• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 745
  • Last Modified:

redhat linux and apache - Need to fix 2GB limit

Hello and thank you for any help you have.

I need to be able to download a file larger than 2GB from my website, but any file over 2GB and apache wont allow download.

Please advise.

Thank you very much.

Best regards,

Dr34m3r
0
dr34m3rs
Asked:
dr34m3rs
  • 8
  • 5
  • 2
  • +2
1 Solution
 
samriCommented:
I would think that this is the OS limit.  What is your OS and version?
0
 
dr34m3rsAuthor Commented:
Redhat Linux 7.3

If so, how can I fix that?
0
 
dorwardCommented:
I believe you would need to compile in support (to your kernel) for a file system that supported large files. I believe XFS will do the job. XFS support is available in the kernel in recent 2.4 kernels (although I don't know if there are any RPM versions which include it)

You would then need to format some of your hard disk to use that file system.
0
VIDEO: THE CONCERTO CLOUD FOR HEALTHCARE

Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

 
MercantilumCommented:
Have a look at
   http://cbbrowne.com/info/fs.html#AEN22253
for FS information.
You may be interested in "I want more than 2GB, what can I do...".
You will notice that the popular FS ext2 can support up to 4TB, much more than 2GB... so the FS is not the problem.
It is either Linux, or Apache.

What is the version of your kernel? (do "uname -a")
0
 
MercantilumCommented:
It seems RedHat 7.3 won't cause a size problem... ( http://answers.google.com/answers/threadview?id=122241 )

In the meantime, you can split files

   man split

I have a look on Apache side to see if there is a file size limitation.
What kind of error do you get exactly, from apache? the log? please give details.
0
 
dr34m3rsAuthor Commented:
Hello there.

When I try and access the file directly (http://mysite.com/myfile.tar.gz) I get a 403 (on anything bigger than 2 GB) and when I try and link to the file I get a "file can not be found or is not accessible"...

I'm not really sure where to look in apache to see if there is a size limit...

Thank you very much.

Best regards,

Dr34m3r
0
 
dr34m3rsAuthor Commented:
kernel version is 2.4.21
0
 
dr34m3rsAuthor Commented:
The problem is apache not linux.
0
 
MercantilumCommented:
Ok, seems that this problem occurs very rarely, as there is *really* not much neither on the web nor the newsgroups.

I had a look at the Apache source files.

Let's look at this, when it opens a file to be sent (explanation after) :
   (removed the non relevant code for our problem)

---- start of code ----
#if APR_HAS_LARGE_FILES
        if (  (r->finfo.size > AP_MAX_SENDFILE)) {
            /* APR_HAS_LARGE_FILES issue; must split into mutiple buckets,
             * no greater than MAX(apr_size_t), and more granular than that
             * in case the brigade code/filters attempt to read it directly.
             */
            apr_off_t fsize = r->finfo.size;
            e = apr_bucket_file_create(fd, 0, AP_MAX_SENDFILE, r->pool,
                                       c->bucket_alloc);
            while (fsize > AP_MAX_SENDFILE) {
                   ...
            }
            e->length = (apr_size_t)fsize; /* Resize just the last bucket */
        }
        else
#endif
            e = apr_bucket_file_create(fd, 0, (apr_size_t)r->finfo.size,
                                       r->pool, c->bucket_alloc);
----end----

so what can we see? That
1. in case you compile with APR_HAS_LARGE_FILES it will send larger files than AP_MAX_SENDFILE .
2. if not your files are limited to AP_MAX_SENDFILE

What is AP_MAX_SENDFILE ? 16 MB

( from httpd.h:#define AP_MAX_SENDFILE 16777216  /* 2^24 */ )

Even if you would compile with APR_HAS_LARGE_FILES on Linux, you would get a max size of "apr_off_t"
and apr_off_t is long !

( from apr.h:typedef  long           apr_off_t; )

So, even with APR_HAS_LARGE_FILES, you max size will be  2 GB and I don't think - unless you change source code - you can get bigger.

Conclusion: max of the max size is 2 GB, have a look to "split"  (man split)  to cut your file in several files.
0
 
dr34m3rsAuthor Commented:
Awesome. Thank you very much.

Best regards,

Dr34m3r
0
 
samriCommented:
Mercantilum: excellent stuff dude!.
0
 
MercantilumCommented:
Thanks guys.
0
 
dr34m3rsAuthor Commented:
Mercantilum rocks :)
0
 
dbrodersCommented:
I don't think the accepted answer is correct.  What you need to do is configure/compile apache with the following CFLAGS:

CFLAGS="-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64"

Once I had done this Apache (2.0.48) started showing and serving up files greater than 2gb.

I've put up my short run down on this at http://modzer0.cs.uaf.edu/wiki/?Apache/LargeFileSupport
0
 
MercantilumCommented:
Oh I didn't see that :)  (google search for "apache and FILE_OFFSET_BITS" gives about 10 results for both web and groups...)

The code analysis is valid, only that this change should made the default long value for apr_off_t to go to more bits...

dr34m3rs, please try this for  a while (my only worry is that its use seems rare according to google search...) and if it does work ok, ask in this thread for a Unaccept and then accept the above answer.
0
 
dr34m3rsAuthor Commented:
Ok I will look into this, although I have other projects pending that I need to do first...     but when I get to this I will let everyone know if it worked or not. And thanks for the great info guys!!

Thank you very much for everyones help.

Best regards,

Dr34m3r
0
 
dr34m3rsAuthor Commented:
I had time to look into that.

That fix did not work. Apache could not run and had to be reconfig'd again.

Still have the problem, where apache can not read file over 2 gb :)

But that's ok, I can live with that. = D

Thank you very much.

Best regards,

Dr34m3r
0

Featured Post

VIDEO: THE CONCERTO CLOUD FOR HEALTHCARE

Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

  • 8
  • 5
  • 2
  • +2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now