?
Solved

redhat linux and apache - Need to fix 2GB limit

Posted on 2004-04-20
17
Medium Priority
?
742 Views
Last Modified: 2008-02-01
Hello and thank you for any help you have.

I need to be able to download a file larger than 2GB from my website, but any file over 2GB and apache wont allow download.

Please advise.

Thank you very much.

Best regards,

Dr34m3r
0
Comment
Question by:dr34m3rs
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 8
  • 5
  • 2
  • +2
17 Comments
 
LVL 15

Expert Comment

by:samri
ID: 10877191
I would think that this is the OS limit.  What is your OS and version?
0
 
LVL 1

Author Comment

by:dr34m3rs
ID: 10877240
Redhat Linux 7.3

If so, how can I fix that?
0
 
LVL 17

Expert Comment

by:dorward
ID: 10878095
I believe you would need to compile in support (to your kernel) for a file system that supported large files. I believe XFS will do the job. XFS support is available in the kernel in recent 2.4 kernels (although I don't know if there are any RPM versions which include it)

You would then need to format some of your hard disk to use that file system.
0
Optimum High-Definition Video Viewing and Control

The ATEN VM0404HA 4x4 4K HDMI Matrix Switch supports 4K resolutions of UHD (3840 x 2160) and DCI (4096 x 2160) with refresh rates of 30 Hz (4:4:4) and 60 Hz (4:2:0). It is ideal for applications where the routing of 4K digital signals is required.

 
LVL 10

Expert Comment

by:Mercantilum
ID: 10879543
Have a look at
   http://cbbrowne.com/info/fs.html#AEN22253
for FS information.
You may be interested in "I want more than 2GB, what can I do...".
You will notice that the popular FS ext2 can support up to 4TB, much more than 2GB... so the FS is not the problem.
It is either Linux, or Apache.

What is the version of your kernel? (do "uname -a")
0
 
LVL 10

Expert Comment

by:Mercantilum
ID: 10879646
It seems RedHat 7.3 won't cause a size problem... ( http://answers.google.com/answers/threadview?id=122241 )

In the meantime, you can split files

   man split

I have a look on Apache side to see if there is a file size limitation.
What kind of error do you get exactly, from apache? the log? please give details.
0
 
LVL 1

Author Comment

by:dr34m3rs
ID: 10892203
Hello there.

When I try and access the file directly (http://mysite.com/myfile.tar.gz) I get a 403 (on anything bigger than 2 GB) and when I try and link to the file I get a "file can not be found or is not accessible"...

I'm not really sure where to look in apache to see if there is a size limit...

Thank you very much.

Best regards,

Dr34m3r
0
 
LVL 1

Author Comment

by:dr34m3rs
ID: 10892219
kernel version is 2.4.21
0
 
LVL 1

Author Comment

by:dr34m3rs
ID: 10914160
The problem is apache not linux.
0
 
LVL 10

Accepted Solution

by:
Mercantilum earned 800 total points
ID: 10914822
Ok, seems that this problem occurs very rarely, as there is *really* not much neither on the web nor the newsgroups.

I had a look at the Apache source files.

Let's look at this, when it opens a file to be sent (explanation after) :
   (removed the non relevant code for our problem)

---- start of code ----
#if APR_HAS_LARGE_FILES
        if (  (r->finfo.size > AP_MAX_SENDFILE)) {
            /* APR_HAS_LARGE_FILES issue; must split into mutiple buckets,
             * no greater than MAX(apr_size_t), and more granular than that
             * in case the brigade code/filters attempt to read it directly.
             */
            apr_off_t fsize = r->finfo.size;
            e = apr_bucket_file_create(fd, 0, AP_MAX_SENDFILE, r->pool,
                                       c->bucket_alloc);
            while (fsize > AP_MAX_SENDFILE) {
                   ...
            }
            e->length = (apr_size_t)fsize; /* Resize just the last bucket */
        }
        else
#endif
            e = apr_bucket_file_create(fd, 0, (apr_size_t)r->finfo.size,
                                       r->pool, c->bucket_alloc);
----end----

so what can we see? That
1. in case you compile with APR_HAS_LARGE_FILES it will send larger files than AP_MAX_SENDFILE .
2. if not your files are limited to AP_MAX_SENDFILE

What is AP_MAX_SENDFILE ? 16 MB

( from httpd.h:#define AP_MAX_SENDFILE 16777216  /* 2^24 */ )

Even if you would compile with APR_HAS_LARGE_FILES on Linux, you would get a max size of "apr_off_t"
and apr_off_t is long !

( from apr.h:typedef  long           apr_off_t; )

So, even with APR_HAS_LARGE_FILES, you max size will be  2 GB and I don't think - unless you change source code - you can get bigger.

Conclusion: max of the max size is 2 GB, have a look to "split"  (man split)  to cut your file in several files.
0
 
LVL 1

Author Comment

by:dr34m3rs
ID: 10915185
Awesome. Thank you very much.

Best regards,

Dr34m3r
0
 
LVL 15

Expert Comment

by:samri
ID: 10958677
Mercantilum: excellent stuff dude!.
0
 
LVL 10

Expert Comment

by:Mercantilum
ID: 10958695
Thanks guys.
0
 
LVL 1

Author Comment

by:dr34m3rs
ID: 10962877
Mercantilum rocks :)
0
 

Expert Comment

by:dbroders
ID: 11094243
I don't think the accepted answer is correct.  What you need to do is configure/compile apache with the following CFLAGS:

CFLAGS="-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64"

Once I had done this Apache (2.0.48) started showing and serving up files greater than 2gb.

I've put up my short run down on this at http://modzer0.cs.uaf.edu/wiki/?Apache/LargeFileSupport
0
 
LVL 10

Expert Comment

by:Mercantilum
ID: 11094616
Oh I didn't see that :)  (google search for "apache and FILE_OFFSET_BITS" gives about 10 results for both web and groups...)

The code analysis is valid, only that this change should made the default long value for apr_off_t to go to more bits...

dr34m3rs, please try this for  a while (my only worry is that its use seems rare according to google search...) and if it does work ok, ask in this thread for a Unaccept and then accept the above answer.
0
 
LVL 1

Author Comment

by:dr34m3rs
ID: 11094838
Ok I will look into this, although I have other projects pending that I need to do first...     but when I get to this I will let everyone know if it worked or not. And thanks for the great info guys!!

Thank you very much for everyones help.

Best regards,

Dr34m3r
0
 
LVL 1

Author Comment

by:dr34m3rs
ID: 11634612
I had time to look into that.

That fix did not work. Apache could not run and had to be reconfig'd again.

Still have the problem, where apache can not read file over 2 gb :)

But that's ok, I can live with that. = D

Thank you very much.

Best regards,

Dr34m3r
0

Featured Post

Free Tool: SSL Checker

Scans your site and returns information about your SSL implementation and certificate. Helpful for debugging and validating your SSL configuration.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Hi, in this article I'm going to teach you how to run your own site, and how to let people in (without IP). I'll talk about and explain each step... :) By the way, everything in this Tutorial is completely free and legal. This article is for …
If your site has a few sections that need to be secure when data is transmitted between the server and local computer, such as a /order/ section for ordering or /customer/ which contains customer data, etc it would of course be recommended to secure…
Monitoring a network: how to monitor network services and why? Michael Kulchisky, MCSE, MCSA, MCP, VTSP, VSP, CCSP outlines the philosophy behind service monitoring and why a handshake validation is critical in network monitoring. Software utilized …
This is my first video review of Microsoft Bookings, I will be doing a part two with a bit more information, but wanted to get this out to you folks.
Suggested Courses

765 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question