Solved

redhat linux and apache - Need to fix 2GB limit

Posted on 2004-04-20
17
733 Views
Last Modified: 2008-02-01
Hello and thank you for any help you have.

I need to be able to download a file larger than 2GB from my website, but any file over 2GB and apache wont allow download.

Please advise.

Thank you very much.

Best regards,

Dr34m3r
0
Comment
Question by:dr34m3rs
  • 8
  • 5
  • 2
  • +2
17 Comments
 
LVL 15

Expert Comment

by:samri
Comment Utility
I would think that this is the OS limit.  What is your OS and version?
0
 
LVL 1

Author Comment

by:dr34m3rs
Comment Utility
Redhat Linux 7.3

If so, how can I fix that?
0
 
LVL 17

Expert Comment

by:dorward
Comment Utility
I believe you would need to compile in support (to your kernel) for a file system that supported large files. I believe XFS will do the job. XFS support is available in the kernel in recent 2.4 kernels (although I don't know if there are any RPM versions which include it)

You would then need to format some of your hard disk to use that file system.
0
 
LVL 10

Expert Comment

by:Mercantilum
Comment Utility
Have a look at
   http://cbbrowne.com/info/fs.html#AEN22253
for FS information.
You may be interested in "I want more than 2GB, what can I do...".
You will notice that the popular FS ext2 can support up to 4TB, much more than 2GB... so the FS is not the problem.
It is either Linux, or Apache.

What is the version of your kernel? (do "uname -a")
0
 
LVL 10

Expert Comment

by:Mercantilum
Comment Utility
It seems RedHat 7.3 won't cause a size problem... ( http://answers.google.com/answers/threadview?id=122241 )

In the meantime, you can split files

   man split

I have a look on Apache side to see if there is a file size limitation.
What kind of error do you get exactly, from apache? the log? please give details.
0
 
LVL 1

Author Comment

by:dr34m3rs
Comment Utility
Hello there.

When I try and access the file directly (http://mysite.com/myfile.tar.gz) I get a 403 (on anything bigger than 2 GB) and when I try and link to the file I get a "file can not be found or is not accessible"...

I'm not really sure where to look in apache to see if there is a size limit...

Thank you very much.

Best regards,

Dr34m3r
0
 
LVL 1

Author Comment

by:dr34m3rs
Comment Utility
kernel version is 2.4.21
0
 
LVL 1

Author Comment

by:dr34m3rs
Comment Utility
The problem is apache not linux.
0
What is SQL Server and how does it work?

The purpose of this paper is to provide you background on SQL Server. It’s your self-study guide for learning fundamentals. It includes both the history of SQL and its technical basics. Concepts and definitions will form the solid foundation of your future DBA expertise.

 
LVL 10

Accepted Solution

by:
Mercantilum earned 200 total points
Comment Utility
Ok, seems that this problem occurs very rarely, as there is *really* not much neither on the web nor the newsgroups.

I had a look at the Apache source files.

Let's look at this, when it opens a file to be sent (explanation after) :
   (removed the non relevant code for our problem)

---- start of code ----
#if APR_HAS_LARGE_FILES
        if (  (r->finfo.size > AP_MAX_SENDFILE)) {
            /* APR_HAS_LARGE_FILES issue; must split into mutiple buckets,
             * no greater than MAX(apr_size_t), and more granular than that
             * in case the brigade code/filters attempt to read it directly.
             */
            apr_off_t fsize = r->finfo.size;
            e = apr_bucket_file_create(fd, 0, AP_MAX_SENDFILE, r->pool,
                                       c->bucket_alloc);
            while (fsize > AP_MAX_SENDFILE) {
                   ...
            }
            e->length = (apr_size_t)fsize; /* Resize just the last bucket */
        }
        else
#endif
            e = apr_bucket_file_create(fd, 0, (apr_size_t)r->finfo.size,
                                       r->pool, c->bucket_alloc);
----end----

so what can we see? That
1. in case you compile with APR_HAS_LARGE_FILES it will send larger files than AP_MAX_SENDFILE .
2. if not your files are limited to AP_MAX_SENDFILE

What is AP_MAX_SENDFILE ? 16 MB

( from httpd.h:#define AP_MAX_SENDFILE 16777216  /* 2^24 */ )

Even if you would compile with APR_HAS_LARGE_FILES on Linux, you would get a max size of "apr_off_t"
and apr_off_t is long !

( from apr.h:typedef  long           apr_off_t; )

So, even with APR_HAS_LARGE_FILES, you max size will be  2 GB and I don't think - unless you change source code - you can get bigger.

Conclusion: max of the max size is 2 GB, have a look to "split"  (man split)  to cut your file in several files.
0
 
LVL 1

Author Comment

by:dr34m3rs
Comment Utility
Awesome. Thank you very much.

Best regards,

Dr34m3r
0
 
LVL 15

Expert Comment

by:samri
Comment Utility
Mercantilum: excellent stuff dude!.
0
 
LVL 10

Expert Comment

by:Mercantilum
Comment Utility
Thanks guys.
0
 
LVL 1

Author Comment

by:dr34m3rs
Comment Utility
Mercantilum rocks :)
0
 

Expert Comment

by:dbroders
Comment Utility
I don't think the accepted answer is correct.  What you need to do is configure/compile apache with the following CFLAGS:

CFLAGS="-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64"

Once I had done this Apache (2.0.48) started showing and serving up files greater than 2gb.

I've put up my short run down on this at http://modzer0.cs.uaf.edu/wiki/?Apache/LargeFileSupport
0
 
LVL 10

Expert Comment

by:Mercantilum
Comment Utility
Oh I didn't see that :)  (google search for "apache and FILE_OFFSET_BITS" gives about 10 results for both web and groups...)

The code analysis is valid, only that this change should made the default long value for apr_off_t to go to more bits...

dr34m3rs, please try this for  a while (my only worry is that its use seems rare according to google search...) and if it does work ok, ask in this thread for a Unaccept and then accept the above answer.
0
 
LVL 1

Author Comment

by:dr34m3rs
Comment Utility
Ok I will look into this, although I have other projects pending that I need to do first...     but when I get to this I will let everyone know if it worked or not. And thanks for the great info guys!!

Thank you very much for everyones help.

Best regards,

Dr34m3r
0
 
LVL 1

Author Comment

by:dr34m3rs
Comment Utility
I had time to look into that.

That fix did not work. Apache could not run and had to be reconfig'd again.

Still have the problem, where apache can not read file over 2 gb :)

But that's ok, I can live with that. = D

Thank you very much.

Best regards,

Dr34m3r
0

Featured Post

Microsoft Certification Exam 74-409

Veeam® is happy to provide the Microsoft community with a study guide prepared by MVP and MCT, Orin Thomas. This guide will take you through each of the exam objectives, helping you to prepare for and pass the examination.

Join & Write a Comment

Suggested Solutions

Title # Comments Views Activity
MDB2 Error: not found 4 135
Php Remote Files 13 56
Duplicate Records In MySQL 7 49
Hosting application in Apache and Tomcat 1 77
In my time as an SEO for the last 2 years and in the questions I have assisted with on here I have always seen the need to redirect from non-www urls to their www versions. For instance redirecting http://domain.com (http://domain.com) to http…
In Solr 4.0 it is possible to atomically (or partially) update individual fields in a document. This article will show the operations possible for atomic updating as well as setting up your Solr instance to be able to perform the actions. One major …
In this seventh video of the Xpdf series, we discuss and demonstrate the PDFfonts utility, which lists all the fonts used in a PDF file. It does this via a command line interface, making it suitable for use in programs, scripts, batch files — any pl…
This video shows how to remove a single email address from the Outlook 2010 Auto Suggestion memory. NOTE: For Outlook 2016 and 2013 perform the exact same steps. Open a new email: Click the New email button in Outlook. Start typing the address: …

763 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

7 Experts available now in Live!

Get 1:1 Help Now