• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1349
  • Last Modified:

2 Gig File Size Limit????

I seem to be running to a 2 gig file size limit on a new linux machine.  I'm trying to unzip a file that I created on a Windows machine (under NTFS).  Once upzipped, the file should be over 3 gigabytes.  However, unzip always fails at 2.1478 gigs.....and I've tried other methods.

Is there some inherent file size restriction? If so, how do I get rid of it?

I have a Dell-installed Redhat Linux 7.3 server.  

Thanks,
Brett
0
brgordon
Asked:
brgordon
1 Solution
 
elniniokevCommented:
AFAIK there is a 2 gig file size limit for Linux on 32-bit machines.  There should be a patch available to increase the max file size to 4 gig from Redhat.  You will have to recompile your kernel to take advantage of the patch.
0
 
brgordonAuthor Commented:
Is there any way to get past the 4 gig limit?

Thanks,
Brett
0
 
dorwardCommented:
Patch your kernel to support the XFS file system, install the XFS tools, and create an XFS file system on one of your partitions, depending on the page size the maximum file size varies beteen sixteen and sixty four Terrabytes, which should be more then enough for most people.

http://oss.sgi.com/projects/xfs/

Bonus - its journeling :)
0
The new generation of project management tools

With monday.com’s project management tool, you can see what everyone on your team is working in a single glance. Its intuitive dashboards are customizable, so you can create systems that work for you.

 
ahoffmannCommented:
With kernel 2.4.x and glibc 2.2.x there is no longer a 2GB limit.
Unfortunatelly most distribution come with the old binaries compiled with a (g)libc with the 2GB limit, and so won't work proper on 2.4.x kernels.
You need programs compiled under 2.4.x, which also use the open64(), lseek() and so on ... , function.
AFAIK there is a LFS (large filesystem) fileutils package.
0
 
ahoffmannCommented:
BTW, if you compile tar:

    ./configure --enable-largefile

0
 
brgordonAuthor Commented:
ahoffman & dorward,

Thanks for both of your responses.  As it turns out, the problem does not seem to be with the OS - it seems like 'unzip' has a problem with > 2gb files.  I was able to create a 10 gb file.  

Do certain compression utilities have problems with the 2 gb limit?  If so, how do I get around it?  Are there other utilities?

ahoffman - could the 'unzip' utility be one of the old binaries?

Thanks,
Brett
0
 
brgordonAuthor Commented:
ahoffman,

lastly, what does 'AFAIK' mean?

-brett
0
 
ahoffmannCommented:
AFAIK - as far as I know
(OOTFA - one of these f... acronyms :)

> could the 'unzip' utility be one of the old binaries?
yes.
You can check with:
  nm -gop `which unzip`|grep -i open
but unzip is most likely a stripped binary, so you probably get nothing

I suggest to use gzip instead, but mae shure that it is 2GB aware:
  nm -gop `gzip`|grep -i open
(if there is a reference to glib2.0, it is a old one)
0
 
brgordonAuthor Commented:
I was able to use 'zcat'.

Thanks,
Brett
0
 
ahoffmannCommented:
zcat is a link to gzip, usually ;-)
0

Featured Post

Never miss a deadline with monday.com

The revolutionary project management tool is here!   Plan visually with a single glance and make sure your projects get done.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now