I have a very critical problem with very large files. I have 3 SGI machines exporting a directory via NFS with >2G files. I need to mount this directories in a Linux Redhat 6.2 machine. Now I can't read this large files. What must I do? Where i can find the patches for the linux kernel? I also need to read from this files ussing C and i'm sure i can't do it with the standard libraries. Where I can find info about this? Please point me to the solution as soon as you can and I will give more points to the question.
To access "large files" (files larger than 2GB) you need to have a system that has "large file support" (LFS) compiled in kernel (2.4 kernels contain this) and in glibc. redhat 7.x already has this, but
The average business loses $13.5M per year to ineffective training (per 1,000 employees). Keep ahead of the competition and combine in-person quality with online cost and flexibility by training with Linux Academy.
I have seen several blogs and forum entries elsewhere state that because NTFS volumes do not support linux ownership or permissions, they cannot be used for anonymous ftp upload through the vsftpd program.
IT can be done and here's how to get i…
Note: for this to work properly you need to use a Cross-Over network cable.
1. Connect both servers S1 and S2 on the second network slots respectively. Note that you can use the 1st slots but usually these would be occupied by the Service Provide…
Finds all prime numbers in a range requested and places them in a public primes() array. I've demostrated a template size of 30 (2 * 3 * 5) but larger templates can be built such 210 (2 * 3 * 5 * 7) or 2310 (2 * 3 * 5 * 7 * 11).
The larger templa…