how to copy large files across net?

I am trying to do a relatively simple task. I want to copy a  15,618,606,818 (15GB) file from one host to another. I've tried ftp, rsync and gftp. ftp and rsync die with "file too large" errors. gftp is probably dying for the same reason, but doesn't give that as a message. The target device has 250GB free.

nfs is not a desired solution. Nor is breaking the file up into smaller files.

How do I do this?
Who is Participating?
martin_2110Connect With a Mentor Commented:
Read this wrong the first time, I would try upgrading rsync and you kernel to the latest version.   Any chance your rsyncing to fat, the limit of fat32 is 4gb.  What os and filesystem are you syncing to and from?
scp wil transfer the file for you.

Command to use is:
scp myfiletocopy username@remoteip:remotefile

I've done 100GB files this way and is secure.
rsync -vrPtz -e ssh host:/remote_path/* /local_path/


-e ssh rsync will use ssh client instead of rsh
-z compress file transfer
-t preserve time (other attributes as owner or permissions are also possible)
-P resume incomplete file transfer
-r recursive into subdirectories
-v verbose

if it stops running run it again Cheers!
jmarkfoleyAuthor Commented:
I used scp. It works and I like it. I like the progress info. as it turned out, the problem was with none of the client programs. They probably all would work. The problem was my target filesystem. I mounted an external usb drive which was preformatted to Windows FAT. I thought that would be nice because I could access the files from my windows laptop but, as everyone *should* know, there is a 2GB file size limit. So, I reformated it to ext2 and no problem.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.