How to Copy the whole 200GB folder content from another server using SSH

Hi,

I would like to request an asssitant.

May i know what is the BEST and FASTEST method to download or FTP via shell (SSH) to a remote server and download a folder contain 120GB of .tar.gz file in it to a Linux Server.

If the method can support resume on failure is better.

FYI, i would like to transfer those file from linux to linux server.

Appreciates if anybody can provide any BEST method available.

Thank you.

Regards,
Sham
LVL 2
Shamsul KamalJunior TechAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

edster9999Commented:
It depends on the data.  FTP is not encrypted.  SSH is.
If the data is sensitive or secret or import then SSH (or scp) is the best method.

.gz is not the smallest compression.  Something like bz2 would be smaller but would take time to do.
You could just bzip2 all the files but then you would be more open to a file transfer fail.  At least if you have multiple files you can continue where you failed (just delete the last one and start again from there)

Check rsync too.
This is a great method of transfer.  It uses ssh but only takes files you need.
You do have to install it at both ends so it is a bit more fiddly.
0
LinuxNtwrkngCommented:
I second rsync
0
omarfaridCommented:
is this a job you want to do once or repeatedly? are these two systems on the same network? can you share folders (NFS) between them?
0
Become a Certified Penetration Testing Engineer

This CPTE Certified Penetration Testing Engineer course covers everything you need to know about becoming a Certified Penetration Testing Engineer. Career Path: Professional roles include Ethical Hackers, Security Consultants, System Administrators, and Chief Security Officers.

Shamsul KamalJunior TechAuthor Commented:
Hi,

Thanks for all,

This job is to do one time only and not repeteadly.

These two server are in different network.

I actually would like to collect all the backup files located at the old server compressed in .tar.gz and copy it to the new server for the restoration purposes.

I intent to do this using SSH.

I will be doing this for about 10 servers.

If rsync is the best, appreciates if anybody can write some script for me .

The old sever backup file locattion is at /backup/cpbackup/daily/xxxx.tar.gz

And i would like to copy all those file into the new server /backup/cpbackup/daily/

I do not one to do this one by one and would like to do this using script for all the files.

Thank you,

Regards,
Sham

0
diepesCommented:
Here is a shell script i use with rsync.
It does rsync through a ssh connection.
if you put "delete" on the command line it will delete files on the destination not at the source.


#!/bin/sh
extparam=""
if [ "$1" = "delete" ] 
	then
	extparam="--delete"
	echo "Deleting missing files .."
 	#--delete                delete extraneous files from dest dirs
fi
 
rsync --verbose --rsh=ssh --recursive \
	--exclude=".svn" \
	--update ${extparam} \
	/data/ server.destination.net:/data
 
## --update    skip files that are newer on the receiver

Open in new window

0
Shamsul KamalJunior TechAuthor Commented:
Hi,

May i know where and how to run that scripts ?

Is there any other scripts / mehtods which specificaly can work in my situation ?

Thank you.
0
diepesCommented:
I use it for synchronization  of 2 directory structures.

the rsync command takes a couple of flags and then a source and destination.


if you are on the source server
use.
$rsync --verbose --rsh=ssh /backup/cpbackup/daily/*.tar.gz   new_server:/backup/cpbackup/daily/

0
Shamsul KamalJunior TechAuthor Commented:
Hi,

Thanks for the updates.

May i know how to run this on the destination server ?

Thank you.
0
diepesCommented:
Just swap source and destination.
Is is  FROM a TO b

$rsync --verbose --rsh=ssh old_server_ip:/backup/cpbackup/daily/*.tar.gz   /backup/cpbackup/daily/



0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
nabeelmoiduCommented:
From my experience FTP is the fastest method for file transfer.
If its one time, all you need to do is enable the ftp server and either set the ftp root to the directory where u have the tar.gz or copy the tar to the ftp root.

if you let us know the distro you use , we can guide you better
0
Shamsul KamalJunior TechAuthor Commented:
Hi,

We are using Centos V5 at both server.

Hope you can guide us with step by step.

Thanks for your helps.
0
nabeelmoiduCommented:
see if you have vsftpd package installed
rpm -qa | grep vsftpd
0
Shamsul KamalJunior TechAuthor Commented:
Got..

root@svr10 [~]# rpm -qa | grep vsftpd
vsftpd-2.0.5-12.el5
root@svr10 [~]#

0
edster9999Commented:
scp and rsync is normally faster as they are (or can be) compressed.
As you are transfering compressed files it will not be much different.
Its not normally faster though.
0
nabeelmoiduCommented:
Either copy your tar.gz to /var/ftp/pub or set anon_root parameter in /etc/vsftpd/vsftpd.conf to the directory where the tar.gz exists.

/etc/init.d/vsftpd start

if you are not much bothered about security, disable iptables temporarily
/etc/init.d/iptables stop

do a netstat -tnlp to verify if the ftp is listening
and first test from the other linux machine if its working

ftp servername
username:anonymous
enter

if it works , try using wget ftp://servername/path/filename.tar.gz

because wget supports continuation of a paused or crashed download. I've done it many times with an http download , not very sure about ftp though. worth a try anyway

0
LinuxNtwrkngCommented:
wget will resume a failed ftp download.  Just add the '-c' argument:

wget -c ftp://servername/path/filename.tar.gz
0
Shamsul KamalJunior TechAuthor Commented:
Hi,

But how to wget the whole folder ?

Thank you.
0
Shamsul KamalJunior TechAuthor Commented:
Hi,

I'm now using the following command :

rsync --verbose --rsh='ssh -p6298' IP:/backup/cpbackup/daily/*.tar.gz /backup/cpbackup/daily/

But this will only copy the *.tar.gz , may i know how to make the command to copy the whole IP:/backup/cpbackup/daily/ folder ?

Thank you.
0
nabeelmoiduCommented:
rsync -av
0
Shamsul KamalJunior TechAuthor Commented:
Hi,

Thanks nabeelmoidu! ...

Can i get the full command ? I'm scared to do it wrongly.

Appreciates your asssitant.

Thank you.

0
nabeelmoiduCommented:
rsync -av --rsh='ssh -p6298' IP:/backup/cpbackup/daily/  /backup/cpbackup/daily/
0
Shamsul KamalJunior TechAuthor Commented:
Hi,

Thanks...

It seems to works ...

I have another question related.

Which one should be faster, running the script at the destination server or running it at the source server ?

Thank you.
0
nabeelmoiduCommented:
I doubt if that would make a difference either way.
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Shell Scripting

From novice to tech pro — start learning today.