Shamsul Kamal
asked on
How to Copy the whole 200GB folder content from another server using SSH
Hi,
I would like to request an asssitant.
May i know what is the BEST and FASTEST method to download or FTP via shell (SSH) to a remote server and download a folder contain 120GB of .tar.gz file in it to a Linux Server.
If the method can support resume on failure is better.
FYI, i would like to transfer those file from linux to linux server.
Appreciates if anybody can provide any BEST method available.
Thank you.
Regards,
Sham
I would like to request an asssitant.
May i know what is the BEST and FASTEST method to download or FTP via shell (SSH) to a remote server and download a folder contain 120GB of .tar.gz file in it to a Linux Server.
If the method can support resume on failure is better.
FYI, i would like to transfer those file from linux to linux server.
Appreciates if anybody can provide any BEST method available.
Thank you.
Regards,
Sham
I second rsync
is this a job you want to do once or repeatedly? are these two systems on the same network? can you share folders (NFS) between them?
ASKER
Hi,
Thanks for all,
This job is to do one time only and not repeteadly.
These two server are in different network.
I actually would like to collect all the backup files located at the old server compressed in .tar.gz and copy it to the new server for the restoration purposes.
I intent to do this using SSH.
I will be doing this for about 10 servers.
If rsync is the best, appreciates if anybody can write some script for me .
The old sever backup file locattion is at /backup/cpbackup/daily/xxx x.tar.gz
And i would like to copy all those file into the new server /backup/cpbackup/daily/
I do not one to do this one by one and would like to do this using script for all the files.
Thank you,
Regards,
Sham
Thanks for all,
This job is to do one time only and not repeteadly.
These two server are in different network.
I actually would like to collect all the backup files located at the old server compressed in .tar.gz and copy it to the new server for the restoration purposes.
I intent to do this using SSH.
I will be doing this for about 10 servers.
If rsync is the best, appreciates if anybody can write some script for me .
The old sever backup file locattion is at /backup/cpbackup/daily/xxx
And i would like to copy all those file into the new server /backup/cpbackup/daily/
I do not one to do this one by one and would like to do this using script for all the files.
Thank you,
Regards,
Sham
Here is a shell script i use with rsync.
It does rsync through a ssh connection.
if you put "delete" on the command line it will delete files on the destination not at the source.
It does rsync through a ssh connection.
if you put "delete" on the command line it will delete files on the destination not at the source.
#!/bin/sh
extparam=""
if [ "$1" = "delete" ]
then
extparam="--delete"
echo "Deleting missing files .."
#--delete delete extraneous files from dest dirs
fi
rsync --verbose --rsh=ssh --recursive \
--exclude=".svn" \
--update ${extparam} \
/data/ server.destination.net:/data
## --update skip files that are newer on the receiver
ASKER
Hi,
May i know where and how to run that scripts ?
Is there any other scripts / mehtods which specificaly can work in my situation ?
Thank you.
May i know where and how to run that scripts ?
Is there any other scripts / mehtods which specificaly can work in my situation ?
Thank you.
I use it for synchronization of 2 directory structures.
the rsync command takes a couple of flags and then a source and destination.
if you are on the source server
use.
$rsync --verbose --rsh=ssh /backup/cpbackup/daily/*.t ar.gz new_server:/backup/cpbacku p/daily/
the rsync command takes a couple of flags and then a source and destination.
if you are on the source server
use.
$rsync --verbose --rsh=ssh /backup/cpbackup/daily/*.t
ASKER
Hi,
Thanks for the updates.
May i know how to run this on the destination server ?
Thank you.
Thanks for the updates.
May i know how to run this on the destination server ?
Thank you.
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
From my experience FTP is the fastest method for file transfer.
If its one time, all you need to do is enable the ftp server and either set the ftp root to the directory where u have the tar.gz or copy the tar to the ftp root.
if you let us know the distro you use , we can guide you better
If its one time, all you need to do is enable the ftp server and either set the ftp root to the directory where u have the tar.gz or copy the tar to the ftp root.
if you let us know the distro you use , we can guide you better
ASKER
Hi,
We are using Centos V5 at both server.
Hope you can guide us with step by step.
Thanks for your helps.
We are using Centos V5 at both server.
Hope you can guide us with step by step.
Thanks for your helps.
see if you have vsftpd package installed
rpm -qa | grep vsftpd
rpm -qa | grep vsftpd
ASKER
Got..
root@svr10 [~]# rpm -qa | grep vsftpd
vsftpd-2.0.5-12.el5
root@svr10 [~]#
root@svr10 [~]# rpm -qa | grep vsftpd
vsftpd-2.0.5-12.el5
root@svr10 [~]#
scp and rsync is normally faster as they are (or can be) compressed.
As you are transfering compressed files it will not be much different.
Its not normally faster though.
As you are transfering compressed files it will not be much different.
Its not normally faster though.
Either copy your tar.gz to /var/ftp/pub or set anon_root parameter in /etc/vsftpd/vsftpd.conf to the directory where the tar.gz exists.
/etc/init.d/vsftpd start
if you are not much bothered about security, disable iptables temporarily
/etc/init.d/iptables stop
do a netstat -tnlp to verify if the ftp is listening
and first test from the other linux machine if its working
ftp servername
username:anonymous
enter
if it works , try using wget ftp://servername/path/filename.tar.gz
because wget supports continuation of a paused or crashed download. I've done it many times with an http download , not very sure about ftp though. worth a try anyway
/etc/init.d/vsftpd start
if you are not much bothered about security, disable iptables temporarily
/etc/init.d/iptables stop
do a netstat -tnlp to verify if the ftp is listening
and first test from the other linux machine if its working
ftp servername
username:anonymous
enter
if it works , try using wget ftp://servername/path/filename.tar.gz
because wget supports continuation of a paused or crashed download. I've done it many times with an http download , not very sure about ftp though. worth a try anyway
wget will resume a failed ftp download. Just add the '-c' argument:
wget -c ftp://servername/path/filename.tar.gz
wget -c ftp://servername/path/filename.tar.gz
ASKER
Hi,
But how to wget the whole folder ?
Thank you.
But how to wget the whole folder ?
Thank you.
ASKER
Hi,
I'm now using the following command :
rsync --verbose --rsh='ssh -p6298' IP:/backup/cpbackup/daily/ *.tar.gz /backup/cpbackup/daily/
But this will only copy the *.tar.gz , may i know how to make the command to copy the whole IP:/backup/cpbackup/daily/ folder ?
Thank you.
I'm now using the following command :
rsync --verbose --rsh='ssh -p6298' IP:/backup/cpbackup/daily/
But this will only copy the *.tar.gz , may i know how to make the command to copy the whole IP:/backup/cpbackup/daily/
Thank you.
rsync -av
ASKER
Hi,
Thanks nabeelmoidu! ...
Can i get the full command ? I'm scared to do it wrongly.
Appreciates your asssitant.
Thank you.
Thanks nabeelmoidu! ...
Can i get the full command ? I'm scared to do it wrongly.
Appreciates your asssitant.
Thank you.
SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Hi,
Thanks...
It seems to works ...
I have another question related.
Which one should be faster, running the script at the destination server or running it at the source server ?
Thank you.
Thanks...
It seems to works ...
I have another question related.
Which one should be faster, running the script at the destination server or running it at the source server ?
Thank you.
I doubt if that would make a difference either way.
If the data is sensitive or secret or import then SSH (or scp) is the best method.
.gz is not the smallest compression. Something like bz2 would be smaller but would take time to do.
You could just bzip2 all the files but then you would be more open to a file transfer fail. At least if you have multiple files you can continue where you failed (just delete the last one and start again from there)
Check rsync too.
This is a great method of transfer. It uses ssh but only takes files you need.
You do have to install it at both ends so it is a bit more fiddly.