[Last Call] Learn how to a build a cloud-first strategyRegister Now

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 600
  • Last Modified:

SCP file to multiple RHEL5 hosts from central computer...

I want to copy my authorized keys file to all of my linux hosts at the same time so that I can update it in one location and then push it out to all the others. However, since I'm new to shell scripts I have a couple of questions about this....

I have made a simple test file with:

#!/bin/sh
scp file.txt root@192.156.222.222:/root/.ssh
echo "Server has the file now!"

This will prompt me for the root password and then perform the copy as intended. However, if I want to define a variable $SERVERS in this file and then have a list of all the hosts by IP and change the script to use: scp file.txt root@$SERVERS:/root/.ssh....how can I write that?

Also, I need a way to use an admin user account for the servers instead of the root account. For example, I have my own user that uses a private/public key to login as the root, but how can I use that user account to do the SCP to each box? If I use the root user I would have to type in the password for each box because they are different which totally ruins the purpose.

Any help would be appreciated.
0
willlandymore
Asked:
willlandymore
  • 16
  • 7
  • 7
  • +1
2 Solutions
 
Deepak KosarajuCommented:
#!/bin/sh
scp file.txt admin@192.156.222.222:/root/.ssh
echo "Server has the file now!"

Use the username before @ of hostname which already has private and public key to login.
0
 
woolmilkporcCommented:

SERVERS="server1 server2 server3 server4"
for SERVER in $SERVERS
  do
    scp file.txt root@$SERVER:/root/.ssh
    echo "Server $SERVER has the file now!"
done
 
0
 
Deepak KosarajuCommented:
u can do in this way define all the host in a files called hosts
then u can say
#!/bin/sh
for i in `cat hosts`
do
scp file.txt admin@$i:/root/.ssh
if [ $? == 0 ]
then
echo "Server has the file now!"
else
echo "Cannot copy the file!"
done

You can use expect to have password automated insited the script.
http://bash.cyberciti.biz/security/expect-ssh-login-script/
http://www.unix.com/shell-programming-scripting/28194-using-expect-script-shell-script.html
0
Windows Server 2016: All you need to know

Learn about Hyper-V features that increase functionality and usability of Microsoft Windows Server 2016. Also, throughout this eBook, you’ll find some basic PowerShell examples that will help you leverage the scripts in your environments!

 
Deepak KosarajuCommented:
I forgot to close the if loop
#!/bin/sh
for i in `cat hosts`
do
scp file.txt admin@$i:/root/.ssh
if [ $? == 0 ]
then
echo "Server has the file now!"
else
echo "Cannot copy the file!"
fi
done
0
 
willlandymoreAuthor Commented:
I tried using my own username but it comes back with the password prompt over and over again so it must need to know where the private key is to use otherwise it won't work, right?
0
 
Deepak KosarajuCommented:
Is you private key appended to inside the following file on all the hosts that you are tyring to login.
Then appending it to the file ~/.ssh/authorized_keys
0
 
woolmilkporcCommented:
Never give away your private key!
If your local private key is in some nonstandard location use
scp -i /path/to/identity_file ...
 
0
 
willlandymoreAuthor Commented:
Hey kosarajudeepak,

I used the script you had there and I get:

./ssh_copy.sh: line 10: syntax error near unexpected token `done'
./ssh_copy.sh: line 10: `done'

when I run it....
0
 
mccrackyCommented:
Another way to set it up is to have each server pull the new authorized keys file down.  You can set up the rsync daemon and have each server setup through cron to rsync the new authorized keys file down every x minutes/hours/days.  Just another thought for you to set things up.

BTW, if you think you have the keys set up correctly and you still get prompted for the password for ssh key authentication, you need to check the permissions on the directories and files.  If the permissions are too open, key authentication won't work.  Check the logs on the receiving end of the connection and it will point you to the permissions you need to change.
0
 
willlandymoreAuthor Commented:
actually with the script that I was trying from above I just get the error about the 'done'
0
 
willlandymoreAuthor Commented:
my bad...I had 'if' at the bottom not fi.

Now I get the following:

It is recommended that your private key files are NOT accessible by others.
This private key will be ignored.
bad permissions: ignore key: /root/.ssh/authorized_keys
Enter passphrase for key '/root/.ssh/authorized_keys':
0
 
willlandymoreAuthor Commented:
it's reading the hosts though from the file because I can see several IP's that it's trying to connect to. It just won't let me use my key from the file to connect and run the scp.

so if I used something like this:

#!/bin/sh
for i in `cat hosts`
do
scp -i /root/.ssh/authorized_keys file.txt userX@$i:/root/.ssh
if [ $? == 0 ]
then
echo $i " has the file now!"
else
echo "Cannot copy the file!"
fi
done

where userX has root permissions on all the other boxes would that all me to connect to each box or would I need to put the key of the physical server in the other authorized key files of the boxes it's trying to connect to?
0
 
Deepak KosarajuCommented:
as always u do not permissions to access root as normal user. as you are trying to scp as normal to /root/.ssh/authorized_keys which is not allowed. so use expect to define root password inside the script and copy the file to authorized_keys.
0
 
willlandymoreAuthor Commented:
okay, I'll give that shot
0
 
Deepak KosarajuCommented:
No only file that you place authorized keys is ~/.ssh/authorized_keys so make sure you do it right in placing your public key in order to get access to the servers.
0
 
mccrackyCommented:
Look again at what I put in above.  Your permissions on your private key file is too open.  It should be PRIVATE as in only the owner can read it.  check the permissions on your private key file and the .ssh directory.
0
 
mccrackyCommented:
No, NO, NO!

kosarajudeepak is WRONG.  DON'T put the root password anywhere in clear text!!!

As a normal user, you CAN ssh directly to the root account with keys.  Your public key just needs included into the /root/.ssh/authorized_keys file.

You need to make sure that on the receiving end of the connection, you need to check that

PermitRootLogin = without-password

(or "yes", but "without-password" is more secure--only allowing direct root login with key authentication.)

You need to check on the receiving end that the .ssh/authorized_keys file and the parent directories (.ssh and the user's home directory) don't have their permissions too open.

And, according to the message above, that the private key on the initiating end (and the parent directories) don't have their permissions too open.
0
 
willlandymoreAuthor Commented:
yeah, we have that setup already. We have public keys on the servers so that we can use the private ones and go in as root.

I'm not sure I understand how to fix the permissions issue though on it so I can use my user account to run the command to copy the file...
0
 
mccrackyCommented:
test things outside the script first.

Step one:  
  -  Make sure you can ssh directly to root (ssh root@<ip or fqdn>) without a password.  If you get the error above about the private key file, then check that your local .ssh directory has permissions of 700 and the files in it of 600 and that your home directory is 755 or "less".
  -  If you don't get the error above about the private key file, but still get a password prompt, you need to check the logs on the server end (/var/log/messages or /var/log/secure -- wherever sshd writes to) see where things are going wrong.  Probably permissions on the receiving end similar to above.  Set the permissions tighter like the above on the receiving end too.

Step two:
  -  After you can log in directly with ssh, it's just a matter of getting the scp commands and script to run.
  - reread the man page of scp.  the "-i" is pointed to your private key file, not the authorized keys file.
0
 
willlandymoreAuthor Commented:
okay, makes sense. I'll run through though test. Thanks.
0
 
willlandymoreAuthor Commented:
Okay, I tried on one server and checked the sshd_config and the permitrootlogin without-password is there.

I also adjusted the permissions to 600 on the files and 700 on the .ssh directory on both the server and the one connecting, however, I still get it prompting me for a password when I try the ssh root@server.

Any thoughts?
0
 
willlandymoreAuthor Commented:
here's the output from the log:

remote_server sshd[1998]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=172.16.0.236  user=root
remote_server sshd[1998]: Failed password for root from SERVER1 port 53699 ssh2
0
 
willlandymoreAuthor Commented:
Okay, I think I got it working. I put it on the central box and then I generated a key for that box. Then I put that key on all the servers in the known_hosts file and then ran the script from the central server. It asks if I want to connect to them on the first try, but after that it goes through without a hitch.
0
 
mccrackyCommented:
What key did you generate and put in the known_hosts file?  I hope not the private key.  You need to put the public key (id_rsa.pub or the like) into the authorized_keys file.  

The known_hosts file is managed by ssh itself when you connect to a server the first time.  Then, if the key of the server to which you are connecting changes it will complain warning you that you might not be connecting to the server you think you are.  

The first time you ran the script, you needed to accept the signature of each server to add it to the known_hosts file.  From then on, it should run without interaction.  If, in the future, you reload a server you will need to delete the entry in the known_hosts file or it won't connect and will throw up a warning about a possible security breach.  You can delete the entry in the known_hosts by using the command "ssh-keygen -R <ip or fqdn of changed server>".
0
 
willlandymoreAuthor Commented:
yeah, public of the server in the authorized keys.

Then I pre-added the entries into the known_hosts so I didn't get the warning....sorry, that wasn't very clear.
0
 
mccrackyCommented:
But the known_hosts file is used by the client (connecting) side and sits in the client's .ssh directory, not on the server (receiving) side.  You don't need to pre-add anything.  It's the sshd server's signature that is put into the ssh client's known_hosts file.  These are added to the known_hosts file when you connect the first time (as happened the first time you ran your script).
0
 
willlandymoreAuthor Commented:
ok
0
 
willlandymoreAuthor Commented:
I just got an error on one server (no route to host) but when I took the entry in the known hosts from another server and put it in that box the error went away. That's why I was assuming it was required.
0
 
mccrackyCommented:
Coincidence.  "no route to host" is a routing or DNS issue and since is cleared up, it was probably a temporary outage or some other thing (typographical error on the fqdn or ip?).
0
 
willlandymoreAuthor Commented:
well I'm using the IP so I don't know what to say, and I was on to that host at the time, but whatever.
0
 
willlandymoreAuthor Commented:
problem solved
0

Featured Post

Keep up with what's happening at Experts Exchange!

Sign up to receive Decoded, a new monthly digest with product updates, feature release info, continuing education opportunities, and more.

  • 16
  • 7
  • 7
  • +1
Tackle projects and never again get stuck behind a technical roadblock.
Join Now