Solved

Transfering a lot of files with long names

Posted on 1998-07-15
19
532 Views
Last Modified: 2013-12-27
Hi,

   I need to move many files (thousands of them) from one machine to another(Solaris 2.6 to SunOS 4.1.4), these files have very long filenames.

   My first attempt was tar, but it fails miserably (long file names problem). ufsdump is out of question because the files are in a vx filesystem, which is not supported in the SunOS machine.

   I tried pkzip, but the ownership and permissions are not mantained by unzip (because by the way pkzip does not exist for SunOS). gzip is of no use.

  Any ideas?

  I am desperate !!!!

Thanks.

   
0
Comment
Question by:jlms
  • 6
  • 4
  • 2
  • +6
19 Comments
 
LVL 5

Expert Comment

by:n0thing
ID: 2008826
Have you tried to NFS mount it ?? Like the disk of Solaris to SunOS or the reverse and just copy/move the files over ??
0
 
LVL 4

Author Comment

by:jlms
ID: 2008827
I am sorry, I forgot to add that there is a firewall between both machines and the only means of comunication is ftp and telnet; for that reason I can't use NFS.


0
 
LVL 1

Expert Comment

by:hajek
ID: 2008828
tar does not handle long names ? what version of tar ?
Tar is good method - try to get newer tar.
(But I am really surprised that your tar have problems of this kind.)
0
 
LVL 4

Author Comment

by:jlms
ID: 2008829
The tar used to recover the files is the tar shipped with SunOS 4.1.4, in the man page it says the length of the name is restricted to a maximu of 100 chars.
0
 
LVL 1

Expert Comment

by:hajek
ID: 2008830
And you have file names longer than 100 chars ? Well, in this case tar really CAN have troubles .... And users too ...
I do not know how to help you, but can you give us an example of such _really_looooooooong_name ? What is it used (and good) for ?

0
 
LVL 51

Expert Comment

by:ahoffmann
ID: 2008831
Have you tried GNU's tar?
0
 
LVL 4

Author Comment

by:jlms
ID: 2008832
About the example is one of the directories of a Sun package (SUNWspro, whatever that is). On top of that, the package is installed 3 directories deep in the tree. In reality I think itdoes not matter to put the thing here, that fact is that the name is very long for a SunOS tar to handle it properly.

I tried GNU tar, but it crashed or tells me that some of the files don't exist (maybe because they are in a Veritas file system?). It could be that the version of GNU tar I have is old, so if sombebody can point me to the a place where I can get it (I need the binaries for SunOS because we don't have a compiler there).
0
 
LVL 1

Expert Comment

by:rgmisra
ID: 2008833
Horribly inefficient, but this may work for you. The "shar"
program encodes files as a shell script. It shouldn't have file
name length limitations. However, it also uuencodes all the files
which results in a large increase in size.

I can give you a copy of the SunOS 4 gtar binary.
0
 
LVL 1

Expert Comment

by:rgmisra
ID: 2008834
Horribly inefficient, but this may work for you. The "shar"
program encodes files as a shell script. It shouldn't have file
name length limitations. However, it also uuencodes all the files
which results in a large increase in size.

I can give you a copy of the SunOS 4 gtar binary.
0
Find Ransomware Secrets With All-Source Analysis

Ransomware has become a major concern for organizations; its prevalence has grown due to past successes achieved by threat actors. While each ransomware variant is different, we’ve seen some common tactics and trends used among the authors of the malware.

 

Expert Comment

by:vkurland
ID: 2008835

What about cpio ? Something like this:

on Solaris:

# cd directory_where_you_have_your_files
# find . -print | cpio -o > archive.cpio

on SUN OS:

# cd directory
# cat archive.cpio | cpio -idmv

cpio should not have a problem with filename length. There are also some command line switches for better compatibility, but I don;t think these are needed as you move files from one SUN to another SUN.

0
 
LVL 4

Author Comment

by:jlms
ID: 2008836
cpio is limited to 128 characters in SunOS (better than tar but not good enough). Something that accepts around 200 characters should make the trick. Also the differences between Solaris and SunOS are sometimes dreadful. Actually I was reading that the default header file for a cpio in Solaris is not the best for compatibility in SunOS...

About shar, is it a free utility or part of the OS? I don't care about space (that is not an issue in this case) but I need something that puts the full bunch of files together to ftp-it preserving ownerships and permissions.

0
 
LVL 1

Expert Comment

by:rgmisra
ID: 2008837
There is a GNU shar utility, and I can send you the sun sparc
binaries (or I may be able to compile other binaries for you,
depending on the machine type).

0
 
LVL 1

Expert Comment

by:rgmisra
ID: 2008838
I just noticed that shar doesn't store symbolic links properly.
Is this a problem for you? If so, I can try to figure out a way
around it.
0
 

Expert Comment

by:vkurland
ID: 2008839
I see. Can you just run ftp from one machine to another? If yes, then you can use "mget *" command to transfer all files automatically. You can automate the process using .netrc file and a script which you run as "ftp -iv < scriptfile"

shar as the matter of fact just uuencodes your files and combines output with simple shell script header for auto-decoding. This will work probably if uudecode on SUN OS does not have the same limitation on file names length. Also check if shar handles spaces in file names properly

I just tried to uuencode/uudecode a file with long name (about 140 chars with spaces in it). UUencode does its job, but uudecode can not extract the file because it is getting confused by spaces in the name. I did this on SUN OS 4.1.4

If all this does not help, I can write a script for you which will rename your files into something short and build the table of name correspondence. You can then move files somehow and run script back to restore names. This is going to be 50 lines on perl, or even awk if you don't have perl there


0
 
LVL 4

Author Comment

by:jlms
ID: 2008840
I certainly need to handle links properly, ther are plenty of them.

About using ftp the problem is the permissions: I guess they will change to wathever the umask of the user reciving the files dictates and certainly the ownership will be changed.

For all what has been discussed here I thing gnu tar could make the trick, I just would like to find a trustable place in the Internet where I could get the binaries for SunOS 4.1.x (I can compile the thing in Solaris).

   Other backup utilities that write to disk files could be of interest, so I am willing to try anything.

   Certainly I don't want to make scripting for this, for me it looks like a very straight thing: put a directory in a file and transfer it using ftp, then recovering the directory (or directories I should say). The only drawback is the length of the filenames, the tool should handle unusually long filenames (when I say filenames this includes the full absolute or relative path).

   Thanks for the offer of make some scripts for me, I don't want to bother anybody that much, but if that is the only solution I certainly can do it myself, I just would like to get a feeling if there is something else out there that makes the job.
0
 
LVL 1

Expert Comment

by:chytrace
ID: 2008841
Hi,

           try ncftp 3.0 or later. It has posssibility to download recursively whole directory trees. Also ncftpget and
ncftpput utilities are available in binary for so you can use them in shell scripts. The latter two also have possibility to download whole trees recursively. Try the following:

       ftp://ftp.ncftp.com:/ncftpput

 where you can find precompiled binaries.

If SunOS itself can handle such long file names then this utility does exactly what you want. Download just ncftpput for Solaris and move files.

Hope it helps

                    Radovan
0
 
LVL 3

Expert Comment

by:braveheart
ID: 2008842
Can you use rcp through your firewall?
0
 
LVL 4

Author Comment

by:jlms
ID: 2008843
SunOS is the problem, it can't handle long file names.
0
 

Accepted Solution

by:
samarium earned 200 total points
ID: 2008844
Take the file system off line from users and perform a backup.

Generate a listing of the file space using find, and use awk to convert that to a shorten script that uses mv to rename the files and directories to generated short file names, ie F1, F11, F123 etc.  If the directory names are long and deep such that the combined directory and file names are > the limit on your machine, probably 1023, you will need to intersperse appropriate cd commands to desend into each directory as you process it, rather than just use the simpler mv longpath shortpath.  Also generate a lengthen script to do the reverse operation.  Test in a "safe" area both shorten and lengthen operations.  Make sure you test the lengthen script on the remote machine too.

After you are sure you can generate reliable shorten and lengthen scripts, backup the file space, shorten the file space, tar up the file space and transfer it and the lengthen script to the remote machine.  Untar to files, and run the lengthen script.

0

Featured Post

Enabling OSINT in Activity Based Intelligence

Activity based intelligence (ABI) requires access to all available sources of data. Recorded Future allows analysts to observe structured data on the open, deep, and dark web.

Join & Write a Comment

Java performance on Solaris - Managing CPUs There are various resource controls in operating system which directly/indirectly influence the performance of application. one of the most important resource controls is "CPU".   In a multithreaded…
Every server (virtual or physical) needs a console: and the console can be provided through hardware directly connected, software for remote connections, local connections, through a KVM, etc. This document explains the different types of consol…
Learn how to get help with Linux/Unix bash shell commands. Use help to read help documents for built in bash shell commands.: Use man to interface with the online reference manuals for shell commands.: Use man to search man pages for unknown command…
In a previous video, we went over how to export a DynamoDB table into Amazon S3.  In this video, we show how to load the export from S3 into a DynamoDB table.

707 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

17 Experts available now in Live!

Get 1:1 Help Now