Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1277
  • Last Modified:

How to copy an os x drive?

I have a mac with os x 10.2. This is an office fileserver so multiple users connect to it. There is a shared drive which has different folders with different access permissions for different users. This drive is almost full so I want to copy everything from that drive to a larger new one, then remove the old one and replace it with the new one. How is this done? I need to be root?
0
gharnett
Asked:
gharnett
  • 9
  • 7
  • 2
  • +5
3 Solutions
 
brettmjohnsonCommented:
You might not need to be root, but you do need sufficient privileges to read all the data you wish to copy.
To dup disks under Mac OS X, I use an excellent utility called Carbon Copy Cloner:

http://www.bombich.com/software/ccc.html

0
 
weedCommented:
You dont need root but you do need to be the administrator. Some like Carbon Copy Cloner but I use SuperDuper! which has a much slicker interface and is somewhat faster.
0
 
Andrew DuffyTechnical Services CoordinatorCommented:
I have noticed that CC Cloner can be slow - took a whole weekend to copy 450GB of data between ultra-wide RAIDs! However, it did work completely flawlessly (apparently).
0
Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

 
Scorp888Commented:
To back up everyone else, CCC works fine.
0
 
grexxCommented:
I have used CCC several times to clone a complete system from one harddrive (or partition) to another. It worked perfectly! So for your (less demanding) job, it should do everything you want. You do need admin-rights, but don't need to login as root.
0
 
gharnettAuthor Commented:
Both ccc and superduper take a very long time to copy a 100GB HD. I haven't switched drives yet so I'm not sure who gets the points. If someone can tell me how to speed it up to less than 12hrs i'll give them the points plus some bonus points too :)
Thanks people.
0
 
Scorp888Commented:
Well you can look at taring an zipping large files up, or use stuffit to compress files.

I've just done this, and it reduced the time by about 30%, but that was given that I had a large number of documents I could compress, lots of already compressed files, jpg's dmg's zip files, that sort of thing, and it's not going to be as good.

0
 
Andrew DuffyTechnical Services CoordinatorCommented:
Good idea but because the data will be offline while its being compressed you need to take that time into account in the overall process. So unless (time taken to compress files) + (time taken to copy compressed files) is quicker than a straight copy it's not really worth doing.

I don't think it's the program that's going to be your bottleneck - it'll be the system interface. What interface does the shared disk use? There are two alternatives; one which is tried & tested and one which is a 'plan b' as it's not quite ideal but might work better:

Tried & Tested
Put both drives into a machine on separate channels. Boot that machine using an OS X system CD and run Disk Utility. Click one of the disks and choose the 'Restore' tab. From the list of disks, drag the old existing drive into the Source box and then drag the new drive into the destination box. Check the 'Erase Destination' box (and make sure your disks are the right way round!) and click Restore. The data will then stream from one drive to the other and because there are no other system processes going on, it should be quicker. I do this all the time and it works a treat.

Plan B
As above basically but using a PC and Norton Ghost. Ghost won't take advantage of some advanced disk features but as long as the BIOS supports the drives fully it should still be quick. I've never done this with a Mac disk though so I don't know if the software is actually capable of ghosting Macintosh partition data. Worth a shot though.
0
 
jonberghCommented:
is the "shared drive" you speak of the same as the boot drive? then, yes, like everybody is saying, run Carbon Copy Cloner or SuperDuper.

if it's all on a different physical disk and you feel like it hook a second disk up and just drag stuff. 100G is a lot though. if the server is staying up and in use while this is going on you might look at running rsync locally... the first run would take a while but then you could keep running it periodically until you were ready to kick everyone off, run rsync one last rime, then bounce the box and pull out the old disk.
0
 
Andrew DuffyTechnical Services CoordinatorCommented:
I suppose it's a question of which solution (out of CCCloner, SuperDuper and Disk Utility) the guy used in the end.
0
 
Scorp888Commented:
and wether he zipped it and tarred it.

He suggests that the points should go (with bonus's) to whoever can speed it up, well that would be me, if of course he actually did it.

I'd be happy with a 3 way split.

0
 
Andrew DuffyTechnical Services CoordinatorCommented:
Yeah, for the sake of resolution I'll go with that. :0)
0
 
Scorp888Commented:
I seem to have about 40 emails at the moment, all with a 50/50 with someone (mostly weed) where the solution could have been one of two things, and the person hasn't got back, so no one knew who was right.

0
 
VenabiliCommented:
>>I seem to have about 40 emails at the moment
Yep.. trying to get the old questions closed ..
0
 
Scorp888Commented:
Oh yes, don't take it the wrong way Venabili, you're doign a great job!

0
 
VenabiliCommented:
Sorry guys,

I have points for only 3 way split.
The first two comments give alternative solution, the third I selected also...
Sorry... I wish I had 5 more points here...
0
 
Scorp888Commented:
The only comment I would make is to go back to the posters last comments.

"Both ccc and superduper take a very long time to copy a 100GB HD. I haven't switched drives yet so I'm not sure who gets the points. If someone can tell me how to speed it up to less than 12hrs i'll give them the points plus some bonus points too :)
Thanks people."

The answer to that, is zip and tar.

0
 
Andrew DuffyTechnical Services CoordinatorCommented:
In that case, I feel I have to reiterate my point that presuming (logically) that the 12hrs is in the context of minimising downtime, zip and tar would not necessarily be the best solution because the data obviously needs to be offline while its being compressed. So unless (time taken to compress files) + (time taken to copy compressed files) is quicker than a straight copy it's no quicker.
0
 
Scorp888Commented:
Nope, you could compress whilst you're transferring the files to disk.

Something along the lines of.

ls `*` |xargs |compress > /Volumes/newdisk/$1.

As a 3 second example which probably won't work but should show the way.

0
 
Andrew DuffyTechnical Services CoordinatorCommented:
Okay, my shell experience clearly isn't as extensive as yours but I still would have thought that total time = compression time + transfer time. It's merely occuring file-by-file in this case. I'd be delighted if I was wrong as this is potentially a very useful method.
0
 
Scorp888Commented:
This is more for fun that to do with the question now, although I guess it's still useful.

Well we're both potentally right.

How?

Well if the transfer medium is slower than the compression time, then I'm right.

If the transfer medium is faster than the compression, then you are correct.

Example

1 MB file, no compression takes 1024 seconds to transmit.                                     Total time 1024 seconds
1MB file, takes 1 second to compress to 100k, which takes 100 seconds to transmit. Total time 0102 seconds.


Saving, 900 seconds.

however.

1 MB file takes 100 seconds to transmit.                                                                      Total time 100 seconds.
1 MB file takes 99 seconds to compress to 100k which takes 10 seconds to transmit    Total time 109 seconds.

Mostly, it's quicker to compress and send.

however, with a slow machine and a fast firewire drive it's going to be close.

Also for the shell you'd probably use find, not list.

If you're really interested I can dig something out for you.

0
 
Andrew DuffyTechnical Services CoordinatorCommented:
Hehe, nah I think you've indulged me enough, thanks. I see what you mean now and I'm surprised I hadn't considered the difference compression could make - 900 seconds multiplied by 100,000 is a lot of minutes saved.

Obviously though this is based on your example compression ratio of 1:10, so in this instance the question of who's answer is closest would be answered by finding out what sort of data is on the server in question. If it's video or graphics then of course there's not much consideration to be given to  your method.

Cheers!
0
 
Scorp888Commented:
Just a final point, and it's to re-iterate what you state.

If you had a disk of zipped files, or compressed jpegs, then you're correct, you'd not see much gain.

Same really for a disk full of mp3s.

But for most people a backup is a mixture, and for business it's lots of text fles, whether they be word documents, excel spreadh sheets, or sql dumps.

0

Featured Post

Upgrade your Question Security!

Your question, your audience. Choose who sees your identity—and your question—with question security.

  • 9
  • 7
  • 2
  • +5
Tackle projects and never again get stuck behind a technical roadblock.
Join Now