Copying a file ???

Posted on 2003-04-01
Medium Priority
Last Modified: 2012-05-04
Hi people...
Well... I have a FOR that is doing many copies of other files (backup system). Look the situation:
At the first files (like 600 files) it copies realy fast. But after this 600 files, it starts to become slowly and slowly.
How to keep the same speed for all the files ? ("same speed", i mean the same speed of the first 600 files)
Thanx a lot !
Question by:BrunoMS
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
  • 2
  • 2
  • +2
LVL 18

Expert Comment

ID: 8246199
Is this a program you have written?  If so in what language.

Are you copying to a blank directory?   Maybe as the directory fills it takes longer for a new file to be created.
LVL 101

Expert Comment

ID: 8246532
Are you sure there is a slowdown?
Are the first several hundred files similar in size to the last several hundred files?  


Expert Comment

ID: 8246603
Is the destination on a networked drive? Is there any implicit sequence of copying (eg. smaller files first?)
Is it the bytes/sec that is decreasing, or files/minute?

I'd go along with deighton's number-of-files-in-a-directory theory, though. Perhaps you could fill backup1 directory to 600 files, then create a new backup directory and copy the next 600 files to that one? how much control do you have?

What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

LVL 24

Expert Comment

ID: 8247410
MSW OS slows down when managing over a few hundred changes to file directory structure, it is part and parcel of Windoze maintenance of a file system with access for GUI being sorted. It is also buffered, not updated promptly, such that eventually, after making copy #### your code can try to access it and find file not found. So, workarounds:

1) Create more processes, each copying fewer at a time, wait for one to finish before starting the next

2) Change all Windoze parameters for maintaining file system (things like FindFast) for duration of your process.

3) Code for DOS, not Windoze. I've found that even when in a DOS window, the filesystem access is more prompt and stable, especially for large directories (having qty of subdirectories, such as one per user, or files - greatest impact on qty of items for a Windoze to display.

4) the length of the filename and path can impact. You can try reducing the length, such as converting "\company_name\Backup Directory" to "\BkUp", but I haven't personally improved much with that. I've heard of problems with lengths 256+ as well, but haven't witnessed (mine aren't that long). Just an fyi there to beware-ey.
LVL 24

Expert Comment

ID: 8247453
On coding, take a peek at task manager when your code is running, for you could find that each copy is a separate spawned process, which can lead to it being run virtually, meaning temporarily offloaded to disk; thus spending needless overhead on windows' maintenance. If so, either recode, live with it, or you can try eliminating the virtual store during the process. Then, as ram fills up with your process, your process will slow down because it cannot fill ram with more copy requests. But benefit of no offload to HD will be a good tradeoff, for disk access is longer that ram or cpu.

Author Comment

ID: 8262065
SunBow, i thanx by the explanation...
So what can i do to make it fastier ?

Accepted Solution

billious earned 150 total points
ID: 8262660

This would seem to indicate that the first 600 files are being loaded into RAM/pagefile,and once this fills, then the transfer slows while the transport to the final destination takes place. BrunoMS doesn't say, but if the REAL bottleneck here is a network (such that the first 600 files are queued locally) then attempting to prevent this queueing will ALSO slow down the first 600 files - wouldn't it?

I'd suggest that windows overhead would be small in comparison with network speed restrictions, if a network is involved.

I perfectly willing to be shot down in flames about this - it's just speculation on my part.


Author Comment

ID: 8334422
Thank you.
"Good" because i didnt like to hear that. But its the reality. Thank you

Featured Post

Get 15 Days FREE Full-Featured Trial

Benefit from a mission critical IT monitoring with Monitis Premium or get it FREE for your entry level monitoring needs.
-Over 200,000 users
-More than 300,000 websites monitored
-Used in 197 countries
-Recommended by 98% of users

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

In this post we will learn how to connect and configure Android Device (Smartphone etc.) with Android Studio. After that we will run a simple Hello World Program.
Computer science students often experience many of the same frustrations when going through their engineering courses. This article presents seven tips I found useful when completing a bachelors and masters degree in computing which I believe may he…
In this fifth video of the Xpdf series, we discuss and demonstrate the PDFdetach utility, which is able to list and, more importantly, extract attachments that are embedded in PDF files. It does this via a command line interface, making it suitable …
Starting up a Project

752 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question