I've noticed something and it really has me wondering some things.
I administer a 10BaseT Network. Just straight 10BaseT, not 100BaseT. I'm running Windows NT Server 5.0 SP5 and Windows 95B (ORS2). I run only NetBIOS over TCP/IP.
I setup computers for the company. I make my own rescue CDs. I use a nameless software to make images of harddrives then I burn them to CD.
The software I use is a DOS program. I'm running this software in a DOS window on Windows 95. This is very important. IT'S NOT DOS MODE. ITS A DOS WINDOW. WINDOWS 95 SITS IN THE BACKGROUND WHILE THE IMAGE IS MADE.
While doing this, I brought up perfmon (the performace monitor that comes with Windows NT). I checked out the speed at which the image was being put on the sever. I was blown away. 700000 bytes per second!!!! I have used perfmon several times. Using the windows 95 copy functions that are built into the shell, I've reached speeds of only 300000 to 400000 bytes per second.
I can copy the same exact image file to the server via Windows 95 copy commands at only 300000 bytes per second. If I were using the software in DOS mode, I could see why, but I'm not. If a program running in a dos window under Windows 95 can copy information at 700,000 bytes per second, there is no reason Windows 95 can't copy information at 700000 bytes per second as well.
I looking for a very educated explanation of why this happens.
Anyone that offers a tweak that can make Windows 95 copy at 700000 bytes per second will get double the points.
I WILL EXCEPT ONLY COMMENTS AS ANSWERS!