a program I am responsible for writes large amount of data to a binary file. The users can stop the process and exit, and later restart it. In that case, the 'old' file is opened and JUST the data portions of the previous file (excludes the header info, time stamps. etc) are copied to the next file, then the process continues to collect data and write it to file (basically appending the new data to the old). This copying of the data is the slow down for me. For example, 4000 pieces of previous data is taking about 30-40 seconds to write to the new file. During this time, the user just sits there and waits. ANYWAY, I am using Rad Studio 2007 C++ as the IDE. The command I use for the file output is the _rtl_write command. Does anyone maybe have a better way of doing this? Would old standard 'fwrite' be faster? Memory mapped files? Suggestions are appreciated.