Code Optimization

Hi,

I am trying to insert a LineFeed at the end of every 100th character in my file.
I am using the following code for this.
It takes a long time if the file has some 100,000 characters. So can u suggest me a idea to do it faster.


Code:

char *file;
char *file1;

unsigned long offset = 0;
char   buf[102] = {0};
HANDLE hFile, hFile1 ;
DWORD  dnbytes = 0;
DWORD  dwact = 0;
DWORD  lowsize = 0;
DWORD  hisize = 0;
int    nStartIndex = 0;


if ((hFile = CreateFile(file,
      GENERIC_READ,
      0,
      NULL,
      OPEN_EXISTING,
      FILE_FLAG_RANDOM_ACCESS,
      NULL) )
== INVALID_HANDLE_VALUE)
return 1;

lowsize = GetFileSize(hFile, &hisize);
if (lowsize == 0xffffffff)
{
CloseHandle(hFile);
return FAILURE;
}

if ((hFile1 = CreateFile(file1,
       GENERIC_WRITE,
       0,
       NULL,
       OPEN_EXISTING,
       FILE_FLAG_RANDOM_ACCESS,
       NULL) )
== INVALID_HANDLE_VALUE)
{
CloseHandle(hFile);
return FAILURE;      
}


while (1)
{
dnbytes = 100;        

if (!ReadFile(hFile,buf,dnbytes,&dwact,0))
{
    CloseHandle(hFile);
    CloseHandle(hFile1);
    return SUCCESS;
}

// insert 0x0d at the end
if (dnbytes != dwact)
{
    dnbytes = dwact;
    if (dwact != 0)
 buf[dwact]=0x0d;
    CloseHandle(hFile);
    CloseHandle(hFile1);
    return SUCCESS;
}
else
{
    buf[dnbytes]=0x0d;
}

dnbytes++;

if (!WriteFile(hFile1, &buf[nStartIndex], dnbytes, &dwact, NULL))
{
    CloseHandle(hFile);  
    CloseHandle(hFile1);  
    return FAILURE;  
}

if (dnbytes != dwact)
{
    CloseHandle(hFile);  
    CloseHandle(hFile1);  
    return FAILURE;  
}
Sleep(1);      
}  
return 0;

Thanks in Advance,
Varadha
Varadha2kAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

stsanzCommented:
Use parameter FILE_FLAG_SEQUENTIAL_SCAN in both CreateFile calls instead of FILE_FLAG_RANDOM_ACCESS.
This may improve file buffering.

0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
drnickCommented:
also, you could speed a little up when reading and writing larger blocks.
make your buffer n*100+n bytes large,
read n*100 bytes,
use memmove or RtlMoveMemory to move the buffer content like

RtlMoveMemory(&buf[n*100-1], &buf[(n-1)*100], 100)
RtlMoveMemory(&bu[(n-1)*100-2], & buf[(n-2)*100, 100)
maybe in a loop, regarding how may bytes actually read
and then write the block again
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Programming

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.