?
Solved

Code Optimization

Posted on 2003-11-03
5
Medium Priority
?
275 Views
Last Modified: 2010-04-17
Hi,

I am trying to insert a LineFeed at the end of every 100th character in my file.
I am using the following code for this.
It takes a long time if the file has some 100,000 characters. So can u suggest me a idea to do it faster.


Code:

char *file;
char *file1;

unsigned long offset = 0;
char   buf[102] = {0};
HANDLE hFile, hFile1 ;
DWORD  dnbytes = 0;
DWORD  dwact = 0;
DWORD  lowsize = 0;
DWORD  hisize = 0;
int    nStartIndex = 0;


if ((hFile = CreateFile(file,
      GENERIC_READ,
      0,
      NULL,
      OPEN_EXISTING,
      FILE_FLAG_RANDOM_ACCESS,
      NULL) )
== INVALID_HANDLE_VALUE)
return 1;

lowsize = GetFileSize(hFile, &hisize);
if (lowsize == 0xffffffff)
{
CloseHandle(hFile);
return FAILURE;
}

if ((hFile1 = CreateFile(file1,
       GENERIC_WRITE,
       0,
       NULL,
       OPEN_EXISTING,
       FILE_FLAG_RANDOM_ACCESS,
       NULL) )
== INVALID_HANDLE_VALUE)
{
CloseHandle(hFile);
return FAILURE;      
}


while (1)
{
dnbytes = 100;        

if (!ReadFile(hFile,buf,dnbytes,&dwact,0))
{
    CloseHandle(hFile);
    CloseHandle(hFile1);
    return SUCCESS;
}

// insert 0x0d at the end
if (dnbytes != dwact)
{
    dnbytes = dwact;
    if (dwact != 0)
 buf[dwact]=0x0d;
    CloseHandle(hFile);
    CloseHandle(hFile1);
    return SUCCESS;
}
else
{
    buf[dnbytes]=0x0d;
}

dnbytes++;

if (!WriteFile(hFile1, &buf[nStartIndex], dnbytes, &dwact, NULL))
{
    CloseHandle(hFile);  
    CloseHandle(hFile1);  
    return FAILURE;  
}

if (dnbytes != dwact)
{
    CloseHandle(hFile);  
    CloseHandle(hFile1);  
    return FAILURE;  
}
Sleep(1);      
}  
return 0;

Thanks in Advance,
Varadha
0
Comment
Question by:Varadha2k
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
5 Comments
 
LVL 6

Accepted Solution

by:
stsanz earned 80 total points
ID: 9670691
Use parameter FILE_FLAG_SEQUENTIAL_SCAN in both CreateFile calls instead of FILE_FLAG_RANDOM_ACCESS.
This may improve file buffering.

0
 
LVL 5

Expert Comment

by:drnick
ID: 9670743
also, you could speed a little up when reading and writing larger blocks.
make your buffer n*100+n bytes large,
read n*100 bytes,
use memmove or RtlMoveMemory to move the buffer content like

RtlMoveMemory(&buf[n*100-1], &buf[(n-1)*100], 100)
RtlMoveMemory(&bu[(n-1)*100-2], & buf[(n-2)*100, 100)
maybe in a loop, regarding how may bytes actually read
and then write the block again
0

Featured Post

Learn how to optimize MySQL for your business need

With the increasing importance of apps & networks in both business & personal interconnections, perfor. has become one of the key metrics of successful communication. This ebook is a hands-on business-case-driven guide to understanding MySQL query parameter tuning & database perf

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Q&A with Course Creator, Mark Lassoff, on the importance of HTML5 in the career of a modern-day developer.
This article will show how Aten was able to supply easy management and control for Artear's video walls and wide range display configurations of their newsroom.
In this fifth video of the Xpdf series, we discuss and demonstrate the PDFdetach utility, which is able to list and, more importantly, extract attachments that are embedded in PDF files. It does this via a command line interface, making it suitable …
Six Sigma Control Plans

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question