Link to home
Start Free TrialLog in
Avatar of aztec
aztec

asked on

TFileStream problem

Hello...
   Over a year ago, one of your experts - Andrewjb - answered a question I had regarding how to tack together files programmatically. He suggested using the TFileSTream command and the code example he provided worked great:

procedure TForm1.Button1Click(Sender: TObject);
var
  lOut : tFileStream;
  lIn  : tFileStream;
begin
  lOut := tFileStream.Create( 'c:\temp\out' , fmCreate );

  lIn  := tFileStream.Create( 'c:\temp\in.1' , fmOpenRead );
  lOut.CopyFrom( lIn , lIn.Size );
  lIn.Free;

  lIn := tFileStream.Create( 'c:\temp\in.2' , fmOpenRead );
  lOut.CopyFrom( lIn , lIn.Size );
  lIn.Free;

  { etc.! }

  lOut.Free;
end;


..However I find I'm running into a problem whenever I execute this code on very large files (like 1 or 2 Gigs). It bombs and gives a "stream read error". I suspect this is because of the statement "lOut.CopyFrom( lIn , lIn.Size );", where it attempts to use the size of the whole file as the buffer...no? I'm no expert on FileStreams at all...any suggestions on how I can get my code to work on these very large files...without having to slow down the execution speed terribly?

Thanks!
   Shawn Halfpenny
   drumme59@sprint.ca
 
ASKER CERTIFIED SOLUTION
Avatar of viktornet
viktornet
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of kretzschmar
Hi Aztec,

try this little change (lOut.CopyFrom( lIn , 0 );),

procedure TForm1.Button1Click(Sender: TObject);
   var
     lOut : tFileStream;
     lIn  : tFileStream;
   begin
     lOut := tFileStream.Create( 'c:\temp\out' , fmCreate );

     lIn  := tFileStream.Create( 'c:\temp\in.1' , fmOpenRead );
     lOut.CopyFrom( lIn , 0 );
     lIn.Free;

     lIn := tFileStream.Create( 'c:\temp\in.2' , fmOpenRead );
     lOut.CopyFrom( lIn , 0 );
     lIn.Free;

     { etc.! }

     lOut.Free;
   end;

meikl
Avatar of aztec
aztec

ASKER

Hello Gentlemen...
   I will be trying both your examples. Kretzschmar, I am curious as to what exactly your suggestion does: (lOut.CopyFrom( lIn , 0 );), It would appear to me that it would not copy any bytes at all, because the "Count" parameter is zero...can you explain?

Cheers
   Shawn
If you're copying 2GB files, then use 64k as the block size, not 512. It ought to be much quicker that way.
Avatar of aztec

ASKER

Hello Gentlemen...
   I tried all your suggestions, however I discovered something - I am reasonably sure that my problem lies in the fact that I am sometimes attempting to append files that are greater than 2 gigs! Given the fact that TFileStream variables Size and Position are both LongInt's, I believe I am exceeding the maximum LongInt allowable value of 2,147,483,647 !
  Is there some other high-speed method of data transfer that I can use to append very large files together?

Thanks
   Shawn Halfpenny
   
Yes ...try these two...

BlockRead() BlockWrite()
Here is a function that kills the files... It uses writeblock() ...it's just an example how to use the function writeblock() ....readblock() is completely the same except that you use it to read the data from the file...

so you read with ReadBlock() and then write it to the output file with WRiteBlock()...

Let me know if you need an example how to do the code to write to that app...

function KillFile(FileName:string):boolean;
   var
       f:file;
       Buf:array [0..1023] of byte;
       i:integer;
   begin
     for i:=0 to SizeOf(Buf)-1 do
      Buf[i]:=0;
    try
     AssignFile(f,FileName);
      FileMode:=2;   // Read and write access
      Reset(f);
      while not eof(f) do
        BlockWrite(f,Buf,SizeOf(Buf));
      CloseFile(f);
      Erase(f);
     except
     on e:exception do
     begin
       Result:=false;
       exit;
     end; // except
    end; // try
    Result:=true; // File successfully deleted
   end;

-Viktor
--Ivanov
Here is also some code that splits files into chunks.. it will show you how to use ReadBlock() and WriteBlock();

var
  inFile, outFile : File;
  CopyBuffer : Pointer;
  iRecsOK, iRecsWr, iX: Integer;
  sFileName : String;
CONST
  ChunkSize : LongInt = 1024000;
begin
  GetMem(CopyBuffer, ChunkSize);
  sFileName := 'C:\windows\desktop\test';
  AssignFile(inFile, sFileName + '.ZIP');
  Reset(inFile);
  iX := 1;
  repeat
    AssignFile(outFile, sFileName + IntToStr(iX) + '.ZIP');
    Rewrite(outFile);
    inc(iX);
    BlockRead(inFile, CopyBuffer^, ChunkSize, iRecsOK);
    BlockWrite(outFile, CopyBuffer^, iRecsOK, iRecsWr);
    CloseFile(outFile);
  until (iRecsOK < Chunksize);
  CloseFile(inFile);
  FreeMem(CopyBuffer, Chunksize);
end;

-Viktor
--Ivanov
Hi aztec,

read in the helpfile TFileStream|CopyFrom

it sounds like : if count  0 then, CopyFrom sets source.Position to 0 and reads the whole File into the Stream.

That means also that is doesnt matter if your file.size > MaxLongint.

meikl

Hi aztec,

a other solution, give the work to the OS:

ShellExecute(self.handle, NIL, PChar('Command.Com'),
       PChar('/C copy c:\temp\in.1+c:\temp\in.2 c:\temp\out'), Nil, SW_HIDE);

meikl

Avatar of aztec

ASKER

Hello Viktor...I'll be trying your suggestion soon...thanks!

Kretzschmar... I did try your : (lOut.CopyFrom( lIn , 0 );), suggestion, but on very large files (> 2 gig), it produced the same "Stream Read Error". Your suggestion on letting the OS do the work with the ShellExecute command looks interesting, but would there not be a limit to the length of the command line you submit to COMMAND.COM? Less than or equal to 128 characters or something like that? You see, I may have to append together literally hundreds of files! Will the ShellExecute be able to handle that many filenames?

Thanks!
   Shawn Halfpenny
   drumme59@sprint.ca


P.S: hmmm, I have the check mark set below for "Check here if you'd like an email notification whenever this question is updated "...but I do not receive the email notification! Is this feature working?
Yes the feature is working for me,, I don't know why it won't work for you..

Yes, I think that command.com will be able to handle all the files, but you don't have any control over comand.com so you can't do exactly what you want... I think that BlockRead() and BlockWrite() is an original solution.. Well, let me know what you think.... btw- BlockRead() and BlockWrite() can write any size you want to the file... I think that it is better to append it chunk by chunk for example .5 megs or 1 meg... and so on..

-Viktor
--Ivanov
Hi aztec,

Yes this feature is working by me too.
Viktor is right, his solution is really the better solution.

meikl ;-)

Avatar of aztec

ASKER

Hi Viktor!
   Your answer involving Blockread/Blockwrite works wonderfully! It even executes MUCH faster than the TFileStream I was using before (about twice as fast!). Thank you!

Regards
  Shawn Halfpenny