• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 537
  • Last Modified:

TFileStream problem

Hello...
   Over a year ago, one of your experts - Andrewjb - answered a question I had regarding how to tack together files programmatically. He suggested using the TFileSTream command and the code example he provided worked great:

procedure TForm1.Button1Click(Sender: TObject);
var
  lOut : tFileStream;
  lIn  : tFileStream;
begin
  lOut := tFileStream.Create( 'c:\temp\out' , fmCreate );

  lIn  := tFileStream.Create( 'c:\temp\in.1' , fmOpenRead );
  lOut.CopyFrom( lIn , lIn.Size );
  lIn.Free;

  lIn := tFileStream.Create( 'c:\temp\in.2' , fmOpenRead );
  lOut.CopyFrom( lIn , lIn.Size );
  lIn.Free;

  { etc.! }

  lOut.Free;
end;


..However I find I'm running into a problem whenever I execute this code on very large files (like 1 or 2 Gigs). It bombs and gives a "stream read error". I suspect this is because of the statement "lOut.CopyFrom( lIn , lIn.Size );", where it attempts to use the size of the whole file as the buffer...no? I'm no expert on FileStreams at all...any suggestions on how I can get my code to work on these very large files...without having to slow down the execution speed terribly?

Thanks!
   Shawn Halfpenny
   drumme59@sprint.ca
 
0
aztec
Asked:
aztec
  • 5
  • 4
  • 4
  • +1
1 Solution
 
viktornetCommented:
try this...

{...}
while(IIn.Size > 512) do
  IOut.CopyFrom(IIn, 512);
IOut.CopyFrom(IIn, IIn.Size);
{...}

I hope this works out for you ...tell me how it goes..

-Viktor
--Ivanov
0
 
kretzschmarCommented:
Hi Aztec,

try this little change (lOut.CopyFrom( lIn , 0 );),

procedure TForm1.Button1Click(Sender: TObject);
   var
     lOut : tFileStream;
     lIn  : tFileStream;
   begin
     lOut := tFileStream.Create( 'c:\temp\out' , fmCreate );

     lIn  := tFileStream.Create( 'c:\temp\in.1' , fmOpenRead );
     lOut.CopyFrom( lIn , 0 );
     lIn.Free;

     lIn := tFileStream.Create( 'c:\temp\in.2' , fmOpenRead );
     lOut.CopyFrom( lIn , 0 );
     lIn.Free;

     { etc.! }

     lOut.Free;
   end;

meikl
0
 
aztecAuthor Commented:
Hello Gentlemen...
   I will be trying both your examples. Kretzschmar, I am curious as to what exactly your suggestion does: (lOut.CopyFrom( lIn , 0 );), It would appear to me that it would not copy any bytes at all, because the "Count" parameter is zero...can you explain?

Cheers
   Shawn
0
Free Tool: Port Scanner

Check which ports are open to the outside world. Helps make sure that your firewall rules are working as intended.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

 
philipleighsCommented:
If you're copying 2GB files, then use 64k as the block size, not 512. It ought to be much quicker that way.
0
 
aztecAuthor Commented:
Hello Gentlemen...
   I tried all your suggestions, however I discovered something - I am reasonably sure that my problem lies in the fact that I am sometimes attempting to append files that are greater than 2 gigs! Given the fact that TFileStream variables Size and Position are both LongInt's, I believe I am exceeding the maximum LongInt allowable value of 2,147,483,647 !
  Is there some other high-speed method of data transfer that I can use to append very large files together?

Thanks
   Shawn Halfpenny
   
0
 
viktornetCommented:
Yes ...try these two...

BlockRead() BlockWrite()
0
 
viktornetCommented:
Here is a function that kills the files... It uses writeblock() ...it's just an example how to use the function writeblock() ....readblock() is completely the same except that you use it to read the data from the file...

so you read with ReadBlock() and then write it to the output file with WRiteBlock()...

Let me know if you need an example how to do the code to write to that app...

function KillFile(FileName:string):boolean;
   var
       f:file;
       Buf:array [0..1023] of byte;
       i:integer;
   begin
     for i:=0 to SizeOf(Buf)-1 do
      Buf[i]:=0;
    try
     AssignFile(f,FileName);
      FileMode:=2;   // Read and write access
      Reset(f);
      while not eof(f) do
        BlockWrite(f,Buf,SizeOf(Buf));
      CloseFile(f);
      Erase(f);
     except
     on e:exception do
     begin
       Result:=false;
       exit;
     end; // except
    end; // try
    Result:=true; // File successfully deleted
   end;

-Viktor
--Ivanov
0
 
viktornetCommented:
Here is also some code that splits files into chunks.. it will show you how to use ReadBlock() and WriteBlock();

var
  inFile, outFile : File;
  CopyBuffer : Pointer;
  iRecsOK, iRecsWr, iX: Integer;
  sFileName : String;
CONST
  ChunkSize : LongInt = 1024000;
begin
  GetMem(CopyBuffer, ChunkSize);
  sFileName := 'C:\windows\desktop\test';
  AssignFile(inFile, sFileName + '.ZIP');
  Reset(inFile);
  iX := 1;
  repeat
    AssignFile(outFile, sFileName + IntToStr(iX) + '.ZIP');
    Rewrite(outFile);
    inc(iX);
    BlockRead(inFile, CopyBuffer^, ChunkSize, iRecsOK);
    BlockWrite(outFile, CopyBuffer^, iRecsOK, iRecsWr);
    CloseFile(outFile);
  until (iRecsOK < Chunksize);
  CloseFile(inFile);
  FreeMem(CopyBuffer, Chunksize);
end;

-Viktor
--Ivanov
0
 
kretzschmarCommented:
Hi aztec,

read in the helpfile TFileStream|CopyFrom

it sounds like : if count  0 then, CopyFrom sets source.Position to 0 and reads the whole File into the Stream.

That means also that is doesnt matter if your file.size > MaxLongint.

meikl

0
 
kretzschmarCommented:
Hi aztec,

a other solution, give the work to the OS:

ShellExecute(self.handle, NIL, PChar('Command.Com'),
       PChar('/C copy c:\temp\in.1+c:\temp\in.2 c:\temp\out'), Nil, SW_HIDE);

meikl

0
 
aztecAuthor Commented:
Hello Viktor...I'll be trying your suggestion soon...thanks!

Kretzschmar... I did try your : (lOut.CopyFrom( lIn , 0 );), suggestion, but on very large files (> 2 gig), it produced the same "Stream Read Error". Your suggestion on letting the OS do the work with the ShellExecute command looks interesting, but would there not be a limit to the length of the command line you submit to COMMAND.COM? Less than or equal to 128 characters or something like that? You see, I may have to append together literally hundreds of files! Will the ShellExecute be able to handle that many filenames?

Thanks!
   Shawn Halfpenny
   drumme59@sprint.ca


P.S: hmmm, I have the check mark set below for "Check here if you'd like an email notification whenever this question is updated "...but I do not receive the email notification! Is this feature working?
0
 
viktornetCommented:
Yes the feature is working for me,, I don't know why it won't work for you..

Yes, I think that command.com will be able to handle all the files, but you don't have any control over comand.com so you can't do exactly what you want... I think that BlockRead() and BlockWrite() is an original solution.. Well, let me know what you think.... btw- BlockRead() and BlockWrite() can write any size you want to the file... I think that it is better to append it chunk by chunk for example .5 megs or 1 meg... and so on..

-Viktor
--Ivanov
0
 
kretzschmarCommented:
Hi aztec,

Yes this feature is working by me too.
Viktor is right, his solution is really the better solution.

meikl ;-)

0
 
aztecAuthor Commented:
Hi Viktor!
   Your answer involving Blockread/Blockwrite works wonderfully! It even executes MUCH faster than the TFileStream I was using before (about twice as fast!). Thank you!

Regards
  Shawn Halfpenny
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Cloud Class® Course: Python 3 Fundamentals

This course will teach participants about installing and configuring Python, syntax, importing, statements, types, strings, booleans, files, lists, tuples, comprehensions, functions, and classes.

  • 5
  • 4
  • 4
  • +1
Tackle projects and never again get stuck behind a technical roadblock.
Join Now