Solved

TFileStream problem

Posted on 1999-01-27
14
517 Views
Last Modified: 2010-04-06
Hello...
   Over a year ago, one of your experts - Andrewjb - answered a question I had regarding how to tack together files programmatically. He suggested using the TFileSTream command and the code example he provided worked great:

procedure TForm1.Button1Click(Sender: TObject);
var
  lOut : tFileStream;
  lIn  : tFileStream;
begin
  lOut := tFileStream.Create( 'c:\temp\out' , fmCreate );

  lIn  := tFileStream.Create( 'c:\temp\in.1' , fmOpenRead );
  lOut.CopyFrom( lIn , lIn.Size );
  lIn.Free;

  lIn := tFileStream.Create( 'c:\temp\in.2' , fmOpenRead );
  lOut.CopyFrom( lIn , lIn.Size );
  lIn.Free;

  { etc.! }

  lOut.Free;
end;


..However I find I'm running into a problem whenever I execute this code on very large files (like 1 or 2 Gigs). It bombs and gives a "stream read error". I suspect this is because of the statement "lOut.CopyFrom( lIn , lIn.Size );", where it attempts to use the size of the whole file as the buffer...no? I'm no expert on FileStreams at all...any suggestions on how I can get my code to work on these very large files...without having to slow down the execution speed terribly?

Thanks!
   Shawn Halfpenny
   drumme59@sprint.ca
 
0
Comment
Question by:aztec
  • 5
  • 4
  • 4
  • +1
14 Comments
 
LVL 10

Accepted Solution

by:
viktornet earned 50 total points
ID: 1363605
try this...

{...}
while(IIn.Size > 512) do
  IOut.CopyFrom(IIn, 512);
IOut.CopyFrom(IIn, IIn.Size);
{...}

I hope this works out for you ...tell me how it goes..

-Viktor
--Ivanov
0
 
LVL 27

Expert Comment

by:kretzschmar
ID: 1363606
Hi Aztec,

try this little change (lOut.CopyFrom( lIn , 0 );),

procedure TForm1.Button1Click(Sender: TObject);
   var
     lOut : tFileStream;
     lIn  : tFileStream;
   begin
     lOut := tFileStream.Create( 'c:\temp\out' , fmCreate );

     lIn  := tFileStream.Create( 'c:\temp\in.1' , fmOpenRead );
     lOut.CopyFrom( lIn , 0 );
     lIn.Free;

     lIn := tFileStream.Create( 'c:\temp\in.2' , fmOpenRead );
     lOut.CopyFrom( lIn , 0 );
     lIn.Free;

     { etc.! }

     lOut.Free;
   end;

meikl
0
 

Author Comment

by:aztec
ID: 1363607
Hello Gentlemen...
   I will be trying both your examples. Kretzschmar, I am curious as to what exactly your suggestion does: (lOut.CopyFrom( lIn , 0 );), It would appear to me that it would not copy any bytes at all, because the "Count" parameter is zero...can you explain?

Cheers
   Shawn
0
 
LVL 3

Expert Comment

by:philipleighs
ID: 1363608
If you're copying 2GB files, then use 64k as the block size, not 512. It ought to be much quicker that way.
0
 

Author Comment

by:aztec
ID: 1363609
Hello Gentlemen...
   I tried all your suggestions, however I discovered something - I am reasonably sure that my problem lies in the fact that I am sometimes attempting to append files that are greater than 2 gigs! Given the fact that TFileStream variables Size and Position are both LongInt's, I believe I am exceeding the maximum LongInt allowable value of 2,147,483,647 !
  Is there some other high-speed method of data transfer that I can use to append very large files together?

Thanks
   Shawn Halfpenny
   
0
 
LVL 10

Expert Comment

by:viktornet
ID: 1363610
Yes ...try these two...

BlockRead() BlockWrite()
0
 
LVL 10

Expert Comment

by:viktornet
ID: 1363611
Here is a function that kills the files... It uses writeblock() ...it's just an example how to use the function writeblock() ....readblock() is completely the same except that you use it to read the data from the file...

so you read with ReadBlock() and then write it to the output file with WRiteBlock()...

Let me know if you need an example how to do the code to write to that app...

function KillFile(FileName:string):boolean;
   var
       f:file;
       Buf:array [0..1023] of byte;
       i:integer;
   begin
     for i:=0 to SizeOf(Buf)-1 do
      Buf[i]:=0;
    try
     AssignFile(f,FileName);
      FileMode:=2;   // Read and write access
      Reset(f);
      while not eof(f) do
        BlockWrite(f,Buf,SizeOf(Buf));
      CloseFile(f);
      Erase(f);
     except
     on e:exception do
     begin
       Result:=false;
       exit;
     end; // except
    end; // try
    Result:=true; // File successfully deleted
   end;

-Viktor
--Ivanov
0
Is Your Active Directory as Secure as You Think?

More than 75% of all records are compromised because of the loss or theft of a privileged credential. Experts have been exploring Active Directory infrastructure to identify key threats and establish best practices for keeping data safe. Attend this month’s webinar to learn more.

 
LVL 10

Expert Comment

by:viktornet
ID: 1363612
Here is also some code that splits files into chunks.. it will show you how to use ReadBlock() and WriteBlock();

var
  inFile, outFile : File;
  CopyBuffer : Pointer;
  iRecsOK, iRecsWr, iX: Integer;
  sFileName : String;
CONST
  ChunkSize : LongInt = 1024000;
begin
  GetMem(CopyBuffer, ChunkSize);
  sFileName := 'C:\windows\desktop\test';
  AssignFile(inFile, sFileName + '.ZIP');
  Reset(inFile);
  iX := 1;
  repeat
    AssignFile(outFile, sFileName + IntToStr(iX) + '.ZIP');
    Rewrite(outFile);
    inc(iX);
    BlockRead(inFile, CopyBuffer^, ChunkSize, iRecsOK);
    BlockWrite(outFile, CopyBuffer^, iRecsOK, iRecsWr);
    CloseFile(outFile);
  until (iRecsOK < Chunksize);
  CloseFile(inFile);
  FreeMem(CopyBuffer, Chunksize);
end;

-Viktor
--Ivanov
0
 
LVL 27

Expert Comment

by:kretzschmar
ID: 1363613
Hi aztec,

read in the helpfile TFileStream|CopyFrom

it sounds like : if count  0 then, CopyFrom sets source.Position to 0 and reads the whole File into the Stream.

That means also that is doesnt matter if your file.size > MaxLongint.

meikl

0
 
LVL 27

Expert Comment

by:kretzschmar
ID: 1363614
Hi aztec,

a other solution, give the work to the OS:

ShellExecute(self.handle, NIL, PChar('Command.Com'),
       PChar('/C copy c:\temp\in.1+c:\temp\in.2 c:\temp\out'), Nil, SW_HIDE);

meikl

0
 

Author Comment

by:aztec
ID: 1363615
Hello Viktor...I'll be trying your suggestion soon...thanks!

Kretzschmar... I did try your : (lOut.CopyFrom( lIn , 0 );), suggestion, but on very large files (> 2 gig), it produced the same "Stream Read Error". Your suggestion on letting the OS do the work with the ShellExecute command looks interesting, but would there not be a limit to the length of the command line you submit to COMMAND.COM? Less than or equal to 128 characters or something like that? You see, I may have to append together literally hundreds of files! Will the ShellExecute be able to handle that many filenames?

Thanks!
   Shawn Halfpenny
   drumme59@sprint.ca


P.S: hmmm, I have the check mark set below for "Check here if you'd like an email notification whenever this question is updated "...but I do not receive the email notification! Is this feature working?
0
 
LVL 10

Expert Comment

by:viktornet
ID: 1363616
Yes the feature is working for me,, I don't know why it won't work for you..

Yes, I think that command.com will be able to handle all the files, but you don't have any control over comand.com so you can't do exactly what you want... I think that BlockRead() and BlockWrite() is an original solution.. Well, let me know what you think.... btw- BlockRead() and BlockWrite() can write any size you want to the file... I think that it is better to append it chunk by chunk for example .5 megs or 1 meg... and so on..

-Viktor
--Ivanov
0
 
LVL 27

Expert Comment

by:kretzschmar
ID: 1363617
Hi aztec,

Yes this feature is working by me too.
Viktor is right, his solution is really the better solution.

meikl ;-)

0
 

Author Comment

by:aztec
ID: 1363618
Hi Viktor!
   Your answer involving Blockread/Blockwrite works wonderfully! It even executes MUCH faster than the TFileStream I was using before (about twice as fast!). Thank you!

Regards
  Shawn Halfpenny
0

Featured Post

Is Your Active Directory as Secure as You Think?

More than 75% of all records are compromised because of the loss or theft of a privileged credential. Experts have been exploring Active Directory infrastructure to identify key threats and establish best practices for keeping data safe. Attend this month’s webinar to learn more.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

A lot of questions regard threads in Delphi.   One of the more specific questions is how to show progress of the thread.   Updating a progressbar from inside a thread is a mistake. A solution to this would be to send a synchronized message to the…
This article explains how to create forms/units independent of other forms/units object names in a delphi project. Have you ever created a form for user input in a Delphi project and then had the need to have that same form in a other Delphi proj…
Delivering innovative fully-managed cloud services for mission-critical applications requires expertise in multiple areas plus vision and commitment. Meet a few of the people behind the quality services of Concerto.
Need to grow your business through quality cloud solutions? With everything required to build a cloud platform and solution, you may feel like the distance between you and the cloud is quite long. Help is here. Spend some time learning about the Con…

919 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

17 Experts available now in Live!

Get 1:1 Help Now