Go Premium for a chance to win a PS4. Enter to Win

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 505
  • Last Modified:

GetChunk, AppendChunk GOTCHA's

I am mucking about with saving and retrieving BLOB's from an Access database with AppendChunk and GetChunk. In the docs it describes using 32k chunks in a loop to fill a byte array. I have jumped the gun and just get the chunk in one large bite (1.4mb) to fill the array. This has worked fine. My question is what are the GOTCHA's, am I messing with something dangerous or if it works it works.
0
jarrahjack
Asked:
jarrahjack
  • 2
1 Solution
 
watyCommented:
In previous version of VB, the max len of a string was 32k.

The way you fo is not the right way, because ready a byte array
has not enough performance.

Try using the following routine.


Sub GetFileFromDB(BinaryField As Field, szFileName As String)
    ' *** Will retrieve an entire Binary field and write it to disk ***
   
    Dim NumChunks    As Long
    Dim TotalSize    As Long
    Dim RemChunk     As Integer
    Dim CurSize      As Integer
    Dim nChunkSize   As Long
    Dim nI           As Integer
    Dim nFile        As Integer
    Dim CurChunk     As String
   
    nChunkSize = 32000    ' Set size of chunk.
   
    ' *** Get field size.
    TotalSize = BinaryField.FieldSize()
    NumChunks = TotalSize \ nChunkSize   ' Set number of chunks.
   
    ' *** Set number of remaining bytes.
    RemChunk = TotalSize Mod nChunkSize
   
    ' *** Set starting size of chunk.
    CurSize = nChunkSize
    nFile = FreeFile ' Get free file number.
   
    Open szFileName For Binary As #nFile  ' Open the file.
    For nI = 0 To NumChunks
       If nI = NumChunks Then CurSize = RemChunk
       CurChunk = BinaryField.GetChunk(nI * nChunkSize, CurSize)
       Debug.Print CurChunk
       Put #nFile, , CurChunk   ' Write chunk to file.
    Next
    Close nFile

End Sub

0
 
jarrahjackAuthor Commented:
Thanks for your answer, I know that it is what SHOULD be done. I am not writing to disk but to a byte array in memory which can then be manipulated. I am not really interested in performance. My concern was more of:- will there be a crash because of what I doing, ie. writing a large amount of data in one hit with appendchunk and ditto: reading with getchunk. It works fine on my machine with 64Mb of RAM.
0
 
cymbolicCommented:
I've done the same with millions of records in processing migrations, and it works fine in one gulp.  However, with very large blocks, you have two problems.  One is you can not release any time for background processing (user waits and wonders whether to cancel your app), and the other is potential memory/swapping problems on busier systems with less memory.  That is your app is more sensitive to low memory/resource environments.  But, hey, why not push Microsoft apps to the limit.  We all know how robust they are and how easy it is to get information and a fix from them when operating out of expected parameters, now don't we?
0
 
jarrahjackAuthor Commented:
Just as I thought. But one never knows with MSoft tra la, tra la
0

Featured Post

Important Lessons on Recovering from Petya

In their most recent webinar, Skyport Systems explores ways to isolate and protect critical databases to keep the core of your company safe from harm.

  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now