Link to home
Start Free TrialLog in
Avatar of mac_barnard
mac_barnardFlag for South Africa

asked on

From disk (file) to SQL and back?

I would like to store files of many types (exe, dwg, drw, txt, bin) in SQL programmatically, and read them back to save them on disk in their original form.
I use ADO to access the SQL database.
I think I must use the varbinary field in SQL but I?m not sure.

Could someone help me with code please.
Avatar of Triskelion
Triskelion
Flag of United States of America image

Why would you do this?
It sounds as if you simply need a file server and not a database.
And on second thought, maybe you could just keep the path name to the files in the database, then as they're requested, you can flag the number of times they've been downloaded or who retrieved them and whatever.

Avatar of jhance
jhance

What database engine are you using?  This problem, that is storing "raw" data in the database, is not exactly a standard across all DB engines.  By SQL do you mean MS SQL Server??

In general, this is called a BLOB in the database world.  BLOB is an acronym for Binary Large OBject.
Avatar of Paul Maker
maybe a better way would be store the file on disk and hold an entry in the database pointing to the path of the file. i dont know to much about the internals of a database but i can not see the advantage of storing massive binary objects in a database. if anything it must slow the db engine down and may inpact the running of other queries.

Paul
Avatar of mac_barnard

ASKER

I'm using MS SQL Server 2000
SQL server does supports BLOBs and while there may be a case for including the data in the DB vs. only storing a pointer, you can do it either way.

See the section in the MSDN titled:

Recordset: Working with Large Data Items (ODBC)

It provides a discussion and some example code.
I'm using MS SQL Server 2000
Yes, we know that now.  Why are you repeating yourself?

As I said:

SQL server does supports BLOBs and while there may be a case for including the data in the DB vs. only
storing a pointer, you can do it either way.

See the section in the MSDN titled:

Recordset: Working with Large Data Items (ODBC)

It provides a discussion and some example code.

I'm using MS SQL Server 2000
Yes, we know that now THREE TIMES!!!!  

Why are you repeating yourself?

As I said:

SQL server does supports BLOBs and while there may be a case for including the data in the DB vs. only

storing a pointer, you can do it either way.

See the section in the MSDN titled:

Recordset: Working with Large Data Items (ODBC)

It provides a discussion and some example code.

Don't have a cow, man. :-)  Maybe he was reloading the page and causing a repost of the form data.  Blame the EE software for not checking for that kind of scenario.
thats why EE has reload question link,

'Don't have a cow, man. :-) ' although thats a bloddy funny thing to say LOL, hahahahahaha,

Paul
There are several excellent reasons to store large files into a database.  One major reason is that the clients that need to access the data can use their database connection rather than a filesystem access -- avoiding network security problems.

mac_barnard,
Describe what help you need.  

In general, it is simply a matter of declaring a variable and using ADO to read data into it or store data from it into a db table.  With ADO, it is really quite similar to accessing regular variables.  When using 'raw' SQL fns it is more involved because you need to process chunks of data manually.

-- Dan
I'm using MS SQL Server 2000
Thanks cwrea, I did reload the page?sorry.
Dan

That is exactly my reason, I don?t want the client to use file system
access because of network security.

I?ve got a dialog-based application that acts as a document manager for our drawing office.

My application is currently working fine with file system access for the documents and a table for the index info.

I must change my app to act in the following manner

When they check a document IN, the app should then insert the document index info as well as the document itself into a table. On a check OUT the reverse must happen, get the document index info and the document itself from the table.

The code I?ve got to read the file don?t seem to work for me, when I do the read ( is.read (buffer,length) ) it seems as if it encounters a carriage return and stop filling the buffer.

int length;
char * buffer;
char name[1024];

strcpy(name, "c:\\TEST.Bin");

ifstream is;
is.open (name, ios::binary );

// get length of file:
is.seekg (0, ios::end);
length = is.tellg();
is.seekg (0, ios::beg);

// allocate memory:
buffer = new char [length];

// read data as a block:
is.read (buffer,length);

is.close();

//Insert into table
strcpy(SqlStr,"Insert into INFO (FileName, BIN) values ('");
strcat(SqlStr,name);
strcat(SqlStr,"','");
strcat(SqlStr,buffer);
strcat(SqlStr,"')");

adoRs.CreateInstance(_uuidof(Recordset));
try
{
   adoRs->Open   (SqlStr,"DSN=TEST;UID=sa;PWD=;",adOpenStatic,adLockOptimist ic,adCmdText);              
}
catch (...)
{
  ;
}
adoRs = NULL;

//Clear buffer
buffer = NULL;
Well part of this is easy...
you use strcat to append the raw binary data into a text string.  Consider what will happen when the binary data contains a null (ASCII 0x00).  That will terminate the strng and the SQL command goes kaput.

The fil io looks ok.  Ill get back with the proper coding technique...

-- Dan
Thanks Dan
ASKER CERTIFIED SOLUTION
Avatar of DanRollins
DanRollins
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Thanks Dan,

I'm halfway there.

Please help me with GetChunk to get the blob out.

 
First, I did not mention that the above code copies the file data unnecessarity.  It would be wise to allocate the SafeArray first, then use SafeArrayAccessData to get a pointer to it, and then read the file into that address.

-=-=-=-=-=-=-=
To read the file from the database:  
You will either need to know its size in advance, or use GetChuck multiple times.  

For the former, I suggest adding an integer nFileLen to the database table and set it when you insert the record.  Then your code will look like this:

   _bstr_t  bstrQuery( "SELECT * FROM TheTableWithTheBlobs WHERE nIdx=1234 " );
   hr= m_pRecSet->Open(_variant_t(bstrQuery), vNull,
       adOpenStatic,
       adLockOptimistic,
       adCmdText
   );
   // int nBlobCnt= m_pRecSet->GetRecordCount(); // eyeball check


   COleVariant vTmp= 0L;
   m_pRecSet->Fields->GetItem("nFileLen")->GetValue( vTmp );
   long nFileLen= vTmp;

   SAFEARRAYBOUND rgsabound[1];
   rgsabound[0].lLbound = 0;
   rgsabound[0].cElements= nFileLen; // size of data

   COleVariant v;
   v.vt= VT_ARRAY|VT_UI1;

   v= m_pRecSet->Fields->Item["rawBlob"]->GetChunk( nFileLen );
 
   void* pArrayData= NULL;
   SafeArrayAccessData( v.parray, &pArrayData );

   // here put the code to...
   // save the data from the blob to a local file
   // the data is at pArrayData and should be nFileLen bytes long

   SafeArrayUnaccessData( v.parray );
}

The alternate method would be to read it in pieces

long lOffset = 0;
long index1=  0;
const long CNUM_ChunkSize= 1024;
bool fDone= FALSE;
while( !fDone )
{
     v= m_pRecSet->Fields->Item["logo"]->GetChunk(CNUM_ChunkSize);
     note: v will be NULL after final read.
     if (v == NULL) {fDone= TRUE; break; }
     void* pArrayData= NULL;
     SafeArrayAccessData( v.parray, &pArrayData );
     // here add the code to
     // write CNUM_ChunkSize bytes to the file.  It starts at pArrayData
     SafeArrayUnaccessData( v.parray );
     lOffSet +=  CNUM_ChunkSize;
}

I think you will need to modify this code so that on the final chunk you only write the actual amount of data that is in the variant.  I don't have much expertise with these ugly VB constructs so I can't say offhand how to do that.  That size may be available in v.parray->rgsabound[0].cElements   I really hate it when the only examples of use were written by and for VB programmers.

-- Dan