I am having problems with setFullState and getFullState. I have datastores which requires some complex calculations to populate with data. After population, I'd like to save the datastore in the database for subsequent rapid retrieval. I use a long varchar column (Adaptive Server Anywhere 9.0.2 via ODBC), as the datastore size is always > 32K.
GetFullState, updateblob, and the subsequent selectblob work fine, but the final setFullState never succeeds, and the datastore is never populated with the saved image from the database. Here's the sample code:
//the database table; col PK_ID is primary key
CREATE table blob_table
(pk_id float not null,
blob_col long varchar);
long id = 1234567
INSERT into blob_table (pk_id) VALUES (:id);
UPDATEBLOB blob_table SET blob_col = blb_var WHERE pk_id = :id;
SELECTBLOB blob_col into :blb_var from blob_table WHERE pk_id = :id;
//this always returns -1 (fail)
The blb_var value from the initial getFullState works just fine if applied directly in setFullState.
If blb_var is saved to a file in streammode, then read back into a blob in streammode, the blob also works fine in setFullState.
Only if saved to the database as shown above, then retrieved to a blob and applied in setFullState does it fail.
I saved to a file the blob retrieved by the SELECTBLOB and compared it to the blob from the initial getFullState, saved to a file. In the former case, the contents are certainly recognizable as a datawindow, but some bytes are rearranged, and the file size is different by 1 byte, compared to the latter case.
It seems that saving and retrieving the blob representation of the datastore state causes it to get mangled. Is there some setting on either the ASA side, the ODBC driver or PowerBuilder side to retrieve an exact copy of what is initially stored.
(I used a long varchar column instead of a long binary after reading Sybase notes about values exceeding 32K). Using PowerBuilder 10.0 Build 4510.
Advice is most thankfully appreciated!
Knowledge Management Systems, Inc.