Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 667
  • Last Modified:

Bulk Imports in pervasive db "refresh files"

Hi. I have a table in a pervasive database that contains 800,000 records. Every so often a "refresh" file will come in. We have to wipe out the table and load it with the new 800,000 records from the file. In Microsoft SQL (which I am more familiar with) I would do this pretty easily in a DTS package. What is the best way to do this in pervasive? I wrote a quick VB app that reads the file and inserts the record one by one, but it is going to take days to go through the whole file. Is there some sort of bulk import available?
0
tags266
Asked:
tags266
  • 2
  • 2
  • 2
2 Solutions
 
mirtheilCommented:
A few things:
1. What interface are you using in VB?  If it's SQL/ODBC, make sure you use parameterized queries.
2. What version of PSQL are you using?  
3. What format is the "refresh" file in?

You might also look into the Pervasive Integration products.  
0
 
Bill BachPresidentCommented:
The quickest way to wipe out the table is to recreate it in-place.  You can do this from the OS layer (by copying in an empty file created with BUTIL -CLONE) or from the Btrieve API layer with a CREATE call.

For inserting the records, a Btrieve application running ON THE SERVER will provide the best performance if you are inserting one record at a time.  This eliminates the network overhead for each record, and it runs many times faster.

You can gain some performance by trying two additional techniques:

1) Use InsertExtended calls instead of Insert calls.  InsertExtended allows you to build a large buffer (up to around 60K at a time) of records and insert them all at the same time, with one call. Internally, the database has to do the same amount of work, but this eliminates some additional calling overhead.  If you cannot run the import process on the server, this is the next best thing.

2) When you create the new file, create it with NO keys.  Then, do your bulk insert as mentioned above.  When done, go create the keys.  While this CAN be a problem if you need to filter duplicate records, if the data is known to be "good", then the inserts with no keys will be MUCH faster, and adding the keys after the fact will save a lot of time.
0
 
tags266Author Commented:
hey mirtheil...the answers to your questions:

1) SQL/ADO
2) 8.6
3) fixed format text file
0
VIDEO: THE CONCERTO CLOUD FOR HEALTHCARE

Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

 
tags266Author Commented:
I like your answer so far BillBach.  I'm going to try a couple things out and get back..thanks.
0
 
Bill BachPresidentCommented:
If you are accessing the database through ADO, then I'm not sure you can leverage the InsertExtended call, so this option may be unavailable to you.  I checked, and I also believe that the ActiveX controls do NOT support InsertExtended.  Since codiung direct to the Btrieve API is more complicated than ADO, you might be limited in what you can accomplish.  Please be aware that native access is MANY TIMES faster than SQL access via ADO.  The time invested in writing a native API app will be well worthwhile.
0
 
mirtheilCommented:
The only way to approximate an InsertExtended with ADO would be parameterized queries.  It's not the same and won't give the same performance as a Btrieve API program.  Bill's right, to get the best performance, you're going to want to write a Btrieve API program.  
0

Featured Post

Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

  • 2
  • 2
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now