Copying data from one database to another in Pervasive

I'm using the following query to copy data from one database table to another database table in Pervasive.

insert into c.SALES_HISTORY_DETAIL select * from r.SALES_HISTORY_DETAIL

Open in new window


While running the query it shows that records are being copied (in the monitor utility) but it just goes in to an infinite loop not copying any data. The file size in the database I'm copying to does grow in size infinitely as well
LVL 1
GerhardpetAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

mirtheilCommented:
What version of Pervasive are you using?  
How large is the table?
Does it fail if you run the query in the PCC?  

I tried it with a small (30 record) table and  your syntax worked. I'm using PSQL v11.31.
0
GerhardpetAuthor Commented:
I'm v10.31

Correct for smaller tables works for me too. The table I'm copying from has 1,334,395 records. The file in explorer is already split in 2 files one at 2 GB and the other one at almost 1 GB
0
Bill BachPresidentCommented:
Your query may be working, but this syntax will force all inserts to complete within the same database transaction. There are two viable solutions:
1) write a stored procedure to handle moving one record at a time. This will be a bit faster, as well, but requires that you create a cursor, iterate over the source data set one record at a time, and then insert the records into the target table.
2) if the file structures are indeed the same, then use the BUTIL -COPY command at the Btrieve level to copy your records.

The latter is far easier, but only if the tables are identical. If you need help with the stored procedure, let me know and I can give you a shell to start from.
0
Ultimate Tool Kit for Technology Solution Provider

Broken down into practical pointers and step-by-step instructions, the IT Service Excellence Tool Kit delivers expert advice for technology solution providers. Get your free copy now.

GerhardpetAuthor Commented:
I like to try the stored procedure. If you can help me get going that would be great.

I have never tried the BUTIL -Copy. How is that done? I may explore both.
0
Bill BachPresidentCommented:
The copy command is simple:
BUTIL -COPY source target

I will post a sample stored procedure tomorrow morning.
0
Bill BachPresidentCommented:
Here's a sample MoveData SP for you...
create procedure movedata()  with default handler as
begin
   declare :c1 int;
   declare :c2 int;
   declare cur1 cursor for
       select * from t1;
   open cur1;
   fetch next from cur1 into :c1, :c2;
   while (SQLSTATE = '00000') do
      insert into t2 values (:c1, :c2);
      fetch next from cur1 into :c1, :c2;
   end while;
   close cur1;
end;

Open in new window

There are other ways, of course, but this would allow you to map into disparate table definitions.
0
GerhardpetAuthor Commented:
Can you give me a bit more details on how to use the Stored Procedure?

Will the procedure also work if I only want to copy some records using the where clause?

Also with the BUTIL -COPY source target do I enter the database.table for source and target? I have not tried it yet.
0
Bill BachPresidentCommented:
Sure. Just change the SQL statement inside the cursor. Be sure to create a variable for each field of the correct types, too.

The BUTIL command is run from the command line, accessing the base Btrieve files directly. Check the manuals for more information on BUTIL if you need it.
0
GerhardpetAuthor Commented:
I'm not that good at database other then basic queries and using Access to connect to a Pervasive database. So like to know if you could do a bit more hand holding with the stored procedure you have provided.

The database table that I want to copy is called SALES_HISTORY_DETAILS

Source OLD.SALES_HISTORY_DETAILS
Target NEW.SALES_HISTORY_DETAILS

OLD and NEW being the databases names in PCC

Also while copying the data I want to use the where clause to only include records > a date that I specified.  

What are the steps I need to make this work?
0
Bill BachPresidentCommented:
create procedure movedata()  with default handler as
begin
   declare :c1 int;
   declare :c2 int;
   declare cur1 cursor for
       select * from OLD.SALES_HISTORY_DETAILS WHERE DateField > {d 'yyyy-mm-dd'};
   open cur1;
   fetch next from cur1 into :c1, :c2;
   while (SQLSTATE = '00000') do
      insert into NEW.SALES_HISTORY_DETAILS values (:c1, :c2);
      fetch next from cur1 into :c1, :c2;
   end while;
   close cur1;
end;

Open in new window

Modify the WHERE clause as needed.  For each field in the table that you want to bring over, define a carrier field (e.g. :c1, :c2, etc.) of the proper data type and add to the FETCH NEXT and INSERT lines.
0
GerhardpetAuthor Commented:
The table I want to bring over has 84 fields. How can I use the SP to bring over all fields.

Next I need to know how to execute the SP. Do I create it in the source database or both source and target?
0
Bill BachPresidentCommented:
For each field, define a carrier variable (:c1 through :c84) of the correct data type.  Add the field list to both the fetch and insert statements.

Use the command CALL MoveData().  If you need to pass in the date variable, then you can do this, too, but you'll have to add an input variable as well.  For exact details on syntax, see the SQL Engine Syntax Reference text in the manuals (PCC/Help/Documentation).
0
GerhardpetAuthor Commented:
Been banging my head against the wall and can't get it to work. The reference guide is of somewhat help but it is an enormous task to do this for 84 fields. That is probably where can't get it to work.

Isn't there a simpler way to do this?

My original query works and how could we make that work so that it would only do one row at the time?
0
Bill BachPresidentCommented:
There is no simpler way using pure SQL that I am aware of.  You could try changing the * to a field list (Field1, Field2, Field3, etc.) and adding the fields and variables 10 at a time.  This will give you some successes.  However, the INSERT statements may fail depending on whether the target fields (those that are NOT included) are nullable or not.  In that case, comment out the INSERT statement and just get the reader process working first, then add the INSERT back in after the fact.

There *IS* a simpler way -- if you are up to using other tools.  Essentially, what you are doing amounts to "ETL" work, short for Extract/Transform/Load.  There are many ETL tools available on the market, some of which will also work with PSQL databases.  In fact, if you can consider an upgrade to PSQLv12, then you have access to a very powerful tool called Actian DataFlow which will allow you to do what you want graphically, in a few mouse clicks.  However, this is a young product yet, and it is rife with minor issues which make using it a bit touchy, at best.  Other, more mature products include Actian's Data Integrator product (which might be a better starting point, since it has been around for many years and is much more stable), along with products like SSIS, Jaspersoft, and Pentaho.  Or, just search for the web for "ETL" and you'll find other ideas to consider.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
GerhardpetAuthor Commented:
@BillBach - I have some experience with Pentaho Spoon which have it setup and it works to move the data. Thanks for the idea!

One questions though: My source table has 1,337,452 records and after I move the data using a where statement to only include records base on a date field >= the target has 918,749 records.

But when I check the physical size of the file in Windows Explorer the target file is 1.6 GB and the source is .6 GB (600 MB).

How is that possible if I have less records in the target then source?
0
Bill BachPresidentCommented:
When you are moving data VERY rapidly, the number of shadow pages required to handle the database system transactions is exponentially higher.  This is more of a measure of how fast you are inserting than anything else, and if you run the conversion on a MUCH slower computer (think Pentium 233 over 10Mbps network), then you will see a much smaller result file.  (See "Shadow Paging" in the online manuals for more explanation of this process.)

You can rebuild the database file immediately after the copy to reclaim the free space.  Rebuilding will result in the most compact file possible, with very few shadow pages.  If you need a better idea of your actual page usage counts, check out Goldstar Software's KeyData utility.  It will analyze the PSQL database file and report back with the page usage counts, index utilization percentages, and more.  Alternatively, if you are on PSQLv12, try the "dbdefrag -analyze <filename>" command from a command prompt, and it will give you three core statistics letting you know if a defrag would be helpful.
0
GerhardpetAuthor Commented:
I'm on PSQL v10 and have no option to upgrade. Program only support up to v10.31

How do I rebuild a database file? Do I use Pervasive Rebuild Utility? If so what settings should I consider on the following

rebuild
0
Bill BachPresidentCommented:
Default settings should be fine, but...
Page Size: If you run the KeyData tool against the file, it will tell you if there is a more efficient page size in terms of storage capacity.  Usually, switching to a 16K page results in faster performance as well, but make sure that your application doesn't check this for some strange reason.  
Key Number: If you know that the application always accesses the file in a specific key order, then rebuilding by THAT key number can yield faster performance, since it effectively uses the cache to read records in order, not randomly.  However, rebuilding by one key may make OTHER key reads less efficient.
0
GerhardpetAuthor Commented:
I just did a rebuild on one of the files and I was able to reduce the file size by about 50%

I left all settings as default.

There are so many odd things about the ERP application and I don't want to take any risks in break something. The application is Sage BusinessVision slated to die in 2020 for good. It won't let you login anymore beyond Dec 31 2020 and Sage is not going to do anything about it. That tells you everything.
0
Bill BachPresidentCommented:
>It won't let you login anymore beyond Dec 31 2020 and Sage is not going to do anything about it.
WOW!  That's service for you!
0
GerhardpetAuthor Commented:
Yep. A lot of frustration among 100+ partners supporting BusinessVision.
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Databases

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.