Go Premium for a chance to win a PS4. Enter to Win

x
?
Solved

Data Pump by Date Range

Posted on 2008-06-25
4
Medium Priority
?
1,460 Views
Last Modified: 2008-06-29
Hello,

I have a schema (SCOTT) where there are two related tables: SCOTT.Documents (parent) and SCOTT.DocumentPages (child).  The SCOTT.Documents table has a "Created" date column.  There are PK fields that ensures there cannot be duplicates etc.

SCOTT.Documents
--------------------------------------
doc_id   NUMBER    PK
description   VARCHAR2
created   DATE

SCOTT.DocumentPages
--------------------------------------
page_id   NUMBER   PK
doc_id   NUMBER   FK
document   BLOB

1. How can I use the Data Pump to export all the data in both these tables WHERE just SCOTT.Documents (and corresponding SCOTT.DocumentPages) greater than 01-01-2008 are exported AND all other schema data in full?

2. The destination database\schema only has data less than 01-01-2008 for the two tables listed.  How can I import this data into the destination database ensuring existing older Document\DocumentPages data and it's Indexes and Constraints (PK, FK etc.)  are not negatively affected?

Note: All the other schema outside these two tables should be replaced in full on the destination database\schema.

Parameter file (.par) examples would be ideal.

Thanks,

Michael4606
0
Comment
Question by:michael4606
  • 2
4 Comments
 
LVL 20

Expert Comment

by:gatorvip
ID: 21869005
I don't think/know if you can do it directly with expdp but you can create two tables with only the data you want, export those two tables, then import into your new location (also in two "temporary" tables) then finally merge them back with your other data.
0
 
LVL 48

Expert Comment

by:schwertner
ID: 21872781
There is a tricky way to do this and I use it for special purposes.
The trick is that Data Pump has API called DBMS_DATA_PUMP.
You can create a PL/SQL procedure that:
1. Imports (or Package that exports/imports) both tables in their original
form.
2. After the Import is done deletes the unwanted records.

The only problem you will have is to synchronize the end of the Import
with the beginning of the delete action. This is so because using the package you only
initiate the Pump job. It runs in the background and have to know when it stops.
0
 

Author Comment

by:michael4606
ID: 21883088
The legacy export\import tools had a QUERY parameter that could be used to filter data using a WHERE clause.  I guess in essence I am asking for the modern Data Pump method of doing this.

Thanks again,

Michael4606
0
 
LVL 48

Accepted Solution

by:
schwertner earned 375 total points
ID: 21883224
You can also use filter in the Data Pump like in the clasic Export.
But here you face relationship between two tables.
This is complex WHERE clause and i am not sure that you can express this
condition in the EXCLUDE clause.
0

Featured Post

Free Tool: Path Explorer

An intuitive utility to help find the CSS path to UI elements on a webpage. These paths are used frequently in a variety of front-end development and QA automation tasks.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

From implementing a password expiration date, to datatype conversions and file export options, these are some useful settings I've found in Jasper Server.
Shell script to create broker configuration file using current broker Configuration, solely for purpose of backup on Linux. Script may need to be modified depending on OS-installation. Please deploy and verify the script in a test environment.
This video shows how to recover a database from a user managed backup
This video shows how to configure and send email from and Oracle database using both UTL_SMTP and UTL_MAIL, as well as comparing UTL_SMTP to a manual SMTP conversation with a mail server.

916 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question