Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1495
  • Last Modified:

Data Pump by Date Range

Hello,

I have a schema (SCOTT) where there are two related tables: SCOTT.Documents (parent) and SCOTT.DocumentPages (child).  The SCOTT.Documents table has a "Created" date column.  There are PK fields that ensures there cannot be duplicates etc.

SCOTT.Documents
--------------------------------------
doc_id   NUMBER    PK
description   VARCHAR2
created   DATE

SCOTT.DocumentPages
--------------------------------------
page_id   NUMBER   PK
doc_id   NUMBER   FK
document   BLOB

1. How can I use the Data Pump to export all the data in both these tables WHERE just SCOTT.Documents (and corresponding SCOTT.DocumentPages) greater than 01-01-2008 are exported AND all other schema data in full?

2. The destination database\schema only has data less than 01-01-2008 for the two tables listed.  How can I import this data into the destination database ensuring existing older Document\DocumentPages data and it's Indexes and Constraints (PK, FK etc.)  are not negatively affected?

Note: All the other schema outside these two tables should be replaced in full on the destination database\schema.

Parameter file (.par) examples would be ideal.

Thanks,

Michael4606
0
michael4606
Asked:
michael4606
  • 2
1 Solution
 
gatorvipCommented:
I don't think/know if you can do it directly with expdp but you can create two tables with only the data you want, export those two tables, then import into your new location (also in two "temporary" tables) then finally merge them back with your other data.
0
 
schwertnerCommented:
There is a tricky way to do this and I use it for special purposes.
The trick is that Data Pump has API called DBMS_DATA_PUMP.
You can create a PL/SQL procedure that:
1. Imports (or Package that exports/imports) both tables in their original
form.
2. After the Import is done deletes the unwanted records.

The only problem you will have is to synchronize the end of the Import
with the beginning of the delete action. This is so because using the package you only
initiate the Pump job. It runs in the background and have to know when it stops.
0
 
michael4606Author Commented:
The legacy export\import tools had a QUERY parameter that could be used to filter data using a WHERE clause.  I guess in essence I am asking for the modern Data Pump method of doing this.

Thanks again,

Michael4606
0
 
schwertnerCommented:
You can also use filter in the Data Pump like in the clasic Export.
But here you face relationship between two tables.
This is complex WHERE clause and i am not sure that you can express this
condition in the EXCLUDE clause.
0

Featured Post

[Webinar On Demand] Database Backup and Recovery

Does your company store data on premises, off site, in the cloud, or a combination of these? If you answered “yes”, you need a data backup recovery plan that fits each and every platform. Watch now as as Percona teaches us how to build agile data backup recovery plan.

  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now