Jasmin shahrzad
asked on
exp/imp
is it possible to make an table exp with timestamp (from 'time' to 'time')?
i don't want to use snapshut over the database link. i want to export f.x. change for last 2 hours from some table and imp to another database.
if i create a snapshut in source database can i use snapshut log on another database.
i don't want to use snapshut over the database link. i want to export f.x. change for last 2 hours from some table and imp to another database.
if i create a snapshut in source database can i use snapshut log on another database.
You can do queries with "as of timestamp 'sysdate-2/24'" only if you have big value in UNDO_RETENTION parameter of the DB.
The universal approach in your case is to use the archive logs and to make Point In Time Recovery (PITR) in a spare DB and to export and import the table(s).
The universal approach in your case is to use the archive logs and to make Point In Time Recovery (PITR) in a spare DB and to export and import the table(s).
ASKER
expdp system/dev Full=N DIRECTORY=DATA_PUMP_DIR dumpfile=dev3.dmp logfile=dev3.log query=dev.test:'" select * from dev.test minus select * from test as OF TIMESTAMP (SYSTIMESTAMP - INTERVAL '120' MINUTE);"
is working
but impdp not working.
impdp system/dev schemas=dev2 directory=DATA_PUMP_DIR remap_schema=dev:dev2 dumpfile=dev2.dmp logfile=expdev2.log
ORA-39002: invalid operation
ORA-39165: Schema DEV2 was not found.
is working
but impdp not working.
impdp system/dev schemas=dev2 directory=DATA_PUMP_DIR remap_schema=dev:dev2 dumpfile=dev2.dmp logfile=expdev2.log
ORA-39002: invalid operation
ORA-39165: Schema DEV2 was not found.
the dumpfile will only create the table and data
you'll have to precreate the schema dev2
you'll have to precreate the schema dev2
ASKER
i CREATED dev2 user. it's not because user(schema) is not found.
>> it's not because user(schema) is not found.
I have to believe what the error message is telling me.
>>You can do queries with "as of timestamp 'sysdate-2/24'"
There is also a FLASHBACK_TIME parameter of expdp:
http://docs.oracle.com/database/121/SUTIL/GUID-CB8386E5-CA76-4D4A-884E-F97BFC58B230.htm#SUTIL850
I have to believe what the error message is telling me.
>>You can do queries with "as of timestamp 'sysdate-2/24'"
There is also a FLASHBACK_TIME parameter of expdp:
http://docs.oracle.com/database/121/SUTIL/GUID-CB8386E5-CA76-4D4A-884E-F97BFC58B230.htm#SUTIL850
You export objects via SELECT statement and try to import schema. I suppose that the schema is not found IN THE EXPORT FILE.
Try to do FULL import of the export file and connect to the DB as user DEV2.
The best way is to use a parameter file:
userid="dev2/password@inst ance"
full=Y
........
Try to do FULL import of the export file and connect to the DB as user DEV2.
The best way is to use a parameter file:
userid="dev2/password@inst
full=Y
........
ASKER
You are right i think the export is wrong
expdp dev/dev Full=N DIRECTORY=DATA_PUMP_DIR TABLES=TEST dumpfile=dev.dmp logfile=dev.log query=dev.test:'" select * from dev.test minus select * from test as OF TIMESTAMP (SYSDATE-2/24);"'
ORA-31693: Table data object "DEV"."TEST" failed to load/unload and is being skipped due to error:
ORA-00933: SQL command not properly ended
what i want is export/import of data of table test every 2 hours to another database.
expdp dev/dev Full=N DIRECTORY=DATA_PUMP_DIR TABLES=TEST dumpfile=dev.dmp logfile=dev.log query=dev.test:'" select * from dev.test minus select * from test as OF TIMESTAMP (SYSDATE-2/24);"'
ORA-31693: Table data object "DEV"."TEST" failed to load/unload and is being skipped due to error:
ORA-00933: SQL command not properly ended
what i want is export/import of data of table test every 2 hours to another database.
It appears your two question have now become the same?
As I posted in your other question, I'm not sure the path you have chosen will accomplish what you want.
Since your questions now appear to be the same, we should chose one to continue with so we are not duplicating posts.
As I posted in your other question, I'm not sure the path you have chosen will accomplish what you want.
Since your questions now appear to be the same, we should chose one to continue with so we are not duplicating posts.
you don't need the ";" at the end of your query statement
and why prepend the query with dev.test ?
and why prepend the query with dev.test ?
ASKER
same problem
expdp dev/dev Full=N DIRECTORY=DATA_PUMP_DIR TABLES=TEST dumpfile=dev.dmp logfile=dev .log query=test:'"select * from test minus select * from test as OF TIMESTAMP (SYSDATE-10/1440)"'
error :
ORA-31693: Table data object "DEV"."TEST" failed to load/unload and is being skipped due to error:
ORA-00933: SQL command not properly ended
Master table "DEV"."SYS_EXPORT_SCHEMA_0 2" successfully loaded/unloaded
expdp dev/dev Full=N DIRECTORY=DATA_PUMP_DIR TABLES=TEST dumpfile=dev.dmp logfile=dev .log query=test:'"select * from test minus select * from test as OF TIMESTAMP (SYSDATE-10/1440)"'
error :
ORA-31693: Table data object "DEV"."TEST" failed to load/unload and is being skipped due to error:
ORA-00933: SQL command not properly ended
Master table "DEV"."SYS_EXPORT_SCHEMA_0
there is a syntax error in the query
try to run the query in sqlplus or toad first
after you get the query working, then copy it to the exp command
select * from test minus select * from test as OF TIMESTAMP systimestamp-10/1440
try to run the query in sqlplus or toad first
after you get the query working, then copy it to the exp command
select * from test minus select * from test as OF TIMESTAMP systimestamp-10/1440
As I posted in your other question:
I don't think the query parameter for export takes a complete select statement. From everything I've read, it takes a WHERE clause to apply to the tables being exported.
I don't think it will work the way you are trying to make it work.
I don't think the query parameter for export takes a complete select statement. From everything I've read, it takes a WHERE clause to apply to the tables being exported.
I don't think it will work the way you are trying to make it work.
ASKER
There is no syntax error in query. i think slightwv has right. sql query is not supported in exp.
You should try a more simple way.
Ensure enough disk space and create there a new schema.
With a script grant access to the original schema to the new schema.
With another script try to create there the tables with the statement:
CREATE TABLE name AS (SELECT * FROM original_schema.name AS OF TIMESTAMP (SYSDATE-10/1440)
Use execute immediate PL/SQL statement.
If all is OK, make an export and after that import.
Yes, I understand, there possibly is not disk place. But the Export also needs disk space.
After the Import create the needed indexes.
You will need also other related objects, but step by step you will do the job.
Alternatively you can use DB link, bit it is slow and possibly the new schema will not be consistent.
Ensure enough disk space and create there a new schema.
With a script grant access to the original schema to the new schema.
With another script try to create there the tables with the statement:
CREATE TABLE name AS (SELECT * FROM original_schema.name AS OF TIMESTAMP (SYSDATE-10/1440)
Use execute immediate PL/SQL statement.
If all is OK, make an export and after that import.
Yes, I understand, there possibly is not disk place. But the Export also needs disk space.
After the Import create the needed indexes.
You will need also other related objects, but step by step you will do the job.
Alternatively you can use DB link, bit it is slow and possibly the new schema will not be consistent.
Creating a table as it looked in the past will not help solve the problem.
What Jasmin needs is to capture the changes in a table over time so those exact same changes can be made to a table in a remote database.
What Jasmin needs is to capture the changes in a table over time so those exact same changes can be made to a table in a remote database.
ASKER
what i decide to use is export all table first time den use SELECT * FROM original_schema.name AS OF TIMESTAMP (SYSDATE-10/1440) from source table to a file and with sqlldr load to another database. of course delete is not support here. then i support for update insert only.
i can also use logminer but we do not delete, den sqlldr is simple way.
open for suggest.
i can also use logminer but we do not delete, den sqlldr is simple way.
open for suggest.
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
It is good news that complex SELECT with MINUS and TIMESTAMP AS OF works in export with Data Pump.
But the idea to create additional table based on SELECT like as select * from test minus select * from test as OF TIMESTAMP systimestamp-10/1440 and to export it has his preferences. It is a simple and proven way, if there is enough disk space of course.
Also the same thing can be done with DBLINK. So there are more then enough alternatives (I didn't mention materialized views). Of course the asker has to take the decision, I am only trying to figure out other ways.
But the idea to create additional table based on SELECT like as select * from test minus select * from test as OF TIMESTAMP systimestamp-10/1440 and to export it has his preferences. It is a simple and proven way, if there is enough disk space of course.
Also the same thing can be done with DBLINK. So there are more then enough alternatives (I didn't mention materialized views). Of course the asker has to take the decision, I am only trying to figure out other ways.
complex ?
there is a million ways of how to achieve this
first scenario which springs to my mind:
there is a million ways of how to achieve this
first scenario which springs to my mind:
- export data to a text file as insert statements with toad
- place the file on a usb stick
- attach to pigeon
- send pigeon to remote site
- detach usb from pigeon
- plugin usb on computer at remote
- run script in toad on dest db
- feed pigeon
ASKER
problem with snapshot or materialized views is: read only and if snapshot log break den insert delete and update is impossible until
you delete log and create it from start. and db link is not support in my work.
But perfect Geert Gruwez, THANK YOU
you delete log and create it from start. and db link is not support in my work.
But perfect Geert Gruwez, THANK YOU
ASKER
i put it all in crontab in AIX.
have you checked materialized view logs ?
it's a prerequisite for a materialized view
> it contains all changes on a the table as of a certain time
you can generate the insert, update and delete statements based on a materialized view log
it's a prerequisite for a materialized view
> it contains all changes on a the table as of a certain time
you can generate the insert, update and delete statements based on a materialized view log
ASKER
Yes yes i working with this sheet before, from oracle 8. it is really problem when your snapshot log break on night batch (kl. 02xx).
and here is not allowed db link between databases.
no matter with delete row. we do not delete row from historical tables.
and here is not allowed db link between databases.
no matter with delete row. we do not delete row from historical tables.
ASKER
Perfect
no clue what that is
if your undo is big enough ... you could increase it if not
you could use a query to find the data changed in the last 2 hours:
select * from table
minus
select * from table as of timestamp 'sysdate-2/24';
http://docs.oracle.com/database/122/ADFNS/flashback.htm#ADFNS618
exp and expdp can export data based off a query:
http://docs.oracle.com/database/122/SUTIL/oracle-original-export-utility.htm#SUTIL2688
http://docs.oracle.com/database/122/SUTIL/oracle-data-pump-export-utility.htm#SUTIL860
and then run an import with data_only
http://docs.oracle.com/database/122/SUTIL/oracle-original-import-utility.htm#SUTIL1710
expdp and impdp have more capabilities than the original exp/imp