[Webinar] Streamline your web hosting managementRegister Today

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1068
  • Last Modified:

how to do a full db migration using data pump (db running in noarchive mode)

Hi everybody,

I have to servers (10g rel2) on two different locations. I want a certain instance, which is actually in production, to be fully copied onto an other server for testing/developement purposes.
I've never done such thing before (I've been a plain pl/sql developer so far).
The production instance is running in NOARCHIVELOG mode.

My first idea was to
- stop the PROD database,
- make a full dump using the new datapump export utilyty (it is said to be more efficient than the older export import utility) from linux command line (as I figured out, only some simple parameter file has to be assembled)
- restart the PROD server
- copy the files to the TEST server (I guess via FTP)
- somehow import it with the corresponding datapump util

My questions:
- It would be desireable not to stop the PROD server if it was not necessary. According to my understanding it will be necessary to stop it, since it's running in NOARCHIVE mode and you wont get a constistent dump while it's running.
Or maybe it could be possible to make a full dump (for a dev/test copy) while running?

- Also I saw in the documentation that there's an API (dbms_datapump or something like that), with which you can copy one database instance to an other database (without command line and file copy).
Do you think this could be as good as my original plan (or even better) regarding performance?

- Any comment, hint, trick, possible errors regarding such a migration are welcomed.

- Also it was beautiful if you could provide me with some sample code just to save me some time.

Thanks for your help
Gergo
0
bpgergo
Asked:
bpgergo
  • 2
2 Solutions
 
gvsbnarayanaCommented:
Hi,
  To take an export(using data pump as well), you don't need the database running in archive log mode.
Your idea is correct all except for stopping the database.If you stop the prod server, then you will not be able to connect to the database and hence no data pump export.
You can close all the applications that are connecting to the database, if possible and initiate the datapump export. Otherwise, you can export the database with respect to a SCN (you can query using select current_scn from v$database) to have a consistent back up and supply this scn to the datapump script.
HTH
Regards,
Badri.
0
 
bpgergoAuthor Commented:
Badri, thanks.

So I can export up until a certain SCN and that will be consistent even if the database is running and in NOARCHIVELOG mode? And when I import it I get a constistent state corresponding to that certain SCN?
Is this goes only for command line version or the dbms_datapump package is similar?

And again: is it possible to to do the migration without mediate dump files, i.e. submiting a job in the source database what runs a pl/sql program which, using this dbms_datapump package connects to the target db through dblink and duplicates itself?

Also, which option do you think fits my needs better?

(These are all the questions I would like to clear before I close this thread and start to do this migration. Should additional quetions raise during the process, they go into a new thread.)

Thanks, again.
Best,
Gergo
0
 
rbrookerCommented:
Hi,

to create a test instance from a production instance, the easiest way that i have found ( and the least hassle ) is to do a backup / restore.
NOARCHIVELOG mode dictates that it would be a cold backup ( shut down, copy files ).

once the files have been copied to the test instance location, you can replace the test instance with the restored copy of production. ( reading your last post, the test instance is created and running )

I have always found this to be easier than an export / import.  and more reliable.

good luck.

:)
0
 
gvsbnarayanaCommented:
Hi,
  Creating tables on the other database using dblinks and taking a datapump export/import are two different mechanisms.
You can very easily create tables along with data on the second instance using dblinks but you will have problems related to indexes, constraints. If you take a datapump export, then you can get all these without any problem.
How big is your database? If your database is very big, then using a backup and restore will be easier..a little learning might be needed for you as you haven't done previously.
Yes... The data being export will be consistent with respect to that SCN. You may need to have bigger undo tablespace, if there is transactions going on the database while taking backup. If the instance experience any such problems, you can see them in the log file.
HTH
Regards,
Badri.
0

Featured Post

Receive 1:1 tech help

Solve your biggest tech problems alongside global tech experts with 1:1 help.

  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now