Solved

Oracle Database Move

Posted on 2009-07-09
17
362 Views
Last Modified: 2013-12-19
I have a scenario where I need to replicate a live Oracle 10G Database onto a staging environment. While I asked the IT staff controlling the origin database to give me a copy, they apparently copied the files from the running database.

As such, I have 5 .DBF files, 3 REDOXX.LOG files and 3 CONTROLXX.CTL files.

I am running a fresh install of Oracle 10G at my end. I now need these files to work on this installation. How do I go about restoring these files. Else should I actually have to stop the source database to take these copies ?

OS Platfform is Win 2k3 SP2

Any help is appreciated. I am really running out of ideas here.

Regards,
Krish
0
Comment
Question by:ipwerks
  • 8
  • 5
  • 3
17 Comments
 
LVL 6

Expert Comment

by:ahmad2121
Comment Utility
you dont need any of these to get data onto a staging environment!

1. do an export of the production database:

1. create a par file. sample below

USERID=username/pass@DATABASE
DUMPFILE=c:\dumps\MYDBDUMP.dmp
LOGFILE=x_c:\dumps\logs\DP_MYDBDUMP.log
SCHEMAS=MYPRODSCHEMA
job_name=SRSDEV_datapump_export
estimate=statistics

save it as expdp.par

2. run the following command

expdp parfile=expdp.par

this will do a datapump export of the database. Take that file, copy it to your staging server.

3. Install oracle 10g, apply all the necessary patches (or the patches that match production)

4. do an import of the data. Create an import par file, sample:

USERID=username/pass@DATABASE
DUMPFILE=c:\import\MYDBDUMP.dmp
LOGFILE=x_c:\import\logs\DP_MYDBDUMP.log  
SCHEMAS=MYPRODSCHEMA

Call it impdp.par

IMPDP PARFILE=impdp.par

5. thats it. Now you have an exact copy of the data and everything else related to it on the staging server.

impdp/expdp live in the oracle home bin, so you need that in your path to run those commands.

impdp/expdp have many options such as remapping schema and tablespace names which is good for maintaining several instances of the schema on one database.
0
 
LVL 40

Expert Comment

by:mrjoltcola
Comment Utility
How did you determine the db was live?

If they took a copy of the live database, you will need the archive logs to recover and open the db. A hot backup is not consistent without the archive logs from during / after the backup.

Otherwise, they need to do one of:

1) Give you a cold backup while the db is closed
2) Give you an export

0
 
LVL 6

Expert Comment

by:ahmad2121
Comment Utility
you can drop that jobname in the first par file.
0
 

Author Comment

by:ipwerks
Comment Utility
@mrjoltcola:
I was not around when the copy was made. The assumption is based on the information passed on to me by my colleague who was there that the staff there "plugged in the portable HDD to the server and just copied and pasted these files from the server to the portable HDD".

@ahmad2121:
Just for clarity, I am repeating the steps. First I need to export the DB. After I complete the export, I create a .par file as per your post. This par file will basically get me a datadump from the exported DB -- is my understanding right ?

Thanks,
Krish
0
 
LVL 40

Expert Comment

by:mrjoltcola
Comment Utility
I would take the export approach, see ahmad2121's good instructions.

The only catch is if the db is large it may be troublesome to get a consistent export while the db is live. With export you need to use CONSISTENT=Y, with data pump (expdp you should use flashback_time or flashback_scn). Your DBA team should know how to do this.

If they cannot shutdown the database, and the export doesn't work out, post back here and I can walk you through an RMAN clone of a live database.

0
 
LVL 40

Expert Comment

by:mrjoltcola
Comment Utility
Actually reading ahmad's instructions I am not sure which tool he is referring to. His instructions are in conflict. The export instructions will only work with exp, but the import instructions are for impdp (not imp). With data pump (expdp) you cannot specify a dumpfile full path like that, it will go to a specific Oracle directory, so you use exp (not expdp).

Just use this to export:

exp system/manager full=y file=system.dmp


Then prior to import, create the empty database and you have 2 options, either create the exact data directories as the source database, and let the import create the tablespaces again, or pre-create the tablespaces prior to doing the full import.

Then:

imp system/manager full=y file=system.dmp
0
 
LVL 40

Expert Comment

by:mrjoltcola
Comment Utility
Use the proper password, however, it is probably not manager for your source database.

0
 
LVL 40

Expert Comment

by:mrjoltcola
Comment Utility
Sorry for the spam followups, I realized the instructions given by the other poster above were erroneous, and I am heading offline for now, so in case you need more pointers, here is another canned instruction I give to help people.

Creating a new DB from full import...

Run DBCA to create a general purpose DB (that is fastest). Choose a compatible character set, or use AL32UTF8 for best compatibility. Then import the data.

Remember with import there is an important rule when full importing.

Either
1) the tablespaces must be pre-created with the same names as the original DB
OR
2) pre-create the directories so that Oracle can create the tablespaces for you during the full import

Example:

If source DB has non-standard tablespaces like DATA and INDEX, then the datafiles may be in:

C:\oradata\ora1\system.dbf
C:\oradata\ora1\sysaux.dbf
C:\oradata\ora1\data.dbf
C:\oradata\ora1\index.dbf

For this example DATA and INDEX are just examples, they could be anything non-standard.

You can list these with:  select file_name from dba_data_files;

You can skip system and sysaux, they will be created already by DBCA, as well as temp and undo, but you should create DATA and INDEX prior to the import OR you should make sure to create the directory C:\oradata\ora1 and then import will create the tablespaces for you.

Another way to find out what the database structure is from the actual import file is to use the LOG= and SHOW= parameters for import, which causes import not to actually do the import but to dump the DDL to a file for your review.

imp system/manager full=y file=system.dmp log=log.sql show=y rows=n

Then edit log.sql and take the create tablespace statements from it to run on your new database, changing the paths if needed.

Then when ready do the import:

imp system/manager full=y file=system.dmp


0
How your wiki can always stay up-to-date

Quip doubles as a “living” wiki and a project management tool that evolves with your organization. As you finish projects in Quip, the work remains, easily accessible to all team members, new and old.
- Increase transparency
- Onboard new hires faster
- Access from mobile/offline

 
LVL 6

Expert Comment

by:ahmad2121
Comment Utility
Erronous? Please explain to me how I export my databases several times a day using EXPDP using that exact method?
0
 
LVL 40

Expert Comment

by:mrjoltcola
Comment Utility
>>Erronous? Please explain to me how I export my databases several times a day using EXPDP using that exact method?

>>DUMPFILE=c:\dumps\MYDBDUMP.dmp


This is illegal syntax for expdp

You cannot specify a path in an expdp DUMPFILE parameter. Perhaps you can explain to me how you also export daily using that syntax?
0
 
LVL 6

Accepted Solution

by:
ahmad2121 earned 500 total points
Comment Utility
My apologies, you are correct. I did skip a step and that was out of assumption and not trial.

You do have to create a directory first, so run sqlplus / as sysdba. Then:

SQL> create or replace directory dumpfolder as 'c:\dumps';
SQL> create or replace directory dumplogfolder as 'c:\dumps\log';

this is for the export. Here is the fixed par file:

USERID=username/pass@DATABASE
DUMPFILE=dumpfolder:MYDBDUMP.dmp
LOGFILE=dumplogfolder:DP_MYDBDUMP.log
SCHEMAS=MYPRODSCHEMA
job_name=SRSDEV_datapump_export
estimate=statistics

then for the import, you also have to create directories. So on the new server, run the following:

SQL> create or replace directory dumplocation as 'c:\import';
SQL> create or replace directory importlog as 'c:\import\log';

The new import par file becomes:

USERID=username/pass@DATABASE
DUMPFILE=dumplocation:MYDBDUMP.dmp
LOGFILE=importlog:DP_MYDBDUMP.log  
SCHEMAS=MYPRODSCHEMA


Lessons learned: never skip steps.
0
 
LVL 40

Expert Comment

by:mrjoltcola
Comment Utility
No problem, thanks for correcting it ahmad. I only corrected last night because I wanted to make sure the asker had the information he needed while we may have slept. :)

This is one key difference between old export and newer Data Pump (expdp), expdp has no ability to export to a client filesystem, it must go to an Oracle directory or database link (as ahmad corrected above).

I still find plain export useful for that reason alone.


0
 
LVL 6

Expert Comment

by:ahmad2121
Comment Utility
Well we migrated over to datapumps because exp/imp were going away with 11g. After using dps, old school imp/exp seem outdated, especially with their syntax and features.

That said, when you create oracle directories they do last forever unless you delete them of course. And if you really wanted to, you could run a sql script just before your impdp/expdp that will create or replace them every time for easier portability.
0
 

Author Closing Comment

by:ipwerks
Comment Utility
@ Ahmad2121: I will try this out and post on the results.
0
 

Author Comment

by:ipwerks
Comment Utility
@modus_operandi:

I must admin the slip is at my end. By right, I should have tried out the solution and then closed the question awarding points to whichever solution/solutions that worked. However, my situation is that it would take a few days more for me to set up an appointment with my client where these servers are hosted since I would need to fall in line with the access policies of my client to have access to these servers.

In this case, there would have been at least a 3 day lull before I reverted with a response -- which could also make me fall foul of EE requirements. I should have just made a comment and reposted after I tried out the solution.

I will accept the course of action recommended by the experts here -- as to whether the points should be refunded or otherwise, and whether I will have to re-position the question.
0
 
LVL 40

Expert Comment

by:mrjoltcola
Comment Utility
Hi,

I understand no harm was intended. This is just normal objection procedure at EE.

Since I did correct the syntax errors, and ahmad also provided the corrected solution as a followup, I would request at least partial credit for my post:  http:#24820433
I don't disagree with the selection of ahmad's final solution as well but mine was the first working solution in the question. 50/50 is probably fair but I won't object to your choice if it includes http:#24820433

Thanks,

mrjoltcola

0

Featured Post

Highfive + Dolby Voice = No More Audio Complaints!

Poor audio quality is one of the top reasons people don’t use video conferencing. Get the crispest, clearest audio powered by Dolby Voice in every meeting. Highfive and Dolby Voice deliver the best video conferencing and audio experience for every meeting and every room.

Join & Write a Comment

This post first appeared at Oracleinaction  (http://oracleinaction.com/undo-and-redo-in-oracle/)by Anju Garg (Myself). I  will demonstrate that undo for DML’s is stored both in undo tablespace and online redo logs. Then, we will analyze the reaso…
Entity Framework is a powerful tool to help you interact with the DataBase but still doesn't help much when we have a Stored Procedure that returns more than one resultset. The solution takes some of out-of-the-box thinking; read on!
Via a live example show how to connect to RMAN, make basic configuration settings changes and then take a backup of a demo database
This video shows how to Export data from an Oracle database using the Datapump Export Utility.  The corresponding Datapump Import utility is also discussed and demonstrated.

728 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

10 Experts available now in Live!

Get 1:1 Help Now