Oraclle11g_Server_Setup_Time.

Do you have any rough ideas on tasks and time required to move a 9i database from one machine to a new machine that will have 11gR2 server.

There new 11g server will run on a RHEL virtual machine.

Also, the source database is currently in western european character set and the 11g target database will be in Unicode or UTF8 format.

The database has a couple of pl/sql applications so these would need to be tested too..

There is no RAC involed. There are a couple of database with a couple of instances.
sam15Asked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

johnsoneSenior Oracle DBACommented:
I would say it should take you about a day to get the 11g server configured and up and running.

To move the data, I assume you would use exp/imp, and the amount of time that takes is going to be based on the amount of data as well as many other factors.  The best way to get an approximate timing is to do a test run.
0
sam15Author Commented:
The exp takes about 30 minuts for 10 GB database. It is not a big issue.

I think most time here is testing the applications and running and reviewing the CSSCAN reports for character set converion since i am going to unicode.

There is also some setup for oracle HTTP server. This is installed with database.
0
slightwv (䄆 Netminder) Commented:
Imports can take significantly longer since it has to build indexes and handle all the constraints.

Depending on how you prep the new database there is also the truncate/delete/create/??? that you need to perform.

There is also the time to transfer the file from point-A to point-B.

As far as testing:  I agree this will likely take the most time especially when changing character sets since you have to make sure you don't drop data during the conversion.

I agree with the first post about test runs.  Take a few smaller tables that you think might be issues and test with those.  Once you are confident the data conversions work properly then the biggest effort is testing the applications.
0
The Ultimate Tool Kit for Technolgy Solution Provi

Broken down into practical pointers and step-by-step instructions, the IT Service Excellence Tool Kit delivers expert advice for technology solution providers. Get your free copy for valuable how-to assets including sample agreements, checklists, flowcharts, and more!

sam15Author Commented:
I think the CSSCAN report should be the best indicator if there will be any data truncation or not.

How else can you test every single table and field data to check it manually? You have to compare new data in 11g to your old data in 9i to confirm if they match or not. I doubt anyone does that.

It seems this whole effor can be done in 2-3 weeks.
0
slightwv (䄆 Netminder) Commented:
The level of checking is directly related to and policies in place, compliance standards, etc...

For example:  Are there things like federal regulations that you must comply with?

If three are things like this in place I'm not sure I would look at one row and if it is OK assume the millions of others are fine.
0
sam15Author Commented:
no, there is not any regulations.

But if the CSSCAN reports were OK do you still run tests to validate data. Can you explain what kind of tests you would run to make sure data is correct in the new target DB.

The two databases (source and target) are in different cities and cannot talk to each other.
0
slightwv (䄆 Netminder) Commented:
I've never run CSSCAN.

When I have done data migrations in the past I typically write code to check things for me.  The MINUS operator and linked databases work very nicely.

Since your databases cannot talk you can always spool out data and copy them to a common location and do a simple diff on the files.

If there is a TON of data involved, you can look at selecting the PK fields and generating a checksum on the rest of the columns.  Spool that out then diff the files.

If possible:  I check ALL rows.

You should go as far as you feel confident that everything is OK.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
sam15Author Commented:
may i ask why you dont use CSSCAN. That is oracle standard check tool when you migrate from character set to another.

I like your idea about doing a checksum on data or tables. Is there a link or article that shows code on how you implement that using oracle functions.
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Oracle Database

From novice to tech pro — start learning today.