DTS Import and Index on multiple tables

I've got a bunch of tables I am transferring from 2 Oracle databases to Microsoft SQL Server.  I opened the SqlServer Import/Export wizard and selected 40 tables from one source, and ran it, and it worked great, but when I tried to edit the package I found it had given me 40 completely separate data connections and transformations.  Then I set up Indexes on the tables, and, of course, the next morning, as it recreated the tables, it blew all of my indexes away.

I think need to set it up from scratch, using only one Source and one Destination, and preferrably only one script that runs all of my SQL code, but as an Oracle guy, DTS is new to me, and the import/export wizard doesn't create what I want.  If I could get some assistance getting started in the right direction, I think I can figure it out.  Also I'm new to SQLServer concepts in general, so I'm grateful for any level of detail you can offer along the way.

1. I'd like to create a general connection to my first Oracle database and rely on the SQL statements to get the tables I want.
2. I'll need to drop my indexes on the SQL Server side.
3. I wondered if there were any cool way to gather only the latest data that had changed from Oracle, I'll probably end up dropping my SQL Server tables and recreating them.
4. Then I'll run the 40 statements that gather the data from one Oracle, but I'll need to modify the SQL to name the tables uniquely on the SQLServer side, since I have another Oracle database with the exact same tables, and I want to import them into the same SQLServer database.  And how cool would it be if I could put all of the statements in a single script?!
5. Then I'll want to create the indexes
6. And I've heard rumors that I could have it email me when it is complete, and I'd like to see if I can make that happen.
7. And finally I'll need to do the same thing with my second Oracle database.

I opened the Enterprise Manager and messed around to see if I could intuitively create what I want, but it's not as intuitive as I hoped, so I'm leaving this request for help on Experts Exchange while I go start my education with tutorials on the web.

Thanks in advance, for your help!
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Create a new package..

1. Drop an ODBC or OLE DB connection on the page that connects to Oracle. Each data pump that goes from Oracle to SQL Server can use a SQL statement as its source.

2. Drop a SQL server connection on to the page and confugire it. Now you can drop an Execute SQLTask on the page, set it to use the SQL Server connection, and run a DROP INDEX command. You use workflow to ensure it runs first.

3. Assuming there is some kind of date stamp on the oracle table to identify new records, I usually use parameters in my source SQL to only gather data thats newer than say seven days old. Then I delete that same data in the destination tables before loading it.

4. Can you add a column to your destination table which partitions the data. That way it all goes into the same table.

5. Use an Execute SQL task as described in 2 (but don't add another SQLconnection - you can use the same one)

6. Yep. You either need to use native SQL email functionality which is a bit of pain (need to install outlook client on the SQL Server) or you can use xp_smtpmail from  www.sqldev.net - very handy

7. You'll probably need two Oracle connections on your DTS.

Finally ,this site is REALLY helpful with DTS


Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
AptDevAuthor Commented:
On number 3, when you are transferring only the data from the most recent 2 days, do you use a "select into" in an Execute SQL Task or do you use a Data Transformation Task?  If I use an Execute SQL task, and I can get a single query working, then I should be able to put a couple of dozen data transfers in a single script.  But if I use a Data Transformation Task, then I'll end up with 40 tasks running from my source to my destination database, is that true?
Yep you can probably use a select into, you'd need to set up a linked server to your Oracle database

Linked servers are under 'Security' in Enterprise Manager, or you could look up sp_addlikedserver in books online

Once you've set up your linked server you could do this:

DELETE LocalTable
WHERE DateField >= DATEADD(d,-7,GETDATE())

INSERT INTO LocalTable (C1,C2,C3,C4,DateField)
SELECT C1,C2,C3,C4, DateField
FROM OracleDB.Database.Schema.Table
WHERE DateField >= DATEADD(d,-7,GETDATE())

I personally avoid using DTS as it can easily get its knickers in a know if fields disappear etc.

If you were to use the script above you could put it in a stored procedure and just run that.
That stored procedure could also contain your index manipulation scripts as well as your emailing script.
AptDevAuthor Commented:
thanks you nmcdermaid, it worked great!
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft SQL Server

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.