Solved

import several access tables to temp file in ssis.

Posted on 2011-02-19
7
1,197 Views
Last Modified: 2013-11-10
I am setting up an ssis package.  I have 4 tables in an access database for import into one sql table. Insert data from each table (some columns will be different on each table) into a temp table - create aggregate on some columns - insert the aggregated data into the temp table, then export all to the main table in SQL.  
So far, I've set up a for each loop container and set up a connection to the database that contains the files with a variable filename and added this variable to the container.  Then, added the dataflow task for each table in the for loop container.  I would then added an execute sql task to each dataflow task in the container to get the specific data into the temp table (I did not try this yet - guess I would create the temjp table first, then append to it).
This is where I am stalled.  For each access table importing,  I need to count lines per group number and sum column values, then insert the computed columns to the temp table then when all is done, export the finished temp table to the SQL table.  Any suggestions /examples would be helpful since I am new to ssis.   Am I going about this the right way? Need help with this asap.
thanx,

0
Comment
Question by:bar0822
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
7 Comments
 
LVL 30

Accepted Solution

by:
nmcdermaid earned 250 total points
ID: 34936847
If your source tables have different columns then its difficult to use them in a loop because upon each iteration of the loop it will use the same data flow which will be expecting a source table with exactly the same columns (but probably with a different name). From your description, they don't have identical columns?

What I suggest is that you just get the data into a staging table in SQL first - forget about summarising it. Create four individual data flows and load the tables into your SQL staging table. Then you can write some summary SQL to transfer from the staging table into your live table. So you'd have a total of five dataflows.

You definitely need to create the SQL table first. Open up SQL Server Management Studio and do it in there. Its very similar to MS Access.

Can you clarify - this is four tables from one MDB file, right?
0
 

Author Comment

by:bar0822
ID: 34937204
OK. The four tables are from one mdb - Originally, the tables were from four different mdb's but they are being imported into one dbase and updated with new data on a monthly basis. For the most part, some columns are the same with exception of few, because some tables do not have a specific column while others do.  I created a table on staging. To extract the information from the OLEDB Source, I am using a Execute SQL Task and inserting into the staging table (trying to avoid using a data transformation task).  But, I will need to count lines over group numbers for each clientID - should I do this when all the tables are imported to the staging table Or would be easier to create in Execute SQL task for each access table (using a stored procedure?).  I am all over the place with this one ...
0
 

Author Comment

by:bar0822
ID: 34938382
Before I redo this again, is it possible to use the loop container to loop through queries in the database?  I can create queries on each table with the same columns - (set up alias for those tables that do not have the column).???  I am asking before I tear this apart again.  
0
What is SQL Server and how does it work?

The purpose of this paper is to provide you background on SQL Server. It’s your self-study guide for learning fundamentals. It includes both the history of SQL and its technical basics. Concepts and definitions will form the solid foundation of your future DBA expertise.

 
LVL 7

Assisted Solution

by:rmm2001
rmm2001 earned 250 total points
ID: 34989617
If you're opposed to data flow transformations, then you'll probably want to just insert the data into your sql temp tables and then do the query work when you insert from the temp to the real table.

When you say "is it possible to use the loop container to loop through queries in the database?  I can create queries on each table with the same columns - (set up alias for those tables that do not have the column).???"...what do you mean?

What exactly are you trying to do?
0
 
LVL 30

Expert Comment

by:nmcdermaid
ID: 34990027
There is a fair bit going on here:

1. You have a step combining four MDB's into one and this isn't part of the scope of what you're asking.
2. If I understand rightly you are using an execute SQL Task to transfer data. This means you are either transferring data within the same MDB or you are using some kind of linked server.

Can you clarify this?

I recommend an approach where you get all of the raw data into one table, then perform updates/selects on that to get your information.

That way when you get another four tables to process (MDB/Excel/SQL/whatever) all you have to do is push them into your central staging table and the final counting/processing whatever includes them automatically.

I really see two steps to this:

1. Get disparate data sources into a single table
2. Run whatever analysis you need over that.

Lastly:

1. Why are you avoiding data flow tasks?
2. Is this data finally residing in a SQL Server database or an Access database?
0
 

Author Comment

by:bar0822
ID: 34993375
thanks for your reply. The tables reside in an access database and need to import all the tables into one table on SQL.  I am using an oledb (jet) driver to import tables into staging table.  I was getting so many errors on the data e.g. cannot convert unicode to non-unicode on most of the fields and I've tried the data transformation/derived column but still receiving the same errors so I created a temp table using exec sql task with data types nvarchar then insert this data into my table on SQL.
0
 
LVL 21

Expert Comment

by:Alpesh Patel
ID: 35239560
Hi Please use data conversion task
0

Featured Post

Simplifying Server Workload Migrations

This use case outlines the migration challenges that organizations face and how the Acronis AnyData Engine supports physical-to-physical (P2P), physical-to-virtual (P2V), virtual to physical (V2P), and cross-virtual (V2V) migration scenarios to address these challenges.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Load balancing is the method of dividing the total amount of work performed by one computer between two or more computers. Its aim is to get more work done in the same amount of time, ensuring that all the users get served faster.
The Delta outage: 650 cancelled flights, more than 1200 delayed flights, thousands of frustrated customers, tens of millions of dollars in damages – plus untold reputational damage to one of the world’s most trusted airlines. All due to a catastroph…
This video shows how to set up a shell script to accept a positional parameter when called, pass that to a SQL script, accept the output from the statement back and then manipulate it in the Shell.
Viewers will learn how to use the SELECT statement in SQL to return specific rows and columns, with various degrees of sorting and limits in place.

726 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question