Help on SQL SSIS Task Optimization

Hi,

I have an excel file contains around 1200 columns and 100,000 rows. I have created a SSIS package to export this dataset to a SQL Table. I am using dataflow task to export these data to SQL table, but it takes very long time more than 1 hour. The client excepts to complete this task within 1 minute.

Can someone suggest a better way to optimize this process or a better way to achieve this?

Thank you for your help!
r_pat72Asked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Ryan ChongCommented:
tried use Bulk Insert / openrowset methods for data importing?

Import Bulk Data by Using BULK INSERT or OPENROWSET(BULK...) (SQL Server)
https://docs.microsoft.com/en-us/sql/relational-databases/import-export/import-bulk-data-by-using-bulk-insert-or-openrowset-bulk-sql-server
0
r_pat72Author Commented:
Thank you for your reply. Question: How do i create table on the fly using Bulk Insert task? It only shows existing tables for the SQL connection.
Does it have any options of creating tables based on file selection from source?
0
Ryan ChongCommented:
How do i create table on the fly using Bulk Insert task?
bulk insert is for data importing. to create new tables, then we got to use Create Table commands prior to that.

hope this clear
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
Jim HornMicrosoft SQL Server Developer, Architect, and AuthorCommented:
>I have an excel file contains around 1200 columns and 100,000 rows.
>The client excepts to complete this task within 1 minute.
For starters it is obvious that your client is not an expert in SSIS or ETL in general, as Excel is an extremely poor choice of source data format as users can edit it in thousands of ways that would cause an SSIS mapping error in a data flow task between source and destination.  Far better would be to have them save this file as a .csv or some other text format.

>but it takes very long time more than 1 hour.
Have you tried going into Advanced Properties of your source connection (data source?) and setting the data types to minimize the footprint of each column to what is needed, and not whatever it thought were the defaults?

Also is this 100k file a 'full load' file of all rows, or an 'incrimental' file where it only has changes from the last time this file was provided?  incrimental files are always much smaller and therefore quicker to load.

Also is it really (c'mon, really?) necessary for you to import all 1,200 columns?  If not, you still have to define them in the source, but not pumping them into the destination will save time.
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
SQL

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.