Fastest way to import textfile record into database

Any fast and efficient way for a console to read from a textfile, do format checking and duplication checking on row then insert the record into database? i would like to read from 100k line and insert into database in a short time. expect to insert 100k row record in 5-10 minute.
mkdev2009Asked:
Who is Participating?
 
Raja Jegan RConnect With a Mentor SQL Server DBA & ArchitectCommented:
Since this is MS SQL Server, you can follow this procedure.

1. Create a temporary table with exactly the same structure as of your original table without any constraints.
2. BCP that flat file into a temporary table without any validation.
3. From the temporary table, INSERT / UPDATE records into your main table based along with the required validations.
4. Capture error records if any.

Hope this helps
0
 
nirojexpertCommented:
the fastest way is to load using the sql loader. you can enable your constraints of data validation and unique constraints using the database features.
but if your database is not oracle, you have to use the suitable tool.
the main point is the use the tool provided by the database than making your own if you are really concerned with performance.
0
 
Alpha AuCommented:
if you are using sql server, the fastest way to import a text file to db is BCP

you may check this out.

http://msdn.microsoft.com/en-us/library/ms162802.aspx
0
Improve Your Query Performance Tuning

In this FREE six-day email course, you'll learn from Janis Griffin, Database Performance Evangelist. She'll teach 12 steps that you can use to optimize your queries as much as possible and see measurable results in your work. Get started today!

 
Raja Jegan RSQL Server DBA & ArchitectCommented:
Have you tried out the native loading techniques specfic to the databases.
Since you haven't specified your database, listing the options below:

1. SQL Server - BCP
2. Oracle - SQL Loader
3. DB2 - LOAD
4. Sybase - BCP
5. MySQL  - LOAD

By the way, these native loading techniques are the faster way to load text file into the appropriate tables.
0
 
mkdev2009Author Commented:
Hi all, thanks a lot for the reply.
FYI, i am using MSSQL as my database. curretly the txt file is uplaod via web portal,and the console need to read it, do the validation, and insert into DB.
Any idea to speed up during duplication checking?

nirojexpert,
the duplicate validation are base on 3 criteria, dun think the unique constraints  able to do the checking.
0
 
Bill BachPresidentCommented:
An ETL tool may be quite helpful here, as well.  Pervasive Data Integrator (www.pervasive.com) can provide all of the text parsing, deduplication, and database insertion.  The real performance is in the automated processing of the "Engine", but you can certainly do all of what you need (except for automated runs from a batch file) via the less-expensive Developers Seat license.
0
 
Alpha AuCommented:
if the speed is concerned, i would suggest to batch insert the txt file into sql server table (create a table for temp storage)
then run a sp to do the deduplication and validation.

0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.