[Last Call] Learn how to a build a cloud-first strategyRegister Now

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 489
  • Last Modified:

How to import a very big-sized .CSV file into a MS SQL table

Hi:

We've a very big-sized .CSV file. We would like to import it into a MS SQL table. This file is very big, we cannot convert it into an Excel file becuase the size exceeds the limited size of an excel file.

If you know how to import this .CSV file into a MS SQL table, please help. Any help would be appreciated.

Thank you very much,

Somits
0
somits
Asked:
somits
  • 5
  • 3
  • 3
  • +3
5 Solutions
 
ptjcbCommented:
You could use DTS (if you are SQL Server 2000) or SSIS (if 2005).


In 2000
Right click on the database in Enterprise Manager, select ALL Tasks, Import Data. This will bring up an import/export wizard. Select the data source to be your CSV. Destination is your database. In the next screens you can select what table and if there is any transformation needed. Answer the questions on the rest of the screen and you can either set it to run right now, or at another time.
0
 
Scott PletcherSenior DBACommented:
Use BULK INSERT, since that's the fastest method of loading data.  As far I know, there's no limit to the file size that can be loaded.  If you find you need a format file, use bcp to create one based on the table, then adjust as needed -- this is a pain to get working, but once it works, it's slick.
0
 
Aneesh RetnakaranDatabase AdministratorCommented:
I agree with ScottPletcher, you should go for BULK INSERT
BULK INSERT dbo.urTable
   FROM 'C:\urCsv.csv'
   WITH (FIELDTERMINATOR = ',')
0
Free learning courses: Active Directory Deep Dive

Get a firm grasp on your IT environment when you learn Active Directory best practices with Veeam! Watch all, or choose any amount, of this three-part webinar series to improve your skills. From the basics to virtualization and backup, we got you covered.

 
Eugene ZCommented:
Bukl insert, DTS <=> BCP ..  fastest metod compare to what?
What is your Sql server version?

0
 
somitsAuthor Commented:
Hi:

Thanks a lot for all your help. I'm very eager to try both the suggestions to find out what I can do with both. I'm currently on a business trip and hope that I can get back to this very soon. Sorry for the fact that I could not be in touch on the issue for several days.

Thanks a lot,

Somits
0
 
thatelvisCommented:
How would you do this if you didnt have Enterprise Manager, my isp hosts my database but I have no tool here on my home pc to upload a 150 meg file to it.

regards

kenny
0
 
Aneesh RetnakaranDatabase AdministratorCommented:
You can use the FTP tool to upload your fiel
0
 
thatelvisCommented:
really, upload it to my sql server 2000 table. how would i do that then ,
0
 
Aneesh RetnakaranDatabase AdministratorCommented:
i hope you are using

BULK INSERT dbo.urTable
   FROM 'C:\urCsv.csv'
   WITH (FIELDTERMINATOR = ',')


now for this ,you need to put the csv in some ftp folder and put the appropriate path in the above query
0
 
thatelvisCommented:
how would i know what content would go in which column though, thans for help sorry for hijacking this question.

my ftp tool is cuteftppro

regards

kenny
0
 
Eugene ZCommented:
1. create destination table
2. try to use format file if bcp or bulk insert
3. But DTS GUI is the best in your case
0
 
Eugene ZCommented:
0
 
somitsAuthor Commented:
Hi all:

I've tried with a sample of .CSV file. It works as all of you pointed. Now with the real things, If I have mutiple .CSV files, how can I import all of them into the database. Can I do one-by-one and all the .CSV files are imported into the DB correctly in the order of .CSV file importing?

For examples, I've 3 very big .CSV files, named CSV_1, CSV_2, and CSV_3.
First, I import CSV_1 into the database DB_1, then I import CSV_2. Can I do by that way and all the records from CSV_2 are added into the DB_1 as the new rows following the rows generated when CSV_1 is imported.

Thanks a lot,

Somits

 
0
 
thatelvisCommented:
Hello,
i had a very big file it was 130 odd meg,  i did it using http://www.navicat.com/download.html , I have tried many others over the last two days both commercial and freeish, the navicat was the fastest, I had neat 200k records into one table and it did it within 2 hours, where as some of the other softwares by two hours they were only 25% done.

if all csv are the same I would near be temped to joint the all together and then use navicat , while it is working go for a beer or two.

regards

kenny

ps I did give this a go, it is commercial software   http://www.simx.com/simx/simx.stt?comp=cloyllf&recid=&uiinfo=C49_cloyllf:c1p84i3g,C2_cloyllf:false,C4_cloyllf:false,C6_cloyllf:true,&extra=

they have good support but navicat was quicker.
0
 
somitsAuthor Commented:
Hi thatlevis:

Thanks a lot for your comments. I'd like to join all the .CSV files. HOwever, it should be huge ... huge because the size is at least 17 GB. Insteads of doing once, can I import one by one .CSV files into the database? It may take more time but I think it's OK.

Thanks,

Somits
0
 
thatelvisCommented:
17 gig wowa that would take ages, a day at least maybe 48 hours, For doing something of that magnitude I would consider using perhaps a commercial piece of software and have it load each automatically csv one after the other.

 http://www.simx.com/simx/simx.stt?comp=cloyllf&recid=&uiinfo=C49_cloyllf:c1p84i3g,C2_cloyllf:false,C4_cloyllf:false,C6_cloyllf:true,&extra=

above is a software I believe could handle it, drop there suport a line, you should epect to get a reply from them within 24 hours.

lol out of interest what have you got in the csv for it to be 17 gig,

regards

Kenny


0

Featured Post

 [eBook] Windows Nano Server

Download this FREE eBook and learn all you need to get started with Windows Nano Server, including deployment options, remote management
and troubleshooting tips and tricks

  • 5
  • 3
  • 3
  • +3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now