Bulk Insert Question

I have a text file that I want to import into a table using BULK INSERT.  I each entire row in the text file to go into one column in my database table.  Do I set the FIELDTERMINATOR and ROWTERMINATOR both to "\n"?

Books Online says not to use the same value for ROWTERMINATOR and FIELDTERMINATOR

Also, is there any documentation that indicates what these mean:
\t (I know is tab)
\n (I know is New Line)
\r\n (Don't know wtf this is)
etc.
LVL 7
ABaruhAsked:
Who is Participating?

Improve company productivity with a Business Account.Sign Up

x
 
mironConnect With a Mentor Commented:
well, this can be done dynamically with minor addition of the row number variable

-- define the start row
declare @i int
set @i = row_number

-- select table
create table #t( i int identity primary key, my_text varchar(8000) )
insert into #t ( my_text ) execute master.dbo.xp_cmdshell 'type "C:\my_text_file.txt"'

-- remove rows
deleted from #t where i < @i

-- final select
INSERT MyDB.dbo.MyTable select my_text from #t
0
 
mironCommented:
without bulk insert you can try to use this script running it in the query analyzer

create table t( i int identity primary key, my_text varchar(8000) )
insert into t ( my_text ) execute master.dbo.xp_cmdshell 'type "C:\my_text_file.txt"'


need to make sure that the
text file is located at C:\my_text_file.txt on the "sql server" where the script is executed
account running this script having high enough privileges to execute xp_cmdshell and see the C:\ or the directory you shose

-- cheers
0
 
Eugene ZCommented:
try DTS Wizard to import the txt or csv file - to see what are delimeters for row and field..

or\and


BULK INSERT yourDB..TempTable  FROM 'c:\yourimport.txt'
   WITH
      (
         MAXERRORS=100
      )
0
Build your data science skills into a career

Are you ready to take your data science career to the next step, or break into data science? With Springboard’s Data Science Career Track, you’ll master data science topics, have personalized career guidance, weekly calls with a data science expert, and a job guarantee.

 
ABaruhAuthor Commented:
The reason I cannot use DTS is because the name of the text file I will be importing is constantly changing

I will try EugeneZ solution
0
 
mironCommented:
you should try the first solution first :)

-- cheers
0
 
ABaruhAuthor Commented:
miron, I'm already trying to test in Query Analyzer.  Here is what I am trying:

BULK INSERT MyDB.dbo.MyTable
FROM "c:\Test.txt"
WITH (FIELDTERMINATOR = '\n', ROWTERMINATOR = '\n', FIRSTROW = 1)

I get 0 rows affected.
0
 
mironCommented:
well,

INSERT MyDB.dbo.MyTable exec master.dbo.xp_cmdshell 'type c:\Test.txt'

should do it, unless you have a record longer then 8000 character. Yet there are more limitations or possible tweaks due to potential unicode / locale, still this is a quick and simple way of selecting a file row by row into the targret table...

-- cheers
0
 
Partha MandayamTechnical DirectorCommented:
Your fieldterminator cannot be \n. \n is carriage return.
If your file is tab delimited, fieldterminator='\t'
0
 
ABaruhAuthor Commented:
mcp111, my file is not tab delimeted, it's a log file that a windows service produces.  we want to bring that logged information into a SQL table.  I just want each row to be inserted into one field.  i doubt the row would be more than 4000 characters so I have set the column to nvarchar(4000).  Is there another special character (like \n, \t) that can say the end of the field is the data UP TO the carriage return?

miron, I will try your INSERT method
0
 
mironCommented:
if the field terminator is set to \n then what sql server ( should ) try to do is to select each row of text into a table column, that would create a flat row with as many column as there are lines in the source file. Obviously this would not work. I am just wondering what is stopping you from using the command shell.

-- cheers
0
 
ABaruhAuthor Commented:
Oh, miron, I just realized.  Regarding the INSERT...xp_cmdshell solution you identify, I forgot to mention that I need to be able to start the import at a certain Line in the file.
0
 
Partha MandayamTechnical DirectorCommented:
try setting only the row terminator to '\n'
Another strategy would be to try using the DTS wizard and see what it selects as the row terminator and field terminator. Then use that in your bulk insert statement.
0
 
ABaruhAuthor Commented:
Miron, I suppose that would work, but I hate adding the overhead of processing all those records then eventually deleting them.  If all else fails though, I'll go with that solution.

mcp111 - I did as you suggested.  In the DTS Wizard I set the Fieldterminator to something like 'sasdfasfwef' and rowterminator to {CR}{LF} and it imported fine.  Then I tried the BULK INSERT from Query Analyzer specifying FIELDTERMINATOR = 'sasdfasfwef' and ROWTERMINATOR = '{CR}{LF}' and it gave a STREAM...Unexpected EOF reached error.  I also tried ROWTERMINATOR = '\n' with the aforementioned FIELDTERMINATOR and got same error
0
 
mironCommented:
Well, looks like when the the FIELDTERMINATOR set to '' should work

-- cheers
0
 
Partha MandayamTechnical DirectorCommented:
try rowterminator='\n' and fieldterminator='\t'
It works!
0
 
ABaruhAuthor Commented:
Neither of those work.  THe specific error I get is:

Bulk insert: Unexpected end-of-file (EOF) encountered in data file
OLE DB Provider 'STREAM' reported an error.  The provider did not give any information about the error.

It looks like I will go with Miron's Insert solution and delete the rows I do not want out of the temp table.

Thanks everyone.
0
 
Partha MandayamTechnical DirectorCommented:
I just tried this query on a text file and it worked fine. Did you try a query exactly like this?

BULK INSERT abbreviations
   FROM 'd:\partha\abbreviations.txt'
   WITH
      (
         FIELDTERMINATOR = '\t',
         ROWTERMINATOR = '\n'
      )
0
 
ABaruhAuthor Commented:
that is what I tried.  I got the error specified above.  The text file is a test file with just two lines:

This freaking thing
better work
0
 
Eugene ZCommented:
ABaruh:
did it work?

BULK INSERT yourDB..TempTable  FROM 'c:\yourimport.txt'
   WITH
      (
         MAXERRORS=100
      )

It is just for ASCII txt
if you have binary it is almost same just need
specify
 DATAFILETYPE [ =
                { 'char' | 'native'| 'widechar' | 'widenative' } ]

see more

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/tsqlref/ts_ba-bz_4fec.asp
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.