Losing data with bulk insert

I am using Bulk Insert to pull a large text file into my SQL table.  I noticed that the number of rows in my table was less than the number of rows in the text file.  When I looked more closely at the data, it appears that it's not detecting the end of the row and the start of the next row is attached to the last field of the previous row.  When the stored procedure finishes, it shows (343520 row(s) affected).  My table contains 171760 records which is half the number of rows affected.  I assume my row terminator is incorrect???  I have it set to '\n'.  I've attached a screen shot of the last column of a few records in my SQL table.  The "CGH" after the two boxes is the first field of the next record in the text file.  Does anyone have any ideas? Doc2.doc
kshumwayAsked:
Who is Participating?

Improve company productivity with a Business Account.Sign Up

x
 
dbaSQLConnect With a Mentor Commented:
well, at this point i'd need to see your table def.  I just ran this, and it loaded everything fine:


drop table test
go
create table test (A char(5),B char(4),C varchar(25),D decimal(5,2),E char(5),F datetime,G char(6),H varchar(25),I datetime,J char(5),K char(20))
select * from test

begin
BULK INSERT test
FROM '\\myserver\c$\mine\data.txt'
WITH
      (
      FIELDTERMINATOR = ',',
      ROWTERMINATOR = '\n'
      )
END


obvioulsy, i just created a big/generic table to take in the data from the file, but if you're taking into account that hidden column at the end, between the date and the HGB AND HCT stuff, then this should work.  the above loaded all 14 records in your file (with the double quotes).

                                                                                                                               
"CGH","2E","66400931",7.92,"0",08/01/2011,"HH","500120705",08/01/2011,,"HGB AND HCT"
                                                                                                                          ^
                                                                                                                          ^  this field


 
0
 
dbaSQLCommented:
Try \r\n.

\r is for the carriage return, \n is for the new row.
0
 
kshumwayAuthor Commented:
I tried that and I got the following error when I tried to execute the stored procedure:

Running [dbo].[InsertHospitalData].

OLE DB provider 'STREAM' reported an error. The provider did not give any information about the error.
The statement has been terminated.
Bulk Insert fails. Column is too long in the data file for row 1, column 12. Make sure the field terminator and row terminator are specified correctly.
0
What Kind of Coding Program is Right for You?

There are many ways to learn to code these days. From coding bootcamps like Flatiron School to online courses to totally free beginner resources. The best way to learn to code depends on many factors, but the most important one is you. See what course is best for you.

 
kshumwayAuthor Commented:
BTW - thank you for your quick response...
0
 
dbaSQLCommented:
can you post your bulk statement?  are you using a format file?

my pleasure.  :-)
0
 
kshumwayAuthor Commented:
See attached.  I've also attached a few rows of the .txt file that I"m importing and the table definition.  I'll be offline for about an hour.  Back online after that.  Again THANK YOU! Data.txt Doc3.doc
ALTER PROCEDURE dbo.InsertHospitalData
as
begin
BULK INSERT [Hospital Billing Data SQL]
FROM '\\KSQLDB\Projects\HospitalInvoicesData.txt'
WITH 
	(
	FIELDTERMINATOR = ',',
	ROWTERMINATOR = '\r\n'
	)
END

Open in new window

0
 
kshumwayAuthor Commented:
I've attached my table definition.  I made a few changes based on your definition, but I'm only getting six rows in my table.   tabledef.doc
0
 
kshumwayAuthor Commented:
I figured it out.  I had an id field on the table.  Took that off and I am getting all the data.  Thank you so much for your help!
0
 
kshumwayAuthor Commented:
Thank you so much for your help.
0
 
dbaSQLCommented:
Excellent!  I am glad you figured it out.  And I am glad to have helped.
0
 
kshumwayAuthor Commented:
Thanks again!
0
 
dbaSQLCommented:
>>I had an id field on the table.  Took that off and I am getting all the data.
by they way, in situations like this, it is always nice to use a format file.  you can get around identity columns very easily.

something like this, where the sixth value is server column order.  see, i am starting at server column #2, not #1, and we skip the ID:

9.0
4
1       SQLCHAR       0       7       ","      2     Col1         ""
2       SQLCHAR       0       100     ","      3     Col2         SQL_Latin1_General_CP1_CI_AS
3       SQLCHAR       0       100     ","      4     Col3         SQL_Latin1_General_CP1_CI_AS
4       SQLCHAR       0       100     "\r\n"   5     Col4         SQL_Latin1_General_CP1_CI_AS

just an example, for the next time.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.