?
Solved

SQL 2008 Bulk Insert issues

Posted on 2011-03-09
12
Medium Priority
?
1,199 Views
Last Modified: 2012-05-11
I am having issues with with csv file that works in sql 2000. I have identified why but not how to solve. If a row does not have data for the right most columns it just puts a '\n\r' without commas as placeholders. Bulk insert then ignores the row terminator.
Here is a sample of a typical file - It is the code snippet
 
20,00021B,N,12.56,,,,,12.34,12/31/2008,12.34,12/31/2008,12.34,12/31/2008,,,12.34,12/31/2008
20,01595J,N,218.07
20,01596J,N,257.99
20,01597J,N,439.83
20,01598J,N,331.91
20,02332I,N,9716.31
20,02333I,N,1505.44
20,02334I,N,1904.18
20,028030,Y,139.23,136.53,04/30/2008
20,028061,Y,349.39,342.61,04/30/2008
20,030028,Y,121.72,119.36,04/30/2008
20,03049I,N,3371.04
20,03051I,N,2527.22
20,03064I,N,5964.34
20,03065I,N,5964.34
20,038086,N,34.03,,,,,33.15,12/31/2008,33.15,12/31/2008,33.15,12/31/2008,,,33.15,12/31/2008

Open in new window

0
Comment
Question by:dkochenour
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 6
  • 3
  • 2
  • +1
12 Comments
 
LVL 43

Expert Comment

by:Eugene Z
ID: 35093455
what are you using for bulk insert: SSIS; Command line, ?
what an error did you get?
0
 
LVL 51

Expert Comment

by:Vitor Montalvão
ID: 35093583
Can you post full command that you are using?
0
 
LVL 51

Expert Comment

by:Mark Wills
ID: 35094122
so what has changed - is it the generation of the CSV file, reading it, what creates the CSV without the placeholders for the detailed rows ?

Do you want a routine to read that file regardless ?

Does it have a header row ?

Why arent the commas there ? Is it a multiformat line ?
0
Free learning courses: Active Directory Deep Dive

Get a firm grasp on your IT environment when you learn Active Directory best practices with Veeam! Watch all, or choose any amount, of this three-part webinar series to improve your skills. From the basics to virtualization and backup, we got you covered.

 

Author Comment

by:dkochenour
ID: 35094180
BULK INSERT ITEM_PROMOS_QUEUE_TST
FROM 'C:\ECSJOBS\TDG080229041829001.832'
WITH (FORMATFILE = 'C:\ECSJOBS\PromosPrice.fmt')


(1 row(s) affected)
0
 

Author Comment

by:dkochenour
ID: 35094199
Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
0
 

Author Comment

by:dkochenour
ID: 35094202

Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 1
10.0
18
1       SQLCHAR             0       2       ","        1     CORP_CODE                                    SQL_Latin1_General_CP1_CI_AS
2       SQLCHAR             0       20      ","        2     IC_ITEM_NO                                   SQL_Latin1_General_CP1_CI_AS
3       SQLCHAR             0       1       ","        3     PROMO_IND                                    SQL_Latin1_General_CP1_CI_AS
4       SQLCHAR             0       30      ","        4     COLUMN_PRICE_AMT                             ""
5       SQLCHAR             0       30      ","        5     PROMO_PRICE_AMT                              ""
6       SQLCHAR             0       30      ","        6     PROMO_EXPIRATION_DTE                         ""
7       SQLCHAR             0       30      ","        7     EF_PRICE_AMT                                 ""
8       SQLCHAR             0       30      ","        8     EF_EXPIRATION_DTE                            ""
9       SQLCHAR             0       30      ","        9     EH_PRICE_AMT                                 ""
10      SQLCHAR             0       30      ","        10    EH_EXPIRATION_DTE                            ""
11      SQLCHAR             0       30      ","        11    EL_PRICE_AMT                                 ""
12      SQLCHAR             0       30      ","        12    EL_EXPIRATION_DTE                            ""
13      SQLCHAR             0       30      ","        13    FE_PRICE_AMT                                 ""
14      SQLCHAR             0       30      ","        14    FE_EXPIRATION_DTE                            ""
15      SQLCHAR             0       30      ","        15    FG_PRICE_AMT                                 ""
16      SQLCHAR             0       30      ","        16    FG_EXPIRATION_DTE                            ""
17      SQLCHAR             0       30      ","        17    ST_PRICE_AMT                                 ""
18      SQLCHAR             0       30      "\r\n"     18    ST_EXPIRATION_DTE                            ""

Open in new window

0
 
LVL 51

Expert Comment

by:Mark Wills
ID: 35094234
OK your format file is likely to be specifying a column for every field in the CSV file. When it encounters a new line instead of the anticipated columns it create the error.

Just as a test, what happens if you specify just a single column terminated by \n\r (copy your format file first - or create a new test one)

Now, is this a legitimate CSV file ? What created it ? Do you have to handle variable length columns ?
0
 

Author Comment

by:dkochenour
ID: 35094259
This file is produced by our mainframe. Yes, we want to insert the row even if it only has 4 columns. It has always left the commas out if it just finds a price, but no govt pricing ehich what the other columns are for.
0
 
LVL 51

Expert Comment

by:Vitor Montalvão
ID: 35094791
You should provide a row terminator character because of different number of columns by row.
0
 

Author Comment

by:dkochenour
ID: 35094862
Mainframe creates it and yes there are variable length fields. When I pare  down the format to the first 4 columns(which are aleways present) and end it with the \rn\ I get
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 1, column 4 (COLUMN_PRICE_AMT).
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 2, column 4 (COLUMN_PRICE_AMT).
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 17, column 4 (COLUMN_PRICE_AMT).
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 27, column 4 (COLUMN_PRICE_AMT).
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 30, column 4 (COLUMN_PRICE_AMT).
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 43, column 4 (COLUMN_PRICE_AMT).
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 44, column 4 (COLUMN_PRICE_AMT).
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 53, column 4 (COLUMN_PRICE_AMT).
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 56, column 4 (COLUMN_PRICE_AMT).
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 63, column 4 (COLUMN_PRICE_AMT).
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 94, column 4 (COLUMN_PRICE_AMT).
Msg 4865, Level 16, State 1, Line 1
Cannot bulk load because the maximum number of errors (10) was exceeded.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

 which looks like it is complaining about the rows that hve the additioal columns




0
 
LVL 51

Accepted Solution

by:
Mark Wills earned 2000 total points
ID: 35097784
Right...

Few different approaches that can be tried. One approach could be to read in, adjust commas so they align and rewrite the file just to import again, or, import as a single line and manually unpack, or, get the mainframe guys to fix their export ((best method).

Then it also depends on if column names / header exists in the CSV file (because that can change things a lot).

Then it also depends on if you can run xp_cmdshell as well. Lets assume you cannot, and there is no headings, so we really do have to take "manual" control.

So, just by way of example, lets see if we can do the job "manually" :

-- using the bulk insert to load a staging table

if object_id('tmp_staging_table') is not null drop table tmp_staging_table
go
-- create our staging table afresh
CREATE TABLE tmp_staging_table (line varchar(max))
go

-- load it up
BULK insert tmp_staging_table from 'C:\ECSJOBS\TDG080229041829001.832' with (fieldterminator='\r\n', firstrow=1)
go

-- have a look
select * from tmp_staging_table
go

-- then run through a procedure to unpack it - not included just yet - there is one down below...

-- or try a openrowset of the entire row without having to go into a staging area first, but ready for manual unpacking into a staging table
-- first create a format file depicting a single line... (the three lines between the asterixs) saved as 'c:\single_line.fmt' in the example
/*
8.0
1
1       SQLCHAR       0      8000     "\r\n"   1     line           ""
*/
-- now select from the csv file...
select * FROM OPENROWSET(BULK 'C:\ECSJOBS\TDG080229041829001.832'  , FORMATFILE = 'c:\single_line.fmt',FIRSTROW = 1, MAXERRORS=10) AS A
go

-- right, so if they work, can now try using the above and doing the unpacking manually...

;with cte as
(
      SELECT line, ltrim(SUBSTRING(line, n, CHARINDEX(',', line + ',',n) - n)) AS Fields, row_number() over (partition by line order by n) as field_number
      FROM OPENROWSET(BULK 'C:\ECSJOBS\TDG080229041829001.832' , FORMATFILE = 'c:\single_line.fmt',FIRSTROW = 1, MAXERRORS=10) AS A   -- or tmp_staging_table
      CROSS JOIN (SELECT number n FROM master..spt_values WHERE type = 'P') AS Numbers
      WHERE SUBSTRING(',' + line, n, 1) = ','
      AND n < LEN(line) + 1
) 
select [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18] from 
(select line, fields, field_number from cte) src
pivot
(max(fields) for field_number in ([1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18])) pvt

-- so then we can use the above to load a temp table and then use that table to put away, or, format, or, validate then do the put away.
-- we can also do things like select [1] as CORP_CODE, [2] as ....
-- we could also define the first three/four columns in our format file, "line" doesnt have to be a single column of all columns, just those that might need to be manually unpacked.

Open in new window

0
 

Author Closing Comment

by:dkochenour
ID: 35097935
Awesome solution _ Bless you!!! I have learned mucho today!!!
0

Featured Post

Get 15 Days FREE Full-Featured Trial

Benefit from a mission critical IT monitoring with Monitis Premium or get it FREE for your entry level monitoring needs.
-Over 200,000 users
-More than 300,000 websites monitored
-Used in 197 countries
-Recommended by 98% of users

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

How to leverage one TLS certificate to encrypt Microsoft SQL traffic and Remote Desktop Services, versus creating multiple tickets for the same server.
Ever needed a SQL 2008 Database replicated/mirrored/log shipped on another server but you can't take the downtime inflicted by initial snapshot or disconnect while T-logs are restored or mirror applied? You can use SQL Server Initialize from Backup…
Sometimes it takes a new vantage point, apart from our everyday security practices, to truly see our Active Directory (AD) vulnerabilities. We get used to implementing the same techniques and checking the same areas for a breach. This pattern can re…
Have you created a query with information for a calendar? ... and then, abra-cadabra, the calendar is done?! I am going to show you how to make that happen. Visualize your data!  ... really see it To use the code to create a calendar from a q…

800 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question