ssis skipping some rows on import

Posted on 2013-12-16
Medium Priority
Last Modified: 2016-07-11
for some reason, I can't seem to figure out why I'm missing some lines when I import a csv file that I exported using sqlplus. The ssis package gives no warnings or errors as well.

The reason I know this is because when I loaded the csv file into a separate analysis application it gets the correct totals for numeric columns and row counts.

But for some reason, SSIS doesn't seem to capture all of the lines...

there were 6313052 rows in the csv file and ssis imports 6308607...

any thoughts?

I've tried different code pages too (1250, 1252, and UTF8) but they didn't seem to have an affect

I checked out this link: Why all the records are not being copied from CSV to SQL table in a SSIS package

and numbers I've checked on numbers 2, 3, and 4.

Although, for number 1, I'm using a for each loop container descirbed in this site:http://help.pragmaticworks.com/dtsxchange/scr/FAQ%20-%20How%20to%20loop%20through%20files%20in%20a%20specified%20folder,%20load%20one%20by%20one%20and%20move%20to%20archive%20folder%20using%20SSIS.htm to loop through files in a folder and import them.

I've also thought about missing delimiters, but I exported the files myself using sqlplus like this:

trim(to_char(t1.D_DTM, 'yyyy-mm-dd hh24:mm:ss'))||','||
from table

Open in new window

I'm not sure how I could miss delimiters if I export a file like that...

Also, I tested importing these files into MySql using LOAD DATA INFILE and that seems to work fine and import all of the data...

I've also looked for commas in my varchar columns but was unable to find any...
Question by:k1ng87
  • 2
LVL 66

Accepted Solution

Jim Horn earned 1000 total points
ID: 39723927
Might not be a bad idea to pump these rows into a staging table with a single varchar(max) column and validate that all rows make it into SQL.  

Then compare rowcounts to verify.

Then write T-SQL to manually parse the rows and insert into another 'destination' table.

My wild guess is that the file has some errors with the pile delimeters that are causing rows not to insert, and for some reason not to be thrown as an error.

I had a project once where we were supposed to import a pipe-delimeted .csv, and one of those columns had email text, and whenever anyone had signature blocks with pipes in it (see below) it caused the import to fail.

Hope this helps.
Jim Horn  |  SQL Genius Dang Dude | Cell (906) 555-1212
"This signature block is guaranteed to hose up pipe-delimeted csv's"

Author Comment

ID: 39723999
Hey Jim,

So I did that and i got the exact same row count back now as the flat file contains, so how do I parse the table now with t-sql with comma delimiters?

my row looks like this now...

Column 0
2013-11-27 13:11:00,1XRTT,DATA,East Michigan,Region 2,East Michigan_PORT HURON_CL#17,LNS1,2436,DE60XC049,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,,,,,,,,,,,,,,,76,0,0,0,0,0,0,0,41,35,0,2.59444444444444444444444444444444444444,0,76,0,0,0,168,168,,,,,,,,,,,,,,,,,,,,,,,,,155.666666666666666666666666666666666667,0,0,0,0,0,3,0,104,0,0,0,150,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,18,0

Open in new window

LVL 75

Assisted Solution

by:Anthony Perkins
Anthony Perkins earned 1000 total points
ID: 39733210
I would first make sure that all the rows in this new table have the correct number of commas.  If you are not sure how to do that with T-SQL, just ask.
LVL 66

Expert Comment

by:Jim Horn
ID: 41704213
Recommend split #39723927 and #39733210

Featured Post

Build your data science skills into a career

Are you ready to take your data science career to the next step, or break into data science? With Springboard’s Data Science Career Track, you’ll master data science topics, have personalized career guidance, weekly calls with a data science expert, and a job guarantee.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Microsoft Access has a limit of 255 columns in a single table; SQL Server allows tables with over 255 columns, but reading that data is not necessarily simple.  The final solution for this task involved creating a custom text parser and then reading…
In this article, we will see two different methods to recover deleted data. The first option will be using the transaction log to identify the operation and restore it in a specified section of the transaction log. The second option is simpler and c…
Using examples as well as descriptions, and references to Books Online, show the documentation available for date manipulation functions and by using a select few of these functions, show how date based data can be manipulated with these functions.
Using examples as well as descriptions, and references to Books Online, show the different Recovery Models available in SQL Server and explain, as well as show how full, differential and transaction log backups are performed

624 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question