Solved

ssis skipping some rows on import

Posted on 2013-12-16
6
47 Views
Last Modified: 2016-07-11
for some reason, I can't seem to figure out why I'm missing some lines when I import a csv file that I exported using sqlplus. The ssis package gives no warnings or errors as well.

The reason I know this is because when I loaded the csv file into a separate analysis application it gets the correct totals for numeric columns and row counts.

But for some reason, SSIS doesn't seem to capture all of the lines...

there were 6313052 rows in the csv file and ssis imports 6308607...

any thoughts?

I've tried different code pages too (1250, 1252, and UTF8) but they didn't seem to have an affect

I checked out this link: Why all the records are not being copied from CSV to SQL table in a SSIS package

and numbers I've checked on numbers 2, 3, and 4.

Although, for number 1, I'm using a for each loop container descirbed in this site:http://help.pragmaticworks.com/dtsxchange/scr/FAQ%20-%20How%20to%20loop%20through%20files%20in%20a%20specified%20folder,%20load%20one%20by%20one%20and%20move%20to%20archive%20folder%20using%20SSIS.htm to loop through files in a folder and import them.

I've also thought about missing delimiters, but I exported the files myself using sqlplus like this:

select
trim(to_char(t1.D_DTM, 'yyyy-mm-dd hh24:mm:ss'))||','||
trim(t1.TECHNOLOGY)||','||
trim(t1.VOICEDATA)||','||
trim(t2.MRKT_NM)||','||
trim(t2.REGION_NM)||','||
trim(t2.CLUSTER_NM)||','||
trim(t3.BSC_NM)||','||
trim(t1.BTS_ID)||','||
trim(t1.CSCD_NM)||','||
trim(t1.SECT_SEQ_ID)||','||
trim(t1.BND_ID)||','||
trim(t1.FA_ID)||','||
trim(t1.TCE_DTCT_CALL_SETUP_F_CNT)||','||
trim(t1.MS_ACQ_CALL_SETUP_F_CNT)||','||
trim(t1.SIGN_CALL_SETUP_F_CNT)||','||
trim(t1.BAD_FRM_CALL_SETUP_F_CNT)||','||
trim(t1.REORG_ATT_CNT)||','||
trim(t1.TCE_CALL_SETUP_F_CNT)||','||
trim(t1.WCD_CALL_SETUP_F_CNT)||','||
trim(t1.L_CALL_SETUP_F_CNT)||','||
trim(t1.TRAF_FRM_MISS_CNT)||','||
trim(t1.BCP_TMO_CALL_SETUP_F_CNT)
from table

Open in new window


I'm not sure how I could miss delimiters if I export a file like that...

Also, I tested importing these files into MySql using LOAD DATA INFILE and that seems to work fine and import all of the data...

I've also looked for commas in my varchar columns but was unable to find any...
0
Comment
Question by:k1ng87
  • 2
6 Comments
 
LVL 65

Accepted Solution

by:
Jim Horn earned 250 total points
ID: 39723927
Might not be a bad idea to pump these rows into a staging table with a single varchar(max) column and validate that all rows make it into SQL.  

Then compare rowcounts to verify.

Then write T-SQL to manually parse the rows and insert into another 'destination' table.

My wild guess is that the file has some errors with the pile delimeters that are causing rows not to insert, and for some reason not to be thrown as an error.

I had a project once where we were supposed to import a pipe-delimeted .csv, and one of those columns had email text, and whenever anyone had signature blocks with pipes in it (see below) it caused the import to fail.

Hope this helps.
Jim Horn  |  SQL Genius Dang Dude | Cell (906) 555-1212
"This signature block is guaranteed to hose up pipe-delimeted csv's"
0
 
LVL 1

Author Comment

by:k1ng87
ID: 39723999
Hey Jim,

So I did that and i got the exact same row count back now as the flat file contains, so how do I parse the table now with t-sql with comma delimiters?

my row looks like this now...

Column 0
2013-11-27 13:11:00,1XRTT,DATA,East Michigan,Region 2,East Michigan_PORT HURON_CL#17,LNS1,2436,DE60XC049,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,,,,,,,,,,,,,,,76,0,0,0,0,0,0,0,41,35,0,2.59444444444444444444444444444444444444,0,76,0,0,0,168,168,,,,,,,,,,,,,,,,,,,,,,,,,155.666666666666666666666666666666666667,0,0,0,0,0,3,0,104,0,0,0,150,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,18,0

Open in new window

0
 
LVL 75

Assisted Solution

by:Anthony Perkins
Anthony Perkins earned 250 total points
ID: 39733210
I would first make sure that all the rows in this new table have the correct number of commas.  If you are not sure how to do that with T-SQL, just ask.
0
 
LVL 65

Expert Comment

by:Jim Horn
ID: 41704213
Recommend split #39723927 and #39733210
0

Featured Post

6 Surprising Benefits of Threat Intelligence

All sorts of threat intelligence is available on the web. Intelligence you can learn from, and use to anticipate and prepare for future attacks.

Join & Write a Comment

JSON is being used more and more, besides XML, and you surely wanted to parse the data out into SQL instead of doing it in some Javascript. The below function in SQL Server can do the job for you, returning a quick table with the parsed data.
Slowly Changing Dimension Transformation component in data task flow is very useful for us to manage and control how data changes in SSIS.
Via a live example, show how to shrink a transaction log file down to a reasonable size.
Viewers will learn how the fundamental information of how to create a table.

746 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

12 Experts available now in Live!

Get 1:1 Help Now