Link to home
Create AccountLog in
Avatar of Kenneth_K
Kenneth_K

asked on

BCP from SQL2005 to 2008R2

Hi
I am trying to BCP data from a 2005 to 2008 SQL database, on the 2008 sql e are using Nvarchar instead of varchar.
When i try to use the commando:
BCP xxx in x\xxx -T -n  -S xxxxx
Then i get this fail error
SQLState = 22001, Native Error = 0 String data Right truncation.

I have then try to get a Error file from the BCP by using -e, then i got this: (se atach file)

 error.txt

When i look at the file i see it say that it is diffrence collum it fail on, so first i thought they could be diffrence on how many collums og Sorce/target.. but they are identical.. (just the Nvarchar)..
Have anyone idea what i could try.
Avatar of dbaSQL
dbaSQL
Flag of United States of America image

This is most likely because one of the data strings is too long for the column into which you're trying to insert.  You could create a dummy table, or a working table, with each of your columns larger than the current table.  BCP into your working table, and then analyze the data, to figure out which column(s) are causing the problem.
Avatar of Kenneth_K
Kenneth_K

ASKER

ok god idea, let me try that. I return back in a few hours
Avatar of Alpesh Patel
Ohh, you are trying to insert data in fields greater than it's length

i.e field length is 40 and you are trying to insert data with length greater than 40 then truncation error come.
I maybe blind . but i cant see why it should not fit in target.
I have attached the source tabe and the target table. If anyone can see where it goes wront i would be very happy

source.txt source.txt source.txt target.txt
NVARCHAR is stored as UTF-16, 16 bits or 2 bytes per character, twice the storage required for VARCHAR.
 
sure but how should that be a problem here, or am i misunderstanding something.. i know about the Nvarchar and unicode. but i still thought i could take a varchar and cast it into a Nvarchar
Take a look at this:  http://www.dotnetspider.com/forum/158173-Difference-between-Varchar-nvarchar.aspx

I would still go with my initial suggestion -- why not create a working table, all attributes much larger than they are in your current target table.  And load the data.  At this point, you compare the data lengths, and figure out where the problem is.
Sur ei can try just a huge table wih 88 mil records. but i can take one and try. I have try to copy source table to insert at that works (of course fine).  But i gonna try to make copy of dest table, just larger.
Then i return what happen.
Maybe create a view of the data, selecting a few thousand records.  And use that to extract your data.  Just to improve the chances of collecting data that will cause the same problem.
I did try to make the sie of the table to "very oversize" it i still not help.. what i did so was export it with the -c  and import with that option then it did work.
ASKER CERTIFIED SOLUTION
Avatar of dbaSQL
dbaSQL
Flag of United States of America image

Link to home
membership
Create a free account to see this answer
Signing up is free and takes 30 seconds. No credit card required.
See answer