Want to win a PS4? Go Premium and enter to win our High-Tech Treats giveaway. Enter to Win

x
?
Solved

bulk insert date problem

Posted on 2007-11-15
21
Medium Priority
?
533 Views
Last Modified: 2010-04-21
table includes a smalldatetime field -- call it date1
i'm bulk inserting a ton of data to this table
i just now noticed that all the dates have been altered.  
11/13/2007 17:59 in the file is 1900-01-01 17:59:00 in the table

i just truncated the table and re-did one of the file bulk inserts.  same thing, every date is current (today) in the file, yet it is 1900 in the table

like i said, it's smalldatetime in the table
the bulkinsert is done via format file, here is that particular attribute in the fmt file:


5  SQLCHAR        0   4    ","      4      date1      SQL_Latin1_General_CP1_CI_AS

is it my format file?
can anybody tell me what's happening to my dates?
0
Comment
Question by:dbaSQL
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
21 Comments
 
LVL 17

Author Comment

by:dbaSQL
ID: 20295357
here's the whole format file, just incase the problem is w/another attribute:

8.0
6
1   SQLCHAR   0    50   ","         1    orderno      SQL_Latin1_General_CP1_CI_AS
2   SQLCHAR   0    1     ","        2    source      SQL_Latin1_General_CP1_CI_AS
3   SQLCHAR   0   1      ","    3     type      SQL_Latin1_General_CP1_CI_AS
4   SQLCHAR   0   0      " "    0    mindex      SQL_Latin1_General_CP1_CI_AS
5   SQLCHAR   0   4      ","         4   date1      SQL_Latin1_General_CP1_CI_AS
6   SQLCHAR   0   4     "\n"   5   timestampms   SQL_Latin1_General_CP1_CI_AS
0
 
LVL 17

Author Comment

by:dbaSQL
ID: 20295462
the oddity is i am doing two bulk inserts, into two different tables.  the same flat files are the datasource for both bulk inserts.  they go to two different tables via two different format files.
one is working
the other one, all dates are 1900

does anybody know what this may be?
0
 
LVL 17

Author Comment

by:dbaSQL
ID: 20295543
the only diff between the two bulk inserts is the fmt file.  in this one, i am trying to ignore MIndex
4   SQLCHAR   0   0      " "    0    mindex      SQL_Latin1_General_CP1_CI_AS

that's why i put 0 in there for server order, length and no terminator
i put that column on this 2nd table (ugh), and put the necesary details back into the fmt file, it works w/out changing the date to 1900.

i don't want to load the attribute.  could anybody tell me how to properly ignore it from the data file?
i did this, but i see now that is to ignore something that doesn't exist in the data file.  i need to ignore something that is in the file, but not in the table


\\Note  It is possible to skip importing a table column if the field does not exist in the data file by specifying 0 prefix length, 0 length, 0 server column order, and no terminator. This effectively states that the data field does not exist in the data file, and that the server column should not have data loaded into it.

0
What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

 
LVL 75

Expert Comment

by:Aneesh Retnakaran
ID: 20296285
Is that happening with only the dates after '12th'  ?
0
 
LVL 143

Expert Comment

by:Guy Hengel [angelIII / a3]
ID: 20296561
you have only 2 choices:
ensure the format of the date in the file is yyyy-mm-dd ...
or
use a staging table with the field as varchar, and insert from there with a propert CONVERT(datetime, yourfield, 101)
0
 
LVL 17

Author Comment

by:dbaSQL
ID: 20297522
aneesh, it was all dates, but in this particular situation, yes, they were all after the 12th.  it was the 13th, 14th and 15th.  but i don't understand why that would apply???

angel, the same file goes into tableA via bulk insert w/format file, no problem
then i put same file into tableB via bulk insert w/another fmt file, the dates all change to 1900
so i have to assume the date in the file is correctly formatted, as it's working on tableA w/out error
the only difference between the two inserts is the fmt file, and the table it's going into

tableA
MsgID      int
OrderNo      varchar
Source      char
Type      char
mIndex      int
TSDate      smalldatetime
TimestampMS      int

fmt file #1
8.0
6
1   SQLCHAR   0    50   ","         2    orderno      SQL_Latin1_General_CP1_CI_AS
2   SQLCHAR   0    1     ","        3    source      SQL_Latin1_General_CP1_CI_AS
3   SQLCHAR   0   1      ","    4     type      SQL_Latin1_General_CP1_CI_AS
4   SQLCHAR   0   4      ","    5    mindex      SQL_Latin1_General_CP1_CI_AS
5   SQLCHAR   0   4      ","   6   tsdate      SQL_Latin1_General_CP1_CI_AS
6   SQLCHAR   0   4     "\n"   6   timestampms   SQL_Latin1_General_CP1_CI_AS

table2
OrderNo   varchar      
Source    char      
Type      char      
MIndex    int      
TSDate   smalldatetime  
TimestampMS   int      
Latency   int      
Type   char      

fmt file #2
8.0
6
1   SQLCHAR   0    50   ","         1    orderno      SQL_Latin1_General_CP1_CI_AS
2   SQLCHAR   0    1     ","        2    source      SQL_Latin1_General_CP1_CI_AS
3   SQLCHAR   0   1      ","    3     type      SQL_Latin1_General_CP1_CI_AS
4   SQLCHAR   0   0      " "    0    mindex      SQL_Latin1_General_CP1_CI_AS
5   SQLCHAR   0   4      ","         4   tsdate      SQL_Latin1_General_CP1_CI_AS
6   SQLCHAR   0   4     "\n"   5   timestampms   SQL_Latin1_General_CP1_CI_AS

i'm trying to ignore MIndex in the 2nd fmt file/table/bulk insert
tsdate is the problematic column
0
 
LVL 7

Expert Comment

by:wittyslogan
ID: 20297781
just a question?
Does your system think it's english?
The reason I ask is because all the problems occur after the 12th.
So is it thinking that you are putting in the 11th day of the 13th month and is this causing the error?
0
 
LVL 17

Author Comment

by:dbaSQL
ID: 20297826
>>Does your system think it's english?

um.... i would think so.  how would i confirm this?
0
 
LVL 7

Expert Comment

by:wittyslogan
ID: 20297860
when you do select getdate() is the month or the day first?  I wasn't being cheeky, just English and US dates are different
0
 
LVL 17

Author Comment

by:dbaSQL
ID: 20297889
2007-11-16 07:14:13.547
cheeky, i like that word
0
 
LVL 7

Expert Comment

by:wittyslogan
ID: 20297945
it's a marvellous word.

But you will notice that the order of the date is yyyy mm dd
your import is in the order yyyy dd mm.
so it is in english.

Should the collation be changed to cope with the import?
Anyone?
0
 
LVL 17

Author Comment

by:dbaSQL
ID: 20297962
i keep coming back to the fmt file.  cuz remember, the very same file goes into another table w/out error.  the dates are not altered

in my fmt file, i was trying to ignore mindex
when i changed that, and allowed it into the bulkinsert/table, the dates came in fine
possibly i was ignoring it incorrectly -- syntactically wrong?

do you know the correct way to ignore a field in the flat file within the format file?
0
 
LVL 7

Expert Comment

by:wittyslogan
ID: 20298040
I think you set the column number to 0 to ignore it.
0
 
LVL 17

Author Comment

by:dbaSQL
ID: 20298591
i will try that now
0
 
LVL 17

Author Comment

by:dbaSQL
ID: 20298773
Server: Msg 4823, Level 16, State 1, Line 2
Could not bulk insert. Invalid column number in format file 'E:\MSSQL\Tools\test.fmt'.

there must be a way to ignore a field in the flat file
0
 
LVL 7

Accepted Solution

by:
wittyslogan earned 2000 total points
ID: 20299196
you have to change the column order.

if you've got columns 1,2,3,4

You wan tot ignore column 3

Then on the format file this will become 1,2,0,3
not 1,2,0,4

Which did you do?
0
 
LVL 17

Author Comment

by:dbaSQL
ID: 20299453
i got it.  it was the fmt file.  here it is:

8.0
6
1   SQLCHAR   0    50   ","    1   orderno      SQL_Latin1_General_CP1_CI_AS
2   SQLCHAR   0    1     ","    2   source       SQL_Latin1_General_CP1_CI_AS
3   SQLCHAR   0   1      ","    3   type           SQL_Latin1_General_CP1_CI_AS
4   SQLCHAR   0   4      ","    0   mindex        ""
5   SQLCHAR   0   4      ","    4   tsdate         SQL_Latin1_General_CP1_CI_AS
6   SQLCHAR   0   4     "\n"   5   timestampms   SQL_Latin1_General_CP1_CI_AS

note, it's not really that much diff from the one that was hoking my dates:
8.0
6
1   SQLCHAR   0    50   ","     1    orderno      SQL_Latin1_General_CP1_CI_AS
2   SQLCHAR   0    1     ","     2    source      SQL_Latin1_General_CP1_CI_AS
3   SQLCHAR   0   1      ","     3    type          SQL_Latin1_General_CP1_CI_AS
4   SQLCHAR   0   0      " "     0    mindex      SQL_Latin1_General_CP1_CI_AS
5   SQLCHAR   0   4      ","     4   tsdate        SQL_Latin1_General_CP1_CI_AS
6   SQLCHAR   0   4     "\n"   5   timestampms   SQL_Latin1_General_CP1_CI_AS

correct:        4   SQLCHAR   0   4      ","    0    mindex      ""
incorrect:     4   SQLCHAR   0   0      " "     0    mindex      SQL_Latin1_General_CP1_CI_AS

aside from the collation, you are correct, i had to change the column order.  but, you said ignore column 3, i actually wanted to ignore column 4 in the file

crap

i totall thought i had it.  i ran it from query analyzer w/the bulk insert statement, using the test.fmt file
it worked.
so i took the content of the test.fmt file, put it into my correct fmt file, now it fails w/this:

Server: Msg 4839, Level 16, State 1, Line 1
Cannot perform bulk insert. Invalid collation name for source column 6 in format file 'E:\MSSQL\Tools\ee_messages.fmt'.

yet it works w/the test.fmt file
there must be something silly i am missing here
0
 
LVL 17

Author Comment

by:dbaSQL
ID: 20299475
no clue.  honestly, the both appear identical.  i just renamed them --- ee_messages.fmt became ee_messagesOLD.fmt,  test.fmt became ee_messages.fmt, it's working now
who knows

again, thank you each for your input
0
 
LVL 17

Author Closing Comment

by:dbaSQL
ID: 31409488
i ended up getting it myself, but wittyslogan was definitely correct, i had to change my column order
0
 
LVL 7

Expert Comment

by:wittyslogan
ID: 20299492
I'm glad, took a bit of head scratching.  Be good to know what made it work and then not work though.
0
 
LVL 17

Author Comment

by:dbaSQL
ID: 20299513
yeah... i agree.  if i see this again, i'll be a bit more informed, for sure... but still, i don't have the exact explanation, which kinda blows.  

regardless, thanks very much.  we're there now
0

Featured Post

Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Why is this different from all of the other step by step guides?  Because I make a living as a DBA and not as a writer and I lived through this experience. Defining the name: When I talk to people they say different names on this subject stuff l…
Recently we ran in to an issue while running some SQL jobs where we were trying to process the cubes.  We got an error saying failure stating 'NT SERVICE\SQLSERVERAGENT does not have access to Analysis Services. So this is a way to automate that wit…
Via a live example, show how to shrink a transaction log file down to a reasonable size.
Viewers will learn how to use the SELECT statement in SQL to return specific rows and columns, with various degrees of sorting and limits in place.

609 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question