Go Premium for a chance to win a PS4. Enter to Win

x
?
Solved

Bulk Inserting Dates

Posted on 2004-08-19
9
Medium Priority
?
888 Views
Last Modified: 2012-06-27
Hi,

I am using Bulk Insert to import large amounts of data into SQL tables. I have come across a problem when I need to insert date fields with the value "1899-12-30". Whenever such a field is imported, Bulk Insert generates an error saying "Server: Msg 4864, Level 16, State 1, Line 1
Bulk insert data conversion error (type mismatch) for row 2072, column 9 (LAST_AMENDED_DATE)."

I have tried changing the field type from smalldatetime to date time but hasn't been successful.

This is a sample row from the text file I am importing (pipes are delimiting the fields): -
PO008726|X001 041|1|0|0|1|9|0.00|1899-12-30|BIMPORTD|0||1||1||WI|0.00|0||0|WI|I

Is there any way of going around this?

Cheers
0
Comment
Question by:jbonello
9 Comments
 
LVL 10

Expert Comment

by:AustinSeven
ID: 11840849
Can not imagine why it's failing as there's nothing wrong with the date format in the example you've given.   Have you looked at the row corresponding to row 2072?   Is it a valid date in that row?  As a test, can you remove row 2072 from the input file and re-run the test?

AustinSeven
0
 
LVL 10

Expert Comment

by:AustinSeven
ID: 11840856
Also... what happens if you bcp in the data?   What happens if you DTS in the data?  Would be interested to know if you can isolate row 2072.

AustinSeven?
0
 
LVL 17

Expert Comment

by:BillAn1
ID: 11840871
As AustinSeven says, there doesn;t appear to be anything worng with the sample row. Can you isolate the row 2072?

Alternatively, can you import the file using DTS, and set it up to log error file.
You can set it up to have max errors to be say 999. That way you can load most of the data, and have the bad rows logged. YOu can then examine the bad rows to determine what's wrong with them.
0
Concerto Cloud for Software Providers & ISVs

Can Concerto Cloud Services help you focus on evolving your application offerings, while delivering the best cloud experience to your customers? From DevOps to revenue models and customer support, the answer is yes!

Learn how Concerto can help you.

 

Author Comment

by:jbonello
ID: 11840962
I have tried removing the indicated row from the text file and the import went in without any problems.
I have also tried changing the year from 1899 to 1900 and this also worked fine. However changing the date isn't a viable option for me.
Also, I am restricted to using Bulk Insert because this forms part of a larger process so I can't use BCP or DTS at this stage.
0
 
LVL 10

Accepted Solution

by:
AustinSeven earned 375 total points
ID: 11840967
Or... How about copying  your existing SQL Server database table schema to another empty table and then changing the data type of the datetime column to varchar (25).  Then do the bulk insert again.   This will give you the opportunity to easily query the data to find duff values.   You could extend this test into a solution by using this new table as a staging table.   Such a table can be 'loosely typed' to allow for dirty data.   Obvioiusly, you would need a procedure to clean the data and then insert into the original table.    I guess this strategy depends on the liklihood of you getting bad data.  

AustinSeven
0
 
LVL 19

Expert Comment

by:Melih SARICA
ID: 11841271
In that row date value is below '1753/01/01'

If ur data may contain this kinda earlier dates u must change ur datatype to varchar


Melih SARICA
0
 

Author Comment

by:jbonello
ID: 11841419
Thanks Austin,

I used varchar in the staging table then converted into a datetime in the production table.

Cheers
0
 
LVL 14

Expert Comment

by:Jan Franek
ID: 11841445
It looks like your date column is of SMALLDATETIME type - this type supports dates from January 1, 1900, to December 31, 2079. To import date from year 1899 you need to use DATETIME type (January 1, 1753 to December 31, 9999)
0
 
LVL 4

Expert Comment

by:rlibrandi
ID: 11841469
I've had issues like this before as well.  My solution was to write a program to scrub the data before the bulk insert ever runs.

In VB -

Open the file
read each line and replace the bad date with a good one -
    VB Code - Replace(strTemp, "1899-12-30", "1900-12-30"

Write the "scrubbed" line to a new file
Read the scrubbed file in to your bulk insert.
0

Featured Post

Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Why is this different from all of the other step by step guides?  Because I make a living as a DBA and not as a writer and I lived through this experience. Defining the name: When I talk to people they say different names on this subject stuff l…
It is possible to export the data of a SQL Table in SSMS and generate INSERT statements. It's neatly tucked away in the generate scripts option of a database.
Via a live example, show how to setup several different housekeeping processes for a SQL Server.
Viewers will learn how to use the SELECT statement in SQL to return specific rows and columns, with various degrees of sorting and limits in place.

824 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question