Solved

Bulk Inserting Dates

Posted on 2004-08-19
9
882 Views
Last Modified: 2012-06-27
Hi,

I am using Bulk Insert to import large amounts of data into SQL tables. I have come across a problem when I need to insert date fields with the value "1899-12-30". Whenever such a field is imported, Bulk Insert generates an error saying "Server: Msg 4864, Level 16, State 1, Line 1
Bulk insert data conversion error (type mismatch) for row 2072, column 9 (LAST_AMENDED_DATE)."

I have tried changing the field type from smalldatetime to date time but hasn't been successful.

This is a sample row from the text file I am importing (pipes are delimiting the fields): -
PO008726|X001 041|1|0|0|1|9|0.00|1899-12-30|BIMPORTD|0||1||1||WI|0.00|0||0|WI|I

Is there any way of going around this?

Cheers
0
Comment
Question by:jbonello
9 Comments
 
LVL 10

Expert Comment

by:AustinSeven
ID: 11840849
Can not imagine why it's failing as there's nothing wrong with the date format in the example you've given.   Have you looked at the row corresponding to row 2072?   Is it a valid date in that row?  As a test, can you remove row 2072 from the input file and re-run the test?

AustinSeven
0
 
LVL 10

Expert Comment

by:AustinSeven
ID: 11840856
Also... what happens if you bcp in the data?   What happens if you DTS in the data?  Would be interested to know if you can isolate row 2072.

AustinSeven?
0
 
LVL 17

Expert Comment

by:BillAn1
ID: 11840871
As AustinSeven says, there doesn;t appear to be anything worng with the sample row. Can you isolate the row 2072?

Alternatively, can you import the file using DTS, and set it up to log error file.
You can set it up to have max errors to be say 999. That way you can load most of the data, and have the bad rows logged. YOu can then examine the bad rows to determine what's wrong with them.
0
VMware Disaster Recovery and Data Protection

In this expert guide, you’ll learn about the components of a Modern Data Center. You will use cases for the value-added capabilities of Veeam®, including combining backup and replication for VMware disaster recovery and using replication for data center migration.

 

Author Comment

by:jbonello
ID: 11840962
I have tried removing the indicated row from the text file and the import went in without any problems.
I have also tried changing the year from 1899 to 1900 and this also worked fine. However changing the date isn't a viable option for me.
Also, I am restricted to using Bulk Insert because this forms part of a larger process so I can't use BCP or DTS at this stage.
0
 
LVL 10

Accepted Solution

by:
AustinSeven earned 125 total points
ID: 11840967
Or... How about copying  your existing SQL Server database table schema to another empty table and then changing the data type of the datetime column to varchar (25).  Then do the bulk insert again.   This will give you the opportunity to easily query the data to find duff values.   You could extend this test into a solution by using this new table as a staging table.   Such a table can be 'loosely typed' to allow for dirty data.   Obvioiusly, you would need a procedure to clean the data and then insert into the original table.    I guess this strategy depends on the liklihood of you getting bad data.  

AustinSeven
0
 
LVL 19

Expert Comment

by:Melih SARICA
ID: 11841271
In that row date value is below '1753/01/01'

If ur data may contain this kinda earlier dates u must change ur datatype to varchar


Melih SARICA
0
 

Author Comment

by:jbonello
ID: 11841419
Thanks Austin,

I used varchar in the staging table then converted into a datetime in the production table.

Cheers
0
 
LVL 14

Expert Comment

by:Jan_Franek
ID: 11841445
It looks like your date column is of SMALLDATETIME type - this type supports dates from January 1, 1900, to December 31, 2079. To import date from year 1899 you need to use DATETIME type (January 1, 1753 to December 31, 9999)
0
 
LVL 4

Expert Comment

by:rlibrandi
ID: 11841469
I've had issues like this before as well.  My solution was to write a program to scrub the data before the bulk insert ever runs.

In VB -

Open the file
read each line and replace the bad date with a good one -
    VB Code - Replace(strTemp, "1899-12-30", "1900-12-30"

Write the "scrubbed" line to a new file
Read the scrubbed file in to your bulk insert.
0

Featured Post

PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Nowadays, some of developer are too much worried about data. Who is using data, who is updating it etc. etc. Because, data is more costlier in term of money and information. So security of data is focusing concern in days. Lets' understand the Au…
When you hear the word proxy, you may become apprehensive. This article will help you to understand Proxy and when it is useful. Let's talk Proxy for SQL Server. (Not in terms of Internet access.) Typically, you'll run into this type of problem w…
This video shows how to set up a shell script to accept a positional parameter when called, pass that to a SQL script, accept the output from the statement back and then manipulate it in the Shell.
Via a live example, show how to shrink a transaction log file down to a reasonable size.

776 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question