Still celebrating National IT Professionals Day with 3 months of free Premium Membership. Use Code ITDAY17

x
?
Solved

Bulk Inserting Dates

Posted on 2004-08-19
9
Medium Priority
?
887 Views
Last Modified: 2012-06-27
Hi,

I am using Bulk Insert to import large amounts of data into SQL tables. I have come across a problem when I need to insert date fields with the value "1899-12-30". Whenever such a field is imported, Bulk Insert generates an error saying "Server: Msg 4864, Level 16, State 1, Line 1
Bulk insert data conversion error (type mismatch) for row 2072, column 9 (LAST_AMENDED_DATE)."

I have tried changing the field type from smalldatetime to date time but hasn't been successful.

This is a sample row from the text file I am importing (pipes are delimiting the fields): -
PO008726|X001 041|1|0|0|1|9|0.00|1899-12-30|BIMPORTD|0||1||1||WI|0.00|0||0|WI|I

Is there any way of going around this?

Cheers
0
Comment
Question by:jbonello
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
9 Comments
 
LVL 10

Expert Comment

by:AustinSeven
ID: 11840849
Can not imagine why it's failing as there's nothing wrong with the date format in the example you've given.   Have you looked at the row corresponding to row 2072?   Is it a valid date in that row?  As a test, can you remove row 2072 from the input file and re-run the test?

AustinSeven
0
 
LVL 10

Expert Comment

by:AustinSeven
ID: 11840856
Also... what happens if you bcp in the data?   What happens if you DTS in the data?  Would be interested to know if you can isolate row 2072.

AustinSeven?
0
 
LVL 17

Expert Comment

by:BillAn1
ID: 11840871
As AustinSeven says, there doesn;t appear to be anything worng with the sample row. Can you isolate the row 2072?

Alternatively, can you import the file using DTS, and set it up to log error file.
You can set it up to have max errors to be say 999. That way you can load most of the data, and have the bad rows logged. YOu can then examine the bad rows to determine what's wrong with them.
0
Learn how to optimize MySQL for your business need

With the increasing importance of apps & networks in both business & personal interconnections, perfor. has become one of the key metrics of successful communication. This ebook is a hands-on business-case-driven guide to understanding MySQL query parameter tuning & database perf

 

Author Comment

by:jbonello
ID: 11840962
I have tried removing the indicated row from the text file and the import went in without any problems.
I have also tried changing the year from 1899 to 1900 and this also worked fine. However changing the date isn't a viable option for me.
Also, I am restricted to using Bulk Insert because this forms part of a larger process so I can't use BCP or DTS at this stage.
0
 
LVL 10

Accepted Solution

by:
AustinSeven earned 375 total points
ID: 11840967
Or... How about copying  your existing SQL Server database table schema to another empty table and then changing the data type of the datetime column to varchar (25).  Then do the bulk insert again.   This will give you the opportunity to easily query the data to find duff values.   You could extend this test into a solution by using this new table as a staging table.   Such a table can be 'loosely typed' to allow for dirty data.   Obvioiusly, you would need a procedure to clean the data and then insert into the original table.    I guess this strategy depends on the liklihood of you getting bad data.  

AustinSeven
0
 
LVL 19

Expert Comment

by:Melih SARICA
ID: 11841271
In that row date value is below '1753/01/01'

If ur data may contain this kinda earlier dates u must change ur datatype to varchar


Melih SARICA
0
 

Author Comment

by:jbonello
ID: 11841419
Thanks Austin,

I used varchar in the staging table then converted into a datetime in the production table.

Cheers
0
 
LVL 14

Expert Comment

by:Jan Franek
ID: 11841445
It looks like your date column is of SMALLDATETIME type - this type supports dates from January 1, 1900, to December 31, 2079. To import date from year 1899 you need to use DATETIME type (January 1, 1753 to December 31, 9999)
0
 
LVL 4

Expert Comment

by:rlibrandi
ID: 11841469
I've had issues like this before as well.  My solution was to write a program to scrub the data before the bulk insert ever runs.

In VB -

Open the file
read each line and replace the bad date with a good one -
    VB Code - Replace(strTemp, "1899-12-30", "1900-12-30"

Write the "scrubbed" line to a new file
Read the scrubbed file in to your bulk insert.
0

Featured Post

Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Recently we ran in to an issue while running some SQL jobs where we were trying to process the cubes.  We got an error saying failure stating 'NT SERVICE\SQLSERVERAGENT does not have access to Analysis Services. So this is a way to automate that wit…
Windocks is an independent port of Docker's open source to Windows.   This article introduces the use of SQL Server in containers, with integrated support of SQL Server database cloning.
Familiarize people with the process of retrieving data from SQL Server using an Access pass-thru query. Microsoft Access is a very powerful client/server development tool. One of the ways that you can retrieve data from a SQL Server is by using a pa…
Viewers will learn how the fundamental information of how to create a table.

670 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question