Can I convert a string "DD:MM:YYYY" to a DATETIME field during a bcp ?


I've got a comma seperated file ...

My table is like so...

I need to 'bcp' the file containing these rows of comma seperated fields with '\n' as a row terminator.

When I try...
'bcp db.owner.table in file -U -P -c -t , -r \\n'
It will not work as I get an error on the second field (the dd:mm:yyyy). I know this date field should be like yyyy:mm:dd but this 'cannot' be changed !

Is it possible during my bcp to actually convert this string dd:mm:yyyy into a format that can be inserted into a Sybase DATETIME type ?

Remember, I need to bcp the whole file (i cannot read each line of the file and convert the date string) as is.
Also, the field in the database needs to be a DATETIME, I can't just use a char here as I need to be able to sort by date (and I can't sort on dd:mm:yy) :(


Who is Participating?
brianon - any updates on this question ?
if you are on a unix platform, i can help you with a script that will convert that field to the format you want and then you can bcp that file.

if you are familiar with perl, you can write a script for urself or i guess that could be done in MS Excel also ..

let me know if can help u

A suggestion to skip scripts and do everythign through SQL

create a table similar to the existing one, just change the datatype of that particular column from datetime to char(n).

then bcp this file into this new table.

now once data is in the new table, just update that column using characteristic function into the format u want.

finally .. just insert into the old table from the new table.
(remember to convert to datetime this char column)

Cloud Class® Course: MCSA MCSE Windows Server 2012

This course teaches how to install and configure Windows Server 2012 R2.  It is the first step on your path to becoming a Microsoft Certified Solutions Expert (MCSE).

brianonAuthor Commented:
Thanks guys.

However, what I'm really looking for is a way to convert the char(mm:dd:yyyy) to the Sybase DATETIME 'during' the bcp of the comma seperated file to the Database.

I don't think the script would work as time/speed of execution is a big factor in this one.

The extra table option ??? Not sure but I'll think about it.


u work on unix platform or ms windows

plz let me know.
brianonAuthor Commented:
I work on a unix platform.
ok that a good thing - now i can provide you with some script that will convert this format into the format that is accepted by Sybase.

Could you please paste here a sample line of the file.


I think amit really has your answer.  A simple external program (c or perl) can convert 100K rows in a few seconds.  Of course, this does depend on your system.  Amit's thoughts about using a temp table are a good idea too.   You could put the logic in a Stored Procedure and then let sybase do all the work.  Just make sure to keep an eye on your log space as well as your indexes when you do this(you will be doing an insert/select and not a bcp and this will effect log usage.  You may need to drop all/most of your indexes and then recreate them. Note: this normally should have happened even if you just were to bcp directly into the table.  You might need to do some experimenting with this as this depends on the number of rows in the table vs number of rows being imported.  If you get a sample line (or just dummy up one), then im sure amit can get you what you need.

brianonAuthor Commented:
thanks for your help.
It solved half the problem and I found the rest myself.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.