[Webinar] Streamline your web hosting managementRegister Today

  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 740
  • Last Modified:

How do determine data types from .csv file columns?

I am looking to import columns from a .csv file to a MS SQL database. Is there a utility out there that can analyze a .csv file to make recommendations on what the best data type is based on the data in the column?
1 Solution

MSSQL Import and Export utility will help you.
Reffer http://msdn.microsoft.com/en-us/library/ms140052.aspx for more details.

Raja Jegan RSQL Server DBA & ArchitectCommented:
>> Is there a utility out there that can analyze a .csv file to make recommendations on what the best data type is based on the data in the column?

No, You have to tell SQL Server the datatypes of the columns in the csv file

* Import & Export Wizard - Need to specify datatype for correct conversion
* BCP - Need to create Format File specifying datatype
* SSIS Package - Need to map with appropriate Datatype
mikedgibsonAuthor Commented:
Is there at least a utility that will look at the file and tell you the max length of each field so I can set the string sizes appropriately?
The new generation of project management tools

With monday.com’s project management tool, you can see what everyone on your team is working in a single glance. Its intuitive dashboards are customizable, so you can create systems that work for you.

Raja Jegan RSQL Server DBA & ArchitectCommented:
Since it is a CSV file, just dump it into a temp table with all columns having datatype as varchar(1000)..
You can identify the max length of each and every column present in the feed file and then set that as the length of your columns..
Raja Jegan RSQL Server DBA & ArchitectCommented:

Since CSV file wont contain information about the datatype of columns along with max length of the columns, I have requested above to load it into a table with all varchar columns and after that manually analyze the data type and length of those fields..
Hence I would recommend

Accept 29060683

Using Import and Export Wizard will not help as we need to specify the datatypes and column length manually..
Unfortunately, that won't work with the ODBC datasource because the conversions occur automatically _before_ the process even makes it to load into varchar column step.  ie When the insert occurs, the data is already corrupted. That's why the schema.ini file is needed.
Ignore that last comment. It was posted to the wrong thread.

Featured Post

[Webinar] Kill tickets & tabs using PowerShell

Are you tired of cycling through the same browser tabs everyday to close the same repetitive tickets? In this webinar JumpCloud will show how you can leverage RESTful APIs to build your own PowerShell modules to kill tickets & tabs using the PowerShell command Invoke-RestMethod.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now