Solved

Problem w/ Nulls:  Import Excel 2002 -> SQL Server 2000

Posted on 2003-10-23
10
1,574 Views
Last Modified: 2007-12-19
I am attempting to import 6 worksheets from Excel to 6 tables in SQL Server 2000.  When the package executes, I get the following error message:

"Error during Transformation 'DirectCopyXform' for Row number 801.  Errors encountered so far in this task:  1. TransformCopy 'DirectCopyXform' conversion error:  Destination does not allow NULL on column pair1 (source: column 'my_col_name' (DBTYPE_R8), destination column 'my_dest_col' (DBTYPEI8))."

This is strange because my excel spreadsheet has only 800 rows, and none of them are null.  Is it possible that SQL Server is reading in my spreadsheet and is running right off the end of my data?  

In an attempt to understand what was going on, I created a test environment with an identical destination table that allowed nulls in my_dest_col.  With this, I was able to import successfully, but I got 2 null rows in my database for some (possibly all) tables.

One possible workaround is temporarily allowing nulls in the database in my production environment (while I import the data).  Then I could delete the erroneously imported null values and uncheck the box to allow nulls in enterprise manager.  Is it possible that this solution would cause damage to my database?  It's not under constant pounding, but a few people might fill out submission forms in that interval.

Comments by mid-morning 10/24 EST would be greatly appreciated since I'll probably try the import in production at that time.
0
Comment
Question by:Igiwwa
10 Comments
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 9610987
>>Is it possible that SQL Server is reading in my spreadsheet and is running right off the end of my data? <<
Correct.  Delete the "blank" rows at the end and your problem will go away.

>>One possible workaround is temporarily allowing nulls in the database in my production environment (while I import the data). <<
You should never import directly into a table (let alone in Production), but rather you should import into a temporary table first and then select the appropriate rows into the final table.  This way you can verify what is actually going in (remember the old adage: "garbage in, garbage out").

Anthony
0
 

Author Comment

by:Igiwwa
ID: 9611062
I already tried to delete these "blank" rows and it didn't seem to make any difference.  I guess that's because Excel always gives you infinite empty rows and columns.  Is there any way I can make the spreadsheet just end after row 800?  Maybe I can use MS Access (*shudder*) as an intermediary.

Thanks for the tip about the temporary table, I will definitely use that regardless of the outcome of this problem.
0
 
LVL 75

Accepted Solution

by:
Anthony Perkins earned 450 total points
ID: 9611093
>>I already tried to delete these "blank" rows and it didn't seem to make any difference. <<
I suspect it depends how you are deleting them.

In any case, when you import it in, use a query instead of a straight copy, as in:

Select *
From [Sheet1$]
Where Not [Column1] Is Null

Anthony
0
Best Practices: Disaster Recovery Testing

Besides backup, any IT division should have a disaster recovery plan. You will find a few tips below relating to the development of such a plan and to what issues one should pay special attention in the course of backup planning.

 
LVL 15

Expert Comment

by:namasi_navaretnam
ID: 9613365
I have not had any issues DTS from Excel to MS SQL Server. But I often use a funcky way to import data from excel to MS SQL Server.

For Example, say I have data listed below in a spreadsheet
Col1  Col2   Col2       Col4
1       A       XXXX    
2       B       YYYY


For Col4, I use excel function CANCATENATE to come up with a sql statement concatenating values in the other columns

=CONCATENATE( " INSERT INTO MYTBALE (COL1, COL2, COl3) VALUES (", A1, ",'", B1, "','", C1,"')" )

Then copy the column 4 data to Query explorer and execute it. The data is imported.

You may want to use this method if nothing helps.

Namasi.
0
 
LVL 30

Expert Comment

by:nmcdermaid
ID: 9613614
If Excel is too dumb to work out where the data stops, you can select all the data and make it a named range (type a name into the cell drop down in the bottom left of the toolbar)

Then I think it is possible to tell DTS to only import the named range.

It's a bit impractical for a automted system though.
0
 

Author Comment

by:Igiwwa
ID: 9616870
I like acperkins suggestion:

Select *
From [Sheet1$]
Where Not [Column1] Is Null

It seems to work well in a test environment.  I was going to try it for real later today, but other stuff has come up so it may have to wait until next week, but it looks quite promising.

Thanks.
0
 
LVL 34

Expert Comment

by:arbert
ID: 9623520
I agree with acperkins suggestion in every way.  Not only can you use the query to extract only the records you need, but you REALLY SHOULD use the temp table--makes it easier to "clean" any data ahead of time, and also makes it easier to back out data if you have a problem.....

Brett
0
 

Author Comment

by:Igiwwa
ID: 9627548
Ok.  I did the import and everything worked well.  There is just one small problem.  I have a unique id for each record.  Before the import I was in the mid 5000s.  To be able to easily identify the newly imported records, I started them at 10,000.  Now the auto numbering of sql is starting new records in the 10,000 block.  Is there any way I can make it start giving out ids in the 5000s again?
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 9628881
You cannot reassign the IDENTITY column to a lower number than already exists.  If this is not the case checkout the command:
DBCC CHECKIDENT

From BOL:
<quote>
Checks the current identity value for the specified table and, if needed, corrects the identity value.

Syntax
DBCC CHECKIDENT
    ( 'table_name'
        [ , { NORESEED
                | { RESEED [ , new_reseed_value ] }
            }
        ]
    )

</quote>

Anthony
0
 

Author Comment

by:Igiwwa
ID: 9630409
Thanks a lot everyone!
0

Featured Post

Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

When you hear the word proxy, you may become apprehensive. This article will help you to understand Proxy and when it is useful. Let's talk Proxy for SQL Server. (Not in terms of Internet access.) Typically, you'll run into this type of problem w…
Slowly Changing Dimension Transformation component in data task flow is very useful for us to manage and control how data changes in SSIS.
This video shows, step by step, how to configure Oracle Heterogeneous Services via the Generic Gateway Agent in order to make a connection from an Oracle session and access a remote SQL Server database table.
Via a live example, show how to backup a database, simulate a failure backup the tail of the database transaction log and perform the restore.

828 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question