Handling Null Values in BULK INSERT

We have a Excel file which contains some 50000 records, now while uploading this Excel
we first convert it into Text formate. In this Text file the 50000 records will be saved and the remaining
15000 records will be "null".
   Now while inserting this text file Using Bulk insert Command in SQL 2000, the "null" values
are also getting inserted in Databse which is not required
 
 Can you please suggest a solution, where the "null" values can be avoided while running Bulk
Insert


 
 
Venkatkp007Asked:
Who is Participating?

Improve company productivity with a Business Account.Sign Up

x
 
frankyteeConnect With a Mentor Commented:
that may be limitation of bulk insert. when importing data into sql i usually using a "staging" table, which is identical in structure to your "real" table but with no constraints nor indexes etc to ensure the best speed. then after it imports do the necessary cleaning etc which in your case is to remove the "nulls" (or just ignore them) when inserting into the real table.
0
 
Jagdish DevakuSr DB ArchitectCommented:
Hi,

I think using Import for SQL Management Studio (SMS) is the better option than using BULK INSERT.

Right click on the database  to which you want to import the data.
goto -> Tasks -> Import Data... then follow the steps as it asks. ()

While it asks for Datasource, select MIcrosoft Excel.



0
 
Mark WillsConnect With a Mentor Topic AdvisorCommented:
Yeah bulk insert is a bit "dumb" but that is what it was designed for...

If you need more finesse, then you can goto openrowset, or even create a linked server to the spreadsheet itself and save having to convert it...

Even still, like franytee, I always use a staging table, do all the validation / verification on that and then import "clean" records into the real database table.

You will need to swap the appropriate spreadsheet name (c:\example3.xls) and the worksheet ("example3$") , but hopefully will be self explanatory enough...

You can then do things like select where a column is not null etc...


Select * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0', 'Excel 8.0;Database=c:\example3.xls;HDR=YES', 'SELECT * FROM [example3$]')
 
-- or below for a linked server
 
EXEC sp_addlinkedserver MyExcel,
     'Jet 4.0',
     'Microsoft.Jet.OLEDB.4.0',
     'c:\example3.xls', 
      NULL,
     'Excel 5.0;'
 
GO
 
--Set up login mappings (just ADMIN - jet wants something).
EXEC sp_addlinkedsrvlogin MyExcel, FALSE, NULL, Admin, NULL
GO
 
--List the tables in the linked server (these are the worksheet names).
EXEC sp_tables_ex MyExcel
GO
 
--List the contents of the 'Sheet1' worksheet 
select * from MyExcel...example3$
 
--Define a table - permanent or othwise - suggest otherwise as staging
if exists (select * from information_schema.tables where table_name = 'my_tmp_Sheet1') drop table my_tmp_Sheet1
select * into my_tmp_Sheet1 from MyExcel...Sheet1$
 
-- now from here you will have to adapt 
select * from my_tmp_Sheet1
 
 
-- now can unpack that table into a real home...
-- fetch each record from the cursor, and insert / update a 'real' table
-- then clean up
 
 
-- remove 'staging' area
 
if exists (select * from information_schema.tables where table_name = 'my_tmp_Sheet1') drop table my_tmp_Sheet1
 
-- remove added logins
sp_droplinkedsrvlogin 'MyExcel', 'Admin'
-- remove server
sp_dropserver 'MyExcel', 'droplogins';

Open in new window

0
 
Venkatkp007Author Commented:
Mark,
Is this OPENROWSET works with .txt file also?
Now we are facing a new problem.
We have  4 columns in the text file and the field terminator is tab.
we use bulk statement as :
BULK INSERT Temp_Locations2 FROM 'E:\upload\KGI Files\TDC Song Zone1.txt' WITH (FIRSTROW = 2,FIELDTERMINATOR = '      ',ROWTERMINATOR = '\n')
For this text file last column is null. Some rows have field terminator as tab after 3rd column and some dont have.
so if it doesnt find the field terminator, it is inserting the next row in the current row itself.
How can this be handled?
0
 
Mark WillsConnect With a Mentor Topic AdvisorCommented:
Yes, and a few other options as well. Normally need to get involved with sql "format files" when dealing with flat files... kind of like a primitive schema...

P.S.  use '\t' as fieldterminator....

If it is only 4 columns, import as a single column and a bit of TSQL code will quickly break it up for you...

Want to send through a zipped copy of your file ?
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.