Solved

How do I make working SqlBulkCopy in VB.NET?

Posted on 2008-06-13
10
3,362 Views
Last Modified: 2013-11-26
We have a txt file (size around 2 G) that loads very slowly > 8 hrs. So, we decided to use bulk copy to accelerate the process.

This txt file has some special headers (the second line contains the actual column headers) and one line of footer.

First, I don't know how to get rid of the first line of header and the last one (footer) in StreamReader.

Second, when I cut these lines manually (it is not easy with 2 G so in my test I used only 500 lines) and process,
there is an error message stating as follows:
System.InvalidOperationException was unhandled
  Message="The given ColumnName 'BIN_LOC' does not match up with any column in data source."
  Source="System.Data"

The column BIN_LOC exists in my destination file!

What is wrong with the code and how to fix the problem of header/footer?
Thanks for your time.

Mike
Sub BulkImport()
 

        Const strDestTable As String = "[SCM-OBJ-12_input_im_ftn]"

        Const strFilename As String = "C:\Mydata.txt"
 

        Dim i As Long = 0

        Dim dbConn As SqlConnection = New SqlConnection("server=MyServer;uid=MyName;pwd=MyPass;database=MyDB")

        Dim sr As StreamReader = New StreamReader(strFilename)

        Dim line As String = sr.ReadLine()

        Dim strArray As String() = line.Split("~")
 
 

        Dim dt As DataTable = New DataTable()

        Dim row As DataRow
 

        For Each s As String In strArray

            dt.Columns.Add(New DataColumn())

        Next
 

        Do

            row = dt.NewRow()

            row.ItemArray = line.Split("~")

            dt.Rows.Add(row)

            i = i + 1

            line = sr.ReadLine()
 

        Loop While Not line = String.Empty
 

        Dim bc As SqlBulkCopy = New SqlBulkCopy(dbConn, SqlBulkCopyOptions.TableLock, Nothing)

        bc.DestinationTableName = strDestTable
 

        bc.BatchSize = dt.Rows.Count

        bc.ColumnMappings.Add("BIN_LOC", "BIN_LOC")

        bc.ColumnMappings.Add("MATL_TYPE_IN", "MATL_TYPE_IN")

        bc.ColumnMappings.Add("MFG_PART_NO", "MFG_PART_NO")

        bc.ColumnMappings.Add("LOC_PRM_IN", "LOC_PRM_IN")

        bc.ColumnMappings.Add("LOC_SCND", "LOC_SCND")

        bc.ColumnMappings.Add("CUST_ID_IN", "CUST_ID_IN")

        bc.ColumnMappings.Add("CO_NO", "CO_NO")

        bc.ColumnMappings.Add("MFR_PART_NBR", "MFR_PART_NBR")
 

        dbConn.Open()

        bc.WriteToServer(dt)

        dbConn.Close()

        bc.Close()

End Sub
 

--------------------------------------------------------------------

Here is a sample of the beginning of the file:
 

SCM-OBJ-12~SCM-OBJ-12_IM_FTN_INPUT~2008-06-10~0509

BIN_LOC~MATL_TYPE_IN~MFG_PART_NO~LOC_PRM_IN~LOC_SCND~CUST_ID_IN~CO_NO~MFR_PART_NBR~FTN_NBR~MATL_TYPE_FTN~STORG_LOC_ID~FTN_NBR_LAST~RECD_DT~SVC_DT~MFG_CODE~OH_PRICED_QTY~DEFAULT_UPR~MFR_CD~LOC_PRM_FTN~PME_NBR~CUST_ID_FTN~MTC_EVT_NM~RTN_FOR_INSPEC_DT~SER_NBR~MFR_DT~FTN_STS_CD~SER_NBR_TYP_CD~ITEM_CLASS_CODE~ITT_CUST_ID~ITT_MATL_TYPE~ITT_LOC_PRM~ITT_STORG_LOC_ID

                  ~  ~3555111-4                 ~     ~          ~  ~PP1 ~3555111-7                 ~R8Z0L1~  ~177-22 11 ~      ~20070307~        ~99193~ 000000000.0000~ 00000000.01000~99193~ABC  ~               ~PP~               ~        ~    060122212409~        ~ANS~M~0702~  ~  ~     ~

Open in new window

0
Comment
Question by:mjasic
  • 5
  • 4
10 Comments
 
LVL 96

Expert Comment

by:Bob Learned
ID: 21785170
Here are some thoughts:

1) If the columns names are the same between the source and target, then you don't need column ColumnMappings.

2) The SqlBulkCopy should handle the connection itself, so you don't need to open and close the connection.

3) You are adding columns to a DataTable without any column names.
0
 
LVL 2

Author Comment

by:mjasic
ID: 21793411
Thanks for the ideas, but the problem is still there.
Friday, I did some changes to my code because I noticed that the column names weren't there, so it isn't an issue any more.

After executing the new code, I had another error:
"The given value of type String from the data source cannot be converted to type nchar of the specified target column."

Today, I tried without connection (as you suggested), but it didn't work. I had to open the connection.

You'll find attached the last version of my code, so I'll appreciate if you have any other idea how to fix the problems with BulkCopy.
Also, how could I skip or cut thefirst and the the last line of data.? Is there any option in BulkCopy to to begin at the second line (the header) and go until the last line -1? In my tests I cut manually these lines (sample of 500 lines), but I cannot do it in a complete file of >2 million lines and then save it. I think it would take hours just to save the file.

Thanks

Mike





        Const strDestTable As String = "[SCM-OBJ-12_input_im_ftn]"

        Const strFilename As String = "C:\Mydata.txt"

        Dim i As Long = 0

        Dim dbConn As SqlConnection = New SqlConnection("server=MyServer;uid=MyName;pwd=MyPass;database=MyDB")

        Dim sr As StreamReader = New StreamReader(strFilename)

        Dim line As String = sr.ReadLine()

        Dim strArray As String() = line.Split("~")

        Dim dt As DataTable = New DataTable()

        Dim row As DataRow

        Dim dc As DataColumn
 

        For Each s As String In strArray

            dc = New DataColumn

            dc.ColumnName = Trim(s)

            dt.Columns.Add(dc)

        Next
 

        Do
 

            row = dt.NewRow()

            row.ItemArray = line.Split("~")

            dt.Rows.Add(row)

            i = i + 1

            line = sr.ReadLine()
 

        Loop While Not line = String.Empty
 
 

        Dim bc As SqlBulkCopy = New SqlBulkCopy(dbConn, SqlBulkCopyOptions.TableLock, Nothing)

        bc.DestinationTableName = strDestTable

        bc.BatchSize = dt.Rows.Count

        dbConn.Open()

        Try

            bc.WriteToServer(dt)    ' <--- the error haapens here

        Catch ex As Exception

            Debug.Print(ex.Message)

        End Try

        dbConn.Close()

        bc.Close()

Open in new window

0
 
LVL 96

Expert Comment

by:Bob Learned
ID: 21793460
1) I wasn't saying that you didn't need the connection, I just said that you don't need to manage the state.

2) If you want to skip the first line, just call ReadLine before you start in the Do loop.  

3) Can you change the column type for nchar to nvarchar?
0
 
LVL 2

Author Comment

by:mjasic
ID: 21793623
For 2), I did it before the first Split (before For..Next). Great!
What about the last line?

For 3) - Unfortunatelly, I am not the one that defined the SQL tables and (also unfortunatelly) all the fields in all tables are "nchar". The tables were generated by a tool that a member of our team developped two years ago. Is there anything we can do for this?

Thanks


0
 
LVL 96

Expert Comment

by:Bob Learned
ID: 21793816
2) For the last line, I would just remove the last DataRow from the DataTable that you are building.

3) You should be able to use something like SQL Server Management Studio, or Server Explorer, to modify the column type to change if from nchar to nvarchar.

0
6 Surprising Benefits of Threat Intelligence

All sorts of threat intelligence is available on the web. Intelligence you can learn from, and use to anticipate and prepare for future attacks.

 
LVL 2

Author Comment

by:mjasic
ID: 21793939
Unfortunatelly, I am not allowed to change the data types in tables (and, moreover, there are several hundreds of tables in our staging system so it would be too long). I must do something with the tables (and their structures) I have. I tried the casting and some other things, but it wasn't better :(
Is there any other way to fix this problem?
0
 
LVL 2

Author Comment

by:mjasic
ID: 21794055
I just did a test. I recreated on table (under other name) and changed nchar to nvarchar. The error is:
The given value of type String from the data source cannot be converted to type nvarchar of the specified target column.

So, it must be something else :(
0
 
LVL 96

Expert Comment

by:Bob Learned
ID: 21794086
Hmmm...I usually don't use nvarchar, unless I am using it for a very specific reason, since it is 2 bytes per character (globalization), but I would have thought that the SqlBulkCopy could have handled the conversion--I have been wrong before, and I will probably be wrong again in the future about something else.
0
 
LVL 2

Accepted Solution

by:
mjasic earned 0 total points
ID: 21835484
I think the problem was only in the header, but the displayed  message was completely wrong.
0
 
LVL 1

Expert Comment

by:pickedaname
ID: 22596608
So what was the solution?
0

Featured Post

How to improve team productivity

Quip adds documents, spreadsheets, and tasklists to your Slack experience
- Elevate ideas to Quip docs
- Share Quip docs in Slack
- Get notified of changes to your docs
- Available on iOS/Android/Desktop/Web
- Online/Offline

Join & Write a Comment

Recently while returning home from work my wife (another .NET developer) was murmuring something. On further poking she said that she has been assigned a task where she has to serialize and deserialize objects and she is afraid of serialization. Wha…
In my previous article (http://www.experts-exchange.com/Programming/Languages/.NET/.NET_Framework_3.x/A_4362-Serialization-in-NET-1.html) we saw the basics of serialization and how types/objects can be serialized to Binary format. In this blog we wi…
Internet Business Fax to Email Made Easy - With eFax Corporate (http://www.enterprise.efax.com), you'll receive a dedicated online fax number, which is used the same way as a typical analog fax number. You'll receive secure faxes in your email, fr…
This video gives you a great overview about bandwidth monitoring with SNMP and WMI with our network monitoring solution PRTG Network Monitor (https://www.paessler.com/prtg). If you're looking for how to monitor bandwidth using netflow or packet s…

746 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

13 Experts available now in Live!

Get 1:1 Help Now