Link to home
Start Free TrialLog in
Avatar of Craig Beamson
Craig BeamsonFlag for United Kingdom of Great Britain and Northern Ireland

asked on

How should I load a VB array of Class (within VB 2010 Express) to a SQL Server database table ?

I am using Visual Basic 2010 Express.
I have an API which returns, amongst other things, an array of a class.
I want to load the information in this array to a SQL server 2008 Express database.

The items in the class are simple datatypes (dates, integers, strings etc) and can be mapped to SQL datatypes.

So, my current solution works, but is a bit ugly.

If the API returns AllPeople, my code goes something like:

With AllPeople
  For i = 0 to .PeopleData.Length - 1
    With .PeopleData(i)
      ConSql.Open()
      MySqlCommand = New SqlCommand("SpInsertPeople", ConSql)
      MySqlCommand.Parameters.AddWithValue("Forename", .forename)
      MySqlCommand.Parameters.AddWithValue("Surname", .surname)
      [etc]
      MySqlCommand.ExecuteNonQuery()
    ConSql.Close()
    End With
  Next
End With 

Open in new window


So, I'm looping through the index of the array and for each item in the array, I'm adding the class values to parameters for a stored procedure and executing it. (The stored procedure adds new records, updates exisiting ones)

Currently, the size of the array is fairly small - say tens or hundreds of records - but it will grow to several thousand.  I don't want to be hammering my SQL server with tens of thousands of seperate executions of stored procedures. This process will have to repeat every 30 seconds or so.

So, is there a cleaner way of getting data from an array of class into a SQL Server table?
Or is For... Execute stored procedure...  Next...    as good as it gets?
ASKER CERTIFIED SOLUTION
Avatar of Haver Ramirez
Haver Ramirez

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of Craig Beamson

ASKER

I've done various comma or tab delimited text file BULK INSERTS successfully before.  To use them in this case, I'd again need to step through the array to generate the text file and upload it to the SQL server.  To me this could work, but would be an even more cumbersome route to achieving the transfer of data!

I was hoping there would be something in the area of dataadapters and datasets that could somehow connect to both the source array and the target SQL table, locally copy between datasets by an insert query, then update via the dataadapter back to the SQL server.  

If stepping through arrays and copying data line by line is the only way to do this kind of thing, I'm happy to accept this.  I'm just dimly aware that in a lot of my work, I'm doing a lot of stepping through arrays or datasets, in order to copy data into SQL Server, and it feels like I'm doing it the more processing-intensive way.
Avatar of Haver Ramirez
Haver Ramirez

With datasets and adapter, when this come from database directly with datasource they use a insert one by one, also you can mapp the store procedure but i think is one by one
The one by one insert is going to happen anyway. It could become bit easier on the code side if your code returns a datatable instead of class array.
It doesn't feel like a very efficient way of doing things, but in the absence of anything better, stepping through the dataset and executing a stored procedure for each record seems to be the only way.