• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 466
  • Last Modified:

ADO.NET code to update a temporary table

Hi,

I have some simple code below and I would like to get some ADO.NET code to update a temporary table. Hopefully using Datasets / data tables.



Thanks,

Ward
SqlConnection myConn = new SqlConnection("Data Source=localhost\\SQLEXPRESS;Initial Catalog=CarDB;Integrated Security=True;database=CarDB");
            string str = "create table #temp1 (col1 int, col2 int);";
 
            // Code to do a dataset to insert some records.
            
            SqlCommand myCommand = new SqlCommand(str, myConn);

            myConn.Open();
            myCommand.ExecuteNonQuery();
            myCommand.CommandText = "drop table #temp1;";
            myCommand.ExecuteNonQuery();

Open in new window

0
whorsfall
Asked:
whorsfall
  • 3
  • 2
1 Solution
 
redpipeCommented:
Can you say something about why you are using temporary tables and what you are trying to achieve trough your code? Such information will greatly help in giving you a to-the-point answer :)

You do not need to CREATE local temporary tables. In stead you just SELECT...INTO them. See the following article for basic tips on usage of SQL Servers local temp tables:
http://weblogs.sqlteam.com/mladenp/archive/2006/11/03/17197.aspx

Many of the operations done by local temporary tables, that only exists in memory for a given session, can be done more effectively by using derived tables directly in the FROM part of your SELECT statement. A strategy many argue can give increased performance. An example of this is given in this aritcle http://www.sql-server-performance.com/articles/per/derived_temp_tables_p1.aspx 

Hope this helps you on your quest..?
0
 
whorsfallAuthor Commented:
Hi,

Thanks for your response. Ok I am building a data acquision application which needs to sample data at a high rate and record for later analysis. We estimate we need to store 400 results at 100 times per second.

So I am trying to find the fastest method to achieve this throughput. This why I was thinking about temporary tables.

I would even be happy with something that can run put it all in memory and only take a shoryt amount of time to save to disk.

Ward.
0
 
redpipeCommented:
Sorry for the delay, but I've been offline for a couple of days. When it comes to a reply to you question, I first have to admit that I have not worked directly on a solution with such high demands for data caching and object persistence.

If the acquisition part is entirely disconnected from the analysis part, you could do ordinary INSERT statements to a permanent table where one of the columns contains e.g. a binary representation of your 400 resultset. Databases as Berkeley and MySql are both perceived as swift and high capacity 'tools' for object persistence.

You could use queuing, but that is just an externalization of native database caching.

Another solution could be to use specialized databases that are targeted on such caching e.g.DB4o that claims to cache up to 200.000 objects per second (http://www.db4o.com/about/solutions/networks/default.aspx).
0
 
whorsfallAuthor Commented:
Hi,

Thanks for the great response - have you tried DB4o it sounds impressive?

thanks,

Ward
0
 
redpipeCommented:
No I have not tried it myself. But I've heard great things from a friend of a friend
0

Featured Post

Prepare for your VMware VCP6-DCV exam.

Josh Coen and Jason Langer have prepared the latest edition of VCP study guide. Both authors have been working in the IT field for more than a decade, and both hold VMware certifications. This 163-page guide covers all 10 of the exam blueprint sections.

  • 3
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now