Solved

How to run performance tests for .NET Db Libraries

Posted on 2010-09-02
7
463 Views
Last Modified: 2013-12-13
I want to write some tests to compare performance of accessing a SQL Server database using different .NET Db libraries - in particular OleDb and SQL Server.

Can anyone point me to a list of dos and don'ts for running performance tests? I'm thinking that perhaps there is some way to turn off caching for example to ensure it is the actual Db access that is being measured.

Thanks for any help you can give.
0
Comment
Question by:purplesoup
7 Comments
 
LVL 9

Assisted Solution

by:richard_hughes
richard_hughes earned 50 total points
Comment Utility
Hello purplesoup

This may help:

http://msdn.microsoft.com/en-us/library/system.diagnostics.stopwatch.aspx

It is a really handy part of .NET 4 diagnostics.

Thanks,

Richard Hughes
0
 

Author Comment

by:purplesoup
Comment Utility
I've run two groups of tests, comparing the SQL Client library with the OleDb library.

The first group of tests uses a Command object which has an INSERT statement assigned, parameters set, and 500 rows inserted.

First I used a table with one ID UNIQUEIDENTIFIER PRIMARY KEY and 989 DATETIME NULL columns.

The average SQL performance to insert 500 rows was 13.62 seconds, the average OLEDB time was 2.86.

Next I used a table with one ID UNIQUEIDENTIFIER PRIMARY KEY and 350 NVARCHAR(10 NOT NULL columns.

The average SQL performance to insert 500 rows was 5.11 seconds, the average OLEDB time was 2.19.

I've included the StringInsert code to give an example of what was run.

Clearly, OleDb is actually faster than the native SQL library. Can anyone explain why this might be??

The second group of tests use a DataAdapter which is populated with data from a SQL table containing around 10,000 rows.

At first just the 10,000 rows were returned, with results as follows:

OLE        SQL
4.39              2.53
4.21              2.71
4.26              2.55
4.40              2.69
4.32              2.62

This makes the SQL 1.65 times faster than the OLEDB.

I then modified the test so that as well as returning 10,000 rows, 5000 rows are updated, 500 rows inserted and the data adapter Update command is then run.

Test results are as follows (in seconds):

OLE      SQL
90      84
87      82
96      95
83      89
-------------
89      87.5

As you can see, the SQL is marginally faster, but not much.

I had thought things would be much faster with SQL - for example this test

http://social.msdn.microsoft.com/Forums/en-US/vbgeneral/thread/fadb2742-d75a-49fb-a839-b2d4a9183998

claims SQL to be 2.5 times faster than OLEDB.

Can anyone give some comments on my tests to see whether I am missing something?

private void StringInsert(ClientClassesEnum clientType, int recs)
        {
            const int C_COLUMNS = 350;

            if (recs <= 0) recs = 500;

            IDbConnection c;
            if (clientType == ClientClassesEnum.SqlClient)
            {
                c = new SqlConnection();
            }
            else
            {
                c = new OleDbConnection();
            }

            string conn = Helper.GetConnectionString("DataSource.iMIS.Connection");

            if (clientType == ClientClassesEnum.OleDbClient)
            {
                c.ConnectionString = "Provider=SQLOLEDB;" + conn;    
            }
            else
            {
                c.ConnectionString = conn;    
            }
            
            c.Open();

            IDbCommand cmd;
            if (clientType == ClientClassesEnum.SqlClient)
            {
                cmd = new SqlCommand();
            }
            else
            {
                cmd = new OleDbCommand();    
            }
            
            cmd.Connection = c;

            StringBuilder cmdTxt = new StringBuilder();
            cmdTxt.Append("INSERT INTO PerformanceTable2 (");
            cmdTxt.Append("ID,");

            // Generate the SQL INSERT string columns
            for (int i = 1; i <= C_COLUMNS; i++)
            {
                string s1 = "String%1".Replace("%1", i.ToString());
                cmdTxt.Append(s1);
                if (i < C_COLUMNS) cmdTxt.Append(",");
            }
            cmdTxt.Append(") VALUES (");

            if (clientType == ClientClassesEnum.SqlClient)
            {
                cmdTxt.Append("@ID,");    
            }
            else
            {
                cmdTxt.Append("?,");    
            }

            // Generate the SQL INSERT string values
            for (int i = 1; i <= C_COLUMNS; i++)
            {
                string s1;
                if (clientType == ClientClassesEnum.SqlClient)
                {
                    s1 = "@P%1".Replace("%1", i.ToString());    
                }
                else
                {
                    s1 = "?";    
                }
                cmdTxt.Append(s1);
                if (i < C_COLUMNS) cmdTxt.Append(",");
            }
            cmdTxt.Append(")");

            cmd.CommandText = cmdTxt.ToString();

            // pass in the ID value
            IDataParameter p;
            if (clientType == ClientClassesEnum.SqlClient)
            {
                p = new SqlParameter("ID", SqlDbType.UniqueIdentifier);               
            }
            else            
            {
                p = new OleDbParameter("ID", OleDbType.Guid);
            }
            p.Value = Guid.NewGuid();
            cmd.Parameters.Add(p);

            for (int i = 1; i <= C_COLUMNS; i++)
            {
                if (clientType == ClientClassesEnum.SqlClient)
                {
                    string s1 = "@P%1".Replace("%1", i.ToString());
                    // p = new SqlParameter(s1, SqlDbType.DateTime);
                    p = new SqlParameter(s1, SqlDbType.NVarChar);
                }
                else
                {
                    string s1 = "P%1".Replace("%1", i.ToString());
                    // p = new OleDbParameter(s1, OleDbType.Date);
                    p = new OleDbParameter(s1, OleDbType.VarWChar);
                }
                p.Value = DateTime.Now.ToShortDateString();
                cmd.Parameters.Add(p);                
            }

            Stopwatch w = new Stopwatch();
            w.Start();

            int r = cmd.ExecuteNonQuery();

            for (int j = 1; j < 500; j++)
            {
                if (clientType == ClientClassesEnum.SqlClient)
                {
                    (cmd.Parameters[0] as SqlParameter).Value = Guid.NewGuid();
                }
                else
                {
                    (cmd.Parameters[0] as OleDbParameter).Value = Guid.NewGuid();    
                }
                
                for (int i = 1; i <= C_COLUMNS; i++)
                {
                    if (clientType == ClientClassesEnum.SqlClient)
                    {
                        (cmd.Parameters[i] as SqlParameter).Value = DateTime.Now.AddMinutes(i).ToShortDateString();
                    }
                    else
                    {
                        (cmd.Parameters[i] as OleDbParameter).Value = DateTime.Now.AddMinutes(i).ToShortDateString();
                    }
                }

                int r2 = cmd.ExecuteNonQuery();
            }

            w.Stop();

            TimeSpan ts = w.Elapsed;
            txtResults.Text += String.Format("{0:00}:{1:00}:{2:00}.{3:00}\n",
                                ts.Hours, ts.Minutes, ts.Seconds,
                                ts.Milliseconds / 10);

        }

Open in new window

0
 
LVL 21

Accepted Solution

by:
MogalManic earned 250 total points
Comment Utility
Your numbers kinda make sense.  

With the SqlClient, almost all of the code is managed code that sends the Db transaction to the network to execute on the Sql Server instance.
With the OleClient, the transaction flows through managed code, to the Ole Providor, to the network  to execute on the Sql Server instance. (one extra hop).

With small Db Transactions with small data transfered (i.e. update/insert one row), the extra hop is negligible and performance MAY be better because the data is transfered to Sql Server via compiled C code (or maybe even assembly).

With large transactions (i.e. select of multiple rows), data has to be transfered from managed code to unmanaged code (compiled C code) before it is sent to Sql Server.  This extra transfer explains why the select is slower.

A better insert/update test would be to update/insert a varchar2(2000) field that has the full 2000 characters filled.
0
 
LVL 6

Assisted Solution

by:r3nder
r3nder earned 100 total points
Comment Utility
0
 
LVL 1

Assisted Solution

by:shar5
shar5 earned 100 total points
Comment Utility
     Once the application is completely built and tested, I will conduct something called a "Load test" or "Stress test". There are several ways (or tools) in which one can stress test a database application. I use Mercury Loadrunner, though it is a very expensive tool to work with. Cheaper alternatives include, tools like "Database Hammer" from SQL Server 2000 Resource Kit. Idea behind stress testing is to be able to simulate real production loads, user connections and see how the application and database behave under extreme stress. This enables us to fix potential performance issues, blocking and deadlocking issues, before we put the application in production. So, it is very important to have a separate load testing environment, which mirrors your production environments in terms of hardware and software resources (like processing power, available memory, disk layouts and RAID arrays, versions and editions of software), network bandwidth etc. Without this, your load test may not be a valid one.
0

Featured Post

Find Ransomware Secrets With All-Source Analysis

Ransomware has become a major concern for organizations; its prevalence has grown due to past successes achieved by threat actors. While each ransomware variant is different, we’ve seen some common tactics and trends used among the authors of the malware.

Join & Write a Comment

Suggested Solutions

For both online and offline retail, the cross-channel business is the most recent pattern in the B2C trade space.
In this article, you will read about the trends across the human resources departments for the upcoming year. Some of them include improving employee experience, adopting new technologies, using HR software to its full extent, and integrating artifi…
Using examples as well as descriptions, and references to Books Online, show the different Recovery Models available in SQL Server and explain, as well as show how full, differential and transaction log backups are performed
XMind Plus helps organize all details/aspects of any project from large to small in an orderly and concise manner. If you are working on a complex project, use this micro tutorial to show you how to make a basic flow chart. The software is free when…

763 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

6 Experts available now in Live!

Get 1:1 Help Now