Solved

SQL Select

Posted on 2014-09-23
9
263 Views
Last Modified: 2014-09-23
We have a SQL stored procedure that selects and returns records against a SQL DB table containing 3 million + records.

The selection logic is very simple, any record that has a created date greater than a variable that is passed in the procedure. The resulting selected record set can be as few as a couple dozen or number in the 100,000s.

The logic run for each record is extensive, to the extent that it takes a couple of seconds for each record to be processed.

Could performance be improved if we were to do a select * insert into a new table with just the subset of records that we want, and then run the logic against that limited number of records?

Thanks for your help
0
Comment
Question by:ordo
  • 4
  • 2
  • 2
  • +1
9 Comments
 
LVL 40

Expert Comment

by:Kyle Abrahams
ID: 40340047
"The logic run for each record is extensive"

Can you expound on that?  What are you doing in the logic run?  Does it make more sense to store the data differently?

Without seeing the relevant sections (or an approximation of) it will be difficult to determine if you're better off using a temporary table.
0
 
LVL 65

Expert Comment

by:Jim Horn
ID: 40340078
Mind readers we ain't.  Copy-paste the Stored Procedure into this question in a code block, and perhaps we can help..
0
 

Author Comment

by:ordo
ID: 40340083
Kyle,

The logic/operations preformed against each record isn't going to change and is way to involved to bore you with. I guess my question is more related to memory usage and processor utilization.

Thanks
0
 

Author Comment

by:ordo
ID: 40340087
To elaborate the stored procedure simple selects about 10 columns from each record that meets the desired created date criteria.

Thanks again.
0
VMware Disaster Recovery and Data Protection

In this expert guide, you’ll learn about the components of a Modern Data Center. You will use cases for the value-added capabilities of Veeam®, including combining backup and replication for VMware disaster recovery and using replication for data center migration.

 
LVL 69

Assisted Solution

by:ScottPletcher
ScottPletcher earned 334 total points
ID: 40340113
With such limited info, the only things that are clear so far are:

1) cluster the table by [created date]
2) do as much set-based processing as you can.  if you must use a cursor, make it as efficient as possible.
3) >> Could performance be improved if we were to do a select * insert into a new table with just the subset of records that we want, and then run the logic against that limited number of records? <<  Not likely, but still possible.
0
 

Author Comment

by:ordo
ID: 40340143
Scott,

Table is clustered by [create date] column.

Can you elaborate a little bit on your second suggestion (set-based processing)?

Thanks for your help.
0
 
LVL 69

Accepted Solution

by:
ScottPletcher earned 334 total points
ID: 40340154
Rather than doing this:

DECLARE cursor_name CURSOR ... FOR
SELECT ...
FROM ...
...
WHILE loop
    FETCH NEXT FROM ...
    ...
...


as much as possible use standard "SELECT ..." to process the data.


Again, w/o being able to see the code, that's just a very general guideline.
0
 
LVL 40

Assisted Solution

by:Kyle Abrahams
Kyle Abrahams earned 166 total points
ID: 40340230
another example:

instead of looping each row and doing

column10 = column8 + column9

he's saying:

update table
set column10 = column8 + column9

The logic may not change, but the way you process the logic can be optimized.
0
 

Author Closing Comment

by:ordo
ID: 40340383
Thank you all.
0

Featured Post

Is Your Active Directory as Secure as You Think?

More than 75% of all records are compromised because of the loss or theft of a privileged credential. Experts have been exploring Active Directory infrastructure to identify key threats and establish best practices for keeping data safe. Attend this month’s webinar to learn more.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

JSON is being used more and more, besides XML, and you surely wanted to parse the data out into SQL instead of doing it in some Javascript. The below function in SQL Server can do the job for you, returning a quick table with the parsed data.
The Delta outage: 650 cancelled flights, more than 1200 delayed flights, thousands of frustrated customers, tens of millions of dollars in damages – plus untold reputational damage to one of the world’s most trusted airlines. All due to a catastroph…
Via a live example, show how to shrink a transaction log file down to a reasonable size.
Viewers will learn how to use the SELECT statement in SQL and will be exposed to the many uses the SELECT statement has.

910 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

20 Experts available now in Live!

Get 1:1 Help Now