[Okta Webinar] Learn how to a build a cloud-first strategyRegister Now

x
?
Solved

INDEXING with Cast / Convert function in SQL Server (Enhancing Query Performance)

Posted on 2013-12-12
7
Medium Priority
?
2,339 Views
Last Modified: 2014-02-10
I have several SQL Server tables which are dropped and reloaded through text files on a regular basis. The result is dozens of tables which all have the data type NVARCHAR(255). In the past this has worked fine since I can cast the data type to whatever I want later without having to worry about type conversion issue on the daily job. This practice has now created an issue since I would like to create an index on a datetime derived field that is stored as a multiple text fields. Is it possible to create an index on a value that is derived from the table? In my case I would like something similar to:

CREATE NONCLUSTERED INDEX NDX_prks_Position_Dta_01
ON [dbo].[Position_Dta]
([Position Number - Position Dta],
DATEADD(Minute, cast([Effective Date Sequence # - Position Dta] as int), [Effective Date - Position Dta]),
cast(tbl_Secondary.[Action Date - Position Dta] as date))

Obviously the above syntax does not work but hopefully that helps explain what I am looking for. Essentially the query I have in the example is one of dozens of subqueries with the same issue and the result is queries that take 4 hours to run. I suppose I would create a temp table with the correct data types and index the temp table but I was really hoping for a solution that would not cause me to have to update a lot of functioning code (even though it is super slow it works). Additionally I am not the only person who accesses these tables so I am really trying to avoid modifying the source tables.

Attached is a copy of a typical example where I am trying to create a more effective index.  I am open to any ideas that will increase performance and the less modification to the existing code the better! I appreciate the help experts!
EE-IndexPositionQuery-01.png
0
Comment
Question by:HRISTeam
  • 2
  • 2
  • 2
  • +1
7 Comments
 
LVL 20

Expert Comment

by:TheAvenger
ID: 39714917
You can add computed columns to the tables and then make the index on those.

Here is a topic about computed columns: http://technet.microsoft.com/en-us/library/ms191250(v=sql.105).aspx
And here about indexes on them: http://technet.microsoft.com/en-us/library/ms189292.aspx
0
 
LVL 43

Expert Comment

by:pcelba
ID: 39715018
To create an index on TWO tables as you showed in the question is not good solution probably...

You should try to create index on  [Position Number - Position Dta]  column on both tables and see what happens. Then you may add indexes on [Effective Date Sequence # - Position Dta] and [Effective Date - Position Dta] which will optimize your subquery.

The "less than comparison part" of the WHERE condition isn't easily optimizable and the index on computed column could be a good start BUT I am skeptic...
0
 
LVL 1

Author Comment

by:HRISTeam
ID: 39715181
Sorry, the screen shot is not 100% accurate. All the data is coming from the same table. So the table alias tbl_Primary and tbl_Secondary are actually the exact same table.
0
NEW Veeam Backup for Microsoft Office 365 1.5

With Office 365, it’s your data and your responsibility to protect it. NEW Veeam Backup for Microsoft Office 365 eliminates the risk of losing access to your Office 365 data.

 
LVL 43

Expert Comment

by:pcelba
ID: 39715322
In such case I would rather create a temporary table which will be used in place of the first subselect. This subselect must slow down the whole query because it calculates minimum from unoptimized data.

So create a table where each  [Position Number - Position Dta]  value has one corresponding value calculated as MIN(DateAdd(MINUTE, ...  WHERE ... etc.
and join this table by LEFT JOIN to your Position_dta table. This should optimize the subselect part.

If the result will be still slow then you may optimize the correlated subquery in the WHERE part.
0
 
LVL 20

Expert Comment

by:TheAvenger
ID: 39715333
What about the computed columns I wrote about?
0
 
LVL 1

Author Comment

by:HRISTeam
ID: 39715549
The computed columns looks promising but it would seemingly require me to rewrite much of my code. I am going to try a small test with the concept and test the performance gains.
0
 
LVL 66

Accepted Solution

by:
Jim Horn earned 1500 total points
ID: 39716774
>several SQL Server tables which are dropped and reloaded through text files on a regular basis. The result is dozens of tables which all have the data type NVARCHAR(255).

That's an excellent start, but a vastly better idea would be to treat all of these nvarchar(255) tables as 'staging' tables, whose purpose in life is to ensure that all rows in your text file(s) make it into SQL Server.

Then, do some validations to make sure dates are dates, numbers are numbers, etc. and then INSERT rows from those staging tables into destination tables where the date columns have a date data type, numeric columns have numeric data types, etc.   If you have time you can also build a way to gracefully handle rows where a value that should have been a date was not.

Then index/relate the destination tables, and use them as the source of data, and not the nvarchar(255) tables, in everything that needs the data.

The benefits of this approach will include you don't have to CAST() all the time, which means indexes will be used.
0

Featured Post

Free Tool: SSL Checker

Scans your site and returns information about your SSL implementation and certificate. Helpful for debugging and validating your SSL configuration.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

In this article, I’ll look at how you can use a backup to start a secondary instance for MongoDB.
Creating a Cordova application which allow user to save to/load from his Dropbox account the application database.
In this video, Percona Director of Solution Engineering Jon Tobin discusses the function and features of Percona Server for MongoDB. How Percona can help Percona can help you determine if Percona Server for MongoDB is the right solution for …
In this video, Percona Solutions Engineer Barrett Chambers discusses some of the basic syntax differences between MySQL and MongoDB. To learn more check out our webinar on MongoDB administration for MySQL DBA: https://www.percona.com/resources/we…
Suggested Courses

872 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question