• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 753
  • Last Modified:

How to Track How Often a SQL Server Table is Queried?

Is there a way in SQL Server 2012 to track "how often" a Table is queried?

I'm in the process of analyzing "long-running ETL" processes (SSIS) to try and identify if any of these processes can be optimized. As part of my initial analysis requirement, I need to pull a list of long-running ETL Packages, which I've done. Now I need to see if I can identify which of these long-running ETL Packages contain Tables Updates for TABLES that are "highly utilized" within our Company.

And so, I want to know if there is a way to track how often a certain Table is Queried as this will tell me which ETL's contain Tables that are most widely used in our organization.
0
MIKE
Asked:
MIKE
  • 3
  • 2
4 Solutions
 
kevwitCommented:
For your initial analysis you could use the SSIS Dashboard to drill into the executions. Depending on the number of packages you've identified this might be helpful.

You can change the amount of performance data you gather by checking you SSIS Catalog (usually SSISDB) settings and changing the server logging level to "Performance".

This script would also do the above:
EXEC [SSISDB].[catalog].[configure_catalog] @property_name=N'SERVER_LOGGING_LEVEL', @property_value=2
GO

Tracing anything from the SQL DB side likely isn't going to be valuable for you because you need to know which package is executing the queries. However, if you can look at package execution timing and compare to a SQL trace including query execution times (filtered by the execution account for the SQL job) that may also work depending on how many packages are running at the same time.

The last option is the most complex, but the most powerful. SSIS 2012 has a new feature called "Data taps". These would have to be programmed and ran as part of the package execution (instead of just executing a job that runs a package there would have to be a script created that executes the package and initiates data taps as the package is executed). Here is one blog that describes an overview of this process::

http://www.rafael-salas.com/2012/01/ssis-2012-quick-peek-to-data-taps.html
0
 
MIKESoftware Solutions ConsultantAuthor Commented:
Thanks for the expert input.

Related to your comment:
"...Tracing anything from the SQL DB side likely isn't going to be valuable for you because you need to know which package is executing the queries...."

I want to clarify my requirement. I actualy need to know or find-out which literal DB TABLES are being utilized within our Database..."outside and separate" from any SSIS Package. What we are trying to analyze is:

1. Which of the "long-running ETLs" effects the "most utilized DB Tables" based on SQL Queries processed outside of SSIS. Basically I need to know what are the most widely used Tables by our Reporting Processes that are consumed by our Customers.

The reason for this requirement is so that we can Prioritize, which Long-Running ETL/SSIS Packages to begin analyzing for optimization and refinement. In this way, we will get the most bang for our buck as we'll be working to optimize the long-running ETL packages that effect the most widely used table(s) in our system....

Thanks
0
 
kevwitCommented:
Ok, I think I understand now. But to be sure, you want to identify most widely used tables before going further?

In an effort to better understand your goal I'll highlight two ways to identify queries using performance info.

You can download and install the SQL Server 2012 Performance Dashboard reports:
http://www.microsoft.com/en-us/download/details.aspx?id=29063

This would be the easiest and there is a report named "Top Queries by CPU time" and another named "Top Queries by I/O". This may meet your requirements?

Another option is to use DMVs. Check out this blog post and look at all the related posts on the bottom:

http://sqlserverplanet.com/dmvs/find-queries-taking-most-cpu-processor

These posts highlight the problem. How are you looking to identify what tables or views are "most utilized DB Tables"? By the shear number of times the entity is accessed? If this is true I'll have to tackle this much differently.

Does this help at all?


This blog post likely won't help you, but I'll include it in case your interested:

http://www.chilledsql.com/welcome/tip_category_dmvqueries/tip_detail_dmvqueries_gettopnlongrunningqueriesonserver
0
 [eBook] Windows Nano Server

Download this FREE eBook and learn all you need to get started with Windows Nano Server, including deployment options, remote management
and troubleshooting tips and tricks

 
Scott PletcherSenior DBACommented:
You can use view:
sys.dm_db_index_usage_stats
to see table and index usage (for tables w/o an index, i.e. "heaps", the index_id will be 0).

That is a cumulative view, from the time the SQL instance started.  Only tables/indexes that were actually used will be in this view; thus, any table/index not in the view has not been used since SQL started.  

Also, you can calculate usage for a selected period(s) of time by capturing the view results at those times and comparing/subtracting them from the previous view results.

For example, if you captured the view results every hour, you could determine read activity for each hour.
0
 
kevwitCommented:
Good point Scott.

Here's some sample code that should work in your environment and you can build on that:

USE /*DBNAME*/
SELECT DB_NAME([database_id]) AS [database name] ,
OBJECT_NAME([object_id]) AS [Table name] ,
*
FROM [sys].[dm_db_index_usage_stats]
WHERE OBJECT_NAME([ddius].[object_id]) IS NOT NULL
0
 
Scott PletcherSenior DBACommented:
Since that view actually contains rows from all databases on the instance, you'll need to include the database_id to get the object name.  However, because that is a lot of overhead, I'd do that after you capture the view data into your own table.

That is, create your own table based on the view, insert the raw view data into it periodically, on whatever schedule you want.  Then, later, update your table containing the view data with the correct object names.
0

Featured Post

Receive 1:1 tech help

Solve your biggest tech problems alongside global tech experts with 1:1 help.

  • 3
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now