We help IT Professionals succeed at work.

How to use SQL profiler efficiently?

luyan
luyan asked
on
377 Views
Last Modified: 2012-08-13
Sometime the cube build took several minutes, but sometimes it took around 2 hours. I want to trace it and see what is going on. I used SQL profiler.
Question 1. Do you have any suggestions on use the profiler?  In trace properties, any suggestions on setting Events, Data Columns, Filter? What do you think they are necessary to set?
Question 2. Do you know any good documents on using SQL profiler?
Thanks!
Comment
Watch Question

Top Expert 2006

Commented:
>>Question 1.
Purely depend on what you want to look on for example if you want to look for any statement that is taking time then you will select sqlstmt completed and in the event sqltext duration database and from the filter you will select duration >1000
greater then 1 sec.
Top Expert 2006

Commented:
>>Question2.
http://www.microsoft.com/technet/prodtechnol/sql/70/tips/sqlprof.mspx

Author

Commented:
Can I run the SQL profiler in the background? Right now if I log off and disconnect my computer, the running profiler will be stopped.
Thanks!
Top Expert 2006

Commented:
>>Can I run the SQL profiler in the background?
No, The best thing is to run it from one machine and keep it running there.

Author

Commented:
Can profiler trace a job?
1). All profiler definition will be saved into xxx.trc file. Right?
2). If I want to modify the definition of the profiler, I need to open the trace file, then modify its properties and save them. Right?
3). If I need to rerun the profiler, I need to select "replay" and then "start". Right?
Thanks!
Top Expert 2006

Commented:
Can profiler trace a job?
yes

1) all the trace out put will be saved in it.
2) there are templated you have to modify for the profiler defination
3) select the template

Author

Commented:
Question 1:
I saved the configuration into one template file xxx.tdf.
I opened it through File -> Open -> Trace Template -> xxx.tdf.
But how can I re-run it?

Question 2:
If I put the trace result into one table TBL_trace, the TextData column is big, how can I see it clear?

Question 3:
Which way is better to see the trace result - trace file (xxx.trc) or trace table?

Thanks a lot!
Top Expert 2006

Commented:
1: It will be a template next time when you select this template and then run the profiler it will select the columns in the profiler and the filter automatically.


2: Use query analyzer
select textdata from trace table where ....

3. trace file can be open at the profiler you can click on the record and it will display information on the profiler only
other hand in trace table you can apply all sql tech. to go to particular record.

Author

Commented:
I found one of query took a long time. I would like to tune indexes of the query. The query takes a long time to get around 80 million rows data. Through profiler or other tuning tools, without executing the query, how can I find which indexes should be used for better performance?
Thanks!
Top Expert 2006

Commented:
>>Through profiler or other tuning tools, without executing the query, how can I find which indexes should be used for >>better performance?

Inside your profiler ---> tools -->Index Tunning wizard

save the query in a script file and then check it against  the database using index tunning wizard. If you know the table names specify the table names.
Top Expert 2006

Commented:

Author

Commented:
Thanks! I did the index defragmentation because I found some tables were in low scan dencity. But it just decreased 5 minutes (recently it took around 2 hours) to run the cube build job. Is there any help if I do re-index? Or do you have any good ideas for tuning query performance?
Thanks!
Top Expert 2006

Commented:
>>Is there any help if I do re-index?
Incase an index is defragmented yes it helps a lot. you can check by running the query

Dbcc showcontig command to see if an index is defragmented.


>>you have any good ideas for tuning query performance?

Try to avoid using

Distinct
in
not in
*

Unless its really nessesary

Author

Commented:
Thanks for your answers!

I used Dbcc showcontig check and see a little improvement after the index defragment.
I found it took a long time to read data from one view. And this view got data from around 20 tables. Is it helpful to create the temporary table to get part of data from some tables, and then let the view get data from the temporary table and other tables? Or do you have any good idea to improve the performance for reading data from a big view?

Thanks!

Author

Commented:
How about create indexes for the view? Will it be helpful?
Thanks!
Top Expert 2006

Commented:
>>How about create indexes for the view? Will it be helpful?

Yes it will boost the performance but there are limitation for the indexed view
check Requirements for the View in the books online.

Author

Commented:
It seems there is no relation to the indexes because sometimes the query runs fast around 15 minutes and sometimes it runs slow around 2 hours. Why? Any reasons caused this issue?
Thanks!

Author

Commented:
Is it any helpful to improve the performance if add "with nolock" into the query?
Thanks!  
Top Expert 2006
Commented:
This one is on us!
(Get your first solution completely free - no credit card required)
UNLOCK SOLUTION
Unlock the solution to this question.
Join our community and discover your potential

Experts Exchange is the only place where you can interact directly with leading experts in the technology field. Become a member today and access the collective knowledge of thousands of technology experts.

*This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

OR

Please enter a first name

Please enter a last name

8+ characters (letters, numbers, and a symbol)

By clicking, you agree to the Terms of Use and Privacy Policy.