Avatar of CaptainGiblets
CaptainGibletsFlag for United Kingdom of Great Britain and Northern Ireland

asked on 

SQL Server performance

Hi, I have recently purchased a copy of solarwinds and have set it up to monitor my SQL server.

Straight away it has brought loads of critical errors mainly on the following

Workfiles Created/Sec          - Average 35 out of hours, 250+ in hours, sometimes its as high as 600
Worktables Created/Sec      - Average 25 out of hours, 90 in hours
Page Splits/Batch Request   - 0.35 out of hours, 1.45 in hours
Page Splits/sec                       - 20 out of hours, 143 in hours

Nobody has reported any issues regarding performance.

Is this something that is safe to just adjust where the critical alerts were?

Originally the server had 96Gb of ram, and it was using 100% of the ram, I upped the ram to 256Gb, but weirdly its only using around 60Gb now.

I have tracked down a lot of cross database / server queries and I have changed the way they run to reduce the CPU time on them. However it has made little difference to the results.

If anyone has any advice on the counters, what are safe limits etc I would really appreciate it.
Microsoft SQL ServerSQLSolarWinds

Avatar of undefined
Last Comment
ste5an
Avatar of ste5an
ste5an
Flag of Germany image

Workfiles/worktables means that temporary results are created during some queries depending on the join operations in the query plan and the data reuse. This is not a problem per se, cause some queries simply need them.

Same for page splits. Depending mainly on the table structure and workload this can be pretty normal.

So: as long as you don't have performance issues, there is no need to do something.

For the RAM: the RAM usage depends on the read data pages since the last service restart. This can simply mean, that you only need 60GB RAM for your typical day to day workload.

The primary indicators for proactive optimization are imho high physical and logical reads and writes for your queries.
Avatar of Pavel Celba
Pavel Celba
Flag of Czechia image

Page splits depend on the data updates and appropriate index definitions. If you are heavily inserting random data then you may expect page splits as the key values must be inserted at various places in the index tree. Here could help reindex or index recreation with lower fill factor, let say 50%. Of course, the lower fill factor means the engine must read more data to traverse the tree... but you seem to have enough RAM so all the operations should be very fast. And reindex should be a part of regular database maintenance.

Workfiles creation is used in hash joins. The more queries are executed the more workfiles are created. You may minimize it by appropriate indexes creation but it will need to analyze your query plans... which is always time consuming.

I would also recommend to optimize the number of logical reads. Low number of logical reads means optimal index usage. High number means missing index obviously or invalid SQL engine decision etc. etc.
Basically it looks like people are over-using temp tables.  Some developers do (almost) everything thru temp tables.

With tuning, my first recommendation is always to review the clustering indexes.  Until you get the best clustering index on every table, you won't get truly good results from any other tuning anyway.
Avatar of CaptainGiblets

ASKER

Is there an easy way to find out what queries are creating these high results?

I have looked at all the top queries that use CPU time etc and none of them seem to have anything that jumps out at me.
ASKER CERTIFIED SOLUTION
Avatar of ste5an
ste5an
Flag of Germany image

Blurred text
THIS SOLUTION IS ONLY AVAILABLE TO MEMBERS.
View this solution by signing up for a free trial.
Members can start a 7-Day free trial and enjoy unlimited access to the platform.
See Pricing Options
Start Free Trial
Microsoft SQL Server
Microsoft SQL Server

Microsoft SQL Server is a suite of relational database management system (RDBMS) products providing multi-user database access functionality.SQL Server is available in multiple versions, typically identified by release year, and versions are subdivided into editions to distinguish between product functionality. Component services include integration (SSIS), reporting (SSRS), analysis (SSAS), data quality, master data, T-SQL and performance tuning.

171K
Questions
--
Followers
--
Top Experts
Get a personalized solution from industry experts
Ask the experts
Read over 600 more reviews

TRUSTED BY

IBM logoIntel logoMicrosoft logoUbisoft logoSAP logo
Qualcomm logoCitrix Systems logoWorkday logoErnst & Young logo
High performer badgeUsers love us badge
LinkedIn logoFacebook logoX logoInstagram logoTikTok logoYouTube logo