Troubleshooting SQL server slowdown and T-SQL optimisation tips

If you ever find yourself in a situation where SQL server is maxing out its available resource (CPU) here are a few tips for identifying the cause: 

Head blocking sessions 

Example: You have a table which is the root source of data for most queries used by your apps. 

A long running query against this table locks up the resource blocking subsequent queries. 

In this example, under high load, you may notice this being followed by timeout expired logs and a general slowness to your application 

The cause for this may be a query that could be optimized  

To find out which session is causing the head block right click the database you are connected to and open Activity Monitor 

Expand Processes and order by Head Blocker descending 

if you find a process with Head Blocker set as 1 that stays for a significant amount of time this will likely be the culprit 

Right click the processes causing the block and click details - this will contain the query being executed causing the block, most of the time these will be stored procedures 

Using SQL search from Redgate (free download) search for a part of the query against the database in question and it will point you in the direction of the stored procedure. 

 

Expensive queries 

When looking at expensive queries I tend to focus on 3 columns 

Executions/min 

If you have a query in this list with a very high count of executions/min and the query in question cannot be optimized any further then it could be worth creating a cached table that you populate every x seconds/minutes/hours etc... using a SQL Job 

Example: You have a complex select statement with multiple joins returning a large amount of rows 

Under high load that complex select statement could cause the load on the server so creating a table that mirrors the select statements results and using a SQL job to truncate and repopulate every x amount of time will have a significant impact in reducing the strain on the server 
 

CPU (ms/sec) 

A query with a high CPU usage in ms/sec may be in need of some optimization, if the query itself cannot be optimized any further and all the necessary index's are in place it may be worth creating a static table to store data in that is read from directly rather than as the result of a complex query, effectively a data cache 

 

Average Duration (ms) 

A query with a high avg duration may be in need of optimization. 

If a query is using a core data source which is also used throughout other applications and is taking a significant amount of time you run the risk of locks occurring and it becoming a head blocker. 

it is always worth looking at ways you can fine tune the performance of the query and if not possible try move it away from heavily used core data to prevent it interfering with any apps using it.

 

Tips for optimizing T-SQL queries

Always check the estimated execution plan 

When writing queries that you plan to create a stored procedure out of it is always extremely useful to check the execution plan of what you have written. 

The execution plans will also show you where an index could be added to aid the performance of the query 

 

Be mindful of how a query will cope under high load 

Whenever taking on a new task that will involve writing some queries to create stored procedures, always be mindful of how the query would cope under high load otherwise you could be caught out at release to a production server. 

Its very easy to be given a task - create a stored procedure to return some data - test it yourself, test it with a few users and all appears well 

Next it gets released live and is exposed to a high traffic environment causing lag on the SQL server 

 

SQL Jobs with lots of steps 

When creating a SQL job that needs to do multiple things in order it is worth (if it is possible, Please note this may not always be suitable depending on circumstance) to split the actions up into separate steps in the SQL Job Steps tab. 

The reason for this being if you ever need to troubleshoot an issue with some data which is maintained by your SQL job you can check the history of the job and see if it failed at a particular step allowing you to jump right to that part of the job and fix the issue 

 

Checking statistics time 

If you find yourself writing a query that is doing some complex statements, using functions etc.. it is useful to use the following at the top of your query 

SET STATISTICS TIME ON

and the following at the bottom of your query 

SET STATISTICS TIME OFF

Doing this will show you the execution times for your query, CPU and elapsed time 

if you notice that your query is using a high amount of CPU - attempt to tune it 

 
 

Checking fields for nulls and/or empty strings 

I once came across a query where it was using a datalength function and also checking a field was null 

Example:

WHERE (datalength([FIELD1]) = 0 OR [FIELD1] IS NULL)  

This query had a slightly high CPU count so i attempted to simplify the logic check with the following: 

WHERE (ISNULL([FIELD1], '') = '')  

Changing it to this made it evaluate significantly quicker and that query no longer appeared in the expensive list 

 
 

Checking fields for a boolean (BIT fields) 

Whenever dealing with bits in T-SQL, i.e. using them in where clauses, ensure that you always use the bit form (0 or 1) over ('true'/'false') and ('0'/'1') 

Doing this means that it doesn't have to evaluate what it needs to compare with before it uses it in the where clause 

I have seen this in a query that experienced a very high load and updating this to use the bit equivalents dropped the CPU usage on the server by half immediately 

 
 

Cached data tables 

Whenever you start dealing with complex queries with multiple joins returning thousands of rows from a massive table, such a query put into a stored procedure and released could have a significant impact on your SQL server resources and may cause slow downs. 

The time may come where it may be useful to create a replicate static table that matches the result of your query and truncate then populate it using a SQL job effectively creating a data cache that can be used in a simple select statement without all the complex joins, where clauses etc... 


csharper Profile Image

csharper

Rating: 10

Offline


Useful Tip Statistics
  • Views: 426
  • Comments: 0
  • Author: csharper (10)
  • Date: 5/2/2017 22:01
Tags
SQL Server SQL

© 2016 - 2018 - IntermittentBug