SQL Errors

SQL Performance Troubleshooting Tools

Overview

Problem / Symptoms:

There might be the case where you have a system that is not performing as expected and you would like to get some Performance stats to guide you in the right direction.

SQL has a lot of performance monitors that can be called to highlight potential areas where your performance issues are.

But I’m primarily going to discuss about 3 topics in this guide.

  1. Disk IO

  2. CPU IO

  3. Memory IO

We all know SQL loves memory and the more the better, but here is the catch, you can have all the memory in the world but at some stage SQL has to come down from its memory clouds and store data on a physical drive.

This is most likely where your performance issues will arise.

CPU generally only becomes a problem if the system is running to many 3rd party applications on the Same Server.

So the general rule is that SQL/ Production Databases should generally be on its own instance or PC away from any other 3rd party application that could degrade performance or take system resources away from SQL. 

Diagnosis:

So your SQL server is running slow or not performing as expected what can you do to identify possible causes?

Here I will list 3 applications that could highlight performance issues. 

Solution / resolution:

TEST 1:

The First tool is a simple Disk performance test.

Using a simple free tool like CrystalDiskMark

Download Link

You can get a quick snapshot of disk performance, primarily Sequential and random disk read and write, in this case the higher the number the better

Here is an example of a Raid 1 setup

VS a Standalone SSD Drive

Now you might go well my system already has an SSD and I have done a test and the bench values are high why is SQL still performing slow.

This is probably due to the fact that too many application/processes are sharing the same harddrive, even for an SSD.

Examples are:

OS (Windows)

SQL

SQLTEMPDB

ProductionDB

3rd Party applications

All running off the same harddrive competing for DISK IO resources.

TEST 2:

Another tool provided by SQL that comes with the SQL installations is SQLIOSIM

Download Link

This is another great tool to get a quick snapshot if the system your production database is running on is having performance issues.

The tool give you the ability to specify a  Database and log file size and also where those Databases and log files are located.

You then specify how many cycles you would like to test and in this example I’m going to run 2 cycles for 60 seconds each to get some feedback from my current environment.

Once the test starts to run you will see that it performs SQL activities

Once the Test completes it gives you a long list of logs with potential warning and performance stats

In my test case I am running SQL test on a Raid 1 setup

The sqliosim.mdx in this case is SQL MDF Database

We have a target IO duration of 100ms and an average IO duration of 15 ms and lastly the number of times the IO was throttled

Same for the sqliosim.ldx is the LDF Log file

We have a target IO duration of 100ms and an average IO duration of 1 ms and lastly the number of times the IO was throttled

The goal is to keep the

Average Duration as small as possible.

For Data files:

Less than 15ms is excellent performance

For Log files:

Less than 5ms would be ideal

Throttling to as small as possible.

As mentioned above the SQLIOSIM might also highlight other errors as shown below

In the below example we have an instant where the drive cannot keep up with the SQL requests and the requests are outstanding for more than 15 secs. Which is highlighting that the disk performance is where we have the bottleneck.

TEST 3:

The last tool is a set of stored procedures that are triggered to take snapshots of SQL at a specified time interval and then write these to a tempdb in order to use a tool like Microsoft BI to give you a graphic representation of what the SQL instance is doing.

Download Link

These stored procedures were created by Brent Ozar and are free to download.

He has a many articles on SQL that can be read here

Brent Ozar Unlimited - SQL Server Consulting and Training

The first step is to install the stored procedures which will all install under the Master database

Within the First Responders kit you will have an sql file called ‘Install-All-Scripts.sql’

Run this via SSMS and select the master database, and click execute, it will install all the stored procedures.

Next we need to create a new Blank database called ‘DBAtools’

Once installed we need to create an agent job to capture the data and store it into a DB so we can accumulate data over a period of time.

Head over to the Agent section in SSMS

Create a new Agent job by right clicking jobs

Give it a name

Click on Steps – New

Give it a name

Select the database you want it to check on

And the T-SQL commands

Click OK

EXEC sp_BlitzFirst

  @OutputDatabaseName = 'DBAtools',

  @OutputSchemaName = 'dbo',

  @OutputTableName = 'BlitzFirst',

  @OutputTableNameFileStats = 'BlitzFirst_FileStats',

  @OutputTableNamePerfmonStats = 'BlitzFirst_PerfmonStats',

  @OutputTableNameWaitStats = 'BlitzFirst_WaitStats',

  @OutputTableNameBlitzCache = 'BlitzCache',

  @OutputTableNameBlitzWho = 'BlitzWho';

Click on Schedule – New

Give it a name

Specify the time interval

Click OK

Finally Click OK

This should now have created the agent that will run the stored procedures at 15 minute intervals and store the results in the DBATools Database.

Let this run for as long as needed 24 hours plus to get a picture of what is happening on a server.

Now for the next part you need Microsoft BI to suck the data from the DBATools database into a report.

Download Microsoft BI

Run the ‘FirstResponderKit.pbix’ found in the firstresponderskit.

Once loaded you will see a dashboard as follows, this is the template with some temp data to show you the various performance counters.

There is an ‘overview tab’ , ‘file speeds’ and ‘files sizes’ and ‘perfmon’ tab that highlight the metrics

This gives you a great overall view of what is going on within this instance of SQL.

Since the template is only populated with Demo Data you need to point it to your ‘DBATools’ Database. To do this you have to click on Transform Data -> Data Source

Click close – Apply changes

This will then read the data from the DBATOOLS database into the BI Report

Once data has been added you can always refresh to get new entries

But this paints a nice picture on what is going on this SQL instance

Example from the File Speeds section when performing an update tables via database.exe function show how much SQL actually uses the TEMPDB, 

Diagnoses is the tempdb and the production database sharing the same drive?

Remember once you done diagnosing your issue to stop the agent job and if you like you can uninstall the store procedures ‘Uninstall.sql’ in the first responders kit