Posted in SQL Server

Reviewing My Go-To SQL, Part 1

Recently, I was doing some performance testing and came to a realization – I think I’ve been using the same code for this since my SQL Server 6.5\7.0 days. Yikes!

Usually I am trying to look at two or more ways of writing something and figure out which one is faster. If I’m trying to improve bad code, I can keep the starting point and incrementally see if what I’m doing is improving it.

The solution I’ve been using is fairly straight forward:

First, I create a variable to hold the start date and time. Then, I get the current date and time right before I run each set of code. Finally, I get the difference in time between the variable and current date and time in milliseconds when it ends.

It’s quick and dirty and gets me the results I am looking for.

There are definitely some downsides to this –

  • If I’m looking at the execution plan or if they are returning a lot of rows, the time returned is affected by the time it takes to generate the execution plan or the result sets.
  • It’s looking at all of the steps of the code at once. So if I’m working on improving a multi-step stored procedure, it’s not going to tell me which step is taking the longest but how everything looks overall.
  • It’s not necessarily the most accurate way of looking at this information.
  • I’ve been using this SQL since the turn of the millennium.

But there are some upsides –

  • I’m usually trying to find the overall speed and I’m not looking for anything more specific or completely 100% accurate.
  • It’s easy to set up and expand or condense. I could get fancy by creating a loop to run the statements multiple time and dump the results into a temp table or variable so I can have it return an average. But I can also run it for one set or several sets.
  • This will work across multiple versions of SQL Server if you need to be worried about backwards compatibility.
  • It’s something I run as I’m working on the T-SQL statements and I don’t have to switch between windows to see what the results are.

I know I could use Extended Events for this and set them up to limit it to just the query or spid I was working on. Usually, I’m just working out of query window so I’m testing my T-SQL in an isolated environment so setting up an Extended Events session for a given spid is pretty simple. And as I start using SQL Server 2016 more, this may be something I can figure out if and how Query Store could be used in a way to give me this information as well. I’m sure there are other tools I can use out there that get me the same information. But I’m so used to using this that I don’t think I’ve had a real incentive to look for something else.

So what do you think? Do I need to update my go-to testing SQL or should I go with the “it ain’t broke, why fix it” mentality here? If I update my methods, what should I go with?



2 thoughts on “Reviewing My Go-To SQL, Part 1

  1. I’m with you on this. I still fall back to this and “Set statistics IO/TIME ON/OFF”. Yeah, it’s old school and perhaps a bit of it is me being slow to learn the new stuff, but I figure for 90% of what I need to do the old-school methods get the job done. And can do it fairly quickly and easily.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s