Creating test data using TSQL- 4 weeks of CPU stats

To Spoof the creation of SQL creating the CPU stats for the last 4 weeks I used some useful SQL routines, the full code is linked here but the relevant sections are explained in the blog post. These are a few great TSQL tips and routines that can be used in a variety of things, listed are ways to:-

  • to create an N number of rows quickly
  • create an incremental number (like an Identity column) but without identity or any ordered rows, would work with a table full of duplicates
  • creating a random number in a result set

The output creates a result set that looks like

CPU stats

I declared some variables to control a change in CPU between specific hours and a threshold of activity, LOW, MED, HIGH

Spoofing the results of the CPU

DECLARE @StartDate DATETIME =GETDATE()

DECLARE @BusinessStartHours TIME ='09:00:00'

DECLARE @BusinessEndHours TIME ='18:00:00'

DECLARE @ServerProfile CHAR(4) ='HIGH' --LOW, MED, HIGH

I used a simple join which creates a Cartesian product of sysobjects and syscolumns to create a large result set. This would create 2,347,149 rows on a SQL 2016 server. More than enough for what I need


FROM Master.dbo.sysobjects

CROSS JOIN sys.columns

I then using a CTE (Common Table expression) and a really clever bit of SQL, of course I stole it somewhere at somepoint to generate a row number without having any row numbers or rows to order by.


ROW_NUMBER() OVER (PARTITION BY (SELECT 1) ORDER BY (SELECT 1)) AS ID

This generates sequential number from any result set.

CPU stats-rownumber

I now have a table of rows from 1 to 2,347,149 rows if need. I then filtered by times [Number of hours] x [minutes in an hour] x [Days in a week] x [number of weeks]


WHERE ID <= (

SELECT 24 * 60 * 7 * 4

)

Combined with ( single line of code over two for clarity)

DATEADD(minute,ROW_NUMBER() OVER (PARTITION BY (SELECT 1)
ORDER BY (SELECT 1)),@StartDate) AS ReportDate

Which gave me data instantly from Now (Getdate()) to 4 weeks previous for every minute of every day.

The next thing I needed was some random CPU data. Rand is not the way to go with this as when combining with a result set you get the same random number

To generate a random number I used NewID, the following giving me a random number from 0-60

ABS(CHECKSUM(NewId())) % 60

Which give me a set of random numbers

CPU stats-random

Hopefully, some of these elements might save someone sometime

Posted in DataScotland, Tips & Tricks, TSQL | Leave a comment

Monitoring CPU

As part of a licensing project, it was found to be very useful to get an overview of current servers CPU across the SQL estate. There was an expensive 3rd party software solution that was reporting high CPU usage on certain servers that had next to no usage when checking.

There is a wealth of free scripts as part of Glenn Berry Diagnostic scripts  These are really really useful and worth a look if you do anything around SQL server. The one in question I utilised was query 44. The code below and can be found in the script link above

 


-- Get CPU Utilization History for last 256 minutes (in one minute intervals) (Query 44) (CPU Utilization History)
DECLARE @ts_now bigint = (SELECT cpu_ticks/(cpu_ticks/ms_ticks) FROM sys.dm_os_sys_info WITH (NOLOCK));

SELECT TOP(256) SQLProcessUtilization AS [SQL Server Process CPU Utilization],
SystemIdle AS [System Idle Process],
100 - SystemIdle - SQLProcessUtilization AS [Other Process CPU Utilization],
DATEADD(ms, -1 * (@ts_now - [timestamp]), GETDATE()) AS [Event Time]
FROM (SELECT record.value('(./Record/@id)[1]', 'int') AS record_id,
record.value('(./Record/SchedulerMonitorEvent/SystemHealth/SystemIdle)[1]', 'int')
AS [SystemIdle],
record.value('(./Record/SchedulerMonitorEvent/SystemHealth/ProcessUtilization)[1]', 'int')
AS [SQLProcessUtilization], [timestamp]
FROM (SELECT [timestamp], CONVERT(xml, record) AS [record]
FROM sys.dm_os_ring_buffers WITH (NOLOCK)
WHERE ring_buffer_type = N'RING_BUFFER_SCHEDULER_MONITOR'
AND record LIKE N'%%') AS x) AS y
ORDER BY record_id DESC OPTION (RECOMPILE);
------

This will show the CPU usage for the last 256  minutes which in itself can be quite useful, if you come in one morning and server A job has run slowly, you can see if it was a CPU related issue quickly, however, this starts to be more useful if you capture this on a continual basis ( in theory every 256 mins) but I set to run hourly so would track server usage.

CPU stats-rownumber

This was achieved simply by adding into a scheduled job.  The whole job is here This creates a job and creates a table in master, that way you can across the SQL estate and collect the data locally and pull back to a central repository that allows the analysis of CPU .

The next blog post shows how to spoof up creating a result set of CPU results which uses some useful techniques in TSQL.  I’ll also blog about pulling the data back to a central location

 

 

 

Posted in Management Studio, Performance Tuning | Tagged , , , , | Leave a comment

Setting up Docker for testing – Setting up 3 SQL servers Manchester / Glasgow / Edinburgh

There are a lot of resources out there about Docker, I am just going to cover the steps I ran to get a test environment I used to present with. These steps created 3 different SQL server environments (Glasgow, Manchester, and Edinburgh). I had previously used other names for other presentations, call them anything you want.

  1. Download Docker Desktop for Windows and install.

2. You need to create user and email for Docker.

I did have one issue with logging into docker, I was able to log into Docker using my email address, it accepted it fine, but when running docker login to login to docker at the command prompt I got the following error

docker login

I logged out and back in using my user name and all was good

Docker user

3. Next step is to download the SQL server image you want to use

docker pull microsoft/mssql-server-linux:2017-latest

docker sql install

This will take a while but you only need to do this once for each version of SQL server you want. You might notice I am using Powershell, I can also use CMD and will switch between the two in the demos.

Now we can create our SQL Servers

docker run -e “ACCEPT_EULA=Y“

-v sqlvolume:/var/opt/mssql

-e “SA_PASSWORD=Passw0rd”

-p 12435:1433 –name glasgow

-d –hostname glasgow

microsoft/mssql-server-linux:2017-latest

I’ll go through each line above

docker run -e “ACCEPT_EULA=Y“

Docker is the command you need to run for all the docker statements the ACCEPT EULA is the same as if you were installing SQL manually and accepting the license agreement.

 -v sqlvolume:/var/opt/mssql

This is specifying a sqlvolume location, you might notice that its a Linux format as its a Linux image but we will be connecting from a Windows 10 machine.

 -e “SA_PASSWORD=Passw0rd”

The SA password with a simple password. To lock down the password takes more of an effort and out of the scope of this blog as I will show you how you can see what the password is set to in a further step

 -p 12435:1433 –name glasgow 

We now need to set the port that SQL will communicate on, you might notice the 1433 value which is SQL servers default port. The 12435 will be used to distinguish each different server, this value will be different for Edinburgh and Manchester.

 -d –hostname glasgow

The –hostname bit is quite important for the demo as without this you can still connect to the server by name but running a SELECT @@servername will bring back a gibberish value which isn’t great when I want to pull back geographical server names.

microsoft/mssql-server-linux:2017-latest

And finally what we are wanting to run.

If we then run the whole command, within seconds we have SQL server up and running.

To confirm this  we can run

docker ps -a

Docker started

This shows the containers running, what they are running, the ports and the names of the server.

There are a couple of tweaks required to allow me to connect through the SSMS ( and Azure data studio) on the servername.

To allow return from ping (as I demoed I could ping it) you need to run as administrator the following to create a static route

route add 172.17.0.0 MASK 255.255.0.0 10.0.75.2

I also mentioned you can see the SA password that has been set, if you run

docker inspect manchester

it prints out the SQL configuration as you can see in clear text is the SA password

docker passwordand also the IP Address detailsdocker ipaddressI could have used SQL alias to create a redirect of a DNS name to point to that IP Address, however, I planned on using Azure data studio and aliases don’t work, this is due to it being a new application build from the ground up (utilising Visual Studio Codebase)

There are other ways I could have achieved what I wanted ( creating a network for example) but decided on setting the hostname to IP address in the Lmhosts file for simplicity.

This file is located in C:\Windows\System32\drivers\etc and called lmhosts with no extension. To edit in a text editor you will need to run as administrator.

The entry in the file looks like

172.17.0.2 glasgow #PRE
172.17.0.3 edinburgh #PRE
172.17.0.4 manchester #PRE

Once saved you can refresh the Lmhost file running  (and yes its upper case R)

netstat -R

So now in Management studio, you should be able to connect to Manchester, Glasgow or Edinburgh

docker -manchester

Also As the sqlvolume was added the data is persisted and stopping and removing the docker container still retains the data if you recreate.

So there we have it 3 SQL environments spun up in no time at all.

Posted in Azure Data Studio, DataScotland, Docker, Management Studio | Tagged | Leave a comment

Data Scotland Presentation

It’s now (well) over a week since I presented at Data Scotland in Glasgow and had an amazing time, definitely one to add to the calendar for next year and Glasgow is such a vibrant city

20190914_215238I presented a talk called “The Productive data professional” which covered a mix of things, using docker to create 3 geological environments, Glasgow, Edinburgh and Manchester to use in all the demo’s.

I will have a series of blog posts covering various elements from the presentation over the next week.  In the meantime for anyone interested he scripts can be found with the slide deck on GitHub

https://github.com/martincroft/The-Productive-data-professional

Posted in PowerShell, Productivity, SQL Notebooks, Tips & Tricks | Leave a comment

5 things in 5 minutes presentation- Data & Gravy

It was 2nd year attending the great data event hosted in Leeds (for those who don’t know its near Manchester and not far from Lancashire), last year I did a full hour presentation. This year I wasn’t as organised submitting but saw they were looking for lightning talks, for just 5 minutes, umm.  That can’t be hard right? So I came up with a title before I came up with any content. This might have been influenced by a presentation I did a few years ago, 99 thing but the batch isn’t one, on 99 things I had learned recently. Do nothing by halves me 😉

So 5 things in 5 minutes

5-things-in-5-minutes2

This was done in SQL Notebooks in Azure data studio. The gif above (Created by my Son Leo) was embedded using the following syntax in text markup in a SQL Notebook ( these are really cool). Bit of a tip use internet explorer/chrome  (other browsers are available) and browse to the location of the file (if locally) you can also reference URLs on the web

![Data and Gravy](file:///C:/DANDG/5-things-in-5-minutes2.gif)
One nice thing with SQL notebooks is you can add headings to your script prefixing text with #, ##,###,#### as shown below
markup
I also showed a very useful Powershell Script that takes the Glenn Berry Diagnostics SQL Scripts which are just amazing and convert them to SQL notebooks.
SQL notebooks allow you to have specified code and text sections, you can run individual sections. One great feature is the result sets can be saved with the notebook.
Now that can be really useful, you can send a script to a developer, a client anyone then ask them to run it and return the script back to you.  You can use it for run books, like a server upgrade for example, with multiple scripts to be executed one after another and you get to keep a history of the result! you can share them within in a team, you can also clear all output with a couple of clicks.  I can think of many user cases for these.
SQL Notebooks

The script I found was from Gianluca Sartori (aka spaghettidba) and his post can be found here .
The next step will be to update some of the SQL script’s I used to utilise the routine he’s created to convert to SQL Notebooks.
Next up in my 5 things in 5 minutes was a choice for the attendees of Data in Devon ( which Data & Gravy were streaming to) and the locals in Leeds.  I had over two recent projects where I am contracting used some routines as part of a licensing project and one as part of a decommissioning project.  The winner was CPU stats (I’ll expand on these in a future post as I found them useful and someone else might). I’ll cover the logging one at another point.
Anyway, I’m like 4 minutes in at this point, I think I could probably expand this out to an hours talk so why I thought I would cram it into 5 minutes.
It was nicely presented mind( on the screen, I don’t mean by me), look lovely images and SQL scripts.
CPU Stats
The script here (which again was from cleverer people than me ) used Glenn Berry Scripts to pull back CPU stats. The job creates a table (if it doesn’t exist) and runs hourly pulling back the CPU stats from the server. You can look at the CPU for the last 256 minutes.  It’s only a sampling per minute, but as part of a licensing project it highlighted multiple SQL VM that could be moved to Max virtualisation hosts (if you have Enterprise agreement and Software Assurance and Enterprise, don’t get me started with Licensing, (another post maybe)
Anyway, I digress, this shows you CPU stats, which when using some expensive tools to show where CPU savings can be made(names withheld) showed servers having high CPU usage, but using these simple scripts showed me that yeah CPU hit 70% at 3AM when the backups ran for 5 minutes on a SQL server that is used 9-5 so it didn’t matter and certainly didn’t need that many of cores !
My SQL Server
I did go a bit further with the stats and have PowerShell routines to pull to a central reporting area, PowerBI report to graph CPU, filters than allow ignoring backups to get a true picture anyway, I’m at about 4.5 minutes at this stage. I probably scared everyone tbh so hopefully the addition of a blog post will make it useful to relook up what I was waffling on about.
These can also be very useful for baselining systems.
Last but not least is a really really useful set of routines that I came across, again as part of an upgrade script and another failsafe when changes system
I came across George Walkey set of Powershell scripts which allow you to script an entire SQL server.  Just to give you an indicator. It’s worth taking a look and usesDBAtools
Gwalkey
That’s taken me ages to write….. and I presented in 5 minutes, I must be simple.
Posted in PowerShell, Productivity, SQL Notebooks, Tips & Tricks, TSQL | Tagged , , , , , , , , , , | 2 Comments

Email Security SQL Server

One security vulnerability that people don’t often realise , is that SQL server can email information out through the likes of  sp_send_dbmail. It goes without saying that you should take steps to stop emails externally and then if its a requirement ensure that its to approved email domain only.

The following bit of code shows the domains that emails have actually been emailed to, the code below shows several email addresses ( mock up of  MSDB.dbo.sysmail_allitems email addresses.

If you notice there are multiple emails on one line

Email Address Example

The following code will strip out all the email addresses individually and remove the Email address to just leave the domain name. So quickly you can check if you are getting emails sent places they shouldn’t be. You can even add in your companies email address so it turns into an exception report!

WITH CTE_Email
AS
(
--SELECT recipients FROM MSDB.[dbo].[sysmail_allitems]
SELECT Email AS Recipients FROM
(values
('Martin.Croft@SQLServerSolutions.com'),
('Someone@MyWorksDomain.co.uk;SomeoneElse@MyWorksDomain.co.uk'),
('PersonalEmail@Yahoo.co.uk'),
('Evil@NastyHackers.co.uk'),
('Bill@Acceptable3rdParty.com;Helpdesk@Acceptable3rdParty.com')
) EmailAddress(Email)

)


SELECT DISTINCT
SUBSTRING(recipients,CHARINDEX('@', recipients)+1,LEN(recipients)) AS EmailDomains
FROM
(
SELECT
LTRIM(RTRIM(m.n.value('.[1]','varchar(8000)'))) AS recipients
FROM
(
SELECT CAST('<XMLRoot><RowData>' + REPLACE(recipients,';','</RowData><RowData>') + '</RowData></XMLRoot>' AS XML) AS Email
FROM CTE_EMAIL
)t
CROSS APPLY Email.nodes('/XMLRoot/RowData')m(n)
)EmailAddress

Which results in the following

Email Address Domains

Posted in Uncategorized | Leave a comment

Kerberos -The annoyance that is

This is more about how to find something than finding something if that makes sense or doesn’t. Anyway a bit of back story is probably worthwhile at this point

In the beginning 

There was once a server that wasn’t working  properly, the poor little SQL server when started would write the following out to the SQL event log ( but not in Italic)

The SQL Server Network Interface library could not register the Service Principal Name (SPN) for the SQL Server service. Error: 0x80090350, state: 4. Failure to register an SPN may cause integrated authentication to fall back to NTLM instead of Kerberos. This is an informational message. Further action is only required if Kerberos authentication is required by authentication policies.

Every time it started, it would moan and whinge, this however was a lowly test server and didn’t get the time it deserved. Until someone broke ( accidentally)  some integration that was setup with another server. This was rebuilt and Kerberos was actually needed now, ( how it wasn’t before is another question for another day)

Anyway all the blogs on the internet were googled as is the way and there are lots out there regarding SPN’s and the setup and configuration and the fixing of, these  had all been followed. Its at this point that I mention there was another server, a test server just like out poor little server, only it didn’t have this same problem.

Hallelujah, I hear people exclaim, just look at our good server and our poorly servers configuration and settings , which was exactly what the DBA had done and was starting to pull out his hair, when he asked for another opinion and a another set of eyes on the issue.  We tried a couple of things to eliminate the issue without success! He exclaimed  both service accounts are setup the same. But are they ? I considered, have you seen how many setting can be made against users and computers in AD wow

Quick google for the syntax and a couple lines of PowerShell for both the poorly server and good server

 

Get-ADUser -filter {name -like "svc_Poorly_SQL"} -Properties |
Export-Csv -Path "C:\Temp\MSTEST.csv" -Delimiter ","

Allowed me to compare the setting and yep there was ones that weren’t the same, taking usernames, SID and things out of the equation , some digging was done on the setting that were different and on the poorly server the UseDESKeyOnly option had been set to TRUE, which was visible in the GUI .  There are other blogs out there which describe what this is and what this does, in this case all you need to know this was the cause of our issues.

The moral of the story

You can stare at something for ages and never see something, don’t ever be afraid to ask for someone else opinion, as you will without doubt be in the same position at some point.  You don’t necessarily have to deep dive into things if you’ve got a baseline ( working server in  this case) and the broken one ( not working one in this case), then its a matter of logic, actually what is different and a method of finding it out.

 

Posted in PowerShell, Tips & Tricks, Uncategorized | Leave a comment