5 things in 5 minutes presentation- Data & Gravy

It was 2nd year attending the great data event hosted in Leeds (for those who don’t know its near Manchester and not far from Lancashire), last year I did a full hour presentation. This year I wasn’t as organised submitting but saw they were looking for lightning talks, for just 5 minutes, umm.  That can’t be hard right? So I came up with a title before I came up with any content. This might have been influenced by a presentation I did a few years ago, 99 thing but the batch isn’t one, on 99 things I had learned recently. Do nothing by halves me 😉

So 5 things in 5 minutes

5-things-in-5-minutes2

This was done in SQL Notebooks in Azure data studio. The gif above (Created by my Son Leo) was embedded using the following syntax in text markup in a SQL Notebook ( these are really cool). Bit of a tip use internet explorer/chrome  (other browsers are available) and browse to the location of the file (if locally) you can also reference URLs on the web

![Data and Gravy](file:///C:/DANDG/5-things-in-5-minutes2.gif)
One nice thing with SQL notebooks is you can add headings to your script prefixing text with #, ##,###,#### as shown below
markup
I also showed a very useful Powershell Script that takes the Glenn Berry Diagnostics SQL Scripts which are just amazing and convert them to SQL notebooks.
SQL notebooks allow you to have specified code and text sections, you can run individual sections. One great feature is the result sets can be saved with the notebook.
Now that can be really useful, you can send a script to a developer, a client anyone then ask them to run it and return the script back to you.  You can use it for run books, like a server upgrade for example, with multiple scripts to be executed one after another and you get to keep a history of the result! you can share them within in a team, you can also clear all output with a couple of clicks.  I can think of many user cases for these.
SQL Notebooks

The script I found was from Gianluca Sartori (aka spaghettidba) and his post can be found here .
The next step will be to update some of the SQL script’s I used to utilise the routine he’s created to convert to SQL Notebooks.
Next up in my 5 things in 5 minutes was a choice for the attendees of Data in Devon ( which Data & Gravy were streaming to) and the locals in Leeds.  I had over two recent projects where I am contracting used some routines as part of a licensing project and one as part of a decommissioning project.  The winner was CPU stats (I’ll expand on these in a future post as I found them useful and someone else might). I’ll cover the logging one at another point.
Anyway, I’m like 4 minutes in at this point, I think I could probably expand this out to an hours talk so why I thought I would cram it into 5 minutes.
It was nicely presented mind( on the screen, I don’t mean by me), look lovely images and SQL scripts.
CPU Stats
The script here (which again was from cleverer people than me ) used Glenn Berry Scripts to pull back CPU stats. The job creates a table (if it doesn’t exist) and runs hourly pulling back the CPU stats from the server. You can look at the CPU for the last 256 minutes.  It’s only a sampling per minute, but as part of a licensing project it highlighted multiple SQL VM that could be moved to Max virtualisation hosts (if you have Enterprise agreement and Software Assurance and Enterprise, don’t get me started with Licensing, (another post maybe)
Anyway, I digress, this shows you CPU stats, which when using some expensive tools to show where CPU savings can be made(names withheld) showed servers having high CPU usage, but using these simple scripts showed me that yeah CPU hit 70% at 3AM when the backups ran for 5 minutes on a SQL server that is used 9-5 so it didn’t matter and certainly didn’t need that many of cores !
My SQL Server
I did go a bit further with the stats and have PowerShell routines to pull to a central reporting area, PowerBI report to graph CPU, filters than allow ignoring backups to get a true picture anyway, I’m at about 4.5 minutes at this stage. I probably scared everyone tbh so hopefully the addition of a blog post will make it useful to relook up what I was waffling on about.
These can also be very useful for baselining systems.
Last but not least is a really really useful set of routines that I came across, again as part of an upgrade script and another failsafe when changes system
I came across George Walkey set of Powershell scripts which allow you to script an entire SQL server.  Just to give you an indicator. It’s worth taking a look and usesDBAtools
Gwalkey
That’s taken me ages to write….. and I presented in 5 minutes, I must be simple.
This entry was posted in PowerShell, Productivity, SQL Notebooks, Tips & Tricks, TSQL and tagged , , , , , , , , , , . Bookmark the permalink.

2 Responses to 5 things in 5 minutes presentation- Data & Gravy

  1. George Walkey says:

    Thanks for the mention!
    But my powershell scripts do not use DBATools
    All pure Powershell

    Like

  2. martincroft says:

    My mistake nevertheless really useful tools. Thanks for sharing.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s