SQL Process Automation with PowerShell

Here is a handy way to handle SQL processes that you find yourself needing to schedule. Of course you can always setup that kind scheduling via the SQL Server Agent, but there are two good reasons to do this kind of scheduling via PowerShell.

1: You don’t have rights to add jobs to via the SQL Server Agent. Some security teams will restrict non-dba access to either the agent or insist on setting the agent to a manual start.
2: You wish to have easier tracking, easier configuration, and just want to do something cool with PowerShell.

The other option you have when trying to schedule something like SQL processes would be to simply use Task Scheduler. Indeed – in my solution I actually use Task Scheduler as a base engine to run every minute or so. What I don’t like about Task Scheduler is trying to put SQL command lines in it. It’s flat out a pain. So, I built something that was easy to configure – even by someone who is not skilled in PowerShell, easy to implement, and has all of the typical good stuff you want with PowerShell.

The solution utilizes 3 main pieces. First, is a script that we will schedule to run every minute via Task Scheduler. Second, a configuration file in JSON format. I choose JSON since it’s simple to read and easy to write. Lastly, we will have XML file that tracks when the last time something ran. Let’s examine each piece:

The script file is fairly straight forward, and there are only a couple of pieces that need explaining. In simple terms this script file will be used as the engine that calls the various SQL commands we specify in the config.json file. We import a few modules, setup a couple of base variables, and then loop through the SQL commands, updating the runhistory.xml file with the timestamp. Schedule this file to run every minute via task scheduler.

$log = $PSScriptRoot + '\'+ ($MyInvocation.MyCommand.Name).split('.')[0] + '.log'
$ModulesDir = 'D:\pwsh\modules'
$ModulesToImport = 'DDTLogging','SQL'
foreach ($module in $ModulesToImport){Get-ChildItem $ModulesDir\$module\*.psd1 -Recurse | resolve-path | ForEach-Object { import-module $_.providerpath -force }}
$Date = Get-Date
$Commands = (get-content $PSScriptRoot\config.json|convertfrom-json).commands
[xml]$RunHistory = get-content "$PSScriptRoot\runhistory.xml"
foreach ($command in $Commands){
    $commandname = $command.name
    $LastRunTime = ($runhistory.Catalog.dataset|where-object {$_.name -eq $commandname}|select-object -Property time).time
    if ($LastRunTime -lt $Date.AddMinutes(-$command.TimePeriod) -or $LastRunTime -eq $null)
    {
        if ($LastRunTime -eq $null){
            $newdata = $RunHistory.Catalog.AppendChild($RunHistory.CreateElement("dataset"))
            $newdata.SetAttribute("name","$commandname")
            $newdata.SetAttribute("time","$date")
            $RunHistory.Save("$PSScriptRoot\runhistory.xml")
        }
        else {
            $RunHistory.Catalog.dataset|where-object {$_.name -eq $commandname}|foreach {$_.time = [string]$date}
            $RunHistory.Save("$PSScriptRoot\runhistory.xml")
        }
        try{
            invoke-sql -server $command.server -database $command.database -method $command.commandtype -integratedsecurity $true -statement $command.statement
        }
        catch{
            write-log -text "Error in $commandname.  $_" -level ERROR -log $log
        }
    }
}

If you are wondering about the Invoke-SQL command, it’s something that I wrapped up in a quick module. You can get the module here.

Now that we have the base script, let’s look at the config.json file. This is where the meat of the information about your commands come from:

{
    "Commands":
    [
        {
            "name":"SQL_Views_Stored_Proc",
            "timeperiod":15,
            "statement":"exec refreshviews",
            "server":"sqlserver1.constoso.com",
            "database":"customers",
            "commandtype":"execute"
        },
        {
            "name":"Update_Time",
            "timeperiod":15,
            "statement":"update cogsuppliers set time = getdate()",
            "server":"sqlserver2.constoso.com",
            "database":"suppliers",
            "commandtype":"update"
        }       
    ]
}

As you can see – it’s a pretty straight forward. Put in the frequency you would like the statement to run (timeperiod), the statement itself, the server and database to run on, and finally the ‘type’ of command it is. This is just to tell the SQL module whether or not to load the data into a data table. That’s it! Anyone with a basic knowledge of how to write a json file can add or remove from this config file quickly!

The final piece is the runshistory.xml file. This simply lets the main script keep track of the last time each statement was run. You shouldn’t have to ever update this file manually.

<Catalog>
  <dataset name="SQL_Views_Stored_Proc" time="11/15/2018 10:49:12" />
  <dataset name="Update_Time" time="11/15/2018 10:49:12" />
</Catalog>

For full transparency, there are a few things I just need to swing back around and address. First – the commands runs sequentially. Long running SQL statements might cause others to fail or fall behind. The plan for that is to use something like PoshRSJob to run the jobs in parallel, and each in it’s own runspace. Secondly, there is a small chance that if someone was to manually edit the runhistory.xml file and remove one of the lines the script could error. I will update the script with a catch to make sure this doesn’t happen.

SiteScope Logging to a SQL AlwaysOn Availability Group

SiteScope can log monitor status and response data to a couple of locations – flat files and databases, primarily.  While useful, it does fail when you attempt to send those logs to a Microsoft SQL Server Availability Group (AlwaysOn). If we use a new driver, however, we can make it work.

Versions:  SiteScope 11.32, SQL 2014, JDBC driver 4.2

The first thing to do is download the latest JDBC driver from Microsoft’s website – https://www.microsoft.com/en-us/download/details.aspx?displaylang=en&id=11774
I downloaded the sqljdbc_6.0.7507.100_enu.exe version since it was the latest at the time of this post. Download the file and open the exe. It will prompt you for an extraction location. You will only need 2 files from this extract, so a temp location is fine. Grab the latest sqljdbc file (sqljdbc42.jar in this case) and copy it to the \WEB-INF\lib directory. From the extract, you also need to grab the sqljdbc_auth.dll file and copy it to the c:\windows\system32 directory on the SiteScope server.

On the SiteScope server, remove the \WEB-INF\lib\sqljdbc.jar file (delete or move it out of the SiteScope directories entirely). Be advised that any previous database connections that use this driver can fail, so test those after removing the file and restarting SiteScope. Launch SiteScope, navigate to Preferences->Log Preferences, and enter “jdbc:sqlserver://:1433;databaseName=;IntegratedSecurity=true;MultiSubnetFailover=true” (without the quotes) as the database connection URL. Enter “com.microsoft.sqlserver.jdbc.SQLServerDriver” (without the quotes) as the database driver. Leave the username and password entries blank.

The important pieces in play here – the updated driver supports the connection property MultiSubnetFailover. That tells the connection it can fail over to another member of the AG. The entry for IntegratedSecurity=true tells the driver that the user that is running SiteScope will be making the connection. It is important that the SiteScope service is running as a user that has rights to Create and Insert on the database.

Restart the SiteScope service. It will attempt to connect to the database in the URL, create a table called SiteScopeLog, and start logging after a few monitor runs (default 20).

Bonus – you can update the table that SiteScope logs to – for example, if you want multiple SiteScopes logging to a single database but have different tables, you can have each log to a table that is unique for that SiteScope. This support article details how you can change the master.config file – specifically the CREATE and INSERT statements.

Hope this article helps. This took way to long to figure out how to do, so I hope it saves you some time.