Super-fast mass update of management servers for OpsMgr

Here’s a quick one – you want to update the failover management servers on your agents en-mass, and don’t want to wait 12 years for it to complete. Why do you want to set it? Maybe you only want certain agents talking to certain data-centers, or specific management servers have very limited resources. Regardless of the reasons, if you do need to update the agent config, it can be a bit slow. Here is a quick little script that can make those update a LOT quicker.

First thing first – download PoshRSJob from Boe Prox. It’s about the best thing since sliced bread, and I use it constantly. Download the module and place it in one of your module directories (C:\Windows\System32\WindowsPowerShell\v1.0\Modules, for example). Next, create a CSV called FailOverPairs.csv. This should have 2 columns – Primary and Failover. For example:

Primary,Failover
MS01,MS02
MS03,MS04
MS05,MS06

You will want that header line – mainly because it saves us a couple of lines of code in PowerShell. Next, save that CSV in the same directory as the script below. This CSV will be used to set the appropriate failover partner. Save the script below in the same directory as the csv, and you are good to go! Here is the script:

Import-Module PoshRSJob -Force
Import-Module OperationsManager -Force
$modules = (Get-Module | Where-Object{ $_.Name -notlike 'Microsoft.*' -and $_.Name -ne 'PoshRSJob' -and $_.Name -ne 'ISE' }).path
try
{
    $agents = Get-SCOMAgent
}
catch
{
    write-verbose "Cannot load agent list"
}
$Pairs = Import-Csv -Path $PSScriptRoot + '\FailoverPairs.csv' -Header Primary,Failover
$agents|Start-RSJob -Name { $_.DisplayName } -Throttle 20 -ModulesToImport $modules -ScriptBlock {
    param($agent)
    $Pairs = $using:Pairs
    $primary = $agent.PrimaryManagementServerName
    $CurrentFailover = ($agent.GetFailoverManagementServers().DisplayName)
    foreach ($Pair in $Pairs)
    {
        if ($Pair.Primary -eq $primary){$secondary = $Pair.Failover}
        if ($Pair.Failover -eq $primary){$secondary = $Pair.primary}
    }
    if ($secondary -ne $CurrentFailover)
    {
        $AgentName = $agent.DisplayName
        write-verbose "$AgentName Secondary wrong. Primary $Primary, Current Secondary $CurrentFailover, Discovered Secondary $secondary"
        try
        {
            $Failover = Get-SCOMManagementServer | Where-Object {$_.Name -eq $secondary}
            if ($Failover.IsGateway -eq $true)
            {
                $FailOverServerObject = Get-SCOMGatewayManagementServer | Where-Object {$_.Name -eq $secondary}
            }
            else 
            {
                $FailOverServerObject = $Failover
            }
            Set-SCOMParentManagementServer -Agent $agent -FailoverServer $FailOverServerObject
            write-verbose "$AgentName $secondary set."
        }
        catch
        {
            $ErrorText = $error[0]
            write-verbose "$AgentName Failed to set failover. Current Failover $CurrentFailover, Discovered Failover $secondary.$ErrorText"
        }
    }
}|Out-Null
get-rsjob|Wait-RSJob|Remove-RSJob -force|Out-Null

Let’s examine some of this – the imports are obvious. If you have any issue with unblocking files or execution policy, leave a comment and I will help you through the import. The next line is different:

$modules = (Get-Module | Where-Object{ $_.Name -notlike 'Microsoft.*' -and $_.Name -ne 'PoshRSJob' -and $_.Name -ne 'ISE' }).path

What we are doing here is to get a list of the loaded modules, then exclude some of them. We are doing this because when we run this script, we are creating a ton of Runspaces. By default, these runspaces will need to know which modules to load. We don’t need them to load PoshRSJob, and we don’t need them to load things like the ISE because they are ephemeral – they will go away after they have completed their processing. This line can be modified if you don’t need to load other modules. It will load the OperationsManager module, which is the heavy lifter of this script.

Next, we get all of the agents from the management group. This script needs to be run from a SCOM server, but you could easily modify this script to run from a non-SCOM system by adding the “-computername” switch to the get-scomagent command. Then we import the CSV that contains our failover pairs.

Now the fun starts – this line starts the magic:

$agents|Start-RSJob -Name { $_.DisplayName } -Throttle 20 -ModulesToImport $modules -ScriptBlock {

This is the magic. We are feeding the list of SCOM agents (via the pipeline) to the start-rsjob cmdlet. The “-name” parameter tells the runspaces to use the Agent name as the job name, and the “-Throttle” parameter is set to control the number of runspaces we want running at once. I typically find that there isn’t a lot of benefit to going much over 2 or 3 times the number of logical cores. Maybe if you have remote processes that were very long running it might be beneficial to go up to 5-10 times the number of processors, but for this I found 2-3 to be the sweet spot. You will also see that we are telling start-rsjob what modules to import (see above).

The rest of the script is the scriptblock we want PoshRSJob to run. This is actually pretty straight forward – we set some variables (some of the we have to get with “$using:“). Then we find the current primary and failover, see if they match our pairs, and if they don’t we correct them. This isn’t a fast process, but if you are doing 20 of them at a time, it goes by a lot faster!

At the end of the script, we are simply waiting for the jobs to finish. In fact, if you want to track the progress, comment out this line:

get-rsjob|Wait-RSJob|Remove-RSJob -force|Out-Null

If you comment that line out, you can track how fast your jobs are completing by using this:

get-rsjob|group -property state

We’ve been able to check several thousand systems daily in very little time to make sure our primary and failover pairs are set correctly. I hope you guys get some use from this, and go give Boe some love for his awesome module! Leave a comment if you have any questions, or hit me up on Twitter.

Customize your PowerShell profile for useful startup actions

Did you know you can make PowerShell run any commands you want when you start a shell? This is amazingly useful for gathering information, making settings changes, or kicking off processes – all at shell startup. There are plenty of places that talk about the profiles, so I won’t go into each type, but long story short there are almost 10 different profiles when you account for 32 and 64 bit PowerShell. Some of them are explained in detail here.

For my example here, I am going to deal with the %UserProfile%\My Documents\WindowsPowerShell\profile.ps1 profile, which affects the current user, but all shells. This is useful for when you are switching back and forth between the ISE and console – say when you are testing new scripts. By default this file won’t exist – you will have to create it if it doesn’t.

#See if your profile file exists.  Checks the 'My Documents\WindowsPowerShell' directory
Test-Path $profile

The profile file is really just a .ps1 file. You can put any PowerShell you want in this file. Say you want to get a random Cat Fact every time you start a shell? (who am I to judge?)

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
((((invoke-webrequest -uri https://catfact.ninja/fact -Method get).content).split(':')[1])).trim('","length"')

There are some really useful things you can do now that you know you can run anything. For example – this will show the number of running PowerShell processes, along with WinRM service status:

write-host -ForegroundColor Green "PowerShell Processes: " (Get-process 'PowerShell').count
write-host -ForegroundColor Green "WinRM Status: " (Get-service 'winrm').status

And this will show the PowerShell module paths:

write-host -ForegroundColor Green "Module Paths:"
foreach ($module in ($env:PSModulePath).Split(";")){write-host -ForegroundColor Green "    "$module}

Notice here – I also added a simple “cd c:\blog” to my profile – really useful for starting straight in your scripts directory (no more “cd c:\workspaces\app\development\scripts….” every time you start a console)

But you can do even more! Full functions can be loaded into your profile and are available immediately. One of my favorites is one that will add a Start-RDP function, so I can initiate a remote desktop session without ever touching the start menu! How cool is this?

write-host -ForegroundColor Green "PowerShell Processes: " (Get-process 'PowerShell').count
write-host -ForegroundColor Green "WinRM Status: " (Get-service 'winrm').status
write-host -ForegroundColor Green "Module Paths:"
foreach ($module in ($env:PSModulePath).Split(";")){write-host -ForegroundColor Green "    "$module}
cd C:\Blog
function Start-RDP
{
    param  
    (  
        [Parameter(
            Position = 0,
            ValueFromPipeline=$true,
            Mandatory=$true
        )]
        [ValidateNotNullOrEmpty()]
        [string]
        $ServerName
    )
    mstsc /v:$ServerName 
}

There is also a special function you can put in your profile – it’s the ‘prompt’ function. This will change the PowerShell command line prompt to whatever you want! Just create a new function called ‘prompt’, write-host anything you want into the function, and make sure you put a ‘return ” “‘ at the end – it’s that’s simple! You can put some great data right on the prompt – for example, you can make the current time show up each time the prompt is shown! This is really useful for measuring how long something takes to run if you don’t want to pull out measure-object! Here is my prompt, along with the full profile:

function prompt
{
    $time = "("+(get-date).ToLongTimeString()+")"
    write-host -NoNewline -ForegroundColor Green "PS " (Get-Location).Path $time ">"
    return " "
}
write-host -ForegroundColor Green "PowerShell Processes: " (Get-process 'PowerShell').count
write-host -ForegroundColor Green "WinRM Status: " (Get-service 'winrm').status
write-host -ForegroundColor Green "Module Paths:"
foreach ($module in ($env:PSModulePath).Split(";")){write-host -ForegroundColor Green "    "$module}
cd C:\Blog
function Start-RDP
{
    param  
    (  
        [Parameter(
            Position = 0,
            ValueFromPipeline=$true,
            Mandatory=$true
        )]
        [ValidateNotNullOrEmpty()]
        [string]
        $ServerName
    )
    mstsc /v:$ServerName 
}

And the outcome:

It’s that simple! Customize your profile, and start being productive quicker! Leave a comment below to tell me what your favorite profile modifications are!!

Austin PowerShell User Group Survey Responses!

Last week, I sent out a survey where we were asking people questions about the Austin PowerShell User Group. Basically we are trying to find out what you all want when it comes to location, meeting times, etc… Well, here are the results!!

To begin with we had around 40 responses, which is excellent! Thank you all for responding!

The first question – “Would you be interested in attending an Austin PowerShell User Group meeting in the future?”. I know – I know – softball question. If someone is responding, then they are probably interested in attending. Not surprisingly, the percent that answered yes was 100%! That’s great!

When we asked what days worked best, a couple of clear winners emerged – Friday and Doesn’t Matter.

Next – How often should we meet? Again, a pretty clear winner – Once every other month took almost 50% of the votes!

Now we get to the fun questions – Where and for how long should we meet. First, here are the primary and secondary choices for location. North Austin and Round Rock/Pflugerville appear to be the leaders for Primary choice, and North Austin taking the bulk of the Secondary choice!


When we asked how long each meeting should be, we got some great varied responses! All day and Afternoon took 2/3rds from the Primary, while Afternoon and Mornings dominated the Secondary Choice!


We know we can’t pick everyone’s preferred time and/or location, so next we asked how likely you would still be able to attend if the selection didn’t go your way. All in all, everyone seemed somewhat flexible!

Now on the to the free-form text! Some great suggestions on venues:

Microsoft or Dell Campus
Employer
Microsoft, Member Facilities
Just happy to be aware of this
Dave & Busters, Alamo Drafthouse, MSFT Store (free)
Domain area
eBay (Daytime only), Microsoft (daytime only), User Group Member Businesses
Private companies to host
Yes 🙂
Microsoft Austin on Stonelake

I don’t know who the smart-ass was that said “Yes” with a smiley face, but I will find you 🙂

I should have know better than ask for open comments, but here they are.

I think this is a great idea! It would be great to meet other PS developers in the area
I could participate more easily on days I could not attend if we had live feed or if the presentations were available on you tube or something
Maybe we could expand CTSMUG and devote a session to Powershell every time we meet – We could schedule it after lunch to allow for those attendees who cant take an entire day – Or whenever during the CTSMUG Day that makes the most sense. The technologies are complimentary and it benefits the CTSMUGers as much as the PUGers. Or barring that how about a Powershell Happy Hour post CTSMUG.
Newcomers’ meeting would be cool for a start.
Meetings during business hours opens up more options for locations because you don’t have to pay for extra security or host in a Retail/Food location which may be too noisy. I work for eBay and can easily host meetings (small or large) given enough advanced notice but it must be during the week, during business hours.
Ask for more volunteers to lead the group so we can spread the load.

Here is something nice – someone thanked me!

Thanks Donnie !

And then there’s Duncan McAlynn’s comment (Yes, I know it was you)

Donnie’s an asshole.

Thanks to everyone that took the survey (except Duncan) – we REALLY appreciate it! We will munch on this data, and send out invites shortly!

Run _Anything_ with Flow. PowerShell Triggers

Want to start PowerShell commands from a Tweet? Yeah you do, and you didn’t even know you wanted to.

Earlier this month, a great Flow of the Week was posted that highlighted the ability to use a .net filesystemwatcher to kick off local processes. This sparked an idea – I think we can expand on this and basically run anything we want. Here’s how:

First, let’s start with the Connected Gateway. The link above goes into a bit of detail on how to configure the connection. Nothing special there.
Second, on the Connected Gateway, run this PowerShell script:

$FileSystemWatcher = New-Object System.IO.FileSystemWatcher
$FileSystemWatcher.path = "C:\temp\WatchMe"
$FileSystemWatcher.Filter = "Flow.txt"
$FileSystemWatcher.EnableRaisingEvents = $true
 
Register-ObjectEvent $FileSystemWatcher "Changed" -Action {
$content =  get-content C:\temp\WatchMe\Flow.txt |select-object -last 1
powershell.exe $content
}

This script sets up a FileSystemWatcher on the C:\temp\WatchMe\Flow.txt file. The watcher will only perform an action if the file is changed. There are several options for the “Changed” parameter – Created, Deleted, Renamed, Error, etc… Once created, the watcher will look at the last line of the c:\temp\WatchMe\Flow.txt file, and launch a PowerShell process that takes that last line as the input.

Third – This is the best part. Since we have a FileSystemWatcher, and that watcher is reading the last line of the C:\temp\WatchMe\Flow.txt file and kicking that process off, all we have to do is append a line to that file to start a PowerShell session. Flow has a built-in connection for FileSystem. You can see where this is going. Create a new Flow, and add an input action – I am fond of the Outlook.com Email Arrives action. Supply a suitable trigger in the subject, and add the ‘Append File’ action from the FileSystem service. Here is how mine is configured:

The only catch with this particular setup is that the body of the email needs to be in plain text – Windows 10 Mail app, for example, will not send in plain text. The body of the mail is the PowerShell command we want to run. For example, maybe we want PowerShell to get a list of processes that have a certain name, and dump those to a text file for parsing later. Simply send an email that has the body of “get-process -name chrome|out-file c:\temp\ChromeProcesses.txt”. Here is what that results in:
Before we send the email:

The Email:

After a few minutes – an new folder appears!:

The contents of the text file:

Handles  NPM(K)    PM(K)      WS(K)     CPU(s)     Id  SI ProcessName                                                  
-------  ------    -----      -----     ------     --  -- -----------                                                  
   1956      97   131588     191648     694.98    728   1 chrome                                                       
    249      22    34268      43356       1.63   4264   1 chrome                                                       
    381      81   307592     331312     145.16   6080   1 chrome                                                       
    149      12     2140      10076       0.05   7936   1 chrome                                                       
    632      86   277900     557484     974.00   9972   1 chrome                                                       
    431      31   147956     159404     182.11  10056   1 chrome                                                       
    219      12     2132       9608       0.08  11636   1 chrome                                                       
    283      50   135932     141512      98.05  12224   1 chrome                                                       
    396      54   133912     297432      18.58  12472   1 chrome                                                       
    253      46   107348     106752      50.13  13276   1 chrome                                                       
    381      48   114452     128836     242.89  14328   1 chrome                                                       

Think about what you could do with this – Perhaps you want to do an Invoke-WebRequest every time a RSS Feed updates. Maybe start a set of diagnostic commands when an item is added to Sharepoint. Kick off actions with a Tweet. If you want full scripts to run instead of commands, change the Action section of the FileSystemWatcher to “PowerShell.exe -file $content”. Easy as pie.

PowerShell WSMAN Configuration for Massive Scale

In my day job, I constantly strive to push PowerShell to the limit, attempting to use absolutely every bit of processor/memory/network bandwidth available. One way I do this is with PoshRSJob written by Boe Prox. PoshRSJob is a wonder multi-threading tool, and I use it at pretty heavy scale – typically at a 100 thread throttle.

Sometimes, when you are running a lot of concurrent threads attaching to remote machines, you will run into WinRM connection limitations. They typically will show up in error messages like this when you try to do commands line “invoke-command -computername remoteserver01” :
“This user is allowed a maximum number of 5 concurrent shells, which has been exceeded. “

Configuring typical WSMAN connection limits are fairly well documented, but I was running into another type of error. This error was occurring even after I had upped the connection limits:
“The maximum number of concurrent shells allowed for this plugin has been exceeded.”

This was driving me crazy, until I realized the slightly different wording. Browsing the WSMAN PSDrive, I was eventually able to solve it. The key word in the second error was “plugin”. I had to configure the limits on the plugin, not just the shell. After I realized the difference, I was able to find the right settings. I have compiled them here in a small script that will enable WinRM, and set the limits very high for both the shell and plugin.

enable-psremoting -force 
cd WSMan:\localhost\Shell 
set-item MaxConcurrentUsers 100 
set-item MaxProcessesPerShell 10000 
set-item MaxMemoryPerShellMB 1024 
set-item MaxShellsPerUser 1000 
cd WSMan:\localhost\Plugin\microsoft.powershell\Quotas 
set-item MaxConcurrentUsers 100 
set-item MaxProcessesPerShell 10000 
set-item MaxShells 1000 
set-item MaxShellsPerUser 1000 
restart-service winrm 

Obviously test this before you deploy to production. I also found a neat one-liner to monitor the number of WSMan connections of a target system (set the $computername variable to the target, or use localhost):

while($true){(Get-WSManInstance -ComputerName $ComputerName -ResourceURI Shell -Enumerate).count;start-sleep 1} 

Azure Runbook for Posting to the OMS API

For MMSMOA 2017, I created an Azure Runbook that could post to the OMS API. Well, it’s more than a month later, but I finally got around to making a post around it. I’m going to skip the basics of creating a runbook, but if you need a primer, I suggest starting here.

Let’s start with the runbook itself. Here is a decent template that I modified from the OMS API documentation. This template takes an input string, parses the string into 3 different fields, and sends those fields over to OMS. Here’s the runbook:

Param
(
    [Parameter (Mandatory= $true)]
    [string] $InputString
)



$CustomerID = Get-AutomationVariable -Name "CustomerID"
$SharedKey = Get-AutomationVariable -Name "SharedKey"
write-output $customerId
write-output $SharedKey
$date = (get-date).AddHours(-1)

# Specify the name of the record type that you'll be creating
$LogType = "MyRecordType"

# Specify a field with the created time for the records
$TimeStampField = "DateValue"

# Create the function to create the authorization signature
Function Build-Signature ($customerId, $sharedKey, $date, $contentLength, $method, $contentType, $resource)
{
    $xHeaders = "x-ms-date:" + $date
    $stringToHash = $method + "`n" + $contentLength + "`n" + $contentType + "`n" + $xHeaders + "`n" + $resource

    $bytesToHash = [Text.Encoding]::UTF8.GetBytes($stringToHash)
    $keyBytes = [Convert]::FromBase64String($sharedKey)

    $sha256 = New-Object System.Security.Cryptography.HMACSHA256
    $sha256.Key = $keyBytes
    $calculatedHash = $sha256.ComputeHash($bytesToHash)
    $encodedHash = [Convert]::ToBase64String($calculatedHash)
    $authorization = 'SharedKey {0}:{1}' -f $customerId,$encodedHash
    return $authorization
}


# Create the function to create and post the request
Function Post-OMSData($customerId, $sharedKey, $body, $logType)
{
    $method = "POST"
    $contentType = "application/json"
    $resource = "/api/logs"
    $rfc1123date = [DateTime]::UtcNow.ToString("r")
    $contentLength = $body.Length
    $signature = Build-Signature `
        -customerId $customerId `
        -sharedKey $sharedKey `
        -date $rfc1123date `
        -contentLength $contentLength `
        -fileName $fileName `
        -method $method `
        -contentType $contentType `
        -resource $resource
    $uri = "https://" + $customerId + ".ods.opinsights.azure.com" + $resource + "?api-version=2016-04-01"

    $headers = @{
        "Authorization" = $signature;
        "Log-Type" = $logType;
        "x-ms-date" = $rfc1123date;
        "time-generated-field" = $TimeStampField;
    }

    $response = Invoke-WebRequest -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body -UseBasicParsing
    $WhatISent = "Invoke-WebRequest -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body -UseBasicParsing"
    write-output $WhatISent
    return $response.StatusCode
}

# Submit the data to the API endpoint

$ComputerName = $InputString.split(';')[0]
$AlertName = $InputString.split(';')[1]
$AlertValue = $InputString.split(';')[2]

# Craft JSON
$json = @"
[{  "StringValue": "$AlertName",
    "Computer": "$computername",
    "NumberValue": "$AlertValue",
    "BooleanValue": true,
    "DateValue": "$date",
    "GUIDValue": "9909ED01-A74C-4874-8ABF-D2678E3AE23D"
}]
"@

Post-OMSData -customerId $customerId -sharedKey $sharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($json)) -logType $logType  

There are a couple of things to note with this runbook – first, the date it will post into OMS will be Central Standard Time. If you want to change to another timezone, change the $date = (get-date).AddHours(-1) line (aligning to EST). Second, this script has output which you can remove. The output will only show up in the output section in Azure Automation, which makes it handy for troubleshooting. The third thing you might want to change is the $LogType = “MyRecordType” line. This is the name that OMS will give the log (with one caveat mentioned below).

So, create your runbook in Azure Automation, and give it a test. You will be prompted for the InputString. In my example here, I will use the input string of “Blog Test;Critical;This is a test of an Azure Runbook that calls the OMS HTTP API”

Give it a minute or so, and you are rewarded with this:

Notice the “_CL” at the end of my log name? Notice the “_S” at the end of the fields? OMS does that automatically – CL for custom log, S for string (or whatever data type you happen to pass).

There you have it – runbooks that post to OMS. Add a webbook to the Runbook, and call it from Flow. Send an email to an inbox, have Flow trigger the Runbook with some of the email data, and suddenly you have the ability to send emails and have that data appear in OMS.

PowerShell – Runspaces and Large Enterprises

You’ve got 40gb of log files, a broken app, and a CFO reminding you how much money the company is losing per minute. You’ve got to find that one error in one log that will clue you in on how to fix the issue. You have no idea where it is, but you know you have this issue in the bag. How? Because you have PowerShell. I’m going to show you how.

We’ve all been there – a task that has to be done across hundreds of systems, a search of thousands of files, pulling a property from tens of thousands of AD user accounts. PowerShell can do it, but there is always a need to shave those seconds off. In this article we examine how to perform these large action in the quickest possible ways.

Asynchronous Processing

PowerShell has a couple of options when it comes to running tasks in a ‘multi-threaded’ fashion. The two you will primarily hear about are workflows and runspaces (jobs are another topic). Workflows are dead easy to setup, but can be picky about what they will and will not allow to happen in them. Workflows do have the nice feature of sequencing – being able to tell part of the workflow to run in sequence, then run other parts in parallel. Runspaces are more difficult to setup initially, but allow essentially any action you desire. Runspaces also allow for insane parallel processing. In my personal preference, runspaces are always in my toolbox. It becomes a no-brainer when you combine some of the work that the PowerShell heavy-weights have done to make runspaces super easy. Mainly Boe Prox and Warren F. These guys are serious rock-stars.

Invoke-Parallel – Your new best friend

When you absolutely, positively have to burn up those CPUs and flood the network, you need Invoke-Parallel. Seriously, download it now. I made that a link for a reason. Go get it. Using this beast, we can run multiple commands against 20,000 remote server nodes every evening. We can put thousands of SCOM nodes into maintenance mode in a matter of minutes, or search hundreds of directories with thousands of files in a matter of seconds. This is one function that will elevate your PowerShell game. It’s all built on runspaces, and has some amazing logic wrapped around it.

From Github you will get a .ps1 file. You can either pull the function out of that file and include it in your script, dot-source the whole .ps1 file (. “C:\temp\invoke-parallel.ps1”), or take the function and wrap it up in a module. That is my preferred method, since I wrap it up with other useful functions. Regardless of how you reference the function, calling it is easy. Here is a simple example:

. "c:\blog\Parallel\invoke-parallel.ps1"
$servers = 'server1','server2','server3'
Invoke-Parallel -InputObject $servers -ScriptBlock {Test-Connection $_ -Count 1 -Quiet}

This is pretty straight forward. I am dot-sourcing the ps1, Generating an array that has 3 servers in it, and then sending that array to the Invoke-Parallel function as the InputObject parameter. This couldn’t be easier, and guess what? You just ‘multi-threaded’ a PowerShell script. Pat yourself on the back, and then buy Boe and Warren a drink the next time you see them.

Now invoking these runspaces doesn’t come free – there is an overhead and startup time associated with starting a runspace, and that might actually be a detriment to your outcome. For example, if I have 20,000 log files that are relatively small (10mb or less), and you need to do a “select-string -pattern ‘something'”, then it might not be advantageous to run invoke-parallel. Let’s look at the time it takes to find an error in one of those logs files with each method. In a previous blog post, I created a function to create a lot of log files with random data – I am using that here to create 20,000 log files of about 1mb in size. (side-note: I will later be expanding that function to take advantage of invoke-parallel).
dirproperties

I have edited a random file and added this line somewhere in the middle:
2016-08-21–ERROR–TOO MANY FILES, IDIOT.
I have idea which one I edited. That’s how dedicated I am to this cause. Now, let’s measure how long it takes to find this string both with and without invoke-parallel.
Without:

PS C:\blog\Parallel> Measure-Command{Get-ChildItem c:\Temp\blog -recurse | Select-String -pattern "ERROR"}


Days              : 0
Hours             : 0
Minutes           : 2
Seconds           : 11
Milliseconds      : 966
Ticks             : 1319662757
TotalDays         : 0.00152738745023148
TotalHours        : 0.0366572988055556
TotalMinutes      : 2.19943792833333
TotalSeconds      : 131.9662757
TotalMilliseconds : 131966.2757

With:

Measure-Command{
	. "c:\blog\Parallel\invoke-parallel.ps1"
	$files = Get-ChildItem c:\Temp\blog -Recurse
	Invoke-Parallel -InputObject $files -Throttle 8 -ScriptBlock { $_ | Select-String -Pattern 'ERROR' }
}

Days              : 0
Hours             : 0
Minutes           : 4
Seconds           : 1
Milliseconds      : 880
Ticks             : 2418807160
TotalDays         : 0.00279954532407407
TotalHours        : 0.0671890877777778
TotalMinutes      : 4.03134526666667
TotalSeconds      : 241.880716
TotalMilliseconds : 241880.716

Because the files are small, the Select-String can process them faster then we can spin up new runspaces. But, if we change the size of the files – say to something like 1GB, the difference is dramatic.
Without Invoke-Parallel:

PS C:\blog\Parallel> Measure-Command{Get-ChildItem c:\Temp\blog -recurse | Select-String -pattern "ERROR"}


Days              : 0
Hours             : 0
Minutes           : 5
Seconds           : 12
Milliseconds      : 368
Ticks             : 3123680771
TotalDays         : 0.00361537126273148
TotalHours        : 0.0867689103055556
TotalMinutes      : 5.20613461833333
TotalSeconds      : 312.3680771
TotalMilliseconds : 312368.0771

And with:

Days              : 0
Hours             : 0
Minutes           : 2
Seconds           : 15
Milliseconds      : 444
Ticks             : 1354442030
TotalDays         : 0.00156764123842593
TotalHours        : 0.0376233897222222
TotalMinutes      : 2.25740338333333
TotalSeconds      : 135.444203
TotalMilliseconds : 135444.203

It halved the time it took to process those files. Why? Because we could load up multiple select-strings at a time as each was long-running. Each individual select-string is not CPU intensive, it just takes time. In this instance I could process 40GB of files in 2.25 minutes, whereas before I could only do 20gb in 4 minutes. This tells us that runspaces are great for commands or scripts that take a bit longer to run and aren’t horribly CPU intensive.

All of this brings me to the title of this article – when you are dealing with an absolute massive amount of machines, or AD accounts, or large files – whatever the case may be – invoke-parallel should be in your toolbox. At my current job, I have 6 commands to run every night on around 18,000 servers. I can run these through 8 jump servers – I pipe invoke-commands through a large invoke-parallel with a throttle of 80, and can finish this job in about 3 hours. Prior to using invoke parallel, it was taking about 18 hours to complete. That is how you utilize a network.

The main parameters that we typically deal with when using Invoke-Parallel are the InputObject, the Throttle, the ScriptBlock (or ScriptFile), and the Timeout. The InputObject is an array that is the basis for the function. In essence the function will open a runspace for each object in the array. It could be an array of servers, array of users, or a list of files. The throttle is how many runspaces you want running at the same time. Avoid the temptation to set this value too high – it can actually be detrimental if too many runspaces are vying for the same resources (CPU/MEM/Disk). A good rule of thumb for my environment is to limit it to the number of processors on the system running the task. If I am using multiple servers to run tasks, or if the tasks have extremely minimal requirements, I might set it higher. Timeout is how many seconds you want the runspace to run before it is killed. This is typically used to free up runspaces that have encountered a problem – hung commands and such. The last parameter – the ScriptBlock (or scriptfile) is what you want to actually happen in the runspace. Take this example:

$sites = 'www.google.com','www.cnn.com','www.microsoft.com','www.reddit.com'
. "C:\blog\Parallel\invoke-parallel.ps1"
invoke-parallel -InputObject $sites -scriptblock {Test-Connection $_ -quiet}

In this case, the ScriptBlock is a simple test-connection. The $_ is the reference to the current object being processed by this runspace. In this example it was a single url, but it can also be an object with properties, which you would access as any other property ($_.name, $_.Size, etc…). Inside the scriptblock, there are 2 option for accessing variables that are declared outside of the scriptblock. You can either use the ‘$using:variable’ method, or you can specify the ‘-ImportVariables’ parameter for Invoke-Parallel. Along those same lines, if you want to use modules that are imported outside of the runspace, you can use the ‘-ImportModules’ parameter.

This example expands on the script block a bit, and shows how to use the -ImportVariables parameter:

$sites = 'www.google.com','www.cnn.com','www.microsoft.com','www.reddit.com', 'www.powershell.org','www.draith.com','www.bing.com','www.arstechnica.com','www.bbcnews.com'
$texttofind = 'PowerShell'
. "C:\blog\Parallel\invoke-parallel.ps1"
invoke-parallel -InputObject $sites  -ImportModules -ImportVariables -scriptblock {
    if (Test-Connection $_ -quiet -count 1)
    {
        try
        {
            $result = Invoke-WebRequest -Uri $_ -UseBasicParsing|select-string -Pattern $texttofind
            if ($result)
            {
                Write-output $_
            }
        }
        catch
        {
            Write-Verbose 'Error getting site data.'
        }
    }
}

These are the basics of Invoke-Parallel. If you have any questions, feel free to leave a comment or ping me via email. In a future post we will go over jobs and how they compare to runspaces. See you then!

Again – special thanks to Boe Prox and Warren F. You guys make this stuff look easy.

‘Why Not?’ Series – PowerShell, IFTTT, and Smartthings

Ever wanted to turn your kitchen lights off from a command line?

OF COURSE YOU HAVE.

In another addition of my ‘Why Not?’ series, I explore how to bring the power of IFTTT and SmartThings to PowerShell.

SmartThings is a home automation suite that consists of a central hub, z-wave or Zigbee switches, outlets, light bulbs, smoke detectors, water sensors, etc… Thousands of devices exist that integrate with the SmartThings system, and it has an accessible API. I swear, this is not a paid advertisement. It just happens to be the system I use in my home automation. One night I was sitting at home wondering why I had to use an app or even my Harmony remote to turn off the lights in my man-cave. I was working on a PowerShell script, and that’s when it hit me – ‘Why Not?’ – Why can’t I use PowerShell to turn those off these lights?

Let’s assume you have SmartThings setup in your home already, and that you have signed into the https://graph.api.smartthings.com/ portal at least once. If that is done – we simply need to head over to IFTTT. If you need a primer on IFTTT, head here: https://ifttt.com/wtf. Yeah, I am NOT going to change that link – it’s awesome. If you already have an IFTTT account, sign in. If not, sign up. Once signed in, we will want to add a new channel. The channel we want to add, ironically enough, is called SmartThings. Click on the “Channels” link, search for SmartThings, and click on the Icon (should be the only one returned if you search correctly). Click the Icon.
IftttFindSmartThings

Now click on the GIANT connect button on the right hand side. It will take you to the SmartThings Api login page. Sign in, and you are greeted with this:
iftttpicklocation
Pick the Hub/Location that you want to integrate with IFTTT. When you do, you are shown a list of devices connected to your SmartThings hub:
iftttpickdevices
Select the devices you want to control, and press the “Authorize” button. You will be taken back to IFTTT. Don’t try to add any recipes yet – we need one more channel to make this work. In order to get IFTTT to trigger, we can use either the DO channel/app (which I am going to ignore in this demo), or we can use the Maker channel. Search for, and add the Maker channel just like we did for the SmartThings channel.
iftttmaker
When you add the Maker Channel, a key is automatically generated for you. Keep this key handy – we will be using it soon. I am not showing you my key, cause I barely know you guys – and it is unique to my recipes.
iftttmakerkey

Great – now what? Let’s add a recipe!! Click the “Create a New Recipe button”, and you are shown the typical IFTTT recipe builder page that looks like:
iftttthis
Click on the “this” portion, and search for/select the Maker channel:
iftttmakerthis
Pick the “Receive Web Request” tile. We now need to name our trigger. These will be unique to each device, and unique to the function we are calling. For example, if I want to turn on and off my Man-Cave Lights, I need to specify two unique triggers. Leave out spaces in this name.
iftttmakertrigger-mc
Create the trigger. BOOM – we are back the equation:
iftttthat
Click the ‘THAT’ section. We are dropped back to the channel search page – this time it’s the search for the Action channel – type in and select ‘SmartThings’. You should see a list of all the fun things that SmartThings brings to IFTTT.
iftttactionSmartthings
Choose the ‘Switch On’ tile. You can now pick the switch you want to interact with. In my example I will choose ‘Man Cave Lights’
iftttchoosedevice
One final step – click “Create Recipe”. Done! Now, create another new recipe, following the same steps, except name the Maker trigger something like ‘Man_Cave_Lights_Off’, and make sure you select the ‘Switch Off’ tile in the action section. You should now have 2 recipes – one for turning the lights on, another for turning the lights off. If you check the My Recipes section, you should have something along the lines of these two:
iftttrecipes

We are done in IFTTT for the moment – let’s head over to PowerShell.

This is pretty straight-forward, actually. We need to craft a URL with this format: https://maker.ifttt.com/trigger/{event}/with/key/{key}. The {event} is the Maker event we specified when we created the recipe – they {key} is the unique key that was generated when we added the Maker channel. All we need to do is craft the URL and send the web-request.

$MakerKey = 'abcdefghijklmnopqrstuvwxyz'
$BaseURL = 'https://maker.ifttt.com/trigger/'
$EndURL = "/with/key/$MakerKey"
$event = 'Man_Cave_Lights_On'
$url = $BaseURL+$event+$endurl
Invoke-WebRequest -uri $url -UseBasicParsing|Select-Object -Property content

Again, we use the -UseBasicParsing parameter to keep from firing up IE initial config. If we have done everything right, we should be greeted with something along the lines of:
Content
——-
Congratulations! You’ve fired the Man_Cave_Lights_On event

Want proof? Here you go!

20160827_234150

Once you add the Maker Channel, it opens a world of possibility up when it comes to IFTTT. For example, I later added my Harmony remote as a channel and was able to turn my AV system on and off and perform a large number of automations from PowerShell. Oh, the trouble we will get into….

PowerShell – Create a ton of random files filled with text

While working on an upcoming blog post, I found myself in the need to create a lot of files filled with random data. I know there are plenty tools out there that will create files, but the bulk of them create empty files – I needed them filled with a considerable amount of text data. So I did what any good PowerShell guy does – I built it myself using ideas from everyone else!

Download the function on GitHub!

Basically we are using the old “Lorem Ipsum” block of text and filling single files until it reaches about the right size. In this example, I am not really concerned about getting the EXACT size. In fact, I included a ‘Variance’ parameter to allow the size to get a bit more random – setting this will make the size of each file larger or smaller by the variance amount. Also included is a pair of parameters that handle sub-directory creation and sub-directory creation chance. At this time it will only go one layer deep, but in future releases I will allow nested subdirectories.

Here is the help text:

<#
	.SYNOPSIS
		Creates a number of auto-generated files in the spcified directories.
	
	.DESCRIPTION
		This function will supply any number of files approximately at the size you request.  Rather than setting a file size, this function actually fills the files with text until they reach the requested size.  You can control the number of files, size of files, and whether or not to create random sub-directories.
	
	.PARAMETER Path
		The top level path where the script will begin to create files.   C:\Temp, for example
	
	.PARAMETER FileTypes
		The types of files to create.  This parameter expects an array of values.  For example:  'log','log1','log2'
	
	.PARAMETER Size
		Size of each individual file.  This parmeter is added to, or subtracted from, by the variance parameter.  This value is in MB.  Exact sizes will vary, and larger file sizes will lead to larger variance.
	
	.PARAMETER Variance
		This parameter is added to, or subtracted from, the Size parameter to get a semi-random file size.  Blocks of text are added to the files, so exact file sizes will be random within a range.
	
	.PARAMETER SubDirectories
		Specify if this script should create subdirectories under the directory specified by the Path parameter.  Directories will have random names similar to file names, and there is a random chance of them being created (specified by the SubDirectoryCreateChance parameter)
	
	.PARAMETER Count
		The number of files to create under the directory specified by the Path parameter
	
	.PARAMETER SubDirectoryCreateChance
		The chance that a subdirectory will be created.  Larger numbers here will actually make it less likely that a subdirectory is created.  A number of 20 means that 1-in-20 will create a subdirectory.  A number of 100 means that 1-100 will create a subdirectory.
	
	.EXAMPLE
		PS C:\> Create-Files -Path c:\Temp\blog -FileTypes log, log1, log2, log3, log4, log5, log6, log7, log8 -Size 3 -Count 100

	.EXAMPLE
		PS C:\> Create-Files -Path c:\Temp\blog -FileTypes log, log1, log2, log3, log4, log5, log6, log7, log8 -Size 1 -Count 20000 -SubDirectories True -SubDirectoryCreateChance 50	
	
	.EXAMPLE
		PS C:\> Create-Files -Path c:\Temp\blog -FileTypes log, log1, log2, log3, log4, log5, log6, log7, log8 -Size 10 -Count 50 -Variance 4 -SubDirectories True -SubDirectoryCreateChance 10
	
#>

Let’s look at the examples:

Create-Files -Path c:\Temp\blog -FileTypes log, log1, log2, log3, log4, log5, log6, log7, log8 -Size 3 -Count 100

This example will create log files in the c:\temp\blog directory. It will create 100 files with a size of approximately 3MB. It will do so in only one directory.

Create-Files -Path c:\Temp\blog -FileTypes log, log1, log2, log3, log4, log5, log6, log7, log8 -Size 1 -Count 20000 -SubDirectories True -SubDirectoryCreateChance 50	

This example will create log files in the c:\temp\blog directory. It will create 20,000 files with a size of around 1MB. It has a 1-in-50 chance on each file of putting it in a sub-directory.

Create-Files -Path c:\Temp\blog -FileTypes log, log1, log2, log3, log4, log5, log6, log7, log8 -Size 10 -Count 50 -Variance 4 -SubDirectories True -SubDirectoryCreateChance 10

This final example will create log files in the c:\temp\blog directory. It will create 50 files of approximately 10MB, plus or minus 4MB. It has a 1-in-10 chance of creating subdirectories.

This fun little script is up on GitHub at the link above. Feel free to leave any comments or suggestions! I am already planning to add nested sub-directories, disk space checking, and multi-threading the creation to decrease generation time.

Thanks!

PowerShell on Linux – Try it Out Fast!

Unless you have been living under a rock, you probably heard that Microsoft Open-Sourced PowerShell and made it available on Linux/MacOS!. Everyone wants to give this thing a try. Want to test it out quick? Here’s how:

Versions used in this example:
Windows 10 Build 10586.545 (Hyper-V Host)
CentOS 7
WinSCP 5.9.1

A quick assumption (I know – we all know what happens when we assume) – You already have Hyper-V up and running. If not, read this post by @adbertram about Essential PowerShell Cmdlets For Managing Hyper-V. It’s a really well put together article.

Start by downloading CentOS. The link above will take you to the download page – in this example I will use the DVD ISO. Since the PowerShell repo is still in Alpha, I am going to stick with CentOS. Open Hyper-V Manager, and create a new VM. Do this by right-clicking on your computer in the tree pane, and selecting New – Virtual Machine.

new-virtualmachine

Click through the wizard, choosing a VM name, the generation (I used 2 for this demo), memory (8gb), the network (make sure it has internet access), where to store the VHD, and finally – under the installation options – select “Install an operating system from a bootable image file”. Point it to the CentOS iso you downloaded previously. Finish the wizard.
vmsummary

Before we start the VM we will need to disable Secure Boot, otherwise we won’t be able to load the DVD image. Right click on the new VM, click Settings, Security, and uncheck the box for Secure Boot.
secureboot

Start and connect to the VM. When this menu appears, select ‘Install CentOS 7″.
installcent

Installing CentOS is pretty straight forward. Select your language for setup, click Continue. On the next screen, you are going to see the “Installation Summary” screen. Here you will probably see some items marked with a classic warning symbol. In my demo, it was the “Installation Destination” section. Simply going into that section and selecting done is enough to clear it. There is, however, one change we want to make before starting the installation. Click on the “Network & Host Name” section, set the hostname, and turn on eth0. We need this vm to be able to access the internet, which is difficult without an active network adapter.
network2

Click Begin Installation. The install will begin to run, and during this time you should go ahead and set the root password, and create a non-root user (not required, but a good idea anyway). Setup doesn’t take long, so in a few minutes we are presented “Complete!” message. Click the Reboot button.

Within a minute or two, you should be presented with a CentOS Linux 7 login. Go ahead and log in with root.
rootlogin

If you don’t already have it installed, download and install WinSCP. We are going to need to get the RPM for PowerShell from Github, and move it to the CentOS VM. For this demo, we can get the RPM here. Once we have the RPM downloaded, launch WinSCP and connect to the CentOS VM. Depending on networking, it might be necessary to connect via IP address. In order to get the IP in this minimal CentOS install, we are going to need to use ‘ip addr’. Note the IP for eth0.
ip

In WinSCP, connect to the VM using root, and copy the rpm to the /tmp directory.
winscp

Now the fun begins. Go back to the CentOS VM, and run the following command:

sudo yum -y install /tmp/powershell-6.0.0_alpha.9-1.el7.centos.x86_64.rpm

This command tells the VM to install the PowerShell rpm we downloaded, and the ‘-y’ tells it to automatically install the 2 dependencies. When it’s done, you are rewarded with something like this
powershellinstalled

Once this is done – we have PowerShell on this CentOS VM. That is not something I thought I would type anytime soon 🙂 Why don’t we see if it works. It’s basically another shell, so let’s start by typing ‘powershell’.

psprompt

Take a moment to look at this – a Microsoft PowerShell prompt, running in a Linux VM. Want proof? Do a get-service, but look for something uniquely *nix.
get-process

And there you go. Microsoft dropped the mic in a big way on this one. VERY well done.