Couple of Logs Analyzing Function

Heya all – here are a couple of quick functions to help analyze logs files. Coming from a ConfigMgr/SCCM background, I got used to reading a LOT of logs. Having a couple of functions like this would have greatly helped!

First – let’s see if there are warning and/or error messages in a log (or stack of logs)

function Analyze-LogContent {
    [CmdletBinding()]
    param (
        [Parameter(Mandatory=$true)]
        [string]$LogFilePath,

        [string]$ErrorPattern = 'ERROR|Error|error',
        [string]$WarningPattern = 'WARNING|Warning|warning'
    )

    if (-not (Test-Path -Path $LogFilePath)) {
        Write-Error "Log file does not exist at the path: $LogFilePath"
        return
    }

    # Reading the log file
    $logContent = Get-Content -Path $LogFilePath

    # Analyzing for errors
    $errors = $logContent | Where-Object { $_ -match $ErrorPattern }
    $warnings = $logContent | Where-Object { $_ -match $WarningPattern }

    # Output analysis
    $output = @()
    if ($errors.Count -gt 0) {
        $output += "Found $($errors.Count) errors in the log."
    } else {
        $output += "No errors found in the log."
    }

    if ($warnings.Count -gt 0) {
        $output += "Found $($warnings.Count) warnings in the log."
    } else {
        $output += "No warnings found in the log."
    }

    return $output
}

# Example usage
$logPath = "C:\Path\To\Your\LogFile.log"
$result = Analyze-LogContent -LogFilePath $logPath
$result | ForEach-Object { Write-Host $_ }

Change the patterns as necessary – ERR, for example.

The second function is pretty straight forward – summarize a log counting the number of INFO, Warning, and Error messages:

function Summarize-LogFile {
    [CmdletBinding()]
    param (
        [Parameter(Mandatory=$true)]
        [string]$LogFilePath
    )

    if (-not (Test-Path -Path $LogFilePath)) {
        Write-Error "Log file does not exist at the path: $LogFilePath"
        return
    }

    $logContent = Get-Content -Path $LogFilePath

    $infoCount = 0
    $errorCount = 0
    $warningCount = 0

    foreach ($line in $logContent) {
        switch -Regex ($line) {
            "INFO" { $infoCount++ }
            "ERROR" { $errorCount++ }
            "WARNING" { $warningCount++ }
        }
    }

    $summary = @"
Log File Summary:
Info Entries: $infoCount
Error Entries: $errorCount
Warning Entries: $warningCount
Total Entries: $($logContent.Count)
"@

    return $summary
}

# Example usage
$logPath = "C:\Path\To\Your\LogFile.log"
$summary = Summarize-LogFile -LogFilePath $logPath
Write-Host $summary

There ya go! I will keep adding to these, and eventually get them in Github so you all can tell me how wrong they are 🙂

Happy Coding!

Creating Alert Rules in Azure with AZ PowerShell – Some Samples

Let go over a simple one – how to create various types of alert rules in Azure using the AZ PowerShell Module.

Each example targets a different aspect of Azure monitoring, but doesn’t cover them all. Remember to tweak the parameters to match your environment.

Metric Alerts for Performance Monitoring

To keep an eye on Azure service metrics:

$criteria = New-AzMetricAlertRuleV2Criteria -MetricName 'Percentage CPU' -TimeAggregation Average -Operator GreaterThan -Threshold 80

Add-AzMetricAlertRuleV2 -Name 'HighCPUAlert' -ResourceGroupName 'YourResourceGroupName' -WindowSize 00:05:00 -Frequency 00:01:00 -TargetResourceId '/subscriptions/yourSubscriptionId/resourceGroups/yourResourceGroupName/providers/Microsoft.Compute/virtualMachines/yourVMName' -Condition $criteria -ActionGroup '/subscriptions/yourSubscriptionId/resourceGroups/yourResourceGroupName/providers/microsoft.insights/actionGroups/yourActionGroupName' -Severity 3 -Description 'Alert on high CPU usage.'

Log Alerts for Custom Log Queries

For alerts based on log analytics:

$query = "AzureActivity | where OperationName == 'Create or Update Virtual Machine' and ActivityStatus == 'Succeeded'"

Set-AzScheduledQueryRule -ResourceGroupName 'YourResourceGroupName' -Location 'East US' -ActionGroup '/subscriptions/yourSubscriptionId/resourceGroups/yourResourceGroupName/providers/microsoft.insights/actionGroups/yourActionGroupName' -ConditionQuery $query -Description "VM creation alert" -Enabled $true -EvaluationFrequency 'PT5M' -Severity 0 -WindowSize 'PT5M' -Name 'VMCreationAlert'

Activity Log Alerts for Azure Resource Events

To monitor specific Azure service events:

$condition = New-AzActivityLogAlertCondition -Field 'category' -Equal 'Administrative'
$actionGroupId = "/subscriptions/yourSubscriptionId/resourceGroups/yourResourceGroupName/providers/microsoft.insights/actionGroups/yourActionGroupName"

Set-AzActivityLogAlert -Location 'Global' -Name 'AdminActivityAlert' -ResourceGroupName 'YourResourceGroupName' -Scopes "/subscriptions/yourSubscriptionId" -Condition $condition -ActionGroupId $actionGroupId -Description "Alert on administrative activities"

Application Insights Alerts for Application Performance

Track application performance with a simple AppInsights web test

$rule = New-AzApplicationInsightsWebTestAlertRule -Name 'AppPerfAlert' -ResourceGroupName 'YourResourceGroupName' -Location 'East US' -WebTestId '/subscriptions/yourSubscriptionId/resourceGroups/yourResourceGroupName/providers/microsoft.insights/webtests/yourWebTestId' -FailedLocationCount 3 -WindowSize 'PT5M' -Frequency 'PT1M' -Criteria $criteria

Set-AzApplicationInsightsWebTestAlertRule -InputObject $rule

Mastering PowerShell: Organizing Functions and Modules with Source Control

When your scripts evolve from one-off tasks to a library of tools, proper organization is key. Let’s explore how to structure your PowerShell functions and modules for maximum impact, and integrate source control to safeguard and collaborate on your code.

Elevating Scripts with Functions

At the heart of organized PowerShell scripting are functions. Functions allow you to encapsulate logic, making your scripts more readable, reusable, and maintainable. Here’s how to structure a function effectively:

function Get-DemoData {
    [CmdletBinding()]
    Param (
        [Parameter(Mandatory=$true)]
        [string]$Parameter1,

        [Parameter(Mandatory=$false)]
        [int]$Parameter2 = 10
    )

    Begin {
        # Initialization code here
    }

    Process {
        # Main function logic here
    }

    End {
        # Cleanup code here
    }
}

Best Practices:

  • CmdletBinding and Parameters: Use [CmdletBinding()] to make your function behave like a built-in cmdlet, including support for common parameters like -Verbose. Define parameters clearly, marking mandatory ones as such.
  • Verb-Noun Naming: Follow the PowerShell naming convention of Verb-Noun, making your function’s purpose immediately clear.
  • Comment-Based Help: Provide detailed help within your function using comment-based help. This makes your functions self-documenting and user-friendly.

Scaling with Modules

As your collection of functions grows, modules become your best friend. A module is a package of functions, scripts, and variables that you can share and reuse across projects. Here’s a simple structure for a module:

# MyModule.psm1
function Get-DemoData {
    # Function definition here
}

function Set-DemoData {
    # Another function definition here
}

Export-ModuleMember -Function Get-DemoData, Set-DemoData

Module Manifests: For more complex modules, consider creating a module manifest (MyModule.psd1). This file defines metadata about your module, including version, author, and which functions to export.

Integrating Source Control with Git

Source control is not just for developers; it’s essential for scripters too. Git, a version control system, helps you track changes, collaborate with others, and revert to earlier versions when needed.

  1. Initialize a Git Repository: Start by creating a new directory for your project, then initialize a Git repository.
git init

Commit Your Functions and Modules: As you create or modify your functions and modules, add them to the repository and commit changes.

git add .
git commit -m "Add Get-DemoData function"

Branching for New Features: When working on a new feature or major change, use branches to keep your main branch stable.

git checkout -b feature/new-feature

Collaboration and Backup: Use online Git repositories like GitHub or Azure Repos for backup, collaboration, and leveraging CI/CD pipelines for automated testing and deployment.

Happy scripting!

Quick Code – Send Events to an Event Hub with PowerShell

Here is another quick one – let’s send events to an event hub with PowerShell!

function New-SasToken {
    param(
        [string]$ResourceUri,
        [string]$Key,
        [string]$PolicyName,
        [timespan]$TokenTimeToLive
    )

    $Expires = [DateTimeOffset]::Now.Add($TokenTimeToLive).ToUnixTimeSeconds()
    $StringToSign = [System.Web.HttpUtility]::UrlEncode($ResourceUri) + "`n" + $Expires
    $HMACSHA256 = New-Object System.Security.Cryptography.HMACSHA256
    $HMACSHA256.Key = [Text.Encoding]::UTF8.GetBytes($Key)
    $Signature = $HMACSHA256.ComputeHash([Text.Encoding]::UTF8.GetBytes($StringToSign))
    $Signature = [Convert]::ToBase64String($Signature)
    $Token = "SharedAccessSignature sr=" + [System.Web.HttpUtility]::UrlEncode($ResourceUri) + "&sig=" + [System.Web.HttpUtility]::UrlEncode($Signature) + "&se=" + $Expires + "&skn=" + $PolicyName
    return $Token
}

# Event Hub parameters
$namespace = "yourNamespace"
$eventHubName = "yourEventHubName"
$sharedAccessKeyName = "yourSharedAccessKeyName"
$sharedAccessKey = "yourSharedAccessKey"
$endpoint = "https://$namespace.servicebus.windows.net/$eventHubName/messages"
$tokenTimeToLive = New-TimeSpan -Minutes 60

# Generate SAS token
$sasToken = New-SasToken -ResourceUri $endpoint -Key $sharedAccessKey -PolicyName $sharedAccessKeyName -TokenTimeToLive $tokenTimeToLive

# Event data
$body = @"
{
    "Data": "Sample Event Data"
}
"@

# Send the event
$headers = @{
    "Authorization" = $sasToken
    "Content-Type" = "application/json"
}

try {
    $response = Invoke-RestMethod -Uri $endpoint -Method Post -Body $body -Headers $headers
    Write-Output "Event sent successfully"
}
catch {
    Write-Error "Failed to send event: $_"
}

Azure Inventory Management with PowerShell

Listen – creating resources in Azure with PowerShell is easy – but actually knows what you have deployed is something else. Let’s dive into the steps to harness the power of PowerShell for a streamlined Azure inventory process.

Prerequisites

Before we embark on this journey, ensure you have:

  • An Azure account with necessary access permissions.
  • PowerShell and the Azure PowerShell module ready on your machine.

Configuring PowerShell for Azure

Connecting to Azure is the first step. Open your PowerShell window and enter these commands. This should let you set your context from the Gridview.

# Connect to Azure with interactive login
Connect-AzAccount

# List subscriptions and select one interactively
Get-AzSubscription | Out-GridView -PassThru | Set-AzContext

Lets go ahead and start to look at your resources:

# List all resources and export to CSV
Get-AzResource | Select-Object ResourceType, Name, Location | Export-Csv -Path ./AllResources.csv -NoTypeInformation

# VM Inventory: List VMs and export their details
Get-AzVM | Select-Object Name, Location, HardwareProfile.VmSize | Export-Csv -Path ./VMInventory.csv -NoTypeInformation

# Storage Accounts: List accounts and export their details
Get-AzStorageAccount | Select-Object StorageAccountName, Location, SkuName | Export-Csv -Path ./StorageAccounts.csv -NoTypeInformation

# Network Resources: List VNets and export their details
Get-AzVirtualNetwork | Select-Object Name, Location, AddressSpace | Export-Csv -Path ./VNetInventory.csv -NoTypeInformation

In the scripts above, each command not only fetches the necessary details but also exports them to a CSV file for easy access and reporting.

Advanced Techniques

Organizing and managing your resources effectively can further be achieved by using tags.

# Organizing resources with Tags: Filter by tag and export
Get-AzResource -Tag @{ Department="Finance"} | Select-Object Name, ResourceType | Export-Csv -Path ./FinanceResources.csv -NoTypeInformation

For more insights and advanced techniques, visit the Azure PowerShell documentation. Here’s to efficient management of your Azure resources. Happy scripting!

Setting Up and Accessing Azure Cognitive Services with PowerShell

Alright folks – we’re going to dive into how you can leverage Azure Cognitive Services with PowerShell to not only set up AI services but also to interact with them. Let’s go!

Prerequisites

Before we begin, ensure you have the following:

  • An Azure subscription.
  • PowerShell 7.x or higher installed on your system.
  • Azure PowerShell module. Install it by running Install-Module -Name Az -AllowClobber in your PowerShell session.

Use Connect-AZAccount to get into your subscription, then run this to create a new RG and Cognitive Services resource:

$resourceGroupName = "<YourResourceGroupName>"
$location = "EastUS"
$cognitiveServicesName = "<YourCognitiveServicesName>"

# Create a resource group if you haven't already
New-AzResourceGroup -Name $resourceGroupName -Location $location

# Create Cognitive Services account
New-AzCognitiveServicesAccount -Name $cognitiveServicesName -ResourceGroupName $resourceGroupName -Type "CognitiveServices" -Location $location -SkuName "S0"

It’s that simple!

To interact with Cognitive Services, you’ll need the access keys. Retrieve them with:

$key = (Get-AzCognitiveServicesAccountKey -ResourceGroupName $resourceGroupName -Name $cognitiveServicesName).Key1

With your Cognitive Services resource set up and your access keys in hand, you can now interact with various cognitive services. Let’s explore a couple of examples:

Text Analytics

To analyze text for sentiment, language, or key phrases, you’ll use the Text Analytics API. Here’s a basic example to detect the language of a given text:

$text = "Hello, world!"
$uri = "https://<YourCognitiveServicesName>.cognitiveservices.azure.com/text/analytics/v3.1/languages"

$body = @{
    documents = @(
        @{
            id = "1"
            text = $text
        }
    )
} | ConvertTo-Json

$response = Invoke-RestMethod -Uri $uri -Method Post -Body $body -Headers @{
    "Ocp-Apim-Subscription-Key" = $key
    "Content-Type" = "application/json"
}

$response.documents.languages | Format-Table -Property name, confidenceScore

So this code will try and determine the language of the text submitted. The output might look like this:

Name           ConfidenceScore
----           ---------------
English        0.99

Let’s try computer vision now:

Computer Vision

Azure’s Computer Vision service can analyze images and extract information about visual content. Here’s how you can use PowerShell to send an image to the Computer Vision API for analysis:

$imageUrl = "<YourImageUrl>"
$uri = "https://<YourCognitiveServicesName>.cognitiveservices.azure.com/vision/v3.1/analyze?visualFeatures=Description"

$body = @{
    url = $imageUrl
} | ConvertTo-Json

$response = Invoke-RestMethod -Uri $uri -Method Post -Body $body -Headers @{
    "Ocp-Apim-Subscription-Key" = $key
    "Content-Type" = "application/json"
}

$response.description.captions | Format-Table -Property text, confidence

This code is trying to describe the image, so the output might look like this – pardon the bad word wrap:

Text                            Confidence
----                            ----------
A scenic view of a mountain range under a clear blue sky 0.98

To learn more about Cognitive Services – check out the Docs!

Adding Azure Alert Rules with PowerShell 7

Here is a quick and dirty post to create both Metric and Scheduled Query alert rule types in Azure:

To create an Azure Alert rule using Powershell 7 and the AZ module, you will need to install both Powershell 7 and the AZ module.

PowerShell 7: https://docs.microsoft.com/en-us/powershell/scripting/install/installing-powershell?view=powershell-7.1

To install the AZ module, run the following command in Powershell:

Install-Module -Name AZ

Once both Powershell 7 and the AZ module are installed, you can use the following commands to create an Azure Alert rule.

To create a metric alert rule:

$alertRule = New-AzMetricAlertRule `
    -ResourceGroupName "MyResourceGroup" `
    -RuleName "MyMetricAlertRule" `
    -TargetResourceId "/subscriptions/{subscription-id}/resourceGroups/{resource-group-name}/providers/Microsoft.Compute/virtualMachines/{vm-name}" `
    -MetricName "Percentage CPU" `
    -Operator GreaterThan `
    -Threshold 90 `
    -WindowSize 30 `
    -TimeAggregationOperator Average

And to create a Scheduled Query rule:

$alertRule = New-AzLogAlertRule `
    -ResourceGroupName "MyResourceGroup" `
    -RuleName "MyScheduledQueryAlertRule" `
    -Location "East US" `
    -Condition "AzureMetrics | where ResourceProvider == 'Microsoft.Compute' and ResourceType == 'virtualMachines' and MetricName == 'Percentage CPU' and TimeGrain == 'PT1H' and TimeGenerated > ago(2h) | summarize AggregateValue=avg(MetricValue) by bin(TimeGenerated, 1h), ResourceId" `
    -ActionGroupId "/subscriptions/{subscription-id}/resourceGroups/{resource-group-name}/providers/Microsoft.Insights/actionGroups/{action-group-name}"

You will have to replace the main bits you would expect – ResourceGroupName, Subscription-ID, Action-Group-Name, Location, etc.

Hope this helps!

Azure Monitor Action Group Source IP Addresses

One of the hurdles a company might run into when moving to Azure, especially with Azure Monitor and Log Analytics, is integration with Action Groups. Rarely are the actions of SMS or Email good enough to integrate with an internal Service Management system, so that leaves webhook as the simplest way to get data back into the datacenter. But that leaves one problem – Firewalls.

The Azure backend can change at any time, so it’s important to know what IP addresses the Action Group can originate from – and we can do that with Get-AZNetworkServiceTag.

To start, let’s install the az.network module. I am specifying allowclobber and force, to both update dependencies and upgrade existing modules. Make sure you are running or pwsh window or VSCode as an administrator.

install-module az.network -allowclobber -force
Downloading and Installing

If you aren’t already connected, do a quick Connect-AzAccount. It might prompt you for a username/password, and will default to a subscription, so specify an alternative if necessary.

Once connected, we can run the Get-AZNetworkServiceTag cmdlet. We will specify what region we can to get the information for – in this case, EastUS2.

Get-AzNetworkServiceTag -Location eastus2

The ‘values’ property is what we are after. It contains a massive amount of resource types, regions, and address ranges. In our case, let’s just get the Action Groups, specifically for EastUS2.

(Get-AzNetworkServiceTag -Location eastus2).values|where-object {$_.name -eq 'ActionGroup.EastUS2'}

And there we go! A list of 3 ranges. We can now work on our Firewall rules and make sure the webhook can get back in to the datacenter.

Why not? Pathfinder 2e API and PowerShell

In another of my “Why Not?” series – a category of posts that don’t actually set out to show a widespread use, but rather just highlight something cool I found, I present just how much of a nerd I am.

I love tabletop RPG games – DnD, Pathfinder, Shadowrun, Call of Cthulhu, Earthdawn,etc… You name it, I have probably sat around a table playing it with a group of friends. Our new favorite game to play recently – Pathfinder 2e.

Recently I found something very cool – a really good PF2e character builder/manager called Wanderer’s Guide. I don’t have any contact with the author at all – I am just a fan of the software.

A bit of background before I continue – I have been looking for an API to find raw PF2e data for quite a while. An app might be in the future, but I simply couldn’t find the data that I wanted. The Archive would be awesome to get API access to, but it’s not to be yet.

After moping around for a couple of months, I found this, and an choir of angels began to sing. Wanderers Guide has an API, and it is simple awesome. Grab you free API key, and start to follow along.

We are going to make a fairly standard API call first. Let’s craft the header with the API key you get from your profile. This is pretty straight forward:

$ApiKey = "<Put your Wanderer's Guide API Key here>"
$header = @{"Authorization" = "$apikey" }

Next, let’s look at the endpoints we want to access. Each call will access a category of PF2e data – classes, ancestries, feats, heritages, etc… This lists the categories of data available.

$baseurl = 'https://wanderersguide.app/api'
$endpoints = 'spell', 'feat', 'background', 'ancestry', 'heritage'

Now we are going to iterate through each endpoint and make the call to retrieve the data. But – since Wanderer’s Guide is nice enough to provide the API for free, we aren’t going to be jerks and constantly pull the full list of data each time we run the script. We want to only pull the data once (per session), so we will check to see if we have already done it.

foreach ($endpoint in $endpoints) {
    if ((Test-Path variable:$endpoint'data') -eq $false){
        "Fetching $endpoint data from $baseurl/$endpoint/all"
        New-Variable -name ($endpoint + 'data') -force -Value (invoke-webrequest -Uri "$baseurl/$endpoint/all" -headers $header)
    }
}

The trick here is the New-Variable cmdlet – it lets us create a variable with a dynamic name while simultaneously filing it with the webrequest data. We can check to see if the variable is already created with the Test-Path cmdlet.

Once we have the data, we need to do some simple parsing. Most of it is pretty straight forward – just convert it from JSON and pick the right property – but a couple of the endpoints need a bit more massaging. Heritages and Backgrounds, specifically.

Here is the full script – it’s really handy to actually use out-gridview in order to parse the data. For example, do you want to background that gives training in Athletics – just pull of the background grid and filter away!

$ApiKey = "<Put your Wanderer's Guide API Key here>"
$header = @{"Authorization" = "$apikey" }
$baseurl = 'https://wanderersguide.app/api'
$endpoints = 'spell', 'feat', 'background', 'ancestry', 'heritage'

foreach ($endpoint in $endpoints) {
    if ((Test-Path variable:$endpoint'data') -eq $false){
        "Fetching $endpoint data from $baseurl/$endpoint/all"
        New-Variable -name ($endpoint + 'data') -force -Value (invoke-webrequest -Uri "$baseurl/$endpoint/all" -headers $header)
    }
    else{
        switch ($endpoint){
            'spell' {$Spells = ($spelldata.content|convertfrom-json).psobject.properties.value.spell|Out-GridView}
            'feat'{$feats = ($featdata.content|convertfrom-json).psobject.properties.value.feat|Out-GridView}
            'ancestry'{$ancestries = ($ancestrydata.content|convertfrom-json).psobject.properties.value.ancestry|Out-GridView}
            'background'{$backgrounds = ($backgrounddata.content|convertfrom-json).psobject.properties|where-object {$_.name -eq 'syncroot'}|select-object -expandProperty value|out-gridview}
            'heritage'{$heritages = ($heritagedata.content|convertfrom-json).psobject.properties|where-object {$_.name -eq 'syncroot'}|select-object -expandProperty value|out-gridview}
            default{"Instruction set not defined."}
        }
    }
}

Enjoy!

EASY PowerShell API Endpoint with FluentD

One of the biggest problems that I have had with PowerShell is that it’s just too good. I want to use it for everything. Need to perform automation based on monitoring events? Pwsh. Want to update rows in a database when someone clicks a link on a webpage? Pwsh. Want to automate annoying your friends with the push of a button on your phone? Pwsh. I use it for everything. There is just one problem.

I am lazy and impatient. Ok, that’s two problems. Maybe counting is a third.

I want things to happen instantly. I don’t want to schedule something in task scheduler. I don’t want have to run a script manually. I want an API like end-point that will allow me to trigger my shenanigans immediately.

Oh, and I want it to be simple.

Enter FluentD. What is Fluentd you ask? From the website – “Fluentd allows you to unify data collection and consumption for a better use and understanding of data.” I don’t necessarily agree with this statement, though – I believe it’s so much more than that. I view it more like an integration engine with a wide community of plug-ins that allow you to integrate a wide variety of toolsets. It’s simple, light-weight, and quick. It doesn’t consume a ton of resources sitting in the background, either. You can run it on a ton of different platforms too – *nix, windows, docker, etc… There is even a slim version for edge devices – IOT or small containers. And I can run it all on-prem if I want.

What makes it so nice to use with PowerShell is that I can have a web API endpoint stood up in seconds that will trigger my PowerShell scripts. Literally – it’s amazingly simple. A simple config file like this is all it takes:

<source>
  @type http
  port 9880
</source>

Boom – you have an listener on port 9880 ready to accept data. If you want to run a PowerShell script from the data it receives, just expand your config file a little.

<source>
  @type http
  port 9880
</source>

#Outputs
<match **>
  @type exec
  command "e:/tasks/pwsh/pwsh.exe  -file e:/tasks/pwsh/events/start-annoyingpeople.ps1"
  <format>
    @type json
  </format>
  <buffer>
    flush_interval 2s
  </buffer>
</match>

With this config file you are telling FluentD to listen on port 9880 (http://localhost:9880/automation?) for traffic. If it sees a JSON payload (post request) on that port, it will execute the command specified – in this case, my script to amuse me and annoy my friends. All I have to do is run this as a service on my Windows box (or a process on *Nix, of course) and I have a fully functioning PowerShell executing web API endpoint.

It doesn’t have to just be web, either. They have over 800 plug-ins for input and output channels. Want SNMP traps to trigger your scripts? You can do it. How about an entry in a log starting your PowerShell fun? Sure! Seriously – take a look at FluentD and how it can up your PowerShell game immensely.