Quick Dive: Integrating Logic Apps with Azure OpenAI

Let’s cut to the chase: Integrating Azure Logic Apps with Azure OpenAI unlocks a plethora of possibilities, from automating content creation to enhancing data analysis. Below is a step-by-step guide to melding these powerful tools.

Step 1: Set Up Azure OpenAI

First, you need an Azure OpenAI service instance. Go to the Azure Portal, search for Azure OpenAI Service, and create a new instance. Once deployed, grab your API key and endpoint URL from the resource management section.

Step 2: Create Your Logic App

Navigate back to the Azure Portal and create a new Logic App:

  • Choose your subscription and resource group.
  • Pick a region close to you for lower latency.
  • Name your Logic App.
  • Click “Review + create” and then “Create” after validation passes.

Step 3: Design Your Logic App Workflow

Once your Logic App is ready, it’s time to design the workflow:

  • Open your Logic App in the Azure Portal and go to the Logic App Designer.
  • Start with a common trigger like “When an HTTP request is received” if you want your Logic App to act based on external requests.
  • Add a new step by searching for “HTTP” in the actions list and choose the “HTTP – HTTP” action. This will be used to call the Azure OpenAI API.

Step 4: Configure the HTTP Action for Azure OpenAI

  • Method: POST
  • URI: Enter the endpoint URL of your Azure OpenAI service.
  • Headers: Add two headers:
    • Content-Type with the value application/json
    • Authorization with the value Bearer <Your Azure OpenAI API Key>
  • Body: Craft the JSON payload according to your task. For example, to generate text, your body might look like this:
{
  "prompt": "Write a brief about integrating Azure OpenAI with Logic Apps.",
  "temperature": 0.7,
  "max_tokens": 100
}

Step 5: Process the Response

After calling the Azure OpenAI API, you’ll want to handle the response:

  • Add a “Parse JSON” action to interpret the API response.
  • In the “Content” box, select the body of the HTTP action.
  • Define the schema based on the Azure OpenAI response format. For text generation, you’ll focus on extracting the generated text from the response.

Step 6: Add Final Actions

Decide what to do with the Azure OpenAI’s response. You could:

  • Send an email with the generated content.
  • Save the response to a database or a file in Azure Blob Storage.
  • Respond to the initial HTTP request with the generated content.

Step 7: Test Your Logic App

  • Save your Logic App and run a test by triggering it based on your chosen trigger method.
  • Monitor the run in the “Overview” section of your Logic App to ensure everything executes as expected.

Deploy Logic Apps with PowerShell

This post is basically just a way to refresh my memory when in the next 3 months I completely forget how easy this is. Here’s how you can leverage PowerShell to manage your Logic Apps and their connections more effectively.

# Define variables
$resourceGroupName = 'YourResourceGroup'
$logicAppName = 'YourLogicAppName'
$templateFilePath = 'path/to/your/template.json'
$parametersFilePath = 'path/to/your/parameters.json'

# Deploy the Logic App
New-AzResourceGroupDeployment -Name DeployLogicApp `
  -ResourceGroupName $resourceGroupName `
  -TemplateFile $templateFilePath `
  -TemplateParameterFile $parametersFilePath

If you need a template example or parameters example, check the end of this post!!

Managing Logic App Connections with PowerShell

PowerShell can also simplify the creation and management of Logic App connections, making it easier to connect to services like Office 365 or custom APIs:

# Creating a connection to Office 365
$connectionName = 'office365Connection'
$connectionParams = @{
    'token:TenantId' = '<YourTenantId>';
    'token:PrincipalId' = '<YourPrincipalId>';
    'token:ClientSecret' = '<YourClientSecret>'
}

New-AzResource -ResourceType 'Microsoft.Web/connections' -ResourceName $connectionName `
  -ResourceGroupName $resourceGroupName -Location 'eastus' `
  -Properties $connectionParams

Sample Template and Parameter Json Files:

Template:

{
  "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "resources": [
    {
      "type": "Microsoft.Logic/workflows",
      "apiVersion": "2019-05-01",
      "name": "[parameters('logicAppName')]",
      "location": "[parameters('location')]",
      "properties": {
        "state": "Enabled",
        "definition": {
          "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
          "contentVersion": "1.0.0.0",
          "triggers": {
            "When_a_HTTP_request_is_received": {
              "type": "Request",
              "kind": "Http",
              "inputs": {
                "method": "POST",
                "schema": {}
              }
            }
          },
          "actions": {
            "Send_an_email": {
              "type": "ApiConnection",
              "inputs": {
                "host": {
                  "connection": {
                    "name": "@parameters('$connections')['office365']['connectionId']"
                  }
                },
                "method": "post",
                "body": {
                  "Subject": "Email Subject Here",
                  "Body": "<p>Email Body Here</p>",
                  "To": "example@example.com"
                },
                "path": "/Mail"
              }
            }
          },
          "outputs": {}
        },
        "parameters": {
          "$connections": {
            "defaultValue": {},
            "type": "Object"
          }
        }
      }
    }
  ],
  "parameters": {
    "logicAppName": {
      "defaultValue": "YourLogicAppName",
      "type": "String"
    },
    "location": {
      "defaultValue": "eastus",
      "type": "String"
    }
  }
}

Parameters:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "value": "YourLogicAppName"
    },
    "location": {
      "value": "eastus"
    }
  }
}

Automating Azure Service Health Alerts with PowerShell

Hello, Azure amigos! Today we’re diving into the depths of automating Azure Service Health alerts using PowerShell.

What’s Azure Service Health Anyway?

Azure Service Health provides personalized alerts and guidance when Azure service issues affect you. It breaks down into three main types of alerts:

  • Service issues: Problems in Azure services that affect you right now.
  • Planned maintenance: Upcoming maintenance that can affect your services in the future.
  • Health advisories: Issues that require your attention but don’t directly impact Azure services (e.g., security vulnerabilities, deprecated features).

Now, onto the fun part—automating these alerts with PowerShell!

Prerequisites

I’ll assume you’ve got the Azure PowerShell module installed and you’re familiar with the basics of PowerShell scripting and Azure. If not, it’s like assuming you can cook a gourmet meal without knowing how to turn on the stove—start there first!

Let’s get one more thing worked out – creating an action group to use in the Alert Rule.

$ActionGroupName = "MyActionGroup"
$ResourceGroupName = "MyResourceGroup"
$ShortName = "MyAG"

# Replace these values with your actual email and phone number
$Email = "your-email@domain.com"
$Sms = "+1-555-867-5309"

# Creating the action group
New-AzActionGroup -ResourceGroupName $ResourceGroupName -Name $ActionGroupName -ShortName $ShortName -EmailReceiver $Email -SmsReceiver $Sms -Location "Global"

With our action group ready, it’s time to define what we’re actually alerting on. We can create alerts for specific issues, maintenance events, or advisories. Here’s how:

# Assuming you've already created an action group as per the previous steps

$ResourceGroupName = "MyResourceGroup"
$RuleName = "MyServiceHealthAlert"
$ActionGroupId = (Get-AzActionGroup -ResourceGroupName $ResourceGroupName -Name "MyActionGroup").Id

# Service Health alert criteria
$criteria = New-AzActivityLogAlertCondition -Field 'category' -Equal 'ServiceHealth'

# Creating the Service Health alert
Set-AzActivityLogAlert -Location "Global" -Name $RuleName -ResourceGroupName $ResourceGroupName -Scope "/subscriptions/your-subscription-id" -Condition $criteria -ActionGroup $ActionGroupId

This PowerShell command creates an alert rule specifically for Service Health notifications within Azure. It triggers based on the ‘ServiceHealth’ category in the Azure Activity Log, ensuring you’re notified whenever there are relevant service health events affecting your subscription.

Explanation:

  • $criteria: This line defines what we’re alerting on. In this case, it’s any activity log entries with a category of ‘ServiceHealth’.
  • Set-AzActivityLogAlert: This cmdlet creates or updates an activity log alert rule. We specify the alert name, the scope (usually your subscription or a resource group), the conditions under which to trigger, and the action group to notify.

And there ya go! Simple and quick. Enjoy your new Alert Rule!

Optimizing Azure Cost Management with PowerShell

Let’s dig into some quick hits for trying to keep your costs down in Azure – and since I am who I am let’s use PowerShell

Automating Cost Reports

First – lets script the retrieval of usage and cost data, businesses can monitor their cloud expenditures closely, identify trends, and make informed decisions to optimize costs.

Get-AzConsumptionUsageDetail -StartDate "2023-01-01" -EndDate "2023-01-31" | Export-Csv -Path "./AzureCostsJan.csv"

This simple script fetches the consumption details for January 2023 and exports the data to a CSV file – from there you can use something like Excel to dig into your big costs.

Identifying Underutilized Resources

PowerShell scripts can scan Azure services to pinpoint underutilized resources, such as VMs with low CPU utilization or oversized and underused storage accounts, which are prime candidates for downsizing or deletion to cut costs.

Get-AzVM | ForEach-Object {
    $metrics = Get-AzMetric -ResourceId $_.Id -MetricName "Percentage CPU" -TimeGrain "00:05:00" -StartTime (Get-Date).AddDays(-30) -EndTime (Get-Date)
    $avgCpu = ($metrics.Data | Measure-Object -Property Average -Average).Average
    if ($avgCpu -lt 10) {
        Write-Output "$($_.Name) is underutilized."
    }
}

This script assesses VMs for low CPU usage, identifying those with an average CPU utilization below 10% over the last 30 days.

Implementing Budget Alerts

Setting up budget alerts with PowerShell helps prevent unexpected overspending by notifying you when your costs approach predefined thresholds.

$budget = New-AzConsumptionBudget -Amount 1000 -Category Cost -TimeGrain Monthly -StartDate 2023-01-01 -EndDate 2023-12-31 -Name "MonthlyBudget" -NotificationKey "90PercentAlert" -NotificationThreshold 90 -ContactEmails "admin@example.com"

This script creates a monthly budget of $1000 and sets up an alert to notify specified contacts via email when 90% of the budget is consumed.

And there you go! Some quick and easy scripts to make sure you don’t blow your Azure budget!

Hidden Gems in PowerShell 7

PowerShell 7 introduces several lesser-known features that can significantly enhance your scripting prowess. Let’s dive into these hidden gems.

Ternary Operator for Concise Conditional Logic

PowerShell 7 brings the ternary operator (?:), a shorthand for simple if-else statements, allowing for more concise and readable code.

$result = ($value -eq 10) ? "Equal to 10" : "Not equal to 10"

Pipeline Parallelization with ForEach-Object -Parallel

The -Parallel parameter in ForEach-Object can dramatically improve performance by executing script blocks in parallel. Note that it requires the use of the -ThrottleLimit parameter to control the number of concurrent threads.

1..50 | ForEach-Object -Parallel { $_ * 2 } -ThrottleLimit 10

Simplified Error Viewing with $ErrorView and Get-Error

PowerShell 7 introduces a new view for error messages through the $ErrorView variable, which can be set to ConciseView for a more streamlined error display. Additionally, Get-Error provides detailed error information, perfect for troubleshooting.

$ErrorView = 'ConciseView'
Get-Error

Null Conditional Operators for Handling $null

The null conditional operators ?. and ?[] provide a safe way to access properties and methods or index into arrays when there’s a possibility of $null values, preventing unnecessary errors.

$obj = $null
$name = $obj?.Name  # Returns $null without throwing an error
$value = $array?[0] # Safely attempts to access the first element

The switch Statement Enhancements

PowerShell 7 enhances the switch statement with the -Regex and -File options, allowing pattern matching against regex expressions and simplifying file content parsing.

switch -Regex ($inputString) {
    'error' { Write-Output 'Error found' }
    'warning' { Write-Output 'Warning found' }
}

Coalescing Operators for Default Values

The null coalescing operators ?? and ??= simplify the process of providing default values for potentially $null variables, reducing the need for verbose if statements.

$name = $null
$displayName = $name ?? 'Default Name'

Automatic Unwrapping of Single-Element Arrays

A subtle but handy feature; when a command or expression returns an array with a single element, PowerShell 7 automatically unwraps it, eliminating the need for manual indexing to access the single item.

Enhanced JSON Handling with ConvertFrom-Json and ConvertTo-Json

Improvements to ConvertFrom-Json and ConvertTo-Json cmdlets include better depth handling and the ability to work with PSCustomObject instances, streamlining JSON serialization and deserialization.

$json = '{"name": "PowerShell", "version": 7}'
$obj = $json | ConvertFrom-Json

Invoke DSC Resources Directly from PowerShell 7

Directly invoking Desired State Configuration (DSC) resources within PowerShell 7 scripts bridges traditional configuration management with modern PowerShell scripting, enhancing automation capabilities.

There ya go! Hope you find something in here that makes coding a bit more fun/easy!

Cranking Up the Efficiency: Optimizing PowerShell Scripts

Hey there, PowerShell aficionados! Whether you’re automating your morning coffee or deploying a fleet of VMs into the cloud, efficiency is key. Nobody wants to watch paint dry while their script runs in the background. So, let’s put some pep into that PowerShell script of yours. We’re diving straight into the realm of optimization – no fluff, just the good stuff.

Measure, Then Cut: Profiling Your Script

Before you start tweaking, let’s figure out where the bottlenecks are. PowerShell, being the Swiss Army knife it is, comes equipped with some nifty profiling tools like Measure-Command. This cmdlet lets you time how long it takes for a script or command to run. Use it to identify slow parts of your script:

Measure-Command { .\YourScript.ps1 }

Lean and Mean: Streamlining Execution

1. Filter Left, Format Right

One of the golden rules for optimizing PowerShell scripts is to do your filtering as early as possible. Use cmdlets like Where-Object and Select-Object judiciously to trim down your data before processing it further. Remember, processing less data means faster execution:

Get-Process | Where-Object { $_.CPU -gt 100 } | Select-Object Name, CPU

2. Avoid the Pipeline When Possible

While the pipeline is one of PowerShell’s most powerful features, it’s not always the most efficient. Each pipe operation adds overhead. For tight loops or operations that need to be as fast as possible, consider using .NET collections or array manipulations:

$processes = Get-Process
$highCpuProcesses = [System.Collections.ArrayList]@()
foreach ($process in $processes) {
    if ($process.CPU -gt 100) {
        [void]$highCpuProcesses.Add($process)
    }
}

3. Use Foreach-Object Carefully

Foreach-Object is versatile but can be slower than its foreach loop counterpart due to pipeline overhead. For large datasets, stick to foreach for better performance:

# Slower
Get-Process | Foreach-Object { $_.Kill() }

# Faster
foreach ($process in Get-Process) {
    $process.Kill()
}

The Need for Speed: Parallel Processing

When you’re dealing with tasks that can be run concurrently, PowerShell 7’s ForEach-Object -Parallel can be a game-changer. This allows you to run multiple operations at the same time, significantly speeding up processes:

1..10 | ForEach-Object -Parallel { Start-Sleep -Seconds $_; "Slept for $_ seconds" } -ThrottleLimit 10

A Parting Tip: Stay Up-to-Date

PowerShell and .NET are constantly evolving, with new features and performance improvements being added regularly. Make sure your PowerShell version is up-to-date to take advantage of these enhancements.

Wrap-Up

Optimizing PowerShell scripts can turn a sluggish sequence of commands into a streamlined process that runs at lightning speed. By measuring performance, refining your approach, and employing parallel processing, you can ensure your scripts are not only efficient but also maintainable. Happy scripting, and may your execution times always be minimal!

Couple of Logs Analyzing Function

Heya all – here are a couple of quick functions to help analyze logs files. Coming from a ConfigMgr/SCCM background, I got used to reading a LOT of logs. Having a couple of functions like this would have greatly helped!

First – let’s see if there are warning and/or error messages in a log (or stack of logs)

function Analyze-LogContent {
    [CmdletBinding()]
    param (
        [Parameter(Mandatory=$true)]
        [string]$LogFilePath,

        [string]$ErrorPattern = 'ERROR|Error|error',
        [string]$WarningPattern = 'WARNING|Warning|warning'
    )

    if (-not (Test-Path -Path $LogFilePath)) {
        Write-Error "Log file does not exist at the path: $LogFilePath"
        return
    }

    # Reading the log file
    $logContent = Get-Content -Path $LogFilePath

    # Analyzing for errors
    $errors = $logContent | Where-Object { $_ -match $ErrorPattern }
    $warnings = $logContent | Where-Object { $_ -match $WarningPattern }

    # Output analysis
    $output = @()
    if ($errors.Count -gt 0) {
        $output += "Found $($errors.Count) errors in the log."
    } else {
        $output += "No errors found in the log."
    }

    if ($warnings.Count -gt 0) {
        $output += "Found $($warnings.Count) warnings in the log."
    } else {
        $output += "No warnings found in the log."
    }

    return $output
}

# Example usage
$logPath = "C:\Path\To\Your\LogFile.log"
$result = Analyze-LogContent -LogFilePath $logPath
$result | ForEach-Object { Write-Host $_ }

Change the patterns as necessary – ERR, for example.

The second function is pretty straight forward – summarize a log counting the number of INFO, Warning, and Error messages:

function Summarize-LogFile {
    [CmdletBinding()]
    param (
        [Parameter(Mandatory=$true)]
        [string]$LogFilePath
    )

    if (-not (Test-Path -Path $LogFilePath)) {
        Write-Error "Log file does not exist at the path: $LogFilePath"
        return
    }

    $logContent = Get-Content -Path $LogFilePath

    $infoCount = 0
    $errorCount = 0
    $warningCount = 0

    foreach ($line in $logContent) {
        switch -Regex ($line) {
            "INFO" { $infoCount++ }
            "ERROR" { $errorCount++ }
            "WARNING" { $warningCount++ }
        }
    }

    $summary = @"
Log File Summary:
Info Entries: $infoCount
Error Entries: $errorCount
Warning Entries: $warningCount
Total Entries: $($logContent.Count)
"@

    return $summary
}

# Example usage
$logPath = "C:\Path\To\Your\LogFile.log"
$summary = Summarize-LogFile -LogFilePath $logPath
Write-Host $summary

There ya go! I will keep adding to these, and eventually get them in Github so you all can tell me how wrong they are 🙂

Happy Coding!

Creating Alert Rules in Azure with AZ PowerShell – Some Samples

Let go over a simple one – how to create various types of alert rules in Azure using the AZ PowerShell Module.

Each example targets a different aspect of Azure monitoring, but doesn’t cover them all. Remember to tweak the parameters to match your environment.

Metric Alerts for Performance Monitoring

To keep an eye on Azure service metrics:

$criteria = New-AzMetricAlertRuleV2Criteria -MetricName 'Percentage CPU' -TimeAggregation Average -Operator GreaterThan -Threshold 80

Add-AzMetricAlertRuleV2 -Name 'HighCPUAlert' -ResourceGroupName 'YourResourceGroupName' -WindowSize 00:05:00 -Frequency 00:01:00 -TargetResourceId '/subscriptions/yourSubscriptionId/resourceGroups/yourResourceGroupName/providers/Microsoft.Compute/virtualMachines/yourVMName' -Condition $criteria -ActionGroup '/subscriptions/yourSubscriptionId/resourceGroups/yourResourceGroupName/providers/microsoft.insights/actionGroups/yourActionGroupName' -Severity 3 -Description 'Alert on high CPU usage.'

Log Alerts for Custom Log Queries

For alerts based on log analytics:

$query = "AzureActivity | where OperationName == 'Create or Update Virtual Machine' and ActivityStatus == 'Succeeded'"

Set-AzScheduledQueryRule -ResourceGroupName 'YourResourceGroupName' -Location 'East US' -ActionGroup '/subscriptions/yourSubscriptionId/resourceGroups/yourResourceGroupName/providers/microsoft.insights/actionGroups/yourActionGroupName' -ConditionQuery $query -Description "VM creation alert" -Enabled $true -EvaluationFrequency 'PT5M' -Severity 0 -WindowSize 'PT5M' -Name 'VMCreationAlert'

Activity Log Alerts for Azure Resource Events

To monitor specific Azure service events:

$condition = New-AzActivityLogAlertCondition -Field 'category' -Equal 'Administrative'
$actionGroupId = "/subscriptions/yourSubscriptionId/resourceGroups/yourResourceGroupName/providers/microsoft.insights/actionGroups/yourActionGroupName"

Set-AzActivityLogAlert -Location 'Global' -Name 'AdminActivityAlert' -ResourceGroupName 'YourResourceGroupName' -Scopes "/subscriptions/yourSubscriptionId" -Condition $condition -ActionGroupId $actionGroupId -Description "Alert on administrative activities"

Application Insights Alerts for Application Performance

Track application performance with a simple AppInsights web test

$rule = New-AzApplicationInsightsWebTestAlertRule -Name 'AppPerfAlert' -ResourceGroupName 'YourResourceGroupName' -Location 'East US' -WebTestId '/subscriptions/yourSubscriptionId/resourceGroups/yourResourceGroupName/providers/microsoft.insights/webtests/yourWebTestId' -FailedLocationCount 3 -WindowSize 'PT5M' -Frequency 'PT1M' -Criteria $criteria

Set-AzApplicationInsightsWebTestAlertRule -InputObject $rule

Mastering PowerShell: Organizing Functions and Modules with Source Control

When your scripts evolve from one-off tasks to a library of tools, proper organization is key. Let’s explore how to structure your PowerShell functions and modules for maximum impact, and integrate source control to safeguard and collaborate on your code.

Elevating Scripts with Functions

At the heart of organized PowerShell scripting are functions. Functions allow you to encapsulate logic, making your scripts more readable, reusable, and maintainable. Here’s how to structure a function effectively:

function Get-DemoData {
    [CmdletBinding()]
    Param (
        [Parameter(Mandatory=$true)]
        [string]$Parameter1,

        [Parameter(Mandatory=$false)]
        [int]$Parameter2 = 10
    )

    Begin {
        # Initialization code here
    }

    Process {
        # Main function logic here
    }

    End {
        # Cleanup code here
    }
}

Best Practices:

  • CmdletBinding and Parameters: Use [CmdletBinding()] to make your function behave like a built-in cmdlet, including support for common parameters like -Verbose. Define parameters clearly, marking mandatory ones as such.
  • Verb-Noun Naming: Follow the PowerShell naming convention of Verb-Noun, making your function’s purpose immediately clear.
  • Comment-Based Help: Provide detailed help within your function using comment-based help. This makes your functions self-documenting and user-friendly.

Scaling with Modules

As your collection of functions grows, modules become your best friend. A module is a package of functions, scripts, and variables that you can share and reuse across projects. Here’s a simple structure for a module:

# MyModule.psm1
function Get-DemoData {
    # Function definition here
}

function Set-DemoData {
    # Another function definition here
}

Export-ModuleMember -Function Get-DemoData, Set-DemoData

Module Manifests: For more complex modules, consider creating a module manifest (MyModule.psd1). This file defines metadata about your module, including version, author, and which functions to export.

Integrating Source Control with Git

Source control is not just for developers; it’s essential for scripters too. Git, a version control system, helps you track changes, collaborate with others, and revert to earlier versions when needed.

  1. Initialize a Git Repository: Start by creating a new directory for your project, then initialize a Git repository.
git init

Commit Your Functions and Modules: As you create or modify your functions and modules, add them to the repository and commit changes.

git add .
git commit -m "Add Get-DemoData function"

Branching for New Features: When working on a new feature or major change, use branches to keep your main branch stable.

git checkout -b feature/new-feature

Collaboration and Backup: Use online Git repositories like GitHub or Azure Repos for backup, collaboration, and leveraging CI/CD pipelines for automated testing and deployment.

Happy scripting!

Quick Code – Send Events to an Event Hub with PowerShell

Here is another quick one – let’s send events to an event hub with PowerShell!

function New-SasToken {
    param(
        [string]$ResourceUri,
        [string]$Key,
        [string]$PolicyName,
        [timespan]$TokenTimeToLive
    )

    $Expires = [DateTimeOffset]::Now.Add($TokenTimeToLive).ToUnixTimeSeconds()
    $StringToSign = [System.Web.HttpUtility]::UrlEncode($ResourceUri) + "`n" + $Expires
    $HMACSHA256 = New-Object System.Security.Cryptography.HMACSHA256
    $HMACSHA256.Key = [Text.Encoding]::UTF8.GetBytes($Key)
    $Signature = $HMACSHA256.ComputeHash([Text.Encoding]::UTF8.GetBytes($StringToSign))
    $Signature = [Convert]::ToBase64String($Signature)
    $Token = "SharedAccessSignature sr=" + [System.Web.HttpUtility]::UrlEncode($ResourceUri) + "&sig=" + [System.Web.HttpUtility]::UrlEncode($Signature) + "&se=" + $Expires + "&skn=" + $PolicyName
    return $Token
}

# Event Hub parameters
$namespace = "yourNamespace"
$eventHubName = "yourEventHubName"
$sharedAccessKeyName = "yourSharedAccessKeyName"
$sharedAccessKey = "yourSharedAccessKey"
$endpoint = "https://$namespace.servicebus.windows.net/$eventHubName/messages"
$tokenTimeToLive = New-TimeSpan -Minutes 60

# Generate SAS token
$sasToken = New-SasToken -ResourceUri $endpoint -Key $sharedAccessKey -PolicyName $sharedAccessKeyName -TokenTimeToLive $tokenTimeToLive

# Event data
$body = @"
{
    "Data": "Sample Event Data"
}
"@

# Send the event
$headers = @{
    "Authorization" = $sasToken
    "Content-Type" = "application/json"
}

try {
    $response = Invoke-RestMethod -Uri $endpoint -Method Post -Body $body -Headers $headers
    Write-Output "Event sent successfully"
}
catch {
    Write-Error "Failed to send event: $_"
}