Azure Automation and Table Storage: Creating a Basic Azure IPAM Solution

Overview

I wanted to provide a basic demo of Azure Automation, Azure Storage Explorer, and the use of Storage Account Tables, because I think all three are extremely valuable to any Azure Administrator or Developer. Personally, I think its more fun to provide a working solution that fully utilizes the features rather than regurgitate information that’s already available, so I came up with this.

The Problem

Managing, planning, and organizing address spaces within Azure can get quite cumbersome and is something that I believe could be improved upon in the portal. I find myself clicking in and out of VNets, their subnets, checking outdated spreadsheets, and trying to keep everything updated once there are too many cooks in the kitchen. I chose to build this solution in Azure Automation and store my data in a Storage Account Table. Here’s the final product:

 

The Solution

I’ve provided an Azure Automation Runbook, and Local PowerShell versions of this solution which can be found on my GitHub or PowerShell Gallery:

https://github.com/cooperlutz/cooperscloud/tree/master/powershell-scripts/Generate-AzureIPAMTable

https://www.powershellgallery.com/packages/Generate-AzureIPAMTable/

 

Table Storage

Storage Account Tables provide simple storage of structured datasets. The Table is made up of “entities” which each contain a “Partion” and “Row” Key. The “ParitionKey” serves as a group of “entities”, while the “RowKey”  serves as the unique identifier for the “entity”.  The “RowKey” must not be a repeated more than once in a “Partition”.

Tables can not be viewed in the Azure portal, but Microsoft has provided a free tool “Azure Storage Explorer”, which allows for an Azure administrator to view their tables, blobs, file shares, and queues. Azure Storage Explorer also allows you to modify, add, and delete data stored in a Storage Account.

I like to think of them as a mix between a database table and a spreadsheet.

 

The Script

I wrote this script in a local PowerShell instance and then converted it to Azure Automation, my preferred method because its generally a little bit easier to troubleshoot and allows me to run portions of code rather than troubleshooting the entire script in the Azure Automation Test Pane.

I started by defining the resources I wanted to display in this solution:

  • Virtual Networks
  • Subnets
  • Public IPs
  • Reserved Address Spaces – something that Azure does not provide a solution for, but I think would be a nice to have. I think of this as a placeholder for a planned space, or defining my on premises address range.

 

From here I was able to come up with a script that would loop through all of the Public IPs, Virtual Networks, their Subnets, and then added a test output for the Reserved Spaces.

I decided I also wanted to add functionality that would calculate my hosts, host IP, broadcast IP, and separate out the address length. This was accomplished utilizing the PSipcalc script created by Svedsen Tech, which I added in as a Function. Big thanks to him for that piece of art. His script also provides some features that I’d love to implement at a later version, specifically around calculating and displaying any address overlap.

Outputting the data was tricky and something I had to try out a few times. I wanted to be able to quickly organize my output table by the “RowKey” in a fashion that would first show the VNet, then its Subnets immediately following. I accomplished this by iterating through the Vnets and assigning them a “RowKey” of “Vnet###”, and then iterating through each Subnet and assigning them a “RowKey” of “VNet###” + “Subnet###”. Now I’m able to view the data in a more understandable way by sorting by “RowKey” descending.

 

Here’s a small subset of the code to display some the data retrieval, “RowKey” iteration, PSipcalc function calls, and output:

# Get the table, PIPs, and VNets
$table = Get-AzureStorageTable -Name $tableName -Context $saContext
$vnets = Get-AzureRmVirtualNetwork
$publicIPs = Get-AzureRmPublicIpAddress -ErrorAction SilentlyContinue


# Vnets loop - get the Vnet data and its relevant subnets.
Write-Verbose "Populating VNets and Subnets..."
foreach ($vnet in $vnets) {
 
 # Set Val2 to 1 to re-initiate the subnet rowkey
 $val2 = 1
 ## Get the Virtual Network's range and run it through PSipcalc
 $vnetAddressSpace = Get-AzureRmVirtualNetwork -Name $vnet.Name -ResourceGroupName $vnet.ResourceGroupName | select AddressSpace
 $vnetAddressSpace = $vnetAddressSpace.AddressSpace.AddressPrefixes
 $rangeInfo = Psipcalc -NetworkAddress $vnetAddressSpace

# Add the table row for the Vnet
 Add-StorageTableRow -table $table -partitionKey "VNet" -rowKey ("Vnet" + ("{0:d3}" -f $val++)) -property @{`
 "VirtualNetwork" = $($vnet.Name); `
 "Type" = "Range"; `
 "Name" = $($vnet.Name); `
 "PublicIpAllocationMethod" = "";`
 "NetworkLength" = $($rangeInfo.NetworkLength);`
 "IP" = $($rangeInfo.IP);`
 "TotalHosts" = $($rangeInfo.TotalHosts);`
 "Broadcast" = $($rangeInfo.Broadcast); `
 "AddressSpace" = $($vnetAddressSpace); `
 "ResourceGroupName" = $($vnet.ResourceGroupName)`
 }

# Loop through all subnets within a vnet and output these to a table row
 $subnets = Get-AzureRmVirtualNetwork -Name $vnet.Name -ResourceGroupName $vnet.ResourceGroupName | select subnets
 $subnetCount = $subnets.Subnets.Name.Count
 For ($i = 0; $i -lt $subnetCount; $i++) {
 
 $rangeInfo = Psipcalc -NetworkAddress $subnets.Subnets[$i].AddressPrefix

# Add the table row for each vnet
 Add-StorageTableRow -table $table -partitionKey "Subnet" -rowKey ("Vnet" + ("{0:d3}" -f ($val-1)) + "Subnet" + ("{0:d3}" -f $val2++)) -property @{`
 "VirtualNetwork" = $($vnet.Name); `
 "Type" = "Range"; `
 "Name" = $($subnets.Subnets[$i].Name); `
 "PublicIpAllocationMethod" = "";`
 "NetworkLength" = $($rangeInfo.NetworkLength);`
 "IP" = $($rangeInfo.IP); `
 "TotalHosts" = $($rangeInfo.TotalHosts); `
 "Broadcast" = $($rangeInfo.Broadcast);`
 "AddressSpace" = $($subnets.Subnets[$i].AddressPrefix); `
 "ResourceGroupName" = $($vnet.ResourceGroupName)`
 }
 }
}

 

Azure Automation and Storage Account Table Walk Through

Creating the Storage Account

I generally like to recommend that an organization establish a general purpose Storage Account to be used for shared resources that Azure uses to integrate some of their services, like Azure Cloud Shell, Azure ARM Templates archives, or other resources that Azure admins regularly use. In this case we will establish a general purpose Storage Account that will store our Table Storage output.

I choose LRS for these general purpose Storage Accounts because I’m not too concerned about losing the data or not being able to access it during an Azure outage.

That’s it! Our Storage Account is ready to be used.

 

Configuring the Azure Automation Account

Start by searching for the Automation Account blade in Azure.

Click to add a new automation account.

Add Automation Account

Create Azure Run As account – I choose “Yes” to this feature as my automation account deployment will generate an AzureAD account that will have Contributor access to my Azure resources. You can navigate to the “Run as accounts” under your Automation Account’s “Account Settings” to view its generated display name, Application ID, and Roles, and other relevant information. It is possible to modify the level of access this account has after it is generated.

Updating Modules

Start by viewing the default PowerShell Modules added to the Automation Account. The default modules are very early versions, so we want to update these to latest. The “Update Azure Modules” button at the top will quickly update all Modules to latest.

Adding Modules

Once our modules are updated, we need to add a few from the gallery for the runbook we are about to deploy. Navigate to the gallery and search for the needed modules

AzureRmStorageTable

AzureRM.Network

 

Click OK to add the module to your Automation Account.

Handling Module Dependency Issues

Upon imporing the “AzureRM.Network” module, I’ve ran into a dependency issue with “AzureRM.Profile”. Although I updated all of my Azure modules, I ended up with 4.2.0. I’ve found the best way to combat this is with the following method:

 

From the Modules blade, click on the “AzureRM.Profile” module, and then choose to delete it.

 

Once deleted, the module version will default back to the lowest version.

 

Navigate to the online PowerShell Gallery, search for the needed module, and select it.

Click the “Deploy to Azure Automation” button.

 

Select your Automation Account and click OK.

 

Once the import has completed, return to the Automation Account and check the “Modules” blade, the AzureRM.Profile version should now show as 4.3.0.

 

Now we can return to the gallery and attempt to import the AzureRM.Network Module. This time we are not met with any dependency warning. Click OK to import the modules.

 

Deploying the Runbook

Navigate to the PowerShell Gallery where the script is hosted. Click the “Deploy to Azure Automation” button to add it to your Automation Account.

Select your Automation Account and click OK.

 

Navigate back to your Azure Automation Account and select “Runbooks”. We see that the “Generate-AzureIPAMTable” runbook has been added. Select the runbook.

 

From here we can make edits to the code, delete the runbook, add a schedule to automatically run our script, or review past executions. First we will click “Edit” and then choose “Publish” to make the Runbook available for execution.

 

After publishing, we can run the script for the first time. Click “Start” to execute the script.

 

Enter the resource group of the storage account, and the storage account name, then click OK.

 

We can click on the “Output” to view the running details of the script.

Once completed we will move on to viewing our Table’s data.

 

Viewing the Storage Table

Download Storage Explorer from Microsoft

Connect to your Storage Account either by logging into Azure or providing the Storage Account name and key.

 

Once connected, we see our table has been generated with our Azure IP resources.

 

You can change the column options to make things a little friendlier.

Complete!

Leave a Comment

Your email address will not be published. Required fields are marked *