As you start using Platform as a Service (PaaS) architectures in Azure, there is a strategic need to standardize and automate building these services. This post covers the automation and creation of an; Azure Resource Group, Blob storage, Azure SQL Server and Azure Analysis Services in a PaaS offering using Azure Automation PowerShell Runbooks. Whether it is multiple developers, QA, Development or Preproduction environments, having an automated way to standardize their creation will save time and allow for a consistent build pattern. This article will review creating and moving a PowerShell script into a Runbook in Azure Automation and cover some of the issues you can face.
Table of contents
Reference – A tutorial has been added to the Microsoft Documentation, Manage runbooks in Azure Automation | Microsoft Docs
You can take scripts that you run against Azure on a local machine and put those into reusable parameterized Runbooks. Develop once, run many styles of architecture builds. I use Runbooks for many automation tasks in Azure data workflows. Still, scripting out creating a SQL BI environment can be very handy, especially if you have many environments and developers.
Note: Many of the commands are linked to their documentation which provides more detail on their format, options and parameters.
This article assumes that you have a few things in place before you begin. See the links provided for more information.
- You have to have an Azure Automation account, referenced here, Create a standalone Azure Automation account.
- When you create the Automation Account, there is an option to Create Azure Run As account. This account is used to manage the Resource Manager deployment model resources. When you create a Run As account, it creates a new service principal user in Azure Active Directory and assigns the Contributor role to this user at the subscription level.
- This blog post will use a version that does not use parameters but will have hard coded values. The best advice I can give is to get it working as a bare bones script, then get more advanced.
A Bare Bones Script
Our goal is to create a minimal BI environment that will be made available to developers and DBA teams who would continue with any unique setup required for a project.
The script as written will run inside either in a PowerShell Workflow Runbook or a local command window. The changes required are commented on in the code below.
This can be either a local PowerShell terminal, such as the PowerShell IDE or Visual Studio Code. With a minimal change in login, it will run from an Azure Automation PowerShell workflow runbook.
A PowerShell Workflow script has a specific format. The script used in this example can be inserted into the template below when the Runbook is created. On the Azure Automation Runbook tab, select Create A Runbook menu item.
The following panel is displayed, select Create a New Runbook. Add in a Name, and the type being a PowerShell Workflow for this example.
The following is created; notice the Connections section has an “Azure Run AS” member, which we will use to provide the necessary permissions required to create the objects.
To run the script, I use the following code segment as a starting point. This example will not have parameters but will cover these in a future post.
Step 1 – Login
There are two sections to the login script, one for running locally, in Visual Studio code, for example, and for logging in to Azure from within a Runbook. Notice that the Runbook Login has the “AzureRunAsConnection ” setup. This is required to give the Runbook the security context to create the objects. The screen capture below shows the Assets that were created when the Azure Automation service was setup.
These can be found in the Shared Resources section of the Azure Automation
This connection was created during the Azure Automation setup. This allows running the script through automation while not having to have an account and password information in the code. Runbooks allows you to create various Assets, if required, such as SQL Logins to run SQL commands. More detail on this is available in the documentation; getting Started with Azure Automation: Automation Assets and Credential assets in Azure Automation.
However, the first portion of the script that is commented out will log you in when running the script locally. The big benefit of this process is that you can develop these scripts locally and then move them to a runbook later. I find the debugging easier developing the initial script locally. The command, Connect-AzureRmAccount -SubscriptionName <<Subscription Name>> allows you to run the local login as one step. The reference, Sign in with Azure PowerShell and the command documentation, Connect-AzureRmAccount , goes into more detail on this.
Logging Into Local PowerShell IDE
## Login when running from a Local PowerShell Prompt
## General Connect Info
## # Log in to your Azure account
## # List all the subscriptions associated to your account
## # Select a subscription
## Set-AzureRmContext -SubscriptionId <subscription ID>
## Connect-AzureRmAccount -SubscriptionName <<Subscription Name>>
## Connect-AzureRmAccount -SubscriptionName "Visual Studio Enterprise"
Login Using the RUNAS account
The following code will give the Runbook the elevated security required to create the objects.
Login when running from a Runbook
## Login when running from a Runbook
# Get the connection "AzureRunAsConnection "
$servicePrincipalConnection = Get-AutomationConnection `
"Logging in to Azure..."
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
Step 2 – Create Objects
The following section reviews the code to create each object inside the created Resource Group. You can certainly parameterize these options and create Try Catch error checking to make this more of a production version.
Create the Resource Group
New-AzureRmResourceGroup -Name "MyResourceGroupName" -Location "eastus2"
Create a Storage Account
The specific names need lower case on the storage account and blob name.
-ResourceGroupName "MyResourceGroupName" `
-Name "mybistorageaccountname" -Location "eastus2" `
-SkuName Standard_LRS -Kind StorageV2
Create a BLOB inside the Storage Account
In order to create a BLOB inside the storage container, you have to get the storage key and the context of the account.
#Get the storage Keys
$storagekey = Get-AzureRmStorageAccountKey -ResourceGroupName "MyResourceGroupName" ` -Name "mybistorageaccountname"
#Get the Context
$context = New-AzureStorageContext -StorageAccountName "mybistorageaccountname" `
-StorageAccountKey $storagekey.Value -Protocol Http
#Create the Storage Container
New-AzureStorageContainer -Name "myblobfilecontainername" -Permission Blob `
Create the Azure Analysis Server
The following code creates the Azure Analysis Server. The SKU is set at the D1 pricing tier, which is the lower cost developer SKU. You also need to provide an Azure Active Directory ID as the administrator. Note the server name is lower case.
New-AzureRmAnalysisServicesServer -ResourceGroupName "MyResourceGroupName" `
-Name "myssaservername" -Location "eastus2" -Sku "D1" `
-Administrator "MyName@domain.com" -ErrorAction Ignore
Create the SQL Server Database and a Sample DB
The following code creates the Azure SQL Server and a sample database at the S0 pricing tier.
New-AzureRmSqlServer -ResourceGroupName "MyResourceGroupName" -ServerName "mysqlservername" -Location "eastus2" -SqlAdministratorCredentials $(New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList "ServerAdmin", $(ConvertTo-SecureString -String "ChangeYourAdminPassword1" -AsPlainText -Force))
New-AzureRmSqlDatabase -ResourceGroupName "MyResourceGroupName" -ServerName "mysqlservername" -DatabaseName "MySampleDB" -RequestedServiceObjectiveName "S0" -SampleName "AdventureWorksLT"
Step 3 – Create References to the Azure Modules
Not a quick Plug and Play
As you develop the scripts locally and move them to a Runbook, this is not generally a quick plug and run. You can run into errors with scripts that worked when tested in the local shell.
When you run this script for the first time, you could receive the following errors;
Missing Module Reference
“The term ‘New-AzureRmAnalysisServicesServer’ is not recognize das the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again. “
Old PowerShell Module Version Referenced
“A parameter cannot be found that matches parameter name ‘SkuName’.”
Note: I wrote a more detailed post on this error is available at How to keep Your Microsoft Azure Automation Runbook Modules Up-to-Date.
Both errors are easy to fix. In looking at the following screen capture, the Azure Automation panel has a Shared Resources section where selecting the Modules option brings up the Azure PowerShell modules that are referenced in your account.
The modules are not updated automatically, which can be useful when developing scripts, as you may want to control when you have updates. However, when you run the scripts as an automated Job, the current module versions are used which could cause headaches.
First, on the menu, select Update Azure Modules and follow the prompts. This will update all your referenced modules to the current released versions.
The error above turned out to be a parameter that was not available in Version 1.0.3 of the Azure.Storage module but was added in later versions. After the update, the version is now 4.6.1 and the command now works. This can happen if you have an environment you have not used in a while.
The missing module reference occurs as not all Azure modules are available when you first create your Azure Automation account.
When you go to the PowerShell command’s documentation page, the Module that the command belongs to will be noted at the top.
On the Modules tab, Select the Browse Gallery which will allow you to add module references available to your scripts by adding that module to the Azure Automation environment you are using.
This will bring up a search box. When you search for the Module the command belongs to, here AzureRmStorageAccount, the module will be displayed.
Selecting the module name will bring up more detail, and select the Import menu Item.
Which will change the display when finished.
You now have all the modules you require and can do a test run of your Runbook. As shown in the screen capture below, select the Runbooks option in the Process Automation section, then the click on the Runbook that you created.
This will bring up the detail header of the Runbook, select Edit to bring up the script.
Once you select, Edit, the edit pane is displayed. Select the Test Pane. This will allow you to run your script in test mode, just as if it were being run in a local window.
Click start, and you will see results start to populate in the lower display window. You can select the Refresh Job Stream to update this window while the script is running. The job will be submitted to the Queue and then run. Note that you can include messages in this window while your script runs using the command, Write-Host.
When the script complete, you should see the following results, which includes messages about the objects created.
Checking in your Portal, find the Resource Group and your objects shown.
That being said, if you are rerunning your script in testing, you can run the following command to delete the resource group and all the objects you have created. You will be charged for anything you create.
#Clean Up resources after test
#This removes the resource group and all objects created
Remove-AzureRmResourceGroup -ResourceGroupName "MyResourceGroupName"