Skip to content
adatum
  •  SCOM Web API
  • About adatum
Automation

Multi subscription deployment with DevOps and Azure Lighthouse

  • 11/03/202003/09/2020
  • by Martin Ehrnst

Companies today are rapidly adopting new technology. Adding more pressure on the companies IT-department or the service provider. No matter where the workload runs, governance, security, and deployments are fundamental parts.
Azure Lighthouse came last year, and while it solves a lot of our trouble. Multi subscription deployments aren’t one of them – but it sure makes it more possible!

Azure deployments with Azure DevOps

One of the things that make service providers great at what they do is standardization. For example; making sure all subscriptions (customers in many cases) have the same set of baseline Azure policies.

Azure DevOps has multiple ways to deploy resources in Azure or other places. How do we deploy the same Azure template to multiple subscriptions using the same pipeline? In our case, I solved this with some existing features, forward-thinking and PowerShell magic.

Azure DevOps service connection with Azure Lighthouse

Multi subscription deployments with Azure DevOps is not a built-in feature. With Azure Lighthouse it became a little bit easier but will require some work.

First, you must set up a service connection and allow that to access one of your internal subscriptions. In Azure DevOps service connections are bound to one subscription.

For this service connection to be capable of multi subscription deployments, it will need access to your customer’s subscriptions. This can be solved through delegated resource management and Azure Lighthouse.
In my case, I had a group with contributor access. And I could add the SPN to that group. Otherwise, you will have to update your current delegation.

Repository structure

Everything you need, including a YAML pipeline is available on GitHub, but I will walk you through how and why I set it up.

I needed to create not only a solution to deploy one resource to multiple subscriptions. I also needed to deploy multiple ARM templates.

For this purpose, I had the option to create one large PowerShell script with complex logic. Or, I could reuse the same script for every deployment. I chose option two. My code repository now looks something like this

  • ARM-Templates (folder)
    • storage-Account (folder)
      • azuredeploy.json
      • azuredeploy.parameters.json
      • deploy.ps1
    • another-resource (folder)
      • azuredeploy.json
      • azuredeploy.parameters.json
      • deploy.ps1
    • […]
  • azure-pipelines.yaml

PowerShell magic?

In regular deployments, we can use built-in tasks in the pipeline and deploy directly. For multi subscription deployment, PowerShell is my weapon of choice.

To people with PowerShell competency. The script used is fairly simple;

  1. Connect to Azure
  2. Retrieve the subscriptions
  3. Iterate and deploy the ARM template(s) to each customer subscriptions.

Below is a short example. Showing the core in my deployment script (deploy.ps1)

$deploymentName = "Multi-sub-deployment"
$deploymentLocation = "westeurope"
$templateFile = ".ARM-Templatesstorage-Accountazuredeploy.json"
$templateParameterFile = ".ARM-Templatesstorage-Accountazuredeploy.parameters.json"

# getting all subscriptions
$subscriptions = Get-AzSubscription | Where-Object { $_.Id -NotIn $excludedSubs }

foreach ($subscription in $subscriptions) {
        
    # set context to the current subscription
    $subscriptionId = $subscription.id
    Set-AzContext -SubscriptionId $subscriptionId

    # deploy the arm template
    New-AzSubscriptionDeployment -Name $deploymentName -Location $deploymentLocation `
        -TemplateParameterFile $templateParameterFile -TemplateFile $templateFile
}

Multi subscription deployment, build pipeline

Although my repository contains a YAML pipeline, you don’t have to use it. To be honest, I’m not sure I like them. I used too much time trying to wrap my head around it. And at this point, it seems unfinished from the Azure DevOps side. Therefore, I will show you how to set up your pipeline to support multi subscription deployments using the classic method.

azure pipeline template selector

kick off with the classic mode, and chose the empty job on the template page. We could probably discuss if we even need multiple pipelines for this. But it doesn’t hurt, and it will be easier for the next person if we do this by the book.

For the build pipeline, I am renaming my job to something meaningful, and add one single task. Publish pipeline artifact

After you save and run the build. An artifact should be produced. The result should look something like this

Azure pipeline artifact

Multi subscription deployment, release pipeline

After a successful build (or in this case copy files), it is time to create our release pipeline. When using YAML, the two pipelines are combined. Not at all confusing for someone like me, who not that many months ago, didn’t know anything at all about this stuff.

Once again, I start off with an empty job, before I add my artifact from my build pipeline. I also renamed the first stage to “resource deployment”.

Now it’s the matter of adding tasks to our job, and it’s here you will need the service connection that you added earlier. The task we are working with is the Azure PowerShell task. And for multi subscription deployment, it is the only task you’ll need. Below is my task configuration

For some reason, it takes a few tries before pipelines want to work. It might be because of lat hours, or a law created by some guy named Murphy. Anyway, once it’s up and running you shole be able to see something similar to my output below;

2020-03-10T17:13:12.2503976Z ## Az module initialization Complete
2020-03-10T17:13:12.2517976Z ## Beginning Script Execution
2020-03-10T17:13:12.2532801Z Generating script.
2020-03-10T17:13:12.3293381Z ========================== Starting Command Output ===========================
2020-03-10T17:13:12.3416275Z ##[command]"C:Program FilesPowerShell6pwsh.exe" -NoLogo -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -Command ". 'd:a_temp112fe6df-8ee0-4a10-b53e-5694a4d34b0f.ps1'"
2020-03-10T17:13:25.0914923Z No subscription specified. Deploying to all subscriptions
2020-03-10T17:13:25.2151812Z 
2020-03-10T17:13:25.2211374Z Name                                     Account             SubscriptionName    Environment         TenantId
2020-03-10T17:13:25.2292766Z ----                                     -------             ----------------    -----------         --------
2020-03-10T17:13:25.2300155Z MVP-Sponsorship (6dca9329-fb22-46cb-826…  MVP-Sponsorship     AzureCloud          22046864-98a9-4a9…
2020-03-10T17:14:01.7100820Z 
2020-03-10T17:14:01.7151741Z Id                      : /subscriptions/6dca9329-fb22-46cb-826c-/providers/Microsoft.Resources/deployments
2020-03-10T17:14:01.7152947Z                           /Multi-sub-deployment
2020-03-10T17:14:01.7154721Z Location                : westeurope
2020-03-10T17:14:01.7159918Z ManagementGroupId       : 
2020-03-10T17:14:01.7161790Z ResourceGroupName       : 
2020-03-10T17:14:01.7162557Z OnErrorDeployment       : 
2020-03-10T17:14:01.7163149Z DeploymentName          : Multi-sub-deployment
2020-03-10T17:14:01.7163805Z CorrelationId           : d49c10f8-0260-49d5-aa8d-08a41591d1a7
2020-03-10T17:14:01.7164463Z ProvisioningState       : Succeeded
2020-03-10T17:14:01.7165107Z Timestamp               : 3/10/2020 5:14:00 PM
2020-03-10T17:14:01.7166059Z Mode                    : Incremental
2020-03-10T17:14:01.7166610Z TemplateLink            : 
2020-03-10T17:14:01.7167216Z TemplateLinkString      : 
2020-03-10T17:14:01.7167793Z DeploymentDebugLogLevel : 
2020-03-10T17:14:01.7168836Z Parameters              : {[rgName, Microsoft.Azure.Commands.ResourceManager.Cmdlets.SdkModels.DeploymentVariable], 
2020-03-10T17:14:01.7169800Z                           [location, Microsoft.Azure.Commands.ResourceManager.Cmdlets.SdkModels.DeploymentVariable], 
2020-03-10T17:14:01.7173723Z                           [storagePrefix, 
2020-03-10T17:14:01.7174379Z                           Microsoft.Azure.Commands.ResourceManager.Cmdlets.SdkModels.DeploymentVariable]}
2020-03-10T17:14:01.7175014Z ParametersString        : 
2020-03-10T17:14:01.7177870Z                           Name             Type                       Value     
2020-03-10T17:14:01.7178513Z                           ===============  =========================  ==========
2020-03-10T17:14:01.7179228Z                           rgName           String                     adatum    
2020-03-10T17:14:01.7179831Z                           location         String                     westeurope
2020-03-10T17:14:01.7180452Z                           storagePrefix    String                     str       
2020-03-10T17:14:01.7181006Z                           
2020-03-10T17:14:01.7181454Z Outputs                 : {}
2020-03-10T17:14:01.7181916Z OutputsString           : 
2020-03-10T17:14:01.7182291Z 
2020-03-10T17:14:02.0220641Z 
2020-03-10T17:14:02.0937716Z ##[command]Disconnect-AzAccount -Scope Process -ErrorAction Stop

If you look at the output, you can see that the script set’s context to a subscription, and that there is no subscription specified in the pipeline.

Multi subscription deployment summary

Multi subscription deployments with Azure DevOps is not available as a default. But with a little bit of PowerShell trickery, you got a great solution. For service providers and large enterprises, Azure Lighthouse is now a preferred way to manage resources.

By granting a service principal access to your customer subscriptions (or internal for that matter), and use this SPN as a service connection in your Azure Pipeline. You can use PowerShell to iterate through each subscription and deploy the resources needed.

The beauty with this is that it will work regardless of where your DevOps environment is hosted. You can have separate tenants for Lighthouse, Azure DevOps and workplace.

This post described the following, which is required for multi subscription deployment to work.

  • Created an SPN in the management tenant
  • Authorized the service principal through Azure Lighthouse
  • Created a repository and added our scripts and templates to it
  • Created pipelines and used the SPN as our service connection
  • Used PowerShell and the built-in task to iterate through each subscription and perform deployments.

Share this:

  • LinkedIn
  • Twitter
Azure

Working with Azure Monitor Rest API

  • 29/02/202003/09/2020
  • by Martin Ehrnst

It might be an edge case, or you are creating your own integration. But chances are, after fiddling around with Azure Monitor, you encounter a situation where you would have work with its API.
Personally, I have numerous situations over the last years, that required me to integrate directly with Azure Management APIs

In this blog post, I will help you get started using the APIs, hopefully making it less intimidating. All code examples will use PowerShell.

Create SPN and assign it to your subscription RBAC

I am using a service principal in order to connect programmatically to Azure. SPNs are created in Azure Active Directory as Application Registrations. You can choose whether you want to do this via the portal, or by using PowerShell.

After creating (or using an existing SPN) grant the application appropriate access. For simplicity, I am using contributor to my subscription. That might be best for you as well, but one should always use the least privileged assignment needed.

Consider the above as a prerequisite for you to continue.

At this point, you should have an application registration, a secret, and a role assignment on your subscription. We can now use this to acquire an access token and connect to Azure Monitor’s REST API.

Connect to Azure Monitor API using PowerShell

Azure Monitor APIs are a part of the Azure Management APIs. I will, therefore, use these names interchangeably. Also keep in mind, that all other APIs under Azure Management will follow the same methods I demonstrate for Azure Monitor.

To query data we need to authenticate. In the example below, I am using client credentials to acquire the access token. Microsoft’s official example is using the ADAL method, connecting with your identity. I have never had the use for this, as I am usually writing integrations service-to-service.
If you are creating an interactive portal and want to leverage the user’s authorization, ADAL (or MSAL) are probably better.


Retrieve Azure Monitor alert rules

I have no idea why you are exploring Azure Monitors API. Providing an integration solution is therefore not possible. But my gut feeling is that alerts and metrics is a good place to start.

When working with alerts, we need to work with multiple endpoints. Depending on what you are working on, these are the most common;

  • Alert rules*
  • Alert incidents
  • Metric Alerts
  • Metric Alert status

We can start with one of the basics, retrieve the current configured alert rules. To do that, we need to know what kind it is. The classic alert rules (old type) use a single endpoint, while the current use it’s own.

Below I have included three endpoints and a screenshot. As you can see, all the information that you expect is to be found in the output. From here we can start to explore the alert rule by accessing its properties.

Azure monitor rest api metric alert rule output

Get resource metrics from Azure Monitors API

Metrics is another fundamental in monitoring. When we work with the API in the context of metrics. You can explore the available metrics for each resource type by using the Metric Definitions endpoint.

Actual metrics values require a bit more when it comes to the actual query. The official documentation describes everything pretty well, but I have provided an example for a VM below. This example shows the basics of how you get data from one metric and one VM. You can add multiple metrics to one query, and do additional filtering using the OData filter.


Manage alerts, updating status, etc.

Viewing configured alert rules, looking at disk metrics for a VM. What about alerts. The actual things that send you emails- can we work with them using this API? Yes, you can.

Like I said. Providing an integration solution in this blog post isn’t possible. but most integrations I have seen with Azure Monitor or other monitoring solutions have had some kind of functionality to handle active alerts. Personally, I have created one for SquaredUp earlier, where we could acknowledge alerts in Azure Monitor as well as our on-premises SCOM installation.

Before we wrap up. Let’s take a look at how we can interact with an active alert. I have configured a very naggy alert rule, creating a lot of noise, and I want to change the status of those alerts. Armed with PowerShell and the alerts management endpoints everything is possible.

Summary

This blog post has covered the basics regarding the Azure Monitor REST API and PowerShell. With the examples above and the official documentation, you can start creating your own solutions and integrations.

While we have only covered how to get data out of Azure Monitor, you should know it’s also possible to inject data. By using the HTTP data collector API and the Metric store possibilities are ‘endless’.

Integrations ideas

  • Alert remediation/handling from a ticketing system
  • Dashboarding with third-party or custom web integration
  • Teams/Slack/IM connector
  • Custom application metrics or logs

In my examples, I have purposely not included how new alert rules are created, as I believe this should be done through ARM. If that is your use case, you should know it is possible and fully supported.

This blog post was originally published in November 2017. Rewritten for Azure Spring Clean 2020 and to reflect changes to Azure Monitor API

Share this:

  • LinkedIn
  • Twitter
azure spring clean 2020 Azure

Azure spring clean 2020

  • 30/01/202003/02/2020
  • by Martin Ehrnst

Azure spring clean is a community initiative where the idea is to convey best practices and lessons learned managing Azure. During February 2020 we will have a broad selection of blog posts and videos on Real-world scenarios and solutions from the community. Covering topics like Azure Monitor, policies, and cost management.

The first post goes online on February 3.

DateArticleContributorCategory
03/02/20Azure RBAC – Best PracticesAlan KinaneAzure Foundations
04/02/20Azure Policy for AKSSam CoganAzure Policy
05/02/20Monitoring Containers on Azure with Windows Admin CenterDave RendónAzure Monitor
06/02/20How to use Tags to organize your Azure resourcesWim MatthyssenAzure Foundations
07/02/20Azure Governance – Best PractisesAmine CharotAzure Foundations
10/02/20Nailing your Naming Convention with Azure PolicyMatt BrowneAzure Foundations
11/02/20Azure Cost Management – Best PractisesSarah LeanAzure Cost Management
12/02/20Protect your network resources with Azure FirewallLuis BeltranAzure Security Principles
13/02/20Monitoring Azure Site RecoveryKarel De WinterAzure Monitor
14/02/20Using Azure Advisor to baseline your platformSam HodgkinsonAzure Foundations
17/02/20Using Azure Resource Graph To Assess Your Azure Environment Quickly & EfficientlyJack TraceyAzure Foundations
18/02/20Azure Monitor – Best Practices for SanityKam SalisburyAzure Monitor
19/02/20Azure Storage and Backup Lifecycle Best PracticesDwayne NatwickAzure Foundations
20/02/20How to Use and Monitor Azure Update ManagementVukasin TerzicAzure Fundamentals
21/02/20Azure Security: my top 10 best practises to make your tenant secure as possibleShabaz DarrAzure Security Principles
24/02/20Simplify Large Scale Deployments with Azure BlueprintsIsham MohamedAzure Foundations
25/02/20Azure Kubernetes Service (AKS) securing Clusters and ApplicationsAdil TouatiAzure Security Principles
26/02/20Azure Monitor – Autoscaling Resources Based on PerformanceAnthony MashfordAzure Monitor
27/02/20How to Avoid a Billing Shock With Azure Serverless SolutionsStanislav LebedenkoAzure Cost Management
28/02/20Securing Your Azure Platform Web ApplicationsTidjani BelmansourAzure Security Principles



Thanks to all the content creators, MVPs Joe Carlyle and Thomas Thornton for starting this!

Share this:

  • LinkedIn
  • Twitter
Azure

Access to Blob storage using Managed Identity in Logic…

  • 23/01/202023/01/2020
  • by nadeemahamed

I am delighted to share the first guest post on my blog.
Nadeem Ahamed Riswanbasha is a cloud enthusiast and community contributor working for Serverless360 in India.

Please check out his Twitter and LinkedIn profile.

Access to Blob storage using Managed Identity in Logic Apps

By default, when we create a new blob storage container the level of public access will be set to “Private (no anonymous access)”. This is because to extend the security level of the blob container. Nevertheless, if the user wishes to set the level of public access to “container (anonymous read access to the container)” which allows accessing the file by anyone, then it can be modified at the time of creation.

Assume, the business use case needs a high level of security and wants to keep the container/blob more secure. In this case, there are a lot of ways to access the secured blob/container through proper authentication. One way of achieving authentication is through Managed Identity.

What is Managed Identity?

Managed Identity allows you to authenticate to Azure AD and access Azure resources. At the backend, the identity (credentials) will be managed and secured for you. The user doesn’t necessarily need to provide or rotate the secrets.

There are two types of Managed Identity

System-assigned

The lifecycle of a system-assigned identity is directly tied to the Azure service instance that it’s enabled on (Logic app here). So, if this Logic App is deleted, Azure automatically cleans up the credentials and the identity in Azure AD.

User-assigned

User-assigned managed identity is created as a standalone Azure resource i.e. Not tied to any service. So, it is the same as explicitly creating the AD app and can be shared by any number of services.  Currently, Logic Apps only supports the system-assigned identity. Now, let us explore how to authenticate access to Azure Blob storage using Managed Identity in Logic Apps.

Enable managed identity in Logic Apps

First off, we need to enable the system-assigned identity in the logic app that you wish to access the blob storage through.
To do this, follow the steps below;

  1. Go to the logic app menu and select the identity option under settings
  2. A new window will be prompted under which switch the status option to ON and click Save.
enable system assigned identity in Logic App

Give managed identity access to Blob storage through RBAC

  1. Switch to the Azure Blob Storage container menu
  2. In the left pane, click on the Access control (IAM)
  3. Go to the Role Assignments option and click Add
  4. Now, a new blade will be opened on the right side of the window
  5. Fill in the details as follows;
    Role – Storage Blob Data Contributor
    Assign access to – Logic App
    Select – <your logic app>
add role assignment RBAC

You can now see that the Logic App has been assigned as a Storage Account Contributor in the role assignment section.

Design the Logic App to access the Blob

Now, let us jump in and design the logic app to access the blob storage container or files into it. But remember, not all the trigger and actions of the logic app supports managed identity feature. Here is the list of triggers and actions that supports it.

  1. Now, let us head back to the logic app (here LinkedInTest) designer page
  2. Add a Recurrence trigger for the logic app with a defined interval of time
  3. Subsequently, add a new HTTP (it supports managed identity) action to the logic app
  4. In the HTTP action, fill the following fields as follow;
    • Method – Get
    • URI – <the URI of the blob>
    • Headers –
    • x-ms-blob-type: BlockBlob
    • x-ms-version: 2019-02-02
    • Authentication – Managed Identity
    • Audience – https://storage.azure.com (this sets the scope to all of your storage accounts)

Test your Managed Identity enabled Logic App

Save the logic app and run it. You can now see in the run history of the logic app that blob content has been successfully accessed through managed identity authentication.

Wrap-up

In this blog, we have seen how to access blob using a system-assigned managed identity in the Logic Apps. 

 On enabling the logic apps managed identity, an AD app gets created with the same name as that of the azure service (here LinkedInTest, logic app) in Active directory, you can check it in Enterprise Application.

Hope you enjoyed reading this article. Happy Learning! 

Share this:

  • LinkedIn
  • Twitter
Community

Speaking at NIC 2020 vision

  • 17/01/202017/01/2020
  • by Martin Ehrnst

Nordic Infrastructure Conference, or NIC for short. It is a leading technology conference in the Nordics

If you have been to NIC previously, you already know that the conference is packed with great speakers from the entire industry. It doesn’t matter if you work with Microsoft Azure or AWS. Technology is the focus of NIC Conf.

Whether you are an IT pro, looking to step up your automation game. or just interested in new technology, I encourage you to attend my session about Azure serverless.

The two-day conference is held on February 6 and 7, Oslo Norway. Are you coming?

Share this:

  • LinkedIn
  • Twitter

Posts navigation

1 2 3 4 5 6 … 19

Top Posts & Pages

  • Azure AD authentication in Azure Functions
  • Creating Azure AD Application using Powershell
  • Azure Application registrations, Enterprise Apps, and managed identities
  • Multi subscription deployment with DevOps and Azure Lighthouse
  • Using Azure pipelines to deploy ARM templates
  • Working with Azure Monitor Rest API
  • Script to add SCOM agent management group
  • Azure CDN on Wordpress
  • SCOMpercentageCPUTimeCounter cause CPU Spike
  • Resource health through Azure Rest API

Tags

agent announcements api ARM authoring Automation Azure AzureAD AzureFunctions AzureLighthouse AzureMonitor AzureSpringClean Bicep Community CSP database EventGrid ExpertsLive ExpertsLiveEU IaC Infrastructure as code Integrations LogAnalytics management pack monitoring MSIgnite MSIgnite2017 MSOMS MSP nicconf Nordic Virtual Summit OperationsManager OpsMgr Powershell QUickPublish rest SCDPM SCOM SCOM2016 SCVMM Serverless SquaredUP SysCtr system center Webasto

Follow Martin Ehrnst

  • Twitter
  • LinkedIn

RSS Feed RSS - Posts

RSS Feed RSS - Comments

Microsoft Azure MVP

Martin Ehrnst Microsoft Azure MVP

NiCE Active 365 Monitor for Azure

NiCE active 365 monitor for Azure
Adatum.no use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it. Cookie Policy
Theme by Colorlib Powered by WordPress
adatum
Proudly powered by WordPress Theme: Shapely.