Microsoft Secure application model

Secure application model was released by Microsoft late last year (2018). At that time, I noticed it and didn’t quite understad how it impacted my work. I therfore moved on.

A few weeks ago i discovered what this change actually means.
If you are a Microsoft cloud service provider or a control panel vendor, you will have to change to this new model of authentication soon. Depending on how you deliver apps or how you manage your customer tenants, there’s quite some work to do.

Microsoft is forcing all user accounts with access to CSP with MFA. That is great, but if you (and likely you are) using app + user credentials to access partner center you cannot do this programmatically, as the current method uses password grant. I have written about how that method works here.

The secure application model depends on refresh tokens and access tokens. As a service provider your customers will have to consent to an application getting access to their tenant. When the admin user consent, you will get a code response. This code is used to create a refresh token, which later can be used to access Azure or other resources.
I do not have the mandate to learn you how refresh and access tokens work, but i found the articles on Oauth.com pretty good.

The below picture shows a broad overview of the flow and ‘infrastructure’ required. I suggest you download the document as well.

Token flow in the secure application model

Secure application model infrastructure

i have built these examples with PowerShell to authenticate to a customer tenant using the new model. The model used here assumes you function as a managed service provider. Maintaining your customers Azure tenants and subscriptions. That way you can consent on behalf of your customers. If you provide Azure market place applications, the process is a bit different, but infrastructure wise, were using the same tools.

In my implementation of secure application model I have used the following tools.

  • Azure KeyVault
  • Multi tenant Azure AD Application, with access to the APIs you require.
  • A user able to consent (in my case member of Admin Agents in CSP)
  • Single tenant AAD app to authenticate against KeyVault

Multi tenant Azure AD application

If you find this blog post interesting, I assume you already have a multi tenant AAD app used in your integration or software delivery.
If not, you can check out my blog post on how to create Azure AD apps using PowerShell

In my case I have one application used to monitor customers workloads in Azure. This application have access to Azure Management APIs

Single tenant web app

This is the additional application needed. This application will represent your system. In my case a monitoring tool. I use this application registration to access key vault where I have stored my refresh token. The refresh token is then used to get an access token from a customer tenant.

Azure Key Vault

We need a secure place to store the refresh token and possibly other stuff down the road. I chose to run with Key Vault. There are multiple blog posts and documentation on how to provision and give permissions in key vault, but remember to give your single tenant application read and write access to secrets.

Using PowerShell and Rest APIs

As I have multiple times before, I chose to run with REST rather than PowerShell modules. Feel free to use modules or SDKs, it just doesen’t work that well in my environment.

Getting consent

I’m sure theres much slicker ways to do this, but I only needed one consent to make our integration work. If you have multiple refresh tokenst etc. I would build some kind of callback service that could handle the consent flow.
Alter the following code to your needs, paste it in your web browser and sgn in with apropriate credentials. In return, you will have recieve a code. Copy and use this in next step.

Azure AD refresh token

Now that you have consent it is time to get a refresh token. This is what you later use to get access tokens from your customer tenants. By default and if used. The refresh token is valid for 90 days. You will have store this in Key Vault or a similar service.
Add your information to the script below to get your refresh token.

Write and retrieve from Key Vault API

Since you don’t want to get a new consent every time, you will need to save your refresh token to a secure place. I chose to run with Key Vault, but feel free to chose what ever software you want. Below are two snippets that will allow to write and retrieve secrets from Azure KeyVault.
You will have to get your key vault URL and the single tenant application id and secret.
That way your application accessing customers tenants, in my case a monitoring system, have it’s own credentials, separated from the credentials aquiring the refresh and access tokens.

Retrieving data from customer tenant

It’s time to connect to your customers tenants. Before doing that, lets summarize. By now you should have the following in place

  • One multi tenant application with Api access and proper consent.
  • A key vault with the a refresh token
  • A single tenant (your integration) application with access to key vault
  • AppID and access keys for bot application registrations.

Below I have included three examples on how to retrieve data from your customers. The first will get all customers from partner center, second will use the same refresh token to access Microsoft Graph, and the third will access Azure management API’s (Azure Resource Manager). In order for this to work, your multi tenant app must have access to these APIs

Azure monitoring, connecting the dots

Azure Monitoring

Welcome to the continuing saga on how to monitor your customers Azure tenants being a service provider. Previously we have covered how to authenticate against Microsoft CSP, using Azure Resource Health API with Powershell and more.

This post is all about connecting the dots. We are far away from finished, but things are moving in this project and at the time of writing, we have two separate projects going.
The first one  is focused on creating a single pane of glass for all our customers’ workflows. This involves custom coding and management pack development for SCOM. The second one, which this post will cover, is how we have designed each customer tenant and how we plan to use built-in Azure monitoring functionality.

 

Customer tenant setup

Working for a service provider we need to construct Azure tenants by taking in to account that we are going to manage cloud resources, so using many cloud features makes a lot of sense. The challenge ist that we always have to think about how we can integrate with an existing deployment and work with monitoring solutions on premises.

When we first started out this project we looked in-to what have been done before, and most of the examples we found wouldn’t scale to our requirements or used OMS/Log Analytics only. We wanted to use our SCOM environment for alert handling, dashboard and platform health as SCOM is already integrated with customer portals, CMDBs and more. We will discuss more on that later in this blog post.

Things are moving very fast in Azure, we have changed our inital customer tenant setup twice before we found a structure we believe is future friendly.
When a customer sign up for an Azure Subscription, we populate their tenant with a default monitoring resource group and a OMS/Log Analytics workspace (LA). Along with the default LA workspace we add the Azure Activity Log, Web Apps and Office 365 solutions as standard.
For “bread and butter” type of Azure Resources, such as compute and web apps we setup the same type of monitoring regime we provide for on-premise resources, but we use alerts in Azure Monitor. This approach works well for Azure Resources which do not have existing, custom Log Analytics solutions and searches to provide health state. This means that VMs deployed using our custom ARM template will also include Monitor Alerts such as “CPU Usage % above 95” and “Web app response time above x”. In conjunction with Azure Monitor we use Azure Resource Health wich will provide health state data regardless of resource type, and custom alerts in monitor or Log Analytics.

Below is a (not so detailed) illustration on our default tenant.

 

SCOM and Azure Integration

We use System Center Operations Manager (SCOM) as our main monitoring platform for operating systems and applications. As SCOM is already integrated with our ticketing system, CMDB and other internal tools it seems reasonable to provide insight to application and workloads running in Azure on the same monitoring platform. That way we optimally can provide a single pane of glass in to the on premise, hybrid and cloud only workloads.

 

Azure Management Packs

To get monitoring data in to our on prem SCOM we looked in to two major options.

Option #1:
The official Azure Management pack from Microsoft. he official MP discovery process/adding new tenants cannot be automated. It relies on a GUI where you sign it to the tenant etc. neither does it provide any “umbrella” functionality for companies enrolled in the CSP program.

Option #2:
Daniele Grandini’s Azure/OMS management pack. Daniele’s management packs provide insight to Log Analytics, Azure Backup and Automation, but relies on the official Microsoft MP for initial discovery. Daniele’s management packs focuses on the solutions within the “Monitoring + management” (formerly known as OMS) space in Azure. Since much of the alerting features from OMS/Log Analytics are moving to Azure Monitor, I reached out to Daniele and asked if he had looked in to creating a management pack for that. He had looked a little in to it, but was also concerned about the rapid changes. Unfortunately this MP is bound to the initial discovery from the official Azure MP. A service provider managing several hundred tenants (and growing) cannot have that limitation. I hope to be able to help Daniele with the upcoming Azure Monitor MP.

Here’s where our problems started. I wanted to discover all our manged tenants automatically. Take advantage of being a CSP we set out to create our own management pack(s). I have create one management pack for the CSP platform that integrate with the Partner Center API (see example in this blog post) to do the initial discovery. Tenants and subscriptions are populated as objects in SCOM. Further, using a Partner Center Managed Application we can pre-consent access to all managed tenants. That means we can use this applications credentials to authenticate against each of our managed tenants, by-passing the limitation within the official management pack. All resources are the created as object with a hosting relationship to resourcegroup, subscription and tenant. Basic monitoring is done through Azure Resource Health API.

Below is a diagram showing the structure of our CSP management pack

Credentials used to authenticate against partner center and the Azure tenants is provided through SCOM RunAs accounts.

Our next step in SCOM and Azure integration is to create an Azure Monitor Management pack that reference the CSP management pack. This will provide the more enriched monitoring provided by Azure Monitor. Due to many recent changes to the monitor platform I have decided to wait and see where we end up. At the time of writing Azure Monitor have two new alert features in preview and none of their API’s are officially documented – i will come back with examples when I have something tangible.

Summary

To provide effective monitoring as a service provider for customers which span on-prem and cloud environments, we recommend the following:

  1. For “bread and butter” monitoring use a combination of SCOM and Azure Monitor
  2. If in the CSP program. Create a management pack using CSP rest API’s (hopefully I can share our MP later) combined with a custom Azure Monitor MP
  3. Not a CSP? Look in to a combination of the official MP and Daniele’s management packs.
  4. Deploy Log Analytics as default to all tenants. This will give you an advantage when customers require custom solutions and log sources.

Wrapping up

All service providers do their monitoring differently, but hopefully you have gotten some ideas on how you can do yours. Our solution is far from being finished, but I feel we have a structure that are future proof (the modern type of future). Hopefully we can share the SCOM management packs later, but feel free to contact me on specifics. Just remember I cannot share the MP itself at this point in time.

Until further notice, this will be the closing post on how you can do Azure Monitoring as a service provider.

 

Big thanks to Kevin Green and Cameron Fuller for providing feedback and to reach out to other community friends on my behalf.

Working with Azure Monitor Rest API

Before delving in to Azure monitor Rest API and powershell, let’s take a little step back. Azure monitor released in public preview a little over a year ago (September 2016). Introduced as “The built-in solution to make monitoring available for all Azure users”.
At that time I was personally all over the Operations Management Suite or OMS which is now a deprecated brand. All features from OMS is now available under “Monitoring + Management” in the Azure Marketplace.

Therefore, Azure Monitor went a bit under my radar, but when OMS shifted we started to see more and more about Azure Monitor being the one stop shop for Azure monitoring. Especially the alerting feature seems to be richer in Monitor than in Log Analytics and it is my (and others) anticipation that we will see Monitor as the default alert tool, for both metrics and activity logs.

So as part of my never-ending story, Monitoring Azure as a CSP provider. Let’s take a look at Azure Monitor and it’s REST API

We want to retrieve alert rules and incidents (alerts) programmatically, but first we create an alert rule to work with through the GUI. In th Azure portal:

At this time of writing the alert rule is bound to a specific resource. I hope we will see the ability to create rules based on resource type. ie: you want all web apps to have the same standard alert rules for response time.

Locate Azure monitor

Find your resource and metric (or you can jump straight in to alerts)


Verify your alert rule exist

Retrieve Azure Monitor alerts and incidents

From Powershell I am connecting to AAD and generating authentication header, reusing code from earlier blog post about Azure Resource health. For the purpose of this post I will focus on these two Azure Monitor API endpoints: Alert Rules – List By Resource Group and Alert Rule Incidents – List By Alert Rule.

After you have authenticated against Azure AD, and if using my previous sample you should have the following header available.

Using this header we call the alert rules endpoint to get our alert rules. Pay attention to the URL, as it requires you to specify a subscription id and the resource group name.

 

The output from the above should look something like this. I only have one alert configured for the resource group


id : /subscriptions/a2782f8e/resourceGroups/2017/providers/microsoft.insights/alertrules/name
name : name
type : Microsoft.Insights/alertRules
location : westeurope
tags : @{$type=Microsoft.WindowsAzure.Management.Common.Storage.CasePreservedDictionary,}
properties : @{name=No runs; description=; isEnabled=True; condition=; action=; lastUpdatedTime=2017-11-02T12:52:26.9091865Z; provisioningState=Succeeded; actions=System.Object[]}

 

Next we use the ID from our previus result as part of the URL to get our alert incidents

I am jumping straight in to the ‘value’ node at this point. If youre alert rule have triggered an incident it will return a result. We see the time it was activated (and resolved if it’s old), a boolean value of its status, and some information on the resource it self.


id : L3N1YnNjcmlwdGlvbnMvZTUxNGRhY2EtYTA3Ny00NGYwLTljZmEtNjBlMzRjOTk1Zjk3L3Jlc291cmNlR3JvdXBzL1NlbWluYXIyMDE3L3Byb3ZpZGVycy9taWNyb3NvZnQuaW5zaWdodHMvYWxlcnRydWxlcy9ObyUyMHJ1bnMwNjM2NDUyMjM3NjcyODMwODI2
ruleName : /subscriptions/a2782f8e/resourceGroups/2017/providers/microsoft.insights/alertrules/name
isActive : True
activatedTime : 2017-11-02T12:49:27.2830826+00:00
resolvedTime :
targetResourceId :
targetResourceLocation :
legacyResourceId :

 

 

Below I have tied everything together, using my Azure App authentication function we generate the auth header and retrieves alert rules based on user input.

 

 

Wrapping up

We now know the basic consepts on how we authenticate and retrieve alert rules and incidents from Azure Monitor through it’s rest API using Powershell. From here we can easially expand our work to create new alert rules or retrieve metrics from our resources wich lets us build custom solutions on prem or any where else.

If you want to continue exploring Azure Monitoring capabilities i suggest you follow Adin Ermies series describing all the different solutions.

Hopefully I can soon provide some insights on how we are building our CSP monitoring solution in a single blog post, using the different tools mentioned in my latest posts.

Authenticate against Micrsoft Partner Center API using Powershell

Update 04.01.2019:
While the method described in this post still work. Microsoft are moving to what they call secure app model. Meaning that password grant is deprecated and you will need to use a refresh token model. I have written a new blog post, explaining the new model.

If you’re not familiar with the Microsoft Cloud Service Provider program it’s in short a program to easier let service providers manage their customers tenants and subscriptions within Azure and Office 365 from a centralized platform.

Apart from a very limited web portal it have a set of API’s and SDK’s to build your own solutions – wich I assume is prefered from Microsoft and the service provider. For a project I needed to authenticate against the REST API using Powershell and then retrieve some information about each tenant, who would have thought that could be so much work

Here’s what I said.

That’s fine, I will have it to you in an hour.

For your reference, this is the API I am working with: Partner Center Swagger

An hour later I did have authentication in place, but I was unable to retrieve any information from our customers. After digging through the documentation I found that the customer endpoints required “App + User Authentication” where I had only authenticated with AppId and App Secret.

After spending too much time decifer the C# examples on how you authenticate with app and user against the CSP Rest API i finally had a working Powershell function.

These are the steps required

  • Generate a token from Azure AD by calling https://login.microsoft.com/tenant-name/oauth/token
    • Specified with the resource you want to access (partner center api), client id, username and password, correct grant type and scope
  • Use the AAD token to authenticate against partnercenter/generatetoken and recieve a correct User + App jwt_token
  • Use the jwt token to further authenticate against endpoints you preffer

If you ever find your self in a situation where you need to authenticate against the CSP REST API as app + user, here is a function to do it.

Be aware that the function does require a credential object, but when you atuhenticate against AAD the password is decoded and sent in the post request.