Skip to content
adatum
  • Home
  •  About adatum
  •  Learn Azure Bicep
  •  SCOM Web API
Migration to GitHub illustration Azure DevOps

Migrate from Azure DevOps to GitHub – what you…

  • 16/02/202407/01/2025
  • by Martin Ehrnst

Developers love GitHub, and I do to! However, migrating from one platform to another is akin to orchestrating a complex dance. The rhythm of code, the choreography of pipelines, and the harmony of collaboration—all must align seamlessly. But beneath the surface lies the intricate steps: data mapping, permissions, and legacy dependencies. As the curtain rises, let us delve into the intricacies of this migration journey, where every line of code carries the weight of history, and every commit echoes the promise of a new beginning.

I’ve been apart of this project before, and now I find my self in the same situation. The goal is the same, but not all solutions are identical. You will need to adjust your tasks to fit your company’s situation. But in this blog post, I will outline what you need to know when migrating from Azure DevOps to GitHub.

GitHub for organization and GitHub enterprise.

GitHub enterprise is probably what you’re looking for. You can get some things done with an Organization only, but if you want to use some security features, let’s say, branch policies for internal repositories, you’re forced to go the enterprise route.

GitHub Organizations are well-suited for open-source projects, small teams, or individual developers.

  • They provide a collaborative space for managing repositories, teams, and access permissions.
  • Features include team-based access control, issue tracking, and project management.
  • Ideal for community-driven development and public repositories.

GitHub Enterprise caters to larger organizations and enterprises.

  • It offers enhanced security, scalability, and administrative controls.
  • Features include advanced auditing, single sign-on (which you want), and enterprise-grade support.
  • Perfect for companies with (any) compliance requirements and a need for robust infrastructure.

GitHub Enterprise – account types

Time to chose how accounts are managed. In DevOps you probably used your company’s existing accounts synced with Entra ID. For GitHub you have two options.

Individual accounts using SAML: These are user accounts that are linked to an identity provider (IdP) using Security Assertion Markup Language (SAML). Users can sign in to GitHub Enterprise Cloud with their existing credentials with a linked account to your identity provider, most likely Azure Entra ID.

The other option is Enterprise managed users: Here the user accounts are provisioned an managed through the IdP. Of course this is what you want, right. Full control! However, in both projects we ended up with option 1, individual accounts with SAML. What you do comes down to whether you want to favor developer experience a bit more than full control and central management.

Enterprise managed users are totally blocked from any collaboration outside your enterprise. This means creating issues on a public repo, etc. I really hope GitHub will change this, because what we actually want is both!

Migrating repositories

Let’s delve in to the more technical side of things. You want your repositories moved from DevOps to GitHub. And that is pretty damn simple, possibly the easiest part of the whole project as both is using Git as the underlying technology.

If you only have a handful repo’s a simple git clone can do the job. But most likely, you want to do a bit more, and if you are like me, working as a platform engineer or similar, you probably would like to streamline the process, and have each repository set up with some baseline settings. All this will require some scripting.

Enter GitHub CLI and ADO2GH extension despite having a super annoying limitation requiring personal access tokens (PAT) for bot DevOps and GitHub, I still think this is you best option. I spent a few hours trying to find out how to use GH CLI with an application, but without luck. Considering this is a time limited project, using a PAT from a service account (will consume a license) is acceptable.

Our solution for migrating repositories is a workflow in GitHub developers can run. Below is an example on the workflow and the PowerShell migration script.

param(
[Parameter(HelpMessage="The Azure DevOps organization.")]
[string]$adoOrg = "Adatum",
[Parameter(Mandatory=$true, HelpMessage="The Azure DevOps team project.")]
[string]$adoTeamProject,
[Parameter(Mandatory=$true, HelpMessage="The Azure DevOps repository.")]
[string]$adoRepo,
[Parameter(HelpMessage="The GitHub organization.")]
[string]$githubOrg = "Adatum",
[Parameter(HelpMessage="The GitHub repository.")]
[bool]$lockAdoRepo = $false,
[Parameter(HelpMessage="Repository owner.", Mandatory=$true)]
[string]$repoOwner
)
# Use the Azure DevOps repository name as the GitHub repository name
[string]$githubRepo = $adoRepo
gh auth login –with-token $env:GH_TOKEN
$repoExists = $null
$repoExists = gh repo view $githubOrg/$githubRepo
if ($null -eq $repoExists) {
# Use the custom extension to migrate the repository
try {
gh ado2gh migrate-repo –ado-org $adoOrg –ado-team-project $adoTeamProject –ado-repo $adoRepo –github-org $githubOrg –github-repo $githubRepo –target-repo-visibility 'internal'
# get default branch and set branch protection
Write-Output "Setting branch protection…"
$defaultBranch = gh repo view "${githubOrg}/${githubRepo}" –json defaultBranchRef –jq '.defaultBranchRef.name'
gh api repos/$githubOrg/$githubRepo/branches/$defaultBranch/protection –method PUT `
-H "Accept: application/vnd.github+json" `
-F "required_pull_request_reviews[required_approving_review_count]=1" `
-F "required_status_checks=null" `
-F "restrictions=null" `
-F "enforce_admins=true" `
# setting the repo admin
gh api repos/$githubOrg/$githubRepo/collaborators/$repoOwner –method PUT -F permission=admin
Write-Output "creating environments…"
gh api repos/$githubOrg/$githubRepo/environments/production –method PUT `
-H "Accept: application/vnd.github+json" `
-F "deployment_branch_policy[protected_branches]=true" `
-F "deployment_branch_policy[custom_branch_policies]=false"
gh api repos/$githubOrg/$githubRepo/environments/dev –method PUT `
-H "Accept: application/vnd.github+json"
gh api repos/$githubOrg/$githubRepo/environments/qa –method PUT `
-H "Accept: application/vnd.github+json"
}
catch {
if ($LASTEXITCODE -eq 1) {
Write-Output "Migration failed. Aborting…"
gh ado2gh abort-migration –ado-org $adoOrg –ado-team-project $adoTeamProject –ado-repo $adoRepo –github-org $githubOrg –github-repo $githubRepo
break
}
}
if ($lockAdoRepo) {
Write-Output "Disabling Azure DevOps repository…"
gh ado2gh disable-ado-repo –ado-org $adoOrg –ado-team-project $adoTeamProject –ado-repo $adoRepo
}
} else {
Write-Output "Repository already exists. Migration skipped."
}
view raw migrate-devops-repo.ps1 hosted with ❤ by GitHub
name: Migrate DevOps Repo
on:
workflow_dispatch:
inputs:
adoTeamProject:
description: 'Azure DevOps team project'
required: true
adoRepo:
description: 'Azure DevOps repository'
required: true
lockAdoRepo:
description: 'Lock Azure DevOps repository'
required: false
type: boolean
default: false
jobs:
migrate:
name: Migrate repo, ${{ github.event.inputs.adoRepo }}
runs-on: ubuntu-latest
steps:
– name: Checkout code
uses: actions/checkout@v4
with:
persist-credentials: false
ref: ${{ github.head_ref }}
– name: Install GH CLI Extension
run: |
gh extension install github/gh-ado2gh
– name: Run PowerShell script
shell: pwsh
run: |
.\scripts\migrate-devops-repo.ps1 -adoTeamProject "${{ github.event.inputs.adoTeamProject }}" -adoRepo "${{ github.event.inputs.adoRepo }}" -repoOwner "${{ github.triggering_actor }}"
env:
GH_PAT: ${{ secrets.GH_PAT }}
ADO_PAT: ${{ secrets.ADO_PAT }}
view raw run-repo-migration.yaml hosted with ❤ by GitHub

From Azure Pipelines to GitHub workflows

Next up in your migration is CI/CD. What do you do? In our case, we have also discussed if it’s time to ditch our deployment pipelines in favor of GitOps, using Flux or ArgoCD. All our applications run on Kubernetes (AKS), which makes this a viable option. However, it is a broader discussion, and most likely some developers want to move, and some others will not. It’s reasonable to think deployment pipelines will be a part of our setup for a long time.

Question is, should you try using the GitHub actions importer, or is a refactor about time anyway? Given the fact that the importer tool has some limitation, and you probably have wanted to do some adjustment to your existing pipelines already, I believe this project will force some refactoring anyway.

As a platform engineer, I always strive to create self-service options. Now for pipelines, I can create custom starter workflows. I really like this approach, as it provides the DevOps/Platform team a way to streamline, and scaffold the bare minimum of what’s required, and developers can adjust to their application specific needs. The example in the image above is nothing but a slightly modified standard workflow. However, with the starter workflow we can add references to our container registries, use organization secrets, use and pre-populate connection to our Azure environment. As I mentioned above with the user accounts. We want both, freedom and control!

Azure Work Items and GitHub issues

Azure work items translate to GitHub issues (almost). Work items is a part of Azure Boards, and boards are almost similar to GitHub projects. With some careful consideration and new thinking, I believe it is possible to ditch Azure Boards and work items in favor of Issues with projects. If not, you can connect Azure Boards to GitHub. As you probably have noticed, I haven’t solved this yet, but I will do my best in making it happen.

The biggest difference between the work items and issues is that work items are linked to the board, where issues are tied to one repository. After making the repo migration, you will have to create a script to pull work items from Azure DevOps and create them as issues in the correct repository on GitHub. After that, we can re-create the boards in GitHub projects. There’s a few options/scripts for doing the first task, but I believe every organization use these features differently, so customization is needed. This solution by Josh Johanning is where I will start.

I will update this post when i hit a wall or find different solutions. Until then, happy migration!

Share this:

  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on X (Opens in new window) X
  • Click to share on Reddit (Opens in new window) Reddit
Azure Active Directory

GitHub actions federated identity with Azure AD

  • 24/08/202207/01/2025
  • by Martin Ehrnst

A long-awaited feature for many is the ability to authenticate with Azure in a GitHub workflow. For a long time, this has been an integrated feature in Azure DevOps, and for some time GitHub actions have had the ability to authenticate using Azure Application registrations and secrets. But now, we can use federated credentials which utilize OIDC- This will completely remove the cumbersome task of rotating app secrets.

Configure an Azure Application with federated identity

At my company, I have decided we should look into moving pipelines from Azure DevOps to GitHub. Federated identity is now Generally Available (GA) which means we are allowed to work with a production solution in mind. So how do you set it up? my suggestion is to follow the official guide. Below is a script that will configure an Azure AD application with GitHub authentication enabled. The script assumes you use a branching strategy. However, you can change this to be any of the other entity types, altering the federationSubcject variable.

# creates an appregistration in Azure AD and connects it with a github repo
# use as an example only
[CmdletBinding()]
param (
[Parameter(Mandatory)]
[string]
$gitHubRepoName,
[Parameter(Mandatory)]
[string]
$teamName,
[Parameter(Mandatory)]
[string]
$branchName = "main"
)
$organization = "myorgltd"
$appRegistrationName = "gh-$teamName-action-sp"
$appRegistrationDescription = "Used by $teamName for enabeling GitHub actions with Azure AD OIDC authentication"
$audience = "api://AzureADTokenExchange"
$ghIssuer = "https://token.actions.githubusercontent.com/"
$federationName = "action-$gitHubRepoName"
$federationSubject = "repo:$($organization)/$($gitHubRepoName):ref:refs/heads/$($branchName)"
# check if app exists
$GhAdApp = $null
$GhAdApp = Get-AzADApplication -DisplayName $appRegistrationName
if ($GhAdApp) {
Write-Output "App registration already exist. Adding new credential for specified repo $gitHubRepoName"
$federatedCred = New-AzADAppFederatedCredential -ApplicationObjectId $GhAdApp.Id `
-Audience $audience -Issuer $ghIssuer -Subject $federationSubject `
-Name $federationName -Description "Used to authenticate pipeline in $gitHubRepoName"
}
# if app does not eist
else {
Write-Output "adding new app registration for $teamName with credentials for $gitHubRepoName"
$GhAdApp = New-AzADApplication -DisplayName $appRegistrationName -Description $appRegistrationDescription
$federatedCred = New-AzADAppFederatedCredential -ApplicationObjectId $GhAdApp.Id `
-Audience $audience -Issuer $ghIssuer -Subject $federationSubject `
-Name $federationName -Description "Used to authenticate pipeline in $gitHubRepoName"
}
$credentialObject = [PSCustomObject]@{
ApplicationName = "$($GhAdApp.DisplayName)"
ApplicationObjectId = "$($GhAdApp.Id)"
}
Write-Output $credentialObject | ConvertTo-Json
view raw create-adappgithub.ps1 hosted with ❤ by GitHub

Configure repository secrets in GitHub using PowerShell

As you probably understand, I need to automate this entire process. If our developers are moving from DevOps to Github workflows, it has to be as easy as it can be. No one will move unless they see some benefits, and onboarding is just as easy as the current DevOps setup. I am pretty familiar with Azure as an identity platform, especially with app registrations and enterprise apps. So the above script was done quickly. On the other hand, I am not that familiar with GitHub APIs. I know there’s a CLI, but for various reasons, I need to look into the API, more specifically the repository secrets API. What you can see from the first part of the documentation is that you need to encrypt the secret before you make the put request.

Creates or updates an organization secret with an encrypted value. Encrypt your secret using LibSodium. 

GitHub api docs

Now I was in trouble. Should I write everything in .NET, or is it possible to solve this using PowerShell? For sure I would be better off using PowerShell, at least in the PoC phase. First I tried to load the .NET sodium DLLs in my script, but it did not work, and the approach is not that delightful. Luckily with some search engine work, I found a module that wraps Sodium, created specifically for creating secrets in GitHub. Thanks, Tyler! See my example script below.

# add secrets to repo
import-module PSSodium
$ghToken = "ghp_"
$headers = @{Authorization = "token " + $ghToken}
$secret = "fb4ca569-c690-4391-96d9-928e7a8fd7ff"
$repoName = "myrepo"
$organization = "myorgltd"
Invoke-RestMethod -Method get -Uri "https://api.github.com/repos/$($organization)/$($repoName)/actions/secrets" -Headers $headers
$publicKey = (Invoke-RestMethod -Method get -Uri "https://api.github.com/repos/$($organization)/$($repoName)/actions/secrets/public-key" -Headers $headers)
$encryptedSecret = ConvertTo-SodiumEncryptedString -Text $secret -PublicKey $($publicKey.key)
$secretBody = @"
{
"encrypted_value": "$encryptedSecret",
"key_id": "$($publicKey.key_id)"
}
"@
Invoke-RestMethod -Method Put -Uri "https://api.github.com/repos/$($organization)/$($repoName)/actions/secrets/AZURE_TENANT_ID" -Headers $headers -body $secretBody
view raw create-ghsecret.ps1 hosted with ❤ by GitHub

What about the pipeline?

Next, you need a working pipeline. Lucky for you the documentation is way better when I first looked at this (before GA), so the example works just fine as it is. Anyway, there are a couple of things to point out. Namely the permissions and the setting for login without a subscription ID.

name: 'Federated identity test'
on:
push:
branches:
– main
permissions:
id-token: write
contents: read
jobs:
build-and-publish:
runs-on: ubuntu-latest
steps:
– name: checkout
uses: actions/checkout@main
– name: azure login
uses: azure/login@v1
with:
client-id: ${{ secrets.AZURE_CLIENT_ID }}
tenant-id: ${{ secrets.AZURE_TENANT_ID }}
allow-no-subscriptions: true
view raw workflow.yaml hosted with ❤ by GitHub

Summary

I hope this short post on how to configure Azure and GitHub with federated identity helps you, and that it provided some more information than the official documentation does. If anything please reach out. I now have my scripts in place, and can start the full automation and migration from Azure DevOps pipelines.

Share this:

  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on X (Opens in new window) X
  • Click to share on Reddit (Opens in new window) Reddit
Azure

How to move Azure blobs up the path

  • 25/01/202207/01/2025
  • by Martin Ehrnst

This is the short, but for me, pretty intense story from when I uploaded 900 blobs of one Gb each to the wrong path in a storage container. Eventually I was able to move these files using azcopy and PowerShell

Thanos persistent Prometheus metrics

In our Azure Kubernetes environment (AKS) we use Prometheus and Thanos for application metrics. Thanos allow us to use Azure Storage for long term retention and high availability Prometheus setup. The other day I was challenged with deleting a series of metrics causing high cardinality. Meaning that a lot of new series of data was written due to a parameter being inserted during scraping.

The way Thanos works is that it takes raw prometheus data, downsamples it and upload it to Azure Storage for long term retention. Each time this process runs, it will create a new blob. In our production environment we had around 900 blobs and 900gb of data.

Thanos has a built in tool to rewrite these blobs and remove the metric we wanted, which seemed easy enough to do, but we had no idea when the problem first started, so I had to analyze, rewrite and upload all the data. It all seemed to work fine, util I discovered no metrics where available. It turned out that the tool I used inherited my local path and uploaded all the modified data to <guid>/chunks/c:/users/appdata/local/[...]/00001.blob

So no matter how satisfied I was, all the data was useless as thanos expected the files to be under <guid>/chunks/00001. On the bright side, all data was there, so the challenge was to move the files from <guid>/chunks/c:/users/appdata/local/[...] to <guid>/chunks/. From the two pictures below you can see the folder structure. Going trough a download and upload approach was the last thing I wanted to do.

Azure storage explorer

AzCopy and PowerShell to the rescue

I already knew my way around azcopy. But I did not know the process actually run on the Azure backbone if you copy within or between storage accounts. Luckily my dear Twitter friends was there to help where I failed to read the documentation.

To perform the copy operation I used a combination of Azure Powershell and AzCopy.

  • Connect
  • Get all current blobs
  • Filter them
  • Actually copy
  • Second loop to delete

Below is my complete script. This could be way smarter but I quickly put it together to get the job done.

## connect to storage using SAS
$storageName = ""
$sasToken = ""
$container = ""
$ctx = New-AzureStorageContext -StorageAccountName $storageName -SasToken $sasToken
# get all the blobs
$blobs = Get-AzureStorageBlob -Container $container -Context $ctx
# a date to filter on
$date = (get-date -Date 20.01.2022 -AsUTC)
# filter the blobs for date and where name has /c:/..
$blobsToModify = $blobs | where { ($_.LastModified.DateTime -ge $date) -and ($_.LastModified.DateTime -le $date.AddHours(24)) -and ($_.Name -like "*/chunks/C:/Users/*") }
# loop through the blobs
# get the original folder name and the blob name with some splitting
foreach ($blob in $blobsToModify) {
$blobtoMove = $blob.name
$original = $blob.Name.split("/",2)[0] # trim to original name
$newBlob = $blob.Name.split("/")[-1] # trim to original chunk name
# actually copy
./azcopy.exe copy "https://$storageName.blob.core.windows.net/thanos/$blobToMove$sasToken" "https://$storageName.blob.core.windows.net/thanos/$original/chunks/$newBlob$sasToken" –overwrite=prompt –s2s-preserve-access-tier=false –include-directory-stub=false –recursive –log-level=INFO;
}
# antother loop to delete the whole c:/ folder after the chunks of data is moved
# i have a separate loop as there might be multiple chunks in the folder.
foreach ($blob in $blobsToModify) {
$original = $blob.Name.split("/",2)[0] # trim to original name
./azcopy.exe remove "https://$storageName.blob.core.windows.net/thanos/$original/chunks/C%3A/$sasToken" –from-to=BlobTrash –recursive –log-level=INFO;
}
view raw azcopy-move.ps1 hosted with ❤ by GitHub

Summary

I hope this helps someone else who accidentaly upload a lot of data to the wrong place. If you by any chance are using Thanos. I filed this as a bug.

Share this:

  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on X (Opens in new window) X
  • Click to share on Reddit (Opens in new window) Reddit

Posts pagination

1 2 3 … 10

Popular blog posts

  • SCOM Alerts to Microsoft Teams and Mattermost
  • Windows Admin Center with SquaredUp/SCOM
  • SCOM Task to restart Agent - am i complicating it?
  • How to move Azure blobs up the path
  • First look at SquaredUp's new Visio plugin

Categories

Automation Azure Azure Active Directory Azure Bicep Azure DevOps Azure Functions Azure Lighthouse Azure Logic Apps Azure Monitor Azure Policy Community Conferences CSP Monitoring DevOps GitHub Guest blogs Infrastructure As Code Kubernetes Microsoft CSP MPAuthoring OMS Operations Manager Podcast Powershell Uncategorised Windows Admin Center Windows Server

Follow Martin Ehrnst

  • X
  • LinkedIn

RSS feed RSS - Posts

RSS feed RSS - Comments

Microsoft Azure MVP

Martin Ehrnst Microsoft Azure MVP
Adatum.no use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it. Cookie Policy
Theme by Colorlib Powered by WordPress