Successful pipelineSuccessful pipeline

As you take software development more seriously, Application Lifecycle Management (ALM) becomes more important.

Application Lifecycle Management

Application Lifecycle Management isn’t something that stands on its own. After years of developing apps, you can’t just implement ALM overnight. It is important to put some processes in place.

In this post I’m looking at a number of things that you should consider.

  • Get started with environments and solutions
  • Deployment automation
  • Look after you low code in Azure DevOps

Deployment Automation

The Power Platform has come a long way, from a pure citizen development tool, more and more pro development has influenced the way we use the tools. A while back multiple environment has become standard to separate development and production.

As it is more common to use multiple environments the application lifecycle management is becoming more important.

In short, how do we get our development efforts into production without risking, or at least reduce the risk of breaking something.

Get Started with solutions

I’m going to assume for now that you already have created a development, test and production environment. Within the Development environment we will need to create a solution within Power Apps or Power Automate and then add a couple of things to the solutions. I my case I added a flow and a canvas app to my solution.

A solution with a flow and an app
A solution with a flow and an app

We can now save and publish these changes in development. Do some basic testing to ensure that all is working. But now comes the big test. Will this work in the test environment and even more important will this work in production, without potentially breaking previous versions of the solution?

And what if all your apps and flows were created by many people within your organisation in the default environment? Even worse, you might not even know what you have, or who is using it all. Wouldn’t it be helpful if all development was done within solutions? Wouldn’t it be good if all your citizen developers behaved like pro developers?

There is a fine balance between letting citizen developers create the exact solutions that they need and the management of solutions. A lot of citizen developers will complain that ALM is over engineering things.

Quite often I leave apps that are used by small groups of people where they are. Apps that have a greater impact on the business often need to be managed better however.

Get Started with DevOps

For all of my projects I create a new project within DevOps. Projects within DevOps let you manage tasks, code and releases of your products.

Step 1 in Application Lifecycle Management and the Power Platform Microsoft Office 365 create new project

Within this post I will focus on the code management and the release management. Specifically, I’m going to have a look at how this all works with the Power Platform’s solutions.

Manage Repos in Azure DevOps

Once we have created a project we will need to initialize the repository that will hold the code. We can click on the initialize button near the bottom of the below screenshot. This will add a readme file to the repository.

Initialize repository
Initialize repository

So what is this code going to be? And why do we want to put that in repositories.

Code in the Power Platform

The solutions that were mentioned earlier in this post will bundle all elements of your project. This can be a canvas app, a flow or definition of a Dataverse table, but there is a lot more. All of these elements are described in small text files. These text files are bundles together in a single zip file, also called a solution file. When you unzip the solution file you will find a large collection of files in various formats.

These files we want to store in our Azure DevOps repository. One of the major benefits of doing this is so that we can see what has changed during any specific period of time.

Taking this all one step further, how about exporting the solutions/apps/flows developed on a regular basis. So that any changes made are tracked and they could even become your backups!

Exporting Development

Within the Azure Devops project there is an option to create Pipelines. Pipelines are a set of steps that are executed. (Yes, it is a bit like flows in Power Automate, but then different).

To create a Pipeline select pipelines in the left hand navigation, then click on the Create Pipeline button.

Create a pipeline
Create a pipeline

We can now select that we want to store our code in an Azure Repository. But if you want to you could of course use github, gitbucket or subversion. But i’m going to use the classic editor using the link at the bottom.

Where is your code
Where is your code

Now the following dialog will appear, where the repository related to the pipeline can be selected. In my project I only have one repository but you could potentially have more than one.

Use a classic editor instead
Use a classic editor instead

Using all the detault options, I just press the continue button and my Pipeline will be created:

New Empty job has been created
New Empty job has been created

Then click on the Agent Job 1 and add a few steps. I will need 4 steps in total. The first 3 steps are all related the the Power Platform. Typing Power in the search box helps a bit:

Adding steps for Power Platform to pipeline job
Adding steps for Power Platform to pipeline job

Within no time I’m adding the following 4 steps:

  • Power Platform Tool Installers
  • Power Platform Export Solution
  • Power Platform Unpack Solution
  • Command Line Script
Step 1 in Application Lifecycle Management and the Power Platform Microsoft Office 365 basic export pipeline

It’s probably worth it to explain a bit about what is going on here.

Our pipeline will first install some tools that look after the export and unpacking of solutions. Then it will export the solution from our development environment. Then the solution that was picked up will be unpacked (unzipping that solutions file so that we get mainly text files). then the last step will upload the files to our repository.

Is it really that simple? No!

Now, we will need to configure the 4 steps to get the first part of our Application Lifecycle Management solution in place.

Power Platform Tool Installer

The is the easy step. Nothing to configure here.

Power Platform Export Solution

The Export Solution step is slightly more complicated. Especially the Authentication bit. You could supply user name and password here. But really, the Service Principal/client secret option is better.

Configure export solution step
Configure export solution step

In the service connection you can supply the Tenant ID, Application ID and a secret as you would have to configure within an app registration within Azure.

Step 1 in Application Lifecycle Management and the Power Platform Microsoft Office 365 configure service principal

Within your app registration in Azure make sure that you have the right API permissions configured. The Permissions should look like this:

Step 1 in Application Lifecycle Management and the Power Platform Microsoft Office 365 power apps api permissions granted within the app

Once all of the above is done we will see that the pipeline can create us the zip file for our managed or unmanaged solution. In our case we want it to be unmanaged. I’ve given the solution file the name PV Test solution.zip but you can call it whatever is suitable for your project.

Power Platform Export Solution
Power Platform Export Solution

Power Platform Unpack Solution

Now that we have a zip file, we want to unpack the zip file. Having zip files in repositories might give us a backup, but it doesn’t give us the option to compare code.

Unpack solution configured
Unpack solution configured

The unpack solution is easier to configure. The only thing to look out for is that you have to type the solution input file name. While we develop the pipeline the solution file doesn’t exist yet, therefore it can’t be selected.

Command Line Script

Then now the final step, uploading all our code to our repository.

So far the exporting and unpacking have created a set of files within the storage of the pipeline. This storage is clean at the start of each pipeline run. So creating these files will do nothing to the repository.

To make the last step work, we will need to run a script.

Command line check in project
Command line check in project

Using the following line of code, the Command line will just print “commit all changes”. this isn’t to exciting, but it can be very helpful to know where in your script things are going wrong.

echo commit all changes

Then the next two lines will tell the script which account to use to commit changes with


git config user.email "pieter.veenstra@myorg.com"
git config user.name "Pieter Veenstra"

Then I’m adding a few commands that will show me the files that I have. These aren’t 100% needed, but when changes don’t come through in my repository it will be useful to know if the unpack actually create my files.


dir /w

dir .\solutions\*

Now the real parts of the code update. First the code from the repository is checked out. Then all our changes are added. then the changes are committed. and finally there is a complicated looking push command.

git checkout main
git add --all
git commit -m "solution"
echo push code to repo
git push --all https://iyzyivaf4fr5trme7zarxh37cvfbsbkdieg3theg62qszsnugt3q@dev.azure.com/SharePains/PV%%20Test%%20Solution%%20Project/_git/PV%%20Test%%20Solution%%20Project 

That Push command needs a bit more of an explanation. First of all there are the %%20 parts in the path. Because I have spaces in my project name I would have expected to use %20 to encode the spaces. But that doesn’t work. The % signed need to be escaped. This escaping of the % sign is done using %%.

Then there is that long string of random characters at the front. This is called the PAT or personal access token.

Step 1 in Application Lifecycle Management and the Power Platform Microsoft Office 365 get personal access token

The PAT can be generated and used in the script.

Now the total result is the following script ( make sure that you update the email, name and PAT!)

echo commit all changes
git config user.email "pieter.veenstra@myorg.com"
git config user.name "Pieter Veenstra"

dir /w

dir .\solutions\*

git checkout main
git add --all
git commit -m "solution"
echo push code to repo
git push --all https://iyzyivaf4fr5trme7zarxh37cvfbsbkdieg3theg62qszsnugt3q@dev.azure.com/SharePains/PV%%20Test%%20Solution%%20Project/_git/PV%%20Test%%20Solution%%20Project 

Running the Pipeline

Now we can finally run the pipeline by selecting the Save & queue option.

Step 1 in Application Lifecycle Management and the Power Platform Microsoft Office 365 save queue the pipeline

This, if all is done correctly will result in a successful run:

Successful pipeline
Successful pipeline

Now all we have to do is double check the repository for our files and we can find our files back.

Files have arrived in my repository

The next steps

The following post about Application Lifecycle Management with the Power Platform will follow soon:

Importing the solution into test/production

Avatar for Pieter Veenstra

By Pieter Veenstra

Business Applications Microsoft MVP working as the Head of Power Platform at Vantage 365. You can contact me using contact@sharepains.com

16 thoughts on “Step 1 in Application Lifecycle Management and the Power Platform”
  1. “I’m going to assume for now that you already have created a development, test and production environment.”

    That is a big assumption. What if we have not yet created these environments? What are the implications if we do create them? Specifically, what happens to my existing apps and flows when I create a new environment? Do they continue to run as normal? How do they know which environment to run in? Do they all have to be updated?

  2. this is fantastic and def an area i have been watching closely … the biggest issue is with connection references when the pipe runs from dev to test and prod. Since the solution is managed those connections dont work and rendering the whole ALM pointless. I have noticed they made a recent release 5 days ago regrading this but any help there is appreciated.

    1. I will include the solution for that in my next post. In short I add a second solution that is deployed as unmanaged. this second solution will only include variables. As this second solution is unmanaged I can now edit the variables for each environment.

  3. Hi there, would you please provide link for next step. Importing the solution into test/production

  4. Hi There

    I’ve followed this guide and the first three tasks run successfully but when I run the command line script I get the below:

    error: pathspec ‘main’ did not match any file(s) known to git

    I can see the main under branches.

    Does anyone have any ideas?

      1. Hi There.

        Please see below.

        echo commit all changes
        git config user.email “alias@domain.com”
        git config user.name “Automatic Build”
        git checkout main
        git add –all
        git commit -m “solution”
        echo push code to repo
        git push –all https://

        I have left out the email and PAT with URL to the repo

        I have also tried the below and the error is the same as above. All previous tasks are successful but cmd gives an error “error: pathspec ‘main’ did not match any file(s) known to git”

        echo commit all changes
        git config user.email “alias@domain.com”
        git config user.name “Automatic Build”
        git checkout main
        git add –all
        git commit -m “solution”
        echo push code to new repo
        git -c http.extraheader=”AUTHORIZATION: Bearer $(System.AccessToken)” push origin main

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from SharePains by Microsoft MVP Pieter Veenstra

Subscribe now to keep reading and get access to the full archive.

Continue reading