Power Platform Solution Components to Work Items with PowerShell

In previous posts, we explored how to link Power Platform Solution Components to Work Items in Azure DevOps.

The reason we would like to do it is to be able to, for example, see what work has been done in relation to an existing Canvas App, or a Power Automate flow, or see what requirements originated the creation of a web resource.

We saw how creating a new Work Item type could help this task.

But, if you have to create every Solution Item manually so you can link it to User Stories and Tasks, that would be a lot of work, right? Why not try to automate it in some way?

There are many ways to do it, but probably the simplest one is using Power Shell.

In a previous post, we saw how to connect PowerShell to the Azure DevOps API to interact with it. Also in a past article, we learned how we can navigate the contents of any given Power Platform solution using a script.

Now is the moment to put everything together.

Previous Steps

In order to fully understand what comes in the next sections, I recommend quickly checking the related previous posts listed below:

Code

The code can be broken-down in 4 parts, as shown in the image below.

1. Getting the Solution Components

You can find a detailed explanation in the article PowerShell to Power Platform Solution Components but to summarize it, we use PowerShell to read the outputs of the Solution Packager Tool, left in the Source Control Repository.

The Solution Packager extracts all the metadata information about the contents of any Power Platform in files. These files follow a well-defined structure that is easy to read.

Outputs of the Solution Packager stored in the Source Control repository in Azure DevOps.

Why do we take the Solution Components from the source control instead of reading them from the development environment itself, you may ask?
It is because the Source Control is always considered the source of truth. In a way, what you have in the source control is what it will be used later to deploy to production. Besides, there are some projects where you have multiple development environments, and, when thet are merged, they are commited to the repo.

This is what the code for getting the solution components looks like:

https://gist.github.com/crisfervil/b5cb091d0eaf26008a5236f5f2b0795b

A. Check if it Exists in Azure DevOps

In order to do this, we first need to connect to Azure DevOps from PowerShell and run a query.

To query Azure DevOps we use WIQL.

https://gist.github.com/crisfervil/7fe983b437f579507947db82ea60f312

B. Creating a new Work Item

Similar to the previous point, this is about calling the API and passing the parameters.

https://gist.github.com/crisfervil/0ddd4661dcd4cc67577a4931df1a61c9

C. Updating a Work Item

Instead of passing all the values to the endpoint, you just need to pass what will be updated.

Why not delete everything and recreate it instead of doing an upsert?
Because you want to keep the history and not delete the existing Work Items and their relationships.

https://gist.github.com/crisfervil/3934e397a9f2d9324976889e57e9ab14

All together now

About 150 lines of code are enough to perform this task. You can run this script using an Azure DevOps build so your Work Items are kept always up-to-date.

https://gist.github.com/crisfervil/25e2c17b6c0a8eb6cf5e23e49b72a9ed

Conclusion

This is the last post in the series, explaining this idea I am experimenting with. Although the process of setting it up is somehow cumbersome, I believe the benefits can surpass the effort by far, especially in mid-sized projects.

I hope you find it useful. If you have any comments or queries, please leave them below.

Subscribe to our newsletter