Extending the Power Platform Pipeline
Power Platform Pipelines has officially graduated from its preview phase. However, they're still ironing out a few things, which they're actively addressing. For the time being, I've found a handy workaround that updates site settings and ensures that authentication credentials are automatically updated during deployment.
May 3, 2024
So the problem is this
I'm currently working on a project where we're utilizing Power Platform Pipelines for our portal (Power Pages) ALM. The setup for the pipeline is robust and it's exciting to note that it's no longer in preview — meaning it's fully supported by Microsoft.
If you're not familiar with setting up a Pipeline, I’ll be covering that in a future post. For now, you can check out the official documentation on how to set up pipelines in Power Platform.
Set up pipelines in Power Platform
The project uses an authentication system provided through OpenID, which requires a unique redirect URL for each environment. Initially, I thought of using environment variables to manage this, but soon realized that wouldn’t work well with the site settings.
Setup
Power Platform Pipelines offers some useful guidance on how to extend your pipelines:
Extend pipelines in Power Platform
After some research, and a discussion and help from my colleague Ulrikke Akerbæk I discovered that the best approach to update the site settings post-deployment is by using Power Automate, triggered at the end of the deployment process.
OnDeploymentCompleted
I started by identifying the specific site setting that needed updating, which I located through the Power Pages management interface.
To capture the ID of this setting for later use, I simply navigated to the site setting in the Power Pages management and copied the GUID from the URL bar:
I also need to manage the RedirectUri for different environments, specifically for testing and production. This is crucial because the initial site and settings are configured in the development environment, and when the pipelines run, they replicate these settings from dev.
Another important detail is knowing which table to target for updates. Initially, I tried writing directly to the virtual table for Site Settings, which didn’t work. I got this error:
I have not figured out what it means. If someone does, please send me an message on LinkedIn.
I eventually found after talking to Ulrikke again, that the appropriate table to use is the 'Site Components' table. This table aggregates all the virtual tables and is the one that gets updated during pipeline execution.
https://your-portal-test.powerappsportals.com/signin-openid_1
https://your-portal-prod.powerappsportals.com/signin-openid_1
table: powerpagecomponent
I also tried fetching the virtual Site setting table Using XrmToolBox, but i got nothing. When i tried fetching in the site components i get the settings i need, when filtering on site settings.
Automated Cloud Flow
I've set up a new Automated Cloud Flow, and it's crucial to note that this must be done within the HOST environment where your pipeline is set up. This ensures access to the necessary pipeline triggers.
I also need the id for the row that we are going to update, this will be the same id in all environments, because the configuration moves downstream with the pipeline.
I went into the Site Components table, and found the Row id uniqe, which i thought maybe was the right one, because the id for the component, was uniqe for the site settings row.
But that did not work, so it was the power pages component id. The one next to the row id, in the column you see above (aka. the first GUID i copied).
1. Step - Trigger
For the trigger, use the following parameters:
OnDeploymentCompleted
@equals(triggerOutputs()?['body/OutputParameters/DeploymentStageName'], 'Dev To Test')
In the parameters tab
- Catalog: Microsoft Dataverse Common
- Category: Power Platform Pipelines
- Table Name: (none)
- Action Name: OnDeploymentCompleted
Add a condition to confirm it's for the Test Environment:
- Trigger condition
Ensure the last part matches your pipeline stage name.
@equals(triggerOutputs()?['body/OutputParameters/DeploymentStageName'], 'Dev To Test')
2. Step - Action - Get site setting
In this step it is important to choose the right environment. Where you get the site setting from, and also then, in which environment the site setting will be updated.
So we select the Dataverse Action: Get a row by ID from selected environment.
And because we know the GUID of the site setting, we use this.
This step is not neccesary, i do this, so that i know what i need to update the row with, and confirm it is the right row.
- Environment: Choose the right Environment
- Table Name: Site Components
- Row ID: Use the GUID we found before
- Select Columns: name, content
3. Step - Action - Update a row in selected environment
Update the site setting in the desired environment.
- Environment: (target environment)
- Table Name: Site Components
- Component Type: Site Setting
- Advanced parameters - Content:
{
"value": "https://your-portal-test.powerappsportals.com/signin-openid_1"
}
Next step?
Repeat the same process for your Production Environment. This ensures consistency across all your deployment stages and aligns the Production settings with those tested in your Development and Test environments.
One more thing
I sent a message to Nick Doelman, and told him what i have done with this. And asked if he knew a better way of doing this. And he told me that he had done this in another way:
"But here’s what I found… this is specific for authentication site settings (OKTA).
The site settings for authentication will “travel” from DEV to downstream environments.
However, if I configure authentication manually on the downstream environment, it doesn’t overwrite, but creates a new set of values.
If I make that particular authentication provider the “default”, it will ignore the one brought over from the DEV environment. So I don’t need to update"
This is specific for authentication site settings (OKTA).
I tried this myself, and it did not work for my customer, because the environments and settings was created long before managed environments and the enhanced data model was introduced.
So the steps are:
- Provide you new portal.
- Set up authentication.
- Run the pipeline to test and prod (or as many environments you use)
- Set up authentication again in both environments, in an unmanaged solution, with different values on the site settings and make them DEFAULT in the respective environments.
- They will then not be overwritten the next time you run your pipelines.
You should always use managed solutions in you test and production environments. But in a special case like this, and until Microsoft solves this, an unmanaged solution in these environments is neccessary.
Thank you for reading, Stig.