Using Azure Key Vault with Power Automate Desktop
In this article, I am going to outline one approach for using Azure Key Vault (AKV) with Power Automate Desktop (PAD). The scenario I am trying to address is where an organization would like to automate the processing of content in a system that is only accessible through a web browser from within their corporate network. The on-premises system, a web portal, requires authentication and does not support any type of single sign-on so we have to provide credentials but we don’t want to store those credentials in the desktop flow for (hopefully obvious) security reasons.
To address this scenario, I created a Power Automate cloud flow that reads the user credentials from Azure Key Vault and sends those credentials to PAD flow and the PAD flow logs into the portal and is able to do additional process from there.
Use a Service Principal for Azure Key Vault access
The first consideration is that we do not want to make the connection to Azure Key Vault using a named user account. Rather, I want to use a service principal that I configure in Azure Active Directory (AAD). This is where I got tripped up first.
The next challenge I faced was that the out-of-box AKC connector in Power Automate would not allow me to supply credentials for a service principal. It kept authenticating me with my logged in user account. As a solution design principle, I did not want this and I also had a separate technical challenge with respect to my tenants/Azure subscriptions, though that is not relevant to this post. As a result, I decided to try using the HTTP connector and call the AKV REST API directly.
Not covered: Setting up your Azure Key Vault. Sorry — not my area of expertise and I would recommend checking out other articles on the topic.
Configure Service Principal
There is a lot of content about setting up service principals for Azure and I won’t pretend to know all of it. However, when I configured my service principal, I thought that simply assigning application permissions to the application would allow it to access my key vault. For an unattended account, however, that doesn’t suffice. In order for the service principal to access the key vault through the API, you need to add an Access Policy on the key vault itself.
To test this outside of the cloud flow and verify the steps, I used Postman for the two step process:
First, we retrieve an access_token by calling the token endpoint for the Microsoft authentication system:
https://login.microsoftonline.com/{tenant-id}/oauth2/v2.0/token
where {tenant-id} is the GUI for your tenant.
With that request, you have to provide a few key headers, see the image below:
The client_id and client_secret will be specific to your service principal.
If I have everything configured correctly and I get that access token back, I can execute step two and actually send a request to the key vault to get my secret:
The format of the request for the secret is:
https://{key-vault-name}.value.azure.net/secrets/{secret-name}?api-version={api-version}
This piece is pretty well-documented in the official Azure Key Vault REST documentation.
With these tests complete, I know that my service principal is configured correctly. With that behind me, I could configure my cloud flow
Cloud Flow for accessing AKV and calling PAD
As stated before, I ran into some issues configuring the out-of-box AKV action in Power Automate cloud flow so I decided to use HTTP actions.
Secure inputs and outputs
Since this automation is dealing with sensitive data (passwords), we should configure each of our stages to use secure inputs and secure outputs. You can find more details on that here. As mentioned in the documentation, it is harder to debug flows with this enabled so it often makes sense to enable this once the rest of the flow is mostly complete.
Configure HTTP action to get access_token
Here is the configuration for the HTTP action for retrieving the access_token. Notice that I have abstracted the sensitive content for this request into separate variables. This is practical for many reasons but also makes it easier to share my screen shots :-)
Once this responds, we use a JSON parse action to parse the response so there is a variable representing the access token. The input to this action is the body of the HTTP action.
Configure HTTP action to get the actual AKV secret
Now that we have the access_token, we can retrieve the secret from AKV. We use the access_token variable that was parsed out of the HTTP action as our Bearer token in the request to the AKV service:
Then parse that response to get the actual secret:
Call Power Automate Desktop
Now that we have the credentials for our automated user account, we can send them down to our desktop flow:
Power Automate Desktop flow configuration
Because this article is pretty long already, I’ll quickly describe the components of the desktop flow:
- As a first step in the desktop flow, we define input variables that the flow expects to receive from the cloud flow. In this case, it’s the username and the password that we extracted from AKV
- In step two, we launch a browser session that opens up the login page
- In step three, we take the username variable and input it into the login page’s username form field
- Then, we take the password variable and input it into the login page’s password form field. Worth noting that this variable was designated to be of type “Sensitive text” in the desktop flow configuration.
- As a last step, the desktop flow simulates the click of the login button.
Once logged into the portal in the browser, the desktop flow could simulate any number of user actions — that is beyond the scope of this article.
What this article demonstrates is how to securely store user credentials in Azure Key Vault and pass those credentials into a Power Automate Desktop flow to login into an on-premises web portal.