Github Actions: Using AKV To Get Secrets
Description:
A key goal I wish to accomplish with Github Actions is to set all the secrets I can in an Azure Key Vault and then only get them at run time to populate all the other secrets needed. I have been doing this for years with Function Apps and other Azure services and wish to continue this strategy. Basically, the strategy works like this: "Only store Azure Credentials as Env secrets and use those at runtime to populate more secrets"
. This way you can create a Service Principal with limited rights like “Key Vault Secrets User” at the AKV level and then monitor it in Azure Active Directory to ensure it is only being used to access AKVs. Here is how I do just this:
To Resolve:
-
The only secrets needed for OIDC Auth are
${/{ secrets.CLIENT_ID }}
,${/{ secrets.TENANT_ID }}
, and${/{ secrets.SUB_ID }}
so I put each of these in the Repo as secrets.-
NOTE: Jekyll Liquid Filters clash with Github Variables so replace all instances of
${/{
by removing the forward slash :) -
Then I need to add
${/{ secrets.REPO_BOT_PEM }}
as discussed here for access to my module repos. -
I would have that one come from a Key Vault, but I had issues reading secrets from AKV that are private keys as explained here.
-
-
Once those secrets are added, we then just need to do 2 things: Populate our AKV with all possible secrets and then add a task in our pipeline that will update the worfklow and “switch” based on the needed variables.
-
Here is the action that will switch based on the specific workflow :
1 2 3 4 5 6 7 8
- name: "Parse Workflow TF Folder From Matrix" id: parse run: | cd $GITHUB_WORKSPACE chmod +x ./.github/scripts/parse.sh ./.github/scripts/parse.sh env: CURRENT_DIRECTORY: ${/{ matrix.directories }}
- And here is the script.
-
You will notice a few important things about this script.
- First, it makes use of Github Outputs context which is critical to how workflows work in Github Actions.
- I used to think it only worked with Inline bash so I would have long workflows but I found that shell scripts will inherit the
$GITHUB_OUTPUT
env var. - Second, the script mostly works by looking at the current matrix item and then setting outputs based on the value of it. Just a like a powershell switch statement but using bash.
- Lastly, after the parsing script, you just reference any output key value pair by the
key's name
in subsequent steps. For example:TF_VAR_subscription_id: ${/{ steps.azure-keyvault-secrets-spoke.outputs.spoke-subscription-id }}
TF_VAR_hub_subscription_id: ${/{ steps.azure-keyvault-secrets-hub.outputs.hub-subscription-id }}
- In these examples the steps name is
azure-keyvault-secrets
as seen here and the secrets we are setting arespoke-subscription-id
andhub-subscription-id
as seen in various spots of the parse script above ( look for any lines with>>$GITHUB_OUTPUT
).
-
In the previous step we get the subscription ID and hub subscription ID needed dynamically so that we can build providers in our calling workflows. See my lab section for how this works.
-
Also, I have recently added a random up to 45 second delay because I might have multiple parrallel workflows running and the AKV needs to only whitelist at run time so it would often error out with ‘(Forbidden) Client address is not authorized’ like so:
1 2 3 4 5 6 7 8 9 10 11 12 13 14
Run az account set --subscription "prd-hub" ERROR: (Forbidden) Client address is not authorized and caller is not a trusted service. Client address: 20.81.159.17 Caller: appid=***;oid=fe62cc9a-b71b-46ae-93b3-d154327f57a4;iss=https://sts.windows.net/***/ Vault: aa-prd-scus-hub-akv-v1;location=southcentralus Code: Forbidden Message: Client address is not authorized and caller is not a trusted service. Client address: 20.81.159.17 Caller: appid=***;oid=fe62cc9a-b71b-46ae-93b3-d154327f57a4;iss=https://sts.windows.net/***/ Vault: aa-prd-scus-hub-akv-v1;location=southcentralus Inner error: { "code": "ForbiddenByFirewall" } Error: Process completed with exit code 1.
- The fix is to simply click the
rerun failed jobs
button. This happens due to concurrency that the 45 second delay is supposed to fix.
- The fix is to simply click the
Comments