Terraform: Data Sources Module
Description:
So if your organization uses a hub and spoke like structure like mine, you most likely will have certain resources that you point all your subscriptions to. For example, you could say something like ‘all resources must send their logs to one of our Log Analytics workspaces based on environment and region’. Here is a way you could develop a module that will automatically resolve the correct Log Analytics Workspace to send logs to that you can copy and paste freely between your different module compositions.
Note that is one of the official ways to use Terraform.
Note: You can see the code for this post on my Github repo.
To Resolve:
-
First create a module called ‘data_sources’ or something like that. Mine is here => data-sources
-
Inside the module, accept providers for each of your hub network environments:
1 2 3 4 5 6 7 8 9
terraform { required_providers { azurerm = { source = "hashicorp/azurerm" configuration_aliases = [azurerm.nonprod-hub, azurerm.prod-hub, ] version = ">= 3.20.0" } } }
- NOTE: I used to build providers inside the module but I kept getting errors in my terraform plan saying this was depreciated and that you should pass them in instead. Makes sense, modules should have very little except vars and code.
-
Next, just start doing data lookups like here, here, and here where you query resources from your hub network based on environment.
- Couple things to note here:
- We are querying both Nonprod and Prod environments as well as South Central and East regions and getting their resources at the same time.
- We are accessing whatever properties are available of those resources like
name
,id
, etc. and exporting them as outputs.
-
Ok, so now we have
module.dataLookup.law_pe_rg
andmodule.dataLookup.law_pe_name
, how does that help? Well the real power of this you can use a lookup function in your module composition like so :-
Let’s break this
common.tf
down: - First, we call our module so we get all of its outputs in
module.dataLookup.$some_output
- Next, we pass in our
${var.env_stage_abbr}
and${var.region_abbr}
and store them as a string with an underscore separator inlocal.lookup
(line 65) - Next, we create a map object for each combination of environments and regions. NOTE I only did by environment in this example but you can extend this logic to make some pretty sweet lookups.
- For example, you could do
environment_region_filter
or something if you want to have lookups based on different environments, Azure regions, or some other filter likelogical stage
or something. - Either way, you then create a lookup table that maps all possible values to all possible combinations.
- Man this is a lot of work, why do it?
-
-
You may wonder why do all this work to begin with just to get a stupid
id
for a Log Analytics workspace. Well the power of this is that you only have to do this once and you can store it as acommon.tf
or something and copy it between all your module compositions. The real power of this comes from deploying between different regions and environments, since the data lookup will match based on that, you will never have to change your code!!- For example, every time I need to pass a
log_analytics_workspace_id
, I can just passlocal.law_id
blindly and know it will perform the lookup for me!
- For example, every time I need to pass a
-
NOTE: I have recently started playing with Terragrunt and it seems like it does the same thing but without a module call/lookup. I will update on it shortly.
Comments