Terraform Module Input Variables implementation guide with Remote State for GCP Resources
I'm attempting to set up I'm converting an old project and I'm working with an scenario where changes to input variables in my Terraform module are not being reflected in the resources managed by the remote state. I'm using Terraform version 1.3.6 along with the `google` provider version 3.5.0. I have a module that creates a Google Cloud Storage bucket, and I want to update its location dynamically based on an input variable. Here's the relevant module code: ```hcl module "storage_bucket" { source = "./modules/storage_bucket" bucket_name = var.bucket_name location = var.bucket_location } ``` And in my root module, I've defined these variables like so: ```hcl variable "bucket_name" { type = string } variable "bucket_location" { type = string } ``` I initialize and apply using: ```bash tf init tf apply -var="bucket_name=my-awesome-bucket" -var="bucket_location=US" ``` The bucket is created successfully in the US, but when I try to change `bucket_location` to `EU` in my `tf apply` command: ```bash tf apply -var="bucket_name=my-awesome-bucket" -var="bucket_location=EU" ``` I get the following message: ``` No changes. Your infrastructure matches the configuration. ``` I tried running `tf refresh` before the `apply`, but it behaves the same way. The state file seems to be up-to-date since I can see the current configurations reflect what was applied earlier. I also checked that the variable names match exactly with the ones used in the module. Can anyone guide to understand why the module's input variables are not triggering an update? Am I missing a step or any best practices for managing state with remote backends? My development environment is Ubuntu 22.04. What are your experiences with this? I've been using Hcl for about a year now. Is this even possible?