Experimental resource exporter

Generates *.tf files for Databricks resources together with import.sh that is used to import objects into the Terraform state. Available as part of provider binary. The only way to authenticate is through environment variables. It's best used when you need to export Terraform configuration for an existing Databricks workspace quickly. After generating the configuration, we strongly recommend manually reviewing all created files.

Example Usage

After downloading the latest released binary, unpack it and place it in the same folder. You may have already downloaded this binary - check the .terraform folder of any state directory where you've used the databricks provider. It could also be in your plugin cache ~/.terraform.d/plugins/registry.terraform.io/databricks/databricks/*/*/terraform-provider-databricks. Here's the tool in action:

asciicast

Exporter can also be used in a non-interactive mode:

export DATABRICKS_HOST=...
export DATABRICKS_TOKEN=...
./terraform-provider-databricks exporter -skip-interactive \
 -services=groups,secrets,access,compute,users,jobs,storage \
 -listing=jobs,compute \
 -last-active-days=90 \
 -debug

Argument Reference

All arguments are optional, and they tune what code is being generated.

Services

Services are just logical groups of resources used for filtering and organization in files written in -directory. All resources are globally sorted by their resource name, which allows you to use generated files for compliance purposes. Nevertheless, managing the entire Databricks workspace with Terraform is the preferred way. Except for notebooks and possibly libraries, which may have their own CI/CD processes.

Secrets

For security reasons, databricks_secret cannot contain actual plaintext secrets. By default importer will create a variable in vars.tf, with the same name as the secret. You are supposed to fill in the value of the secret after that. You can use -export-secrets command-line option to generate the terraform.tfvars file with secret values.

Parallel execution

To speed up export, Terraform Exporter performs many operations, such as listing & actual data exporting, in parallel using Goroutines. Built-in defaults are controlling the parallelism, but it's also possible to tune some parameters using environment variables specific to the exporter:

Support Matrix

Exporter aims to generate HCL code for most of the resources within the Databricks workspace:

Resource Supported Incremental Workspace Account
databricks_access_control_rule_set Yes No No Yes
databricks_artifact_allowlist Yes No Yes No
databricks_catalog Yes Yes Yes No
databricks_cluster Yes No Yes No
databricks_cluster_policy Yes No Yes No
databricks_connection Yes Yes Yes No
databricks_dbfs_file Yes No Yes No
databricks_external_location Yes Yes Yes No
databricks_file Yes No Yes No
databricks_global_init_script Yes Yes Yes No
databricks_grants Yes No Yes No
databricks_group Yes No Yes Yes
databricks_group_instance_profile Yes No Yes No
databricks_group_member Yes No Yes Yes
databricks_group_role Yes No Yes Yes
databricks_instance_pool Yes No Yes No
databricks_instance_profile Yes No Yes No
databricks_ip_access_list Yes Yes Yes No
databricks_job Yes No Yes No
databricks_library Yes* No Yes No
databricks_metastore Yes Yes No Yes
databricks_metastore_assignment Yes No No Yes
databricks_mlflow_experiment No No No No
databricks_mlflow_model No No No No
databricks_mlflow_webhook Yes Yes Yes No
databricks_model_serving Yes Yes Yes No
databricks_notebook Yes Yes Yes No
databricks_obo_token Not Applicable No No No
databricks_permissions Yes No Yes No
databricks_pipeline Yes Yes Yes No
databricks_recipient Yes Yes Yes No
databricks_registered_model Yes Yes Yes No
databricks_repo Yes No Yes No
databricks_schema Yes Yes Yes No
databricks_secret Yes No Yes No
databricks_secret_acl Yes No Yes No
databricks_secret_scope Yes No Yes No
databricks_service_principal Yes No Yes Yes
databricks_service_principal_role Yes No Yes Yes
databricks_share Yes Yes Yes No
databricks_sql_alert Yes Yes Yes No
databricks_sql_dashboard Yes Yes Yes No
databricks_sql_endpoint Yes No Yes No
databricks_sql_global_config Yes No Yes No
databricks_sql_permissions No No Yes No
databricks_sql_query Yes Yes Yes No
databricks_sql_table Yes Yes Yes No
databricks_sql_visualization Yes Yes Yes No
databricks_sql_widget Yes Yes Yes No
databricks_storage_credential Yes Yes Yes No
databricks_system_schema Yes No Yes No
databricks_token Not Applicable No Yes No
databricks_user Yes No Yes Yes
databricks_user_instance_profile No No No No
databricks_user_role Yes No Yes Yes
databricks_volume Yes Yes Yes No
databricks_workspace_conf Yes (partial) No Yes No
databricks_workspace_file Yes Yes Yes No

Notes: