databricks_dbfs_file Resource

This is a resource that lets you manage relatively small files on Databricks File System (DBFS). The best use cases are libraries for databricks_cluster or databricks_job. You can also use databricks_dbfs_file and databricks_dbfs_file_paths data sources.

Example Usage

In order to manage a file on Databricks File System with Terraform, you must specify the source attribute containing the full path to the file on the local filesystem.

resource "databricks_dbfs_file" "this" {
  source = "${path.module}/main.tf"
  path   = "/tmp/main.tf"
}

Alternatively, you can create DBFS files with custom content, using filesystem functions.

resource "databricks_dbfs_file" "this" {
  content_base64 = base64encode(<<-EOT
    Hello, world!
    Module is ${abspath(path.module)}
    EOT
  )
  path = "/tmp/this.txt"
}

Install databricks_library on all databricks_clusters:

data "databricks_clusters" "all" {
}

resource "databricks_dbfs_file" "app" {
  source = "${path.module}/baz.whl"
  path   = "/FileStore/baz.whl"
}

resource "databricks_library" "app" {
  for_each   = data.databricks_clusters.all.ids
  cluster_id = each.key
  whl        = databricks_dbfs_file.app.dbfs_path
}

Argument Reference

The following arguments are supported:

Attribute Reference

In addition to all arguments above, the following attributes are exported:

Import

The resource dbfs file can be imported using the path of the file:

terraform import databricks_dbfs_file.this <path>

The following resources are often used in the same context: