databricks_pipeline Resource

Use databricks_pipeline to deploy Delta Live Tables.

Example Usage

resource "databricks_notebook" "dlt_demo" {
  #...
}

resource "databricks_repo" "dlt_demo" {
  #...
}

resource "databricks_pipeline" "this" {
  name    = "Pipeline Name"
  storage = "/test/first-pipeline"
  configuration = {
    key1 = "value1"
    key2 = "value2"
  }

  cluster {
    label       = "default"
    num_workers = 2
    custom_tags = {
      cluster_type = "default"
    }
  }

  cluster {
    label       = "maintenance"
    num_workers = 1
    custom_tags = {
      cluster_type = "maintenance"
    }
  }

  library {
    notebook {
      path = databricks_notebook.dlt_demo.id
    }
  }

  library {
    file {
      path = "${databricks_repo.dlt_demo.path}/pipeline.sql"
    }
  }

  continuous = false

  notification {
    email_recipients = ["user@domain.com", "user1@domain.com"]
    alerts = [
      "on-update-failure",
      "on-update-fatal-failure",
      "on-update-success",
      "on-flow-failure"
    ]
  }
}

Argument Reference

The following arguments are supported:

notification block

DLT allows to specify one or more notification blocks to get notifications about pipeline's execution. This block consists of following attributes:

Attribute Reference

In addition to all arguments above, the following attributes are exported:

Import

The resource job can be imported using the id of the pipeline

terraform import databricks_pipeline.this <pipeline-id>

The following resources are often used in the same context: