databricks_notebook Resource

This resource allows you to manage Databricks Notebooks. You can also work with databricks_notebook and databricks_notebook_paths data sources.

Example Usage

You can declare Terraform-managed notebook by specifying source attribute of corresponding local file. Only .scala, .py, .sql, .r, and .ipynb extensions are supported, if you would like to omit the language attribute.

data "databricks_current_user" "me" {
}

resource "databricks_notebook" "ddl" {
  source = "${path.module}/DDLgen.py"
  path   = "${data.databricks_current_user.me.home}/AA/BB/CC"
}

You can also create a managed notebook with inline sources through content_base64 and language attributes.

resource "databricks_notebook" "notebook" {
  content_base64 = base64encode(<<-EOT
    # created from ${abspath(path.module)}
    display(spark.range(10))
    EOT
  )
  path     = "/Shared/Demo"
  language = "PYTHON"
}

You can also manage Databricks Archives to import the whole folders of notebooks statically. Whenever you update the .dbc file, the Terraform-managed notebook folder is removed and replaced with contents of the new .dbc file. You are strongly advised to use .dbc format only with source attribute of the resource:

resource "databricks_notebook" "lesson" {
  source = "${path.module}/IntroNotebooks.dbc"
  path   = "/Shared/Intro"
}

Argument Reference

The size of a notebook source code must not exceed a few megabytes. The following arguments are supported:

Attribute Reference

In addition to all arguments above, the following attributes are exported:

Access Control

Import

The resource notebook can be imported using notebook path

terraform import databricks_notebook.this /path/to/notebook

The following resources are often used in the same context: