google_dataproc_workflow_template

A Workflow Template is a reusable workflow configuration. It defines a graph of jobs with information on where to run those jobs.

Example Usage

resource "google_dataproc_workflow_template" "template" {
  name = "template-example"
  location = "us-central1"
  placement {
    managed_cluster {
      cluster_name = "my-cluster"
      config {
        gce_cluster_config {
          zone = "us-central1-a"
          tags = ["foo", "bar"]
        }
        master_config {
          num_instances = 1
          machine_type = "n1-standard-1"
          disk_config {
            boot_disk_type = "pd-ssd"
            boot_disk_size_gb = 15
          }
        }
        worker_config {
          num_instances = 3
          machine_type = "n1-standard-2"
          disk_config {
            boot_disk_size_gb = 10
            num_local_ssds = 2
          }
        }

        secondary_worker_config {
          num_instances = 2
        }
        software_config {
          image_version = "2.0.35-debian10"
        }
      }
    }
  }
  jobs {
    step_id = "someJob"
    spark_job {
      main_class = "SomeClass"
    }
  }
  jobs {
    step_id = "otherJob"
    prerequisite_step_ids = ["someJob"]
    presto_job {
      query_file_uri = "someuri"
    }
  }
}

Argument Reference

The following arguments are supported:

The jobs block supports:

The placement block supports:

The config block supports:


The hadoop_job block supports:

The logging_config block supports:

The hive_job block supports:

The query_list block supports:

The pig_job block supports:

The logging_config block supports:

The query_list block supports:

The presto_job block supports:

The logging_config block supports:

The query_list block supports:

The pyspark_job block supports:

The logging_config block supports:

The scheduling block supports:

The spark_job block supports:

The logging_config block supports:

The spark_r_job block supports:

The logging_config block supports:

The spark_sql_job block supports:

The logging_config block supports:

The query_list block supports:

The parameters block supports:

The validation block supports:

The regex block supports:

The values block supports:

The cluster_selector block supports:

The managed_cluster block supports:

The master_config block supports:

The accelerators block supports:

The disk_config block supports:

The autoscaling_config block supports:

The encryption_config block supports:

The endpoint_config block supports:

The gce_cluster_config block supports:

The node_group_affinity block supports:

The reservation_affinity block supports:

The shielded_instance_config block supports:

cluster_config {
  gce_cluster_config {
    shielded_instance_config {
      enable_secure_boot          = true
      enable_vtpm                 = true
      enable_integrity_monitoring = true
    }
  }
}

The gke_cluster_config block supports:

The namespaced_gke_deployment_target block supports:

The initialization_actions block supports:

The lifecycle_config block supports:

The metastore_config block supports:

The security_config block supports:

The kerberos_config block supports:

The software_config block supports:

Attributes Reference

In addition to the arguments listed above, the following computed attributes are exported:

Timeouts

This resource provides the following Timeouts configuration options: configuration options:

Import

WorkflowTemplate can be imported using any of these accepted formats:

In Terraform v1.5.0 and later, use an import block to import WorkflowTemplate using one of the formats above. For example:

import {
  id = "projects/{{project}}/locations/{{location}}/workflowTemplates/{{name}}"
  to = google_dataproc_workflow_template.default
}

When using the terraform import command, WorkflowTemplate can be imported using one of the formats above. For example:

$ terraform import google_dataproc_workflow_template.default projects/{{project}}/locations/{{location}}/workflowTemplates/{{name}}
$ terraform import google_dataproc_workflow_template.default {{project}}/{{location}}/{{name}}
$ terraform import google_dataproc_workflow_template.default {{location}}/{{name}}