google_storage_transfer_job

Creates a new Transfer Job in Google Cloud Storage Transfer.

To get more information about Google Cloud Storage Transfer, see:

Example Usage

Example creating a nightly Transfer Job from an AWS S3 Bucket to a GCS bucket.

data "google_storage_transfer_project_service_account" "default" {
  project = var.project
}

resource "google_storage_bucket" "s3-backup-bucket" {
  name          = "${var.aws_s3_bucket}-backup"
  storage_class = "NEARLINE"
  project       = var.project
  location      = "US"
}

resource "google_storage_bucket_iam_member" "s3-backup-bucket" {
  bucket     = google_storage_bucket.s3-backup-bucket.name
  role       = "roles/storage.admin"
  member     = "serviceAccount:${data.google_storage_transfer_project_service_account.default.email}"
  depends_on = [google_storage_bucket.s3-backup-bucket]
}

resource "google_pubsub_topic" "topic" {
  name = "${var.pubsub_topic_name}"
}

resource "google_pubsub_topic_iam_member" "notification_config" {
  topic = google_pubsub_topic.topic.id
  role = "roles/pubsub.publisher"
  member = "serviceAccount:${data.google_storage_transfer_project_service_account.default.email}"
}

resource "google_storage_transfer_job" "s3-bucket-nightly-backup" {
  description = "Nightly backup of S3 bucket"
  project     = var.project

  transfer_spec {
    object_conditions {
      max_time_elapsed_since_last_modification = "600s"
      exclude_prefixes = [
        "requests.gz",
      ]
    }
    transfer_options {
      delete_objects_unique_in_sink = false
    }
    aws_s3_data_source {
      bucket_name = var.aws_s3_bucket
      aws_access_key {
        access_key_id     = var.aws_access_key
        secret_access_key = var.aws_secret_key
      }
    }
    gcs_data_sink {
      bucket_name = google_storage_bucket.s3-backup-bucket.name
      path        = "foo/bar/"
    }
  }

  schedule {
    schedule_start_date {
      year  = 2018
      month = 10
      day   = 1
    }
    schedule_end_date {
      year  = 2019
      month = 1
      day   = 15
    }
    start_time_of_day {
      hours   = 23
      minutes = 30
      seconds = 0
      nanos   = 0
    }
    repeat_interval = "604800s"
  }

  notification_config {
    pubsub_topic  = google_pubsub_topic.topic.id
    event_types   = [
      "TRANSFER_OPERATION_SUCCESS",
      "TRANSFER_OPERATION_FAILED"
    ]
    payload_format = "JSON"
  }

  depends_on = [google_storage_bucket_iam_member.s3-backup-bucket, google_pubsub_topic_iam_member.notification_config]
}

Argument Reference

The following arguments are supported:


The transfer_spec block supports:

The schedule block supports:

The event_stream block supports:

The object_conditions block supports:

The transfer_options block supports:

The gcs_data_sink block supports:

The gcs_data_source block supports:

The posix_data_sink block supports:

The posix_data_source block supports:

The aws_s3_data_source block supports:

The aws_access_key block supports:

The http_data_source block supports:

The azure_blob_storage_data_source block supports:

The azure_credentials block supports:

The schedule_start_date and schedule_end_date blocks support:

The start_time_of_day blocks support:

The notification_config block supports:

Attributes Reference

In addition to the arguments listed above, the following computed attributes are exported:

Import

Storage Transfer Jobs can be imported using the Transfer Job's project and name (without the transferJob/ prefix), e.g.

In Terraform v1.5.0 and later, use an import block to import Storage Transfer Jobs using one of the formats above. For example:

import {
  id = "{{project_id}}/{{name}}"
  to = google_storage_transfer_job.default
}

When using the terraform import command, Storage Transfer Jobs can be imported using one of the formats above. For example:

$ terraform import google_storage_transfer_job.default {{project_id}}/123456789