Resource: aws_appflow_flow

Provides an AppFlow flow resource.

Example Usage

resource "aws_s3_bucket" "example_source" {
  bucket = "example-source"
}

data "aws_iam_policy_document" "example_source" {
  statement {
    sid    = "AllowAppFlowSourceActions"
    effect = "Allow"

    principals {
      type        = "Service"
      identifiers = ["appflow.amazonaws.com"]
    }

    actions = [
      "s3:ListBucket",
      "s3:GetObject",
    ]

    resources = [
      "arn:aws:s3:::example-source",
      "arn:aws:s3:::example-source/*",
    ]
  }
}

resource "aws_s3_bucket_policy" "example_source" {
  bucket = aws_s3_bucket.example_source.id
  policy = data.aws_iam_policy_document.example_source.json
}

resource "aws_s3_object" "example" {
  bucket = aws_s3_bucket.example_source.id
  key    = "example_source.csv"
  source = "example_source.csv"
}

resource "aws_s3_bucket" "example_destination" {
  bucket = "example-destination"
}

data "aws_iam_policy_document" "example_destination" {
  statement {
    sid    = "AllowAppFlowDestinationActions"
    effect = "Allow"

    principals {
      type        = "Service"
      identifiers = ["appflow.amazonaws.com"]
    }

    actions = [
      "s3:PutObject",
      "s3:AbortMultipartUpload",
      "s3:ListMultipartUploadParts",
      "s3:ListBucketMultipartUploads",
      "s3:GetBucketAcl",
      "s3:PutObjectAcl",
    ]

    resources = [
      "arn:aws:s3:::example-destination",
      "arn:aws:s3:::example-destination/*",
    ]
  }
}

resource "aws_s3_bucket_policy" "example_destination" {
  bucket = aws_s3_bucket.example_destination.id
  policy = data.aws_iam_policy_document.example_destination.json
}

resource "aws_appflow_flow" "example" {
  name = "example"

  source_flow_config {
    connector_type = "S3"
    source_connector_properties {
      s3 {
        bucket_name   = aws_s3_bucket_policy.example_source.bucket
        bucket_prefix = "example"
      }
    }
  }

  destination_flow_config {
    connector_type = "S3"
    destination_connector_properties {
      s3 {
        bucket_name = aws_s3_bucket_policy.example_destination.bucket

        s3_output_format_config {
          prefix_config {
            prefix_type = "PATH"
          }
        }
      }
    }
  }

  task {
    source_fields     = ["exampleField"]
    destination_field = "exampleField"
    task_type         = "Map"

    connector_operator {
      s3 = "NO_OP"
    }
  }

  trigger_config {
    trigger_type = "OnDemand"
  }
}

Argument Reference

This resource supports the following arguments:

Destination Flow Config

Destination Connector Properties

Generic Destination Properties

EventBridge, Honeycode, and Marketo destination properties all support the following attributes:

Custom Connector Destination Properties
Customer Profiles Destination Properties
Redshift Destination Properties
S3 Destination Properties
S3 Output Format Config
Salesforce Destination Properties
SAPOData Destination Properties
Success Response Handling Config
Snowflake Destination Properties
Upsolver Destination Properties
Upsolver S3 Output Format Config
Aggregation Config
Prefix Config
Zendesk Destination Properties
Error Handling Config

Source Flow Config

Source Connector Properties

Generic Source Properties

Amplitude, Datadog, Dynatrace, Google Analytics, Infor Nexus, Marketo, ServiceNow, Singular, Slack, Trend Micro, and Zendesk source properties all support the following attributes:

Custom Connector Source Properties
S3 Source Properties
S3 Input Format Config
Salesforce Source Properties
SAPOData Source Properties
Veeva Source Properties

Incremental Pull Config

Task

Connector Operator

Trigger Config

Scheduled Trigger Properties

The trigger_properties block only supports one attribute: scheduled, a block which in turn supports the following:

resource "aws_appflow_flow" "example" {
  # ... other configuration ...

  trigger_config {
    scheduled {
      schedule_expression = "rate(1minutes)"
    }
  }
}

Attribute Reference

This resource exports the following attributes in addition to the arguments above:

Import

In Terraform v1.5.0 and later, use an import block to import AppFlow flows using the arn. For example:

import {
  to = aws_appflow_flow.example
  id = "arn:aws:appflow:us-west-2:123456789012:flow/example-flow"
}

Using terraform import, import AppFlow flows using the arn. For example:

% terraform import aws_appflow_flow.example arn:aws:appflow:us-west-2:123456789012:flow/example-flow