airflow.operators.redshift_to_s3_operator

Module Contents

class airflow.operators.redshift_to_s3_operator.RedshiftToS3Transfer(schema:str, table:str, s3_bucket:str, s3_key:str, redshift_conn_id:str='redshift_default', aws_conn_id:str='aws_default', verify:Union[bool, str]=None, unload_options:List=None, autocommit:bool=False, include_header:bool=False, *args, **kwargs)[source]

Bases: airflow.models.BaseOperator

Executes an UNLOAD command to s3 as a CSV with headers

Parameters
  • schema (str) – reference to a specific schema in redshift database

  • table (str) – reference to a specific table in redshift database

  • s3_bucket (str) – reference to a specific S3 bucket

  • s3_key (str) – reference to a specific S3 key

  • redshift_conn_id (str) – reference to a specific redshift database

  • aws_conn_id (str) – reference to a specific S3 connection

  • verify (bool or str) –

    Whether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified. You can provide the following values:

    • False: do not validate SSL certificates. SSL will still be used

      (unless use_ssl is False), but SSL certificates will not be verified.

    • path/to/cert/bundle.pem: A filename of the CA cert bundle to uses.

      You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.

  • unload_options (list) – reference to a list of UNLOAD options

template_fields = [][source]
template_ext = [][source]
ui_color = #ededed[source]
execute(self, context)[source]