CopyFromExternalStageToSnowflakeOperator

Use the CopyFromExternalStageToSnowflakeOperator to load data stored in AWS S3, Google Cloud Storage, or Azure Blob Storage to a Snowflake table.

Note

This operator is a simple wrapper in top of COPY INTO table query, and required creating stage first.

Using the Operator

Similarly to the SnowflakeOperator, use the snowflake_conn_id and the additional relevant parameters to establish connection with your Snowflake instance. This operator will allow loading of one or more named files from a specific Snowflake stage (predefined S3 path). In order to do so pass the relevant file names to the files parameter and the relevant Snowflake stage to the stage parameter. pattern can be used to specify the file names and/or paths match patterns (see docs). file_format can be used to either reference an already existing Snowflake file format or a custom string that defines a file format (see docs).

An example usage of the CopyFromExternalStageToSnowflakeOperator is as follows:

tests/system/snowflake/example_copy_into_snowflake.py[source]

copy_into_table = CopyFromExternalStageToSnowflakeOperator(
    task_id="copy_into_table",
    snowflake_conn_id=SNOWFLAKE_CONN_ID,
    files=[S3_FILE_PATH],
    table=SNOWFLAKE_SAMPLE_TABLE,
    stage=SNOWFLAKE_STAGE,
    file_format="(type = 'CSV',field_delimiter = ';')",
    pattern=".*[.]csv",
)

Was this entry helpful?