Content¶
Guides
References
Commits
apache-airflow-providers-snowflake
¶This is a provider package for snowflake
provider. All classes for this provider package
are in airflow.providers.snowflake
python package.
You can install this package on top of an existing Airflow 2 installation (see Requirements
below)
for the minimum Airflow version supported) via
pip install apache-airflow-providers-snowflake
PIP package |
Version required |
---|---|
|
|
|
|
|
|
|
|
Those are dependencies that might be needed in order to use all the features of the package. You need to install the specified provider packages in order to use them.
You can install such cross-provider dependencies when installing from PyPI. For example:
pip install apache-airflow-providers-snowflake[common.sql]
Dependent package |
Extra |
---|---|
|
|
|
You can download officially released packages and verify their checksums and signatures from the Official Apache Download site
The apache-airflow-providers-snowflake 4.0.3 sdist package (asc, sha512)
The apache-airflow-providers-snowflake 4.0.3 wheel package (asc, sha512)
provide missing connection to the parent class operator (#29211)
Snowflake Provider - hide host from UI (#29208)
This release of provider is only available for Airflow 2.3+ as explained in the Apache Airflow providers support policy.
The SnowflakeHook
is now conforming to the same semantics as all the other DBApiHook
implementations and returns the same kind of response in its run
method. Previously (pre 4.* versions
of the provider, the Hook returned Dictionary of { "column": "value" ... }
which was not compatible
with other DBApiHooks that return just sequence of sequences. After this change (and dependency
on common.sql >= 1.3.1),the SnowflakeHook
returns now python DbApi-compatible “results” by default.
The description
(i.e. among others names and types of columns returned) can be retrieved
via descriptions
and last_description
fields of the hook after run
method completes.
That makes the DatabricksSqlHook
suitable for generic SQL operator and detailed lineage analysis.
If you had custom hooks or used the Hook in your TaskFlow code or custom operators that relied on this
behaviour, you need to adapt your DAGs or you can switch back the SnowflakeHook
to return dictionaries
by passing return_dictionaries=True
to the run method of the hook.
The SnowflakeOperator
is also more standard and derives from common
SQLExecuteQueryOperator
and uses more consistent approach to process output when SQL queries are run.
However in this case the result returned by execute
method is unchanged (it still returns Dictionaries
rather than sequences and those dictionaries are pushed to XCom, so your DAGs relying on this behaviour
should continue working without any change.
In SnowflakeHook, if both extra__snowflake__foo
and foo
existed in connection extra
dict, the prefixed version would be used; now, the non-prefixed version will be preferred.
The 4.0.0
and 4.0.1
versions have been broken and yanked, so the 4.0.2 is the first change from the
4.*
line that should be used.
Fix wrapping of run() method result of exasol and snowflake DB hooks (#27997)
Make Snowflake Hook conform to semantics of DBApi (#28006)
Warning
This version is yanked, as it contained problems when interacting with common.sql provider. Please install a version released afterwards.
Fix errors in Databricks SQL operator introduced when refactoring (#27854)
Bump common.sql provider to 1.3.1 (#27888)
Fixing the behaviours of SQL Hooks and Operators finally (#27912)
Warning
This version is yanked, as it contained problems when interacting with common.sql provider. Please install a version released afterwards.
Update snowflake hook to not use extra prefix (#26764)
Move min airflow version to 2.3.0 for all providers (#27196)
Add SQLExecuteQueryOperator (#25717)
Use unused SQLCheckOperator.parameters in SQLCheckOperator.execute. (#27599)
Add custom handler param in SnowflakeOperator (#25983)
Fix wrong deprecation warning for 'S3ToSnowflakeOperator' (#26047)
Move all "old" SQL operators to common.sql providers (#25350)
Unify DbApiHook.run() method with the methods which override it (#23971)
Adding generic 'SqlToSlackOperator' (#24663)
Move all SQL classes to common-sql provider (#24836)
Pattern parameter in S3ToSnowflakeOperator (#24571)
S3ToSnowflakeOperator: escape single quote in s3_keys (#24607)
This release of provider is only available for Airflow 2.2+ as explained in the Apache Airflow providers support policy https://github.com/apache/airflow/blob/main/README.md#support-for-providers
Fix error when SnowflakeHook take empty list in 'sql' param (#23767)
Add support for private key in connection for Snowflake (#22266)
Fix mistakenly added install_requires for all providers (#22382)
Add more SQL template fields renderers (#21237)
Fix #21096: Support boolean in extra__snowflake__insecure_mode (#21155)
Support insecure mode in SnowflakeHook (#20106)
Remove unused code in SnowflakeHook (#20107)
Improvements for 'SnowflakeHook.get_sqlalchemy_engine' (#20509)
Exclude snowflake-sqlalchemy v1.2.5 (#20245)
Limit Snowflake connector to <2.7.2 (#20395)
Add test_connection method for Snowflake Hook (#19041)
Add region to Snowflake URI. (#18650)
Auto-apply apply_default decorator (#15667)
Warning
Due to apply_default decorator removal, this version of the provider requires Airflow 2.1.0+.
If your Airflow version is < 2.1.0, and you want to install this provider version, first upgrade
Airflow to at least version 2.1.0. Otherwise your Airflow package version will be upgraded
automatically and you will have to manually run airflow upgrade db
to complete the migration.
Add 'template_fields' to 'S3ToSnowflake' operator (#15926)
Allow S3ToSnowflakeOperator to omit schema (#15817)
Added ability for Snowflake to attribute usage to Airflow by adding an application parameter (#16420)
fix: restore parameters support when sql passed to SnowflakeHook as str (#16102)
Corrections in docs and tools after releasing provider RCs (#14082)
Prepare to release the next wave of providers: (#14487)
Updated documentation and readme files.
Fix S3ToSnowflakeOperator to support uploading all files in the specified stage (#12505)
Add connection arguments in S3ToSnowflakeOperator (#12564)
Initial version of the provider.