airflow.providers.postgres.hooks.postgres

Module Contents

Classes

PostgresHook

Interact with Postgres.

Attributes

CursorType

airflow.providers.postgres.hooks.postgres.CursorType[source]
class airflow.providers.postgres.hooks.postgres.PostgresHook(*args, options=None, **kwargs)[source]

Bases: airflow.providers.common.sql.hooks.sql.DbApiHook

Interact with Postgres.

You can specify ssl parameters in the extra field of your connection as {"sslmode": "require", "sslcert": "/path/to/cert.pem", etc}. Also you can choose cursor as {"cursor": "dictcursor"}. Refer to the psycopg2.extras for more details.

Note: For Redshift, use keepalives_idle in the extra connection parameters and set it to less than 300 seconds.

Note: For AWS IAM authentication, use iam in the extra connection parameters and set it to true. Leave the password field empty. This will use the “aws_default” connection to get the temporary token unless you override in extras. extras example: {"iam":true, "aws_conn_id":"my_aws_conn"} For Redshift, also use redshift in the extra connection parameters and set it to true. The cluster-identifier is extracted from the beginning of the host field, so is optional. It can however be overridden in the extra field. extras example: {"iam":true, "redshift":true, "cluster-identifier": "my_cluster_id"}

Parameters
  • postgres_conn_id – The postgres conn id reference to a specific postgres database.

  • options (str | None) – Optional. Specifies command-line options to send to the server at connection start. For example, setting this to -c search_path=myschema sets the session’s value of the search_path to myschema.

property schema[source]
conn_name_attr = 'postgres_conn_id'[source]
default_conn_name = 'postgres_default'[source]
conn_type = 'postgres'[source]
hook_name = 'Postgres'[source]
supports_autocommit = True[source]
get_conn()[source]

Establish a connection to a postgres database.

copy_expert(sql, filename)[source]

Execute SQL using psycopg2’s copy_expert method.

Necessary to execute COPY command without access to a superuser.

Note: if this method is called with a “COPY FROM” statement and the specified input file does not exist, it creates an empty file and no data is loaded, but the operation succeeds. So if users want to be aware when the input file does not exist, they have to check its existence by themselves.

get_uri()[source]

Extract the URI from the connection.

Returns

the extracted uri.

Return type

str

bulk_load(table, tmp_file)[source]

Load a tab-delimited file into a database table.

bulk_dump(table, tmp_file)[source]

Dump a database table into a tab-delimited file.

get_iam_token(conn)[source]

Get the IAM token.

This uses AWSHook to retrieve a temporary password to connect to Postgres or Redshift. Port is required. If none is provided, the default 5432 is used.

get_table_primary_key(table, schema='public')[source]

Get the table’s primary key.

Parameters
  • table (str) – Name of the target table

  • schema (str | None) – Name of the target schema, public by default

Returns

Primary key columns list

Return type

list[str] | None

get_openlineage_database_info(connection)[source]

Return Postgres/Redshift specific information for OpenLineage.

get_openlineage_database_dialect(connection)[source]

Return postgres/redshift dialect.

get_openlineage_default_schema()[source]

Return current schema. This is usually changed with SEARCH_PATH parameter.

classmethod get_ui_field_behaviour()[source]

Was this entry helpful?