airflow.providers.amazon.aws.sensors.bedrock

Classes

BedrockBaseSensor

General sensor behavior for Amazon Bedrock.

BedrockCustomizeModelCompletedSensor

Poll the state of the model customization job until it reaches a terminal state; fails if the job fails.

BedrockProvisionModelThroughputCompletedSensor

Poll the provisioned model throughput job until it reaches a terminal state; fails if the job fails.

BedrockKnowledgeBaseActiveSensor

Poll the Knowledge Base status until it reaches a terminal state; fails if creation fails.

BedrockIngestionJobSensor

Poll the ingestion job status until it reaches a terminal state; fails if creation fails.

BedrockBatchInferenceSensor

Poll the batch inference job status until it reaches a terminal state; fails if creation fails.

Module Contents

class airflow.providers.amazon.aws.sensors.bedrock.BedrockBaseSensor(deferrable=conf.getboolean('operators', 'default_deferrable', fallback=False), **kwargs)[source]

Bases: airflow.providers.amazon.aws.sensors.base_aws.AwsBaseSensor[_GenericBedrockHook]

General sensor behavior for Amazon Bedrock.

Subclasses must implement following methods:
  • get_state()

Subclasses must set the following fields:
  • INTERMEDIATE_STATES

  • FAILURE_STATES

  • SUCCESS_STATES

  • FAILURE_MESSAGE

Parameters:

deferrable (bool) – If True, the sensor will operate in deferrable mode. This mode requires aiobotocore module to be installed. (default: False, but can be overridden in config file by setting default_deferrable to True)

INTERMEDIATE_STATES: tuple[str, Ellipsis] = ()[source]
FAILURE_STATES: tuple[str, Ellipsis] = ()[source]
SUCCESS_STATES: tuple[str, Ellipsis] = ()[source]
FAILURE_MESSAGE = ''[source]
aws_hook_class: type[_GenericBedrockHook][source]
ui_color = '#66c3ff'[source]
deferrable = True[source]
poke(context, **kwargs)[source]

Override when deriving this class.

abstract get_state()[source]

Implement in subclasses.

class airflow.providers.amazon.aws.sensors.bedrock.BedrockCustomizeModelCompletedSensor(*, job_name, max_retries=75, poke_interval=120, **kwargs)[source]

Bases: BedrockBaseSensor[airflow.providers.amazon.aws.hooks.bedrock.BedrockHook]

Poll the state of the model customization job until it reaches a terminal state; fails if the job fails.

See also

For more information on how to use this sensor, take a look at the guide: Wait for an Amazon Bedrock customize model job

Parameters:
  • job_name (str) – The name of the Bedrock model customization job.

  • deferrable – If True, the sensor will operate in deferrable mode. This mode requires aiobotocore module to be installed. (default: False, but can be overridden in config file by setting default_deferrable to True)

  • poke_interval (int) – Polling period in seconds to check for the status of the job. (default: 120)

  • max_retries (int) – Number of times before returning the current state. (default: 75)

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

INTERMEDIATE_STATES: tuple[str, Ellipsis] = ('InProgress',)[source]
FAILURE_STATES: tuple[str, Ellipsis] = ('Failed', 'Stopping', 'Stopped')[source]
SUCCESS_STATES: tuple[str, Ellipsis] = ('Completed',)[source]
FAILURE_MESSAGE = 'Bedrock model customization job sensor failed.'[source]
aws_hook_class[source]
template_fields: collections.abc.Sequence[str][source]
poke_interval = 120[source]
max_retries = 75[source]
job_name[source]
execute(context)[source]

Derive when creating an operator.

The main method to execute the task. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

get_state()[source]

Implement in subclasses.

class airflow.providers.amazon.aws.sensors.bedrock.BedrockProvisionModelThroughputCompletedSensor(*, model_id, poke_interval=60, max_retries=20, **kwargs)[source]

Bases: BedrockBaseSensor[airflow.providers.amazon.aws.hooks.bedrock.BedrockHook]

Poll the provisioned model throughput job until it reaches a terminal state; fails if the job fails.

See also

For more information on how to use this sensor, take a look at the guide: Wait for an Amazon Bedrock provision model throughput job

Parameters:
  • model_id (str) – The ARN or name of the provisioned throughput.

  • deferrable – If True, the sensor will operate in deferrable more. This mode requires aiobotocore module to be installed. (default: False, but can be overridden in config file by setting default_deferrable to True)

  • poke_interval (int) – Polling period in seconds to check for the status of the job. (default: 60)

  • max_retries (int) – Number of times before returning the current state (default: 20)

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

INTERMEDIATE_STATES: tuple[str, Ellipsis] = ('Creating', 'Updating')[source]
FAILURE_STATES: tuple[str, Ellipsis] = ('Failed',)[source]
SUCCESS_STATES: tuple[str, Ellipsis] = ('InService',)[source]
FAILURE_MESSAGE = 'Bedrock provision model throughput sensor failed.'[source]
aws_hook_class[source]
template_fields: collections.abc.Sequence[str][source]
poke_interval = 60[source]
max_retries = 20[source]
model_id[source]
get_state()[source]

Implement in subclasses.

execute(context)[source]

Derive when creating an operator.

The main method to execute the task. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.sensors.bedrock.BedrockKnowledgeBaseActiveSensor(*, knowledge_base_id, poke_interval=5, max_retries=24, **kwargs)[source]

Bases: BedrockBaseSensor[airflow.providers.amazon.aws.hooks.bedrock.BedrockAgentHook]

Poll the Knowledge Base status until it reaches a terminal state; fails if creation fails.

See also

For more information on how to use this sensor, take a look at the guide: Wait for an Amazon Bedrock Knowledge Base

Parameters:
  • knowledge_base_id (str) – The unique identifier of the knowledge base for which to get information. (templated)

  • deferrable – If True, the sensor will operate in deferrable more. This mode requires aiobotocore module to be installed. (default: False, but can be overridden in config file by setting default_deferrable to True)

  • poke_interval (int) – Polling period in seconds to check for the status of the job. (default: 5)

  • max_retries (int) – Number of times before returning the current state (default: 24)

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

INTERMEDIATE_STATES: tuple[str, Ellipsis] = ('CREATING', 'UPDATING')[source]
FAILURE_STATES: tuple[str, Ellipsis] = ('DELETING', 'FAILED')[source]
SUCCESS_STATES: tuple[str, Ellipsis] = ('ACTIVE',)[source]
FAILURE_MESSAGE = 'Bedrock Knowledge Base Active sensor failed.'[source]
aws_hook_class[source]
template_fields: collections.abc.Sequence[str][source]
poke_interval = 5[source]
max_retries = 24[source]
knowledge_base_id[source]
get_state()[source]

Implement in subclasses.

execute(context)[source]

Derive when creating an operator.

The main method to execute the task. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.sensors.bedrock.BedrockIngestionJobSensor(*, knowledge_base_id, data_source_id, ingestion_job_id, poke_interval=60, max_retries=10, **kwargs)[source]

Bases: BedrockBaseSensor[airflow.providers.amazon.aws.hooks.bedrock.BedrockAgentHook]

Poll the ingestion job status until it reaches a terminal state; fails if creation fails.

See also

For more information on how to use this sensor, take a look at the guide: Wait for an Amazon Bedrock ingestion job to finish

Parameters:
  • knowledge_base_id (str) – The unique identifier of the knowledge base for which to get information. (templated)

  • data_source_id (str) – The unique identifier of the data source in the ingestion job. (templated)

  • ingestion_job_id (str) – The unique identifier of the ingestion job. (templated)

  • deferrable – If True, the sensor will operate in deferrable more. This mode requires aiobotocore module to be installed. (default: False, but can be overridden in config file by setting default_deferrable to True)

  • poke_interval (int) – Polling period in seconds to check for the status of the job. (default: 60)

  • max_retries (int) – Number of times before returning the current state (default: 10)

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

INTERMEDIATE_STATES: tuple[str, Ellipsis] = ('STARTING', 'IN_PROGRESS')[source]
FAILURE_STATES: tuple[str, Ellipsis] = ('FAILED',)[source]
SUCCESS_STATES: tuple[str, Ellipsis] = ('COMPLETE',)[source]
FAILURE_MESSAGE = 'Bedrock ingestion job sensor failed.'[source]
aws_hook_class[source]
template_fields: collections.abc.Sequence[str][source]
poke_interval = 60[source]
max_retries = 10[source]
knowledge_base_id[source]
data_source_id[source]
ingestion_job_id[source]
get_state()[source]

Implement in subclasses.

execute(context)[source]

Derive when creating an operator.

The main method to execute the task. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.sensors.bedrock.BedrockBatchInferenceSensor(*, job_arn, success_state=SuccessState.SCHEDULED, poke_interval=120, max_retries=75, **kwargs)[source]

Bases: BedrockBaseSensor[airflow.providers.amazon.aws.hooks.bedrock.BedrockHook]

Poll the batch inference job status until it reaches a terminal state; fails if creation fails.

See also

For more information on how to use this sensor, take a look at the guide: Wait for an Amazon Bedrock batch inference job

Parameters:
  • job_arn (str) – The Amazon Resource Name (ARN) of the batch inference job. (templated)

  • success_state (SuccessState | str) – A BedrockBatchInferenceSensor.TargetState; defaults to ‘SCHEDULED’ (templated)

  • deferrable – If True, the sensor will operate in deferrable more. This mode requires aiobotocore module to be installed. (default: False, but can be overridden in config file by setting default_deferrable to True)

  • poke_interval (int) – Polling period in seconds to check for the status of the job. (default: 5)

  • max_retries (int) – Number of times before returning the current state (default: 24)

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

class SuccessState[source]

Target state for the BedrockBatchInferenceSensor.

Bedrock adds batch inference jobs to a queue, and they may take some time to complete. If you want to wait for the job to complete, use TargetState.COMPLETED, but if you only want to wait until the service confirms that the job is in the queue, use TargetState.SCHEDULED.

The normal successful progression of states is:

Submitted > Validating > Scheduled > InProgress > PartiallyCompleted > Completed

SCHEDULED = 'scheduled'[source]
COMPLETED = 'completed'[source]
INTERMEDIATE_STATES: tuple[str, Ellipsis][source]
FAILURE_STATES: tuple[str, Ellipsis] = ('Failed', 'Stopped', 'PartiallyCompleted', 'Expired')[source]
SUCCESS_STATES: tuple[str, Ellipsis][source]
FAILURE_MESSAGE = 'Bedrock batch inference job sensor failed.'[source]
INVALID_SUCCESS_STATE_MESSAGE = 'success_state must be an instance of TargetState.'[source]
aws_hook_class[source]
template_fields: collections.abc.Sequence[str][source]
poke_interval = 120[source]
max_retries = 75[source]
job_arn[source]
success_state[source]
trigger_class: type[airflow.providers.amazon.aws.triggers.bedrock.BedrockBaseBatchInferenceTrigger][source]
get_state()[source]

Implement in subclasses.

execute(context)[source]

Derive when creating an operator.

The main method to execute the task. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?