Airbyte (dagster-airbyte)

This library provides a Dagster integration with Airbyte.

For more information on getting started, see the Airbyte integration guide.

Ops

dagster_airbyte.airbyte_sync_op = <dagster._core.definitions.op_definition.OpDefinition object>[source]

Config Schema:
connection_id (String):

The Airbyte Connection ID that this op will sync. You can retrieve this value from the “Connections” tab of a given connector in the Airbyte UI.

poll_interval (Float, optional):

The time (in seconds) that will be waited between successive polls.

Default Value: 10

poll_timeout (Union[Float, None], optional):

The maximum time that will waited before this operation is timed out. By default, this will never time out.

Default Value: None

yield_materializations (Bool, optional):

If True, materializations corresponding to the results of the Airbyte sync will be yielded when the op executes.

Default Value: True

asset_key_prefix (List[String], optional):

If provided and yield_materializations is True, these components will be used to prefix the generated asset keys.

Default Value: [‘airbyte’]

Executes a Airbyte job sync for a given connection_id, and polls until that sync completes, raising an error if it is unsuccessful. It outputs a AirbyteOutput which contains the job details for a given connection_id.

It requires the use of the airbyte_resource, which allows it to communicate with the Airbyte API.

Examples:

from dagster import job
from dagster_airbyte import airbyte_resource, airbyte_sync_op

my_airbyte_resource = airbyte_resource.configured(
    {
        "host": {"env": "AIRBYTE_HOST"},
        "port": {"env": "AIRBYTE_PORT"},
    }
)

sync_foobar = airbyte_sync_op.configured({"connection_id": "foobar"}, name="sync_foobar")

@job(resource_defs={"airbyte": my_airbyte_resource})
def my_simple_airbyte_job():
    sync_foobar()

@job(resource_defs={"airbyte": my_airbyte_resource})
def my_composed_airbyte_job():
    final_foobar_state = sync_foobar(start_after=some_op())
    other_op(final_foobar_state)

Resources

dagster_airbyte.airbyte_resource ResourceDefinition[source]

Config Schema:
host (dagster.StringSource):

The Airbyte Server Address.

port (dagster.StringSource):

Port for the Airbyte Server.

use_https (Bool, optional):

Use https to connect in Airbyte Server.

Default Value: False

request_max_retries (Int, optional):

The maximum number of times requests to the Airbyte API should be retried before failing.

Default Value: 3

request_retry_delay (Float, optional):

Time (in seconds) to wait between each request retry.

Default Value: 0.25

request_timeout (Int, optional):

Time (in seconds) after which the requests to Airbyte are declared timed out.

Default Value: 15

request_additional_params (permissive dict, optional):

Any additional kwargs to pass to the requests library when making requests to Airbyte.

Default Value:
{}
forward_logs (Bool, optional):

Whether to forward Airbyte logs to the compute log, can be expensive for long-running syncs.

Default Value: True

This resource allows users to programatically interface with the Airbyte REST API to launch syncs and monitor their progress. This currently implements only a subset of the functionality exposed by the API.

For a complete set of documentation on the Airbyte REST API, including expected response JSON schema, see the Airbyte API Docs.

To configure this resource, we recommend using the configured method.

Examples:

from dagster import job
from dagster_airbyte import airbyte_resource

my_airbyte_resource = airbyte_resource.configured(
    {
        "host": {"env": "AIRBYTE_HOST"},
        "port": {"env": "AIRBYTE_PORT"},
    }
)

@job(resource_defs={"airbyte":my_airbyte_resource})
def my_airbyte_job():
    ...
class dagster_airbyte.AirbyteResource(host, port, use_https, request_max_retries=3, request_retry_delay=0.25, request_timeout=15, request_additional_params=None, log=<Logger dagster.builtin (DEBUG)>, forward_logs=True)[source]

This class exposes methods on top of the Airbyte REST API.

Assets

dagster_airbyte.load_assets_from_airbyte_instance(airbyte, workspace_id=None, key_prefix=None, create_assets_for_normalization_tables=True, connection_to_group_fn=<function _clean_name>, connection_filter=None)[source]

Loads Airbyte connection assets from a configured AirbyteResource instance. This fetches information about defined connections at initialization time, and will error on workspace load if the Airbyte instance is not reachable.

Parameters:
  • airbyte (ResourceDefinition) – An AirbyteResource configured with the appropriate connection details.

  • workspace_id (Optional[str]) – The ID of the Airbyte workspace to load connections from. Only required if multiple workspaces exist in your instance.

  • key_prefix (Optional[CoercibleToAssetKeyPrefix]) – A prefix for the asset keys created.

  • create_assets_for_normalization_tables (bool) – If True, assets will be created for tables created by Airbyte’s normalization feature. If False, only the destination tables will be created. Defaults to True.

  • connection_to_group_fn (Optional[Callable[[str], Optional[str]]]) – Function which returns an asset group name for a given Airbyte connection name. If None, no groups will be created. Defaults to a basic sanitization function.

  • connection_filter (Optional[Callable[[AirbyteConnectionMetadata], bool]]) – Optional function which takes in connection metadata and returns False if the connection should be excluded from the output assets.

Examples:

Loading all Airbyte connections as assets:

from dagster_airbyte import airbyte_resource, load_assets_from_airbyte_instance

airbyte_instance = airbyte_resource.configured(
    {
        "host": "localhost",
        "port": "8000",
    }
)
airbyte_assets = load_assets_from_airbyte_instance(airbyte_instance)

Filtering the set of loaded connections:

from dagster_airbyte import airbyte_resource, load_assets_from_airbyte_instance

airbyte_instance = airbyte_resource.configured(
    {
        "host": "localhost",
        "port": "8000",
    }
)
airbyte_assets = load_assets_from_airbyte_instance(
    airbyte_instance,
    connection_filter=lambda meta: "snowflake" in meta.name,
)
dagster_airbyte.load_assets_from_airbyte_project(project_dir, workspace_id=None, key_prefix=None, create_assets_for_normalization_tables=True, connection_to_group_fn=<function _clean_name>, connection_filter=None)[source]

Loads an Airbyte project into a set of Dagster assets.

Point to the root folder of an Airbyte project synced using the Octavia CLI. For more information, see https://github.com/airbytehq/airbyte/tree/master/octavia-cli#octavia-import-all.

Parameters:
  • project_dir (str) – The path to the root of your Airbyte project, containing sources, destinations, and connections folders.

  • workspace_id (Optional[str]) – The ID of the Airbyte workspace to load connections from. Only required if multiple workspace state YAMLfiles exist in the project.

  • key_prefix (Optional[CoercibleToAssetKeyPrefix]) – A prefix for the asset keys created.

  • create_assets_for_normalization_tables (bool) – If True, assets will be created for tables created by Airbyte’s normalization feature. If False, only the destination tables will be created. Defaults to True.

  • connection_to_group_fn (Optional[Callable[[str], Optional[str]]]) – Function which returns an asset group name for a given Airbyte connection name. If None, no groups will be created. Defaults to a basic sanitization function.

  • connection_filter (Optional[Callable[[AirbyteConnectionMetadata], bool]]) – Optional function which takes in connection metadata and returns False if the connection should be excluded from the output assets.

Examples:

Loading all Airbyte connections as assets:

from dagster_airbyte import load_assets_from_airbyte_project

airbyte_assets = load_assets_from_airbyte_project(
    project_dir="path/to/airbyte/project",
)

Filtering the set of loaded connections:

from dagster_airbyte import load_assets_from_airbyte_project

airbyte_assets = load_assets_from_airbyte_project(
    project_dir="path/to/airbyte/project",
    connection_filter=lambda meta: "snowflake" in meta.name,
)
dagster_airbyte.build_airbyte_assets(connection_id, destination_tables, asset_key_prefix=None, normalization_tables=None, upstream_assets=None)[source]

Builds a set of assets representing the tables created by an Airbyte sync operation.

Parameters:
  • connection_id (str) – The Airbyte Connection ID that this op will sync. You can retrieve this value from the “Connections” tab of a given connector in the Airbyte UI.

  • destination_tables (List[str]) – The names of the tables that you want to be represented in the Dagster asset graph for this sync. This will generally map to the name of the stream in Airbyte, unless a stream prefix has been specified in Airbyte.

  • normalization_tables (Optional[Mapping[str, List[str]]]) – If you are using Airbyte’s normalization feature, you may specify a mapping of destination table to a list of derived tables that will be created by the normalization process.

  • asset_key_prefix (Optional[List[str]]) – A prefix for the asset keys inside this asset. If left blank, assets will have a key of AssetKey([table_name]).

  • upstream_assets (Optional[Set[AssetKey]]) – A list of assets to add as sources.