Run Airbyte and Fivetran from your dbt project using fal

You can now use fal to easily specify and trigger sync jobs on pipelines from both Fivetran and Airbyte.

Run Airbyte and Fivetran from your dbt project using fal

Airbyte and Fivetran are popular solutions for loading data from external sources into your data warehouse. If you are a dbt user, you typically have to schedule two separate “cron jobs”: one to extract and load, and the other to run your dbt transformations.

Today, we are introducing our Airbyte and Fivetran integrations for fal, which lets you specify and trigger sync jobs right from your dbt project. Used in combination with fal flow, you can now represent extract and load operations as nodes in your dbt DAG.

With this update, you will no longer have to reach for a complex orchestrator like Airflow to run multiple commands in sequence and instead represent the whole workflow in a single fal flow command.

In order for fal to be aware of EL connections, we first add them to the profiles.yml file:

  target: dev
        type: airbyte
        host: http://localhost:8001
          - name: pokemon
            id: my_connection_id
        type: fivetran
        api_key: my_api_key
        api_secret: my_api_secret
          - name: fivetran_log
            id: my_connector_id
      type: postgres

This is the same profiles.yml that is used by dbt. As you see, we define a new node, fal_extract_load, and enter EL configurations there.

In schema.yml we define a Python script that is set to run before a model:

  - name: pokemon_height
    description: Pokemon heights and names
      materialized: table
            - fal_scripts/

Now, inside fal_scripts/ script you have access to a new magic variable el, that has methods for running EL jobs:

# Runs an Airbyte sync job on connection "pokemon"
el.airbyte_sync(config_name="airbyte_config", connection_name="pokemon")

# Runs a Fivetran sync job on connector "fivetran_log"
el.fivetran_sync(config_name="fivetran_config", connector_name="fivetran_log")

There's no need to import el, it's available automatically in fal runtime.

You can now run the entire ELT pipeline end-to-end by using the new fal flow run command.

For more information on running EL jobs from fal, see our docs. Feedback and comments are always welcome on our Discord server. You can also check out our Github repository in order to give us a star or raise an issue.