Run Airbyte and Fivetran from your dbt project using fal
You can now use fal to easily specify and trigger sync jobs on pipelines from both Fivetran and Airbyte.

Airbyte and Fivetran are popular solutions for loading data from external sources into your data warehouse. If you are a dbt user, you typically have to schedule two separate “cron jobs”: one to extract and load, and the other to run your dbt transformations.
Today, we are introducing our Airbyte and Fivetran integrations for fal, which lets you specify and trigger sync jobs right from your dbt project. Used in combination with fal flow
, you can now represent extract and load operations as nodes in your dbt DAG.
With this update, you will no longer have to reach for a complex orchestrator like Airflow to run multiple commands in sequence and instead represent the whole workflow in a single fal flow
command.
In order for fal to be aware of EL connections, we first add them to the profiles.yml
file:
fal_test:
target: dev
fal_extract_load:
dev:
airbyte_config:
type: airbyte
host: http://localhost:8001
connections:
- name: pokemon
id: my_connection_id
fivetran_config:
type: fivetran
api_key: my_api_key
api_secret: my_api_secret
connectors:
- name: fivetran_log
id: my_connector_id
outputs:
dev:
type: postgres
...
This is the same profiles.yml
that is used by dbt. As you see, we define a new node, fal_extract_load
, and enter EL configurations there.
In schema.yml
we define a Python script that is set to run before a model:
models:
- name: pokemon_height
description: Pokemon heights and names
config:
materialized: table
meta:
fal:
scripts:
before:
- fal_scripts/run_el_jobs.py
Now, inside fal_scripts/run_el_jobs.py
script you have access to a new magic variable el
, that has methods for running EL jobs:
# Runs an Airbyte sync job on connection "pokemon"
el.airbyte_sync(config_name="airbyte_config", connection_name="pokemon")
# Runs a Fivetran sync job on connector "fivetran_log"
el.fivetran_sync(config_name="fivetran_config", connector_name="fivetran_log")
There's no need to import el
, it's available automatically in fal runtime.
You can now run the entire ELT pipeline end-to-end by using the new fal flow run
command.
For more information on running EL jobs from fal, see our docs. Feedback and comments are always welcome on our Discord server. You can also check out our Github repository in order to give us a star or raise an issue.