dbt recently introduced native Python models support!
At fal, we believe this should be the way to run your Python models, so we have been working on dbt-fal, an adapter that enables you to run dbt Python models with any datawarehouse (including Redshift, Postgres). dbt-fal can be used to run Python models locally, making it very easy to iterate fast on your Python models. If you are interested, check out our first demo.
dbt v1.3 release makes fal's Python models format incompatible. To give fal Python models users an easy migration route to dbt-fal Python models, we have to separate fal Python models from the other dbt models.
To do this, update fal to version
0.7.0 or above.
Add a variable
fal-models-paths in your
dbt_project.yml with a list of directories where to find your fal Python models. This directory must not match any folders listed under the
model-paths for dbt models.
You can check the PR (#578) for more details.
- Ensure broken dbt models are not corrupting other models (#500)
- Correctly serialize script type when constructing task (#522)
- Ensure the venv based environments are thread safe (#527)
- Use POSIX which to figure out the binary path (#526)
- Correct dependency analysis for external dbt packages (#529)
- Include fal-extras in the dependency analysis (and fix sf failures) (#539)
- Update docs to include pre-hook
- Change readme to reflect new recommended uses (#515)
- Model-scoped environments (#492)
- Implement "local" hooks (#505)
- Two-way bridge for transporting objects to isolated hooks (#519)
- Dual-venvs for faster/cheaper environment creation (#520)
- Add DO_NOT_TRACK as option for telemetry opt out (#538)
- Include shell-variables in the isolated debug script (#540)
- Add telemetry information about fal python models being used (#530)
- Run npm audit fix for docsite (#533)
- Add a PR template (#536)
- Add telemetric project identifier (#537)
- Pass config and adapter down to lib funcitons when possible (#516)
- Define the explicit flow between env<->conn (#523)
- Use sqlalchemy engine to write and read dataframes by default (#517)
- Move to new logging system (#524)
- Remove duckdb and redshift from daily integration tests (#518)
- Remove fake credentials for redshift (#531)
- Add athena integration tests (#503)
- Adapt profiles_dir info import for new location in dbt 1.3 (#564)
- Enable global script selection (#569)
- Document structured hooks [the
- Document the new environment management feature (#548)
- Implement initial support for conda (#528)
- Generalize dual Python IPC and speed-up conda builds (#534)
- Add PythonAdapter and PythonConnectionManager classes (#553)
- Add --globals flag for more explicit global script execution (#574)