prefect-dbt, you can execute dbt Core from a Prefect flow with per-node observability, trigger and observe dbt Cloud jobs, and incorporate other tools, such as Snowflake, into your dbt runs.
Prefect provides a global view of the state of your workflows and allows you to take action based on state changes.
Getting started
Prerequisites
- A dbt adapter for your target database if using dbt Core (for example,
dbt-duckdb,dbt-snowflake, ordbt-bigquery). - A dbt Cloud account if using dbt Cloud.
Install prefect-dbt
prefect-dbt and dbt-core, but not a dbt database adapter. For adapter-specific extras ([snowflake], [bigquery], [postgres], [all_extras]) and notes on adapters without a bundled extra such as dbt-duckdb, see the installation guide.
Start here
If you already have a dbt Core project with a workingprofiles.yml, the shortest path is PrefectDbtRunner:
dbt Core
prefect-dbt 0.7.0 and later ship one interface for running dbt Core from a Prefect flow: the PrefectDbtRunner. It pairs dbt build semantics with Prefect-native logging, failure handling, and asset lineage.
For projects still using the pre-0.7.0 API (DbtCoreOperation, DbtCliProfile, and the TargetConfigs block hierarchy), see the Legacy guide.
dbt Cloud
Use the pre-builtrun_dbt_cloud_job flow to trigger dbt Cloud jobs from Prefect, with automatic retries for failed nodes. See the dbt Cloud guide for end-to-end setup with DbtCloudCredentials and DbtCloudJob blocks.
Resources
PrefectDbtRunner— run dbt Core from a Prefect flow with asset lineage.- dbt Cloud — run dbt Cloud jobs from Prefect.
- Run dbt with Prefect example — complete working flow.
- Installation — adapter extras and detailed install options.
- Legacy —
prefect-dbt0.6.6 and earlier. prefect-dbtrelease notes — version history.- SDK reference — full
prefect-dbtAPI. - dbt documentation — upstream dbt docs.