Skip to main content
With prefect-dbt, you can execute dbt Core from a Prefect flow with per-node observability, trigger and observe dbt Cloud jobs, and incorporate other tools, such as Snowflake, into your dbt runs. Prefect provides a global view of the state of your workflows and allows you to take action based on state changes.

Getting started

Prerequisites

  • A dbt adapter for your target database if using dbt Core (for example, dbt-duckdb, dbt-snowflake, or dbt-bigquery).
  • A dbt Cloud account if using dbt Cloud.

Install prefect-dbt

pip install "prefect[dbt]"
This installs prefect-dbt and dbt-core, but not a dbt database adapter. For adapter-specific extras ([snowflake], [bigquery], [postgres], [all_extras]) and notes on adapters without a bundled extra such as dbt-duckdb, see the installation guide.

Start here

If you already have a dbt Core project with a working profiles.yml, the shortest path is PrefectDbtRunner:
from prefect import flow
from prefect_dbt import PrefectDbtRunner


@flow
def run_dbt():
    PrefectDbtRunner().invoke(["build"])


if __name__ == "__main__":
    run_dbt()
For a complete end-to-end flow that downloads a dbt project, runs it, and tests it, see the Run dbt with Prefect example.

dbt Core

prefect-dbt 0.7.0 and later ship one interface for running dbt Core from a Prefect flow: the PrefectDbtRunner. It pairs dbt build semantics with Prefect-native logging, failure handling, and asset lineage. For projects still using the pre-0.7.0 API (DbtCoreOperation, DbtCliProfile, and the TargetConfigs block hierarchy), see the Legacy guide.

dbt Cloud

Use the pre-built run_dbt_cloud_job flow to trigger dbt Cloud jobs from Prefect, with automatic retries for failed nodes. See the dbt Cloud guide for end-to-end setup with DbtCloudCredentials and DbtCloudJob blocks.

Resources