When a deployment runs, the execution environment needs access to the flow code. Flow code is not stored directly in Prefect server or Prefect Cloud; instead, it must be made available to the execution environment. There are two main ways to achieve this:

  1. Include source code directly in your runtime: Often, this means building your code into a Docker image.
  2. Retrieve code from storage at runtime: The worker pulls code from a specified location before starting the flow run.

This page focuses on the second approach: retrieving code from a storage location at runtime.

You have several options for where your code can be stored and pulled from:

  • Local filesystem
  • Git-based storage (GitHub, GitLab, Bitbucket)
  • Blob storage (AWS S3, Azure Blob Storage, GCP GCS)

The ideal choice depends on your team’s needs and tools.

In the examples below, we show how to create a deployment configured to run on dynamic infrastructure for each of these storage options.

Deployment creation options

As detailed in the Deployment overview, you can create a deployment in one of two main ways:

  • Python code with the flow.deploy method

    • When using .deploy, specify a storage location for your flow with the flow.from_source method.

    • The source is either a URL to a git repository or a storage object. For example:

      • A local directory: source=Path(__file__).parent or source="/path/to/file"
      • A URL to a git repository: source="https://github.com/org/my-repo.git"
      • A storage object: source=GitRepository(url="https://github.com/org/my-repo.git")
    • The entrypoint is the path to the file the flow is located in and the function name, separated by a colon.

  • YAML specification defined in a prefect.yaml file

    • To create a prefect.yaml file interactively, run prefect deploy from the CLI and follow the prompts.

    • The prefect.yaml file may define a pull section that specifies the storage location for your flow. For example:

      • Set the working directory:
      pull:
          - prefect.deployments.steps.set_working_directory:
              directory: /path/to/directory
      
      • Clone a git repository:
      pull:
          - prefect.deployments.steps.git_clone:
              repository: https://github.com/org/my-repo.git
      
      • Pull from blob storage:
      pull:
          - prefect.deployments.steps.pull_from_blob_storage:
              container: my-container
              folder: my-folder
      

Whether you use from_source or prefect.yaml to specify the storage location for your flow code, the resulting deployment will have a set of pull steps that your worker will use to retrieve the flow code at runtime.

Store code locally

If using a Process work pool, you can use one of the remote code storage options shown above, or you can store your flow code in a local folder.

Here is an example of how to create a deployment with flow code stored locally:

from prefect import flow
from pathlib import Path


@flow(log_prints=True)
def my_flow(name: str = "World"):
    print(f"Hello {name}!")


if __name__ == "__main__":
    my_flow.from_source(
        source=str(Path(__file__).parent),  # code stored in local directory
        entrypoint="local_process_deploy_local_code.py:my_flow",
    ).deploy(
        name="local-process-deploy-local-code",
        work_pool_name="my-process-pool",
    )

Git-based storage

Git-based version control platforms provide redundancy, version control, and collaboration capabilities. Prefect supports:

For a public repository, you can use the repository URL directly.

If you are using a private repository and are authenticated in your environment at deployment creation and deployment execution, you can use the repository URL directly.

Alternatively, for a private repository, you can create a Secret block or git-platform-specific credentials block to store your credentials:

Then you can reference this block in the Python deploy method or the prefect.yaml file pull step.

If using the Python deploy method with a private repository that references a block, provide a GitRepository object instead of a URL, as shown below.

from prefect import flow

if __name__ == "__main__":
    flow.from_source(
        source="https://github.com/org/my-public-repo.git",
        entrypoint="gh_public_repo.py:my_flow",
    ).deploy(
        name="my-github-deployment",
        work_pool_name="my_pool",
    )

For accessing a private repository, we suggest creating a Personal Access Tokens (PATs). We recommend using HTTPS with fine-grained Personal Access Tokens to limit access by repository.

Per least privilege, we recommend granting the token the ability to read Contents and for your repository.

If using a Secret block, you can create it through code or the UI ahead of time and reference it at deployment creation as shown above.

If using a GitHubCredentials block to store your credentials, you can create it ahead of time and reference it at deployment creation.

  1. Install prefect-github with pip install -U prefect-github
  2. Register all block types defined in prefect-github with prefect block register -m prefect_github
  3. Create a GitHubCredentials block through code or the Prefect UI and reference it at deployment creation as shown above.

Note that you can specify a branch if creating a GitRepository object. The default is "main".

Push your code

When you make a change to your code, Prefect does not push your code to your git-based version control platform. This is intentional to avoid confusion about the git history and push process.

Docker-based storage

Another popular flow code storage option is to include it in a Docker image. All work pool options except Process and Prefect Managed allow you to bake your code into a Docker image.

To create a deployment with Docker-based flow code storage use the Python deploy method or create a prefect.yaml file.

If you use the Python deploy method to store the flow code in a Docker image, you don’t need to use the from_source method.

The prefect.yaml file below was generated by running prefect deploy from the CLI (a few lines of metadata were excluded from the top of the file output for brevity).

Note that the build section is necessary if baking your flow code into a Docker image.

from prefect import flow


@flow
def my_flow():
    print("Hello from inside a Docker container!")

if __name__ == "__main__":
    my_flow.deploy(
        name="my-docker-deploy",
        work_pool_name="my_pool",
        image="my-docker-image:latest",
        push=False
    )

By default, .deploy will build a Docker image that includes your flow code and any pip packages specified in a requirements.txt file.

In the examples above, we elected not to push the resulting image to a remote registry.

To push the image to a remote registry, pass push=True in the Python deploy method or add a push_docker_image step to the push section of the prefect.yaml file.

Custom Docker image

If an image is not specified by one of the methods above, deployment flow runs associated with a Docker work pool will use the base Prefect image (e.g. prefecthq/prefect:3-latest) when executing. Alternatively, you can create a custom Docker image outside of Prefect by running docker build && docker push elsewhere (e.g. in your CI/CD pipeline) and then reference the resulting image in the job_variables section of your deployment definition, or set the image as a default directly on the work pool.

For more information, see this discussion of custom Docker images.

Blob storage

Another option for flow code storage is any fsspec-supported storage location, such as AWS S3, Azure Blob Storage, or GCP GCS.

If the storage location is publicly available, or if you are authenticated in the environment where you are creating and running your deployment, you can reference the storage location directly. You don’t need to pass credentials explicitly.

To pass credentials explicitly to authenticate to your storage location, you can use either of the following block types:

  • Prefect integration library storage blocks, such as the prefect-aws library’s S3Bucket block, which can use a AWSCredentials block when it is created.
  • Secret blocks

If you use a storage block such as the S3Bucket block, you need to have the prefect-aws library available in the environment where your flow code runs.

You can do any of the following to make the library available:

  1. Install the library into the execution environment directly
  2. Specify the library in the work pool’s Base Job Template in the Environment Variables section like this:{"EXTRA_PIP_PACKAGES":"prefect-aws"}
  3. Specify the library in the environment variables of the deploy method as shown in the examples below
  4. Specify the library in a requirements.txt file and reference the file in the pull step of the prefect.yaml file like this:
    - prefect.deployments.steps.pip_install_requirements:
        directory: "{{ pull_code.directory }}" 
        requirements_file: requirements.txt

The examples below show how to create a deployment with flow code in a cloud provider storage location. For each example, we show how to access code that is publicly available. The prefect.yaml example includes an additional line to reference a credentials block if authenticating to a private storage location through that option.

We also include Python code that shows how to use an existing storage block and an example of that creates, but doesn’t save, a storage block that references an existing nested credentials block.

from prefect import flow


if __name__ == "__main__":
    flow.from_source(
        source="s3://my-bucket/my-folder",
        entrypoint="my_file.py:my_flow",
    ).deploy(
        name="my-aws-s3-deployment",
        work_pool_name="my-work-pool"
    )

To create an AwsCredentials block:

  1. Install the prefect-aws library with pip install -U prefect-aws
  2. Register the blocks in prefect-aws with prefect block register -m prefect_aws
  3. Create a user with a role with read and write permissions to access the bucket. If using the UI, create an access key pair with IAM -> Users -> Security credentials -> Access keys -> Create access key. Choose Use case -> Other and then copy the Access key and Secret access key values.
  4. Create an AWSCredentials block in code or the Prefect UI. In addition to the block name, most users will fill in the AWS Access Key ID and AWS Access Key Secret fields.
  5. Reference the block as shown above.

Another authentication option is to give the worker access to the storage location at runtime through SSH keys.