In the Quickstart, you learned how to convert a Python script to a Prefect flow.

In this tutorial, you’ll learn how to get that flow off of your local machine and run it on a schedule with Prefect.

Publish your code to a remote repository

First, you need to take the code from your local machine and publish it to a remote repository. We’ve already published the code to GitHub that you need for this tutorial:

https://github.com/prefecthq/demos.git

Create a work pool

Running a flow locally is a good start, but most use cases require a remote execution environment. A work pool is the most common interface for deploying flows to remote infrastructure.

Deploy your flow to a self-hosted Prefect server instance using a Process work pool. All flow runs submitted to this work pool will run in a local subprocess (the creation mechanics are similar for other work pool types that run on remote infrastructure).

  1. Create a Process work pool:

    prefect work-pool create --type process my-work-pool
    
  2. Verify that the work pool exists:

    prefect work-pool ls
    
  3. Start a worker to poll the work pool:

    prefect worker start --pool my-work-pool
    

You can also choose from other work pool types.

Deploy and schedule your flow

A deployment is used to determine when, where, and how a flow should run. Deployments elevate flows to remotely configurable entities that have their own API. To set a flow to run on a schedule, you need to create a deployment.

  1. Create a deployment in code:

    create_deployment.py
    from prefect import flow
    
    # Source for the code to deploy (here, a GitHub repo)
    SOURCE_REPO="https://github.com/prefecthq/demos.git"
    
    if __name__ == "__main__":
        flow.from_source(
            source=SOURCE_REPO,
            entrypoint="my_gh_workflow.py:repo_info", # Specific flow to run
        ).deploy(
            name="my-first-deployment",
            work_pool_name="my-work-pool", # Work pool target
            cron="* * * * *", # Cron schedule (every minute)
        )
    

    You can store your flow code in nearly any location as long as Prefect can access it. See Where to store your flow code for more details.

  2. Run the script to create the deployment:

    python create_deployment.py
    

    Check the logs to ensure your deployment was created:

    Successfully created/updated all deployments!
    ______________________________________________________
    |                    Deployments                     |  
    ______________________________________________________
    |    Name                       |  Status  | Details |
    ______________________________________________________
    | repo-info/my-first-deployment | applied  |         |
    ______________________________________________________
    
  3. Schedule a run for the deployment:

    prefect deployment run 'repo-info/my-first-deployment'
    

    Soon you should see the flow run graph and logs on the Flow Run page in the UI. Logs are also streamed to the terminal.

Next steps

In this tutorial, you successfully deployed your flow to remote infrastructure and scheduled it to run automatically.

Next, learn how to build a resilient and performant data pipeline with retries, concurrent tasks, concurrency limits, and caching.

Need help? Book a meeting with a Prefect Product Advocate to get your questions answered.