serve
method.
Serve a flow
The serve method creates a deployment for the flow and starts a long-running process that monitors for work from the Prefect server. When work is found, it is executed within its own isolated subprocess.hello_world.py
- schedules
- event triggers
- metadata such as tags and description
- default parameter values
Schedules are auto-paused on shutdownBy default, stopping the process running
flow.serve
will pause the schedule
for the deployment (if it has one).When running this in environments where restarts are expected use thepause_on_shutdown=False
flag to prevent this behavior:Additional serve options
Theserve
method on flows exposes many options for the deployment.
Here’s how to use some of those options:
cron
: a keyword that allows you to set a cron string schedule for the deployment; see schedules for more advanced scheduling optionstags
: a keyword that allows you to tag this deployment and its runs for bookkeeping and filtering purposesdescription
: a keyword that allows you to document what this deployment does; by default the description is set from the docstring of the flow function (if documented)version
: a keyword that allows you to track changes to your deployment; uses a hash of the file containing the flow by default; popular options include semver tags or git commit hashestriggers
: a keyword that allows you to define a set of conditions for when the deployment should run; see triggers for more on Prefect Events concepts
Triggers with
.serve
See this example that triggers downstream work on upstream events.CTRL+C
and your schedule automatically pauses.
serve()
is a long-running processTo execute remotely triggered or scheduled runs, your script with flow.serve
must be actively running.Serve multiple flows at once
Serve multiple flows with the same process using theserve
utility along with the to_deployment
method of flows:
serve_two_flows.py
- the
flow.to_deployment
interface exposes the exact same options asflow.serve
; this method produces a deployment object - the deployments are only registered with the API once
serve(...)
is called - when serving multiple deployments, the only requirement is that they share a Python environment; they can be executed and scheduled independently of each other
- pause and unpause the schedule for the
"sleeper"
deployment - use the UI to submit ad-hoc runs for the
"sleeper"
deployment with different values forsleep
- cancel an active run for the
"sleeper"
deployment from the UI
Hybrid execution optionPrefect’s deployment interface allows you to choose a hybrid execution model.
Whether you use Prefect Cloud or self-host Prefect server, you can run workflows in the
environments best suited to their execution.
This model enables efficient use of your infrastructure resources while maintaining the privacy
of your code and data.
There is no ingress required.
Read more about our hybrid model.
Retrieve a flow from remote storage
Just like the.deploy
method, the flow.from_source
method is used to define how to retrieve the flow that you want to serve.
from_source
The flow.from_source
method on Flow
objects requires a source
and an entrypoint
.
source
The source
of your deployment can be:
- a path to a local directory such as
path/to/a/local/directory
- a repository URL such as
https://github.com/org/repo.git
- a
GitRepository
object that accepts- a repository URL
- a reference to a branch, tag, or commit hash
GitCredentials
for private repositories
entrypoint
A flow entrypoint
is the path to the file where the flow is located within that source
, in the form
hello
flow from the flows/hello_world.py
file in the PrefectHQ/examples
repository:
load_from_url.py
You can serve loaded flowsYou can serve a flow loaded from remote storage with the same
serve
method as a local flow:serve_loaded_flow.py