Submit flows directly to different infrastructure types without a deployment
Infrastructure | Required Package | Decorator |
---|---|---|
Docker | prefect-docker | @docker |
Kubernetes | prefect-kubernetes | @kubernetes |
AWS ECS | prefect-aws | @ecs |
Google Cloud Run | prefect-gcp | @cloud_run |
Google Vertex AI | prefect-gcp | @vertex_ai |
Azure Container Instances | prefect-azure | @azure_container_instance |
@docker
When using the @docker
decorator with a local Docker engine, you can use volume mounts to share data between your Docker container and host machine.Here’s an example:LocalFileSystem
block’s basepath
matches the path specified in the volume mount@kubernetes
:
my_flow
will execute locally, while my_remote_flow
will be submitted to run in a Kubernetes job.
cloudpickle
to allow them to be transported to the destination infrastructure.Most Python objects can be serialized with cloudpickle
, but objects like database connections cannot be serialized. For parameters that cannot be serialized, you’ll need to create the object inside your infrastructure-bound workflow.