Why use Google Cloud Run for flow run execution?
Google Cloud Run is a fully managed compute platform that automatically scales your containerized applications.- Serverless architecture: Cloud Run follows a serverless architecture, which means you don’t need to manage any underlying infrastructure. Google Cloud Run automatically handles the scaling and availability of your flow run infrastructure, allowing you to focus on developing and deploying your code.
- Scalability: Cloud Run can automatically scale your pipeline to handle varying workloads and traffic. It can quickly respond to increased demand and scale back down during low activity periods, ensuring efficient resource utilization.
- Integration with Google Cloud services: Google Cloud Run easily integrates with other Google Cloud services, such as Google Cloud Storage, Google Cloud Pub/Sub, and Google Cloud Build. This interoperability enables you to build end-to-end data pipelines that use a variety of services.
- Portability: Since Cloud Run uses container images, you can develop your pipelines locally using Docker and then deploy them on Google Cloud Run without significant modifications. This portability allows you to run the same pipeline in different environments.
Google Cloud Run guide
After completing this guide, you will have:- Created a Google Cloud Service Account
- Created a Prefect Work Pool
- Deployed a Prefect Worker as a Cloud Run Service
- Deployed a Flow
- Executed the Flow as a Google Cloud Run Job
Prerequisites
Before starting this guide, make sure you have:- A Google Cloud Platform (GCP) account.
- A project on your GCP account where you have the necessary permissions to create Cloud Run Services and Service Accounts.
- The
gcloud
CLI installed on your local machine. You can follow Google Cloud’s installation guide. If you’re using Apple (or a Linux system) you can also use Homebrew for installation. - Docker installed on your local machine.
- A Prefect server instance. You can sign up for a forever free Prefect Cloud Account or, alternatively, self-host a Prefect server.
Step 1. Create a Google Cloud service account
First, open a terminal or command prompt on your local machine wheregcloud
is installed. If you haven’t already authenticated with gcloud
, run the following command and follow the instructions to log in to your GCP account.
<PROJECT_ID>
with your GCP project’s ID.
prefect-project
the command will look like this:
Step 2. Create a Cloud Run work pool
Let’s walk through the process of creating a Cloud Run work pool.Fill out the work pool base job template
You can create a new work pool using the Prefect UI or CLI. The following command creates a work pool of typecloud-run
via the CLI (you’ll want to replace the <WORK-POOL-NAME>
with the name of your work pool):


Step 3. Deploy a Cloud Run worker
Now you can launch a Cloud Run service to host the Cloud Run worker. This worker will poll the work pool that you created in the previous step. Navigate back to your terminal and run the following commands to set your Prefect API key and URL as environment variables. Be sure to replace<ACCOUNT-ID>
and <WORKSPACE-ID>
with your Prefect account and workspace IDs (both will be available in the URL of the UI when previewing the workspace dashboard).
You’ll want to replace <YOUR-API-KEY>
with an active API key as well.
<YOUR-SERVICE-ACCOUNT-NAME>
with the name of the service account you created in the first step of this guide, and replace <WORK-POOL-NAME>
with the name of the work pool you created in the second step.
prefect-worker
service by navigating to the Cloud Run page of your Google Cloud console. Additionally, you should be able to see a record of this worker in the Prefect UI on the work pool’s page by navigating to the Worker
tab.
Let’s not leave our worker hanging, it’s time to give it a job.
Step 4. Deploy a flow
Let’s prepare a flow to run as a Cloud Run job. In this section of the guide, we’ll “bake” our code into a Docker image, and push that image to Google Artifact Registry.Create a registry
Let’s create a docker repository in your Google Artifact Registry to host your custom image. If you already have a registry, and are authenticated to it, skip ahead to the Write a flow section. The following command creates a repository using the gcloud CLI. You’ll want to replace the<REPOSITORY-NAME>
with your own value. :
Write a flow
First, create a new directory. This will serve as the root of your project’s repository. Within the directory, create a sub-directory calledflows
.
Navigate to the flows
subdirectory and create a new file for your flow. Feel free to write your own flow, but here’s a ready-made one for your convenience:
weather_flow.py
, but you can name yours whatever you’d like.
Creating a prefect.yaml
file
Now we’re ready to make a prefect.yaml
file, which will be responsible for managing the deployments of this repository.
Navigate back to the root of your directory, and run the following command to create a prefect.yaml
file using Prefect’s docker deployment recipe.
us-docker.pkg.dev/<PROJECT-ID>/<REPOSITORY-NAME>/
. You’ll want to replace <PROJECT-ID>
with the ID of your project in GCP. This should match the ID of the project you used in first step of this guide. Here is an example of what this could look like:
prefect.yaml
file available at the root of your project. The contents will look similar to the example below, however, we’ve added in a combination of YAML templating options and Prefect deployment actions to build out a simple CI/CD process. Feel free to copy the contents and paste them in your prefect.yaml:
After copying the example above, don’t forget to replace
<WORKING-DIRECTORY>
with the name of the directory where your flow folder and prefect.yaml
live. You’ll also need to replace <PATH-TO-ARTIFACT-REGISTRY>
with the path to the Docker repository in your Google Artifact Registry.prefect.yaml
file above and what they do, feel free to read this next section. Otherwise, you can skip ahead to Flow Deployment.
In the build
section of the prefect.yaml
the following step is executed at deployment build time:
prefect_docker.deployments.steps.build_docker_image
: builds a Docker image automatically which uses the name and tag chosen previously.
If you are using an ARM-based chip (such as an M1 or M2 Mac), you’ll want to ensure that you add
platform: linux/amd64
to your build_docker_image
step to ensure that your docker image uses an AMD architecture. For example:push
section sends the Docker image to the Docker repository in your Google Artifact Registry, so that it can be easily accessed by the worker for flow run execution.
The pull
section sets the working directory for the process prior to importing your flow.
In the deployments
section of the prefect.yaml
file above, you’ll see that there is a deployment declaration named gcp-weather-deploy
. Within the declaration, the entrypoint for the flow is specified along with some default parameters which will be passed to the flow at runtime. Last but not least, the name of the work pool that we created in step 2 of this guide is specified.
Flow deployment
Once you’re happy with the specifications in theprefect.yaml
file, run the following command in the terminal to deploy your flow:
Step 5. Flow execution
Find your deployment in the UI, and hit the Quick Run button. You have now successfully submitted a flow run to your Cloud Run worker! If you used the flow script provided in this guide, check the Artifacts tab for the flow run once it completes. You’ll have a nice little weather report waiting for you there. Hope your day is a sunny one!Recap and next steps
Congratulations on completing this guide! Looking back on our journey, you have:- Created a Google Cloud service account
- Created a Cloud Run work pool
- Deployed a Cloud Run worker
- Deployed a flow
- Executed a flow