Constructing a Vertex AI Pipeline with a custom training container and a model serving container

37 views
Skip to first unread message

Sanger Steel

unread,
Apr 15, 2022, 4:31:45 PM4/15/22
to cloud-nl-discuss

Hi there:

I'd like to be able to train a model with a training app container that I've made and saved to my artifact registry. I want to be able to deploy a model with a flask app and with a /predict route that can handle some logic -- not necessarily just predicting an input json. It'll also need a /healthz route I understand. So basically I want a pipeline that performs a training job on a model training container that I make, and deploys the model with a flask app with a model serving container that I make. Looking around on Overflow, I wonder if this question's pipeline has the correct layout I'll eventually want to have. 

I'm hoping that model_serving_container_image_uri and serving_container_image_uri both refer to the URI for the model serving container I'm going to make. I've already made a training container that trains a model and saves saved_model.pb to Google Cloud Storage. Other than having a flask app that handles the prediction and health check routes and a Dockerfile that exposes a port for the flask app, what else will I need to do to ensure the model serving container works in this pipeline? Where in the code do I install the model from GCS? In the Dockerfile? How is the model serving container meant to work so that everything will go swimmingly in the construction of the pipeline? I'm having trouble finding any tutorials or examples of precisely what I'm trying to do anywhere even though this seems like a pretty common scenario.

oakinlaja

unread,
Apr 15, 2022, 7:56:39 PM4/15/22
to cloud-nl-discuss
Hello, 

Have you gone through this article[0]? My understanding of your use-case is that you would like to create your own custom container for training[or better still, it seems you have created one]. If my understanding is right then, you should go through the details of  the link[0], which gives an Overview description about the use of Custom containers with Vertex AI. This doc[1] gives more information specifically about creating a custom container image providing the most flexibility for training on Vertex AI. 

Vertex AI only supports custom training with container images on Artifact Registry, Container Registry, or Docker Hub. From the information that you provided, it seems your container images are in GCS? or am I misunderstanding the info provided?  This guide[1] focuses on using Artifact Registry with Vertex AI.

Sanger Steel

unread,
Apr 18, 2022, 12:50:03 PM4/18/22
to cloud-nl-discuss
My custom container images are in the Artifact Registry. I want a custom container for training, but I also ultimately want to be able to deploy an endpoint where I have a flask app that handles some logic in the /predict path.

Muhammad Sarder

unread,
Apr 19, 2022, 6:22:54 PM4/19/22
to cloud-nl-discuss

Hello, 

Thank you for providing more information and context. Nonetheless, could you please clarify what you mean by deploying an endpoint? I suppose you mean using private endpoints[0] to serve online predictions with Vertex AI?

Please clarify the details so we can have a better understanding of your intentions. 


[0]https://cloud.google.com/vertex-ai/docs/predictions/using-private-endpoints

Reply all
Reply to author
Forward
0 new messages