Scale your data science workflows with the Vertex AI Workbench notebook executorScale your data science workflows with the Vertex AI Workbench notebook executorDeveloper Advocate

When solving a new ML problem, it’s common to start by experimenting with a subset of your data in a notebook environment. But if you want to execute a long-running job, add accelerators, or run multiple training trials with different input parameters, you’ll likely find yourself copying code over to a Python file to do the actual computation. That’s why we’re excited to announce the launch of the notebook executor, a new feature of Vertex AI Workbench that allows you to schedule notebooks ad hoc, or on a recurring basis. With the executor, your notebook is run cell by cell on Vertex AI Training. You can seamlessly scale your notebook workflows by configuring different hardware options, passing in parameters you’d like to experiment with, and setting an execution schedule, all via the Console UI or the notebooks API

Built to Scale

Imagine you’re tasked with building a new image classifier. You start by loading a portion of the dataset into your notebook environment and running some analysis and experiments on a small machine. After a few trials, your model looks promising, so you want to train on the full image dataset. With the notebook executor, you can easily scale up model training by configuring a cluster with machine types and accelerators, such as NVIDIA GPUs, that are much more powerful than the current instance where your notebook is running.

Your model training gets a huge performance boost from adding a GPU, and you now want to run a few extra experiments with different model architectures from TensorFlow Hub. For example, you can train a new model using feature vectors from various architectures, such as Inception, ResNet, or MobileNet, all pretrained on the ImageNet dataset. Using these feature vectors with the Keras Sequential API is simple; all you need to do is pass the TF Hub URL for the particular model to hub.KerasLayer.

Leave a Comment