Intro to Model Zoo
Get an overview of the Cerebras Model Zoo, including model portability, modules, and updated directory paths for enhanced usability.
The Cerebras Model Zoo is a comprehensive repository offering a range of deep learning models optimized for the Cerebras hardware. This collection is tailored to showcase the best practices for constructing models that exploit the robust capabilities of the Cerebras Wafer-Scale Cluster. With a focus on models developed in PyTorch, the Model Zoo provides detailed instructions for deploying neural network jobs on the Cerebras hardware, guiding users through compiling, validating, and training processes.
From version 2.2 onwards, the Model Zoo has been reorganized to enhance clarity and usability. If you’ve upgraded from any version prior to 2.2, see Directory Path Updates for more info.
Included Models
The Cerebras Model Zoo includes various models:
To deploy your neural network jobs on the Cerebras Wafer-Scale Cluster, refer to the Quick start guide. This guide will walk you through the necessary steps to compile, validate, and train models from this Model Zoo using your preferred framework.
Model Portability
The Cerebras Model Zoo facilitates model portability, providing tools that aid users in adapting existing models or crafting new ones using Cerebras APIs. This includes a spectrum of user stages, from beginners to advanced users, with varying levels of integration and customization:
Beginners
Start with Cerebras data preprocessing tools and use model implementations found in the Cerebras Model Zoo.
Intermediate Users
Intermediate users can integrate their data preprocessing methods by referring to the section Create your own data preprocessing.
Advanced Users
Define your PyTorch model or code using the run
function in the Cerebras Model Zoo and Supported Operations API.
Structure
The Model Zoo is designed to be user-friendly, offering an organized array of models, datasets, tools, and utilities.
Key features include:
-
Reorganized models and datasets categorized by classes (NLP, Vision, Multimodal).
-
Registry APIs that enable querying of paths and supported combinations for each model.
-
Config classes designed to encapsulate parameters in YAML files, providing a clear class hierarchy and ensuring the validation of configurations used with models.
-
Enhanced data preprocessing tools for NLP models, significantly improving performance.
The structure of the Cerebras Model Zoo is illustrated in the diagram below:
Fig. 7 Cerebras Model Zoo structure
Modules
The principal modules in the Cerebras Model Zoo include:
Modules | Description |
---|---|
modelzoo.config | For configuration classes and base-level configuration. |
modelzoo.data | Hosting data processing scripts and loaders for vision and NLP models |
modelzoo.layers | Containing layers and modules for model building and adaptation |
modelzoo.losses | Offering a variety of loss functions for different model training phases |
modelzoo.models | Implementing a range of models across NLP, vision, and multimodal domains |
modelzoo.data_preparation | Providing tools for data preprocessing stages |
modelzoo.tools | Containing utilities for model conversion and configuration |
modelzoo.common | Hosting common utilities across different models |
modelzoo.trainer | Trainer API that facilitates training and validating Model Zoo models |
This redesign aims to streamline the user experience, making it easier for ML developers and researchers to explore, experiment, and develop solutions efficiently within the Cerebras ecosystem. The documentation further enriches this experience, offering examples and guidance on utilizing the Model Zoo’s Registry APIs and Config Classes, thereby enhancing the model development and deployment process.
Directory Path Updates
If you’re updating to the latest version of Model Zoo from any version prior to 2.2, the paths have been updated. See the table to compare the old paths with their corresponding new versions.
Old Path (relative to $MODELZOO) | New Path (relative to $MODELZOO) |
---|---|
common/pytorch/* common/run_utils/* common/model_utils/* | common/ |
common/pytorch/layers | layers/ |
common/pytorch/input common/input | data/ |
models/nlp/* vision/pytorch/* multimodal/pytorch/* | models/nlp/* models/vision/* models/multimodal/* |
data_preparation/* | data_preparation/ |
common/pytorch/model_utils/ | tools/ |
fc_mnist/ | fc_mnist/ |