Get started building models even faster by using Public Checkpoints.
Reusable Checkpoints Make Models Even More Flexible
trainML now supports the creation and use of Checkpoints to store immutable versions of large model weight files.
Google Drive Storage Source
Deliver Analytics To Customer Securely With Federated Inference
Analytics providers can now run their models directly on the trainML deployments of their customers. This allows the analytics provider to maintain and protect their intellectual property while providing analytics services inside their customers' secure, private infrastructure.
Take Inference to the Edge with GPU-enabled CloudBender Devices
Run real-time inference workloads on NVIDIA Jetson fully managed by CloudBender™.
Central Storage Controllers for Physical CloudBender Regions
Physical CloudBender™ regions now support running a centralized storage controller similar to cloud regions.
Azure Blob Storage and Container Registry Integration
Integration with Azure Blob Storage and Azure Container Registry is now available natively in trainML.
Private Endpoints for CloudBender Regions
CloudBender™ now allows you deploy applications as endpoints to your local region, so they are only accessible from inside your infrastructure.
More Flexibility in Worker Output Format
Customers can now disable the automatic archiving of job outputs prior to upload.
Wasabi Cloud Storage Integration
Wasabi cloud storage has been added as an available storage integration. Wasabi can save you up to 80% on persistent storage compared to AWS and has no additional egress/API fees, making it a great option for trainML integration.