Skip to main content

Documentation

Welcome to the comprehensive user documentation for Ori Global Cloud (OGC). As your trusted partner in the journey towards innovative cloud solutions, OGC is committed to offering a suite of services tailored to meet the ever-evolving needs of machine learning (ML) and artificial intelligence (AI) enthusiasts, practitioners, and enterprises. Our documentation aims to guide you through the full spectrum of OGC services, ensuring a smooth and efficient experience.

OGC's array of services is designed to provide end-to-end solutions for ML and AI applications, from development to deployment and orchestration. Whether you are starting with ML models, looking to leverage high-performance computing, or seeking to streamline your MLOps, our platform is equipped to support your goals:

  • Ori Metal: For those who need the raw power of bare metal, our Ori Metal service provides the foundation for performance-intensive and scalable applications.

  • Ori GPU VMs: The flexibility of on-demand virtual machines, with a wide variety of GPU configurations possible, from the most powerful to cost optimised performance.

  • Kubernetes Services: Explore our Kubernetes offerings, from serverless capabilities that mirror the functionality of Kubernetes, enhanced with GPU support and the familiarity of Kubectl, to fully managed Ori GPU Clusters that give you control over a powerful Kubernetes cluster.

  • Inference Orchestration: Learn how to deploy, scale, and maintain your inference workloads across multiple cloud environments with our Inference Orchestration service. Ensure your models are always running and serving predictions with minimal downtime.

  • ML GPU VMs: Dive into on-demand, pre-configured ML environments on VMs, complete with Jupyter notebooks and a curated set of ML libraries. Get started with your ML projects quickly, accessing these environments via SSH or HTTPS, without the hassle of setup.

  • ML Kubeflow: For flexible ML Ops on powerful GPU Kubernetes clusters, this service enables users to run their entire ML/AI operations on distributed infrastructure. Power and simplicity combined for those that just want to get going.

Additional Resources

  • Billing and Quotas: Understand how to manage your costs and resources effectively on the OGC platform.

  • User Access Control: Get to know how to securely manage access to your services and resources.

  • APIs and Marketplace: Learn how to integrate OGC services into your workflows with our APIs, and enhance your capabilities through our Marketplace.

Support and Community

At OGC, we believe in not just providing a service, but also in fostering a supportive community. Our documentation is complemented by a community forum where you can discuss features, share insights, and find answers from fellow users and OGC experts.

Getting Started

To begin your journey with OGC, we recommend familiarizing yourself with the documentation corresponding to the service you are interested in. Each section provides detailed instructions, best practices, and tips to help you make the most out of OGC services.

Thank you for choosing Ori Global Cloud. We look forward to supporting your ML and AI endeavors with our cutting-edge cloud services.

Sections

  • Organisation: Manage and visualise your resources with multiple dashboard views;
  • Ori Metal: Baremetal with the most powerful GPUs on the market, when you need dedicated resources on a private cloud;
  • Ori GPU VMs: On-demand Virtual Machines with GPUs optimised for your performance needs;
  • Ori GPU Kubernetes: Kubernetes services with GPUs and the flexibility to run your applications as you want;
  • Ori Inference Orchestration: Ori's orchestration solution for Inferencing on distributed environments and multi-cloud;
  • Ori ML Services: Run your ML projects simply and easily on powerful compute rigs.