Skip to main content

Frequently Asked Questions

This page provides answers to common questions and common issues.

1. What is Ori Global Cloud (OGC)?

OGC is an end-to-end AI/ML cloud platform specializing in GPU resources and services tailored for machine learning (ML) and artificial intelligence (AI) applications. It aims to empower developers, data scientists, and organizations by providing advanced cloud solutions to revolutionize various sectors.

2. How do I get started with using Ori Global Cloud?

To start using OGC, all you need to do is sign up!

  • Create an account by following the sign up instructions here.
  • Verify your email by clicking on the confirmation link sent to your address.
  • Log in to explore and utilize the services offered by OGC.
3. What libraries come preinstalled on OGC instances?

You’ll soon be able to spin up instances that come preinstalled with CUDA, Python 3.10, Tensorflow, TensorRT and PyTorch in our upcoming releases. Stay tuned!

4. I'm unfamiliar with SSH, how do I use the .pem private key?

See our documentation on adding, generating, and deleting SSH key using the Cloud dashboard. You can also watch our getting started video tutorial.

5. What are the main services offered by Ori Global Cloud?

OGC offers a range of services, including:

  • Ori GPU Metal: Provides reserved compute power with direct access to physical GPU hardware.
  • Ori GPU VMs: Virtual machines equipped with high-performance GPUs designed for ML and AI tasks.
  • Ori GPU Serverless: Offers high-performance GPU resources in a serverless framework, ideal for on-demand tasks.
  • Ori GPU Clusters: Integrates powerful GPUs to manage demanding AI workloads efficiently.
  • Ori ML Services (MLaaS): A comprehensive service designed to implement and scale ML and AI initiatives swiftly.
6. How can I gain access to the platform?

To access our platform, please sign up here. Simply follow the registration process, which involves entering your details and verifying your email. Once registered, you can log in and start exploring the wide range of services and tools we offer.

7. How do I report a bug or issue?

If you encounter a bug or need help troubleshooting an issue, you can raise a support case through our dedicated support portal. Our dedicated support team will address your query promptly.

8. How do I request access to GPU resources?

To request access to our GPU resources, please contact our team here. We offer tailored GPU cloud solutions to fit your specific project needs. Our team will work with you to understand your requirements and provide the appropriate GPU resources to ensure your projects run efficiently.

9. How do I request an increase in my service quotas?

If you find that your current service quotas do not meet your project's needs, you can request an increase through the platform directly, or by raising a ticket through our dedicated support channel. Select the ‘Licensing and billing questions’ option when raising your ticket. You can provide information about your requested increase, Including details such as your project requirements and expected usage so we can see how we can best support you!

10. What advantages do Ori GPU VMs offer for machine learning tasks?

Ori GPU VMs are optimized for ML and AI applications, providing the computational horsepower necessary for high-speed data processing, deep learning training, and advanced analytics. These VMs help in accelerating model training times and enhancing overall performance for AI-driven projects.

11. How does Ori GPU Serverless support AI/ML workloads?

The GPU Serverless service by Ori enables efficient and scalable deployment of AI-driven applications without the hassle of managing infrastructure. It automates resource scaling, ensuring cost-effective and performance-optimized usage of GPU resources.

12. What is the purpose of Ori GPU Clusters?

Ori GPU Clusters are specifically designed to handle the most demanding AI workloads, providing scalability and flexibility through Kubernetes combined with the raw power of GPU computing. This service simplifies the deployment and management of AI projects, allowing for efficient execution regardless of complexity.

13. How is billing handled on Ori Global Cloud?

Billing for Ori Global Cloud services is based on the resources used. Detailed information about billing and payment methods can be found in the Billing and Payments section of the OGC documentation.

14. How can I inquire about billing options for my company account?

For large accounts or enterprise-level services that require custom billing solutions, follow these steps to get in touch with our billing department by raising a ticket in our dedicated support channel (select the ‘Licensing and billing questions’ option).

15. Is there a free trial available for Ori Global Cloud?

New users can join Ori Global Cloud for free to explore and experiment with the platform's capabilities. However, accessing full features such as spinning up GPU infrastructure requires subscribing to a paid plan.

16. Where can I find tutorials and documentation for Ori Global Cloud?

OGC offers extensive resources to help users get started and maximize your use of the platform, including a Quick Start Guide, Basic Concepts overview, detailed How-To Guides, and comprehensive documentation available on the OGC website.

17. How can I get support if I encounter issues with Ori Global Cloud?

You can reach out directly by raising a ticket through our dedicated support channel.

18. Where can I find compliance information about Ori?

Ori is ISO27001 (also SOC 2 coming in MAy2024) compliant for its control plane. All of Ori’s Data Centers are ISO27001 and/or SOC2 compliant . See detailed information on our compliance page.

19. How can I contact support if I wish to make a complaint?

You can reach out directly by raising a ticket through our dedicated support channel.

OGC Services

20. What is Ori Metal and who should use it?

Ori Metal provides direct access to bare metal resources, offering the raw computing power needed for performance-intensive applications. This service is ideal for users who require scalable, high-performance computing without the overhead of virtualization, such as those running large-scale simulations or needing full control over their computing environment for compliance reasons.

21. How do Ori GPU VMs cater to different computing needs?

Ori GPU VMs offer a flexible, on-demand virtual machine service with a variety of GPU configurations to suit a range of needs, from ultra-high performance to cost-optimized options. This service is perfect for users who need customizable computing resources for ML and AI projects that can scale with their requirements.

22. Can you explain the Kubernetes services offered by OGC?

OGC provides two main Kubernetes services:

  • Serverless Kubernetes: This offers all the functionalities of Kubernetes with additional GPU support, suitable for those familiar with Kubectl but seeking enhanced scalability without managing the underlying infrastructure.
  • Ori GPU Clusters: These are fully managed Kubernetes clusters equipped with powerful GPUs, ideal for users who need to control complex applications and workloads requiring extensive computation.
23. What are ML GPU VMs and how do they simplify ML project setups?

ML GPU VMs are pre-configured virtual environments that come equipped with Jupyter notebooks and a curated selection of ML libraries. They are designed to help users quickly start their ML projects without the hassle of setting up software environments. Users can access these VMs via SSH or HTTPS, making them a convenient option for rapid deployment and iterative testing.

24. How does ML Kubeflow support ML operations?

ML Kubeflow leverages the power of Kubernetes to provide a flexible and powerful platform for managing ML workflows. This service allows users to run their entire ML/AI operations across distributed infrastructures, simplifying the process from data preparation to model deployment. It's especially suited for users looking for an integrated platform that supports a wide range of ML frameworks and tools while offering the scalability necessary for complex projects.