NVIDIA today announced the NVIDIA GPU Cloud (NGC), a cloud-based platform that will give developers convenient access -- via their PC, NVIDIA DGX system or the cloud -- to a comprehensive software suite for harnessing the transformative powers of AI.
Speaking at the eighth annual GPU Technology Conference, NVIDIA CEO and founder Jensen Huang said that NGC will make it easier for developers to access the latest, optimized deep learning frameworks and the newest GPU computing resources.
"We're designing a cloud platform that will unleash AI developers, so they can build a smarter world," said Jim McHugh, vice president and general manager of DGX Systems at NVIDIA. "You can do your best work no matter where you are, using our latest technology in the cloud. It's accelerated computing when and where you need it."
Harnessing deep learning presents two challenges for developers and data scientists. One is the need to gather into a single stack the requisite software components -- including deep learning frameworks, libraries, operating system and drivers. Another is getting access to the latest GPU computing resources to train a neural network.
NVIDIA solved the first challenge earlier this year by combining the key software elements within the NVIDIA DGX-1™ AI supercomputer into a containerized package. As part of the NGC, this package, called the NGC Software Stack, will be more widely available and kept updated and optimized for maximum performance.
To address the hardware challenge, NGC will give developers the flexibility to run the NGC Software Stack on a PC (equipped with a TITAN X or GeForce® GTX 1080 Ti), on a DGX system or from the cloud.
NGC will accelerate and simplify deep learning development by making it easier for developers to conduct deep learning training, experimentation and deployment. Developers will be able to easily design more sophisticated neural networks, process more data, iterate quickly and get to market faster.
NGC will offer the following benefits:
- Purpose Built: Designed for deep learning on the world's fastest GPUs.
- Optimized and Integrated: The NGC Software Stack will provide a wide range of software, including: Caffe, Caffe2, CNTK, MXNet, TensorFlow, Theano and Torch frameworks, as well as the NVIDIA DIGITS™ GPU training system, the NVIDIA Deep Learning SDK (for example, cuDNN and NCCL), nvidia-docker, GPU drivers and NVIDIA® CUDA® for rapidly designing deep neural networks.
- Convenient: With just one NVIDIA account, NGC users will have a simple application that guides people through deep learning workflow projects across all system types whether PC, DGX system or NGC.
- Versatile: It's built to run anywhere. Users can start with a single GPU on a PC and add more compute resources on demand with a DGX system or through the cloud. They can import data, set up the job configuration, select a framework and hit run. The output could then be loaded into TensorRT™ for inferencing.
With NGC, developers can build models of any size or type, using a versatile platform that makes it easier to move models from prototyping to deployment. They can increase or decrease computing resources, and only pay for what they need.
NGC is expected to enter public beta by the third quarter. Pricing will be announced at a later date. Learn more at www.nvidia.com/cloud.