UbiOps – the widely-used artificial intelligence deployment and serving platform - is adding support for Graphcore IPUs to help developers run large AI jobs faster and in a more cost efficient way.
The rollout of UbiOps, initially for IPUs on Gcore’s cloud service, allows developers to train and run AI models, and implement MLOps best practices without the need to configure and maintain complex compute infrastructure.
UbiOps acts as a serverless AI compute platform handling the serving, workflow management and dynamic scaling of companies' AI workloads. This way AI teams can focus more on their core tasks and spend less time on resource-intensive DevOps and IT work.
The underlying multi-cloud architecture of UbiOps enables teams to run and orchestrate AI workloads easily on state-of-the-art hardware with the click of a button, of which the Graphcore IPUs on GCore cloud are the latest innovation.
UbiOps users can easily create AI microservices and pipelines for both deployment and training workloads, deploy them on IPU enabled nodes and scale their IPU usage dynamically with their changing needs.
This proves especially effective for large, modern AI workloads like GenAI models and LLM fine-tuning.
UbiOps CEO Yannick Maltha welcomed the partnership, saying: “Graphcore IPUs are emerging as a powerful and cost-effective platform for AI accelerated compute to build the next generation of AI solutions".
Victor Pereboom, CTO at UbiOps said: "This partnership will enable businesses to run large AI workloads like GenAI models and LLM fine-tuning faster while optimizing their costs at the same time. Graphcore's specialized IPUs orchestrated by Gcore's cost-effective cloud provide a powerful computing architecture that data science teams can utilize with a click of a button in UbiOps".
The growing IPU ecosystem
The accelerating adoption of Graphcore IPU compute by AI-centric businesses and those conducting leading-edge research is being supported by a growing ecosystem of developer tools as well as an emerging layer of AI-as-a-Service.
Graphcore’s software stack is fully integrated with major AI frameworks, including PyTorch and PyTorch Geometric, Tensorflow, Keras and PaddlePaddle, while Hugging Face Optimum provides a wide range of IPU-optimised models.
The performance and standout economics of Graphcore IPUs is also proving an attractive proposition to new AI-centric businesses building higher-level AIaaS businesses, including NLP business insights platform Pienso, foundation model creator AlephAlpha, and NLP Cloud, a provider of convenient API-based models.
Visit UbiOps.com and find out how UbiOps and IPU enabled compute can accelerate your next AI project.