What is multi-model routing?
Multi-model routing is a process of linking multiple AI models together. The routing can either be done in series or in parallel, meaning that you use a router to send prompts to specific models....
View ArticleAccelerate time-to-results for European NVIDIA AI Sovereign-Hybrid Cloud,...
June, 27.06.2024 Amsterdam UbiOps, an AI serving and orchestration platform, has partnered with NEBUL, Modern HPC solutions provider and an official NVIDIA Partner for NVIDIA DGX, GPU and...
View ArticleNew UbiOps features July 2024
On the 11th of July 2024 we have released new functionality and made improvements to our UbiOps SaaS product. An overview of the changes is given below. Python client library version for this release:...
View ArticleUbiOps vs standard Model Serving Platforms
What UbiOps delivers more than standard Model Serving Platforms? Model serving is the process of providing access to production-level models for end-users or applications. Meaning that they will be...
View ArticleDeploy Llama 3.1 8B Instruct on UbiOps
In this guide, we’ll take you through the release of the new update from MetaAI. This update saw changes to the existing Llama 3 8B & 70B models, while also releasing a new model with 405B...
View ArticleCreate a chatbot using Llama 3.1, Streamlit and UbiOps
In recent times we’ve seen that open-source LLMs like Mixtral and Llama are starting to rival the performance of some proprietary LLMs. One of the things to consider when working with open-source...
View ArticleUbiOps vs MLOps platforms
Machine learning operations (MLOps) involve a set of techniques and principles aimed at the design, development, deployment, and maintenance of machine learning models for production use. The purpose...
View ArticleData Privacy and AI in Healthcare
Artificial intelligence (AI) has the potential to significantly improve efficiency in the medical field. However, as the healthcare sector has very sensitive data, organizations and regulators need to...
View ArticleWhy are companies opting for on-premise instead of public cloud?
In a linkedin post by Fergal Mcgovern in May, he tries to explain why around 83% of enterprise CIOs plan to place some workloads on-premise instead of on-cloud. Let’s briefly explain what we mean when...
View ArticleDeploying Phi 3.5 Mini-Instruct to UbiOps using the CLI
In this guide, we will take you through the newly released Phi-3.5 Small Language Models (SLMs). Which saw updates to Microsoft’s two existing Phi 3 SLMs: Phi-3.5-mini & Phi-3.5-vision, and the...
View ArticleUbiOps Revolutionizes AI Model Inference Using AMD Instinct
UbiOps, a leading AI and machine learning deployment and serving platform, is thrilled to announce its collaboration with ecosystem partner AMD UbiOps today announced that it will transform how...
View ArticleWhy is Hybrid Cloud Deployment Useful?
Hybrid cloud is a type of deployment architecture in which the storage or compute capabilities are distributed across on-premise and public/private on-cloud hardware. In the AI field, this means...
View Article