Oracle Cloud Infrastructure—Powering Bing Conversational Search
By: Craig Kennedy
Oracle Cloud Infrastructure—Powering Bing Conversational Search
Yesterday, November 7, Oracle announced it had entered into a multi-year agreement with Microsoft to utilize Oracle Cloud Infrastructure (OCI) resources to support the growing demand of Bing conversational searches.
Oracle and Microsoft—Competitors or Partners?
Oracle and Microsoft both compete for cloud revenue share with the other hyperscale cloud providers, however these two major players have demonstrated a growing tendency to collaborate.
It was less than two short months ago that they jointly announced Oracle Database@Azure, providing users direct access to Oracle database services running on Oracle Cloud Infrastructure (OCI) inside Microsoft Azure datacenters.
History of Collaboration back to 2019
This history of collaboration between these two tech giants goes back even further with their realizing they have a broad overlap of customers using their business apps and database offerings and to entice them to migrate to the cloud required them to embrace multi-cloud and establish high speed, low latency, redundant interconnects between their respective cloud platforms, which they did back in 2019.
OCI—Purpose Built to Perform at Less Cost
Oracle has been quietly, and in some cases not so quietly, spending billions of dollars to amass high end GPUs from NVIDIA for building out its AI infrastructure. Oracle has been touting the performance and cost savings of its OCI Supercluster and AI Infrastructure, claiming it has greater scalability, supporting tens of thousands of GPUs per cluster, and up to 35% – 41% cheaper than other cloud providers.
Microsoft—AI Resource Consumption on the Rise
Microsoft is stating the explosive growth of Bing conversational search as the reason for the additional AI compute resources; however, I suspect it’s a combination of this growth combined with the accelerating demand for generative AI resources by Azure partner and enterprise users. Microsoft plans to use both its own AI infrastructure and OCI’s AI infrastructure to run daily inferencing of Microsoft’s massive AI models that power Bing conversational search.
Bottom Line
There are a couple of takeaways from this announcement.
1) Users in Microsoft Azure are consuming AI resources faster than Microsoft can physically build them, and
2) Oracle has resources available for AI workloads to spare that are affordable enough to entice Microsoft to rent vs buy. Inference models at this scale require clusters of tens of thousands of GPUs, and it appears Oracle has what Microsoft needs.
See the Future of Unified Communications and Collaboration with CEO and Lead Analyst Jim Lundy!
Airing LIVE on Thursday, November 16th at 10 AM PT | 1 PM ET
The Future of Communications and Collaboration
The second largest market in enterprise software is Communications and Collaboration. With a growing focus on AI and on enabling users to do video, voice or messaging based interactions, Communications and Collaboration is poised for the most change as Video becomes pervasive and messaging takes over from email.
Join Aragon Research’s Founder and CEO, Jim Lundy, on Thursday, November 16th, 2023 as he talks about what the future of UC&C looks like.
Key issues being covered include:
- What are the trends driving the evolution of Communications and Collaboration?
- How will AI and CoPilots make communication and collaboration more efficient?
- How can enterprises stay ahead of and take advantage of all of the innovation that is coming from Providers?
Have a Comment on this?