Site icon Aragon Research

Oracle Cloud Infrastructure—Powering Bing Conversational Search

By: Craig Kennedy

 

Oracle Cloud Infrastructure—Powering Bing Conversational Search

Yesterday, November 7, Oracle announced it had entered into a multi-year agreement with Microsoft to utilize Oracle Cloud Infrastructure (OCI) resources to support the growing demand of Bing conversational searches. 

Oracle and Microsoft—Competitors or Partners?

Oracle and Microsoft both compete for cloud revenue share with the other hyperscale cloud providers, however these two major players have demonstrated a growing tendency to collaborate.

It was less than two short months ago that they jointly announced Oracle Database@Azure, providing users direct access to Oracle database services running on Oracle Cloud Infrastructure (OCI) inside Microsoft Azure datacenters.

History of Collaboration back to 2019

This history of collaboration between these two tech giants goes back even further with their realizing they have a broad overlap of customers using their business apps and database offerings and to entice them to migrate to the cloud required them to embrace multi-cloud and establish high speed, low latency, redundant interconnects between their respective cloud platforms, which they did back in 2019.

OCI—Purpose Built to Perform at Less Cost

Oracle has been quietly, and in some cases not so quietly, spending billions of dollars to amass high end GPUs from NVIDIA for building out its AI infrastructure. Oracle has been touting the performance and cost savings of its OCI Supercluster and AI Infrastructure, claiming it has greater scalability, supporting tens of thousands of GPUs per cluster, and up to 35% – 41% cheaper than other cloud providers.

Microsoft—AI Resource Consumption on the Rise

Microsoft is stating the explosive growth of Bing conversational search as the reason for the additional AI compute resources; however, I suspect it’s a combination of this growth combined with the accelerating demand for generative AI resources by Azure partner and enterprise users. Microsoft plans to use both its own AI infrastructure and OCI’s AI infrastructure to run daily inferencing of Microsoft’s massive AI models that power Bing conversational search.

Bottom Line

There are a couple of takeaways from this announcement.

1) Users in Microsoft Azure are consuming AI resources faster than Microsoft can physically build them, and

2) Oracle has resources available for AI workloads to spare that are affordable enough to entice Microsoft to rent vs buy. Inference models at this scale require clusters of tens of thousands of GPUs, and it appears Oracle has what Microsoft needs.


See the Future of Unified Communications and Collaboration with CEO and Lead Analyst Jim Lundy!

Airing LIVE on Thursday, November 16th at 10 AM PT | 1 PM ET

 

The Future of Communications and Collaboration

The second largest market in enterprise software is Communications and Collaboration. With a growing focus on AI and on enabling users to do video, voice or messaging based interactions, Communications and Collaboration is poised for the most change as Video becomes pervasive and messaging takes over from email.

Join Aragon Research’s Founder and CEO, Jim Lundy, on Thursday, November 16th, 2023 as he talks about what the future of UC&C looks like.

Key issues being covered include:

Register Here


 

Blog 1: Introducing the Digital Operations Blog Series

Blog 2: Digital Operations: Keeping Your Infrastructure Secure

Blog 3: Digital Operations: Cloud Computing

Blog 4: Cybersecurity Attacks Have Been Silently Escalating

Blog 5: Automation—The Key to Success in Today’s Digital World

Blog 6: Infrastructure—Making the Right Choices in a Digital World

Blog 7: Open-Source Software—Is Your Supply Chain at Risk?

Blog 8: IBM AIU—A System on a Chip Designed For AI

Blog 9: IBM Quantum: The Osprey Is Here

Blog 10: The Persistence of Log4j

Blog 11: AWS re:Invent 2022—Focus on Zero-ETL for AWS

Blog 12: AWS re:Invent 2022—The Customer Is Always Right

Blog 13: How Good is the New ChatGPT?

Blog 14: The U.S. Department of Defense Embraces Multi-Cloud

Blog 15: 2022 Digital Operations—The Year in Review

Blog 16: Lucky Number 13 for Intel—Intel Is Back on Top

Blog 17: Quantum Decryption—The Holy Grail for Cybercriminals

Blog 18: Microsoft and OpenAI—Intelligent Partnership

Blog 19: ChatGPT—The First One Is Free

Blog 20: Bing and ChatGPT—Your Co-Pilot When Searching the Web

Blog 21: ESXiArgs—Ransomware Attack on VMware

Blog 22: The Cost of Supply Chain Security—$250M in Sales

Blog 23: OpenAI Delivers on APIs—Accelerating the Adoption of ChatGPT

Blog 24: OpenAI Delivers on Plugins—Is ChatGPT The New Generative Content Platform?

Blog 25: Microsoft Security Copilot—Defending the Enterprise at the Speed of AI

Blog 26: Operation Cookie Monster Takes a Huge Bite Out of The Dark Web

Blog 27: AWS Bedrock—Amazon’s Generative AI Launch

Blog 28: Google Cloud Security AI Workbench – Conversational Security

Blog 29: World Password Day – Is This the Last Anniversary

Blog 30: Intel Partners to Enter the Generative AI Race—Aurora genAI

Blog 31: Charlotte AI – CrowdStrike Enters the Generative AI Cybersecurity Race

Blog 32: NICE Catches the Generative AI Wave

Blog 33: AMD Instinct MI300X—A New Challenger to Nvidia

Blog 34: Storm-0558—Chinese Cyber Attack on US Government Organizations

Blog 35: Network Resilience Coalition—Making the Network Safer

Blog 36: Frontier Model Forum—Power Players Unite to Make AI Safer

Blog 37: Intel Is Back in the Foundry Business—Entering the Angstrom Era

Blog 38: Check Point Acquires Perimeter 81—Securing Remote Access

Blog 39: PSA—Akira Ransomware Targeting Cisco VPNs

Blog 40: Volt Typhoon: A Chinese Cyberwar Threat to Critical US Infrastructure

Exit mobile version