Apple’s Not So Secret AI Investments Are About to Pay Off
Apple’s Not So Secret AI Investments Are About to Pay Off
Apple, notoriously tight-lipped about its future plans, seems poised to capitalize on the coming shift to edge computing – a requirement for many AI applications such as Intelligent Assistants. From the blazing-fast M4 chip rumored to power upcoming devices to the reported development of dedicated AI data center chips, This blog discusses the work Apple has been doing to prepare for the age of AI and Edge.
The M4 Processor and New AI Data Center Chip Put Others on Notice
A few years ago, new Macbooks took forever to get refreshed – in part due to Intel’s inability to refresh its chipsets to support Apple’s new Product needs. Enter the System on a Chip.
What is amazing about the new M4 Processor isn’t all the specs, it is that it is here and it is shipping less than a year after the M3 processor emerged for Macbooks and iPads. Given Apple’s use of the ARM architecture, there are no signs of them slowing down.
The M4 Processor is fast – in fact, just released benchmarks show that it is 16% faster than Intel’s flagship I9 desktop processor. Why you ask – well look at the components – all in one System on a chip design: the M4 features a 10-core CPU and a 10-core GPU with a 16-core Neural Engine capable of 38TOPS. The question is – how much faster will the M5 be –
Apple’s Quiet Shift to GPUs Started Years Ago
Apple’s shift to custom-designed, GPU-heavy System on a Chip (SoC) architectures across iPhones, iPads, and Macs just has yet to be about faster graphics. This move, years in the making, paved the way for dedicated on-device AI processing. Rather than relying solely on cloud-based AI, Apple seems positioned to leverage machine learning models right on your device.
Apple Has a Lead in GPUs for Edge Computing
Apple’s strategic investments point towards a future where AI isn’t just an add-on, but a core component of its ecosystem. Hints of this focus can be seen in the groundwork for on-device AI within iOS 18.
Details Apple’s Data Center processor appears to be focused on its M2 Ultra processor – with reports from Bloomberg stating that the new M4 mentioned above will quickly replace the M2. Given the performance benchmarks of the M4 – it makes perfect sense.
Small Language Models are Coming
Apple has hinted strongly that SLMs will run on iOS 18.; This means that local on-device Generative AI is coming to iPhones, iPads, and Macs. We expect this to enable lightning-fast, tailored responses and insights without needing to query an LLM in the cloud model every time. This shift could potentially elevate Apple’s user experience to an entirely new level.
While Microsoft has made significant strides with AI integration into Windows and Office, Apple’s unique control over both hardware and software could offer an edge. Deep integration between its AI-powered chips, operating systems, and applications could enable unmatched AI capabilities within its ecosystem.
Bottom Line
Apple has been quietly laying the foundation for a significant AI revolution. While details remain shrouded in secrecy, the signs are clear. Enterprises need to understand the future of AI-based processing will be as much a part of local devices – as it is in the cloud. This could be a pivotal turning point, redefining the role of AI in personal computing.
UPCOMING WEBINARS
The Race to Generative AI: Putting CoPilots to Work in Your Enterprise
Generative AI is here and with it the promise of increased productivity. But is that a promise or is it a reality?
In this webinar Aragon analyst, Adam Pease and CEO, Jim Lundy take on the challenge of putting Generative AI to work.
Have a Comment on this?