Those who buy an Intel notebook with Ice Lake this fall may start to see increased instances of something special: a sprinkling of AI magic, first here and there, and then more and more.
This isn’t to say that AI capabilities are exclusive to Ice Lake, or that without it you won’t see drastic improvements in desktop software. But some of the “whoa” moments that app developers are working on require artificial intelligence and machine learning capabilities, which Intel is building into its new 10th-gen Core chip.
Some of those you already see today. Microsoft’s Photos app, for example, uses AI image analysis to come up with its own assessment of what it’s “seeing,” such as a beach scene, for example, or snow. Microsoft Photos and Google Photos already identify and group the subjects of your photos, recognizing who’s in them.
But on the PC, “AI” tends to be equated with digital assistants like Cortana. Intel’s trying to redefine how we think of AI.
What will AI do for you?
In a briefing before Computex, Intel showed off other AI examples: stylizing a video in real time as it plays, just like applying a filter in Snapchat; removing unwanted background noise from a call or chat, and accelerating CyberLink PhotoDirector 10’s ability to de-blur photos. Microsoft’s Skype and Teams apps can already pick you out from a video call and blur or even replace the background. AI functionality will make that faster.
What is a Gaussian Neural Accelerator?
Intel’s secret sauce is what it calls a Gaussian Neural Accelerator, a tuned piece of logic found within the Ice Lake chip package. The two work hand in hand. The CPU architecture accelerates what’s known as DLBoost, which in turn accelerates inferencing technology on Intel’s Ice Lake CPU. (Inferencing applies rules or algorithms to known facts to learn more about them.) The Gaussian Neural Accelerator, meanwhile, can run under very low-power conditions to perform a specialized task, such as real-time translation of an ongoing conversation.
New functions inside of a PC typically go through something of a struggle between running them on the general-purpose CPU or a dedicated add-on card. In the early days of the PC, for example, early multimedia functions were accelerated by Native Signal Processing and Intel’s Multimedia Instruction Set (MMX), then migrated to dedicated sound and graphics cards. Over time, basic graphics and audio capabilities moved back into the CPU and chipset as it became more cost-effective.
Figuring out how AI will be processed seems to be in these early stages, too. For now, Intel is hedging its bets, splitting the workload between the CPU’s DLBoost instructions, the Gaussian Neural Accelerator, and the more traditional Iris Plus integrated GPU. “The thing that we’re good at is providing primitives that they [software developers] can build upon,” Ronak Singha, an Intel fellow, told reporters before Computex.
Not the only game in town
Intel, obviously, isn’t alone. In fact, Qualcomm was first to show off how it eliminates background noise within an audio or video call at the launch of Snapdragon 8cx last December. AMD’s Lisa Su also told reporters that her company is working on AI infererencing, but offered no details.
As users, though, it’s not quite clear where AI-powered functions will pop up. One example, executives said, is how Adobe Photoshop has continually grown smarter and smarter at identifying the subject of a photo and then allowing it to be automatically migrated to another background, through AI-powered recognition. But that “magic lasso” function existed before Ice Lake, too.
It probably will be up to the app developers themselves to communicate what capabilities they’re delivering in a particular app or game, and what is expressly powered by AI. If history holds, Nvidia will have something to say about all of this, too: In 2008, companies like Ageia started talking about hardware physics accelerators after PhysX and others started making available physics engines to games for realistic approximations of how objects ricochet and fall in the real world. Those functions later got sucked into discrete GPUs, as Nvidia bought Ageia. It’s very possible a similar fight could occur, determining which chip gets AI inferencing inside your PC.