25/02/2023

The iPhone maker’s purchase of startup Xnor.ai is the latest move toward a trend of computing on the “edge,” rather than in the cloud.

Apple dropped $200 million this week on a company that makes lightweight artificial intelligence. Its all about keeping an edge in AI … by adding more AI to the edge.
The acquisition of Xnor.ai, a Seattle startup working on low-power machine learning software and hardware, points to a key AI battleground for Apple and other tech heavyweightspacking ever-more intelligence into smartphones, smartwatches, and other smart devices that do computing on the edge rather that in the cloud. And doing it without killing your battery.
Machine learning is going to happen at the edge in a big way, predicts Subhasish Mitra, a professor at Stanford who is working on low-power chips for AI. The big question is how do you do it efficiently? That requires new hardware technology and design. And, at the same time, new algorithms as well.
The most powerful AI algorithms tend to be large and very power hungry when run on general purpose chips. But a growing number of startups, Xnor.ai among them, have begun devising ways to pare down AI models and run them on extremely energy-efficient, highly specialized hardware.
Last March, Xnor.ai demoed a computer chip capable of running image recognition using only the power from a solar cell. A research paper authored by the founders of Xnor.ai and posted online in 2016 describes a more efficient form of convolutional neural network, a machine learning tool that is particularly well suited to visual tasks. The researchers reduced the size of the network by essentially creating a simplified approximation of the interplay among its layers.
Apple already makes chips that perform certain AI tasks, like recognizing the wake phrase Hey, Siri. But its hardware will need to become more capable without draining your battery. Apple did not respond to a request for comment.
Now, AI on the edge means running pretrained models that do a specific task, such as recognizing a face in a video or a voice in a call. But Mitra says it may not be long before we see edge devices that learn too. This could let a smartphone or another device improve its performance over time, without sending anything to the cloud. That would be truly exciting, he says. Today most devices are essentially dumb.
Applying AI to video more efficiently, as Xnor.ai has demoed, will also be key for Apple, Google, and anyone working in mobile computing. Cameras and related software are a key selling point for iPhones and other smartphones, and video-heavy apps like TikTok are popular among younger smartphone customers. Edge computing has the added benefit of keeping personal data on your device, instead of sending it to the cloud.
Dave Schubmehl, an analyst with the research firm IDC, says machine learning could also be used in Apple gadgets that currently don’t include AI. “I can see them running AI on the Apple Watch and in AirPods, to clean up sound for example,” he says. “There’s tremendous opportunity in existing products.”
Running sophisticated AI on video, like an algorithm that can tell whats happening in a scene or add complex special effects, is usually done in the cloud because it requires a significant amount of computer power. For example, adding synthetic depth of field to your photos might require running a deep network to estimate the depth of each pixel, says James Hays, a professor at Georgia Tech who specializes in computer vision.
Besides making your iPhones camera smarter, Xnor.ais technology could help Apple in other areas. Giving machines more ability to perceive and understand the messy real world will be key to robotics, autonomous driving, and natural language understanding.
If the goal of AI is to achieve human-level intelligence, reasoning about images is vital to that, Hays says, noting that roughly a third of the human brain is dedicated to visual processing. Evolution seems to consider vision vital to intelligence, he says.
Apple seems to think that a more evolved form of computer vision is pretty valuable too.