Over the last couple days Apple announced a raft of new features across its products, many of them directly built atop AI technologies, or in partnership with OpenAI. They also decided to rebrand “Artificial Intelligence” as “Apple Intelligence.” I know I should feel anxious when a giant organization bigfoots over a whole industry and tries to claim it as its own, but I also like watching Godzilla eat people, so. Also maybe the AI industry could use some devouring.
Apple’s AI move isn’t that exciting. One thing rules: The calculator app. The rest is more of the same. Memojis but drawn by AI is a downgrade, frankly. But okay—new paradigm, new features, they’re in the game.
The biggest part of these announcements, to me, is that many of the AI models are going to run locally—like, right on your phone, built in, no server required. Apple has written a nerdy and exhaustive explanation of how it’s all going to work, which you might enjoy reading if you’re nerdy and exhaustive.
AI models really are hyper-compressed databases of connections between numbers—numbers often derived from written sentences or images. Making these databases is very expensive, but when it comes down to it, AI stuff is just software. Perhaps it’s in a new weird category between “data” and “application,” but most models could fit on a small SD card. And what Apple is doing is saying: Yes, it’s actually better that way. Just have it. Do whatever. It’s up to you.
Servers and Clients and Wares, Oh My!
The entire history of computing oscillates steadily between “do everything on a network computer you’re leasing” and “do everything on a local computing device.” There’s always a financial logic on either side—it’s sort of like how people are always being told it makes more sense to rent forever but still want to own a house.
A great reason to use a server is if you’re running an e-commerce website that thousands of people use. Amazon started that way. Or a huge index of web pages anyone can search. That’s Google, or it was. But given that AI models are medium-sized blobs of data, there’s no real reason they shouldn’t run locally and update themselves from time to time. And if you can use them to do search-style things—if they’re able to tell you about the stock market, or help you find a new car or apartment, and if those explorations are bolstered with some real-world data like stock prices or maps while still keeping a lot of privacy turned on, then that is genuinely bad news for Google. We’ll see!
You Need to Calm Down
With Apple’s somewhat underwhelming announcement, I definitely felt like some air was coming out of the balloon. Or more specifically, Apple was saying: This is neat stuff! It’s about as neat as any other layer we add to our operating system. Such layers include the web, payments, health and fitness data, and so forth. It’s not going to completely change everything. You’re still going to have to spend $80 on a cloth Apple Watch wristband at the Apple store, and you’re going to download apps. Relax, nerds.
As for Aboard and its AI? My guess is that we’ll keep using OpenAI’s services for a while. Then we’ll spin up our own servers that replace OpenAI more cheaply. And some time down the road, good AI models will be built into operating systems and browsers, and we’ll just tap into those. Meanwhile, the cost of this technology—including the ecological costs—is going to keep going down.
Seeing Apple come to similar conclusions is like getting a hug from Godzilla. They see it too: The new thing is going to get increasingly commoditized, and instead of freaking out about imaginary AI robot overlords in the future, we can use it to do things for actual humans now.