Click to Skip Ad
Closing in...

Siri’s AI app control in iOS 18 is the start of Minority Report-like computing

Published May 31st, 2024 6:50AM EDT
iPhone 15 Pro Display
Image: Christian de Looper for BGR

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

I said earlier this week that the killer AI feature of iOS 18 might be privacy like nobody industry, but that was before a huge iOS 18 Siri leak dropped, detailing AI features more advanced than what ChatGPT or Gemini can do.

If Mark Gurman’s report is accurate, and they usually are, you’ll be able to tell Siri what actions to perform on your iPhone on your behalf. That’s the kind of voice control I’ve been dying to see on computers. With the arrival of AI, it was only a matter of time until it happened.

Telling Siri to open documents, move files around, send or delete emails, crop a photo for you, and share it with a friend is just the start. Siri controlling apps in iOS 18 will be much bigger than the iPhone. It’ll be a killer feature for wearable devices like the Apple Watch and the Vision Pro. And it should work amazingly well on the Mac and iPad, adding another layer of multitasking that’s currently impossible.

Gurman’s report mentions specific use cases for this Siri functionality and a few drawbacks. First, Siri voice control will work with Apple apps initially. But I expect Apple to allow third-party apps to plug into it as well in the future.

Second, Siri will take one command at a time rather than performing multiple commands stringed together. For example, you’ll have to tell Siri to transcribe a voice recording for you. Once the process is done, you might want to tell it to share that document with specific contacts. In the future, you might be able to give Siri that command in one go while you’re doing something else on any computer.

Here are some of the things Siri might do for you on iPhone and iPad, according to Gurman:

Siri will be a key focus of the WWDC unveiling. The new system will allow the assistant to control and navigate an iPhone or iPad with more precision. That includes being able to open individual documents, moving a note to another folder, sending or deleting an email, opening a particular publication in Apple News, emailing a web link, or even asking the device for a summary of an article.

While that happens in the background, you’ll focus on other tasks on the device you’re using.

But I suspect this only scratches the surface. What Apple is doing here is something few other tech companies can offer. Google will surely give Gemini similar powers, and Microsoft is doing it with Copilot. It’s all part of the AI revolution we’re currently witnessing.

I’ve said time and again that we’re heading towards personal AI experiences. These are AI assistants that will know everything about you and help you get things done faster. They’ll offer ChatGPT chatbot functionality to answer all sorts of questions and perform complex web searches for you.

But that’s half the job. The other half is what Apple will have Siri doing for you in iOS 18: Letting you use voice to control the iPhone.

Devices like Humane Ai Pin and Rabbit r1 wanted to introduce such AI experiences. But these gadgets are already dead on arrival, as the iPhone will be a much better alternative.

A Vision Pro user running multiple apps side-by-side.
A Vision Pro user running multiple apps side-by-side. Image source: Apple Inc.

The spatial computing of tomorrow

This Siri assistant will work even better on the Apple Watch or the Vision Pro. In fact, I said months ago that the Vision Pro needs AI like ChatGPT as soon as possible to improve the whole experience:

Generative AI like ChatGPT is an easy fix for making the most of that terrific hardware and limited battery life. Rather than Siri or any other type of input mechanism, an Apple version of ChatGPT would take the Vision Pro to the next level.

You could just talk to the AI and tell it what you need from your immediate AR/VR experience rather than using your eyes and hands to navigate menus. You could run queries as you use Vision Pro to get work done or for fun. The faster AI produces the answers you need, the more time you’ll spend doing what you want to do on the gadget instead of opening apps and navigating menus.

I told you that voice control will supplant eye and hand-tracking on the Vision Pro for a Minority Report type of spatial computing:

When Apple unveiled the Vision Pro, I said the headset would benefit from generative AI support. Siri, as it is now, isn’t good enough. But voice will be one way you interact with the computer of the future.

You’ll use your eyes to move the “cursor” and various hand gestures to control digital objects. But voice control will play a key role in spatial computing, especially once generative AI comes to Vision Pro. Voice will perfectly complement eye- and hand-tracking to get things done quickly. The best example that comes to mind is Minority Report computing. That’s where we’re heading.

Gurman’s report only reinforces my belief that we’re about to enter a new era of computing, where voice will play a much bigger role than ever now that AI can actually understand us. It’ll start with Siri getting smarter in iOS 18 on the iPhone, but the same features will quickly be available across Apple’s growing ecosystem. And this will be a huge advantage over rivals that don’t have similarly rich product lineups.

Chris Smith Senior Writer

Chris Smith has been covering consumer electronics ever since the iPhone revolutionized the industry in 2008. When he’s not writing about the most recent tech news for BGR, he brings his entertainment expertise to Marvel’s Cinematic Universe and other blockbuster franchises.

Outside of work, you’ll catch him streaming almost every new movie and TV show release as soon as it's available.

\
  翻译: