Apple has revealed that when iOS 15 reaches iPhones in September, it will allow them to pick text directly out of photographs to copy, share, and search thanks to a tool called Live Text.
Live Text will be capable of recognising text in images taken on the iPhone, shared images from friends or the web, screenshots, and can even spot text in the camera app’s live viewfinder.
Apple showed off examples of an iPhone scanning the text from a whiteboard with meeting notes, recognising the restaurant name in a logo on its sign, and picking out details from a boba cup. Apple even claims it will be able to scan handwritten notes and recipes.
Tapping the Live Text button in the Camera app instantly converts any recognised text, letting you select and copy it, but it can also pull text directly from saved images.
You can copy, save, and share that text, or directly search it – looking up the restaurant or ingredient you just found in the text. It’s also smart enough about context to recognise a phone number and instantly turn it into a link to open the Call app.
Alongside English, Live Text works in seven languages for the time-being, including both simplified and traditional Chinese.
Live Text will be included in iOS 15, expected to arrive alongside the iPhone 13 series this September, when it will also roll out to existing iPhones as a free update. We cover how to use Live Text on iPhone or iPad separately if you’re interested in finding out more.