The Search app has become a benchmark for locating virtually anything. From our friends and family, to our devices and iCloud family, to anything we have placed an AirTag on. With this, it is no surprise that Apple continues think about and develop ways to make the location even more comfortable, including the use of augmented reality.
Information already available with a lot of potential
The truth is that the Search app’s localization system is already very intuitive, an arrow in the center of the screen shows us indicates direction and distance to object appears at the bottom. To go further, the “Electronic Device Tracking Systems” patent proposes the use of what is defined as a “visual aid” to locate our objects.
“The control circuit [del iPhone] you can use the screen or other output device to guide the user to the item, “the patent says.” The screen can display a visual guide such as an arrow, sphere, circle, compass , a map or any other visual aid that directs user to the “article” address.
“The screen can overlay a visual aid on live images of the user’s environment captured by a camera,” he continues. “If desired, the control circuit can wait until the electronic device is within a predetermined range of the object before turning on the camera.”
“The control circuit can change the size or other characteristic of the visual aid as the distance between the electronic device and the object changes,” he says. “The control circuit can change the location of the visual aid on the screen when the orientation of the electronic device relative to the object changes.”
The truth is, after seeing the navigation aid in the iOS 15 Maps app, which uses data from Apple maps combined with the iPhone’s ability to recognize environments to actually show the route to follow. increased, that a similar system access search application it’s something pretty consistent.
Like many technologies developed by Apple in recent times, the system fits very well in future augmented reality glasses. We have already commented several times that much of the technology that arrives on our iPhone, in addition to being useful to us on a daily basis, is used by Apple to continue to develop the basis of these future glasses. Functions such as the recognition of text written in photographs, for example, will be very useful in the future.
The truth is that locating objects, especially AirTags, from the search app is already using the cameras. It does this to have more context on the surrounding space and to give us better clues as to the location of our lost object. Given this, combining the information so that we can see it on the screen in augmented reality only seems a matter of time.