A few days ago, Google presented a few projects as part of its I/O 2024 keynote event. At the center is – how could it be otherwise – artificial intelligence.
This trailer was also part of the presentation and is intended to introduce you to the practical advantages of Project Astra. This AI technology allows users to interact with the environment directly via the smartphone camera.
In the video you can see the following examples:
“Let me know as soon as you discover something that can produce sound.” – In response to this request, Astra immediately discovers a loudspeaker.
The lady filming then draws an arrow on the screen that points to the upper speaker. She asks what do you call this part. The AI recognizes the request and promptly replies that it is a tweeter (tweeter speaker) and explains how it works.
If these gimmicks aren’t enough for you:
– Project Astra recognizes code and can explain how it works.
– She can create alliterations about objects (in this case crayons).
– She can also guess the location you are in based on the pictures.
– Astra casually remembers where he left his glasses
Project Astra is not just limited to the camera of the Pixel smartphone. It was also tested with a prototype of smart glasses.
It can react directly to the wearer’s field of vision and, for example, recognizes Schrödinger’s cat on a board.
Of course, it remains to be seen to what extent the AI can implement what has been shown in everyday practice and how error-free the detection will work. This demonstration is definitely impressive.
Further information about the keynote can be found here: Google I/O 2024: All innovations from the keynote at a glance
Spec comparison: Pixel 8a vs. Pixel 8: The specs of the two Android phones in comparison