These are the gestures with which the Apple Vision Pro is controlled

The gestures that control the Apple Vision Pro

The conceptual revolution that involves the new Apple product, the Apple Vision Pro, he is extraordinarily tall. The company has introduced what they call spatial computing. Users will be able to interact with the device in an immersive reality through eye movement and hand gestures. The rest is put together by Apple’s hardware and software. Throughout this week at WWDC, engineers showed the six main gestures with which the Apple Vision Pro interface is controlled and the evolution that gestures could have towards more complex movements also available in glasses.

Six gestures that control the Apple Vision Pro

The Apple Vision Pro files more than 5,000 patents for its operation, a technological breakthrough that is setting a precedent in spatial computing in mixed reality. Thanks to ten sensors, cameras and screens the user will be able to work, have fun and many other actions from the same place with the glasses.

The gestures that control the Apple Vision Pro

During WWDC23, a session entitled “Design for spatial introduction” was offered, moderated by Israel Pastrana and Eugene Krivoruchko, in which they discussed the importance of eyes and hands to the operation of the Apple Vision Pro.

As you can see, the operation of visionOS is governed by six main gestures:

  • To touch: Touch your thumb with your index finger to touch a virtual item on the screen you are on. It’s like tapping or tapping somewhere on the screen of a device like an iPhone or iPad.
  • Double touch
  • Press and hold : this gesture could be targeted to select text.
  • Click and drag: It could be used to scroll a web page or navigate horizontally or vertically through the content we are viewing. The speed and force with which we dragged would result in more or less speed.
  • Zoom: the first of the gestures to be performed with two hands. This is done like pinching the screen or image and moving it around to increase the size or zoom.
  • Turn: grabbing the content and rotating it around an invisible axis rotates the content.

were also discussed complex gestures oriented in a context, like the ones you can see in the picture a few lines above. In this case, developers would have endless possibilities, understanding the Apple Vision Pro as an extension of reality. In other words, the gestures will depend on the application or scenario in which we find ourselves and the interactions will be generated by the developers. For example, put a vinyl record as in the example or click to rotate an object on itself.

oriXone

oriXone

I started playing Xbox a lot thanks to Call of Duty online. Since then I haven't stopped playing competitive online.

Related Posts

Next Post

Leave a Reply

Your email address will not be published.