It’s probably a very similar moment to the one we’re experiencing the arrival of the first iPhone. I am referring to the launch of Apple Vision Pro (for now, only in the United States), which favors the appearance of new applications that take advantage of the hardware capabilities of the new Apple device.
One of the most curious, and which gets the most talk, is Magic Room: it is a native application for visionOS which makes visible what is called the “Mesh Dynamics” of the Apple Vision Pro LiDAR sensor. Basically, it is the “mapping” of reality with a polygonal mesh composed of thousands of points sampled by this type of sensor.
This polygonal mesh is capable of detect depth, position and size of each element in real time (since hundreds of thousands of samples are produced per second), allowing visionOS to know where each element is located in relation to the user.
APPLE’S AUGMENTED REALITY GLASSES are called VISION PRO and look like SCIENCE FICTION
Apple Vision Pro technology with LiDAR
LiDAR technology has been present in iPhones since the iPhone 12 Pro and allows the device to “know” what reality looks like to make decisions about depth of field, photographs, mixed reality applications and other decisions requiring distance calculation of that type. On iPhones it is mainly used to improve photographs in low light conditions, since this type of sensor does not need it to work.
LiDAR is an active remote sensing system. In other words, it emits infrared laser pulses and measures the return time to the sensor for each mapped point. It does this in real time, calculating around 120,000 samples per second, so we can use Apple Vision Pro while moving in a real environment and the system is able to detect our environment and its volume.
What Magic Room does is precisely “reveal this magic”, as they define it. When the application starts, suddenly this mesh of points connected in the form of polygons – just as visionOS recognizes via LiDAR – will appear in front of us.
I tried the example you see in the image above on the terrace of the house. The reference photo is taken during the day, but I did the test with the app at night so we can see that even without light, the LiDAR works perfectly – and is able to “retrace” reality with astonishing detail and speed.
Not only is it curious – and spectacular, when you try it with the Apple viewer – it’s also interesting check the power of the device to interpret reality. In the app you can even adjust the development speed and some other special tricks.
What surprises you the first time you try it is that It makes us feel like we can “see through things.”. Not really: Magic Room is cartography constantly reality, even if we do not activate the visualization of the mesh. If we have already navigated the environment, the spatial point data will be stored.
At home for example, this happens to me with the room where the washing machine is located: is able to understand that he is behind a wall with the information you already have from the previous sampling and the sample on the model from our point of view. This interpretation of the polygonal mesh almost leads us to deceive ourselves as if we could “see through the wall”.
Matrix in your living room
One of the most curious and spectacular uses is to activate the “Digital Rain” visualization, where we will see how our house magically becomes a interpretation of Matrix code, moving and flowing around us. As we move around the room, new parts are revealed, in perfect real time and without any type of latency or delay.
You can also change the font size of the code that is circulating, the changes to which we will see applied instantly in the experience. And we have the possibility of blur the real environment or mix it with experience interactive to give an even more cinematic feel.
In the images or videos that you can see, you cannot appreciate the feeling of having it not only in front of you, but also enveloping you. It is a very curious immersive sensation, which also conveys the technological capabilities of Apple Vision Proand that games like the great Super Fruit Ninja already use to “map” the room and know where the furniture in our living room is so that the fruits hit, stain or slide (For example) at the bottom of our sofa. The future of these applications, more sophisticated, immersive and consistent with the reality surrounding the user, is fast approaching.
In Pommesfera | Apple Vision Pro, first impressions: after trying it for 24 hours, I can say it’s the most amazing thing Apple has done in recent years