these are the secrets of a new sensor that converts undesirable authenticity

oriXone

these are the secrets of a new sensor that converts undesirable authenticity

authenticity, converts, Secrets, sensor, undesirable


It's not clear that it will happen, in fact I personally didn't bet, however Apple's designs are immeasurable and finally a new product launch: a full MacBook Air update with 10th-generation processes from Intel, a double SSD base and a change to the Magic Keyboard keyboard, a Mac mini update with double starter storage: the new iPad Pro.

If you want more details, you can read the articles mentioned that our partners have done to provide more details, such as the new CPU A12Z or a large case of a keyboard that keeps the floating iPad held by a magistrate. We will focus on the new rear camera. But not on wide-angle and wide-angle cameras, similar to those on the iPhone 11. Let's talk about another major proponent: 3D sensor for LiDAR technology either Laser Imagination and Ranging). What is it and what is it for?

LiDAR, a laser beam light sensor

LiDAR technology is widely used in the professional world, for immediate detection cloud of points to allow us to know the distance of objects (high) relative to the probability of a laser beam. The system is as simple as shooting a beam of laser light and measuring the time when the discharge from that beam is taken back by the sensor.

IPad Pro LIDAR with Augmented Reality

As Apple said in its announcement of the new Apple Pro, the laser operates at the Photon level (light images) in nanosecondsThat is, it is a very high speed scanner and receives a single turn of the light to measure the approximate distance immediately.

LiDAR is a widely used technology in atmospheric physics, geology or seismology because it allows high-speed 3D data for any terrain. But in recent years, it has been the technology that autonomous vehicles operate on get a 360 degree image of your surroundings. Google's Waymo cars use this type of sensor to get instant information about their location and distance to each item.

Autonomous Waymo is a vehicle with sensors and LiDAR cameras Autonomous Waymo is a vehicle with sensors and LiDAR cameras

In the case of the iPad, LiDAR is able to show and locate point maps inside and out, at a depth of 5 meters. That allows you, in almost any room, classroom or adjacent space, to get a close reading of the closest objects and the depth map obtained, and get a better verified experience.

Virtual reality, from computer vision to laser light

We've already told you the truth is, uses a machine-based learning engine to interpret what the camera sees through the lens. The rendering is based on how light shines in different "camera view" locations, so that the device measures the depth and visibility map of the point (or feature points). Those performance points are the 3D space spaces that correspond to the actual object, fixed to the known X, the coordinate Z.

When a feature point fixed in 3D space, even when moving the camera, the tip will move as if it were the point of the visible anchor in an uncomfortable realm connected to the real world.

Obviously, if there is not enough light or if they are in front of a flat surface where there is no visible difference in how light shines on them, the computer detection algorithm will not be able to locate this earth or measure its distance to the camera's purpose to set the point of work. But if it gets too many in the plane building, then it will see the plane either horizontally or vertically and almost positioned in the middle of an incident.

3D volumetric cloud is the point that gets the LiDAR sensor for the iPad Pro 2020 3D volumetric cloud is the point that gets the LiDAR sensor for the iPad Pro 2020

When a computer vision measures the scene and detects operating points or 3D planes, we can place 3D objects (objects) directed at those points and allow for the feeling that it is truly embedded and / or positioned in the real world.

But LiDAR is progressing because it no longer uses computer vision, which in many cases is 100% ineffective, as we have already noted and depends on whether there is a properly illuminated area. Applications will now get a real 3D scale map of distances. With the use of LiDAR, any feature or front of the camera can be accurately detected and reconfigured as a 3D object.

Putting material in the real place

This is a magic trick behind a real one that is not liked by the truth. When I put a 3D object on an airplane, I do it on a 3D plane with no texture (visible) but that it blends in with its realm. I think I've left a virtual vase on top of the actual table, but in fact I've done it in a transparent 3D plane and its size and size match the actual table.

Virtual reality, to be 100% immersed, should create a visual map of the 3D space matrix and align it with the real world to give the impression that one is integrated into another.

Based on these functional capabilities, the new LIDAR sensor will allow for greater accuracy in 3D rendering of our environment that will create more accurate and more accurate volumetric optical equations. Therefore When I put material in this space, they will see a sofa, floor, table or other object as part of a common ground (we might call it, a mirror representation in the physical world of the real world) and by this we will feel that the realities of the scene are imagined with absolute accuracy.

The whole future ahead

When the real environment is created in its 3D "screen size" in our application using this LiDAR technology, the need for a process to manage this area is very small because we do not use computer vision algorithms and with this we can increase the display quality or location accuracy. Use the GPU to render the quality of multiple drawings rather than using them to translate scenes in real time.

And best of all: how LiDAR will help build this event with ARKit (Apple's true library) Developers don't have to do anything to integrate it: It's totally obvious to us. Simply put, the surfaces will be accurately formed with a new laser and nothing else.

The floor is full, but it's really nice

On the other hand, I am not suggesting that Apple will later approve of a specific IPOS 14 API that allows to manage this scanner to the highest and most efficient and effective 3D imaging of any, the ability to work in one-dimensional 3D recording and the composition of an object. The opportunities presented to us are, of course, exciting and extremely creative.

In addition, it should be considered that this LiDAR technology is greatly reduced in space, such as the photo-metal gap, is compatible with Apple's virtual reality headset and mirrors, which makes our experience more integrated. If we listen to the rumors, it will come with the new iPhone 12 Pro. So, we are just looking at the beginning of a change.

Leave a Comment