oriXone

This is Apple's device for unpleasant virtual attacks

Apple39s, Attacks, device, unpleasant, Virtual


Apple continues to take action quietly, without rush but without pause. The steps are strongly devoted to them the future of an unpleasant reality and making its availability available to anyone. The real unpleasant steps are to become another user interface soon, as today is a leading tool, touch or voice.

The last step, as we told you at the launch of the Reality Converter app. That release serves as an excuse talk about Apple's virtual reality environment that is not well liked, its present state and its possible future.

Not only did we go through three versions of the ARKit library with virtual reality, this year we had two very important additions in the form of a new library and an app that greatly aid the creation of virtual reality experiences: the RealityKit library and the Reality Composer app (which also has the iPad and Mac version).

Reac Composer on Mac and iPad

We'll review all the tools and apps that Apple has already published the reality of the world is unpopular.

ARKit, a library

ARKit is a virtual Apple library that was developed. It is a library built on Objective-C, integrated into iOS and IPOS systems, and its purpose is to provide information about the truth captured by our camera and its sensors.

ARKit does not paint any object in augmented reality, only giving us information on how to interpret what it sees with the device's camera and sensor information in real-time. With this information, we can draw augmented reality using a 3D graphics library.

The ARKit library cannot paint anything, since its function is to collect 60 times per second the data that comes with the camera, as well as the navigation sensors of the device. In this way, ARKit informs us of how the cell or tablet has moved in the x, y, z axes of the z axis (horizontal, vertical and horizontal). If you start an unpleasant real session, point to where you are The app is 0.0.0 and from there it will add or subtract values ​​so we know how it goes.

ARKit 1.0, the first type of library

It also gives us rotation data on the basics, so that we know the degree of inclination the device has at 360 degrees. This amount of data, it allows us to have enough information to convey the unpleasant realities of the unconventional movement of the app, and with that, feel that things exist.

But the important thing to understand is that this information will then be used by the 3D library to paint objects in frames captured by the camera. All based on information provided by ARKit. This is why the library is not only compatible with Apple's traditional 3D wallpapers API, SceneKit. There is also a standard native API, Metal, and other third-party libraries such as the Unreal Engine or Unity video IDEs game.

The emergence of ARKit in its three years of existence

The first version of ARKit, as any bookhop calls its salt, has shown its original weapons and objectives, To create the foundation we have to work there. In its first edition, everything was working on finding a horizontal surface such as tables or floors, where the overlapping of points on the work surface meant that the surface was held.

The point of the operation is that point in the 3D field seen by the camera, where the light and light allow us to determine a certain distance from our device to that "spot" in a three dimensional space.

For flight, we can place 3D objects in them with the assurance that this will allow visual integration of the objects. At the combination of performance points the positioning and acquisition of the aircraft it was the perfect opportunity provided by ARKit in that first run. We can also use FaceID sensors to capture faces in real-time and get a 3D map. That is the basis of Animojis.

Planet discovery via ARKit. Yellow points are performance points.

A few months later ARKit 1.5 arrived. New step: achieved vertical and horizontal image acquisition. In this way, we can load the texture which, in the computer vision, allows us to find the transparency in the 3D space and create a plane in which we can work ** to place objects or put texture to it as a 3D object **.

With iOS 12 comes ARKit 2.0. Overall performance improvements, increased 720p quality in the camera window to 1080p on more powerful devices and shared experiences arrive. The system is able to see 3D objects that we have previously registered and even scenes where we have been (because we can record details of an event captured in 3D).

Using WiFi or Bluetooth, we can create a real-time data-sharing network between devices and other devices. an unpleasant virtual reality space sharing your information with other devices, and events. This way, several people (and without the need for an internet connection) can see the virtual reality experience from different devices and points of view.

With iOS 13 comes ARKit 3.0. The power kept increasing: A sign of people so that if there is someone in front of the camera and something visible behind it, this is not drawn because logically, there is someone in front. And the discovery of moving in the style of original cinema. It is no longer available for more than 50 facial expressions, as included with ARKit in its original version.

Prevalence of people with ARKit 3.0. The girl in front of the material, covers her. Prevalence of people with ARKit 3.0. The girl in front of the material, covers her.

With ARKit 3.0 we can find people again find a 3D skeleton map with bone and rotation points, so that you can duplicate it in any 3D object with the same motion base that is the 3D shape.

Face capture capacity increased to 3 faces simultaneously, the use of interactive experiences using a rear and front camera at the same time as well as interactive sessions where we can have scenes already uploaded to our device when used to think of a fast and efficient startup.

In addition, up to 100 static images, machine learning is now being used for this purpose, so detection is much faster and 3D detection now supports larger objects and better accuracy has been improved.

It is important to note that human exposures or captures of bone marrow, are limited to devices with the CPU A12 onwards, due to the power and specific capacity requirements.

But the release of ARKit 3 did not end there, with a new library being introduced: RealityKit with a new app, Reality Composer.

RealityKit, create your scenes without editing

RealityKit is a new library, from Swift, that connect to the SceneKit library (which uses Metal) for information from ARKit.

RealityKit is an event file translator created by the Real Composer app.

One of the most difficult things is when creating a truly unpleasant reality place the scenes themselves. Import 3D objects, place, illuminate, create connections (not affected, for example), images … all of these had to be programmed into the SceneKit library (which is a 3D video game program) that prevents this curve from entering this type of application.

A common structure of how RealityKit works A common structure of how RealityKit works

However, in conjunction with ReightKit we have a Real Composer app, capable of composing scenes, but with a sound app. We can create an event, introduce objects, give them functionality, animation, respond to events, prepare their physics … all we had to do to organize, but from a simple application and without writing a single line of code.

Reac Composer for Mac

So RealityKit is a library that uploads scenes created in Reality Composer, so that our real unpleasant experiences are prepared and running with a few lines of code.

Reality Converter, convert your 3D objects

The 3D formats Reality Composer and Xcode are welcoming outline SceneKit are there Collada (open, ext .obj) format or the traditional format you use for scenes: USDZ. This format, developed by Wright for his work, is a testament to Apple's use of unpleasant realities that we can hang on the web and see directly from Safari.

Reality Converter Beta

But many 3D-service providers do not use these frameworks, because they rely heavily on the system they use for that work. This is where the Reality Converter comes in. Basically a program that is able to load other 3D object values ​​and scenes, such as FBX.

Reality converter is not intended as a simple converter: it we can see the revolutionary elements in detail, they voluntarily convert and save their favorites so we can transfer everything to USDZ format. It also supports .obj editing and .gltf and .usd.

Everything is ready with a helmet and lenses

If we analyze the environment ARKit has had since being introduced in iOS 11 to today, we can clearly see how Apple has been producing the perfect environment for development based on this kind of thing, covering almost any type of use case and providing an amazing palette of developer opportunities.

The ultimate goal is not just to use the iPhone and iPad as a true entertainment tool for entertainment, education or professional use, among others. The real purpose is a helmet and virtual reality lenses. That this technology resonates with us on a daily basis and allows us to enjoy a step beyond communication and information in real time.

Apple augmented real lens prototype

Arrow marked paths, product information and offers, real-time data for any kind of point of interest, interactive museums, in-depth shopping experiences online and young mentors in our home … millions of opportunities to explain how we communicate with information.

If we add to all of this the next step will be deep 3D background sensors that would arrive in the new IP Pro (according to rumors) all aligned enter a new atmosphere of user experience.

The future is just around the corner and where Apple, gradually, step by step, has been teaching developers how to do it. Without a doubt, it's a happy future.

Leave a Comment