Do not do what you can with software using Hardware. This is one of the basics of Google, who has spent years showing that it didn't take two cameras to make a good portrait, that advanced hardware doesn't have to be wet and that machine learning can be installed across the app without Users.
With the latest update for the month of March, Google has provided more interesting Pixel news. However, one note is one of the most interesting. Google was able to create its own & # 39; 3D Touch & # 39; by using software.
3D Touch & # 39; made by Google
With a new battery of new features arriving in the March patch, Google made its Pixel they can see the pressure from the mirror. As the company explained to The Verge, this feature will be developed in future Google applications, currently available in the launcher.
Long Press is currently working on a select collection of app applications and user interfaces, such as Startup, Photos and Drive applications. This update accelerates media so that more options appear faster.
Basically, Google has used its API & # 39; Deep Press & # 39 ;, which has been available on Android since its version of Ten. With this, they have done the work a series of algorithms that can detect the pressed touch area. The higher the pressure, the higher the pressure. As always, i machine learning You key in this kind of function, and Google is able to translate different fingerprint and keywords.
Since all this method is based on the API, engineers can use it to use this technology or, at present, only Google takes the initiative to use it, first in the launcher and, in the future, in applications such as Google Photos or Drive.
How things have changed since Force Touch
It's amazing how many things have changed after the idea came to Apple's hands.
Force Touch is a technology developed by Apple Five years ago, in September 2014. This technology started with the Apple Watch and last came as the & # 39; Dray 3D & # 39; iPhone.
Apple upgraded Force Touch and took it to the iPhone. Google has imitated it with long machines. Later, Apple moved to replace them with longer machines. Now, Google is mimicking it with software that is getting stressed
The main "problem" of this technology is that requires a physical component inside the device board and, later, Apple decided to get it out of the iPhone XR. Mobile replace the pressure sensors with a long stroke system (Haptic Touch) which, while fulfilling part of the older 3D Touch functions, does not reach the original level.
Similarly, Google itself takes time running long machines on your launcher to solicit activities. Basically, the same philosophy for Haptic Touch: press a few milliseconds to invoke tasks. Either way, the door on Android's app so you can see the pressure is more open than before, so you have to see if it ends up being a popular thing or not.
Via | Action