One of the new features of the AirPods Pro 2 is the new adaptive transparency mode. A way in which prioritize sounds that need to reach our ears and that it will also be present in the original AirPods Pro. In the meantime, if we dare the world of betas, we can already try it.
Adaptive transparency: the environment, but not the noise
With the third beta of iOS 16.1 Apple brings Adaptive Transparency Mode to first-generation AirPods. A mode that, thanks to artificial intelligence, is able to discern what sounds we want to hear in transparency mode. Thus, if we walk down the street with this mode active, we can hear a siren, or someone talking to us, but not the noise of the works.
This adaptive transparency comes to AirPods thanks to firmware 5A305A, which is currently in beta. A firmware that should officially arrive on all helmets within a few weeks. In the meantime, if we want to try this feature, the betas give us the opportunity to do so.
During the presentation of the AirPods 2, it seemed that only the new H2 chip present in these headphones was going to support the adaptive transparency mode. Now, with its arrival in beta form, there is doubts on how it will work in the current H1 and what differences there will be between the two modes in the different generations of AirPods.
The truth is that the capabilities of the AirPods, despite their small size, never cease to amaze us. Update after update, gain new features and capabilities, something that is repeated now with the firmware that Apple is working on and that will bring one of the great novelties of the AirPods Pro 2 to the previous generation. Certainly good news.