Apple is just months away from unleashing five new features for its $250 AirPods Pro 2 that will use machine learning to optimize listening.
The new updates have been announced at the World Wide Developers Conference (WWDC) this month, but was likely overshadowed by the AR Vision Pro headset.
The software listens for changes in your environment to let the outside world in or block out the noise without you having to manually adjust and turns off content when speech is identified.
Apple also said the upgrade also reduces the time to connect between different devices and comes with a new Mute or Unmute feature.
Apple has announced that it will be adding five new features to the second-generation AirPods Pro 2 this fall
“This fall, software updates for AirPods will unlock powerful new capabilities to transform the personal audio experience, the tech giant said in an announcement.
AirPods Pro (2nd generation) becomes easier to use in different environments and interactions with three powerful new features: Adaptive Audio, Personalized Volume, and Conversation Awareness. The entire line-up will also receive new and improved features that make calling and automatic switching even smoother.”
Adaptive Audio: Combines two functions into one
This is Apple’s middle ground for listening to your favorite music while still chatting with people in the real world.
Machine learning analyzes the environment of the earbuds and adjusts the sound accordingly when the background noise changes.
For example, loud or disturbing sounds around you are automatically muted, while other sounds remain audible.
9to5Mac shared an example of a user doing the dishes with Transparency activated, but the system switches to ANC the moment they turn on the vacuum cleaner.
Transparency mode lets in outside noise so you can hear what’s going on around you.
One is an easy way to mute and unmute the earbuds. Users can press the stem — or the Digital Crown on AirPods Max — to quickly mute or unmute themselves, making multitasking effortless
Personalized volume: AirPods Pro 2 will know how you like it
Another feature powered by machine learning is Personalized Volume, which listens to the user’s volume preferences under certain circumstances to automatically fine-tune the media experience.
Conversation Awareness: Hear conversations without removing the earbuds
You no longer have to remove your AirPods to talk to someone again.
Conversation awareness automatically lowers the volume of your song or podcast and amplifies the voice or voices for you while reducing background noise such as traffic.
Automatic Switching: No more waiting for AirPods Pro 2 to pair with other Apple devices
Apple said at WWDC that your AirPods Pro 2 seem to pair instantly with other devices, as it currently takes a few seconds for the process to complete.
‘Plus, switching between Apple devices with AirPods gets even easier with auto-switching updates,” Apple shared in a press release.
“Now the connection time between a user’s Apple devices is significantly faster and more reliable, making it easier to switch from a favorite podcast on iPhone to a work call on Mac.”
Mute or unmute: No more reaching for your iPhone to turn the volume on or off
For added convenience, using AirPods during calls has been enhanced with the new feature for AirPods Pro (1st and 2nd generation), AirPods (3rd generation), and AirPods Max.
Users can press the stem — or the Digital Crown on AirPods Max — to quickly mute or unmute themselves, making multitasking effortless.
While the AirPods Pro 2 software update may sound exciting to some, Apple’s Vision Pro headset was the star of the annual tech conference.
The $3,499 headset allows users to merge the real world with a digital world controlled by their eyes, voice, and hands — no controllers required.
The headset runs on VisionOS, which Apple touts as “the world’s first spatial operating system.”
Apple calls it “spatial computing” because it blends content into the space around you.
The new software was announced this month at the World Wide Developers Conference, but was likely overshadowed by the AR Vision Pro headset
Mike Rockwell, vice president of Apple’s Technology Development Group, said, “Creating our first spatial computer required ingenuity in almost every facet of the system.
“Through tight integration of hardware and software, we have designed a standalone spatial computer in a compact portable form factor that is the most advanced personal electronic device ever.”
Users move their eyes and hands and say specific commands to power their journey through the augmented experience.
Alan Dye, Apple’s head of human interface, said users use their eyes to select content in the glasses, tap their fingers together to click and swipe gently to scroll.