In iOS 16, apps can trigger hands-free actions in the real world – Vidak For Congress

New functionality arriving in iOS 16 allows apps to trigger real-world actions hands-free. That means users can do things like start playing music by just walking into a room or turning on an e-bike for a workout by just standing on it. Apple told developers today in a session hosted at the company’s Worldwide Developer Conference (WWDC) that these hands-free actions can also be activated even if the iOS user isn’t actively using the app at the time.

The update, which leverages Apple’s Near Interaction framework, could lead to some interesting use cases where the iPhone becomes a way to interact with objects in the real world, if developers and accessory makers choose to adopt the technology. .

During the session, Apple explained how apps today can connect and exchange data with Bluetooth LE accessories, even when they are running in the background. However, in iOS 16, apps can start a Near Interaction session with a Bluetooth LE accessory that also supports Ultra Wideband in the background.

As a result, Apple has updated the accessory manufacturers specification to support these new background sessions.

This paves the way for a future where the lines between apps and the physical world are blurring, but it remains to be seen whether the third-party app and device makers choose to use the functionality.

The new feature is part of a broader update to Apple’s Near Interaction framework, which was the focus of the developer session.

Introduced at WWDC 2020 with iOS 14, this framework allows third-party app developers to leverage the U1 or Ultra Wideband (UWB) chip on iPhone 11 and later devices, Apple Watch, and other third-party accessories. It’s what today powers the Precision Finding capabilities offered by Apple’s AirTag, which allows iPhone users to open the “Find My” app to be guided to the precise location of their AirTag using on-screen directional arrows. in addition to other guidance that lets you know how far away you are from the AirTag or if the AirTag is on a different floor.

With iOS 16, third-party developers can build apps that do much of the same, thanks to a new capability that allows them to integrate ARKit – Apple’s augmented reality developer toolkit – with the Near Interaction framework.

This allows developers to take advantage of the device’s trajectory as calculated by ARKit, so that their devices can also intelligently guide a user to a lost item or other object that a user may want to interact with, depending on the functionality of the app. Using ARKit, developers get more consistent distance and direction information than if they were just using Near Interaction.

However, the functionality does not have to be used only for AirTag-like accessories manufactured by third parties. Apple demonstrated another use case where a museum could use Ultra Wideband accessories to guide visitors through its exhibits, for example.

In addition, this function can be used to overlay directional arrows or other AR objects over the camera’s real world image, as it helps direct users to the Ultra Wideband object or accessory. In the continuation of the demo, Apple briefly showed how red AR bubbles could appear on the app’s screen on top of the camera view to show the way.

In the longer term, this functionality lays the groundwork for Apple’s rumored mixed reality smartglasses, where AR-powered apps would presumably be the core of the experience.

The updated functionality is rolling out to beta testers of the iOS 16 software update that will reach the general public later this year.

Learn more about WWDC 2022 on Vidak For Congress

    Leave a Comment