Skip to content
In iOS 16, apps can trigger real-world actions hands-free – TechCrunch


The new feature arriving in iOS 16 will allow apps to trigger real actions hands-free. This means users can do things like start playing music just by walking into a room or turn on an e-bike for a workout just by riding on it. Apple told developers today during a session at the company’s Worldwide Developers Conference (WWDC) that these hands-free actions could also be triggered even if the iOS user is not actively using the device. application at that time.

The update, which leverages Apple’s Nearby Interaction Framework, could lead to some interesting use cases where the iPhone becomes a way to interact with objects in the real world, if developers and accessory manufacturers choose to adopt the technology.

During the session, Apple explained how today’s apps can connect and exchange data with Bluetooth LE accessories even when they’re running in the background. In iOS 16, however, apps will be able to start a nearby interaction session with a Bluetooth LE accessory that also supports Ultra Wideband in the background.

In this regard, Apple has updated the specifications of accessory manufacturers to support these new background sessions.

This paves the way for a future where the line between apps and the physical world blurs, but whether third-party app and device makers choose to use the feature remains to be seen.

The new feature is part of a larger update to Apple’s Nearby Interaction Framework, which was the focus of the developer session.

Introduced at WWDC 2020 with iOS 14, this framework allows third-party app developers to leverage the U1 or Ultra Wideband (UWB) chip on iPhone 11 and later devices, Apple Watch, and other third-party accessories. This is what today powers the precision find capabilities offered by Apple’s AirTag which allows iPhone users to open the “Find My” app to be guided to the precise location of their AirTag using on-screen directional arrows as well as other tips that let you know how far you are from the AirTag or if the AirTag may be on a different floor.

With iOS 16, third-party developers will be able to build apps that do much the same thing, thanks to a new capability that will allow them to integrate ARKit – Apple’s augmented reality development toolkit – with the iOS framework. nearby interaction.

This will allow developers to leverage the device trajectory calculated from ARKit, so their devices can also intelligently guide a user to a misplaced item or other object that a user may want to interact with, depending on the functionality of the app. By leveraging ARKit, developers will get more consistent distance and direction information than if they only used near interaction.

However, the feature should not only be used for AirTag type accessories manufactured by third parties. Apple presented another use case where a museum could use Ultra Wideband accessories to guide visitors through its exhibits, for example.

Additionally, this feature can be used to overlay directional arrows or other AR objects over the real-world camera view as it helps guide users to the Ultra Wideband object or accessory. Continuing the demo, Apple briefly demonstrated how red AR bubbles could appear on the app screen at the top of the camera view to indicate the way forward.

Longer term, this feature lays the foundation for Apple’s mixed reality smart glasses, where AR-powered apps would presumably be at the heart of the experience.

The updated feature is rolling out to beta testers of the iOS 16 software update which will hit the mainstream later this year.

Tech

Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.