iOS 11 is Apple’s new operating system update for mobile devices.

Apple just released the latest update to their mobile operating system, designated iOS 11, to developers at WWDC (Worldwide Developers Conference) and they’ve introduced several new technologies (and updates to others) to build innovative experiences for users and businesses.

AR-Kit, Child holding iPad up to table where content is superimposed on top while looking through iPad

AR Kit

When most people think of augmented reality, they think of Minority Report—waving your arms around using gestures / jazz hands, or Google Cardboard. Both require new pieces of technology, new ways of interacting with digital products, and the experience is unfamiliar to users.

Apple’s taken a different approach, and built a software development kit for the mobile devices people already know how to use (iPhone, iPad)—using the camera as the interface for looking into the augmented reality world. This enables anyone with a newer device to play games that interact with the real world, see how furniture will fit into their living space, see an approximation of what  clothes may look like on them, and anything else that super-imposes digital content or objects into the real world.

By deploying AR Kit on iOS, every new phone running the latest operating system will have access to augmented reality products, and Apple has access to largest AR platform in the world, overnight.

Siri Waveforms when audio input is captured with the iPhone microphone

Siri®

Siri® is Apple’s voice assistant that enables users to control iOS and certain types of apps with just their voice. In iOS 11, SiriKit (the platform for developers to integrate Siri into apps) has opened up even farther—adding more support for sending and replaying to messages, making calls, transferring money, making restaurant reservations, and booking rides.

Providing more developer access to what Siri® can do allows designers, developers and companies to create experiences where a user isn’t required to open an app, or provide an alternative interface when someone’s hands are busy (working out, cooking, etc) and they still want access to features provided by you and your product.

Features supported by Siri®:

VoIP Calling
  • Initiate calls and search the user’s call history.
Messaging
  • Send messages and search the user’s received messages.
Payments
  • Send payments between users or pay bills.
Lists and Notes
  • Create and manage notes and to-do list items.
Visual Codes
  • Convey contact and payment information using QR codes.
Photos
  • Search for and display photos.
Workouts
  • Start, end, and manage fitness routines.
Ride Booking
  • Book rides and report their status.
Car Commands
  • Manage vehicle door locks and get the vehicle’s status.
CarPlay
  • Interact with a vehicle’s CarPlay system.
Restaurant Reservations
  • Create and manage restaurant reservations with help from the Maps app.
(Source)

CoreML, Apple's Machine Learning Framework logo

Core ML

Build intelligent applications.

Until now, apps have done what they’ve explicitly been built to do. The designer and developer specify a feature—and the system responds how it’s been programmed to do so. Machine learning focuses on the development of applications that can change when exposed to new data—providing answers, understanding text and language, recognizing people, and changing the systems responses based on the context of how a user is using the product.

This opens to door for businesses to provide more personalized experiences—showing users products that data indicates they’ll be interested in, responding to support requests automatically without the user choosing from a list of options, providing recommendations and making tasks easier for users by taking the first few steps for them. With iOS 11, Apple’s providing support for your applications to learn from you, predict what you’re about to do, and show you the right things at the right time.

Called Core ML, and built on top of Metal (Apple’s interface to the GPU / Graphics Processing Unit), it’s easy to integrate machine learning models into your app. Use models provided by Apple, or convert your existing models into the Core ML format to integrate into your products.

iOS Machine Learning Examples

Some of the ways machine learning is already used in Apple’s products.

Computer vision.

With the iPhone camera and CoreML, developers are now able to build applications that see what your camera sees. Build products computer vision features including face tracking, face detection, landmark detection, text detection, object tracking, and barcode detection. This opens up a wide range of applications such as scanning printed documents and editing the digital text on your device, showing a user information about a landmark when they look at it through their viewfinder, and maintaining focus on someone’s face as they move around to get the perfect selfie.

Natural Language Processing.

The APIs provide by Apple in iOS 11 allow for “language identification, tokenization, lemmatization, part of speech, and named entity recognition”.
(Source)

 

Music Kit, released with iOS 11

MusicKit

With MusicKit’s release at WWDC this year, Apple is finally opening up their music platform with 27 million paid subscribers to developers. Apps are now able to “create playlists, add songs to their library, and play any of the millions of songs in the Apple Music catalog”. For users who aren’t Apple Music subscribers, companies can provide a trial of the the music service from within their product.

This new Software Development Kit opens up the possibilities for rich applications focused on entertainment, Smart Homes, content viewing, gaming, and sharing music with friends. After release, companies partnered with Apple (Nike, Shazam) have announced integration with MusicKit to provide music services that are tightly coupled with the underlying operating system.