Summary of Apple Worldwide Developers Conference 2018

Every year Apple invites the community to their Worldwide Developers Conference and even though 6 000 developer gets to go every year it’s really hard to get a ticket. For me it took six attempts but I finally got one and it was worth the wait.

In a five day conference there are a lot of topics covered. There where over 100 sessions and even more labs where you could actually get access to over 1 000 Apple engineers. I managed to go to about a quarter of the sessions and visit a handful of labs and in this post I’ll try to summarize my impressions, talk about how it may effect us developers and link to the best talks.

Machine Learning and Augmented Reality

Last year Apple introduced Core ML and ARKit and to be honest I was not really that interested. ARKit lacked a lot of stuff and you still had to train your machine learning models using other tools before you could take advantage of Core ML. After this year’s talks though, I’m really impressed.

The first piece of the puzzle is Create ML, a simple way to to augment prebuilt models from Apple with your own custom data. You can use images, text or tabular data and it’s ridiculously easy to get started. For images you can start by putting a lot of images of apples in a folder called apple, oranges in a folder called orange and bananas in a a third folder. Then you can use Swift Playgrounds to pass the images to Create ML or just write a simple script like this:

#!/usr/bin/swift
import Foundation
import CreateML

// Specify Data
let trainDirectory = URL(fileURLWithPath: "/Users/Joakim/Desktop/Fruits")
let testDirectory = URL(fileURLWithPath: "/Users/Joakim/Desktop/TestFruits")

// Create Model
let model = try MLImageClassifier(trainingData: .labeledDirectories(at: trainDirectory))
// Evaluate Model
let evaluation = model.evaluation(on: .labeledDirectories(at: testDirectory))
// Save Model
try model.write(to: URL(fileURLWithPath: "/Users/Joakim/Desktop/FruitClassifier.mlmodel"))

Now you got a trained model that you can drag into your project and you are ready to recognize fruits in your images. Text and tabular data can be trained in the same way. I can think of a lot of cases where this could have improved previous projects I’ve been involved in and I hope I’ll get to use it in the future.

With machine learning in place we can turn to an application of machine learning and that is augmented reality. Apple introduced ARKit last year and this year they added image, face and object tracking which means you can both recognize these in the real world and track them around as they move. You should really check out this video to understand how it works but I was able to try this for myself and it’s actually really nice.

They also added support for persisting the AR world as you build it and support for multiple users in the same world which makes it possible to create really fun experiences. At WWDC they ran an AR tournament around the game SwiftShot. The entire game is open source so you can download it to play or learn more.

Universal Scene Descriptions

If games and object recognition is not your thing, how about shopping for bags online and actually being able to put it on your back to see how it looks? That will not be so hard to do in the future.

If, in addition to an image of the bag, you create your bag as an AR scene you can now store this in a .usdz file. This is an open file format that Apple created in collaboration with Pixar. When you’ve created this file you can link to it, just as as you normally do on the web, and users can click on it to show the bag in the augmented real world and do whatever they want with it.

Creating these scenes will obviously be quite hard but Adobe showed a preview of project Aero that will make this much easier.

This may have been one of the least appreciated announcements this year and whenever you create your next e-commerce solution I will call it a fail if it doesn’t include AR.

Siri Shortcuts

Building on the Machine Learning theme shortcuts provide a way for developers to surface actions that a user do frequently to Siri to let Siri learn the behaviour of the user. With this knowledge Siri can surface these actions as shortcuts on the lock screen, in spotlight or in the Siri watch face on the Apple Watch as they become relevant (based on time, location and user behaviour). Contrary to other companies, Apple doesn’t collect this data on their servers and instead it all happen on-device and to me this makes it much less creepy but to be honest, probably not as good.

These shortcuts can also be triggered by voice if the user assigns a trigger phrase to them. This trigger phrase can even be used on CarPlay or the HomePod. This finally opens Siri to all developers which is really good. Unfortunately the trigger phrases are static so we can’t pass parameters the same way we can to Alexa for instance and this makes this feature way less attractive but it’s a first step and I would not be surprised to see this added in a future release.

If your app already uses NSUserActivity you can expose it to Siri shortcuts by simply setting userActivity.isEligibleForPrediction = true and if you want more control and the ability to run the commands using your voice in the background you can use the more advanced intent api.

Finally there is a Shortcuts app that can tie all of these shortcuts together. The shortcuts can be scripted, combined with calls to web services and we can even assign a trigger phrase to them. This means that for the first time since the iPhone was introduced, we finally have built-in automation on iOS!

Notifications

In the keynote Apple showed grouped notifications and got huge applause. The changes to the notification system was much bigger though, it really got a complete overhaul. Notifications can now be more dynamic, have custom actions and if the notification provides a custom UI, using content extensions, it can now be fully interactive. This means that if a user receives an offer as a push notification the user can now interact with that notification and choose what to do with the offer right there without opening the app.

Apple is also making it way easier to manage notifications as the user receives them. This means that the user will now be able to turn notifications off right from the notification so if you abuse the system, by sending too many or irrelevant notifications, I’m pretty sure that user engagement will go down over time.

Performance, performance, performance

Performance really was the theme of the conference and if you run the beta you will not be disappointed. You can really see that the platforms have grown up and you don’t have to focus on how to put stuff on the screen anymore and instead you can focus on how to do it really really fast. This includes everything from the network stack up to scrolling in table views, from persistence in Core Data to how image assets are compressed.

I learned so many great tricks and this really is where native development shines. I’ve never been a fan of cross platform tools such as Electron or React Native and this is one reason. I realize that in many situations this doesn’t really matter but if you want to make your product the best it can be, there really aren’t any choice.

The conference

Besides the big stuff there obviously was a lot of “small” stuff as well. NFC for Wallet Passes, a new Mac App Store, Dark Mode for macOS, a sneak peek of UIKit running on macOS, WebKit running on the Apple Watch and a whole lot more.

To summarize, I was really impressed with both the content of the conference and the conference itself. The organization around the keynote was a mess but besides that, everything worked great.

There where a lot of great sessions but these are my favorites:

After visiting a few labs I realize, what everyone have been telling me, that this is the real benefit of the conference and I kind of regret not going to more but I guess I can do that the next time I’ll get to go in six years. I really hope it won’t take that long though, because as an Apple fan, this was without a doubt the best conference experience I’ve ever had and I wish can can go next year.