So Christmas time for Apple Developers has just been, bringing a bunch of expected and unexpected new features and APIs. While I wasn’t in San Jose this year, I’ve kept up to date with the livestreams. Let’s have a look at some of the features that I thought were cool.
This is really cool. I created an emoji like version of myself, used the cartoon filter and now I have a perfect online avatar that represents me.
You can also use Memoji while chatting on Facetime with up to 32 of your friends. I know, who has 32 friends??
- Shared experiences
- Persistent worlds & object recognition
The demo featured people playing a LEGO game (created in Unity) by holding up their iPads for about 10 minutes. I wonder how long they worked out before being able to hold the iPad up for that long.
A lot of people think that AR is gimmiky, which I agree with for now. But I see Apple taking steps towards a device with a radical new form factor. By creating great APIs and a bunch of Apps in an App Store that would just work with the new device, Apple might be able to release a new device with a lot of content available at launch.
I’d guess that the Apple Lens will be released in 2020 👓
A new file format created in collaboration with Pixar. This file format allows you to contain an entire 3D scene in one file and serve it in place of an image. Think product images in an online shop, contextule objects and scenes in news articles.
Apple have a gallery on their website here AR Quick Look Gallery. Make sure to use an iOS 12 device to see the 3D preview.
I can’t figure out how to make them just yet - I can’t find any of the tools that were mentioned. I am looking forward to checking this out and potentially integrating it into Overflights’ Road Code app.
In iOS12 developers can now offer shortcuts in your applications that can be mapped to a phrase that a user says into Siri.
I am very exicted to see how this plays out as the iOS 12 beta seeds continue to be released.