A short recap of some of the Apple WWDC 2020 talks

  • Scope:
  • Web Development
  • Mobile Development
A short recap of the Apple WWDC 2020 talks

As this year’s WWDC has recently come to a close, we would like to share a short recap of the talks we think might interest you. Some talks are better than others and sometimes it’s worth watching the video, but often a short written summary is enough. So here are our thoughts on a few talks which caught our eye. 

This list is totally subjective, yet filtered through the lenses of our Apple fans and iOS developers. It might save you time or help you discover some talks that you missed. We’ve omitted the talks on App Clips and updated widgets as those are the most-watched and have been covered a hundred times by now.  

What is Apple WWDC 

The Apple annual Worldwide Developers Conference is centered around new technologies and features being added to Apple products and services. Developers can attend sessions with Apple engineers so they can think about new ways to use these tools and implement them in their upcoming projects. 

2020 WWDC came with multiple new features to implement and updates to pre-existing tools. So, starting with the obvious one:

What’s new in Swift

If you are not curious how Swift speed and memory management compares to Obj-C, you can skip this talk – it is rather aimed at hard coders, not a bad thing at all, but a highly hermetic focus. It may be better and more time-efficient to just read what’s new in Swift 5.3, for example here. And if you just want a short summary, here it is:

  • the binary size of apps written in Swift is now only 50% bigger than in Obj-C apps 
  • heap usage (e.g. variables used by the app stored in memory) is significantly smaller in Swift 5.3 compared to older versions
  • code completion should work much better in the new Swift version (up to 15x faster)
  • the debugger should be more stable and resilient
  • multiple, mostly small improvements to Swift syntax and code style

Advances in UICollectionView and Lists in UICollectionView

The first talk is a very short summary of what’s new in UICollectionsViews. Here’s a short list so you don’t have to watch it: 

  • Section Snapshots – update to diffable data sources, now we can use a separate snapshot for each section
  • List Configuration – a new style for UICollectionViewCompositionalLayout, UITableView-like appearance, creating a simple list layout in 2 lines of code
  • Cell registrations – a new way of using cells, configuration and registering of cells can now be connected, no more forgetting about registering a new cell
  • Cell content configurations – predefined cell styles from UITableView now also in UICollectionView as UIListContentConfiguration

The second one is about a new style for UICollectionView that gives a UITableView-like appearance. The new UICollectionView style provides a list-style specifically designed for use as a sidebar menu in iPad apps. It also delivers a new type of UICollectionViewCellUICollectionViewListCell.

This new cell type adds:

  • Improved separator handling
  • Swipe actions as features of a cell
  • New accessory types, for leading and trailing the sides of cells

In the talk, you can see very short code snippets presenting how the new API can be used. If you want to learn how to use this new API, we would recommend checking the documentation and trying it out yourself.

Meet Nearby Interaction

The new U1 chip in newer iPhones enables much more precise device location capabilities through a new NearbyInteraction API. So far, it’s only been used by the operating system, e.g. when determining what devices are available for sharing files via AirDrop. 

Since iOS 14, a public API will be available for developers, offering distance & direction measurement between devices with centimeter-level accuracy. This might become a very powerful tool in real-world interactions, for example, to pinpoint which rideshare vehicle is waiting for the user. 

The feature will also find a place in AR gaming & collaboration, or even health & fitness, especially when combined with other advances, like access to AirPods and AirPods’ pro motion data or new features in ARKit 4.

Detect Body and Hand Pose with Vision

Do you want to create an application for sequence photography? Or maybe for detecting and translating finger gestures to seamlessly talk with people with hearing impairments? Or just draw with your fingers without touching the screen? 

If you said yes to any one of those questions, we highly recommend this WWDC session. You can learn how to use the Vision Framework for hand and body pose detection.

What’s new in location

It should rather be called “What’s new in location privacy” as its main focus is new privacy settings for apps. If this is something that interests you, we highly recommend watching it. If you don’t have time, here’s a short summary of what’s new:

  • when granting permission for the app to use the user’s location, a user can now select if the location should be precise – this should help the user to feel more comfortable with giving access to this information
  • new callbacks for CLLocationManager that inform the user about changing location accuracy authorizations
  • location accuracy authorization can be granted and revoked at any time, e.g. a user can authorize the app to use the precise location until the app is closed

The talk also covers (with examples) how to handle the new location authorization and the location settings for App Clips and widgets.

What’s new in Wallet and Apple Pay

No need to watch the whole thing, there isn’t much here other than listing new things connected to Apple Pay. To save you some time, we’ve listed them here:

  • The pay with Apple Pay button now has new, specialized types that change the text on it e.g. Buy, Rent, Tip with Apple Pay, etc.
  • The new styles will also come to the Web versions
  • Supporting Apple Pay in App Clips – the recommended way to handle payments
  • Apple Pay is coming to Catalyst and native Mac apps available in the App Store
  • Shipping and Contact data when paying with Apple Pay can now be formatted and validated based on region
  • Issuer extensions for adding cards – a user can add cards from inside the Wallet app when he/she is signed in on the card issuer’s apps

There isn’t much more outside of this list. A simple example on how to handle payments isn’t worth the watch, so we did it for you. 

Synchronize health data with HealthKit

If you are interested in how to correctly process and share health data between devices, remote servers, and HealthKit – this talk is for you. It goes over the typical use cases that require data to be synchronized and versioned.

If you are creating an app that uses health data and you wish to know how to share it with medical staff, it’s a must-watch for you. It’s never been easier.

Considering the dominant trends in healthcare, including big data analysis, using AI and the Internet of Things to boost the efficiency of healthcare systems is a trend of the future. HealthKit arrives as a handy tool to make all this data-related synchronization and management a lot easier. 

We highly recommend watching the talk as it presents information with simple diagrams and easy to follow examples.

Record stereo audio with AVAudioSession

Did you know that there are 4 mics built in to your phone? And did you know that you can now record audio in stereo with 8 different orientation settings? Do you want to know how it all works? Then go ahead and watch this talk.

It’s short, condensed and explains the issue in an easy to understand way. In the video, you can also find tips on when and how to set the orientation of the audio recording to make your users happy.

Did you watch an interesting (or the opposite) talk that you wish to discuss? Do you think we missed something that is a must-watch?

Or maybe you have some cool idea about your new app and you wish to share it with us? Then, cool! Wait not a second more and contact us now!