WWDC brought with it many goodies for all app developers. Here are a few photography highlights that we found interesting!
Taking over the Lock Screen camera button
The biggest new feature for lovers of third-party apps is the new LockedCameraCapture API, which lets us finally add the camera to the Lock Screen, something we’ve wanted to do ever since Apple added a camera button to the Lock Screen way back in iOS 6!
It’s understandable that it has taken them time to get this right, as the system camera does a special dance with the photos shot in this mode in order to preserve your privacy and keep your photos safe, as the phone’s data is normally encrypted and inaccessible while the phone is locked, and the photos shot while the phone is locked are stored separately, and only merged into the main photo library when the phone is unlocked. This new API brings all of that security to third-party camera apps, prohibiting them from accessing any private data while the phone is locked.
Spatial video recording
Ever since Apple introduced spatial video recording last year with the iPhone 15 Pro and Pro Max, we’ve been waiting for an API to record spatial video for the Vision Pro, and this year Apple has delivered! You can look forward to third-party video apps adding support for this in iOS 18.
The APIs are well thought-out, including everything needed to warn users when you are too close (causing the images to get misaligned, causing crossed eyes) or too dark.
Enhanced video stabilization
To make the spatial video recording feel more comfortable to watch in the Vision Pro, they’ve introduced a new video stabilization mode, with the brilliant name of “Cinematic Extended Enhanced”. This is also available for regular video recording, although it will remain to be seen how this compares to, say, Action Mode stabilization (for which there is no API). However, since it does not require the ultra-wide lens, it will probably slot in between the current stabilization and Action Mode, cropping the image more than the current video but working better in low-light situations.
No spatial photos?
One surprise is that despite there being an API for Spatial Video, there isn’t anything for Spatial Photos, even though Apple has thoroughly documented the format, and portions of the talks almost seem to be hinting at material cut out that would have covered it. It will be interesting to see if this is coming later this year with the new iPhone or maybe during the next WWDC.
Constant Color Photography
One surprising new API released this year is called “Constant Color Capture”, where Apple allows you to shoot photos that have been color-calibrated, no matter the current lighting conditions. It works by taking two photos – one with the flash off and one with the flash on. It calculates how the flash affects the photo, and uses factory calibration data about the camera and flash to adjust the photo to be consistent. This can work some real magic, even removing quite the extreme colors cast by colored RGB lighting.
They used several examples to demonstrate use cases
- One was shadow removal for document scanning – this was actually released earlier this year as a feature with the new iPads, which secretly used this API under the hood.
- Another example was product photography, so that you can take color-accurate photos of products like T-shirts for your online shop so customers can be certain the colors match.
- The last example was for medical uses – for example being able to track the development of a rash to see if it’s getting better or worse, even between morning and evening when the color of daylight changes.