“The best camera is the one that’s with you,” photographer Jay Maisel famously once said. And for millions of people around the world, that camera will be the one on the back of the iPhone in their pocket – for now, anyway.
With the launch of the Vision Pro headset, Apple will set loose the most significantly different camera imaging system it has ever made. The company’s ‘Spatial Computing’ wearable will include an advanced camera array capable of capturing 3D stills and videos; these clips and shots add greater depth to an image over traditional ‘flat’ 2D media, according to Apple, leading to more immersive recreations of those moments captured in time.
Apple’s serious about this 3D photo and video push — there’s a dedicated camera shutter hardware button on the Vision Pro, and the headset’s WWDC 2023 reveal showcase had a whole section dedicated to capturing what Apple’s calling spatial video and spatial photos.
“Apple Vision Pro lets users capture, relive, and immerse themselves in favorite memories with Spatial Audio,” says Apple of the feature.
“Every spatial photo and video transports users back to a moment in time, like a celebration with friends or a special family gathering. Users can access their entire photo library on iCloud, and view their photos and videos at a life-size scale with brilliant color and spectacular detail.”
But the staged WWDC clip of a father capturing videos of his kids while wearing the headset was the most derided part of an otherwise impressive Vision Pro unveiling, and a slightly dystopian view of distanced parenting norms to come. It looked and felt creepy, and circling back to Jay Maisel, it’s hard to imagine people taking the Vision Pro (with its slight battery life) out and about with them each day. Those looking to take spatial photos of their beach-side holiday will be wary of toting their $3,499 headset down to the sand — and would end up with some very interesting facial tan lines if they did.
Look at the Meta Quest headsets: Despite millions of sales, how many have you seen in public? If Apple expects 3D video and photography libraries for Vision Pro to swell in size, something else is going to have to take that media-capturing burden.
That’s where the rumored iPhone Ultra might just come in.
An Ultra camera shake up
For months, there’s been speculation that the iPhone 15 Pro, expected to launch at the next Apple event on September 12, 2023, will in fact be called the iPhone Ultra, in keeping with the premium Apple Watch Ultra wearable launched last year (and the Apple Watch Ultra 2 likely to arrive next week).
Despite many leaks in support of this assertion, it’s a theory that’s fallen out of favor, with the Ultra now expected to be an even more premium iPhone that will launch next year above the current iPhone range. So you’ll end up with a 2024 launch line-up that looks like iPhone 16 < iPhone 16 Pro < iPhone Ultra.
Rumors circulating on Chinese social media site Weibo, as first spotted by MacRumors, point to an advanced 3D photography array being present on the upcoming 2024 handset. In addition to the wide, ultra-wide and telephoto camera lenses, alongside the LiDAR scanner and True Tone Flash on the rear of an iPhone (not to mention the rumored Periscope lens the iPhone 15 Pro Max is said to have), another 3D-focused lens would be required.
What form that takes remains to be seen, but it could necessitate yet another camera bump on the rear of the handset, one spaced far enough away from a partnered lens to create natural and believable depth when two images are combined.
It’s a delicate art to pull off.
“3D photography hasn’t really taken off for a couple of reasons,” explains Mark Wilson, pro photographer and TechRadar.com News Editor.
“It’s still widely seen as a gimmick, which is understandable given that most people’s experience of it was those old View-Master toys. There are also unflattering parallels with 3D TV. But tech limitations have also restricted it to being a low-quality experience,” Wilson told iMore. “Stereoscopic photography goes back to the 1800s, but we’re now seeing major advances like NeRFs (neural radiance fields) that could take 3D captures to a new level.”
VisionOS, the underlying Spatial Computing software platform, has some example spatial photos to examine in its earliest beta release, presumably taken with the headset’s own camera system. Developers with early access have revealed how the headset takes two HEIC images from slightly varying positions to match the distance between a user’s eyes; it presents them in tandem to create the 3D effect. Presumably an iPhone Ultra camera array would be able to replicate this method.
3D — the undead zombie of entertainment technology
The elephant in the room is that this would be far from the first attempt at making a 3D-media-capturing phone — it’s just that there’s never been a successful one. To my memory, LG was first out of the gates with the LG Optimus 3D back in 2011, which featured a 3D screen and interface as well as the ability to capture 3D stills and three-dimensional video up to 720p in resolution. It was a flop, but a valiant one given the resurgence of interest in 3D TV at the time.
Then there was of course Amazon’s Fire Phone, released in 2014. Its take on 3D was slightly different: Rather than capturing 3D imagery, it used its 3D sensing cameras to present a 3D, head-tracked interface on the phone screen itself. It’s perhaps the biggest smartphone failure of all time, with Amazon selling just a few thousand handsets and resulting in part of a $170 million write-down for the company.
It’s not a like-for-like comparison, but I bring up the Fire Phone despite its functional differences to help emphasise a point: 3D, in all its forms, has never amounted to more than a bit of a fad. Whether it’s the Avatar-led cinema boom, or the Nintendo 3DS’s glasses-free 3D screen, visual media makers are on what feels like an eternal cycle to present 3D as the next hot ‘new’ thing every few years.
Now it’s Apple’s turn to take it for a spin.
“The Apple Vision Pro is a promising step for the 3D viewing experience, but a lot more needs to be done,” Wilson said.
“Apple needs to make 3D photography a point-and-shoot experience by adding next-gen LiDAR scanners and extra sensors to its iPhones and iPads. It also needs to open up spatial capture to third-party apps. If that’s all combined with more affordable, approachable headsets or glasses, then 3D photography could well have a revival. 2D photography won’t go away as an art form, but I could see 3D captures becoming an important way to capture our memories in more powerful ways.”
As only a function of a wider immersive ‘Spatial Computing’ ecosystem, it stands a better chance of long-term survival than other examples of 3D media, perhaps. But to feed that application of Vision Pro fully would require a significant change for iPhone photography.
If the premium nature of the iPhone Ultra is true, you could be looking at needing a $1,500 phone to fuel the 3D library of your $3,499 headset — and that just doesn’t scream ‘mass adoption’ to me. Given the yo-yoing fortunes of, and interest in, 3D video and photography over the years, I’m not convinced that should be the core photography play any eventual iPhone Ultra should take. But it may prove the inevitable direction to get a 3D ecosystem off the ground.