The iPhone 14 Pro, with its 48 megapixel camera. Photo / AP
Smartphone makers focus (sorry) on cameras more than anything, and try to leapfrog each other when they launch new devices.
The reason for that is simple: images and video are tangible things. You can
literally see the improvements higher resolution, improved dynamic range, better color reproduction and image stabilization bring.
Except not always, because you want the full workflow from taking pictures and shooting videos, to publishing them and that’s not always going to happen.
It’s early days and photo geeks like yours truly are frantically taking pictures and video, but initial impressions are that even the mid-range iPhone 14 takes stunningly good pictures with its 12 megapixel camera.
If you want better than that, the Pro versions of the iPhone 14 add a 48 Mp main camera that adds some new tricks. By “pixel binning” four adjacent photo diodes on the sensor to form one super pixel, you get 12 Mp images with lower noise in them; on the Android side, Oppo does the same with a 50 Mp sensor, and Samsung has “nona binning” for its 108 Mp imager.
You can shoot at the full 48 Mp too, in RAW format for post-processing but that produces huge files, over 75 Mbytes. The same applies to Apple’s ProRes 4: 2: 2 HQ video format.
ProRes is overkill for normal use, but if you really want to get the very best out of your iPhone videos and don’t mind wrangling huge, multi-gigabyte size files, you can shoot in that format up to 4K and 30 frames per second – it’s why there are iPhone 14 Pros with 1 terabyte storage.
On top of the main wide camera, both the ultrawide and the telephoto ones have had a polish, ditto the front-facing imager which has surprisingly high specs.
All the imaging magic is controlled by Apple’s computational photography tech, now called Photonic Engine, and the A16 Bionic chipset that’s even a bit faster than Apple’s M1 Pro in the MacBook Pro laptop on a single-core basis.
That said, sharing your masterpieces with friends and the world on social media is usually a disappointing experience, because the results can look meh. Smartphone photo tech has exploded in terms of quality, but the social sites where you hang out with people? Not so much. They’re more worried about storage costs (see above about the big files) and fast page loading times on the web and in apps.
The iOS Twitter app is particularly brutal in that respect. I was trying out the Action Mode which provides even better image stabilization for when you’re moving around and holding the iPhone 14 Pro in your hand: this comes at the cost of lower resolution, 2.8K vs 4K but you get Dolby Vision high dynamic range and can even shoot in ProRes at a fast 60 fps.
The footage that looked stunning on the iPhone was scaled down on Twitter to 1280 by 720 pixel resolution. While Twitter kept the 60 fps (OK, 59.97 fps to be pedantic) rate, it dropped the data rate from 55.76 megabits per second to 1.1 – 2.3 Mbps.
Audio was downgraded as well and you won’t get 10-bit color on @ jack’s social network.
Does this mean it’s pointless to have good camera systems, if the sites where you share photos and videos compress the snot out of them to reduce their size? No, because if you start off with good quality, the slimmed down versions will look better.
Meanwhile, Apple’s iCloud Photos keeps full res and sizes of images and videos (although RAW files are rendered as JPEGs), and you can share them easily on iOS, Apple’s mobile device operating system.
iCloud Photos needs a tweak though, so that you can show off what iPhones are capable of on the platforms where your followers hang out.
It’s arguably a very first-world problem, and yes, you can use third-party sites that don’t mangle your images, and link to those. That involves swapping platforms that’ll invariably behave differently from one another.
You’d think Apple would want to own the whole workflow from imaging to publishing, and sell more iCloud subscriptions that way. I wouldn’t be surprised if Big A has something to announce on this at some stage.