in

Reports of the iPhone 16 Pro’s 48-megapixel camera being ‘fake’ have been greatly exaggerated. Here’s why

Jason Hiner/ZDNET

Pro photographer and YouTuber Tony Northrup has been on the offensive recently, slamming Apple’s marketing for the iPhone 16 Pro line, claiming that the way the camera is being advertised is “meaningless and misleading.”

Also: We’ve used every iPhone 16 model and here’s our best buying advice for 2024

The claim at the center of his attack is that the camera uses a 48-megapixel sensor. To test it, Northrup compares the quality of an iPhone 15 Pro Max (yes, Northrup does not have an iPhone 16 Pro and is evaluating using its predecessor) to a 48MP professional Sony camera. And, as I’d expect, the quality of the professional camera far outperforms that of the iPhone.

So, are Apple’s 48MP claims “absolutely fake,” as Northrup says? Let me break things down.

<!–>

There’s more going on here than meets the eye. Let’s start with the 48MP claim. Does the sensor that Apple uses for the 13-mm ultra-wide camera actually have 48 million pixels? Well, yes, but not in the sense that we may have come to expect. The sensor at the heart of the iPhone’s camera – and many other smartphones, action cameras, and drones – uses a Quad Bayer sensor, where groups of four pixels are under a single color – red, green, or blue – filter. In contrast, a standard sensor has separate filters over each individual pixel.

–>

The iPhone 16 and iPhone 16 Plus field a 48-megapixel ‘Fusion’ camera that can perform a 2X optical zoom.

Kerry Wan/ZDNET

In the Quad Bayer sensor, these pixels are grouped into a 4×4 pattern, which includes four blue, four red, and eight green pixels. This setup allows for improved light sensitivity, greater color accuracy, and, because the human eye is more sensitive to green light, images that feel more real.

Also: I tested the iPhone 16 Plus, and it’s most impressive feature was not Apple Intelligence

Under ideal lighting conditions, a 48MP Quad Bayer sensor can output a 48MP image, but this requires the use of computational photography to stack and merge several exposures into one, along with some neural engine processing. The overall result is an image that is better and more detailed than a 12MP image would be.

Quad Bayer sensors really shine in low light because they can now use a trick called pixel binning to enhance image quality. Pixel binning involves combining adjacent pixels on the sensor to create bigger superpixel clusters where the light is combined, making the clusters more sensitive to light than the individual pixels. Here, pixel binning does indeed transform the output from 48MP down to 12MP.

<!–> Try taking a handheld shot like this - a solar storm over the UK - with a mirrorless camera.

Try taking a handheld shot like this – a solar storm over the UK – with a mirrorless camera. It won’t be easy.

Adrian Kingsley-Hughes/ZDNET

Quad Bayer sensors also allow smartphone cameras to shoot HDR photographs by capturing multiple exposures simultaneously within the same pixel group. This helps to improve the overall dynamic range – the range between the brightest and darkest parts of the photo – of the captures.

And remember, smartphone cameras aren’t just for shooting stills. They are also being used to shoot more and more video, and Quad Bayer sensors bring with them huge quality, low-light, and dynamic range benefits.

Also: We’ve used every iPhone 16 model and here’s our best buying advice for 2024

So, what about the claims that the quality of an iPhone is not as good as that from a professional camera? Well, this is to be expected as the size of the entire smartphone camera – from lens to pixels – is much smaller. For example, the pixels on the iPhone’s sensor are a minuscule 1.22 micrometers, compared to the beefier 3.73 micrometers for my Sony A7R IV–> mirrorless camera.

Those larger pixels, combined with bigger, far superior lenses, make for better-quality images. On the subject of lenses, the lens — the Sony FE 24-70 mm F2.8 G Master<!–> – that Northrup uses on the Sony camera itself costs $1,700.

ZDNET’s expert takeaway

We’re comparing apples to oranges here. It’s also disingenuous to single out Apple for using Quad Bayer sensors since pretty much every smartphone manufacturer does the same. And they do it because big numbers – screen size, RAM, camera resolution, and so on – sell.

As someone who uses a wide variety of cameras, from those in smartphones, built into drones, and dedicated mirrorless cameras, each one has its strengths and weaknesses. I can do things with my iPhone that I can’t do with any other camera I own. For example, there’s no way I could take handheld night shots of an aurora with my Sony camera, but it’s no problem for my iPhone. 

–>


Source: Robotics - zdnet.com

Buy a 3-month Adobe Creative Cloud subscription for 50% off with this deal

Upgrading to MacOS Sequoia? Here’s why you may want to hold off