iPhone 14 Pro comes with a 48-megapixel wide-angle rear lens for the first time on an iPhone. It’s the first upgrade in megapixels since the iPhone 6s in 2015, which came with a 12-megapixel rear camera. However, the only way to take a 48MP photo with iPhone 14 Pro is to use ProRAW or third-party apps – but I wish I could take compressed 48MP photos using Apple’s Camera app.
How the iPhone 14 Pro uses its 48MP camera
You may ask yourself: If the iPhone doesn’t take 48MP pictures by default, how does it use the new lens?
To put it simply, Apple uses a process known as “pixel binning,” which combines data from four pixels into one. This is because using all 48MP requires much more light, which can be a downside for photos in low light scenarios. With pixel binning, iPhone uses the 48MP sensor to take a 12MP picture with better quality and less noise.
The company has also introduced a new 2x zoom in the iPhone 14 Pro’s Camera app that crops the original 48MP image to result in a 12MP zoomed photo. This allows users to digitally zoom in without losing definition and without having to switch to the telephoto lens with 3x zoom.
Since image files with 48MP resolution are much larger than those with 12MP resolution, Apple has limited how users can take pictures with the new sensor. If you really want to take a 48MP picture, you must enable Apple ProRAW in the Camera app.
RAW vs. non-RAW
For those unfamiliar, a RAW photo is basically the original image captured from the sensor, with minimal or no post-processing. It contains all the data about things like brightness, shadows, and colors that can be edited later in image editing software like Adobe Lightroom. Because of this, a RAW image file can be 15 times larger than a compressed image.
When you don’t have RAW enabled, the camera takes the picture and then removes some of this data to result in a smaller file that takes up less space.
Many iPhone models can take RAW photos with the help of third party apps. Since the introduction of the iPhone 12 Pro, Apple has implemented this feature natively in the iOS Camera app with Apple ProRAW. Now with iPhone 14 Pro, Apple has decided to tie the 48MP resolution photos with ProRAW.
The problem, as you can imagine, is that these photos take up a lot of space in the iPhone’s internal storage. Apple says that each 48MP RAW file can be approximately 75MB. Of course, only real “pro” users end up enabling this option, but everyone should be able to take advantage of the full 48 megapixels of the iPhone 14 Pro camera.
48MP compressed photos
I really wish I could take 48MP photos without bloating my iPhone’s storage because the only way to easily do so is by enabling ProRAW. But why does it matter? Well, I did some experiments to show you all how photos taken with 48 megapixel resolution are noticeably better, even when compressed.
You can see some of the results below. The images have been cropped so you can get a better look at the details:
Here’s another example that shows the level of detail in a 48MP photo, even after compression:
For that, I used a Shortcut created by developer Gabriele Trabucco (via Vadim Yuryev) that quickly converts ProRAW 48MP photos to the HEIF (High Efficiency Image File Format) compressed format. HEIF is a codec officially supported by Apple that is capable of preserving the good quality of images in much smaller files.
While each 48MP RAW file in these examples is about 60MB, the HEIF-compressed version is only 3.2MB – less than the 3.3MB of the 12MP JPEG image.
For videos, iPhone users can choose between shooting in 720p, 1080p, or 4K. So why not offer the same option for taking photos? I’m sure that a lot of users would choose to take pictures in 48MP resolution despite the compression.
Plus: I want to get rid of Smart HDR
Since I’m talking about the iPhone camera, here’s another thing that annoys me a lot: Smart HDR. This is a feature introduced back in 2018 with iPhone XS that uses AI to enhance photos by making a number of post-processing adjustments. Remember the “Beautygate” of the iPhone XS? That was Smart HDR’s fault.
But back then, Smart HDR wasn’t as aggressive as today, and users could still turn it off. Since iPhone 12, Smart HDR can no longer be turned off by users. While Smart HDR tries to improve photos, it ends up ruining some of them by making them extremely unnatural.
I’ve seen a lot of people recently complaining about pictures taken with iPhone, and even I no longer like some of the pictures I take.
YouTuber Max Tech has made a great video showing how iPhone 14’s Smart HDR lags behind compared to the post-processing of the recently released Google Pixel 7 Pro.
I understand that post-processing is important to compensate for the limitations of the hardware (in this case, the small camera lenses and sensors), but it has reached a point where some of this post-processing is just too much.
If someone from Apple is reading this, please bring back the option to turn off Smart HDR. I don’t want to have to take RAW pictures just to take advantage of the 48-megapixel resolution or to avoid this excessive post-processing.
But what about you? Do you think Apple needs to bring more options to the native iPhone Camera app? Let me know in the comments below.
FTC: We use income earning auto affiliate links. More.