New rumors reaffirm that Apple could increase the definition of the main sensor of the iPhone 14 Pro and Pro Max. This would go to 48 megapixels. The iPhone sensor definition has stalled at 12 megapixels since the iPhone 6s, while the competition opts for 50, 64 or 108 megapixel sensors. However, will this new sensor really improve the quality of photos on these iPhones? Is not safe.
In recent years, the definition of photosensors in smartphones has progressed strongly. It continues to increase, even exceeding 100 megapixels. We had the opportunity to test some, such as the Samsung Galaxy S21 Ultra 5G, the Xiaomi Mi 11i wave Honor 50. Starting this year, the first smartphones with sensors 200 megapixels it could happen. the Xiaomi Note 11 could even be the first smartphone take advantage of this component.
If the definition of the sensors is quite high in most telephony brands, however it remains quite low in an irreducible … American. this is apple. Since the iPhone 6s, the Cupertino company has stayed true to the 12-megapixel definition. It is true that the nature of the sensors has changed since then, as well as the quality of the lenses, stabilizers and autofocus. But Apple refuses to increase the definition of the main sensor so as not to create an imbalance with respect to the secondary sensors.
Apple would choose a 48 megapixel sensor to equip the iPhone 14 Pro and Pro Max
But that could change. Some rumors say that some models expected for September 2022 could have sensors 48 megapixels. The information is confirmed today by TrendForce specifying that this change would only affect the professional range, that is to say iPhone 14 Pro and iPhone 14 Pro Max. Once again, Apple would give its much more expensive Pro line a photographic edge. It was the first to benefit from a secondary sensor, an optical zoom, an optical stabilizer, or even LiDAR autofocus.
If this leak is confirmed, will the photos improve? In fact, the leak confirms that the sensor would be compatible with “Quad Pixel”. This means that four adjacent pixels will combine to form one larger pixel. Therefore, the default definition of photos would remain at 12 megapixels.. Any of the current definitions, to maintain the balance mentioned above.
So why change? Two hypotheses. Or Apple chooses a larger component. This would increase the size of the pixels: a “pixel quad” of the new sensor would be larger than a native pixel of the current component. And as always, more light always means better quality. Either Apple does not resize the sensor. This decision would then be motivated solely by the marketing argument of the figure. Which would be pretty sad.
Fountain : TrendForce
Introvert. Beer guru. Communicator. Travel fanatic. Web advocate. Certified alcohol geek. Tv buff. Subtly charming internet aficionado.