Apple is rumored to be introducing a stacked rear camera sensor design in its upcoming iPhone models. According to respected Apple industry analyst Ming-Chi Kuo, the iPhone 16 series, expected to be released in 2023, will feature this innovative camera technology.
Previous rumors suggested that the iPhone 15 and 15 Plus models would have a 48-megapixel main camera, but stacked image sensor technology was not mentioned at the time. However, Kuo reveals that the stacked sensor design will be included in Apple’s standard, lower-end iPhone 15 and 15 Plus models, rather than the higher-end iPhone 15 Pro and Pro Max devices.
Production issues seem to be the reason behind the partial adoption of stacked image sensors in the iPhone 15 lineup. This problem may continue to affect Apple in 2024. To mitigate this, Apple has reportedly secured most of the stacked CMOS image sensor (CIS) production capacity from Sony Semiconductor Solutions well in advance.
Kuo also suggests that any production shortages from Sony may benefit rival supplier Will Semi, especially considering CIS orders from non-Apple smartphone manufacturers.
Reports indicate that the introduction of stacked sensors in Interchangeable Lens Cameras (ILCs) has had a minimal impact on image quality but has greatly improved camera speed. Stacked sensors allow for faster data readout, which can enhance imaging performance. Faster data readout enables quicker shooting speeds, improved autofocus, and better video performance. The quicker the data can be read from the sensor, the better the overall performance.
Canon Europe explains how Canon’s stacked sensor in the Canon EOS R3 enables faster data readout. The Canon R3’s sensor is also backside-illuminated, which enhances light collection efficiency and, consequently, image quality.
While rumors suggest that the stacked image sensor in the iPhone 15 and 15 Plus models will offer better image quality, it is essential to consider its potential impact on image processing performance. Image processing plays a significant role in the excellent image quality featured in Apple’s iPhone devices. As processors become more powerful and artificial intelligence (AI)-driven software improves, the image processing pipeline can better handle data from the image sensor. If an image sensor can deliver more data into the pipeline faster, it should offer noticeable benefits for image processing tasks.
It is unlikely that Apple will want its standard iPhone 15 to outperform the iPhone 15 Pro models. Therefore, consumers may not fully experience the power of a 48-megapixel stacked Sony chip until 2024.
With stacked rear camera sensor technology, Apple aims to improve the imaging capabilities of its lower-end iPhone models. The inclusion of this feature in the upcoming iPhone 16 series shows Apple’s commitment to delivering advanced camera capabilities to a wider range of users.