Omnivision's patent application US20120038014 proposes a stress film to passivate the backside surface:
"For a BSI CMOS image sensor, dark currents may be a particular problem. A typical BSI CMOS image sensor has dark current levels that are over 100 times greater than that of a front side illuminated sensor.
A BSI image sensor's backside surface stress may affect its dark current level. The present application discloses utilizing structures and methods to adjust the stress on a CMOS image sensor's backside silicon surface, thereby reducing the dark current effect by facilitating the movement of photo generated charge carriers away from the backside surface.
Stress on a backside silicon surface may be adjusted by forming a stress loaded layer on the surface. A stress loaded layer may include materials such as metal, organic compounds, inorganic compounds, or otherwise. For example, the stress loaded layer may include a silicon oxide (SiO2) film, a silicon nitride (SiNx) film, a silicon oxynitride (SiOxNy) film, or a combination thereof."
Apple parent application US20120044328 proposes to split image sensor into three - one luminance sensor and two chrominance ones:
"Typically, the luminance portion of a color image may have a greater influence on the overall image resolution than the chrominance portion. This effect can be at least partially attributed to the structure of the human eye, which includes a higher density of rods for sensing luminance than cones for sensing color.
While an image sensing device that emphasizes luminance over chrominance generally does not perceptibly compromise the resolution of the produced image, color information can be lost if the luminance and chrominance sensors are connected to separate optical lens trains, and a “blind” region of the luminance sensor is offset from the “blind” region of the chrominance sensor. One example of such a blind region can occur due to a foreground object occluding a background object. Further, the same foreground object may create the blind region for both the chrominance and luminance sensors, or the chrominance blind region created by one object may not completely overlap the luminance blind region created by a second object. In such situations, color information may be lost for the “blind” regions of the chrominance sensor, thereby compromising the resolution of the composite color image."
So Apple proposes to use two chrominance sensors and the following processing flow:
"For a BSI CMOS image sensor, dark currents may be a particular problem. A typical BSI CMOS image sensor has dark current levels that are over 100 times greater than that of a front side illuminated sensor.
A BSI image sensor's backside surface stress may affect its dark current level. The present application discloses utilizing structures and methods to adjust the stress on a CMOS image sensor's backside silicon surface, thereby reducing the dark current effect by facilitating the movement of photo generated charge carriers away from the backside surface.
Stress on a backside silicon surface may be adjusted by forming a stress loaded layer on the surface. A stress loaded layer may include materials such as metal, organic compounds, inorganic compounds, or otherwise. For example, the stress loaded layer may include a silicon oxide (SiO2) film, a silicon nitride (SiNx) film, a silicon oxynitride (SiOxNy) film, or a combination thereof."
Apple parent application US20120044328 proposes to split image sensor into three - one luminance sensor and two chrominance ones:
"Typically, the luminance portion of a color image may have a greater influence on the overall image resolution than the chrominance portion. This effect can be at least partially attributed to the structure of the human eye, which includes a higher density of rods for sensing luminance than cones for sensing color.
While an image sensing device that emphasizes luminance over chrominance generally does not perceptibly compromise the resolution of the produced image, color information can be lost if the luminance and chrominance sensors are connected to separate optical lens trains, and a “blind” region of the luminance sensor is offset from the “blind” region of the chrominance sensor. One example of such a blind region can occur due to a foreground object occluding a background object. Further, the same foreground object may create the blind region for both the chrominance and luminance sensors, or the chrominance blind region created by one object may not completely overlap the luminance blind region created by a second object. In such situations, color information may be lost for the “blind” regions of the chrominance sensor, thereby compromising the resolution of the composite color image."
So Apple proposes to use two chrominance sensors and the following processing flow: