Patent application number | Description | Published |
20120019669 | SYSTEMS AND METHODS FOR CALIBRATING IMAGE SENSORS - Systems and methods are provided for calibrating image sensors. In some embodiments, a processing module of an image system can automatically perform a self-calibration process after a production unit of an image sensor has been integrated into an end product system. For example, the processing module can calibrate a production unit based on one or more reference pixels of the production unit, where the one or more reference pixels have minimal color filtration. In some embodiments, the processing module may perform local calibrations by correcting specifically for spatial variations in a color filter array (“CFA”). In some embodiments, the processing module can perform global calibrations by correcting for optical density variations in the CFA. In some embodiments, a processing module can determine whether the cause of production variations is related to production variations of a CFA or production variations of an infrared (“IR”) cutoff filter. | 01-26-2012 |
20120091317 | IMAGING SYSTEMS AND METHODS INCLUDING PIXEL ARRAYS WITH REDUCED NUMBERS OF METAL LINES AND CONTROL SIGNALS - This is generally directed to systems and methods for reduced metal lines and control signals in an imaging system. For example, in some embodiments a pixel cell of an imaging system can operate without a row select transistor, and therefore can operate without a row select metal control line. As another example, in some embodiments a pixel cell can share its reset transistor control line with a transfer transistor control line of another pixel cell. In this manner, an imaging system can be created that averages a single metal line per pixel cell. In some embodiments, operation of such reduced-metal line imaging systems can use modified timing schemes of control signals. | 04-19-2012 |
20120193515 | IMAGERS WITH DEPTH SENSING CAPABILITIES - An imager may include depth sensing pixels that provide an asymmetrical angular response to incident light. The depth sensing pixels may each include a substrate region formed from a photosensitive portion and a non-photosensitive portion. The depth sensing pixels may include mechanisms that prevent regions of the substrate from receiving incident light. Depth sensing pixel pairs may be formed from depth sensing pixels that have different asymmetrical angular responses. Each of the depth sensing pixel pairs may effectively divide the corresponding imaging lens into separate portions. Depth information for each depth sensing pixel pair may be determined based on the difference between output signals of the depth sensing pixels of that depth sensing pixel pair. The imager may be formed from various combinations of depth sensing pixel pairs and color sensing pixel pairs arranged in a Bayer pattern or other desired patterns. | 08-02-2012 |
20120274744 | STRUCTURED LIGHT IMAGING SYSTEM - Structured light imaging method and systems are described. An imaging method generates a stream of light pulses, converts the stream after reflection by a scene to charge, stores charge converted during the light pulses to a first storage element, and stores charge converted between light pulses to a second storage element. A structured light image system includes an illumination source that generates a stream of light pulses and an image sensor. The image sensor includes a photodiode, first and second storage elements, first and second switches, and a controller that synchronizes the image sensor to the illumination source and actuates the first and second switches to couple the first storage element to the photodiode to store charge converted during the light pulses and to couple the second storage element to the photodiode to store charge converted between the light pulses. | 11-01-2012 |
20130038691 | ASYMMETRIC ANGULAR RESPONSE PIXELS FOR SINGLE SENSOR STEREO - Depth sensing imaging pixels include pairs of left and right pixels forming an asymmetrical angular response to incident light. A single microlens is positioned above each pair of left and right pixels. Each microlens spans across each of the pairs of pixels in a horizontal direction. Each microlens has a length that is substantially twice the length of either the left or right pixel in the horizontal direction; and each microlens has a width that is substantially the same as a width of either the left or right pixel in a vertical direction. The horizontal and vertical directions are horizontal and vertical directions of a planar image array. A light pipe in each pixel is used to improve light concentration and reduce cross talk. | 02-14-2013 |
20130222552 | IMAGING PIXELS WITH DEPTH SENSING CAPABILITIES - An imager may include depth sensing pixels that receive and convert incident light into image signals. The imager may have an associated imaging lens that focuses the incident light onto the imager. Each of the depth sensing pixels may include a microlens that focuses incident light received from the imaging lens through a color filter onto first and second photosensitive regions of a substrate. The first and second photosensitive regions may provide different and asymmetrical angular responses to incident light. Depth information for each depth sensing pixel may be determined based on the difference between output signals of the first and second photosensitive regions of that depth sensing pixel. Color information for each depth sensing pixel may be determined from a summation of output signals of the first and second photosensitive regions. | 08-29-2013 |
20130222603 | IMAGING SYSTEMS FOR INFRARED AND VISIBLE IMAGING - An imaging device capable of simultaneously capturing visible and infrared images may be provided with an array of photosensitive elements, an array of filter elements arranged over the array of photosensitive elements, and a dual bandpass filter arranged over the array of filter elements. The dual bandpass filter may have a first passband in the visible spectral range and a second passband in the infrared spectral range. The array of filter elements may include color filter elements and infrared filter elements. During color image capturing operations, each color pixel receives visible and near infrared light through the dual bandpass filter and an associated color filter element. The infrared portion of the pixel signal from the color pixels may be removed using signals from the near infrared pixels. During infrared image capturing operations, each near infrared pixel receives infrared light through the dual bandpass filter and an associated infrared filter element. | 08-29-2013 |
20130292548 | IMAGE SENSORS WITH PHOTOELECTRIC FILMS - An image sensor with an organic photoelectric film for converting light into charge may be provided. The image sensor may include an array of image sensor pixels. Each image sensor pixel may include a charge-integrating pinned diode that collects photo-generated charge from the photoelectric film during an integration period. An anode electrode may be coupled to an n+ doped charge injection region in the charge-integrating pinned diode and may be used to convey the photo-generated charge from the photoelectric film to the charge-integrating pinned diode. Upon completion of a charge integration cycle, a first transfer transistor gate may be pulsed to move the charge from the charge-integrating pinned diode to a charge-storage pinned diode. The charge may be transferred from the charge-storage pinned diode to a floating diffusion node for readout by pulsing a gate of a second charge transfer transistor. | 11-07-2013 |
20140077323 | IMAGING SYSTEMS WITH BACKSIDE ILLUMINATED NEAR INFRARED IMAGING PIXELS - An imaging system may include an image sensor having backside illuminated near infrared image sensor pixels. Each pixel may be formed in a graded epitaxial substrate layer such as a graded n-type epitaxial layer. Each pixel may be separated from an adjacent pixel by an isolation trench formed in the graded epitaxial layer. The isolation trench may be a continuous isolation trench or may be formed from a combined front side isolation trench and backside isolation trench that are separated by a wall structure. A buried front side reflector may be provided that reflects light such as infrared light that has passed through a pixel back into the pixel, thereby effectively doubling the silicon absorption depth of the pixels. | 03-20-2014 |
20140078349 | IMAGING SYSTEMS WITH CROSSTALK CALIBRATION PIXELS - An image sensor may include crosstalk calibration pixels. Crosstalk calibration pixels may include exposed pixels and shielded pixels. Exposed pixels may be partially or completely surrounded by shielded pixels. Calibration pixels may be formed in a checkerboard pattern of alternating shielded and exposed pixels or a double checkerboard pattern of alternating pairs of shielded and exposed pixels. Exposed pixels may have apertures of various size in a shielding layer that shields the shielded pixels from light. Signals generated by exposed and shielded pixels may be used in assessing pixel optical and electrical crosstalk and indirectly deducing the spectral composition of incoming light for particular locations in a pixel array. Information about local crosstalk across the array may be used in coordinate dependent color correction matrices, white balance algorithms, luminance and chroma noise cancellation, edge sharpening, assessment of pixel implantation depth, and measuring a modulation transfer function. | 03-20-2014 |
20140078366 | IMAGING SYSTEMS WITH IMAGE PIXELS HAVING VARYING LIGHT COLLECTING AREAS - An image sensor may have an array of image sensor pixels having varying light collecting areas. The light collecting area of each image pixel may vary with respect to other image pixels due to varied microlens sizes and varied color filter element sizes throughout the array. The light collecting area may vary within unit pixel cells and the variability of the light collecting areas of pixels within each pixel cell may depend on the location of the pixel cell in the pixel array. Each unit pixel cell may include at least one clear pixel having a light collecting area that is smaller than the light collecting areas of other single color pixels in the unit pixel cell. | 03-20-2014 |
20140197301 | GLOBAL SHUTTER IMAGE SENSORS WITH LIGHT GUIDE AND LIGHT SHIELD STRUCTURES - An image sensor operable in global shutter mode may include an array of image pixels. Each image pixel may include a photodiode for detecting incoming light and a separate storage diode for temporarily storing charge. To maximize the efficiency of the image pixel array, image pixels may include light guide structures and light shield structures. The light guide structures may be used to funnel light away from the storage node and into the photodiode, while the light shield structures may be formed over storage nodes to block light from entering the storage nodes. The light guide structures may fill cone-shaped cavities in a dielectric layer, or the light guide structures may form sidewalls having a ring-shaped horizontal cross section. Metal interconnect structures in the dielectric layer may be arranged in concentric annular structures to form a near-field diffractive element that funnels light towards the appropriate photodiode. | 07-17-2014 |
20140263980 | IMAGERS WITH DEPTH SENSING CAPABILITIES - An imager may include depth sensing pixels that provide an asymmetrical angular response to incident light. The depth sensing pixels may each include a substrate region formed from a photosensitive portion and a non-photosensitive portion. The depth sensing pixels may include mechanisms that prevent regions of the substrate from receiving incident light. Depth sensing pixel pairs may be formed from depth sensing pixels that have different asymmetrical angular responses. Each of the depth sensing pixel pairs may effectively divide the corresponding imaging lens into separate portions. Depth information for each depth sensing pixel pair may be determined based on the difference between output signals of the depth sensing pixels of that depth sensing pixel pair. The imager may be formed from various combinations of depth sensing pixel pairs and color sensing pixel pairs arranged in a Bayer pattern or other desired patterns. | 09-18-2014 |