Patent application number | Description | Published |
20130016292 | EYEPIECE FOR NEAR-TO-EYE DISPLAY WITH MULTI-REFLECTORSAANM Miao; XiaoyuAACI SunnyvaleAAST CAAACO USAAGP Miao; Xiaoyu Sunnyvale CA USAANM Amirparviz; BabakAACI Mountain ViewAAST CAAACO USAAGP Amirparviz; Babak Mountain View CA US - An eyepiece for a head mounted display includes an illumination module, an end reflector, a viewing region, and a polarization rotator. The illumination module includes an image source for launching computer generated image (“CGI”) light along a forward propagating path. The end reflector is disposed at an opposite end of the eyepiece from the illumination module to reflect the CGI back along a reverse propagation path. The viewing region is disposed between the illumination module and the end reflector. The viewing region includes a polarizing beam splitter (“PBS) and non-polarizing beam splitter (“non-PBS”) disposed between the PBS and the end reflector. The viewing region redirects the CGI light from the reverse propagation path out of an eye-ward side of the eyepiece. The polarization rotator is disposed in the forward and reverse propagation paths of the CGI light between the viewing region and the end reflector. | 01-17-2013 |
20130016413 | WHOLE IMAGE SCANNING MIRROR DISPLAY SYSTEMAANM Saeedi; EhsanAACI Santa ClaraAAST CAAACO USAAGP Saeedi; Ehsan Santa Clara CA USAANM Miao; XiaoyuAACI SunnyvaleAAST CAAACO USAAGP Miao; Xiaoyu Sunnyvale CA USAANM Amirparviz; BabakAACI Mountain ViewAAST CAAACO USAAGP Amirparviz; Babak Mountain View CA US - An optical apparatus includes an image source, a scanning mirror, an actuator, and a scanning controller. The image source outputs an image by simultaneously projecting a two-dimensional array of image pixels representing a whole portion of the image. The scanning mirror is positioned in an optical path of the image to reflect the image. The actuator is coupled to the scanning mirror to selectively adjust the scanning mirror about at least one axis. The scanning controller is coupled to the actuator to control a position of the scanning mirror about the at least one axis. The scanning controller includes logic to continuously and repetitiously adjust the position of the scanning mirror to cause the image to be scanned over an eyebox area that is larger than the whole portion of the image. | 01-17-2013 |
20130021374 | Manipulating And Displaying An Image On A Wearable Computing System - Example methods and systems for manipulating and displaying a real-time image and/or photograph on a wearable computing system are disclosed. A wearable computing system may provide a view of a real-world environment of the wearable computing system. The wearable computing system may image at least a portion of the view of the real-world environment in real-time to obtain a real-time image. The wearable computing system may receive at least one input command that is associated with a desired manipulation of the real-time image. The at least one input command may be a hand gesture. Then, based on the at least one received input command, the wearable computing system may manipulate the real-time image in accordance with the desired manipulation. After manipulating the real-time image, the wearable computing system may display the manipulated real-time image in a display of the wearable computing system. | 01-24-2013 |
20130021658 | Compact See-Through Display System - An optical system includes a display panel, an image former, a viewing window, a proximal beam splitter, and a distal beam splitter. The display panel is configured to generate a light pattern. The image former is configured to form a virtual image from the light pattern generated by the display panel. The viewing window is configured to allow outside light in from outside of the optical system. The virtual image and the outside light are viewable along a viewing axis extending through the proximal beam splitter. The distal beam splitter is optically coupled to the display panel and the proximal beam splitter and has a beam-splitting interface in a plane that is parallel to the viewing axis. A camera may also be optically coupled to the distal beam splitter so as to be able to receive a portion of the outside light that is viewable along the viewing axis. | 01-24-2013 |
20130033756 | METHOD AND APPARATUS FOR A NEAR-TO-EYE DISPLAY - An eyepiece for a head mounted display includes an illumination module, an end reflector, a viewing region, and a polarization rotator. The illumination module provides CGI light along a forward propagation path within the eyepiece. The end reflector is disposed at an opposite end of the eyepiece from the illumination module to reflect the CGI light back along a reverse propagation path within the eyepiece. The viewing is disposed between the illumination module and the end reflector and includes an out-coupling polarizing beam splitter (“PBS”). The out-coupling PBS passes the CGI light traveling along the forward propagation path and redirects the CGI light traveling along the reverse propagation path out of an eye-ward side of the eyepiece. The polarization rotator is disposed in the forward and reverse propagation paths between the out-coupling PBS and the end reflector. | 02-07-2013 |
20130063486 | Optical Display System and Method with Virtual Image Contrast Control - A method includes generating a light pattern using a display panel and forming a virtual image from the light pattern utilizing one or more optical components. The virtual image is viewable from a viewing location. The method also includes receiving external light from a real-world environment incident on an optical sensor. The real-world environment is viewable from the viewing location. Further, the method includes obtaining an image of the real-world environment from the received external light, identifying a background feature in the image of the real-world environment over which the virtual image is overlaid, and extracting one or more visual characteristics of the background feature. Additionally, the method includes comparing the one or more visual characteristics to an upper threshold value and a lower threshold value and controlling the generation of the light pattern based on the comparison. | 03-14-2013 |
20130069985 | Wearable Computer with Superimposed Controls and Instructions for External Device - A wearable computing device includes a head-mounted display (HMD) that provides a field of view in which at least a portion of the environment of the wearable computing device is viewable. The HMD is operable to display images superimposed over the field of view. When the wearable computing device determines that a target device is within its environment, the wearable computing device obtains target device information related to the target device. The target device information may include information that defines a virtual control interface for controlling the target device and an identification of a defined area of the target device on which the virtual control image is to be provided. The wearable computing device controls the HMD to display the virtual control image as an image superimposed over the defined area of the target device in the field of view. | 03-21-2013 |
20130088413 | Method to Autofocus on Near-Eye Display - An optical system has an aperture through which virtual and real-world images are viewable along a viewing axis. The optical system may be incorporated into a head-mounted display (HMD). By modulating the length of the optical path along an optical axis within the optical system, the virtual image may appear to be at different distances away from the HMD wearer. The wearable computer of the HMD may be used to control the length of the optical path. The length of the optical path may be modulated using, for example, a piezoelectric actuator or stepper motor. By determining the distance to an object with respect to the HMD using a range-finder or autofocus camera, the virtual images may be controlled to appear at various distances and locations in relation to the target object and/or HMD wearer. | 04-11-2013 |
20130100362 | NEAR-TO-EYE DISPLAY WITH DIFFRACTION GRATING THAT BENDS AND FOCUSES LIGHT - A near-to-eye optical system includes an optically transmissive substrate having a see-through display region and a repeating pattern of diffraction elements. The repeating pattern of diffraction elements is disposed across the see-through display region of the optically transmissive substrate and organized into a reflective diffraction grating that bends and focuses computer generated image (“CGI”) light impingent upon the reflective diffraction grating. The see-through display region is at least partially transmissive to external ambient light impingent upon an exterior side of the optically transmissive substrate and at least partially reflective to the CGI light impingent upon an interior side of the optically transmissive substrate opposite the exterior side. | 04-25-2013 |
20130108229 | HEADS-UP DISPLAY INCLUDING AMBIENT LIGHT CONTROL | 05-02-2013 |
20130113973 | ADAPTIVE BRIGHTNESS CONTROL OF HEAD MOUNTED DISPLAY - A technique for adaptive brightness control of an eyepiece of a head mounted display (“HMD”) includes displaying a computer generated image (“CGI”) to an eye of a user from a viewing region of the eyepiece of the HMD. Image data is captured from an ambient environment surrounding the HMD. A brightness value is calculated for the ambient environment based at least in part upon the image data. A bias power setting is determined based at least in part upon the brightness value. The bias power setting is applied to an illumination source for generating the CGI and a brightness level of the CGI displayed to the eye of the user is controlled with the bias power setting. | 05-09-2013 |
20130124204 | Displaying Sound Indications On A Wearable Computing System - Example methods and systems for displaying one or more indications that indicate (i) the direction of a source of sound and (ii) the intensity level of the sound are disclosed. A method may involve receiving audio data corresponding to sound detected by a wearable computing system. Further, the method may involve analyzing the audio data to determine both (i) a direction from the wearable computing system of a source of the sound and (ii) an intensity level of the sound. Still further, the method may involve causing the wearable computing system to display one or more indications that indicate (i) the direction of the source of the sound and (ii) the intensity level of the sound. | 05-16-2013 |
20130235191 | NEAR-TO-EYE DISPLAY WITH AN INTEGRATED OUT-LOOKING CAMERA - Embodiments of a near-to-eye display include a light guide with a proximal end, a distal end, a front surface spaced apart from a back surface, an ambient input region on the front surface and an output region on the back surface. A display and a camera are positioned at or near the proximal end. A proximal optical element is positioned in the light guide and optically coupled to the display and the camera. A distal optical element is positioned in the light guide and optically coupled to the proximal optical element, the ambient input region and the output region. The proximal optical element can direct display light toward the distal optical element and ambient light to the camera, and the distal optical element can direct display light to the output region and ambient light to the output region and to the proximal optical element. | 09-12-2013 |
20130279705 | Displaying Sound Indications On A Wearable Computing System - Example methods and systems for displaying one or more indications that indicate (i) the direction of a source of sound and (ii) the intensity level of the sound are disclosed. A method may involve receiving audio data corresponding to sound detected by a wearable computing system. Further, the method may involve analyzing the audio data to determine both (i) a direction from the wearable computing system of a source of the sound and (ii) an intensity level of the sound. Still further, the method may involve causing the wearable computing system to display one or more indications that indicate (i) the direction of the source of the sound and (ii) the intensity level of the sound. | 10-24-2013 |
20130314759 | Compact See-Through Display System - An optical system includes a display panel, an image former, a viewing window, a proximal beam splitter, and a distal beam splitter. The display panel is configured to generate a light pattern. The image former is configured to form a virtual image from the light pattern generated by the display panel. The viewing window is configured to allow outside light in from outside of the optical system. The virtual image and the outside light are viewable along a viewing axis extending through the proximal beam splitter. The distal beam splitter is optically coupled to the display panel and the proximal beam splitter and has a beam-splitting interface in a plane that is parallel to the viewing axis. A camera may also be optically coupled to the distal beam splitter so as to be able to receive a portion of the outside light that is viewable along the viewing axis. | 11-28-2013 |
20130335301 | Wearable Computer with Nearby Object Response - Exemplary methods and systems relate to detecting physical objects near a substantially transparent head-mounted display (HMD) system and activating a collision-avoidance action to alert a user of the detected objects. Detection techniques may include receiving data from distance and/or relative movement sensors and using this data as a basis for determining an appropriate collision-avoidance action. Exemplary collision-avoidance actions may include de-emphasizing virtual objects displayed on the HMD to provide a less cluttered view of the physical objects through the substantially transparent display and/or presenting new virtual objects. | 12-19-2013 |
20140016015 | IMAGING DEVICE WITH A PLURALITY OF DEPTHS OF FIELD - An imaging device includes an image sensor that includes a first group of pixels and a second group of pixels disposed on a semiconductor die. The first group of pixels are arranged to capture a first image and the second group of pixels are arranged to capture a second image. The imaging device also includes a first lens configured to focus image light from a first focus distance onto the first group of pixels. The imaging device further includes a second lens configured to focus the image light from a second focus distance onto the second group of pixels and not the first group of pixels. The first lens is positioned to focus the image light from the first focus distance onto the first group of pixels and not the second group of pixels. The first focus distance is different than the second focus distance. | 01-16-2014 |
20140078333 | IMAGING DEVICE WITH A PLURALITY OF PIXEL ARRAYS - An imaging device includes a first pixel array arrange to capture a first image and a second pixel array arranged to capture a second image. The first pixel array and the second pixel array face substantially a same direction. The imaging device also includes shutter control circuitry which is coupled to the first pixel array to initiate a first exposure period of the first pixel array to capture the first image. The shutter control circuitry is also coupled to the second pixel array to initiate a second exposure period of the second pixel array to capture the second image. The imaging device also includes processing logic coupled to receive first pixel data of the first image and coupled to receive second pixel data of the second image. The processing logic is configured to generate at least one image using the first pixel data and the second pixel data. | 03-20-2014 |
20140125810 | LOW-PROFILE LENS ARRAY CAMERA - An imaging device includes an image sensor and an array of wafer lenses. The image sensor has rows and columns of pixels partitioned into an array of sensor subsections. The array of wafer lenses is disposed over the image sensor. Each of the wafer lenses in the array of wafer lenses is optically positioned to focus image light onto a corresponding sensor subsection in the array of sensor subsections. Each sensor subsection includes unlit pixels that do not receive the image light focused from the wafer lenses and each sensor subsection also includes lit pixels that receive image the image light focused by the wafer lenses. A rectangular subset of the lit pixels from each sensor subsection are arranged to capture images. | 05-08-2014 |
20150042834 | SINGLE PIXEL CAMERA - A camera system includes a single pixel photo-sensor disposed in or on a substrate to acquire image data. A micro-lens is adjustably positioned above the single pixel photo-sensor to focus external scene light onto the single pixel photo-sensor. An actuator is coupled to the micro-lens to adjust a position of the micro-lens relative to the single pixel photo-sensor to reposition the micro-lens to focus the external scene light incident from different angles onto the single pixel photo-sensor. Readout circuitry is coupled to readout the image data associated with each of the different angles from the single pixel photo-sensor. | 02-12-2015 |