Patents - stay tuned to the technology

Inventors list

Assignees list

Classification tree browser

Top 100 Inventors

Top 100 Assignees


23rd week of 2016 patent applcation highlights part 78
Patent application numberTitlePublished
20160165127IMAGE CAPTURE DEVICE AND IMAGE PROCESSING METHOD - A first restoration processing section 2016-06-09
20160165128CAPTURING AND SENDING IMAGES AND VIDEOS BASED ON A SINGLE USER INTERACTION WITH A USER INTERFACE ELEMENT - A user interacts with a messaging application on a client device to capture and send images to contacts or connections of the user, with a single user interaction. The messaging application installed on the client device, presents to the user a user interface. The user interface includes a camera view and a face tray including contact icons. On receiving a single user interaction with a contact icon in the face tray, the messaging application captures an image including the current camera view presented to the user, and sends the captured image to the contact represented by the contact icon. In another example, the messaging application may receive a single user interaction with a contact icon for a threshold period of time, and may capture a video for the threshold period of time, and send the captured video to the contact.2016-06-09
20160165129Image Processing Method - A method of processing an image comprises: acquiring an image of a scene including an object having a recognisable feature. A lens actuator setting providing a maximum sharpness for a region of the image including the object and a lens displacement corresponding to the lens actuator setting are determined. A distance to the object based on the lens displacement is calculated. A dimension of the feature as a function of the distance to the object, the imaged object size and a focal length of a lens assembly with which the image was acquired, is determined. The determined dimension of the feature is employed instead of an assumed dimension of the feature for subsequent processing of images of the scene including the object.2016-06-09
20160165130Eye/Head Controls for Camera Pointing - A setting of a video camera is remotely controlled. Video from a video camera is displayed to a user using a video display. At least one eye of the user is imaged as the user is observing the video display, a change in an image of at least one eye of the user is measured over time, and an eye/head activity variable is calculated from the measured change in the image using an eyetracker. The eye/head activity variable is translated into a camera control setting, and an actuator connected to the video camera is instructed to apply the camera control setting to the video camera using a processor.2016-06-09
20160165131REFRIGERATOR - A refrigerator is provided. The refrigerator may include a main body having a storage compartment, a first door rotatably installed at a first side of the main body to open and close a first portion of the storage compartment, and a second door rotatably installed at a second side of the main body to open and close a second portion of the storage compartment. A first camera may be installed at the first door to take a picture of an interior of the first storage compartment during rotation of the first door, and a second camera may be installed at the second door to take a picture of the interior of the first storage compartment during rotation of the second door. A controller may combine plural pictures taken by the first camera and the second camera into a single corrected image of a region of the first compartment spanning from the first door to the second door.2016-06-09
20160165132ELECTRONIC APPARATUS, AND CONTROL METHOD THEREFOR - In order to enable users to use various functions via simple operations, a setting unit is configured to set any of setting values included in a first item in a hierarchy of a first setting menu and to set any of setting values included in a second item in a hierarchy of a second setting menu, and a control unit is configured to, according to a specific operation, switch to selection of a second setting value included in the second item in a case a first setting value included in the first item is selected, and to switch to selection of a fourth setting value included in the second item in a case a third setting value included in the first item is selected.2016-06-09
20160165133METHOD OF CONTROLLING CAMERA OF DEVICE AND DEVICE THEREOF - Provided are a method of controlling a camera of a device capable of minimizing a vision difference between a front camera and a user by displaying a reduced size preview image within a partial region of a display proximate to a lens of the front camera. When a photo is taken while the user views the reduced size preview image, a more natural image is captured.2016-06-09
20160165134Feature Based High Resolution Motion Estimation from Low Resolution Images Captured Using an Array Source - Systems and methods in accordance with embodiments of the invention enable feature based high resolution motion estimation from low resolution images captured using an array camera. One embodiment includes performing feature detection with respect to a sequence of low resolution images to identify initial locations for a plurality of detected features in the sequence of low resolution images, where the at least one sequence of low resolution images is part of a set of sequences of low resolution images captured from different perspectives. The method also includes synthesizing high resolution image portions, where the synthesized high resolution image portions contain the identified plurality of detected features from the sequence of low resolution images. The method further including performing feature detection within the high resolution image portions to identify high precision locations for the detected features, and estimating camera motion using the high precision locations for said plurality of detected features.2016-06-09
20160165135IMAGE PHOTOGRAPHING APPARATUS, METHOD OF PHOTOGRAPHING IMAGE AND NON-TRANSITORY RECORDABLE MEDIUM - A imaging apparatus is disclosed. The imaging apparatus according to an exemplary embodiment includes a camera configured to capture a subject, a combiner configured to be combined with another imaging apparatus, a controller configured to perform capturing by controlling the camera and the other imaging apparatus, respectively, and an image processor configured to in response to a field of view interference occurring between the camera and the other imaging apparatus, delete an area where the field of view interference occurs from an image captured by the camera.2016-06-09
20160165136SERVICE SYSTEM, INFORMATION PROCESSING APPARATUS, AND SERVICE PROVIDING METHOD - A service system includes a mobile terminal and an information processing device capable of communication via a network. The mobile terminal includes a first transmission unit that transmits spherical images taken in respective imaging locations and positional information about the imaging locations to the information processing device. The information processing device includes a reception unit that receives the spherical images transmitted by the first transmission unit; a map data obtaining unit that obtains map data from a map data storage, the map data including the imaging locations of the spherical images; a path information creation unit that creates information about a path made by connecting the imaging locations in the map data obtained by the map data obtaining unit; and a content providing unit that makes content available for a request through the network, the content including the map data, the information about the path, and the spherical images.2016-06-09
20160165137IMAGING APPARATUS AND METHOD OF CONTROLLING THE SAME - An imaging apparatus and method of controlling the same is disclosed. The imaging apparatus includes camera modules, a motion sensor configured to sense a motion of the imaging apparatus to generate a motion value, a controller configured to generate a control signal to adjust a position of a lens of the camera modules based on the motion value, a image stabilizers configured to adjust the position of the lens of the camera modules in response to the control signal and to sense the adjusted position of the lens, and a selector configured to select at least one of the first image stabilizing unit or the second image stabilizing unit, in response to a selective input signal, to transfer the control signal to the selected image stabilizer, and to transfer the sensed position of the lens to the controller from the selected image stabilizer.2016-06-09
20160165138IMAGE CAPTURING APPARATUS AND CONTROL METHOD THEREFOR - An image capturing apparatus comprises: a first detection unit configured to detect a panning amount of the image capturing apparatus; a second detection unit configured to detect a moving amount of a subject; and a control unit configured to perform control to display information which is related to a difference between the panning amount of the image capturing apparatus detected by the first detection unit in the exposure duration and the moving amount of the subject detected by the second detection unit before a start of the exposure, on a display unit in an exposure duration.2016-06-09
20160165139IMAGE SHAKE CORRECTION DEVICE, IMAGE PICKUP APPARATUS, AND CONTROL METHOD - Provided is an image shake correction device to correct image shake during panning photographing to let the movement of an image pick-up apparatus follow the movement of a main subject which uses a shift lens group. The image shake correction device executes a determination process on a motion vector for calculating an angular velocity of the main object for use in calculation of a drive signal of the shift lens group. The image shake correction device changes the determination process on the motion vector according to a state of the image pick-up apparatus.2016-06-09
20160165140METHOD FOR CAMERA MOTION ESTIMATION AND CORRECTION - A motion estimation and correction system and methods are shown comprising: an image acquisition device configured to acquire an image via scanning an image frame over a period of time, an inertial measurement unit configured to measure at least one of a position, an orientation, and a movement of the image acquisition device during the period of time and output an indication of the movement as detected; and a state estimation module, operatively coupled to the inertial measurement unit and the image acquisition device, configured to estimate a state related to at least one of position and orientation of the image acquisition device based on the at least one of the position, the orientation, and the movement. In one example, systems and methods include using an inertial measurement unit along with an image acquisition device. In one example, the image acquisition device includes a rolling-shutter camera.2016-06-09
20160165141CAMERA TIMER - Disclosed are devices, systems, methods, and non-transitory computer-readable storage media for displaying useful countdown timers on a media capture device. A media capture device can increase contrast between countdown timer and the video of a scene to be captured by a media capture device and adjust the position and size of a counter displayed on the device based on whether the object is determined to be closer or further than a predetermined threshold distance. Countdown timers can be triggered by detection of objects and gestures.2016-06-09
20160165142IMAGE CAPTURING APPARATUS, CONTROL METHOD THEREFOR, AND STORAGE MEDIUM - There is provided an image capturing apparatus. A detection unit detects a position that has been designated by a user on a first display screen. A display unit displays, on the first display screen, a captured image generated by a image capturing unit. A transmission unit transmits the captured image to an external apparatus, the external apparatus being configured to display the captured image on a second display screen and detect a position that has been designated by the user on the second display screen. An obtaining unit configured to obtain, from the detection unit or the external apparatus, position information based on the position designated by the user. A selection unit selects a partial region of the captured image based on the position information if the position designated by the user is within an effective range.2016-06-09
20160165143Position Capture Input Apparatus, System and Method Therefor - According to various embodiments, a position capture input system uses a camera to capture an image of a displayed graphical user interface that may be partially obstructed by an object, such as a user's hand or other body part. The position capture input system also includes a software component that causes a computing device to compare the captured image with a displayed image to determine which portion, if any, of the graphical user interface is obstructed. The computing device can then identify any user interface elements with which the user is attempting to interact. The position capture input system may also include an accelerometer or accelerometers for detecting gestures performed by the user to, for example, select or otherwise interact with a user interface element. The position capture input system may also include a haptic feedback module to provide confirmation, for example, that a user interface element has been selected.2016-06-09
20160165144IMAGE DISPLAY APPARATUS, IMAGE DISPLAY METHOD, AND STORAGE MEDIUM - An image display apparatus configured to recognize one or more objects from an image, determine a display priority of a menu to be displayed relating to a recognized object based on a set display mode, and displaying menus corresponding to one or more recognized objects based on the determined display priority.2016-06-09
20160165145CONFIGURATION AND MANAGEMENT OF LIGHT POSITIONING SYSTEM USING DIGITAL PULSE RECOGNITION - In one aspect, the present disclosure relates to a method for a method for hiding a camera preview feed of a mobile device application. The method may proceed by the mobile device application enabling an imaging sensor of the mobile device, where the software of the mobile device requires the mobile device application to display the camera preview feed when the imaging sensor is enabled. The method may continue by creating a camera preview surface for displaying the camera preview feed. The method may further continue by modifying the camera preview surface to be hidden from the mobile device user. The method may end by setting the camera preview feed to be displayed on the camera preview surface. In another aspect, the present disclosure further relates to modifying the camera preview surface by resizing the camera preview surface to be one pixel large.2016-06-09
20160165146DEVICES, METHODS, AND SYSTEMS FOR VISUAL IMAGING ARRAYS - Computationally implemented methods and systems include capturing a scene that includes one or more images, through use of an array of more than one image sensor, selecting a particular portion of the scene that includes at least one image, wherein the selected particular portion is smaller than the scene, transmitting only the selected particular portion from the scene to a remote location, and de-emphasizing pixels from the scene that are not part of the selected particular portion of the scene. In addition to the foregoing, other aspects are described in the claims, drawings, and text.2016-06-09
20160165147Camera Modules Patterned with Pi Filter Groups - Systems and methods in accordance with embodiments of the invention pattern array camera modules with π filter groups. In one embodiment, an array camera module includes: an M×N imager array including a plurality of focal planes, where each focal plane includes an array of pixels; an M×N optic array of lens stacks, where each lens stack corresponds to a focal plane, and where each lens stack forms an image of a scene on its corresponding focal plane; where each pairing of a lens stack and focal plane thereby defines a camera; where at least one row in the M×N array of cameras includes at least one red camera, one green camera, and one blue camera; and where at least one column in the M×N array of cameras includes at least one red camera, one green camera, and one blue camera.2016-06-09
20160165148IMAGE SYNTHESIZER FOR VEHICLE - An image synthesizer apparatus for vehicle includes an image generator and an error detector. From multiple cameras arranged to a vehicle so that an imaging region of each camera partially overlaps with an imaging region of an adjacent camera, the image generator acquires images of areas allocated to the respective cameras, and synthesizes the acquired images to generate a synthetic image around the vehicle viewed from a viewpoint above the vehicle. The error detector detects errors in the cameras. When the error detector detects a faulty camera in the cameras, the image generator acquires, from the image captured by the camera adjacent to the faulty camera, an overlap portion overlapping with the image captured by the faulty camera, uses the overlap portion to generate the synthetic image, and applies image reinforcement to the overlap portion.2016-06-09
20160165149Daisy Chain Devices and Systems for Signal Switching and Distribution - The invention provides systems, devices, methods and software to daisy chain multiple individual transmitters, optionally nodes, optionally extenders, and receivers to form any sized scalable system of digital video and audio signal switching and distribution. The video audio systems are for a simpler system design, wiring, control and expansion, to accomplish signal interfacing, switching, splitting for many varied input and output requirements with a scalable pair of transmitters and receivers.2016-06-09
20160165150MOBILE DEVICE FOR RECORDING, REVIEWING, AND ANALYZING VIDEO - Two parties have three distinct viewpoints of their relationship, from which assumptions emerge and working hypotheses about how to manage their relationship. The system, device, and method described herein include using a mobile device for understanding face-to-face human interactions. The process includes using a mobile device for recording an interaction with one or more other persons, whereby one or more of the participants use the mobile device to describe their viewpoints of the interaction. The participants can use the mobile device to receive immediate feedback for analysis, to compare viewpoints, to examine how the viewpoints are arrived, and to explore the viewpoints' consequences for the participants' relationship.2016-06-09
20160165151Virtual Focus Feedback - A system and method for focusing a camera is disclosed. The system may generate a proxy image that is blurred to an extent that is correlated to a degree to which a camera is out of focus. A user may be asked to adjust a focusing mechanism to attempt to bring the proxy image into focus. This allows the camera to be focused without the user needing to see an image from the camera. This can be used, for example, to focus an infrared camera. The infrared camera could be a tracking camera in a device such as a head mounted display device.2016-06-09
20160165152Drift Correction Method for Infrared Imaging Device - A method reduces drift induced by environment changes when imaging radiation from a scene in two wavelength bands. Scene radiation is focused by two wedge-shaped components through a lens onto a detector that includes three separate regions. The wedge-shaped components are positioned at a fixed distance from the lens. The radiation from the scene is imaged separately onto two of the detector regions through an f-number of less than approximately 1.5 to produce a first pixel signal. Imaged radiation on each of the two regions includes radiation in one respective wavelength band. Radiation from a radiation source is projected by at least one of the wedge-shaped components through the lens onto a third detector region to produce a second pixel signal. The first pixel signal is modified based on a predetermined function that defines a relationship between second pixel signal changes and first pixel signal changes induced by environment changes.2016-06-09
20160165153AMBIENT INFRARED DETECTION IN SOLID STATE SENSORS - An image sensor device has a first region configured to sense only infrared illumination and a second region configured to not sense visible and infrared illumination.2016-06-09
20160165154OPERATION INPUT DEVICE, OPERATION INPUT METHOD, AND COMPUTER-READABLE RECORDING MEDIUM - An operation input device includes an infrared irradiation unit radiates infrared, a capturing unit detects the infrared to capture an image, an luminance calculating unit calculates, in a captured image, a luminance difference between a luminance of a first area irradiated with the infrared by the infrared irradiation unit and a luminance of a second area arranged outside the first area, an infrared control unit adjusts an irradiation intensity of the infrared to be radiated from the infrared irradiation unit so that the luminance difference calculated by the luminance calculating unit becomes a predetermined target value, an image processing unit detects a shape of an indication object from the captured image, a determination unit determines an operation by the indication object from the shape detected by the image processing unit, and a command unit makes a device to be operated perform a function corresponding to the determined operation determined.2016-06-09
20160165155METHOD FOR REDUCING IMAGE FUZZY DEGREE OF TDI-CCD CAMERA - The present invention belongs to the field of image processing, and particularly relates to the determination of an aerial remote sensing image fuzzy parameter and the elimination of aerial remote sensing image fuzziness based on a TDI-CCD camera. The method comprises the following specific steps: establishing an image coordinate system, reading an area array image, constructing a similarity matching criterion, conducting offset resolving to acquire homonymy points so as to obtain a digital image reducing the chattering influence. The method is relatively simple and precise in computing process, and good in processing effect.2016-06-09
20160165156DEVICE FOR PICTURE TAKING IN LOW LIGHT AND CONNECTABLE TO A MOBILE TELEPHONE TYPE DEVICE - A picture taking device includes a sensor of resolution higher than thirteen megapixels and an objective lens open at minimum to f/2.0 for at least one focal length value. A diagonal dimension of the photosensitive surface of the sensor is greater than 15 mm. A digital processor to reduce noise while preserving the textures in at least one image captured by the sensor. The device has a power supply and a rigid connector to connect it to the mobile phone or tablet type device comprising a screen, and communicates and/or commands with the mobile phone or tablet type device. A picture taking method is provided using the picture taking device.2016-06-09
20160165157STEREO ASSIST WITH ROLLING SHUTTERS - An imaging system for a vehicle may include a first image capture device having a first field of view and configured to acquire a first image relative to a scene associated with the vehicle, the first image being acquired as a first series of image scan lines captured using a rolling shutter. The imaging system may also include a second image capture device having a second field of view different from the first field of view and that at least partially overlaps the first field of view, the second image capture device being configured to acquire a second image relative to the scene associated with the vehicle, the second image being acquired as a second series of image scan lines captured using a rolling shutter. As a result of overlap between the first field of view and the second field of view, a first overlap portion of the first image corresponds with a second overlap portion of the second image. The first image capture device has a first scan rate associated with acquisition of the first series of image scan lines that is different from a second scan rate associated with acquisition of the second series of image scan lines, such that the first image capture device acquires the first overlap portion of the first image over a period of time during which the second overlap portion of the second image is acquired.2016-06-09
20160165158IMAGING APPARATUS AND CAMERA SYSTEM - An imaging apparatus that forms an image of a light beam transmitted through an imaging lens on an imaging element includes a laminated material that is provided on the imaging element, the light beam being transmitted through the laminated material, the laminated material being provided at a position at which an end portion of an upper surface of the laminated material allows an outermost light beam out of light beams to be transmitted therethrough, the light beams entering a pixel in an outer end portion of the imaging element in an effective pixel area, the position having a width Hopt.2016-06-09
20160165159SOLID STATE IMAGE SENSOR WITH ENHANCED CHARGE CAPACITY AND DYNAMIC RANGE - Certain aspects relate to imaging systems and methods for manufacturing imaging systems and image sensors. The imaging system includes a pixel array including a plurality of pixels, the pixels configured to generate a charge when exposed to light and disposed on a first layer. The imaging system further includes a plurality of pixel circuits for reading light integrated in the pixels coupled thereto, each of the plurality of pixel circuits comprising one or more transistors shared between a subset of the plurality of the pixels, the one or more transistors disposed on a second layer different than the first layer. The imaging system further includes a plurality of floating diffusion nodes configured to couple each of the plurality of pixels to the plurality of pixel circuits.2016-06-09
20160165160PIXEL READOUT ARCHITECTURE FOR FULL WELL CAPACITY EXTENSION - Certain aspects relate to systems and techniques for full well capacity extension. For example, a storage capacitor included in the pixel readout architecture can enable multiple charge dumps from a pixel in the analog domain, extending the full well capacity of the pixel. Further, multiple reads can be integrated in the digital domain using a memory, for example DRAM, in communication with the pixel readout architecture. This also can effectively multiply a small pixel's full well capacity. In some examples, multiple reads in the digital domain can be used to reduce, eliminate, or compensate for kTC noise in the pixel readout architecture.2016-06-09
20160165161IMAGE SENSING SYSTEM AND METHOD OF DRIVING THE SAME - In a first operation mode, signals are read from pixels including ranging pixels in a pixel array and ranging and image generation are performed based on the read signals. In a second operation mode, signals are read from the pixels excluding the ranging pixels and exposure is controlled based on the read signals.2016-06-09
20160165162SOLID-STATE IMAGING DEVICE AND DRIVE CONTROL METHOD FOR THE SAME - A CMOS sensor has unit pixels each structured by a light receiving element and three transistors, to prevent against the phenomenon of saturation shading and the reduction of dynamic range. The transition time (fall time), in switching off the voltage on a drain line shared in all pixels, is given longer than the transition time in turning of any of the reset line and the transfer line. For this reason, the transistor constituting a DRN drive buffer is made proper in its W/L ratio. Meanwhile, a control resistance or current source is inserted on a line to the GND, to make proper the operation current during driving. This reduces saturation shading amount. By making a reset transistor in a depression type, the leak current to a floating diffusion is suppressed to broaden the dynamic range.2016-06-09
20160165163IMAGING DEVICE - An object of the present invention is to reduce capacitance of a charge accumulation part (floating diffusion) of each pixel unit. In an imaging device, in addition to a plurality of first switching transistors for coupling a plurality of coupling wires extending in the column direction, a second switching transistor is provided between each of the coupling wires and a floating diffusion in each pixel unit. Preferably, the gate of the first switching transistor and the gate of the second switching transistor are electrically coupled to each other.2016-06-09
20160165164IMAGING APPARATUS AND CONTROL METHOD THEREOF - Provided is an imaging apparatus including an imaging element including a pixel unit having first and second photoelectric conversion units configured to generate image signals by photoelectrically converting optical fluxes passing through different regions into which an exit pupil of an imaging optical system is divided for one micro-lens. The imaging apparatus controls each of the timing of a first removal operation of removing a noise component from a first image signal read from the first photoelectric conversion unit and the timing of a second removal operation of removing a noise component from a second image signal read from the first and second photoelectric conversion units to have a predetermined relationship with a frequency of a noise source occurring during an operation of reading the image signals.2016-06-09
20160165165FLOATING DIFFUSION RESET LEVEL BOOST IN PIXEL CELL - A reset level in a pixel cell is boosted by switching ON a reset transistor of the pixel cell to charge the floating diffusion to a first reset level during a reset operation. A select transistor is switched from OFF to ON during the floating diffusion reset operation to discharge an output terminal of an amplifier transistor. The reset transistor is switched OFF after the output terminal of the amplifier transistor has been discharged in response to the switching ON of the select transistor. The output terminal of the amplifier transistor charges to a static level after being discharged. The floating diffusion coupled to the input terminal of the amplifier transistor follows the output terminal of the amplifier transistor across an amplifier capacitance coupled between the input terminal and the output terminal of the amplifier transistor to boost the reset level of the floating diffusion.2016-06-09
20160165166IMAGE SENSOR FOR IMPROVING NONLINEARITY OF ROW CODE REGION, AND DEVICE INCLUDING THE SAME - An image sensor is provided. The image sensor includes a pixel configured to generate a reset signal and an image signal, a comparator configured to compare the reset signal with a reference signal and generate a first comparison signal, a counter configured to generate a first count value corresponding to the reset signal based on a clock signal and the first comparison signal, and a reference signal generator configured to generate the reference signal which changes between a first level corresponding to a maximum reset count value of the counter and a second level corresponding to a minimum reset count value of the counter during a reset signal period.2016-06-09
20160165167SOLID STATE IMAGING DEVICE AND ELECTRONIC APPARATUS - A solid state imaging device includes a pixel array unit in which color filters of a plurality of colors are arrayed with four pixels of vertical 2 pixels×horizontal 2 pixels as a same color unit that receives light of the same color, shared pixel transistors that are commonly used by a plurality of pixels are intensively arranged in one predetermined pixel in a unit of sharing, and a color of the color filter of a pixel where the shared pixel transistors are intensively arranged is a predetermined color among the plurality of colors. The present technology can be applied, for example, to a solid state imaging device such as a back-surface irradiation type CMOS image sensor.2016-06-09
20160165168SYSTEMS AND METHODS FOR TRIGGERING THE TRANSMISSION OF RECOVERY VIDEO FRAMES TO A VIDEO-RECEIVING DEVICE OVER A HALF-DUPLEX AIR INTERFACE - An embodiment takes the form of a process that includes transmitting video frames to a receiving device during a first transmission period of one or more time slots of a half-duplex air interface, and receiving feedback messaging from the receiving device during a feedback period of one or more time slots of the half-duplex air interface. The process further includes suspending, after the first transmission period, transmission of video frames for a suspension period of one or more time slots of the half-duplex air interface, and after the feedback period and after the suspension period, transmitting one or more recovery frames to the receiving device during a recovery period of one or more time slots of the half-duplex air interface. The one or more recovery frames collectively include inter-coded macroblock data and/or intra-coded macroblock data.2016-06-09
20160165169Apparatus and Method for Transmitting Video Data in Communication System - Video data transmission in a communication system is provided. A method for operating a transmitting node includes generating a packet comprising data selected from compressed data and raw data based on a data size, and transmitting the packet.2016-06-09
20160165170Augmented reality remote control - An Augmented Reality (AR) device, places manual controls and virtual displays onto a surface of a controllable electronic device (CED) or next to the CED as viewed through the AR device allowing the user to manipulate the controls and view feedback via the virtual displays associated with the controllable device. The AR device overlays an image on a surface of the CED with virtual control objects and virtual feedback image(s) are displayed on a surface of the CED or adjacent to the CED. The user views the control objects and virtual displays on the surface and/or area adjacent to the CED. These control objects are manipulated via voice, hand, head, and eye gestures recognized by the AR device and will be able to see real time feedback displayed on the virtual displays.2016-06-09
20160165171METHOD AND DEVICE FOR INSERTING A GRAPHICAL OVERLAY IN A VIDEO STREAM - The present invention relates generally to video communication systems, and more specifically to a method and device for determining a degree of animation of a graphical overlay for a video stream based on a degree of movement in the video stream.2016-06-09
20160165172HUMAN-COMPUTER INTERACTION METHOD AND CONTROLLED TERMINAL AND REMOTE-CONTROL DEVICE UTILIZING THE SAME - A control method for a controlled terminal and the controlled terminal and a remote-control device utilizing the same are provided. According to the control method for a controlled terminal, a time interval for the controlled terminal to respond to the input operations from a remote-control device is set according to a setting instruction, and the input operations from the remote-control device are processed according to the time interval. Using the above-describe manner, the time interval of the controlled terminal to respond to input operations can be flexibly set to ensure that users can watch TV programs normally, and improve the user experience.2016-06-09
20160165173Video Preview During Trick Play - Methods and systems are described for displaying a thumbnail preview of video content. In an aspect, one or more mosaic images made up of thumbnails corresponding to frames of the video content at multiple time points can be loaded into the system or created by the methods described. In an aspect, the selected thumbnail, as well as any other thumbnails, can be selected in response to receiving a command (e.g., trick play request) from the viewer. The command can dictate the direction from the selected thumbnail that the next thumbnails will be selected. The command can also dictate the frequency with which thumbnails will be selected from the mosaic image. In an aspect, frames comprised of thumbnails can be encoded to create the video content.2016-06-09
20160165174SYSTEMS AND METHODS FOR AUTOMATICALLY CONTROLLING MEDIA ASSET PLAYBACK IN A VEHICLE - Systems and methods for method for automatically controlling playback of a media asset in a vehicle are provided. A presentation of a media asset to a user in the vehicle is generated. A motion state of the vehicle is determined. In response to determining that the motion state indicates that the vehicle is moving, the presentation of the media asset is paused. In response to determining that the motion state indicates that the vehicle is not moving, the presentation of the media asset is resumed.2016-06-09
20160165175SYSTEMS AND METHODS FOR RE-RECORDING CONTENT ASSOCIATED WITH RE-EMERGED POPULARITY - Systems and methods for re-recording content associated with popularity that re-emerged are provided. A plurality of media assets is recorded. Responsive to determining that popularity of a given one of the plurality of media assets fell below a first threshold, the given media asset is selected for deletion and the given media asset is added to a list of a plurality of media assets that have been selected for deletion. Popularity for the given media asset in the list is retrieved after selecting the given media asset for deletion. The retrieved popularity of the given media asset, which previously fell below the first threshold, is compared to a second threshold. In response to determining that the retrieved popularity of the given media asset, which previously fell below the first threshold, is now above the second threshold, an action relating to re-recording the given media asset is performed.2016-06-09
20160165176News Production System with Display Controller - An example news production system includes a scheduling system, a composite display with multiple display panels, and a driver and a controller for the composite display. The controller receives data from the scheduling system. The controller then identifies a digital video effect to be run by the driver. The digital video effect involves the driver using an input video stream, at least in part, to generate output video streams for the display panels. In response, the controller prompts a user for an input, and then receives a timing signal based on a user input. In response to receiving the timing signal, the controller causes a video feed network to route the input video stream to the driver and causes the driver to run the digital video effect using the input video stream to thereby generate video streams to the composite display.2016-06-09
20160165177SUPER-RESOLUTION OF DYNAMIC SCENES USING SAMPLING RATE DIVERSITY - Super-resolution of dynamic scenes using sampling rate diversity, processes and systems thereof are provided. The method implementing the processes includes: in a first stage, super-resolving secondary low-resolution (LR) images using a set of primary LR images to create LR dictionaries to represent polyphase components (PPCs) of high resolution (HR) patches of images; and, in a second stage, reverting to a single frame super-resolution (SR) applied to each frame which comprises an entire image, using the HR dictionaries extracted from the super-resolved sequence obtain in the first stage.2016-06-09
20160165178SUPER-RESOLUTION OF DYNAMIC SCENES USING SAMPLING RATE DIVERSITY - Super-resolution of dynamic scenes using sampling rate diversity, processes and systems thereof are provided. The method implementing the processes includes: in a first stage, super-resolving secondary low-resolution (LR) images using a set of primary LR images to create LR dictionaries to represent polyphase components (PPCs) of high resolution (HR) patches of images; and, in a second stage, reverting to a single frame super-resolution (SR) applied to each frame which comprises an entire image, using the HR dictionaries extracted from the super-resolved sequence obtain in the first stage.2016-06-09
20160165179Method, Device, and System for Reducing Bandwidth Usage During a Communication Session - A communication system, method and communication terminal are configured to permit bandwidth reduction when communication partners are engaged in a communication session. In one embodiment, a communication terminal such as a cellular phone or tablet may turn off at least one of its display and camera sensor within a predetermined amount of time when the terminal is detected as being positioned near a user's ear via at least one sensor of the terminal. In some embodiments, the communication terminal may also transmit one or more messages to communication devices of communication partners engaged in the communication session to inform those devices that video transmissions should no longer be sent to the communication terminal. The communication terminal and other communication devices involved in the communication session may also stop transmitting video as a result of the detection of the communication terminal being positioned near a user's ear.2016-06-09
20160165180TRANSMISSION MANAGEMENT APPARATUS - A transmission management apparatus manages transmission states of a plurality of transmission terminals including a first transmission terminal and a second transmission terminal. The apparatus includes a terminal management table storage unit configured to store therein a terminal management table in which terminal information including an identifier and an identification name of each transmission terminal is managed; a receiving unit configured to receive a terminal information request signal from the first transmission terminal, the terminal information request signal indicating a request for information for identifying the second transmission terminal, the first and second transmission terminals being in transmission therebetween; a terminal state acquisition unit configured to acquire the information for identifying the second transmission terminal from the terminal management table in response to the terminal information request signal; and a transmitting unit configured to transmit the information acquired by the terminal state acquisition unit to the first transmission terminal.2016-06-09
20160165181DETERMINING ELECTRONIC MEDIA FORMAT WHEN TRANSFERRING A CUSTOMER BETWEEN SPECIALISTS OR AMONGST COMMUNICATION SOURCES AT A CUSTOMER SERVICE OUTLET - Systems, apparatus, and computer program products are provided for determining the media format for transferring a customer from customer between specialists and/or from one communication source to another communication source within a customer service outlet, such as a banking center or the like. For example, while a two-way video conference system may be used for communication between a customer and a remote specialist, in certain instances in which the customer requires transfer to another specialist (e.g., a specialist having a different specialty) other media formats, such as one-way live video conference, live audio-only conference, text chat or the like may be implemented.2016-06-09
20160165182DETERMINING ELECTRONIC MEDIA FORMAT WHEN TRANSFERRING A CUSTOMER BETWEEN SPECIALISTS OR AMONGST COMMUNICATION SOURCES AT A CUSTOMER SERVICE OUTLET - Systems, apparatus, and computer program products are provided for determining the media format for transferring a customer from customer between specialists and/or from one communication source to another communication source within a customer service outlet, such as a banking center or the like. For example, while a two-way video conference system may be used for communication between a customer and a remote specialist, in certain instances in which the customer requires transfer to another specialist (e.g., a specialist having a different specialty) other media formats, such as one-way live video conference, live audio-only conference, text chat or the like may be implemented.2016-06-09
20160165183SYSTEM AND METHOD TO ENABLE LAYERED VIDEO MESSAGING - A method includes determining, at a first computing device, first capabilities of a second computing device. The method includes setting, at the first computing device, a user configurable option based on a first capability of a second computing device, a second capability associated with a bandwidth of a network, or a combination thereof. The method includes generating, at the first computing device, multimedia content. The multimedia content includes a first layer and a second layer. The first layer includes first media content received from a first content source. A first bit rate of the first layer is determined based on the user configurable option. The method also includes sending the multimedia content to the second computing device.2016-06-09
20160165184PROVISION OF VIDEO CONFERENCE SERVICES - Embodiments are described for provision of video conference services. In some embodiments, a service connector module of a video conference system receives a request to switch a participant user of a video conference to a communication mode for the video conference received at a participant endpoint device, the request to switch is (a) a request to switch from an unidirectional communication mode to a bidirectional communication mode, or (b) a request to switch from the bidirectional communication mode to the unidirectional communication mode, determining information on at least one distribution capability from one or more interactions with the participant endpoint device, determines a distribution scheme for the communication mode using the at least one distribution capability, sends a request to establish a connection with the endpoint device using the distribution scheme, receives a notification on establishment of the connection, and sends a request to terminate a prior connection.2016-06-09
20160165185INTERACTIVE VIDEO CONFERENCING - Technology for a local user equipment (UE) operable to perform video conferencing with a remote UE is disclosed. The local UE can receive a set of predefined region of interests (ROIs) from the remote UE. The local UE can select a predefined ROI from the set of predefined ROIs received from the remote UE. The local UE can communicate the predefined ROI to the remote UE that directs the remote UE to capture video within the predefined ROI and encode the video within the predefined ROI. The local UE can receive encoded video within the predefined ROI from the remote UE. The encoded video can include regions within the predefined ROI and excluding regions outside the predefined ROI. The local UE can provide the encoded video within the predefined ROI for rendering and display at the local UE.2016-06-09
20160165186NUI VIDEO CONFERENCE CONTROLS - A system and method providing gesture controlled video conferencing includes a local capture device detecting movements of a user in a local environment and an audio/visual display. A processor is coupled to the capture device and a remote capture device and a remote processor at a remote environment via a network. The local processor includes instructions to render a representation of the remote environment on the display responsive to the remote processor and remote capture device. The processor also tracks movements of a local user in a space proximate to the local capture device. Responsive to a user gesture detected at the local capture device, the audio or visual signals provided by the remote capture device are altered to change the representation of the remote location is altered locally.2016-06-09
20160165187SYSTEMS AND METHODS FOR AUTOMATED VISUAL SURVEILLANCE - Embodiments relate to systems, devices, and computer-implemented methods for performing automated visual surveillance by obtaining video camera coordinates determined using video data, video camera metadata, and/or digital elevation models, obtaining a surveillance rule associated with rule coordinates, identifying a video camera that is associated with video camera coordinates that include at least part of the rule coordinates, and transmitting the surveillance rule to a computing device associated with the video camera. The rule coordinates can be automatically determined based on received coordinates of an object. Additionally, the surveillance rule can be generated based on instructions from a user in a natural language syntax.2016-06-09
20160165188VEHICLE VISION SYSTEM WITH ENHANCED FUNCTIONALITY - A vision system for a vehicle includes at least one camera disposed at a vehicle so as to have a field of view exterior of the vehicle. Control circuitry includes an image processor for processing image data captured by the camera. The image processor processes image data captured by the camera for a pedestrian detection system of the vehicle. The control circuitry, responsive at least in part to image processing of captured image data by the image processor, is operable to cause adjustment of the pedestrian detection system of the vehicle from regular sensitivity to higher sensitivity. Adjustment from regular sensitivity to higher sensitivity is at least in part responsive to image processing of captured image data by the image processor determining that the vehicle is in the vicinity of children that a driver of the vehicle should watch out for.2016-06-09
20160165189Virtual Home Safety Assessment Framework - Disclosed herein is a framework for facilitating virtual safety assessment. In accordance with one aspect, the framework receives image data of an environment to be assessed for safety from an agent support system. The framework sends such image data to an expert support system and receives safety assessment information from the expert support system determined based on the image data. The framework then provides a report based at least in part on such safety assessment information.2016-06-09
20160165190METHOD AND SYSTEM FOR MONITORING LOGISTICS FACILITIES - The invention relates to a method for monitoring processes and/or operating states in logistics facilities, in particular in roofed logistics facilities, comprising: providing a system which comprises at least one unmanned aerial vehicle (2016-06-09
20160165191TIME-OF-APPROACH RULE - A method for predicting when an object will arrive at a boundary includes receiving visual media captured by a camera. An object in the visual media is identified. One or more parameters related to the object are detected based on analysis of the visual media. It is predicted when the object will arrive at a boundary using the one or more parameters. An alert is transmitted to a user indicating when the object is predicted to arrive at the boundary.2016-06-09
20160165192METHODS, SYSTEMS, AND APPARATUSES FOR CAPTURING AND ARCHIVING FIREARM EVENT TELEMETRY AND ENVIRONMENTAL CONDITIONS - In accordance with embodiments disclosed herein, there are provided mechanisms, methods, systems, and apparatuses for capturing and archiving firearm event telemetry and environmental conditions. According to a particularly described embodiment, there is, for example, a method executing within a device having at least a processor and a memory therein and being physically coupled to a weapon, wherein the method includes operating a proximity sensor at the weapon, the proximity sensor to trigger an activation event upon removal of the weapon from a weapon holster; detecting the activation event and triggering event archiving; archiving event data by storing event data to the memory of the device; and uploading the event data to a remote storage separate and distinct from the device. The weapon may be a lethal or a non-lethal weapon. In a related embodiment, there is a device capable for mounting to weapon, the device including at least a proximity sensor to trigger an event upon change in status of the weapon from a holster or to a holster; an audio capture device; a video capture device; a solid state memory to store captured audio, video, and event telemetry data; and a wireless communications interface from the device to a remote location having storage capability distinct from the device. Other related embodiments are described.2016-06-09
20160165193SYSTEMS AND METHODS FOR VIDEO ANALYSIS RULES BASED ON MAP DATA - Systems, methods and computer-readable media for creating and using video analysis rules that are based on map data are disclosed. A sensor(s), such as a video camera, can track and monitor a geographic location, such as a road, pipeline, or other location or installation. A video analytics engine can receive video streams from the sensor, and identify a location of the imaged view in a geo-registered map space, such as a latitude-longitude defined map space. A user can operate a graphical user interface to draw, enter, select, and/or otherwise input on a map a set of rules for detection of events in the monitored scene, such as tripwires and areas of interest. When tripwires, areas of interest, and/or other features are approached or crossed, the engine can perform responsive actions, such as generating an alert and sending it to a user.2016-06-09
20160165194LIGHTING DEVICE HAVING PHOSPHOR WHEEL AND EXCITATION RADIATION SOURCE - Various embodiments may relate to a lighting device, including an excitation radiation source and a two-sided luminescent-material wheel. At least one luminescent material is provided on each of the two sides of the two-sided luminescent-material wheel, thus both on the front side and on the opposite back side. The luminescent materials on both sides are excited sequentially in time. For this purpose, at least one transmissive region is provided in the rotating luminescent-material wheel, through which transmissive region the excitation radiation can radiate, which excitation radiation can be deflected onto the luminescent material on the back side by means of an optical unit.2016-06-09
20160165195IMAGE PROJECTOR, IMAGE PROJECTION METHOD, AND RECORDING MEDIUM - An image projector includes a light source to emit light, an image forming device to form an image based on a reflected light of the light emitted from the light source, a color wheel disposed between the light source and the image forming device so as to pass the light emitted from the light source to the image forming device through an optical path, the color wheel including a plurality of color filters, at least one lens to project the image formed on the image forming device to a projection surface as a projection image, and circuitry to cause the color wheel be moved away from the optical path for the light emitted from the light source, when a brightness of the projection image is to be increased.2016-06-09
20160165196SYSTEMS AND METHODS FOR AN IMMERSION THEATER ENVIRONMENT WITH DYNAMIC SCREENS - Provided herein are systems and methods to configure an immersive theater environment by moving one or more screens. A movie screen can be provided with one or more actuators coupled to the movie screen, wherein the one or more actuators are configured to move the movie screen in a plurality of directions including, for example, up, down, left, right, back, forward, pitch, roll, and yaw. This system can include a processor coupled to the actuator, wherein the processor is configured to control the movement of the one or more actuators, synchronize cinema content for movement of the movie screen, process cinema content for metadata cues indicating movement of the movie screen, and track positions of the movie screen.2016-06-09
20160165197PROJECTION PROCESSOR AND ASSOCIATED METHOD - A projection processor includes a receiving circuit and an image processing circuit. The receiving circuit receives an input image. The image processing circuit performs at least one predetermined image processing operation upon the input image to generate an output image, wherein a projection source is generated according to the output image. The projection source is displayed or projected by a projection source component of an electronic device, such that a first cover of a projection display component partially reflects the projection source.2016-06-09
20160165198IMAGE DISPLAYING SYSTEM, CONTROLLING METHOD OF IMAGE DISPLAYING SYSTEM, AND STORAGE MEDIUM - It aims to enable to uniform synthesized luminance in an image overlap region of a projection image by a simple process, by means of a mechanism of: performing display control to project a black image as the projection image to be projected from a first projection-type image displaying apparatus to a screen, and project, as the projection image to be projected from a second projection-type image displaying apparatus to the screen, an image in which dimming correction to the overlap region of the projection images respectively projected by the first and second projection-type image displaying apparatuses has been enabled; measuring a luminance characteristic of the second projection-type image displaying apparatus based on a photographed image obtained by photographing the overlap region; and setting a dimming correction characteristic for the overlap region by the first projection-type image displaying apparatus, based on the measured luminance characteristic.2016-06-09
20160165199Hybrid Image Decomposition and Projection - Hybrid image projection systems and methods can superimpose image components of an input image. An input image can be divided into smaller regions and at least one parameter of each region can be determined. The input image can be decomposed based on the parameter of each region into multiple, less correlated, orthogonal or quasi-orthogonal image components. Each projector can display respective image components so that the images projected may be optically superimposed on a screen. The superposition of orthogonal or quasi-orthogonal image components can result in superposition of images in an existing multi-projector image systems being more insensitive to inter-projector image misalignment. Superimposing orthogonal or quasi-orthogonal images can be used to avoid visible image degradation, and provide more robust image quality in a multiple projector system implementation.2016-06-09
20160165200MEASUREMENTS OF CINEMATOGRAPHIC PROJECTION - A method for determining the operation of an image projector for projecting images onto a screen of a projection room. In particular, the method is implemented by computing means and comprises the following steps: driving the projector so as to project onto the screen a test card comprising a distribution of patterns of different hues, acquiring an image of the test card on the screen by a picture taking apparatus, and applying a processing of the image acquired, so as to determine at least one deviation of chrominance of the image acquired with respect to a predefined number of colors.2016-06-09
20160165201COLOR SIGNAL PROCESSING DEVICE AND COLOR SIGNAL PROCESSING METHOD - A color signal processing device can correct color phase distortion for each luminance value by using a small-capacity memory and in a circuit configuration having a small chip area and low power consumption. The present invention relates to a color signal processing device including a matrix coefficient storage section that stores a plurality of first matrix coefficients corresponding to a plurality of evaluation values set in advance and a predetermined offset value, an evaluation value calculation section that calculates a predetermined evaluation value based on each of a plurality of color component signals contained in a first color signal, a matrix coefficient interpolation operation section that generates a second matrix coefficient based on the predetermined evaluation value calculated by the evaluation value calculation section and the plurality of first matrix coefficients, a matrix operation section that performs an operation based on each of the color component signals in the first color signal and the second matrix coefficient to generate a second color signal containing a plurality of color component signals, and an offset operation section that performs an operation based on the predetermined offset values and the second color signal.2016-06-09
20160165202METHOD AND APPARATUS FOR CONTROLLING WHITE BALANCE - A method comprising: capturing, by an electronic device, a first incident light sample and generating an image based on the first incident light sample; capturing, by the electronic device, a second incident light sample and identifying a light source type associated with the second incident light sample; and adjusting a white balance of the image according to the light source type.2016-06-09
20160165203Method and System for Delivery of Content Over Communication Networks - Computer-implemented systems and methods for determining second content which was not selected by the user but which is related to first content which was selected by the user. For example, a system and method can be configured to receive, using one or more processors, first and second content where the first and second content are rich media (e.g., content containing audio or video elements). Systems and methods may be further configured where the relationship between the first content and second content are determined based on data or metadata of the first content (e.g., the title of the content or an episode number). Systems and methods may be further configured where the relationship between the first content and second content are determined based on scheduling data of the first content (e.g., the time and channel when the first content is transmitted).2016-06-09
20160165204METHOD FOR ENCODING AND METHOD FOR DECODING A COLOR TRANSFORM AND CORRESPONDING DEVICES - The present invention relates to a method for encoding a color transform. The method includes encoding first parameters representative of video signal characteristics of color output decoded pictures remapped by said at least one color transform; and encoding second parameters representative of said the at least on color transform.2016-06-09
20160165205HOLOGRAPHIC DISPLAYING METHOD AND DEVICE BASED ON HUMAN EYES TRACKING - A holographic displaying method and device based on human eyes tracking are disclosed. The holographic displaying method includes the following steps of: tracking human eyes of a viewer in real time and acquiring an image of the human eyes; determining whether coordinates of the both eyes can be determined according to the tracked image of the human eyes; and decreasing a transforming depth of field of the displayed 3D image when the coordinates of the both eyes cannot be determined according to the tracked image of the human eyes. In the aforesaid way, the present disclosure allows users to view clear 3D images even if the camera cannot track positions of the human eyes clearly.2016-06-09
20160165206DIGITAL REFOCUSING METHOD - A digital refocusing method includes: to plurality of images corresponding to multiple views in a scene are obtained, the images include a central view image and at least one non-central view image; a pixel shift or a pixel index shift is performed to the non-central view image; a line scan along a pre-determined linear path is performed to the central view image and the non-central view images to obtain corresponding pixels of the central view image and corresponding pixels of the non-central view images; view interpolation based on the disparities defined in a disparity map is performed, target pixels corresponded to a novel view image are obtained from the corresponding pixels of the central view image and the corresponding pixels of the non-central view according to a target disparity; and a refocused novel view image is obtained by averaging and compositing the target pixels of novel views.2016-06-09
20160165207ELECTRONIC DEVICE, METHOD, AND COMPUTER PROGRAM PRODUCT - In general, according to one embodiment, an electronic device includes a hardware processor. The hardware processor is configure to outputs a user interface for designating disparity sharpness related to a difference in sharpness at a border between an object and a background of the object, the difference resulting from a difference in depth-direction distances of the object and the background, and to sets the sharpness at the border between the background and the object based on the disparity sharpness designated via the user interface, and to generates one multiscopic image from parallax images.2016-06-09
20160165208SYSTEMS FOR PROVIDING IMAGE OR VIDEO TO BE DISPLAYED BY PROJECTIVE DISPLAY SYSTEM AND FOR DISPLAYING OR PROJECTING IMAGE OR VIDEO BY A PROJECTIVE DISPLAY SYSTEM - A system for providing image or video to be displayed by a projective display system includes: an encoding subsystem and a packing subsystem. The encoding subsystem is configured to encode at least one image or video of a subject to generate encoded image data. The packing subsystem is coupled to the encoding subsystem, and configured to pack the encoded image data with projection configuration information regarding the projective display system to generate packed image data. The projective display system comprises a projection source device and a projection surface, the projection source device projects the image or video to the projection surface according to the packed image data.2016-06-09
20160165209Method of Sub-PU Syntax Signaling and Illumination Compensation for 3D and Multi-view Video Coding - A method of sub-PU (prediction unit) syntax element signaling for a three-dimensional or multi-view video coding system is disclosed. A first syntax element associated with a texture sub-PU size is transmitted only for texture video data and a second syntax element associated with a depth sub-PU size is transmitted only for depth video data. The first syntax element associated with the texture sub-PU size is used to derive an IVMP (inter-view motion prediction) prediction candidate used for a texture block. The second syntax element associated with the depth sub-PU size is used to a MPI (motion parameter inheritance) prediction candidate for a depth block.2016-06-09
20160165210METHOD AND APPARATUS FOR ENCODING THREE-DIMENSIONAL CONTENT - Disclosed is a method of encoding three-dimensional (3D) content. The method of encoding 3D content according to an embodiment may include setting a dependency between texture information and depth information of the 3D content, and generating a bitstream comprising the dependency.2016-06-09
20160165211AUTOMOTIVE IMAGING SYSTEM - Various implementations include an automotive imaging system that includes at least three cameras disposed on a vehicle and an electronic control unit (ECU) in electronic communication with the cameras. The three cameras have overlapping fields of view, and a processor of the ECU may be configured for generating at least three stereoscopic images from images captured by each pair of cameras and blending these stereoscopic images into one high quality panoramic image. These images provide a wider field of coverage and improved images, which improves the ability of the safety and advanced driver assistance systems of the vehicle to detect and identify potential collision hazards and conduct situational analyses of the vehicle according to certain implementations.2016-06-09
20160165212System and Methods for Calibration of an Array Camera - Systems and methods for calibrating an array camera are disclosed. Systems and methods for calibrating an array camera in accordance with embodiments of this invention include the capturing of an image of a test pattern with the array camera such that each imaging component in the array camera captures an image of the test pattern. The image of the test pattern captured by a reference imaging component is then used to derive calibration information for the reference component. A corrected image of the test pattern for the reference component is then generated from the calibration information and the image of the test pattern captured by the reference imaging component. The corrected image is then used with the images captured by each of the associate imaging components associated with the reference component to generate calibration information for the associate imaging components.2016-06-09
20160165213LAYERED TYPE COLOR-DEPTH SENSOR AND THREE-DIMENSIONAL IMAGE ACQUISITION APPARATUS EMPLOYING THE SAME - Provided are a color-depth sensor and a three-dimensional image acquisition apparatus including the same. The color-depth sensor includes a color sensor that senses visible light and an infrared sensor that is stacked on the color sensor and senses infrared light. The 3D image acquisition apparatus includes: an imaging lens unit; a color-depth sensor that simultaneously senses color image information and depth image information about an object from light reflected by the object and transmitted through the imaging lens unit; and a 3D image processor that generates 3D image information by using the color image information and the depth image information sensed by the color-depth sensor.2016-06-09
20160165214IMAGE PROCESSING APPARATUS AND MOBILE CAMERA INCLUDING THE SAME - Disclosed is an image processing apparatus which includes a light projection unit for projecting infrared light having a predetermined pattern onto an object, an image acquisition unit for absorbing light having a visible-light band and transmitting light having an infrared wavelength band to acquire an image having a target pattern projected onto the object, and an image processing unit for obtaining information on 3D distance of the object using the light acquired by the image acquisition unit.2016-06-09
20160165215SYSTEM AND METHOD FOR GENERALIZED VIEW MORPHING OVER A MULTI-CAMERA MESH - An apparatus is configured to perform a method for generalized view morphing The method includes determining a camera plane based on a predetermined view point of a virtual camera associated with a desired virtual image, the camera plane comprising at least three real cameras; pre-warping at least three image planes such that all of the image planes are parallel to the camera plane, each image plane associated with one of the real cameras positioned in the camera plane; determining a virtual image plane by performing a linear interpolation morphing on the at least three image planes; and post-warping the virtual image plane to a predetermined pose.2016-06-09
20160165216DISPARITY SEARCH RANGE DETERMINATION FOR IMAGES FROM AN IMAGE SENSOR ARRAY - A range is determined for a disparity search for images from an image sensor array. In one example, a method includes receiving a reference image and a second image of a scene from multiple cameras of a camera array, detecting feature points of the reference image, matching points of the detected features to points of the second image, determining a maximum disparity between the reference image and the second image, and determining disparities between the reference image and the second image by comparing points of the reference image to points of the second image wherein the points of the second image are limited to points within the maximum disparity.2016-06-09
20160165217SYSTEM AND METHOD FOR MEASURING VIEWING ZONE CHARACTERISTICS OF AUTOSTEREOSCOPIC 3D IMAGE DISPLAY - Disclosed are a system and method for measuring viewing zone characteristics of an autostereoscopic three-dimensional (3D) image display device. The system for measuring viewing zone characteristics of the autostereoscopic 3D image display device includes at least one image sensor that is provided on a front side of the image display device, and measures characteristics of luminance distribution of viewpoint images in a depth direction (Z-direction) formed from at least two local areas which are designated in advance in a horizontal direction (X-direction) of the image display device, and a determination unit that determines, as an optimum viewing distance (OVD), a position of the image sensor corresponding to a depth direction (Z-direction) having a horizontal direction (X-direction) minimum deviation of a center position of luminance distribution of light generated from the same viewpoint image of each of the at least two local areas by analyzing the characteristics of luminance distribution on an X-Z plane measured from the image sensor.2016-06-09
20160165218HEAD MOUNTED DISPLAY DEVICE - A head mounted display device includes a first flexible display to display a left-eye image, a second flexible display to display a right-eye image, a first optical system between the first flexible display and a left eye of a user, a second optical system between the second flexible display and a right eye of the user, and a housing including the first flexible display, the second flexible display, the first optical system, and the second optical system.2016-06-09
20160165219IMAGE DISPLAY DEVICE COMPRISING CONTROL CIRCUIT - An image display device according to an aspect of the present disclosure includes: a display including light-emitting elements arrayed two-dimensionally, and having regions, in each of which a part of the light-emitting elements is located; a lens array including lenses, each of the lenses being disposed correspondingly to one of the regions, the lens array forming real images or virtual images of images displayed at each of the regions; and a control circuit that, in operation, controls each of the light-emitting elements, the control circuit being electrically connected to the display, and, in operation, causing a first part of the light-emitting elements to emit light when the control circuit causes a second part of the light-emitting elements different from the first part of the light-emitting elements not to emit light.2016-06-09
20160165220DISPLAY APPARATUS AND METHOD OF CONTROLLING DISPLAY APPARATUS - A display apparatus includes a display unit, a plurality of sensors, a first control unit that controls the display apparatus, a second control unit that is connected to the plurality of sensors and transmits data including detection results of the plurality of sensors to the first control unit.2016-06-09
20160165221Simulated 3D Projection Apparatus - A display apparatus for displaying simulated 3D images is provided which is preferably portable and scalable in size, and comprises a front display device having side edges, arranged to project a first image of a first program material towards a viewer, and a background display device having side edges, arranged to project a second image of a second program material towards a viewer, wherein said front display device and said background display device are separated to provide an apparent parallax effect between said first image and said second image, and wherein said background display device extends beyond the side edges of the front projection device. The display device provides a parallax effect which can extend to the edges of the front display device.2016-06-09
20160165222MEDICAL STEREOSCOPIC OBSERVATION APPARATUS, MEDICAL STEREOSCOPIC OBSERVATION METHOD, AND PROGRAM - There is provided a medical stereoscopic observation apparatus, including an acquiring unit configured to acquire a status signal according to a status of image processing from each of image processing sections of a plurality of systems that perform the image processing on input image data through a selected image processing unit among a plurality of image processing units and generate output image data to be output as a right eye image or a left eye image, and a control unit configured to cause a second image processing section different from a first image processing section to switch the selected image processing unit according to the status signal acquired from the first image processing section among the image processing sections of the plurality of systems.2016-06-09
20160165223LIGHT-RESTRICTED PROJECTION UNITS AND THREE-DIMENSIONAL DISPLAY SYSTEMS USING THE SAME - The invention features a multi-view display system based on planar- or circular-aligned light-restricted projection units. A light-restricted projection unit is constituted by a display panel, a directional imaging structure transmitting optical messages from the display panel along a specific direction, and baffles encasing the display panel/directional imaging structure pair for light blocking. During operation, all the light-restricted projection units project images to a common display zone. Each light-restricted projection unit generates two kinds of zones: VZ where light rays from all pixels of the display panel pass and PVZ where light rays from partial pixels of the display panel pass. With the light-restricted projection units being aligned closely, PVZs from different light-restricted projection units completely or partially overlap into a fusing zone (FZ). For each point in the FZ, light rays from pixels belonging to segments of different display panels pass. The spatial percents of different segments occupying their display panels change with the point location, resulting in changing views presented to the pupil moving from one VZ (or image area of one VZ) to its adjacent VZ (or image area of the adjacent VZ). With the help of these keeping-changed views, continuous motion parallax gets implemented. Furthermore, through introducing sequentially gated gating-apertures, perspective views corresponding to more viewpoints can be presented by the display system based on persistence of vision for better three-dimensional effects.2016-06-09
20160165224PROJECTOR - A projector includes a lamp that emits projection light to project 3D picture, in which a right eye image and a left eye image are represented in a time division manner, to an object, a synchronization signal transmission section which transmits shutter synchronization signal to glasses having a right eye shutter and a left eye shutter to control the opened state or the closed state of the right eye shutter and the left eye shutter, based on the signal indicating a displaying period of the right eye image and the left eye image of 3D picture, and a lamp drive section that supplies AC current having peak overlapping with a period when the right eye shutter of the glasses is in the opened state and peak overlapping with a period when the left eye shutter of the glasses is in the opened state to the lamp, based on the signal.2016-06-09
20160165225VIDEO STREAMING AND VIDEO TELEPHONY UPLINK PERFORMANCE ANALYSIS SYSTEM - The technology disclosed relates to scoring user experience of video frames displayed on a mobile or other video display device. In particular, it relates to capture alignment and test stimulus isolation techniques that compensate for artifacts in the capture mechanism. The technology disclosed includes methods and systems for analyzing both downlink and uplink quality for mobile or other video display device cameras capturing and transmitting video frames including teleconference video display. Particular aspects of the technology disclosed are described in the claims, specification and drawings.2016-06-09
20160165226VIDEO STREAMING AND VIDEO TELEPHONY DOWNLINK PERFORMANCE ANALYSIS SYSTEM - The technology disclosed relates to scoring user experience of video frames displayed on a mobile or other video display device. In particular, it relates to capture alignment and test stimulus isolation techniques that compensate for artifacts in the capture mechanism. The technology disclosed includes methods and systems for analyzing both downlink and uplink quality for mobile or other video display device cameras capturing and transmitting video frames including teleconference video display. Particular aspects of the technology disclosed are described in the claims, specification and drawings.2016-06-09
Website © 2025 Advameg, Inc.