Patents - stay tuned to the technology

Inventors list

Assignees list

Classification tree browser

Top 100 Inventors

Top 100 Assignees


01st week of 2013 patent applcation highlights part 29
Patent application numberTitlePublished
20130002806APPARATUS AND METHOD FOR MANAGING TELEPRESENCE SESSIONS - A system that incorporates teachings of the present disclosure may include, for example, obtaining images that are captured by a camera system at a location associated with a user, transmitting video content representative of the images over a network for presentation by another media processor at another location, receiving at a media processor media content and second video content representative of second images that are associated with the second user, and presenting at a display device at the location the media content and the second video content in a telepresence configuration that simulates a presence of the other user at the location, where the presentation of at least one of the media content and the second video content at the display device is delayed based on latency parameters associated with the other media processor.2013-01-03
20130002807METHOD AND SYSTEM FOR MEASURING ANGLES BASED ON 360 DEGREE IMAGES - A method of measuring an angle includes orienting a measurement device at a reference position characterized by a reference angle. A first panoramic image defined by a predetermined range of elevation angles is acquired where the first panoramic image includes an object. A first bearing of the object in relation to the reference angle is determined and the measurement device is rotated to a measurement position characterized by a measurement angle. A second panoramic image defined by the predetermined range of elevation angles is acquired where the second panoramic image includes the object. A second bearing of the object in relation to the reference angle is determined. The measurement angle is computed as a function of the first bearing and the second bearing.2013-01-03
20130002808METHOD FOR PHOTOGRAPHIC PANORAMIC IMAGE WHEN THRESHOLD EXCEEDS COMPARISON BETWEEN CURRENT AND PREVIOUS IMAGES - Disclosed is a method for photographing a panoramic image including the steps of recognizing movement of a corresponding photographing apparatus by comparing a current real-time input image with a previous image through a motion estimation mechanism with exposure compensation, determining a time to photograph each next picture by determining whether movement in a photography direction reaches a preset threshold value, and photographing each next picture by manual or automatic operation at the determined time.2013-01-03
20130002809IMAGE GENERATING APPARATUS, SYNTHESIS TABLE GENERATING APPARATUS, AND COMPUTER READABLE STORAGE MEDIUM - An image generating apparatus includes an image receiving unit, a projecting unit, and a panoramic image generating unit. The image receiving unit receives camera images captured by a plurality of cameras, the subject of which is common. The projecting unit projects, within each camera image received by the image receiving unit, images which are present below the horizontal line within each camera image on the bottom surface of an infinite hemisphere having a planar bottom surface and projects images which are present above the horizontal line within each camera image on the hemispherical surface of the infinite hemisphere. The panoramic image generating unit generates a panoramic image based on the images projected on the infinite hemisphere by the projecting unit.2013-01-03
20130002810OUTLIER DETECTION FOR COLOUR MAPPING - A method and an arrangement for an improved outlier detection for colour mapping are recommended, wherein a neighborhood of a partially colour compensated test-image by comparing the initially corrected test-image with a reference image in a neighborhood comparator is used for outlier detection and not outlier detection of at least two images for colour mapping.2013-01-03
20130002811THREE-DIMENSIONAL IMAGING METHOD USING SINGLE-LENS IMAGE-CAPTURE APPARATUS AND THREE-DIMENSIONAL IMAGE ENHANCEMENT METHOD BASED ON TWO-DIMENSIONAL IMAGES - A three-dimensional (3D) imaging method using one single-lens image-capture apparatus, comprising: deriving a first two-dimensional (2D) image with the single-lens image-capture apparatus; deriving a depth map corresponding to the first 2D image; synthesizing a view synthesized image according to the depth map and the first 2D image; and deriving a second 2D image with the single-lens image-capture apparatus according to the view synthesized image, wherein the first 2D image and the second 2D image are utilized for 3D image display.2013-01-03
20130002812ENCODING AND/OR DECODING 3D INFORMATION - There is an encoding of three dimensional (3D) information. The encoding may include receiving a signal including frames in a 3D video sequence, receiving caption information to appear in a caption window associated with the frames, and/or receiving disparity information associated with the frames. The encoding may also include determining frame disparity maps based on the disparity information associated with the frames. The frame disparity maps may be determined by dividing a part of a frame into a plurality of grid cells. The grid cells may define a disparity measure associated with locations in a grid. The grid cells may form a caption window disparity map dividable into equivalent size portions including an equivalent amount of grid cells. The encoding may also include encoding the frames, the caption information and the frame disparity maps. There is also a decoding of the 3D information.2013-01-03
20130002813VIEWING WINDOWS FOR VIDEO STREAMS - Techniques are provided for viewing windows for video streams. A video stream from a video capture device is accessed. Data that describes movement or position of a person is accessed. A viewing window is placed in the video stream based on the data that describes movement or position of the person. The viewing window is provided to a display device in accordance with the placement of the viewing window in the video stream. Motion sensors can detect motion of the person carrying the video capture device in order to dampen the motion such that the video on the remote display does not suffer from motion artifacts. Sensors can also track the eye gaze of either the person carrying the mobile video capture device or the remote display device to enable control of the spatial region of the video stream shown at the display device.2013-01-03
20130002814METHOD FOR AUTOMATICALLY IMPROVING STEREO IMAGES - A method for improving a stereo image including a left view image and a right view image, comprising: using a data processor to automatically analyze the stereo image to determine an original stereo quality score responsive to relative positions of corresponding points in the left view image and the right view image; specifying a set of one or more candidate modifications to the stereo image; determining revised stereo quality scores based on each of the candidate modifications to the stereo image; selecting a particular candidate modification that produces a revised stereo quality score which indicates a higher quality level than the original stereo quality score; forming an output stereo image corresponding to the selected particular candidate modification; and storing the output stereo image in a processor-accessible memory.2013-01-03
201300028153D DRAWING SYSTEM FOR PROVIDING A REAL TIME, PERSONALIZED, AND IMMERSIVE ARTISTIC EXPERIENCE - A method for providing a three dimensional (3D) drawing experience. The method includes capturing a 3D image of a participant and then processing this image to key the participant's image from a background. The keyed participant's image is mixed with a 3D background image such as frames or scenes from a 3D movie, and the mixed 3D image is projected on a projection screen. For example, left and right eye images may be projected from a pair of projectors with polarization films over the lenses, and the projection screen may be a polarization-maintaining surface such as a silver screen. The user moves a drawing instrument in space in front of the projection screen, and spatial tracking performed to generate a locus of 3D positions. These 3D positions are used to create a 3D drawing image that is projected with the 3D background and participant images in real time.2013-01-03
20130002816Depth Map Coding - The invention relates to coding of depth information for multi-view video coding. Different parameters and/or any features from picture encoding or the encoded and reconstructed pictures may be used in the coding of the depth information, especially in filtering the depth picture using e.g. a loop filter in the depth coding loop. The same principle may be applied in decoding, that is, the decoded (texture) pictures and parameters may be used to control the decoding of the depth data, e.g. to control the filtering of the depth data in a loop filter. Parameters and data that may be used as such control may comprise features extracted from the reconstructed pictures, the encoded video data and parameters, the motion estimation data and others.2013-01-03
20130002817IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD THEREOF - An apparatus and method for processing an image are provided. The image processing apparatus uses a two-dimensional (2D) video signal and depth information corresponding to the 2D video signal to generate a three-dimensional (3D) video signal includes: an image receiver which receives a 2D video signal containing a background and an object; and an image processor which adjusts a transition area corresponding to a boundary between the object and the background in the depth information, and renders a 3D image from the 2D video signal through the adjusted transition area.2013-01-03
20130002818IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD THEREOF - Provided are an apparatus and method for processing an image. The image processing apparatus that generates a three-dimensional (3D) video signal corresponding to a two-dimensional (2D) video signal includes: an image receiver which receives a frame of a 2D video signal containing a background and an object; and an image processor which determines a hole area caused by a shift of the object based on depth information of the object, and inpaints the hole area according to layer information of the background and the object. Accordingly, there are provided an apparatus and method for processing an image, in which a hole area is naturally and easily inpainted.2013-01-03
20130002819RECEIVING SYSTEM AND METHOD OF PROCESSING DATA - A receiving system and a method of processing data are disclosed herein. The receiving system includes a receiving unit, a system information processor, a decoding unit, and a display unit. The receiving unit receives a broadcast signal including a 3D content and system information associated with the 3D content. The system information processor extracts identification information from the system information. Herein, the identification information may identify that the broadcast signal being received by the receiving unit includes the 3D content. The decoding unit decodes the received 3D content based upon transmission format information of the 3D content. Herein, the transmission format information may be included in the extracted identification information. And, the display unit displays the 3D content decoded by the decoding unit as a 3D image based upon a display method of a display device.2013-01-03
20130002820Streaming and Rendering Of 3-Dimensional Video - Transmitting and receiving 3D video content via an Internet protocol (IP) stream are described. 3D video content may be transmitted in a single IP stream and adjusted by a device associated with a display for rendering the 3D video content in a desired manner. 3D content also may be transmitted in a plurality of IP streams and a device associated with a display for rendering the 3D content may determine which of the plurality of IP streams to decode based upon a mode of operation of the device. A device receiving 3D video content may be configured to adjust the appearance of the content displayed on a display associated with the device. Such adjusting of the appearance may include moving the position of the rendered 3D video content within the display, positioning in band and/or out of band content in front of, behind, or within the rendered 3D video content.2013-01-03
20130002821VIDEO PROCESSING DEVICE - A video processing device is a device capable of outputting stereoscopic video information containing a first eye image and a second eye image and enabling stereoscopic viewing to a video display device. The video processing device includes an obtaining unit that obtains the stereoscopic video information which is obtained by coding the first and second eye images in a coding method using different bit rates to the first and second eye images respectively, and a transmitting unit that transmits identification information indicating one of the first and second eye images that is coded with higher bit rate, to the video display device, with the identification information being associated with the decoded stereoscopic video information.2013-01-03
20130002822PRODUCT ORDERING SYSTEM, PROGRAM AND METHOD - A product ordering system storing the information of 3D product models including data for forming each of the 3D product models and the scale of each of the 3D product models. A method for ordering a product using the system includes first capturing an image. Then, sensing the distance between the image capturing unit and the user, next, obtaining specific image data from the captured image according to one selected 3D product model, then, converting life size data needed to form a 3D model of the user according to the focus of the image capturing unit and the sensed distance. After that, generating a 3D model of the user according to the scale of the 3D product model, and overlaying the selected 3D product model with the 3D model of the user and displaying the combination for viewing by the user.2013-01-03
20130002823IMAGE GENERATING APPARATUS AND METHOD - An image generating apparatus may include a first reflector and a second reflector. When a light emitter emits an infrared light, the infrared light may be reflected from a first reflector and be omni-directionally reflected. The reflected infrared light that is the infrared light reflected from the object may be reflected from a second reflector and be transferred to a sensor. The sensor may receive the reflected infrared light and generate a depth image of the object.2013-01-03
20130002824INTEGRATED OTOSCOPE AND THREE DIMENSIONAL SCANNING SYSTEM - A multi-purpose device that can be used, as, among other things, an otoscope and a three dimensional scanning system is disclosed.2013-01-03
20130002825IMAGING DEVICE - This 3D image capture device includes a light-transmitting section 2013-01-03
20130002826CALIBRATION DATA SELECTION DEVICE, METHOD OF SELECTION, SELECTION PROGRAM, AND THREE DIMENSIONAL POSITION MEASURING APPARATUS - Appropriate selection of calibration data by shortening process time without wasteful processing is provided. Before measuring a three dimensional point of a target object from a stereo image, the calibration data according to an in-focus position of taking optical systems are applied to the stereo image. To select the calibration data, an object distance is acquired according to parallax obtained from the stereo image being reduced. The object distance is an estimated focusing distance corresponding to the in-focus position. One of the calibration data assigned with a set distance region in which the estimated focusing distance is included is selected. Respective view images are reduced in a range in which it is possible to detect any one of the set distance regions determined for a respective reference focusing distance corresponding to the calibration data.2013-01-03
20130002827APPARATUS AND METHOD FOR CAPTURING LIGHT FIELD GEOMETRY USING MULTI-VIEW CAMERA - An apparatus and method for capturing a light field geometry using a multi-view camera that may refine the light field geometry varying depending on light within images acquired from a plurality of cameras with different viewpoints, and may restore a three-dimensional (3D) image.2013-01-03
20130002828Context and Epsilon Stereo Constrained Correspondence Matching - A catadioptric camera having a perspective camera and multiple curved mirrors, images the multiple curved mirrors and uses the epsilon constraint to establish a vertical parallax between points in one mirror and their corresponding reflection in another. An ASIFT transform is applied to all the mirror images to establish a collection of corresponding feature points, and edge detection is applied on mirror images to identify edge pixels. A first edge pixel in a first imaged mirror is selected, its 25 nearest feature points are identified, and a rigid transform is applied to them. The rigid transform is fitted to 25 corresponding feature points in a second imaged mirror. The closes edge pixel to the expected location as determined by the fitted rigid transform is identified, and its distance to the vertical parallax is determined. If the distance is not greater than predefined maximum, then it is deemed correlate to the edge pixel in the first imaged mirror.2013-01-03
20130002829Method of and device for capturing 3D data of one or more airborne particles - Disclosed is a method of capturing 3D data of one or more airborne. At least one image of the one or more airborne particles is taken by a plenoptic camera of which the geometry and the optical properties of its optics are known, and the distance of a plane of focus with at least one selected particle of the one or more airborne particles from a defined reference location is determined by use of the captured image together with the known optical properties and the known geometry of the optics of the plenoptic camera.2013-01-03
20130002830STEREOSCOPIC IMAGING DEVICE AND METHOD FOR AUTOMATICALLY ADJUSTING THE FOCAL POINT OF A STEREOSCOPIC IMAGING DEVICE - A stereoscopic imaging device comprising: a second focus adjusting unit that operates a second focus lens to carry out a search within a second search range and searches for a second lens position at which a subject to be imaged is brought into focus; and a photographing unit that performs a photographing of a first viewpoint image and a second viewpoint image when a photographing instruction is inputted after a process by a first focus adjusting unit and a second focus adjusting unit, wherein the second focus adjusting unit calculates the second lens position based upon a first lens position and a focus positional deviation amount stored in a storage unit, and shifts the second focus lens to the second lens position, if it is not possible to acquire the second lens position within the second search range as a result of the search.2013-01-03
20130002831Infrared Emitter in Projection Display Television - An IR emitter internally mounted behind the screen of a projection television apparatus. The IR emitter is mounted so as not to interfere with the image display light path. The IR emitter is mounted so that IR rays are reflected off the mirror and exit through the screen of the projection television. The IR emitter includes one or more IR type LEDs.2013-01-03
20130002832METHOD AND APPARATUS FOR MONITORING AN OBJECT - A method and system of monitoring an object (e.g., for change in configuration of a person) includes projecting a radiation pattern onto the object; recording at a first time first image data representing a portion of the projected radiation pattern on the object, the first image data representative of a three dimensional configuration of the object at the first time; recording at a second time second image data representing a portion of the projected pattern of radiation on the object, the second image data representative of a three dimensional configuration of the object at the second time; and processing the first and second image data to generate differential data representative of a change in the configuration of the object between the first and second times.2013-01-03
20130002833SYSTEMS AND METHODS FOR PROVIDING CLOSED CAPTIONING IN THREE-DIMENSIONAL IMAGERY - Systems and methods are presented for processing three-dimensional (3D or 3-D) or pseudo-3D programming. The programming includes closed caption (CC) information that includes caption data and a location identifier that specifies a location for the caption data within the 3D programming. The programming information is processed to render the caption data at the specified location and to present the programming on the display. By encoding location identification information into the three-dimensional programming, a high level of configurability can be provided and the 3D experience can be preserved while captions are displayed.2013-01-03
201300028343D IMAGE DISPLAY DEVICE - The disclosed 3D image display device has: a delay unit (2013-01-03
20130002835BACKLIGHT MODULATION TO PROVIDE SYNCHRONIZATION BETWEEN SHUTTER GLASSES AND THREE DIMENSIONAL (3D) DISPLAY - In general, in one aspect, a three dimensional (3D) display includes an optical stack, a backlight, panel electronics and a backlight driver. The optical stack is to present left eye and right eye images. The backlight is to illuminate the optical stack so the left eye and right eye images are visible and to provide signals for synchronizing the image illuminated on the optical stack with shutter glasses worn by a user to enable a left eye to view the left eye images and a right eye to view the right eye images. The panel electronics are to generate the left eye and right eye images on the optical stack. The backlight driver is to control operation of the backlight.2013-01-03
201300028363D DISPLAY APPARATUS AND 3D DISPLAY METHOD THEREOF - A 3-dimensional (3D) display apparatus and a 3D display method thereof are provided. The 3D display apparatus includes: a display unit which outputs a plurality of image frames; a synchronization signal processor which generates a synchronization signal corresponding to the image frames; a controller which acquires period information of first and second sequences of the synchronization signal and determines whether the synchronization signal has been stabilized, according to change information of the period information; and a communicator which determines time information of the stabilized synchronization signal according to the determination result and transmits shutter control data, which is generated based on the time information, to 3D glasses.2013-01-03
20130002837DISPLAY CONTROL CIRCUIT AND PROJECTOR APPARATUS - A display control circuit includes: a device drive unit writing left and right images alternately into a light modulation device in time division based on time-division display type stereoscopic image data including the left and right images; a shutter glass drive unit driving opening and closing of liquid crystal shutters in the shutter glasses; a light source drive unit maintaining the total sum of light source current to be constant over cycles when an opening and closing period of the liquid crystal shutters is taken as one cycle; and a control unit controlling the timing of opening and closing the liquid crystal shutters with respect to the shutter glass drive unit, the timing of writing the left and right images with respect to the device drive unit and the luminance level of the light source in the opening and closing period with respect to the light source drive unit.2013-01-03
20130002838THREE-DIMENSIONAL IMAGE PLAYBACK METHOD AND THREE-DIMENSIONAL IMAGE PLAYBACK APPARATUS - When the video images to be played back are three-dimensional (3D) images each including a first parallax image and a second parallax image obtained when an object in a 3D space is viewed from different viewpoints, a video attribute determining unit determines whether or not the frame rate of the images is within a predetermined frame rate range at which a flicker is likely to occur. When the frame rate of the 3D images is within the predetermined frame rate range, a frame rate converter raises the frame rate of the 3D images until the frame rate exceeds the predetermined frame rate range.2013-01-03
20130002839Device and Method for the Recognition of Glasses for Stereoscopic Vision, and Related Method to Control the Display of a Stereoscopic Video Stream - A method for the recognition of stereoscopic glasses, wherein two images of an environment in front of a screen are acquired from the same point of view. A differential image is then calculated by subtracting one of the two images from the other one, and the presence of two lenses is detected within the differential image. A method is also provided for controlling the display of stereoscopic images by using the method for the recognition of glasses. Also described are the devices allowing the methods to be implemented.2013-01-03
20130002840METHOD AND APPARATUS FOR RECORDING MEDIA EVENTS - An apparatus, method, and computer program product are provided for recording media events. In particular, the apparatus includes a processor and a memory including computer program code that are configured to cause the apparatus to identify a party to the media event and access authorization data associated with the party. The authorization data may indicate whether the party consents to the recording of the party's participation in the media event and, if so, under what circumstances. Location data may also be accessed regarding the location of the party, and the location data may also be considered. In this way, the determination of whether the party's voice or image may be recorded can be based on the explicit consent of the party, the legal jurisdiction in which the media event takes place, the location or environment of the media event, or a combination of one or more of these factors.2013-01-03
20130002841SYSTEM AND METHOD FOR ACQUIRING IMAGES - A vision system useful in acquiring images includes: a light dome having a window and a perimeter; an annular light curtain positioned within and radially inwardly from the perimeter of the light dome such that an annular gap is formed between the light dome and the light curtain; and a light ring positioned to illuminate the gap between the light dome and the light curtain. The light curtain and window are sized and positioned such that no direct light from the light ring reaches the window. The system further comprises a camera having a lens facing the window to acquire images of an object on a side of the window opposite the camera. The images acquired by the camera can then be compared to stored images to determine whether the identity of the objects (which may be pharmaceutical tablets) is as expected.2013-01-03
20130002842Systems and Methods for Motion and Distance Measurement in Gastrointestinal Endoscopy - Systems and methods for extracting and measuring motion from images capture during capsule endoscopy in accordance with embodiments of the invention are disclosed. In one embodiment of the invention, an endoscope system configured to generate a spatial index of images captured along a passageway includes a processor and a camera configured to capture a plurality of images as the camera moves along a passageway, wherein the processor is configured to compare sets of at least two images from the plurality of images, determine motion of the camera along the passageway using the sets of at least two images, determine the distance the camera traveled along the passageway at the point at which an image in each of the sets of at least two images was captured, and generate a spatial index for the plurality of images by associating distances traveled along the passageway with the images.2013-01-03
20130002843MOTOR DRIVE APPARATUS AND IMAGING APPARATUS FOR DIAGNOSIS - A motor drive apparatus mounted with an imaging probe having a transmitting and receiving unit which carries out signal transmission and reception continuously, comprises: a scanner unit; a pull-back unit; and an operation unit, wherein on the side surface of the scanner unit, there are formed, at positions where facing each other, a convex portion formed for a predetermined length along the straight-ahead direction of the scanner unit, and a concave portion having an upper end surface which forms an identical surface as a lower end surface of the convex portion.2013-01-03
20130002844ENDOSCOPE APPARATUS - To provide an endoscope apparatus with which an affected site can be treated with a treatment instrument while viewing tissue in the body cavity, including a region located at the rear of the treatment instrument. An endoscope apparatus is employed, which is provided with an image generating portion that generates an image of an subject; an image-saving memory portion that saves a real-time image; a forceps-region extracting portion that extracts a forceps region, in which forceps exist, from the real-time image; an image-position aligning portion that aligns positions of the saved image saved in the image-saving memory portion and the real-time image; a forceps-region extracting portion that extracts a region corresponding to the forceps region from the saved image saved in the image-saving memory portion; and an image combining portion that combines an image of the region extracted by the forceps-region extracting portion and the real-time image.2013-01-03
20130002845SYSTEM FOR DETECTING AN ITEM WITHIN A SPECIFIED ZONE - The disclosure reveals a system for detecting one or more persons in a specified zone. A determination is whether there is a person in the zone. A presence determination module may indicate from a current image of the zone compared with a reference image of the zone, whether there is a person in or not in the zone. An illumination controller may assure that the zone is sufficiently illuminated for a current image sufficient for comparison with the reference image to determine a possible presence of a person in the zone. The illumination may be infrared. The system may be used to assure appropriate and adequate face velocity at a fume hood having the presence of a person and having minimal face velocity in the absence of a person at the fume hood.2013-01-03
20130002846SYSTEM AND METHOD FOR TRACKING THE POINT OF GAZE OF AN OBSERVER - A system for tracking the point of gaze of an observer observing an object comprises a camera for recording an image of an eye of the observer, comprises a means for providing a luminous marker, and means for analyzing the image of the eye to determine the reflection of the marker on the eye and the centre of the pupil. The relative positions of the corneal reflection of the marker and the pupil centre are measured. The marker is repositioned, in dependence on the determined relative positions, to improve correspondence between the corneal reflection of the marker and the pupil centre.2013-01-03
20130002847SYSTEMS AND METHODS FOR SAMPLE DISPLAY AND REVIEW - Methods and systems for displaying images of cells in a sample include obtaining a plurality of images of cells in the sample, where each image corresponds to one of the cells in the sample, determining values of at least one property for each of the cells based on the plurality of images, arranging the plurality of images to form a first image array, where the images are ordered in the first image array based on the values of the at least one property, displaying the first image array, sorting the plurality of images to form a second image array in which an ordering of the images is different from the first image array, and displaying the second image array, where the sample includes blood and the cells include red blood cells.2013-01-03
20130002848STAGE ADAPTOR FOR IMAGING BIOLOGICAL SPECIMENS - A stage adaptor for imaging a biological specimen is described. The adaptor having a housing; a vented chamber contained within the housing; and a removable lid for covering the vented chamber. A depression is provided on the removable lid for receiving an objective from a microscope. An aperture is also provided at the apex of the depression for viewing inside the vented chamber. Also described is an integrated stage adaptor and imaging system as well as a method for imaging the biological specimen using the stage adaptor.2013-01-03
20130002849METHOD AND APPARATUS FOR PATTERN INSPECTION - According to the present invention, for a pattern inspection apparatus that compares images in corresponding areas of two patterns that are identical and that determines an unmatched portion between the images is a defect, a plurality of detection systems and a plurality of corresponding image comparison methods are provided. With this configuration, the affect of uneven brightnesses for a pattern that occurs due to differences in film thicknesses can be reduced, a highly sensitive pattern inspection can be performed, a variety of defects can be revealed, and the pattern inspection apparatus can be applied for processing performed within a wide range. Furthermore, the pattern inspection apparatus also includes a unit for converting the tone of image signals of comparison images for a plurality of different processing units, and when a difference in brightness occurs in the same pattern of the images, a defect can be correctly detected.2013-01-03
20130002850INSPECTION OF A COMPONENT - A component such as a pressure vessel includes a cladding layer bonded to a substrate by a welding process. The component is inspected by a method that includes using a first camera to record a first image of an inspection portion of the cladding layer, the inspection portion having a pattern of markings; causing or allowing a temperature change in the component; using a second camera to record a second image of the inspection portion; and identifying the deformation of the pattern of markings on the inspection portion by reference to relative movement of the respective markings between the first and second image. The temperature change may occur as a result of cooling of the component after the welding process. Analysis of the deformation may be conducted by a Digital Image Correlation process, enabling possible defects or flaws in the component to be identified.2013-01-03
20130002851Optical Inspection of Containers - An apparatus and method for inspecting a container having a base and a mouth, wherein light is directed through the container base into the container, and out of the container through the container mouth, using at least first and second light sources operatively disposed adjacent to each other beneath the container base and having differing operating characteristics. Light transmitted through the container mouth is sensed, and a composite image of the container mouth may be produced from two or more images of portions of the container mouth.2013-01-03
20130002852System and Method for Inspecting Components of Hygienic Articles - System and method to inspect hygienic articles. Defects are detected using a vision system by comparing an inspection image of a component to a reference image of a defect-free component. Detection of a defect can then be used to reject components and perform other functions.2013-01-03
20130002853FILTER INSPECTION METHOD AND APPARATUS - An approximate circle approximated to the profile of a filter end face in an inspection image obtained by photographing the filter end face of a filter cigarette is obtained, and the center position of the approximate circle is found. An area indicating a string-like flavor containing element embedded in the filter is detected in the inspection image, and the position of gravity center of the area is found. Distance between the gravity center position indicating the flavor containing element and the center position of the approximate circle is determined, and thus, the quality of the flavor containing element embedded in the filter is inspected.2013-01-03
20130002854MARKING METHODS, APPARATUS AND SYSTEMS INCLUDING OPTICAL FLOW-BASED DEAD RECKONING FEATURES - A position of a marking device is monitored by receiving start position information indicative of an initial position of the marking device, capturing one or more images using one or more camera systems attached to the marking device, and analyzing the image(s) to determine tracking information indicative of a motion of the marking device. The tracking information and the start position information are then analyzed to determine current position information. In one example, images of a target surface over which the marking device is carried are analyzed pursuant to an optical flow algorithm to provide estimates of relative position for a dead-reckoning process, and the current position information is determined based on the estimates of relative position and the start position information.2013-01-03
201300028553-D LUMINOUS PIXEL ARRAYS, 3-D LUMINOUS PIXEL ARRAY CONTROL SYSTEMS AND METHODS OF CONTROLLING 3-D LUMINOUS PIXEL ARRAYS - A luminous pixel array includes a plurality of luminous flying vehicles configured to move in 3-dimensional space. A first luminous flying vehicle of the plurality of luminous flying vehicles corresponds to at least one pixel of the luminous pixel array. The first luminous flying vehicle is configured to radiate a first color and intensity of light.2013-01-03
20130002856SHAPE MEASUREMENT METHOD AND SHAPE MEASUREMENT APPARATUS FOR TIRES - A shape measurement method for a tire includes: detecting an outer surface shape data and an inner surface shape of the tire from image data of the outer surface and the inner surface; subjecting irregularities along the tire circumferential direction around the tire in the outer surface shape data and in the inner surface shape data to Fourier transformation to take out primary waveform components respectively; adjusting the tire circumferential positions of both of the waveform components to adjust the tire circumferential positions thereof; adjusting the tire radial direction cross section positions of the outer surface shape data and the inner surface shape data from information about the placement angles and the positions of the first camera and the second camera; and synthesizing the outer surface shape data and the inner surface shape data based on the adjusted tire circumferential positions and the tire radial direction cross section positions.2013-01-03
20130002857NAVIGATION IN BUILDINGS WITH RECTANGULAR FLOOR PLAN - An apparatus and method for providing a direction based on an angle of a reference wall is provided. A mobile device uses an angle of a horizontal feature from an image to calibrate a sensor and future sensor measurements. The angle of the horizontal feature is determined by image processing and this angle is mapped to one of four assumed parallel or perpendicular angles of an interior of a building. A sensor correction value is determined from a difference between the sensor-determined angle and the image-processing determined angle. The image processing determined angle is assumed to be very accurate and without accumulated errors or offsets that the sensor measurements may contain.2013-01-03
20130002858Mechanisms for Conserving Power in a Compressive Imaging System - A system and method for conserving power in compressive imaging. An optical subsystem separates an incident light stream into a primary light stream and a secondary light stream. The primary light stream is modulated with a sequence of spatial patterns by a light modulator. The modulated light stream is sensed by a first light sensing device. The secondary light stream is sensed by a second light sensing device. The signal(s) produced by the second light sensing device may be monitored to determine when to turn on power to the light modulator. Thus, the light modulator may remain off when not needed. In an alternative implementation, a light sensing device is used to sense the light reflected from the light modulator in its power-off state. The signal(s) produced by that light sensing device may be monitored to determine when to turn on power to the light modulator.2013-01-03
20130002859INFORMATION ACQUIRING DEVICE AND OBJECT DETECTING DEVICE - Laser light emitted from a laser light source is converted into light having a dot pattern by a projection optical system for projection onto a target area. The projection optical system is configured such that the density of dots in a peripheral portion of the dot pattern is smaller than that in a center portion of the dot pattern in the target area. A dot pattern captured by irradiating a dot pattern onto a reference plane is divided into segment areas. A distance to each segment area is acquired by matching between dots in each segment area, and a dot pattern acquired by capturing an image of the target area at the time of distance measurement. The segment areas are set such that a segment area in the peripheral portion of the dot pattern is larger than a segment area in the center portion of the dot pattern.2013-01-03
20130002860INFORMATION ACQUIRING DEVICE AND OBJECT DETECTING DEVICE - An information acquiring device is provided with a projection optical system which projects laser light onto a target area with a predetermined dot pattern, and a light receiving optical system which captures an image of the target area. Segment areas are set on a reference dot pattern reflected on a reference plane and captured by the light receiving optical system. A distance to each segment area is acquired by matching between a dot pattern captured at the time of distance measurement and dots in each segment area. The segment area sizes differ depending on regions of the reference dot pattern.2013-01-03
20130002861CAMERA DISTANCE MEASUREMENT DEVICE - A camera distance measurement device displays an image in which a plurality of graduation lines which are arranged in the form of a grid with respect to a vehicle is superimposed on a camera image which is captured by a camera mounted to the vehicle on a display unit, and estimates a distance in a direction of the width of the vehicle, and a distance in a direction of the capturing by the camera from a unit distance defined for each grid side of the graduation lines.2013-01-03
20130002862MEASURING DEVICE USER EXPERIENCE THROUGH DISPLAY OUTPUTS - Methods and systems may include high speed camera to capture a video of a display output, a robotic arm to interact with a device, a processor, and a computer readable storage medium having a set of instructions. If executed by the processor, the instructions cause the system to identify one or more user experience characteristics based on the captured video, and generate a report based on the one or more user experience characteristics. The report may include a perceptional model score that is generated based on the user experience characteristics as well as other parameters. The user experience characteristics could include response time, frame rate and run time characteristics.2013-01-03
20130002863SYSTEM AND METHOD FOR AUTO-COMMISSIONING AN INTELLIGENT VIDEO SYSTEM - An auto-commissioning system provides automatic parameter selection for an intelligent video system based on target video provided by the intelligent video system. The auto-commissioning system extracts visual feature descriptors from the target video and provides the one or more visual feature descriptors associated with the received target video to an parameter database that is comprised of a plurality of entries, each entry including a set of one or more stored visual feature descriptors and associated parameters tailored for the set of stored visual feature descriptors. A search of the parameter database locates one or more best matches between the extracted visual feature descriptors and the stored visual feature descriptors. The parameters associated with the best matches are returned as part of the search and used to commission the intelligent video system.2013-01-03
20130002864QUALITY CHECKING IN VIDEO MONITORING SYSTEM - Performance control of a video monitoring system (2013-01-03
20130002865MODE REMOVAL FOR IMPROVED MULTI-MODAL BACKGROUND SUBTRACTION - A method and system for updating a visual element model of a scene model associated with a scene, the visual element model including a set of mode models for a visual element for a location of the scene. The method receives an incoming visual element of a frame of the image sequence and, for each mode model, classifies the respective mode model as either a matching mode model or a distant mode model, by comparing an appearance of the incoming visual element and a set of visual characteristics of the respective mode model. The method removes a distant mode model from the visual element model, based upon a first temporal characteristic of a matching mode model exceeding a maturity threshold and a second temporal characteristic of the distant mode model being below a stability threshold.2013-01-03
20130002866Detection and Tracking of Moving Objects - Techniques for performing visual surveillance of one or more moving objects are provided. The techniques include registering one or more images captured by one or more cameras, wherein registering the one or more images comprises region-based registration of the one or more images in two or more adjacent frames, performing motion segmentation of the one or more images to detect one or more moving objects and one or more background regions in the one or more images, and tracking the one or more moving objects to facilitate visual surveillance of the one or more moving objects.2013-01-03
20130002867System and Method for Providing Wireless Security Surveillance Services Accessible via a Telecommunications Device - A system and method for providing video surveillance may include providing digital television services to a customer via middleware. The middleware may include digital rights management services. Digital surveillance services may be provided to the customer via the middleware. In providing digital surveillance services to the customer, the customer may be enabled to access surveillance equipment via the middleware from a remote location using a telecommunications device, where the telecommunications device is authorized to access the surveillance equipment by the digital rights management services.2013-01-03
20130002868SURVEILLANCE CAMERA TERMINAL - A surveillance camera terminal includes a wireless communication means that performs wireless communication with another terminal located near the subject terminal either directly or via another terminal, an imaging means that images part of a surveillance area assigned to the subject terminal with an imaging visual field positioning the part of the surveillance area, an imaging visual field control means that adjusts the imaging visual field of the imaging means to a desired region in the surveillance area, an object extracting means that extracts an object being imaged by processing a frame image taken by the imaging means, a tracking means that tracks the object extracted by the object extracting means in the surveillance area, a handover means that hands over the object being tracked by the tracking means, and a wireless communication path forming means.2013-01-03
20130002869SURVEILLANCE CAMERA TERMINAL - A surveillance camera terminal includes an imaging means that outputs a frame image, an imaging visual field control means that changes the imaging visual field of the imaging means in a surveillance area, an object extracting means that extracts an object being imaged by processing the frame image taken by the imaging means, a tracking means that detects a position of the object in the surveillance area based on a position of the object extracted by the object extracting means on the frame image and the imaging visual field of the imaging means, an imaging visual field storage means that stores an imaging visual field, and detection means that detects whether or not the object being tracked by the tracking means has traveled to a position near a handover area for handover of the object to another surveillance camera terminal2013-01-03
20130002870Method for the Output of Information - A method for the output of information by an output unit in the passenger compartment of a motor vehicle on the basis of capturable personal data. Activation of an approach module is followed by initiation of person recognition, in which personal data from a person who is outside of the motor vehicle are captured by an ambient sensor system and are compared with stored personal data. If the captured personal data match stored personal data then output of information relating to the person is initiated by the output unit.2013-01-03
20130002871Vehicle Vision System - In order to determine extrinsic parameters of a vehicle vision system or an aspect of a vehicle vision system, road lane markings may be identified in an image captured by a camera and provided to the vehicle vision system. For one or more of the identified road lane markings, a first set of parameters defining an orientation and a position of a line along which a road lane marking extends in an image plane may be determined. Also, a second set of parameters defining an orientation and a position of a line along which a road lane marking extends in a road plane may be determined. A linear transformation that defines a mapping between the first set of parameters and the second set of parameters may be identified. The extrinsic parameters may be established based on the identified linear transformation.2013-01-03
20130002872ADAPTABLE NIGHT VISION SYSTEM - An adaptable night vision system comprises a series of individual night vision function providing modules that are adapted to selectively releasably interface with others in the system so that, when mated, each combination provides a product whose features are not available with the individual modules themselves. Such modules comprise a combined camera and illumination module, separate illumination and camera modules, a display module, auxiliary optics module for observing display images when close to the naked eye, and communication link modules. The modules are adapted with common interfaces to easily combine with base units to form a head mount night vision device, a hand held night vision device, a remote camera device, a toy vehicle, and a hand held vehicle controller.2013-01-03
20130002873IMAGING SYSTEM FOR VEHICLE - An imaging or vision system for a vehicle includes an imaging sensor disposed at the vehicle and having an imaging array of photosensing pixels. A first optical element is disposed at a first portion of the imaging array and has a first focal length, and a second optical element is disposed at a second portion of the imaging array and has a second focal length. The first focal length is longer than the second focal length so that the first portion of the imaging array captures focused images of a more remote or distant scene than that of the second portion of the imaging array. The first portion of the imaging array may capture images of a scene occurring forwardly of the vehicle and the second portion of the imaging array may capture images of a surface of the vehicle windshield.2013-01-03
20130002874SURROUNDING AREA MONITORING APPARATUS FOR VEHICLE - A surrounding area monitoring apparatus for vehicle includes an infrared camera mounted on a vehicle for capturing images of surrounding area around the vehicle, a unit for generating and displaying images based on image data captured by the camera, and a controller for calibrating output of the camera with respect to a relation between pixels, based on image data produced by imaging a surface of a shutter that opens and closes an aperture introducing a light to the camera. The controller estimates whether a temperature of the camera is stable, based on an operation state of the vehicle, and determines a possibility that a driver is looking at the unit, based on a behavior of the vehicle. When the temperature of the camera is estimated stable and the possibility that the driver is looking at the unit is determined to be low, the means for calibrating executes the calibration.2013-01-03
20130002875Mobile Study - A mobile studio for producing video and audio content includes a vehicle having a body mounted on a chassis. The body includes a studio chamber enclosure formed by a floor, a ceiling, spaced opposite outer side walls and spaced outer front and rear walls, and the floor includes a stage area. The mobile studio further includes at least one LED lighting assembly located in the studio chamber enclosure that is suitable for providing sufficient illumination for image capture such that a captured image is suitable for projection as a Pepper's Ghost image, at least one camera to capture an image of a subject on the stage area and generate the captured image, and a communications device to transmit the captured image.2013-01-03
20130002876EXTERIOR MIRROR VISION SYSTEM FOR A VEHICLE - An exterior mirror vision system for a vehicle includes driver-side and passenger-side exterior rearview mirror assemblies including respective driver-side and passenger-side light modules and driver-side and passenger-side cameras. The light modules are sealed so as to be substantially water-impervious, and the light modules provide at least one of (a) a driver-side floodlight and a passenger-side floodlight and (b) a driver-side indicator and a passenger-side indicator. The driver-side camera and the passenger-side camera are part of a multi-camera vision system of the vehicle, and the multi-camera vision system has a display device that is operable to display images for viewing by a driver of the vehicle. The displayed images displayed by the display device include an image portion derived from image data captured by at least one of the driver-side camera and the passenger-side camera.2013-01-03
20130002877IMAGE CONTROL APPARATUS - In order to provide an image control apparatus that allows a driver to easily check the situation of a passenger's getting on/off a vehicle through an opening operation of a door of the vehicle, the apparatus includes a camera which captures an image of the vicinity of the vehicle door, a monitor capable of displaying the image captured by the camera and a door opening detection sensor which detects an open state of the vehicle door. The monitor displays the image captured by the camera when the door opening detection sensor detects an open state of the door.2013-01-03
20130002878SALES DATA PROCESSING APPARATUS, SALES DATA PROCESSING SYSTEM, AND COMPUTER-READABLE STORAGE MEDIUM - A sales data processing apparatus which registers and processes sales data on a transaction-by-transaction basis. The apparatus includes: an imaging module that images manipulation situations successively; a recording module that successively records manipulation images taken by the imaging module; a journal storage module that successively stores and manages manipulation data of respective manipulations as journal data; a reproducing module that performs a reproduction operation of successively reading and displaying the manipulation images recorded by the recording module; a detecting module that detects timing of occurrence of a prescribed caution-needed manipulation by successively referring to contents of the journal storage module in link with the reproduction operation of the reproducing module; and a display control module that displays prescribed alerting data in association with a manipulation image at the timing of occurrence, detected by the detecting module, of the prescribed caution-needed manipulation.2013-01-03
20130002879SYSTEMS AND METHODS FOR TRACKING A COMMODITY - A system for monitoring articles within a monitored area is provided. The system for monitoring articles within a monitored area comprises tags configured to be coupled to articles. The tags have GPS transceivers and RF transmitters therein. The RF transmitters transmit RF data. The RF data transmitted from the RF transmitters uniquely identify the corresponding tag. The system for monitoring articles within a monitored area also includes RF sensors configured to be distributed data over a monitored area. The RF sensors receive the RF data received from the RF transmitters. The system for monitoring articles within a monitored area further includes GPS sensors configured to be positioned to cover the monitored area. The GPS sensors and GPS transceivers convey GPS data there between. The system for monitoring articles within a monitored area also includes a processor module to identify movement of the articles within the monitored area based on the RF data. The processor module tracks movement of the articles within the monitored area based on the GPS data.2013-01-03
20130002880SYSTEM AND METHOD FOR ASSIGNING CAMERAS AND CODES TO GEOGRAPHIC LOCATIONS AND GENERATING SECURITY ALERTS USING MOBILE PHONES AND OTHER DEVICES - A system and method for generating security alerts for a facility is presented. The system can comprise a server, cameras operable to stream video to the server or other storage location and to a surveillance center of the facility, input devices, a first module operable to assign one or more codes to the facility, associate one or more cameras with each code, and associate response guidelines with each code; and a second module operable to receive one code of the one or more codes from one of the one or more input devices, notify the facility assigned to the one code based on the response guidelines, stream video from the one or more cameras associated with the one code to the surveillance center of the facility, and generate the security alert based on the video stream. In one aspect, a user can send a video feed along with the code.2013-01-03
20130002881SYNTHETIC INFRARED IMAGE INJECTION APPARATUS - A synthetic infrared image injection apparatus for simulating images obtained from an electron optics head and injecting them into a signal process part so as to evaluate the performance of an infrared image seeker. The synthetic infrared image injection apparatus comprises: an image input module for receiving the images from a channel synthetic image generation apparatus and extracting an effective area from the received images; a first pixel process module for performing an image process so as to reflect elements influenced by model characteristic effects of the electron optics head in the images; a second pixel process module for realizing dead and hot pixels and an automatic gain; an image transmitting and receiving module for outputting processing images of each module and real-time images before the output thereof formed at least one module among each module; and a system control module for diagnosing and controlling operations of each module.2013-01-03
20130002882IMAGE-CAPTURING DEVICE - The image-capturing device according to the present invention includes a solid-state imaging element, an infrared LED which emits infrared light, a light-emission controlling unit which causes the infrared LED to emit infrared pulsed light on a per frame time basis, and a signal processing unit which extracts, from the solid-state imaging element, a color visible-light image signal in synchronization with a non-emitting period and an infrared image signal in synchronization with an emitting period of the infrared LED. The solid-state imaging element includes an image-capturing region in which unit-arrays are two-dimensionally arranged, and each of the unit-arrays has a pixel for receiving green visible light and infrared light, a pixel for receiving red visible light and infrared light, a pixel for receiving blue visible light and infrared light, and a pixel for receiving infrared light.2013-01-03
20130002883DEVICE FOR SENDING IMAGE DATA FROM CAMERA TO CCTV NETWORK - An integrated Internet camera includes, an infrared sensor, an optical system for forming an image on the infrared sensor and lenses formed from infrared transmitting material, an image capturing circuit, for capturing digital images from the infrared sensor; a network interface device connectible to a computer network for transmission of the digital images, as digital image files, across the computer network. A file transfer device of the camera communicates, via the network interface device, with a destination computer at a selected network address on the computer network, and transfers the digital image files to the destination computer according to a predetermined file transfer protocol. A transport control device of the camera controls addressing of the digital image files to the selected network address. The camera self-initiates a connection with the computer network and a connection with the destination computer at the selected network address.2013-01-03
20130002884IMAGING APPARATUS HAVING OBJECT DETECTION FUNCTION AND METHOD FOR CONTROLLING IMAGING APPARATUS - An imaging apparatus having a function of detecting a face from an image signal on which a focusing frame is superimposed is configured so that a face selected to be a main object can continue being selected to be a main object even if a focusing frame is displayed as superimposed on the face. A central processing unit moves a position of a focusing frame to be displayed by a display unit to a position corresponding to a detected object if a ratio of a size of the detected object to a size of the focusing frame displayed on the display unit is greater than or equal to a predetermined threshold, and does not move the position of the focusing frame to be displayed by the display unit to a position corresponding to the detected object if the ratio is less than the threshold.2013-01-03
20130002885IMAGE PICK-UP APPARATUS AND TRACKING METHOD THEREFOR - An image pick-up apparatus that enables to appropriately set a holding time in which there is a possibility of returning to a state that allows tracking when a subject cannot be tracked temporarily during the subject tracking to improve the ease-of-use for a user. A specifying unit specifies a subject included in a captured image. A display unit displays the captured image on a screen and displays identification information showing that the specified subject is tracked. A tracking unit tracks the subject. A setting unit sets a holding time in which the display of the identification information is held according to at least one of a focal length of the image pick-up apparatus and a subject distance. An elimination unit eliminates a display of the identification information when the holding time has passed after the tracking unit lost the subject.2013-01-03
20130002886APPARATUS AND METHOD FOR INVERSE TELECINE WITH LOCAL VIDEO DE-INTERLACING - The present invention relates to systems and methods for inverse telecine or video de-interlacing for picture quality improvement on set-top-box and TV products. The system comprises a film mode detector at the picture or sequence level, a global mixed video and film content detector at the region, picture, or sequence level on top of the detected film content, and a local video content detector at pixel level on top of the detected mixed video and film content. Inverse telecine processing is applied on detected film content fading in with a locally de-interlaced local video content. The invention further provides an apparatus and method for globally detecting mixed video and film content at region, picture, or sequence level. Such apparatus and method comprise a plurality of detectors for robustness and increased detection accuracy.2013-01-03
20130002887System And Method For Automated Set-Top Box Testing Via Configurable Event Time Measurements - The present application provides a user configurable test system for the automatic detection of events on a set-top box (STB) and the determination of their timing. The system relies upon performing metric calculations on the A/V output of the STB and measuring the duration of the event by reference to a set of user defined metrics satisfying a set of user defined conditions. The system is particularly suited to zap time measurement2013-01-03
20130002888APPARATUS FOR EDITING MULTIMEDIA INFORMATION AND METHOD THEREOF - Disclosed are an apparatus for editing multimedia information and a method thereof. The apparatus for editing multimedia information in accordance with the exemplary embodiment of the present invention includes: a display unit the displays a document input window for preparing a document including the multimedia information; a control unit that sets an insertion area for inputting images or moving pictures within the document input window according to a request of a user; and a picture photographing unit that photographs the images or the moving pictures to be input to the insertion area by being driven when the insertion area is set, wherein the control unit inputs the photographed images or moving pictures to the insertion area within the document input window.2013-01-03
20130002889Method and System for Managing The Lifecycles of Media Assets - There is provided a method for managing the lifecycles of one or more media assets. The method comprises importing the one or more media assets into a system for managing the lifecycles of the one or more media assets, determining one or more metadata tags for association with the one or more media assets by evaluating the one or more media assets with one or more tagging filters, associating the one or more metadata tags with the one or more media assets after determining one or more metadata tags for association with the one or more media assets, and grouping the one or more media assets according to the one or more metadata tags associated with the one or more media assets by evaluating the one or more metadata tags with one or more grouping filters to generate one or more media asset groups.2013-01-03
20130002890USING METADATA TAGS IN VIDEO RECORDINGS PRODUCED BY PORTABLE TERMINALS - A system and method for facilitating the video information search and users' navigation through multiple video image or video stream files, as well as reducing the amount of data transferred between a video content management server and a video playback client, by employing metadata tags inserted into the video recordings automatically or by an operator of an encoded information reading terminal.2013-01-03
20130002891POLICY-BASED CONTROLS FOR WIRELESS CAMERAS - A system and method provide parental and corporate control for a wireless camera. An administration authority web site is provided that enables an owner of a wireless camera to log into a server and establish a set of control policies that specify what actions on the phone are authorized for what user. The control policies are downloaded to the wireless camera, and the wireless camera is then operated such that a user of the wireless camera is only able to perform actions authorized by the control policies. In a further embodiment, a control policy is provided that requires the wireless camera to upload captured images to the server for review by the wireless camera owner. In addition, the images may be quarantined until the wireless camera owner authorizes their release to the user.2013-01-03
20130002892IMAGING SYSTEM, IMAGING INSTRUCTION ISSUING APPARATUS, IMAGING APPARATUS, AND IMAGING METHOD - An imaging system includes an imaging instruction issuing apparatus and one or more imaging apparatuses. The imaging instruction issuing apparatus includes a generation unit that generates notification data including identification information unique to each imaging instruction issuing apparatus and an imaging request signal, and a first communication unit that transmits and outputs the notification data to each imaging apparatus. Each imaging apparatus includes an imaging unit that obtains captured image data of a subject, a saving unit that saves the captured image data obtained by the imaging unit, a second communication unit that receives the notification data from the imaging instruction issuing apparatus, and a control unit that causes the saving unit to save the captured image data and the identification information included in the notification data in an associated manner in a case where the notification data received by the second communication unit includes the imaging request signal.2013-01-03
20130002893IMAGING APPARATUS AND IMAGING METHOD - An imaging apparatus includes an optical wavefront coding element that performs a dispersion of a light flux emitted from a subject into three or more directions, an imaging element that receives the light flux dispersed by the optical wavefront coding element to acquire a subject preimage, and a generation unit that generates an image of the subject by applying a process corresponding to the dispersion to the subject preimage.2013-01-03
20130002894STABILIZED CAMERA MODULE - The present invention relates to a stabilized camera module configured to prevent instable movement of a bobbin caused by outside force, the module including a holder formed at a lateral surface with at least one slit, a bobbin movably coupled to an inside of the holder and having a protruder protruding outside of the holder by penetrating the slit, a lens assembly coupled to an inside of the bobbin and including one or more lenses receiving an optical image of an object, and one or more elastic members formed between the holder and the bobbin to prevent the bobbin from moving to one side of the holder.2013-01-03
20130002895ACCELEROMETER REMOTE FOR ARTICULATION OF A VIDEO PROBE - An articulated video probe system and associated method. The system includes an articulated video probe for in-taking and transmitting an image of an area. The articulated video probe is moveable for changing the in-taken image. The system includes a unit for causing the video probe to move. The unit utilizing a received movement command signal to move the articulated video probe. The system includes an accelerometer configured to sense motion and transmit the movement command signal to the unit for causing the video probe to move.2013-01-03
20130002896IMAGE CAPTURING APPARATUS AND CONTROL METHOD THEREOF - An image capturing apparatus capable of shooting a still image and recording a moving image, which controls to start still image shooting in response to a shooting instruction if it is set that moving image recording is not started in response to the shooting instruction, and controls to start moving image recording if it is set that the moving image recording is started in response to the shooting instruction.2013-01-03
20130002897ACCESSORY, CAMERA, ACCESSORY CONTROL PROGRAM, AND CAMERA CONTROL PROGRAM - An accessory is supplied with power from a camera, and includes an accessory control section that controls a process executed in the accessory in accordance with an image capture mode of the camera.2013-01-03
20130002898ENCODER-SUPERVISED IMAGING FOR VIDEO CAMERAS - A controller controls a camera that produces a sequence of images and that has output coupled to a video encoder. The camera has an operating condition including a field of view and lighting, and one or more imaging parameters. The video encoder encodes images from the camera into codewords. The controller receives one or more encoding properties from the video encoder, and causes adjusting one or more of the imaging parameters based on at least one of the received encoding properties, such that the camera produces additional images of the sequence of images for the video encoder using the adjusted one or more imaging parameters.2013-01-03
20130002899IMAGING DEVICE - Herein describes, by example, a digital image device comprising a CCD image sensor, a setting dial, a controller, and an image processor. The CCD image sensor captures a subject image and outputs image information. The setting dial accepts an input operation and outputs an operation signal including one that corresponds to the direction of the dial. The controller sets an object region that will be the object of a specific image processing process, so that at least one of a region direction and a region range for setting the object region can be changed according to an operation signal corresponding to the operation direction of the interface component. The image processor performs diorama filtering on the object region on the basis of the region direction and the region range.2013-01-03
20130002900Object Tracking - In general, the subject matter described in this specification can be embodied in methods, systems, and program products. A computing system accesses an indication of a first template that includes a region of a first image. The region of the first image includes a graphical representation of a face. The computing system receives a second image. The computing system identifies indications of multiple candidate templates. Each respective candidate template from the multiple candidate templates includes a respective candidate region of the second image. The computing system compares at least the first template to each of the multiple candidate templates, to identify a matching template from among the multiple candidate templates that includes a candidate region that matches the region of the first image that includes the graphical representation of the face.2013-01-03
20130002901FINE GRAINED POWER GATING OF CAMERA IMAGE PROCESSING - Methods and apparatus relating to fine grained power gating of camera image processing are described. In an embodiment, an Image Signal Processor (ISP) includes a first partition to receive and store image sensor data in a memory during a first time period. The ISP also includes a second partition to process the stored image sensor data during a second time period that follows the first time period. The second partition is entered into a low power consumption state during the first time period. Other embodiments are also disclosed and claimed.2013-01-03
20130002902FLARE DETERMINATION APPARATUS, IMAGE PROCESSING APPARATUS, AND STORAGE MEDIUM STORING FLARE DETERMINATION PROGRAM - A flare determination apparatus includes an image input unit inputting an image, an image correction unit correcting the image using a correction coefficient to be set based on a gain value for white balance processing depending on a light source type, a region detection unit detecting a region included in a predetermined color component range from the image being corrected by the image correction unit, and a determination unit determining whether or not a flare is generated in the region being detected by the region detection unit.2013-01-03
20130002903CAMERA USER INPUT BASED IMAGE VALUE INDEX - In an imaging evaluation method, camera, and system, a scene is imaged with a camera. User inputs to the camera are received concurrent with the imaging. The inputs each define a setting of one of a plurality of operational functions of the camera. The inputs are valued to provide a set of input values. An image value index is calculated using the input values.2013-01-03
20130002904REMOVABLE DATA STORAGE DEVICE WITH INTERFACE TO RECEIVE IMAGE CONTENT FROM A CAMERA - A data storage device includes an interface removably connectable to a camera to receive from the camera image content in a first format, where the image content is received at the interface from the camera when the camera is operating in a print mode. The data storage device also includes a processor coupled to the interface. The processor processes the received image content in the first format to generate converted image content in a second format. The data storage device also includes a non-volatile memory to store the converted image content in the second format. The data storage device also includes a second interface to a second electronic device. The second interface selectively outputs the image content in the second format to the second electronic device. The data storage device emulates a printer via the interface.2013-01-03
20130002905IMAGING APPARATUS - An imaging apparatus includes the following processing units. An imaging device generates first image data according to incident light. A compression unit performs fixed length coding on the first image data to generate first compressed data. A storage unit stores the first compressed data. A de-compression unit de-compresses only first designated data that is a part of the first compressed data so as to generate first partial de-compressed data. A signal processing unit corrects image quality of the first partial de-compressed data to generate first partial corrected image data. A display unit displays the first partial corrected image data.2013-01-03
Website © 2025 Advameg, Inc.