Patents - stay tuned to the technology

Inventors list

Assignees list

Classification tree browser

Top 100 Inventors

Top 100 Assignees


OBJECT TRACKING

Subclass of:

348 - Television

Patent class list (only not empty are listed)

Deeper subclasses:

Class / Patent application numberDescriptionNumber of patent applications / Date published
348170000 Using tracking gate 5
20100321505TARGET TRACKING APPARATUS, IMAGE TRACKING APPARATUS, METHODS OF CONTROLLING OPERATION OF SAME, AND DIGITAL CAMERA - A detection area is decided in a case where a target has gone out-of-frame. If a target is being imaged by a camera continuously, it is determined whether the target has gone out-of-frame. If the target has gone out-of-frame, then the magnitude and direction of motion of the camera are detected. If camera motion is large, it can be concluded that the camera user is imaging the target while tracking it. Accordingly, it can be concluded that the target will again be imaged at the center of the imaging zone. An area defined as a region in which the target will be detected is set at the center of the imaging zone. If camera motion is small in a case where the target goes out-of-frame, it can be concluded that the user is waiting for the target to re-enter the imaging zone and therefore the edge of the imaging zone is set as the detection area.12-23-2010
20140168448IMAGING DEVICE, ANNOUNCING METHOD, AND RECORDING MEDIUM - An imaging device includes: an area setting unit which sets a first area and a second area other than the first area on the captured image acquired by an imaging unit; a main subject setting unit which sets a main subject based on the captured image acquired by the imaging unit or another image; a subject tracking unit which periodically detects the position of the main subject on the captured image acquired by the imaging unit; and a control unit which performs control depending on a detection result of the subject tracking unit so that an announcement unit announces that the main subject exists only in the first area when the main subject does not exist in the second area, and the announcement unit announces that the main subject does not exist only in the first area when the main subject exists in the second area.06-19-2014
20150077569METHOD AND APPARATUS FOR PERFORMING IMAGE PROCESSING OPERATION BASED ON FRAME/ALGORITHM SELECTION - One image processing method has at least the following steps: receiving an image input in a device, wherein the image input is composed of at least one source image; receiving image selection information; regarding a source image included in the image input, checking the image selection information to determine whether the source image is selected or skipped; and performing an object oriented image processing operation upon each selected source image. Another image processing method has at least the following steps: receiving an image input in a device, wherein the image input is composed of at least one source image; receiving algorithm selection information; and regarding a source image included in the image input, checking the algorithm selection information to determine a selected image processing algorithm from a plurality of different image processing algorithms, and performing an image processing operation upon the source image based on the selected image processing algorithm.03-19-2015
20160171330VISION BASED REAL-TIME OBJECT TRACKING SYSTEM FOR ROBOTIC GIMBAL CONTROL06-16-2016
20160191871INFORMATION PRESENTATION DEVICE - An object of the present invention is to provide a device that is capable of presenting image information that is larger than an object by irradiating light onto the object that moves on a trajectory that is not known. An object tracking section controls line of sight direction so as to be directed towards a moving object. A rendering section irradiates a light beam in a direction along the line of sight direction. In this way the rendering section can irradiate the light beam onto the surface of the object. It is possible to present information, that has been rendered in a range that is larger than the surface area of the object, to an observer, utilizing an after image of the light beam that has been irradiated on the surface of the object.06-30-2016
348172000 Centroidal tracking 2
20090015677Beyond Field-of-View Tracked Object Positional Indicators for Television Event Directors and Camera Operators - A system and method for implementing beyond field-of-view tracked object positional indicators for television event directors and camera operators. The present invention includes a camera having a field-of-view. The camera tracks an off-screen object. A coordinate manager blends an on-screen indication of distance that the object is away from said field-of-view. The camera is positioned to avoid the object in the field-of-view.01-15-2009
20110007167High-Update Rate Estimation of Attitude and Angular Rates of a Spacecraft - System and method are provided for estimating attitude and angular rate of a spacecraft with greater accuracy by obtaining star field image data at smaller exposure times and processing the data using algorithms that allow attitude and angular rate to be calculated during the short exposure times.01-13-2011
Entries
DocumentTitleDate
20080231709SYSTEM AND METHOD FOR MANAGING THE INTERACTION OF OBJECT DETECTION AND TRACKING SYSTEMS IN VIDEO SURVEILLANCE - A system, method and program product for providing a video surveillance system that enhances object detection by utilizing feedback from a tracking system to an object detection system. A system is provided that includes: a moving object detection system for detecting moving objects in a video input; an object tracking system for tracking a detected moving object in successive time instants; and a tracker feedback system for feeding tracking information from the object tracking system to the moving object detection system to enhance object detection.09-25-2008
20080239081SYSTEM AND METHOD FOR TRACKING AN INPUT DEVICE USING A DISPLAY SCREEN IN CAPTURED FRAMES OF IMAGE DATA - A system and method for tracking an input device uses positional information of a display screen in frames of image data to determine the relative position of the input device with respect to the display screen.10-02-2008
20080259163METHOD AND SYSTEM FOR DISTRIBUTED MULTIPLE TARGET TRACKING - A method and system for distributed tracking of multiple targets is disclosed. Multiple targets to be tracked by a plurality of trackers are detected in a frame. The motion state variable of each of the plurality of trackers is calculated in the E-step of a variational Expectation-Maximization algorithm. Further, the data association variable of each of the plurality of trackers is calculated in the M-step of the algorithm. Depending on the motion state variable and the data association variable, the multiple targets are tracked.10-23-2008
20080278584Moving Object Detection Apparatus And Method By Using Optical Flow Analysis - Disclosed is a moving object detection apparatus and method by using optical flow analysis. The apparatus includes four modules of image capturing, image aligning, pixel matching, and moving object detection. Plural images are successively inputted under a camera. Based on neighboring images, frame relationship on the neighboring images is estimated. With the frame relationship, a set of warping parameter is further estimated. Based on the wrapping parameter, the background areas of the neighboring images are aligned to obtain an aligned previous image. After the alignment, a corresponding motion vector for each pixel on the neighboring images is traced. The location in the scene of the moving object can be correctly determined by analyzing all the information generated from the optical flow.11-13-2008
20090009606Tracking device, focus adjustment device, image-capturing device, and tracking method - A tracking device includes: a first tracking control unit that tracks an object based upon focus adjustment states that are detected by a focus detection unit in a plurality of focus detection positions; a second tracking control unit that tracks the object based upon image information that is outputted by an image-capturing unit and reference image information that has been set as a reference; a setting unit that sets a degree to which focus adjustment based upon a focus adjustment state detected by the focus detection unit is temporarily prohibited; and a control unit that selects one of the first tracking control unit and the second tracking control unit to be used for tracking the object, based upon the degree that has been set by the setting unit.01-08-2009
20090015676Recognition and Tracking Using Invisible Junctions - The present invention uses invisible junctions which are a set of local features unique to every page of the electronic document to match the captured image to a part of an electronic document. The present invention includes: an image capture device, a feature extraction and recognition system and database. When an electronic document is printed, the feature extraction and recognition system captures an image of the document page. The features in the captured image are then extracted, indexed and stored in the database. Given a query image, usually a small patch of some document page captured by a low resolution image capture device, the features in the query image are extracted and compared against those stored in the database to identify the query image. The present invention also includes methods for recognizing and tracking the viewing region and look at point corresponding to the input query image. This information is combined with a rendering of the original input document to generate a new graphical user interface to the user. This user interface can be displayed on a conventional browser or even on the display of an image capture device.01-15-2009
20090027501DETECTING AN OBJECT IN AN IMAGE USING CAMERA REGISTRATION DATA INDEXED TO LOCATION OR CAMERA SENSORS - An object is detected in images of a live event by storing and indexing camera registration-related data from previous images. For example, the object may be a vehicle which repeatedly traverses a course. A first set of images of the live event is captured when the object is at different locations in the live event. The camera registration-related data for each image is obtained and stored. When the object again traverses the course, for each location, the stored camera registration-related data which is indexed to the location can be retrieved for use in estimating a position of a representation of the object in a current image, such as by defining a search area in the image. An actual position of the object in the image is determined, in response to which the camera registration-related data may be updated, such as for use in a subsequent traversal of the course.01-29-2009
20090027502Portable Apparatuses Having Devices for Tracking Object's Head, and Methods of Tracking Object's Head in Portable Apparatus - The portable apparatus includes camera section, head tracking section, image processor, video codec section and camera controller. The camera section obtains image of object. The head tracking section receives the image from the camera section, detects head area from the image, simulates the head area using model of ellipse, and calculates shape similarity, which represents similarity between the shape of the gradients of pixels at boundary of the ellipse and that of the ellipse, and color histogram similarity between internal area of candidate figure and internal area of the modeling figure. In order to obtain of the position of candidate ellipse of which color histogram similarity has maximum value, mean shift, which requires small amount of calculation with respect to first number of samples in internal area of the candidate ellipse, is used. Image processing section performs image-processing on the image based on quality information of the detected head area. Video codec section performs differential encoding on the detected head area based on the location of the detected head area. The camera controller controls rotation of the camera section on the basis of the location of the detected head area. Robust head tracking algorithm with small quantity of calculation is modified to be adapted to portable device, the user's head area may be tracked appropriately for the portable device.01-29-2009
20090059008DATA PROCESSING APPARATUS, DATA PROCESSING METHOD, AND DATA PROCESSING PROGRAM - A data processing apparatus includes: a detection section detecting an image of an object from moving image data; a table creation section recording position information indicating a position on the moving image data in a table on the basis of a detection result by the detection section; a dubbing processing section performing dubbing processing on the moving image data; and a control section controlling the dubbing processing section so as to extract a portion of the moving image data recorded on a first recording medium on the basis of the position information recorded in the table, and to perform the dubbing processing on the extracted portion onto a second recording medium.03-05-2009
20090059009OBJECT TRACKABILITY VIA PARAMETRIC CAMERA TUNING - A method and apparatus are described for improving object trackability via parametric camera tuning. In one embodiment, a determination is made as to whether the camera settings loaded cause saturation of a video image and hue differences between objects and between the objects and a background of the video image. If the saturation and hue differences do not exceed the threshold, a search of camera settings is performed to increase saturation and hue differences between objects and between the objects and a background of the video image.03-05-2009
20090079833TECHNIQUE FOR ALLOWING THE MODIFICATION OF THE AUDIO CHARACTERISTICS OF ITEMS APPEARING IN AN INTERACTIVE VIDEO USING RFID TAGS - The present solution can include a method for allowing the selective modification of audio characteristics of items appearing in a video. In this method, a RFID tag can be loaded with audio characteristics specific to a sound-producing element. The RFID tag can then be attached to an item that corresponds to the sound-producing element. The video and audio of the area including the item can be recorded. The audio characteristics can be automatically obtained by scanning the RFID tag. The audio characteristics can then be embedded within the video so that the audio characteristics are available when the item appears in the video.03-26-2009
20090079834CAMERA - A camera of the present invention is provided with a gapless prism which obtains wavelength spectrum of an optical image of a targeting object, an IR-cut filter which is inserted to and retreated from an optical path, and a Bch image sensor, a Gch image sensor, and an Rch+IR image sensor which are arranged for respective spectrum separated by the gapless prism. An infrared ray separator which conducts separation into a specific spectrum signal and an infrared ray signal from infrared ray mixture image data output from the Rch+IR image sensor when the IR-cut filter is retreated from the optical path.03-26-2009
20090086027Method And System For Providing Images And Graphics - A system detects a subject(s) or recipient(s), or an object(s) associated therewith, and sends a communication, for example, a message, to the subject, or object associated therewith, collectively, the “subject.” The communication is projected as an image proximate to the subject, and is projected to the subject when stationary or in motion, and in some cases, the communication is selected based on the direction of travel of the subject.04-02-2009
20090096871TRACKING DEVICE, TRACKING METHOD, TRACKING DEVICE CONTROL PROGRAM, AND COMPUTER-READABLE RECORDING MEDIUM - A tracking device includes feature information detection means for detecting feature information from a photographic image and tracking object matching means for comparing the feature information with tracking object information in which feature information of a plurality of figures are registered so that the feature information corresponds to a priority indicating tracking order of the feature information and for determining whether or not the feature information is information of the tracking object. The tracking device also includes priority acquisition means for acquiring the priority of the feature information detected from the tracking object information where it is determined that the feature information detected is the information of the tracking object and control means for controlling the photographing section, based on the priority acquired, so as to continuously include, in the photographic image from which the feature information is detected, feature information that has a highest priority in the photographic image.04-16-2009
20090115850MOBILE OBJECT IMAGE TRACKING APPARATUS AND METHOD - A mobile object image tracking apparatus includes at least one unit rotating about at least one axis, a camera sensor photographing a mobile object to acquire image data, a unit detecting a tracking error as a tracking error detection value, a unit detecting an angle of the rotary unit, a unit estimating the tracking error as a tracking error estimation value, a unit selecting the tracking error detection value when the mobile object falls within the field of view, and selecting the tracking error estimation value when the mobile object falls outside the field of view, a unit computing an angular velocity instruction value used to drive the rotating unit to track the mobile object, a unit detecting an angular velocity of the rotary unit, and a unit controlling the rotating unit to make zero a difference between the angular velocity instruction value and the angular velocity.05-07-2009
20090122146METHOD AND APPARATUS FOR TRACKING THREE-DIMENSIONAL MOVEMENTS OF AN OBJECT USING A DEPTH SENSING CAMERA - A controller (05-14-2009
20090128632CAMERA AND IMAGE PROCESSOR - A video signal produced through a shooting operation of a camera is sent to a motion detecting unit which detects a motion included in the video signal to set an area of the motion as a space motion area and inputs information of the area to a distance determination circuit which calculates distance between the motion area and the camera by use of a parallax signal produced from stereo cameras disposed in the camera to send information of the space motion area and the distance to a mask determination circuit. The mask determination circuit conducts a comparison between the space motion area information and the space mask area information and between the distance information of the motion area and that of the space mask area, and resultantly compares three-dimensional positions between the detected motion and the mask to determine a position relationship therebetween.05-21-2009
20090135254IMAGE CAPTURING DEVICE AND METHOD FOR TRACKING OBJECT OF SAME - An image capturing device includes a main body with an image capturing unit received therein, a base, a pan and tilt mechanism for rotating the main body relative to the base, a viewfinder display window, and an object tracking system. The object tracking system includes an object obtaining unit, a storing unit, a detecting unit, a calculating unit, and a driving unit. The object obtaining unit is used for selecting and obtaining an object for tracking in the viewfinder display window. The storing unit is used for storing the information of the object. The detecting unit is used for detecting the position of the object image displayed in the viewfinder display window. The calculating unit is used for calculating an amount and direction of rotation for the main body. The driving unit is used for driving the pan and tilt mechanism to rotate the main body.05-28-2009
20090153666TRACKING DEVICE, AUTOMATIC FOCUSING DEVICE, AND CAMERA - A tracking device includes: a light measurement device for light measuring an object; a focus detection device for performing focus detection of the object by an optical system; and a tracking control device for tracking the object based on light measurement information from the light measurement device and focus detection information from the focus detection device corresponding to the light measurement information.06-18-2009
20090153667Surveying instrument - The present invention provides a surveying instrument provided with a tracking function, said surveying instrument comprising a first image pickup means 06-18-2009
20090160942Tagging And Path Reconstruction Method Utilizing Unique Identification And The System Thereof - Disclosed is a tagging and path reconstruction method utilizing unique identification characteristics and the system thereof. The tagging and path reconstruction system comprises a plurality of readers for reading identification information having unique IDs, a plurality of cameras for taking object's image data, and a server. The server includes an identifying and tagging module, an interlaced fusion and identification module, and a path reconstruction module. The identifying and tagging module identifies and tags the object image data with unique IDs. The interlaced fusion and identification module filters, checks and updates the tagged object image data. The path reconstruction module recovers the tagged object image data, lets them regress to their original identity data, and reconstructs the motion path of each object.06-25-2009
20090167867CAMERA CONTROL SYSTEM CAPABLE OF POSITIONING AND TRACKING OBJECT IN SPACE AND METHOD THEREOF - A space position device capable of generating position signals according to its position in space is used in the camera control system for tracking an object. The space position device generates and transmits its position signals to a control unit every predetermined time interval. The control unit then generates control command for controlling a camera to rotate upward/downward, leftward/rightward, zoom in or zoom out according to the generated position signals such that the camera adjusts its focus on the space position device for tracking an object automatically.07-02-2009
20090195656INTERACTIVE TRANSCRIPTION SYSTEM AND METHOD - A method and system which seamlessly combines natural way of handwriting (real world) with interactive digital media and technologies (virtual world) for providing a mixed or augmented reality perception to the user is disclosed.08-06-2009
20090213222SYSTEM FOR TRACKING A MOVING OBJECT, BY USING PARTICLE FILTERING - In a tracking system for tracking a moving object by using a particle filter, the particle filter is configured to arrange particles initially, in a standby state, in a given background region provided in the screen of a camera and to rearrange the particles with respect to the moving object in accordance with a change in likelihood that the object has with respect to the particles.08-27-2009
20090231436Method and apparatus for tracking with identification - Combining and fusing the tracking of people and objects with image processing and the identification of the people and objects being tracked. Also, conditions of a person, object, area or facility can be detected, evaluated and monitored.09-17-2009
20090262197MOVING OBJECT IMAGE TRACKING APPARATUS AND METHOD - An apparatus includes a computation unit computing a moving velocity of a moving object (MO) by differentiation on a first angle of a first-rotation unit and a second angle of a second-rotation unit, a setting unit setting a first-angular velocity of the first-rotation unit and a second-angular velocity of the second-rotation unit as angular-velocity-instruction values when the MO falls outside a correction range, and setting the second-angular velocity and a third-angular velocity as the angular-velocity-instruction values when the MO falls within the correction range, a detection unit detecting a fourth-angular velocity and a fifth-angular velocity of the first-rotation unit and the second-rotation unit, and a control unit controlling a driving unit to eliminate a difference between the fourth-angular velocity and an angular velocity corresponding to the first-rotation unit, and controlling the driving unit to eliminate a difference between the fifth-angular velocity and an angular velocity corresponding to the second-rotation unit.10-22-2009
20090268033Method for estimating connection relation among wide-area distributed camera and program for estimating connection relation - An estimating method and program for estimating the connection relation among distributed cameras to use the estimated relation for monitoring and tracking many object in a wide area. The feature of the invention is not to need any object association among cameras by recognition of a camera image. Each camera independently detects and tracks objects entering/exiting an observation image, and the image coordinates and times of the moments when each object is detected first and last in the image are acquired. All the acquired data observed by each camera is tentatively associated with all the acquired data observed by all the cameras before the detection of all the acquired data observed by the each camera, and the associated items of the data associated for each elapsed time are counted. By using the fact that the elapsed time of correctly associated data with the movement of the same object has a significant peak in the histogram showing the relation between the elapsed time and the number of observations, the connection relation among the fields of view of cameras (presence/absence of overlap between fields of view, image coordinates at which entrance/exit occurs, elapsed time, and pass probability) is acquired according to be peak detection result.10-29-2009
20090278936METHOD FOR AUTOSTEREOSCOPICALLY PRODUCING THREE-DIMENSIONAL IMAGE INFORMATION FROM SCANNED SUB-PIXEL EXTRACTS AND DEVICE FOR CARRYING OUT SAID METHOD - A method for autostereoscopically producing three-dimensional image information from scanned subpixel extracts uses a multiplex track method (MTV) having a separating raster (TR) obliquely extended with respect to a matrix screen (MB) and an electronic tracking (TS) of viewing areas ibased on two separated image views (L, R), that adjacently disposes two or three subpixels (SP) of each pixel (P) of the two image views (L, R) in the actual subpixel extraction (SPA), continuously and alternatingly preserving each subpixel address and disposes said subpixels (SP) in an overlapping manner on each other with an offset, whereby the resolution loss effects the subpixels (SP) only. The crosstalk resulting from the inclination of the separating raster (TR) is reduced by a special structure of the subpixel extraction (SPA), wherein the resolution homogenisation in two directions of the screen is simultaneously preserved. The formation of the actual subpixel extraction (SPA) is carried out according to multiplex schemes (MUX11-12-2009
20090278937Video data processing - An embodiment of the present invention relates to systems and methods for dynamically detecting and visualizing actions and/or events in video data streams. In one embodiment, a method involves dynamically detecting and extracting objects and attributes relating to the objects from a video data stream by using action recognition filtering for attribute detection and time series analysis for relation detection among the extracted objects. In addition, the method may involve dynamically generating a multi-field video visualization along a time axis by depicting the video data stream as a series of frames at a relatively sparse or dense interval, and by continuously rendering the attributes relating to the objects with substantially continuous abstract illustrations. Finally, a method may also involve dynamically combining detection, and extraction of objects and combining with multi-field visualization in a video perpetuo gram (VPG), which may show a video stream in parallel, and which allows for real-time display and interaction.11-12-2009
20090278938Cognitive Change Detection System - A method of detecting a changed condition within a geographical space from a moving vehicle. Images of that geographic space are memorialized in conjunction with GPS coordinates together with its GPS coordinates. The same geographic space is traversed from the moving vehicle while accessing the route's GPS coordinates. The memorialized images are played back by coordinating the GPS data on a memorialized images with that of the traversed geographic space such that the memorialized images are viewed simultaneously with the geographic space being traversed. An observer traveling within the moving vehicle can compare the memorialized images with those being traversed in order to identify changed conditions.11-12-2009
20090295926IMAGE PICKUP APPARATUS - An image pickup apparatus includes a first detection unit configured to detect movement information of an object which is a tracking target from a movie generated at an image pickup element; a second detection unit configured to detect movement information of an apparatus main body; and a determination unit configured to determine that tracking cannot be continued when the difference between a motion vector of the apparatus main body and a motion vector of the object is larger than a certain threshold.12-03-2009
20090315996Video tracking systems and methods employing cognitive vision - Video tracking systems and methods include a peripheral master tracking process integrated with one or more tunnel tracking processes. The video tracking systems and methods utilize video data to detect and/or track separately several stationary or moving objects in a manner of tunnel vision. The video tracking system includes a master peripheral tracker for monitoring a scene and detecting an object, and a first tunnel tracker initiated by the master peripheral tracker, wherein the first tunnel tracker is dedicated to track one detected object.12-24-2009
20090315997IMAGE PROCESSING METHOD, IMAGING APPARATUS, AND STORAGE MEDIUM STORING CONTROL PROGRAM OF IMAGE PROCESSING METHOD EXECUTABLE BY COMPUTER - An image processing method is provided for detecting the position of a specific subject from a movie and combining a display of detection result indicating the detected position with the movie. The image processing method includes a step of determining, depending on a display time of the detection result, whether the detection result should be continuously displayed, when the subject cannot be detected during the display of the detection result combined with the movie.12-24-2009
20090322885IMAGE PROCESSING METHOD, IMAGING APPARATUS, AND STORAGE MEDIUM STORING CONTROL PROGRAM OF IMAGE PROCESSING METHOD EXECUTABLE BY COMPUTER - An image processing method is provided for detecting the position of a specific subject from a movie and combining a display of detection result indicating the detected position with the movie. The image processing method includes a step of determining, depending on a display time of the detection result, whether the detection result should be continuously displayed, when the subject cannot be detected during the display of the detection result combined with the movie.12-31-2009
20100002083MOVING OBJECT AUTOMATIC TRACKING APPARATUS - Even when a moving object intrudes into an area other than an initially registered preset position, movement of the moving object can be detected and tracked.01-07-2010
20100007740Statistical modeling and performance characterization of a real-time dual camera surveillance system - The present invention relates to a method for visually detecting and tracking an object through a space. The method chooses modules for a restricting a search function within the space to regions with a high probability of significant change, the search function operating on images supplied by a camera. The method also derives statistical models for errors, including quantifying an indexing step performed by an indexing module, and tuning system parameters. Further the method applies a likelihood model for candidate hypothesis evaluation and object parameters estimation for locating the object.01-14-2010
20100013935MULTIPLE TARGET TRACKING SYSTEM INCORPORATING MERGE, SPLIT AND REACQUISITION HYPOTHESES - A tracking system having a video detector for associating observations of blobs and objects and deriving objects' or blobs' paths. Hypotheses may be computed by the system for merging, splitting and reacquisition of the observations. There may be objects tracked among the observations, and best paths selected as trajectories of corresponding objects. The observations may be placed in a sliding window containing a series of observations inferred from a collection of frames for improving the accuracy of the tracking (or data association). The processed observations and data may be represented graphically.01-21-2010
20100033579Image Shooting Device And Image Playback Device - An image shooting device includes: an image sensing device that, by sequential shooting, outputs a signal representing a series of shot images; a tracking processing portion that, based on the output signal of the image sensing device, sequentially detects the position of a tracking target on the series of shot images and thereby tracks the tracking target on the series of shot images; a clipping processing portion that, for each shot image, based on the detected position, sets a clipping region in the shot image and extracts the image within the clipping region as a clipped image or outputs clipping information indicating the position and extent of the clipping region; and a tracking evaluation portion that, based on the output signal of the image sensing device, evaluates the degree of reliability or ease of tracking by the tracking processing portion. The clipping processing portion varies the extent of the clipping region in accordance with the evaluated degree.02-11-2010
20100045799Classifying an Object in a Video Frame - In a digital video surveillance system, a number of processing stages are employed to identify foreground regions representing moving objects in a video sequence. An object tracking stage (02-25-2010
20100045800Method and Device for Controlling Auto Focusing of a Video Camera by Tracking a Region-of-Interest - The invention concerns an electronic device equipped with a video imaging process capability, which device includes a camera unit arranged to produce image frames from an imaging view which includes a region-of-interest ROI, an adjustable optics arranged in connection with the camera unit in order to focus the ROI on the camera unit, an identifier unit in order to identify a ROI from the image frame, a tracking unit in order to track the ROI from the image frames during the video imaging process and an auto-focus unit arranged to analyze the ROI on the basis of the tracking results provided by the tracking unit in order to adjust the optics. The device is arranged to determine the spatial position of the ROI in the produced image frame without any estimation measures.02-25-2010
20100053333METHOD FOR DETECTING A MOVING OBJECT IN A SEQUENCE OF IMAGES CAPTURED BY A MOVING CAMERA, COMPUTER SYSTEM AND COMPUTER PROGRAM PRODUCT - The invention relates to a method for detecting a moving object in a sequence of images captured by a moving camera. The method comprises the step of constructing a multiple number of difference images by subtracting image values in corresponding pixels of multiple pairs of images being based on captured images. Further, the method comprises the step of retrieving a moving object by extracting spatial information of pixels in the multiple number of constructed difference images having relatively large image values. In addition, from a pair of images in the construction step an image is a representation of a high resolution image having a higher spatial resolution than original captured images on which the high resolution image is based.03-04-2010
20100073484Display control apparatus, display control method, and program - A display control apparatus includes a receiving unit that receives a television broadcast signal containing at least remote broadcast image information, a display unit that displays image information contained in the television broadcast signal, a player information acquiring unit that acquires, from the remote broadcast image information, information regarding players in a sports game included in a broadcast image signal, a field information acquiring unit that acquires field information from the remote broadcast image information, a player position information acquiring unit that acquires player position information from the image signal using the player information and the field information, a player information providing unit that provides the acquired player information by displaying the player information on the display unit, and a cursor control function unit that sets, using the player position information, a cursor on one of the players selected using the provided player information and displayed on the display unit.03-25-2010
20100091110SINGLE CAMERA TRACKER - A camera tracker, in which an image captured by a camera oriented to capture images across a surface is accessed. A region in which an object detected within the accessed image is positioned is determined from among multiple defined regions within a field of view of the camera. User input is determined based on the determined region and an application is controlled based on the determined user input.04-15-2010
20100097474SMOKE DETECTING METHOD AND SYSTEM - A smoke detecting method and system are provided. The smoke detecting method and system capture a plurality of images; determine whether a moving object exists in the plurality of images; select the images having the moving object to be analyzed; analyze whether the moving object is moving toward a specific direction and a displacement of a base point of the moving object; and determine the moving object as a smoke when the moving object is moving toward the specific direction and the displacement is less than a threshold value.04-22-2010
20100097475INTER-CAMERA LINK RELATION INFORMATION GENERATING APPARATUS - The apparatus comprises a feature quantity extraction part for extracting a feature quantity of a subject from video captured by plural cameras, an In/Out point extraction part for extracting In/Out points indicating points in which a subject appears and disappears in each video captured, an In/Out region formation part for forming In/Out regions based on the In/Out points extracted, a correlation value calculation part for calculating a correlation value by obtaining the total sum of similarities every feature quantity of the subject in each of the plural combinations of In/Out points included in the In/Out regions, a frequency histogram creation part for creating a frequency histogram based on the correlation value, and a link relation information generation part for extracting a peak of the frequency histogram and estimating the presence or absence of a link relation between the plural cameras and generating link relation information.04-22-2010
20100097476Method and Apparatus for Optimizing Capture Device Settings Through Depth Information - A method for adjusting image capture settings for an image capture device is provided. The method initiates with capturing depth information of a scene at the image capture device. Depth regions are identified based on the captured depth information. Then, an image capture setting is adjusted independently for each of the depth regions. An image of the scene is captured with the image capture device, wherein the image capture setting is applied to each of the depth regions when the image of the scene is captured.04-22-2010
20100103269DETERMINING ORIENTATION IN AN EXTERNAL REFERENCE FRAME - Orientation in an external reference is determined. An external-frame acceleration for a device is determined, the external-frame acceleration being in an external reference frame relative to the device. An internal-frame acceleration for the device is determined, the internal-frame acceleration being in an internal reference frame relative to the device. An orientation of the device is determined based on a comparison between a direction of the external-frame acceleration and a direction of the internal-frame acceleration.04-29-2010
20100118149SYSTEM AND METHOD FOR TRACKING AND MONITORING PERSONNEL AND EQUIPMENT - A system and method are described for using RFID tags to track and monitor personnel and equipment in large environments and environments that are prone to multipath fading. The system scans the environment by selecting local interrogation zones where RFID tags may be located. Multiple antennae are used, each transmitting a portion of an activation signal, such that the activation signal will be formed in the selected local interrogation zone. Different subsets of the antennae are successively selected, each targeting the selected local interrogation zone, to repeat the activation signal for each subset of antenna. RFID tags in the local interrogation zone will receive the portions of the activation signals and process them to determine whether the full activation signal was destined for that local interrogation zone for each of the subsets of antennae. An activated RFID tag will transmit its tag information, including any data collected from sensors connected to the tag, back to the system. The systems and method will use the location information of the various RFID tags in the global environment and combine that with data received through cameras and other sensors to provide a display with the RFID tag location information superimposed. The data collected about various regions of the environment may be transmitted back to the RFID tags to provide the personnel with information about their surroundings.05-13-2010
20100123782AUTOFOCUS SYSTEM - By performing a tap operation on any one of the buttons on a screen of a liquid crystal display equipped with a touch panel for performing an operation input on the AF frame auto-tracking, it is possible to select a desired mode among a fixation mode, an object tracking mode, a face detection tracking mode, and a face recognition tracking mode. The fixation mode is suitable to set a position of the AF frame by means of manual operation. The object tracking mode is suitable to allow the AF frame to automatically track a desired subject other than a face. The face detection tracking mode is suitable to allow the AF frame to track a face of an arbitrary person previously not registered. The face recognition tracking mode is suitable to allow the AF frame to track a face of a specific person previously registered.05-20-2010
20100134631APPARATUS AND METHOD FOR REAL TIME IMAGE COMPRESSION FOR PARTICLE TRACKING - A real-time image compression method includes identifying pixels in a set of image data that have a brightness value past a predetermined threshold; determining a position of each identified pixel in the image data; and for each of the identified pixels, defining a vector that includes the brightness value and the position of the identified pixel in the image data.06-03-2010
20100134632APPARATUS FOR TRACKING AN OBJECT USING A MOVING CAMERA AND METHOD THEREOF - Provided is an apparatus for tracking an object, including: a dynamic area extracting unit that extracts an object to be tracked from an image frame collected through an image collecting apparatus; an object modeling unit that models the object to be tracked extracted through the dynamic area extracting unit to calculate the color distribution of the object to be tracked; and an object tracking unit that calculates the color distribution of a next image frame collected through the image collecting apparatus, after calculating the color distribution of the object to be tracked, and calculates a posterior probability that each pixel belongs to the object to be tracked in the next image frame based on the calculated color distribution of the next image frame and the color distribution of the object to be tracked to track the object to be tracked.06-03-2010
20100141772IMAGE PROCESSING DEVICE AND METHOD, IMAGE PROCESSING SYSTEM, AND IMAGE PROCESSING PROGRAM - An image processing device includes: an entire image display control portion that performs control to display an entire image of a predetermined region in an entire image display window; and a cutout image display control portion that performs control to enlarge a plurality of tracking subjects included in the entire image and display the tracking subjects in a cutout image display window. The cutout image display control portion performs the control in such a manner that one cutout image including the tracking subjects is displayed in the cutout image display window in a case where relative distances among the tracking subjects are equal to or smaller than a predetermined value, and that two cutout images including the respective tracking subjects are displayed in the cutout image display window in a case where the relative distances among the tracking subjects are larger than the predetermined value.06-10-2010
20100141773DEVICE FOR RECOGNIZING MOTION AND METHOD OF RECOGNIZING MOTION USING THE SAME - The present invention provides a device for recognizing a motion. The device for recognizing a motion includes: an input device that includes a light source and an inertial sensor; and a motion recognition mechanism that extracts the trajectory of a user's motion by detecting position change of the light source for a user's motion section that is determined in response to a sensing signal of the inertial sensor.06-10-2010
20100149340COMPENSATING FOR BLOOMING OF A SHAPE IN AN IMAGE - A number of brightness samples are taken outside a shape to compensate for blooming of the shape in an image generated by a digital camera. The brightness of each of the samples is determined and averaged, and the size of the shape is adjusted based on the difference between the brightness of the shape and the average of the brightness samples.06-17-2010
20100149341CORRECTING ANGLE ERROR IN A TRACKING SYSTEM - To correct an angle error, acceleration data is received corresponding to a tracked object in a reference frame of the tracked object. Positional data of the tracked object is received from a positional sensor, and positional sensor acceleration data is computed from the received positional data. The acceleration data is transformed into a positional sensor reference frame using a rotation estimate. An amount of error between the transformed acceleration data and the positional sensor acceleration data is determined. The rotation estimate is updated responsive to the determined amount of error.06-17-2010
20100149342IMAGING APPARATUS - An imaging apparatus has an imaging section that creates video data from an optical image of a subject field; a feature acquiring section that acquires features of a main subject in the subject field; a feature holding section that holds the acquired features; a tracking processing section that performs a predetermined process for tracking the main subject using the created video data and the held features; and a controlling section that validates or invalidates an operation of the feature acquiring section, and the controlling section invalidates the operation of the feature acquiring section when the imaging apparatus satisfies a predetermined condition.06-17-2010
20100149343PHOTOGRAPHING METHOD AND APPARATUS USING FACE POSE ESTIMATION OF FACE - Provided are a photographing method and apparatus using face pose estimation. The photographing method includes: detecting a face area from an input image; estimating pose information in the detected face area; and determining a face direction based on the estimated pose information and recording the input image according to the face direction. Accordingly, when a face of a subject looks at something other than a camera, a picture is not taken, and thus a failed photograph is prevented.06-17-2010
20100157063SYSTEM AND METHOD FOR CREATING AND MANIPULATING SYNTHETIC ENVIRONMENTS - Disclosed herein are systems, computer-implemented methods, and tangible computer-readable media for synthesizing a virtual window. The method includes receiving an environment feed, selecting video elements of the environment feed, displaying the selected video elements on a virtual window in a window casing, selecting non-video elements of the environment feed, and outputting the selected non-video elements coordinated with the displayed video elements. Environment feeds can include synthetic and natural elements. The method can further toggle the virtual window between displaying the selected elements and being transparent. The method can track user motion and adapt the displayed selected elements on the virtual window based on the tracked user motion. The method can further detect a user in close proximity to the virtual window, receive an interaction from the detected user, and adapt the displayed selected elements on the virtual window based on the received interaction.06-24-2010
20100157064OBJECT TRACKING SYSTEM, METHOD AND SMART NODE USING ACTIVE CAMERA HANDOFF - If an active smart node detects that an object leaves a center region of a FOV for a boundary region, the active smart node predicts a possible path of the object. When the object gets out of the FOV, the active smart node predicts the object appears in a FOV of another smart node according to the possible path and a spatial relation between cameras. The active smart node notifies another smart node to become a semi-active smart node which determines an image characteristic similarity between the object and a new object and returns to the active smart node if a condition is satisfied. The active smart node compares the returned characteristic similarity, an object discovery time at the semi-active smart node, and a distance between the active smart node and the semi-active smart node to calculate possibility.06-24-2010
20100157065AUTOFOCUS SYSTEM - An autofocus system includes: an image pickup unit; an autofocus unit performing focus adjustment such that a subject in a AF area is to be in focus in the photographing image; a tracking unit moving the AF area to follow the movement of the subject in the range of the photographing image; a reference pattern registering unit registering the subject image in focus as a reference pattern; and a matched image detecting unit detecting a subject which is most closely matched with the reference pattern in the photographing image. When the amount of movement of the detected subject in a screen is less than a given value, the reference pattern is updated with the image of the subject, and an AF frame is updated to follow the moved subject. When the amount of movement is equal to or more than the given value, a tracking operation stops.06-24-2010
20100165112AUTOMATIC EXTRACTION OF SECONDARY VIDEO STREAMS - The automatic generation (07-01-2010
20100165113SUBJECT TRACKING COMPUTER PROGRAM PRODUCT, SUBJECT TRACKING DEVICE AND CAMERA - A computer performs a template-matching step of template-matching an input image with a plurality of template images with different magnifications by a program stored in an object racking computer program product, selecting the template image with the highest similarity to an image in a predetermined region of the input image among the plurality of template images and extracting the predetermined region of the input image as a matched region, a determination step of determining whether or not the matching results of the template-matching step satisfy update conditions for updating the plurality of template images, and an updating step of updating at least one of the plurality of template images when the update conditions are determined to be satisfied by the determination step.07-01-2010
20100165114DIGITAL PHOTOGRAPHING APPARATUS AND METHOD OF CONTROLLING THE SAME - Provided is a method of controlling a digital photographing apparatus. The method includes: determining a target subject; determining whether the target subject exists in a moving image photographing domain; determining whether a condition of temporarily stopping recording of a moving image based on whether the target subject exists is satisfied; and temporarily stopping the recording of a moving image when the condition is satisfied.07-01-2010
20100171836IMAGE CAPTURING APPARATUS, CONTROL METHOD THEREOF, AND PROGRAM - An image capturing apparatus comprising an object detection unit which detects a specific object from an image signal, and a control unit which performs first control corresponding to the specific object when the object detection unit detects the specific object, and performs second control different from the first control when the object detection unit does not detect the specific object, wherein when a state in which the specific object is detected by the object detection unit transits to a state in which the specific object becomes undetectable, the control unit changes, based on information before the specific object becomes undetectable, at least either of a time for which the first control is held and a transition speed when transiting from the first control to the second control.07-08-2010
20100188511IMAGING APPARATUS, SUBJECT TRACKING METHOD AND STORAGE MEDIUM - If there is no movement of the imaging apparatus and no subject presence estimation region, normal tracking setting is accomplished (step S07-29-2010
20100208078HORIZONTAL GAZE ESTIMATION FOR VIDEO CONFERENCING - Techniques are provided to determine the horizontal gaze of a person from a video signal generated from viewing the person with at least one video camera. From the video signal, a head region of the person is detected and tracked. The dimensions and location of a sub-region within the head region is also detected and tracked from the video signal. An estimate of the horizontal gaze of the person is computed from a relative position of the sub-region within the head region.08-19-2010
20100220194Image processing device, image processing system, camera device, image processing method, and program - An image processing device includes: a detecting section configured to detect a plurality of objects by type from one input image; a generating section configured to generate image data on each of the objects detected by the detecting section as images of respective different picture frames by types of the objects; and a processing section configured to subject the images of the different picture frames, the images of the different picture frames being generated by the generating section, to processing according to one of a setting and a request.09-02-2010
20100231723APPARATUS AND METHOD FOR INFERENCING TOPOLOGY OF MULTIPLE CAMERAS NETWORK BY TRACKING MOVEMENT - Provided are an apparatus and a method for tracking movements of objects to infer a topology of a network of multiple cameras. The apparatus infers the topology of the network formed of the multiple cameras that sequentially obtain images and includes an object extractor, a haunting data generator, and a haunting database (DB), and a topology inferrer. The object extractor extracts at least one from each of the obtained images, for the multiple cameras. The haunting data generator generates appearing cameras and appearing times at which the moving objects appear, and disappearing cameras and disappearing times at which the moving objects disappear, for the multiple cameras. The haunting DB stores the appearing cameras and appearing times and the disappearing cameras and disappearing times of the moving object, for the multiple cameras. The topology inferrer infers the topology of the network using the appearing cameras and appearing times and the disappearing cameras and disappearing times of moving objects. Therefore, the apparatus accurately infers topologies and distances among the multiple cameras in the network of the multiple cameras using the cameras and appearing and disappearing times at which the moving objects appear and disappear. As a result, the apparatus accurately track the moving objects in the network.09-16-2010
20100238296MOBILE OBJECT IMAGE TRACKING APPARATUS - A mobile object image tracking apparatus includes: a base; a first gimbal; a second gimbal; an image guiding passage configured to guide an image received through an input opening portion of the second gimbal to the base; an image capturing device; an angle sensor; a tracking error detector configured to detect a first tracking error of an image data; a delay circuit; a tracking error calculator configured to calculate a second tracking error based on the first tracking error, a delayed first rotation angle, and a delayed second rotation angle; an angular velocity processor configured to generate a first target angular velocity and a second target angular velocity based on the first rotation angle, the second rotation angle, and the second tracking error; and an actuator controller configured to control the first gimbal the second gimbal based on the first and second target angular velocities.09-23-2010
20100245587Automatic tracking method and surveying device - The present invention provides an automatic tracking method, comprising a light spot detecting step of detecting a light spot 09-30-2010
20100245588TAG TRACKING SYSTEM - A new system is described that integrates real time item location data from electronically tagged items with a video system to allow cameras in proximity to a tagged item to automatically be enabled or recorded and to move and follow the movement of the tagged item. The methodology to implement such a system is described and involves computerized coordinate system scaling and conversion to automatically select and command movement if available of the most appropriate cameras. The system can follow a moving tagged item and hand off the item from one camera to another, and also command other facility assets, such as lights and door locks.09-30-2010
20100245589CAMERA CONTROL SYSTEM TO FOLLOW MOVING OBJECTS - The present invention is directed to an image tracking system that tracks the motion of an object. The image processing system tracks the motion of an object with an image recording device that records a first image of an object to be tracked and shortly thereafter records a second image of the object to be tracked. The system analyzes data from the first and the second images to provide a difference image of the object, defined by a bit map of pixels. The system processes the difference image to determine a threshold and calculates a centroid of the pixels in the difference image above the threshold. The system then determines the center of the difference image and determines a motion vector defined by the displacement from the center to the centroid and determines a pan tilt vector based on the motion vector and outputs the pan tilt vector to the image recording device to automatically track the object.09-30-2010
20100271484Object tracking using momentum and acceleration vectors in a motion estimation system - There is provided a method and apparatus for motion estimation in a sequence of video images. The method comprises a) subdividing each field or frame of a sequence of video images into a plurality of blocks, b) assigning to each block in each video field or frame a respective set of candidate motion vectors, c) determining for each block in a current video field or frame, which of its respective candidate motion vectors produces a best match to a block in a previous video field or frame, d) forming a motion vector field for the current video field or frame using the thus determined best match vectors for each block, and e) forming a further motion vector field by storing a candidate motion vector derived from the best match vector at a block location offset by a distance derived from the candidate motion vector. Finally, steps a) to e) are repeated for a video field or frame following the current video field or frame. The set of candidate motion vectors assigned at step b) to a block in the following video field or frame includes the candidates stored at that block location at step e) during the current video field or frame The method enables a block or tile based motion estimator to improve its accuracy by introducing true motion vector candidates derived from the physical behaviour of real world objects.10-28-2010
20100271485IMAGE PHOTOGRAPHING APPARATUS AND METHOD OF CONTROLLING THE SAME - A method of controlling an image photographing apparatus to track a subject using a non-viewable pixel region of an image sensor includes detecting a motion vector of a subject when a moving image is photographed, determining whether a non-viewable pixel region of an image sensor is present in the direction of the detected motion vector of the subject, and moving a photographing region to the non-viewable pixel region so as to track the motion of the subject. Accordingly, the moving subject may be tracked within a range of a predetermined Field of View (FOV) of the image photographing apparatus without an additional hardware system.10-28-2010
20100277596AUTOMATIC TRACKING APPARATUS AND AUTOMATIC TRACKING METHOD - An automatic tracking apparatus controls a direction of an imager and adjusts an imaging region so as to include a tracked object in imaged video by a pan/tilt/zoom controller and a pan/tilt driver. Here, a tracking speed is acquired from control information for controlling the imager by a tracking speed decision portion. Then, when a tracking speed is a predetermined value or less, a zoom scaling factor is calculated according to the tracking speed by a zoom scaling factor calculator and a zoom scaling factor of the imager is changed by the pan/tilt/zoom controller and a zoom lens driver.11-04-2010
20100302377IMAGE SENSOR, APPARATUS AND METHOD FOR DETECTING MOVEMENT DIRECTION OF OBJECT - An image sensor includes a base, at least a plurality of first sensing elements and second sensing elements formed on the base. The first sensing elements and the second sensing elements are arranged in an alternate fashion. The first sensing elements cooperatively form a first noncontinuous planar sensing surface facing toward an object, and the second sensing elements cooperatively form a second noncontinuous planar sensing surface facing toward the object. The first noncontinuous planar sensing surface is lower than the second noncontinuous planar sensing surface.12-02-2010
20100302378TRACKING SYSTEM CALIBRATION USING OBJECT POSITION AND ORIENTATION - To calibrate a tracking system, a computing device receives positional data of a tracked object from an optical sensor as the object is pointed approximately toward the optical sensor. The computing device computes a first angle of the object with respect to an optical axis of the optical sensor using the received positional data. The computing device receives inertial data corresponding to the object, wherein a second angle of the object with respect to a plane normal to gravity can be computed from the inertial data. The computing device determines a pitch of the optical sensor using the first angle and the second angle.12-02-2010
20100321503IMAGE CAPTURING APPARATUS AND IMAGE CAPTURING METHOD - There is provided an image capturing apparatus and method capable of preventing frame-out of a moving object even when a fast-moving object is photographed. The solution comprises: an image capturing unit that obtains image data by capturing an object; a moving object detecting unit that detects a moving object to be tracked based on the image data obtained with the image capturing unit; a tracking frame setting unit that sets an area portion which includes the moving object in the image data as a tracking frame when the moving object is detected by the moving object detecting unit; a display frame setting unit that sets an area which is shifted from the tracking frame in the image data in the direction opposite to the moving direction of the moving object as a display frame; and a display processing unit that displays the image data which is included in the display frame on a display unit.12-23-2010
20100321504INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD, PROGRAM, AND RECORDING MEDIUM - An information processing system, apparatus and method is disclosed wherein an image of a predetermined region and an image of moving bodies in the region can be picked up and any of images obtained by such image pickup which is desired by a user can be reproduced readily. Sensor images of the predetermined region are stored, and an image of the moving bodies in the region is picked up separately and stored together with reproduction information relating to reproduction of the sensor image from which the moving bodies are detected. When an instruction to reproduce the sensor image is issued, the reproduction information corresponding to the moving body is read out, and the sensor image is reproduced based on the read out reproduction information. The invention can be applied, for example, to a monitoring system.12-23-2010
20100328467Movable-mechanical-section controlling device, method of controlling movable mechanical section, and program - A movable-mechanical-section controlling device includes a driving controlling unit and an angular position range setting unit. The driving controlling unit performs control so that a movable mechanical section performs a unit pan operation that is performed in a predetermined angular position range in a pan direction, the movable mechanical section having a structure that moves so that an image pickup direction of an image pickup section changes in the pan direction, the predetermined angular position range being set on the basis of a previously set pan-direction movable angular range. The angular position range setting unit sets the angular position range so that, after the unit pan operation has been performed in a first angular position range, the unit pan operation is performed in a second angular position range differing from the first angular position range.12-30-2010
20110001831Video Camera - A video camera includes an imager. An imager repeatedly outputs an object scene image captured on an imaging surface. A determiner repeatedly determines whether or not one or at least two dynamic objects exist in the object scene by referring to the object scene image outputted from the imager. A first searcher searches a specific dynamic object that satisfies a predetermined condition from the one or at least two dynamic objects when a determination result of the determiner is updated from a negative result to an affirmative result. An adjuster adjusts an imaging condition by tracking the specific dynamic object discovered by the first searcher.01-06-2011
20110025854CONTROL DEVICE, OPERATION SETTING METHOD, AND PROGRAM - A control device includes an operation decision unit which inputs the information on image data and a subject detected in an image of the image data and decides the operations to be executed based on the position of the subject in the image in the case of a predetermined limitation position state.02-03-2011
20110043639Image Sensing Apparatus And Image Processing Apparatus - An image sensing apparatus includes an imaging unit which outputs image data of images obtained by photography, and a photography control unit which controls the imaging unit to perform sequential photography of a plurality of target images including a specific object as a subject. The photography control unit sets a photography interval of the plurality of target images in accordance with a moving speed of the specific object.02-24-2011
20110090344Object Trail-Based Analysis and Control of Video - Systems and methods for analyzing scenes from cameras imaging an event, such as a sporting event broadcast, are provided. Systems and methods include detecting and tracking patterns and trails. This may be performed with intra-frame processing and without knowledge of camera parameters. A system for analyzing a scene may include an object characterizer, a foreground detector, an object tracker, a trail updater, and a video annotator. Systems and methods may provide information regarding centers and spans of activity based on object locations and trails, which may be used to control camera field of views such as a camera pose and zoom level. A magnification may be determined for images in a video sequence based on the size of an object in the images. Measurements may be determined from object trails in a video sequence based on an effective magnification of images in the video sequence.04-21-2011
20110090345DIGITAL CAMERA, IMAGE PROCESSING APPARATUS, AND IMAGE PROCESSING METHOD - An image processing apparatus is provided which is capable of stably tracking a target even when an amount of characteristics of an image of the target varies on input pictures due to an abrupt movement of the target or other causes. An image processing apparatus (04-21-2011
20110115920MULTI-STATE TARGET TRACKING MEHTOD AND SYSTEM - A multi-state target tracking method and a multi-state target tracking system are provided. The method detects a crowd density of a plurality of images in a video stream and compares the detected crowd density with a threshold when receiving the video stream, so as to determine a tracking mode used for detecting the targets in the images. When the detected crowd density is less than the threshold, a background model is used to track the targets in the images. When the detected crowd density is greater than or equal to the threshold, a none-background model is used to track the targets in the images.05-19-2011
20110122253IMAGING APPARATUS - An imaging apparatus includes an imaging unit, a field angle change unit, and a movement detection unit. The imaging unit includes a lens that forms an image of a subject and acquires a picture image by taking the image formed by the lens. The field angle change unit changes a field angle of the picture image acquired by the imaging unit. The movement detection unit detects a movement of the imaging apparatus. The field angle change unit changes the field angle of the picture image in accordance with a moving direction of the imaging apparatus when the movement detection unit detects the movement of the imaging apparatus.05-26-2011
20110122254DIGITAL CAMERA, IMAGE PROCESSING APPARATUS, AND IMAGE PROCESSING METHOD - Images of a target are stably tracked even when an image-capture environment changes. An image processing apparatus (05-26-2011
20110128387SYSTEMS AND METHODS FOR MAINTAINING MULTIPLE OBJECTS WITHIN A CAMERA FIELD-OFVIEW - In one embodiment, a system and method for maintaining objects within a camera field of view include identifying constraints to be enforced, each constraint relating to an attribute of the viewed objects, identifying a priority rank for the constraints such that more important constraints have a higher priority that less important constraints, and determining the set of solutions that satisfy the constraints relative to the order of their priority rank such that solutions that satisfy lower ranking constraints are only considered viable if they also satisfy any higher ranking constraints, each solution providing an indication as to how to control the camera to maintain the objects within the camera field of view.06-02-2011
20110141287VIDEO PROCESSING SYSTEM PROVIDING ENHANCED TRACKING FEATURES FOR MOVING OBJECTS OUTSIDE OF A VIEWABLE WINDOW AND RELATED METHODS - A video processing system may include a display and a video processor coupled to the display. The video processor may be configured to display a georeferenced video feed on the display defining a viewable area, determine actual geospatial location data for a selected moving object within the viewable area, and generate estimated geospatial location data along a predicted path for the moving object when the moving object is no longer within the viewable area and based upon the actual geospatial location data. The video processor may be further configured to define a successively expanding search area for the moving object when the moving object is no longer within the viewable window and based upon the estimated geospatial location data, and search within the successively expanding search area for the moving object when the successively expanding search area is within the viewable area.06-16-2011
20110141288Object Tracking Method And Apparatus For A Non-Overlapping-Sensor Network - An object tracking method for a non-overlapping-sensor network works in a sensor network. The method may comprise a training phase and a detection phase. In the training phase, a plurality of sensor information measured by the sensors in the sensor network is used as training samples. At least an entrance/exit is marked out within the measurement range of each sensor. At least three characteristic functions including sensor spatial relation among the sensors in the sensor network, difference of movement time and similarity in appearance, are estimated by an automatically learning method. The at least three characteristic functions are used as the principles for object tracking and relationship linking in the detection phase.06-16-2011
20110157370TAGGING PRODUCT INFORMATION - A system for enabling the tagging of items appearing in a moving or paused image includes the use of an identification device on or within said item to be tagged. The method includes capturing moving image footage containing the image of said item to be tagged, detecting the presence and the position of the identification means within each frame of the moving image footage and hence determining the position of the item to be tagged in each frame of the moving image. By automatically determining the position of the identification device, a suitable tag can then be automatically associated with the item which has the identification device provided thereon or therein when saving or transmitting the moving image.06-30-2011
20110187869TRACKING-FRAME INITIAL-POSITION SETTING APPARATUS AND METHOD OF CONTROLLING OPERATION OF SAME - Automatic tracking of a target image is made comparatively easy. Specifically, an image obtained by imaging a subject is displayed on a display screen and a tracking frame is displayed at a reference position located at the central portion of the display screen. A target area for setting the initial position of the tracking frame is set surrounding the tracking frame and a high-frequency-component image representing high-frequency components of the image within this area is generated. The amount of high-frequency component is calculated while a moving frame is moved within the high-frequency-component image. The position of the moving frame where the calculated amount of high-frequency component is largest is decided upon as the initial position of the tracking frame.08-04-2011
20110193969OBJECT-DETECTING SYSTEM AND METHOD BY USE OF NON-COINCIDENT FIELDS OF LIGHT - The invention provides an object-detecting system and method for detecting information of an object located in an indicating space. In particular, the invention is to capture images relative to the indicating space by use of non-coincident fields of light, and further to determine the information of the object located in the indicating space. The invention also preferably sets the operation times of image-capturing units and the exposure times of light-emitting units in the object-detecting system to improve the quality of the captured images.08-11-2011
20110228099SYSTEM AND METHOD FOR TRACKING COOPERATIVE, NON-INCANDESCENT SOURCES - A system and method for tracking a cooperative, non-incandescent source may include collecting scene images of a scene that includes the cooperative, non-incandescent source and background clutter. First and second scene images of the scene may be generated over distinct spectral bands. The first and second scene images may be imaged onto respective first and second focal plane arrays. In one embodiment, the imaging may be substantially simultaneous. The first and second scene image frame data respectively generated by the first and second focal plane arrays may be processed to produce resultant scene image frame data. The scene image frame data may result in reducing magnitude of scene image frame data representative of the background clutter more than magnitude of scene image frame data representative of the cooperative, non-incandescent source.09-22-2011
20110228100OBJECT TRACKING DEVICE AND METHOD OF CONTROLLING OPERATION OF THE SAME - Disclosed is a technique for accurately tracking a tracking target object. A disparity map image in which the disparity of each pixel of an object is shown is generated from a three-dimensional object image. A detection range is determined such that an object disposed on the front side of a pedestrian, which is a tracking target object, in the depth direction is excluded from the generated disparity map image. The pedestrian, which is a tracking target object, is detected in the determined detection range. In this way, it is possible to prevent a bike driver disposed on the front side of the pedestrian in the depth direction from being tracked.09-22-2011
20110279682Methods for Target Tracking, Classification and Identification by Using Foveal Sensors - A method of operating a sensor system may include the steps of sensing a predetermined area including a first object to obtain first sensor data at a first predetermined time, sensing the substantially same predetermined area including the first object to obtain second sensor data at a second predetermined time, determining a difference between the first sensor data and the second sensor data, identifying a target based upon the difference between the first sensor data and the second sensor data, identifying a material of the target and determining a target of interest to track based upon the material of the target.11-17-2011
20110279683Automatic Motion Triggered Camera with Improved Triggering - An automatic motion triggered camera with improved triggering to produce images with the target animal substantially centered in the field of view. A single motion detector, a camera, and image memory are controlled by a processor that detects changes in the motion detector output and generates a trigger signal for the camera. The trigger follows a first stage of waiting until a predetermined minimum threshold of movement is detected and a second stage of waiting until a further change in the output signal indicative of an animal being substantially centered in the field of view of the camera is detected. Compensation for camera capture delay time can be included by sampling a plurality of motion detector signals, computing an estimated time at which the animal will be centered, and triggering the camera prior to that time.11-17-2011
20110285854System and method for theatrical followspot control interface - There is provided a system and method for controlling a tracking device to follow a location of a performer on a stage. The tracking device may comprise a lighting fixture such as a high output video projector or an automated mechanical moving light such as a DMX lighting fixture to provide a followspot, or a microphone to record audio from a particular performer. The method comprises capturing a video feed of the stage using a camera, presenting the video feed on a display of a control device, receiving input data indicating a position of the performer in the video feed, translating the input data into the location of the performer on the stage, and adjusting the tracking device to follow the location of the performer on the stage. The control device thereby provides lighting operators with an intuitive interface readily implemented using cost effective commodity hardware.11-24-2011
20110285855METHOD OF AUTOMATICALLY TRACKING AND PHOTOGRAPHING CELESTIAL OBJECTS AND CAMERA EMPLOYING THIS METHOD - A method of automatically tracking and photographing a celestial object so that the celestial object image, which is formed on an imaging surface of an image sensor via a photographing optical system, becomes stationary relative to a predetermined imaging area of the imaging surface of the image sensor during a tracking and photographing operation. The method includes performing a preliminary photographing operation at a predetermined preliminary-photographing exposure time with the photographic apparatus directed toward the celestial object and with a celestial-body auto tracking action suspended to obtain a preliminary image before automatically tracking and photographing the celestial object, calculating a moving direction and a moving speed of the celestial object image from the preliminary image that is obtained by the preliminary photographing operation, and automatically tracking and photographing the celestial object based on the moving direction and the moving speed of the celestial object image.11-24-2011
20110292217METHOD OF AUTOMATICALLY TRACKING AND PHOTOGRAPHING CELESTIAL OBJECTS AND PHOTOGRAPHIC APPARATUS EMPLOYING THIS METHOD - A method of automatically tracking and photographing a celestial object, is provided, which moves relative to a photographic apparatus due to diurnal motion so that the celestial object image formed on an image sensor becomes stationary during a celestial-object auto-tracking photographing operation. The method includes inputting photographing azimuth angle and elevation angle information of the photographic apparatus; calculating preliminary-tracking drive control data based on the photographing azimuth angle and elevation angle information; obtaining first and second preliminary images corresponding to commencement and termination points of the preliminary tracking operation; calculating a deviation amount between a celestial object image in the first preliminary image and a corresponding celestial object image in the second preliminary image; calculating, from the deviation amount, actual-tracking drive control data with the deviation amount cancelled; and performing the celestial-object auto-tracking photographing operation based on the actual-tracking drive control data.12-01-2011
20110304736GIMBAL POSITIONING WITH TARGET VELOCITY COMPENSATION - A gimbal system, including components and methods of use, configured to track moving targets. More specifically, the disclosed system may be configured to orient and maintain the line of sight (“los”) of the gimbal system toward a target, as the target and, in some cases, the platform supporting the gimbal system are moving. The system also may be configured to calculate an estimated target velocity based on user input, and to compute subsequent target positions from previous positions by integrating the estimated target velocity over time. A variety of filters and other mechanisms may be used to enable a user to input information regarding a target velocity into a gimbal controller.12-15-2011
20110304737GIMBAL POSITIONING WITH TARGET VELOCITY COMPENSATION - A gimbal system, including components and methods of use, configured to track moving targets. More specifically, the disclosed system may be configured to orient and maintain the line of sight (“los”) of the gimbal system toward a target, as the target and, in some cases, the platform supporting the gimbal system are moving. The system also may be configured to calculate an estimated target velocity based on user input, and to compute subsequent target positions from previous positions by integrating the estimated target velocity over time. A variety of filters and other mechanisms may be used to enable a user to input information regarding a target velocity into a gimbal controller.12-15-2011
20120002055Correlation of position data that are acquired by means of a video tracking system with a second localization system - The invention relates to a video tracking system which determines the position of objects, e.g. players in a football match, by evaluating video image information. According to the present invention, the video tracking system is additionally provided with an independent positioning system which is preferably used in instances when the video-based position detection is faulty or does not work at all. According to one embodiment, the independent positioning system is GPS based, each football player carrying a portable GPS module which transmits the position data to the system by radio. Additional data, such as heart rate or acceleration, can also be transmitted.01-05-2012
20120002056APPARATUS AND METHOD FOR ACTIVELY TRACKING MULTIPLE MOVING OBJECTS USING A MONITORING CAMERA - An apparatus for actively tracking an object is provided. The apparatus includes a camera unit; a motor drive for changing a shooting direction of the camera unit; and a controller for acquiring a first comparative image and a second comparative image in sequence using the camera unit, comparing the first comparative image with the second comparative image, detecting a moving direction and a speed of an identical object existing in the first and second comparative images, determining an estimated location of the object after receipt of the second comparative image based on the detected moving direction and speed, and enlarging and capturing the object in the estimated location of the object.01-05-2012
20120019664CONTROL APPARATUS FOR AUTO-TRACKING CAMERA SYSTEM AND AUTO-TRACKING CAMERA SYSTEM EQUIPPED WITH SAME - A control apparatus for a camera-monitor system having an auto-tracking function has a mode switcher switching the camera between a normal shooting mode and a tracking mode, a display position selector selecting a display position at which the selected specific object is to be displayed on the monitor screen, and a signal outputting unit outputting a drive signal for driving the auto-tracking camera so that an image of the tracked specific object is displayed at the display position. When the tracking mode is selected by the mode switcher, the signal outputting unit outputs to the camera a drive signal responsive to the amount of operation of the operation part so that the direction of shift of the selected specific object in shifting the selected specific object on the monitor screen coincides with the represented direction during setting performed before the camera starts auto tracking of the selected specific object.01-26-2012
20120019665AUTONOMOUS CAMERA TRACKING APPARATUS, SYSTEM AND METHOD - An autonomous camera tracking system. The system can include a tracking apparatus, and a beacon. The tracking apparatus can include an infrared sensor, a camera, a processor, and a panning stepper motor. The beacon can include an infrared emitter. The infrared sensor can receive an infrared signal from the beacon and, upon detection of a shift in the position of the infrared signal, the stepper motor can pan the camera and the infrared sensor so as to place the beacon in the field of view of the camera. The tracking apparatus can further include a tilting stepper motor, and, upon detection of a shift in the position of the infrared signal, the stepper motor can tilt the camera and the infrared sensor so as to place the beacon in the field of view of the camera.01-26-2012
20120026340SYSTEMS AND METHODS FOR PRESENTING VIDEO DATA - Described herein are systems and methods for presenting video data to a user. In overview, video data originates from a source, such as a capture device in the form of a camera. This video data is defined by a plurality of sequential frames, having a common geometric size. This size is referred to as the “geometric bounds of captured video data”. Analytics software is used to track objects in the captured video data, and provide position data indicative of the location of a tracked object relative to the geometric bounds of captured video data. Video data is presented to a user via a “view port”. By default, this view port is configured to display video data corresponding to geometric bounds of captured video data. That is, the view port displays the full scope of video data, as captured. Embodiments of the present invention use the position data to selectively adjust the view port to display a geometrically reduced portion of the geometric bounds of captured video data, thereby to assist the user in following a tracked object.02-02-2012
20120057028IMAGING SYSTEM AND PIXEL SIGNAL READOUT METHOD - An imaging system is provided that includes a target detector, a readout area determiner and a readout processor. The target detector detects a target subject from an effective pixel area of an image sensor. The readout area determiner defines a readout area within the effective pixel area, the readout area corresponding to a detected target. The readout processor reads out only pixel signals within the readout area. A partial area within the readout area is redefined as the readout area when the size of the original readout area is greater than a predetermined size.03-08-2012
20120057029CAPTURE OF VIDEO WITH MOTION-SPEED DETERMINATION AND VARIABLE CAPTURE RATE - A method of capturing a video of a scene depending on the speed of motion in the scene, includes capturing a video of the scene; determining the relative speed of motion within a first region of the video of the scene with respect to the speed of motion within a second region of the video of the scene; and causing a capture rate of the first region of the video of the scene to be greater than a capture rate of the second region of the video of the scene, or causing an exposure time of the first region to be less than exposure time of the second region.03-08-2012
20120081552VIDEO TRACKING SYSTEM AND METHOD - A video tracking system and method are provided for tracking a target object with a video camera.04-05-2012
20120105647CONTROL DEVICE, CONTROL METHOD, PROGRAM, AND CONTROL SYSTEM - A control device, a control method, a program, and a control system, whereby more intelligent imaging operation more useful for a user can be realized particularly in event of performing automatic imaging operation using a subject detection result within an imaged image, as an imaging operation. Control relating to imaging operation is performed according to a positional relation between an edge region set as an edge portion region of an image frame, and a subject detected within the image frame. Control relating to automatic imaging operation can be performed based on a determination reference that is a positional relation between the edge region and a detected subject, and more intelligent automatic imaging operation more useful for a user can be realized.05-03-2012
20120113267IMAGE CAPTURING APPARATUS, METHOD, AND RECORDING MEDIUM CAPABLE OF CONTINUOUSLY CAPTURING OBJECT - An image capturing apparatus includes an image capturing unit, a designating unit configured to designate a main-area or a position to which a moving object is to reach, in each captured image captured by the image capturing unit, an image capturing control unit configured to control the image capturing unit to continuously capture the moving object at a predetermined frame rate, a position specifying unit configured to specify a position of the moving object in the image captured by the image capturing unit, and a frame rate control unit configured to control the predetermined frame rate, based on the specified position of the moving object, and either the main-area or position.05-10-2012
20120113268IMAGE GENERATION APPARATUS, IMAGE GENERATION METHOD AND STORAGE MEDIUM - An image generation apparatus comprises: a first-image obtaining unit adapted to obtain an image obtained by causing an image capturing unit to capture the grip unit controlled so as to place the target object in one predetermined orientation of a plurality of predetermined orientations with respect to the image capturing unit and the target object in the one predetermined orientation as a grip-state image; a second-image obtaining unit adapted to obtain, as a non-grip-state image corresponding to the one predetermined orientation, an image of the grip unit that does not grip the target object and is placed in a predetermined orientation coincident with the orientation controlled to place the target object in the one predetermined orientation; and an image generation unit adapted to generate a target object image including only the target object for the one predetermined orientation based on a difference between the grip-state image and the non-grip-state image.05-10-2012
20120120247Image sensing module and optical signal processing device - An image sensing module includes an image sensor and a sensing controller. The image sensor captures an image with a first frequency. The sensing controller determines whether the image sensor has detected an object according to the image. When the sensing controller determines that the image sensor has detected an object, the sensing controller switches the image sensor to capture the image with a second frequency for the image sensor to continuously detect the location of the object. By setting the first frequency higher than the second frequency, the image-sensing module detects the object in real time and the power consumption of the image-sensing module continuously detecting the location of the object is saved as well.05-17-2012
20120120248IMAGE PHOTOGRAPHING DEVICE AND SECURITY MANAGEMENT DEVICE OF OBJECT TRACKING SYSTEM AND OBJECT TRACKING METHOD - An image photographing device of an object tracking system includes: an image recognizing module for collecting image information within a field of view (FOV) region in real time, recognizing occurrence of an event from the collected image information to extract an object contributing to the occurrence of the event, and sensing whether the extracted object is out of the FOV region or not. The device further includes an object tracking module for extracting property of the object from the extracted object to generate metadata, storing the metadata in a database, and providing the metadata stored in the database to ambient image photographing devices based on the sensing result of the image recognizing module.05-17-2012
20120120249CONTROL APPARATUS, IMAGING SYSTEM, CONTROL METHOD, AND PROGRAM - A control apparatus, an imaging system, a control method, and a program in which, when performing automatic image-recording, subjects which seem to be present around an imaging apparatus can be recorded as evenly as possible. An automatic recording operation for recording, upon detection of a subject from an image obtained by imaging, data representing an image containing the subject is performed. On that basis, if it is determined, on the basis of image-recording history information, that the transition to a subject configuration different from that used in the last image-recording is to be performed, a movable mechanism unit is moved to change an imaging field-of-view range, thereby obtaining a different subject configuration.05-17-2012
20120127319METHODS AND APPARATUS FOR CONTROLLING A NETWORKED CAMERA - An apparatus for controlling a remote camera is described. The apparatus includes a housing and a processor positioned within the housing. A transceiver coupled to the processor communicates with a remote server. The remote server is coupled to the remote camera. A motion tracking component is mechanically coupled to the housing and electrically coupled to the processor. The motion tracking component generates a motion signal. The remote server controls a parameter of the remote camera in response to the motion signal. A display is coupled to the processor for displaying the output signal from the remote camera. The output signal is associated with the parameter of the remote camera.05-24-2012
20120133777CAMERA TRACKING WITH USER SCRIPT CONTROL - The present application provides increased flexibility and control to a user by providing a camera and camera controller system that is responsive to a user-defined script. The user-defined script can allow a user to choose a subject and have the camera follow the subject automatically. In one embodiment, a camera is provided for taking still or video images. Movement of the camera is automatically controlled using a camera controller coupled to the camera. A user script is provided that describes a desired tracking of an object. The camera controller is responsive to the script for controlling the camera in order to track the object.05-31-2012
20120133778TRACKING SYSTEM AND METHOD FOR IMAGE OBJECT REGION AND COMPUTER PROGRAM PRODUCT THEREOF - In one exemplary embodiment, an object region tracking and picturing module is constructed on a moving platform of a mobile end and a remote control module is constructed on anther platform for an image object region tracking system. The two modules communicate with each other via a digital network for delivering required information. The object region tracking and picturing module uses a real-time image backward search technology to store at least an image frame previously captured on the moving platform into a frame buffer, and start tracking an object region from the position pointed out by the remote control module to a newest image frame captured on the moving platform, then find out a relative position on the newest image frame for the tracked object region.05-31-2012
20120154599ZOOMING FACTOR COMPUTATION - Systems, methods, and devices are disclosed for determining a zooming factor for a camera in a pan, tilt, and zoom (PTZ) camera tracking system to enable a camera to keep an object at a constant size within the camera's viewing area, despite changes in the object's distance from the camera. This provides a complement to a camera's pan and tilt tracking of the moving object. For example, a PTZ camera tracking system that determines an object to track, utilizes information regarding images of the object of interest are used to determine a zooming factor (or other zooming value) for a camera in the PTZ camera tracking system. This information includes variables such as tilt angles of one or more cameras and a reference zooming factor.06-21-2012
20120154600IMAGE PICKUP DEVICE - An image pickup device may include an image pickup unit that captures a subject, an acquiring unit that acquires an absolute position of the image pickup device, an estimating unit that estimates an image capturing direction of the image pickup device, a receiving unit that receives first information or second information from an external terminal, the first information representing an absolute position of the subject, the second information representing an absolute position of the external terminal, an image capturing direction of the external terminal, and a distance from the external terminal to the subject, and a control unit that controls the image capturing direction or notifies an operator of the image pickup device of information instructing that the image capturing direction be changed, based on the absolute position, the image capturing direction, and at least one of the first information and the second information.06-21-2012
20120188379AUTOMATIC-TRACKING CAMERA APPARATUS - An automatic-tracking camera apparatus which is capable of realizing continuous and smooth driving and obtaining an image with little position variation of a tracking target from a target position within the image and with little blur. The position of a camera body is changed by a gimbal device. The speed of a tracking target object at the next-after-next start timing of image acquisition by the camera body is predicted. The gimbal device is controlled so that the camera body reaches the position indicated by a position instruction value generated for the next-after-next start timing of image acquisition by the camera body, at the next-after-next start timing, and the speed of the camera body at the next-after-next start timing of image acquisition by the camera body corresponds to the speed predicted for the next-after-next timing of image acquisition.07-26-2012
20120200715IMAGING APPARATUS, CONTROL METHOD THEREOF, AND STORAGE MEDIUM - An imaging apparatus includes an imaging unit configured to acquire image data, a positioning unit configured to perform positioning processing for acquiring positional information, a first control unit configured to control the positioning unit to perform the positioning processing at a first time interval, and to control an association unit to associate the positional information with the image data, and a second control unit configured to control the positioning unit to perform the positioning processing at a second time interval, and to control a generation unit to generate the log data based on the positional information, wherein the second control unit changes a time interval based on the acquisition status of the positional information.08-09-2012
20120212622MOVING OBJECT IMAGE TRACKING APPARATUS AND METHOD - According to one embodiment, a moving object image tracking apparatus includes two drivers, a camera sensor, a tracking error detector, angle sensors, angular velocity sensors, a first calculator, a second calculator, a corrected tracking error detector, a generator, and a controller. The tracking error detector detects tracking errors as deviation amounts of a moving object from a visual field center from the image data as tracking error detection values. The corrected tracking error detector calculates corrected tracking errors for each period shorter than a sampling period, tracking error detection values being constant, from a velocity vector and a relationship between a visual axis vector and a position vector. The generator generates angular velocity command values required to drive the drivers to track the moving object using the corrected tracking errors. The controller controls the drivers so that differences between the angular velocity command values and angular velocities become zero.08-23-2012
20120212623SYSTEM AND METHOD OF CONTROLLING VISION DEVICE FOR TRACKING TARGET BASED ON MOTION COMMANDS - A system of controlling a vision device based on motion commands may include: a movable body; a vision device driven by being connected to the body and receiving image information; a driving unit driving the body in accordance with a motion command; and a control unit which calculates motion information of the body using the motion command and drives the vision device so as to compensate an influence caused by the motion of the body using the calculated motion information. By using the system, reliable image information may be obtained in a manner such that a vision device included in a subject such as a locomobile robot is controlled to watch a predetermined target even when the subject moves.08-23-2012
20120224068DYNAMIC TEMPLATE TRACKING - Various arrangements for tracking a target within a series of images is presented. The target may be detected within a first image at least partially based on a correspondence between the target and a stored reference template. A tracking template may be created for tracking the target using the first image. The target may be located within a second image using the tracking template.09-06-2012
20120229651IMAGE PICKUP APPARATUS WITH TRACKING FUNCTION AND TRACKING IMAGE PICKUP METHOD - An image pickup apparatus with tracking function including a lens apparatus having aperture and focus adjust function and a camera apparatus connected to the lens apparatus and having an image pickup element. The apparatus has a movement detector that detects movement of an object to be tracked in a picked-up image, and a stop controller that performs a control of changing the aperture value of the aperture stop in an opening direction when movement of the object to be tracked is detected by the movement detector.09-13-2012
20120242838IMAGE CAPTURING APPARATUS AND METHOD FOR CONTROLLING THE SAME - An image capturing apparatus is provided that is capable of performing both object detection using image recognition and object detection using movement detection on successively captured images. In the image capturing apparatus, the reliability of the result of the object detection using image recognition is evaluated based on the previous detection results. If it is determined that the reliability is high, execution of the object detection using movement detection is determined. If it is determined that the reliability is low, non-execution of the object detection using movement detection is determined. With this configuration, the object region can be tracked appropriately.09-27-2012
20120249802DISTRIBUTED TARGET TRACKING USING SELF LOCALIZING SMART CAMERA NETWORKS - A plurality of camera devices are configured to localize one another based on visibility of each neighboring camera in an image plane. Each camera device captures images and identifies sightings of candidate targets. The camera device share information about sightings and triangulate positions of targets. Targets are matched to known tracks based on prior images, allowing targets to be tracked in a 3D environment.10-04-2012
20120268608AUTOMATIC TRACKING CONTROL APPARATUS FOR CAMERA APPARATUS AND AUTOMATIC TRACKING CAMERA SYSTEM HAVING SAME - An automatic tracking control apparatus for a camera apparatus having a panning or tilting function, comprising an object recognition unit to recognize an object in picked-up image, a tracking object setting unit to set the recognized object as tracking object, an output position setting unit to set a position in the image for outputting the image of the tracking object, a control computing unit to output drive signal to locate the tracked object at the output position, and an image output unit to output an image in which an indication of the tracking object is superimposed on the image. Automatic tracking is suspended to change the position of the tracking object in the image by the output position setting unit, and automatic tracking is restarted by outputting a drive signal for driving the camera apparatus to locate the tracked object at the output position.10-25-2012
20120274780IMAGE APPARATUS, IMAGE DISPLAY APPARATUS AND IMAGE DISPLAY METHOD - An image apparatus, comprising a image capture device that captures an image, a display device that displays the image, a display control block that enlarges and displays a first area, included in a first image, on the display device, a target setting block that sets an object in the first area as a target for tracking and a detecting block that detects a second area, including the object, in a second image which is specified to be displayed next on the display device in place of the first image, the display control block enlarging and displaying the second area on the display device.11-01-2012
20120274781MARGINAL SPACE LEARNING FOR MULTI-PERSON TRACKING OVER MEGA PIXEL IMAGERY - A method for tracking pedestrians in a video sequence, where each image frame of the video sequence corresponds to a time step, includes using marginal space learning to sample a prior probability distribution p(x11-01-2012
20120274782Object Recognition Method and Recognition Apparatus - A target fish-eye image for object recognition is split into regions in accordance with the distortion direction. An object recognition portion performs object recognition by using databases prepared in accordance with the regions respectively. The same process is applied to a plurality of target fish-eye images created by rotation of the target fish-eye image. A detected coordinate transformation portion restores the obtained position of an object into the original position by inverse rotation, and outputs a thus obtained result as a detection result. Detection accuracy in object recognition is improved, and the quality of data in databases is reduced. Rotated object images required for database creation are used for creation of databases for object recognition for the fish-eye image. Thus, a distorted fish-eye image for object recognition is directly used without projective conversion or image conversion based on corrective operation to the fish-eye image.11-01-2012
20120274783IMAGING WITH REAL-TIME TRACKING USING OPTICAL COHERENCE TOMOGRAPHY - An optical coherence tomography system is provided. The system includes an OCT imager; a two-dimensional transverse scanner coupled to the OCT imager, the two-dimensional transverse scanner receiving light from the light source and coupling reflected light from a sample into the OCT imager; optics that couple light between the two-dimensional transverse scanner and the sample; a video camera coupled to the optics and acquiring images of the sample; and a computer coupled to receive images of the sample from the video camera, the computer processing the images and providing a motion offset signal based on the images to the two-dimensional transverse scanner.11-01-2012
20120274784SYSTEM AND METHOD FOR RECOGNIZING A UNIT LOAD DEVICE (ULD) NUMBER MARKED ON AN AIR CARGO UNIT - The present invention provides a system and method for recognizing a Unit Load Device (ULD) number marked on an air cargo unit. The system includes at least one camera configured to acquire images of the ULD number. It includes also a presence sensing module configured to detect a presence status of the air cargo unit in a scanning zone of the system, the presence status can have a value being one of present and absent, and a recognition processor coupled to the presence sensing module and to the at least one camera. The recognition processor is configured to obtain from the presence sensing module information relating to the presence status of said air cargo unit, to trigger the at least one camera to acquire the images upon a change in the value of the presence status, and to process the images for recognizing the ULD number.11-01-2012
20120274785SUBJECT TRACKING APPARATUS, CAMERA HAVING THE SUBJECT TRACKING APPARATUS, AND METHOD FOR TRACKING SUBJECT - A subject tracking apparatus includes a light detector, a focus detector, and a tracking controller. The focus detector is configured to detect focus information at a plurality of focus detection regions in a view of an image. The focus information includes at least focus information detected at a subject region of a subject in the view. The tracking controller is configured to determine at least one first region having substantially same light information as light information detected at the subject region from among a plurality of light measurement regions, to determine at least one second region having substantially same focus information as the focus information detected at the subject region from among the plurality of focus detection regions, and to determine reference information for tracking the subject in the view based on the at least one first region and the at least one second region.11-01-2012
20120293663DEVICE FOR DETERMINING DISAPPEARING DIRECTION AND METHOD THEREOF, APPARATUS FOR VIDEO CAMERA CALIBRATION AND METHOD THEREOF - A disappearing direction determination device and method, a video camera calibration apparatus and method, a video camera and a computer program product are provided. The device comprises: a moving target detecting unit for detecting in the video image a moving target area where a moving object locates; a feature point extracting unit for extracting at least one feature point on the moving object in the detected moving target area; a moving trajectory obtaining unit for tracking a movement of the feature point in a predetermined number of video image frames to obtain a movement trajectory of the feature point; and a disappearing direction determining unit for determining, according to the movement trajectories of one or more moving objects in the video image, a disappearing direction pointed by a major moving direction of the moving objects. Thus, a disappearing direction and video camera gesture parameters can be determined accurately.11-22-2012
20120293664CHARACTER RECOGNITION SYSTEM AND METHOD FOR RAIL CONTAINERS - A system and method, which enables precise identification of characters contained in vehicle license plates, container I.D, chassis I.D, aircraft serial number and other such identification markings. The system can process these identified characters and operate devices, such as access control operations, traffic systems and vehicle and container tracking and management systems, and provide records of all markings together with their images.11-22-2012
20120293665IMAGING DEVICE INCLUDING TARGET TRACKING FUNCTION - An imaging device includes an imaging unit capturing an image of a subject, and tracks, through images captured in time series, an area in which a specific target appears. The device includes a parameter acquiring unit acquiring a photographic parameter from the imaging unit, a target area determining unit determining an area of a captured image including the specific target as a target area, a track area adjusting unit setting a track frame for a track area to track the target area including the specific target and adjusting a size of the track frame based on the photographic parameter, and a track area searching unit searching the captured image for the track area, while moving the size-adjusted track frame, based on a similarity between a characteristic amount of the track area of a current captured image and that of the target area of a previous captured image.11-22-2012
20120300082Automatic Device Alignment Mechanism - An alignment suite includes first and second targeting devices and an optical coupler. The first targeting device is configured to perform a positional determination regarding a downrange target. The first targeting device includes an image processor. The second targeting device is configured to perform a targeting function relative to the downrange target and is affixable to the first targeting device. The optical coupler enables the image processor to capture an image of a reference object at the second targeting device responsive to the first and second targeting devices being affixed together. The image processor employs processing circuitry that determines pose information indicative of an alignment relationship between the first and second targeting devices relative to the downrange target based on the image captured.11-29-2012
20120300083IMAGE PICKUP APPARATUS AND METHOD FOR CONTROLLING THE SAME - An image pickup apparatus is provided which realizes an improvement in the accuracy of a subject tracking function of the image pickup apparatus during continuous shooting. This image pickup apparatus includes an image pickup unit configured to capture a plurality of auxiliary images during an interval between capturing of a main image and capturing of a next main image; a first subject tracking processing unit configured to detect a region where a subject that is the same as a main subject exists, from a first region that is a part of a first auxiliary image among the plurality of auxiliary images; and a second subject tracking processing unit configured to detect a region where a subject that is the same as the main subject exists, from a second region of a second auxiliary image among the plurality of auxiliary images, the second region being larger than the first region.11-29-2012
20120320219IMAGE GATED CAMERA FOR DETECTING OBJECTS IN A MARINE ENVIRONMENT - System for detecting objects protruding from the surface of a body of water in a marine environment under low illumination conditions, the system comprising a gated light source, generating light pulses toward the body of water illuminating substantially an entire field of view, a gated camera, sensitive at least to wavelengths of the light generated by the gated light source, the gated camera receiving light reflected from at least one object, within the field of view, protruding from the surface of the body of water and acquiring a gated image of the reflected light, and a processor coupled with the gated light source and with the gated camera, the processor gating the gated camera to be set ‘OFF’ for at least the duration of time it takes the gated light source to produce a light pulse in its substantial entirety in addition to the time it takes the end of the light pulse to complete traversing a determined distance from the system and back to the gated camera, the processor further setting, for each pulse, the gated camera to be ‘ON’ for an ‘ON’ time duration until the light pulse, reflecting back from the object, is received by the gated camera.12-20-2012
20120327249AUGMENTED REALITY METHOD AND DEVICES USING A REAL TIME AUTOMATIC TRACKING OF MARKER-FREE TEXTURED PLANAR GEOMETRICAL OBJECTS IN A VIDEO STREAM - Methods and devices for the real-time tracking of an object in a video stream for an augmented-reality application are disclosed herein.12-27-2012
20130002884IMAGING APPARATUS HAVING OBJECT DETECTION FUNCTION AND METHOD FOR CONTROLLING IMAGING APPARATUS - An imaging apparatus having a function of detecting a face from an image signal on which a focusing frame is superimposed is configured so that a face selected to be a main object can continue being selected to be a main object even if a focusing frame is displayed as superimposed on the face. A central processing unit moves a position of a focusing frame to be displayed by a display unit to a position corresponding to a detected object if a ratio of a size of the detected object to a size of the focusing frame displayed on the display unit is greater than or equal to a predetermined threshold, and does not move the position of the focusing frame to be displayed by the display unit to a position corresponding to the detected object if the ratio is less than the threshold.01-03-2013
20130002885IMAGE PICK-UP APPARATUS AND TRACKING METHOD THEREFOR - An image pick-up apparatus that enables to appropriately set a holding time in which there is a possibility of returning to a state that allows tracking when a subject cannot be tracked temporarily during the subject tracking to improve the ease-of-use for a user. A specifying unit specifies a subject included in a captured image. A display unit displays the captured image on a screen and displays identification information showing that the specified subject is tracked. A tracking unit tracks the subject. A setting unit sets a holding time in which the display of the identification information is held according to at least one of a focal length of the image pick-up apparatus and a subject distance. An elimination unit eliminates a display of the identification information when the holding time has passed after the tracking unit lost the subject.01-03-2013
20130021477METHOD AND CAMERA FOR DETERMINING AN IMAGE ADJUSTMENT PARAMETER - The present invention relates to a method and a camera for determining an image adjustment parameter. The method includes receiving a plurality of images representing an image view, detecting from the plurality of images events of a specific event type, identifying a location within the image view where the event of the specific type is present, determining a presence value of each of the identified locations, and determining an image adjustment parameter based on data from an adjustment location within the image view. The adjustment location is determined based on the presence value in each location of a plurality of locations within the image view.01-24-2013
20130033607METHOD OF AUTOMATICALLY TRACKING AND PHOTOGRAPHING CELESTIAL OBJECTS, AND CAMERA EMPLOYING THIS METHOD - A method of automatically tracking and photographing celestial objects which captures a still image of a celestial object(s) where each celestial object appears stationary simply by making an exposure with a camera directed toward an arbitrary-selected celestial object and fixed with respect to the ground and without using an equatorial, and also a camera that employs this method. The method includes inputting latitude information at a photographic site, photographing azimuth angle information, photographing elevation angle information, attitude information of a photographic apparatus and focal length information of a photographing optical system; calculating movement amounts of the celestial object image relative to the photographic apparatus, for fixing the celestial object image with respect to the predetermined imaging area of an image pickup device, using all of the input information; and obtaining a photographic image by moving at least one of the predetermined imaging area and the celestial object image.02-07-2013
20130050499INDIRECT TRACKING - A mobile platform in a multi-user system tracks its own position with respect to an object and tracks remote mobile platforms despite being an unknown moving objects. The mobile platform captures images of the object and tracks its position with respect to the object as the position changes over time using the multiple images. The mobile platform receives the position of a remote mobile platform with respect to the object as the second position changes over time. The mobile platform tracks the position of the mobile platform with respect to the remote mobile platform using the position of the mobile platform and the received position of the remote mobile platform. The mobile platform may render virtual content with respect to the object and the remote mobile platform based on the tracked positions or may detect or control interactions or trigger events based on the tracked positions.02-28-2013
20130050500INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING APPARATUS, AND INFORMATION PROCESSING METHOD, UTILIZING AUGMENTED REALITY TECHNIQUE - An exemplary embodiment provides an information processing program. The information processing program includes image obtaining instructions, search target detection instructions, shape recognition instructions, event occurrence instructions, virtual image generation instructions, and display control instructions. The search target detection instructions cause a computer to detect a search target from an image of a subject obtained in accordance with the image obtaining instructions. The shape recognition instructions cause the computer to recognize a shape of the search target. The event occurrence instructions cause the computer to cause an event to occur in a virtual space in accordance with the shape of the search target. The virtual image generation instructions cause the computer to generate a virtual image by shooting the event that occurs in accordance with the event occurrence instructions with a virtual camera arranged in the virtual space.02-28-2013
20130050501Metrology Method and Apparatus, and Device Manufacturing Method - A target structure including a periodic structure is formed on a substrate. An image of the target structure is detected while illuminating the target structure with a beam of radiation, the image being formed using a first part of non-zero order diffracted radiation while excluding zero order diffracted radiation. Intensity values extracted from a region of interest within the image are used to determine a property of the periodic structure. A processing unit recognizes locations of a plurality of boundary features in the image of the target structure to identify regions of interest. The number of boundary features in each direction is at least twice a number of boundaries of periodic structures within the target structure. The accuracy of locating the region is greater than by recognizing only the boundaries of the periodic structure(s).02-28-2013
20130050502MOVING OBJECT TRACKING SYSTEM AND MOVING OBJECT TRACKING METHOD - A moving object tracking system includes an input unit, a detection unit, a creating unit, a weight calculating unit, a calculating unit, and an output unit. The detection unit detects all tracking target moving objects from each of input images input. The creating unit creates a combination of a path that links each moving object detected in a first image to each moving object detected in a second image, a path that links each moving object detected in the first image to an unsuccessful detection in the second image, and a path that links an unsuccessful detection in the first image to each moving object detected in the second image. The calculating unit calculates a value for the combination of the paths to which weights are allocated. The output unit outputs a tracking result.02-28-2013
20130057700LINE TRACKING WITH AUTOMATIC MODEL INITIALIZATION BY GRAPH MATCHING AND CYCLE DETECTION - A vision based tracking system in a mobile platform tracks objects using groups of detected lines. The tracking system detects lines in a captured image of the object to be tracked. Groups of lines are formed from the detected lines. The groups of lines may be formed by computing intersection points of the detected lines and using intersection points to identified connected lines, where the groups of lines are formed using connected lines. A graph of the detected lines may be constructed and intersection points identified. Interesting subgraphs are generated using the connections and the group of lines is formed with the interesting subgraphs. Once the groups of lines are formed, the groups of lines are used to track the object, e.g., by comparing the groups of lines in a current image of the object to groups of lines in a previous image of the object.03-07-2013
20130057701DEVICE AND METHOD FOR IMAGE PROCESSING - An image processing device includes: an extractor configured to extract a region of interest which includes a point of interest and satisfies a specified condition in a first image frame; a divider configured to divide the region of interest into a first subregion including the point of interest and a second subregion not including the point of interest at a narrow portion of the region of interest; and a specifying unit configured to specify a specified pixel in the first subregion as a point of interest of a second image frame.03-07-2013
20130057702OBJECT RECOGNITION AND TRACKING BASED APPARATUS AND METHOD - Object recognition and tracking methods, devices and systems are disclosed. One embodiment of the present invention pertains to a method for associating an object with an event and a condition triggering the event in response to a receipt of a representation of the object to be recognized and tracked. The method also comprises tracking a movement of the object and storing information associated with the movement. The method further comprises, generating data associated with the object based on the information associated with the movement of the object in response to occurrence of the condition triggering the event.03-07-2013
20130063605IMAGING APPARATUS, IMAGE PROCESSING METHOD, AND RECORDING MEDIUM FOR RECORDING PROGRAM THEREON - An imaging apparatus includes: a setting processor which sets an image area including a tracking shooting object image in a pan-blur shooting as a tracking object image as a tracking image area, with respect to a specific frame image; a searching processor, by respectively setting a scanning area for a plurality of frame images following the frame image used to set the tracking image area, and respectively moving the scanning area on each corresponding frame image of the frame images, and makes a comparison of a characteristic amount of an image between the tracking image area and the scanning area, which respectively obtains a scanning area where the characteristic amount of the image is similar to the image in the tracking image area as a tracking object existing area including the tracking object image, with respect to the frame images; a measuring processor, by dividing a difference between a coordinate of the tracking object existing area obtained with respect to one frame image of the frame images and that obtained with respect to a next one by a certain time interval, which measures a moving speed of the tracking object image on a monitor screen; and a displaying processor which displays a speed display mark corresponding to the moving speed of the tracking object image on the monitor screen.03-14-2013
20130070103SUPER RESOLUTION BINARY IMAGING AND TRACKING SYSTEM - In one aspect, the invention provides an imaging system including an optical system adapted to receive light from a field of view and direct the received light to two image planes. A fixed image detector is optically coupled to one of the image planes to detect at least a portion of the received light and generate image data corresponding to at least a portion of the field of view. A movable (e.g., rotatable) image detector is optically coupled to the other image plane to sample the received light at different locations thereof to generate another set of image data at a higher resolution than the image data obtained by the fixed detector. The system can include a processor for receiving the two sets of image data to generate two images of the field of view.03-21-2013
20130070104SOUND SOURCE MONITORING SYSTEM AND METHOD THEREOF - A sound source monitoring system includes a sound receiving module, a sound detection module, a sound source localization module, and a camera module. The sound receiving module is configured to receive a plurality of sound signals. The sound detection module is for dividing an integrated signal formed by adding the sound signals received by the sound receiving module and normalizing the sum of the sound signals or dividing each of the sound signals into a plurality of sub-bands, calculating a signal-to-noise ratio (SNR) of each sub-band and a background noise, and accordingly determining whether to output the sound signals received by the sound receiving module to the sound source localization module. The sound source localization module is for outputting a sound source location by using the sound signals received by the sound receiving module. The camera module is for shooting an image corresponding to the sound source location.03-21-2013
20130070105TRACKING DEVICE, TRACKING METHOD, AND COMPUTER PROGRAM PRODUCT - According to an embodiment, a tracking device includes an acquiring unit, a first calculator, a second calculator, and a setting unit. The acquiring unit images a tracking target object to time-sequentially acquire an image. The first calculator calculates a first likelihood representing a degree of coincidence between a pixel value of each pixel included in a search region within the image and a reference value. The second calculator calculates a difference value between the pixel value of each pixel in the search region and the pixel value of a corresponding pixel in an image in a past frame. The setting unit sets weights of the first likelihood and the difference value so that as a distance between each pixel in the search region and a position of the tracking target object in the past increases, the weight of the first likelihood decreases and the weight of the difference value increases.03-21-2013
20130076913SYSTEM AND METHOD FOR OBJECT IDENTIFICATION AND TRACKING - What is disclosed is a system and method for identifying materials comprising an object captured in a video and for using the identified materials to track that object as it moves across the captured video scene. In one embodiment, a multi-spectral or hyper-spectral sensor is used to capture a spectral image of an object in an area of interest. Pixels in the spectral planes of the spectral images are analyzed to identify a material comprising objects in that area of interest. A location of each of the identified objects is provided to an imaging sensor which then proceeds to track the objects as they move through a scene. Various embodiments are disclosed.03-28-2013
20130083201METHODS AND APPARATUS FOR DETERMINING MISALIGNMENT OF FIRST AND SECOND SENSORS - Method and apparatus of the invention determine the spatial and roll mis-alignment of first and second sensors using a common image scene from sensor overlapping Fields of View (FOV). The sensors are dissimilar in one or more of the following respects: Field-Of-View (FOV) size, detector array size, spectral response, gain and level, pixel resolution, dynamic range, and thermal sensitivity.04-04-2013
20130083202Method, System and Computer Program Product for Reducing a Delay From Panning a Camera System - For reducing a delay from panning a camera system, an estimate is received of a physical movement of the camera system. In response to the estimate, a determination is made of whether the camera system is being panned. In response to determining that the camera system is not being panned, most effects of the physical movement are counteracted in a video sequence from the camera system. In response to determining that the camera system is being panned, most effects of the panning are preserved in the video sequence, while concurrently the video sequence is shifted toward a position that balances flexibility in counteracting effects of a subsequent physical movement of the camera system.04-04-2013
20130100295INFORMATION PROCESSING APPARATUS AND METHOD - An information processing apparatus includes an acquirement section, a detection section, a selection section, a display control section and a commodity recognition section. The acquirement section acquires an image captured by an image capturing section. The detection section detects all or part of targets included in the image acquired by the acquirement section. The selection section selects any one target in the condition that the detection section detects a plurality of targets. The display control section displays the target selected by the selection section in the plurality of targets on the image acquired by the acquirement section. The commodity recognition section recognizes a commodity captured by the image capturing section based on a similarity showing a degree with which all or part of the images of the target selected by the selection section are similar to the reference image of each commodity.04-25-2013
20130107057METHOD AND APPARATUS FOR OBJECT TRACKING AND RECOGNITION05-02-2013
20130113940IMAGING DEVICE AND SUBJECT DETECTION METHOD - An imaging device includes a first detection part which detects one or more subjects in an image captured by the image capturing part capturing an image continuously; a second detection part which follows the one or more subjects detected; and a system control part which includes a setting part setting a part of the image as a limited region, and causes, after the first detection part detects the one or more subjects in the captured image, the second detection part to follow and detect a subject in an image captured subsequently to the captured image, and causes the first detection part to detect a subject in the limited region.05-09-2013
20130113941TRACKING APPARATUS AND TRACKING METHOD - The tracking target subject specifying unit specifies a tracking target subject in image data. The first tracking position detection unit detects first characteristic information from the image data and set a first candidate tracking position based on the first characteristic information. The second tracking position detection unit detects second characteristic information from the image data and detect a second candidate tracking position based on the second characteristic information. The reference information acquisition unit acquires reference information. The control unit decides a true tracking position based on two determinations.05-09-2013
20130120585AUTOMATIC TRACKING CAMERA SYSTEM - An automatic tracking camera system includes: an image pickup unit; a driving unit for rotating the image pickup unit in panning or tilting direction; a signal receiver for receiving an object information signal; an image signal processor for recognizing an object in an image and detecting motion of the object in the image; a controller for controlling the image pickup unit, the driving unit, and the image signal processor; and a memory for storing, for each passageway, standby positions at which the image signal processor detects the object. The controller calculates an approaching passageway and angle of the object; selects a corresponding standby position from the standby positions stored in the memory; drives the driving unit and a lens apparatus to the standby position; and controls, when the image signal processor recognizes the object, the driving unit and the lens apparatus to automatically track the object based on detected information.05-16-2013
20130120586AUTOMATIC TRACKING CAMERA SYSTEM - An automatic tracking camera system includes: a rotating unit for panning and tilting an image pickup unit including a lens apparatus and an image pickup apparatus; a tracking object detector; a motion vector detector for detecting a motion vector of the object to be tracked; a capture position setting unit for setting a capture position of the object to be tracked in the picked up image; and a controller for controlling drive of the rotating unit. The controller controls the rotating unit in a capture mode to capture the object to be tracked at the capture position based on the motion vector detected by the motion vector detector after the tracking object detector has detected the object to be tracked in the picked up image, and a maintenance mode to continuously capture the object to be tracked at the capture position after the capture mode.05-16-2013
20130128054System and Method for Controlling Fixtures Based on Tracking Data - Systems and methods are provided for using tracking data to control the functions of an automated fixture. Examples of automated fixtures include light fixtures and camera fixtures. A method includes obtaining a first position of a tracking unit. The tracking unit includes an inertial measurement unit and a visual indicator configured to be tracked by a camera. A first distance is computed between the automated fixture and the first position and it is used to set a function of the automated fixture to a first setting. A second position of the tracking unit is obtained. A second distance between the automated fixture and the second position is computed, and the second distance is used to set the function of the automated fixture to a second setting.05-23-2013
20130141591HUMAN-MACHINE-INTERFACE AND METHOD FOR MANIPULATING DATA IN A MACHINE VISION SYSTEM - This invention provides a Graphical User Interface (GUI) that operates in connection with a machine vision detector or other machine vision system, which provides a highly intuitive and industrial machine-like appearance and layout. The GUI includes a centralized image frame window surrounded by panes having buttons and specific interface components that the user employs in each step of a machine vision system set up and run procedure. One pane allows the user to view and manipulate a recorded filmstrip of image thumbnails taken in a sequence, and provides the filmstrip with specialized highlighting (colors or patterns) that indicate useful information about the underlying images. The programming of logic is performed using a programming window that includes a ladder logic arrangement.06-06-2013
20130162837Image Capture - An apparatus including a processor configured to change automatically which pixels are used to define a target captured image in response to relative movement of a sensor frame of reference defined by a camera sensor and an image frame of reference defined by the image.06-27-2013
20130162838Transformation between Image and Map Coordinates - Systems and methods for transformations between image and map coordinates, such as those associated with a video surveillance system, are described herein. An example of a method described herein includes selecting a reference point within the image with known image coordinates and map coordinates, computing at least one transformation parameter with respect to a location and a height of the camera and the reference point, detecting a target location to be tracked within the image, determining image coordinates of the target location, and computing map coordinates of the target location based on the image coordinates of the target location and the at least one transformation parameter.06-27-2013
20130162839TRACKING DEVICE AND TRACKING METHOD FOR PROHIBITING A TRACKING OPERATION WHEN A TRACKED SUBJECT IS OBSTRUCTED - A tracking device includes an imaging part for repeatedly acquiring image data on a subject image, a tracking processing part for setting a tracking position based on first image data to perform tracking processing on a subject in the tracking position based on second image data, and a relative distance information calculating part for calculating relative distance information using (1) a determined information about a distance to the tracking position and (2) a determined information about a distance to a surrounding area around the tracking position. When the tracking processing part determines that another subject in the surrounding area is located at the closer range than the subject in the tracking position, the tracking processing part prohibits tracking processing.06-27-2013
20130169820CAMERA DEVICE TO CAPTURE AND GENERATE TARGET LEAD AND SHOOTING TECHNIQUE DATA AND IMAGES - This invention relates to processes for the capturing of the images of a target, and/or the shooter, at the time around the discharge of a gun, bow, or shooting device and the display of the images prior to discharge, around point of discharge, and post discharge in a manner that allows the shooter to analyze the images and data. More particularly, the present invention relates to the process in shooting where a moving target must be led in order that the projectile (or projectiles) arrives on target after the point in time where the shoot decision is made and the projectile reaches the target area. This invention will aid the shooter by letting them see images and sight pictures of successful and unsuccessful shots and how much lead, if any, they had given the targets at the point in time they decided to shoot. It also allows for the shooters technique to be recorded and analyzed.07-04-2013
20130169821Detecting Orientation of Digital Images Using Face Detection Information - A method of automatically establishing the correct orientation of an image using facial information. This method is based on the exploitation of the inherent property of image recognition algorithms in general and face detection in particular, where the recognition is based on criteria that is highly orientation sensitive. By applying a detection algorithm to images in various orientations, or alternatively by rotating the classifiers, and comparing the number of successful faces that are detected in each orientation, one may conclude as to the most likely correct orientation. Such method can be implemented as an automated method or a semi automatic method to guide users in viewing, capturing or printing of images.07-04-2013
20130182122INFORMATION PROCESSING APPARATUS AND METHOD - According to one embodiment, an information processing apparatus includes an acquirement unit configured to acquire a color image captured by an image capturing unit, an image conversion unit configured to convert the acquired color image into a monochrome image, a first recognition unit configured to specify an commodity included in the image captured by the image capturing unit based on the monochrome image, a second recognition unit configured to specify the commodity included in the image captured by the image capturing unit based on the acquired color image if the commodity cannot be specified by the first recognition unit and an output unit configured to output information showing the commodity specified by the first recognition unit or the second recognition unit.07-18-2013
20130182123IMAGE PICKUP APPARATUS, CONTROL METHOD FOR THE SAME, AND PROGRAM THEREOF - An image pickup apparatus including an image pickup device configured to obtain a captured image corresponding to an object image, an object detecting unit configured to detect a specific object in the captured image, an position obtaining unit configured to obtain a position at which the specific object exists in the captured image, a composition specifying unit configured to specify a recommended composition on the basis of the existence position of the specific object when the specific object is viewed as a main object, and an instruction control unit configured to instruct the execution of a predetermined operation which notifies a user of the recommended composition.07-18-2013
20130188059Automated System and Method for Tracking and Detecting Discrepancies on a Target Object - A detection system including a target object having a target object coordinate system, a tracking unit configured to monitor a position and/or an orientation of the target object and generate a target object position signal indicative of the position and/or the orientation of the target object, a camera positioned to capture an image of the target object, an orienting mechanism connected to the camera to control an orientation of the camera relative to the target object, and a processor configured to analyze the image to detect a discrepancy in the image and, when the discrepancy is present in the image, determine a location of the discrepancy relative to the target object coordinate system based at least upon the target object position signal and the orientation of the camera, and then orient the camera and laser to aim at and point out the discrepancy.07-25-2013
20130194433IMAGING PROCESSING SYSTEM AND METHOD AND MANAGEMENT APPARATUS - An imaging processing system includes one or more image capturing apparatuses, a reading unit configured to read biometric information from an authentication object person, a similarity calculation unit configured to calculate similarity based on a result of comparing biometric information read by the reading unit with true biometric information of the authentication object person, an authentication unit configured to perform authentication based on a comparison between the similarity calculated by the similarity calculation unit and a preliminarily set threshold, and a control unit configured to control, if the authentication performed by the authentication unit is successful, imaging processing, which is performed by the image capturing apparatus, based on the similarity calculated by the similarity calculation unit.08-01-2013
20130201344SMART CAMERA FOR TAKING PICTURES AUTOMATICALLY - Methods, apparatuses, systems, and computer-readable media for taking great pictures at an event or an occasion. The techniques described in embodiments of the invention are particularly useful for tracking an object, such as a person dancing or a soccer ball in a soccer game and automatically taking pictures of the object during the event. The user may switch the device to an Event Mode that allows the user to delegate some of the picture-taking responsibilities to the device during an event. In the Event Mode, the device identifies objects of interest for the event. Also, the user may select the objects of interest from the view displayed by the display unit. The device may also have pre-programmed objects including objects that the device detects. In addition, the device may also detect people from the users' social networks by retrieving images from social networks like Facebook® and LinkedIn®.08-08-2013
20130201345METHOD AND APPARATUS FOR CONTROLLING VIDEO DEVICE AND VIDEO SYSTEM - Embodiments of the present invention disclose a method and an apparatus for controlling a video device as well as a video system. The video device includes a monitor and a camera that are relatively fixed, face a same direction, and are connected to a moving mechanism. The method includes: obtaining a facial image of a participant that is identified from a conference site image, where the conference site image is shot and provided by the camera; analyzing the facial image, and, after determining, with reference to an analysis result, that a facial position of the participant has deviated from directions of facing the monitor and the camera, determining a deviation direction; and controlling the moving mechanism to drive the monitor and the camera to move to a position of facing the facial position of the participant according to the deviation direction.08-08-2013
20130201346IMAGE PROCESSING DEVICE WITH FUNCTION FOR AUTOMATICALLY ADJUSTING SEARCH WINDOW - An image processing device according to the present invention includes a means (08-08-2013
20130201347PRESENCE DETECTION DEVICE - A user presence detection device includes a camera module with a silicon-based image sensor adapted to capture an image and a processing device configured to process the image to detect the presence of a user. The camera module further includes a light filter having a lower cut-off wavelength of between 550 nm and 700 nm and a higher cut-off wavelength of between 900 nm and 1100 nm.08-08-2013
20130208126COOPERATIVE OPTICAL-IMAGING SENSOR ARRAY - An apparatus and method for providing image primitives, such as edge polarity, edge magnitude, edge orientation, and edge displacement, and derivatives thereof, for an object are described. The data are obtained substantially simultaneously and processed in parallel such that multiple objects can be distinguished from one another in real time.08-15-2013
20130208127AUTO BURST IMAGE CAPTURE METHOD APPLIED TO A MOBILE DEVICE, METHOD FOR TRACKING AN OBJECT APPLIED TO A MOBILE DEVICE, AND RELATED MOBILE DEVICE - The present invention discloses a mobile device, where the mobile device includes an image sensing unit, a touch screen, and a processor. The image sensing unit is configured to receive at least an image of a scene comprising at least an object. The touch screen is configured to display at least an image of a scene and received at least one user input. The processor is configured to identify the object in response to a first user input corresponding to the object is received, determine characteristics of the object, track the object in the scene according to the characteristics of the object, and capture a number of images of the scene according to a motion state of the object. The motion state is determined according to variance of the characteristics of the object in consecutive images received by the image sensing unit.08-15-2013
20130208128METHOD AND APPARATUS FOR USING GESTURES TO CONTROL A LASER TRACKER - A laser measurement system includes a laser tracker having a structure rotatable about first and second axes, a first light source that launches a light beam from the structure, a distance meter, first and second angular encoders that measure first and second angles of rotation about the first and second axes, respectively, a processor, and a camera system. Also, a communication device that includes a second light source and an operator-controlled device that controls emission of a light from the second light source; a retroreflector target not disposed on the communication device. Also, the camera system is operable to receive the second light and to convert it into a digital image, and the processor is operable to determine a command to control operation of the tracker based on a pattern of movement of the second light source between first and second times and the digital image.08-15-2013
20130229528APPARATUS AND METHOD FOR AUTOMATIC VIDEO RECORDING - System and methods for pointing a device, such as a camera, at a remote target wherein the pointing of the device is controlled by a combination of location information obtained by global positioning technology and orientation information obtained by line of sight detection of the direction from the device to the target.09-05-2013
20130229529Camera to Track an Object - Methods and apparatus to create and display screen stereoscopic and panoramic images are disclosed. Methods and apparatus are provided to generate multiple images that are combined into a stereoscopic or a panoramic image. A controller provides correct camera settings for different conditions. A controller rotationally aligns images of lens/sensor units that are rotationally misaligned. A compact controllable platform holds and rotates a camera. A remote computing device with a camera and a digital compass tracks an object causing the camera in the platform to track the object.09-05-2013
20130235211Multifunctional Bispectral Imaging Method and Device - A multifunctional device and method for bispectral imaging are provided. The device and method include acquiring a plurality of bispectral images (IBM), each bispectral image being the combination of two acquired images (I09-12-2013
20130242112IMAGE STABILIZATION AND TRACKING SYSTEM - An image stabilization and tracking system includes a primary imaging detector, a stabilization and tracking detector, an image processing and correction command control, and an adaptive optic device. The primary imaging detector is configured to detect, within a field of view, images of a primary object in an optic image. The stabilization and tracking detector is disposed outside of the field of view, and is configured to detect images of a tracking object in the optic image. The image processing and correction command control is coupled to receive the images of, and is configured to detect relative movement of, the tracking object. The adaptive optic device is coupled to receive correction commands and is configured, in response thereto, to move and thereby vary a position of the optic image.09-19-2013
20130242113PHOTOGRAPHING DEVICE, PHOTOGRAPHING DEVICE CONTROLLING METHOD, PHOTOGRAPHING DEVICE CONTROLLING PROGRAM, AND COMPUTER-READABLE RECORDING MEDIUM IN WHICH PHOTOGRAPHING DEVICE CONTROLLING PROGRAM IS RECORDED - A photographing device comprises a detector configured to detect a plurality of subjects from an image photographed by a photographing part; a priority setting part configured to set a priority on each of the plurality of subjects detected by the detector using one or a plurality of pieces of photographing history information, the photographing history information including focused subject information that shows one or a plurality of focused subjects in a history image, the focused subject information being related to the history image, the history image being photographed by the photographing part in the past; a focused subject selector configured to select the subject on which a lens is focused from the plurality of subjects, in order of the priority set by the priority setting part; and a presenting part configured to present the subject selected by the focused subject selector.09-19-2013
20130250126TRACKING APPARATUS - According to one embodiment, a tracking apparatus includes a spherical body, 109-26-2013
20130258113SYSTEM AND METHOD FOR CONTROL BASED ON FACE OR HAND GESTURE DETECTION - System and method for control using face detection or hand gesture detection algorithms in a captured image. Based on the existence of a detected human face or a hand gesture in an image captured by a digital camera, a control signal is generated and provided to a device. The control may provide power or disconnect power supply to the device (or part of the device circuits). The location of the detected face in the image may be used to rotate a display screen to achieve a better line of sight with a viewing person. The difference between the location of the detected face and an optimum is the error to be corrected by rotating the display to the required angular position. A hand gesture detection can be used as a replacement to a remote control for the controlled unit, such as a television set.10-03-2013
20130265439Robust Video-based Camera Rotation Estimation - A robust system and method for estimating camera rotation in image sequences. A rotation-based reconstruction technique is described that is directed to performing reconstruction for image sequences with a zero or near-zero translation component. The technique may estimate only the rotation component of the camera motion in an image sequence, and may also estimate the camera intrinsic parameters if not known. Input to the technique may include an image sequence, and output may include the camera intrinsic parameters and the rotation parameters for all the images in the sequence. By only estimating a rotation component of camera motion, the assumption is made that the camera is not moving throughout the entire sequence. However, the camera is allowed to rotate and zoom arbitrarily. The technique may support both the case where the camera intrinsic parameters are known and the case where the camera intrinsic parameters are not known.10-10-2013
20130265440IMAGE CAPTURING SYSTEM AND IMAGE CAPTURING METHOD - An image capturing system includes a sensor unit that is worn on an user and includes a first motion sensor that detects motion of the user and a subject tracking apparatus that is integrated with a camera platform on which an imaging apparatus is mounted, includes a second motion sensor that detects motion of the camera platform, and controls the motion of the camera platform by using the second motion sensor based on the motion of the user detected by the first motion sensor to allow the imaging apparatus to track a subject.10-10-2013
20130271613SYSTEM FOR THE DETERMINATION OF RETROREFLECTIVITY OF ROAD SIGNS AND OTHER REFLECTIVE OBJECTS - A system for the determination of retroreflectivity values for reflective surfaces disposed along a roadway repeatedly illuminates an area along the roadway that includes at least one reflective surface using a light source. Multiple light intensity values are measured over a field of view which includes at least a portion of the area illuminated by the light source. A computer processing system is used to identifying a portion of the light intensity values associated with a reflective surface and analyze the portion of the light intensity values to determine at least one retroreflectivity value for that reflective surface.10-17-2013
20130278777CAMERA GUIDED WEB BROWSING - Systems and methods for performing camera-guided browsing, such as web browsing, are described herein. A method for operating a camera-guided web browser as provided herein includes displaying a web page on a display associated with a portable device; passively detecting a first object within a field of view of a camera associated with the portable device; and altering at least part of the web page with first content associated with the first object in response to passively detecting the first object within the field of view of the camera.10-24-2013
20130278778AUTOMATIC TRACKING APPARATUS - An automatic tracking apparatus includes: an image pickup apparatus having zoom function; camera platform driving the image pickup apparatus in at least one of pan and tilt directions, and automaically tracking an object by operating the image pickup apparatus and camera platform; unit for detecting a position of the object in picked-up image information; prohibition area setting unit for setting tracking prohibition area, according to information on the automatic tracking apparatus that includes information on the object including information on at least one of type, orientation and traveling speed of the object, or information on at least one of pan and tilt positions of the camera platform and zoom position; and controller that does not perform tracking operation in the prohibition area, and performs zooming, panning and tilting to perform the tracking operation when out of the prohibition area.10-24-2013
20130286216Rifle Scope Including a Circuit Configured to Track a Target - A rifle scope includes at least one optical sensor configured to capture a video of a view area, a display, a processor coupled to the display and to the at least one optical sensor, and a memory accessible to the processor. The memory stores instructions that, when executed, cause the processor to receive user input that identifies a target within the video, apply a visual tag to the target within the video, and adjust the visual tag to track the target within a sequence of frames. The memory further stores instructions that, when executed, cause the processor to provide the video including the visual tag to the display.10-31-2013
20130286217SUBJECT AREA DETECTION APPARATUS THAT EXTRACTS SUBJECT AREA FROM IMAGE, CONTROL METHOD THEREFOR, AND STORAGE MEDIUM, AS WELL AS IMAGE PICKUP APPARATUS AND DISPLAY APPARATUS - A subject area detection apparatus which is capable of improving the rate of detection for a subject area in an image and detecting the subject area with ease. A first subject area which is a specific area is detected from a subject image in an image having at least one subject image. An area including at least a part of the first subject area is detected as a second subject area from the subject image in the image. An estimated area is obtained by estimating, in the second subject area, an area corresponding to the first subject area. The first subject area and the estimated area are compared with each other to obtain a correlation determination result. A subject area detection result is output according to the correlation determination result.10-31-2013
20130286218IMAGE RECOGNITION DEVICE THAT RECOGNIZES SPECIFIC OBJECT AREA, METHOD OF CONTROLLING THE DEVICE, AND STORAGE MEDIUM, AS WELL AS IMAGE PICKUP APPARATUS, AND DISPLAY DEVICE - An image recognition device capable of accurately recognizing an object by efficiently selecting a specific object area as a recognition target, from detected object areas. The image recognition device recognizes a specific object area from images sequentially input in time series. An object detection section detects object areas from each image. An appropriateness determination section determines whether or not each detected object area is appropriate as a recognition target. A recognition target-selecting section selects an object area for use as the recognition target, from the detected object areas, based on a result of determination by the appropriateness determination section. An image recognition device recognizes whether or not the object area selected by the recognition target-selecting section is the specific object area.10-31-2013
20130307993IMAGE CAPTURE APPARATUS AND CONTROL METHOD THEREFOR - A specific subject region is detected, and if a focus detection area that is encompassed in the specific subject region is present, focus detection is performed on that focus detection area. This makes it possible to reduce the influence of a conflict between far and near subjects and reduce the amount of time required for focus detection in an image capture apparatus and in a control method therefor, the image capture apparatus being for performing automatic focus detection based on contrast evaluation values for a plurality of focus detection areas that are each located at a fixed position and each has a fixed size.11-21-2013
20130307994PHOTOMETRIC DEVICE, IMAGING DEVICE, AND CAMERA - A photometric device includes: an imaging unit that captures an image formed by an optical system, with a marker capable of switching between display and non-display disposed in a light path of the optical system; and a correction unit that, based upon a difference between a first image captured by the imaging unit in a state where the marker is displayed and a second image captured by the imaging unit in a state where the marker is not displayed, performs a correction of a third image that is different from the first image and the second image.11-21-2013
20130314547CONTROLLING APPARATUS FOR AUTOMATIC TRACKING CAMERA, AND AUTOMATIC TRACKING CAMERA HAVING THE SAME - A controller for a tracking camera for automatically tracking an object by controlling panning/tilting and zooming of an image pickup apparatus mounted on a camera platform includes an object recognition unit that recognizes an object in picked-up image, a setting unit that sets a tracking condition, a memory that stores at least one tracking condition, a readout unit that reads out the tracking condition, and a controller that controls panning/tilting of the platform and zooming of the image pickup apparatus to track the object under the condition, wherein the condition includes an output position in the picked-up image, and a display size of the object in the picked-up image, the setting unit includes an output position setting unit, and a display size setting unit, the memory stores the output position and/or the display size, and the readout unit reads out the output position and/or the display size.11-28-2013
20130314548IMAGE CAPTURING APPARATUS, IMAGE CAPTURING METHOD, AND COMPUTER PROGRAM PRODUCT - An image capturing apparatus includes: an image input unit that inputs an image; a designating unit that receives designation of an initial tracking-target region that is a first region in a first image frame of time-series image frames input from the image input unit; a feature value extracting unit that extracts a predetermined feature value from a target region in the first image frame; a first search unit that searches for a second region obtained by changing a size of the first region and by determining whether the feature value extracted from the second region satisfies a predetermined condition for enabling successful tracking and sets the second region as a new initial tracking-target region; and a second search unit that searches for a region similar to the newly set initial tracking-target region as a tracking-result region in a second image frame subsequent to the first image frame.11-28-2013
20130321643IMAGE DISPLAY DEVICE AND OBJECT DETECTION DEVICE - An image display device is provided with: a display having a display screen; a shooting unit that is arranged on an external side of the display screen so that an optical axis obliquely intersects a normal line of the display screen (for example, the normal line passing through the center) on a front surface side of the display screen and sequentially shoots images in a direction of the optical axis to capture shooting images; and a detection unit that detects a change of the shooting images shot by the shooting unit.12-05-2013
20130329055Camera System for Recording and Tracking Remote Moving Objects - A camera system for detecting and tracking moving objects located at a great distance includes a camera having a camera lens system, and a position stabilizing device. The camera includes a first image sensor and a second image sensor. The camera lens system includes optical elements for focusing incident radiation onto a radiation sensitive surface of the first image sensor and/or the second image sensor with a reflecting telescope arrangement and a target tracking mirror arrangement, and a drive device for a movable element of the target tracking mirror arrangement and with a control system for the drive device. The optical elements includes a first subassembly of optical elements having a first focal length and associated with the first image sensor, and a second subassembly of optical elements having a second focal length that is shorter than the first focal length and associated with the second image sensor.12-12-2013
20130335575ACCELERATED GEOMETRIC SHAPE DETECTION AND ACCURATE POSE TRACKING - A reference in an unknown environment is generated on the fly for positioning and tracking. The reference is produced in a top down process by capturing an image of a planar object with a predefined geometric shape, detecting edge pixels of the planar object, then detecting a plurality of line segments from the edge pixels. The plurality of line segments may then be used to detect the planar object in the image based on the predefined geometric shape. An initial pose of the camera with respect to the planar object is determined and tracked using the edges of the planar object.12-19-2013
20130335576DYNAMIC ADAPTATION OF IMAGING PARAMETERS - Representative implementations of devices and techniques provide adaptable settings for imaging devices and systems. Operating modes may be defined based on whether an object is detected within a preselected area. One or more parameters of emitted electromagnetic radiation may be dynamically adjusted based on the present operating mode.12-19-2013
20130335577CAMERA DEVICE AND METHOD FOR DETERMINING FIELD OF VIEW OF THE CAMERA DEVICE - In a method for determining a field of view of a camera device, a location device sends position information to establish anchor points. The camera device captures images of the anchor points, and obtains vertical angles of the anchor points, horizontal angles of the anchor points, and distances between the camera device and the anchor points. Each anchor point corresponds to one or more speakers. The method calculates rotation angles of speakers according to the vertical angles, the horizontal angles, the distances and anthropometric values of the speakers. The method further determines the field of view of the camera device according to the rotation angles.12-19-2013
20130342705BACKGROUND MODEL UPDATE METHOD FOR IMAGE PROCESS - The present invention relates to a background model update method for image process. The accuracy of distinguishing a foreground image from the background model can be improved by adjusting the background model update level according to image variations such as brightness variation, motion variation, color variation, etc.12-26-2013
20140009623GESTURE RECOGNITION SYSTEM AND GLASSES WITH GESTURE RECOGNITION FUNCTION - Glasses with gesture recognition function include a glasses frame and a gesture recognition system. The gesture recognition system is disposed on the glasses frame and configured to detect hand gestures in front of the glasses thereby generating a control command. The gesture recognition system transmits the control command to an electronic device to correspondingly control the electronic device.01-09-2014
20140009624MOVABLE-MECHANICAL-SECTION CONTROLLING DEVICE, METHOD OF CONTROLLING MOVABLE MECHANICAL SECTION, AND PROGRAM - A movable-mechanical-section controlling device includes a pan/tilt driving controlling unit configured to perform driving control on a movable mechanical section having a structure that moves so that an image pickup direction of an image pickup section that obtains an image-pickup image by performing an image pickup operation changes in a pan direction and a tilt direction. In the controlling device, unit pan operations that are performed in an angular range in the pan direction are performed for the respective two or more different tilt positions with decreasing angle of elevation at the tilt positions.01-09-2014
20140022394APPARATUS AND METHOD FOR TRACKING OBJECT - A method of tracking an object includes obtaining an image captured by a camera, setting a plurality of patterns having various sizes, according to a distance from a horizon in the image to a plurality of pixels in the image, extracting an object matching one of the plurality of patterns having various sizes, while scanning the image using the plurality of patterns having various sizes, and displaying information about a position of the extracted object in the image.01-23-2014
20140028855CAMERA BASED INTERACTION AND INSTRUCTION - Disclosed are methods and apparatus for instructing persons using computer based programs and/or remote instructors. One or more video cameras obtain images of the student or other participant. In addition images are analyzed by a computer to determine the locations or motions of one or more points on the student. This location data is fed to computer program which compares the motions to known desired movements, or alternatively provides such movement data to an instructor, typically located remotely, who can aid in analyzing student performance. The invention preferably is used with a substantially life-size display, such as a projection display can provide, in order to make the information displayed a realistic partner or instructor for the student. In addition, other applications are disclosed to sports training, dance, and remote dating.01-30-2014
20140028856FIREARM, AIMING SYSTEM THEREFOR, METHOD OF OPERATING THE FIREARM AND METHOD OF REDUCING THE PROBABILITY OF MISSING A TARGET - A firearm aiming system comprising an imaging system comprising an imaging sensor and an image processor; and a user display, wherein the imaging system is adapted to detect a potential target on the user display based on target features. In some embodiments the system includes a firing processor with an epsilon logic module for calculating a target aim-point/area used by the firing processor to make a firing decision.01-30-2014
20140028857HIGH FLUX COLLIMATED ILLUMINATOR AND METHOD OF UNIFORM FIELD ILLUMINATION - A device including an optical reader, a first light source, and a second light source. The optical reader has a field of view comprising a first surface point and a second surface point horizontally offset from the first surface point along the field of view. The first light source is positioned a first distance from the first surface point. The first light source is operably connected to a first control channel and has a first luminous output. The second light source is positioned a second distance from the second surface point and has a second luminous output. The first distance is different from the second distance, and the first luminous output is different from the second luminous output such that the illumination at the first surface point is substantially equivalent to the illumination at the second surface point of the field of view.01-30-2014
20140043490MOVABLE-MECHANICAL-SECTION CONTROLLING DEVICE, METHOD OF CONTROLLING MOVABLE MECHANICAL SECTION, AND PROGRAM - A movable-mechanical-section controlling device includes a pan/tilt driving controlling unit configured to perform driving control on a movable mechanical section having a structure that moves so that an image pickup direction of an image pickup section that obtains an image-pickup image by performing an image pickup operation changes in a pan direction and a tilt direction. In the controlling device, unit pan operations that are performed in an angular range in the pan direction are performed for the respective two or more different tilt positions with decreasing angle of elevation at the tilt positions.02-13-2014
20140063263SYSTEM AND METHOD FOR OBJECT TRACKING AND TIMING ACROSS MULTIPLE CAMERA VIEWS - A system and method for object tracking and timing across multiple camera views includes local and global tracking modules for tracking the location of objects as they traverse particular regions of interest within an area of interest. A local timing module measures the time spent with each object within the area captured by a camera. A global timing module measures the time taken by the tracked object to traverse the entire area of interest or the length of the stay of the object within the area of interest.03-06-2014
20140071296DISPLAY APPARATUS AND DISPLAY METHOD - A display apparatus includes a sight line detection unit that detects a line of sight of a user by analyzing user video information, an enhancement processing unit that detects an intersection point of the line of sight of the user detected by the sight line detection unit and a video display surface of a monitor as an attention point, which is a point on which the user focuses attention, and performs enhancement processing for the monitor by setting a higher gain amount in stages from a position of a longer distance from the attention point toward a position of a smaller distance, and a display video output control unit that outputs video via the monitor based on the gain amount set by the enhancement processing unit and display video information whose input is received by a display video information input unit.03-13-2014
20140078311METHOD FOR GUIDING CONTROLLER TO MOVE TO WITHIN RECOGNIZABLE RANGE OF MULTIMEDIA APPARATUS, THE MULTIMEDIA APPARATUS, AND TARGET TRACKING APPARATUS THEREOF - Methods, systems, and devices for guiding a subject back within the recognizable visual range of a multimedia system are described. According to one of the described methods, when it is determined that the target has left the recognizable range of the multimedia system, sensor information is acquired from a portable electronic device (or controller) the user has been using to control the multimedia system, and the acquired sensor information is used to determine where the user is, relative to the recognizable range. In one example, the user is asked to make a gesture with the portable electronic device, and the sensor information concerning that gesture is used to determine the user's relative location. In another example, the sensor information recorded at the time the user left the recognizable range is used to determine the user's relative location.03-20-2014
20140078312METHOD AND APPARATUS FOR TRACKING THREE-DIMENSIONAL MOVEMENTS OF AN OBJECT USING A DEPTH SENSING CAMERA - A controller (03-20-2014
20140078313METHOD AND TERMINAL FOR DETECTING AND TRACKING MOVING OBJECT USING REAL-TIME CAMERA MOTION ESTIMATION - A method is provided for detecting and tracking a moving object using real-time camera motion estimation, including generating a feature map representing a change in an input pattern in an input image, extracting feature information of the image, estimating a global motion for recognizing a motion of a camera using the extracted feature information, correcting the input image by reflecting the estimated global motion, and detecting a moving object using the corrected image.03-20-2014
20140078314IMAGE BASED SYSTEMS FOR DETECTING INFORMATION ON MOVING OBJECTS - Systems and methods for generating images of an object having a known object velocity include imaging electromagnetic radiation from the object onto a sensor array of an imaging system, adjusting at least one of a shutter rate and a shutter direction of the imaging system in accordance with an image velocity of the image across the sensor array, and sampling output of the sensor array in accordance with the shutter rate and the shutter direction to generate the images. Systems and methods for generating images of an object moving through a scene include a first imaging system generating image data samples of the scene, a post processing system that analyzes the samples to determine when the object is present in the scene, and one or more second imaging systems triggered by the post processing system to generate one or more second image data samples of the object.03-20-2014
20140085483BROADBAND PASSIVE TRACKING FOR AUGMENTED REALITY - Technologies are generally described for a broadband passive sensing and tracking system that may employ a number of passive receivers that each have the capability of sensing electromagnetic waves (e.g., Radio Frequency “RF” signals) from surrounding broadcast sources. Each passive receiver may be adapted to sense through one or more antennas. Multiple receivers at different positions may be utilized to form a broadband sensing network adapted to perform collaborative tracking of a scene of interest. According to some examples, a beam-forming algorithm may be applied over the broadband sensing network utilizing an antenna array formed by the passive receivers to localize and track objects.03-27-2014
20140098240METHOD AND APPARATUS FOR PROCESSING COMMANDS DIRECTED TO A MEDIA CENTER - A system that incorporates teachings of the subject disclosure may include, for example, a method for controlling a steering of a plurality of cameras to identify a plurality of potential sources, identifying the plurality of potential sources according to image data provided by the plurality of cameras, assigning a beam of a plurality of beams of a plurality of microphones to each of the plurality of potential sources, detecting a first command comprising one of a first audible cue based on signals from a portion of the plurality of microphones, a first visual cue based on image data from one of the plurality of cameras, or both for controlling a media center, and configuring the media center according to the first command. Other embodiments are disclosed.04-10-2014
20140098241COMPACT, RUGGED, INTELLIGENT TRACKING APPARATUS AND METHOD - In a video recording environment, a compact, rugged, intelligent tracking apparatus and method enables the automation of labor-intensive operating of cameras, lights, microphones and other devices. Auto-framing of a tracked object within the viewfinder of a supported camera is possible. The device can sense more than one object at once, and includes multiple ways to easy way to switch from one object to another. The methods show how the auto-framing device can be “predictive” of movements, intelligently smooth the tilt and swivel motions so that the end effect is a professional looking picture or video. It is designed to be uniquely small yet rugged and waterproof. And it can accept configuration input from users via a smartphone or extreme-sports camera over wi-fi or bluetooth, including user-programmable scripts that automate the device functionality in easy to use ways.04-10-2014
20140111653METHOD AND SYSTEM FOR THE TRACKING OF A MOVING OBJECT BY A TRACKING DEVICE - A method for tracking a moving object by a tracking device comprises the following steps: (a) reception by the tracking device of a signal transmitted by the moving object or objects and comprising a first item of information of the position of the moving object determined from information received from a satellite positioning system, (b) determination of a second item of information of the relative altitude of the moving object with respect to the tracking device, and the said second item of information is calculated on the basis of the atmospheric pressures measured at the altitude of the tracking device and on the moving object, (c) calculation of the relative position of the moving object with respect to the tracking means, from the first and second items of information using a data fusion technique, (d) orientation of the tracking means towards the relative position calculated in step (c).04-24-2014
20140118555Systems, Devices, and Methods Employing Angular-Resolved Scattering and Spectrally Resolved Measurements for Classification of Objects - Systems, devices, and methods are described for identifying, classifying, differentiating, etc., objects. For example a hyperspectral imaging system can include a dark-field module operably coupled to at least one of an optical assembly, a dark-field illuminator, and a hyperspectral imaging module. The dark-field module can include circuitry having one or more sensors operable to acquire one or more dark-field micrographs associated with scattered electromagnetic energy from an object interrogated by the dark-field interrogation stimulus. The hyperspectral imaging module can be operably coupled to the dark-field module, and can include circuitry configured to generate an angular-resolved and spectrally resolved scattering matrix based on the one or more dark-field micrographs of the object.05-01-2014
20140118556DETECTION SYSTEM - A detection system comprises a light source configured to illuminate an object, an image sensor configured to receive light reflected from the object, and a processor. The image sensor generates a first picture when the light source is turned on. The image sensor generates a second picture when the light source is turned off. The processor is configured to subtract the second picture from the first picture for determining an object image produced by the object.05-01-2014
20140125812SPHERICAL PIN-HOLE MODEL FOR USE WITH CAMERA LENS IMAGE DATA - A camera image processing subsystem processes image data corresponding to observations taken through a lens of focal point f using a spherical pin-hole model that maps the image data through a perspective center of a pin-hole prospective plane located within the lens onto a model sphere that is a focal length f in diameter and has its center at the perspective center of the pin-hole prospective plane. The subsystem models systematic distortion as rotation about coordinate axis of the pin-hole prospective plane, and maps all of the data, over the entire field of view of the lens, to corresponding spherical coordinates.05-08-2014
20140125813OBJECT DETECTION AND TRACKING WITH VARIABLE-FIELD ILLUMINATION DEVICES - Imaging systems and methods optimize illumination of objects for purposes of detection, recognition and/or tracking by tailoring the illumination to the position of the object within the detection space. For example, feedback from a tracking system may be used to control and aim the lighting elements so that the illumination can be reduced or increased depending on the need.05-08-2014
20140125814IMAGING APPARATUS, IMAGING METHOD THEREOF, AND COMPUTER READABLE RECORDING MEDIUM - An imaging apparatus has a display control unit that causes a display unit to display target information indicating a direction to a target position with an optical axis of the imaging apparatus being a reference, based on an azimuth angle of the imaging apparatus detected by an azimuth detecting unit and an elevation angle detected by a tilt angle detecting unit.05-08-2014
20140125815OBJECT DETECTION AND TRACKING WITH REDUCED ERROR DUE TO BACKGROUND ILLUMINATION - An image sensor frame rate can be increased by “interlaced” mode operation whereby only half the number of lines (alternating between odd and even lines) of an image is transported to the readout circuitry. This halves the integration time but also halves the resolution of the sensor. The reduction is tolerable for motion characterization as long as sufficient image resolution remains. Accordingly, in one embodiment, an image sensor operated in an interlaced fashion is first exposed to a scene under a first form of illumination (e.g., narrowband illumination), and a first set of alternating (horizontal or vertical) lines constituting half of the pixels is read out of the array; the sensor is then exposed to the same scene under a second form of illumination (e.g., existing ambient illumination with the illumination source turned off), and a second set of alternating lines, representing the other half of the pixel array, is read out. The two images are compared and noise removed from the image obtained under narrowband illumination. As this occurs, the image sensor is capturing the next image under the first form of illumination, and the process continues.05-08-2014
20140125816METHOD OF AUTOMATICALLY TRACKING AND PHOTOGRAPHING CELESTIAL OBJECTS, AND CELESTIAL-OBJECT AUTO-TRACKING PHOTOGRAPHING APPARATUS - Automatically tracking and photographing celestial objects and a celestial-object auto-tracking photographing apparatus, in which the burden on the CPU can be reduced by eliminating unnecessary arithmetic processes and can clearly photograph an object(s) so as to appear stationary without using an equatorial, and without using an actuator, which must be precisely controlled. The method includes moving relative to a photographic apparatus due to diurnal motion, to photograph a trimming area that has been electronically trimmed from a part of an imaging area of an image sensor, moving while the celestial object is photographed, including obtaining movement information of an image on the imaging area; setting movement data for the trimming area based on the obtained movement information of the image; and carrying out a photographing operation while moving the trimming area based on the movement data of the set trimming area at each trimming area, upon being moved.05-08-2014
20140139686Digital Camera and Image Capturing Method Thereof - An image capturing method for a digital camera includes the following steps: capturing a plurality of pre-captured images in sequence, wherein each of the pre-captured images includes a plurality of sections; capturing a plurality of captured images based on the number of sections and piecing the plurality of captured images together as an output image, wherein an image capturing optical setting value of each captured image is substantially the same; a target section is selected among the plurality of sections while capturing the plurality of captured images, and an indicating signal is outputted by determining the relative position between an object and the target section based on the plurality of pre-captured images.05-22-2014
20140146182DEVICE AND METHOD FOR DETECTING MOVING OBJECTS - A moving object detection apparatus comprising: an image acquisition device; a first moving object detection device; a difference image generation device; a second moving object detection device for detecting existence/nonexistence of the moving object based on the difference image generated by the difference image generation device; and an integration device for integrating a detection result by the first moving object detection device and a detection result by the second moving object detection device and determining that the moving object is detected in a case where the moving object is not detected by at least the first moving object detection device and the moving object is detected by the second moving object detection device.05-29-2014
20140152843OVERHEAD CAMERA AND METHOD FOR CONTROLLING OVERHEAD CAMERA - An overhead camera includes an imaging section that captures an image of a subject, a position recognition section that recognizes a position specified with an operation member operated toward the subject, and a projection section that projects a predetermined image in the position specified with the operation member.06-05-2014
20140168447Optical Device Including a Mode for Grouping Shots for Use with Precision Guided Firearms - An optical device for use with a firearm includes an image sensor configured to capture visual data corresponding to video of a view area, and includes a display, and a controller coupled to the image sensor and to the display. The controller is configured to provide the video including a visual marker at a previously selected tag location on a target within the video to a display. The controller is configured to reapply the visual marker to the previously selected tag location on the target within the video, shot after shot, when the controller is in a group shooting mode.06-19-2014
20140176726IMAGE PROCESSOR FOR PROCESSING IMAGES RECEIVED FROM A PLURALITY OF IMAGE SENSORS - An image processor for processing images received from a plurality of image sensors affixed to a platform, each image sensor having a field of view that at least partially overlaps with the field of view of another one of the other image sensors. For each image sensor, the image processor uses motion data that is received from a motion sensor associated with the respective image sensor, together with reference motion data from a motion sensor affixed to the platform, to determine whether the image sensor has moved from an expected position during the interval between its capturing a first and second image. The image processor adjusts the second image received from each respective image sensor accordingly and combines the adjusted images into a single output image.06-26-2014
20140176727METHOD OF GENERATING INDEX ELEMENTS OF OBJECTS IN IMAGES CAPTURED BY A CAMERA SYSTEM - A camera system comprises an image capturing device, object detection module, object tracking module, and match classifier. The object detection module receives image data and detects objects appearing in one or more of the images. The object tracking module temporally associates instances of a first object detected in a first group of the images. The first object has a first signature representing features of the first object. The match classifier matches object instances by analyzing data derived from the first signature of the first object and a second signature of a second object detected in a second image. The second signature represents features of the second object derived from the second image. The match classifier determine whether the second signature matches the first signature. A training process automatically configures the match classifier using a set of possible object features.06-26-2014
20140184811IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND COMPUTER PROGRAM PRODUCT - The invention is concerning to an image processing apparatus of the present invention including an acquisition unit, a first setting unit, a second setting unit, and a detection unit. The acquisition unit acquires image data. The first setting unit sets an area included in the image data as a template area. The second setting unit sets a search area indicating an area to be subjected to a detection process for detecting an area similar to the template area from the image data. The detection unit performs the detection process for detecting, out of the search area, whether or not there is an area similar to the template area in an area other than an area corresponding to each of the template area and an area that has been detected by the detection process.07-03-2014
20140184812IMAGE RECOGNITION APPARATUS, CONTROL METHOD, AND PROGRAM OF THE SAME - An image recognition apparatus is arranged to extract feature information of an object area detected from among images, compare with feature information of previously-registered objects, thereby calculating a degree of similarity, when the calculated degree of similarity is equal to or greater than a predetermined threshold value, determine the object included in the object area as a registered object and set a recognition confirmed state that is successively maintained between images, and when a reliability of the object tracking is low and the degree of similarity is smaller than the predetermined threshold value, set the recognition confirmed state which is successively maintained into a recognition unconfirmed state where the object included in the object area is not determined as a registered object.07-03-2014
20140192204Controlling Movements of Pointing Devices According to Movements of Objects - A method and apparatus for controlling pointing devices such as cameras according to the movements of an objects, such as balls or selected players of interest by attaching to the object, an RFID which, when triggered, transmits the identification of the object and detecting the transmitted identification at of different locations to define the instantaneous space location of the object. The detected identifications are inputted into an RTLP (Real-Time Location Processor) to produce tracking signals which cause the pointing device to track the movements of the object in a real-time manner. Another signal, is also derived from the RTLP defining another instantaneous condition of the object, and is also fed to the pointing device to control another parameter, such as the magnification or the displayed field-of-view, of the pointing device in a real-time manner.07-10-2014
20140192205APPARATUS AND METHOD FOR OBJECT TRACKING DURING IMAGE CAPTURE - An apparatus and method for object tracking during image capture are provided. The method includes identifying an object of interest in an original camera image, detecting movement of a mobile terminal performing image capture, and tracking the object of interest in subsequent camera images using the detected movement of the mobile terminal.07-10-2014
20140192206POWER CONSUMPTION IN MOTION-CAPTURE SYSTEMS - The technology disclosed relates to reducing the overall power consumption of motion-capture system without compromising the quality of motion capture and tracking In general, this is accomplished by operating the motion-detecting cameras and associated image-processing hardware in a low-power mode unless and until a moving object is detected. Once an object of interest has been detected in the field of view of the cameras, the motion-capture system is “woken up,” i.e., switched into a high-power mode, in which it acquires and processes images at a frame rate sufficient for accurate motion tracking.07-10-2014
20140198220IMAGE PICKUP APPARATUS, REMOTE CONTROL APPARATUS, AND METHODS OF CONTROLLING IMAGE PICKUP APPARATUS AND REMOTE CONTROL APPARATUS - An image pickup apparatus, adapted to be used with a remote control apparatus external to the image pickup apparatus, includes an image pickup unit for performing photoelectric conversion on a captured image to generate a first image signal representing the captured image, a communicator operable to send the generated first image signal to the remote control apparatus, and further operable to receive from the remote control apparatus a second image signal relating to the sent first image signal, and a controller operable to use the received second image signal to detect a target object in a further captured image represented by a third image signal generated by the image pickup unit subsequently to the first image signal.07-17-2014
20140232877Device Having Built-In Digital Data Device Powered by Unlimited Power Source for Light Device - A device having built-in digital data means is powered by an unlimited power source for a lamp-holder, LED bulb, or light device connected to unlimited power source by prongs or a base that can be inserted into a socket that would otherwise receiving a bulb. The device may take the form of a webcam having auto tracking functions and retractable prongs that plug directly into a wall outlet.08-21-2014
20140253736IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD - An image processing device includes a focus adjusting section, a controller, and a track processing section. The focus adjusting section executes imaging operations plural times while moving a focus lens, performs a scanning operation, and moves the focus lens to a focusing position calculated based on the position where the contrast reaches the peak. The controller allows the imaging element to execute a continuous photographing operation of continuously performing the imaging operation for photographing. The track processing section tracks a subject based on the image data in which the contrast reaches the peak.09-11-2014
20140253737SYSTEM AND METHOD OF TRACKING AN OBJECT IN AN IMAGE CAPTURED BY A MOVING DEVICE - A system and method for calculating an expected change in a position of an object in a series of images resulting from a movement of an imager capturing such series of images, and comparing an actual position of such object in an image captured after such movement to determine if, and to what extent, a change in position of the object in such later captured image resulted from a change of a position in space of the object.09-11-2014
20140267770IMAGE-BASED APPLICATION LAUNCHER - Techniques for managing applications associated with a mobile device are provided. The techniques disclosed herein include techniques for obtaining an image of an object in the view of a camera associated with a mobile device, identifying the object in the image based on attributes of the object extracted from the image, and determining whether one or more applications are associated with the object. If there are one or more applications associated with the real-world object, an application associated with the object can be automatically launched on the mobile device. The association between a real-world object and an application may be identified by a visual indicator, such as an icon, symbol, or other markings on the object that indicates that the object is associated with one or more applications.09-18-2014
20140267771GAZE TRACKING AND RECOGNITION WITH IMAGE LOCATION - An image of a person is captured. A left eye image is located in the image. A right eye image is located in the image. A first quantity of left eye white pixels and a second quantity of left eye white pixels in the left eye image locate are determined. A left eye image ratio of the first quantity of left eye white pixels to the second quantity of left eye white pixels is calculated. A first quantity of right eye white pixels and a second quantity of right eye white pixels in the right eye image is calculated. A right eye image ratio of the first quantity of right eye white pixels to the second quantity of right eye white pixels is calculated. A gaze direction of the person is determined based upon an average ratio.09-18-2014
20140267772ROBOTIC TOTAL STATION WITH IMAGE-BASED TARGET RE-ACQUISITION - A robotic total station includes a camera and a pattern recognition subsystem that automatically determine an azimuth angle to which to direct a laser on the total station for target re-acquisition. The camera records images that scan a search area of interest, and the pattern recognition subsystem processes the images to locate the target in one or more of the images as a predetermined pixel pattern that is based on a distinct characteristic of the target, such as a shape of the target, a color of the target, markings present on the target, and so forth. The subsystem calculates the azimuth angle of the target based on the location of the target in the images, the pointing direction of the camera and the known characteristics of the camera. The robotic total station then rotates to the azimuth angle and directs laser pulses to re-acquire the target.09-18-2014
20140267773MARKER SYSTEM WITH LIGHT SOURCE - A marker system includes: a first marker; and a second marker; wherein the first marker and the second marker are configured to emit light from one or more light sources coupled to the first marker and the second marker; and wherein the first marker and the second marker are configured to emit the light for detection by a camera. A method performed using a marker system includes: generating light using one or more light sources; emitting the light at a plurality of markers that are coupled to the light sources; and detecting the light emitted from the plurality of markers using a camera; wherein the act of detecting comprises using one or more filters to reduce ambient light to a level that corresponds with a noise level of the camera while allowing light emitted from the markers to be imaged by the camera.09-18-2014
20140267774DETERMINING THE ORIENTATION OF OBJECTS IN SPACE - A method and system determines object orientation using a light source to create a shadow line extending from the light source. A camera captures an image including the shadow line on an object surface. An orientation module determines the surface orientation from the shadow line. In some examples a transparency imperfection in a window through which a camera receives light can be detected and a message sent to a user as to the presence of a light-blocking or light-distorting substance or particle. A system can control illumination while imaging an object in space using a light source mounted to a support structure so a camera captures an image of the illuminated object. Direct illumination of the camera by light from the light source can be prevented such as by blocking the light or using a light-transmissive window adjacent the camera to reject light transmitted directly from the light source.09-18-2014
20140267775Camera in a Headframe for Object Tracking - Methods and apparatus track an object with a first camera causing a second camera to also track the object. The object is a moving object. Geospatial coordinates, including an elevation or altitude of a camera are determined on the camera. A pose of the camera, including a pitch angle and an azimuth angle are also determined. Azimuth is determined by a digital compass. A camera pose including pitch is determined by accelerometers and/or gyroscopes. Cameras are communicatively connected allowing to display an image recorded by one camera being displayed on a second camera. At least one camera is on a movable platform with actuators. Actuators are controlled to minimize a difference between a first image of the object and a second image of the object. Cameras are part of a wearable headframe.09-18-2014
20140267776TRACKING SYSTEM USING IMAGE RECOGNITION - A system and method for tracking an object in a defined area, such as a facility or warehouse. The system includes an imaging device installed in the defined area that provides at least a partial three-dimensional image of the object to a server. A database storing a computer-generated three-dimensional model of the object is in communication with the server. The server compares the three-dimensional image from the imaging device to the three-dimensional model stored in the database to identify the object. The server provides location information of the object to a receiving device by sending an image of the defined area including an indicia indicating the location of the object in the defined area. The system also may include tracking technology that can be used to determine the location of the object.09-18-2014
20140267777METHOD FOR SHOOTING A PERFORMANCE USING AN UNMANNED AERIAL VEHICLE - The present invention discloses a method for shooting a performance making use of umanned aerial vehicles, such drones for example, to provide the physical markers that are needed to give a physical actor indications on the positioning of virtual elements to be inserted later in the scene, and with which s/he needs to interact.09-18-2014
20140267778APPARATUSES AND METHODS FOR CONTROLLING A GIMBAL AND OTHER DISPLACEMENT SYSTEMS - Apparatuses and methods for controlling a gimbal and other displacement systems are disclosed herein. In accordance with one or more embodiments of the invention, a pointing angle of a camera attached to a gimbal may be controlled based, at least in part, on one or more control signals provided by a controller. The control signals may be used to compensate for displacement of the camera, to add perceived displacement of the camera, to selectively align a pointing angle of the camera and/or to allow a pointing angle to be manually determined.09-18-2014
20140285674IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND IMAGING APPARATUS - The present technique has a face detection unit configured to detect a face region which is determined as a face in an image, a position specifying unit configured to specify a particular position in the image, a range specifying unit configured to set a hue range which has a first range containing a hue at a position specified by the position specifying unit in a case where the position specified by the position specifying unit is not contained in the face region and to set a hue range which has a second range containing a hue at a position specified by the position specifying unit and wider than the first range in a case where the position specified by the position specifying unit contains the face region, and image processing unit for performing image conversion processing based on color information contained in the hue range.09-25-2014
20140293064IMAGE PICKUP APPARATUS - An image pickup apparatus capable of performing object recognition with accuracy even when an object image formed on a focusing screen becomes out of focus on a photometric sensor. A light flux of the object image formed on the focusing screen is captured by the photometric sensor through a variable photometric aperture, and object recognition is performed by an object recognition unit based on image information contained in photometric information output from the photometric sensor. If determined that the object recognition cannot be achieved based on an object recognition operation of the object recognition unit performed according to the image information, the variable photometric aperture is stopped down under the control of a photometric control unit.10-02-2014
20140307100ORTHOGRAPHIC IMAGE CAPTURE SYSTEM - An image capture system for an image data capture and processing system, consisting of a digital imaging device, active illumination source, computer and software that generates 2 dimensional data sets from which real world coordinate information with planarity, scale, aspect, and innate dimensional qualities can be extracted from the captured image in order to transform the image data into other geometric perspectives and to extract real dimensional data from the imaged objects. The image transformations may be homographic transformations, orthographic transformations, perspective transformations, or other transformations that takes into account distortions in the captured image caused by the camera angle.10-16-2014
20140307101METHOD AND APPARATUS FOR DISPLAYING AND RECORDING IMAGES USING MULTIPLE IMAGE CAPTURING DEVICES INTEGRATED INTO A SINGLE MOBILE DEVICE - A method for capturing images using a mobile device that includes a plurality of integrated image capturing devices having a plurality of different fields of view includes displaying a first image of a first field of view associated with a first one of the plurality of image capturing devices on a first region of a display of the mobile device, and displaying a second image of a second field of view associated with a second one of the plurality of image capturing devices on a second region of the display of the mobile device, wherein the first image and the second image are displayed simultaneously.10-16-2014
20140313345FLYING OBJECT VISUAL IDENTIFICATION SYSTEM - A system for visually identifying a flying object includes a detection subsystem, a visual inspection subsystem, and an identification processor. The detection subsystem is configured to detect the location of one or more flying objects within an area, and includes at least one of radar, lidar, and visual detection. The visual inspection subsystem is configured to visually inspect an object of interest selected from the one or more detected flying objects. The visual inspection subsystem includes a camera having a field of view, a positioning system, and an image processor. The positioning system is configured to support the camera and controllably articulate the field of view to track the object of interest. Finally, the processor is configured to receive one or more images from the visual inspection subsystem, and identify a characteristic of the object of interest from the one or more images.10-23-2014
20140313346TRACKING SHOOTING SYSTEM AND METHOD - A tracking shooting system and method are disclosed herein. The tracking shooting system includes a camera device, a wireless radio frequency module, a wireless transceiver and a control module. The wireless radio frequency module is mounted beside a target area. The wireless transceiver is disposed on an object, and is configured to communicate with the wireless radio frequency module to obtain coordinate information. The control module is configured to receive the coordinate information transmitted from the wireless transceiver, and to calculate a position coordinate of the object in accordance with the coordinate information. The control module controls the camera device in accordance with the position coordinate, so as to tracking-shoot the object.10-23-2014
20140320666OBJECT DETECTION - An object detection apparatus comprising a camera having video output comprising frames; and a digital video processor configured to receive the video output from the camera, detect and track a blob in the frames to determine a trajectory for the blob and trigger an alert message if the trajectory of the blob is characteristic of the object. The digital video processor may detect and classify the object as a leak, and provide an alert or alarm. The digital video processor may detect and classify the object as a bird, and provide a bird report. A weather station may be combined with the digital video processor to receive input from the weather station and take the input from the weather station into account in determining whether to trigger an alert.10-30-2014
20140320667System and Method for Tracking - Systems and methods are provided for tracking at least position and angular orientation. The system comprises a computing device in communication with at least two cameras, wherein each of the cameras are able to capture images of one or more light sources attached to an object. A receiver is in communication with the computing device, wherein the receiver is able to receive at least angular orientation data associated with the object. The computing device determines the object's position by comparing images of the light sources and generates an output comprising the position and angular orientation of the object.10-30-2014
20140320668METHOD AND APPARATUS FOR IMAGE CAPTURE TARGETING - In accordance with an example embodiment of the present invention, a method and corresponding apparatus and computer program are disclosed for: receiving images from an image sensor of a camera unit (10-30-2014
20140327784COMPUTER VISION-BASED OBJECT TRACKING SYSTEM - A computer-implemented method for utilizing a camera device to track an object is presented. As part of the method, a region of interest is determined within an overall image sensing area. A point light source is then tracked within the region of interest. In a particular arrangement, the camera device incorporates CMOS image sensor technology and the point light source is an IR LED. Other embodiments pertain to manipulations of the region of interest to accommodate changes to the status of the point light source.11-06-2014
20140340522STAR TRACKER WITH STEERABLE FIELD-OF-VIEW BAFFLE COUPLED TO WIDE FIELD-OF-VIEW CAMERA - A star tracker has an electronically steerable point of view, without requiring a precision aiming mechanism. The star tracker can be strapped down, thereby avoiding problems associated with precision aiming of mechanical devices. The star tracker images selectable narrow portions of a scene, such as the sky. Each stellar sighting can image a different portion of the sky, depending on which navigational star or group of navigational stars is of interest. The selectability of the portion of the sky imaged enables the star tracker to avoid unwanted light, such as from the sun.11-20-2014
20140340523SYSTEM AND METHOD TO IDENTIFY AND TRACK OBJECTS ON A SURFACE - A system is provided by which objects with RFID tags can communicate with a surface containing exciting and sensing antennas so as to actuate optical emission from those objects. That light emission can then be used for fine position and orientation sensing by an array of cameras placed around the surface.11-20-2014
20140340524SYSTEMS AND METHODS FOR PROVIDING NORMALIZED PARAMETERS OF MOTIONS OF OBJECTS IN THREE-DIMENSIONAL SPACE - Systems and methods are disclosed for detecting user gestures using detection zones to save computational time and cost and/or to provide normalized position-based parameters, such as position coordinates or movement vectors. The detection zones may be established explicitly by a user or a computer application, or may instead be determined from the user's pattern of gestural activity. The detection zones may have three-dimensional (3D) boundaries or may be two-dimensional (2D) frames. The size and location of the detection zone may be adjusted based on the distance and direction between the user and the motion-capture system.11-20-2014
20140362229CONTROLLING DIRECTION OF LIGHT ASSOCIATED WITH A FLASH DEVICE - A flash device includes a light emitting element configured to emit light and light controlling elements. The light controlling elements include a light receiving surface and a light exiting surface that is non-parallel to the light exiting surface of an adjacent light controlling element. The light controlling elements are individually adjustable to vary a difference between an amount of light received at the light receiving surface and an amount of light exited from the light exiting surface such that amounts of light exiting respective non-parallel light exiting surfaces vary to control direction of light emitted by the flash device.12-11-2014
20140362230METHOD AND SYSTEMS OF CLASSIFYING A VEHICLE USING MOTION VECTORS - This disclosure provides methods and systems of classifying a vehicle using motion vectors associated with captured images including a vehicle. According to an exemplary method, a cluster of motion vectors representative of a vehicle within a target region is analyzed to determine geometric attributes of the cluster and/or measure a length of a detected vehicle, which provides a basis for classifying the detected vehicle.12-11-2014
20140362231TRAFFIC ENFORCEMENT SYSTEM WITH TIME TRACKING AND INTEGRATED VIDEO CAPTURE - A method, system, and apparatus are provided for capturing a video image and speed of a target vehicle. A ranging device detects a distance to a target vehicle. The focal distance or zoom of a video camera is set and adjusted based on the distance. The speed of travel of the vehicle is detected, displayed, and/or stored in association with a video image captured of the vehicle by the video camera. A range of distances within which to capture the video image and speed of the vehicle may be set by detecting distances between a pair of landmarks or using GPS and compass heading data. An inclinometer is provided to aid initiation of a power-conservation mode. A target tracking time may be determined and compared to a minimum tracking time period. A device certification period can be stored and displayed and the device deactivated upon expiration thereof.12-11-2014
20140362232OBJECTIVE LENS WITH HYPER-HEMISPHERIC FIELD OF VIEW - The invention relates to an optical device (12-11-2014
20140368663METHOD AND APPARATUS FOR DISPLAYING AN IMAGE FROM A CAMERA - A method and apparatus for displaying video is provided herein. During operation, video is displayed on one of many displays in a geographically correct fashion. For example, in an embodiment involving two displays (e.g., on a firefighter's two wrists), each of which display a video feed, the video is displayed such that the video feed of the most appropriate scene (not necessarily the video feed of the closest camera) is shown on that display.12-18-2014
20140368664SYSTEM AND METHOD FOR MEASURING TRACKER SYSTEM ACCURACY - The present invention relates to a simple and effective system and method for measuring camera based tracker system accuracy, especially for a helmet-mounted tracker system, utilizing Coordinate Measuring Machine (CMM). The method comprises the steps of; computing spatial relation between tracked object and calibration pattern using CMM; computing relation between reference camera and tracker camera; computing relation between reference camera and calibration pattern; computing ground truth relation between tracker camera and tracked object; obtaining actual tracker system results; comparing these results with the ground truth relations and finding accuracy of the tracker system; recording accuracy results; testing if the accuracy results is a new calculation required. The system comprises; a reference camera; a calibration pattern visible by reference camera; a camera spatial relation computation unit; a relative spatial relation computation unit a memory unit; a spatial relation comparison unit.12-18-2014
20150035990IMPACT TIME FROM IMAGE SENSING - Impact time between an image sensing circuitry and an object relatively moving at least partially towards, or away from, the image sensing circuitry can be computed. Image data associated with a respective image frame of a sequence (1 . . . N) of image frames sensed by said image sensing circuitry and which image frames are imaging said object can be received. For each one (i) of multiple pixel positions, a respective duration value (f(i)) indicative of a largest duration of consecutively occurring local extreme points in said sequence (1 . . . N) of image frames can be computed. A local extreme point is present in a pixel position (i) when an image data value of the pixel position (i) is a maxima or minima in relation to image data values of those pixel positions that are closest neighbours to said pixel position (i).02-05-2015
20150054965IMAGE CAPTURING APPARATUS AND CONTROL METHOD THEREOF - An image capturing apparatus comprises a light emitting unit which provides, by light emission, a notification of an operation status of a self-timer when performing self-timer shooting; a mode setting unit which sets one of a plurality of operation modes; and a control unit which controls the light emitting unit to provide the notification of the operation status of the self-timer in self-timer shooting if the mode setting unit has set a first operation mode, and controls the light emitting unit not to provide the notification of the operation status of the self-timer in self timer shooting if the mode setting unit has set a second operation mode.02-26-2015
20150062348IMAGE CAPTURE APPARATUS AND METHOD FOR CONTROLLING THE SAME - In an image capture apparatus that can perform processing using an image on which information displayed by a display device is superimposed and a method for controlling the same, the influence of the superimposed information included in the image is reduced. A first image in which information displayed by the display device is superimposed on an optical finder image is acquired. Then, the predetermined image processing is performed using a second image that excludes data on a pixel having the display color of the information, of pixels included in the first image.03-05-2015
20150085138IMAGE CAPTURING APPARATUS AND IMAGE CAPTURING METHOD - An image capturing apparatus comprises an object detection unit configured to detect a predetermined object and a movement amount acquisition unit configured to acquire an amount of movement of the predetermined object. An object tracking unit suppresses the amount of movement of the predetermined object by moving an optical element constituting an image capturing optical system. A motion vector detection unit detects a motion vector indicating an image blur amount. An image blur correction unit corrects image blur based on the motion vector. A switching unit switches whether or not to perform object tracking using the object tracking unit. The image blur correction unit makes effect of the image blur correction in a case of performing the object tracking smaller than effect of the image blur correction in a case of not performing the object tracking.03-26-2015
20150085139IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD - An image processing device including a subject frame setting section which, by operating a subject detector which detects a subject captured in an image, sets a subject frame which surrounds a predetermined range of the subject detected from the image; an acceptance frame setting section which sets an acceptance frame with a range wider than the subject frame according to the context of the image; a position detecting section which detects a specified position on an image which is specified by a user; and a recognizing section which recognizes a subject which is a tracking target based on the acceptance frame set by the acceptance frame setting section and the specified position detected by the position detecting section.03-26-2015
20150092064Recording Device Positioner Based on Relative Head Rotation - In one aspect, a recording device positioner is provided. The recording device includes a base having a connection portion that is configured to receive a recording device. The recording device positioner further includes a positioning sensor configured to sense the movement of a user. Additionally, the recording device positioner includes a motor attached to the base, the motor being configured to rotate the recording device relative to the base based upon signals sent by the positioning sensor. In another aspect, a method of recording a desired area of interest using a recording device positioner is provided. In a further aspect, a device positioner for moving a video recording device based on movements of a user is provided.04-02-2015
20150103183METHOD AND APPARATUS FOR DEVICE ORIENTATION TRACKING USING A VISUAL GYROSCOPE - A method for tracking device orientation on a portable device is disclosed. The method comprises initializing a device orientation to a sensor orientation, wherein the sensor orientation is based on information from an inertial measurement unit (IMU) sensor. It also comprises initiating visual tracking using a camera on the portable device and capturing a frame. Next, it comprises determining a plurality of visual features in the frame and matching the frame to a keyframe, wherein capture of the keyframe precedes capture of the frame. Subsequently, it comprises computing a rotation amount between the frame and the keyframe. Responsive to a determination that a rotational distance between the frame and the keyframe exceeds a predetermined threshold, promoting the frame to a keyframe status and adding it to a first orientation map and adjusting the frame with all prior captured keyframes.04-16-2015
20150103184METHOD AND SYSTEM FOR VISUAL TRACKING OF A SUBJECT FOR AUTOMATIC METERING USING A MOBILE DEVICE - Embodiments of the present invention provide a novel solution that enables mobile devices to continuously track interesting subjects by creating dynamic visual models that can be used to detect and track subjects in-real time through total occlusion or even if a subject temporarily leaves the mobile device's field of view. Additionally, embodiments of the present invention use an online learning scheme that dynamically adjusts tracking procedures responsive to any appearance and/or environmental changes associated with an interesting subject that may occur over a period of time. In this manner, embodiments of the present invention can determine a more optimal focus position that allows movement by either the mobile device or the subject during the performance of auto-focusing procedures and also enables other camera parameters to properly calibrate (meter) themselves based on the focus position determined.04-16-2015
20150103185DEVICE FOR TRACKING ADJUSTMENT AND TRACKING ADJUSTMENT METHOD - A device for tracking adjustment comprising a zoom instruction input device, a focus instruction input device, a tracking instruction input device, an image signal obtaining device, a determining device which determines whether a state is a first adjustment state in which the zoom lens is set by the zoom instruction input device at a position on a tele side and the focus lens is moved by the focus instruction input device or a second adjustment state in which the zoom lens is set by the zoom instruction input device at a position on a wide side and the tracking lens is moved by the tracking instruction input device, an area setting device, an evaluation value generating device, and a display device, wherein the determining device determines whether the state is the first adjustment state or the second adjustment state based on the image signal obtained from the camera device.04-16-2015
20150109457MULTIPLE MEANS OF FRAMING A SUBJECT - A system for tracking a cinematography target can comprise an emitter configured to attach to a target and to emit a tracking signal. A tracker can be configured to receive the tracking signal from the emitter and to track the emitter based upon the received tracking signal. The tracker can comprise a control module configured to identify a location of the target and to position an audiovisual device to align with a target. Additionally, the tracker can comprise a script execution processor configured to execute a user selected script. The user selected script may be selected from a set of respectively unique scripts. The user selected script can determine one or more control module movements specific to tracking the emitter.04-23-2015
20150116501SYSTEM AND METHOD FOR TRACKING OBJECTS - Various aspects of a system and a method for tracking one or more objects may comprise a network capable of communicatively coupling a plurality of cameras, a plurality of sensors, and a controlling device. The controlling device may receive metadata associated with the one or more objects. The metadata identifies the one or more objects. The controlling device may select a first set of cameras from the plurality of cameras to track the one or more objects based on the received metadata. The controlling device may enable tracking the one or more objects by the selected first set of cameras.04-30-2015
20150116502APPARATUS AND METHOD FOR DYNAMICALLY SELECTING MULTIPLE CAMERAS TO TRACK TARGET OBJECT - A method for dynamically selecting multiple cameras to track a target object, the method including selecting a main camera from among multiple cameras; selecting a target object from an image captured by the main camera; projecting a captured location of the target object onto images to be captured by one or more sub cameras; and selecting sub cameras according to a pixel proportion that indicates a number of pixels which are included in a capture location of the target object in the images captured by the one or more sub cameras.04-30-2015
20150116503IMAGING DEVICE FOR CAPTURING SELF-PORTRAIT IMAGES - A digital camera for capturing an image containing the photographer, comprising: an image sensor; an optical system for forming an image of a scene on the image sensor; a processor for processing the output of the image sensor in order to detect the presence of one or more faces in a field of view of the digital camera; a feedback mechanism for providing feedback to the photographer while the photographer is included within the field of view, responsive to detecting at least one face in the field of view, and a means for initiating capture of a digital image of the scene containing the photographer.04-30-2015
20150116504IMAGE PROCESSING DEVICE AND METHOD, IMAGE PROCESSING SYSTEM, AND IMAGE PROCESSING PROGRAM - An image processing device includes: an entire image display control portion that performs control to display an entire image of a predetermined region in an entire image display window; and a cutout image display control portion that performs control to enlarge a plurality of tracking subjects included in the entire image and display the tracking subjects in a cutout image display window. The cutout image display control portion performs the control in such a manner that one cutout image including the tracking subjects is displayed in the cutout image display window in a case where relative distances among the tracking subjects are equal to or smaller than a predetermined value, and that two cutout images including the respective tracking subjects are displayed in the cutout image display window in a case where the relative distances among the tracking subjects are larger than the predetermined value.04-30-2015
20150116505MULTIPLE MEANS OF TRACKING - A system for tracking a cinematography target comprises an emitter configured to attach to a target and to provide a tracking indicator. The system also comprises a tracker configured to receive tracking data from a separate tracking data reception device and based upon the received tracking data to actuate one or more motors that cause an attached cinematography device to point towards the tracking indicator. Further, the tracking data reception device can be configured to generate information relating to the location of the tracking indicator. In particular, the tracking data reception device can comprise one or more sensor modules that are configured to identify a location of the tracking indicator relative to the tracking data reception device. The system can also comprise a user interface device configured to receive commands from a user and communicate the commands to the tracker.04-30-2015
20150124103Navigation System with Monocentric Lens and Curved Focal Plane Sensor - A navigation system includes a monocentric lens and one or more curved image sensor arrays disposed parallel and spaced apart from the lens to capture respective portions, not all, of the field of view of the lens.05-07-2015
20150146010OBJECT DETECTING APPARATUS, IMAGE CAPTURING APPARATUS, METHOD FOR CONTROLLING OBJECT DETECTING APPARATUS, AND STORAGE MEDIUM - An object detecting apparatus includes a detecting unit configured to detect an area of a predetermined object from an image, a calculating unit configured to calculate an evaluation value on the area detected by the detecting unit, and a control unit configured, when the evaluation value satisfies a predetermined criterion, to determine that the area is the predetermined object. The predetermined criterion is set depending on an amount of distortion of an image displayed on a display unit.05-28-2015
20150146011IMAGE PICKUP APPARATUS HAVING FA ZOOM FUNCTION, METHOD FOR CONTROLLING THE APPARATUS, AND RECORDING MEDIUM - An image pickup apparatus includes a movement detector detecting movement of the apparatus, a subject detector detecting a subject, a setting unit setting a mode for controlling a zoom operation which is enabled as a first mode or second mode, and a controller controlling the zoom operation based on the enabled mode. In the first mode, the controller controls the zoom operation based on at least one of position or size of the detected subject. In the second mode, the controller performs a first zoom operation for changing a zoom magnification to a wide-angle side when a movement amount of the apparatus is a first amount larger than a predetermined amount and performs a second zoom operation for changing the zoom magnification to a telephoto side when a movement amount of the apparatus detected after execution of the first zoom operation is a second amount smaller than the predetermined amount.05-28-2015
20150294176AUTOMATIC TRACKING IMAGE PICKUP SYSTEM - An automatic tracking image pickup system including: an image pickup apparatus picking up an image of an object; a driving unit changing an image pickup direction of the image pickup apparatus; a recognition unit recognizing a tracking object in a picked up image; and a controller controlling a speed of the driving unit based on a difference between a position of the tracking object in the image and a target position in the image in an initial mode until the tracking object reaches a predetermined position in the image after the recognition unit recognizes the tracking object for first time, and in a normal mode after the tracking object reaches the predetermined position with a gain for obtaining the speed of the driving unit based on the difference in the normal mode being larger than a gain used in the initial mode.10-15-2015
20150317539IMAGE RECOGNITION SYSTEM WITH ASSISTANCE OF MULTIPLE LENSES AND METHOD THEREOF - The present disclosure illustrates an image recognition system with assistance of the multiple lenses. The system is characterized in using an image capturing device having dual lenses to calculate 3D data of an object, and the 3D data includes spacial coordinates corresponding to multiple parts of the object image. The object image exists in both of the first image and the second image, and corresponds to the object.11-05-2015
20150326769IMAGING DEVICE FOR SCENES IN APPARENT MOTION - Imaging systems and methods for imaging of scenes in apparent motion are described. A multi-axis positioning mechanism is operable to move an area imaging device along a tracking axis. A control module directs the multi-axis positioning mechanism to set the tracking axis to be substantially parallel with the apparent motion, and directs the multi-axis positioning mechanism to move the area imaging device in one or more cycles such that the area imaging device moves, in each of the one or more cycles, forward along the tracking axis at a tracking speed that compensates for the apparent motion. The control module directs the area imaging device to take at least one exposure during each of the one or more cycles to generate one or more exposures. An imaging module forms an image of the scene based on the one or more exposures.11-12-2015
20150331083Camera tracking system - The Camera Tracking System operation is the operator sets the camera up and attaches the signal module to the person or object that is to be filmed. The operator will use the control device or module (smart Phone) to jog the camera until the person or object wearing the device is in full view of the camera. The operator will then push the initiation button. At that point the system is functioning and it will follow the person wearing the signal module everywhere no matter if the person is in full view or not.11-19-2015
20150332465METHOD AND SYSTEM FOR MEDICAL TRACKING USING A PLURALITY OF CAMERA POSITIONS - A method for tracking trackable objects using a medical tracking device, the tracking device being a camera or an EM transmitter, during a medical workflow comprising a plurality of workflow steps, wherein each trackable object has at least one marker and the method comprises the steps of: —acquiring a set of camera positions, wherein each tracking device position is associated with at least one workflow step; —identifying a workflow; —sequentially and automatically moving the tracking device to the camera positions associated with the workflow steps; and—performing a tracking step only when the tracking device is in a fixed position.11-19-2015
20150332476METHOD AND APPARATUS FOR TRACKING OBJECT IN MULTIPLE CAMERAS ENVIRONMENT - The present invention relates to a method of tracking an object in a multiple cameras environment and the method includes generating first feature information of the object from an image input from a first camera; detecting a second camera in which identification information for the object is recognized when the object moves out of a view angle of the first camera, and comparing second feature information of the object generated from an image input from the second camera with the first feature information to track the object from the image input from the second camera. According to the present invention, the object is tracked based on an image in one camera image and if the object moves out of the camera, the identification information of the terminal which is possessed by the object is recognized to hand over the camera to continuously track the same object.11-19-2015
20150334310POSITION CONTROL APPARATUS, POSITION CONTROL METHOD, OPTICAL APPARATUS, AND IMAGING APPARATUS - A target generating unit outputs a target value signal of a control target, and a position encoding processing unit outputs a position detection signal of the control target. A PID compensator calculates a control amount for causing the control target to follow a target position based on a difference signal between the target value signal and the position detection signal, and outputs it to an adding unit. A disturbance estimation observer outputs the result of estimation of disturbance to the adding unit. The adding unit adds the control amount output of the PID compensator to the output of the disturbance estimation observer to calculate a total control amount. A storage unit stores a control amount during position control to a fixing position as an attitude difference correction amount, and the disturbance estimation observer performs disturbance estimation using a control amount obtained by subtracting a value stored in the storage unit during position control to a moving position from the total control amount.11-19-2015
20150334311CAMERA TRACKING SYSTEM - A camera tracking system allows a camera to lock onto and follow a target object dynamically and automatically, without direct human intervention, including when the object moves erratically or unpredictably.11-19-2015
20150338497TARGET TRACKING DEVICE USING HANDOVER BETWEEN CAMERAS AND METHOD THEREOF - A target tracking device has an input unit configured to receive information on a target to be searched for, and a predicted path calculating unit configured to use the received information and a plurality of prediction models to calculate a movement candidate point of the target for each of the prediction models so as to provide movement candidate points. The predicted path calculating unit also determines a predicted movement point of the target by making a comparison among the calculated movement candidate points. A determining unit able to determine whether an image of the target is included in imagery of a camera, at one of the movement candidate points, is controlled to first check imagery of a camera at only the predicted movement point of the target.11-26-2015
20150338498CAMERA AND METHOD FOR CAPTURING IMAGE DATA11-26-2015
20150338499IMAGE RECORDING SYSTEM WITH RELATIONAL TRACKING - A relational tracking and recording method and apparatus can include: transmitting a signal from a beacon; detecting the signal with a first antenna and with a second antenna, the first antenna coupled to a tracking and recording element; determining a time of flight for the signal between the beacon and the first antenna; calculating a distance between the beacon and the first antenna based on the time of flight; determining a horizontal angle of the beacon; determining a vertical angle of the beacon; positioning an image sensor to face toward the beacon based on the vertical angle and based on the horizontal angle; focusing optics in front of the image sensor based on the distance; zooming the optics in front of the image sensor based on the distance; and recording an image with the image sensor.11-26-2015
20150341551PROJECTING LIGHT AT ANGLE CORRESPONDING TO THE FIELD OF VIEW OF A CAMERA - In one aspect, a device includes a camera, a processor, and a memory accessible to the processor. The memory bears instructions executable by the processor to initiate the camera and project light from the device at an angle corresponding to a field of view of the camera according to a current focal length of the camera.11-26-2015
20150350537VIRTUAL HEAD MOUNTED VIDEO CAMERA SYSTEM - A facial image capture system may capture images of a face of a person while the person is moving. A video camera may capture sequential images of a scene to which the video camera is directed. A marker-based location detection system may determine and generate information about the location of a marker worn on or close to the face of the person. A camera control system may automatically adjusts both the horizontal and vertical direction to which the video camera is directed so as to cause the sequential images of the camera to each be of the face of the person while the person is moving, based on the information about the location of the marker from the marker-based location detection system.12-03-2015
20150350568METHOD OF IMAGING A TARGET IN A NIGHT SKY - A method is provided for detecting a target not emitting in the wavelength region lying between 1 μm and 1.9 μm. The target is situated in a night sky luminous environment of level less than or equal to 4. Use is made of an imaging device of focal length f and of pupil diameter D, comprising at least one detector comprising types of pixels configured to operate in the wavelength region lying between 1 μm and 1.9 μm, the detectors exhibiting a noise level of less than 0.6×1012-03-2015
20150355309TARGET TRACKING IMPLEMENTING CONCENTRIC RINGLETS ASSOCIATED WITH TARGET FEATURES - Systems, methods, and computer product for identifying and tracking an object of interest from an image capturing system based on a plurality of features associated with the object of interest. The object of interest may be tracked based on features associated with the object of interest. A center feature associated with the object of interest is designated. The center feature changes location as the object of interest changes location. A plurality of ringlets is generated. Each ringlet is concentrically positioned so that each ringlet encircles the center feature and encompasses additional features associated with the object of interest. The object of interest is tracked with feature data extracted by each ringlet as the object of interest changes location and/or orientation. The feature data is associated with each feature of the object of interest that each ringlet encompasses.12-10-2015
20150356737SYSTEM AND METHOD FOR MULTIPLE SENSOR FIDUCIAL TRACKING - In a head mounted virtual reality or augmented reality system, a fast but lower resolution second camera is added and used to quickly find an area of a visual field returned by a slower but higher resolution first camera, where that area is likely to contain the image of a marker for a head tracking system or a hand held device.12-10-2015
20150363009PROXIMITY OBJECT TRACKER - Object tracking technology, in which controlling an illumination source is controlled to illuminate while a camera is capturing an image to define an intersection region within the image captured by the camera. The image captured by the camera is analyzed to detect an object within the intersection region. User input is determined based on the object detected within the intersection region and an application is controlled based on the determined user input.12-17-2015
20150365599INFORMATION PROCESSING APPARATUS, IMAGE CAPTURING APPARATUS, AND CONTROL METHOD - An information processing apparatus obtains information of a plurality of viewpoints corresponding to images in which a same subject is captured. Then the apparatus detects a region of interest within a captured range including the subject, and displays, on a display medium, an image corresponding to a display viewpoint selected as a viewpoint corresponding to the region of interest from among the plurality of viewpoints.12-17-2015
20150378000DELAY COMPENSATION WHILE CONTROLLING A REMOTE SENSOR - The presently disclosed subject matter includes a method, a system and a delay compensation unit configured for compensating a delay in communication between a sensing unit and a control unit. A succession of two or more captured images are received from the sensing unit; information indicative of command dynamics parameters are obtained from the control unit; based on the information and a respective transfer function which models the dynamics of the sensor module, the expected reaction of the sensing module to the command is determined; and information indicative of the reaction before the command is executed in the sensing unit can be provided.12-31-2015
20150381898DIRECTING FIELD OF VISION BASED ON PERSONAL INTERESTS - A method for directing the field of vision based on personal interests. The method includes receiving a keyword and/or an image file and processing the keyword and/or image file to generate data representing a user interest. The method includes receiving a video input from a camera representative of the field of vision of the camera and processing the video input to identify a visible element in the field of vision of the camera. The method further includes comparing the visible element in the field of vision of the camera and the data representing the user interest to determine whether the visible element is of interest to the user. A notification is provided to the user for identified visible elements that are of interest to the user.12-31-2015
20160018647SEE-THROUGH COMPUTER DISPLAY SYSTEMS - Aspects of the present invention relate to providing see-through computer display optics.01-21-2016
20160027188Methods for Capturing Images of a Control Object and Tracking to Control Interfacing with Video Game Objects - Methods for real time motion capture for controlling an object in a video game are provided. One method includes defining a model of a control object and identifying a marker on the control object. The method also includes capturing movement associated with the control object with a video capture device. Then, interpreting the movement associated with the control object to change a position of the model based on data captured through the video capture device, wherein the data captured includes the marker. The method includes moving the video game object presented on the display screen in substantial real-time according to the change of position of the model.01-28-2016
20160028917SYSTEMS AND METHODS FOR REMEMBERING HELD ITEMS AND FINDING LOST ITEMS USING WEARABLE CAMERA SYSTEMS - Apparatuses and methods are provided for storing information related to objects associated with a hand of a user via a wearable camera system. In one implementation, a wearable apparatus for storing the information is provided comprising a wearable image sensor configured to capture a plurality of images from the environment of the user, and at least one processing device programmed to process the images. The processing device may detect the hand of the user, and an object associated with the user's hand. The processing device may proceed to store information related to the object. Consistent with disclosed embodiments, the stored information may be used for various purposes, such as warning the user of dangers, catering advertising to the user, and helping the user find objects when they are lost.01-28-2016
20160028952Super Resolution Binary Imaging And Tracking System - In one aspect, the invention provides an imaging system including an optical system adapted to receive light from a field of view and direct the received light to two image planes. A fixed image detector is optically coupled to one of the image planes to detect at least a portion of the received light and generate image data corresponding to at least a portion of the field of view. A movable (e.g., rotatable) image detector is optically coupled to the other image plane to sample the received light at different locations thereof to generate another set of image data at a higher resolution than the image data obtained by the fixed detector. The system can include a processor for receiving the two sets of image data to generate two images of the field of view. In some implementations, the processor can employ one of the images (typically the image having a lower resolution) to detect one or more objects of interest (e.g., one or more objects moving within the field of view) and to effect the acquisition of image data corresponding to one or more of those moving objects at a higher resolution (e.g., by controlling the movement of the movable image detector).01-28-2016
20160033270METHOD AND SYSTEM FOR DETERMINING POSITION AND ORIENTATION OF A MEASURING INSTRUMENT - A method for determining position and orientation of a first measuring instrument is disclosed. A second MI and at least one reflective target including a retroreflector unit are arranged in the vicinity of the first MI. At least one imaging module is arranged in the first MI for determining orientation thereof. The at least one imaging module in the first MI can be used in a similar manner as a tracker unit of an optical total station, by way of detecting optical radiation emitted from the second MI and reflected by the at least one TGT.02-04-2016
20160041693GESTURE CONTROL HAVING AUTOMATED CALIBRATION - A basis image is captured by an image capture device and transmitted to a computing unit. A gesture of a user of the computing unit is identified by the computing unit based on the basis image. An action is determined and executed by the computing unit depending on the identified gesture. The action is determined by the computing unit for at least one of the gestures in addition in dependence on a relative position of the image capture device relative to a display device. No later than upon capturing the basis image, an additional image is captured by at least one additional image capture device and transmitted to the computing unit. On the basis of the additional image, the relative position of the image capture device relative to the display device is determined by the computing unit.02-11-2016
20160063328Video tracking systems and methods employing cognitive vision - Video tracking systems and methods include a peripheral master tracking process integrated with one or more tunnel tracking processes. The video tracking systems and methods utilize video data to detect and/or track separately several stationary or moving objects in a manner of tunnel vision. The video tracking system includes a master peripheral tracker for monitoring a scene and detecting an object, and a first tunnel tracker initiated by the master peripheral tracker, wherein the first tunnel tracker is dedicated to track one detected object.03-03-2016
20160071286IMAGE PROCESSING APPARATUS, IMAGING APPARATUS, CONTROL METHOD, AND STORAGE MEDIUM - To increase the accuracy of object tracking using particle filter processing, an imaging apparatus performs object tracking processing as follows. During an SW1 holding state and continuous shooting, the apparatus repeatedly performs distributing particles according to the random number following normal distribution based on light metering image data obtained from an AE sensor, estimating an image region of an object by calculating likelihood at the position of each of the particles, and arranging a particle having a lower likelihood in the position of a particle having a higher likelihood. At this time, the apparatus calculates a movement amount of the object based on a difference between the present position of the object and the previous position thereof. If the calculated movement amount is greater than a predetermined threshold value, the apparatus increases the dispersion of the normal distribution for random movement of the particles to be performed next.03-10-2016
20160078056Data Processing Method and Data Processing System - The invention provides a data processing method, comprising: a first terminal performs image acquisition on at least one photographic object entity, and encodes the image and corresponding recognition information to form video data which is sent to a second terminal; the second terminal performs data separation on the video data to obtain a video file and recognition information associated with at least one photographic object in the video file; the second terminal recognizes at least one photographic object in the video file according to the recognition information, and forms a corresponding operation area in the video file; and when the video file is played, the second terminal performs an associated operation function according to a detected operation action on a designated operation area.03-17-2016
20160080633METHOD FOR CAPTURING IMAGE AND IMAGE CAPTURING APPARATUS - Image capturing methods and image capturing apparatuses are provided. The image capturing method may include setting a target position of a live view image, tacking a moving object from the live view image, estimating a position of the moving object by using information on the tracked moving object, and capturing a still image based on the estimated position when the moving object is positioned on the target position of the live view image.03-17-2016
20160084932IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, IMAGE PROCESSING SYSTEM, AND STORAGE MEDIUM - An image processing apparatus includes a detection unit configured to detect movement information for specifying a movement direction of a specific moving object detected from an image obtained by at least one of a plurality of image capturing units, a prediction unit configured to predict a second image capturing unit configured to image-capture the specific moving object subsequently to a first image capturing unit based on the movement information detected by the detection unit and information representing an image capturing range of each of the plurality of image capturing units, and a display control unit configured to perform display for specifying a prediction result by the prediction unit before the second image capturing unit image-captures the specific moving object.03-24-2016
20160088219IMAGE CAPTURE APPARATUS WHICH CONTROLS FRAME RATE BASED ON MOTION OF OBJECT, INFORMATION TRANSMISSION APPARATUS, IMAGE CAPTURE CONTROL METHOD, INFORMATION TRANSMISSION METHOD, AND RECORDING MEDIUM - An image capture apparatus 03-24-2016
20160094790AUTOMATIC OBJECT VIEWING METHODS AND APPARATUS - The apparatus and methods for providing automatic and focused camera view over objects in motion. Camera view stream from a camera system is displayed on a user's mobile device. The invented system recognizes and highlights candidate objects in the camera view. After a user selects one or multiple candidate objects as the target object for view following, the system continuously computes the position and motion of the target object and it controls the camera orientation motion such that the aimpoint of the camera system follows the target object closely and at substantially the same velocity. The system also controls the camera zoom to achieve a reference object presentation ratio during view following. Meanwhile, the camera view stream is transferred to connected mobile devices and display devices. The camera view stream can also be recorded in video file for playback review and for video sharing through internet.03-31-2016
20160100146IMAGING APPARATUS, IMAGE PROCESSING METHOD, AND MEDIUM - An imaging apparatus acquires an image including a still image and a moving image, tracks a position of the object within an image capturing area of the image, records a luminance change within a region of the object being tracked, detects a point having a luminance change amount equal to or greater than a threshold, as a specular reflection area, calculates an object correction value, which is a color correction value of the object, based on color information of the specular reflection area, and corrects color overlap of the image based on the object correction value.04-07-2016
20160103200SYSTEM AND METHOD FOR AUTOMATIC TRACKING AND IMAGE CAPTURE OF A SUBJECT FOR AUDIOVISUAL APPLICATIONS - A system and method for automatically tracking and capturing an image of a subject is disclosed. The system and method includes at least one magnetic emitter device emitting a magnetic field, a magnetic sensing device enabled to receive the magnetic field from the magnetic emitter device and the magnetic sensing device is attached to at least one subject. The system and method further includes an image capturing device in communication with the to the magnetic sensing device in which the magnetic sensing device is enabled to communicate data to the image capturing device and the data enables the image capturing device to track a position of the subject. The image capturing device captures an image of the subject.04-14-2016
20160105679IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD - In imaging processing, image data outputted from an imaging device is acquired as a data string formed of a first display value expressed by a first number of gradations obtained through conversion of a luminance value in accordance with preset nonlinear conversion characteristics. The first display value is compressed in accordance with preset compression characteristics, and outputted as a second display value expressed by a second number of gradations smaller than the first number of gradations. A recognition target is detected from an image expressed by compressed data that is a data string formed of the second display value. A ratio of the second number of gradations to the first number of gradations is taken as a basic compression ratio, a luminance range including at least a recognition target range that is a luminance range where the recognition target is estimated to be present is taken as a specified range, and the first display value corresponding to a boundary luminance value that is a minimum luminance value in the specified range is taken as a boundary first display value. In the recognition target range, the compression characteristics are set so that the second display value is a sum of a compressed value and the boundary first display value, the compressed value being obtained by compressing a value of not less than the boundary first display value among the first display values at a low compression ratio lower than the basic compression ratio.04-14-2016
20160116564METHOD AND APPARATUS FOR FORWARDING A CAMERA FEED - A device tracks a user's field of vision/view (FOV). Based the FOV, the device may receive video and/or audio from cameras having similar FOVs. More particularly, the device may fetch a camera feed from a camera having a similar FOV as the user. Alternatively, the device may fetch a camera feed from a camera within the user's FOV.04-28-2016
20160117561DISTANCE SENSOR - In one embodiment, a distance sensor includes an image capturing device positioned to capture an image of a field of view and a first plurality of projection points arranged around a first lens of the image capturing device, wherein each projection point of the first plurality of projection points is configured to emit a plurality of projection beams in different directions within the field of view.04-28-2016
20160125638Automated Texturing Mapping and Animation from Images - A system for generating texture maps for 3D models of real-world objects includes a camera and reflective surfaces in the field of view of the camera. The reflective surfaces are positioned to reflect one or more reflected views of a target object to the camera. The camera captures a direct image of the target object and reflected images from the reflective surfaces. An image processor device separates the reflected views/images from the direct image of the target object in the captured image by detecting distortion in the reflected views. The image processor reduces distortion in the reflected views, and generates a texture map based on 05-05-2016
20160134805IMAGING APPARATUS, IMAGING METHOD THEREOF, AND COMPUTER READABLE RECORDING MEDIUM - An electronic device including an imaging device that captures an image of an object and generates image data of the object, a tilt angle detector that detects a tilt angle of the imaging device with respect to a horizontal plane, an azimuth detector that detects an azimuth angle between a reference orientation based on a preset orientation and an optical axis of the imaging device, and a display controller that causes a display to display target information indicating a direction to a target position for instructing a position of a capturing area of the imaging device in an image corresponding to the image data, with the optical axis of the imaging device being a reference, based on the tilt angle detected by the tilt angle detector and the azimuth angle detected by the azimuth detector.05-12-2016
20160134806HYPERACUITY SYSTEM AND METHODS FOR REAL TIME AND ANALOG DETECTION AND KINEMATIC STATE TRACKING - Certain embodiments of the methods and systems disclosed herein determine a location of a tracked object with respect to a coordinate system of a sensor array by using analog signals from sensors having overlapping nonlinear responses. Hyperacuity and real time tracking are achieved by either digital or analog processing of the sensor signals. Multiple sensor arrays can be configured in a plane, on a hemisphere or other complex surface to act as a single sensor or to provide a wide field of view and zooming capabilities of the sensor array. Other embodiments use the processing methods to adjust to contrast reversals between an image and the background.05-12-2016
20160178991SMART ILLUMINATION TIME OF FLIGHT SYSTEM AND METHOD06-23-2016
20160182769APPARATUS AND METHOD FOR GENERATING MOTION EFFECTS BY ANALYZING MOTIONS OF OBJECTS06-23-2016
20160191820IMAGING APPARATUS, IMAGE PROCESSING APPARATUS, AND IMAGE PROCESSING METHOD - An imaging apparatus comprising: a subject recognition section for recognizing a subject image at rest on an imaging surface and a subject image moving on the imaging surface, an image shift-amount detection section for detecting a positional shift on the imaging surface, and an image composition section for executing composition processing by additive composition of the respective image data if the subject recognition section recognizes the subject image as the subject image at rest on the imaging surface and for correcting the positional shift of the moving subject image detected by the image shift-amount detection section if the subject recognition section recognizes the subject image as the subject image moving on the imaging surface and then, for executing the composition processing by relatively bright composition or additionally averaged composition of the respective corrected image data so as to generate taken image data with multiple exposure.06-30-2016
20160196667System and Method for Tracking07-07-2016
20160381267Hemispherical Star Camera - A digital camera optically couples a monocentric lens to image sensor arrays, without optical fibers, yet shields the image sensor arrays from stray light. In some digital cameras, baffles are disposed between an outer surface of a monocentric lens and each image sensor array to shield the image sensor arrays from stray light. In other such digital cameras, an opaque mask defines a set of apertures, one aperture per image sensor array, to limit the amount of stray light. Some digital cameras include both masks and baffles.12-29-2016
20180027161IMAGING DEVICE FOR SCENES IN APPARENT MOTION01-25-2018
20190147219METHOD FOR ESTIMATING A 3D TRAJECTORY OF A PROJECTILE FROM 2D CAMERA IMAGES05-16-2019

Patent applications in class OBJECT TRACKING

Patent applications in all subclasses OBJECT TRACKING

Website © 2025 Advameg, Inc.