Omek Interactive, Ltd.
|Omek Interactive, Ltd. Patent applications|
|Patent application number||Title||Published|
|20140258942||INTERACTION OF MULTIPLE PERCEPTUAL SENSING INPUTS - A system and method for using multiple perceptual sensing technologies to capture information about a user's actions and for synergistically processing the information is described. Non-limiting examples of perceptual sensing technologies include gesture recognition using depth sensors, two-dimensional cameras, gaze detection, and/or speech recognition. The information captured about a user's gestures using one type of sensing technology is often not able to be captured with another type of technology. Thus, using multiple perceptual sensing technologies allows more information to be captured about the user's gestures. Further, by synergistically leveraging the information acquired using multiple perceptual sensing technologies, a more natural user interface can be created for a user to interact with an electronic device.||09-11-2014|
|20140232631||MODEL-BASED MULTI-HYPOTHESIS TARGET TRACKER - The present disclosure describes a target tracker that evaluates frames of data of one or more targets, such as a body part, body, and/or object, acquired by a depth camera. Positions of the joints of the target(s) in the previous frame and the data from a current frame are used to determine the positions of the joints of the target(s) in the current frame. To perform this task, the tracker proposes several hypotheses and then evaluates the data to validate the respective hypotheses. The hypothesis that best fits the data generated by the depth camera is selected, and the joints of the target(s) are mapped accordingly.||08-21-2014|
|20140123077||SYSTEM AND METHOD FOR USER INTERACTION AND CONTROL OF ELECTRONIC DEVICES - A system and method for close range object tracking are described. Close range depth images of a user's hands and fingers are acquired using a depth sensor. Movements of the user's hands and fingers are identified and tracked. This information is used to permit the user to interact with a virtual object, such as an icon or other object displayed on a screen, or the screen itself.||05-01-2014|
|20140037135||CONTEXT-DRIVEN ADJUSTMENT OF CAMERA PARAMETERS - A system and method for adjusting the parameters of a camera based upon the elements in an imaged scene are described. The frame rate at which the camera captures images can be adjusted based upon whether the object of interest appears in the camera's field of view to improve the camera's power consumption. The exposure time can be set based on the distance of an object form the camera to improve the quality of the acquired camera data.||02-06-2014|
|20140022171||SYSTEM AND METHOD FOR CONTROLLING AN EXTERNAL SYSTEM USING A REMOTE DEVICE WITH A DEPTH SENSOR - A system and method for implementing a remote controlled user interface using close range object tracking are described. Close range depth images of a user's hands and fingers or other objects are acquired using a depth sensor. Using depth image data obtained from the depth sensor, movements of the user's hands and fingers or other objects are identified and tracked. The tracking data is transmitted to an external control device, thus permitting the user to interact with an object displayed on a screen controlled by the external control device, through movements of the user's hands and fingers.||01-23-2014|
|20130266174||SYSTEM AND METHOD FOR ENHANCED OBJECT TRACKING - A system and method are provided for object tracking using depth data, amplitude data and/or intensity data. In some embodiments, time of flight (ToF) sensor data may be used to enable enhanced image processing, the method including acquiring depth data for an object imaged by a ToF sensor; acquiring amplitude data and/or intensity data for an object imaged by a ToF sensor; applying an image processing algorithm to process the depth data and the amplitude data and/or the intensity data; and tracking object movement based on an analysis of the depth data and the amplitude data and/or the intensity data.||10-10-2013|
|20130265220||SYSTEM AND METHOD FOR COMBINING THREE-DIMENSIONAL TRACKING WITH A THREE-DIMENSIONAL DISPLAY FOR A USER INTERFACE - Systems and methods for combining three-dimensional tracking of a user's movements with a three-dimensional user interface display is described. A tracking module processes depth data of a user performing movements, for example, movements of the user's hand and fingers. The tracked movements are used to animate a representation of the hand and fingers, and the animated representation is displayed to the user using a three-dimensional display. Also displayed are one or more virtual objects with which the user can interact. In some embodiments, the interaction of the user with the virtual objects controls an electronic device.||10-10-2013|
|20130142417||SYSTEM AND METHOD FOR AUTOMATICALLY DEFINING AND IDENTIFYING A GESTURE - A system and method for creating a gesture and generating a classifier that can identify the gesture for use with an application is described. The designer constructs a training set of data containing positive and negative examples of the gesture. Machine learning algorithms are used to compute the optimal classification of the training data into positive and negative instances of the gesture. The machine learning algorithms generate a classifier which, given input data, makes a decision on whether the gesture was performed in the input data or not.||06-06-2013|
|20120327125||SYSTEM AND METHOD FOR CLOSE-RANGE MOVEMENT TRACKING - A system and method for close range object tracking are described. Close range depth images of a user's hands and fingers or other objects are acquired using a depth sensor. Using depth image data obtained from the depth sensor, movements of the user's hands and fingers or other objects are identified and tracked, thus permitting the user to interact with an object displayed on a screen, by using the positions and movements of his hands and fingers or other objects.||12-27-2012|
Patent applications by Omek Interactive, Ltd.