Aquifi, Inc. Patent applications |
Patent application number | Title | Published |
20150192991 | Systems and Methods for Implementing Head Tracking Based Graphical User Interfaces (GUI) that Incorporate Gesture Reactive Interface Objects - Embodiments in accordance with this invention disclose systems and methods for implementing head tracking based graphical user interfaces that incorporate gesture reactive interface objects. The disclosed embodiments perform a method in which a GUI includes interface objects is rendered and displayed. Image data of an interaction zone is captured. A targeting gestured targeting a targeted interface object is detected in the captured image data and a set of 3D head interaction gestures are enabled. Additional image data is captured. Motion of at least a portion of a human head is detected and one of the 3D head interactions is identified. The rendering of the interface is modified in response to the detection of one of the 3D head interactions and the modified interface is displayed. | 07-09-2015 |
20150089453 | Systems and Methods for Interacting with a Projected User Interface - A system and method for providing a 3D gesture based interaction system for a projected 3D user interface is disclosed. A user interface display is projected onto a user surface. Image data of the user interface display and an interaction medium are captured. The image data includes visible light data and IR data. The visible light data is used to register the user interface display on the projected surface with the Field of View (FOV) of at least one camera capturing the image data. The IR data is used to determine gesture recognition information for the interaction medium. The registration information and gesture recognition information is then used to identify interactions. | 03-26-2015 |
20150062004 | Method and System Enabling Natural User Interface Gestures with an Electronic System - An electronic device coupleable to a display screen includes a camera system that acquires optical data of a user comfortably gesturing in a user-customizable interaction zone having a z | 03-05-2015 |
20150062003 | Method and System Enabling Natural User Interface Gestures with User Wearable Glasses - User wearable eye glasses include a pair of two-dimensional cameras that optically acquire information for user gestures made with an unadorned user object in an interaction zone responsive to viewing displayed imagery, with which the user can interact. Glasses systems intelligently signal process and map acquired optical information to rapidly ascertain a sparse (x,y,z) set of locations adequate to identify user gestures. The displayed imagery can be created by glasses systems and presented with a virtual on-glasses display, or can be created and/or viewed off-glasses. In some embodiments the user can see local views directly, but augmented with imagery showing internet provided tags identifying and/or providing information as to viewed objects. On-glasses systems can communicate wirelessly with cloud servers and with off-glasses systems that the user can carry in a pocket or purse. | 03-05-2015 |
20150057082 | Method and System to Create Three-Dimensional Mapping in a Two-Dimensional Game - Natural three-dimensional (x | 02-26-2015 |