Entries |
Document | Title | Date |
20080198131 | Temperature Feedback PC Pointing peripheral - A new pointing device implementation method that uses temperature sensors mounted on or in a personal computing device's pointing device or game console's game controller to collect and effectively send temperature information about the surface of the peripheral pointing device to the connected personal computing device or game console for use by the software running on the device. The software running on the personal computing device or game console connected to the peripheral pointing device or game controller also sends data instructions specifying the desired temperature for which the surface of the pointing peripheral should be. In response to the instructions sent by the game console or personal computing device, the device heats up or cools down to reflect the context of the video game or software running on the screen or display. | 08-21-2008 |
20080204410 | RECOGNIZING A MOTION OF A POINTING DEVICE - The present invention is directed toward a system and process that controls a group of networked electronic components using a multimodal integration scheme in which inputs from a speech recognition subsystem, gesture recognition subsystem employing a wireless pointing device and pointing analysis subsystem also employing the pointing device, are combined to determine what component a user wants to control and what control action is desired. In this multimodal integration scheme, the desired action concerning an electronic component is decomposed into a command and a referent pair. The referent can be identified using the pointing device to identify the component by pointing at the component or an object associated with it, by using speech recognition, or both. The command may be specified by pressing a button on the pointing device, by a gesture performed with the pointing device, by a speech recognition event, or by any combination of these inputs. | 08-28-2008 |
20080204411 | RECOGNIZING A MOVEMENT OF A POINTING DEVICE - The present invention is directed toward a system and process that controls a group of networked electronic components using a multimodal integration scheme in which inputs from a speech recognition subsystem, gesture recognition subsystem employing a wireless pointing device and pointing analysis subsystem also employing the pointing device, are combined to determine what component a user wants to control and what control action is desired. In this multimodal integration scheme, the desired action concerning an electronic component is decomposed into a command and a referent pair. The referent can be identified using the pointing device to identify the component by pointing at the component or an object associated with it, by using speech recognition, or both. The command may be specified by pressing a button on the pointing device, by a gesture performed with the pointing device, by a speech recognition event, or by any combination of these inputs. | 08-28-2008 |
20080211771 | Approach for Merging Scaled Input of Movable Objects to Control Presentation of Aspects of a Shared Virtual Environment - A system for controlling operation of a computer. The system includes, a sensing apparatus configured to obtain positional data of a sensed object controllable by a first user, such positional data varying in response to movement of the sensed object, and engine software operatively coupled with the sensing apparatus and configured to produce control commands based on the positional data, the control commands being operable to control, in a multi-user software application executable on the computer, presentation of a virtual representation of the sensed object in a virtual environment shared by the first user and a second user, the virtual representation of the sensed object being perceivable by the second user in a rendered scene of the virtual environment, where the engine software is configured so that the movement of the sensed object produces control commands which cause corresponding scaled movement of the virtual representation of the sensed object in the rendered scene that is perceivable by the second user. | 09-04-2008 |
20080211772 | SUCCESSIVELY LAYERED MODULAR CONSTRUCTION FOR A PORTABLE COMPUTER SYSTEM - A modular portable computer system is described. A top modular layer with a coupled display interface and adapted to be interconnected with other modular layers. A second modular layer is interconnected with the top modular layer and other modular layers, for providing a power source to supply operating power to said top modular layer and to those other modular layers present and is disposed beneath the top modular layer. A third modular layer is interconnected with the top modular layer and the second modular layer for providing baseline logic electronics and communication components to the modular portable computer system and is disposed beneath the top modular layer. A universal interconnect for providing electronic and communicative interconnection of each modular layer is disposed at least once on each modular layer. | 09-04-2008 |
20080218474 | Ultra Thin Optical Pointing Device and Personal Portable Device Having the Same - The present invention relates to an ultra thin optical pointing device, and a personal portable device having the optical pointing device. The optical pointing device includes a PCB ( | 09-11-2008 |
20080218475 | METHOD AND APPARATUS FOR DYNAMICALLY MODIFYING WEB PAGE DISPLAY FOR MOBILE DEVICES - A method of dynamically modifying web page displays used in various mobile devices. The method uses a motion detection mechanism to detect whether the mobile device is moving or in motion and then modifies web page displays sent to the device based upon the sensor readings. As such, the method, system, and apparatus are capable of automatically modifying a display provided to a mobile device based upon a determination that the user and/or device are moving and/or in motion. In another aspect, the method, system, and apparatus are also capable of modifying the complexity of a display provided to a mobile device based upon the degree of movement and/or motion by the user and/or device. | 09-11-2008 |
20080225000 | Cancellation of Environmental Motion In Handheld Devices - A method, system and computer program product for compensating the environmental motion in handheld devices. A sensor unit is affixed to an object in the environment to detect and measure environmental motion. Upon measuring any detected environmental motion, the sensor unit transmits a value corresponding to the measured environmental motion to one or more handheld devices. Alternatively, the sensor unit may transmit the value corresponding to the measured environmental motion to a unit configured to retransmit the value to one or more handheld devices. Upon receiving the value corresponding to the measured environmental motion, the handheld device cancels this environmental motion from the motion it measured thereby taking into consideration only the motion inputted by the user of the handheld device. | 09-18-2008 |
20080225001 | Method For Controlling an Interface Using a Camera Equipping a Communication Terminal - The invention concerns a method for controlling a graphic, audio and/or video interface using a camera equipping a communication terminal which consists in acquiring and/or storing a first image, acquiring and storing a new image, computing the apparent movement by matching both images, interpreting, in accordance with a predetermined control mode, the apparent movement, into user commands, storing in a memory of said terminal the user commands, modifying the display or sound of the terminal according to the user commands and optionally inputting a command validating an element or a graphic zone, or menu opening or triggering or scrolling an audio or video file, or triggering a sound superimposition above a sound track, or executing a task or application by the user on the communication terminal and optionally transmitting same to a second terminal. | 09-18-2008 |
20080231596 | KEY SHAPED POINTING DEVICE - A key shaped pointing device adapted to mounted on a keyboard of a notebook computer, a keypad of a mobile phone, or a keyboard of a PC includes a substrate including two opposite top latches; a circuit board releasably secured to the substrate; a chip fixedly mounted on the substrate and including a light source, a photosensor, and a processor; and a casing including a transparent top window and opposite openings releasably secured to the latches. The finger as a reflection member is adapted to contact the window for creating a first optical path from the light source to the photosensor via the finger. A movement of the finger is adapted to create a second optical path. The processor is adapted to calculate a direction and a distance corresponding to the movement by comparing the second optical path with the first optical path for generating a cursor control output. | 09-25-2008 |
20080238871 | Display apparatus and method for operating a display apparatus - A display apparatus includes a plurality of scanning lines and data lines and, at each intersection of said data lines and scanning lines, an electrochromic pixel element and a sensor element connected to the pixel element. The sensor element is sensitive to a user's graphical input, which changes a charge state of the respective pixel element. The display apparatus also includes a control means, which is configured to allow the pixel element to be selectively set to a first charge state corresponding to a first display state, or to a second charge state corresponding to a second display state. The second display state reflects the user's graphical input. The second charge state of the pixels may be brought about by exposing the sensor elements to light from a light pen or to a magnetic field from a magnetic pen, the pen being drawn across the display screen of the apparatus. Due to the direct action of the sensor elements in modulating the charge on the pixel elements, it is possible to display an image drawn on the screen instantly, without the need to scan the display to determine which pixels have changed their initial charge. This reduces the power consumption of the apparatus. | 10-02-2008 |
20080246726 | Portable electronic device having a pliable or flexible portion - A portable electronic device or handheld computer is disclosed. The device has a housing and computing electronics supported by the housing. A pliable sensor that is supported by the housing is also disclosed. The pliable sensor provides input from the hand of a user by applying pressure to the pliable sensor. | 10-09-2008 |
20080259028 | Hand glove mouse - A hand glove mouse, including first, second and third buttons or sensors and a cursor positioner that is not a motion or tilt sensor (i.e., a roller ball, a light sensor, etc.) for adjusting the position of a cursor on a computer monitor. In one embodiment the positioner and buttons are located on a users body, but not on the user's fingertips. In another embodiment, the first, second and third sensors are load sensors on the user's fingertips with an actuation threshold above the load generated by typing. The hand glove mouse also includes a communication means communicating signals generated by the first, second and third buttons or sensors, and by the cursor positioner, to the computer. | 10-23-2008 |
20080259029 | Input Device Having the Function of Recognizing Hybrid Coordinates and Operating Method of the Same - The present invention relates a coordinate input device to input a variety of job commands, diagrams or characters, and to store or output the input data. More particularly, the present invention provides an input device for recognizing hybrid coordinates and a method of operating the device. The input device uses an absolute coordinate recognition method and a relative coordinate recognition method in a combined fashion as a coordinate recognition method for inputting the track of the characters or diagrams. By doing so, input coordinates are converted into absolute coordinates, and the tracks of handwritten characters and diagrams are stored as the converted absolute coordinates such that the tracks are displayed on a display of the input device or on a monitor of an information terminal connected to the input device. Accordingly, problems of respective conventional input devices for recognizing coordinates can be solved, the structure of hardware can be simplified, and characters or diagrams can be input with accurate recognition of coordinates. | 10-23-2008 |
20080259030 | Pre-assembled part with an associated surface convertible to a transcription apparatus - An apparatus including a part board having an associated surface or a frame for a surface, such as a whiteboard, that is pre-assembled to include components that when connected to an external module convert the surface to an electronic transcription apparatus. In one version, the components include a set of sensors and electronics therefor, with wiring and a connector. | 10-23-2008 |
20080259031 | CONTROL APPARATUS, METHOD, AND PROGRAM - There is provided an intuitive, easy-to-use operation interface that is less liable to erroneous operations and is operated by a motion of a user. A motion operation mode is entered in response to recognition of a particular motion (preliminary motion) of a particular object in a video image and, after that, operation of any of various devices is controlled in accordance with various command motions recognized in a motion area being locked on. When an end command motion is recognized or after the motion area is unable to be recognized for a predetermined period of time, the lock-on is canceled to exit the motion operation mode. | 10-23-2008 |
20080266253 | SYSTEM AND METHOD FOR TRACKING A LASER SPOT ON A PROJECTED COMPUTER SCREEN IMAGE - A system and method for tracking a laser spot on a projected computer screen image computes a projective transformation matrix using the projected computer screen image electronically captured in a frame of image data. The projective transformation matrix is computed by fitting a polygon to a contour of each graphical object in the frame of image data and determining whether the polygon for each graphical object satisfies a set of predefined parameters to find a candidate polygon that corresponds to an outline of the projected computer screen image in the frame of image data. | 10-30-2008 |
20080273011 | Interactive game method and system with sports injury protection - The present invention discloses an interactive game method with sports injury protection, comprising: providing a remote pointing device for a user to swing; and triggering a safety mechanism in one or more of the following conditions: (1) when a user swings the remote pointing device drastically; (2) when a count of swings exceeds a first threshold; and (3) when a count of swings in a predetermined time period exceeds a second threshold. | 11-06-2008 |
20080278445 | FREE-SPACE MULTI-DIMENSIONAL ABSOLUTE POINTER WITH IMPROVED PERFORMANCE - According to one embodiment, a system includes a handheld device having a pixelated sensor, an optical filter for passing a predetermined frequency band of radiation to the sensor and a transmitter, an electronic equipment having a display, and at least two spaced-apart markers, where each of which are positioned proximate to the display. The markers provide radiation at the frequency band passed by the optical filter. The handheld device includes a processor coupled to receive image data of the markers from the sensor for computing coordinate data from the image data. The coordinate data requires less data than the image data. The processor is coupled to the transmitter to transmit the coordinate data to the electronic equipment. Other methods and apparatuses are also described. | 11-13-2008 |
20080278446 | MULTI-SENSOR USER INPUT DEVICE FOR INTERACTIVE PROGRAMS ON A MACHINE - A user input device is provided for a machine having an interactive program and signals for a humanly-perceptible output for said program to allow a user in an inflatable physical structure to interact or perceive the output. The user input device includes one or more physical structures forming an enclosure adapted to accommodate and substantially laterally surround the body of at least one human user of the machine, a plurality of input elements which are disposed on the physical structure in three dimensions, wherein at least one of the input elements does not lie on the same horizontal or vertical plane on which others of said input elements lie, relative to the possible locations of the user so to be accessible to the user by unseated body movement of the user, and an interface between the input elements and the machine for providing inputs from the input elements to the interactive program. | 11-13-2008 |
20080278447 | THREE-DEMENSIONAL MOUSE APPRATUS - A 3D mouse apparatus is disclosed in the present invention. The apparatus is mainly utilized to calculate, recognize, analyze and output the 3D gestures which include the physical quantity such as 3D position coordinate, displacement, velocity and acceleration of a point light source and the moving behavior of human hand, so as to achieve the purpose of 3D mouse apparatus. | 11-13-2008 |
20080291164 | COORDINATE INPUT APPARATUS, CONTROL METHOD THEREOF, AND PROGRAM - A plurality of sensors for receiving arrival light detect the change ranges of light amount distributions generated upon the pointing operation of a pointer on a coordinate input region. Coordinate values corresponding to the change ranges are calculated on the basis of the number of change ranges in the respective sensors and the number of pen-down signals obtained from the pointer. | 11-27-2008 |
20080297474 | Electronic Device and a Method for Controlling the Functions of the Electronic Device as Well as Program Product for Implementing the Method - The invention relates to an electronic device, which includes a display component, in which at least one controllable element is arranged to be visualized in its entirety, the control of which element is arranged to be based on determining a change (M) relating to the attitude or position of the device and camera means arranged to form image frames (IMAGE | 12-04-2008 |
20080309619 | Cursor control method applied to presentation system and computer readable storage medium - The invention discloses a cursor control method applied to a presentation system. The presentation system comprises a computer, an imaging plane, an optical pointer, a camera, and a projector. The projector is a mobile or built-in projector of the computer for projecting output from the computer onto the imaging plane, wherein the output of the computer comprises an internal cursor generated by the computer. The optical pointer is used for projecting an external cursor onto the imaging plane. The camera is a mobile or built-in camera of the computer for capturing an image of the imaging plane. After capturing the image, a processor of the computer detects both a first position of the external cursor and a second position of the internal cursor corresponding to the image, calculates a shift vector between the first and second positions, and moves the internal cursor based on the shift vector. | 12-18-2008 |
20080316171 | Low-Cost Haptic Mouse Implementations - Low-cost haptic interface device implementations for interfacing a user with a host computer. A haptic feedback device, such as a mouse or other device, includes a housing physically contacted by a user, and an actuator for providing motion that causes haptic sensations on the device housing and/or on a movable portion of the housing. The device may include a sensor for detecting x-y planar motion of the housing. Embodiments include actuators with eccentric rotating masses, buttons having motion influenced by various actuator forces, and housing portions moved by actuators to generate haptic sensations to a user contacting the driven surfaces. | 12-25-2008 |
20090009468 | MULTI-FUNCTIONAL WIRELESS MOUSE - A multi-functional wireless mouse ( | 01-08-2009 |
20090009469 | Multi-Axis Motion-Based Remote Control - Motion-based control of an electronic device uses an array of at least three reference elements forming a triangle. An image sensor (e.g., a video camera), which may be located on a user-manipulated device, captures an image of the array. The array image has a pattern formed by a nonparallel projection of the reference triangle onto the image sensor. The pattern carries information of the relative position between the image sensor and the reference element array, and changes as the relative position changes. The pattern is identified and used for generating position information, which may express a multidimensional position of the user-manipulated device with respect to three axes describing a translational position, and three rotational axes describing pitch, roll and yaw motions. The control system and method are particularly suitable for videogames. | 01-08-2009 |
20090009470 | WRISTWATCH-TYPE MOBILE TERMINAL HAVING PRESSURE SENSOR AND METHOD FOR RECEIVING USER INPUTS THEREIN - Disclosed is a wristwatch-type mobile terminal and method for receiving user inputs therein, which can efficiently receive user inputs through a pressure sensor. The method for receiving user inputs in a wristwatch-type mobile terminal includes providing a sensor unit to the wristwatch-type mobile terminal and detecting a user input through the sensor unit, deciding an operation corresponding to a corresponding location of the sensor unit depending on the user input, determining whether the decided operation relates to movement of a pointer movable on a display screen, determining whether pressure of the user input is more than a preset value when the decided operation relates to the movement of the pointer, and moving the pointer a first distance and displaying the pointer when the pressure of the user input is not more than the preset value, or moving the pointer a second distance and displaying the pointer when the pressure of the user input is more than the preset value. | 01-08-2009 |
20090009471 | INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, AND CONTROL METHOD - An input apparatus outputting input information for controlling a movement of a user interface displayed on a screen is provided. The input apparatus includes: an angular velocity output unit for outputting a first angular velocity about a first axis, a second angular velocity about a second axis. A third angular velocity about a third axis; a combination calculates unit calculating a first combined angular velocity as a combination result of two angular velocities obtained by respectively multiplying the second and third angular velocities by two migration coefficients of a predetermined ratio. An output unit outputs, as the input information, information on the first angular velocity for controlling a movement of the user interface on the screen in an axial direction corresponding to the second axis and information on the first combined angular velocity for controlling the movement of the user interface on the screen in an axial direction corresponding to the first axis. | 01-08-2009 |
20090015553 | IMAGE DISPLAYING APPARATUS, IMAGE DISPLAYING METHOD, AND COMMAND INPUTTING METHOD - A disclosed image displaying apparatus comprises: a photographing unit configured to photograph an image on a screen; a projection image generating unit generating the image to be projected on the screen; an image extracting unit extracting identification information from the image photographed by the photographing unit, the identification information concerning object or figure information; an object recognizing unit recognizing attribute information from the identification information concerning the object information extracted by the image extracting unit; a figure recognizing unit recognizing characteristic information from the identification information concerning the figure information extracted by the image extracting unit; and an operation processing unit operating the projection image generating unit based on the attribute information and characteristic information. | 01-15-2009 |
20090015554 | SYSTEM, CONTROL MODULE, AND METHOD FOR REMOTELY CONTROLLING A COMPUTER WITH OPTICAL TRACKING OF AN OPTICAL POINTER - A control module is used with a projector projecting an image frame having at least one icon from a computer onto a screen, and an optical pointer projecting an optical cursor onto the screen, and obtains a close-loop optical track of the optical cursor based on positions of the optical cursor in a series of images of the projected image frame and the optical cursor captured by an image capturing device, selects an image portion of one captured image based on the close-loop optical track of the optical cursor to serve as a standard sample image, compares the standard sample image with the image frame to determine an image block from the image frame most similar to the standard sample image, and generates a control signal based on the image block to control the computer to execute a command associated with at most one icon contained in the image block. | 01-15-2009 |
20090015555 | INPUT DEVICE, STORAGE MEDIUM, INFORMATION INPUT METHOD, AND ELECTRONIC APPARATUS - An optical input device has a display screen, a portion of the display screen being shielded from light by an operating member to implement input processing based on a plurality of input modes. The input device includes an input detection unit including a display unit that displays predetermined input information, and a control unit. The input detection unit detects an area and/or brightness of a light-shielded portion formed on the display screen by the operating member by approaching the display unit. The control unit provides display control of the display unit and input control on the basis of the detected area and/or brightness. The control unit compares the detected area and/or brightness with a predetermined threshold value to select a desired input mode from among the plurality of input modes. | 01-15-2009 |
20090021479 | Device for Extracting Data by Hand Movement - The invention relates to a hand receiver ( | 01-22-2009 |
20090021480 | POINTER LIGHT TRACKING METHOD, PROGRAM, AND RECORDING MEDIUM THEREOF - A pointer light tracking method wherein all black image and white square images located at four corners of the all black image are projected on the display, the display on which the all black image and the white square images are displayed is shot by a camera, a domain corresponding to the white square image is extracted from the obtained image data, central coordinates (x, y) of the extracted domain are computed, and a parameter necessary in performing distortion correction by use of projection conversion for coordinate expressing the position of the pointer light on the display is computed from the computed central coordinates (x, y) and central coordinates (X, Y) of the white square image. | 01-22-2009 |
20090027335 | Free-Space Pointing and Handwriting - A position detection method using one or more one-dimensional image sensors for detecting a light source ( | 01-29-2009 |
20090027336 | REMOTE CONTROLLED POSITIONING SYSTEM, CONTROL SYSTEM AND DISPLAY DEVICE THEREOF - The invention relates to a remote controlled positioning system, a control system and a display device thereof. The remote controlled positioning system includes a liquid crystal display (LCD) panel, a backlight source, a plurality of infrared ray (IR) sources and a directional remote controller. The LCD panel includes a plurality of display areas. The plurality of IR sources is disposed behind the LCD panel, wherein the IR sources are correspondingly disposed according to the positions of the display areas to respectively emit infrared rays to pass through the LCD panel. The directional remote control receives the infrared rays emitted by the infrared ray sources to obtain positional information pointed to a position of the LCD panel by the directional remote control. | 01-29-2009 |
20090027337 | ENHANCED CAMERA-BASED INPUT - Enhanced camera-based input, in which a detection region surrounding a user is defined in an image of the user within a scene, and a position of an object (such as a hand) within the detection region is detected. Additionally, a control (such as a key of a virtual keyboard) in a user interface is interacted with based on the detected position of the object. | 01-29-2009 |
20090027338 | Gestural Generation, Sequencing and Recording of Music on Mobile Devices - System and methods for an application that allows users to interactively create, transform and play music using cell phones, iPhones™ and other enabled mobile communication devices communicating with a remote host are disclosed. Using an enabled mobile communication device, users are able to strike the mobile device like a drum to create and record rhythms, enter melodies using the keypads, add voice recordings, and manipulate musical tracks by tilting the mobile device continuously in three dimensions. The musical input is sequenced in multiple tracks and the transformative manipulations are applied in real time, allowing users to create their songs in an expressive motion-based manner. | 01-29-2009 |
20090033621 | Inertial Sensor-Based Pointing Device With Removable Transceiver - An inertial sensor-based pointing device | 02-05-2009 |
20090033622 | Smartscope/smartshelf - The SmartScope technology implements perceptual interfaces with a focus on machine vision and establishes a footprint for data collection based on the field of view of the data collecting device. The SmartScope implemented in a retail environment integrates multiple perceptual modalities such as computer vision, speech and sound processing, and haptic (feedback) Input/Output) into the customer's interface. The SmartScope computer vision technology will be used as an effective input modality in human computer interaction (HCI). | 02-05-2009 |
20090033623 | THREE-DIMENSIONAL VIRTUAL INPUT AND SIMULATION APPARATUS - The present invention relates to a three-dimensional virtual input and simulation apparatus, and more particularly to an apparatus comprising a plurality of point light sources, a plurality of optical positioning devices with a visual axis tracking function, and a control analysis procedure. The invention is characterized in that the plurality of optical positioning devices with the visual axis tracking function are provided for measuring and analyzing 3D movements of the plurality of point light sources to achieve the effect of a virtual input and simulator. | 02-05-2009 |
20090040178 | Position detecting device - An object of the present invention is to provide a position detecting device that allows simultaneous use by a plurality of users by recognizing positional information about a plurality of pointers and associating the pointers and the users. A position detecting device of the invention includes a rear projector ( | 02-12-2009 |
20090040179 | GRAPHIC USER INTERFACE DEVICE AND METHOD OF DISPLAYING GRAPHIC OBJECTS - A graphic user interface, an input/output computing apparatus for intuitive interfacing, and a method of interfacing are disclosed. The input/output computing apparatus for intuitive interfacing with a user, includes an input unit to detect one of a plurality of predetermined motions of the user and generate a signal corresponding to the detected predetermined motion, and a controller to carry out an operation corresponding to the signal and generate a control signal to display the result corresponding to the operation. | 02-12-2009 |
20090046061 | Dynamically Controlling a Cursor on a Screen when Using a Video Camera as a Pointing Device - A system provides for controlling a cursor on a screen automatically and dynamically when using a video camera as a pointing device. A computer displays static or dynamic content to a screen. A video camera connected to the computer points at the screen. As the video camera films the screen, frames captured by the video camera are sent to the computer. A target image is displayed by the computer onto the screen and marks the position of the screen cursor of the video camera. Frames captured by the video camera include the target image, and the computer dynamically moves the target image on the screen to ensure that the target image stays in the center of the view of the video camera. | 02-19-2009 |
20090046062 | POINTING DEVICE WITH CUSTOMIZATION OPTIONS - A pointing device that can interface with a graphical user interface of a computer or other electronic device. The pointing device includes a body having an upper portion and an underside. Also included is a tracking assembly having at least one sensor to detect movement and output a control signal responsive to the detected movement. The pointing device further includes several customization features. The customization features include mechanical customization features and software customization features. At least some of the mechanical customization features are configured to be replaceable. Such replaceable customization features are releasably mechanically coupled to the pointing device body. | 02-19-2009 |
20090046063 | Coordinate positioning system and method with in-the-air positioning function - A coordinate positioning system with in-the-air positioning function includes an illuminator and an image sensor. The illuminator produces a directional light. The image sensor receives the directional light produced by the illuminator and produces an image corresponding to the directional light to accordingly analyze the image and obtain a rotating angle corresponding to the directional light. | 02-19-2009 |
20090051650 | Pass through of remote commands - In one embodiment, a television set having remote control signaling to a controlled device has a data communication interface for communication of data, said data communication interface having a connection reserved for DC power. A remote control interface that receives commands from a remote control device. A circuit determines whether a command received at the remote control interface is destined for the television set or for the controlled device. The controlled device is connected to the television set via the data communication interface. A modulator modulates a signal representing a command destined for the controlled device onto the DC power connection in order to convey the command to the controlled device. In another embodiment a television accessory device that is interconnected to and controllable by the television has a data communication interface for communication of data, said data communication interface having a connection reserved for DC power. The accessory device is connected to the television via the data communication interface. A demodulator is coupled to the DC power connection and demodulates a signal representing a command that is modulated onto the DC power connection in order to receive a command from the television. A processor implements the command in the accessory device. This abstract is not to be considered limiting, since other embodiments may deviate from the features described in this abstract. | 02-26-2009 |
20090051651 | APPARATUS FOR REMOTE POINTING USING IMAGE SENSOR AND METHOD OF THE SAME - Problem: since a remote pointing system using an image sensor and having a communication function through an infrared remote controller is used in various environments, the various environments have to be considered when designing the system. Solution: a signal reception unit outputs a control signal controlled to operate in a mode that corresponds to an infrared signal received from a remote controller among a remote control mode and a remote pointing mode. When receiving a control signal controlled to operate in the remote pointing mode from the signal reception unit, an image reception unit is operated to obtain a background image during a first signal reception section and obtains an optical image that corresponds to an infrared signal received from the remote controller during a second signal reception section. The infrared signal is not received during the first signal reception section and received during the second signal reception section from the remote controller. An image-processing unit creates a corrected optical image according to a difference value between the optical image and the background image. A pointing calculator calculates a distance up to the remote controller according to the size of the corrected optical image inputted from the image-processing unit and calculates a movement amount of the remote controller according to the calculated distance, thereby solving the above problem. | 02-26-2009 |
20090051652 | CONTROL APPARATUS AND METHOD - The invention discloses a control apparatus capable of providing a user with a control over an object displayed on a display apparatus by a pointing device. The control apparatus includes an image capturing module, a separating module, a positioning module, a constructing module, and a processing module. The image capturing module is used to record an image sequence included N images. The separating module is applied to capture the pointing device image related to the pointing device from each image. The positioning module is used for calculating a specific point of the pointing device image of each image to generate a first set of specific point information. The constructing module is applied to generate a trajectory in accordance with a pre-defined criterion and the first set of specific point information. Additionally, the processing module is used to analyze the trajectory and generate a control signal to control the object. | 02-26-2009 |
20090051653 | TOY DEVICES AND METHODS FOR PROVIDING AN INTERACTIVE PLAY EXPERIENCE - The invention provides a unique interactive play experience carried out utilizing a toy “wand” and/or other actuation/tracking device. In one embodiment the wand incorporates a wireless transmitter and motion-sensitive circuitry adapted to actuate the transmitter in response to particular learned wand motions. The wand allows play participants to electronically and “magically” interact with their surrounding play environment simply by pointing, touching and/or using their wands in a particular manner to achieve desired goals or produce desired effects. Various wireless receivers or actuators are distributed throughout the play facility to support such wireless interaction and to facilitate full immersion in a fantasy experience in which participants can enjoy the realistic illusion of practicing, performing and mastering “real” magic. | 02-26-2009 |
20090058807 | MOUSE POINTER FUNCTION EXECUTION APPARATUS AND METHOD IN PORTABLE TERMINAL EQUIPPED WITH CAMERA - An apparatus and a method for executing a mouse pointer function in a portable terminal equipped with a camera are disclosed. The method includes: capturing an external image signal by a camera module; sampling the external image signal and converting a sampled image signal into image data; mapping a group of pixels of the image data generated by the sampling to a group of pixels of an image sensor for each unit pixel on a one-to-one basis; detecting coordinate values of image data including a point light source in the group of the mapped pixels; determining if a point light source is actually included in the detected coordinate values of the image data; and displaying the detected coordinate values of the image data determined to include the point light source on a screen. | 03-05-2009 |
20090066645 | WIRELESS MOUSE WITH ALPHANUMERIC KEYPAD - A wireless mouse for use by a hand of a user with a computer and a display having a cursor that includes a hand-holdable housing and a battery carried within the housing. An integrated circuit is carried within the housing and includes radio communications circuitry adapted for communicating with the computer. At least one sensor is carried by the housing for moving the cursor across the display and at least one button is provided on the housing for clicking on objects on the display. An alphanumeric keypad is provided on the housing for typing on the display. | 03-12-2009 |
20090066646 | Pointing apparatus, pointer control apparatus, pointing method, and pointer control method - Provided is a pointing apparatus, a pointer control apparatus, a pointing method, and a pointer control method to a pointing apparatus, a pointer control apparatus, a pointing method, and a pointer control method capable of recognizing image codes included in an image frame using an image sensor to determine a pointing direction, and continuously updating the gain between the displacement of the motion of the pointing apparatus and the displacement of the motion of a displayed pointer. The pointing apparatus includes an image receiving unit sensing image patterns that exist in a sensed region, among all of the image patterns arranged in a display region; an inertial sensor sensing an input motion using at least one of the acceleration and angular velocity that are generated due to the motion; and a coordinate determining unit determining moving coordinates that are moved from the central coordinates of the sensed image pattern by coordinate displacement corresponding to the sensed motion. | 03-12-2009 |
20090066647 | GUI APPLICATIONS FOR USE WITH 3D REMOTE CONTROLLER - A remote wand for controlling the operations of a media system is provided. The wand may be operative to control the movement of a cursor displayed on screen by the position and orientation at which the wand is held. As the user moves the wand, the on-screen cursor may move. The user may use the wand to control a plurality of operations and applications that may be available from the media system, including for example zoom operations, a keyboard application, an image application, an illustration application, and a media application. | 03-12-2009 |
20090066648 | GUI APPLICATIONS FOR USE WITH 3D REMOTE CONTROLLER - A remote wand for controlling the operations of a media system is provided. The wand may be operative to control the movement of a cursor displayed on screen by the position and orientation at which the wand is held. As the user moves the wand, the on-screen cursor may move. The user may use the wand to control a plurality of operations and applications that may be available from the media system, including for example zoom operations, a keyboard application, an image application, an illustration application, and a media application. | 03-12-2009 |
20090073117 | Image Processing Apparatus and Method, and Program Therefor - An image processing apparatus includes an extracting unit configured to extract a feature point from a captured image; a recognizing unit configured to recognize a position of the feature point; a display control unit configured to perform control, based on the position of the feature point, to display a feature-point pointer indicating the feature point; and an issuing unit configured to issue, based on the position of the feature point, a command corresponding to the position of the feature point or a motion of the feature point. | 03-19-2009 |
20090085870 | Multimedia device - A multimedia device includes a display device, a storage device, an orientation sensor and a processing unit. The storage device has a plurality of programs, and each of which is capable of rendering of an operation interface on the display device. The orientation sensor is capable of detecting the disposed orientation of the display device and sending an orientation signal. The processing unit electrically connected to the display device, the storage device and the orientation sensor. The operation interface shown on the display device is determined by the processing unit in accordance with the orientation signal emitted from the orientation sensor. | 04-02-2009 |
20090091532 | REMOTELY CONTROLLING COMPUTER OUTPUT DISPLAYED ON A SCREEN USING A SINGLE HAND-HELD DEVICE - A method, hand-held device and system for remotely controlling computer output displayed on a screen. A single hand-held device is used to remotely control the output of the computer displayed on a screen, where the hand-held device includes both a laser pointer and a camera. The camera is aligned with the laser pointer in such a manner as to be able to capture an image on the screen where the light is projected from the laser pointer. Image matching software may then be used to match the captured image with the image of the output of the computer displayed on the screen. User input (e.g., left-click action) may be received which is then used by the computer to perform that action in connection with the location (e.g., print icon) on the image displayed on the screen by the computer that corresponds to the position the point of light is projecting. | 04-09-2009 |
20090091533 | Keyboard - A keyboard includes a control panel and a cursor controller. The cursor controller includes a first key unit provided on the control panel, a second key unit provided on the control panel and a photo-sensor unit provided on the control panel. When a user moves his finger over the photo-sensor unit, the photo-sensor unit senses the movement and causes a cursor to move on a screen of a display of a computer used with the keyboard. | 04-09-2009 |
20090102788 | Manipulation input device - When a vehicle navigation system is manipulated by taking pictures of a user hand motion and gesture with a camera, as the number of apparatuses and operational objects increases, the associated hand shapes and hand motions increase, thus causing a complex manipulation for a user. Furthermore, in detecting a hand with the camera, when the image of a face having color tone information similar to that of a hand appears in an image taken with a camera, or outside light rays such as sun rays or illumination rays vary, detection accuracy is reduced. To overcome such problems, a manipulation input device is provided that includes a limited hand manipulation determination unit and a menu representation unit, whereby a simple manipulation can be achieved and manipulation can accurately be determined. In addition, detection accuracy can be improved by a unit that selects a single result from results determined by a plurality of determination units, based on images taken with a plurality of cameras. | 04-23-2009 |
20090109176 | DIGITAL, DATA, AND MULTIMEDIA USER INTERFACE WITH A KEYBOARD - A system and corresponding method for providing a 3-dimensional (3-D) user interface to display images in a 3-D coordinate system. The 3-D interface generates and displays one type of holographic keyboard in response to a user's desired selection. The holographic keyboard provides versatility and ergonomic benefits to the user. Sensors are configured to sense user interaction within the 3-D coordinate system, so that a processor may receive user interaction information from the sensors. The sensors are able to provide information to the processor that enables the processor to correlate user interaction with images in the 3-D coordinate system. The system may be used for interconnecting or communicating between two or more components connected to an interconnection medium (e.g., a bus) within a single computer or digital data processing system. | 04-30-2009 |
20090115723 | Multi-Directional Remote Control System and Method - A multi-directional remote control system and method is adapted for use with an entertainment system of a type including a display ( | 05-07-2009 |
20090115724 | THREE-DIMENSIONAL OPERATION INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, CONTROL METHOD, METHOD OF PRODUCING A THREE-DIMENSIONAL OPERATION INPUT APPARATUS, AND HANDHELD APPARATUS - A three-dimensional operation input apparatus for controlling a pointer on a screen includes: a casing; a sensor for detecting a movement of the casing; a movement value calculation section for calculating, based on a detection value detected by the sensor, first and second movement values respectively corresponding to the movements of the casing in directions along first and second axes that are mutually orthogonal; and a modification section for calculating first and second modified movement values for respectively moving the pointer in first and second directions on the screen respectively corresponding to the first and second axes, the first modified movement value obtained by multiplying the first movement value by a first modification coefficient, the second modified movement value obtained by multiplying the second movement value by a second modification coefficient different from the first modification coefficient. | 05-07-2009 |
20090115725 | Input device and method of operation thereof - A generic input device built of Electro-optical camera, sensors, buttons and communication means, provides a measure for operating in absolute and/or relational mode, most software applications on many electronic platforms with display independent of the screen characters. | 05-07-2009 |
20090122010 | Apparatus for operating objects and a method for identifying markers from digital image frame data - An input apparatus other than mice is provided that supports an input to a computer by fingers or a mouth. An apparatus for operating objects according to the present invention operates an object on a screen based on an imaged operating element. The apparatus includes a computing apparatus; a display apparatus connected to the computing apparatus; and an imaging apparatus connected to the computing apparatus. The imaging apparatus images a predetermined operating element. The computing apparatus displays, as a marker, an image of the imaged operating element on the display apparatus. The object on the screen of the display apparatus is operated by movement of the marker. | 05-14-2009 |
20090128488 | OPTICAL NAVIGATION DEVICE WITH CONSOLIDATED PROCESSING FOR SURFACE AND FREE SPACE NAVIGATION - An optical navigation device for operation in a surface navigation mode and a free space navigation mode. The optical navigation device includes a microcontroller, a first navigation sensor, and a second navigation sensor. The first navigation sensor is coupled to the microcontroller, and the second navigation sensor is coupled to the first navigation sensor. The microcontroller processes a movement of the optical navigation device. The first navigation sensor generates a first navigation signal in a first navigation mode. The second navigation sensor generates a second navigation signal in a second navigation mode and sends the second navigation signal to the first navigation sensor. By implementing a navigation sensor to process signals from multiple navigation sensors, the cost and size of the optical navigation device can be controlled, and a small packaging design can be used. | 05-21-2009 |
20090128489 | METHODS AND DEVICES FOR REMOVING UNINTENTIONAL MOVEMENT IN 3D POINTING DEVICES - Systems and methods according to the present invention describe 3D pointing devices and methods which detect movement of the 3D pointing device and remove unintentional movement from the output readings. | 05-21-2009 |
20090140979 | ELECTRONIC APPARATUS - According to one embodiment, an electronic apparatus includes a housing having an outer surface and an inner surface, a pointing device having a flat input surface and located in the housing with the input surface on the inner surface of the housing, an operation area provided in a position on the outer surface of the housing corresponding to at least a part of the pointing device, and a display section which illuminates at least a part of an outline of the operation area and displays a position of the operation area. | 06-04-2009 |
20090140980 | DISPLAY SYSTEM AND METHOD FOR DETECTING POINTED POSITION - A plurality of infrared-light-emitting areas are displayed in a display screen of a liquid crystal display apparatus in a method that allows each of the infrared-light-emitting areas to be distinguished. Then, an image in a direction of a pointed position is captured by an operating device. Based on a result of distinguishing each of the infrared-light-emitting areas and a position of each of the infrared-light-emitting areas, a pointed position on the display screen is calculated. This makes it possible to properly detect the pointed position on the display screen pointed by the operating device, regardless of (i) a distance between the operating device and the display apparatus and (ii) a rotation angle of the operating device around an axis in an image capture direction of the operating device. | 06-04-2009 |
20090140981 | IMAGE SENSOR AND OPTICAL POINTING SYSTEM - Provided is an image sensor and optical pointing system using the same. The image sensor has a plurality of pixels, each pixel including a photocell for receiving light and generating an analog signal having a voltage corresponding to a quantity of the received light, a comparator for, in response to a shutter control signal, comparing the analog signal of the photocell with an analog signal of an adjacent pixel to generate a digital signal for movement calculation, or comparing the analog signal of the photocell with a reference voltage to generate a digital signal for shutter control, and a switch for transferring the digital signal for movement calculation and the digital signal for shutter control in response to a pixel selection signal. The optical pointing system includes a reference voltage generation unit for generating a reference voltage, the image sensor, a signal selector for receiving the digital signal for movement calculation and the digital signal for shutter control, and selecting and outputting one of the digital signal for movement calculation and the digital signal for shutter control in response to a shutter control period selection signal, a movement calculation and shutter control unit for receiving the digital signal for movement calculation to obtain an image of an object and output a movement value of the optical pointing system and the shutter control period selection signal, and receiving the digital signal for shutter control to compare a high-level count value with a maximum count value and a minimum count value and output the shutter control signal. | 06-04-2009 |
20090146950 | METHOD AND SYSTEM OF VISUALISATION, PROCESSING, AND INTEGRATED ANALYSIS OF MEDICAL IMAGES - The present invention concerns a method and a system for the management of a station of visualisation, processing and analysis of images based on not manuals commands, particularly optical and vocal, and able to provide a feedback to the user to direct the further exploration of the medical images. | 06-11-2009 |
20090146951 | User Interface Devices - A method and apparatus of user interface having multiple motion dots capable of detecting user inputs are disclosed. In one embodiment, a user interface (“UI”) device includes a first motion dot and a second motion dot. The first motion dot is capable of attaching to a first finger and the second motion dot is configured to attach to a second finger. The first finger, in one example, is a thumb and the second finger is an index finger. The first motion dot includes multiple accelerometers used for identifying the physical location of the first motion dot. The second motion dot, which is logically coupled to the first motion dot via a wireless communications network, is capable of detecting a user input in response to a relative physical position between the first and the second motion dots. | 06-11-2009 |
20090146952 | DISPLAY SYSTEM AND LIQUID CRYSTAL DISPLAY DEVICE - A display system includes a liquid crystal display apparatus and an operating device. The operating device can point a desired position on a display screen of the display apparatus and capture an image of the display apparatus to obtain the captured image including the desired position pointed by the operating device. The display system includes an image analyzing section for analyzing the captured image obtained by the capture to find reference light areas in the display apparatus and to detect the desired position on the display screen based on positions of the reference light areas in the captured image. The display apparatus also includes a blinking controlling section which controls blinking conditions of all the backlight units in the display apparatus to differ a blinking condition of two or more reference light area backlight units from that of the other backlight units. This allows properly detecting the desired position on the display screen pointed by the operating device. | 06-11-2009 |
20090153477 | Computer mouse glove - A computer mouse glove for transferring computer mouse functions to the hand of a computer user. The glove includes: a glove member having finger fittings and a thumb fitting; a computer cursor control system having buttons and a tracking system having an optical tracking device; a computer module; a power module; a connection module; a tracking ball; and a power switch. The glove member encases a user's hand. The computer cursor control system controls functions of a cursor on a computer screen. The buttons provides mouse electrical switching functions. The tracking system controls movement of the cursor on a computer screen. The power module provides energy to the computer glove mouse. The connection module transmits electronic signals from the computer glove mouse to computer module. The tracking ball controls movement of the computer cursor. The power switch enables a user to select either the tracking ball or optical tracking device. | 06-18-2009 |
20090153478 | Centering a 3D remote controller in a media system - An electronic device associated with a remote wand controlling the operations of the electronic device is provided. The wand may include a motion detection component operative to provide an output reflecting the motion of the wand to the electronic device, such that the movements of a cursor displayed by the electronic device may be related to the output of the motion detection component. The wand may also include an input mechanism operative to receive user inputs. Using the input mechanism, the wand may detect a user's inputs and direct the electronic device to zoom or scroll displayed objects. The electronic device may display a screen saver by which the user may select particular media items for playback while remaining in the screen saver mode. In some embodiments, the electronic device may display video with a scroll bar that includes a preview window of the video. | 06-18-2009 |
20090153479 | Positioning Device of Pointer and Related Method - A positioning device for positioning an aim point of a pointer on a screen includes a screen, a pointer and a processor. The screen is utilized for displaying a plurality of characteristic points having already-known coordinate values. The pointer is utilized for forming an aim point, and includes an image acquisition unit for acquiring an image and a calculation unit for calculating image coordinate values of the plurality of characteristic points in the image. The processor is coupled to the screen and the pointer, and is utilized for establishing a transformation matrix according to the already-known coordinate values of the plurality of characteristic points and the image coordinate values of the plurality of characteristic points in the image and for deciding the position of the aim point according to the transformation matrix. | 06-18-2009 |
20090160768 | Enhanced Presentation Capabilities Using a Pointer Implement - Providing enhanced presentation capabilities using a pointer implement. In an embodiment, a user operates a key on a pointer implement to cause the pointer implement to capture the display image on a screen and send the captured image frame to a digital processing system. The digital processing system examines the image frame to determine the location of a beam spot caused by the pointer implement, which can be used as a basis for several user features. For example, a user may cause the digital processing system to draw a line on the screen or use the pointer implement as a mouse as well. | 06-25-2009 |
20090167682 | INPUT DEVICE AND ITS METHOD - In an input device capable of easily operating a touch panel at hand while viewing a forward display screen, the accuracy of sensing a hand shape and the accuracy of sensing a gesture are improved. | 07-02-2009 |
20090167683 | INFORMATION PROCESSING APPARATUS - According to one embodiment, an optical position detection IC outputs, in accordance with movement of an object on a detection area including a light-transmissive area which is disposed on a top surface of a housing. A control module controls a movement direction and a movement amount of a cursor, which is displayed on a display screen of a display device, based on an attitude signal which indicates in which of two directions the optical position detection IC is disposed, and movement amount information which is output from the optical position detection IC. | 07-02-2009 |
20090174657 | ENTERTAINMENT SYSTEM AND PROCESSING APPARATUS - In a game system | 07-09-2009 |
20090174658 | SYSTEM AND METHOD OF ADJUSTING VIEWING ANGLE FOR DISPLAY BASED ON VIEWER POSITIONS AND LIGHTING CONDITIONS - A method for adjusting a viewing angle of a display, includes determining a location of one or more viewers and determining lighting conditions. Additionally, the method includes calculating an optimal viewing position of the display based on the location of the one or more viewers and the lighting conditions and adjusting the display based on the optimal viewing position. | 07-09-2009 |
20090179858 | APPARATUS AND METHOD GENERATING INTERACTIVE SIGNAL FOR A MOVING ARTICLE - Apparatus and method generate interactive signal for a moving article such as an airplane model. The airplane model is provided with a human-sensible interactive signal source; and the moving status of the air plane model such velocity is detected to generate a movement parameter. The movement parameter is operated with a frequency-dependent conversion function to obtain a first interactive data. A second interactive data is generated when a trace of the moving article is matched with a default pattern. A third interactive data is generated when the velocity along at least one dimension exceeds a threshold value. The interactive signal source, such as loudspeaker or lamps, is selectively driven by one of the interactive data to generate a movement-dependent audiovisual effect. Therefore, the apparatus and method generating interactive signal for moving article can provide enhanced amusement effect for user. | 07-16-2009 |
20090189857 | TOUCH SENSING FOR CURVED DISPLAYS - Described herein is an apparatus that includes a curved display surface that has an interior and an exterior. The curved display surface is configured to display images thereon. The apparatus also includes an emitter that emits light through the interior of the curved display surface. A detector component analyzes light reflected from the curved display surface to detect a position on the curved display surface where a first member is in physical contact with the exterior of the curved display surface. | 07-30-2009 |
20090189858 | Gesture Identification Using A Structured Light Pattern - In at least some embodiments, a computer system includes a processor. The computer system also includes a light source. The light source provides a structured light pattern. The computer system also includes a camera coupled to the processor. The camera captures images of the structured light pattern. The processor receives images of the structured light pattern from the camera and identifies a user gesture based on distortions to the structured light pattern. | 07-30-2009 |
20090207134 | REMOTE CONTROL APPARATUS WITH INTEGRATED POSITIONAL RESPONSIVE ALPHABETIC KEYBOARD - In accordance with one embodiment, a remote control apparatus includes a first transmission means for use when the remote control apparatus is in a horizontal orientation and a second transmission means for use when the remote control apparatus is in a vertical orientation. The remote control apparatus further includes a keypad having a plurality of keys that have a first set of labels for use in the horizontal orientation and a second set of labels for use in the vertical orientation. In addition, a means for determining whether the remote control is in the horizontal orientation or the vertical orientation is provided as part of the remote control apparatus. At least some of the keys have a first functionality when in the horizontal orientation and a second functionality when in the vertical orientation. | 08-20-2009 |
20090207135 | SYSTEM AND METHOD FOR DETERMINING INPUT FROM SPATIAL POSITION OF AN OBJECT - A system and method for determining an input is provided. The system includes an object position determination device and an input determination device. The object position determination device is configured to determine a first position of an object at a first time and a second position of the object at a second time. The object position determination device includes a camera configured to detect light traveling from the object to the camera. The input determination device is configured to determine an input based at least partly upon the first position and the second position. The object position determination device can include a second camera. The object can include a radio frequency emitter. The object can include an infrared emitter. The object can be an electronic device. | 08-20-2009 |
20090213071 | CONTROL SYSTEM AND METHOD FOR A CURSOR IN DISPLAY DEVICE - The present invention discloses a control system for a cursor in display device, in which said display device is connected to a data processing device and said system comprises: an image acquisition device for acquiring user image information and sending a signal; a signal receiving unit for receiving the signal from the image acquisition device; a signal processing unit for parsing said signal and determining whether the cursor and the target region the cursor reaches need to be shifted or not; and a cursor control unit for sending a cursor control signal to said display device based on the determination result by the signal processing unit and shifting the cursor to said target region. Accordingly, the present invention further discloses a control method for a cursor in a display device. | 08-27-2009 |
20090213072 | REMOTE INPUT DEVICE - An input device providing users with a pointing capability includes a sender portion and a receiver portion. The sender portion is adapted to be manipulated by a user to specify a target point within a target area. The sender portion projects a light beam including a pattern on to the target area. A receiver portion includes one or more sensor units located in or near the target area. At least some of the sensor units receive a portion of the light beam regardless of the location of the target point within the target area. A processing unit in the receiver portion analyzes the portions of the light beam received by one or more sensor units to determine an attribute of the target point. The attribute can be the location or relative motion of the target point. The receiver portion may be integrated with a display device. | 08-27-2009 |
20090231278 | Gesture Based Control Using Three-Dimensional Information Extracted Over an Extended Depth of Field - Systems and methods are described for gesture-based control using three-dimensional information extracted over an extended depth of field. The system comprises a plurality of optical detectors coupled to at least one processor. The optical detectors image a body. At least two optical detectors of the plurality of optical detectors comprise wavefront coding cameras. The processor automatically detects a gesture of the body, wherein the gesture comprises an instantaneous state of the body. The detecting comprises aggregating gesture data of the gesture at an instant in time. The gesture data includes focus-resolved data of the body within a depth of field of the imaging system. The processor translates the gesture to a gesture signal, and uses the gesture signal to control a component coupled to the processor. | 09-17-2009 |
20090244005 | INPUT SYSTEM INCLUDING POSITION-DETECTING DEVICE - A position-detecting device detects a position pointed to by a position-pointing instrument and includes an operation panel detecting the position pointed to by the position-pointing instrument; and a manipulation-detecting unit located at at least one of the interior and the exterior of the operation panel, and detecting a manipulation by a second instrument other than the position-pointing instrument, or detecting a manipulation by both the position-pointing instrument and the second instrument. | 10-01-2009 |
20090251411 | Computer input device for automatically scrolling in different speed - A computer input device includes a body and a trace-detecting module coupled to the body. The body has a micro control unit (MCU), and the trace-detecting module has a light pervious area, and a trace-detecting unit. The trace-detecting unit further includes a light source and a sensor. The sensor senses a reflected light beam for a user's digit movement on the light pervious area at a velocity which can be sensed by the sensor. If an automatically scrolling mode is activated and the velocity exceeds a threshold stored in the MCU, then the MCU executes automatic scrolling at a predetermined scrolling speed. | 10-08-2009 |
20090251412 | MOTION SENSING INPUT DEVICE OF COMPUTER SYSTEM - A motion sensing input device of a computer system includes a motion sensor and a receiver. The motion sensor has two gyroscopes for sensing motions in different directions when a user operates the motion sensor. The motion signals are wirelessly transmitted to the receiver connected with the computer system. Furthermore, the motion signals are directly decoded via the receiver, and then they are converted to keyboard input signals. When the user plays a computer game by using the motion sensor to replace a keyboard, the user can experience a lifelike game environment and an intuitive operation mode. | 10-08-2009 |
20090262073 | TOUCH SENSITIVE REMOTE CONTROL SYSTEM THAT DETECTS HAND SIZE CHARACTERISTICS OF USER AND ADAPTS MAPPING TO SCREEN DISPLAY - Sensors around the periphery of the remote control unit detect contact with the user's hand. A trained model-based pattern classification system analyzes the periphery sensor data and makes a probabilistic prediction of the user's hand size. The hand size is then used to control a mapping system that defines how gestures by the user's thumb upon a touchpad of the remote control unit are mapped to the control region upon a separate display screen. | 10-22-2009 |
20090262074 | CONTROLLING AND ACCESSING CONTENT USING MOTION PROCESSING ON MOBILE DEVICES - Various embodiments provide systems and methods capable of facilitating interaction with handheld electronics devices based on sensing rotational rate around at least three axes and linear acceleration along at least three axes. In one aspect, a handheld electronic device includes a subsystem providing display capability, a set of motion sensors sensing rotational rate around at least three axes and linear acceleration along at least three axes, and a subsystem which, based on motion data derived from at least one of the motion sensors, is capable of facilitating interaction with the device. | 10-22-2009 |
20090267897 | REMOTE CONTROL TRANSMITTER - A remote control transmitter detects motion in a specific direction or in a rotational direction around a specific axis: The transmitter includes a battery placed on a bottom surface side within a case containing circuitry of the transmitter. The bottom surface side has a convex surface, with a center of curvature C coincident in the upward direction of the force of gravity above a center of gravity G of the transmitter. When placed on a flat surface, the transmitter assumes a stable orientation such that when the transmitter is grasped in order to perform a motion-based operation, it can be can be assumed that the vertical direction is the direction of the line joining the center of curvature C of the stable portion to the center of mass G, as an absolute direction for reference in detecting the motion operation. | 10-29-2009 |
20090267898 | INPUT APPARATUS AND CONTROL SYSTEM - An input apparatus includes: a casing; a sensor module that includes a reference potential and outputs, as a detection signal, a fluctuation of a potential with respect to the reference potential, that corresponds to a movement of the casing; a velocity calculation unit to calculate a pointer velocity value as a velocity value for moving a pointer based on an output of the sensor module; a first execution section to execute a calibration mode as processing for correcting the reference potential; a second execution section to execute an operation mode as processing for moving the pointer on a screen in accordance with the pointer velocity value calculated by the velocity calculation unit; and a switch to switch the execution of the calibration mode to the execution of the operation mode and vise versa in accordance with an input operation from outside. | 10-29-2009 |
20090278798 | Active Fingertip-Mounted Object Digitizer - A finger-mounted implement including a kinesthetic sensor, at least one tactile sensor, and means for securing the kinesthetic sensor and the at least one tactile sensor to a fingertip. The tactile sensor may be a thin-film force transducer, a piezoelectric accelerometer, or a combination thereof. An artificial fingernail may be connected to the accelerometer. The kinesthetic sensor may include a magnetic transducer and may sense an X-Y-Z position and an angular orientation of a fingertip to which the kinesthetic sensor is secured. The securing means may include at least one means selected from the group consisting of adhesive tape, an elastically deformable cover, and detachable adhesive. The implement can be further connected to a computer processing system for, amongst other things, the virtual representation of sensed objects. The implement can also be used as part of a method of haptic sensing of objects. | 11-12-2009 |
20090278799 | COMPUTER VISION-BASED MULTI-TOUCH SENSING USING INFRARED LASERS - The claimed subject matter provides a system and/or a method that facilitates detecting a plurality of inputs simultaneously. A laser component can be coupled to a line generating (LG) optic that can create a laser line from an infrared (IR) laser spot, wherein the laser component and line generating (LG) optic emit a plane of IR light. A camera device can capture a portion of imagery within an area covered by the plane of light. The camera device can be coupled to an IR-pass filter that can block visible light and pass IR light in order to detect a break in the emitted plane of IR light. An image processing component can ascertain a location of the break within the area covered by the emitted plane of IR light. | 11-12-2009 |
20090278800 | METHOD OF LOCATING AN OBJECT IN 3D - Methods and devices for calculating the position of a movable device are disclosed. The device may include multiple optical detectors (ODs) and the movable device may include light sources. Optics may be above the ODs. A controller may calculate the position of the light source based on data from the ODs and properties of the optics. The device may be a game console, and the light source may be a game controller. The roles of the OD and light sources may be interchanged. The rotation of the movable device may be determined using multiple light sources and/or multiple ODs on the movable device. The movable device may calculate its position and transmit it to a console. The light sources may be modulated by time or frequency to distinguish between the light sources. There may be two or more movable devices. There may be two or more consoles. | 11-12-2009 |
20090284469 | VIDEO BASED APPARATUS AND METHOD FOR CONTROLLING THE CURSOR - A video-based apparatus and method for controlling the cursor are provided. A video camera is used to acquire a hand image of a user, and then the image is analyzed and processed to move the cursor and to take place of functions of a mouse left button and a mouse right button. The user may use a “V” shaped hand gesture to replace the mouse. An index finger image is corresponding to the mouse left button, and a middle finger image is corresponding to the mouse right button. A valley point of the “V” shaped hand gesture is corresponding to the position of the cursor. | 11-19-2009 |
20090289896 | INPUT ARRANGEMENT FOR ELECTRONIC DEVICES - A new form of input arrangement for cursor control devices or other handheld electronic devices in which the activation surfaces are designed to allow the fingers and thumb of the user to affect commands by means of ergonomic and less repetitive motion as compared to current devices. In embodiments, sensors are associated with the user's digits which sense motion in not only the downward direction, but in multiple directions. In embodiments, the cursor control device has resilient pads to further add comfort to a user. The resulting ability of the user to vary and reduce the points of pressure and other stresses onto different surfaces of the digits and corresponding nerves and muscles serves to reduce discomfort and pain resulting from current devices. | 11-26-2009 |
20090295718 | MULTIPLE INPUT OPTICAL NAVIGATION SYSTEM - An optical navigation system with optical imaging of multiple inputs using a single navigation sensor. The optical navigation system includes a tactile interface device, an image sensor, and a processor. The tactile interface device facilitates a navigation input. The image sensor intermittently generates images of a surface of the tactile interface device and images of a contact navigation surface. The image sensor also generates the images of the surface of the tactile interface device exclusive of the images of the contact navigation surface. The processor is coupled to the image sensor. The processor generates a first navigation signal based on the images of the tactile interface device and generates a second navigation signal based on the images of the contact navigation surface. | 12-03-2009 |
20090295719 | DTV CAPABLE OF RECEIVING SIGNAL FROM 3D POINTING DEVICE, AND METHOD OF EXECUTING FUNCTION AND ADJUSTING AUDIO PROPERTY OF DTV EMPLOYING 3D POINTING DEVICE - A system and method for controlling a digital TV, the system including the DTV; and a 3D pointing device. The DTV includes a receiver to receive control signals from the 3D pointing device; and a control unit to select one of a plurality of functions provided by the DTV in response to a selection signal received from the 3D pointing device, each function having a corresponding execution profile, and execute the selected function in accordance with the corresponding execution profile and a motion parameter sensed by the 3D pointing device including a direction of movement and one of a distance and velocity corresponding to the direction of movement. The 3D pointing device includes a transmitter to transmit the control signals and motion parameter; and a sensor to sense within the 3D pointing device the motion of the 3D pointing device and generate the corresponding motion parameter. | 12-03-2009 |
20090295720 | METHOD FOR EXECUTING MOUSE FUNCTION OF ELECTRONIC DEVICE AND ELECTRONIC DEVICE THEREOF - A method for executing a mouse function of electronic device and an electronic device thereof are provided. In the present method, an amount and a relative position of input signals are detected by a sensor module. Then, whether the amount and the relative position are respectively conformed to a predetermined value is determined. If the predetermined values are conformed, whether the input signal is conformed to a specific signal is determined when a variation of the relative position is occurred. Finally, a corresponding mouse function is executed according to a type of the variation if the variation is conformed to the specific signal. As a result, a mouse device is no longer needed for a user to accomplish a directional operation on the electronic device so as to prevent inconvenience of particularly carrying a mouse device. | 12-03-2009 |
20090295721 | INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, AND CONTROL METHOD - An input apparatus is provided, and includes: a main body operated by a user in a first operation form of pointing a predetermined position on a screen using a pointer on the screen and a second operation form different from the first operation form. The input apparatus includes an operation form detection section to detect which of the first operation form and the second operation form an operation form of the main body is; a movement detection section to detect a movement of the mail body. The input apparatus includes an operational section to switch a first operational mode corresponding to the first operation form and a second operational mode corresponding to the second operation form therebetween according to the operation form of the main body, and calculating a corresponding movement value corresponding to the movement of the pointer on the screen that corresponds to the detected movement of the main body. | 12-03-2009 |
20090295722 | INPUT APPARATUS, CONTROL SYSTEM, HANDHELD APPARATUS, AND CALIBRATION METHOD - An input apparatus includes a casing, an output section, a processing section, a judgment section, and a calibration section. The output section includes a reference potential and outputs, as a detection signal, a fluctuation of a potential with respect to the reference potential, that corresponds to a movement of the casing. The processing section executes processing for generating a control command for controlling a screen based on an output value of the output section. The judgment section judges an operational state of the input apparatus. The calibration section executes, when it is judged that the input apparatus is moving, a calibration mode as processing for correcting the reference potential by obtaining the output value over a data obtainment period that is a sampling period of the output value and calculating a correction value of the reference potential based on the obtained output value. | 12-03-2009 |
20090295723 | INTERACTIVE DISPLAY SYSTEM - An interactive display system comprises a white board which communicates with a PC. A projector receives signals from the PC which are translated into corresponding project image which is projected on to the white board. The image projected on to the white board is the same as that shown on a computer screen. By using an electronic pen the position of which can be detected electronically by means of a plurality of wires embedded beneath the surface of the white board and using methods already known in the art, the electronic pen can function in the same way as a computer mouse. The image projected on to the white board may also be manipulated by means of a remote control device, which uses Infra red communication to transmit signals to a transponder built within the white board. | 12-03-2009 |
20090309831 | WIRELESS CONTROL DEVICE AND MULTI-CURSOR CONTROL METHOD - A wireless control device and a multi-cursor control method for use with a computer system are provided. The application program of the computer system is executed to generate at least a first pointer and a second pointer. The wireless control device includes a first pointer controller, a second pointer controller and a wireless transceiver. The multi-cursor control method includes steps of issuing a first wireless control signal containing a first identification code in response to manipulation of a first user, issuing a second wireless control signal containing a second identification code in response to manipulation of a second user, receiving and transmitting the first wireless control signal and the second wireless control signal to the computer system, and controlling corresponding shifts of the first pointer and the second pointer defined in the application program according to the first identification code and the second identification code, respectively. | 12-17-2009 |
20090315829 | Multi-User Pointing Apparaus and Method - An apparatus for interaction of a plurality of users with an application, comprising: a plurality of pointing devices ( | 12-24-2009 |
20090322676 | GUI APPLICATIONS FOR USE WITH 3D REMOTE CONTROLLER - A remote wand for controlling the operations of a media system is provided. The wand may be operative to control the movement of a cursor displayed on screen by the position and orientation at which the wand is held. As the user moves the wand, the on-screen cursor may move. The user may use the wand to control a plurality of operations and applications that may be available from the media system, including for example zoom operations, a keyboard application, an image application, an illustration application, and a media application. | 12-31-2009 |
20090322677 | LIGHT GUIDE PLATE FOR SYSTEM INPUTTING COORDINATE CONTACTLESSLY, A SYSTEM COMPRISING THE SAME AND A METHOD FOR INPUTTING COORDINATE CONTACTLESSLY USING THE SAME - Disclosed are a light guide plate for a non-contact type coordinate input system, a system including the same, and a non-contact type coordinate input method using the same. More particularly, the present invention relates to a light guide plate for a non-contact type coordinate input system, which eliminates inconvenience of a conventional contact-type coordinate input system inputting coordinates through direct contact, and which can reduce use of sensors and optical loss as much as possible. The present invention also relates to a system including the same, and a non-contact type coordinate input method using the same. | 12-31-2009 |
20090322678 | PRIVATE SCREENS SELF DISTRIBUTING ALONG THE SHOP WINDOW - An interactive method and system include at least one detector ( | 12-31-2009 |
20090322679 | ORIENTATION CALCULATION APPARATUS, STORAGE MEDIUM HAVING ORIENTATION CALCULATION PROGRAM STORED THEREIN, GAME APPARATUS, AND STORAGE MEDIUM HAVING GAME PROGRAM STORED THEREIN - An orientation calculation apparatus obtains data from an input device including at least a gyroscope and an acceleration sensor, and calculates an orientation of the input device in a three-dimensional space. Orientation calculation means calculates the orientation of the input device in accordance with an angular rate detected by the gyroscope. Acceleration vector calculation means calculates an acceleration vector representing an acceleration of the input device in accordance with acceleration data from the acceleration sensor. Correction means corrects the orientation of the input device such that a direction of the acceleration vector in the space approaches a vertically downward direction in the space. Also, the correction means corrects the orientation of the input device such that a directional change before and after the correction is minimized regarding a predetermined axis representing the orientation of the input device. | 12-31-2009 |
20100001952 | STORAGE MEDIUM STORING INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - A game apparatus includes a CPU, and the CPU controls a moving object within a virtual space on the basis of acceleration data and angular velocity data which are transmitted from a controller. For example, before the angular velocity data is above a predetermined magnitude, a position and an orientation of the moving object is controlled on the basis of the angular velocity data. When the angular velocity data is above the predetermined magnitude, an initial velocity of the moving object is decided on the basis of the acceleration data, and a moving direction (orientation) of the moving object is decided on the basis of the angular velocity data. Thereafter, the moving object moves within the virtual space according to a general physical behavior. | 01-07-2010 |
20100001953 | INPUT APPARATUS, CONTROL APPARATUS, CONTROL METHOD, AND HANDHELD APPARATUS - To provide an input apparatus, a control apparatus, a control system, and a control method that are capable of improving an operational feeling when a user uses the input apparatus to input an operation signal via an operation section. An MPU ( | 01-07-2010 |
20100007603 | METHOD AND APPARATUS FOR CONTROLLING DISPLAY ORIENTATION - An approach provides controlling of display orientation in a mobile device. Motion of a mobile device having a plurality of displays is detected, wherein each of the displays is configured to present an image. Orientation of one or more of the images is changed on the displays in response to the detected motion. | 01-14-2010 |
20100007604 | TOUCH-SENSITIVE CONTROL SYSTEMS AND METHODS - Touch-sensitive control systems and methods are provided. The touch-sensitive control system includes a touch interface and a sensor. The touch interface includes a first zone and a second zone having an icon corresponding to a function. The sensor is disposed under the touch interface for detecting contacts on the physical interface. When a contact on the icon of the second zone is detected by the sensor, the corresponding function is activated. When a movement on the first zone is detected by the sensor, an operation corresponding to the function is performed according to the movement. | 01-14-2010 |
20100013763 | METHOD AND APPARATUS FOR TOUCHLESS INPUT TO AN INTERACTIVE USER DEVICE - A plurality of light sources is mounted on a housing of an interactive user device. The sources are spaced from each other in a defined spatial relationship, for example in a linear configuration. At least one light sensor is also positioned at the surface of the housing. The light sensor senses light that is reflected from an object placed by the user, such as the user's finger, within an area of the light generated by the light sources. A processor in the user device can recognize the sensed reflected light as a user input command correlated with a predefined operation and respond accordingly to implement the operation. | 01-21-2010 |
20100013764 | Devices for Controlling Computers and Devices - One aspect of the invention provides a device for providing input to a computer. In some embodiments, the device includes a body, first and second reflective elements that have at least a first configuration and a second configuration, and a movable member coupled to the body. The movable member may be configured to move from a first position to a second position under an applied load, such that the reflective elements change from the first configuration to the second configuration, and then return to the first position. | 01-21-2010 |
20100013765 | METHODS FOR CONTROLLING COMPUTERS AND DEVICES - One aspect of the invention provides a method for providing input to a computer. In some embodiments, the method includes the steps of emitting light from a registered light source, reflecting a first pattern of reflected light emitted by the registered light source from at least first and second reflective elements, moving a movable member, coupled to a body, with respect to the body to create a second pattern of reflected light from the at least two reflective elements, and detecting a change from the first pattern to the second pattern to perform at least one of a computer mouse click, a computer mouse scroll, a keyboard input, and a combination thereof. | 01-21-2010 |
20100013766 | Methods for Controlling Computers and Devices - One aspect of the invention provides a method for providing input to a computer. In some embodiments, the method includes the steps of emitting light from a registered light source, reflecting a first pattern of light emitted by the registered light source with at least two reflective elements, detecting the movement of at least one of the reflective elements, and translating the movement of the at least one reflective element to movement of a cursor on a viewing system such that there is a first relationship between the movement of the at least one reflective element and the movement of the cursor, detecting a change from the first pattern to a second pattern of light with the at least two reflective elements, and changing the relationship between the movement of the reflective element and the movement of the cursor from the first relationship to a second relationship. | 01-21-2010 |
20100013767 | Methods for Controlling Computers and Devices - One aspect of the invention provides a method for providing input to a first computer and a second computer. In some embodiments, the method includes the steps of emitting light from a registered light source, reflecting light emitted by the registered light source with a reflective element, detecting the movement of the reflective element, translating the movement of the reflective element to movement of a cursor on a viewing system coupled to the first computer, detecting a computer switching input from a reflective element, and translating the movement of the reflective element to movement of a cursor on a viewing system coupled to the second computer. | 01-21-2010 |
20100020011 | MAPPING DETECTED MOVEMENT OF AN INTERFERENCE PATTERN OF A COHERENT LIGHT BEAM TO CURSOR MOVEMENT TO EFFECT NAVIGATION OF A USER INTERFACE - A method, system and apparatus provide that movement of an interference pattern of a coherent light beam is mapped to cursor movement to effect navigation of a user interface. A remote controller operable to emit a coherent light beam is in cooperative arrangement with a display device operable to display a user interface, navigable by means of a cursor. A laser diode element and coupled diffuser element of the remote controller generate the coherent light beam. Movement of the remote controller causes movement of an interference pattern of the coherent light beam impinging upon a sensor of the display device; movement of the interference pattern is sensed by the display device and mapped to corresponding movement of the cursor in the user interface. Thus, the remote controller may be used to navigate an on-screen user interface by movement of the remote controller itself. | 01-28-2010 |
20100033431 | SELECTION DEVICE AND METHOD - A selection device for selecting an icon in an image area is provided including a motion-sensing unit and a processing unit. The motion-sensing unit senses a first motion and converts the first motion into a first signal. The processing unit converts the first signal into a first locus in the image area, determines a first area in the image area according to the first locus, and determines whether the icon is to be selected according to the first area and a second area where the icon is to be displayed in the image area. | 02-11-2010 |
20100039381 | ROTATABLE INPUT DEVICE - In an example embodiment, a computer mouse is provided. This computer mouse includes a surface tracking sensor that detects movement of the computer mouse along the support surface. Additionally included are one or more orientation sensors that detect a movement of the computer mouse relative to a pivot point. The computer mouse also includes a controller that is configured to translate the movement along the support surface into a two-dimensional coordinate and to translate the movement relative to the pivot point into a magnitude of rotation. | 02-18-2010 |
20100039382 | INFORMATION PROCESSING APPARATUS, INPUT APPARATUS, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND PROGRAM - An information processing apparatus, an input apparatus, an information processing system, an information processing method, and a program that are capable of improving operability when a target object is selected on a screen are provided. A control apparatus is provided to which, when a button is pressed in a state where a pointer is indicating an area around an icon on a screen, a signal indicating that the button has been pressed and a signal of positional information of the pointer at that time are input, and the control apparatus performs movement control such that the pointer indicates the icon based on those signals. Therefore, even when the pointer is not directly indicating the icon, the icon can be indicated by indicating the area around the icon, thus improving operability in selecting the icon on the screen by the pointer. | 02-18-2010 |
20100039383 | DISPLAY CONTROL DEVICE, PROGRAM FOR IMPLEMENTING THE DISPLAY CONTROL DEVICE, AND RECORDING MEDIUM CONTAINING THE PROGRAM - A virtual plane including a display screen is divided into small regions. If calculated coordinates of an intersection is located in one of the small regions outside the display screen, an icon corresponding to the small region is displayed at a predetermined position. If coordinates of the intersection are not calculated, an icon corresponding to the small region in which the preceding intersection coordinates were located in is displayed at a predetermined position. | 02-18-2010 |
20100045598 | APPARATUS FOR CONTROLLING THE MOVEMENT OF AN OBJECT ON A PLANE - An apparatus is provided for controlling the movement of an object on a plane. The apparatus comprising a basin, a movable object positioned within the basin, and a sensor coupled to the apparatus for detecting the movement of the movable object within the basin, wherein the movement of the object on the plane is related to movement of the object within the basin. | 02-25-2010 |
20100045599 | INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, AND CONTROL METHOD - An input apparatus includes a sensor, a calculation section, and a transmission section. The sensor detects a movement of the input apparatus and outputs a detection signal corresponding to the movement of the input apparatus. The calculation section calculates a corresponding value that corresponds to a movement of an image displayed on a screen in a predetermined calculation cycle, the corresponding value corresponding to the detection signal. The transmission section transmits the corresponding value in a transmission cycle shorter than the calculation cycle. | 02-25-2010 |
20100053082 | REMOTE CONTROLS FOR ELECTRONIC DISPLAY BOARD - Techniques for interacting with an electronic display board are disclosed. According to one embodiment, a remote controller includes a laser generator, several motion sensors, a Micro Central Unit has data process capability and memory inside to store programs and a transceiver or a transmitter. A laser beam from the laser generator facilitates a writing movement on the electronic data board, the motion sensors detect the movement of the remote controller, and the MCU calculate the sensor data to derive the movement, the transmitter transmits the movement from the controller to the electronic data board. In accordance of the detected movement, the movement of the remote controller corresponding to the laser is electronically represented on the electronic display board. | 03-04-2010 |
20100053083 | PORTABLE DEVICES AND CONTROLLING METHOD THEREOF - A portable electronic device including a first input/output unit including a monostable display element, a second input/output unit including a bistable display element, a main setting unit configured to selectively set either one of the first and second input/output units as a main input/output unit and the other one of the first and second input/output units as a sub input/output unit, and a conversion unit configured to convert the sub input/output unit into a touch pad for inputting a command on the main input/output unit. | 03-04-2010 |
20100060573 | METHODS AND APPARATUS FOR INCREMENTAL PREDICTION OF INPUT DEVICE MOTION - Methods and apparatus for incremental prediction of input device motion. In one embodiment, the input device comprises one or more sensors adapted to output motional data of the input device as measured at a certain period. A prediction of input device motion is generated based upon the last prediction and a weighted error in estimate determined by the sensory output. According to one embodiment, the weight is calculated as a Kalman gain. In one embodiment, once the prediction has been generated, it is provided to a display update algorithm adapted to orient a navigational object upon an associated display screen. | 03-11-2010 |
20100060574 | OPERATING APPARATUS FOR HAND-HELD ELECTRONIC APPARATUS AND METHOD THEREOF - An operating apparatus for a hand-held electronic apparatus includes a motion sensor, an analog to digital converting module and a processing unit. The motion sensor is utilized for sensing a motion track of the hand-held electronic apparatus to generate an analog sensing signal. The analog to digital converting module is coupled to the motion sensor, and is utilized for converting the analog sensing signal into a digital sensing signal. The processing unit is coupled to the analog to digital converting module, and is utilized for executing a multimedia motion control software to determine whether the motion track of the hand-held electronic apparatus corresponds to a predetermined track according to the digital sensing signal. Furthermore, the processing unit executes at least one predetermined function corresponding to the predetermined track from a plurality of predetermined functions when determining that the motion track corresponds to the predetermined track. | 03-11-2010 |
20100060575 | COMPUTER READABLE RECORDING MEDIUM RECORDING IMAGE PROCESSING PROGRAM AND IMAGE PROCESSING APPARATUS - Displayed region size data indicating a size of a screen of a display device, or a size of a region in which an image of a virtual space is displayed on the screen, is obtained. Distance data indicating a distance between a user and the display device is obtained. A position and an angle of view of the virtual camera in the virtual space are set based on the displayed region size data and the distance data. | 03-11-2010 |
20100060576 | Control System for Navigating a Principal Dimension of a Data Space - Systems and methods are described for navigating through a data space. The navigating comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The navigating comprises translating the gesture to a gesture signal, and navigating through the data space in response to the gesture signal. The data space is a data-representational space comprising a dataset represented in the physical space. | 03-11-2010 |
20100066672 | METHOD AND APPARATUS FOR MOBILE COMMUNICATION DEVICE OPTICAL USER INTERFACE - An optical user interface is provided in a mobile communication device. Motion of an input mechanism of the mobile communication device is detected via an optical sensor. In response to detection, one or more properties governing the motion are evaluated based on input from a motion sensor. Control information is generated based on evaluation. | 03-18-2010 |
20100066673 | Laser pointer capable of detecting a gesture associated therewith and representing the gesture with a function - Described is a laser pointer capable of detecting a gesture associated therewith and representing the gesture with a function, comprising a light emitting unit for emitting a laser light; a detection unit for detecting a set of accelerations composed of at least an acceleration of the gesture and outputting an acceleration signal set; a control unit for receiving and outputting the acceleration signal set or outputting a processed acceleration signal set; and a communications unit for receiving and transmitting the acceleration signal set or the processed acceleration signal set through a communications channel to be received by an electronic device in order to determine the gesture and perform the corresponding function according to the acceleration signal set or the processed acceleration signal set. | 03-18-2010 |
20100066674 | Cursor controlling apparatus and the method therefor - The invention relates to a cursor controlling apparatus for controlling a cursor according to an input information comprising an input module, a processing module and a controlling module. The input module is used for generating the input information. The processing module coupled to the input module is used for generating processing data according to the input information. The controlling module is coupled to the processing module for comparing the processing data with a predetermined value to generate a comparing result and further generating a first control signal or a second control signal corresponding to the processing data and the compared result to control the cursor respectively. | 03-18-2010 |
20100066675 | Compact Interactive Tabletop With Projection-Vision - The subject application relates to a system(s) and/or methodology that facilitate vision-based projection of any image (still or moving) onto any surface. In particular, a front-projected computer vision-based interactive surface system is provided which uses a new commercially available projection technology to obtain a compact, self-contained form factor. The subject configuration addresses installation, calibration, and portability issues that are primary concerns in most vision-based table systems. The subject application also relates to determining whether an object is touching or hovering over an interactive surface based on an analysis of a shadow image. | 03-18-2010 |
20100066676 | Gestural Control of Autonomous and Semi-Autonomous Systems - Systems and methods are described for controlling a remote system. The controlling of the remote system comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The controlling comprises translating the gesture to a gesture signal, and controlling a component of the remote system in response to the gesture signal. | 03-18-2010 |
20100073289 | 3D CONTROL OF DATA PROCESSING THROUGH HANDHELD POINTING DEVICE - A handheld pointer device provides radiation with a directional characteristic relative to its main axis. The radiation is detected by at least two detectors at different positions. This enables determining the orientation of the main axis as well as the displacement of the device along its main axis for 3D control of an object rendered on the screen of a display monitor. | 03-25-2010 |
20100079374 | METHOD OF CONTROLLING A SYSTEM - The invention describes a method of controlling a system ( | 04-01-2010 |
20100090949 | Method and Apparatus for Input Device - Method and apparatus for an input device. In an embodiment, the present invention provides an apparatus that is designed as a hollow-out glove, which can be worn on the wrist of a user. This apparatus has three working mode: touch mode, air-mouse mode, and action-induction mode. In touch mode, it works with laser-positioning device to identify the user's moving motion; in air-mouse mode, it works with multi-sensors to capture user's moving motion; in action-induction mode, the apparatus can track all the directions of user's moving motion. This invention also provides a method to operate this apparatus. It makes it possible to position accurately in the space and capture the user's moving motion. | 04-15-2010 |
20100090950 | Sensing System and Method for Obtaining Position of Pointer thereof - In a sensing system and a method for obtaining a position of a pointer, the sensing system includes a sensing area, a reflective mirror, an image sensor and a processing circuit. The reflective mirror is configured for generating a mirror image of a pointer when the pointer approaches the sensing area. The image sensor is configured for sensing the pointer and the mirror image thereof when the pointer approaches the sensing area. When the pointer approaches the sensing area, the processing circuit calculates a coordinate value of the pointer according to an image sensed by the image sensor and a predetermined size of the pointer. The pointer forms an imaginary orthographic projection in the sensing area, the processing circuit regards the imaginary orthographic projection as a round projection, and a radius of the round projection is the predetermined size. | 04-15-2010 |
20100097316 | System and Method for Determining an Attitude of a Device Undergoing Dynamic Acceleration - A system and a method for determining an attitude of a device undergoing dynamic acceleration is presented. A first attitude measurement is calculated based on a magnetic field measurement received from a magnetometer of the device and a first acceleration measurement received from a first accelerometer of the device. A second attitude measurement is calculated based on the magnetic field measurement received from the magnetometer of the device and a second acceleration measurement received from a second accelerometer of the device. A correction factor is calculated based at least in part on a difference of the first attitude measurement and the second attitude measurement. The correction factor is then applied to the first attitude measurement to produce a corrected attitude measurement for the device. | 04-22-2010 |
20100097317 | METHODS AND APPARATUS TO PROVIDE A HANDHELD POINTER-BASED USER INTERFACE - Methods and apparatus to provide a handheld pointer-based user interface are described herein. An example apparatus includes a wireless pointer component and one or more base components. The wireless pointer component is configured to transmit one or more human-computer interaction (HCI) signals associated with an HCI event via a first communication link. One or more base components are operatively coupled to a screen of a display to receive the one or more HCI signals from the wireless pointer component via the first communication link. Further, the one or more base components are configured to generate at least one of operating information and position information of the wireless pointer component based on the one or more HCI signals, and to transmit the at least one of operating information and position information to a processor configured to generate screen information on the screen of the display via a second communication link. | 04-22-2010 |
20100097318 | METHODS AND APPARATUSES FOR OPERATING A PORTABLE DEVICE BASED ON AN ACCELEROMETER - Methods and apparatuses for operating a portable device based on an accelerometer are described. According to one embodiment of the invention, an accelerometer attached to a portable device detects a movement of the portable device. In response, a machine executable code is executed within the portable device to perform one or more predetermined user configurable operations. Other methods and apparatuses are also described. | 04-22-2010 |
20100103098 | User Interface Elements Positioned For Display - User interface elements positioned for display is described. In various embodiment(s), sensor input can be received from one or more sensors that are integrated with a portable device. A device hold position that corresponds to where the portable device is grasped by a user can be determined based at least in part on the sensor input. A display of user interface element(s) can then be initiated for display on an integrated display of the portable device based on the device hold position that corresponds to where the portable device is grasped. | 04-29-2010 |
20100103099 | POINTING DEVICE USING CAMERA AND OUTPUTTING MARK - Pointing device like mouse or joystick comprises camera for capturing the display screen and image processing means for recognizing and tracking the pointing cursor icon or mark from the captured image and producing the pointing signal. The pointing device of present invention can be used with any type of display without and additional tracking means like ultra sonic sensor, infrared sensor or touch sensor. The pointing device of present invention includes mark outputting portion, camera portion for capturing the said mark outputting portion and image processing portion for recognizing the said mark outputting portion from the captured image and producing the pointing signal. | 04-29-2010 |
20100103100 | INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, CONTROL METHOD, AND HANDHELD APPARATUS - An input apparatus, a control apparatus, a control system, and a control method that are capable of correcting an output signal when a hand movement is input to the input apparatus and with which a user does not feel a phase delay are provided. An input apparatus includes a velocity calculation section, a filter, a control section, and a memory. The velocity calculation section calculates velocity values of a casing in X′- and Y′-axis directions based on physical amounts output from a sensor unit like acceleration values in the X′- and Y′-axis directions output from an acceleration sensor unit. The filter attenuates, by predetermined scale factors, velocity values of signals of the predetermined frequency range out of the velocity values calculated by the velocity calculation section. Since the filter dynamically attenuates the velocity values of a shake frequency range in accordance with the velocity values, a precise pointing operation with a pointer becomes possible. | 04-29-2010 |
20100103101 | SPATIALLY-AWARE PROJECTION PEN INTERFACE - One embodiment of the present invention sets forth a technique for providing an end user with a digital pen embedded with a spatially-aware miniature projector for use in a design environment. Paper documents are augmented to allow a user to access additional information and computational tools through projected interfaces. Virtual ink may be managed in single and multi-user environments to enhance collaboration and data management. The spatially-aware projector pen provides end-users with dynamic visual feedback and improved interaction capabilities. | 04-29-2010 |
20100103102 | DISPLAYING METHOD AND DISPLAY CONTROL MODULE - A displaying method for a portable device under a standby condition is provided. The portable device includes a display unit displaying a first standby frame. The method includes steps of sensing a physical motion of the portable device. Then, the physical motion is identified as an operation command. Finally, according to the operation command, a corresponding operation is executed to change the first standby frame displayed on the display unit. | 04-29-2010 |
20100103103 | Method And Device for Input Of Information Using Visible Touch Sensors - A device and method for manual input of information into computing device using a camera and visible touch sensors. An image of a virtual input device is displayed on a screen, and the positions of the visible touch sensors, recorded by the video camera, are overlaid on the image of the virtual input device, thus allowing the user to see the placement of the touch sensors relative to the keys or buttons on the virtual input device. The touch sensors change their appearance upon contact with a surface, and the camera records their position at the moment of change. This way information about the position of intended touch is recorded. Touch sensors can be binary (ON-OFF) or may have a graded response reflecting the extent of displacement or pressure of the touch sensor relative to the surface of contact. | 04-29-2010 |
20100103104 | APPARATUS FOR USER INTERFACE BASED ON WEARABLE COMPUTING ENVIRONMENT AND METHOD THEREOF - Provided is an apparatus for user interface based on wearable computing environment includes: a signal measurement unit including a plurality of image measurement units that includes image sensors each of which receives optical signals generated from a position indicator that is worn on user's fingers or near a user's wrist to generate optical signals and measures images of the user foreground; and a signal processor that analyzes each image measured by the signal measurement unit to measure three-dimensional coordinates, recognizes a motion pattern of the user's hand on the three-dimensional coordinates from the optical signals received by the signal measurement unit, and outputs the corresponding instructions. | 04-29-2010 |
20100103105 | APPARATUS AND METHOD FOR EXECUTING A MENU IN A WIRELESS TERMINAL - A menu execution apparatus and method for conveniently providing a menu in a wireless terminal are provided. The apparatus includes a pointer unit for indicating a specific object in a menu execution recognition mode. A camera photographs the object indicated by the pointer unit in the menu execution recognition mode. A controller controls an operation for recognizing the object indicated by the pointer unit and photographed by the camera and displaying a menu for controlling the recognized object. | 04-29-2010 |
20100110005 | INTERACTIVE INPUT SYSTEM WITH MULTI-ANGLE REFLECTOR - An interactive input system comprises a pointer input region; and a multi-angle reflecting structure located along a single side of the pointer input region and operable to reflect radiation from a pointer within the pointer input region from at least two surface locations of the multi-angle reflecting structure, wherein the at least two surface locations each have different respective angles. An imaging system is operable to capture within at least a portion of the pointer input region images of the reflected radiation located within a field of view of the imaging system. Processing structure is provided for determining the location of the pointer relative to the pointer input region based on the at least one image. | 05-06-2010 |
20100110006 | Remote pointing appratus and method - A remote pointing apparatus is provided. A light receiving unit may receive emitted light. A filtering unit may filter a received signal to a first frequency component, a second frequency component, a third frequency component, and a fourth frequency component. A calculation unit may compare amplitudes of the first frequency component and the second frequency component to calculate a first coordinate axis value of a cursor on a display unit, and compare amplitudes of the third frequency component and the fourth frequency component to calculate a second coordinate axis value of the cursor on the display unit. | 05-06-2010 |
20100110007 | INPUT SYSTEM AND METHOD, AND COMPUTER PROGRAM - An input system comprises: a plurality of indicators having a plurality of displays for displaying an image including coordinate information unique to the displays on the screens, (i) a detecting means for detecting the image displayed on the screens, and (ii) a transmitting means for transmitting transmission information generated according to the coordinate information included in the detected image: and a controller having (i) a receiving means for receiving the transmission information, (ii) a generating means for generating drawing information indicating the trace along which each of the indicators moves on the screen of each of the displays according to the transmission information, and (iii) a control means for controlling each of the displays so as to perform the drawing processing corresponding to the drawing information generated by the generating means. | 05-06-2010 |
20100117958 | INFORMATION INPUT PANEL USING LIGHT EMITTED DIODE MATRIX - The present invention relates to an information input panel using the light emitted diode (LED) matrix. The panel includes the LED matrix and a control circuit. The LED matrix includes N×M LEDs. The control circuit includes N pieces of first terminal and M pieces of second terminal, wherein the i | 05-13-2010 |
20100117959 | MOTION SENSOR-BASED USER MOTION RECOGNITION METHOD AND PORTABLE TERMINAL USING THE SAME - A motion sensor-based user motion recognition method and portable terminal having a motion sensor is disclosed. The method recognizes user motions in a portable terminal. At least one parameter value is extracted from at least one user motion applied to the portable terminal. A reference parameter value serving as a user motion recognition reference is established according to at least one extracted parameter value. The established reference parameter value is stored. | 05-13-2010 |
20100117960 | HANDHELD ELECTRONIC DEVICE WITH MOTION-CONTROLLED CURSOR - A handheld electronic device of the present invention includes a display, a memory, a motion sensor, and a controller. The memory is configured for storing a viewable output of at least one software application. The controller is in communication with the display, the memory, and the motion sensor. The controller includes a first control logic that generates a first image on the display representative of a portion of the viewable output of the software application. The first image has a field of view (FOV), where the viewable output includes an inner region and an outer region. A second control logic adjusts the FOV of the first image based upon movement of the handheld device. A third control logic displays a second image of a cursor in the inner region. A fourth control logic displays the second image of the cursor in the outer region. | 05-13-2010 |
20100123661 | SLIDE PRESENTATION SYSTEM AND METHOD OF PERFORMING THE SAME - A slide presentation system and a method of performing the same which are capable of providing a real-time interaction among conference presenter and attendees are disclosed. When a projector projects at least one slide to as map a screened image generated from a host, an image identifying unit identifies a pointer after an image capturing unit imaging the content expressed on the projected slide. After the pointer is identified, an orienting unit detects a two-dimension coordinate value with reference to where the pointer is pointed on the projected slide as the same as the screened image of the host. Then, the two-dimension coordinate values are transmitted to the host for determining an action of the pointer according to the two-dimension coordinate value with reference to the screened image of the host. By the present invention, the pointer pointing on the projected slide can be directly implemented as functioning a mouse. | 05-20-2010 |
20100127978 | POINTING DEVICE HOUSED IN A WRITING DEVICE - A method for controlling a pointing icon in a computer, including the steps of (A) establishing a wireless connection between a pointing device and the computer, (B) generating directional information through one or more three dimensional movements of the pointing device, (C) transmitting the directional information from the pointing device to the computer and (D) translating the directional information into movements of the pointing icon on a screen of the computer using a device driver program stored on the computer. | 05-27-2010 |
20100127979 | Input device - Provided is an input device that may sense a touch and a motion, generate a sensing signal with respect to the sensed touch and motion, generate an input signal based on the sensing signal, and transmit the input signal to a display device. The display device may control an object displayed on the display device based on the input signal. | 05-27-2010 |
20100134414 | INPUT APPARATUS WITH BALL - An input apparatus is disclosed. The input apparatus provides a control signal to a host system. It includes a housing that includes an upper portion and a lower portion. A ball is coupled to the upper portion of the housing and can reside within a ring. A first sensor assembly is configured to sense the position of the ball, and a second sensor assembly is configured to sense the position of the input apparatus relative to a work surface. The input apparatus also includes a mode switch, where the mode switch is operatively coupled to the first sensor assembly and the second sensor assembly. The mode switch includes a first mode where the first sensor assembly provides the control signal to the host system and a second mode where the second sensor assembly provides the control signal to the host system. | 06-03-2010 |
20100134415 | IMAGE PROCESSING APPARATUS, IMAGE DISPLAYING METHOD, AND IMAGE DISPLAYING PROGRAM - An image processing apparatus includes an instructed-position detecting unit configured to receive an instruction operation by a user on a display screen of a display device and detect and output a position where the instruction operation is performed; a storing unit configured to store multiple image data items each including information corresponding to a search key; a search-key display controlling unit configured to cause at least one search key to be selectively displayed on the display screen of the display device; a searching unit configured to, if the search key displayed on the display screen is instructed by the search-key display controlling unit through the instructed-position detecting unit, search the storing unit for the image data corresponding to the search key to extract the image data; and a display controlling unit configured to collectively display images corresponding to the image data in a certain part on the display screen. | 06-03-2010 |
20100141578 | IMAGE DISPLAY CONTROL APPARATUS, IMAGE DISPLAY APPARATUS, REMOTE CONTROLLER, AND IMAGE DISPLAY SYSTEM - An image display control apparatus comprises a menu creating unit displays an operation menu ME on a liquid crystal display unit of an image display apparatus, a camera with an infrared filter capable of recognizing an infrared signal coming from a remote controller, a remote controller position identifying unit identifies the position which the remote controller occupies during image capturing on the basis of the recognition result, a remote controller position signal creating unit displays the identified position of the remote controller on the liquid crystal display unit, and a user operation judging unit determines the operable specification object in the operation menu displayed on the liquid crystal display unit. | 06-10-2010 |
20100141579 | METHOD AND APPARATUS FOR CONTROLLING A COMPUTING SYSTEM - A handheld computing device is introduced comprising a motion detection sensor(s) and a motion control agent. The motion detection sensor(s) detect motion of the computing device in one or more of six (6) fields of motion and generate an indication of such motion. The motion control agent, responsive to the indications of motion received from the motion sensors, generate control signals to modify, one or more of the operating state and/or the displayed content of the computing device based, at least in part, on the received indications. | 06-10-2010 |
20100141580 | PIEZO-ELECTRIC SENSING UNIT AND DATA INPUT DEVICE USING PIEZO-ELECTRIC SENSING - Disclosed herein is a piezoelectric sensing unit and a data input device using piezoelectric sensing. The data input device of the present invention includes a base, an input unit, first piezoelectric sensing parts, and a control unit. The input unit performs a first directional input in such a way that the input unit moves to one of first direction indicating locations arranged around a base location in radial directions at positions spaced apart from each other within a pre-determined input radius defined on the base. The first piezoelectric sensing parts are provided on respective moving paths of the input unit, so that when the first directional input is performed, the corresponding first piezoelectric sensing part is pressed by the input unit, thus generating a first sensing signal proportional to a pressing force. When the first sensing signal is greater than a preset value, the control unit extracts data, assigned to the corresponding first direction indicating location at which movement of the input unit is sensed, from a memory unit and inputs the data. | 06-10-2010 |
20100149096 | NETWORK MANAGEMENT USING INTERACTION WITH DISPLAY SURFACE - A computing system is provided to make managing the devices and content on a network easier by making the process intuitive, tactile and gestural. The computing system includes a display surface for graphically displaying the devices connected to a network and the content stored on those devices. A sensor is used to recognize activity on the display surface so that gestures may be used to control a device on the network and transport data between devices on the network. Additionally, new devices can be provided access to communicate on the network based on interaction with the display device. | 06-17-2010 |
20100156788 | INPUT DEVICE AND DATA PROCESSING SYSTEM - An input device includes a main body, a motion sensor unit and a coordinate conversion processing unit. The coordinate conversion processing unit configured to perform coordinate conversion processing based on a Y-axis acceleration and a Z-axis acceleration detected by the motion sensor unit with a first two-dimensional orthogonal coordinate system being defined by a mutually orthogonal Y-axis and Z-axis in a first plane perpendicular to an X-axis coinciding with a pointing direction of the main body. The coordinate conversion processing unit is configured to convert the Y-axis angular velocity and the Z-axis angular velocity detected by the motion sensor unit to a U-axis angular velocity and a V-axis angular velocity, respectively, in a second two-dimensional orthogonal coordinate system defined by a U-axis corresponding to a horizontal axis in the first plane and a V-axis perpendicular to the U-axis in the first plane. | 06-24-2010 |
20100164866 | HANDHELD ELECTRONIC DEVICE, CURSOR POSITIONING SUB-SYSTEM AND METHOD EMPLOYING CURSOR SCALING CONTROL - A track ball cursor positioning sub-system is employed by a handheld electronic device including an operating system and a plurality of applications having a plurality of predetermined scaling values. The cursor positioning sub-system includes a track ball cursor positioning device adapted to output a plurality of device pulses, and a track ball cursor resolution controller adapted to repetitively input the device pulses and to responsively output to the operating system a plurality of cursor movement events. The cursor resolution controller is further adapted to be controlled by the operating system or by the applications to learn which one of the applications is active and to automatically scale a number of the cursor movement events for a corresponding number of the device pulses based upon a corresponding one of the predetermined scaling values of the active one of the applications. | 07-01-2010 |
20100171696 | MOTION ACTUATION SYSTEM AND RELATED MOTION DATABASE - The claimed invention relates to an interactive system for recognizing a single or a series of hand motion of the user to control or create applications used in multimedia. In particular, the system includes a motion sensor detection unit (MSDU) | 07-08-2010 |
20100171697 | METHOD OF CONTROLLING VIEW OF STEREOSCOPIC IMAGE AND STEREOSCOPIC IMAGE DISPLAY USING THE SAME - A method of controlling a view of a stereoscopic image and a stereoscopic image display using the same are disclosed. The method of controlling a view of a stereoscopic image includes: detecting a position information of a viewer from an output of a sensor; changing parameters for rendering a viewing angle and a depth information according to the position information; generating a left-eye image and a right-eye image in which a viewing angle and a depth information are changed in accordance with the parameters; and displaying the left-eye image and the right-eye image on a stereoscopic image display. | 07-08-2010 |
20100171698 | DIGITAL TELEVISION AND METHOD OF DISPLAYING CONTENTS USING THE SAME - A DTV and a method of displaying content using the same are provided. a DTV includes: a plurality of first and second display units physically isolated from each other; a communication unit configured to communicate with an external remote control; and a control unit configured to move a content between the first and second display units in response to a motion of the remote control sensed by a signal received from the remote control through the communication unit. | 07-08-2010 |
20100171699 | AUTOMATIC ORIENTATION-BASED USER INTERFACE FOR AN AMBIGUOUS HANDHELD DEVICE - An electronic device is provided that includes a user-interface feature, a detection mechanism and one or more internal components. The user-interface feature is configurable to have a selected orientation about one or more axes. The detection mechanism can detect orientation information about the electronic device. The one or more components may select the orientation of the user-interface feature based on the detected orientation information. | 07-08-2010 |
20100182235 | INPUT DEVICE AND INPUT METHOD, INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD, INFORMATION PROCESSING SYSTEM AND PROGRAM - An input device includes an operation unit that is held by a user and operated in a three-dimensional free space to remotely operate an information processing device, a directional button that is provided on the operation unit and operated by the user to point in a direction, and a transmission unit that, when the directional button is operated while the operation unit is being operated in the free space, transmits information corresponding to the operation in the free space and information corresponding to the operated directional button to the information processing device so that an object image linearly moves by only an amount corresponding to a directional component of the directional button out of an operation amount in the free space after the operation of the directional button. | 07-22-2010 |
20100182236 | COMPACT RTD INSTRUMENT PANELS AND COMPUTER INTERFACES - This concerns my RTD invention disclosed in co pending applications, particularly but not necessarily for use in the “center stack” region of vehicle instrument panels. Disclosed are novel prism devices to shrink the size of the unit, while increasing resistance to vibration and condensation and providing easier assembly into the vehicle. Also disclosed are methods to improve the efficiency of such projector based systems for delivering display light to the driver of the vehicle, as well as to reduce noise caused by backscatter from the screen and control surface, or sunlight coming through the windshield. RTD versions for home or office use are also disclosed, including a “cushion computer”-like device meant primarily for use on one's lap and optionally having a reconfigurable keyboard. The device can also serve as a TV remote and perform other useful functions in the home, workplace, or car, and may serve as a useful interface accessory to expand the usability and enjoyment of mobile devices. | 07-22-2010 |
20100188334 | INPUT DEVICE AND METHOD, INFORMATION PROCESSING APPARATUS AND METHOD, INFORMATION PROCESSING SYSTEM, AND PROGRAM - An input device includes: an operating section which is held by a user and operated in a three-dimensional free space in order to operate an information processing apparatus by remote control; a calculation section which calculates a hand shake related value for controlling selection of an image to be controlled which is displayed on the information processing apparatus, the hand shake related value being relevant to an amount of hand shake of the operating section; and an output section which outputs the hand shake related value as an operation signal for operating the information processing apparatus by remote control. | 07-29-2010 |
20100194687 | REMOTE INPUT DEVICE - An input device providing users with a pointing capability includes a sender portion ( | 08-05-2010 |
20100201619 | INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, CONTROL METHOD, AND HANDHELD APPARATUS - An input apparatus, a control apparatus, a control system, a control method, and a handheld apparatus with which a user can easily control a movement and stop of a pointer displayed on a screen are provided. An input apparatus includes a sensor unit for detecting a movement of a casing and a button. An MPU outputs a determination code when a press of the button is released within a first time period. On the other hand, when the button is pressed and held for a time period equal to or longer than a first time period, a movement command is output from after an elapse of the first time period. Accordingly, the button is provided with a function corresponding to a determination input button and a function corresponding to an input button for controlling a movement and stop of a pointer, for example. As a result, a user can easily control the movement and stop of the pointer without mixing up an input operation for moving and stopping the pointer with other input operations. | 08-12-2010 |
20100201620 | FIREARM TRAINING SYSTEM - The present invention provides a firearm training system for actual and virtual moving targets comprising a firearm, a trigger-initiated image-capturing device mounted on a firearm, a processor, and a display. The system allows a user to visualize the accuracy of a shot taken when the trigger was pulled or the gun fired by showing the predicted position of the firearm's projectile in relation to the moving targets. | 08-12-2010 |
20100201621 | MOVING OBJECT DETECTING APPARATUS, MOVING OBJECT DETECTING METHOD, POINTING DEVICE, AND STORAGE MEDIUM - Even when a user is gazing at one point intentionally but the eyeball of the user is actually moving slightly, the slight movement is not reproduced as it is as the position of a cursor but a determination is made that the user is gazing at one point intentionally, that is, the eyeball is stopping. Thus, when a determination is made that the eyeball is stopping, the cursor is displayed still even when the gazing point is moving slightly depending on the slight movement. Furthermore, when a determination is made that the cursor is stopped, selection of an object such as other icon displayed at a position where the cursor is displayed is identified. | 08-12-2010 |
20100207880 | COMPUTER INPUT DEVICES AND ASSOCIATED COMPUTING DEVICES, SOFTWARE, AND METHODS - Computer input devices include a detector adapted to detect relative movement of an input member in x-, y-, and z-dimensions relative to a base point in a base plane, and a controller adapted to send a signal to an associated computing device based at least in part on the relative movement of the input member. Associated computing devices, software, and methods are also disclosed. | 08-19-2010 |
20100207881 | Apparatus for Remotely Controlling Computers and Other Electronic Appliances/Devices Using a Combination of Voice Commands and Finger Movements - An apparatus for remotely operating a computer using a combination of voice commands and finger movements. The apparatus includes a microphone and a plurality of control elements in the form of touch-sensitive touchpads and/or motion-sensitive elements that are used to operate the computer and to move an on-screen cursor. At least one touchpad is used to selectively switch between a command-mode of operation in which the computer interprets spoken words as commands for operating the computer and any software applications being used, and a text-mode of operation in which the computer interprets spoken words literally as text to be inserted into a software application. The apparatus is ergonomically designed to enable it to be easily worn and to enable a user to operate a computer from a standing, sitting or reclining position. The apparatus can be used to operate a computer for traditional computing purposes such as word processing or browsing the Internet, or for other purposes such as operating electronic devices such as a television and/or other household appliances. The apparatus eliminates the need for a keyboard and a mouse to operate a computer. In addition, the apparatus cart be used as a telephone. | 08-19-2010 |
20100214214 | REMOTE INPUT DEVICE - An input device providing users with a pointing capability includes a sender portion ( | 08-26-2010 |
20100214215 | INPUT DEVICE FOR USE WITH A DISPLAY SYSTEM - A methods and systems for use of an input device with a display system are disclosed. In an example embodiment, a projection device configured to project a displayed image is provided where the displayed image includes one or more selectable items. The projection system further includes an input device which may be movable in free space and may be configured to point to the selectable items. The input device may be enabled to provide a double-click input to effect one or more changes in a graphical user interface that corresponds with a selection of a particular one of the selectable items at which the input device is pointed. The double click input may be identified such that movement of the input device after initiation of the double-click input may be ignored by the graphical user interface until completion of the double click input. | 08-26-2010 |
20100214216 | MOTION SENSING AND PROCESSING ON MOBILE DEVICES - Display devices including motion sensing and processing. In one aspect, a handheld electronic device includes a subsystem providing display capability and a set of motion sensors provided on a single substrate and including at least one gyroscope sensing rotational rate of the device around three axes of the device and at least one accelerometer sensing gravity and linear acceleration of the device along these axes. A computation unit is capable of determining motion data from the sensor data stored in the memory, the motion data derived from a combination of the sensed rotational rate around at least one of the axes and the sensed gravity and linear acceleration along at least one of the axes. The motion data describes movement of the device including a rotation of the device around at least one of the axes, the rotation causing interaction with the device. | 08-26-2010 |
20100225582 | INFORMATION PROCESSING APPARATUS, STORAGE MEDIUM HAVING INFORMATION PROCESSING PROGRAM STORED THEREIN, INFORMATION PROCESSING SYSTEM, AND DISPLAY RANGE CONTROL METHOD - An information processing apparatus inputs an angular rate detected by a gyroscope included in an input device and displays an image on a display device. The information processing apparatus initially calculates an orientation of the input device based on the angular rate. Then, the information processing apparatus calculates a coordinate point at an intersection between a line extending from a predetermined position in a predetermined space toward a vector representing the orientation and a predetermined plane within the predetermined space. A display range of a display target that is to be displayed on the display device is controlled based on the coordinate point. | 09-09-2010 |
20100225583 | COORDINATE CALCULATION APPARATUS AND STORAGE MEDIUM HAVING COORDINATE CALCULATION PROGRAM STORED THEREIN - A coordinate calculation apparatus calculates a coordinate point representing a position on a display screen based on an orientation of an input device. The coordinate calculation apparatus includes direction acquisition means, orientation calculation means, first coordinate calculation means, and correction means. The direction acquisition means acquires information representing a direction of the input device viewed from a predetermined position in a predetermined space. The orientation calculation means calculates the orientation of the input device in the predetermined space. The first coordinate calculation means calculates a first coordinate point for determining the position on the display screen based on the orientation of the input device. The correction means corrects the first coordinate point such that the first coordinate point calculated when the input device is directed in a predetermined direction takes a predetermined reference value. | 09-09-2010 |
20100225584 | SILENT OR LOUD 3D INFRARED FUTURISTIC COMPUTER MICE AND KEYBOARD DESIGN FOR A MICE&KEYBOARD LESS COMPUTER - New futuristic silent or loud 3D computer mice and key board design for a mice and keyboard less computer is presented. The current invention uses the fact that each word spoken by humans has a distinctive three dimensional pattern of the mouth and face and unique infrared spectrum. Using this fact, the present invention presents a method where the facial expression, irradiated by an array of infrared diodes, of the spoken word is picked up by infrared sensors installed either in stand alone mode, on top of the computer display or directly into the computer display to translate any spoken word silent or loud into computer commands that will facilitate the interaction of humans and computers without the use of a keyboard or mouse. | 09-09-2010 |
20100231512 | ADAPTIVE CURSOR SIZING - Disclosed herein are systems and methods for controlling a computing environment with one or more gestures by sizing a virtual screen centered on a user, and by adapting the response of the computing environment to gestures made by a user and modes of use exhibited by a user. The virtual screen may be sized using depth, aspects of the user such as height and/or user profile information such as age and ability. Modes of use by a user may also be considered in determining the size of the virtual screen and the control of the system, the modes being based on profile information and/or information from a capture device. | 09-16-2010 |
20100231513 | POSITION MEASUREMENT SYSTEMS USING POSITION SENSITIVE DETECTORS - Methods and devices for a remote control device for a display device are disclosed. In one embodiment, the remote control device may comprise a plurality of light sources that each has a light profile angled in a predetermined degree different from other light sources. In another embodiment, the remote control device may comprise a controller; and a plurality of optical detectors coupled to the controller. Each optical detector may generate a pair of electrical signals in response to incident light from a plurality of light sources located on a display device and the controller may calculate the position of the remote control device based on the electrical signals. | 09-16-2010 |
20100238112 | INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, AND CONTROL METHOD - An input apparatus includes: a casing; a first acceleration detection section to detect a first acceleration value of the casing in a first direction; a first angle-related value detection section to detect a first angle-related value of the casing about an axis in a second direction; a radius gyration calculation section to calculate, based on the first acceleration value and first angle-related value, a first radius gyration of the casing about the axis in the second direction, the first radius gyration being a distance from a rotational center axis to the first acceleration detection section; and a pseudo velocity calculation section to generate a first pseudo radius related to a magnitude of the first radius gyration and calculate a first pseudo velocity value of the casing in the first direction by multiplying the first pseudo radius by a first angular velocity value obtained from the first angle-related value. | 09-23-2010 |
20100245244 | CHARACTER INPUTTING DEVICE - The present invention relates to a character inputting device. The character inputting device includes an input unit, a detection unit, and a control unit. The input unit is configured such that a plurality of input areas is radially arranged around a reference location, one or more characters are assigned to each input area, and the respective characters can be selected through different input actions. The detection unit is configured to separately detect different input actions for each of the input areas. The control unit is configured to extract a relevant character from a memory unit in accordance with results of the detection by the detection unit, and input the character. | 09-30-2010 |
20100245245 | SPATIAL INPUT OPERATION DISPLAY APPARATUS - A spatial input operation display apparatus providing a user interface allowing input operations inside a space without requiring hands or fingers to be stopped in space, and not requiring a physical shape or space. The spatial input operation display apparatus is provided with a shape identifying portion ( | 09-30-2010 |
20100253622 | POSITION INFORMATION DETECTION DEVICE, POSITION INFORMATION DETECTION METHOD, AND POSITION INFORMATION DETECTION PROGRAM - A position information detection device, method, and program are provided, which are capable of detecting position information with high precision using simple and easily identified discrimination marks. A position information detection device has an image capture portion | 10-07-2010 |
20100253623 | REMOTE CONTROL, IMAGING DEVICE, METHOD AND SYSTEM FOR THE SAME - A remote control, an imaging device, a method and a system for the same are provided to realize the functions such as channel selection of programs, character input, etc. The remote control comprises: an operation means having multiple keys, an ultrasonic and radio signal transmitting means for transmitting radio signals and ultrasonic signals while one of the said multiple keys is operated to map the position of the remote control into a cursor displayer on a screen, and a control means for controlling the said operation means and the said ultrasonic and radio signal transmitting means. | 10-07-2010 |
20100253624 | SYSTEM FOR DISPLAYING AND CONTROLLING ELECTRONIC OBJECTS - The present invention is directed toward a system and process that controls a group of networked electronic components using a multimodal integration scheme in which inputs from a speech recognition subsystem, gesture recognition subsystem employing a wireless pointing device and pointing analysis subsystem also employing the pointing device, are combined to determine what component a user wants to control and what control action is desired. In this multimodal integration scheme, the desired action concerning an electronic component is decomposed into a command and a referent pair. The referent can be identified using the pointing device to identify the component by pointing at the component or an object associated with it, by using speech recognition, or both. The command may be specified by pressing a button on the pointing device, by a gesture performed with the pointing device, by a speech recognition event, or by any combination of these inputs. | 10-07-2010 |
20100253625 | METHOD AND DEVICE FOR DETERMINATION OF COORDINATES OF A COMPUTER POINTING DEVICE SUCH AS MOUSE OR ELECTRONIC STYLUS - The invention relates to radio engineering, in particular to methods and devices for loading information into computers or game consoles. The inventive method for extending functionalities and the scope of a mouse- or electronic stylus pen-type manipulator involves emitting electromagnetic wave radiation with the aid of a main transmitter antenna built into the manipulator, receiving said electromagnetic waves with the aid of at least three, one base and two working, spaced antennas of a receiver, when the manipulator is moved on a plain, measuring the phase difference of the signals for the different pairs of the receiver antennas (the base antenna and each of the working antennas) with the aid of at least four, one base and three working, spaced antennas of the receiver, when the manipulator is three-dimensionally moved, and calculating the manipulator plane or three-dimensional coordinates according to the phase differences, wherein the device for determining the coordinates of a mouse- or electronic stylus pen-type manipulator comprises a manipulator transmitter with a base antenna built into the manipulator and a receiving device which contains at least three, one base and two working, spaced antennas, when the manipulator is moved on a plane, and at least four, one base and three working, spaced antennas, when the manipulator is three-dimensionally moved. | 10-07-2010 |
20100259478 | METHOD AND DEVICE FOR INPUTTING INFORMATION BY DESCRIPTION OF THE ALLOWABLE CLOSED TRAJECTORIES - The method and the device for inputting information by description of the allowable closed trajectories (ACT), the device of the sensors of the characteristic points. The manipulator unit ( | 10-14-2010 |
20100271300 | Multi-Touch Pad Control Method - A multi-touch pad control method is disclosed. The method comprises the following steps. First of all, a primary cursor is detected. Then a first touch motion is detected, if there is no first touch motion, primary cursor is re-detected. Next a secondary cursor is detected if there is a first touch motion. Then a second touch motion is detected. A first function is performed if there is no second touch motion. A first direction of the second touch motion is detected, if the second touch motion controlling the secondary cursor is detected. A second function is performed if the second touch motion is toward first direction. A second direction of the second touch motion is detected, if the second touch motion is not toward the first direction. A third function is performed if the second touch motion is toward the second direction, wherein the first function is performed if the second touch motion controlling the secondary cursor is detected and the second touch motion is neither toward the first direction nor the second direction. | 10-28-2010 |
20100271301 | INPUT PROCESSING DEVICE - An input processing device includes an input pad; a detector which detects a position of an indicating object coming into contact with the input pad; and a processor which controls a display state of an image displayed on a display on the basis of an input signal obtained from the detector, wherein an input surface of the input pad is provided with a detection region for detecting a specific input operation, and wherein when the processor receives the input signal corresponding to the specific input operation given from the indicating object onto the detection region, the processor rotates the image. | 10-28-2010 |
20100283730 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING PROGRAM - An information processing apparatus include: a control detection block configured to detect a control in a predetermined detection space; a position detection block configured to detect a three-dimensional position of a control detected by the control detection block; a threshold value setting block configured, if the control has approached the control detection block beyond a threshold value set to a predetermined distance through the control detection block on the basis of a three-dimensional position detected by the position detection block, to set the threshold value farther from the control detection block than the predetermined distance; a setting change block configured, if the control has exceeded the threshold value set by the threshold value setting block, to change setting values for predetermined processing; and a processing execution block configured to execute the processing by use of the setting values set by the setting change block. | 11-11-2010 |
20100283731 | METHOD AND APPARATUS FOR PROVIDING A HAPTIC FEEDBACK SHAPE-CHANGING DISPLAY - A haptic device includes a processor, a communication module coupled to the processor for receiving a shape input, and a housing for housing the communication module and including a deformable portion. The deformable portion includes a deformation actuator, and the processor provides a signal to the deformation actuator in response to the shape input to deform the housing. The shape of other areas of the device may also change in response to the signal. The shape changes may provide haptic effects, provide information, provide ergonomic changes, provide additional functionality, etc., to a user of the device. | 11-11-2010 |
20100283732 | EASILY DEPLOYABLE INTERACTIVE DIRECT-POINTING SYSTEM AND PRESENTATION CONTROL SYSTEM AND CALIBRATION METHOD THEREFOR - A method for controlling movement of a computer display cursor based on a point-of-aim of a pointing device within an interaction region includes projecting an image of a computer display to create the interaction region. At least one calibration point having a predetermined relation to said interaction region is established. A pointing line is directed to substantially pass through the calibration point while measuring a position of and an orientation of the pointing device. The pointing line has a predetermined relationship to said pointing device. Movement of the cursor is controlled within the interaction region using measurements of the position of and the orientation of the pointing device. | 11-11-2010 |
20100289743 | LASER POINTER AND GESTURE-BASED INPUT DEVICE - A laser pointer is combined with a gesture-based input system to enable presenters to make a seamless presentation, using the laser pointer to highlight content on a screen and in addition as a mount for a motion sensor comprising at least one small sensor such as a micro-electromechanical sensor MEMS that is used as an input device for delivering commands to a host computer. | 11-18-2010 |
20100295781 | Electronic Device with Sensing Assembly and Method for Interpreting Consecutive Gestures - A method for interpreting at least two consecutive gestures includes providing a sensing assembly having at least one photoreceiver and a plurality of phototransmitters, wherein each phototransmitter is positioned to emit infrared light away from the electronic device about a corresponding central transmission axis, wherein each central transmission axis is oriented in a different direction with respect to the others, and controlling emission of infrared light by each of the phototransmitters during each of a plurality of time periods. During each of the plurality of phototransmitters and for each of the plurality of time periods, a corresponding measured signal is generated which is indicative of a respective amount of infrared light which originated from that phototransmitter during that time period and was reflected by the external object prior to being received by a photoreceiver. The measured signals are evaluated to identity a first gesture, and the electronic device is controlled in response to identification of the first gesture according to a first mode of operation. A parameter of a second gesture is also determined, and the electronic device is controlled in response to the determined parameter of the second gesture according to a second mode of operation. | 11-25-2010 |
20100295782 | SYSTEM AND METHOD FOR CONTROL BASED ON FACE ORE HAND GESTURE DETECTION - System and method for control using face detection or hand gesture detection algorithms in a captured image. Based on the existence of a detected human face or a hand gesture in an image captured by a digital camera, a control signal is generated and provided to a device. The control may provide power or disconnect power supply to the device (or part of the device circuits). The location of the detected face in the image may be used to rotate a display screen to achieve a better line of sight with a viewing person. The difference between the location of the detected face and an optimum is the error to be corrected by rotating the display to the required angular position. A hand gesture detection can be used as a replacement to a remote control for the controlled unit, such as a television set. | 11-25-2010 |
20100295783 | GESTURE RECOGNITION SYSTEMS AND RELATED METHODS - A method and apparatus for performing gesture recognition. In one embodiment of the invention, the method includes the steps of receiving one or more raw frames from one or more cameras, each of the one or more raw frames representing a time sequence of images, determining one or more regions of the one or more received raw frames that comprise highly textured regions, segmenting the one or more determined highly textured regions in accordance textured features thereof to determine one or more segments thereof, determining one or more regions of the one or more received raw frames that comprise other than highly textured regions, and segmenting the one or more determined other than highly textured regions in accordance with color thereof to determine one or more segments thereof. One or more of the segments are then tracked through the one or more raw frames representing the time sequence of images. | 11-25-2010 |
20100295784 | DUAL-PEN: MASTER-SLAVE - There is disclosed an interactive display system comprising an interactive surface for displaying an image and for receiving inputs from remote devices, the system being adapted to detect the presence of at least two remote devices proximate the interactive surface. | 11-25-2010 |
20100302150 | PEER LAYERS OVERLAPPING A WHITEBOARD - A method for displaying edits overlapping a whiteboard comprising creating peer layers overlapping the whiteboard for a peer and peers coupled to the peer and sending or receiving metadata of edits for updating one or more of the peer layers on the peer and the coupled peers in response to any of the peer layers being edited. | 12-02-2010 |
20100302151 | IMAGE DISPLAY DEVICE AND OPERATION METHOD THEREFOR - An image display device and an operation method thereof are provided. This may include displaying an image on a display, displaying a pointer that moves in correspondence with an operation of a pointing device on the display, and displaying an object for receiving a command or representing image display device-related information on the display when the pointer moves to a predetermined area of the display. | 12-02-2010 |
20100302152 | DATA PROCESSING DEVICE - A data processing device having a display unit, a position sensor unit and an input control unit is provided. The display unit displays an option for operation. The position sensor unit detects an orientation of a linearly shaped pointer put in a front space of a screen of the display unit, and detects a position of a tip of the pointer. The input control unit identifies, upon the position sensor unit detecting the tip of the pointer as being in front of the option displayed on the screen of the display unit, the option as being selected. The input control unit displays the selected option in a form different from a form in which another option is displayed. The input control unit scrolls, upon the position sensor unit detecting a change of the orientation of the pointer, content displayed on the screen of the display unit. | 12-02-2010 |
20100302153 | DEPRESSABLE TOUCH SENSOR - An input device and a method for providing an input device are provided. The input device assembly includes a base, a sensor support, and a scissor mechanism attached to the base and the sensor support. The scissor mechanism allows for only substantially uniform translation of the sensor support towards the base in response to a force biasing the sensor support substantially towards the base. | 12-02-2010 |
20100302154 | MULTI-MODE POINTING DEVICE AND METHOD FOR OPERATING A MULTI-MODE POINTING DEVICE - A method for pairing and operating a multi-mode pointing device is provided. A pointing device may be automatically paired with an image display device. A pairing request signal may be transmitted on a prescribed frequency channel, and a signal may be received indicating information of a plurality of frequency channels. One of the frequency channels may be selected as a pairing frequency channel for operating in a radio frequency mode. | 12-02-2010 |
20100309124 | METHOD OF CALIBRATING POSITION OFFSET OF CURSOR - The present invention provides a method of calibrating a position offset of a cursor on a screen such that, when a pointing device has already been moved to a position beyond the screen boundary, virtual coordinates of the pointing device are calculated and recorded to track the physical positions of the pointing device efficiently, and then the position offset between the pointing device and the cursor on the screen is compensated and corrected, so as for the user to greatly reduce the hassle of manually operating the pointing device for controlling cursor movement and thereby operate the cursor on the screen at will. | 12-09-2010 |
20100315335 | Pointing Device with Independently Movable Portions - A pointing device with independently movable portions is described. In an embodiment, a pointing device comprises a base unit and a satellite portion. The base unit is arranged to be located under a palm of a user's hand and be movable over a supporting surface. The satellite portion is arranged to be located under a digit of the user's hand and be independently movable over the supporting surface relative to the base unit. In embodiments, data from at least one sensing device is read, and movement of both the base unit and the independently movable satellite portion of the pointing device is calculated from the data. The movement of the base unit and the satellite portion is analyzed to detect a user gesture. | 12-16-2010 |
20100315336 | Pointing Device Using Proximity Sensing - A pointing device using proximity sensing is described. In an embodiment, a pointing device comprises a movement sensor and a proximity sensor. The movement sensor generates a first data sequence relating to sensed movement of the pointing device relative to a surface. The proximity sensor generates a second data sequence relating to sensed movement relative to the pointing device of one or more objects in proximity to the pointing device. In embodiments, data from the movement sensor of the pointing device is read and the movement of the pointing device relative to the surface is determined. Data from the proximity sensor is also read, and a sequence of sensor images of one or more objects in proximity to the pointing device are generated. The sensor images are analyzed to determine the movement of the one or more objects relative to the pointing device. | 12-16-2010 |
20100315337 | OPTICAL CAPACITIVE THUMB CONTROL WITH PRESSURE SENSOR - A small sensor surface designed to control a smart phone or Mobile Internet Device (MID). The sensor surface may be mounted on the side of the proposed device in a position where a user's thumb or finger naturally falls when holding the device in his/her hand. The sensor surface is simultaneously convex and concave, providing both visual and physical cues for the use of the sensor surface. The sensor may include capacitive sensing, optical sensing and pressure sensing capabilities to interpret thumb gestures into device control. | 12-16-2010 |
20100315338 | DUPLICATE OBJECTS - There is disclosed an interactive display system comprising an interactive surface for displaying an image and for receiving inputs from remote devices, the system being adapted to detect the presence of at least two remote devices proximate the interactive surface. | 12-16-2010 |
20100321293 | COMMAND GENERATION METHOD AND COMPUTER USING THE SAME - A command generation method is suitable for a computer. First, a human body image is captured by an image capturing device in a two-dimensional image form. Then, the shape of the human body image is determined in two-dimensional image recognition for obtaining a determined result. A command is generated according to the determined result. | 12-23-2010 |
20100321294 | STRETCH OBJECTS - There is disclosed an interactive display system comprising an interactive surface for displaying an image and for receiving inputs from remote devices, the system being adapted to detect the presence of at least two remote devices proximate the interactive surface. | 12-23-2010 |
20100328212 | SYSTEM AND METHOD FOR PROVIDING ROLL COMPENSATION - The embodiments of the present disclosure are directed towards a method and apparatus for providing roll compensation in a control device, the method and apparatus including acquiring rotational data and linear data indicative of movement of the control device, applying roll compensation to the acquired data, and removing a roll compensation error from the roll compensated data. Inertial sensors such as gyroscope sensors and accelerometer sensor(s) may be used to acquire the rotational and linear data. | 12-30-2010 |
20110006984 | Optical Helmet-Position Detection Device Having a Large Dynamic Range - The general field of the invention is that of optical devices for detecting the position/orientation of a helmet. The device according to the invention comprises an optional stationary light source, a stationary camera associated with an image processing system, and a helmet. The helmet has a scattering coating and includes at least one set of markers, each marker comprising at least a first optical element having a very low reflection coefficient, a very low scattering coefficient and a very high absorption coefficient in the visible range and in that of the light source. In one embodiment, each marker may also include a first optical element having a very high retroreflection coefficient and a very low scattering coefficient in the visible range. The marker may also include a second optical element having a high scattering or phosphorescence coefficient in the emission range of the light source. | 01-13-2011 |
20110006985 | DISPLAY SURFACE AND CONTROL DEVICE COMBINED THEREWITH - The invention relates to a display surface and a control device combined therewith for a data processing system, wherein a display surface is equipped with photosensitive elements. A photosensitive element is configured as a planar position detector on the basis of a layer made of an organic photoactive material, which on both sides is connected by a planar electrode, wherein at least one electrode inside the circuit thereof has relatively high resistance, wherein the current through an electrode having poor conductivity is measured at several connecting points disposed at a distance from each other and from this conclusions may be drawn about the position of a local conductive connection through the photosensitive layer caused by light absorption. A luminous indicator produces a light spot on the display surface, the spot is detectable by the position detectors and reported to a data processing unit. | 01-13-2011 |
20110012830 | STEREO IMAGE INTERACTION SYSTEM - A stereo image interaction system includes a stereo image capturing module, a stereo image processing unit, a system host, and a stereo image display module. When the stereo image display module displays a stereo image, the stereo image capturing module obtains a motion image of an operation body, the stereo image processing unit obtains a motion characteristic from the motion image and transmits the motion characteristic to a central processing unit (CPU), and the CPU calculates a real-time motion of the stereo image under the motion characteristic. At this time, the stereo image displayed by the stereo image display module changes along with the motion characteristic, so that a virtual stereo image is displayed in a physical space, and the operation body is enabled to directly perform a real-time interaction on the stereo image. Furthermore, the displayed stereo image may be a first stereo image captured by the stereo image capturing module or a second stereo image pre-stored by a storage unit in the system host. | 01-20-2011 |
20110018801 | Visual Input/Output Device with Light Shelter - A visual input/output device with a light shield is disclosed. The device includes an active area for performing visual input/output tasks, and a non-active area surrounding the active area in the peripheral area of the device. The light shield is formed in the non-active area for substantially preventing unwanted light from interfering with the active area. The light shield is formed on a same layer as a color filter in the active area, and the light shield is made of black pigment or carbon. | 01-27-2011 |
20110018802 | Remote Control Device and Multimedia System - A remote control device for a multimedia device includes a housing, a touch pad placed on a plane of the housing and comprising a dedicated touch area, a signal determination unit placed inside the housing and coupled to the touch pad for generating an indication signal corresponding to an output effect when the dedicated touch area receives a touch signal, and a wireless transmitter placed inside the housing and coupled to the signal determination unit for wirelessly transmitting the indication signal to the multimedia device to control the multimedia device to reach the output effect. | 01-27-2011 |
20110018803 | Spatial, Multi-Modal Control Device For Use With Spatial Operating System - A system comprising an input device includes a detector coupled to a processor. The detector detects an orientation of the input device. The input device has multiple modal orientations corresponding to the orientation. The modal orientations correspond to multiple input modes of a gestural control system. The detector is coupled to the gestural control system and automatically controls selection of an input mode in response to the orientation. | 01-27-2011 |
20110018804 | Operation control device and operation control method - There is provided an operation control device including a motion detection part which detects an object to be detected, which is moved by motion of a user, a motion determination part which determines motion of the object to be detected based on a detection result, a movable region movement processing part which moves a cursor movable region including a cursor operating an object displayed in a display region, and a cursor movement processing part which moves the cursor. Based on motion of a first detected object, the movable region movement processing part moves the cursor movable region along with the cursor in the display region by a first movement unit. Based on motion of a second detected object, the cursor movement processing part moves only the cursor in the cursor movable region by a second movement unit smaller than the first movement unit. | 01-27-2011 |
20110018805 | LOCATION-DETECTING SYSTEM AND ARRANGEMENT METHOD THEREOF - The invention discloses a location-detecting system including an indication region, one of a camera unit and a light-emitting unit, and an optical device. The indication region is for indication of a target location thereon. One of the camera unit and the light-emitting unit is disposed at a first location of the indication region, and the optical device is disposed at a second location of the indication region and corresponding to one of the camera unit and the light-emitting unit. The optical device is for forming one of a specular reflection camera unit and a specular reflection light-emitting unit, wherein the specular reflection camera unit originates from the camera unit, and the specular reflection light-emitting unit originates from the light-emitting unit. | 01-27-2011 |
20110025603 | Spatial, Multi-Modal Control Device For Use With Spatial Operating System - A system comprising an input device includes a detector coupled to a processor. The detector detects an orientation of the input device. The input device has multiple modal orientations corresponding to the orientation. The modal orientations correspond to multiple input modes of a gestural control system. The detector is coupled to the gestural control system and automatically controls selection of an input mode in response to the orientation. | 02-03-2011 |
20110037695 | ERGONOMIC CONTROL UNIT FOR PROVIDING A POINTING FUNCTION - A control unit disposed in a remote-control pointing device for providing a control and pointing function. The control unit comprises a stationary element, a movable control element with a reflective surface disposed proximate to the stationary element, at least one optical sensor fixedly disposed in close proximity to the movable control element, and at least one pressure sensor. When the optical sensor is activated by movement of the movable control element and when the movable control element is put into pressure contact with the pressure sensor, location and pressure contact data, respectively, are collected by the control unit enabling control and pointing functions associated with the pointing device without the need for an external, stationary reference surface. | 02-17-2011 |
20110043448 | OPERATION INPUT SYSTEM, CONTROL APPARATUS, HANDHELD APPARATUS, AND OPERATION INPUT METHOD - An operation input system includes a casing and a motion sensor for detecting a movement of the casing inside the casing and calculates a position of the casing in a predetermined space based on an output of the motion sensor. The operation input system includes a position sensor and a correction section. The position sensor directly detects the position of the casing in the predetermined space. The correction section corrects the output of the motion sensor using an output of the position sensor. | 02-24-2011 |
20110050569 | Motion Controlled Remote Controller - A handheld device includes a display having a viewable surface and operable to generate an image indicating a currently controlled remote device and a gesture database maintaining a plurality of remote command gestures. Each remote command gesture is defined by a motion of the device with respect to a first position of the handheld device. The device includes a gesture mapping database comprising a mapping of each of the remote command gestures to an associated command for controlling operation of the remote device and a motion detection module operable to detect motion of the handheld device within three dimensions and to identify components of the motion in relation to the viewable surface. The device includes a control module operable to track movement of the handheld device using the motion detection module, to compare the tracked movement against the remote command gestures to determine a matching gesture, and to identify the one of the commands corresponding to the matching gesture. The device also includes a wireless interface operable to transmit the identified command to a remote receiver for delivery to the remote device. | 03-03-2011 |
20110050570 | Orientation-Sensitive Signal Output - Orientation-sensitive signal output, in which a neutral position of a device is automatically determined in relation to at least a first axis, an angular displacement of the device is measured about at least the first axis, and shaking of the device is detected. A selection of the first control is received, and an output signal is output based at least upon the selection and the angular displacement or based upon detecting the shaking of the device. | 03-03-2011 |
20110057879 | Image Projection System with Adjustable Cursor Brightness - An image projection system is disclosed. The system comprises a projector, a user input device and a computing device. The micro-mirror based projector projects an image including a cursor on a display plane. The present invention discloses various embodiments for the system and method of adjusting the brightness of the cursor to improve effects of a presentation. According to one embodiment, the brightness of the cursor is adjusted by modifying the on/off time ratio of the mirrors by which the cursor is formed. According to another embodiment, the projector comprises a first and a second micro-mirror array. The second array is dedicated for projecting the cursor image. The brightness of the cursor may be adjusted by changing the number of micro-mirrors by which the cursor is formed from the second array. | 03-10-2011 |
20110057880 | INFORMATION DISPLAY APPARATUS, INFORMATION DISPLAY METHOD AND PROGRAM - An information display apparatus including: a tilt detection unit that detects a basic position of a casing and detects a tilt from the basic position of the casing; a display unit that is mounted on the casing and displays information on a display screen; a touch detection unit that is mounted on the casing and detects a touch of an operating body on the casing; and a control unit that after movement of information displayed on the display screen of the display unit is started based on the tilt of the casing detected by the tilt detection unit and when a touch of an operating body is detected by the touch detection unit, stops the movement of the information displayed on the display screen. | 03-10-2011 |
20110063213 | REMOTE TOUCHPAD DEVICE FOR VEHICLE AND CONTROL METHOD THEREOF - The present invention features a remote touchpad device for a vehicle, which preferably comprises a circuit board having luminous elements installed at predetermined intervals along the circumference of a circle to irradiate light and at least one light-receiving element to receive the light the luminous elements, a pad provided on an upper part of the circuit board to make the light from the luminous elements reflected by an approaching or contact object and incident to the light-receiving element, a controller controlling a user interface by calculating the position of the object with 3D coordinates based on the amount of light incident to the light-receiving element, and a housing forming the exterior of the circuit board, the pad, and the controller. The invention also features a method of controlling the remote touchpad device. | 03-17-2011 |
20110063214 | Display and optical pointer systems and related methods - Display and optical pointer systems and related methods are disclosed that utilize LEDs in a display device to respond to optical signals from optical pointing devices. In part, the disclosed embodiments relate to displays with arrays of LEDs and associated pointing devices that communicate with individual LEDs in the arrays using visible light. The LED arrays can produce images directly as in LED billboards and sports arena scoreboards or can produce the backlight for LCD screens for instance. The pointing devices communicate with individual pixels or groups of pixels using a beam of light that may or may not be modulated with data, which is detected by the LEDs in the array that are exposed to the beam. Periodically, the LEDs in an array stop producing light and are configured with an associated driver device to detect light from the pointing device. Such configurations enable the user to point and click at on screen displays much like a computer mouse. | 03-17-2011 |
20110063215 | REMOTE CONTROL SYSTEM AND REMOTE CONTROL METHOD - A remote control system includes a PC and a MFP capable of remotely controlling an external device. PC includes a browsing portion to receive an operation screen from the MFP, a display control portion to allow a projector to project the received operation screen onto a projection plane, and a position detection portion to detect a position pointed to by a user in the projected operation screen. The browsing portion transmits to the MFP a command which is included in the operation screen and is related to the detected position in the operation screen. The MFP includes an operation screen transmission portion to transmit to the PC the operation screen, including a command for specifying control, for accepting an operation, a command reception portion to receive a command transmitted from the PC, and a process control portion to control the external device or the device itself in accordance with the received command. | 03-17-2011 |
20110063216 | SYSTEM AND METHOD FOR NAVIGATING A MOBILE DEVICE USER INTERFACE WITH A DIRECTIONAL SENSING DEVICE - An electronic mobile device includes a display, a tilt sensor and a processor. The display is for displaying a graphical element. The tilt sensor is configured to measure a tilt angle of the mobile device. The processor is configured to store the measured tilt angle as a reference tilt angle, subsequently determine a delta tilt angle as the difference between a currently measured tilt angle and the reference tilt angle, compare the delta tilt angle to different thresholds, and alter the position of the displayed element on the display at a rate that is based on the number of the thresholds the delta tilt angle has exceeded. | 03-17-2011 |
20110063217 | DIRECT NAVIGATION OF TWO-DIMENSIONAL CONTROL USING A THREE-DIMENSIONAL POINTING DEVICE - Direct and absolute pointing is provided for with respect to a two-dimensional information display surface, much like how one would point a laser pointer or flashlight at a desired point. The displayed control may be moved by manipulating the pointing device in three dimensions. The translational position of the pointing device may be measured in three dimensions. Also, the three-dimensional orientation of the pointing device may be measured. A computing device may receive this information from the pointing device and determine where the pointing device is pointing to. If the pointing device is pointing at a display, then the computing device may cause the control to be displayed at the position to which the pointing device is pointing. In addition, the control may be displayed at an orientation that depends upon the orientation of the pointing device. | 03-17-2011 |
20110069007 | POINTING DEVICE - A gyroscopic pointing apparatus for an electronic device, particularly a games console, that is provided with means to detect when the mobile component is pointing at the screen upon which it controls a cursor to provide a mechanism to correct for drift in gyroscope readings and an improved method of dynamically recalibrating the zero point of the gyroscopes. The pointing detection mechanism may be provided by the combination of an infra-red LED and an infra-red sensor in either permutation on the mobile component and fixed component respectively. | 03-24-2011 |
20110074674 | PORTABLE INPUT DEVICE, METHOD FOR CALIBRATION THEREOF, AND COMPUTER READABLE RECORDING MEDIUM STORING PROGRAM FOR CALIBRATION - Provided are a portable input device for inputting coordinates, a method of calibrating the device, and a computer readable recording medium storing a computer program for making a computer perform the method. The portable input device includes two digital cameras, a calibration tool, a storage section, and a controller for calculating coordinates of an object on an input surface based on images taken by the two digital cameras based on images taken by the two digital cameras so as to include the calibration tool. The controller also calibrates positions and widths of a detection band which corresponds to a detection zone defined in a vicinity of the input surface. The positions and the widths of the detection band are stored in the storage section in relationship to positions on the input surface. | 03-31-2011 |
20110074675 | METHOD AND APPARATUS FOR INITIATING A FEATURE BASED AT LEAST IN PART ON THE TRACKED MOVEMENT - In accordance with an example embodiment of the present invention, an apparatus comprising a camera configured to capture one or more media frames. Further, the apparatus comprises at least one processor and at least one memory including computer program code. The at least one memory and the computer program code is configured to, with the at least one processor, cause the apparatus to perform at least the following: filter the one or more media frames using one or more shaped filter banks; determine a gesture related to the one or more media frames; track movement of the gesture; and initiate a feature based at least in part on the tracked movement. | 03-31-2011 |
20110074676 | Large Depth of Field Navigation Input Devices and Methods - Disclosed are various embodiments of a navigation input device, and methods, systems and components corresponding thereto. According to some embodiments, the navigation input device has a large depth of field associated therewith and employs time- and/or frequency-domain processing algorithms and techniques. The device is capable of providing accurate and reliable information regarding the (X,Y) position of the device on a navigation surface as it is moved laterally thereatop and thereacross, notwithstanding changes in a vertical position of the device that occur during navigation and that do not exceed the depth of field of an imaging lens incorporated therein. According to one embodiment, the navigation input device is a writing instrument that does not require the use of an underlying touch screen, touch pad or active backplane to accurately and reliably record successive (X,Y) positions of the writing device as it is moved across and atop an underlying writing medium such as paper, a pad or a display. | 03-31-2011 |
20110074677 | Methods for Determining a Cursor Position from a Finger Contact with a Touch Screen Display - A portable device with a touch screen display detects a contact area of a finger with the touch screen display and then determines a first position associated with the contact area. The cursor position of the finger contact is determined, at least in part, based on: the first position, one or more distances between the first position and one or more of the user interface objects; and one or more activation susceptibility numbers, each associated with a respective user interface object in the plurality of user interface objects. If the cursor position falls into the hidden hit region of a virtual push button on the touch screen display, the portable device is activated to perform operations associated with the virtual push button. | 03-31-2011 |
20110080340 | System And Method For Remote Control Of A Computer - A remote control system for a computer, and a corresponding method include a Web camera having an image capture unit, the image capture unit including one or more devices capable of receiving imagery from multiple, distinct sections of the electromagnetic spectrum; a detection and separation module capable of detecting and separating the imagery into at least one signal capable of cursor control, wherein the signal capable of cursor control is generated by a remote control device; and a processing unit that receives the signal capable of cursor control and generates one or more cursor control signals, wherein the one or more cursor control signals include signals indicative of movement of the remote control device, the movement capable of translation to movement of a cursor displayed on a display of the computer. | 04-07-2011 |
20110090148 | Wearable input device - A wearable input device includes a body and a soft battery. The body is connected to the soft battery to form a collar range, and the soft battery surrounds a hand of a user. The soft battery supplies an electric power to a finger-contact control module and a wireless transmitter in the body, such that the finger-contact control module senses a movement of an object (for example, a finger) on the body, and generates a control signal corresponding to a moving position of the object, and then the wireless transmitter transmits the control signal to a computer host, thus manipulating a cursor on an operating system frame of the computer host. | 04-21-2011 |
20110090149 | METHOD AND APPARATUS FOR ADJUSTING A VIEW OF A SCENE BEING DISPLAYED ACCORDING TO TRACKED HEAD MOTION - A method for controlling a view of a scene is provided. The method initiates with detecting an initial location of a control object. An initial view of the scene is displayed on a virtual window, the initial view defined by a view-frustum based on a projection of the initial location of the control object through outer edges of the virtual window. Movement of the control object to a new location is detected. An updated view of the scene is displayed on the virtual window, the updated view defined by an updated view-frustum based on a projection of the new location of the control object through the outer edges of the virtual window. | 04-21-2011 |
20110095977 | INTERACTIVE INPUT SYSTEM INCORPORATING MULTI-ANGLE REFLECTING STRUCTURE - An interactive input system comprises at least one imaging device having a field of view looking into a region of interest. At least one radiation source emits radiation into the region of interest. A bezel at least partially surrounds the region of interest. The bezel comprises a multi-angle reflecting structure to reflect emitted radiation from the at least one radiation source towards the at least one imaging device. | 04-28-2011 |
20110095978 | REMOTE CONTROL - In a method for controlling objects, objects to be controlled are arranged in a real space. Said real space is linked to a multi-dimensional representational space by a transformation rule. Representations in the representational space are associated with the controllable objects of the real space by a mapping. Said method comprises steps of determining the position and orientation of a pointer in the real space, determining the position and orientation of a pointer representation associated with the pointer in the representational space using the position and orientation of the pointer in the real space and the transformation rule between the real space and the representational space, determining the representations in the representational space that are intersected by the pointer representation, selecting a representation that is intersected by the pointer representation, and controlling the object in the real space that is associated with the pointer representation in the representational space. | 04-28-2011 |
20110095979 | Real-Time Dynamic Tracking of Bias - A bias value associated with a sensor, e.g., a time-varying, non-zero value which is output from a sensor when it is motionless, is estimated using at least two, different bias estimating techniques. A resultant combined or selected bias estimate may then be used to compensate the biased output of the sensor in, e.g., a 3D pointing device. | 04-28-2011 |
20110095980 | HANDHELD VISION BASED ABSOLUTE POINTING SYSTEM - A method is described that involves identifying one or more images of respective one or more fixed markers. Each marker is positioned on or proximate to a display. The images appear on a pixilated sensor within a handheld device. The method also involves determining a location on, or proximate to, the display where the handheld device was pointed during the identifying. The method also involves sending from the handheld device information derived from the identifying of the one or more images of respective one or more fixed markers. The method also involves triggering action taken by electronic equipment circuitry in response to the handheld device's sending of a signal to indicate the action is desired. | 04-28-2011 |
20110102318 | USER INPUT BY POINTING - Presented is apparatus for capturing user input by pointing at a surface using pointing means. The apparatus comprises: a range camera for producing a depth-image of the pointing means; and a processor. The processor is adapted to determine from the depth-image the position and orientation of a pointing axis of the pointing means; extrapolate from the position and orientation the point of intersection of the axis with the surface; and control an operation based on the location of the point of intersection. | 05-05-2011 |
20110102319 | HYBRID POINTING DEVICE - The present invention discloses a hybrid pointing device including an optical navigation module and a pointing module. The optical navigation module is configured to replace the conventional buttons of a convention pointing device, such as an optical mouse or a trackball mouse. The optical navigation module is configured to sense gestures of at least one object operated by a user to activate commands associated with particular programs running on the host. Since the optical navigation module is only configured to sense gestures of the object but not the movement of the hybrid pointing device relative to a surface, the resolution thereof is aimed to be sufficiently high enough for sensing gestures and no need to be relatively high. | 05-05-2011 |
20110102320 | INTERACTION ARRANGEMENT FOR INTERACTION BETWEEN A SCREEN AND A POINTER OBJECT - An interaction arrangement for interaction between at least one screen arranged behind a transparent pane and at least one pointer object located in front of the pane, comprising at least two cameras arranged behind the pane, wherein there is associated with each of the cameras a deflection unit by means of which at least one optical path from an interaction area in the vicinity of and in front of the pane can be directed into the camera, and comprising a computing unit connected to all of the cameras for determining a position of the pointer object which is guided so as to be visible for at least two of the cameras, wherein at least the interaction area can be stroboscopically illuminated with infrared light, and the cameras are sensitive to infrared light and can be synchronized with the stroboscopic illumination. | 05-05-2011 |
20110102321 | IMAGE DISPLAY APPARATUS AND METHOD FOR CONTROLLING THE IMAGE DISPLAY APPARATUS - An image display apparatus including a remote control interface configured to receive a signal from a remote controller; a controller configured to calculate a first pointer position at which a pointer is to be displayed on a display of the image display apparatus based on the received signal, to determine a depth of a three-dimensional (3D) object displayed on the display of the image display apparatus, and to calculate a second position of the pointer based on the determined depth of the 3D object; and a video processor configured to display the pointer at the calculated second pointer position on the display of the image display apparatus. | 05-05-2011 |
20110109545 | Pointer and controller based on spherical coordinates system and system for use - A hand-held pointing device for manipulating an object on a display is disclosed. The device is constructed from at least one accelerometer and at least one linear input element. The accelerometer or accelerometers generate a pitch signal and a roll signal. These pitch and roll signals are used to determine a position on a display. | 05-12-2011 |
20110109546 | ACCELEROMETER-BASED TOUCHSCREEN USER INTERFACE - A CE device for, e.g., displaying the time can incorporate an accelerometer to provide various features and enhancements. For example, tilting of the housing as sensed by the accelerometer may be used for controlling a volume output by an audio display, and/or for controlling a position of a screen cursor relative to underlying presentation on a visual display, and/or for controlling motion of a virtual object presented on the visual display; and/or for rotating a presentation on the visual display to always be oriented up and/or for determining that a person has tapped the housing based on signals from the accelerometer and in response thereto presenting an image of a rotatable object on the display. | 05-12-2011 |
20110109547 | POSITION REMOTE CONTROL SYSTEM FOR WIDGET - A position remote control system for a widget is disclosed. The position remote control system is utilized for controlling positions of a plurality of widgets on a display device, and includes a remote controller for selecting one of the plurality of widgets and a corresponding target position and accordingly generating a remote control signal, and a display module including a first wireless device for receiving the remote control signal, an interpreter coupled to the first wireless device for interpreting the remote control signal to generate a first display signal, and a display unit coupled to the interpreter for displaying the widget window on a display area of the display device according to the first display signal. | 05-12-2011 |
20110109548 | Systems and methods for motion recognition with minimum delay - Techniques for performing motion recognition with minimum delay are disclosed. A processing unit is provided to receive motion signals from at least one motion sensing device, where the motion signal describes motions made by a user. The processing unit is configured to access a set of prototypes included in a motion recognizer to generate corresponding recognition signals from the motion signals in response to the motion recognizer without considering one or more of the prototypes completely in the motion recognizer. Movements of at least one of the objects in a virtual interactive environment is responsive to the recognition signals such that feedback from the motions to control the one of the objects is immediate and substantially correct no matter how much of the motion signals have been received. | 05-12-2011 |
20110115705 | POINTING DEVICE AND ELECTRONIC APPARATUS - A pointing device includes: a touching surface on which a fingertip is placed; a light emitting diode for illuminating the touching surface from a side opposite to a side where the fingertip is placed; and an imaging element for receiving light reflected from the fingertip, the pointing device further including first light control means for controlling light which is emitted from the light emitting diode and reaches the touching surface so that the light is evenly incident to the touching surface, the light emitting diode emitting light whose light intensity is deviated with respect to a radiation angle, and the first light control means being positioned on a light path from the light emitting diode to the touching surface. This allows providing a pointing device capable of improving decrease in detection accuracy due to deviation in an output from a light source and thus preventing malfunction, and an electronic apparatus including the pointing device. | 05-19-2011 |
20110115706 | Apparatus and method for providing pointer controlfunction in portable terminal - An apparatus and a method for controlling a pointer output at a peripheral device using a portable terminal. In particular, an apparatus and method for controlling a pointer on a peripheral device screen using the portable terminal and shifting an object selected using the pointer to another peripheral device, including a pointer management unit to select information on a pointer to be used and transmit the information to a peripheral device, and determine a motion position of the pointer and provide the motion position to the peripheral device. | 05-19-2011 |
20110122062 | MOTION RECOGNITION APPARATUS AND METHOD - Provided are a motion recognition apparatus and method, and more particularly, a motion recognition apparatus and method which are employed to move a pointer only when intended by a user using a touch sensor included in a pointing device that moves the pointer according to a motion sensed by a motion sensor. | 05-26-2011 |
20110128221 | Computer Display Pointer Device for a Display - According to one embodiment, a computer display pointer device includes an image processor coupled to a display and a video camera. The display is configured to be worn by a user and display a computer image over a portion of the user's field-of-view. The video camera is operable to be worn by the user and boresighted to a field-of-view of the user. The image processor receives a video signal from the video camera that includes an image of a pointer element configured on a hand of the user, determines a position of the pointer element according to the received video image, and moves a cursor on the display according to the position of the pointer element. | 06-02-2011 |
20110128222 | INFORMATION PROCESSING APPARATUS AND CONTROL METHOD - According to one embodiment, a switch circuit switches a resonance frequency band of an antenna in a display unit between first and second resonance frequency bands. The second resonance frequency band is overlapped with a part of the first resonance frequency band and is higher than the first resonance frequency band. A wireless communication module wirelessly transmits and receives signals using a transmission frequency band and a reception frequency band which are included in the first resonance frequency band. A screen image orientation control module changes an orientation of a screen image displayed on the display unit. A resonance frequency shift module shifts the resonance frequency band of the antenna from the first resonance frequency band to the second frequency band by controlling the switch circuit when the orientation of the screen image is an orientation in which the antenna is positioned on a downward side of the screen image. | 06-02-2011 |
20110128223 | METHOD OF AND SYSTEM FOR DETERMINING A HEAD-MOTION/GAZE RELATIONSHIP FOR A USER, AND AN INTERACTIVE DISPLAY SYSTEM - The invention describes a method of determining a head-motion/gaze relationship for a user ( | 06-02-2011 |
20110134034 | Input Device and Method, and Character Input Method - There is provided an input device capable of detecting a motion of a hand of a user ( | 06-09-2011 |
20110134035 | Transmitting Apparatus, Display Apparatus, and Remote Signal Input System - Disclosed is a remote signal input system. The remote signal input system comprises a transmitting device for generating signal light and a display panel comprising a plurality of sensors for sensing the signal light. Location signals are input into a display device having the display panel by using the signal light generated from the transmitting device. | 06-09-2011 |
20110134036 | Touchscreen Display With Plural Cameras - A display system (AP | 06-09-2011 |
20110141013 | USER-INTERFACE APPARATUS AND METHOD FOR USER CONTROL - An apparatus comprising at least two sensors, a pointing device and an object-recognition unit. The sensors are at different locations and are capable of detecting a signal from at least a portion of a user. The pointing device is configured to direct a user-controllable signal that is detectable by the sensors. The object-recognition unit is configured to receive output from the sensors, and, to determine locations of the portion of the user and of the pointing device based on the output. The object-recognition unit is also configured to calculate a target location pointed to by the user with the pointing device, based upon the determined locations of the portion of the user and of the pointing device. | 06-16-2011 |
20110141014 | MOVABLE TOUCHPAD WITH HIGH SENSITIVITY - A highly sensitive movable touchpad is disclosed in the present invention. It is used for laptop computers and has a slidable template for users to move so that a cursor can be controlled by the touchpad. A resistive or capacitive detecting surface can be applied for detecting users' click, double click, drag, or scroll motion on any point of the surface. Additionally, there is an optical displacement sensor provided under the slidable template for detecting surface information on the back surface of the slidable template. A sequence of images of surface movement are processed by an image processing unit. Then, relative movement information is calculated and sent to an operating system in the computer. The operating system controls the cursor with the relative movement information. The present invention uses edge detectors for dynamically controlling the cursor and calibrating location of the cursor so that positioning of the touchpad is synchronous with the cursor. | 06-16-2011 |
20110141015 | Storage medium having information processing program stored thereon and information processing apparatus - A motion information obtaining step successively obtains motion information from a motion sensor. An imaging information obtaining step successively obtains imaging information from an imaging means. An invalid information determination step determines whether the imaging information is valid information or invalid information for predetermined processing. A motion value calculation step calculates a motion value representing a magnitude of a motion of the operation apparatus in accordance with the motion information. A processing step executes, when the imaging information is determined as the invalid information in the invalid information determination step and when the motion value calculated in the motion calculation step is within a predetermined value range, predetermined processing in accordance with most recent valid imaging information among valid imaging information previously obtained. | 06-16-2011 |
20110148762 | SYSTEM AND METHOD FOR MULTI-MODE COMMAND INPUT - A controlling device has a moveable touch sensitive panel positioned above a plurality of switches. When the controlling device senses an activation of at least one of the plurality of switches when caused by a movement of the touch sensitive panel resulting from an input at an input location upon the touch sensitive surface, the controlling device responds by transmitting a signal to an appliance wherein the signal is reflective of the input location upon the touch sensitive surface. | 06-23-2011 |
20110148763 | OPERATION INPUT DEVICE AND METHOD, PROGRAM, AND ELECTRONIC APPARATUS - An operation input device includes: angular velocity detecting means for detecting an angular velocity; relative velocity detecting means for contactlessly detecting a relative velocity to a target object; distance detecting means for detecting a distance to the target object; and computing means for computing an amount of movement based on the angular velocity, the relative velocity, and the distance. | 06-23-2011 |
20110157015 | METHOD OF GENERATING MULTI-TOUCH SIGNAL, DONGLE FOR GENERATING MULTI-TOUCH SIGNAL, AND RELATED CONTROL SYSTEM - A dongle includes: a receiver, for receiving a control signal which is not generated from a multi-touch panel; a processing unit, coupled to the receiver, for generating a multi-touch output signal corresponding to a multi-touch event according to the control signal; and a data port, coupled to the processing unit, for outputting the multi-touch output signal. | 06-30-2011 |
20110157016 | GESTURE RECOGNITION INPUT DEVICE - A gesture recognition based input device includes a number of finger wear components and image capture modules, and an image capture module. Each finger wear component dedicatedly reflects light of a unique wavelength. Each image capture module dedicatedly picks up light reflected by a corresponding finger wear component and thereby dedicatedly captures images of the corresponding finger wear component. The image recognition module recognizes movements of the finger wear components from the images and interprets the movements of the finger wear components into control signals. | 06-30-2011 |
20110157017 | PORTABLE DATA PROCESSING APPARTATUS - A portable data processing apparatus is provided. It has at least one data processing function which depends on detected motion of the apparatus. The apparatus comprises a video camera operable to capture successive images of a part of the real environment around the apparatus; a video motion detector operable to detect motion of the apparatus by analysis of image motion between pairs of captured images; a hardware motion detector operable to detect motion of the apparatus, whereby the data processing function depends on motion detected by the hardware motion detector; and a controller operable to adjust the operation of the hardware motion detector if the motion detected by the video motion detector and the motion detected by the hardware motion detector differ by at least a threshold difference. | 06-30-2011 |
20110163953 | Lapdesk with Retractable Touchpad - A lapdesk for use with a laptop computer includes a housing having a top configured to support the laptop computer. The housing is configured to block heat emitted from the laptop computer from passing through the housing. The lapdesk further includes a tray having a touchpad disposed thereon. The tray is configured to slide into the housing and slide out from the housing. The lapdesk further includes a circuit coupled to the touchpad where the circuit is configured to transmit control signals from the touchpad to the laptop computer. | 07-07-2011 |
20110163954 | DISPLAY DEVICE AND CONTROL METHOD THEREOF - A display device and a control method thereof are provided. The display device includes a camera obtaining an image and a controller obtaining the direction of a user included in the obtained image and correcting the image such that the direction of the user is synchronized with the photographing direction of the camera. Even when the direction of the user does not correspond to the photographing direction of the camera, an image of the user can be corrected to correctly recognize a user's gesture. | 07-07-2011 |
20110163955 | MOTION SENSING AND PROCESSING ON MOBILE DEVICES - Handheld electronic devices including motion sensing and processing. In one aspect, a handheld electronic device includes a set of motion sensors provided on a single sensor wafer, including at least one gyroscope sensing rotational rate of the device around at least three axes and at least one accelerometer sensing gravity and linear acceleration of the device along the at least three axes. Memory stores sensor data derived from the at least one gyroscope and accelerometer, where the sensor data describes movement of the device including a rotation of the device around at least one of the three axes of the device, the rotation causing interaction with the device. The memory is provided on an electronics wafer positioned vertically with respect to the sensor wafer and substantially parallel to the sensor wafer. The electronics wafer is vertically bonded to and electrically connected to the sensor wafer. | 07-07-2011 |
20110163956 | Bimanual Gesture Based Input and Device Control System - A user conveys information to a receiving device with a data input tool which uses combinatorial gesture patterns from the cursor or track point of two single point devices. The input method is independent from hardware and language limitations, improves the user's ability to focus on the data stream being entered and reduces the footprint of the data input tool. | 07-07-2011 |
20110169735 | Apparatus and Method for Interacting with Handheld Carrier Hosting Media Content - Improved techniques for interacting with one or more handheld carriers hosting media content are disclosed. The handheld carrier hosting media content may be sensed, and at least a portion of the media content may be integrated into operation of a media activity provided by a computing device, upon recognizing the media activity and the media content. The media activity provided by the computing device may involve creating or editing an electronic document. The integration of the media content into operation of the media activity may involve insertion or importation of the media content into the electronic document. | 07-14-2011 |
20110169736 | INTERACTIVE INPUT SYSTEM AND TOOL TRAY THEREFOR - An interactive input system comprises an interactive surface and a tool tray supporting at least one tool to be used to interact with the interactive surface. The tool tray comprises processing structure for communicating with at least one imaging device and processing data received from the at least one imaging device for locating a pointer positioned in proximity with the interactive surface. | 07-14-2011 |
20110169737 | STORAGE MEDIUM HAVING INFORMATION PROCESSING PROGRAM STORED THEREIN, INFORMATION PROCESSING APPARATUS, AND INFORMATION PROCESSING SYSTEM - An information processing apparatus is capable of obtaining operation data according to a tilt of a predetermined object that can be moved by a user. The information processing apparatus calculates tilt information corresponding to the tilt of the object based on the operation data. The information processing apparatus calculates a specified position on a screen of a display device based on the operation data so that the specified position changes according to at least one of a position and the tilt of the object. Selection items are displayed on the screen of the display device. The information processing apparatus switches between sets of the selection items displayed on the screen according to an amount of tilt represented by the tilt information. Moreover, the information processing apparatus selects an item displayed at the specified position from among the selection items to perform an information process according to the selected item. | 07-14-2011 |
20110175809 | Tracking Groups Of Users In Motion Capture System - In a motion capture system, a unitary input is provided to an application based on detected movement and/or location of a group of people. Audio information from the group can also be used as an input. The application can provide real-time feedback to the person or group via a display and audio output. The group can control the movement of an avatar in a virtual space based on the movement of each person in the group, such as in a steering or balancing game. To avoid a discontinuous or confusing output by the application, missing data can be generated for a person who is occluded or partially out of the field of view. A wait time can be set for activating a new person and deactivating a currently-active person. The wait time can be adaptive based on a first detected position or a last detected position of the person. | 07-21-2011 |
20110175810 | Recognizing User Intent In Motion Capture System - Techniques for facilitating interaction with an application in a motion capture system allow a person to easily begin interacting without manual setup. A depth camera system tracks a person in physical space and evaluates the person's intent to engage with the application. Factors such as location, stance, movement and voice data can be evaluated. Absolute location in a field of view of the depth camera, and location relative to another person, can be evaluated. Stance can include facing a depth camera, indicating a willingness to interact. Movements can include moving toward or away from a central area in the physical space, walking through the field of view, and movements which occur while standing generally in one location, such as moving one's arms around, gesturing, or shifting weight from one foot to another. Voice data can include volume as well as words which are detected by speech recognition. | 07-21-2011 |
20110181509 | Gesture Control - An apparatus including: a radio transmitter configured to transmit radio signals that are at least partially reflected by a human body; one or more radio receivers configured to receive the transmitted radio signals after having been at least partially reflected by a human body of a user; a gesture detector configured to detect a predetermined time-varying modulation that is present in the received radio signals compared to the transmitted radio signals; and a controller configured to interpret the predetermined time-varying modulation as a predetermined user input command and change the operation of the apparatus. | 07-28-2011 |
20110181510 | Gesture Control - An apparatus including one or more radio transmitters configured to transmit radio signals that are at least partially reflected by an object or objects moving as a consequence of a gesture; multiple radio receivers configured to receive the transmitted radio signals after having been at least partially reflected by an object or objects moving as a consequence of a gesture; a detector configured to detect an attribute of the received signals, for each receiver, that varies with the position of the object or objects moving as a consequence of the gesture; and a controller configured to interpret the detected attributes, for the receivers, as a user input associated with the gesture. | 07-28-2011 |
20110181511 | METHOD FOR ADJUSTING EXPOSURE CONDITION OF OPTICAL SENSOR MODULE AND OPTICAL SENSOR MODULE THEREOF - A method for adjusting an exposure condition of an optical sensor module includes the following steps, (A) receiving reflected light reflected by a working surface; (B) generating an image signal by exposing the optical sensor module to the reflected light, in which the image signal includes a plurality of luminance signals and an image quality signal; (C) setting an exposure condition of the optical sensor module according to part of the luminance signals; (D) repeating Step (B) and Step (C) under different exposure conditions so that the optical sensor module generates a plurality of image quality signals; and (E) setting an optimal exposure condition corresponding to the working surface according to the image quality signals under the different exposure conditions. The optical sensor module is applicable to a pointing device. | 07-28-2011 |
20110187642 | Interaction Terminal - Embodiments of the present invention are directed to systems, apparatuses and methods for using a mobile device with an accelerometer to conduct a financial transaction by making contact with an interaction terminal, thereby generating interaction data that is representative of the physical contact between the mobile device and the interaction terminal. The mobile device may be a mobile phone. The interaction terminal may be a point of sale terminal, access point device, or any other stationary (i.e., in a fixed position) device positioned at a line, door, gate, or entrance. A mobile device with an accelerometer physically contacts the interaction terminal. The interaction terminal flexes, recoils, or moves and generates interaction data (e.g., accelerometer, location, time data, etc.) representative of the physical interaction between the mobile device and the interaction terminal. A server computer determines, based on interaction data, that the mobile device and the interaction terminal made physical contact. After determining that the mobile device and the interaction terminal made contact, communication may be initiated between the devices. Communications may relate to processing a payment transaction using a payment processing network. | 08-04-2011 |
20110187643 | USER INTERFACE SYSTEM BASED ON POINTING DEVICE - The user interaction system comprises a portable pointing device ( | 08-04-2011 |
20110193778 | DEVICE AND METHOD FOR CONTROLLING MOUSE POINTER - Disclosed is a device for controlling a mouse pointer, providing a display unit; an image photographing unit for photographing images of a first object and a second object; and a controller for setting a point between the first object and the second object detected from the photographed images as a position of a mouse pointer on the display unit, and when a distance between the first object and the second object is less than a predetermined distance, determining that a user selection instruction for the point has been input. The device detects movement of fingers using differential images according to the movement of the fingers, so that even when a continuously changing surrounding lighting or a user face having the similar color with the finger is included in a background, it is possible to accurately identify the movement of the fingers. | 08-11-2011 |
20110199301 | Sensor-based Pointing Device for Natural Input and Interaction - A pointing or input device is generally cylindrical or puck-shaped, and has various sensors for sensing 2D, 3D, and high degree of freedom motion for more natural user interaction. | 08-18-2011 |
20110199302 | CAPTURING SCREEN OBJECTS USING A COLLISION VOLUME - A system is disclosed for providing a user a margin of error in capturing moving screen objects, while creating the illusion that the user is in full control of the onscreen activity. The system may create one or more “collision volumes” attached to and centered around one or more capture objects that may be used to capture a moving onscreen target object. Depending on the vector velocity of the moving target object, the distance between the capture object and target object, and/or the intensity of the collision volume, the course of the target object may be altered to be drawn to and captured by the capture object. | 08-18-2011 |
20110199303 | DUAL WRIST USER INPUT SYSTEM - A dual wrist user input system includes a first wrist band that conforms to a first wrist of a user. The dual wrist user input system includes a motion tracking sensor that tracks aerial motion of the first wrist of the user as aerial motion data, the motion tracking sensor being adhered to the first wrist band. The aerial motion of the first wrist of the user is performed by the user to move an indicator displayed by a display screen operably connected to a computing device. Further, the dual wrist user input system includes a second wrist band that conforms to a second wrist of the user. In addition, the dual wrist user input system includes a rotational sensor that tracks rotational movement of the second wrist of the user as rotational movement data. The rotational sensor is adhered to the second wrist band. | 08-18-2011 |
20110199304 | Systems and Methods for Providing Enhanced Motion Detection - Provided are systems and methods for providing enhanced motion detection. One system providing enhanced motion detection includes a smart display, an interface subsystem including a human interface device (HID), and a console having a processor configured to form communication links with the smart display and the interface subsystem and to provide motion detection feedback, using the smart display, to a user of the HID, where the HID is configured to sense motion of the HID and utilize a predictive model to characterize the motion of the HID. One interface subsystem includes a camera to sense motion of a user of the HID. One processor is configured to negotiate a reduced response latency with the smart display. | 08-18-2011 |
20110199305 | MOUSE CONTROLLED BY MOVEMENTS OF FINGERS IN THE AIR - Provided is a new type of finger mouse capable of increasing the user convenience by checking the fine movement information of a finger that freely moves in the air, and by using the fine movement information as coordinate information of a mouse of a computer. The user may easily and conveniently control a mouse pointer regardless of the user's posture or place, departing from the geometrical limitation and the spatial limitation of the wireless mice, through the technology of the present disclosure. | 08-18-2011 |
20110205157 | System and Method for Information Handling System Touchpad Enablement - A disabled information handling system integrated pointing device is automatically enabled based upon user inputs detected at the integrated pointing device, such as inputs that indicate a user is unsuccessfully attempting to use the integrated pointing device. For example, a touchpad disposed at a portable information handling system housing in a disabled state automatically transitions to an enabled state when user inputs include rapid movements, movements from end to end or movements of increasing speed. The touchpad automatically transitions to an enabled state based upon movements interpreted as an attempt by the user to make inputs at the touchpad. | 08-25-2011 |
20110216002 | Calibration of Portable Devices in a Shared Virtual Space - Methods, systems, and computer programs for generating an interactive space viewable through at least a first and a second device are presented. The method includes an operation for detecting from the first device a location of the second device or vice versa. Further, synchronization information data is exchanged between the first and the second device to identify a reference point in a three-dimensional (3D) space relative to the physical location of the devices in the 3D space. The devices establish the physical location in the 3D space of the other device when setting the reference point. The method further includes an operation for generating views of an interactive scene in the displays of the first and second devices. The interactive scene is tied to the reference point and includes virtual objects. The view in the display shows the interactive scene as observed from the current location of the corresponding device. Moving the device in the 3D space causes the view to change according to the perspective from the current location. | 09-08-2011 |
20110227825 | 3D Pointer Mapping - Systems, devices, methods and software are described for mapping movement or motion of a 3D pointing device into cursor position, e.g., for use in rendering the cursor on a display. Absolute and relative type mapping algorithms are described. Mapping algorithms can be combined to obtain beneficial characteristics from different types of mapping. | 09-22-2011 |
20110227826 | INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - There is provided an information processing apparatus including a position detection section which detects a position of an object, and a coordinate calculation section which calculates absolute coordinates based on the position of the object detected by the position detection section, and which calculates relative coordinates, which indicate a display position of the object on a screen, depending on the absolute coordinates and a motion of the object. The coordinate calculation section moves the relative coordinates in order for the relative coordinates to be asymptotic to or correspondent to the absolute coordinates based on a predetermined condition. | 09-22-2011 |
20110227827 | Interactive Display System - An interactive display system including a wireless pointing device including a camera or other video capture system. The pointing device captures images displayed by the computer, including one or more human-imperceptible positioning targets. The positioning targets are presented as patterned modulation of the intensity (e.g., variation in pixel intensity) in a display frame of the visual payload, followed by the opposite modulation in a successive frame. At least two captured image frames are subtracted from one another to recover the positioning target in the captured visual data and to remove the displayed image payload. The location, size, and orientation of the recovered positioning target identify the aiming point of the remote pointing device relative to the display. Another embodiment uses temporal sequencing of positioning targets (either human-perceptible or human-imperceptible) to position the pointing device. | 09-22-2011 |
20110234492 | Gesture processing - Presented is method and system for processing a gesture performed by a user of an input device. The method comprises detecting the gesture and determining a distance of the input device from a predetermined location. A user command is then determined based on the detected gesture and the determined distance. | 09-29-2011 |
20110241986 | INTERACTIVE PROJECTION DEVICE - The subject invention relates to an interactive projection device. The interactive projection device includes a light source configured to emit an input light beam, wherein the light source comprises a visible light emitting device; a first beam splitter configured to split the input light beam into first and second split light beams; a second beam splitter configured to split a scattered light beam received from a surface into third and fourth split light beams; an image forming device configured to produce an image light beam based on the first split light beam and emit the image light beam onto the surface through the first and second splitters, thereby generating a projection image on the surface; and a detector configured to detect the invisible light of the third split light beam, thereby acquiring a scattering image from the surface. | 10-06-2011 |
20110241987 | INTERACTIVE INPUT SYSTEM AND INFORMATION INPUT METHOD THEREFOR - An interactive input system comprises at least one light source configured for emitting radiation into a region of interest, a bezel at least partially surrounding the region of interest and having a surface in the field of view of the at least one imaging device, where the surface absorbs the emitted radiation and at least one imaging device having a field of view looking through a filter and into the region of interest and capturing image frames. The filter has a passband comprising a wavelength of the emitted radiation. | 10-06-2011 |
20110241988 | INTERACTIVE INPUT SYSTEM AND INFORMATION INPUT METHOD THEREFOR - An interactive input system includes at least one imaging device having a field of view looking into a region of interest and capturing images; at least one pen tool comprising an accelerometer configured to measure acceleration of the pen tool and to generate acceleration data, the pen tool configured to wirelessly transmit the acceleration data; and processing structure configured to process the images and acceleration data to determine the location of at least one pointer in the region of interest. | 10-06-2011 |
20110241989 | Remote touch panel using light sensor and remote touch screen apparatus having the same - A remote touch panel includes a plurality of light sensor cells arranged in two dimensions. Each light sensor cell may include a light-sensitive semiconductor layer and first and second electrodes electrically connected to the light-sensitive semiconductor layer. The remote touch panel may be controlled at a remote distance. For example, a large display apparatus can be easily controlled by using a simple light source device, for example, a laser pointer. | 10-06-2011 |
20110241990 | PROJECTION APPARATUS AND LOCATION METHOD FOR DETERMINING A POSITION OF A LIGHT POINT ON A PROJECTION IMAGE - A projection apparatus and a location method for determining a position of a light point on a projection image of the projection apparatus are provided. The projection apparatus comprises a lens, a light detector, a light guide module and a processing circuit. The light guide module is configured to receive the light point via the lens, and to guide the light point into the light detector. The light detector outputs a detection signal to the processing circuit according to the light point within a detection period. The processing circuit determines the position of the light point on the projection image according to the detection signal. | 10-06-2011 |
20110241991 | TRACKING OBJECT SELECTION APPARATUS, METHOD, PROGRAM AND CIRCUIT - Provided is a tracking object selection apparatus ( | 10-06-2011 |
20110254765 | Remote text input using handwriting - A method for user input includes capturing a sequence of positions of at least a part of a body, including a hand, of a user of a computerized system, independently of any object held by or attached to the hand, while the hand delineates textual characters by moving freely in a 3D space. The positions are processed to extract a trajectory of motion of the hand. Features of the trajectory are analyzed in order to identify the characters delineated by the hand. | 10-20-2011 |
20110260968 | 3D POINTING DEVICE AND METHOD FOR COMPENSATING ROTATIONS OF THE 3D POINTING DEVICE THEREOF - A 3D pointing device utilizing an orientation sensor, capable of accurately transforming rotations and movements of the 3D pointing device into a movement pattern in the display plane of a display device is provided. The 3D pointing device includes the orientation sensor, a rotation sensor, and a computing processor. The orientation sensor generates an orientation output associated with the orientation of the 3D pointing device associated with three coordinate axes of a global reference frame associated with the Earth. The rotation sensor generates a rotation output associated with the rotation of the 3D pointing device associated with three coordinate axes of a spatial reference frame associated with the 3D pointing device itself The computing processor uses the orientation output and the rotation output to generate a transformed output associated with a fixed reference frame associated with the display device above. The transformed output represents a segment of the movement pattern. | 10-27-2011 |
20110267268 | BACKLIGHTING FOR OPTICAL FINGER NAVIGATION - An optical finger navigation (OFN) device includes an OFN sensor module, a light source, and a vertical light guide. The OFN sensor module is coupled to a circuit substrate. The OFN sensor module generates a navigation signal in response to a movement detected at a navigation surface based on light reflected from a user's finger. The light source is also coupled to the circuit substrate. The light source generates light (which is separate from the light generated for the OFN sensor module). The vertical light guide is disposed to circumscribe a perimeter of the OFN sensor module. The vertical light guide receives the light from the light source and guides the light toward a light emission surface at a perimeter surface area circumscribing the navigation surface. | 11-03-2011 |
20110267269 | HETEROGENEOUS IMAGE SENSOR SYNCHRONIZATION - A computer implemented method for synchronizing information from a scene using two heterogeneous sensing devices. Scene capture information is provided by a first sensor and a second sensor. The information comprises video streams including successive frames provided at different frequencies. Each frame is separated by a vertical blanking interval. A video output comprising a stream of successive frames each separated by a vertical blanking interval is rendered based on information in the scene. The method determines whether an adjustment of the first and second video stream relative to the video output stream is required by reference to the video output stream. A correction is then generated to at least one of said vertical blanking intervals. | 11-03-2011 |
20110273369 | ADJUSTMENT OF IMAGING PROPERTY IN VIEW-DEPENDENT RENDERING - An image is displayed by determining a relative position and orientation of a display in relation to a viewer's head, and rendering an image based on the relative position and orientation. The viewer's eye movement relative to the rendered image is tracked, with at least one area of interest in the image to the viewer being determined based on the viewer's eye movement, and an imaging property of the at least one area of interest is adjusted. | 11-10-2011 |
20110279367 | PRESENTER - A presenter includes a main body, a light source, a pointing input unit and a wireless transmission unit. The light source can be a laser diode or any other light-emitting element capable of emitting a laser beam. The pointing input unit can be a magnetically sensitive pointing unit or an optical cursor controller. The pointing input unit is operable to output a pointing control signal to a host via the wireless transmission unit. The presenter is capable of projecting a laser beam and controlling a cursor of the host device via the wireless transmission unit. | 11-17-2011 |
20110279368 | INFERRING USER INTENT TO ENGAGE A MOTION CAPTURE SYSTEM - Techniques are provided for inferring a user's intent to interact with an application run by a motion capture system. Deliberate user gestures to interact with the motion capture system are disambiguated from unrelated user motions within the system's field of view. An algorithm may be used to determine the user's aggregated level of intent to engage the system. Parameters in the algorithm may include posture and motion of the user's body, as well as the state of the system. The system may develop a skeletal model to determine the various parameters. If the system determines that the parameters strongly indicate an intent to engage the system, then the system may react quickly. However, if the parameters only weakly indicate an intent to engage the system, it may take longer for the user to engage the system. | 11-17-2011 |
20110279369 | HYBRID POINTING DEVICE - The present invention discloses a hybrid pointing device including an optical navigation module and a pointing module. The optical navigation module is configured to replace the conventional buttons of a convention pointing device, such as an optical mouse or a trackball mouse. The optical navigation module is configured to sense gestures of at least one object operated by a user to activate commands associated with particular programs running on the host. Since the optical navigation module is only configured to sense gestures of the object but not the movement of the hybrid pointing device relative to a surface, the resolution thereof is aimed to be sufficiently high enough for sensing gestures and no need to be relatively high. | 11-17-2011 |
20110285622 | RENDITION OF 3D CONTENT ON A HANDHELD DEVICE - A handheld device having a display and a front-facing sensor and a back-facing sensor is able to render 3D content in a realistic and spatially correct manner using position-dependent rendering and view-dependent rendering. In one scenario, the 3D content is only computer-generated content and the display on the device is a typical, non-transparent (opaque) display. The position-dependent rendering is performed using either the back-facing sensor or a front-facing sensor having a wide-angle lens. In another scenario, the 3D content is composed of computer-generated 3D content and images of physical objects and the display is either a transparent or semi-transparent display where physical objects behind the device show through the display. In this case, position-dependent rendering is performed using a back-facing sensor that is actuated (capable of physical panning and tilting) or is wide-angle, thereby enabling virtual panning. | 11-24-2011 |
20110285623 | Motion Sensing System - A motion sensing system includes a hand-held device and a receiver device. The hand-held device includes a microcontroller, a G-sensor (one 3-axis accelerometer), only one 2-axis gyroscope, and a wireless transmitter. The receiver device is preferably a dongle and includes a microcontroller and a wireless receiver. A first axis of the 2-axis gyroscope is parallel to the Z axis of the hand-held device and the second axis of the 2-axis gyroscope forms an acute angle α with the X axis of the hand-held device. The acute angle α allows the microcontroller of the receiver device to calculate rotational data around each of the three axis of the hand-held device. | 11-24-2011 |
20110285624 | SCREEN POSITIONING SYSTEM AND METHOD BASED ON LIGHT SOURCE TYPE - A screen positioning system includes a display, a light emitter, an image capture device and a computer. The light emitter emits light to a position on the display in which an object icon is displayed to form a projection point thereon. The image capture device captures whole images of the screen; and the computer determines the position or properties of the projection point, and further determines an operating command of the object icon corresponding to the position or properties of the projection point, and then implements the determined operating command and directs the display to display a result of implementing the operating command. | 11-24-2011 |
20110285625 | INFORMATION PROCESSING APPARATUS AND INPUT METHOD - According to one embodiment, an information processing apparatus comprises a touch-screen display, a detection module and an output module. The detection module is configured to detect a number of touch positions on the touch-screen display. The output module is configured to output first data indicative of a touch position on the touch-screen display in order to activate a function associated with a touched display object on the touch-screen display, the output module being configured to output second data indicative of a direction of movement and an amount of movement of a touch position on the touch-screen display, in place of the first data, in order to move a cursor on a screen of a display, if the detection module detects that a plurality of positions on the touch-screen display are touched. | 11-24-2011 |
20110285626 | GESTURE RECOGNIZER SYSTEM ARCHITECTURE - Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data. | 11-24-2011 |
20110291926 | Gesture recognition system using depth perceptive sensors - Acquired three-dimensional positional information is used to identify user created gesture(s), which gesture(s) are classified to determine appropriate input(s) to an associated electronic device or devices. Preferably at at least one instance of a time interval, the posture of a portion of a user is recognized, based at least one factor such as shape, position, orientation, velocity. Posture over each of the instance(s) is recognized as a combined gesture. Because acquired information is three-dimensional, two gestures may occur simultaneously. | 12-01-2011 |
20110291927 | Smart Method and Device for Adaptive User Interface Experiences - A wireless communication device ( | 12-01-2011 |
20110291928 | Multifunctional flexible handwriting board and method for manufacturing the same - The present invention discloses a multifunctional flexible handwriting board and its manufacturing method. The multifunctional flexible handwriting board comprises a name plate, a linear plate, a shock absorbing layer and a circuit board. The name plate is made of a flexible material and has an upper surface uses as a mouse pad. The linear plate is made of a flexible material and has a sensing area at the middle. The shock absorbing layer is made of a soft material for protecting the linear plate. The circuit board is electrically coupled to the sensing area of the linear plate for transmitting information. The method of manufacturing the multifunctional flexible handwriting board is to couple the circuit board to an edge of the linear plate and sequentially couple the name plate, the linear plate and the shock absorbing layer. | 12-01-2011 |
20110291929 | COMPUTER READABLE STORAGE MEDIUM HAVING STORED THEREIN INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING SYSTEM - Correspondence data representing correspondence between a plurality of selection target objects and the attitude of an input device is stored in an information processing apparatus. In accordance with attitude/motion data acquired from the input device, the attitude of the input device is calculated. In accordance with the correspondence data, a selection target object corresponding to the attitude of the input device is selected thereby to perform a process based on the selection target object having been selected. | 12-01-2011 |
20110298708 | Virtual Touch Interface - A user may issue commands to a computing device by moving a pointer within a light field. Sensors may capture light reflected from the moving pointer. A virtual touch engine may analyze the reflected light captured as light portions in a sequence of images by the sensors to issue a command to a computing device in response to the movements. Analyzing the sequence of images may include finding the light portions in the sequence of images, determining a size of the light portions, and determining a location of the light portions. | 12-08-2011 |
20110298709 | SYSTEM AND METHOD FOR DIGITAL RECORDING OF HANDPAINTED, HANDDRAWN AND HANDWRITTEN INFORMATION - The present invention provides method and system for recording hand-painted, hand-drawn and handwritten information defined by a hand and/or fingers movement. The system corresponding to the invented method comprises: a computing device with a display, an input device comprising: an end-point coupled to a force sensor, additional motion sensors, IC circuit for digitizing the information from sensors and processing the data related to the force and motion vectors components; hardware and software for providing a digital description of how the device has been pressed to the surface and how the device has been moved. Besides above mentioned applications the method and system can also be used for precise cursor navigation on the display, computer gaming and as a universal remote control for electronic equipment and appliances or as a security device with multi-level authentication. With an addition of several components the input device can be used as a smart cell-phone. | 12-08-2011 |
20110298710 | HAND-HELD POINTING DEVICE, SOFTWARE CURSOR CONTROL SYSTEM AND METHOD FOR CONTROLLING A MOVEMENT OF A SOFTWARE CURSOR - A hand-held pointing device for controlling a movement of a software cursor comprises an acceleration detector and an image capturing unit. The acceleration detector determines an inclination parameter, wherein the movement of the software cursor in a vertical direction is controllable based on the inclination parameter. Further, the image capturing unit records images within the visible spectral range, wherein the movement of the software cursor in a horizontal direction is controllable based on the recorded images. | 12-08-2011 |
20110304537 | AUTO-CORRECTION FOR MOBILE RECEIVER WITH POINTING TECHNOLOGY - A mobile station and unattached work area is used with an electronic pen, which includes a transmitter, such as an acoustic transmitter. The mobile station includes a receiver that receives signals from the transmitter and orientation sensors that detect movement of the mobile station. The position of the receiver is calibrated with respect to the unattached work area. Data from the orientation sensors is received when the mobile station, and thus, the receiver is moved with respect to the work area. A transformation matrix is generated based on the data from the orientation sensors, which can be used to correct for the movement of the receiver. The position of the transmitter in the electronic pen is calculated and mapped based on received signals and the transformation matrix and the mapped position is then displayed. | 12-15-2011 |
20110304538 | OPTICAL POINTING DEVICE AND ELECTRONIC DEVICE INCLUDING THE OPTICAL POINTING DEVICE - An inclined plane ( | 12-15-2011 |
20110304539 | REMOTE CONTROLLING APPARATUS AND METHOD FOR CONTROLLING THE SAME - Disclosed are a remote controlling apparatus and a method for controlling the same, the apparatus capable of intuitively and conveniently controlling a display apparatus and capable of simply and precisely performing a text input, by controlling activation of a text input unit based on tilt information of the remote controlling apparatus having a three-dimensional pointing function. The remote controlling apparatus for controlling a pointer displayed on a screen of a display apparatus by a three-dimensional operation includes a housing, a text input unit disposed on one surface of the housing, and configured to receive a text input, a sensing unit configured to detect a tilt and a motion of the remote controlling apparatus, and a controller configured to selectively operate a first mode to transmit the text input to the display apparatus or a second mode to transmit information on the detected motion of the remote controlling apparatus to the display apparatus based on the detected tilt of the remote controlling apparatus. | 12-15-2011 |
20110304540 | IMAGE GENERATION SYSTEM, IMAGE GENERATION METHOD, AND INFORMATION STORAGE MEDIUM - An image generation system includes an image information acquisition section that acquires image information from an image sensor, a motion information acquisition section that acquires motion information about an operator based on the image information from the image sensor, an object control section that moves an object in a movement area based on the motion information about the operator, and an image generation section that generates an image displayed on a display section. The object control section limits movement of the object so that the object does not move beyond a movement limiting boundary set in the movement area even when it has been determined that the object has moved beyond the movement limiting boundary based on the motion information. | 12-15-2011 |
20110304541 | METHOD AND SYSTEM FOR DETECTING GESTURES - A method and system for detecting user interface gestures that includes obtaining an image from an imaging unit; identifying object search area of the images; detecting at least a first gesture object in the search area of an image of a first instance; detecting at least a second gesture object in the search area of an image of at least a second instance; and determining an input gesture from an occurrence of the first gesture object and the at least second gesture object. | 12-15-2011 |
20110310013 | INTERFACE APPARATUS AND METHOD FOR CONTACT-FREE SPACE INPUT/OUTPUT - A space input/output interface apparatus includes: a proximity sensor for sensing a movement of a user's wrist; an inertial sensor for sensing a movement of the user's arm; and a controller for generating user input/output interface recognition information corresponding to a sensing value of the proximity sensor or the inertial sensor. The apparatus is an armband-type space input/output interface apparatus which can be put on the user's wrist. | 12-22-2011 |
20110310014 | IMAGE PICKUP APPARATUS AND PROJECTION TYPE IMAGE DISPLAY APPARATUS - In an image pickup apparatus, a visible light cut filter allows infrared components to pass through, and blocks visible light components. A plurality of image pickup devices receive the light transmitted through the visible light cut filter such that a plurality of color components are received separately from each other. The visible light cut filter allows part of the visible light components to transmit such that the visible light component enters at least one of the plurality of image pickup devices. | 12-22-2011 |
20110316776 | INPUT DEVICE WITH PHOTODETECTOR PAIRS - Input devices configured to provide user interface by detecting three dimensional movement of an external object are disclosed. The input device comprises at least two photodetector pairs, a radiation source and a circuit configurable to detect differential and common mode signals generated in the photodetector pairs. By detecting the common mode and differential signals, movement of an external object may be determined and used to control a pointer, or a cursor. | 12-29-2011 |
20110316777 | ELECTRONIC DEVICE PROVIDING REGULATION OF BACKLIGHT BRIGHTNESS AND METHOD THEREOF - An electronic device includes a display module, a function key, a motion sensor, a determination module, and a regulation module. The motion sensor is operable to acquire a coordinate of the electronic device. The determination module is operable to determine whether the electronic device is in the upright position based on the coordinate. The regulation module regulates backlight brightness of the display module when the function key is operative and the specific application is not executed when the electronic device is in the upright position. | 12-29-2011 |
20110316778 | POINTING DEVICE, CONTROLLING METHOD OF THE SAME, GLASSES FOR 3D IMAGE, AND DISPLAY APPARATUS - A pointing device includes: a communication unit which performs communication with glasses for viewing a 3D image and a display apparatus; and a control unit which controls the communication unit to perform communication with the glasses, detects a reference distance between the glasses and the pointing device and a measurement distance between the glasses and the pointing device, and determines depth information for 3D pointing based on the reference distance and the measurement distance. | 12-29-2011 |
20120007804 | INTERACTIVE INPUT SYSTEM AND METHOD - A method of resolving ambiguities between at least two pointers within a region of interest comprises capturing images of the region of interest and at least one reflection thereof from different vantages using a plurality of imaging devices, processing image data to identify a plurality of targets for the at least two pointers, for each image, determining a state for each target and assigning a weight to the image data based on the state, and calculating a pointer location for each of the at least two pointers based on the weighted image data. | 01-12-2012 |
20120019443 | TOUCH SYSTEM AND TOUCH SENSING METHOD - A touch system comprises a transparent panel, a first image sensing module, a second image sensing module and a processing circuit. The first image sensing module is at least partially disposed on a first flat surface of the transparent for obtaining an image above the first flat surface. The second image sensing module is at least partially disposed under a second flat surface of the transparent for obtaining an image of the first flat surface through the second flat surface. When two pointers approach the first flat surface, the processing circuit calculates possible coordinates of the pointers according to the image obtained by the first image sensing module and calculates coordinates of the pointers according to the image obtained by the second image sensing module, so as to compare all of the coordinates to obtain actual coordinates of the pointers from the possible coordinates. | 01-26-2012 |
20120026088 | HANDHELD DEVICE WITH PROJECTED USER INTERFACE AND INTERACTIVE IMAGE - Systems and methods for a device with a user interactive image projector disposed in a distal end of the device from the user are described. In one aspect, the device is operatively configured to project at least a portion of a user interactive image on a projection surface separate from the device. The device locks at least a portion of the projected user interactive image with respect to the projection surface. Responsive to receiving user input, the device allows the user to navigate the user interactive image in accordance with the user input. | 02-02-2012 |
20120038552 | UTILIZATION OF INTERACTIVE DEVICE-ADJACENT AMBIENTLY DISPLAYED IMAGES - A telecommunication device configured to project an ambiently displayed image at a location proximate to the telecommunication device on a surface that is substantially parallel to a plane formed by a body of the telecommunication device is described herein. The telecommunication device is further configured to detect an interaction with the ambiently displayed image and perform an action based on the detected interaction. | 02-16-2012 |
20120038553 | THREE-DIMENSIONAL VIRTUAL INPUT AND SIMULATION APPARATUS - The present invention relates to a three-dimensional virtual input and simulation apparatus, and more particularly to an apparatus comprising a plurality of point light sources, a plurality of optical positioning devices with a visual axis tracking function, and a control analysis procedure. The invention is characterized in that the plurality of optical positioning devices with the visual axis tracking function are provided for measuring and analyzing 3D movements of the plurality of point light sources to achieve the effect of a virtual input and simulator. | 02-16-2012 |
20120044141 | INPUT SYSTEM, INPUT METHOD, COMPUTER PROGRAM, AND RECORDING MEDIUM - A position of a cursor | 02-23-2012 |
20120044142 | DISPLAY HAVING AUTO-OFF FUNCTION BY PERSON INFRARED SENSING - A display having an auto-off function by person infrared sensing includes a display unit, an infrared sensor module, a control unit and a power supply unit. The infrared sensor module senses whether or not a person is within a zone surrounding the display, and generates a sensing trigger signal when sensing no person within the zone. A processor of the control unit stops outputting a control signal when keeping on receiving the sensing trigger signal for a predetermined period of time. The power supply unit supplies power to the control unit or to the display unit and the control unit when receiving the control signal, and enters a DC off mode to stop supplying power when not receiving the control signal. The present invention can automatically turn off the display when sensing no person within the zone for the predetermined period of time so as to achieve power saving. | 02-23-2012 |
20120056807 | POSITION SENSING SYSTEMS FOR USE IN TOUCH SCREENS AND PRISMATIC FILM USED THEREIN - A dual light-source position detecting system for use in touch screens is provided that utilizes parallax to determine the position of an interposing object, and a prismatic film that is brightly retroreflective over a broad entrance angle to enhance to accuracy of the parallax determination of position. The position detecting system includes at least one camera positioned to receive light radiation traversing a detection area and that generates a signal representative of an image; two spaced-apart sources of light radiation, which may be LEDs, or IR emitters positioned adjacent to the camera for outputting light radiation that overlaps over at least a portion of a detection area, and a prismatic film positioned along a periphery of at least a portion of the detection area that retroreflects the light radiation from the two sources to the camera. The prismatic film includes a plurality of triangular cube corner retroreflective elements having dihedral angle errors e | 03-08-2012 |
20120056808 | EVENT TRIGGERING METHOD, SYSTEM, AND COMPUTER PROGRAM PRODUCT - An event triggering method, system, and computer program product are provided, in which an optical spot trajectory tracking is utilized. Firstly, a cursor at a position on the surface of a screen is moved to a preset position through a first optical spot, then, the projecting of the first optical spot is stopped. Next, it is determined whether the preset position is a preset position of a command. Then, a second optical spot is projected to generate a light-triggered signal, and then a processing unit executes the command according to the light-triggered signal. | 03-08-2012 |
20120056809 | HANDHELD PORTABLE POINTING APPARATUS - A portable pointing apparatus utilized with a computing device such as a personal computer, laptop computer, and/or an Internet connected television. The pointing apparatus generally comprises a spin ball, a right click button, a left click button, and a hold button for performing various cursor movements and cursor operations on a display of the computing device. The hold button can be utilized to highlight data displayed on the computing device. The apparatus also includes a laser pointer for pointing out important data on the display. A tracking device generates a movement signal based on a movement of the pointing apparatus. The computing device receives a movement signal from a tracking system and controls the movement of a cursor displayed via the computing device. The pointing apparatus can be operated from any convenient location and can be of any shape or form for user comfort. | 03-08-2012 |
20120068925 | SYSTEM AND METHOD FOR GESTURE BASED CONTROL - Methods and apparatus are provided for gesture based control of a device. In one embodiment, a method includes detecting a first position sensor signal, the first position sensor signal detected by a first sensor, and detecting a second position sensor signal, the second position sensor signal detected by a second sensor, wherein the second position sensor signal is based on position of the second sensor relative to the first sensor. The method may further include generating a control signal for a device based on the first and second position sensor signals, wherein the first and second position sensor signals are generated based on user positing of the first and second sensors. The method may further include transmitting the control signal to the device. | 03-22-2012 |
20120068926 | DISPLAY DEVICE WITH REVERSIBLE DISPLAY AND DRIVING METHOD THEREOF - A display device includes a control circuit, a data driving circuit, a gate driving circuit, and a display panel comprising a display region. The control circuit determines a located orientation of the display panel, sets a display-mode state parameter according to the located orientation of the display panel, and outputs control signals to the data driving circuit and the gate driving circuit according to the display-mode state parameter. The data driving circuit outputs data signals of an image along a first shift direction according to corresponding control signals to the display region, and the gate driving circuit outputs gate signals along a second shift direction according to corresponding control signals to the display region. | 03-22-2012 |
20120075182 | MOBILE TERMINAL AND DISPLAYING METHOD THEREOF - A mobile terminal and a control method thereof are provided. The mobile terminal may include a body, at least one display provided to one side of the body, at least one key button provided to the body to receive a user input, a sensing unit to sense a motion of the body, and a controller to cancel a lock screen displayed on the display based on the motion detected by the sensing unit when the user input is acquired through the key button. | 03-29-2012 |
20120075183 | 3D Pointing Devices with Orientation Compensation and Improved Usability - Systems and methods according to the present invention describe 3D pointing devices which enhance usability by transforming sensed motion data from a first frame of reference (e.g., the body of the 3D pointing device) into a second frame of reference (e.g., a user's frame of reference). One exemplary embodiment of the present invention removes effects associated with a tilt orientation in which the 3D pointing device is held by a user. | 03-29-2012 |
20120086636 | SENSOR RING AND INTERACTIVE SYSTEM HAVING SENSOR RING - This disclosure provides a sensor ring and an interactive system having the sensor ring. The interactive system includes the sensor ring, a RF receiver, an image-capture device and a signal processor. The sensor ring is adopted for wear on fingers or toes, has a sensor module to produce a sensing signal, and a RF transmitter for transmission of sensing signals. The image-capture device has a camera module used to produce a detection signal. The RF receiver receives the sensing signals of the ring. The signal processor processes the sensing signals of the rings and the detection signals of the image-capture device to produces interactive operation. | 04-12-2012 |
20120086637 | SYSTEM AND METHOD UTILIZED FOR HUMAN AND MACHINE INTERFACE - The present invention discloses a system for human and machine interface. The system includes a 3-dimensional (3D) image capture device, for capturing a gesture of a motion object in a period of time; a hand-held inertial device (HHID), for transmitting a control signal; and a computing device. The computing device includes a system integration and GUI module, for compensating the control signal according to an image signal corresponding to the motion object, to generate a compensated control signal. | 04-12-2012 |
20120092254 | PROXIMITY SENSOR WITH MOTION DETECTION - A proximity sensor with movement detection is provided. The proximity sensor may provide a navigation function in response to movement of an object. The proximity sensor includes a driver operable to generate a current to a plurality of light sources in a particular timing sequence, a photo detector configured to receive light and generate an output signal, a controller configured to report the movement of an object near the proximity sensor if the output signal pattern generated matches one of the output signal patterns from among a set of known output signal patterns. The proximity sensor may be configured to provide a navigation operation when an object moves near the proximity sensor. | 04-19-2012 |
20120098744 | SYSTEMS, METHODS, AND APPARATUSES FOR SPATIAL INPUT ASSOCIATED WITH A DISPLAY - An exemplary system includes a handheld user input device configured to emit a pointing signal and a selection signal from within a physical user space and directed at a display screen. The exemplary system further includes a spatial input subsystem configured to detect the pointing signal, determine a physical position within the physical user space based on the detected pointing signal, map the determined physical position within the physical user space to a cursor position on the display screen, output data representative of the cursor position for use by a display subsystem associated with the display screen, detect the selection signal, and output, in response to the selection signal, data representative of a selection command for use by the display subsystem. Corresponding systems, methods, and apparatuses are also disclosed. | 04-26-2012 |
20120098745 | SYSTEM AND METHOD FOR HIGHLIGHTING MULTIMEDIA PRESENTATION - A projector includes an infrared camera. Projected infrared rays on a screen are detected and the projector calculate coordinates of a center point of a projection area of the infrared ray projected on the screen. The coordinates of the center point is transformed to coordinates of a pixel point on the screen corresponding to a resolution of a projection lens of the projector. The projector determines the pixel point on the screen according to the converted coordinates. A light source of the projector is controlled to highlight the pixel point on the screen. | 04-26-2012 |
20120098746 | Optical Position Detection Apparatus - This invention is to provide an optical position detection apparatus including a retroreflective member ( | 04-26-2012 |
20120105325 | CAPACITIVE FINGER NAVIGATION INPUT DEVICE - A capacitive finger navigation input device uses a capacitive sensor array of capacitive sensing cells that includes only two capacitive sensing cells positioned along a linear direction. The capacitive finger navigation input device uses a drive circuit to drive at least one drive electrode of the capacitive sensor array and a sense circuit to sense mutual capacitance at each of the capacitive sensing cells of the capacitive sensor array to produce mutual capacitance signals, which are used to determine at least one of position and movement of a finger of a user with respect to the capacitive sensor array. The capacitive finger navigation input device may be used in a hand-held computing device and in a method for performing finger navigation. | 05-03-2012 |
20120105326 | METHOD AND APPARATUS FOR GENERATING MOTION INFORMATION - Provided are a method and an apparatus for generating motion information relating to motion of an object to provide reaction of a user interface to the motion of the object. The method for generating the motion information includes: detecting depth information of the object using an image frame acquired by capturing the object through a sensor; generating the motion information by compensating a determined size of the motion of the object in the acquired image frame based on the detected depth information; and generating an event corresponding to the generated motion information. | 05-03-2012 |
20120113002 | METHOD AND APPARATUS FOR CONTROLLING AN OUTPUT DEVICE OF A PORTABLE ELECTRONIC DEVICE - According to embodiments described in the specification, a method and apparatus are provided for controlling an output device of a portable electronic device comprising a processor, a first motion sensor, a second motion sensor and an output device. The method comprises: receiving at the processor, from the first motion sensor, first motion data representing movement of an external object relative to the portable electronic device; receiving at the processor, from the second motion sensor, second motion data representing movement of the portable electronic device; generating, at the processor, third motion data based on the first and second motion data, the third motion data representing movement of the external object; and, controlling the output device based on the third motion data. | 05-10-2012 |
20120113003 | DISPLAY SURFACE AND CONTROL DEVICE COMBINED THEREWITH FOR A DATA PROCESSING SYSTEM - A combined display surface and a control device for a data processing system, wherein the position of a light beam hitting the display surface is measured and the measured result is used by the data processing system as a basis for determining a cursor position on the display surface. Several strip-shaped optical position detectors are arranged along the edge of the display surface, the measured signals of which are fed into the data processing system. The cross-sectional shape of the indicator beam is formed by several lines which protrude both the display surface and the position detectors arranged thereon. The optical position detectors are formed by a layered structure made of organic material. | 05-10-2012 |
20120113004 | COMPUTER READABLE RECORDING MEDIUM RECORDING IMAGE PROCESSING PROGRAM AND IMAGE PROCESSING APPARATUS - Displayed region size data indicating a size of a screen of a display device, or a size of a region in which an image of a virtual space is displayed on the screen, is obtained. Distance data indicating a distance between a user and the display device is obtained. A position and an angle of view of the virtual camera in the virtual space are set based on the displayed region size data and the distance data. | 05-10-2012 |
20120113005 | DUAL POINTER MANAGEMENT METHOD USING COOPERATING INPUT SOURCES AND EFFICIENT DYNAMIC COORDINATE REMAPPING - The pointer management technology establishes a protocol and method for dual pointer management in both absolute input mode and relative input mode. The method defines a set of properties/constraints for contextual dynamic remapping between input sensor coordinates and target screen coordinates. The remapping of the left pointer (respectively the right pointer) depends on the position of the right pointer (respectively the left pointer) in the target screen space. This inter-dependence enables a more flexible and more powerful interaction as it exploits the contextual layout to re-estimate the remapping transformations at each instant. | 05-10-2012 |
20120119990 | POINTER CONTROL DEVICE, SYSTEM AND METHOD - A pointer control system includes a pointer control device and a pointer sensing device. The pointer control device includes two coils. The two coils respectively transmit a first signal and a second signal to the pointer sensing device, so that the pointer sensing device senses a first coordinate and a second coordinate. The pointer sensing device outputs a control signal which is corresponding to a distance between the first coordinate and the second coordinate. Therefore, the pointer control system can use an angle formed between the pointer control device and the pointer sensing device to control the operation mode of the pointer. | 05-17-2012 |
20120119991 | 3D GESTURE CONTROL METHOD AND APPARATUS - A 3D gesture control method is provided. The method includes the steps of: obtaining a series of images by a stereo camera; recognizing a control article in the images and acquiring 3D coordinates of the control article; determining the speed of the control article according to the 3D coordinates of the control article; and operating a visible object according to the speed. | 05-17-2012 |
20120119992 | INPUT SYSTEM, INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING PROGRAM, AND SPECIFIED POSITION CALCULATION METHOD - An example input system calculates a specified position on a screen of a display device, the position being specified by an operating device. The input system includes an attitude calculation section, an identification section, and a specified position calculation section. The attitude calculation section calculates an attitude of the operating device. The identification section identifies one of a plurality of display devices toward which the operating device is directed, based on the attitude of the operating device. The specified position calculation section calculates a specified position in accordance with the attitude of the operating device as a position on a screen of the display device identified by the identification section. | 05-17-2012 |
20120127073 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A display apparatus and a control method thereof, are provided. The display apparatus includes: a camera which detects a coordinate of light projected on a screen by a pointing device projecting a light beam; an image processor which processes an image to be displayed on the screen, the image corresponding to the coordinate of the light detected by the camera; an optical filter which is disposed on a path of the light entering the camera to be detected by the camera and transmits light having a first wavelength to the camera by reducing of an amount of the light at a preset ratio; and a controller which determines a characteristic of the detected light according to a wavelength based on a brightness level of the light entering through the optical filter and detected by the camera. | 05-24-2012 |
20120127074 | SCREEN OPERATION SYSTEM - A screen operation system obtains operation information associated with an operation performed by a user with a pointing object relative to a screen of an image display apparatus controlled by an information processing apparatus and causes the information processing apparatus to execute processing associated with the operation information. The screen operation system includes a mobile information apparatus including a camera capturing an image of the screen of the image display apparatus; a display displaying the image captured by the camera; and a communicator communicating information with the information processing apparatus. The camera of the mobile information apparatus captures an image such that the pointing object is displayed overlapping a predetermined position on the screen of the image display apparatus. Based on captured image information obtained thereby, operation information is obtained. | 05-24-2012 |
20120133584 | Apparatus and method for calibrating 3D position in 3D position and orientation tracking system - An apparatus and method for calibrating a 3D position in a 3D position and orientation tracking system are provided. The apparatus according to an embodiment may track the 3D position and the 3D orientation of a remote device in response to a detection of a pointing event, may acquire positions pointed to by the laser beams, in response to the detection of the pointing event, may generate a 3D reference position, based on information about the pointed to positions and the tracked 3D orientation, may calculate an error using the reference position and the tracked 3D position, and may calibrate the 3D position to be tracked, using the error. | 05-31-2012 |
20120133585 | Apparatus and method for controlling object - An apparatus and method for controlling an object are provided. Motions of fingers present in a 3-dimensional (3D) sensing area are detected, and a pointer or an object being displayed is controlled corresponding to the detected motions. Therefore, input of a control signal may be achieved without a direct touch on a display device such as a terminal, thereby preventing leaving marks on a screen of the display. In addition, since the motions of fingers are detected within the 3D sensing area, not on a 2-dimensional (2D) plane, more types of input motions may be used. | 05-31-2012 |
20120133586 | INPUT DEVICE - An input device comprises a rotatable magnet member, a lower casing and an upper casing. The rotatable magnet member has a magnet, a spherical part and a rod-like shaft. The magnet is formed into a ring-shaped disc with a through hole in the center, and magnetized with N and S poles alternately along the circumferential direction. The rod-like shaft having a flange portion near a first end is inserted in the through hole, and both ends protrude from the through hole. The lower casing rotatably supports the rod-like shaft of the rotatable magnet member. There is a space provided between an outer peripheral surface of a part of the rod-like shaft inside the through hole and an inner wall of the through hole over the entire peripheries, and the flange portion is press-fitted and fixed to the spherical part. | 05-31-2012 |
20120139838 | APPARATUS AND METHOD FOR PROVIDING CONTACTLESS GRAPHIC USER INTERFACE - Disclosed herein are an apparatus and method for providing a contactless Graphical User Interface (GUI). The apparatus for providing a contactless GUI includes a basic information management unit, a pointer tracking unit, and a mouse event management unit. The basic information management unit receives finger image information, and generates basic pointer information for a mouse pointer service. The pointer tracking unit analyzes the finger image information based on the basic pointer information, and generates a mouse operation event by tracking a mouse operation in order to control a mouse pointer based on results of the analysis. The mouse event management unit analyzes the mouse operation event, and generates an operation message corresponding to the mouse operation. The pointer tracking unit tracks the mouse operation by calculating the movement distance of the tips of a thumb and an index finger based on the finger image information. | 06-07-2012 |
20120139839 | METHODS AND SYSTEMS FOR MEDIA ANNOTATION, SELECTION AND DISPLAY OF ADDITIONAL INFORMATION ASSOCIATED WITH A REGION OF INTEREST IN VIDEO CONTENT - Methods, systems, and processor-readable media for selecting a region within a particular frame of video content to access additional information about an area of interest associated with the region within the particular frame, and displaying the additional information, in response to selecting the region associated with the particular frame of video content to access the additional information about the area of interest associated with the region within the particular frame. A selection packed can be generated, which includes frame selection data associated with the particular frame of video content. The frame selection data can include data that is sufficient to identify the particular frame of video content. | 06-07-2012 |
20120146902 | ORIENTING THE POSITION OF A SENSOR - Techniques are provided for re-orienting a field of view of a depth camera having one or more sensors. The depth camera may have one or more sensors for generating a depth image and may also have an RGB camera. In some embodiments, the field of view is re-oriented based on the depth image. The position of the sensor(s) may be altered to change the field of view automatically based on an analysis of objects in the depth image. The re-orientation process may be repeated until a desired orientation of the sensor is determined. Input from the RGB camera might be used to validate a final orientation of the depth camera, but is not required to during the process of determining new possible orientation of the field of view. | 06-14-2012 |
20120146903 | GESTURE RECOGNITION APPARATUS, GESTURE RECOGNITION METHOD, CONTROL PROGRAM, AND RECORDING MEDIUM - A gesture recognition apparatus has a temperature sensor in which a plurality of infrared sensors are arranged, a change region specifying unit that specifies a change region where a temperature change is generated as a region indicating a hand based on a temperature detected by each infrared sensor of the temperature sensor, and a gesture recognition unit that specifies a movement locus of the change region specified by the change region specifying unit and recognizes a gesture of the hand. | 06-14-2012 |
20120146904 | APPARATUS AND METHOD FOR CONTROLLING PROJECTION IMAGE - Disclosed are an apparatus and a method related with a user interface that allows a user to easily use services provided by equipment when output images of video equipment are projected onto an external screen by a projector. According to exemplary embodiments of the present invention, a human body may be used as a pointer to form a shadow, and the shadow may be analyzed to control the projection image. Therefore, the projection image can be controlled without using an interface device, which can provide an easy and intuitive user interface to a user. | 06-14-2012 |
20120146905 | TILT DIRECTION DETECTOR FOR ORIENTING DISPLAY INFORMATION - An electronic apparatus having a display function is able to alter the orientation of an image displayed on a display means for displaying images between a first orientation and a second orientation different from the first orientation. A plurality of operating means are provided at positions symmetrical between disposal positions which take the first orientation as a standard orientation and disposal positions which take the second orientation as a standard orientation. | 06-14-2012 |
20120146906 | PROMOTABLE INTELLIGENT DISPLAY DEVICE AND PROMOTING METHOD THEREOF - A promotable intelligent display device includes: an image output means outputting character and promotion images; a camera taking pictures of people; an image recognition means receiving the taken image, and outputting contextual information source among information on position of people, moving direction, distance, number, and size; a controlling means specifying a promotion target at time intervals based on the contextual information, determining driving information on the direction and speed of rotation of the image output means based on the information on moving direction and position of the promotion target, and determining promotion activities by the distance, number, and size of the promotion target; a voice output means outputting promotion and character voices based on the promotion activities; and a driving means connected with the image output means by connection shaft to enable the rotation of the image output means by controlling a shaft based on the driving information and promotion activities. | 06-14-2012 |
20120154276 | REMOTE CONTROLLER, REMOTE CONTROLLING METHOD AND DISPLAY SYSTEM HAVING THE SAME - Disclosed are a remote controller, a remote controlling method, and a display system having the same. An image displayed on a display apparatus may be converted by using one remote controller. The display system may perform a converting the image by using one remote controller, without implementing a touch screen on the display apparatus or using two or more remote controllers for multi-touch. This may enhance a user's convenience and reduce fabrication costs. | 06-21-2012 |
20120154277 | OPTIMIZED FOCAL AREA FOR AUGMENTED REALITY DISPLAYS - A method and system that enhances a user's experience when using a near eye display device, such as a see-through display device or a head mounted display device is provided. An optimized image for display relative to the a field of view of a user in a scene is created. The user's head and eye position and movement are tracked to determine a focal region for the user. A portion of the optimized image is coupled to the user's focal region in the current position of the eyes, a next position of the head and eyes predicted, and a portion of the optimized image coupled to the user's focal region in the next position. | 06-21-2012 |
20120162073 | APPARATUS FOR REMOTELY CONTROLLING ANOTHER APPARATUS AND HAVING SELF-ORIENTATING CAPABILITY - A remote control apparatus for communicating with a target device includes: a sensing portion for sensing points of user contact with the apparatus, user gestures, and an acceleration value of the apparatus; a transmitting device for sending signals representative of user commands to the target device; a controller; and a memory including instructions for configuring the controller to perform a self-orientation process based upon at least one of the acceleration value and the points of user contact to determine a forward direction of a plane of operation for defining the user gestures. An axis of the determined plane of operation substantially intersects the apparatus at any angle. | 06-28-2012 |
20120162074 | USER INTERFACE APPARATUS AND METHOD USING TWO-DIMENSIONAL IMAGE SENSOR - There are provided a user interface apparatus and method using a two-dimensional (2D) image sensor capable of setting an object for an interface from a user image detected by the 2D image sensor and performing a user input accordingly. The user interface apparatus includes: a two-dimensional (2D) image sensor; and a user interface unit detecting a movement of a user captured by the 2D image sensor, setting the detected movement as an object, and outputting the movement of the object detected by the 2D image sensor to a system in which the user intends to establish an interface. | 06-28-2012 |
20120162075 | ULTRA THIN OPTICAL POINTING DEVICE AND PERSONAL PORTABLE DEVICE HAVING THE SAME - An optical pointing device including a Printed Circuit Board (PCB); an infrared Light Emitting Diode (LED) provided on a side of the PCB; a cover plate for detecting motion of a subject, which is placed in an upper portion of the optical pointing device; an image forming system lens placed below the cover plate and configured to condense a light reflected from the subject; an optical image sensor for receiving a reflected image of the subject and detecting motion of the subject; and an optical coating unit placed on the optical image sensor. The cover plate and the optical coating unit are made of a material capable of passing a wavelength of infrared rays which cannot be perceived by a user's eye. A first cut off wavelength of the cover plate is shorter than a second cut off wavelength of the optical coating unit. | 06-28-2012 |
20120169594 | ELECTRONIC DEVICE AND METHOD FOR BACKLIGHT CONTROL - An electronic device includes a display screen, a backlight for illuminating the display screen, and a timer for indicating the expiration of a given time period, wherein when the backlight is first turned on, the timer is initialized to indicate if the expiration of a first predetermined illumination time period occurs. An image capture component is operable to acquire one or more images of a scene in front of the display screen; and a processor is operable to analyze the one or more images to determine whether a user is present in front of the display screen, to control the timer if a user is determined to be present such that the timer is reset to indicate if the expiration of a second predetermined illumination time period occurs, and to control the backlight to dim it or turn it off if the timer indicates an expiration has occurred. | 07-05-2012 |
20120169595 | Operation control device - An operation control device includes a housing, a control module having a movable operating device carried on a carrier frame in the housing and partially exposed to the outside and rotatable and axially slidable by the user, a circuit module, which includes a rotation sensor module for sensing the direction and amount of rotation of the movable operating device and producing a respective control signal and magnetic sensors for sensing the direction and amount of axial displacement of the movable operating device in a non-contact manner. The human-friendly design of the operation control device facilitates cursor control, assuring high operation stability and comfort. | 07-05-2012 |
20120169596 | METHOD AND APPARATUS FOR DETECTING A FIXATION POINT BASED ON FACE DETECTION AND IMAGE MEASUREMENT - The present invention provides an apparatus for detecting a fixation point based on face detection and image measurement, comprising: a camera for capturing a face image of a user; a reference table acquiring unit for acquiring a reference table comprising relations between reference face images and line-of-sight directions of the user; and a calculating unit for performing image measurement based on the face image of the user captured by the camera and looking up the reference table in a reference table acquiring unit, so as to calculate the fixation point of the user on the screen. | 07-05-2012 |
20120176311 | Optical pointing system and method - Optical pointing systems, devices and methods are provided wherein a selected area on a surface is illuminated with a spot of light generated by an optical pointer, the spot being substantially invisible to unassisted human vision. The spot of light is detected and its position determined via an optical sensor, and a visible marker representing the selected area is provided on the surface, under control of an electronic interface. Surfaces of physical objects as well as displayed images are accommodated, and systems, devices and methods are provided for independent operation, as well as for integrated operation with electronic display and presentation systems. | 07-12-2012 |
20120176312 | POINTING DEVICE USING CAPACITIVE SENSING - A pointing device includes: an operator supported above a substrate to be horizontally movable from an initial position; a conductive layer formed on the substrate and formed with conductive patterns spaced apart from each other to form a ring-shaped structure that is disposed coaxially around the operator when the operator is at the initial position; a conductor plate spaced apart from the conductive patterns, mounted coaxially on a bottom end of the operator, and having a diameter larger than an inner diameter of the ring-shaped structure; and a control unit providing a corresponding electrical signal to each conductive pattern during movement of the operator to measure a capacitance generated between each conductive pattern and the conductor plate so as to generate a sensing output corresponding to the measured capacitances, and generating a pointer signal corresponding to the movement of the operator based on the sensing output and predetermined capacitance information. | 07-12-2012 |
20120176313 | DISPLAY APPARATUS AND VOICE CONTROL METHOD THEREOF - A voice-controllable display apparatus is provided. The display apparatus includes a display unit which displays a plurality of icons on a screen, a control unit which controls the display unit to display identifiers corresponding the plurality of icons, the identifiers being assigned to the plurality of icons based on a preset standard if a voice recognition starts, and the identifiers being different from each other, and a voice input unit which receives a voice input. The control unit, selects an icon corresponding to the received voice input assigned to the identifier, if a voice input for an arbitrary identifier is received through the voice input unit. Thereby, effective voice control of the apparatus is achieved. | 07-12-2012 |
20120176314 | METHOD AND SYSTEM FOR CONTROLLING MOBILE DEVICE BY TRACKING THE FINGER - A method and system controls a mobile device by tracking the movement of a finger. A finger mode in which the mobile device is controlled by tracking a movement of a finger is activated. The finger is detected and the movement of the detected finger is tracked via a camera. And when the tracked movement of the detected finger corresponds to a preset motion, a function corresponding to the preset motion pattern is performed. A number of application programs can be controlled respectively by tracking the movement of fingers, via one of a number of input means. | 07-12-2012 |
20120194432 | PORTABLE ELECTRONIC DEVICE AND METHOD THEREFOR - An electronic device includes an object sensor for detecting motion of an object, such as a stylus or finger, relative to device and during a period of contactless object movement. A motion sensor, such as an accelerometer, detects device motion during the period of contactless object movement. A processor determines a gesture that corresponds to the movement of the object and to movement of the device. This device, and the associated method, results in a more accurate determination of an intended gesture, such as a three-dimensional gesture. For example, the processor, or gesture determinator, can compensate for movement of the device when determining the gesture corresponding to detected contactless movement of the object. | 08-02-2012 |
20120194433 | IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND STORAGE MEDIUM - There is provided an image processing apparatus comprising: an input unit configured to input object data including reflection characteristics of an object; an acquisition unit configured to acquire observation position data indicating the observation position of an observer and light source data indicating a surrounding light source around the image capture unit, on the basis of image data captured by the image capture unit; and a generation unit configured to generate image data, which image includes the object placed on the display unit, on the basis of the object data, the observation position data, and the light source data, wherein the image data indicate the image which is observed at the observation position when light from the light source is reflected by the object. | 08-02-2012 |
20120200499 | AR GLASSES WITH EVENT, SENSOR, AND USER ACTION BASED CONTROL OF APPLICATIONS RESIDENT ON EXTERNAL DEVICES WITH FEEDBACK - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes event, sensor, and user action based control of applications resident on external devices with feedback. | 08-09-2012 |
20120200500 | INFORMATION PROCESSING PROGRAM - A game system comprises image obtaining means, direction calculation means, first rotation means, and display control means. The image obtaining means obtains an image taken by an imaging device. The direction calculation means calculates a direction originally determined from the image of an imaging target included in the image taken by the imaging device. The first rotation means rotates an operation target in accordance with the calculated direction. The display control means generates an image in accordance with the rotation of the operation target performed by the first rotation means and displays the generated image on a screen of a display device. | 08-09-2012 |
20120206348 | DISPLAY DEVICE AND METHOD OF CONTROLLING THE SAME - A display device and a method of controlling the same are provided. The display device comprises: a camera for acquiring an image comprising a gesture taken by a user; and a controller for extracting the gesture from the image acquired by the camera and for setting a specific point of an object in which the gesture is performed as a reference point when the gesture corresponding to acquisition of a control right is comprised in the extracted gesture. Therefore, by setting a specific point of an object in which the gesture corresponding to acquisition of a control right is performed as a reference point, a gesture taken by a user can be accurately and effectively recognized. | 08-16-2012 |
20120206349 | UNIVERSAL STYLUS DEVICE - A stylus device receives light from a display though an optical element that is adapted to increase the field curvature of an image formed on an image sensor of the stylus device. Based on the size and shape of a portion of the image that is in focus, a distance, orientation, and/or azimuth of the stylus device with respect to the display can be determined. In addition, a position corresponding to each pixel, or groups of pixels, is encoded into blue light emitted by each pixel or group of pixels of the display. Upon initialization, or after a loss of synchronization, the stylus device can determine its position with respect to the pixels by decoding the encoded position. After synchronizing its position with the display, the stylus device can determine its subsequent positions by tracking pixels of the display. | 08-16-2012 |
20120206350 | Device Control of Display Content of a Display - Methods, apparatuses and systems of providing control of display content on a display with a device are disclosed. One method includes establishing a fixed reference on the display. A user input is received indicating that the device is at a user selected position corresponding to the fixed reference and capturing a position of the device in order to establish a corresponding reference position. The display content on the display is determined based on measured displacement of the device relative to the established reference position. | 08-16-2012 |
20120206351 | INFORMATION PROCESSING APPARATUS, COMPUTER-READABLE STORAGE MEDIUM HAVING STORED THEREIN INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING SYSTEM - A first housing including an orientation detection section for detecting an orientation, and a second housing including a screen section for displaying a predetermined image are connected to each other such that the relative orientation of the second housing with respect to the first housing can be changed. The relative orientation of the second housing is estimated based on a value obtained by adding a predetermined offset to detected data detected by the orientation detection section, and predetermined display processing is performed for the screen section, based on the relative orientation of the second housing. | 08-16-2012 |
20120206352 | IMAGE-CAPTURING DEVICE FOR OPTICAL POINTING APPARATUS - An image-capturing device configured for an optical pointing apparatus includes a plurality of image-sensing units arranged adjacently. The plurality of image-sensing units are configured to sense images of a surface and generate sensing signals that are used for evaluating the velocity of the optical pointing apparatus. The image-capturing device is configured to use different image-sensing units arranged differently to sense the surface according to the velocity of the optical pointing apparatus. When the optical pointing apparatus moves at a first velocity, the image-capturing device uses the image-sensing units configured to occupy a smaller area to sense the surface. When the optical pointing apparatus moves at a second velocity, the image-capturing device uses the image-sensing units configured to occupy a larger area to sense the surface. The first velocity is lower than the second velocity. | 08-16-2012 |
20120206353 | HYBRID POINTING DEVICE - The present invention discloses a hybrid pointing device including an optical navigation module and a pointing module. The optical navigation module is configured to replace the conventional buttons of a convention pointing device, such as an optical mouse or a trackball mouse. The optical navigation module is configured to sense gestures of at least one object operated by a user to activate commands associated with particular programs running on the host. Since the optical navigation module is only configured to sense gestures of the object but not the movement of the hybrid pointing device relative to a surface, the resolution thereof is aimed to be sufficiently high enough for sensing gestures and no need to be relatively high. | 08-16-2012 |
20120212413 | Method and System for Touch-Free Control of Devices - The present invention provides a system and computerized method for receiving image information and translating it to computer inputs. In an embodiment of the invention, image information is received for a predetermined action space to identify an active body part. From such image information, depth information is extracted to interpret the actions of the active body part. Predetermined gestures can then be identified to provide input to a computer. For example, gestures that can be interpreted to mimic computerized touchscreen operation. Also, touchpad or mouse operations can be mimicked. | 08-23-2012 |
20120212414 | AR GLASSES WITH EVENT AND SENSOR TRIGGERED CONTROL OF AR EYEPIECE APPLICATIONS - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes an event and sensor triggered control of eyepiece applications. | 08-23-2012 |
20120212415 | INTERACTIVE SYSTEM, METHOD FOR CONVERTING POSITION INFORMATION, AND PROJECTOR - A position information converting device in an interactive system comprising: conversion control section which determines, if an image formed by an optical signal from an object of the neighborhood of the projection surface is detected within the projection image included in the captured image data, that the predetermined manipulation has been performed, uses the position conversion information stored in the position conversion information storing section to convert position information representing the position where the predetermined manipulation has been performed into a position on the image based on the image signal. | 08-23-2012 |
20120218184 | ELECTRONIC FINGER RING AND THE FABRICATION THEREOF - The present invention provides an electronic finger ring ( | 08-30-2012 |
20120223884 | SYSTEM AND METHOD TO DISPLAY CONTENT - An apparatus and method for displaying content is disclosed. A particular method includes determining a viewing orientation of a user relative to a display and providing a portion of content to the display based on the viewing orientation. The portion includes at least a first viewable element of the content and does not include at least one second viewable element of the content. The method also includes determining an updated viewing orientation of the user and updating the portion of the content based on the updated viewing orientation. The updated portion includes at least the second viewable element. A display difference between the portion and the updated portion is non-linearly related to an orientation difference between the viewing orientation and the updated viewing orientation. | 09-06-2012 |
20120223885 | IMMERSIVE DISPLAY EXPERIENCE - A data-holding subsystem holding instructions executable by a logic subsystem is provided. The instructions are configured to output a primary image to a primary display for display by the primary display, and output a peripheral image to an environmental display for projection by the environmental display on an environmental surface of a display environment so that the peripheral image appears as an extension of the primary image. | 09-06-2012 |
20120229380 | Gyroscope control and input/output device selection in handheld mobile devices - A technique to provide a duplicate I/O device along an adjacent edge of a handheld mobile device to ensure that at least one I/O device is not obscured by a user when the user's hand grasps the handheld mobile device. Depending on which of the portrait or landscape orientation is relative to the user, a sensing device senses the orientation and sends a position signal to a control circuit. The control circuit controls a switching device that controls which of the I/O devices is to be activated depending on the orientation. | 09-13-2012 |
20120229381 | PUSH PERSONALIZATION OF INTERFACE CONTROLS - A computing system is configured to receive one or more depth images, from the depth camera, of a world space scene including a human target. The computing system translates a world space position of a hand of the human target to a screen space cursor position of the user interface using a virtual desktop transformation. The computing system also dynamically adjusts the virtual desktop transformation based on a history of button press actions executed by the human target. | 09-13-2012 |
20120229382 | COMPUTER-READABLE STORAGE MEDIUM, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD - A movement direction of an object arranged in a virtual world is set based on attitude data. Further, the object is moved in the movement direction in the virtual world in accordance with data based on a load applied to a load detection device. An image showing the virtual world including at least the object or an image showing the virtual world viewed from the object is displayed as a first image on a portable display device. | 09-13-2012 |
20120229383 | GESTURE SUPPORT FOR CONTROLLING AND/OR OPERATING A MEDICAL DEVICE - The invention relates to a gesture support device ( | 09-13-2012 |
20120229384 | DISPLAY DEVICE WITH LOCATION DETECTION FUNCTION AND INPUT LOCATION DETECTION SYSTEM - An input position detection system ( | 09-13-2012 |
20120235903 | APPARATUS AND A METHOD FOR GESTURE RECOGNITION - The present invention relates to a method for gesture recognition and an apparatus for gesture recognition carrying out the method. More specifically, the present invention relates to a method for gesture recognition recognizing a finger gesture by using depth information and an apparatus for gesture recognition carrying out the method. | 09-20-2012 |
20120235904 | Method and System for Ergonomic Touch-free Interface - With the advent of touch-free interfaces such as described in the present disclosure, it is no longer necessary for computer interfaces to be in predefined locations (e.g., desktops) or configuration (e.g., rectangular keyboard). The present invention makes use of touch-free interfaces to encourage users to interface with a computer in an ergonomically sound manner. Among other things, the present invention implements a system for localizing human body parts such as hands, arms, shoulders, or even the fully body, with a processing device such as a computer along with a computer display to provide visual feedback on the display that encourages a user to maintain an ergonomically preferred position with ergonomically preferred motions. For example, the present invention encourages a user to maintain his motions within an ergonomically preferred range without have to reach out excessively or repetitively. | 09-20-2012 |
20120235905 | POINTING METHOD, A DEVICE AND SYSTEM FOR THE SAME - The invention shows a method to control a pointing device with an angular, rate sensor, that comprises generating an ensemble of orthogonal unit vector associated signals by at least one angular rate sensor to represent angular rates in a dimensional space for each mutually orthogonal unit vector direction of said dimensional space, amplifying the at least one of said signal non-linearly for determination of cursor on a screen for (x,y) coordinates of the screen, applying a decision criterion to determine the state of the pointing device as based on said unit vector associated signals. The invention also shows a pointer utilising the method and a system comprising such a pointer. | 09-20-2012 |
20120235906 | APPARATUS AND METHOD FOR INPUTTING INFORMATION BASED ON EVENTS - Disclosed are an apparatus and a method for inputting events. An embodiment of the present invention can generate left and right click events in addition to activating and stopping pointers by sensing a rolling of a wrist and calculate and output a coordinate displacement according to the motion of the hand at the time of activating the pointers according to the events. Further, the embodiment of the present invention can be applied to a large-sized display or a contactless spatial input apparatus of an HMD, entertainment such as games, and the like, and can overcome restricted environments by a gesture input scheme under special environment. | 09-20-2012 |
20120235907 | SENSOR MAPPING - Techniques, systems and computer program products are disclosed for providing sensor mapping. In one aspect, a method includes receiving input from a user. The received input includes at least one of motion, force and contact. In addition, a sensor signal is generated based on the received input. From a choice of data structures a data structure associated with a selected application having one or more functions is identified. The data structure indicates a relationship between the generated sensor signal and the one or more functions of the selected application. The generated sensor signal is selectively mapped into a control signal for controlling the one or more functions of the selected application by using the identified data structure. | 09-20-2012 |
20120235908 | MULTI-DIRECTIONAL REMOTE CONTROL SYSTEM AND METHOD WITH AUTOMATIC CURSOR SPEED CONTROL - A multi-directional remote control system and method is adapted for use with an entertainment system of a type including a display such as a monitor or TV and having display functions employing a mouse type control. The remote controller may be conveniently held in one hand of a user and still provides full mouse type functionality. The remote control system and method images the controller to detect relative motion between the controller and screen. This position information is used for control of a cursor or other GUI interface with automatic control of cursor speed based on detected controller distance from the screen and characteristic hand movement. | 09-20-2012 |
20120235909 | CONTROLLING AND/OR OPERATING A MEDICAL DEVICE BY MEANS OF A LIGHT POINTER - The invention relates to a system for controlling and/or operating a medical device ( | 09-20-2012 |
20120242576 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM - An information processing apparatus may include a touchpad, a hardware processor, and a storage medium coupled to the processor. The storage medium may store instructions that, when executed by the processor, cause the information processing apparatus to receive a proximity signal indicative of whether a user is providing input to the touchpad; receive a movement signal indicative of whether the input includes movement of an object relative to the touchpad and/or whether the input includes movement of the object from an outer area surrounding an inner area of the touchpad to the inner area of the touchpad; and select one of a pointing user input mode or a scrolling user input mode based on the signals. | 09-27-2012 |
20120249422 | INTERACTIVE INPUT SYSTEM AND METHOD - An interactive input system comprises an interactive surface, an illumination source projecting light onto the interactive surface such that a shadow is cast onto the interactive surface when a gesture is made by an object positioned between the illumination source and the interactive surface, at least one imaging device capturing images of a three-dimensional (3D) space in front of the interactive surface, and processing structure processing captured images to detect the shadow and object therein, and determine therefrom whether the gesture was performed within or beyond a threshold distance from the interactive surface and execute a command associated with the gesture. | 10-04-2012 |
20120249423 | Image Display Apparatus - A screen is provided on a housing in such a manner that the screen protrudes toward its nearer side, and is inclined so that the nearer side becomes lower. This screen displays thereon a first hierarchy menu for indicating an arrangement of genre pictures, a second hierarchy menu for indicating an arrangement of commodity pictures, and a map image for indicating a map of the sales counters. A user brings his or her hand tip nearer to this screen. This operation allows a double-ring pointer to be displayed at a position which corresponds to the spatial position or direction of the user's hand tip relative to the screen. On this screen, the user is permitted to change the position of this double-ring pointer by performing a gesture of changing the spatial position or direction of the user's hand tip. | 10-04-2012 |
20120249424 | Methods and apparatus for accessing peripheral content - In exemplary implementations of this invention, a main content feed is displayed on a main screen. A user may select one or more auxiliary feeds of content to display simultaneously on a second screen. The second screen is located on a handheld device. The user makes the selection by changing the orientation of the handheld device relative to the main screen. For example, the user may select which auxiliary feed to display by pointing the device at different areas that are around the periphery of the main screen. The handheld device includes one or more sensors for gathering data, and one or more processors for (a) processing the sensor data to calculate the orientation of the handheld device relative to the main screen and (b) based at least in part on that orientation, selecting which of the auxiliary feeds to display. | 10-04-2012 |
20120256834 | PHYSICAL OBJECT FOR INTUITIVE NAVIGATION IN A THREE-DIMENSIONAL SPACE - A computer-implemented method for manipulating graphics objects within a display viewed by an end-user is disclosed. The method involves: receiving motion information generated in response to the end-user moving an object that is external to the display; determining at least one zone of motion in which the end-user moves the object; determining a first motion type associated with the movement of the object within the at least one zone of motion; and based on the at least one zone of motion and the first motion type, determining at least one change to a viewpoint associated with one or more graphics objects displayed to the end-user within the display. The at least one change to the viewpoint causes an alteration in how the one or more graphics objects are displayed to the end-user within the display. | 10-11-2012 |
20120262372 | METHOD AND DEVICE FOR GESTURE RECOGNITION DIAGNOSTICS FOR DEVICE ORIENTATION - Systems, circuits, and devices for recognizing gestures are discussed. A mobile device includes a housing, an orientation sensor, a camera implemented on the housing, a memory for storing a lookup table comprising multiple gestures and corresponding commands, and a controller coupled to the orientation sensor, the camera, and the memory. The controller is configured to generate trace data corresponding to a gesture captured by the camera, wherein x, y, and z coordinates of the trace data are applied according to an orientation of the housing during the gesture. The controller is also configured to determine an orientation angle of the housing detected by the orientation sensor. The controller is further configured to recognize the gesture through accessing the lookup table based on the trace data and the orientation angle of the housing. | 10-18-2012 |
20120262373 | METHOD AND DISPLAY APPARATUS FOR CALCULATING COORDINATES OF A LIGHT BEAM - A display apparatus is disclosed which includes: a camera which senses a light beam focused on a screen; a video processor which processes at least one of a first image including a reference position for calculating coordinates of the light beam and a second image corresponding to the coordinates of the light beam to be displayed on the screen; and a controller which calculates the coordinates of the light beam on the basis of the reference position changed in accordance with change in a display characteristic of the first image, and transmits the calculated coordinates to the video processor so that the second image corresponding to the calculated coordinates can be displayed on the screen. | 10-18-2012 |
20120262374 | REMOTE CONTROL SYSTEM AND METHOD CAPABLE OF SWITCHING DIFFERENT POINTING MODES - The present relates to a remote control method and a remote control system capable of switching different pointing modes. The remote control method includes a first wireless transmission module transmitting a first wireless signal with a first pointing mode code to an electronic device, a second wireless transmission module transmitting a second wireless signal with a second pointing mode code to the electronic device, and the electronic device performing movement of an icon at a first pointing mode or at a second pointing mode according to the received first pointing mode code or the received second pointing mode code and pointing mode relation information. | 10-18-2012 |
20120268372 | METHOD AND ELECTRONIC DEVICE FOR GESTURE RECOGNITION - Provided are a method and an electronic device for gesture recognition capable of dividing a display into a plurality of display regions assigned to a plurality of users, recognizing gestures of the plurality of users, and controlling the display regions assigned to the users who have made a gesture according to the recognized gestures. The method for gesture recognition includes: dividing a display into a plurality of display regions assigned to a plurality of users; recognizing gestures made by the plurality of users, respectively; and controlling the plurality of display regions respectively assigned to the plurality of users who have made the gestures according to the respective recognized gestures. | 10-25-2012 |
20120268373 | METHOD FOR RECOGNIZING USER'S GESTURE IN ELECTRONIC DEVICE - Provided is a method for recognizing a user's gesture in an electronic device, the method including sensing movement of an object by using a motion sensor, checking a distance between the motion sensor and the movement-sensed object and referring to a preset value which is currently applied in relation to gesture recognition, and adaptively recognizing a gesture corresponding to the movement-sensed object according to the set value and the checked distance. | 10-25-2012 |
20120268374 | METHOD AND APPARATUS FOR PROCESSING TOUCHLESS CONTROL COMMANDS - A method and apparatus of detecting an input gesture command are disclosed. According to one example method of operation, a digital image may be obtained from a digital camera of a pre-defined controlled movement area. The method may also include comparing the digital image to a pre-stored background image previously obtained from the digital camera of the same pre-defined controlled movement area. The method may also include identifying one or more pixel differences between the digital image and the pre-stored background image and designating the digital image as having a detected input gesture command. | 10-25-2012 |
20120274559 | Diffusing light of a laser - Embodiments disclosed herein relate to diffusing light of a multi-mode laser. In one embodiment, the multi-mode laser projects a plurality of modes of light and a diffuser reflects the plurality of modes of light to output a single lobe of light. | 11-01-2012 |
20120274560 | POINTING DEVICE - An improved air pointing device which is capable of compensating for the roll angle imparted to said device by a user of said device. The device of the invention includes at least two gyrometers and two accelerometers, the latter being each used for different roll angles. The correction of the roll angle is effected by using the measurements of the first accelerometer, the output of the second accelerometer being simulated as an approximated function of the measurements of the first accelerometer, using a polynomial approximation of the relation between the output of the two accelerometers, the order of the polynomial being chosen as a function of the computing power which is available and the precision which is needed. In an embodiment of the invention, the first and second accelerometers are advantageously swapped at a value of the roll angle which is substantially equal to 45°. | 11-01-2012 |
20120274561 | OPERATION CONTROL SYSTEM USING INFRARED RAYS AND METHOD THEREOF - An operation control method is provided. The method is applied on an operation control system. The system includes an operation device and an electronic device. The operation device includes an infrared emitter to emit infrared rays. The electronic device includes a display unit and infrared receivers. The infrared ray creates a light/heat spot on the display unit. Infrared receivers receive the infrared ray emitted by the infrared emitter. The infrared receiver receiving one infrared ray generates a corresponding signal. The method includes receiving the signal generated by the infrared receiver, determining which infrared receivers generated the signal to determine the light spot on the display unit, and controlling the electronic device to execute the corresponding function according to signals created by the light spot or spots striking the display unit. | 11-01-2012 |
20120274562 | Method, Apparatus and Computer Program Product for Displaying Media Content - In accordance with an example embodiment a method and apparatus is provided. The method comprises receiving at least one face as an input. A presence of the at least one face in a media content is determined and a modified display of the media content is generated if the at least one face is determined to be present in the media content. | 11-01-2012 |
20120280909 | IMAGE DISPLAY SYSTEM AND IMAGE DISPLAY METHOD - An image display system includes an image projection unit that projects image information; a light pointer that designates a point in the image information by irradiating pointing light, wherein the irradiation of the pointing light can be turned on and turned off; a photographing unit that photographs an area on which the image information is projected, and that outputs photographed information; an instruction detection unit that detects an irradiation position of the pointing light in the image information based on the photographed information, and that detects whether the irradiation of the pointing light is turned on or turned off; and a control unit that sets additional image information at a timing in which the instruction detection unit detects that the irradiation of the pointing light is turned off, depending on a position at which the irradiation of the pointing light is turned off. | 11-08-2012 |
20120280910 | CONTROL SYSTEM AND METHOD FOR CONTROLLING A PLURALITY OF COMPUTER DEVICES - The invention provides a control system for controlling a plurality of computer devices, the computer devices each having at least a processing unit, the control system comprising a pointer device, a tracking system connected to a processing unit of each of the plurality of computer devices, the tracking system comprising at least a tracking unit arranged to determine an actual position and/or orientation of the pointer device, wherein the tracking system is arranged to determine a parameter set representative of at least one of position and orientation of the pointer device and to select one of a plurality of computer devices depending on said parameter set and to send a control signal from the tracking system to a processing unit of the selected computer device wherein the control signal is based on said parameter set. | 11-08-2012 |
20120287043 | COMPUTER-READABLE STORAGE MEDIUM HAVING MUSIC PERFORMANCE PROGRAM STORED THEREIN, MUSIC PERFORMANCE APPARATUS, MUSIC PERFORMANCE SYSTEM, AND MUSIC PERFORMANCE METHOD - An input device includes a movement and orientation sensor for detecting one of a movement or an orientation of the input device itself. Firstly, information about one of the movement or the orientation of the input device having been detected by this movement and orientation sensor is obtained. Next, a difference between the orientation of the input device having been obtained, and a predetermined reference orientation is calculated. A predetermined sound is produced based on the difference in orientation, thereby executing music performance. | 11-15-2012 |
20120287044 | PROCESSING OF GESTURE-BASED USER INTERACTIONS USING VOLUMETRIC ZONES - Systems and methods for processing gesture-based user interactions within an interactive display area are provided. The display of one or more virtual objects and user interactions with the one or more virtual objects may be further provided. Multiple interactive areas may be created by partitioning an area proximate a display into multiple volumetric spaces or zones. The zones may be associated with respective user interaction capabilities. A representation of a user on the display may change as the ability of the user to interact with one or more virtual object changes. | 11-15-2012 |
20120293410 | Flexible Input Device Worn on a Finger - Methods and apparatus are directed to an input device including a wearable ring shaped component that is supported on a finger and located between a finger tip and a knuckle of the wearer. The wearable component comprises a touch pad device that is located on an outward surface of the wearable ring shaped component. The touch pad device is contacted to provide an input command. The wearable component includes a transmitter to transmit the input command. | 11-22-2012 |
20120293411 | Methods and apparatus for actuated 3D surface with gestural interactivity - In exemplary implementations of this invention, an array of linear actuators can be used to form a segmented surface. The surface can resemble a low relief sculpture. A user may control the shape of the surface by direct touch manipulation or by making freehand gestures at a distance from the surface. For example, the freehand gestures may comprise input instructions for selecting, translating, and rotating the shape of an object. A projector may augment the rendered shapes by projecting graphics on the surface. | 11-22-2012 |
20120293412 | METHOD AND SYSTEM FOR TRACKING OF A SUBJECT - Method, computer program and system for tracking movement of a subject. The method includes receiving data from a distributed network of camera sensors employing one or more emitted light sources associated with one or more of the one or more camera sensors to generate a volumetric three-dimensional representation of the subject, identifying a plurality of clusters within the volumetric three-dimensional representation that correspond to motion features indicative of movement of the motion features of the subject, presenting one or more objects on one or more three dimensional display screens, and using the plurality of fixed position sensors to track motion of the motion features of the subject and track manipulation of the motion features of the volumetric three-dimensional representation to determine interaction of one or more of the motion features of the subject and one or more of the one or more objects on the three dimensional display. | 11-22-2012 |
20120299826 | Human/Machine Interface for Using the Geometric Degrees of Freedom of the Vocal Tract as an Input Signal - A human/machine (HM) interface that enables a human operator to control a corresponding machine using the geometric degrees of freedom of the operator's vocal tract, for example, using the tongue as a virtual joystick. In one embodiment, the HM interface has an acoustic sensor configured to monitor, in real time, the geometry of the operator's vocal tract using acoustic reflectometry. A signal processor analyzes the reflected acoustic signals detected by the acoustic sensor, e.g., using signal-feature selection and quantification, and translates these signals into commands and/or instructions for the machine. Both continuous changes in the machine's operating parameters and discrete changes in the machine's operating configuration and/or state can advantageously be implemented. | 11-29-2012 |
20120299827 | MULTI-PLATFORM MOTION-BASED COMPUTER INTERACTIONS - Systems and methods for multi-platform motion interactivity, is provided. The system includes a motion-sensing subsystem, a display subsystem including a display, a logic subsystem, and a data-holding subsystem containing instructions executable by the logic subsystem. The system configured to display a displayed scene on the display; receive a dynamically-changing motion input from the motion-sensing subsystem that is generated in response to movement of a tracked object; generate, in real time, a dynamically-changing 3D spatial model of the tracked object based on the motion input; control, based on the movement of the tracked object and using the 3D spatial model, motion within the displayed scene. The system further configured to receive, from a secondary computing system, a secondary input; and control the displayed scene in response to the secondary input to visually represent interaction between the motion input and the secondary input. | 11-29-2012 |
20120299828 | METHOD AND APPARATUS FOR CLASSIFYING MULTIPLE DEVICE STATES - Techniques are described herein for classifying multiple device states using separate Bayesian classifiers. An example of a method described herein includes accessing sensor information of a device, wherein at least some of the sensor information is used in a first feature set and at least some of the sensor information is used in a second feature set; processing the first feature set using a first classification algorithm configured to determine a first proposed state of a first state type and a first proposed state of a second state type; processing the second feature set using a second classification algorithm configured to determine a second proposed state of the first state type and a second proposed state of the second state type; and determining a proposed state of the device as the first proposed state of the first state type and the second proposed state of the second state type. | 11-29-2012 |
20120306745 | MOTION PATTERN CLASSIFICATION AND GESTURE RECOGNITION - Methods, program products, and systems for gesture classification and recognition are disclosed. In general, in one aspect, a system can determine multiple motion patterns for a same user action (e.g., picking up a mobile device from a table) from empirical training data. The system can collect the training data from one or more mobile devices. The training data can include multiple series of motion sensor readings for a specified gesture. Each series of motion sensor readings can correspond to a particular way a user performs the gesture. Using clustering techniques, the system can extract one or more motion patterns from the training data. The system can send the motion patterns to mobile devices as prototypes for gesture recognition. | 12-06-2012 |
20120306746 | LENS ACCESSORY FOR VIDEO GAME SENSOR DEVICE - A lens accessory for a video game sensor device and a method of adjusting a sensing distance of a video game sensor device. A lens accessory for a video game sensor device includes a first lens configured to cover an infrared light emitter of the video game sensor device, a second lens configured to cover an infrared light receiver of the video game sensor device, and a body portion coupling the first lens and the second lens together, the body portion being removably attachable to the video game sensor device, and the first lens and the second lens having a magnification for adjusting a sensing distance of the video game sensor device. | 12-06-2012 |
20120313852 | SENSOR SYSTEM FOR GENERATING SIGNALS THAT ARE INDICATIVE OF THE POSITION OR CHANGE OF POSITION OF LIMBS - The invention relates to a sensor device for generating electronic signals which as such provide information on a position in space, and/or the movement of limbs, especially of the hand of a user in relation to the sensor device. Said electronic signals can be used to carry out input processes in data processing devices, communication devices and other electric devices. The aim of the invention is to devise ways of generating, in an especially advantageous manner, signals that are indicative of the position and/or the movement of limbs. A sensor device for generating electrical signals which as such provide information on the position or movement of limbs in relation to a reference area comprises a transmitter electrode device (G | 12-13-2012 |
20120319948 | MOBILE TERMINAL AND CONTROLLING METHOD THEREOF - According to one embodiment, a mobile terminal includes: a touchscreen configured to display a first menu and receive a plurality of touch inputs of a first pattern via the first menu and to display a second menu and receive a plurality of touch inputs of a second pattern via the second menu; and a controller configured to: calculate a first moving distance of a pointer for each of the received touch inputs of the first pattern; determine a minimum among the plurality of calculated first moving distances; calculate a second moving distance of the pointer for each of the received touch inputs of the second pattern; determine a maximum among the plurality of calculated second moving distances; and determine a threshold moving distance of the pointer for discriminating the touch input of the first pattern from the touch input of the second pattern by using the minimum and the maximum. | 12-20-2012 |
20120319949 | POINTING DEVICE OF AUGMENTED REALITY - The present invention relates to a pointing device capable of inputting a particular position of augmented reality to a computer. The invention comprises: a camera which takes a picture of a feature point or a mark used to generate an augmented reality image; and an imam processor which recognizes the feature point or the mark taken by the camera, and outputs position information indicative of a particular position in the augmented reality. Mouse cursor images can be synthesized as augmented reality images at the position outputted from the image processor. | 12-20-2012 |
20120319950 | METHODS AND SYSTEMS FOR ENABLING DEPTH AND DIRECTION DETECTION WHEN INTERFACING WITH A COMPUTER PROGRAM - One or more images can be captured with a depth camera having a capture location in a coordinate space. First and second objects in the one or more images can be identified and assigned corresponding first and second object locations in the coordinate space. A relative position can be identified in the coordinate space between the first object location and the second object location when viewed from the capture location by computing an azimuth angle and an altitude angle between the first object location and the object location in relation to the capture location. The relative position includes a dimension of depth with respect to the coordinate space. The dimension of depth is determined from analysis of the one or more images. A state of a computer program is changed based on the relative position. | 12-20-2012 |
20120326980 | COMMUNICATION OF USER INPUT - A keyboard application operably configured to facilitate user inputs through a keyboard displayed within a screen or other user-viewable interface by way of signaling received from a remote control. The application may be configured to facilitate user selection of one or more keys with the use of shortcuts buttons on the remote control that allow the user to navigate to different areas of the keyboard in a more efficient manner. The application may be operable with an interactive television (iTV) system to facilitate interfacing user requests with an output device and other system associated with a multiple system operator (MSO), such as to support interfacing user inputs required to enable product purchasing, navigation of an electronic programming guide (EPG), instigation of video on demand (VOD), textual messaging, web browsing, etc. | 12-27-2012 |
20130002547 | POINTING DEVICE - A point device is used with a computer device, and the point device includes a fixed base and a main body. The fixed base has a first hinge portion and a sensing displacement module, the first hinge portion locates on one end of the fixed base, and the sensing displacement module is electrically located in the fixed base to generate a displacement signal. Further, the main body includes a second hinge portion and a processing unit, and the second hinge portion locates on one end of the main body and pivoted on the first hinge portion. The main body and the fixed base are selectively rotated to a first angle or a second angle. Wherein, the processing unit is electrically located in the main body and is electrically connected to the sensing displacement module for receiving the displacement signal and transmitting the displacement signal to the computer device. | 01-03-2013 |
20130002548 | DISPLAY DEVICE - According to an aspect, a display device includes a display unit and a control unit. The display unit stereoscopically displays a display object. When a movement of an object is detected in a three-dimensional space where the display object is stereoscopically displayed, the control unit for changes the display object in the three-dimensional space according to the movement of the object. | 01-03-2013 |
20130002549 | REMOTE-CONTROL DEVICE AND CONTROL SYSTEM AND METHOD FOR CONTROLLING OPERATION OF SCREEN - A control system for controlling an operation of a screen having a first geometric reference includes a marking device and a remote-control device. The marking device displays a first pattern associated with the first geometric reference on the screen. The remote-control device obtains a signal from the screen. The signal represents an image having a second geometric reference and a second pattern associated with the first pattern. The second pattern and the second geometric reference have a first geometric relationship therebetween. The remote-control device uses the first geometric relationship to transform the second pattern into a third pattern, and calibrates the first geometric reference according to the third pattern for controlling the operation of the screen. | 01-03-2013 |
20130002550 | DETERMINATION OF CONTROLLER THREE-DIMENSIONAL LOCATION USING IMAGE ANALYSIS AND ULTRASONIC COMMUNICATION - Methods, systems, and computer programs are presented for determining the location of a controller. One method includes an operation for capturing image data of a capture area in front of a display. Additionally, the method includes another operation for capturing sound data emitted by the controller in the capture area in front of the display. The two-dimensional location of the controller is calculated based on the captured image data, and the third dimensional location of the controller is calculated based on the captured sound data. | 01-03-2013 |
20130002551 | INSTRUCTION INPUT DEVICE, INSTRUCTION INPUT METHOD, PROGRAM, RECORDING MEDIUM, AND INTEGRATED CIRCUIT - An instruction input device includes: a first direction detection unit detecting the first direction in which the user is looking; a second direction detection unit detecting a second direction in which the user is performing a pointing operation; a gaze position calculation unit calculating a gaze position of the user on the screen; a reference coordinate group calculation unit calculating a reference line in space corresponding to the gaze position and connecting the user and the screen; an offset amount calculation unit calculating a distance of the second direction between the reference line and the input coordinate indicating the user's hand as an offset amount with respect to the gaze position; and a pointer display position calculation unit calculating a position in which a distance of the first screen predetermined direction between the position and the gaze position is the offset amount on the screen. | 01-03-2013 |
20130009872 | VIRTUAL INPUT SYSTEM - For a user having a user input actuator, a virtual interface device, such as for a gaming machine, for determining actuation of a virtual input by the input actuator is disclosed. The device comprises a position sensing device for determining a location of the user input actuator and a controller coupled to the position sensing device, the controller determining whether a portion of the user input actuator is within a virtual input location in space defining the virtual input. | 01-10-2013 |
20130021245 | INTERACTIVE CONTENT CONTROL METHOD AND USER INTERFACE APPARATUS USING THE SAME - An interactive content control method and a user interface apparatus using the same are provided. The interactive content control method detects a reference length which becomes a reference for controlling interactive content according to a movement of a user, on the basis of skeletal information of the user, detects a comparison length on the basis of the skeletal information, and controls the interactive content according to a result of comparing the reference length and the comparison length. Accordingly, the present invention can provide a highly interactive user interface environment. | 01-24-2013 |
20130021246 | INPUT APPARATUS OF DISPLAY APPARATUS, DISPLAY SYSTEM AND CONTROL METHOD THEREOF - An input apparatus of a display apparatus, a display system, and a control method thereof, are provided herein, the input apparatus including: a communication unit which communicates with the display apparatus; a sensing unit which detects angular speed and acceleration from a motion of the input apparatus; a storage unit which stores position information on a position of the input apparatus; and a controller which calculates the motion information based on the detected angular speed and the position information and transmits the calculated motion information through the communication unit if the input apparatus moves, and updates the position information in the storage unit based on the detected acceleration if the input apparatus does not move. | 01-24-2013 |
20130027300 | RECOGNITION APPARATUS, METHOD, AND COMPUTER PROGRAM PRODUCT - In an embodiment, a recognition apparatus includes: an obtaining unit configured to obtain positions of a specific part in a coordinate system having a first axis to an n-th axis (n≧2); a calculating unit configured to calculate a movement vector of the specific part; a principal axis selecting unit configured to select a principal axis; a turning point setting unit configured to set a position at which there is a change in the principal axis, and set a position at which there is a change; a section setting unit configured to set a determination target section, and set a previous section; a determining unit configured to calculate an evaluation value of the determination target section and an evaluation value of the immediately previous section and, determine which of the first axis to the n-th axis is advantageous; and a presenting unit configured to perform the determined result. | 01-31-2013 |
20130027301 | OPERATION METHOD AND CONTROL SYSTEM FOR MULTI-TOUCH CONTROL - An operation method and a control system for multi-touch control are provided. A map positioner and a map block are set in a display region, and the position of a cursor is set according to an input signal of an operation region. The position of a fast positioner set in the operation region corresponds to the position of the map positioner. The position of the cursor is shifted according to a motion vector inputted by an input device, and the position of the map block is reset. The map block and the cursor are shifted to the map positioner when the input device receives a trigger signal of the fast positioner. An object in the map block is selected, a multi-touch function is enabled, and the operation property of the object is changed according to a relative shift quantity formed by a first and a second control points. | 01-31-2013 |
20130027302 | ELECTRONIC DEVICE, ELECTRONIC DOCUMENT CONTROL PROGRAM, AND ELECTRONIC DOCUMENT CONTROL METHOD - There are provided an electronic device, an electronic document control program and an electronic document control method for the electronic device. The electronic device includes a display unit configured to display an electronic document, an image taking unit configured to take an image, an eye-gaze position detecting unit configured to detect an eye-gaze position with respect to the display unit based on the image taken by the image taking unit, a determining unit configured to determine whether the electronic document displayed on the display unit has been read based on the eye-gaze position detected by the eye-gaze position detecting unit, and a performing unit configured to perform a predetermined process on the electronic document if the determining unit determines that the electronic document has been read. | 01-31-2013 |
20130027303 | Touch Screen Device, Method, and Graphical User Interface for Moving On-Screen Objects Without Using a Cursor - An electronic device with a touch screen display: detects a single finger contact on the touch screen display; creates a touch area that corresponds to the single finger contact; determines a representative point within the touch area; determines if the touch area overlaps an object displayed on the touch screen display, which includes determining if one or more portions of the touch area other than the representative point overlap the object; connects the object with the touch area if the touch area overlaps the object, where connecting maintains the overlap of the object and the touch area; after connecting the object with the touch area, detects movement of the single finger contact; determines movement of the touch area that corresponds to movement of the single finger contact; and moves the object connected with the touch area in accordance with the determined movement of the touch area. | 01-31-2013 |
20130033428 | ELECTRONIC APPARATUS USING MOTION RECOGNITION AND METHOD FOR CONTROLLING ELECTRONIC APPARATUS THEREOF - An electronic apparatus and controlling method thereof is disclosed. The method for controlling the electronic apparatus includes using motion recognition photographs as an object, and changing and displaying a screen based on a movement direction of the object, when a determination that the photographed object is moved while maintaining a first shape is made. By this method, the user is able to perform zoom in and zoom out operations more easily and intuitively by using motion recognition. | 02-07-2013 |
20130038531 | CURSOR CONTROLLING SYSTEM AND APPARATUS - A cursor-controlling system or apparatus includes a sensing module, a displacement controlling module and a driving module. The sensing module is for receiving a force to generate an input signal. The displacement controlling module includes a first signal amplifying circuit for amplifying the input signal to generate a first amplified signal; a comparator for receiving the first amplified signal to generate a selected signal; a second signal amplifying circuit for amplifying the selected signal to generate a second amplified signal; a filtering circuit for filtering the second amplified signal to generate a filtered signal; a third signal amplifying circuit for amplifying the filtered signal to generate a displacement signal. The driving module is for receiving the displacement signal to control movement of a cursor. | 02-14-2013 |
20130038532 | INFORMATION STORAGE MEDIUM, INFORMATION INPUT DEVICE, AND CONTROL METHOD OF SAME - Provided is a program capable of facilitating an operation input made by a user. A computer-readable information storage medium has stored thereon a program for controlling a computer connected to an operation input device to function so as to obtain, for each time unit, an input value indicating specifics of the operation input of the user received by the operation input device, and to calculate, as an output value of a parameter to be operated, a value obtained by changing a reference value by a change amount, the reference value being determined by one of a plurality of obtained input values, the change amount being determined by each of the plurality of obtained input values. | 02-14-2013 |
20130044053 | Combining Explicit Select Gestures And Timeclick In A Non-Tactile Three Dimensional User Interface - A method including presenting, by a computer, multiple interactive items on a display coupled to the computer, and receiving, from a depth sensor, a sequence of three-dimensional (3D) maps containing at least a hand of a user of the computer. An explicit select gesture performed by the user toward one of the interactive items is detected in the maps, and the one of the interactive items is selected responsively to the explicit select gesture. Subsequent to selecting the one of the interactive items, a TimeClick functionality is actuated for subsequent interactive item selections to be made by the user. | 02-21-2013 |
20130044054 | METHOD AND APPARATUS FOR PROVIDING BARE-HAND INTERACTION - An apparatus for providing bare-hand interaction includes a pattern image projecting unit for projecting a pattern image of structured light onto a projection zone and an image projection unit for projecting an image of digital contents onto a projection zone. The pattern image is captured from the projection zone by a pattern image capturing unit and processed by an image recognizing unit in order to recognize a user input interacted with the projected contents image. The apparatus then generate a system event corresponding to the recognized user input to control an application program in accordance with the event. | 02-21-2013 |
20130044055 | METHOD AND SYSTEM OF USER AUTHENTICATION WITH BIORESPONSE DATA - In one exemplary embodiment, a computer-implemented method includes the step of providing an image to a user. The image is provided with a computer display, An eye-tracking data is obtained from the user when the user views the image. The eye-tracking data is obtained with an eye-tracking system. A user attribute is determined based on the eye-tracking data. The user is enabled to access a digital resource when the user attribute is associated with a permission to access the digital resource. The user attribute can be a personhood state. The digital resource can be a web page document. An instruction can be provided to the user regarding a pattern of viewing the image. The pattern of viewing the image can include instructing the user to gaze on a specified sequence of image elements. | 02-21-2013 |
20130050079 | OPTICAL POINTING SYSTEM AND RELATED METHOD - An optical pointing system includes a plurality of light sources, an image receiver, and an analyzing unit. The plurality of light sources are disposed on multiple locations of an object and configured to provide light having distinct wavelengths. The image receiver is configured to detect optical signals of the plurality of light sources, thereby generating a plurality of corresponding images. The analyzing unit is configured to calculate a relative position or angle of the image receiver with respect to the object according to the images. | 02-28-2013 |
20130050080 | USER INTERFACES - A first handheld device transmits an acoustic signal. A second handheld device receives a signal derived from the transmitted signal, and performs an action based on the received signal. | 02-28-2013 |
20130063348 | POINTING DEVICE WITH MULTIPLE VIEW ANGLES - A pointing device includes two lenses with wide and narrow view angles respectively. In a short distance range, the pointing device utilizes the lens with wide view angle to increase visible range. In a long distance range, the pointing device utilizes the lens with narrow view angle to increase size of a formed image of a reference point. In addition, the pointing device senses images through both of the lenses with wide and narrow view angles to obtain rotational information. The pointing device can provide not only positional information but also angular information. | 03-14-2013 |
20130063349 | OPTICAL NAGIVATION DEVICE - An optical navigation device may have an adaptive sleep mode for preventing unwanted scrolling inputs. A motion indicator may move a device between a sleep mode and an active mode. According to the sleep mode, a number of different sleep states are defined which have further reduced frame rates. The device may be only woken from the deeper sleep modes once repeated motion events are detected. This may prevent the device from being woken accidentally, while preserving the user experience. | 03-14-2013 |
20130063350 | SPATIALLY-CORRELATED MULTI-DISPLAY HUMAN-MACHINE INTERFACE - A human-machine interface involves plural spatially-coherent visual presentation surfaces at least some of which are movable by a person. Plural windows or portholes into a virtual space, at least some of which are handheld and movable, are provided by using handheld and other display devices. Aspects of multi-dimensional spatiality of the moveable window (e.g., relative to another window) are determined and used to generate images. As one example, the moveable window can present a first person perspective “porthole” view into the virtual space, this porthole view changing based on aspects of the moveable window's spatiality in multi-dimensional space relative to a stationary window. A display can present an image of a virtual space, and an additional, moveable display can present an additional image of the same virtual space. | 03-14-2013 |
20130069872 | POSITION DETECTING DEVICE, INFORMATION PROCESSING DEVICE, POSITION DETECTION METHOD, INFORMATION PROCESSING METHOD, AND COMPUTER READABLE MEDIUM - A position detecting device includes a device specifying unit, a motion obtaining unit, and a relative position detecting unit. The device specifying unit specifies plural information processing devices that have been brought into contact with one another. The motion obtaining unit obtains information about a motion of any one of the plural information processing devices. The relative position detecting unit detects, on the basis of a motion produced when the any one of the plural information processing devices specified by the device specifying unit is brought into contact with another of the plural information processing devices, relative positions of the plural information processing devices specified by the device specifying unit. | 03-21-2013 |
20130069873 | Dynamically Varying Classified Image Display System - A dynamically varying image display system including a display, a thematic based sorting algorithm processor, and a display processor. The display is configured to sequentially display a plurality of thematically classified images and the thematic based sorting algorithm processor relies on various recognition systems. | 03-21-2013 |
20130082926 | IMAGE DISPLAY - There is provided an image display including a first light source, a second light source and a modulation unit. The first light source has a plurality of first point lights arranged to form a first shape for generating a predetermined spectrum signal. The second light source has a plurality of second point lights arranged to form a second shape for generating a predetermined spectrum signal. The modulation unit is for simultaneously modulating the first point lights of the first light source with a first predetermined modulation frequency and for simultaneously modulating the second point lights of the second light source with a second predetermined modulation frequency to generate a modulated predetermined spectrum. | 04-04-2013 |
20130088429 | APPARATUS AND METHOD FOR RECOGNIZING USER INPUT - An apparatus includes an image sensor to obtain optical image information, a control unit to generate input recognition information based on the optical image information, and to determine a user input based on the input recognition information, and a display unit to display control information corresponding to the user input. A method for recognizing a user input includes obtaining optical image information, generating input recognition information based on the optical image information, the input recognition information including a region corresponding to an input object, and determining a user input based on the input recognition information. | 04-11-2013 |
20130088430 | ULTRA THIN LIGHT SCANNING APPARATUS FOR PORTABLE INFORMATION DEVICE - Disclosed is an ultra thin optical scanning device for a portable information device, which includes an LED as a light source and totally reflects light from an object-side surface to form an image, thereby increasing a contrast ratio of the image and improving the resolution thereof. The ultra thin optical scanning device includes a light emitting device that emits light for sensing an object, an object-side surface contacting the object and totally reflecting the light emitted from the light emitting device, an image formation part collecting the light totally reflected by the object-side surface, and transmitting the light, and a light receiver part forming an image by using the light transmitted by the image formation part. | 04-11-2013 |
20130093674 | Hybrid Pointing System and Method - A handheld controller which includes at least two disparate sensors, such as a motion sensor and a touchpad sensor. A processor deployed in either handheld controller or separate product implements a hybrid pointing and selection method that uses data from the first sensor to adjust the sensitivity to stimulus of the second sensor, and vice versa. The respective sensor data are thus tempered and combined to generate a cursor control signal that includes a large scale control component to control size and movement of a rough pointer region, and a fine scale control component to control position of a precise pointer within the rough pointer region. | 04-18-2013 |
20130093675 | Remote controllable image display system, controller, and processing method therefor - The present invention discloses a remote controllable image display system, and a controller and a motion detection method for use in the system. The system includes: an image display showing images generated by a program; a light source generating at least a light beam; a controller controlling a current image according to its displacement or rotation and including at least one image sensor sensing the light beam to obtain a first frame having at least two light spots; a processor obtaining a first angle between a main operation surface of the controller and a basis plane according to the differences between the coordinates of the two light spots in the first frame. | 04-18-2013 |
20130093676 | 3D Pointing Devices and Methods - Systems and methods according to the present invention address these needs and others by providing a handheld device, e.g., a 3D device, which uses at least one sensor to detect motion of the handheld device. The detected motion can then be mapped into a desired output, e.g., cursor movement. | 04-18-2013 |
20130100017 | Notification Profile Configuration Based on Device Orientation - In one embodiment, a user places a mobile device (e.g., a smart phone) facing downward on a table. A process running on the mobile device determines an orientation of the mobile device (i.e., a facing downward orientation), and determines that the mobile device has been in the facing downward orientation for over a threshold period of time (e.g., 3 seconds), then the process automatically selects a “Quiet” notification profile, and turn off the mobile device's display, without additional input from the user. | 04-25-2013 |
20130100018 | ACCELERATION-BASED INTERACTION FOR MULTI-POINTER INDIRECT INPUT DEVICES - An indirect interaction input device, such as but not limited to a touch sensor, can provide multiple points of input. These multiple points are in turn mapped to multiple positions on an output device such as a display. The multiple points of input, however, make the application of pointer ballistics and resolution differences between the input sensor and target display more difficult to manage. Thus, a characteristic of the set of points is identified and used to adjust the mapping of each of the points. For example, one way to solve this problem is to identify the input point with the least displacement from a prior frame, whether from its prior point or from a reference point. This displacement is used to adjust the mapping of the set of input points from the input device to their corresponding display coordinates. | 04-25-2013 |
20130100019 | ENHANCED PROJECTED IMAGE INTERFACE - An interactive display projection system, includes a pointing device which determines a location on the projected display indicated by the pointing device using a combination of a location signal in the display captured by the pointing device and optical mouse circuitry to determine motion of the pointing device when the pointing device is close to the projected display. In another embodiment, the pointing device also includes an inertial sensor and associated circuitry which detects linear accelerations and rotational rates to determine motion and orientation of the pointing device, which are also used to determine the location on the projected display indicated by the pointing device. | 04-25-2013 |
20130100020 | ELECTRONIC DEVICES WITH CAMERA-BASED USER INTERFACES - Electronic devices may include touch-free user input components that include camera modules having overlapping fields-of-view. The overlapping fields-of-view may form a gesture tracking volume in which multi-dimensional user gestures can be tracked using images captured with the camera modules. A camera module may include an image sensor having an array of image pixels and a diffractive element that redirects light onto the array of image pixels. The diffractive element may re-orient the field-of-view of each camera module so that an outer edge of the field-of-view runs along an outer surface of a display for the device. The device may include processing circuitry that operates the device using user input data based on the user gestures in the gesture tracking volume. The processing circuitry may operate the display based on the user gestures by displaying regional markers having a size and a location that depend on the user gestures. | 04-25-2013 |
20130106695 | Context-sensitive query enrichment | 05-02-2013 |
20130106696 | DISPLAY DEVICE AND INFORMATION TRANSMISSION METHOD | 05-02-2013 |
20130106697 | SYSTEM AND METHOD FOR IMPROVING ORIENTATION DATA | 05-02-2013 |
20130120254 | Two-Stage Swipe Gesture Recognition - Systems, methods and computer program products for facilitating the recognition of user air swipe gestures are disclosed. Such systems, methods and computer program products provide a two-stage gesture recognition approach that combines desirable aspects of object manipulation gestures and symbolic gestures in order to create an interaction that is both reliable and intuitive for users of a computing system. In a first position-based stage, the user moves the cursor into a swipe activation zone. Second, in a motion-based stage, the user swipes their hand from the activation zone past a swipe gate within a certain amount of time to complete the interaction. GUI feedback is provided following the first stage to let the user know that the swipe interaction is available, and after the second stage to let the user know that the swipe is completed. | 05-16-2013 |
20130120255 | IMAGE PROCESSING APPARATUS AND METHOD - An image processing apparatus provided with a display unit for displaying an operation picture, showing control indicia (e.g., icons) corresponding to multiple image processing functions, is operated to display the operation picture on an external display device. The operator selects one of the indicia shown on the external display device by operating an input device, such as touch screen of the apparatus or a remote control, to indicate the position of the desired one of the indicia. The operator can thereby quickly and reliably select an image processing function to be executed, without any need to view an operation picture on the display unit when making the selection. | 05-16-2013 |
20130120256 | ELECTRONIC APPARATUS, CONTROL PROGRAM, AND CONTROL METHOD - A control unit changes a display orientation of a display screen according to a tilt direction of a display unit when a display orientation detection unit detects the display orientation of the display screen coincides with, among orientations in which the display screen is allowed to be displayed, an orientation closest to a vertical downward direction in a state where the detected display orientation of the display screen is not changed according to the tilt direction of the display unit. The display orientation detection unit detects the display orientation of the display screen displayed on the display unit. A tilt detection unit detects the tilt direction of the display unit, which displays information, in relation to the vertical downward direction. | 05-16-2013 |
20130120257 | MOVEMENT SENSING DEVICE USING PROXIMITY SENSOR AND METHOD OF SENSING MOVEMENT - There is disclosed a movement sensing device configured to detect movement of an object on a touch region three or more proximity sensors arranged on one surface adjacent to the touch region independently and two-dimensionally, to measure electrical scalars corresponding to distances with the object on the touch region, respectively; and a control unit configured to calculate a vector of a second touch point relatively changed with respect to a first touch point based on a first electrical scalar and a second electrical scalar measured at a predetermined time difference, wherein a relative moving signal with respect to a reference point is generated by calculating the movement of the object as the vector. | 05-16-2013 |
20130127711 | TOUCH TRACKING OPTICAL INPUT DEVICE - A trackpad has a cover abutting a housing. The cover includes a body that is transparent to infrared light and visible light. A first ink, deposited in a first area on a surface of the cover body, is transmissive to the infrared light and substantially opaque to visible light. A second ink is deposited in at least one second area on the surface of the cover body and is transmissive to visible light. A first emitter, within the housing, produces infrared light that is transmitted through the cover. A second emitter, within the housing, produces visible light that is transmitted through each second area of the cover. An optical sensor is provided within the housing for receiving infrared light that is reflected by an external object back through the cover. | 05-23-2013 |
20130127712 | GESTURE AND VOICE RECOGNITION FOR CONTROL OF A DEVICE - A user interface allows one or more gestures to be devised by a user and mapped or associated with one or more commands or operations of a TV or other device. The user can select the command/operation that is to be associated with each gesture that he/she devised or created. The user is not limited to the use of pre-set gestures that were previously programmed into a system and is not limited to using pre-set commands/operations that were previously associated with pre-set gestures. In alternative embodiments, voice commands or other audible signals are devised by a user and are mapped or associated with commands/operations of a device. | 05-23-2013 |
20130127713 | Input Device - An input device comprising an optical module having a light guide plate, a light source, a scattering layer and a sensor is provided. The light guide plate has a top surface, a bottom surface, and a side. The light source emits a light to the side. The light travels within the light guide plate. The scattering layer changes a path of parts of the light on the bottom surface so that the light is projected out of the top surface to form a penetrating light. When an object approaches or touches the top surface of the light guide plate, at least a part of the penetrating light is reflected by the object to form a reflected light received by the sensor. According to the reflected light, the input device generates a position signal indicating at least one relative position of the object with respect to the top surface. | 05-23-2013 |
20130127714 | USER INTERFACE SYSTEM AND OPTICAL FINGER MOUSE SYSTEM - There is provided a user interface system including a slave device and a master device. The slave device provides light of two different wavelengths to illuminate a finger surface, receives reflected light from the finger surface to generate a plurality of image frames, calculates and outputs an image data associated with a predetermined number of the image frames. The master device calculates a contact status and a displacement of the finger surface and a physiological characteristic of a user according to the image data. | 05-23-2013 |
20130127715 | 3D Pointing Device With Up-Down-Left-Right Mode Switching and Integrated Swipe Detector - A 3D pointing device for use with a content delivery system is provided. The pointing device can operate in one of at least one of two modes: a first 3D or scrolling mode, and a second non-3D mode that can also be referred to as an up-down-left-right (UDLR) mode. The pointing device can include one or more directional sensors, to provide orientation and movement information. For either of the at least two modes, an optical finger navigation module is provided that can detect movement of a user's finger or object across its screen, and provides a predetermined threshold that must be exceeded before movement information is generated from the OFN module. The pointing device can generate scroll and UDLR commands based on the information from the orientation and movement sensors, as well as the OFN module, or can provide the information from the orientation and movement sensors to a user interface that can generate the appropriate scrolling or UDLR commands for use by the content delivery system. | 05-23-2013 |
20130127716 | Projector - A projector capable of detecting the coordinates of a detection object at a height position away from a projection area to some extent is provided. This projector ( | 05-23-2013 |
20130127717 | Projector - A projector capable of detecting the position of a detection object while suppressing complication of the structure is provided. This projector is so configured that the optical axes of a laser beam of visible light emitted from a first laser beam generation portion and a laser beam of invisible light emitted from a second laser beam generation portion substantially coincide with each other. | 05-23-2013 |
20130135203 | INPUT GESTURES USING DEVICE MOVEMENT - A handheld electronic device has a cursor which is moved by tilting and or accelerating the device, where the cursor movement correlates to a bubble in a bull's eye level. Gestures include flicking, shaking, and reversing an acceleration or tilting, to control movement of the cursor, and to execute instructions corresponding to a position of the cursor. These gestures may be combined with touch, speech, buttons, or other known methods of communication between users and devices. | 05-30-2013 |
20130135204 | Unlocking a Screen Using Eye Tracking Information - Methods and systems for unlocking a screen using eye tracking information are described. A computing system may include a display screen. The computing system may be in a locked mode of operation after a period of inactivity by a user. Locked mode of operation may include a locked screen and reduced functionality of the computing system. The user may attempt to unlock the screen. The computing system may generate a display of a moving object on the display screen of the computing system. An eye tracking system may be coupled to the computing system. The eye tracking system may track eye movement of the user. The computing system may determine that a path associated with the eye movement of the user substantially matches a path associated with the moving object on the display and switch to be in an unlocked mode of operation including unlocking the screen. | 05-30-2013 |
20130135205 | Display Method And Terminal Device - The embodiments of the present disclosure provide a display method and a terminal device. The method includes detecting the state of the terminal device and displaying a first content on the display unit according to the first display command; producing a second display command when it is detected that the terminal devices is in the second state and displaying a second content on the display unit according to the second display command; wherein objects included in the first content and objects included in the second content are not exactly the same. The embodiments of the present disclosure produce different display commands according to the detected different states of the terminal device, and contents not exactly the same through the display unit, realizing that the terminal device displays different contents in different states, which improves the user's operation and experience. | 05-30-2013 |
20130141331 | METHOD FOR PERFORMING WIRELESS DISPLAY CONTROL, AND ASSOCIATED APPARATUS AND ASSOCIATED COMPUTER PROGRAM PRODUCT - A method and apparatus for performing wireless display control and an associated computer program product are provided, where the method is applied to an electronic device. The method includes: detecting whether a wireless display control agent device corresponding to the electronic device exists, wherein the wireless display control agent device is utilized as an agent for the electronic device to perform wireless display control on a display device electrically connected to the wireless display control agent device; and when it is detected that the wireless display control agent device exists, providing a user with a user interface, allowing the user to utilize a specific operating gesture to start an automatic wireless configuration of the electronic device without performing any manual wireless configuration of the electronic device, wherein based upon the automatic wireless configuration, a wireless connection between the electronic device and the wireless display control agent device is automatically established. | 06-06-2013 |
20130141332 | HYBRID POINTING DEVICE - The present invention discloses a hybrid pointing device including an optical navigation module and a pointing module. The optical navigation module is configured to replace the conventional buttons of a convention pointing device, such as an optical mouse or a trackball mouse. The optical navigation module is configured to sense gestures of at least one object operated by a user to activate commands associated with particular programs running on the host. Since the optical navigation module is only configured to sense gestures of the object but not the movement of the hybrid pointing device relative to a surface, the resolution thereof is aimed to be sufficiently high enough for sensing gestures and no need to be relatively high. | 06-06-2013 |
20130147709 | APPARATUS AND METHOD FOR DETECTING TAP - Provided is an apparatus and method for detecting a tap. The apparatus includes a sensor configured to detect a motion and output a signal corresponding to the motion, a gradient calculating unit connected to the sensor to calculate a gradient of the output signal from the sensor, a similarity determining unit connected to the gradient calculating unit to determine a similarity between a rising gradient and a falling gradient of a curve of the output signal, a tap determining unit connected to the similarity determining unit to determine detection of a tap according to the determination result of the similarity determining unit, and an output unit configured to output the determination result of the tap determining unit. | 06-13-2013 |
20130147710 | DISPLACEMENT DETECTING APPARATUS AND DISPLACEMENT DETECTING METHOD - A displacement detecting apparatus, comprising: a first detecting module, for detecting displacement for an object on a detecting surface of the displacement detecting apparatus to generate first location information, and for generating a first control signal according to the first location information; a second detecting module, for detecting a target, and for detecting second location information for the displacement detecting apparatus relative to the target, and for generating a second control signal according to the second location information; and a switch apparatus, for selectively outputting at least one of the first control signal and the second control signal. | 06-13-2013 |
20130147711 | CAMERA-BASED MULTI-TOUCH INTERACTION APPARATUS, SYSTEM AND METHOD - An apparatus, system and method controls and interacts within an interaction volume within a height over the coordinate plane of a computer such as a computer screen, interactive whiteboard, horizontal interaction surface, video/web-conference system, document camera, rear-projection screen, digital signage surface, television screen or gaming device, to provide pointing, hovering, selecting, tapping, gesturing, scaling, drawing, writing and erasing, using one or more interacting objects, for example, fingers, hands, feet, and other objects, for example, pens, brushes, wipers and even more specialized tools. The apparatus and method be used together with, or even be integrated into, data projectors of all types and its fixtures/stands, and used together with flat screens to render display systems interactive. The apparatus has a single camera covering the interaction volume from either a very short distance or from a larger distance to determine the lateral positions and to capture the pose of the interacting object(s). | 06-13-2013 |
20130147712 | Information Processing Device And Control Method Thereof - An information processing device and a control method applied to the information processing device is described. The information processing device includes a display unit configured to display images; an input unit configured to receive inputs from a user; a motion detecting unit configured to detect motion of the information processing device and to generate data related to the motion; and a processing unit connected to the display unit, the input unit and the motion detecting unit. The processing unit is configured to receive the motion-related data from the motion detecting unit, and enable/disable of the display unit and/or the input unit based on the data related to the motion. | 06-13-2013 |
20130154930 | GESTURE CONTROLLED AUDIO USER INTERFACE - A user interface, methods and article of manufacture each for selecting an audio cue presented in three-dimensional (3D) space are disclosed. The audio cues are audibly perceivable in a space about a user, where each of the audio cues may be perceived by the user as a directional sound at a distinct location from other audio cues in the space. Selection of a specific audio cue is made based on one or more user gestures. A portable electronic device may be configured to present the audio cues perceived by a user and detect certain user gestures to select audio cues. The audio cue selection can be used to control operation of the portable device and/or other associated devices. | 06-20-2013 |
20130162534 | Device, Method, and Graphical User Interface for Manipulating a Three-Dimensional Map View Based on a Device Orientation - An electronic device displays on a display a first three-dimensional map view of a respective map location. The first three-dimensional map view is viewed from a first angle while an orientation of the electronic device corresponds to a first orientation. The electronic device detects a rotation of the electronic device with at least one orientation sensor, and determines a respective orientation of the electronic device. The respective orientation is distinct from the first orientation. While detecting the rotation of the electronic device, the electronic device updates the first three-dimensional map view with a respective three-dimensional map view of the respective map location. The respective three-dimensional map view is viewed from a respective angle distinct from the first angle. The respective angle is determined in accordance with the respective orientation of the electronic device. | 06-27-2013 |
20130162535 | MANIPULATION INPUT DEVICE WHICH DETECTS HUMAN HAND MANIPULATIONS FROM CAPTURED MOTION IMAGES - When a vehicle navigation system is manipulated by taking pictures of a user hand motion and gesture with a camera, as the number of apparatuses and operational objects increases, the associated hand shapes and hand motions increase, thus causing a complex manipulation for a user. Furthermore, in detecting a hand with the camera, when the image of a face having color tone information similar to that of a hand appears in an image taken with a camera, or outside light rays such as sun rays or illumination rays vary, detection accuracy is reduced. To overcome such problems, a manipulation input device is provided that includes a limited hand manipulation determination unit and a menu representation unit, whereby a simple manipulation can be achieved and manipulation can accurately be determined. In addition, detection accuracy can be improved by a unit that selects a single result from results determined by a plurality of determination units, based on images taken with a plurality of cameras. | 06-27-2013 |
20130162536 | SYSTEMS AND METHODS FOR ENABLING OR ACCESSING OR VIEWING COMPONENTS, INVOLVING AN INPUT ASSEMBLY AND A SCREEN - The present invention in a preferred embodiment provides systems and methods for enabling or accessing or viewing components through a graphical user interface, wherein the system comprises,
| 06-27-2013 |
20130162537 | ORIENTATION CALCULATION APPARATUS, STORAGE MEDIUM HAVING ORIENTATION CALCULATION PROGRAM STORED THEREIN, GAME APPARATUS, AND STORAGE MEDIUM HAVING GAME PROGRAM STORED THEREIN - An orientation calculation apparatus obtains data from an input device including at least a gyroscope and an acceleration sensor, and calculates an orientation of the input device in a three-dimensional space. Orientation calculation means calculates the orientation of the input device in accordance with an angular rate detected by the gyroscope. Acceleration vector calculation means calculates an acceleration vector representing an acceleration of the input device in accordance with acceleration data from the acceleration sensor. Correction means corrects the orientation of the input device such that a direction of the acceleration vector in the space approaches a vertically downward direction in the space. Also, the correction means corrects the orientation of the input device such that a directional change before and after the correction is minimized regarding a predetermined axis representing the orientation of the input device. | 06-27-2013 |
20130169531 | System and Method of Determining Pupil Center Position - Determining pupil center position. At least some illustrative embodiments are methods including: creating a video signal of an eye, the video signal comprising a stream of frames; and finding an indication of pupil position. The finding may include: calculating a set of feature points within a first frame of the video signal; dividing, by the computer system, the first frame of the video signal into a plurality of sections; selecting a plurality of feature points from the first frame, at least one feature point selected from each section; and determining an ellipse from the plurality of feature points. The method may further include moving a cursor on a display device responsive to change in location of a feature of the ellipse with respect to a previous feature of an ellipse from a previous frame. | 07-04-2013 |
20130169532 | System and Method of Moving a Cursor Based on Changes in Pupil Position - Moving a cursor based on changes in pupil position. At least some of the illustrative embodiments are methods including: creating an analog video signal of an eye of a computer user, the analog video signal comprising interlaced video with two fields per frame; calculating a first location of a pupil within at least one field of a frame; calculating a frame location of the pupil based on location of the pupil in the at least one field; and moving a cursor on a display device of the computer system, the moving responsive to a change in the frame location of the pupil with respect to a previous frame location, and the moving in real time with movement of the pupil. | 07-04-2013 |
20130169533 | System and Method of Cursor Position Control Based on the Vestibulo-Ocular Reflex - Cursor position control based on the vestibulo-ocular reflex. At least some of the illustrative embodiments are methods including: creating a first video stream, the first video stream depicting an eye of user of a computer system, wherein a pupil of the eye changes position relative to a face of the user during use of the computer system by the user; tracking pupil position relative to the face of the user, the tracking by way of the first video stream; moving a cursor position on the display device, the moving responsive to changes in pupil position relative to the face of the user, and the moving in real time with pupil position changes; and adjusting cursor position based on the vestibulo-ocular reflex. | 07-04-2013 |
20130169534 | COMPUTER INPUT DEVICE - A computer input device is disclosed which comprised a keyboard having a plurality of keys for entering commands and characters into the computer, the keyboard having a designated surface area overlaying the plurality of keys, at least one of the plurality of keys being located outside of the designated surface area, a touch sensor for detecting one or more touches by one or more objects on the designated surface area of the keyboard, and an input processor configured to switch the keyboard into a mouse mode when the touch sensor having detected the designated surface area being touched by a single object, the input processor configured to switch the keyboard into a keyboard mode when the touch sensor having detected the designated surface area being touched by two or more objects. | 07-04-2013 |
20130169535 | MORE USEFUL MAN MACHINE INTERFACES AND APPLICATIONS - A method for enhancing a well-being of a small child or baby utilizes at least one TV camera positioned to observe one or more points on the child or an object associated with the child. Signals from the TV camera are outputted to a computer, which analyzes the output signals to determine a position or movement of the child or child associated object. The determined position or movement is then compared to preprogrammed criteria in the computer to determine a correlation or importance, and thereby to provide data to the child. | 07-04-2013 |
20130169536 | CONTROL OF A WEARABLE DEVICE - A wearable device including a camera and a processor and a control interface between the wearable device and a user of the wearable device. An image frame is captured from the camera. Within the image frame, an image of a finger of the user is recognized. The recognition of the finger by the wearable device controls the wearable device. | 07-04-2013 |
20130169537 | IMAGE PROCESSING APPARATUS AND METHOD, AND PROGRAM THEREFOR - An image processing apparatus includes an extracting unit for extracting a feature point from a captured image; a recognizing unit for recognizing a position of the feature point; a display unit for displaying, based on the position of the feature point, a feature-point pointer indicating the feature point and a mirrored image of the captured image in a translucent manner; and an issuing unit for issuing, based on the position of the feature point, a command corresponding to the position of the feature point or a motion of the feature point. | 07-04-2013 |
20130176220 | TOUCH FREE OPERATION OF ABLATOR WORKSTATION BY USE OF DEPTH SENSORS - An inventive system and method for touch free operation of an ablation workstation is presented. The system can comprise a depth sensor for detecting a movement, motion software to receive the detected movement from the depth sensor, deduce a gesture based on the detected movement, and filter the gesture to accept an applicable gesture, and client software to receive the applicable gesture at a client computer in an ablation workstation for performing a task in accordance with client logic based on the applicable gesture. The system can also comprise hardware for making the detected movement an applicable gesture. The system can also comprise voice recognition providing voice input for enabling the client to perform the task based on the voice input in conjunction with the applicable gesture. The applicable gesture can be a movement authorized using facial recognition. | 07-11-2013 |
20130176221 | SENSING DEVICE HAVING CURSOR AND HYPERLINKING MODES - An optical sensing device for controlling a computer system is disclosed. The device has a nib for receiving a nib force upon the nib being pressed against a substrate and a nib switch coupled to the nib. An optical sensor images optically coded data printed on the substrate. A processor effects a mode change between a cursor control mode and a hyperlinking mode upon the nib force actuating the nib switch, generates cursor control data when the optical sensing device is in the cursor control mode, and generates interaction data when the optical sensing device is in the hyperlinking mode. The interaction data indicates a coordinate position of the optical sensing device relative to the substrate. The cursor control data or the interaction data is then communicated to the computer system, where the cursor control data initiates a cursor control response and the interaction data initiates a hyperlinking response. | 07-11-2013 |
20130176222 | OPERATIONAL DISPLAY DEVICE AND METHOD OF CONTROLLING THE SAME, AND RECORDING MEDIUM - An operational display device includes a display portion for displaying an image based on image data including a specific portion, a detection portion for detecting an orientation in which the display portion is held, and a display control unit for controlling a manner of display on the display portion. The display control unit causes an image of the specific portion to be displayed as being zoomed-in and rotated in accordance with an orientation of holding of the display portion when the orientation in which the display portion is held is changed from a first orientation to a second orientation. | 07-11-2013 |
20130176223 | DISPLAY APPARATUS, USER INPUT APPARATUS, AND CONTROL METHODS THEREOF - A display apparatus, a user input apparatus, and control methods thereof are provided. The user input apparatus is connected to a display apparatus having a plurality of display modes and includes: a touch pad unit which receives a user command relating to operation of a user interface (UI) screen corresponding to a first display mode; a button unit which receives a user command relating to operation of a UI screen corresponding to a second display mode; and a communicator which transmits a mode change command, which is entered via either the touch pad unit or the button unit, to the display apparatus. The mode change command is a user command which is generated in conjunction with an execution of an operation method corresponding to a display mode which is different from a current display mode. | 07-11-2013 |
20130181899 | REMOTE CONTROL FOR SENSING MOVEMENT, IMAGE DISPLAY APPARATUS FOR CONTROLLING POINTER BY THE REMOTE CONTROL, AND CONTROLLING METHOD THEREOF - A remote control is provided including a plurality of sensors which sense movement of the remote control, and a control unit which turns on at least one sensor of the plurality of sensors and thereby senses movement of the remote control, and determines whether to turn on or off the remaining sensors according to whether or not the at least one sensor senses movement of the remote control. Consequently, battery consumption is reduced. | 07-18-2013 |
20130181900 | NON-CONTACT SELECTION DEVICE - A non-contract selecting device is disclosed. The non-contract selecting device include a light source, emitting light to an outside; a camera unit, generating and outputting a video signal corresponding to an external video; a video data generating unit, generating video data corresponding to the video signal; and an identity unit, detecting a location of a detected area formed by light, reflected by pointing-means and inputted, of the light emitted from the video data in units of each frame, recognizing a moving locus of the detected area by comparing at least two continuous frames and generating and outputting corresponding change information. With the present invention, the function-selecting can be more quickly and easily and increase making the most use of elements. | 07-18-2013 |
20130187852 | THREE-DIMENSIONAL IMAGE PROCESSING APPARATUS, THREE-DIMENSIONAL IMAGE PROCESSING METHOD, AND PROGRAM - A three-dimensional image processing apparatus includes: an output unit configured to output a plurality of three-dimensional images to a display apparatus; a detection unit configured to detect a pointer in association with a three-dimensional image displayed on the display apparatus; an operation determination unit configured to determine a predetermined operation based on movement of the pointer detected by the detection unit; and an image processing unit configured to perform, on the three-dimensional image associated with the pointer, processing associated with the predetermined operation determined by the operation determination unit, and to cause the output unit to output the processed three-dimensional image. | 07-25-2013 |
20130187853 | DISPLAY SYSTEM - An embodiment of the present invention provides a display system, and the display system comprises a light beam emitting device, emitting a first light beam for marking an input position and a second light beam for confirming the input position, the first light beam being visible light, and the second light beam differing from the first light beam; and a display device, comprising a displaying area, photo-sensitive devices distributed within the displaying area, processing devices coupled with the photo-sensitive devices. The photo-sensitive devices are used for sensing the second light beam projected upon the displaying area and achieving a sensing result; the processing devices are used for determining the projecting position of the second light beam upon the displaying area according to the sensing result, and performing a corresponding operation based on the sensing result. The present invention is capable of performing remote touch operation on the display device. | 07-25-2013 |
20130187854 | Pointing Device Using Camera and Outputting Mark - Pointing device like mouse or joystick comprises camera for capturing the display screen and image processing means for recognizing and tracking the pointing cursor icon or mark from the captured image and producing the pointing signal. The pointing device of present invention can be used with any type of display without and additional tracking means like ultra sonic sensor, infrared sensor or touch sensor. The pointing device of present invention includes mark outputting portion, camera portion for capturing the said mark outputting portion and image processing portion for recognizing the said mark outputting portion from the captured image and producing the pointing signal. | 07-25-2013 |
20130194182 | GAME DEVICE, CONTROL METHOD FOR A GAME DEVICE, AND NON-TRANSITORY INFORMATION STORAGE MEDIUM - A position information acquiring unit acquires position information relating to positions of a plurality of body parts of a player. A determination unit determines whether or not the at least one of the plurality of body parts exists within a determination region including the reference position at a time point corresponding to the reference time. A body part information acquiring unit acquires body part information relating to a kind of the at least one of the plurality of body parts determined to exist within the determination region. An evaluation unit evaluates gameplay of the player based on the kind of the at least one of the plurality of body parts acquired by the body part information acquiring unit. | 08-01-2013 |
20130194183 | COMPUTER MOUSE PERIPHERAL - A computer pointing device including: a base portion with a lower surface adapted for sliding across a work surface, a spine portion, projecting substantially upward from said base portion and having a thumb-engaging surface on a first lateral side of the spine and at least one index fingertip and/or middle fingertip-engaging surface on a second lateral side of the spine opposing said first lateral side. A keyboard with an altered arrangement of function of keys, such as an enlarged or truncated or no spacebar or capable of an altered appearance in accordance with keys being re-mapped to sensors on a pointing device. A keyboard with a virtual screen display, which may be made semi-transparent by activating a sensor on a pointing device. A computer with a recess capable of accommodating a mouse device. A locked scrolling or zooming means, using any pointing device, in which scrolling or zooming in a defined direction is proportional to the distance travelled by the device, irrespective of direction of movement of the device. | 08-01-2013 |
20130194184 | METHOD AND APPARATUS FOR CONTROLLING MOBILE TERMINAL USING USER INTERACTION - A method and apparatus for controlling a mobile terminal through use of user interaction are provided. The method includes operating in a vision recognition mode that generates a vision recognition image through use of a signal output from a second plurality of pixels designated as vision pixels from among a plurality of pixels of an image sensor included in the mobile terminal; determining whether a predetermined object in the vision recognition image corresponds to a person; determining a gesture of the predetermined object when the predetermined object corresponds to the person; and performing a control function of the mobile terminal corresponding to the gesture of the predetermined object. | 08-01-2013 |
20130201104 | MULTI-USER INTERACTIVE DISPLAY SYSTEM - A multi-user interactive display system including a soft-copy display including at least an information display region and a command control region, and a digital image capture system positioned to capture a time sequence of images of users located in a field-of-view of the soft-copy display. A time sequence of images is analyzed to detect a plurality of users, and at least one of the users is designated to be a controlling user. The captured images are displayed in the command control region, wherein the detected users are demarked using graphical elements. The captured time sequence of images is analyzed to detect a gesture made by the controlling user and content displayed in the information display region is updated accordingly. | 08-08-2013 |
20130201105 | METHOD FOR CONTROLLING INTERACTIVE DISPLAY SYSTEM - A method for controlling a multi-user interactive display system including a soft-copy display including at least an information display region and a command control region, and a digital image capture system positioned to capture a time sequence of images of users located in a field-of-view of the soft-copy display. A time sequence of images is analyzed to detect a plurality of users, and at least one of the users is designated to be a controlling user. The captured images are displayed in the command control region, wherein the detected users are demarked using graphical elements. The captured time sequence of images is analyzed to detect a gesture made by the controlling user and content displayed in the information display region is updated accordingly. | 08-08-2013 |
20130201106 | METHOD FOR CONTROLLING ACTIONS BY USE OF A TOUCH SCREEN - A method for controlling a pointer having a position determined by the position of at least one end of a member on a touch screen. An offset is inserted between the position of the pointer and that of the end of the member for driving movements of the pointer such that the end does not have to cover an object displayed on the screen in order to effectively select the object. | 08-08-2013 |
20130207894 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD AND PROGRAM - There is provided an information processing device including a movement information acquisition part acquiring information which is based on movement of an operation device, and a control information generation part generating, based on the information, control information for changing a display status continuously in a non-linear manner according to the movement. | 08-15-2013 |
20130207895 | EYE TRACKING METHOD AND DISPLAY APPARATUS USING THE SAME - A display apparatus employs two tracking units to track the gaze of a user. The display apparatus includes a first tracking unit to generate position information on a user positioned relative to a displayed image; and a second tracking unit to track a gaze of the user upon the displayed image, based on the position information. A method of eye tracking using the display apparatus includes steps of displaying an image; generating position information on a user positioned relative to the displayed image; and tracking a gaze of the user upon the displayed image, based on the position information. | 08-15-2013 |
20130207896 | Augmented reality display system and method of display - The present invention describes a display system that includes a display, including a display screen; a viewpoint assessment component to determine a viewpoint of a user positioned in front the display screen; and an object tracking component to track the user manipulation of an object positioned behind the display screen. | 08-15-2013 |
20130215027 | Evaluating an Input Relative to a Display - Disclosed embodiments relate to evaluating an input relative to a display. A processor may receive information from an optical sensor 106 and a depth sensor 108. The depth sensor 108 may sense the distance of an input from the display. The processor may evaluate an input to the display based on information from the optical sensor 106 and the depth sensor 108. | 08-22-2013 |
20130215028 | APPARATUS SYSTEM AND METHOD FOR HUMAN-MACHINE-INTERFACE - There is provided a 3D human machine interface (“3D HMI”), which 3D HMI may include: (1) an image acquisition assembly, (2) an initializing module, (3) an image segmentation module, (4) a segmented data processing module, (5) a scoring module, (6) a projection module, (7) a fitting module, (8) a scoring and error detection module, (9) a recovery module, (10) a three dimensional correlation module, (11) a three dimensional skeleton prediction module, (12) an output module and (13) a depth extraction module. | 08-22-2013 |
20130222241 | APPARATUS AND METHOD FOR MANAGING MOTION RECOGNITION OPERATION - Provided is an apparatus and method for managing a motion recognition operation. The apparatus includes a sensor unit to detect a motion; and a motion recognition processing unit to determine a motion event corresponding to the detected motion, to determine an execution event corresponding to the motion event, and to perform the execution event. The method includes detecting a motion using a sensor; determining a motion event corresponding to the detected motion; converting the motion event into an execution event with respect to an application operating in a foreground; and transmitting the execution event to the application. | 08-29-2013 |
20130222242 | VIRTUAL INTERFACE AND CONTROL DEVICE - An input device for a computer or other programmable device translates the proximity of an object to one or more antennae into an electronic signal. The antennae generate a first frequency and a second frequency. When an object, such as a hand, is placed in proximity to the antenna, the object causes the first and second frequencies to heterodyne, which creates a third frequency, also referred to as a beat frequency or pulse frequency. A receiver interprets the pulse frequency and translates it into an electronic signal that can be used to command a computer or other programmable device. | 08-29-2013 |
20130222243 | METHOD AND APPARATUS FOR SCROLLING A SCREEN IN A DISPLAY APPARATUS - A method of scrolling a screen in a display apparatus includes initiating a screen-scroll according to a speed of a currently generated flick input when the flick input is generated, comparing a tilt of an axis of a corresponding apparatus with an initial location at a time of generation of the flick input to determine whether a change amount of a change is within a reference value, maintaining a current screen-scroll speed if the change amount deviates from the reference value, and stopping a screen-scroll operation when a scroll stop condition is met. | 08-29-2013 |
20130229345 | Manual Manipulation of Onscreen Objects - According to some embodiments, hand gestures may be entirely used to control the apparent action of objects on a display screen. As used herein, using “only” hand gestures means that no physical object need be grasped by the user's hand in order to provide the hand gesture commands. As used herein, the term “hand-shaped cursor” means a movable hand-like image that can be made to appear to engage or grasp objects depicted on a display screen. In contrast a normal arrow cursor cannot engage objects on a display screen. | 09-05-2013 |
20130229346 | METHOD AND APPARATUS FOR A CAMERA MODULE FOR OPERATING GESTURE RECOGNITION AND HOME APPLIANCE - A camera module for operating gesture recognition includes a camera, an evaluation unit which evaluates picture data generated by the camera for recognition of an operating gesture from a predefined set of operating gestures, wherein the operating gestures serve for operator control of a home appliance, wherein the home appliance includes a control device for function control separate from the camera module, and an interface for coupling of the camera module with the control device, wherein information according to the recognized operating gesture is transmitted to the control device via the interface. | 09-05-2013 |
20130234938 | KEY MODULE AND ELECTRONIC DEVICE INCLUDING THE SAME - An electronic device includes a case and a key module. The key module is mounted on a supporting surface of a housing of the case and includes: a touch pad, a circuit board, a press switch and a transmission member. The press switch is coupled electrically to the circuit board and is disposed at a position corresponding to the middle of the touch pad in a lengthwise direction. The transmission member is mounted between the circuit board and the supporting surface, and includes two levers and a press member disposed between the levers and abutting against the press switch. A corresponding electrical signal can be produced by abutment of the press member against the press switch linked by the transmission member. | 09-12-2013 |
20130234939 | Keyboard Having a Key-In Area Integrated with a Touch Sensor Device and a Method Thereof - The present invention is related to a keyboard having a key-in area integrated with a touch sensor device and a method thereof by providing at least one touch sensor device at a key location within the key-in area of a keyboard main body, and electrically connecting the touch sensor device to the keyboard main body to transmit a signal via the keyboard main body. With the method of integrating a key-in area of a keyboard with a touch sensor device provided by the present invention, the keyboard and the pointing device can be integrated altogether as one such that the operation and control efficiency thereof is advantageously enhanced and is adapted for various application environments and practical usages. | 09-12-2013 |
20130234940 | Pointing Device, Operating Method Thereof and Relative Multimedia Interactive System - An operating method of a display device includes controlling a shift of a cursor of a user interface reference frame according to a shift of the pointing device with reference to an initial point in a 3D spatial reference frame; and updating a position of the initial point in the 3D spatial reference frame according to an updating signal. An advantage of the present invention is when the operating range is changed, reference coordinates utilized by the pointing device are appropriately adjusted, so as to lower the affect of offset, allowing the pointing device to be applied in different areas/directions without having the cursor displayed on the display device to incorrectly reflect shift of the pointing device. | 09-12-2013 |
20130234941 | CONTROL SYSTEM WITH RIDGE INTERFACE AND METHOD OF OPERATION THEREOF - A control system includes: a ridge interface configured to perform a gesture therewith; and a control unit, coupled to the ridge interface, configured to interpret the gesture for controlling a device. | 09-12-2013 |
20130241830 | GESTURE INPUT APPARATUS, CONTROL PROGRAM, COMPUTER-READABLE RECORDING MEDIUM, ELECTRONIC DEVICE, GESTURE INPUT SYSTEM, AND CONTROL METHOD OF GESTURE INPUT APPARATUS - A gesture input apparatus that recognizes gesture made by a gesture actor in front of a camera, and controls a control target device on the basis of the recognized gesture is provided. The gesture input apparatus comprising a sensor detection part configured to detect an input from a sensor, and a gesture recognition part configured to start gesture recognition using an image captured by the camera, on the basis of a time when the input from the sensor is detected. | 09-19-2013 |
20130241831 | Collapsible input device - In exemplary implementations of this invention, a handheld, collapsible input device (CID) may be employed by a user to input and manipulate 3D information. The CID telescopes in length. As a user presses the CID against a display screen, the physical length of the CID shortens, and the display screen displays a virtual end of the CID that appears to project through the screen into the virtual 3D space behind the screen. The total apparent length of the CID, comprised of a physical portion and a virtual portion, remains the same (after taking into account foreshortening). Thus, the user experience is that, as the user holds the physical CID and pushes it against the display screen, the end of the CID appears to be pushed through the display screen into the virtual 3D space beyond it. The CID housing may include a push button for user input. | 09-19-2013 |
20130241832 | METHOD AND DEVICE FOR CONTROLLING THE BEHAVIOR OF VIRTUAL OBJECTS ON A DISPLAY - A method for use in controlling images on a screen, including identifying each object from some objects with respect to a sensing surface, and assigning a dedicated image to that object for presentation on a screen, sensing behavior of that object by monitoring its position contacting the sensing surface and generating position data indicative thereof, and selectively identifying a break in contact between the contacting object and the sensing surface and generating data indicative thereof, processing the position data and generating transformation data between the coordinate system of the sensing surface and a virtual coordinate system of the screen, and selectively generating and storing data indicative of a last position in the virtual coordinate system of an image corresponding to a contacting object, when the contacting object breaks contact with the sensing surface; and using the transformation data for controlling the image associated with each contacting object on the screen. | 09-19-2013 |
20130241833 | POSE TRACKING PIPELINE - A method of tracking a target includes receiving from a source a depth image of a scene including the human subject. The depth image includes a depth for each of a plurality of pixels. The method further includes identifying pixels of the depth image that belong to the human subject and deriving from the identified pixels of the depth image one or more machine readable data structures representing the human subject as a body model including a plurality of shapes. | 09-19-2013 |
20130241834 | System and method for using information from intuitive multimodal interactions for media tagging - System and method for using information extracted from intuitive multimodal interactions in the context of media for media tagging are disclosed. In one embodiment, multimodal information related to media is captured during multimodal interactions of a plurality of users. The multimodal information includes speech information and gesture information. Further, the multimodal information is analyzed to identify speech portions of interest. Furthermore, relevant tags for tagging the media are extracted from the speech portions of interest. | 09-19-2013 |
20130249796 | INFORMATION PROCESSING DEVICE, COMPUTER-READABLE STORAGE MEDIUM, AND PROJECTING SYSTEM - An information processing device includes a storage unit configured to store a predetermined motion of a user who uses an operating device and an attribute of the predetermined motion per role of the user in association with each other; an image capturing unit configured to capture an image of a predetermined area including a projection area on which a projecting device projects an image; an identification unit configured to identify the attribute associated with the predetermined motion corresponding to a motion of light emitted to the predetermined area from the operating device based on the motion of light and an operation signal, referring to the storage unit; a synthetic image generation unit configured to generate a synthetic image by reflecting the attribute of the predetermined motion in the image projected; and a history record unit configured to generate history data including the synthetic image, the role, the attribute. | 09-26-2013 |
20130249797 | METHOD FOR EXECUTING MOUSE FUNCTION OF ELECTRONIC DEVICE AND ELECTRONIC DEVICE THEREOF - A method for executing a mouse function of electronic device and an electronic device thereof are provided. In the present method, an amount and a relative position of input signals are detected by a sensor module. Then, whether the amount and the relative position are respectively conformed to a predetermined value is determined. If the predetermined values are conformed, whether the input signal is conformed to a specific signal is determined when a variation of the relative position is occurred. Finally, a corresponding mouse function is executed according to a type of the variation if the variation is conformed to the specific signal. As a result, a mouse device is no longer needed for a user to accomplish a directional operation on the electronic device so as to prevent inconvenience of particularly carrying a mouse device. | 09-26-2013 |
20130257723 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND COMPUTER PROGRAM - There is provided an information processing apparatus including an operation detection unit configured to detect an orientation of a user's face and operations performed by the user, and an area selection unit configured to, when the operation detection unit detects that the user has performed a first operation, select an area on a screen based on the orientation of the user's face during the first operation. | 10-03-2013 |
20130257724 | Method And Apparatus For User Interface Of Input Devices - A 3 dimensional (3-D) user interface system employs: one or more 3-D projectors configured to display an image at a first location in 3-D space; one or more sensors configured to sense user interaction with the image and to provide user interaction information; and a processor configured (i) to receive the user interaction information from the one or more sensors; (ii) to correlate the user interaction with the image; and (iii) to provide one or more indications responsive to a correlation of the user interaction with the image, including displaying the image at a second location in 3-D space. | 10-03-2013 |
20130257725 | SELECTION DEVICE AND METHOD FOR PERFORMING POSITIONING OPERATION - A method for performing a positioning operation related to an image area having a specific position includes the following steps. A specific orientation directed towards the specific position is provided, wherein the specific orientation and the specific position have a specific spatial relation therebetween, and the specific orientation and the image area have a specific angle structure therebetween. The specific angle structure is determined by detecting the specific orientation. The specific spatial relation is calculated according to the determined specific angle structure and the image area. A selection device for performing a positioning operation is also provided. | 10-03-2013 |
20130265229 | CONTROL OF REMOTE DEVICE BASED ON GESTURES - Embodiments of the present invention are directed toward controlling electronic devices based on hand gestures detected by detecting the topography of a portion of a user's body. For example, pressure data indicative of a user's bone and tissue position corresponding to a certain movement, position, and/or pose of a user's hand may be detected. An electromyographic (EMG) sensor coupled to the user's skin can also be used to determine gestures made by the user. These sensors can be coupled to a camera that can be used to capture images, based on recognized gestures, of a device. The device can then be identified and controlled. | 10-10-2013 |
20130265230 | IMAGE POSITIONING METHOD AND INTERACTIVE IMAGING SYSTEM USING THE SAME - There is provided an image positioning method including the steps of: capturing an image frame with an image sensor; identifying at least on object image in the image frame; comparing an object image size of the object image with a size threshold and identifying the object image having the object image size larger than the size threshold as a reference point image; and positioning the reference point image. There is further provided an interactive imaging system. | 10-10-2013 |
20130265231 | Gaze Based Communications for Locked-In Hospital Patients - Effective patient-centered care in a hospital relies heavily on the ability of patients to communicate their physical needs to care givers. If a patient is unable to speak, he has limited means of communicating at a time when he needs it the most. The embodiments presented here, generally referred to as EyeVoice, include unobtrusive eye-operated communication systems for locked-in hospital patients who cannot speak or gesture. EyeVoice provides an alternate means of communication, allowing hospital patients to communicate with their care givers using their eyes in place of their voices. Simply by looking at images and cells displayed on a computer screen placed in front of them, patients are able to: answer questions posed by caregivers; specify locations, types and degrees of pain and discomfort; request specific forms of assistance; ask or answer care related questions, and help direct his own care. | 10-10-2013 |
20130265232 | TRANSPARENT DISPLAY APPARATUS AND METHOD THEREOF - A transparent display apparatus and method for displaying information thereon includes sensing a position of an object, sensing a position of a user, determining an area of the transparent display through which the object is viewable by the user, and displaying the information on the transparent display based on the area. | 10-10-2013 |
20130271370 | FREE HAND GESTURE CONTROL OF AUTOMOTIVE USER INTERFACE - A free hand gesture control user interface is described. Components of the interface system may include a stereo vision camera, or multiple cameras, a control unit, a projector and a projecting surface. In an automotive application, the projecting surface may be the windscreen in the vehicle or a mounted screen. The camera together with software modules in the control unit define a gesture acceptance border and a gesture acceptance space within which hand gestures may be imaged, defined for various control actions, and executed. The hand images are analyzed in slices of the gesture acceptance space to determine hand orientation (open or closed) and/or motion. | 10-17-2013 |
20130271371 | ACCURATE EXTENDED POINTING APPARATUS AND METHOD THEREOF - An accurate extended pointing device and a method thereof is disclosed, which use a line vector formed by a user's finger and hand to move an indicator on a screen. The accurate extended pointing device mainly includes: an image capturing unit, being configured to capture a user image; and a directional processing unit, being configured to analyze the user image to generate a piece of user finger image data and a piece of user hand image data and generate a virtual pointing vector according to the user finger image data and the user hand image data. According to the virtual pointing vector, the operation indicator in the working frame is controlled to move as the line vector formed by the user's finger and hand moves, and the operation indicator is located at a location where the virtual pointing vector intersects with the working frame. | 10-17-2013 |
20130271372 | HIGH FIDELITY REMOTE CONTOLLER DEVICE FOR DIGITAL LIVING ROOM - Described herein is an intelligent remote controlling device (e.g. a mobile phone). The device can include a six-axis motion sensor to accurately track three dimensional hand motions. For example, the sensors can include a three-axis accelerometer and a three-axis gyroscope. The remote control device can also include a processing unit integrated with the motion sensors in a single module. The processing unit can convert data regarding the hand motion to data regarding a cursor motion for a cursor that will be displayed on a screen of an electronic device. The processing unit can be integrated with the motion sensors in a single module (e.g. an integrated circuit chip (IC)). The processing unit can include at least two modes of functionality corresponding to different types of hand motion: a one to one mode where the cursor directly tracks the hand motion and a non-linear mode that filters data from the motion sensors to eliminate hand jitter. | 10-17-2013 |
20130278502 | METHOD OF DETERMINING OBJECT POSITION AND SYSTEM THEREOF - The present invention provides a method for determining an object position, including the following steps: determining the intensities of a plurality of pixels; determining that a portion of pixels in the plurality of pixels belong to an object image corresponding to an object; and determining coordinate data of the object based on a number of pixels in each row or column of the portion of the plurality of pixels belonging to the object image and the coordinates of pixels in the portion of the plurality of pixels belonging to the object image. The present invention also provides a pointing system. | 10-24-2013 |
20130278503 | GESTURE OPERATION INPUT PROCESSING APPARATUS AND GESTURE OPERATION INPUT PROCESSING METHOD - An instruction point extraction unit extracts an instruction point of a user from an image in which a gesture of the user made while the user is looking at a display is captured. A distance calculation unit obtains a distance to the instruction point in the depth direction. A gesture recognition parameter adjustment unit adjusts a parameter related to detection sensitivity in the depth direction when operation input by the gesture of the user is recognized, based on at least one of resolution of distance measurement in the depth direction and three-dimensional display performance of a display. A gesture recognition processing unit recognizes the operation input by the gesture of the user based on the adjusted parameter in reference to the distance to the instruction point in the depth direction calculated by the distance calculation unit. | 10-24-2013 |
20130278504 | DYNAMIC GESTURE BASED SHORT-RANGE HUMAN-MACHINE INTERACTION - Systems, devices and methods are described including starting a gesture recognition engine in response to detecting an initiation gesture and using the gesture recognition engine to determine a hand posture and a hand trajectory in various depth images. The gesture recognition engine may then use the hand posture and the hand trajectory to recognize a dynamic hand gesture and provide corresponding user interface command. | 10-24-2013 |
20130285905 | THREE-DIMENSIONAL POINTING DEVICE AND SYSTEM - A device that comprises at least one image sensor and a processing unit. The at least one image sensor is configured to consecutively capture a plurality of images at a predetermined rate. The processing unit is configured to identify in each of the plurality of images a first region and a second region, wherein intensities of the first region and the second region are different; determine a displacement of the first region from the first image of the plurality of images to the last image of the plurality of images; and output a first signal comprising the displacement. | 10-31-2013 |
20130285906 | MOBILE TERMINAL AND CONTROL METHOD THEREOF - A mobile terminal including a body; first and second cameras respectively provided to opposite sides of the body; a display configured to display images captured by the first and second cameras; a microphone configured to acquire a voice input; a posture detection sensor included in the body and configured to sense a motion of the body; and a controller configured to store at least one of first and second images received by the first and second cameras based on an image selection signal. Further, the image selection signal includes at least one of the sensed motion, the acquired voice input, and an input gesture captured by at least one of the first and second cameras. | 10-31-2013 |
20130285907 | HYBRID HUMAN-INTERFACE DEVICE - The present invention discloses a hybrid human-interface device including an optical navigation module and a pointing module. The optical navigation module is configured to replace the conventional buttons of a convention pointing device, such as an optical mouse or a trackball mouse or TV controller. The optical navigation module is configured to sense gestures of at least one object operated by a user to activate commands associated with particular programs running on the host. Since the optical navigation module is only configured to sense gestures of the object but not the movement of the hybrid human-interface device relative to a surface, the resolution thereof is aimed to be sufficiently high enough for sensing gestures and no need to be relatively high. | 10-31-2013 |
20130285908 | COMPUTER VISION BASED TWO HAND CONTROL OF CONTENT - A system and method for manipulating displayed content based on computer vision by using a specific hand posture. In one embodiment a mode is enabled in which content can be manipulated in a typically two handed manipulation (such as zoom and rotate). | 10-31-2013 |
20130285909 | SYSTEM AND METHOD FOR AUTOMATED CAPTURE AND COMPACTION OF INSTRUCTIONAL PERFORMANCES - The system comprises functionality for instructors to record their lessons in an easy method while allowing them to capture their teaching techniques with a tool which improves the effectiveness of the playback, by reverse scripting of the teachers motions and use of educational tools including blackboards, whiteboards, and tablet computers among others and automating highlight of the relevant multiple media channels for best emphasis during presentation. Access to the centralized lesson database will permit students to learn from the best teachers and instructors and can execute the system right on their desktops or portable computers or access it through a dedicated website. Playback may be personalized to the needs and preferences of each student and the conceptual content is essentially highlighted in video and audio to maximize didactic effectiveness of presentation. | 10-31-2013 |
20130293467 | USER INPUT PROCESSING WITH EYE TRACKING - A system determines which user of multiple users provided input through a single input device. A mechanism captures images of the one or more users. When input is detected, the images may be processed to determine which user provided an input using the input device. The images may be processed to identify each users head and eyes, and determine the focus point for each user's eyes. The user which has eyes focused at the input location is identified as providing the input. When the input mechanism is a touch screen, the user having eyes focused on the touch screen portion which was touched is identified as the source of the input. | 11-07-2013 |
20130293468 | COLLABORATION ENVIRONMENT USING SEE THROUGH DISPLAYS - A see-through, near-eye, mixed reality display device and system for collaboration amongst various users of other such devices and personal audio/visual devices of more limited capabilities. One or more wearers of a see through head mounted display apparatus define a collaboration environment. For the collaboration environment, a selection of collaboration data and the scope of the environment are determined. Virtual representations of the collaboration data in the field of view of the wearer, and other device users are rendered. Persons in the wearer's field of view to be included in collaboration environment and who are entitled to share information in the collaboration environment are defined by the wearer. If allowed, input from other users in the collaboration environment on the virtual object may be received and allowed to manipulate a change in the virtual object. | 11-07-2013 |
20130293469 | USER INTERFACE CONTROL DEVICE, USER INTERFACE CONTROL METHOD, COMPUTER PROGRAM AND INTEGRATED CIRCUIT - A user interface control device provides a GUI allowing a depth of a graphic to be easily set when composing the graphic with a stereoscopic image. The device comprises: a graphic information obtaining unit that specifies an area occupied by the graphic when the graphic is arranged on one of two viewpoint images forming a stereoscopic image; a depth information analyzing unit that acquires a depth of a subject appearing within the specified area occupied by the graphic in the one viewpoint image; and a depth setting presenting unit that presents a first alternative and a second alternative for setting a depth of the graphic, the first alternative corresponding to the depth of the subject, and the second alternative corresponding to a depth differing for the depth of the subject. | 11-07-2013 |
20130293470 | METHOD AND APPARATUS FOR MOVING OBJECT - An apparatus and a method for moving an object(s) in an electronic device are provided. The method for moving the object(s) includes detecting a tilt change of the electronic device, and moving the object displayed on a display according to the tilt change of the electronic device. An electronic device comprising a display, at least one processor, a memory and at least one program stored in the memory and configured for execution by the at least one processor, wherein the program comprises at least one instruction configured to detect a tilt change of the electronic device, and to move an object displayed on the display according to the tilt change of the electronic device. | 11-07-2013 |
20130293471 | PUSH ACTUATION OF INTERFACE CONTROLS - A computing system translates a world space position of a hand of a human target to a screen space cursor position of a user interface. When the cursor overlaps a button in the user interface, the computing system actuates the button in response to a movement of the hand in world space that changes the cursor position along a z-axis regardless of an initial z-axis position of the cursor. | 11-07-2013 |
20130300659 | Recognizing Commands with a Depth Sensor - Recognizing a command may include monitoring a tangible reference with a depth sensor, maintaining a virtual reference approximately on calibrated three dimensional coordinates of the tangible reference, maintaining a touch space adjacent the virtual reference, and recognizing a command when a predetermined object enters the touch space. | 11-14-2013 |
20130300660 | CURSOR CONTROL SYSTEM - A cursor control system is provided. A cursor position information is calculated according to a position information and an axial information of a hand held device. The display displays the cursor or executes a corresponding operating instruction according to the cursor position information. | 11-14-2013 |
20130300661 | TOUCH-SENSITIVE ELECTRONIC DEVICE - The present invention provides for a touch-sensitive electronic device having at least first and second touch screens, interface control means for touch selection of content displayed on the screens and wherein the interface control means is arranged such that as touch-selection on the first screen reaches an end location, the selected portion of content is saved and touch-selection functionality is arranged to continue at a start location on the second screen, and also provides for a related method allowing for touch selection of content extending across both screens so that such content can be selected in a unitary manner. | 11-14-2013 |
20130300662 | CONTROL SYSTEM WITH GESTURE-BASED INPUT METHOD - A control system with a gesture-based input method is introduced. The system includes an image capturing unit, an image processing unit, a database, and a computing unit. The image capturing unit captures an input image with a user's gesture. The gesture may be sign-language gesture or gesture of the user holding an object. The image processing unit, connected to the image capturing unit, is used to receive and recognize the gesture shown in the input image. The database stores a plurality of reference images and each of which indicates at least one control command. The computing unit, connected to the image processing unit and database, is used to compare the gesture recognized by the image processing unit with the reference images in the database. The comparison is used to determine a control command corresponding to the reference image. The control command is used to operate an electronic device. | 11-14-2013 |
20130307771 | INTERACTION AND MANAGEMENT OF DEVICES USING GAZE DETECTION - User gaze information, which may include a user line of sight, user point of focus, or an area that a user is not looking at, is determined from user body, head, eye and iris positioning. The user gaze information is used to select a context and interaction set for the user. The interaction sets may include grammars for a speech recognition system, movements for a gesture recognition system, physiological states for a user health parameter detection system, or other possible inputs. When a user focuses on a selected object or area, an interaction set associated with that object or area is activated and used to interpret user inputs. Interaction sets may also be selected based upon areas that a user is not viewing. Multiple devices can share gaze information so that a device does not require its own gaze detector. | 11-21-2013 |
20130307772 | INTERACTIVE PROJECTION SYSTEM WITH LIGHT SPOT IDENTIFICATION AND CONTROL METHOD THEREOF - The present invention discloses an interactive projection system with light spot identification, including M pointing devices and a projector, and a control method thereof. A light emitting module of each pointing device selectively generates an infrared light spot, and generates an infrared lighting spot status signal corresponded to the operating status of the light emitting module. A wireless signal transmission module of each pointing device transmits a wireless signal including an identification code and the infrared lighting spot status signal. An infrared light spot capturing module of the projector is used to capture an image indicating the infrared light spot projected on a screen. The interactive projection system correspondingly displays movement tracks in order of the infrared light spots on a monitor according to the image and the wireless signals from the M pointing devices, wherein M is a positive integer and greater than or equal to 1. | 11-21-2013 |
20130307773 | IMAGE PROCESSING APPARATUS, COMPUTER-READABLE RECORDING MEDIUM, AND IMAGE PROCESSING METHOD - In the present invention, an image is projected so as to form an acute angle between an axis of projection from a projection device and a screen; the projected image and an image of a perimeter thereof are captured; and a process corresponding to an operation about to be performed by a user is executed based on a shadow area of a finger of the user included in the captured image. | 11-21-2013 |
20130307774 | INFORMATION PROCESSING APPARATUS, PROJECTION SYSTEM, AND INFORMATION PROCESSING METHOD - An information processing apparatus includes: a control-permission/denial storage unit that stores therein permission/denial information as to whether or not execution of respective control actions in response to a predetermined motions made by users is permitted to roles of the users; a motion detecting unit that detects a predetermined motion from images captured by an image capturing device; and a control-permission/denial determining unit that determines whether or not execution of a control action in response to a predetermined motion made by a user and detected by the motion detecting unit is permitted to a role of the user based on the control-permission/denial storage unit. | 11-21-2013 |
20130307775 | GESTURE RECOGNITION - A system includes two or more optical sensors configured to generate image data based on gestures made by a user. One or more processing devices identifies movement quadrants based on the generated image data. If a match of the identified movement quadrants to one of a set of gesture commands is detected, one or more control signals associated with the matching gesture command are generated. | 11-21-2013 |
20130307776 | THREE-DIMENSIONAL MAN/MACHINE INTERFACE - A method is provided for selecting controls, which implements a control interface, a display, and at least one sensor capable of detecting at least one control object, including a step (i) of obtaining information on the distance between the control object(s) and the control interface using the sensor(s), and a step (ii) of displaying, on the display, at least one symbol representing a control or set of controls according to a display mode, wherein the method further includes a step of using the distance information to determine the display mode of the symbol(s). A device is also provided for implementing the method and to an apparatus. | 11-21-2013 |
20130314317 | APPARATUS FOR NON-CONTACT 3D HAND GESTURE RECOGNITION WITH CODE-BASED LIGHT SENSING - An apparatus for non-contact 3D hand gesture recognition with code-based light sensing is provided, including a plurality of light emitters, at least a light sensor, and a controller, wherein the controller is connected to and controls the plurality of light emitters to emit lights containing a respective identification code. The emitted lights can be reflected by an object, for example, a hand in our application. The at least a light sensor can identify the original light emitter of each respective reflected light through the identification code as well as computing the power level of each respective reflected light to determine the distance or location of the object. The hand gesture recognition can be deduced based on the power levels of respective reflected lights over a time period. | 11-28-2013 |
20130314318 | METHOD OF IMPROVING CURSOR OPERATION OF HANDHELD POINTER DEVICE IN A DISPLAY AND HANDHELD POINTER DEVICE WITH IMPROVED CURSOR OPERATION - A method and apparatus of improving cursor operation of handheld pointer device in a display is provided, applicable to a handheld pointer device and a display. The display can display a cursor indicating the location of a handheld pointer device in a display. The method includes the steps of: the handheld pointer device transmitting a control signal to the display to enter a slow cursor movement mode; the display in the slow movement mode receiving a control signals transmitted by the handheld pointer device; the display showing a slow moving cursor accordingly when the control signal being a move cursor command; the display exiting the slow cursor mode and entering selected object when the control signal being a select object command; and the display exiting the slow cursor movement mode when the control signal being an exit mode command. | 11-28-2013 |
20130314319 | APPARATUS AND METHOD FOR SENSING IMAGE - The present invention relates to an apparatus and method for sensing an image formed by a fluid that is wetted to a real paint brush, such as water. The image sensing apparatus includes an input unit configured to receive an image using a paint brush that is wet with a fluid, a light source unit configured to emit infrared light to the input unit for sensing the image, and a sensing unit configured to sense scattered light generated in response to the image received by the input unit using the wet paint brush and to image sensed results. | 11-28-2013 |
20130314320 | METHOD OF CONTROLLING THREE-DIMENSIONAL VIRTUAL CURSOR BY USING PORTABLE ELECTRONIC DEVICE - A method of controlling a three-dimensional virtual cursor (3D) by using a portable electronic device, the method including: sensing at least one of a movement and a touch input of a portable electronic device through a sensor mounted in the portable electronic device; and converting the sensed at least one of the movement and the touch input of the portable electronic device into a cursor control signal for controlling an operation of a cursor in a 3D space to output the cursor control signal. According to the method, a 3D virtual cursor may be conveniently controlled without a location or time limit by using a portable electronic device which a user carries. | 11-28-2013 |
20130321269 | DISPLAYING ELEMENTS - Apparatus comprises at least one processor, and at least one memory including computer program code, wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus to cause a plurality of elements to be displayed in respective positions on a map, to determine a section of the map based upon one or more conditions, to determine that two or more of the elements are positioned in or near to the section of the map and are spaced closer together than a threshold spacing, and, in response thereto, to cause a perceptible output to be provided to further distinguish each of the two or more elements from one another. | 12-05-2013 |
20130321270 | ARRANGEMENT, METHOD AND COMPUTER PROGRAM FOR CONTROLLING A COMPUTER APPARATUS BASED ON EYE-TRACKING - A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request. | 12-05-2013 |
20130321271 | POINTING-BASED DISPLAY INTERACTION - A method includes receiving and segmenting a first sequence of three-dimensional (3D) maps over time of at least a part of a body of a user of a computerized system in order to extract 3D coordinates of a first point and a second point of the user, the 3D maps indicating a motion of the second point with respect to a display coupled to the computerized system. A line segment that intersects the first point and the second point is calculated, and a target point is identified where the line segment intersects the display. An interactive item presented on the display in proximity to the target point is engaged. | 12-05-2013 |
20130328771 | Camera-Assisted Motion Estimation for Application Control - Embodiments of the present invention generate estimates of device motion from two data sources on a computing device—a motion sensor and a camera. The device may compare the estimates to each other to determine if they agree. If they agree, the device may confirm that device motion estimates based on the motion sensor are accurate and may output those estimates to an application within the device. If the device motion estimates disagree, the device may alter the motion estimates obtained from the motion sensor before outputting them to the application. | 12-12-2013 |
20130328772 | Handheld Pointing Device - A handheld pointing device includes a main body, an image sensing module, an acceleration sensing module and a processing circuit. The image sensing module is disposed in the main body and configured to capture an image comprising at least one reference light source and accordingly generate an optical sensing signal. The acceleration sensing module is disposed in the main body and configured to sense an acceleration value in each one of two dimensions; wherein the acceleration sensing module outputs an acceleration sensing signal if an absolute value of the summation of the two acceleration values in two dimensions is located within a predetermined acceleration range. The processing circuit is configured to receive the optical sensing signal and the acceleration sensing signal and accordingly generate an output signal. | 12-12-2013 |
20130328773 | CAMERA-BASED INFORMATION INPUT METHOD AND TERMINAL - Disclosed are a camera-based information input method and a terminal, for providing an input method that consumes few resources and does not block the terminal screen. The method comprises: a terminal identifying an area having specified color information from an image acquired by a camera; determining change information of the area; and determining, according to the change information, information input to the terminal. | 12-12-2013 |
20130328774 | OPTICAL TOUCH MOUSE - An optical touch mouse includes a cover having a portion as a detect window, a lightguide adjacent to the detect window for directing light to the detect window, and a light source adjacent to the lightguide for providing light to enter the lightguide. The detect window is transparent to light provided by the light source such that light provided by the light source can penetrate through the detect window. | 12-12-2013 |
20130328775 | User Interface Elements Positioned for Display - User interface elements positioned for display is described. In various embodiment(s), sensor input can be received from one or more sensors that are integrated with a portable device. A device hold position that corresponds to where the portable device is grasped by a user can be determined based at least in part on the sensor input. A display of user interface element(s) can then be initiated for display on an integrated display of the portable device based on the device hold position that corresponds to where the portable device is grasped. | 12-12-2013 |
20130328776 | METHOD FOR EXECUTING USER COMMAND ACCORDING TO SPATIAL MOVEMENT OF USER INPUT DEVICE AND IMAGE APPARATUS THEREOF - A method for executing a user command in a display device including receiving, from an input device, a signal that indicates a movement direction of the input device, determining the movement direction of the input device based on the received signal, determining one of a plurality of functions of the display device that corresponds to the movement direction from among the plurality of functions of the display device, and executing the determined function. | 12-12-2013 |
20130335323 | CURSOR CONTROL DEVICE AND SYSTEM - A cursor control device includes an image sensor, at least one button, a processing unit, a selection unit and a transmitter. The image sensor captures a plurality of image frames. The at least one button outputs a trigger signal while being pressed. The processing unit calculates a displacement according to the image frames and outputs a control signal according to the trigger signal. The selection unit selects one of at least two different cursor lock periods. The transmitter outputs the control signal and the displacement, wherein the processing unit controls the transmitter to output zero displacement within the cursor lock period selected after receiving the trigger signal. The present disclosure further provides a cursor control system. | 12-19-2013 |
20130335324 | COMPUTER VISION BASED TWO HAND CONTROL OF CONTENT - A system and method for manipulating displayed content based on computer vision by using a specific hand posture. In one embodiment a mode is enabled in which content can be manipulated in a typically two handed manipulation (such as zoom and rotate). | 12-19-2013 |
20130342454 | DISPLAY APPARATUS, REMOTE CONTROLLING APPARATUS AND CONTROL METHOD THEREOF - A remote controlling apparatus, a display apparatus and a controlling method are provided. The remote controlling apparatus for selecting one of a plurality of operating modes of an external device being operable between a pointing mode and a gesture mode, associated with the remote controlling apparatus, includes an output unit for outputting information regarding the remote controlling apparatus to the external device, a detection unit for detecting motion of the remote controlling apparatus, a motion information generating unit for generating motion information based on the detected motion of the remote controlling apparatus, an operation mode change unit for providing information regarding an operation mode, for changing the operation mode of the external device being operable between the pointing mode and the gesture mode, and wherein the information regarding the remote controlling apparatus comprise the information regarding the operating mode, and the motion information generated by the motion information generating unit. | 12-26-2013 |
20130342455 | DISPLAY APPARATUS, REMOTE CONTROLLING APPARATUS AND CONTROL METHOD THEREOF - A remote controlling apparatus, a display apparatus and a controlling method are provided. The remote controlling apparatus for selecting one of a plurality of operating modes of an external device being operable between a pointing mode and a gesture mode, associated with the remote controlling apparatus, includes an output unit for outputting information regarding the remote controlling apparatus to the external device, a detection unit for detecting motion of the remote controlling apparatus, a motion information generating unit for generating motion information based on the detected motion of the remote controlling apparatus, an operation mode change unit for providing information regarding an operation mode, for changing the operation mode of the external device being operable between the pointing mode and the gesture mode, and wherein the information regarding the remote controlling apparatus comprise the information regarding the operating mode, and the motion information generated by the motion information generating unit. | 12-26-2013 |
20130342456 | REMOTE CONTROL APPARATUS AND CONTROL METHOD THEREOF - A remote controlling apparatus to provide a plurality of control modes includes a communicating unit which performs communication with an external display apparatus which provides a user interface screen, a detecting unit which detects a movement of the remote controlling apparatus, a mode change button unit which receives a user command to change control mode, and a control unit which controls a display status of the user interface screen according to the movement of the remote controlling apparatus as detected through the detecting unit, and which operates in a pointing mode if the mode change button unit is released from pressed state, or operates in a gesture mode while the mode change button unit is in pressed state. The mode change button unit is arranged on a rear surface of the remote controlling apparatus for a user to grip. | 12-26-2013 |
20130342457 | DATA MANIPULATION ON ELECTRONIC DEVICE AND REMOTE TERMINAL - Methods and systems of manipulating data on an electronic device and a remote terminal are disclosed. The system includes at least one controller in an electronic device and a remote terminal, configured to initialize the electronic device and the remote terminal, detect a first input signal associated with a desired action, capture at least a portion of an external environment, detect a second input signal associated with the desired action, and perform the desired action on the electronic device and the remote terminal according to the first input signal and the second input signal. The system also includes at least one signal detection system in the electronic device and the remote terminal controlled by the at least one controller, at least one imaging and display system in the electronic device and the remote terminal generating an image output data in real-time. | 12-26-2013 |
20130342458 | METHODS AND SYSTEMS FOR INPUT TO AN INTERACTIVE AUDIOVISUAL DEVICE - A method of operating an user input device relative to a projection screen includes: displaying a user interface via a projector in the A/V device; receiving a captured image from a camera tracking a two-dimensional (2D) coordinate of a light source from images of the capture image; displaying a cursor over the user interface via the projector in the A/V device based on the 2D coordinate; detecting an optical pattern from the light source; and activating a click interaction on the user interface at the 2D coordinate based on the optical pattern received. | 12-26-2013 |
20140002357 | Enabling and Disabling Features of a Headset Computer Based on Real-Time Image Analysis | 01-02-2014 |
20140002358 | COMPACT CAPACITIVE TRACK PAD | 01-02-2014 |
20140002359 | Calibration Of Portable Devices In A Shared Virtual Space | 01-02-2014 |
20140015748 | High Resolution and High Sensitivity Optically Activated Touch Sensing Device Using Multiple Color Light Sources - A cursor maneuvering device comprises a cavity containing a plurality of colored lights and an image sensor and covered with a light mixing plate. The colored lights are located in the cavity in an as largely departed as possible manner and illuminate a pointing device contacting the surface of the light mixing plate. The image sensor detects a hue of light illuminating the pointing device and controls a cursor depending upon the hue. Changes in the hue of a moving pointing device are translated into movement of the cursor on the display of an electronic device. The pointing device may be anything touching the light mixing plate, for instance a finger. | 01-16-2014 |
20140022171 | SYSTEM AND METHOD FOR CONTROLLING AN EXTERNAL SYSTEM USING A REMOTE DEVICE WITH A DEPTH SENSOR - A system and method for implementing a remote controlled user interface using close range object tracking are described. Close range depth images of a user's hands and fingers or other objects are acquired using a depth sensor. Using depth image data obtained from the depth sensor, movements of the user's hands and fingers or other objects are identified and tracked. The tracking data is transmitted to an external control device, thus permitting the user to interact with an object displayed on a screen controlled by the external control device, through movements of the user's hands and fingers. | 01-23-2014 |
20140022172 | GESTURE INPUT SYSTEMS AND METHODS - A gesture input system with a two-dimension (2D) image sensor and a processing module is provided. The 2D image sensor obtains a plurality of images of a user. The processing module determines positions of an object and a face of the user in a first image of the plurality of images, and determines an operation area for the user according to the positions of the object and the face. Also, the processing module generates a control command according to the subsequent images to the first image of the user within the operation area. | 01-23-2014 |
20140028553 | METHOD, SYSTEM AND APPARATUS FOR DETERMINING LOCATIONS IN A PROJECTED IMAGE - A method, system and apparatus for determining locations in a projected image are provided. The apparatus comprises a light sensor; a body comprising the light sensor, the body enabled to position the light sensor proximal to a screen to detect light from a projector; a communication interface for communicating with a projector system comprising at least the projector; and, a processor enabled to transmit a request to the projector system to project a structured light pattern using the projector; and when at least one pixel in the structured light pattern projected by the projector is detected at the light sensor, transmit a detection indication to the projector system to communicate detection of the at least one pixel. | 01-30-2014 |
20140028554 | RECOGNIZING GESTURE ON TACTILE INPUT DEVICE - A non-transitory computer-readable storage medium may comprise instructions stored thereon for recognizing gestures on a tactile input device. The instructions may be configured to cause a computing system to at least receive, from a sensor of the tactile input device, a signal representing a first contact on the tactile input device and subsequent release of the first contact from the tactile input device, receive, from the sensor of the tactile input device, a signal representing a second contact on the tactile input device after the first contact is released, the second contact being maintained and changing location on the tactile input device, and recognize the first contact and the second contact as a single gesture if the second contact occurs within a re-tap threshold period of time after the first contact, and the second contact begins within a maximal threshold distance on the tactile input device from the first contact. | 01-30-2014 |
20140028555 | METHOD AND APPARATUS FOR CONTROLLING DRAG FOR A MOVING OBJECT OF A MOBILE TERMINAL HAVING A TOUCH SCREEN - A method and apparatus for controlling a drag to variably change the speed of an object moving on a display unit includes displaying an object and a control region for controlling a movement of the object, wherein if a touch input is detected from the control region, the control unit determines whether duration of a drag input after the initial touch input is longer than a predetermined time period, then the control unit defines a ratio of a drag speed and an object speed, based on the determination outcome so that the object is moved in response to the drag input according to the defined ratio. | 01-30-2014 |
20140028556 | OPTICAL NAVIGATION DEVICES - An optical navigation device is provided for detecting movement of a pointer, such as a finger, in three dimensions. A sensor obtains images of the pointer which have been illuminated by an illumination source, and an image scaling module determines the difference in size between images acquired by the image sensor to determine the difference in height of the pointer between images. | 01-30-2014 |
20140028557 | DISPLAY DEVICE, DISPLAY CONTROL METHOD AND DISPLAY CONTROL PROGRAM, AND INPUT DEVICE, INPUT ASSISTANCE METHOD AND PROGRAM - A display device is equipped with a proximity touch panel that detects coming into proximity or contact of a detection target; a position detection unit that detects position coordinates of the detection target whose coming into proximity or contact has been detected, the position coordinates consisting of coordinates in the X and Y directions and a coordinate in the Z direction; a display unit on which the touch panel is placed; a direction judgment unit that judges a direction of the detection target on the basis of the detected position coordinates; and a display control unit for controlling the display unit to perform a display so that a part of display contents of the display unit which would otherwise be hidden by the detection target is prevented from being hidden, on the basis of the direction of the direction-judged detection target. | 01-30-2014 |
20140035814 | ADJUSTING SETTINGS OF A PRESENTATION SYSTEM - Techniques for adjusting settings of a presentation system are described in various implementations. A method that implements the techniques may include receiving, at a computer system and from an image capture device, an image that depicts a viewing area proximate to the presentation system. The method may also include processing the image, using the computer system, to determine whether a viewer is present in the viewing area. The method may also include, in response to determining that a viewer is present in the viewing area, processing the image, using the computer system, to determine an ambient lighting value associated with the viewing area and to determine a distance from the presentation system to the viewer. The method may also include adjusting a presentation setting of the presentation system based on the ambient lighting value and the distance. | 02-06-2014 |
20140035815 | 3D POINTING DEVICE - A three-dimensional (3D) pointing device includes a housing, an inertial measurement unit, a data processing unit, a communication unit, and a power unit. The housing has a rough surface, and the inertial measurement unit is provided inside the housing and in contact with the housing. The inertial measurement unit includes a gyroscope and an accelerometer. The data processing unit is used to integrate data from the gyroscope and the accelerometer to generate an output data, so as for the communication unit to send out the output data. The power unit provides power to the 3D pointing device. The 3D pointing device enables the user to execute pointing control in a 3D space and allows the user to input commands by means of the rough surface. Thus, the 3D pointing device features both convenience of use and a small physical volume. | 02-06-2014 |
20140043233 | INTERACTIVE SYSTEM AND REMOTE DEVICE - An interactive system includes a display, a processor and a remote controller. The display includes at least one reference beacon for providing light with a predetermined feature. The remote controller includes an image sensor configured to capture an image containing the reference beacon and calculates an aiming coordinate according to an imaging position of the reference beacon in the captured image. The processor calculates a scale ratio of a pixel size of the display with respect to that of the image captured by the image sensor and moves a cursor position according to the scale ratio and the aiming coordinate. | 02-13-2014 |
20140043234 | COMPUTER VISION GESTURE BASED CONTROL OF A DEVICE - A system and method are provided for controlling a device based on computer vision. Embodiments of the system and method of the invention are based on receiving a sequence of images of a field of view; detecting movement of at least one object in the images; applying a shape recognition algorithm on the at least one moving object; confirming that the object is a user hand by combining information from at least two images of the object; and tracking the object to control the device. | 02-13-2014 |
20140055355 | METHOD FOR PROCESSING EVENT OF PROJECTOR USING POINTER AND AN ELECTRONIC DEVICE THEREOF - A method for performing a function of an electronic device includes outputting an image outside the electronic device, capturing the output image; detecting a trace of a pointer in the captured image, and performing a preset function corresponding to the detected trace of the pointer. | 02-27-2014 |
20140062874 | CLIENT DEVICE ORIENTATION - Systems and methods are provided for determining an orientation of one or more client devices relative to a computing device. In various embodiments a method and system is provided for receiving, via a client device, data being displayed on a computing device. The data is used to determine an orientation of the client device relative to a computing device. The determined orientation of the client device is used to orient a user interface of a computing device. | 03-06-2014 |
20140062875 | MOBILE DEVICE WITH AN INERTIAL MEASUREMENT UNIT TO ADJUST STATE OF GRAPHICAL USER INTERFACE OR A NATURAL LANGUAGE PROCESSING UNIT, AND INCLUDING A HOVER SENSING FUNCTION - A mobile device has an inertial measurement unit (IMU) that senses linear and rotational movement, a touch screen including (i) a touch-sensitive surface and (ii) a 3D sensing unit, and a state change determination module that determines state changes from a combination of (i) an output of the IMU and (ii) the 3D sensing unit sensing the hovering object. The mobile device may include a pan/zoom module. A mobile device may include a natural language processing (NLP) module that predicts a next key entry based on xy positions of keys so far touched, xy trajectory of the hovering object and NLP statistical modeling. A graphical user interface (GUI) visually highlights a predicted next key and presents a set of predicted words arranged around the current key above which the object is hovering as selectable buttons to enable entry of a complete word from the set of predicted words. | 03-06-2014 |
20140062876 | EYE-CONTROLLED COMMUNICATION SYSTEM - An eye-controlled communication system comprises an eye controlled aid and a visiting aid. The eye controlled aid has an eye controlled module, a first display module and a first processing unit. The eye controlled module detects eye movements of patient, generates a control command based on the detected results and then transmits to the first processing unit. The first processing unit bases the control command to allow patient to operate the first operator interface by using eye movements. The first processing unit generate an execute command according to the operating results, and performs the execute command, wherein the execute command includes information of transmitting messages to the second processing unit. The second processing unit receives the execute command and performs the execute command. Thus, the system of the present invention can help patient to easily express himself or herself to his or her family members or friends. | 03-06-2014 |
20140062877 | DISPLAY APPARATUS AND METHOD OF CONTROLLING THE SAME - A display apparatus is disclosed. The display apparatus includes: a signal reception unit receiving an image signal; a signal processing unit processing the image signal; a display unit displaying an image based on the processed image signal; a communication unit receiving a motion corresponding to an instruction of a user from a remote control unit; and a controller displaying a user directory interface having a predetermined area on the display unit and controlling to perform a function selected among a plurality of functions performed by the display apparatus based on a direction of the motion from the remote control to the user directory interface. | 03-06-2014 |
20140062878 | CONTROL DEVICE, INPUT DEVICE, CONTROL SYSTEM, HANDHELD DEVICE, AND CONTROL METHOD - A control device includes: a receiver for receiving first information regarding the movement of a casing, and second information regarding whether to reflect the first information on the movement of coordinate values; a storage unit for storing a whole-screen region including a real-screen region, and a virtual-screen region set around the real-screen region; a generator for generating the coordinate values within the whole-screen region based on the first information; a switcher for switching a first state in which the coordinate values are movable, and a second state in which the coordinate values are immovable, based on the second information; a determining unit for determining which of the real-screen region or the virtual-screen region the coordinate values belong to; and a coordinate-value control unit for controlling the coordinate values so as to move the coordinate values within the virtual-screen region to the position of predetermined coordinate values within the real-screen region. | 03-06-2014 |
20140062879 | USER INTERFACE SYSTEM BASED ON POINTING DEVICE - The user interaction system comprises a portable pointing device ( | 03-06-2014 |
20140062880 | SYSTEM AND METHOD FOR CONTROLLING THE POSITION OF A MOVABLE OBJECT ON A VIEWING DEVICE - A system for controlling the position of a movable object on a viewing device in an aircraft cockpit, which can be actuated by an operator includes means for determining, at least at one given moment, a position of a target zone on said viewing device toward which the operator's gaze is directed, and positioning means adapted for placing the object on the target zone of the viewing device. | 03-06-2014 |
20140062881 | ABSOLUTE AND RELATIVE POSITIONING SENSOR FUSION IN AN INTERACTIVE DISPLAY SYSTEM - An interactive display system including a wireless pointing device, and positioning circuitry capable of determining absolute and relative positions of the display at which the pointing device is aimed. An error value between the absolute position and an estimated or actual relative position at the point in time of the absolute position is determined, and a compensation factor is determined from this error value that is applied to subsequent relative positioning results. | 03-06-2014 |
20140062882 | DISPLAY CONTROL DEVICE, METHOD, AND PROGRAM - The present technique relates to a display control device, a method, and a program that can improve user's operability of a free-cursor type user interface. | 03-06-2014 |
20140071049 | METHOD AND APPARATUS FOR PROVIDING ONE-HANDED USER INTERFACE IN MOBILE DEVICE HAVING TOUCH SCREEN - A method provides a user interface in a mobile device having a touch screen. The method includes detecting a touch region from the touch screen, and determining whether the detected touch region satisfies at least one of a first condition of exceeding a predetermined area, a second condition of exceeding a predetermined time, and a third condition of exceeding a predetermined pressure. The method further includes displaying a pointer at a specific location of the touch screen when the detected touch region satisfies at least one of the first, second and third conditions, detecting a movement of the touch region, and moving the pointer in response to the movement of the touch region. | 03-13-2014 |
20140071050 | Optical Sensing Mechanisms for Input Devices - A computer or other electronic device including a processor and an input device, such as a track pad. The track pad being in communication with the processor and including a movable surface, a light source in communication with the processor, and an optical sensor in selective optical communication with the light source and in communication with the processor. The optical sensor detects movement of the movable surface by receiving light from the light source. | 03-13-2014 |
20140071051 | METHOD OF CONTROLLING A CONTROL POINT POSITION ON A COMMAND AREA AND METHOD FOR CONTROL OF A DEVICE - The invention describes a method of controlling a position (x′, y′) of a control point (c) on a command area (A | 03-13-2014 |
20140078059 | 3D Pointing Devices with Orientation Compensation and Improved Usability - Systems and methods according to the present invention describe 3D pointing devices which enhance usability by transforming sensed motion data from a first frame of reference (e.g., the body of the 3D pointing device) into a second frame of reference (e.g., a user's frame of reference). One exemplary embodiment of the present invention removes effects associated with a tilt orientation in which the 3D pointing device is held by a user. | 03-20-2014 |
20140085201 | DEVICE WITH TOUCH SCREEN FALSE ACTUATION PREVENTION - A device and method for preventing false-actuation of a touch screen are provided. When one or more of: (a) an initial proximity event occurs at a proximity sensor of he device, and (b) a reorientation event occurs at an orientation sensor of the device: touch data received via a touch screen of the device is buffered in a memory of the device. Thereafter, a processor of the device: processes the buffered touch data when a further proximity event does not occur within a given time period; and, disables the touch screen when the proximity event occurs within the given time period. | 03-27-2014 |
20140085202 | METHOD, APPARATUS, AND COMPUTER PROGRAM PRODUCT FOR REDUCING HAND OR POINTING DEVICE OCCLUSIONS OF A DISPLAY - A method is provided for reducing hand or pointing device occlusions of a display, particularly a touch screen or hover display. Detection of the position of a pointing object and eyes relative to a device display may be used to calculate an offset correction. Shifting a user interface component based on the offset correction may reduce pointing object occlusions of the display. A corresponding apparatus and computer program product are also provided. | 03-27-2014 |
20140085203 | VIDEO IMAGE DISPLAY SYSTEM AND HEAD MOUNTED DISPLAY - A video image display system including an information apparatus and a transmissive head mounted display that allows a user to visually recognize video images distributed from the information apparatus as virtual images is provided. The information apparatus includes a video image distributor that distributes video images corresponding to a specific geographic region to the head mounted display. The head mounted display includes a motion detector that detects motion of the user's head and allows the user to visually recognize, as virtual images, video images selected based on motion information representing the motion. | 03-27-2014 |
20140085204 | Correlating Pupil Position to Gaze Location Within a Scene - Correlating pupil position to gaze location within a scene. Illustrative embodiments may include correlating pupil position of a user to gaze location within a scene viewed by the user. The correlating may include: illuminating an eye of the user, the eye containing the pupil, and the illuminating with light; creating a first video stream depicting the eye; creating a second video stream depicting the scene in front of the user; determining pupil position within the first video stream; calculating gaze location in the second video stream based on pupil position in the first video stream; and sending an indication of the gaze location in the second video stream to a computer system. | 03-27-2014 |
20140092014 | MULTI-MODAL TOUCH SCREEN EMULATOR - Systems and methods may provide for capturing a user input by emulating a touch screen mechanism. In one example, the method may include identifying a point of interest on a front facing display of the device based on gaze information associated with a user of the device, identifying a hand action based on gesture information associated with the user of the device, and initiating a device action with respect to the front facing display based on the point of interest and the hand action. | 04-03-2014 |
20140092015 | METHOD AND APPARATUS FOR MANIPULATING A GRAPHICAL USER INTERFACE USING CAMERA - A method and a computing device for manipulating a graphical user interface (GUI) using camera are disclosed. The computing device captures images of a user by using a camera, detects the face image of the user from captured images, and detects at least one facial feature in the face image. The computing device analyzes at least one parameter of one or more facial features to determine gestures. | 04-03-2014 |
20140092016 | Handheld Pointing Device and Operation Method Thereof - A handheld pointing device includes a body, an image sensing module and a processing circuit. The image sensing module is disposed in the body and configured to sense a reference light source and thereby capturing an image including of the reference light source. The processing circuit, disposed in the body and electrically connected to the image sensing module, is configured to obtain the image including of the reference light source, calculate a coordinate of the image of the reference light source relative to the image captured by the image sensing module, and correct the coordinate according to a distance or a distance change between the body and the reference light source. An operation method of a handheld pointing device is also provided. | 04-03-2014 |
20140092017 | INTERACTIVE SIMULATED-GLOBE DISPLAY SYSTEM - The invention discloses an interactive simulated-globe display system including an imaging body, N image-projecting units, a data processing unit, an optical pointer, and M image-capturing units where N and M are respectively a natural number. The N image-projecting units project N images onto an external hemispheric surface of the imaging body. The N images constitute a hemi-globe image of a whole globe image. The data processing unit detects an indicated spot projected on the external hemispheric surface by the M image-capturing units, judges if a track relative to the indicated spot meets one of a plurality of position input rules, and if YES, executes an instruction corresponding to said one position input rule. | 04-03-2014 |
20140104170 | METHOD OF PERFORMING KEYPAD INPUT IN A PORTABLE TERMINAL AND APPARATUS - A method for performing keypad input in a portable terminal is provided. The method includes displaying, on a touch screen, a keypad and an operating zone; detecting a first user operation performed in the operating zone; identifying, by a processor, a key from the keypad, the key being identified based on the first user operation; and inputting the selected key. | 04-17-2014 |
20140104171 | Electrical device, in particular a telecommunication device, having a projection device, and method for operating an electrical device - An electrical device having a projection device for projecting image information onto a projection surface disposed externally to the telecommunication device; an inertial sensor to detect a movement of the electrical device; the electrical device being operable in a first operating mode so that the projection of the image information onto the projection surface follows a movement of the electrical device with a movement component in a plane disposed in parallel to the plane of the projection surface; the electrical device being operable in a second operating mode so that, in spite of the electrical device's movement, the projection of the image information onto the projection surface with a movement component in a plane parallel to the plane of the projection surface is at least partially substantially fixed; the electrical device being configured so that its movement in the second operating mode is interpreted as a user input. | 04-17-2014 |
20140104172 | Method for Automatically Switching User Interface of Handheld Terminal Device, and Handheld Terminal Device - A method for automatically switching a user interface of a handheld terminal device and a handheld terminal device are provided. The method in an embodiment of the present disclosure includes: obtaining a current state of the terminal device by using a first sensor, and obtaining current trigger states of touch sensors of the terminal device, where the current state is a horizontally holding state or a vertically holding state, and the touch sensors are set on the back and/or a side of the terminal device; determining a current holding mode of the terminal device according to the current state of the terminal device and the current trigger states of the touch sensors; and switching a user interface to a user interface corresponding to the current holding mode of the terminal device. An embodiment of the present disclosure further discloses the handheld terminal device. | 04-17-2014 |
20140111431 | OPTIMIZING PHOTOS - Implementations generally relate to optimizing photos. In some implementations, a method includes collecting attention information associated with one or more objects. The method further includes generating an attention map based on the attention information. The method further includes allocating resources to the one or more objects in one or more photos based on the attention map. | 04-24-2014 |
20140111432 | INTERACTIVE MUSIC PLAYBACK SYSTEM - An interactive music method for controlling a media player device is provided. The interactive music method comprises the steps of receiving one or more gestures, interpreting the gesture in accordance with a plurality of predefined gestures, and executing at least one process corresponding to the gesture. The process comprises controlling audio for a specific amount of time. | 04-24-2014 |
20140111433 | MOTION COMPENSATION IN AN INTERACTIVE DISPLAY SYSTEM - An interactive display system including a wireless pointing device, and positioning circuitry capable of determining absolute and relative positions of the display at which the pointing device is aimed. The pointing device captures images displayed by the computer, including one or more human-imperceptible positioning targets. The positioning targets are presented as patterned modulation of the intensity (e.g., variation in pixel intensity) in a display frame of the visual payload, followed by the opposite modulation in a successive frame. At least two captured image frames are subtracted from one another to recover the positioning target in the captured visual data and to remove the displayed image payload. Motion of the pointing device between the two frames is detected by relative motion sensors, and used to align the positions of the positioning targets in the captured images for those frames to improve the fidelity of the recovered positioning target following the subtraction. | 04-24-2014 |
20140118255 | GRAPHICAL USER INTERFACE ADJUSTING TO A CHANGE OF USER'S DISPOSITION - A user interface apparatus and a method for adjusting the apparatus are disclosed. A position and/or a viewing angle of the user is tracked, and graphical interface objects are adjusted to keep them visible at different user's distances and viewing angles. For example, as the user steps away from the display, the objects on the display can be proportionally enlarged to make them appear of the same size to the user. The sensitivity of a gesture recognition system to the user's movements and gestures can be also adjusted to facilitate manipulation of the objects by the user at different distances from the display of the graphical user interface. | 05-01-2014 |
20140118256 | DISPLAY DIRECTIONAL SENSING - An electronic device includes a display with an image display area configured to change orientation based on a change of position of the device. Upon detection of the position change, a camera may capture an image adjacent the device. The orientation of the image display area may be moved relative to a reference feature captured in the image. | 05-01-2014 |
20140118257 | GESTURE DETECTION SYSTEMS - The amount of power and processing needed to enable gesture input for a computing device can be reduced by utilizing one or more gesture sensors. A gesture sensor can have a lower resolution but larger pixel pitch than conventional cameras. The lower resolution can be achieved in part through skipping or binning pixels in some embodiments. The low resolution enables a global shutter to be used with the gesture sensor. The gesture sensor can be connected to an illumination controller for synchronizing illumination from a device emitter with the global shutter. In some devices, the gesture sensor can be used as a motion detector, enabling the gesture sensor to run in a low power state unless there is likely gesture input to process. At least some processing and circuitry is included with the gesture sensor such that functionality can be performed without accessing a central processor or system bus. | 05-01-2014 |
20140118258 | MOBILE TERMINAL AND CONTROL METHOD THEREOF - The present invention relates a mobile terminal capable of controlling contents displayed on a display unit, and a control method thereof. A mobile terminal according to one exemplary embodiment includes a terminal body, a transparent display unit having first and second transparent substrates both configured to sense touch inputs, a sensing unit to sense at least one of motion and rotation of the terminal body, and a controller to determine whether or not the sensed at least one of the motion and the rotation of the terminal body meets a preset condition information when an event is generated while a content is output on the first transparent substrate, and decide whether or not to display information relating to the generated event on the second transparent substrate based on the determination result. | 05-01-2014 |
20140118259 | PORTABLE DEVICE AND METHOD FOR PROVIDING USER INTERFACE THEREOF - A method for providing a user interface based on a light sensor includes recognizing a motion of an object with respect to a portable device based on a light signal received by at least one light sensor of the portable device and, according to the motion of the object, controlling an application of the portable device. A portable device to provide a user interface based on a light sensor includes at least one light sensor to recognize a motion of an object with respect to the portable device based on a light signal received by at least one light sensor and a control unit to control an application of the portable device according to the motion of the object. | 05-01-2014 |
20140118260 | MOBILE TERMINAL AND METHOD FOR MOVING CURSOR THEREOF - The disclosure discloses a mobile terminal and method for implementing movement of a cursor thereof. The method includes: acquiring accelerated velocities a | 05-01-2014 |
20140118261 | INPUT DEVICE COMPRISING GEOMAGNETIC SENSOR AND ACCELERATION SENSOR, DISPLAY DEVICE FOR DISPLAYING CURSOR CORRESPONDING TO MOTION OF INPUT DEVICE, AND CURSOR DISPLAY METHOD THEREOF - A display device for displaying a cursor according to motion of an input device is provided. The input device comprises an input part which receives pitch angle information and yaw angle information corresponding to motion of an external input device; a computation part which computes a first relative angle corresponding to the information of the pitch angle and a second relative angle corresponding to the information of the yaw angle; a coordinate calculator which calculates a cursor coordinate value which gradually varies according to the changes of the first and second relative angles; and a display which displays a cursor on a position corresponding to the calculated cursor coordinate value. Thus, it is possible to avoid trembling of the cursor caused by noise. | 05-01-2014 |
20140125592 | APPARATUS TO TRACK A POINTING DEVICE - An apparatus for use in helping to track a pointing device is disclosed herein. An example of the apparatus includes an orientation member coupled to the pointing device that includes a plurality of points at least some of which define a plane. The apparatus also includes a sensor to detect the plurality of points of the orientation member and a processor to determine a location in a workspace of each of the points of the orientation member detected by the sensor. The processor also determines a geometric property of the plane defined by at least some of the points of the orientation member and locates a position of an end of the pointing device along a line segment associated with the geometric property. A method for use in helping to track a pointing device in a workspace and a non-volatile storage medium are also disclosed herein. | 05-08-2014 |
20140125593 | USING MOTION GESTURES TO SEND, SAVE, DELETE, AND REJECT A MESSAGE - A method for providing a user interface includes creating a representative image of a media content, displaying the representative image on a screen, making the representative image a movable object on the screen, monitoring a motion gesture to move the representative image, and performing a corresponding action based on the motion gesture. The action includes saving, deleting, sending, or rejecting a message including the representative image and the media content or a link to the media content. | 05-08-2014 |
20140125594 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF - Embodiments disclose a display apparatus which processes an image signal to display and a control method thereof, the display apparatus including: a display device; an image processing device with processes an image signal to display an image on the display device; a detection device with detects a motion of a user; and a controller which moves a pointer corresponding to the motion detected by the detection device in a preset first mode, the pointer being displayed on the image, and moves the pointer in a preset second mode, which is different from the first mode, in response to a distance between a position where the motion is detected by the detection device and the display apparatus being changed. | 05-08-2014 |
20140132510 | Handheld Electronic Apparatus, Operating Method Thereof, and Non-Transitory Computer Readable Medium Thereof - A handheld electronic apparatus, an operating method thereof, and a non-transitory computer readable medium thereof are provided. The handheld electronic apparatus includes a sensor, a processing unit, and an interface. When the handheld electronic apparatus is moved from a first position to a second position, the sensor generates a plurality of sensed data. The processing unit calculates a movement direction and a movement distance according to the sensed data. The interface is connected to a computer having a monitor. The interface transmits a control signal to the computer. The control signal carries the movement direction and a movement distance so that the computer controls a cursor shown on the monitor to move from a first coordinate to a second coordinate according to the movement direction and a movement distance carried in the control signal. | 05-15-2014 |
20140132511 | CONTROL APPARATUS BASED ON EYES AND METHOD FOR CONTROLLING DEVICE THEREOF - The present invention relates to an eye-gaze based control device. The eye-gaze based control device of the present invention may control a control target device according to an eye-gaze point of a user. Here, the eye-gaze based control device controls the control target device by controlling a size of an image displayed to the user, thereby more precisely controlling the control target device. | 05-15-2014 |
20140132512 | Controlling a graphical user interface - Apparatus and methods for enabling a user to interact with movable control elements of a graphical user interface (GUI) by moving a hand | 05-15-2014 |
20140132513 | Information Processing Method And Electronic Device - An information processing method and an electronic device comprising a touch sensitive unit and a display unit separated from each other are described. The electronic device includes at least a first display mode, and a second display mode suitable for the touch sensitive unit. The method includes acquiring a third display mode of the display unit at a first timing when the electronic device is turned on; detecting whether the third display mode is the second display mode or not; acquiring coordinates of each point of N points of the display unit and generating a first control instruction, when the third display mode is the second display mode, wherein N is an integer above 2; and performing the first control instruction, to correspond the coordinates of the each point of the N points to coordinates of each touch sensitive point of the touch sensitive unit. | 05-15-2014 |
20140132514 | Portable Electronic Device With Dual Opposing Displays - A portable electronic device may include a first exterior side and a second exterior side. The first exterior side has a first display that renders a virtual keyboard. The second exterior side is located opposite the first exterior side and has a second display and a physical keyboard. The first display and the physical keyboard may be alternately and oppositely disabled and enabled in response to a reorientation signal. | 05-15-2014 |
20140132515 | SYSTEM AND METHOD FOR INPUTTING USER COMMANDS TO A PROCESSOR - A system for inputting operation system (OS) commands to a data processing device. The system comprises a video camera that captures images of a viewing space. A processor detects a predetermined object in the images using an object recognition algorithm not involving background information in an image. One or more image analysis parameters of the object are extracted from the images and one or more motion detection tests are applied. Each motion detection test has an associated OS command, and when a test succeeds, the OS command associated with the test is executed. By not relying on background information in an image, the system of the invention may be used in devices that are moved in use, such as a palm plot, personal digital assistant (PDA), a mobile telephone, a digital camera, and a mobile game machine. | 05-15-2014 |
20140139431 | METHOD FOR DISPLAYING IMAGES OF TOUCH CONTROL DEVICE ON EXTERNAL DISPLAY DEVICE - The present invention provides a method for displaying images of a touch control device on an external display device. The method comprises providing a graphical icon on the image displayed on the external display device; providing a control interface for controlling the graphical icon on the touch control device; receiving a control signal from the control interface; and performing an operation corresponding to the control signal. | 05-22-2014 |
20140139432 | System and Method for Determining an Attitude of a Device Undergoing Dynamic Acceleration - A system and a method for determining an attitude of a device undergoing dynamic acceleration is presented. A first attitude measurement is calculated based on a magnetic field measurement received from a magnetometer of the device and a first acceleration measurement received from a first accelerometer of the device. A second attitude measurement is calculated based on the magnetic field measurement received from the magnetometer of the device and a second acceleration measurement received from a second accelerometer of the device. A correction factor is calculated based at least in part on a difference of the first attitude measurement and the second attitude measurement. The correction factor is then applied to the first attitude measurement to produce a corrected attitude measurement for the device. | 05-22-2014 |
20140139433 | REMOTE CONTROLLER AND DISPLAY APPARATUS, CONTROL METHOD THEREOF - A remote controller includes a communicator configured to perform communication with a display apparatus which provides a user interface screen, a sensor configured to sense a movement of the remote controller, and a controller configured to control so that, when a preset event occurs, motion information of the remote controller as sensed at the sensor is mapped with a reference point of a pointer provided on the user interface screen. | 05-22-2014 |
20140145948 | INTERACTIVE PROJECTION SYSTEM AND METHOD FOR CALIBRATING POSITION OF LIGHT POINT THEREOF - A method for calibrating position of light point is used for an interactive projection system including an image capture device. The image capture device includes an image sensor and an optical filter. The method includes the following steps. A plurality of calibrating patterns formed by visible light is individually projected on a screen. The distance between the calibrating pattern with big area and an optical axis of the image sensor is larger than that between the calibrating pattern with small area and the optical axis. Next, the exposure time of the image sensor is adjusted so the image sensor can capture the images of the calibrating patterns through the optical filter. According to the images of the calibrating patterns, a plurality of coordinate positions is gotten. Then, the coordinate positions are adjusted according to a standard coordinate. | 05-29-2014 |
20140145949 | INSTRUCTION INPUT DEVICE AND METHOD USING EYE MOVEMENT - An instruction input device and method using eye movement in which an error rate of detecting a point at which a user gazes, is reduced by storing frequencies of an object moving on a screen and instructions corresponding thereto. In addition, the error rate is reduced by comparing frequency detected from eye movement gazing at the object with the frequency of the object and determining as an input of an instruction when the compared value is included within a predetermined range. | 05-29-2014 |
20140145950 | WRITING DEVICE HAVING LIGHT EMITTING DIODE DISPLAY PANEL - A writing device includes an emitter and a light emitting diode display panel. The light emitting diode display panel includes an array and a receiver. The array has a plurality of pixels each including a first light source, a second light source, a third light source, a driving circuit and a light detecting device. The light detecting device is configured to receive the control signal from the emitter and transmit the control signal to the driving circuit to turn on the pixel. The receiver is electrically connected with the driving circuit. The receiver is configured to receive at least one of the first selective signal, the second selective signal and the third selective signal and transmit it to the driving circuit to turn on at least a corresponding one of the first light source, the second light source, and the third light source. | 05-29-2014 |
20140145951 | DISPLAY DEVICE AND CURSOR CONTROLLING METHOD THEREOF - A display device controlled by a remote control includes an image capture device and a display screen. A remote control of the display device emits light using a light-emitting element, and real-time images of the remote control are captured during the time when the light-emitting element emits light. The captured images are analyzed to calculate a horizontal movement and a vertical movement of the light-emitting element. The display device controls movements of a cursor displayed on the display screen according to horizontal movement and a vertical movement of the light-emitting element. | 05-29-2014 |
20140145952 | COORDINATE CALCULATION APPARATUS AND STORAGE MEDIUM HAVING COORDINATE CALCULATION PROGRAM STORED THEREIN - A coordinate calculation apparatus calculates a coordinate point representing a position on a display screen based on an orientation of an input device. The coordinate calculation apparatus includes direction acquisition means, orientation calculation means, first coordinate calculation means, and correction means. The direction acquisition means acquires information representing a direction of the input device viewed from a predetermined position in a predetermined space. The orientation calculation means calculates the orientation of the input device in the predetermined space. The first coordinate calculation means calculates a first coordinate point for determining the position on the display screen based on the orientation of the input device. The correction means corrects the first coordinate point such that the first coordinate point calculated when the input device is directed in a predetermined direction takes a predetermined reference value. | 05-29-2014 |
20140152560 | SYSTEMS FOR CHANGING ORIENTATION OF DISPLAYED CONTENT - In a multiuser touch sensitive device including an orientation module capable of providing touch interfaces and a touch sensitive display screen for displaying a content wherein the device enables users to provide touch sensitive inputs through the display screen and wherein not all users are oriented in a same direction with respect to the device and to the content, a method for changing an orientation of the content without changing an orientation of the device is provided. The method includes receiving a touch sensitive input from one of the touch interfaces, wherein the input is provided by one of the users using single object for changing the orientation of the content; and processing the touch sensitive input at the orientation module for changing the orientation of the content such that the content can be oriented towards at least one of the users for viewing the content at a desired orientation. | 06-05-2014 |
20140152561 | MOUSE PADS AND METHOD FOR USING THE SAME - A mouse pad includes a body having a surface, a touch panel and a processor is provided. The touch panel is located on the surface and electrically connected to the processor. The processor receives signals from the touch panel and can divide the touch panel into a first area and a second area, the first area and the second area respectively acting as a left mouse button (for left clicks) and as a right mouse button (for right clicks). Methods for using the mouse pad are also provided. | 06-05-2014 |
20140152562 | DISPLAY CONTROLLER, DISPLAY SYSTEM, STORAGE MEDIUM AND METHOD - An exemplary display controller includes: an attitude detecting unit configured to detect an attitude of a terminal device; a first display controlling unit configured to control a display unit to display a partial image that is clipped in response to the detected attitude from a panoramic image corresponding to a position; a determining unit configured to determine a direction in which the position is moving; and a second display controlling unit configured to update, in response to a first operation input, an image displayed on the display unit from the first partial image to a second partial image that is clipped in response to the determined direction from the panoramic image. | 06-05-2014 |
20140152563 | APPARATUS OPERATION DEVICE AND COMPUTER PROGRAM PRODUCT - According to one embodiment, an apparatus operation device includes: a direction operation module configured to receive an operating instruction in a two-dimensional direction; a recognizing module configured to recognize a swing motion or a tilting motion of the apparatus operation device; and an output module configured to output a first operation command corresponding to a first operation for screen transition corresponding to a predetermined condition when the swing motion or the tilting motion recognized satisfies the predetermined condition, and to output a second operation command corresponding to a second operation for transition of a pointer on the screen in the two-dimensional direction when the operating instruction in the two-dimensional direction is received. | 06-05-2014 |
20140152564 | PRESENTATION SELECTION BY DEVICE ORIENTATION - A technique involves providing multiple different presentations on a mobile device, and switching between the different presentations depending upon the orientation of the mobile device. For example, a first presentation could include a scripted action, which may include recorded speech or used concurrently with a speaker, and a second presentation that enables free play within a model. When the mobile device is held in a first orientation, the scripted action is used to present a pitch, lesson, or the like. When the mobile device is held in a second orientation, the free play enables a user of the mobile device to illustrate reactions within a model by way of example or in response to questions that can most effectively be answered within a model. | 06-05-2014 |
20140152565 | EASILY DEPLOYABLE INTERACTIVE DIRECT-POINTING SYSTEM AND PRESENTATION CONTROL SYSTEM AND CALIBRATION METHOD THEREFOR - A method for controlling movement of a computer display cursor based on a point-of-aim of a pointing device within an interaction region includes projecting an image of a computer display to create the interaction region. At least one calibration point having a predetermined relation to said interaction region is established. A pointing line is directed to substantially pass through the calibration point while measuring a position of and an orientation of the pointing device. The pointing line has a predetermined relationship to said pointing device. Movement of the cursor is controlled within the interaction region using measurements of the position of and the orientation of the pointing device. | 06-05-2014 |
20140160019 | METHODS FOR ENHANCING USER INTERACTION WITH MOBILE DEVICES - A method for enhancing user interaction with mobile electronic devices is presented. The method includes determining screen orientation on the device by first detecting the presence of a user using data captured by a camera of the portable electronic device. The method further includes searching the data from the camera for a plurality of physical characteristics of the user if a user is detected. The method also includes determining a facial orientation of the user based on information regarding at least one physical characteristic of the user determined from the data. Finally, the method includes setting a screen orientation of a display device of the portable electronic device based on the determined facial orientation of the user. | 06-12-2014 |
20140160020 | OPTICAL TOUCH DEVICE AND OPERATION METHOD THEREOF - An operation method of an optical touch device includes: emitting, by a light emitting unit, a light beam to illuminate an object; capturing, by an image sensing device, an image of the object reflecting the light beam; selecting all pixels in the image having a brightness greater than or equal to a brightness threshold; sorting the selected pixels along a first coordinate axis of the image, a second coordinate axis of the image or based a pixel brightness; selecting the top first predetermined ratio of pixels from the sorted pixels as an object image of the object; and calculating a gravity center of the object image according to positions of the top first predetermined ratio of pixels or according to the positions of the top first predetermined ratio of pixels with a weight of pixel brightness. An optical touch device is also provided. | 06-12-2014 |
20140160021 | Optical Mouse with Cursor Rotating Ability - A surface navigation device for a computer or similar graphical display and methods for implementing its operation. The device moves sensitively and precisely over a surface such as a desktop and operator generated changes in its position relative to targetable objects on that desktop arranged about the circumference of a pseudo-circle are described in terms of a lumped motion vector. The motion vector is decomposed into a translational and rotational part by metrical and topological methods. The device communicates each part of the decomposed motion quickly and accurately to a computer screen or other display where it may implement the motion of a cursor or it may be used to manipulate objects having a 3D character by providing them with translational and rotational motions. The rotational parameter generated by the device may also be used independently to trigger some computer action. | 06-12-2014 |
20140168079 | CURSOR CONTROL SYSTEM - Disclosed is a cursor control system for controlling a cursor position of a display screen, and the cursor control system includes a sensing module, a speed tuning module, a signal processing module and a transmission module, and the sensing module includes a direction sensing unit and an acceleration sensing unit, such that an instant posture of a user's hand can be detected, and an intuitive inertial behavior can be used to achieve the effect of controlling a cursor, and the speed tuning module can increase or decrease the moving speed of the display screen cursor to improve the convenience for users to control the cursor. | 06-19-2014 |
20140168080 | OPTICAL INPUT APPARATUS - An optical input apparatus is provided and includes an input device for providing data and control signals to a computer; and at least one object sensing module disposed on a top of the input device and each including an optical sensor and a light source. The light source is capable of emitting light upward divergently to create a virtual sensing space above. | 06-19-2014 |
20140168081 | 3D REMOTE CONTROL SYSTEM EMPLOYING ABSOLUTE AND RELATIVE POSITION DETECTION - The present invention can include three-dimensional remote control systems that can detect an absolute location to which a remote control is pointing in first and second orthogonal axes and an absolute position of the remote control in a third orthogonal axis. Remote control systems of the present invention can employ absolute position detection with relative position detection. Absolute position detection can indicate an initial absolute position of the remote control and relative position detection can indicate changes in the position of the remote control. By combining absolute and relative position detection, remote control systems of the present invention can track remote controls more precisely than systems that only employ absolute position detection. The present invention also can include methods and apparatus for zooming in and out of an image shown on a display based on the absolute position of the remote control in the third axis. | 06-19-2014 |
20140176435 | Computer input device - A computer input device is disclosed which comprised a keyboard having a plurality of keys for entering commands and characters into the computer, a touch senor for detecting one or more touches by one or more objects on a surface area of the plurality of keys, and an input processor coupled to both the keyboard and the touch senor, where the input processor is configured to switch the computer input device to a mouse mode when the touch senor having detected one of the plurality of keys being touched prior to the key being pressed, and the input processor is configured to switch the computer input device to a keyboard mode when the touch sensor having detected one of the plurality of keys being touched and pressed at approximately the same starting time. | 06-26-2014 |
20140176436 | TECHNIQUES FOR GESTURE-BASED DEVICE CONNECTIONS - Techniques for gesture-based device connections are described. For example, a method may comprise receiving video data corresponding to motion of a first computing device, receiving sensor data corresponding to motion of the first computing device, comparing, by a processor, the video data and the sensor data to one or more gesture models, and initiating establishment of a wireless connection between the first computing device and a second computing device if the video data and sensor data correspond to gesture models for the same gesture. Other embodiments are described and claimed. | 06-26-2014 |
20140176437 | METHOD AND DEVICE FOR SENSING ORIENTATION OF AN OBJECT IN SPACE IN A FIXED FRAME OF REFERENCE - The invention discloses an improved method and device for sensing orientation of an object in space in a Working Reference Frame. The device of the invention includes an angular velocity sensor with at least two sensing axes and a linear acceleration sensor with at least three sensing axes. The method used for sensing orientation of the object in space in the Working Reference Frame uses synthetic values of the gravity component of the acceleration of the object. It depends upon the number of sensing axes of the angular velocity sensor and upon the dynamicity of the movement of the object. The dynamicity of the movement of the object is tested to compute the synthetic values which are used. The method and device of the invention allow control of a cursor on a display, independently of the roll imparted to the device by a user, in a seamless manner which is greatly improved over the devices and methods of the prior art. | 06-26-2014 |
20140176438 | HANDHELD DEVICE AND DISPLAY CONTROL METHOD - An exemplary display control method is provided. The method determines whether the handheld device is being held by a right hand or a left hand according to an inclined angle of a top of a handheld device relative to the Y axis detected by a detection unit. The method then controls a display unit to display information in a first display mode when the handheld device is being held by a right hand, and controls the display unit to display information in a second display mode when the handheld device is being held by a left hand. | 06-26-2014 |
20140184502 | PORTABLE DEVICE WITH DISPLAY MANAGEMENT BASED ON USER INTENT INTELLIGENCE - The claimed subject matter provides a system for enhancing user experience while conserving power in a portable device. The system includes logic to: determine whether a portable electronic device is being held for viewing; and maintain performance of a display of the portable electronic device at least partially in response to a determination that the portable electronic device is being held for viewing. | 07-03-2014 |
20140184503 | TERMINAL AND METHOD FOR OPERATING THE SAME - A terminal includes a button, a touch screen receiving a touch input from an outside and displaying a display screen, and a controller controlling the display screen of the touch screen in accordance with a state of the button and the received touch input, the state of the button being pressed or press-released, wherein if a press or a press release of the button is sensed, the controller displays an operation region having a size that is smaller than a size of the display screen, and wherein, if the touch input is at a first point within the operation region, the controller is configured to execute a same operation as an operation executed in response to a touch input at a second point corresponding to the first point, the second point being within an entire region of the display screen of the touch screen. | 07-03-2014 |
20140184504 | ELECTRONIC DEVICE AND METHOD FOR CONTROLLING SCREEN ORIENTATION THEREOF - A method for controlling a screen orientation for an electronic device is provided. The method comprises generating a sensing signal when an edge of the electronic device is held by a user, determining a screen display mode according to the sensed signal, determining a direction of a gravity vector from a center of gravity of the electronic device, determining a right-side-up display orientation according to the determined direction of the gravity vector, and displaying a screen in the right-side-up orientation under the determined screen display mode. An electronic device using the method for controlling the screen orientation is also provided. | 07-03-2014 |
20140184505 | Magnetic Vector Sensor Positioning and Communications System - A system is described herein for monitoring the movement of one or more magnets located external to a device using the vector data from one or more magnetic vector sensors incorporated in the device to determine a position and/or to communicate information | 07-03-2014 |
20140184506 | ELECTRO-OPTICAL POINTING DEVICE - This utility model relates to systems for interfacing with a computer via control of an image created on a projector screen and can be used for the interaction of a user with the computer during presentations, training sessions or educational events. The electro-optical pointing device comprises a source of radiation in the form of a light wand and a video camera. The video camera is connected to the computer and records the movement of a light spot over the screen, onto which an image being transmitted from the computer is projected. The source of radiation is a laser or a light-emitting diode emitting radiation in the infrared spectral range. An infrared range light filter is mounted in front of the lens of the video camera. The utility model makes it possible to simplify the design of the device whilst maintaining the operational efficiency thereof. | 07-03-2014 |
20140184507 | DISPLAY DEVICE AND DISPLAY CONTROL SYSTEM - A display control system includes a display device including a display area in which a plurality of pixels is provided and which displays an image and a pointer device configured to indicate a location on the display area. A locational information pattern indicating the ln on the display area is provided on the display area. The locational information pattern is made of a plurality of marks which is provided in the sub pixels and absorbs or reflects light. The pointer device is configured to optically read the locational information pattern in a location indicated on the display area. The display device controls the display area such that display contents in a location corresponding to the locational information pattern read by the pointer device. | 07-03-2014 |
20140191963 | APPARATUS AND METHOD FOR CONTROLLING A USER INTERFACE OF A DEVICE - Certain aspects of an apparatus and a method for controlling a user interface of a device may comprise one or more sensors coupled to a vibratory surface associated with the apparatus. The one or more sensors may detect one or more vibrations of the vibratory surface caused by an interaction of an object with the vibratory surface. The one or more sensors may generate one or more vibratory signals in response to the detected one or more vibrations. One or more processors that are communicatively coupled to the one or more sensors may generate a control signal corresponding to the one or more generated vibratory signals to control the user interface of the device. | 07-10-2014 |
20140191964 | Headset Computer with Head Tracking Input Used For Inertial Control - A Head-tracker is built into a headset computer as a user input device. A user interface navigation tool utilizes the head tracking but with inertial control. The navigation tool is formed of two different sized circles concentrically depicted, and a pointer. The pointer is moveable within the two circles defining inner and outer boundaries. The pointer represents user's head position and movement sensed by the head tracker. The HSC displays a document and pans (navigates) the document as a function of user head movement sensed by the head tracker and illustrated by the navigation tool. The direction of movement of the pointer depicted in the navigation tool defines pan direction of the displayed document. Pan speed of the displayed document is defined based on position of the pointer, with respect to the inner and outer circle boundaries in the navigation tool. | 07-10-2014 |
20140191965 | REMOTE POINT OF VIEW - In a system and method that provides a remote point of view, a user views an image (e.g., captured by an imaging device such as a camera) on a display. The position of the user's head relative to display device is detected and the image is processed in response to a ‘point of view’ derived from the position of the user's head relative to the display device. A change in the position of the user's head relative to the display device may be detected and the image may be reprocessed in response to a revised ‘point of view’ derived from the change in position of the user's head relative to the display device. | 07-10-2014 |
20140191966 | INTERACTIVE IMAGE SYSTEM AND OPERATING APPARATUS THEREOF - An operating apparatus adapted to manipulate an operating interface of a display device and having a switchable operation mode is provided. The display device is adapted to provide a first optical signal. The operating apparatus includes a processing module, a light-emitting element, a light sensing element and a switching module. The light-emitting element, the light sensing element and the switching module are electrically connected to the processing module. The switching module is adapted to switch the switchable operation mode to a first mode or a second mode. When the switchable operation mode is switched to the first mode, the light sensing element receives the first optical signal. When the switchable operation mode is switched to the second mode, the light-emitting element provides a second optical signal and the light sensing element receives the second optical signal. An interactive image system having the operating apparatus is also provided. | 07-10-2014 |
20140191967 | ELECTRONIC DEVICE FOR RECOGNIZING ASYNCHRONOUS DIGITAL PEN AND RECOGNIZING METHOD THEREOF - An electronic device for receiving an input by a digital pen having a waveform generation means is provided. The electronic device includes at least three reception sensors and a processor. The at least three reception sensors are installed in the electronic device at positions that are separated from one another, and are configured to receive a waveform generated by the waveform generator of the digital pen. The processor is configured to calculate an input coordinate of the digital pen using a difference of velocities and reception times of the waveform received by the at least three reception sensors. | 07-10-2014 |
20140191968 | USER INTERFACE BASED ON MAGNETIC INDUCTION - A receiving and transmitting node for a wireless data network, and a wireless data network based on magnetic induction. The receiving node includes an antenna receive module for receiving one or more data signals emitted from the transmitting node and a calculation module adapted to calculate one or more distances between the receiving node and the transmitting node, and/or adapted to calculate the position of the transmitting node in relation to the position of the receiving node, and/or adapted to calculate the orientation of the transmitting node in relation to the orientation of the receiving node. The reception of the data signal is based on magnetic induction and the calculation of the one or more distances, and the position of the transmitting node and/or the orientation of the transmitting node is based on the one or more data signals. | 07-10-2014 |
20140198041 | POSITION INFORMATION OBTAINING DEVICE AND METHOD, AND IMAGE DISPLAY SYSTEM - A position information obtaining device and a position information obtaining method are provided. Each of the position information obtaining device and the position information obtaining method captures images to which a light pointer is directed and on which a light spot is formed, in chronological order, and estimates position information of a specified position on the images specified by the light pointer using a plurality of pieces of image information obtained from the captured image. | 07-17-2014 |
20140198042 | OPERATION INPUT DEVICE AND METHOD, PROGRAM, AND ELECTRONIC APPARATUS - An operation input device includes: angular velocity detecting means for detecting an angular velocity; relative velocity detecting means for contactlessly detecting a relative velocity to a target object; distance detecting means for detecting a distance to the target object; and computing means for computing an amount of movement based on the angular velocity, the relative velocity, and the distance. | 07-17-2014 |
20140198043 | THE METHOD FOR GENERATING POINTER MOVEMENT VALUE AND POINTING DEVICE USING THE SAME - Disclosed are an apparatus for calculating the movement value of a pointer, a method of correcting the movement value of the pointer, and a 3D pointing device. The method of generating correction information of pointer includes acquiring movement data of a 3D pointing device during a predetermined time, adding up values of the movement data, generating control information and moving the pointer if an added-up result of the movement data is greater than or equal to a threshold value, and transmitting the control information to the pointer. | 07-17-2014 |
20140204026 | Synchronizing A Cursor from a Managed System with a Cursor from a Remote System - A method includes receiving reports of the pointing device events occurring on a remote computer at a host computer and performing computations in the host computer based upon the mouse reports. The method includes generating screen images in the host computer based upon the computations, the screen images not containing images of a cursor representing locations pointed to by a pointing device of the host computer. The generated screen images are transmitted to the remote computer. In some embodiments, the reports may be received by a remote console controller. An information handling system includes boot firmware to set a mouse to operate in absolute mode under control of the boot firmware. An information handling system separately transmits to a remote console controller of the information handling system screen images without a cursor and cursor images. | 07-24-2014 |
20140204027 | SMART BEZEL ICONS FOR TABLETS - A system, method, and computer-readable medium are disclosed for facilitating user interaction with a mobile device. A first control function is associated with a first user control located in a first location on the bezel of a mobile device to establish a target operational position for any user control that is subsequently associated with the control function. When the orientation of the mobile device is changed, the first control function is associated with a second user control, located in a second location on the bezel, to maintain the target operational position of the user control associated with the control function. | 07-24-2014 |
20140210714 | IMAGE DISPLAY APPARATUS AND METHOD FOR OPERATING THE SAME - An image display apparatus and a method for operating the same are disclosed. The method for operating an image display apparatus includes displaying a home screen including at least one card object including a content list, displaying a dynamic screen on the home screen if dynamic screen display input is received, and moving and displaying the dynamic screen on the home screen if dynamic screen movement input is received. Therefore, it is possible to increase user convenience. | 07-31-2014 |
20140210715 | GESTURE DETECTION DEVICE FOR DETECTING HOVERING AND CLICK - There is provided a gesture detection device including two linear image sensor arrays and a processing unit. The processing unit is configured to compare sizes of pointer images in the image frames captured by the two linear image sensor arrays in the same period or different periods so as to identify a click event. | 07-31-2014 |
20140210716 | GESTURE DETECTION DEVICE FOR DETECTING HOVERING AND CLICK - There is provided a gesture detection device including two linear image sensor arrays and a processing unit. The processing unit is configured to compare sizes of pointer images in the image frames captured by the two linear image sensor arrays in the same period or different periods so as to identify a click event. | 07-31-2014 |
20140218290 | SYSTEM AND METHODS FOR PROVIDING ORIENTATION COMPENSATION IN POINTING DEVICES - Axis orientation compensation is provided in a system in which movement of a controlling device is used to control navigational functions of a target appliance by determining which one of plural sides of the controlling device is an active side of the controlling device and by causing navigational functions of the target appliance made relative to at least one of an X, Y, and Z axis of the target appliance to be dynamically aligned with movements of the controlling device made relative to at least one of an A, B, and C axis of the controlling device as a function of the one of the plural sides of the controlling device that is determined to be the active side of the controlling device. | 08-07-2014 |
20140218291 | ALIGNING VIRTUAL CAMERA WITH REAL CAMERA - Embodiments are disclosed that relate to aligning a virtual camera with a real camera. For example, one disclosed embodiment provides a method comprising receiving accelerometer information from a mobile computing device located in a physical space and receiving first image information of the physical space from a capture device separate from the mobile computing device. Based on the accelerometer information and first image information, a virtual image of the physical space from an estimated field of view of the camera is rendered. Second image information is received from the mobile computing device, and the second image information is compared to the virtual image. If the second image information and the virtual image are not aligned, the virtual image is adjusted. | 08-07-2014 |
20140218292 | DATA SEARCHING METHOD AND SYSTEM - An electronic device includes a touch-sensitive screen, a cursor locating module, a touch detecting module, and a user interface (UI) generating module. The cursor locating module locates a cursor in a graphical user interface (GUI) displayed on the touch-sensitive screen. The touch detecting module detects whether a point of the touch-sensitive screen corresponding to the cursor is continuously pressed for a preset time duration. When the point is continuously pressed for the preset time duration, the UI generating module generates a search UI and displays the search UI adjacent to the cursor in the GUI. A method for searching data in a touch-sensitive device is also provided. | 08-07-2014 |
20140218293 | Method And Apparatus For User Interface Of Input Devices - A 3 dimensional (3-D) user interface system employs: one or more 3-D projectors configured to display an image at a first location in 3-D space; one or more sensors configured to sense user interaction with the image and to provide user interaction information; and a processor configured (i) to receive the user interaction information from the one or more sensors; (ii) to correlate the user interaction with the image; and (iii) to provide one or more indications responsive to a correlation of the user interaction with the image, including displaying the image at a second location in 3-D space. | 08-07-2014 |
20140232650 | User Center-Of-Mass And Mass Distribution Extraction Using Depth Images - Embodiments described herein use depth images to extract user behavior, wherein each depth image specifies that a plurality of pixels correspond to a user. A depth-based center-of-mass position is determined for the plurality of pixels that correspond to the user. Additionally, a depth-based inertia tensor can also be determined for the plurality of pixels that correspond to the user. In certain embodiments, the plurality of pixels that correspond to the user are divided into quadrants and a depth-based quadrant center-of-mass position is determined for each of the quadrants. Additionally, a depth-based quadrant inertia tensor can be determined for each of the quadrants. Based on one or more of the depth-based center-of-mass position, the depth-based inertial tensor, the depth-based quadrant center-of-mass positions or the depth-based quadrant inertia tensors, an application is updated. | 08-21-2014 |
20140232651 | CASCADING OPTICS IN OPTICAL COMBINERS OF HEAD MOUNTED DISPLAYS - An apparatus for a head mounted display includes a display module for launching display light along a forward propagating path. The apparatus also includes a light relay to receive the display light. The light relay includes a first optic disposed along the forward propagating path. The light relay also includes a second optic disposed along the forward propagating path between the first optic and the display module. The first optic is configured to direct the display light in an eye-ward direction and the second optic is configured to direct the display light in an eye-ward direction. | 08-21-2014 |
20140232652 | CALIBRATION OF PORTABLE DEVICES IN A SHARED VIRTUAL SPACE - Methods, systems, and computer programs are provided for generating an interactive space. One method includes operations for associating a first device to a reference point in 3D space, and for calculating by the first device a position of the first device in the 3D space based on inertial information captured by the first device and utilizing dead reckoning. Further, the method includes operations for capturing images with a camera of the first device, and for identifying locations of one or more static features in the images. The position of the first device is corrected based on the identified locations of the one or more static features, and a view of an interactive scene is presented in a display of the first device, where the interactive scene is tied to the reference point and includes virtual objects. | 08-21-2014 |
20140232653 | PORTABLE ELECTRONIC APPARATUS, TOUCH REGION SETTING METHOD, AND TOUCH REGION SETTING PROGRAM - A supporting-point position detecting unit detects a supporting-point position of a thumb when the finger moves in a state in which a portable electronic apparatus (portable information apparatus) is held, and a finger length acquiring unit acquires a length of the thumb. Then, an application processing unit sets a touch region (for example, a display region of an icon of a push button) in which a touch operation serving as a predetermined input operation is received within a movable range of the finger obtained based on the supporting-point position of the thumb and the length of the thumb in a region in which the touch sensor is able to detect a touch. | 08-21-2014 |
20140240231 | PROCESSING TRACKING AND RECOGNITION DATA IN GESTURAL RECOGNITION SYSTEMS - Systems and methods are described for detecting an event of a source device, and generating at least one data sequence comprising device event data specifying the event and state information of the event. The device event data and state information are type-specific data having a type corresponding to an application of the source device. A data capsule is formed to include the at least one data sequence. The data capsule has a data structure comprising an application-independent representation of the at least one data sequence. The systems and methods detect poses and motion of an object, translate the poses and motion into a control signal using a gesture notation, and control a computer application using the control signal. The systems and methods automatically detect a gesture of a body, translate the gesture to a gesture signal, and control a component coupled to a computer in response to the gesture signal. | 08-28-2014 |
20140240232 | Automatic Cursor Rotation - A touch device is configured to display a touch activated cursor on a touch screen. The touch device is configured to automatically rotate the cursor depending on movement of the cursor on the touch screen. | 08-28-2014 |
20140247215 | DELAY WARP GAZE INTERACTION - A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. The delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves gaze while continuing the action (e.g., continued holding of the touchpad). | 09-04-2014 |
20140247216 | TRIGGER AND CONTROL METHOD AND SYSTEM OF HUMAN-COMPUTER INTERACTION OPERATION COMMAND AND LASER EMISSION DEVICE - Disclosed are a trigger and control method and system of a human-computer interaction operation command and an associated laser emission device, the method comprising: utilizing a camera device to shoot a display area outputted by an image output device; determining the coordinate mapping transformation relationship between the shot display area and the original image output by the image output device; detecting a laser point in the shot display area, and transforming the coordinates thereof into the coordinates in the original image according to the relationship; when the laser point is identified to transmit the code signal corresponding to a certain human-computer interaction operation command, triggering the human-computer interaction operation command corresponding to the code signal at the coordinates in the original image correspondingly transformed from the coordinate of the laser point. The present invention facilitates a user in conducting medium range and long range human-computer interaction operations. | 09-04-2014 |
20140253442 | ULTRASONIC HYBRID INPUT DEVICE AND CORRESPONDING TUNING METHOD - A hybrid input device is described. The hybrid input device includes a stylus for writing on a touchscreen, a writing instrument and a pressure sensor. The hybrid input device also includes an ultrasonic transmitter that transmits an ultrasonic data pattern. The hybrid input device also includes a control element that toggles ultrasonic functionality of the hybrid input device. | 09-11-2014 |
20140253443 | USING PORTABLE ELECTRONIC DEVICES FOR USER INPUT - Various techniques of using a portable electronic device for user input are disclosed herein. In one embodiment, a method includes acquiring a sensor reading from an inertial measurement unit of the portable electronic device. The sensor reading contains an acceleration of the portable electronic device. The method also includes determining a position change of the portable electronic device based on the acquired sensor reading and transmitting the determined position change to a computer. The position change is usable by the computer to control a cursor position on the computer. | 09-11-2014 |
20140253444 | MOBILE COMMUNICATION DEVICES AND MAN-MACHINE INTERFACE (MMI) OPERATION METHODS THEREOF - A mobile communication device including a wireless communication module, a local display device, and a processing module is provided. The wireless communication module performs wireless transceiving to and from a display host machine. The local display device is equipped with a first display screen including a first control area and a second control area within the first control area. The processing module detects a first touch event in the first control area and a second touch event for moving the second control area within the first control area, transforms coordinate information of the first and second touch events into a first set and a second set of coordinates on a second display screen of the display host machine, respectively, and presents a touch operation and a cursor operation on the second display screen via the wireless communication module according the first set and second set of coordinates, respectively. | 09-11-2014 |
20140267031 | SPATIALLY AWARE POINTER FOR MOBILE APPLIANCES - A spatially aware pointer that can augment a pre-existing mobile appliance with remote control, hand gesture detection, and/or 3D spatial depth sensing abilities. In at least one embodiment, a spatially aware pointer can be operatively connected to the data port of a mobile appliance (e.g., mobile phone or tablet computer) to provide remote control, hand gesture detection, and 3D spatial depth sensing abilities to the mobile appliance. Such an enhancement may, for example, allow a user with a mobile appliance to make a 3D spatial model of at least a portion of an environment, or remotely control a TV set with hand gestures, or engage other mobile appliances to create interactive projected images. | 09-18-2014 |
20140267032 | SYSTEMS, METHODS, AND MEDIA FOR PROVIDING AN ENHANCED REMOTE CONTROL HAVING MULTIPLE MODES - Systems, methods, and media for providing a multipurpose remote control are provided. In some implementations, a system for controlling a media device is provided, the system comprising: a hardware processor connected to a touch sensor, a directional input region, and a motion sensor, wherein the hardware processor is configured to: detect that the touch sensor is activated for a predetermined period of time; and upon detecting that the touch sensor has been activated for the predetermined period of time, switching from a first mode to a second mode, wherein: the first mode comprises controlling a highlighted region displayed on the media device in response to an input provided on the directional input region and selecting an item corresponding to the highlighted region in response to depression of the touch sensor; and the second mode comprises controlling a position of a cursor displayed on the media device in response to an output of the motion sensor, selecting an item corresponding to the position of the cursor in response to depression of the touch sensor, and inhibiting the display of the highlighted region. | 09-18-2014 |
20140267033 | Information Technology Device Input Systems And Associated Methods - A method for generating a control signal to control an information technology device includes the following steps: (1) capturing, using an image sensor, a current control image of a light source of a remote controller positioned within a field of view of the image sensor; (2) identifying, within the current control image, a current location of light emitted from the light source; (3) determining movement between (a) the current location of the light emitted from the light source and (b) a previous location of the light emitted from the light source determined from a previously captured image; (4) generating a movement control signal based upon the movement; and (5) sending the movement control signal to the information technology device. The method is executed, for example, by a movement control module of an information technology device input system. | 09-18-2014 |
20140267034 | SYSTEMS AND METHODS FOR DEVICE INTERACTION BASED ON A DETECTED GAZE - Systems and methods are provided that allow a user to interact with a device using gaze detection. In the provided systems and methods, the gaze detection is initiated by detecting a triggering event. Once gaze detection has been initiated, detecting a gaze of a user may allow the user to activate a display component of the device, pass a security challenge on the device, and view content and alerts on the device. The gaze detection may continue looking for the user's gaze and keep the display component of the device activated as long as a gaze is detected, but may deactivate the display component of the device once a gaze is no longer detected. To conserve power the gaze detection may also be deactivated until another triggering event is detected. | 09-18-2014 |
20140267035 | Multimodal User Interface Design - A multimodal user interface is provided. A human machine interface in a vehicle utilizes a plurality of modalities. A cognitive model for secondary driving tasks indicates a best use of one or more particular modalities for performing each secondary driving task. | 09-18-2014 |
20140267036 | Real-Time Dynamic Tracking of Bias - A bias value associated with a sensor, e.g., a time-varying, non-zero value which is output from a sensor when it is motionless, is estimated using at least two, different bias estimating techniques. A resultant combined or selected bias estimate may then be used to compensate the biased output of the sensor in, e.g., a 3D pointing device. | 09-18-2014 |
20140267037 | INPUT APPARATUS - An input apparatus includes a display element (e.g., an LCD element) and a first sensor. The display element has a display screen of a predetermined size. The first sensor is configured to have a detection area that is smaller than the display screen, and to detect coordinates of a position pointed to in the detection area by a pointing body, such as a finger and a pen. The first sensor is coupled with the display screen such that its detection area, in which a pointing position pointed to by the pointing body is detected, maps to a predetermined display area within the display screen. The input apparatus may further include a second sensor configured to detect a pointing position pointed to by a pointing body in a full display detection area of the display screen, and the input apparatus selectively processes a detection output from the first sensor or the second sensor. | 09-18-2014 |
20140292654 | SYSTEM AND METHOD FOR DETERMINING 3D ORIENTATION OF A POINTING DEVICE - The present invention is directed toward a system and process that controls a group of networked electronic components using a multimodal integration scheme in which inputs from a speech recognition subsystem, gesture recognition subsystem employing a wireless pointing device and pointing analysis subsystem also employing the pointing device, are combined to determine what component a user wants to control and what control action is desired. In this multimodal integration scheme, the desired action concerning an electronic component is decomposed into a command and a referent pair. The referent can be identified using the pointing device to identify the component by pointing at the component or an object associated with it, by using speech recognition, or both. The command may be specified by pressing a button on the pointing device, by a gesture performed with the pointing device, by a speech recognition event, or by any combination of these inputs. | 10-02-2014 |
20140300544 | Projector and Electronic Device Having Projector Function - A projector includes a laser beam scanning portion, a light detecting portion, and a control portion performing control of predicting a prediction region where a detection object exists on the basis of detection of the detection object, changing a modulation pattern of a laser beam when the laser beam is scanned on the prediction region, acquiring the position of the detection object on the basis of reflected light of the laser beam emitted in a detection object detection pattern, and setting the position of the detection object to an indication position. | 10-09-2014 |
20140300545 | IMAGE PROJECTION SYSTEM AND A METHOD OF CONTROLLING A PROJECTED POINTER - An image projection system whereby a position of a pointer on a projected image can be easily controlled from a location separated from the main control means. The system comprises a projector, a computer for controlling the projector to display a pointer on a projected image in accordance with an operational signal from a terminal, and a remote controller for the projector. The remote controller comprises a pointing device and a transmitter for transmitting an operation signal as an infrared ray operational signal. The projector comprises a receiver which is an infrared (IR) optical receiver having directivity, and receives the infrared ray operational signal. | 10-09-2014 |
20140300546 | MEDIA SYSTEM WITH OFF SCREEN POINTER CONTROL - A media system providing off screen pointer control is disclosed. A remote control device is adapted to be held and pointed by a user and controls a menu based on both on screen and off screen pointing. The on screen pointing may control a cursor or other pointer. The off screen pointing may provide a variable scrolling function or other control function. | 10-09-2014 |
20140300547 | Indirect 3D Scene Positioning Control - Embodiments of the present invention generally relate to interacting with a virtual scene at a perspective which is independent from the perspective of the user. Methods and systems can include either tracking and defining a perspective of the user based on the position and orientation of the user in the physical space, projecting a virtual scene for the user perspective to a virtual plane, tracking and defining a perspective of the a freehand user input device based on the position and orientation of the a freehand user input device, identifying a mark in the virtual scene which corresponds to the position and orientation of the device in the physical space, creating a virtual segment from the mark and interacting with virtual objects in the virtual scene at the end point of the virtual segment, as controlled using the device. | 10-09-2014 |
20140306891 | HOLOGRAPHIC OBJECT FEEDBACK - Methods for providing real-time feedback to an end user of a mobile device as they are interacting with or manipulating one or more virtual objects within an augmented reality environment are described. The real-time feedback may comprise visual feedback, audio feedback, and/or haptic feedback. In some embodiments, a mobile device, such as a head-mounted display device (HMD), may determine an object classification associated with a virtual object within an augmented reality environment, detect an object manipulation gesture performed by an end user of the mobile device, detect an interaction with the virtual object based on the object manipulation gesture, determine a magnitude of a virtual force associated with the interaction, and provide real-time feedback to the end user of the mobile device based on the interaction, the magnitude of the virtual force applied to the virtual object, and the object classification associated with the virtual object. | 10-16-2014 |
20140320407 | Motion Sensing Device and Motion Sensing System thereof - A motion sensing device, for a motion sensing system having a light emitting device for generating light of a first frequency range, includes a motion sensing area comprising a sensor array, for sensing light of the first frequency range to generate two-dimensional motion information of a first axis and a second axis; and a distance sensing area, configured at an outside of the motion sensing area, for sensing light of the first frequency range to generate distance information of a third axis. | 10-30-2014 |
20140320408 | NON-TACTILE INTERFACE SYSTEMS AND METHODS - Methods and systems for processing an input are disclosed that detect a portion of a hand and/or other detectable object in a region of space monitored by a 3D sensor. The method further includes determining a zone corresponding to the region of space in which the portion of the hand or other detectable object was detected. Also, the method can include determining from the zone a correct way to interpret inputs made by a position, shape or a motion of the portion of the hand or other detectable object. | 10-30-2014 |
20140327617 | Method and Locating Device for Locating a Pointing Device - A locating device is for locating a pointing device. The pointing device is configured to capture a plurality of images, to measure an angular rate of the pointing device, and to generate and output current coordinate data associated with an orientation of the pointing device when at least one of the images contains infrared light from a light source module. The locating device is configured to receive the current coordinate data and the angular rate from the pointing device, to generate a cursor signal based on one of the current coordinate data and the angular rate, and to transmit the cursor signal to a computing device, which is configured to move a cursor on a display unit, according to the cursor signal. | 11-06-2014 |
20140333535 | GAZE-ASSISTED COMPUTER INTERFACE - Methods, systems, and computer programs for interfacing a user with a Graphical User Interface (GUI) are provided. One method includes operations for identifying a point of gaze (POG) of a user, and for detecting initiation of an action by the user to move a position of a selector for selecting objects presented on a graphical user interface (GUI). In addition, the method includes an operation for determining the distance between the current position of the selector and the POG. The displacement speed of the selector is adjusted based on the distance between the current position of the selector and the POG, where the displacement speed is reduced as the distance between the current position of the selector and the POG becomes smaller. | 11-13-2014 |
20140340310 | INPUT DEVICE AND FUNCTION SWITCHING METHOD THEREOF - An input device includes a housing; a plurality of keyswitches disposed on the housing; a touch panel disposed on the housing; and a processing unit disposed in the housing and electrically connected to the keyswitches and the touch panel, the processing unit being used for switching coordinate information outputted by the touch panel to a cursor control function or a virtual numeric keypad function. | 11-20-2014 |
20140340311 | CURSOR MODE SWITCHING - Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation. | 11-20-2014 |
20140347273 | User Interface Apparatus and Associated Methods - An apparatus including at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following: determine a curvature of a deformable flexible user interface area of an electronic device, the deformable flexible user interface area being capable of detecting user input within a hover region; and based on the determined curvature, enable adjusting the hover input region of the user interface area to maintain a substantially flat planar hover input region across the deformable flexible user interface area. | 11-27-2014 |
20140347274 | Low Power Management of Multiple Sensor Integrated Chip Architecture - A method, device, system, or article of manufacture is provided for low-power management of multiple sensor chip architecture. In one embodiment, a method comprises, at a computing device that includes a first processor, a second processor and a third processor, receiving, by the first processor, sensor data from a first sensor; determining, by the first processor, a movement by the computing device using the sensor data; receiving, by the first processor, a modality of the computing device; in response to determining that the modality corresponds to a predetermined state, determining, by the first processor, a modality move distance associated with the predetermined state; determining, by the first processor, a move distance of the computing device using the modality move distance; determining, by the first processor, that the move distance of the computing device is at least a move distance threshold; and, in response to determining that the move distance of the computing device is at least a move distance threshold, reporting, by the first processor, to at least one of the second processor and the third processor, that the move distance of the computing device is at least the move distance threshold. | 11-27-2014 |
20140347275 | METHOD AND APPARATUS FOR EXECUTING APPLICATIONS IN PORTABLE ELECTRONIC DEVICES - Disclosed herein are a method of executing an application in a portable electronic device. A notification is output on the portable electronic device and a movement or a rotation of the portable electronic device is detected. An application associated with the notification is executed in response to the movement or the rotation of the portable electronic device. | 11-27-2014 |
20140347276 | ELECTRONIC APPARATUS INCLUDING TOUCH PANEL, POSITION DESIGNATION METHOD, AND STORAGE MEDIUM - An electronic apparatus including a display section with a touch panel displays a pointer for position designation on a screen of the display section while an operation mode is switched to a predetermined operation mode, and causes an entire region of the touch panel to function as a virtually transparent touchpad, and executes operation process including controlling a display position of the pointer in the display section in accordance with an input operation to the touch panel. For this reason, it is not necessary to display anything on the screen of the display section in order to indicate the virtual touchpad, and there is no possibility that the screen region that is operable is decreased or a part of a plurality of icons and the like disposed on the screen are hidden. | 11-27-2014 |
20140347277 | CONTROLLER USER INTERFACE FOR A CATHETER LAB INTRAVASCULAR ULTRASOUND SYSTEM - A touchpad controller for a componentized intravascular ultrasound system is disclosed for acquisition and display of intravascular information in a catheter lab environment. The system includes a patient interface module (PIM) adapted to hold a catheter having an imaging probe located near a distal end, a control panel, a monitor for displaying images and patient data, and a processing unit. The touchpad controller facilitates use beneath a sterile drape and sensitivity to gloved touch. Furthermore, the touchpad controller is sized for handheld use during an imaging session. A rail mount facilitates easy attachment of the touchpad controller alongside a patient table. | 11-27-2014 |
20140354545 | OPTICAL OBJECT RECOGNITION SYSTEM - An optical object recognition system includes at least two beacons, an image sensor and a processing unit. The beacons operate in an emission pattern and the emission pattern of the beacons has a phase shift from each other. The image sensor captures image frames with a sampling period. The processing unit is configured to recognize different beacons according to the phase shift of the emission pattern in the image frames. | 12-04-2014 |
20140354546 | INTERACTIVE PROJECTION SYSTEM AND INTERACTIVE IMAGE-DETECTING METHOD - An interactive projection system includes an electronic device, a projection device and an interactive module. The projection device is connected with the electronic device for receiving a first image signal generated by the electronic device and accordingly projecting a first image. The interactive module includes a processing unit, a storage unit connected with the processing unit for storing a calibration data, an image capture unit connected with the processing unit for capturing the first image and a light point image, and a communication unit connected with the electronic device and the processing unit for transmitting an absolute coordinate information computed and generated by the processing unit according to the light point image and the calibration data to the electronic device. An output signal is generated by the electronic device with the absolute coordinate information. Therefore, the present invention avoids the repeating image calibration and reduces labor cost and time cost. | 12-04-2014 |
20140354547 | EASILY DEPLOYABLE INTERACTIVE DIRECT-POINTING SYSTEM AND PRESENTATION CONTROL SYSTEM AND CALIBRATION METHOD THEREFOR - A method for controlling movement of a computer display cursor based on a point-of-aim of a pointing device within an interaction region includes projecting an image of a computer display to create the interaction region. At least one calibration point having a predetermined relation to said interaction region is established. A pointing line is directed to substantially pass through the calibration point while measuring a position of and an orientation of the pointing device. The pointing line has a predetermined relationship to said pointing device. Movement of the cursor is controlled within the interaction region using measurements of the position of and the orientation of the pointing device. | 12-04-2014 |
20140368432 | WEARABLE SMART GLASSES AS WELL AS DEVICE AND METHOD FOR CONTROLLING THE SAME - A method for controlling a wearable smart glasses is provided, wherein the method comprises steps as follows: a gaze point over an operation system interface of the wearable smart glasses on which a user's eyeballs focus is determined and traced. A controlling command provided by the user through a touch switch module of the wearable smart glasses is received. A corresponding process is then performed on the gaze point according to the controlling command. | 12-18-2014 |
20140375562 | System and Process for Human-Computer Interaction Using a Ballistic Projectile as an Input Indicator - A process for interacting with computer software using a physical ballistic projectile with the steps of: computer sends target image to projector, user shoots projectile at light-blocking shot screen, video imaging device sends video frames to computer for processing by hit detection software, hit detection software compares video frames and identifies a difference as a hit and stores the x,y location of the hit and applies a mirror transform on the x,y coordinates and adjusts the coordinates based on pre-calibrated skewing angles of the video frames, hit detection software verifies that the hit is not moving, ruling out light flicker, debris, or the projectile itself, and computer executes a user input event, such as a mouse click, at the adjusted x,y coordinates. | 12-25-2014 |
20150009140 | NAVIGATION DEVICE INCLUDING THERMAL SENSOR - A navigation device includes a thermal sensor and a processing unit. The thermal sensor is configured to output temperature readings of a work surface. The processing unit is configured to identify a lift event according to the temperature readings. | 01-08-2015 |
20150009141 | POINTER POSITIONING METHOD OF HANDHELD POINTER DEVICE - A pointer positioning method of a handheld pointer device, which includes capturing a first image frame containing a reference point to compute a first pointing coordinate according to the image position of the reference point in the first image frame; generating a cursor parameter of a cursor according to the first pointing coordinate; when the handheld pointer device enters a pointer-lock mode, records the first pointing coordinate and positions the cursor at the first pointing coordinate on a display apparatus; when the handheld pointer device exits the pointer-lock mode, captures a second image frame to compute a second pointing coordinate according to the image position of the reference point in the second image frame to obtain a displacement vector between the first and the second pointing coordinates; generating the cursor parameter and controlling the movement of the cursor according to the displacement vector and the first pointing coordinate. | 01-08-2015 |
20150009142 | INFORMATION PROCESSING DEVICE, METHOD FOR OPERATING INFORMATION PROCESSING DEVICE, AND INFORMATION PROCESSING SYSTEM - An STB that receives (i) a selection button operation corresponding to a press of one of selection buttons of a remote control and (ii) a cursor operation corresponding to a tilt of a case of the remote control, the STB including: a display control unit that displays a pointer on a display screen according to the cursor operation and displays, if the pointer is on an object displayed on the display screen, a plurality of input items assigned to the object; and an input item receiving unit that receives one of the input items displayed by the display control unit, according to the selection button operation. | 01-08-2015 |
20150009143 | OPERATING SYSTEM - An operating system includes a coordinate information generating unit configured to, when a touch on a touch panel is detected, deem that a virtual hovering operation that is performed virtually in air above the touch panel surface was performed and shift to a hovering operation mode that generates two-dimensional coordinate information indicating the touched position and height position information having a positive value, and then, when a particular operation is received during the virtual hovering operation, deem that a virtual touch operation was performed and generate two-dimensional coordinates indicating the touched position and height position information with a value of zero, a display control unit configured and programmed to display on a display unit a specified hovering cursor at the display position that corresponds to the two-dimensional position of the virtual hovering operation so as to be superimposed on key images, and a function information output unit configured to output, when the virtual touch operation is performed, function information assigned to the corresponding key. | 01-08-2015 |
20150009144 | METHOD AND APPARATUS FOR ACTIVATING ELECTRONIC DEVICES WITH GESTURES - An electronic device with gesture activation includes a body having at least one infrared (IR) transmissive window, an IR gesture detection sensor aligned with the transmissive window, a processor coupled to the gesture detection sensor and digital memory coupled to the processor. The digital memory includes code segments executable on the processor for starting a timer if a first gesture of an activation gesture sequence including an ordered plurality of gestures is received while at least one process of the electronic device is in an inactive mode, and for activating the at least one process of the electronic device if the remainder of ordered plurality of gestures is received before the timer has elapsed. | 01-08-2015 |
20150015488 | Isolating Mobile Device Electrode - In one embodiment, a method includes receiving real-time sensor data from multiple sensors located on multiple surfaces of a computing device; detecting a transition in the real-time sensor data from a steady state; and determining based on the detection an imminent use of the computing device. | 01-15-2015 |
20150015489 | SYSTEM AND METHOD FOR DIGITAL RECORDING OF HANDPAINTED, HANDDRAWN AND HANDWRITTEN INFORMATION - A system and method of digital recording of painted, drawn and written information and navigating a cursor on the display defined by free moving at least one part of a painter body has steps of providing a computing device with a display serving as a digital electronic canvas; providing an input device comprising: an interchangeable end-point; a single MEMS sensor of mechanical parameters integrated on a semiconductor substrate chip, which is included in the interchangeable end-point; providing any working surface suitable for moving the input device relative to the working surface in a process of painting, drawing, writing or cursor navigating; moving the input device with at least one part of a painter body such that the interchangeable end-point is interacting with the working surface while recording the change of a vectors of mechanical parameters applied to the sensor; digitizing this information and processing the data related to the change of the vectors of mechanical parameters; and providing a description in digital format of how the input device has been moved over and how it has been pressed to the working surface based on the change of the corresponding vectors of mechanical parameters. | 01-15-2015 |
20150022447 | NON-LINEAR MOTION CAPTURE USING FRENET-SERRET FRAMES - Implementations of the technology disclosed convert captured motion from Cartesian/(x,y,z) space to Frenet-Serret frame space, apply one or more filters to the motion in Frenet-Serret space, and output data (for display or control) in a desired coordinate space—e.g., in a Cartesian/(x,y,z) reference frame. The output data can better represent a user's actual motion or intended motion. | 01-22-2015 |
20150029099 | METHOD FOR CONTROLLING TOUCH AND MOTION SENSING POINTING DEVICE - A method for controlling a touch and motion sensing pointing device is disclosed. In the method, a touch on the pointing device is sensed through a touch sensing module. Whether or not a touch size of the touch is smaller than a predetermined size is determined when the touch is sensed. An output of coordinate signals of the pointing device is locked when the touch size of the touch is smaller than the predetermined size. Whether or not a persisting time of the touch is larger than a predetermined time is measured when the output of coordinate signals is locked. The output of coordinate signals is unlocked when the persisting time of the touch is greater than the predetermined time. A mouse-click signal is transmitted when the persisting time of the touch is smaller than or equal to the predetermined time. | 01-29-2015 |
20150029100 | 2D AND 3D POINTING DEVICE BASED ON A PASSIVE LIGHTS DETECTION OPERATION METHOD USING ONE CAMERA - Systems for surface-free pointing and/or command input include a computing device operably linked to an imaging device. The imaging device can be any suitable video recording device including a conventional webcam. At least one pointing/input device is provided including first, second, and third sets of actuable light sources, wherein at least the first and second sets emit differently colored light. The imaging device captures one or more sequential image frames each including a view of a scene including the activated light sources. One or more computer program products calculate a two-dimensional or three-dimensional position and/or a motion and/or an orientation of the pointing/input device in the captured image frames by identifying a two-dimensional or three-dimensional position of the activated light sources of the first, second, and/or third sets of light sources. Certain activation patterns of light sources are mapped to particular pointing and/or input commands. | 01-29-2015 |
20150035750 | ERGONOMIC PHYSICAL INTERACTION ZONE CURSOR MAPPING - Users move their hands in a three dimensional (“3D”) physical interaction zone (“PHIZ”) to control a cursor in a user interface (“UI”) shown on a computer-coupled 2D display such as a television or monitor. The PHIZ is shaped, sized, and positioned relative to the user to ergonomically match the user's natural range of motions so that cursor control is intuitive and comfortable over the entire region on the UI that supports cursor interaction. A motion capture system tracks the user's hand so that the user's 3D motions within the PHIZ can be mapped to the 2D UI. Accordingly, when the user moves his or her hands in the PHIZ, the cursor correspondingly moves on the display. Movement in the z direction (i.e., back and forth) in the PHIZ allows for additional interactions to be performed such as pressing, zooming, 3D manipulations, or other forms of input to the UI. | 02-05-2015 |
20150035751 | INTERFACE APPARATUS USING MOTION RECOGNITION, AND METHOD FOR CONTROLLING SAME - Provided are a 3-dimensional interface device through motion recognition and a control method thereof, and is a method of controlling an operation range and a movement speed of the interface device. The interface device includes a display unit which is a display unit, a control terminal generating an output signal output to the display unit and controlling drive of the display unit, and an interface unit connected to the control terminal, receiving a user's command from a user, and transmitting the user's command to the control terminal. The interface unit includes a detecting unit detecting a user's motion, and a control unit controlling drive of the interface unit. | 02-05-2015 |
20150035752 | IMAGE PROCESSING APPARATUS AND METHOD, AND PROGRAM THEREFOR - An image processing apparatus includes an extracting unit for extracting a feature point from a captured image; a recognizing unit for recognizing a position of the feature point; a display unit for displaying, based on the position of the feature point, a feature-point pointer indicating the feature point and a mirrored image of the captured image in a translucent manner; and an issuing unit for issuing, based on the position of the feature point, a command corresponding to the position of the feature point or a motion of the feature point. | 02-05-2015 |
20150042564 | PROJECTOR - A technology reducing the number of components in a projector. Illuminating light from a lamp is made incident from a first surface of a TIR prism via an optical system, reflected, output from a second surface of the TIR prism, optically modulated by a DMD, and made incident to the second surface of the TIR prism. Projection light from the DMD is transmitted through the TIR prism, emitted from a third surface of the TIR prism, and enlarged by a projection lens, to form an image on a screen. Light from a light emitting element is reflected by the screen, made incident to the third surface of the TIR prism via the projection lens, reflected, and output from a fourth surface of the TIR prism, to form an image on an imaging element. | 02-12-2015 |
20150049019 | CONTROL DEVICE, INPUT DEVICE, CONTROL SYSTEM, HANDHELD DEVICE, AND CONTROL METHOD - A control device includes: a receiver for receiving first information regarding the movement of a casing, and second information regarding whether to reflect the first information on the movement of coordinate values; a storage unit for storing a whole-screen region including a real-screen region, and a virtual-screen region set around the real-screen region; a generator for generating the coordinate values within the whole-screen region based on the first information; a switcher for switching a first state in which the coordinate values are movable, and a second state in which the coordinate values are immovable, based on the second information; a determining unit for determining which of the real-screen region or the virtual-screen region the coordinate values belong to; and a coordinate-value control unit for controlling the coordinate values so as to move the coordinate values within the virtual-screen region to the position of predetermined coordinate values within the real-screen region. | 02-19-2015 |
20150054741 | DISPLAY CONTROL DEVICE, DISPLAY CONTROL METHOD, AND PROGRAM - There is provided a display control device including an acquisition unit configured to acquire contact information of contact of an operation body on an operation surface, a display control unit that has a first function of moving a cursor displayed on a display device in accordance with a change amount of a contact position indicated by the contact information acquired by the acquisition unit and a second function of displaying the cursor at a position, on a given display area of the display device, corresponding to the contact position, and a switch unit configured to switch a function to be exerted by the display control unit between the first function and the second function. The switch unit exerts the second function when the acquisition unit has acquired first contact information of contact of the operation body on a given operation area of the operation surface. | 02-26-2015 |
20150054742 | INFORMATION PROCESSING METHOD AND INFORMATION PROCESSING APPARATUS - An information processing method is performed by an information processing apparatus, for executing predetermined processing in relation to an object displayed on a screen. The information processing method includes: obtaining position information indicating positions detected for two or more parts of an operating body; calculating a control amount on a basis of the positions of the two or more parts, the positions being indicated in the position information; and controlling including displaying, on the screen, a control symbol including a first symbol which represents the control amount calculated in the calculating and a second symbol which represents a threshold to be a criterion for determining whether or not the control amount satisfies a predetermined condition, and executing the predetermined processing in a case where the control amount is determined to satisfy the predetermined condition. | 02-26-2015 |
20150054743 | DISPLAY METHOD THROUGH A HEAD MOUNTED DEVICE - In one embodiment, it is proposed a method for validating a control operation to be performed by a target device through a virtual control interface, the virtual control interface being displayed by a head mounted device of a user. The method is remarkable in that it comprises:
| 02-26-2015 |
20150054744 | Document Mode Processing For Portable Reading Machine Enabling Document Navigation - Controlling a reading machine while reading a document to a user by receiving an image of a document, accessing a knowledge base that provides data that identifies sections in the document and processing user commands to select a section of the document. The reading machine applies text-to-speech to a text file that corresponds to the selected section of the document, to read the selected section of the document aloud to the user. | 02-26-2015 |
20150054745 | HANDHELD POINTER DEVICE AND POINTER POSITIONING METHOD THEREOF - A pointer positioning method for a handheld pointer device includes: capturing a first frame containing a reference point when the handheld pointer device updates a first tilt angle presently used to a second tilt angle; computing a first pointing coordinate according to the image position of the reference point in the first frame and the first tilt angle; computing a second pointing coordinate according to the image position of the reference point in the first frame and the second tilt angle; capturing a second frame containing the reference point to compute a third pointing coordinate according to the image position of the reference point in the second frame and the second tilt angle; generating a cursor parameter for controlling a display position of a cursor on a display apparatus according to the first pointing coordinate, the second pointing coordinate, and the third pointing coordinate. | 02-26-2015 |
20150062011 | REMOTE CONTROL APPARATUS AND METHOD OF AUDIO VIDEO NAVIGATION SYSTEM - Provided are a remote control apparatus and method that remotely control an operation of an AVN system through a virtual touch panel generated by using a stylus pen and a sensor equipped in the AVN system. Therefore, the AVN system is remotely controlled by using the virtual touch panel generated in a vehicle, thus enhancing a user's convenience. | 03-05-2015 |
20150062012 | DISPLAY CONTROL APPARATUS, DISPLAY CONTROL METHOD, DISPLAY CONTROL SIGNAL GENERATING APPARATUS, DISPLAY CONTROL SIGNAL GENERATING METHOD, PROGRAM, AND DISPLAY CONTROL SYSTEM - There is provided a display control apparatus including a display control unit configured to move an object displayed on a display unit in accordance with a change amount of an indicated direction of a direction indicator. The display control unit changes a movement amount of the object corresponding to the change amount on the basis of a rotational angle with the indicated direction used as an axis. | 03-05-2015 |
20150062013 | Rolling Shutter Synchronization of a Pointing Device in an Interactive Display System - An interactive display system including a wireless pointing device, and positioning circuitry capable of determining absolute and relative positions of the display at which the pointing device is aimed. The pointing device captures images displayed by the computer, using a rolling shutter, the images including one or more human-imperceptible positioning targets. The positioning targets are presented as patterned modulation of the intensity (e.g., variation in pixel intensity) in a display frame of the visual payload, followed by the opposite modulation in a successive frame. At least two captured image frames are subtracted from one another to recover the positioning target in the captured visual data and to remove the displayed image payload. The capturing of images at the pointing device is synchronized with the release of image data to the display, to avoid errors in the positioning operation. | 03-05-2015 |
20150077340 | METHOD, SYSTEM AND COMPUTER PROGRAM PRODUCT FOR REAL-TIME TOUCHLESS INTERACTION - The real-time touchless interaction method of the present invention comprises the following steps: characteristic information of an object is acquired; the characteristic information is recognized and used to generate a 3D icon which corresponds to the object; the characteristic information of the object continues to be acquired, and the 3D icon is reconstructed in real-time based on how the object is manipulated; at the same time, the 3D icon is utilized as an pointer to interact with the smart device without requiring any physical manipulation with the touch screen. A system and a computer program product for real-time touchless interaction are disclosed herein. The present invention enables users to interact with a smart device without physically touching the screen, thereby generating a plethora of new interactive applications and greatly increasing the versatility of available operations between the user and the smart device. | 03-19-2015 |
20150084864 | Input Method - Methods and systems for authenticating a user using eye tracking information are described. A wearable computing system may include a head mounted display (HMD). The wearable computing system may be operable to be in a locked mode of operation after a period of inactivity by a user. Locked mode of operation may include a locked screen and reduced functionality of the wearable computing system. The user may be authenticated to be able to use the wearable computing system after the period of inactivity. The wearable computing system may generate a display of a random content on the HMD including a content personalized to the user. The wearable computing system may receive information associated with a gaze location of an eye of the user and determine that the gaze location substantially matches a predetermined location of the content personalized to the user on the HMD and authenticate the user. | 03-26-2015 |
20150084865 | Input Device Backlighting - Input device backlighting techniques are described. In one or more implementations, an input device includes a light guide configured to transmit light, a sensor assembly having a plurality of sensors that are configured to detect proximity of an object as a corresponding one or more inputs, a connection portion configured to form a communicative coupling to a computing device to communicate the one or more inputs received by the sensor assembly to the computing device, and an outer layer. The outer layer has a plurality of indications of inputs formed using openings in the outer layer such that light from the light guide is configured to pass through the openings to function as a backlight. The outer layer also has a plurality of sub-layers arranged to have increasing levels of resistance to transmission of the light from the light guide, one to another. | 03-26-2015 |
20150084866 | VIRTUAL HAND BASED ON COMBINED DATA - An example computing system may include a display and an input device. The input device may include a touch sensor to provide touch data, and a contactless sensor to provide contactless data. A field of view of the contactless sensor is directed away from the touch sensor. A controller is to combine the touch data and the contactless data into combined data to generate a virtual hand to be displayed on the display. The virtual hand is to include an unsensed feature. | 03-26-2015 |
20150091799 | PROJECTOR WITH REMOVABLE MODULE - A projector ( | 04-02-2015 |
20150091800 | 3D POINTING DEVICES WITH ORIENTATION COMPENSATION AND IMPROVED USABILITY - Systems and methods according to the present invention describe 3D pointing devices which enhance usability by transforming sensed motion data from a first frame of reference (e.g., the body of the 3D pointing device) into a second frame of reference (e.g., a user's frame of reference). One exemplary embodiment of the present invention removes effects associated with a tilt orientation in which the 3D pointing device is held by a user. | 04-02-2015 |
20150097772 | Gaze Signal Based on Physical Characteristics of the Eye - A computing device may receive an eye-tracking signal or gaze signal from an eye-tracking device. The gaze signal may include information indicative of observed movement of an eye. The computing device may make a determination that movement of the eye derived from analyzing the received gaze signal violates a set of rules for eye movement, where the set of rules may be based on an analytical model of eye movement. In response to making the determination, the computing device may provide an indication that the received gaze signal contains unreliable eye-movement information for at least one computer-implemented application that uses measured eye movement as an input. | 04-09-2015 |
20150097773 | METHOD FOR ACTIVATING AN APPLICATION AND SYSTEM THEREOF - The disclosure is related to a method for activating an application. The method involves detecting a position status by using at least one sensor, determining whether the position status satisfies a predetermined condition, and executing an application corresponding to the position status when the position status satisfies a predetermined condition. The disclosure is also related to a system for activating an application. The system includes at least one sensor and a processor communicates with the at least one sensor. The sensor detects a position status of an electrical device and generates a detecting signal corresponding to the position status. The processor determines the position status of the electrical device in response to the detecting signal outputted from the sensor, and determines whether the position status satisfies a predetermined condition, and executing an application corresponding to the position status when the position status satisfies a predetermined condition. | 04-09-2015 |
20150097774 | OPERATION METHOD, CONTROL APPARATUS, AND PROGRAM - A control apparatus comprising a processor, a memory, and a communication circuit configured to communicate with an input apparatus is provided. The memory device stores instructions which when executed by the processor, causes the processor to receive displacement information from the input apparatus, and at least one of: (i) generate a displacement value for displacing an operation target on a display based on the displacement information, wherein a first set of instructions is used to calculate the displacement value if the displacement information is within a predetermined range, and a second set of instructions is used to calculate the displacement value if the displacement information is outside the predetermined range; and (ii) transmit a feedback signal to the input apparatus at a timing based on the displacement information, wherein the timing is calculated differently if the displacement information is within the predetermined range than if the displacement information is outside the predetermined range. | 04-09-2015 |
20150097775 | METHOD AND APPARATUS FOR DETERMINING THE POSE OF A LIGHT SOURCE USING AN OPTICAL SENSING ARRAY - A pose determination system includes a light source configured to emit a pattern of light corresponding to three or more non-collinear points, and a display panel including a plurality of optical sensors in a display area. The optical sensors are configured to detect the light pattern. The pose determination system is configured to determine a position of the light source with respect to the display panel utilizing the detected light pattern. A method for determining a pose of a light source with respect to a display panel includes emitting light from the light source and having a pattern corresponding to three or more non-collinear points, detecting the light pattern utilizing a plurality of optical sensors in a display area of the display panel, and determining by a processor a position of the light source with respect to the display panel utilizing the detected light pattern. | 04-09-2015 |
20150097776 | CONTROL USING MOVEMENTS - A movement of an object is recognised as a predetermined movement, by transmitting signals between transmitter-receiver pairs, which are reflected from the object. A first event is recorded for one of the transmitter-receiver pairs if a reflected signal meets a predetermined proximity criterion, and a second event is recorded for a second transmitter-receiver pair if, after the first event, a subsequent reflected signal meets a predetermined proximity criterion. The first and second events are used to identify the movement. | 04-09-2015 |
20150097777 | 3D Motion Interface Systems and Methods - A 3D interface system for moving the at least one digital displayed object based on movement of the at least one physical object. The 3D interface system comprises a display system for displaying 3D images, a sensor input system, and a computing system. The sensor input system generates sensor data associated with at least one physical control object. The computing system receives the sensor data and causes the display system to display the at least one digital displayed object and the at least one digital sensed object associated with the at least one physical object. The computing system moves the at least one digital displayed object based on movement of the at least one physical object. | 04-09-2015 |
20150103003 | USER INTERFACE PROGRAMMATIC SCALING - Embodiments that relate to scaling a visual element displayed via a display device are disclosed. In one embodiment a method includes receiving and using gaze tracking data to determine gaze locations at which a user is gazing on the display device. Depth tracking data is received and used to determine that a user's pointer is at a predetermined location. In response, a locked gaze location on the screen is locked, where the locked gaze location includes at least a portion of the visual element. In response to locking the locked gaze location, the visual element is programmatically scaled by a predetermined amount to an enlarged size. A user input selecting the visual element is then received. | 04-16-2015 |
20150103004 | VELOCITY FIELD INTERACTION FOR FREE SPACE GESTURE INTERFACE AND CONTROL - The technology disclosed relates to automatically interpreting a gesture of a control object in a three dimensional sensor space by sensing a movement of the control object in the three dimensional sensor space, sensing orientation of the control object, defining a control plane tangential to a surface of the control object and interpreting the gesture based on whether the movement of the control object is more normal to the control plane or more parallel to the control plane. | 04-16-2015 |
20150103005 | POINTING DEVICE USING CAMERA AND OUTPUTTING MARK - Pointing device like mouse or joystick comprises camera for capturing the display screen and image processing means for recognizing and tracking the pointing cursor icon or mark from the captured image and producing the pointing signal. The pointing device of present invention can be used with any type of display without and additional tracking means like ultra sonic sensor, infrared sensor or touch sensor. The pointing device of present invention includes mark outputting portion, camera portion for capturing the said mark outputting portion and image processing portion for recognizing the said mark outputting portion from the captured image and producing the pointing signal. | 04-16-2015 |
20150109206 | REMOTE INTERACTION SYSTEM AND CONTROL THEREOF - A system, method and process for interactions between user and an display is described. | 04-23-2015 |
20150109207 | Keyboard and Mouse of Handheld Digital Device - A keyboard and mouse of a handheld digital device comprises a keyboard and a simulated mouse. The keyboard comprises a plurality of side-keys and a main keyboard. The side-keys are located on the side(s) of the device, mainly used for controlling the inputs of the keys of the main keyboard, and operated by one hand. The main keyboard is mainly used for inputting characters and commands, and operated by another hand. The simulated mouse is simulated by a screen-touching finger or a touch pen. The left and right buttons of the simulated mouse are simulated respectively by the fingers clicking on the screen at the left and right of the simulating finger. The middle button of the simulated mouse is simulated by two fingers touching and sliding on the screen side by side or by one finger touching and sliding together with another touching and staying on the screen. And some of the side-keys can be reused as the left, right and middle simulated mouse buttons. A device with a touch pen can also simulate a mouse by cooperating with side keys and/or fingers. | 04-23-2015 |
20150116218 | MOBILE TERMINAL AND CONTROLLING METHOD THEREOF - A mobile terminal including a wireless communication unit configured to provide wireless communication; a touchscreen configured to receive a touch input; and a controller configured to partition the touchscreen into a first display region and a second display region, when the touchscreen is in a locked state and is touched with a first pointer and a second pointer, reduce sizes of the first and second display regions when the first and second pointers are dragged in opposite directions, and unlock the touchscreen or display a password input object for unlocking the touchscreen based on whether a password for unlocking the touchscreen is set, when a moving distance of at least one of the first and second pointers is greater than a preset moving distance or a size of at least one of the first and second display regions is smaller than a preset size. | 04-30-2015 |
20150123901 | GESTURE DISAMBIGUATION USING ORIENTATION INFORMATION - Embodiments are disclosed that relate to controlling a computing device based upon gesture input. In one embodiment, orientation information of the human subject is received, wherein the orientation information includes information regarding an orientation of a first body part and an orientation of a second body part. A gesture performed by the first body part is identified based on the orientation information, and an orientation of the second body part is identified based on the orientation information. A mapping of the gesture to an action performed by the computing device is determined based on the orientation of the second body part. | 05-07-2015 |
20150123902 | METHOD AND APPARATUS FOR SYNCHRONIZING VIRTUAL AND PHYSICAL MOUSE POINTERS ON REMOTE KVM SYSTEMS - A method and system is disclosed for synchronizing the virtual and physical mouse cursors of a local computer and a remotely controlled computer. Video signals generated by a host computer are transmitted to a client computer in order to allow the user of a client computer to have a virtual presence on the host computer. However, the signals transmitted by the host computer may contain errors that can cause a physical mouse to lose synchronization with a virtual mouse. Therefore this virtual presence architecture uses USB protocol and human interface descriptors that support the movement of a mouse to an absolute position in order to synchronize a virtual mouse cursor with a physical mouse cursor. | 05-07-2015 |
20150123903 | APPARATUS AND METHOD FOR RECOGNIZING MOTION - Provided is an apparatus and method of recognizing a motion that is capable of performing a pointing function and a character input function using motions sensed by an optical sensor and an inertial sensor. The apparatus includes an inertial sensor sensing a first motion by using at least one of acceleration and angular velocity that are generated by an input motion; an optical sensor sensing a second motion by using reflection of light due to the motion; a locus calculating unit calculating the locus of the motion on the basis of the locus of the first motion and the locus of the second motion; and a communication unit transmitting the calculated locus of the motion. | 05-07-2015 |
20150130716 | AUDIO-VISUAL INTERACTION WITH USER DEVICES - A user device is enabled by an audio-visual assistant for audio-visual interaction with a user. The audio-visual assistant enables the user device to track the user's eyes and face to determine objects on the screen that the user is currently observing. Various tasks can be executed on the objects based on further input provided by the user. The user can provide further inputs via facial gestures, voice or combinations thereof for executing the various tasks. | 05-14-2015 |
20150130717 | DISPLAY APPARATUS, DISPLAY SYSTEM, AND CONTROL METHOD - A display apparatus includes a display section that displays a displayed image on a display surface based on an original image, an imaging section that captures the displayed image displayed on the display surface by the display section, a generation section that generates a difference image representing a difference between a captured image produced by the imaging section that captures the displayed image and the original image, and an instruction section that instructs the display apparatus to take a predetermined action based on a temporal change in the difference image generated by the generation section. | 05-14-2015 |
20150130718 | INFORMATION PROCESSOR - An information processor includes a means for displaying a plurality of item buttons in a display region, the item buttons each having center coordinates and outlines, a means for determining coordinates of an indicated position in the display area based on an input signal, a means for calibrating the center coordinates of each of the plurality of item buttons to coordinates between the center coordinates and the outlines of each of the item buttons to obtain the calibrated center coordinates, and a means for determining one of the item buttons to be in a selected state such that the calibrated center coordinates of the determined one is closest to the coordinates of the indicated position. | 05-14-2015 |
20150130719 | METHODS AND APPARATUSES FOR OPERATING A PORTABLE DEVICE BASED ON AN ACCELEROMETER - Methods and apparatuses for operating a portable device based on an accelerometer are described. According to one embodiment of the invention, an accelerometer attached to a portable device detects a movement of the portable device. In response, a machine executable code is executed within the portable device to perform one or more predetermined user configurable operations. Other methods and apparatuses are also described. | 05-14-2015 |
20150138085 | Electronic apparatus for simulating or interfacing a backward compatible human input device by means or control of a gesture recognition system - Method and apparatus where human gestures are interpreted by means of software running on a host computer, into screen coordinates and low level commands—keyboard presses, clicks, double-clicks, drag-and-drop, wheel scroll etc.—which are sent to a hardware peripheral, instead of a software based Application Programming Interface, which hardware corrects and polishes the said screen coordinates and low level commands and translates said data by means of emulating, simulating or manipulating the protocol of an actual Human Input Device (HID)—such as a standard keyboard, mouse, joystick, touchpad, etc.—which actual HID-compliant device or simuloid is embedded into the invention, proper, and is in turn connected back into the host computer where it's recognized by native drivers as a standard HID device so that it may interact with common end-user programs in the usual manner—but thus be controlled by means of human gestures. | 05-21-2015 |
20150138086 | CALIBRATING CONTROL DEVICE FOR USE WITH SPATIAL OPERATING SYSTEM - Systems and methods comprise an input device. A detector is coupled to a processor and detects an orientation of the input device. The input device has modal orientations corresponding to the orientation, and the modal orientations correspond to a input modes of a gestural control system. The detector is coupled to the gestural control system and automatically controls selection of an input mode in response to the orientation. A calibration object comprises a plurality of sensors, and the calibration object receives data used to calibrate the input device. | 05-21-2015 |
20150138087 | HEAD-UP DISPLAY APPARATUS AND DISPLAY METHOD THEREOF - A head-up display apparatus includes: a head-up display unit including a first region in which general driving information is displayed and a second region in which circumference environment interworking information is displayed; a position information obtaining unit that obtains position information on an obstacle in front of a vehicle; and a controlling unit that controls the head-up display unit to display the circumference environment interworking information in the first region when the obstacle in front of the vehicle is present within a preset first distance. | 05-21-2015 |
20150138088 | Apparatus and Method for Recognizing Spatial Gesture - The present invention relates to an apparatus for recognizing a gesture in a space. In accordance with an embodiment, a spatial gesture recognition apparatus includes a pattern formation unit for radiating light onto a surface of an object required to input a gesture in a virtual air bounce, and forming a predetermined pattern on the surface of the object, an image acquisition unit for acquiring a motion image of the object, and a processing unit for recognizing a gesture input by the object based on the pattern formed on the surface of the object using the acquired image. In this way, depending on the depths of an object required to input a gesture in a space, haptic feedbacks having different intensities are provided, and thus a user can precisely input his or her desired gesture. | 05-21-2015 |
20150138089 | INPUT DEVICES AND METHODS - Devices and methods for providing an interface to a computing device are disclosed herein. The disclosed embodiments allow a user to utilize a first computing device, such as a smartphone or other mobile computing device, as a mouse-like peripheral input device for an associated second computing device, such as a tablet computing device. A user can utilize the first computing device as a fully functional touchpad, movement, and/or accelerometer mouse input device. Manipulation of the first computing device is translated into mouse control inputs and movements to be displayed on the associated second computing device. The first computing device may be manipulated in order to fully control movements, gestures, and various touch inputs as well as click inputs on a display, and functionality of applications executing on the second computing device. | 05-21-2015 |
20150138090 | ELECTRONIC DEVICE AND A METHOD FOR CONTROLLING THE FUNCTIONS OF THE ELECTRONIC DEVICE AS WELL AS PROGRAM PRODUCT FOR IMPLEMENTING THE METHOD - The invention relates to an electronic device, which includes a display component, in which at least one controllable element is arranged to be visualized in its entirety, the control of which element is arranged to be based on determining a change (M) relating to the attitude or position of the device and camera means arranged to form image frames (IMAGE | 05-21-2015 |
20150138091 | RFID-Based Input Device - A method includes steps of: receiving a first energy and a second energy emitted from within close proximity to a computer; powering a portable unit using the first energy; determining a position and status of the portable unit using the second energy; and transmitting a user identifier from the portable unit to the computer for verification. | 05-21-2015 |
20150145772 | Methods, Devices, and Computer Readable Storage Device for Touchscreen Navigation - Methods, devices, and computer readable storage device for navigation of a touchscreen display include calculating a scaled position based on a scaling factor. The scaling factor is based on dimensions of a region of reach and dimensions of the touchscreen display. | 05-28-2015 |
20150145773 | BEHIND-DISPLAY USER INTERFACE - Example systems and methods of providing a user interface are presented. In one example, a graphical object is displayed on an opaque display component on a user-facing side of a computing device. Using a sensing component of the computing device, movement of a physical pointer controlled by a user is sensed. The physical pointer may be located opposite the user-facing side of the computing device. On the opaque display component, a representation of the physical pointer is displayed during the movement of the physical pointer. The graphical object, as displayed on the opaque display component, is modified based on the sensed movement of the physical pointer during the movement of the physical pointer. | 05-28-2015 |
20150145774 | METHOD AND SYSTEM FOR GESTURE IDENTIFICATION - A method for gesture identification includes determining a first gesture is performed on a surface sensing region. The first gesture includes at least one input object. The method further includes determining a first action corresponding to the first gesture, and issuing, based on performing the first gesture, a first report reporting the first gesture and the first action. The method further includes determining, within a first predefined length of time subsequent to performing the first gesture, a presence of the at least one input object in an above surface sensing region, determining a second action corresponding to the first gesture and the at least one input object being in the above surface sensing region within the predefined length of time, and issuing a second report reporting the second action. | 05-28-2015 |
20150145775 | PORTABLE INPUT DEVICE - A portable input device includes a casing, a light sensing module and a printed circuit board. The casing includes a reflecting member. The light sensing module is movably mounted on the reflecting member and exposed from the casing. The light sensing module includes a lighting member and a sensor both facing the reflecting member. The lighting member is adapted for emitting light to the reflecting member. The sensor is adapted for receiving the light reflected by the reflecting member and configured for transmitting a sensing signal. The printed circuit board is mounted on the casing and electrically connected to the light sensing module. The printed circuit board is configured for processing the sensing signal to compute a value of movement of the light sensing module moving with respect to the reflecting member. | 05-28-2015 |
20150145776 | Information Input System, Portable Terminal Device, and Computer That Ensure Addition of Function - An information input system includes a computer and a portable terminal device. The portable terminal device includes: an image capturing unit; a detecting unit that detects a moving direction and an amount of movement of the portable terminal device based on a captured image, and a transmission controller that transmits movement information indicative of the moving direction and the amount of movement detected by the detecting unit to the computer. The computer includes: an information receiving unit that receives the movement information, the movement information being transmitted from the transmission controller of the portable terminal device; and an information accepting unit that accepts the moving direction and the amount of movement indicated by the movement information received by the information receiving unit as a moving direction and an amount of movement in which a cursor moves, the cursor functioning in an application program operated on the computer. | 05-28-2015 |
20150145777 | EYE TRACKING AND USER REACTION DETECTION - Methods, systems, and devices are disclosed for optical sensing and tracking of eye movement. In one aspect, a device for obtaining a user's eye dynamics data includes a display screen to present at least one content. The device includes an eye sensing module to obtain the user's eye dynamics data including the user's eye movement while presenting to the user the at least one content on the display screen. The device includes a processor in communication with the eye sensing module. The processor can determine the user's reaction to the at least one content presented on the display screen based at least partly on the obtained user's eye dynamics data. | 05-28-2015 |
20150293586 | EYE GAZE DIRECTION INDICATOR - A user of a computing system gazes at an image of visible objects in the user's field of view. The computing system determines the direction of the user's eye gaze into the field of view, and augments the image seen by the user with a computer-generated gaze direction indicator. The computing system may also determine the direction of an auxiliary pointer into the field of view, and may further augment the image seen by the user with a computer-generated auxiliary pointer direction indicator. The gaze direction indicator may be adjusted based on the user's eye movement and the auxiliary pointer direction indicator may be adjusted based on manipulation of the auxiliary pointer. The direction indicators may intersect over an object in the user's field of view, and may be displayed on a transparent display screen positioned between the user and the object. | 10-15-2015 |
20150293612 | PEN-TYPE OPTICAL INDEXING APPARATUS AND METHOD FOR CONTROLLING THE SAME - Provided is to a pen-type optical indexing apparatus and a method for controlling the same. The apparatus is a handheld device conveniently provided for a user to manipulate. Housing of the apparatus has an opening in the end, and the opening is a passage allowing emitting and receiving lights. The apparatus essentially includes a control unit for integrating the internal signals, a control interface unit provided for manipulating the apparatus to generate control signals, and an integrated circuit which packages a light-source module and a sensing module of the apparatus. The sensing module includes a sensor array composed of multiple sensing cells arranged in an array, and is used to sense the incident lights reflected by an external object. The apparatus includes a communication unit and a power management unit. One of operational modes including cursor-indicating mode, handwriting mode and touching mode can be activated while initiating the apparatus. | 10-15-2015 |
20150293656 | METHOD AND APPARATUS FOR SCROLLING A SCREEN IN A DISPLAY APPARATUS - A method of scrolling a screen in a display apparatus includes initiating a screen-scroll according to a speed of a currently generated flick input when the flick input is generated, comparing a tilt of an axis of a corresponding apparatus with an initial location at a time of generation of the flick input to determine whether a change amount of a change is within a reference value, maintaining a current screen-scroll speed if the change amount deviates from the reference value, and stopping a screen-scroll operation when a scroll stop condition is met. | 10-15-2015 |
20150301598 | METHOD, ELECTRONIC DEVICE, AND COMPUTER PROGRAM PRODUCT - According to one embodiment, a method includes acquiring first image data including first information at a first time by capturing an image of a user's eye; acquiring a first coordinate at a second time substantially the same as the first time; and calculating a calibration value using first information and the first coordinate. The first information is information related to a first position of a user's eye. The first coordinate is a coordinate of a pointer including to move in a display area on a first screen by following an operation of a pointing device by a user, or a coordinate of a touched position at which a user touches a display area of a touch panel. The calibration value is a value for estimating a second position at which a user is looking in the display area. | 10-22-2015 |
20150301610 | Methods and Apparatus Recognition of Start and/or Stop Portions of a Gesture Using Relative Coordinate System Boundaries - Described are apparatus and methods for reconstructing a gesture by aggregating various data from various sensors, including data for recognition of start and/or stop portions of the gesture using a detection of an intersection with a relative coordinate system boundary. | 10-22-2015 |
20150301617 | SYSTEMS AND METHODS FOR PROVIDING ENHANCED MOTION DETECTION - Provided are systems and methods for providing enhanced motion detection. One system providing enhanced motion detection includes a smart display, an interface subsystem including a human interface device (HID), and a console having a processor configured to form communication links with the smart display and the interface subsystem and to provide motion detection feedback, using the smart display, to a user of the HID, where the HID is configured to sense motion of the HID and utilize a predictive model to characterize the motion of the HID. One interface subsystem includes a camera to sense motion of a user of the HID. One processor is configured to negotiate a reduced response latency with the smart display. | 10-22-2015 |
20150301626 | COORDINATE DEVICE WITH ROLLING CYLINDER - The invention refers to a coordinate device ( | 10-22-2015 |
20150302821 | IMAGE DISPLAY DEVICE WITH MOVEMENT ADJUSTMENT - Devices, methods, and systems for displaying an image are described herein. One or more device embodiments include a user interface configured to display an image, a motion sensor configured to sense movement of the device, and a processor configured to convert the movement of the device to a corresponding movement of the display of the image. | 10-22-2015 |
20150309584 | INPUT CONTROL DEVICE AND METHOD - A processor recognizes a shape of an indicator that performs an operation in a space on an object to be operated that is displayed on a display surface. The processor specifies an operation assigned to the recognized shape of the indicator. The processor changes a size of the space in which the operation is performed in accordance with the specified operation. | 10-29-2015 |
20150324116 | SYSTEMS AND METHODS FOR DETECTING A PRESS ON A TOUCH-SENSITIVE SURFACE - Systems and methods for displaying and intuitively interacting with keyboards on a touch-sensitive surface are disclosed herein. In one aspect, a method is performed at an electronic device with one or more processors, memory, a touch-sensitive display, and one or more touch sensors coupled to the touch-sensitive display. The method includes: displaying a plurality of keys on a keyboard on the touch-sensitive display and detecting, by the one or more touch sensors, a first contact at a first key of the plurality of keys on the keyboard. The method further includes: determining a value of a signal corresponding to the first contact. When the value is above a first non-zero threshold, the method includes actuating the first key. When the value is between a second non-zero threshold and the first non-zero threshold, the method includes forgoing actuating the first key. | 11-12-2015 |
20150331499 | INFORMATION PROCESSING APPARATUS, INPUT APPARATUS, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND PROGRAM - An information processing apparatus, an input apparatus, an information processing system, an information processing method, and a program that are capable of improving operability when a target object is selected on a screen are provided. A control apparatus is provided to which, when a button is pressed in a state where a pointer is indicating an area around an icon on a screen, a signal indicating that the button has been pressed and a signal of positional information of the pointer at that time are input, and the control apparatus performs movement control such that the pointer indicates the icon based on those signals. Therefore, even when the pointer is not directly indicating the icon, the icon can be indicated by indicating the area around the icon, thus improving operability in selecting the icon on the screen by the pointer. | 11-19-2015 |
20150338652 | EYEWEAR TYPE TERMINAL AND CONTROL METHOD THEREOF - An eyewear type terminal includes a camera, an infrared light emitting part, a display unit and a controller. The camera obtains a first image using at least one image sensor. The infrared light emitting part is disposed to be spaced apart from the camera at a predetermined distance, and transmits infrared light using at least one infrared light emitting device. The display unit displays the first image and a marker moving along a first input signal on the first image, when the infrared light emitting part is driven. The controller obtains a second image of an object existing in the area where the marker is displayed using the at least one infrared light emitting part and the camera, when a second input signal is sensed, and outputs visual information related to the obtained second image to be adjacent to the object. | 11-26-2015 |
20150338930 | POSITION DETECTOR AND POSITION POINTER - A position detector includes a position pointer having an AC signal generation circuit that is disposed in a housing and that transmits an AC signal, and a sensor that receives the AC signal. The position detector detects the position pointed to by the position pointer on the sensor. The position pointer includes at least three electrodes electrically isolated from each other, and a control circuit that controls so that the AC signal is selectively supplied to the electrodes, and so that identification information identifying the electrode to which the AC signal is selectively supplied is generated and transmitted to the sensor. The position detector further includes a position detection circuit that detects the position based on the AC signal, and an angular information calculation circuit that calculates the rotation angle and/or the tilt angle of the position pointer based on the AC signal and the identification information. | 11-26-2015 |
20150346825 | SYSTEMS AND METHODS FOR ENABLING FINE-GRAINED USER INTERACTIONS FOR PROJECTOR-CAMERA OR DISPLAY-CAMERA SYSTEMS - A computer-implemented method performed in connection with a mobile computing device held by a user, the mobile computing device displaying a marker pattern, the method being performed in a computerized system incorporating a processing unit, a camera and a memory, the computer-implemented method involving: acquiring a plurality of images of the mobile computing device displaying the marker pattern using the camera; using the central processing unit to detect the marker pattern within the acquired plurality of images; using the central processing unit to determine a plurality of positions of the mobile computing device based on the detected marker pattern within the acquired plurality of images; and processing a user event based on the determined plurality of positions of the mobile computing device. | 12-03-2015 |
20150346833 | GESTURE RECOGNITION SYSTEM AND GESTURE RECOGNITION METHOD - A gesture recognition system comprises: a signal collection terminal, configured to collect and preprocess gesture data of a gesture object; a local recognition device, configured to extract features from the gesture data received from the signal collection terminal, form a multi-dimensional feature vector based on extracted features, establish local gesture models based on the multi-dimensional feature vector and perform local gesture recognition according to the local gesture models; and a cloud server, configured to receive the multi-dimensional feature vector from the local recognition device when there is a network connection between the local recognition device and the cloud server, establish cloud gesture models based on the received multi-dimensional feature vector and perform cloud gesture recognition according to the cloud gesture models. A gesture recognition method is also disclosed. | 12-03-2015 |
20150363008 | DISPLAYING A USER INPUT MODALITY - An aspect provides a method, including: receiving, from at least one detector, data input associated with a position of a user with respect to an information handling device; determining, using a processor, the position of the user with respect to the information handling device using the data input; and displaying, using a display, a user position based input modality based on the position that has been determined. Other aspects are described and claimed. | 12-17-2015 |
20150370349 | ERGONOMIC PHYSICAL INTERACTION ZONE CURSOR MAPPING - Users move their hands in a three dimensional (“3D”) physical interaction zone (“PHIZ”) to control a cursor in a user interface (“UI”) shown on a computer-coupled 2D display such as a television or monitor. The PHIZ is shaped, sized, and positioned relative to the user to ergonomically match the user's natural range of motions so that cursor control is intuitive and comfortable over the entire region on the UI that supports cursor interaction. A motion capture system tracks the user's hand so that the user's 3D motions within the PHIZ can be mapped to the 2D UI. Accordingly, when the user moves his or her hands in the PHIZ, the cursor correspondingly moves on the display. Movement in the z direction (i.e., back and forth) in the PHIZ allows for additional interactions to be performed such as pressing, zooming, 3D manipulations, or other forms of input to the UI. | 12-24-2015 |
20160004334 | Advanced wireless pointing device - Subject of the invention is a method to calculate the point of a display device to which we are pointing to and a device of which one part is a part of a pointing device (like computer mouse or any other electronic pointing device), as part of game controllers or (smart) TV remote controllers. Also a method to calculate the point of the display device to which we are pointing. The device consists of a part that we place to a fixed position relative to the display device (or attached to it or built-in) and a part that can be hand held or worn or built-in another device (game or TV controllers). Both parts have several transceivers (receiver and transmitter) that communicate with each other to ascertain the distance from one to the other. We calculate the distance from the time spent for the signal to get from one transmitter to the other (and back) and the velocity of the signal. For communication, different wave propagations can apply (sound or electromagnetic waves). Knowing the position of all transceivers, we can calculate the point on the display device to which the person is pointing. | 01-07-2016 |
20160004337 | PROJECTOR DEVICE, INTERACTIVE SYSTEM, AND INTERACTIVE CONTROL METHOD - A projector device includes: a projecting unit that projects a projection image onto a projection surface; a receiving unit that receives information transmitted from a pointer operating the projection surface, the information including unique identification information of the pointer and first operation direction information indicating an operation direction of the pointer; a photographing unit that photographs the projection surface; a generating unit that generates second operation direction information indicating an operation direction of the pointer operating the projection surface from multiple photographed images taken by the photographing unit; an associating unit that associates the second operation direction information matching with the first operation direction information with the unique identification information; an operation-information generating unit that generates operation information including the associated unique identification information and second operation direction information; and a transmitting unit that transmits the generated operation information to a controlled device. | 01-07-2016 |
20160011660 | Handheld and Wearable Remote-Controllers | 01-14-2016 |
20160011671 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND STORAGE MEDIUM | 01-14-2016 |
20160011675 | Absolute Position 3D Pointing using Light Tracking and Relative Position Detection | 01-14-2016 |
20160011676 | HIGH FIDELITY REMOTE CONTROLLER DEVICE FOR DIGITAL LIVING ROOM | 01-14-2016 |
20160012294 | Athletic Team Integrated Communication, Notification, and Scheduling System | 01-14-2016 |
20160026254 | DISPLAY DEVICE AND METHOD FOR DRIVING THE SAME - A display device and a method for driving the same are disclosed. The display device includes a display unit configured to display a pointer and a controller configured to differently control a motion of the pointer corresponding to a moving distance of a pointing part depending on a distance from the pointing part inputting a gesture command. | 01-28-2016 |
20160026265 | System and Method for Determining an Attitude of a Device Undergoing Dynamic Acceleration - A system and method for determining an attitude of a device undergoing dynamic acceleration is provided. A first attitude measurement may be calculated based on a magnetic field measurement received from a magnetometer of the device and a first acceleration measurement received from a first accelerometer of the device. A second attitude measurement can be calculated based on the magnetic field measurement received from the magnetometer and a second acceleration measurement received from a second accelerometer of the device. A correction factor, calculated based on a difference between the two attitude measurements, can be applied to the first attitude measurement to produce a corrected device attitude measurement. The device can be a headset having two sets of in-the-ear and behind-the-ear microphones, a digital signal processor, and a communications interface. The device may comprise two hearing aids, each having multiple microphones, configured to wirelessly intercommunicate. | 01-28-2016 |
20160026268 | INPUT DEVICE WITH MEANS FOR ALTERING THE OPERATING MODE OF THE INPUT DEVICE - An input device for a computing device includes a first input sensor for receiving a first type of instructions associated with a first operating mode of the input device, the first type of instructions including moving the first input sensor with respect to a reference surface. The input device includes a second input sensor for receiving a second type of instructions associated with a second operating mode of the input device, the second type of instructions including moving an indicator with respect to an active surface of the input device provided with said second input sensor. The input device includes a control sensor adapted to detect the position of at least a portion of the hand of a user with respect to the input device. The control sensor is coupled to a controller adapted to allow the input device to operate in either the first or the second operating mode. | 01-28-2016 |
20160026360 | METHOD AND APPARATUS FOR USER INTERFACE OF INPUT DEVICES - A 3 dimensional (3-D) user interface system employs: one or more 3-D projectors configured to display an image at a first location in 3-D space; one or more sensors configured to sense user interaction with the image and to provide user interaction information; and a processor configured (i) to receive the user interaction information from the one or more sensors; (ii) to correlate the user interaction with the image; and (iii) to provide one or more indications responsive to a correlation of the user interaction with the image, including displaying the image at a second location in 3-D space. | 01-28-2016 |
20160034048 | VIDEO DISPLAY SYSTEM, THREE-DIMENSIONAL VIDEO POINTING DEVICE AND VIDEO DISPLAY DEVICE - A video display system includes: a display device that displays a video, which constitutes a stereoscopic video, on its surface; and a light beam emitting device capable of pointing to one point with scattering light obtained by projecting a non-visible light beam or a visible light beam onto the surface, the display device has a camera capable of capturing the scattering light and a superimposition unit that superimposes a pointer image on a display video, and the superimposition unit superimposes the pointer image on the display video depending on a position of the scattering light captured by the camera. | 02-04-2016 |
20160034183 | Projection Screen, Remote Control Terminal, Projection Device, Display Device, Projection System and Remote Control Method for Projection System - The projection screen includes signal lines, photosensitive sensors and a signal processor. The photosensitive sensor is configured to sense a light beam for determining a remote control region from a remote control terminal. The signal line is configured to transmit to the signal processor the sensing signal generated by the photosensitive sensor after sensing the light beam. The signal processor is configured to determine position information about a position irradiated by the light beam. The display device can determine the remote control region in accordance with the position information determined by the signal processor about the position irradiated by the light beam, and receive an operation command from a command key of the remote control terminal so as to perform a remote control operation. | 02-04-2016 |
20160054802 | SENSOR BASED UI IN HMD INCORPORATING LIGHT TURNING ELEMENT - A head-mounted display (HMD) system includes a head-mounted frame that encases a portable electronic device (PED) that includes a touch-less sensor, and the PED. The frame includes: a front slot to hold the PED and maintain contact with the PED's display face; an entry via dimensioned to prevent blockage of the sensor's field of view; and a light turning element positioned to redirect the sensor's field of view through an exit via in the frame and into open space outside of the frame. The PED includes: the sensor to detect a touch-less gesture; an image display to display image content on the PED's front display face; and processing circuitry configured to change modes from handheld mode to HMD operation mode based on a determination that the PED is encased by the frame and to select the image content based on input signals received from the sensor and the mode. | 02-25-2016 |
20160054807 | SYSTEMS AND METHODS FOR EXTENSIONS TO ALTERNATIVE CONTROL OF TOUCH-BASED DEVICES - Systems and methods configured to facilitate multi-modal user inputs in lieu of physical input for a processing device configured to execute an application include obtaining non-physical input for processing device and the application, wherein the physical input comprises one or more of touch-based input and tilt input; processing the non-physical input to convert into appropriate physical input commands for the application; and providing the physical input commands to the processing device. | 02-25-2016 |
20160054809 | MOTION DETECTION SYSTEM - The present invention provides a motion detecting system, which includes a light source module, a plurality of image sensors and a control unit. The light source module illuminates at least one object. The image sensors respectively detect the object under the light emitted by the light source module to generate a plurality of detection results. The control unit is coupled to the image sensors, and generates a control command according to the detection results. | 02-25-2016 |
20160054814 | EASILY DEPLOYABLE INTERACTIVE DIRECT-POINTING SYSTEM AND PRESENTATION CONTROL SYSTEM AND CALIBRATION METHOD THEREFOR - A method for controlling movement of a computer display cursor based on a point-of-aim of a pointing device within an interaction region includes projecting an image of a computer display to create the interaction region. At least one calibration point having a predetermined relation to said interaction region is established. A pointing line is directed to substantially pass through the calibration point while measuring a position of and an orientation of the pointing device. The pointing line has a predetermined relationship to said pointing device. Movement of the cursor is controlled within the interaction region using measurements of the position of and the orientation of the pointing device. | 02-25-2016 |
20160070367 | WEARABLE ELECTRONIC DEVICE AND CURSOR CONTROL DEVICE - A wearable electronic device adapted to be worn on a user's wrist is provided. The wearable electronic device includes a wristband, a main body of the device, and a sensing unit. The device body is disposed on the wristband and includes a display unit and a control unit, wherein the display unit is electrically coupled to the control unit. The sensing unit is disposed on the wristband, and opposed to the device body. The sensing unit is electrically coupled to the control unit for sensing the data in relation to the outward movement, so as to perform the cursor pointing function. Furthermore, a cursor control device is also provided. | 03-10-2016 |
20160077609 | MAPPED VARIABLE SMOOTHING EVOLUTION METHOD AND DEVICE - The present inventions generally relate to methods and dedicated apparatuses outputting a variable mapped on a device orientation in a non-inertial reference system, with the device orientation being estimated using measurements of motion sensors (such as 3D accelerometers and gyroscopes) and a magnetometer or other similar sensors including cameras. A variable mapped on an orientation of a device is smoothed to have a gradual evolution by adjusting the estimated orientation of the device obtained via sensor fusion or other sensor processing to take into consideration a current measured angular velocity. | 03-17-2016 |
20160077614 | DATA INPUT DEVICE AND ASSOCIATED BRAKING MEANS - A data input device comprises a body intended to be fixed onto a workstation, a part that is rotationally mobile in relation to the body, a set of sensors delivering information on the relative position of the mobile part in relation to the body and means for braking the rotational movements of the mobile part in relation to the body. The braking means comprise a friction ring encircling the mobile part, the ring being split and extending mainly in a plane at right angles to an axis of symmetry of the mobile part, an annular spring extending in the plane and radially compressing the friction ring against the mobile part and means for adjusting the length of the annular spring. | 03-17-2016 |
20160091963 | REMOTE WEARABLE INPUT SOURCES FOR ELECTRONIC DEVICES - In one example an electronic device comprises at least one sensor to detect an input from a remote input source and a controller comprising logic, at least partly including hardware logic, to detect an input on the at least one sensor, generate a signal in response to the input; and forward the signal to an application. Other examples may be described. | 03-31-2016 |
20160091986 | HANDHELD DEVICE, MOTION OPERATION METHOD, AND COMPUTER READABLE MEDIUM - A handheld device includes a memory module, a sensing module and a processing module. The processing module is coupled to the sensing module and the memory module. The memory module stores a plurality of applications. The sensing module detects a motion of the handheld device and provides a sensing signal. The processing module executes at least one of the plurality of applications and receives the sensing signal. When the handheld device enters a motion operation mode and the processing module receives the sensing signal from the sensing module, the processing module controls the handheld device to execute a specific function according to at least one of the applications being executed and the sensing signal received by the handheld device. | 03-31-2016 |
20160109962 | HANDWRITING INPUT DEVICE OF ELECTRONIC DEVICE - A handwriting input device includes a shell, a refill, a light emitting unit, a pressure sensing unit, a light sensing unit, a track determining unit, and a transmitting unit. The refill includes a nib and a distal end. The light emitting unit emits light on a writing medium. The light sensing unit captures images on the writing medium and analyzes the images to obtain a track of the nib. The pressure sensing unit senses whether the nib writes on the writing medium. The track determining unit determines a writing track based on the track of the nib and a sensing result of the pressure sensing unit. The transmitting unit transmits the writing track to an electronic device. The writing track is used to determine an input operation of the electronic device. | 04-21-2016 |
20160110057 | REMOTE CONTROLLER APPARATUS AND CONTROL METHOD THEREOF - A remote controller apparatus including: a communicator configured to communicate with an external device displaying a pointing object thereon; a sensor configured to sense a movement of the remote controller apparatus; and a controller configured to control movement of the pointing object based on the movement of the remote controller apparatus, determine position information corresponding to the movement of the remote controller apparatus with a first method when the remote controller apparatus operates in a first operating mode, determine the position information corresponding to the movement of the remote controller apparatus with a second method when the remote controller apparatus operates in a second operating mode, control the movement of the pointing object based on the position information, and change the operating mode of the remote controller apparatus in response to a preset event occurring. | 04-21-2016 |
20160132106 | MENU SELECTION APPARATUS USING GAZE TRACKING - A menu selection apparatus using gaze tracking includes: a gaze tracker configured to track a gaze of a driver and to output time series gaze data; a timer included in the gaze tracker and configured to output a timing pulse; a buffer configured to store the time series gaze data; a counter configured to count the timing pulse to adjust synchronization with the timer; a button configured to output a click signal when being clicked by the driver; and a controller configured to extract the time series gaze data for a critical time based on button click timing as the click signal is output from the button and to then calculate an average of gaze vectors within each time series gaze data. | 05-12-2016 |
20160139669 | Device for Intuitive Dexterous Touch and Feel Interaction in Virtual Worlds - A device for dexterous interaction in a virtual world in disclosed. The device includes a housing including a plurality of buttons and a plurality of vibration elements each associated with at least one of the plurality of buttons. An orientation sensor detects orientation of the housing, and a bearing is configured to allow the housing to freely rotate in a plurality of directions. A processor is in communication with the plurality of buttons, the plurality of vibration elements, and the orientation sensor. A transmitter/receiver unit is configured to receive data from the processor and configured to send and receive data from a central processing unit. | 05-19-2016 |
20160139687 | HAND HELD POINTING DEVICE WITH ROLL COMPENSATION - A pointing device includes accelerometers and rotational sensors that are coupled to a processor. The processor samples the accelerometers and rotational sensors to detect gravity and pointing device motion and uses algebraic algorithms to calculate roll compensated cursor control signals. The processor transmits the cursor control signals to a receiver that is coupled to an electronic device that moves the cursor on the visual display. | 05-19-2016 |
20160139688 | METHOD FOR REMOTE-CONTROLLING TARGET APPARATUS USING MOBILE COMMUNICATION TERMINAL AND REMOTE CONTROL SYSTEM THEREOF - Methods and apparatuses are provided for remote-controlling a target apparatus using a mobile communication terminal. A communication link is established with the target apparatus having a display screen adapted to visualize an object. An amount of displacement of the mobile communication terminal is detected. Motion information is generated using the amount of displacement of the mobile communication terminal. The motion information is transmitted to the target apparatus. The motion information corresponds to movement data of the object and the amount of displacement is detected based on at least one of current position information, acceleration information and angular velocity information of the mobile communication terminal. | 05-19-2016 |
20160139723 | USER INTERFACE WITH TOUCH SENSOR - A hand-held electronic device includes a display for outputting visual information and a proximity-sensitive touch panel. The display is arranged on a first side of the electronic device, while the proximity-sensitive touch panel is arranged on a second side of the electronic device that is opposite the first side. | 05-19-2016 |
20160147410 | COMPUTING APPARATUS AND METHOD FOR PROVIDING THREE-DIMENSIONAL (3D) INTERACTION - A computing apparatus for providing a three-dimensional (3D) interactive user experience (UX) is provided. The computing apparatus may include an object position estimator configured to calculate first coordinates recognized by a user as a position of a first point of an object in a stereoscopic image. The computing apparatus may include a pointing determiner configured to determine whether the user points to the first point based on the first coordinates and second coordinates, the second coordinates representing a pointing position of the user. | 05-26-2016 |
20160154469 | MID-AIR GESTURE INPUT METHOD AND APPARATUS | 06-02-2016 |
20160154470 | 3D POINTING DEVICES WITH ORIENTATION COMPENSATION AND IMPROVED USABILITY | 06-02-2016 |
20160154478 | POINTING APPARATUS, INTERFACE APPARATUS, AND DISPLAY APPARATUS | 06-02-2016 |
20160162033 | APPARATUS FOR RECOGNIZING GESTURE USING INFRARED RAY AND METHOD THEREOF - Disclosed are an apparatus and a method of recognizing a gesture using an infrared ray. The present invention provides an apparatus of recognizing a gesture, including: a sensing unit which detects a gesture using an infrared sensor to obtain a sensing value from the sensing result; a control unit which performs gesture recognition to which an intention of a user is reflected in accordance with a predetermined recognizing mode based on the obtained sensing value; and a storing unit which stores the recognizing mode when the gesture recognition set in advance by the user is performed, in which the recognizing mode includes a first recognizing mode in which the gesture is directly recognized and a second recognizing mode in which the gesture is recognized after recognizing a hold motion for determining start of the gesture recognition. | 06-09-2016 |
20160162042 | 3D POINTING DEVICES AND METHODS - Systems and methods according to the present invention address these needs and others by providing a handheld device, e.g., a 3D pointing device, which uses at least one sensor to detect motion of the handheld device. The detected motion can then be mapped into a desired output, e.g., cursor movement. | 06-09-2016 |
20160170501 | CURSOR POSITION CONTROLLING APPARATUS, CURSOR POSITION CONTROLLING METHOD, PROGRAM AND INFORMATION STORAGE MEDIUM | 06-16-2016 |
20160179188 | HAND TRACKER FOR DEVICE WITH DISPLAY | 06-23-2016 |
20160180596 | HEADSET VISION SYSTEM FOR PORTABLE DEVICES | 06-23-2016 |
20160188003 | INFORMATION PROCESSING METHOD AND ELECTRONIC DEVICE - The present disclosure provides an information processing method and electronic device. The method comprises: displaying an indicator at a first location on a display area of a first electronic device at a first timing; acquiring a first operation data corresponding to a second timing which is later than the first timing, and updating to display the indicator at a second location on the display area different from the first location; acquiring a second operation data corresponding to a third timing which is later than the second timing when the second location is located on a border of the display area; and performing a predetermined process such that the indicator is displayed at a third location on the display area at a fourth timing when the pointing of the second electronic device re-enters the display area which is later than the third timing, the third location matching with the pointing at the fourth timing. With the present disclosure, the problem of mismatch between the pointing of the second electronic device and the location of the indicator on the display area can be addressed. | 06-30-2016 |
20160188078 | FLOATING TOUCH METHOD - A floating touch method for an electronic device is disclosed. The method includes detecting a first position of an operation object near a screen of the electronic device to generate a first three-dimensional coordinate and recording time that the operation object appears at the first position as a first time, detecting a second position of the operation object near the screen to generate a second three-dimensional coordinate and recording time that the operation object appears at the second position as a second time, computing a first speed that the operation object moves from the first position to the second position according to the first three-dimensional coordinate, the second three-dimensional coordinate, the first time and the second time, and determining that the operation object selects a second two-dimensional coordinate which is a projection of the second three-dimensional on the screen if the first speed conforms to a first threshold speed. | 06-30-2016 |
20160188103 | Touch Pad with Force Sensors and Actuator Feedback - Electronic devices may use touch pads that have touch sensor arrays, force sensors, and actuators for providing tactile feedback. A touch pad may be mounted in a computer housing. The touch pad may have a rectangular planar touch pad member that has a glass layer covered with ink and contains a capacitive touch sensor array. Force sensors may be mounted under each of the four corners of the rectangular planar touch pad member. The force sensors may be used to measure how much force is applied to the surface of the planar touch pad member by a user. Processed force sensor signals may indicate the presence of button activity such as press and release events. In response to detected button activity or other activity in the device, actuator drive signals may be generated for controlling the actuator. The user may supply settings to adjust signal processing and tactile feedback parameters. | 06-30-2016 |
20160188117 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD AND PROGRAM - An apparatus and method provide logic for processing information. In one implementation, an apparatus includes a display unit configured to display a first portion of content to a user. The display unit includes a display surface. A detection unit is configured to detect a distance between an interface surface and a reference point disposed along an operational tool of the user. A control unit configured to determine whether the distance falls within a threshold range, and generate a signal to display a second portion of content to the user, when the detected distance falls within the threshold range. The display unit is further configured to display the second content portion, based on the generated signal. | 06-30-2016 |
20160188154 | AUXILIARY INPUT DEVICE - Disclosed herein are technologies for providing an auxiliary input device. The auxiliary input device may provide data to a mobile host device. In some implementations, the auxiliary input device includes a photoelectric sensor that tracks movement and provides spatial data that manipulates a cursor displayed on a user interface of the mobile host device. | 06-30-2016 |
20160192117 | DATA TRANSMISSION METHOD AND FIRST ELECTRONIC DEVICE - The present disclosure provides a data transmission method and a first electronic device. The method comprises: displaying a sharing mark on a display unit of a first electronic device; obtaining, by a sensor unit of the first electronic device, sensing parameters which indicate an input operation of an operator; determining that an operation object of the input operation is a first object displayed on the display unit based on a first parameter from the sensing parameters; determining an operation trajectory of the input operation based on a second parameter from the sensing parameters; determining that the input operation is a sharing operation associated with the first object when the operation trajectory moves towards the sharing mark; and responding to the sharing operation. | 06-30-2016 |
20160195934 | SYSTEM AND METHOD FOR USING A SIDE CAMERA FOR FREE SPACE GESTURE INPUTS | 07-07-2016 |
20160195940 | USER-INPUT CONTROL DEVICE TOGGLED MOTION TRACKING | 07-07-2016 |
20160202774 | OPTICAL POINTING SYSTEM | 07-14-2016 |
20160202775 | RELATIVE LOCATION DETERMINING METHOD, DISPLAY CONTROLLING METHOD AND SYSTEM APPLYING THE METHOD | 07-14-2016 |
20160202876 | INDIRECT 3D SCENE POSITIONING CONTROL | 07-14-2016 |
20160253068 | METHOD AND APPARATUS FOR USER INTERFACE OF INPUT DEVICES | 09-01-2016 |
20160378200 | TOUCH INPUT DEVICE, VEHICLE COMPRISING THE SAME, AND METHOD FOR CONTROLLING THE SAME - A touch input device includes a protrusion unit protruding from a mounting surface and receiving a touch signal from a user. A recess unit is disposed inside the protrusion unit. A controller is configured to determine whether to input or delete a character in accordance with an input direction of the touch signal. | 12-29-2016 |
20160378205 | METHOD FOR REMOTE-CONTROLLING TARGET APPARATUS USING MOBILE COMMUNICATION TERMINAL AND REMOTE CONTROL SYSTEM THEREOF - Methods and apparatuses are provided for remote-controlling a target apparatus using a mobile communication terminal. A communication link is established with the target apparatus having a display screen adapted to visualize an object. An amount of movement of the mobile communication terminal is detected. Motion information for moving the object is generated using the amount of movement of the mobile communication terminal. The motion information is transmitted to the target apparatus. The amount of movement is detected based on at least one of current position information, acceleration information, and angular velocity information of the mobile communication terminal. | 12-29-2016 |
20160378251 | SELECTIVE POINTER OFFSET FOR TOUCH-SENSITIVE DISPLAY DEVICE - A user contacts a touch-sensitive surface of a touch-sensitive display device with a finger. An initial finger contact patch is determined for the user finger, and a default position is assigned to a display pointer based on the initial finger contact patch. The display pointer is assigned to an offset position, different from the default position responsive to detecting a triggering gesture input from the user finger. | 12-29-2016 |
20160378320 | MANIPULATION APPARATUS - A manipulation apparatus includes an image display device for displaying an image containing a command portion, an operation unit manually operable by a user, a pointer display unit for displaying a pointer at a position corresponding to an operation state of an operation unit on the image displayed on the image display unit, a vibration application unit for applying vibration to the operation unit, and a direction determination unit for, based on a positional relationship between the command portion and the pointer or based on contents of the command corresponding to the command portion, determining a direction of a force that is first applied to the operation unit as the vibration by the vibration application unit when the pointer is displayed on the command portion. | 12-29-2016 |
20170235372 | INTERACTIVE THREE-DIMENSIONAL DISPLAY APPARATUS AND METHOD | 08-17-2017 |
20170235376 | SYSTEMS AND METHODS OF DIRECT POINTING DETECTION FOR INTERACTION WITH A DIGITAL DEVICE | 08-17-2017 |
20180024629 | CONTENT ACQUIRING METHOD AND APPARATUS, AND USER EQUIPMENT | 01-25-2018 |
20180024647 | SENSITIVITY ADJUSTMENT FOR A POINTING DEVICE | 01-25-2018 |
20180024659 | SYSTEM AND METHOD FOR PROVIDING ABSOLUTE COORDINATE AND ZONE MAPPING BETWEEN A TOUCHPAD AND A DISPLAY SCREEN | 01-25-2018 |
20190146598 | SYSTEM AND METHOD FOR DISTRIBUTED DEVICE TRACKING | 05-16-2019 |