Patent application number | Description | Published |
20120075166 | ACTUATED ADAPTIVE DISPLAY SYSTEMS - Adjustable, adaptive display system having individual display elements is able to change its configuration based on a user's movements, position, and activities. A method of adjusting a display system tracking a user is tracked using a camera or other tracking sensor, thereby creating user-tracking data. The user-tracking data is input to an actuator signal module which generates input signals for one or more actuators. The input signals are created, in part, from the user-tracking data. Two or more display elements are actuated using the one or more actuators based on the input signals. The display elements may be planar or curved. In this manner, a configuration of the display system adapts to user movements and adjusts systematically. This provides for a greater amount of a user's human visual field (or user FOV) to be filled by the display system. | 03-29-2012 |
20120260207 | DYNAMIC TEXT INPUT USING ON AND ABOVE SURFACE SENSING OF HANDS AND FINGERS - A virtual keyboard is displayed on a touch screen display surface of a computing device. Partial images of the keyboard are displayed, where a partial image may be one key, referred to as the most probable key that the user will touch, or a group of keys, which may include some less probable or surrounding keys that may be touched. Sensors under or near the display surface detect an outline of the user hands and determines which finger is the fastest moving finger, which is presumed to be the finger used to touch a key. The most probable key is determined based on the fastest moving finger and may be displayed before the finger touches the surface. If the most probable key is not touched, a user profile containing user typing habits may be updated to reflect that a less probable key was touched. | 10-11-2012 |
20130073956 | LINKING PROGRAMMATIC ACTIONS TO USER ACTIONS AT DIFFERENT LOCATIONS - A method for operating a computing device is disclosed, where data that associates a user action at a predetermined location with a programmatic action is stored in memory. A user action being performed at the predetermined location is detected, and the corresponding programmatic action is performed in response to detecting the user action being performed at the predetermined location. | 03-21-2013 |
20130076860 | THREE-DIMENSIONAL RELATIONSHIP DETERMINATION - Example embodiments disclosed herein relate to determining relationships between locations based on beacon information. At least three sensors of a device can be used to determine locations of a beacon. The device can determine a three-dimensional relationship between the locations. | 03-28-2013 |
20130076909 | SYSTEM AND METHOD FOR EDITING ELECTRONIC CONTENT USING A HANDHELD DEVICE - Embodiments of the present invention disclose a system and method for editing electronic content using a handheld device. According to one example embodiment, the system includes a mobile computing device hosting electronic content, and a handheld imaging device. The handheld imaging device is configured to communicate with the mobile computing device and includes an optical sensor for capturing image data associated with an object or area. Still further, the handheld imaging device is configured to transmit and designate a location for insertion of said image data into the electronic content hosted on the mobile device. | 03-28-2013 |
20130077059 | DETERMINING MOTION OF PROJECTION DEVICE - Example embodiments disclosed herein relate to determining a motion based on projected image information. Image information is projected onto an external surface from a device. Sensor information about the external surface and/or projection is received. Motion of the device is determined based on the sensor information. | 03-28-2013 |
20130082928 | KEYBOARD-BASED MULTI-TOUCH INPUT SYSTEM USING A DISPLAYED REPRESENTATION OF A USERS HAND - Example embodiments relate to a keyboard-based multi-touch input system using a displayed representation of a user's hand. In example embodiments, a sensor detects movement of a user's hand in a direction parallel to a top surface of a physical keyboard. A computing device may then receive information describing the movement of the user's hand from the sensor and output a real-time visualization of the user's hand on the display. This visualization may be overlaid on a mufti-touch enabled user interface, such that the user may perform actions on objects within the user interface by performing multi-touch gestures. | 04-04-2013 |
20130082937 | METHOD AND SYSTEM FOR ENABLING INSTANT HANDWRITTEN INPUT - Embodiments of the present invention disclose a method and system for enabling instant handwriting input on a mobile computing device. According to one embodiment, while the mobile device is in an inactive state and identity-protected, an activation event associated with a writing tool operated by a user is detected. In response to the activation event, the mobile computing device is switched from the inactive state to a low power state in which the mobile computing device is configured to accept and store handwritten input while remaining identity-protected. | 04-04-2013 |
20130088427 | MULTIPLE INPUT AREAS FOR PEN-BASED COMPUTING - Embodiments of the present invention disclose a pen-based computing system and method using multiple input areas. According to one embodiment, the system includes a mobile computing device having a display, and a pen input device configured to transmit a signal for determining a position of the pen device relative to the mobile computing device. A plurality of input areas are designated around the entire outer periphery of the display and the mobile computing device such that the presence or movement of the pen input device within any one of the plurality of input areas corresponds to an input operation on the mobile computing device. | 04-11-2013 |
20130091238 | PEN-BASED CONTENT TRANSFER SYSTEM AND METHOD THEREOF - Embodiments of the present invention disclose a system and method for providing pen-based content transfer between mobile computing devices. According to one embodiment, a first mobile computing device and second mobile computing device are configured to host electronic content. A pen device is operated by a user for selecting preferred electronic content from the electronic content hosted on the first computing device. Furthermore, the pen device is configured to store transfer information for facilitating transmission of the preferred electronic content from the first mobile computing device to the electronic content of the second mobile computing device based on action from the user. | 04-11-2013 |
20130100008 | Haptic Response Module - Embodiments provide an apparatus that includes a tracking sensor to track movement of a hand behind a display, such that a virtual object may be output via a display, and a haptic response module to output a stream of gas based a determination that the virtual object has interacted with a portion of the image. | 04-25-2013 |
20130167161 | PROCESSING OF RENDERING DATA BY AN OPERATING SYSTEM TO IDENTIFY A CONTEXTUALLY RELEVANT MEDIA OBJECT - Examples disclose a processor to execute an application associated with an operating system to transmit rendering data which identifies visual objects to display by the operating system. Further, the examples provide the operating system to process the rendering data to identify a media object contextually relevant to the rendering data. Additionally, the examples also disclose the operating system to output the rendering data and the identified media object. | 06-27-2013 |
20130222381 | AUGMENTED REALITY WRITING SYSTEM AND METHOD THEREOF - Embodiments of the present invention disclose an augmented reality writing system and method thereof. According to one example embodiment, the system includes a handheld writing tool having an end portion and a display device for displaying digital content for viewing by an operating user. An optical sensor is coupled to the display device and includes a field of view facing away from the operating user. Furthermore, coupled to the optical sensor is a processing unit configured to detect and track the position of the end portion of the handwriting tool. In accordance therewith, handwritten content is digitally rendered on the display device to correspond with the handwriting motion of the writing tool within the field of view of the optical sensor. | 08-29-2013 |
20130257734 | USE OF A SENSOR TO ENABLE TOUCH AND TYPE MODES FOR HANDS OF A USER VIA A KEYBOARD - Example embodiments relate to a keyboard-based system that enables a user to provide touch input via the keyboard with one hand and typed input via the keyboard with the other hand. In example embodiments, a sensor detects the user's hands on the top surface of the keyboard. In response, a computing device identifies a first hand and a second hand by analyzing information provided by the sensor. The computing device then assigns the first hand to a touch mode and the second hand to a typing mode. The user may then provide touch input using a visualization of the first hand overlaid on a user interface, while providing typed input with the second hand via the keyboard. | 10-03-2013 |
20130286199 | GENERATION OF A COMBINED IMAGE OF A PRESENTATION SURFACE - Example embodiments relate to generating a combined image of a presentation surface. In example embodiments, an image including a portion of a presentation surface is captured using a camera. A location identifier is then received, where the location identifier specifies a location of a digital version of the content displayed on the presentation surface. Next, the digital version of the content displayed on the presentation surface is retrieved from the location specified by the location identifier. Finally, a combined image is generated by combining the captured image and the retrieved digital version of the content displayed on the presentation surface. | 10-31-2013 |
20150058776 | PROVIDING KEYBOARD SHORTCUTS MAPPED TO A KEYBOARD - Example embodiments relate to the provision of keyboard shortcuts that are mapped to a physical keyboard. In example embodiments, a user interface including a plurality of selectable UI elements is outputted. A plurality of keyboard shortcuts may then be outputted, such that each keyboard shortcut corresponds to a key on a physical keyboard and the shortcuts are spatially arranged in a layout corresponding to a layout of the keyboard. A selection of a particular key may then be received and, in response, the UI element positioned at the location of the keyboard shortcut corresponding to the selected key may be activated. | 02-26-2015 |