Patent application number | Description | Published |
20090231285 | INTERPRETING AMBIGUOUS INPUTS ON A TOUCH-SCREEN - A method for interpreting ambiguous click events in relation to click targets on a touch-screen display disposed at respective click target locations includes detecting a touch at a touch location and determining whether the touch location corresponds to a click target location. The method also includes searching for nearby click target locations in a predetermined click region and, upon locating a first click target location within the predetermined click region, associating the touch with the first click target location to identify a click event. A computing device for facilitating accurate touch input targeting with respect to a touch-screen display includes a display component, a touch detection component, a targeting component that associates a touch with a click target, and an event detection component that associates the touch with one of a right click event, a left click event, or a drag event. | 09-17-2009 |
20090327531 | Remote Inking - In one or more embodiments, a bus driver, included on a local computing system, enables detection of hardware available on a host computing system for a remote access session. Upon detecting a hardware device on the host computing system, an operating system included in the local computing system may obtain a device driver for controlling data captured from the hardware device. The device driver may be used to inject data captured from the hardware device into the local operating system's input stack. In some examples, the data is injected into the local operating system's input stack at a layer that corresponds to a layer at which the data was captured on the host computing system. | 12-31-2009 |
20100017758 | PROCESSING FOR DISTINGUISHING PEN GESTURES AND DYNAMIC SELF-CALIBRATION OF PEN-BASED COMPUTING SYSTEMS - Systems, methods, and computer-readable media process and distinguish user input device gestures, such as gestures input via a pen in a pen-based computing system, e.g., to quickly and reliably distinguish between electronic ink entry, single taps, double taps, press-and-hold actions, dragging operations, and the like. Systems, methods, and computer-readable media also are provided for dynamically calibrating a computer system, e.g., calibrating a displayed input panel view based on input data recognized and received by a digitizer. Such systems and methods may operate without entering a dedicated or special calibration application, program, or routine. | 01-21-2010 |
20100146444 | Motion Adaptive User Interface Service - Motion adaptive user interface service is described. In embodiment(s), a user interface can be displayed on an integrated display of a device when an application is executed on the device. Context data associated with movement of the device can be received and used to determine an enhancement of the user interface for ease of usability. The enhancement can then be initiated to modify the user interface while the device is in motion. | 06-10-2010 |
20100207904 | TARGETING IN A STYLUS-BASED USER INTERFACE - Aspects of the invention provide virtual hover zones. When a user lowers a hovering stylus while remaining within a hover zone, cursor control is modified to be more easily controllable by the user. If the user pauses the stylus in mid-air before lowering the stylus, and if the stylus remains within the hover zone, then upon touchdown the cursor may be moved to the projection of the location where the stylus was paused. Any action that may be taken in response to the touch down may be sent to the projection location as well. Also provided are cursor control zones. A dampening zone may be used to provide dampened cursor movement feedback in response to movement input provided by a pointing device. Also, a dead zone may be used to prohibit cursor movement in response to movement input provided by the pointing device. | 08-19-2010 |
20100245106 | Mobile Computer Device Binding Feedback - Embodiments of mobile computer device binding feedback are described. In embodiments, an application interface for a device application is displayed on a first display that is integrated in a first housing of a dual-display mobile computer device. The application interface can also be displayed on a second display that is integrated in a second housing of the dual-display mobile computer device. Binding position data is received that is associated with a binding system that movably connects the first housing and the second housing. Application context data that is associated with the device application is also received. Feedback can then be generated that correlates to the binding position data and to the application context data. | 09-30-2010 |
20100295873 | AUTOMATIC USER VIEWING PREFERENCE - A system may allow an initial viewing adjustment curve set at a factory to be adjusted by a user, and the adjustment may pull the viewing adjustment curve in a particular direction, but may not result in a multistep, jerky viewing adjustment curve. The curve of the viewing adjustment curve may remain a curve, but, through the use of regions and smoothing, the viewing adjustment curve may retain its curve design. | 11-25-2010 |
20100318930 | ASSISTING USER INTERFACE ELEMENT USE - Methods of controlling the display and use of a UI element are disclosed. In an embodiment, the UI element may configured so that it initially maintains a topmost position but eventually allows other applications to assume the topmost position. In an embodiment, the display of the element may be adjusted in response to an input so that the UI element is not visible on the display. In an embodiment, the use of the UI element may allow for seamless dragging of the UI element even if the user inadvertently fails to make consistent contact with the touch-sensitive display while dragging the UI element. | 12-16-2010 |
20110063192 | MOBILE COMPUTER DEVICE BINDING FEEDBACK - In embodiments of mobile computer device binding feedback, an application interface for a device application is displayed on a first display that is integrated in a dual-display mobile device. The application interface can also be displayed on a second display that is integrated in the dual-display mobile device. Binding position data is received from a binding system that movably couples the first display to the second display. Application context data that is associated with the device application is also received. Feedback can then be generated based on the binding position data and the application context data, where the feedback can be generated as audio feedback, video feedback, display feedback, and/or haptic feedback. | 03-17-2011 |
20110072441 | MESSAGE COMMUNICATION OF SENSOR AND OTHER DATA - A service may be provided that reads sensors, and that communicates information based on the sensor readings to applications. In one example, an operating system provides a sensor interface that allows programs that run on a machine to read the values of sensors (such as an accelerometer, light meter, etc.). A service may use the interface to read the value of sensors, and may receive subscriptions to sensor values from other programs. The service may then generate messages that contain the sensor value, and may provide these messages to programs that have subscribed to the messages. The messages may contain raw sensor data. Or, the messages may contain information that is derived from the sensor data and/or from other data. | 03-24-2011 |
20110157062 | Touch Input Data Handling - A system for enabling a tablet input object is described. A tablet input object can take various inputs from touch, a mouse, and a pen and sends their information to an application or operating system. Also, a pen message pathway may also be used to handle touch messages, thereby reusing an existing pen message pathway for messages created by something other than a pen. | 06-30-2011 |
20110216028 | Methods For Allowing Applications To Filter Out Or Opt Into Tablet Input - Methods and systems for enabling a tablet input object is described. A tablet input object can take various inputs from touch, a mouse, and a pen and send their information to an application. | 09-08-2011 |
20120274592 | INTERPRETING AMBIGUOUS INPUTS ON A TOUCH-SCREEN - Methods are provided for interpreting a touch in relation to touch targets displayed on a touch-screen display, the touch targets associated with an application. A touch is detected at a first touch-screen location. The application is queried to determined a first touch target located within a predetermined touch region of the first touch-screen location. The application is then queries to determine whether a second touch target is located within the predetermined touch region. The touch is them disambiguated to determine whether the touch was intended for the first touch target or the second touch target. | 11-01-2012 |
20130044070 | Unintentional Touch Rejection - A method for rejecting an unintentional palm touch is disclosed. In at least some embodiments, a touch is detected by a touch-sensitive surface associated with a display. Characteristics of the touch may be used to generate a set of parameters related to the touch. In an embodiment, firmware is used to determine a reliability value for the touch. The reliability value and the location of the touch is provided to a software module. The software module uses the reliability value and an activity context to determine a confidence level of the touch. In an embodiment, the confidence level may include an evaluation of changes in the reliability value over time. If the confidence level for the touch is too low, it may be rejected. | 02-21-2013 |
20130067486 | MESSAGE COMMUNICATION OF SENSOR AND OTHER DATA - A service may be provided that reads sensors, and that communicates information based on the sensor readings to applications. In one example, an operating system provides a sensor interface that allows programs that run on a machine to read the values of sensors (such as an accelerometer, light meter, etc.). A service may use the interface to read the value of sensors, and may receive subscriptions to sensor values from other programs. The service may then generate messages that contain the sensor value, and may provide these messages to programs that have subscribed to the messages. The messages may contain raw sensor data. Or, the messages may contain information that is derived from the sensor data and/or from other data. | 03-14-2013 |
20130314316 | TARGETING IN A STYLUS-BASED USER INTERFACE - Aspects of the invention provide virtual hover zones. When a user lowers a hovering stylus while remaining within a hover zone, cursor control is modified to be more easily controllable by the user. If the user pauses the stylus in mid-air before lowering the stylus, and if the stylus remains within the hover zone, then upon touchdown the cursor may be moved to the projection of the location where the stylus was paused. Any action that may be taken in response to the touch down may be sent to the projection location as well. Also provided are cursor control zones. A dampening zone may be used to provide dampened cursor movement feedback in response to movement input provided by a pointing device. Also, a dead zone may be used to prohibit cursor movement in response to movement input provided by the pointing device. | 11-28-2013 |
20130326432 | Processing For Distinguishing Pen Gestures And Dynamic Self-Calibration Of Pen-Based Computing Systems - Systems, methods, and computer-readable media process and distinguish user input device gestures, such as gestures input via a pen in a pen-based computing system, e.g., to quickly and reliably distinguish between electronic ink entry, single taps, double taps, press-and-hold actions, dragging operations, and the like. Systems, methods, and computer-readable media also are provided for dynamically calibrating a computer system, e.g., calibrating a displayed input panel view based on input data recognized and received by a digitizer. Such systems and methods may operate without entering a dedicated or special calibration application, program, or routine. | 12-05-2013 |
20130326544 | Remote Inking - In one or more embodiments, a bus driver, included on a local computing system, enables detection of hardware available on a host computing system for a remote access session. Upon detecting a hardware device on the host computing system, an operating system included in the local computing system may obtain a device driver for controlling data captured from the hardware device. The device driver may be used to inject data captured from the hardware device into the local operating system's input stack. In some examples, the data is injected into the local operating system's input stack at a layer that corresponds to a layer at which the data was captured on the host computing system. | 12-05-2013 |
20140111462 | Unintentional Touch Rejection - A method for rejecting an unintentional palm touch is disclosed. In at least some embodiments, a touch is detected by a touch-sensitive surface associated with a display. Characteristics of the touch may be used to generate a set of parameters related to the touch. In an embodiment, firmware is used to determine a reliability value for the touch. The reliability value and the location of the touch is provided to a software module. The software module uses the reliability value and an activity context to determine a confidence level of the touch. In an embodiment, the confidence level may include an evaluation of changes in the reliability value over time. If the confidence level for the touch is too low, it may be rejected. | 04-24-2014 |
20140285457 | Touch Input Data Handling - A system for enabling a tablet input object is described. A tablet input object can take various inputs from touch, a mouse, and a pen and sends their information to an application or operating system. Also, a pen message pathway may also be used to handle touch messages, thereby reusing an existing pen message pathway for messages created by something other than a pen. | 09-25-2014 |