Entries |
Document | Title | Date |
20080222575 | Device Comprising a Detector for Detecting an Uninterrupted Looping Movement | 09-11-2008 |
20080229255 | Apparatus, method and system for gesture detection - Apparatuses, methods, and computer program products are provided to sense orientations or sequence of orientations, i.e. gestures, of mobile devices. The orientation or sequence of orientations control components and/or functions of the mobile device. Indications may be provided to a user to inform the user that the mobile device is in a particular orientation, or that the user has successfully performed a sequence of orientations corresponding to a functionality of the mobile device. The orientation or sequence of orientations may be performed while the mobile device is in a locked or idle state in order to control components and/or functions of the mobile device. A low energy sensor may activate the mobile device after a particular orientation is achieved. | 09-18-2008 |
20080229256 | Image processing apparatus, image processing method and computer readable medium - The image processing apparatus is provided with: a display that displays an object; a receiving device that receives a specified position specified by a user on the display; a holding device that holds setting of a process for the object; a judgment device that judges whether or not the specified position received by the receiving device is included in the display areas of a plural objects overlappedly; and an execution device that executes the process for selecting the plural objects of which the display areas include the specified position received by the receiving device according to the setting held in the holding device, when the judgment device judges that the specified position received by the receiving device is included in the display areas of the plural objects overlappedly. | 09-18-2008 |
20080244465 | COMMAND INPUT BY HAND GESTURES CAPTURED FROM CAMERA - A method and system for invoking an operation of a communication terminal in response to registering and interpreting a predetermined motion or pattern of an object. An input is received, the image data of the object is captured and the object in the image data is identified. | 10-02-2008 |
20080244466 | SYSTEM AND METHOD FOR INTERFACING WITH INFORMATION ON A DISPLAY SCREEN - A technique for interfacing with graphical information on a display screen involves using a hand-held controller unit to collect image information that includes at least a portion of the display screen and using the image of the display screen to generate position data that is indicative of the position of the hand-held controller unit relative to the display screen. An action in a computer program related to the graphical information is then triggered in response to the position data and in response to a user input at the hand-held controller unit. Using this technique, a user can navigate a graphical user interface on a display screen with a hand-held controller unit without relying on beacon-based navigation. | 10-02-2008 |
20080244467 | METHOD FOR EXECUTING USER COMMAND ACCORDING TO SPATIAL MOVEMENT OF USER INPUT DEVICE AND IMAGE APPARATUS THEREOF - A method for executing a user command based on spatial movement of a user input device and an image apparatus having the same are provided. According to the method for executing a user command, a user command which is determined based on the spatial movement of the user input device is executed. Accordingly, a method for inputting a user command becomes more diverse and convenient to use, and a more compact user input device may be provided. | 10-02-2008 |
20080244468 | Gesture Recognition Interface System with Vertical Display - One embodiment of the invention includes a gesture recognition interface system. The system may comprise a substantially vertical surface configured to define a gesture recognition environment based on physical space in a foreground of the substantially vertical surface. The system may also comprise at least one light source positioned to provide illumination of the gesture recognition environment. The system also comprises at least two cameras configured to generate a plurality of image sets based on the illumination being reflected from an input object in the gesture recognition environment. The system further comprises a controller configured to determine a given input gesture based on changes in relative locations of the input object in each of the plurality of image sets. The controller may further be configured to initiate a device input associated with the given input gesture. | 10-02-2008 |
20080256494 | TOUCHLESS HAND GESTURE DEVICE CONTROLLER - A simple user interface for touchless control of electrically operated equipment. Unlike other systems which depend on distance to the sensor or sensor selection this system depends on hand and or finger motions, a hand wave in a certain direction, or a flick of the hand in one area, or holding the hand in one area or pointing with one finger for example. The device is based on optical pattern recognition using a solid state optical matrix sensor with a lens to detect hand motions. This sensor is then connected to a digital image processor, which interprets the patterns of motion and outputs the results as signals to control fixtures, appliances, machinery, or any device controllable through electrical signals. | 10-16-2008 |
20080263479 | Touchless Manipulation of an Image - The invention relates to a method of providing touchless manipulation of an image through a touchless input device ( | 10-23-2008 |
20080270950 | METHOD AND APPARATUS FOR IMPORTING DATA FROM AN APPLICATION INTO A SECOND APPLICATION - One embodiment of the present invention provides a system that automatically acquires data from an application and imports the data into a second application. During operation, the system receives at a data-acquisition tool a command from a user to acquire data from the application. In response to the command, the system overlays a semi-transparent layer over at least a portion of a display which is generated by the application, so that the data within the display is still visible to the user. Next, the system receives a drawing command from the user to draw a shape around an item of data within the display. In response to the drawing command, the system draws a shape around the item of data within the display, wherein the shape is drawn on the semi-transparent layer. The system then acquires the item of data bounded by the shape. | 10-30-2008 |
20080276203 | USER INTERFACE AND COOKING OVEN PROVIDED WITH SUCH USER INTERFACE - An user interface for domestic appliances, particularly for cooking ovens, comprises input and display for showing menus and/or items selected by the user through said input. The input comprises a selection zone where the user's finger can move, the display having at least a portion with a shape substantially corresponding to the shape of the selection zone and showing the result of the finger movement in terms of item or menu selection. | 11-06-2008 |
20080282202 | GESTURED MOVEMENT OF OBJECT TO DISPLAY EDGE - The use of gestures to organize displayed objects on an interactive display. The gesture is used to move the displayed object to the edge of the interactive display so that the displayed object is only partially displayed after being moved. The size of the displayed object may be reduced and/or the displayed object may be rotated such that an identified portion of the displayed object remains in the display after moving. A gesture may also be used to move multiple displayed objects to the edge of the display. | 11-13-2008 |
20080282203 | GENERATING VECTOR GEOMETRY FROM RASTER INPUT FOR SEMI-AUTOMATIC LAND PLANNING - One embodiment of the invention includes a land planning tool that maybe used to perform a variety of land planning tasks. The land planning tool may interpret global information systems (GIS) electronic data in conjunction with user-specified constraints to analyze and display a development site, visually indicating developable areas. The user may then use a pen-based device to sketch outlines of land planning objects. As the user sketches, the land planning tool may generate vector geometry stored in an electronic database for use by a variety of computer aided design tools. | 11-13-2008 |
20080288895 | Touch-Down Feed-Forward in 30D Touch Interaction - A 3-D display device in which zooming is controlled based on the distance that a user's finger is from the display screen, generates a virtual drop shadow of the user's finger at the detected X/Y position of the user's finger with respect to the display screen. The virtual drop shadow represents the center of the zooming of the display image. In addition the size and darkness of the drop shadow is changed relative to the distance that the user's finger is from the display screen. | 11-20-2008 |
20080288896 | Method And System For Attention-Free User Input On A Computing Device - A method and system for attention-free user input on a computing device is described that allows the recognition of a user input irrespective of the area of entry of the user input on a writing surface (such as a digitizer) without the user having to make a visual contact with the writing surface. | 11-20-2008 |
20080313575 | SYSTEM AND PROCESS FOR CONTROLLING ELECTRONIC COMPONENTS IN A UBIQUITOUS COMPUTING ENVIRONMENT USING MULTIMODAL INTEGRATION - The present invention is directed toward a system and process that controls a group of networked electronic components using a multimodal integration scheme in which inputs from a speech recognition subsystem, gesture recognition subsystem employing a wireless pointing device and pointing analysis subsystem also employing the pointing device, are combined to determine what component a user wants to control and what control action is desired. In this multimodal integration scheme, the desired action concerning an electronic component is decomposed into a command and a referent pair. The referent can be identified using the pointing device to identify the component by pointing at the component or an object associated with it, by using speech recognition, or both. The command may be specified by pressing a button on the pointing device, by a gesture performed with the pointing device, by a speech recognition event, or by any combination of these inputs. | 12-18-2008 |
20080320419 | Touch Screen Device, Method, and Graphical User Interface for Providing Maps, Directions, and Location-Based Information - A device, method, and graphical user interface for providing maps, directions, and location-based information on a touch screen display are disclosed. | 12-25-2008 |
20090007025 | USER-INTERFACE FEATURES FOR COMPUTERS WITH CONTACT-SENSITIVE DISPLAYS - Embodiments described herein provide for a portable computer with a contact-sensitive display having a user-interface that is configurable through user-contact with the display. An active input area may be provided that is configurable in appearance and functionality. The contents of the active input area, its functionality, and the manner in which it is oriented, particularly with respect to a left or right handedness, are described herein. | 01-01-2009 |
20090024965 | GRAPHICAL METHOD OF SEMANTIC ORIENTED MODEL ANALYSIS AND TRANSFORMATION DESIGN - User is given the new modeling capability, a Semantic Lasso, which allows grouping of the model elements, so they can be mapped or transformed to the high-level concepts of business ontology. Such grouping is done by drawing of the line contour around relevant to advertised high-level concept model elements on one of the OMG Unified Modeling Language (UML) class diagrams. In addition the user is given a capability to specify extension points of the high-level concepts and the projection of such extension points to the individual model. Another tooling capability, a Semantic Transformation Lens, allows dynamic graphical projection of the individual model fragments to the high-level concepts as those elements are being selected. Semantic Transformation Lens provides the mechanism of reasoning-based smart selection. | 01-22-2009 |
20090031258 | GESTURE ACTIVATED CLOSE-PROXIMITY COMMUNICATION - A system for establishing a link from a wireless communication device (WCD) to at least one target device that is a member of a particular user group. The process of both locating the target device and establishing a link may incorporate the orientation and/or movement of the WCD into the procedure in lieu of the extensive use of traditional menu interfaces. For example, a WCD may recognize a combination of orientation and/or movement changes as a pattern for triggering activities, such as scanning for other devices. Various movement patterns may also be employed to establish a wireless link and for further interaction between users/devices in a user group. | 01-29-2009 |
20090037849 | Apparatus, methods, and computer program products providing context-dependent gesture recognition - At least some exemplary embodiments of the invention enable the use of context-dependent gestures, for example, in order to assist in the automation of one or more tasks. In one exemplary embodiment, an apparatus senses a predefined gesture and, in conjunction with context information (e.g., location information), performs a predefined action in response to the gesture. As non-limiting examples, the gesture may involve movement of the apparatus (e.g., shaking, tapping) or movement relative to the apparatus (e.g., using a touch screen). In one exemplary embodiment of the invention, a method includes: obtaining context information for an apparatus, wherein the context information includes a predefined context; and in response to sensing a predefined movement associated with the predefined context, performing, by the apparatus, a predefined action, wherein the predefined movement includes a movement of or in relation to the apparatus. | 02-05-2009 |
20090064055 | Application Menu User Interface - Methods, systems, and apparatus, including computer program products, for presenting user interface elements. A first page of one or more user interface elements is presented on a touch-sensitive display. Each of the user interface elements corresponds to a respective application. A gesture performed on the touch-sensitive display is detected. In response to the detected gesture, a second page of one or more user interface elements is presented on the touch-sensitive display. | 03-05-2009 |
20090077504 | Processing of Gesture-Based User Interactions - Systems and methods for processing gesture-based user interactions with an interactive display are provided. | 03-19-2009 |
20090089716 | Automatic communication notification and answering method in communication correspondance - A communication correspondence notification and reply method is provided. The method is implemented as a software program with the objective to be less distractive and to increase work productivity compared to prior methods. In particular, a notification format for incoming communication correspondences is determined, without any guidance/input from the user, taking into account (i) monitored user activity and (ii) the type of incoming correspondence, i.e. the notification format is a function of tracked/monitored user activity and the message type with the objective to minimize distraction to the user. To further minimize user distraction, the software program determines an area on the display of the computer system where the incoming correspondence can be presented to the user. Once presented, the user then has the ability to reply with minimal effort by making a pointer-device gesture movement in reply to the presented notification. | 04-02-2009 |
20090089717 | MOBILE TERMINAL AND METHOD OF CONTROLLING THE MOBILE TERMINAL - A method of controlling a mobile terminal and which includes displaying a first screen image on a touch screen of the mobile terminal as an idle background screen, receiving a touch and drag input operation being performed on the touch screen including the displayed first screen image, and displaying a second screen image corresponding to a direction of the touch and drag input operation, said second screen image corresponding to a new idle background screen. | 04-02-2009 |
20090094560 | HANDLE FLAGS - The claimed subject matter provides techniques to effectuate and facilitate efficient and flexible selection of display objects. The system can include devices and components that acquire gestures from pointing instrumentalities and thereafter ascertains velocities and proximities in relation to the displayed objects. Based at least upon these ascertained velocities and proximities falling below or within threshold levels, the system displays flags associated with the display object. | 04-09-2009 |
20090094561 | Displaying Personalized Documents To Users Of A Surface Computer - Methods, apparatus, and products are disclosed for displaying personalized documents to users of a surface computer, the surface computer comprising a surface, surface computer capable receiving multi-touch input through the surface and rendering display output on the surface, that include: registering a plurality of users with the surface computer; assigning, to each registered user, a portion of the surface for interaction between that registered user and the surface computer; selecting a user profile for each registered user; creating, for each registered user from a content repository, personalized display content for that registered user in dependence upon the user profile selected for that registered user; and rendering the personalized display content for each registered user on that user's assigned portion of the surface. | 04-09-2009 |
20090094562 | MENU DISPLAY METHOD FOR A MOBILE COMMUNICATION TERMINAL - A mobile terminal comprising a display module to display a tag and to display a menu screen image related to the tag at one portion of a background image as the tag is dragged, the menu screen image being displayed according to a dragging direction and a dragging distance; an input unit to detect a touch input with respect to the display module or the tag to determine the dragging direction and the dragging distance; and a controller to expose or hide the menu screen image according to the dragging direction the dragging distance of the inputted touch. | 04-09-2009 |
20090100383 | PREDICTIVE GESTURING IN GRAPHICAL USER INTERFACE - A computing system. The computing system includes a display presenting a user interface, and a gesture input configured to translate a user gesture into a command for controlling the computing system. The computing system also includes a gesture-predicting engine to predict a plurality of possible commands based on the beginning of the user gesture, and a rendering engine to indicate the plurality of possible commands via the user interface. | 04-16-2009 |
20090100384 | VARIABLE DEVICE GRAPHICAL USER INTERFACE - Methods, systems, devices, and apparatus, including computer program products, for adjusting a graphical user interface. A motion of a device is detected. A graphical user interface of the device is adjusted in response to the detected motion. | 04-16-2009 |
20090113354 | BROADCAST RECEIVING APPARATUS AND CONTROL METHOD THEREOF - A broadcast receiving apparatus wherein a user is able to control a graphical user interface (GUI) using a pointing device, and a control method thereof are provided. The broadcast receiving apparatus receives a movement pattern from a pointing device, and operates a function corresponding to the pattern. The broadcast receiving conveniently switches a pointing device between a position mode and a step mode, without having a key for switching between the position mode and the step mode or direction keys. | 04-30-2009 |
20090113355 | METHOD AND APPARATUS FOR CONTROLLING MULTI-TASKING OPERATION FOR TERMINAL DEVICE PROVIDED WITH TOUCH SCREEN - A method of controlling a terminal device, and which includes executing a first function on the terminal device, displaying at least one function icon for executing at least one second function that is different than the first function being executed, and selectively executing the second function simultaneously with the first function when said at least one function icon is selected. | 04-30-2009 |
20090125848 | TOUCH SURFACE-SENSITIVE EDIT SYSTEM - A method, medium and implementing processing system are provided in which displayed text is manipulated using two fingers within an editing application to select a region of text or objects. In an example, two fingers are placed on a touch-sensitive display or touch pad and the region of text between the fingers is selected. The selected text can be manipulated as otherwise selected text is currently manipulated, e.g. cut, paste and copy functions can be performed. The movement of the fingers also performs this manipulation. In one example, if the fingers are brought to together, the selected text is cut, or a split screen could occur. If the fingers are placed together and then parted, the action would be to part the text to make room for a picture or other insert. | 05-14-2009 |
20090125849 | Eye Tracker with Visual Feedback - The present invention relates to entry of control commands into a computer in response to eye-tracker detected movement sequences of a point of regard over a graphical display, which is associated with the computer. A processing module in the computer causes the display to present graphical feedback information in the form of a data-manipulating window, which visually confirms any entered control commands. The data-manipulating window is presented at a position relative to an active control object on the display, such that a center point of the window is located within a relatively small offset distance from a center point of the active control object. The window includes graphical information, which symbolizes and activity portion of the display presently being the object of an eye-tracker-controlled entry of control commands. Moreover the information in the window is repeatedly updated in response to the eye-tracker-controlled entry of control commands. | 05-14-2009 |
20090138830 | Method, article, apparatus and computer system for inputting a graphical object - Method, article, apparatus and computer system facilitating the easy and intuitive inputting of a desired graphical object into an electronic system from a large plurality of predetermined graphical objects. In one example embodiment, this is achieved by assigning each of said graphical objects into one of a plurality of groups in accordance with a predetermined similarity criterion, associating respective base shapes to each of said groups, wherein said base shapes having a certain degree of similarity to the objects assigned to the associated group according to said similarity criterion and associating in each of said groups at least one gesture to each of said graphical objects, so that the associated gestures are distinguishable from each other. In order to input the desired graphical object, one of the groups is selected by selecting its base shape and then the desired graphical object is identified by drawing the respective gesture associated thereto. | 05-28-2009 |
20090138831 | Apparatus and method of determining a user selection in a user interface - A user interface ( | 05-28-2009 |
20090144667 | Apparatus, method, computer program and user interface for enabling user input - An apparatus including a display for presenting text; a touch sensitive input device configured to enable a user to make a trace input via the display; and a processor, wherein the processor is configured to detect a first trace input that starts at a predetermined first location and extends across the touch sensitive input device to a second location wherein the processor is configured such that the detection of the first trace input actuates the deletion of the text presented between the predetermined first location and the second location. | 06-04-2009 |
20090144668 | SENSING APPARATUS AND OPERATING METHOD THEREOF - A sensing apparatus is disclosed. The sensing apparatus comprises a first image capturing module, a second image capturing module, a calculating module, and a controlling module. The first image capturing module and the second image capturing module capture a first image and a second image related to a plurality of objects respectively at a specific time. The calculating module obtains a 3-D position of an object according to the first image and the second image and obtains a 3-D displacement of the object according to the 3-D position and a former 3-D position of the object. If any one of 3-D displacements corresponding to the objects is approximately vertical, the controlling module controls an electrical apparatus to perform a first function. If a weighting average of approximately horizontal vector components of all 3-D displacements meets a condition, the controlling module controls the electrical apparatus to perform a second function. | 06-04-2009 |
20090158219 | ENGINE SUPPORT FOR PARSING CORRECTION USER INTERFACES - A parsing system provides a parsed document to a user application labeling the document with indication symbols according to a scheme associated with the parsing results. Users are enabled to insert correction indicators such as handwritten gestures, icon selections, menu item selections, and the like in conjunction with the indication symbols. The document is re-analyzed performing the requested corrections such as line or block separations, line, block, word connections, etc. The operations provide support for the engine stack of the parsing system while accommodating independent user interfaces employed by the users. Insertion of correction indicators and subsequent re-analysis for correction may be performed upon user signal, in an iterative manner, or continuously. | 06-18-2009 |
20090158220 | DYNAMIC THREE-DIMENSIONAL OBJECT MAPPING FOR USER-DEFINED CONTROL DEVICE - A computer-implemented method is provided to interactively capture and utilize a three-dimensional object as a controlling device for a computer system. One operation of the method is capturing depth data of the three-dimensional object. In another operation, the depth data of the three-dimensional object undergoes processing to create geometric defining parameters for the three-dimensional object. The method can also include defining correlations between particular actions performed with the three-dimensional object and particular actions to be performed by the computer system. The method also includes an operation to save the geometric defining parameters of the three-dimensional object to a recognized object database. In another operation, the correlations between particular actions performed with the three-dimensional object and particular actions to be performed by the computer system in response to recognizing the particular actions are also saved to the recognized object database. | 06-18-2009 |
20090164951 | Input architecture for devices with small input areas and executing multiple applications - A run time environment (e.g., operating system, device drivers, etc.) which translates a touch gesture representing one or more directions on a touch screen to a corresponding choice and indicates the same to a user application. As the choice depends merely on the direction(s) of movement of the touch, choices can be easily indicated for all applications executing in a device with small input areas. | 06-25-2009 |
20090164952 | CONTROLLING AN OBJECT WITHIN AN ENVIRONMENT USING A POINTING DEVICE - The present invention is directed toward a system and process that controls a group of networked electronic components using a multimodal integration scheme in which inputs from a speech recognition subsystem, gesture recognition subsystem employing a wireless pointing device and pointing analysis subsystem also employing the pointing device, are combined to determine what component a user wants to control and what control action is desired. In this multimodal integration scheme, the desired action concerning an electronic component is decomposed into a command and a referent pair. The referent can be identified using the pointing device to identify the component by pointing at the component or an object associated with it, by using speech recognition, or both. The command may be specified by pressing a button on the pointing device, by a gesture performed with the pointing device, by a speech recognition event, or by any combination of these inputs. | 06-25-2009 |
20090172606 | METHOD AND APPARATUS FOR TWO-HANDED COMPUTER USER INTERFACE WITH GESTURE RECOGNITION - A method and apparatus for manipulating displayed content using first and second types of human-machine interface in combination are disclosed. Machine operations are divided into two sets and the first type of user interface controls a first set and a second set of operations, while the second type of user interface controls only the second set. In a preferred method embodiment, one hand controls the first set via a mouse interface and the other hand controls the second set via a stereo camera based hand gesture recognition interface. In a preferred apparatus embodiment, the apparatus has a manipulable input device capable of interacting with displayed content and visualization of the displayed content. Additionally, the apparatus has a gesture based input device capable of interacting only with the visualization of the displayed content. | 07-02-2009 |
20090178010 | Specifying Language and Other Preferences for Mobile Device Applications - A user interface for specifying a preference for content is displayed over the content on a display of a mobile device. Preferences (e.g., language preferences) can be specified for audio, closed captions, subtitles and any other features or operations associated with the mobile device. In one aspect, the user interface is a partially transparent sheet that at least partially overlies the content. The sheet can be navigated (e.g., scrolled) in response to input (e.g., touch input). In one aspect, the specified option is made a default option for at least some other applications running on the mobile device. In one aspect, the content is video which is automatically paused while the user interface is displayed. | 07-09-2009 |
20090178011 | GESTURE MOVIES - The display of gesture movies is disclosed to assist users in performing gestures. Gesture movies can be short, unintrusive, and available on demand. A list box can appear in a pop-up window or preference panel, containing a list of gestures that can be displayed. If a user clicks on a gesture in the list, a video, movie or animation of the gesture being performed appears in one box, and a video, movie or animation of the action being performed on a particular object is displayed in another box. Thus, a hand can be shown performing the selected gesture over a touch sensor panel, while at the same time, and synchronized with the gesture being displayed, an object being manipulated by the gesture is displayed. | 07-09-2009 |
20090193366 | GRAPHICAL USER INTERFACE FOR LARGE-SCALE, MULTI-USER, MULTI-TOUCH SYSTEMS - A method implemented on the graphical user interface device to invoke an independent, user-localized menu in an application environment, by making a predetermined gesture with a pointing device on an arbitrary part of a display screen or surface, especially when applied in a multi-touch, multi-user environment, and in environments where multiple concurrent pointing devices are present. As an example, the user may trace out a closed loop of a specific size that invokes a default system menu at any location on the surface, even when a second user may be operating a different portion of the system elsewhere on the same surface. As an additional aspect of the invention, the method allows the user to smoothly transition between the menu-invocation and menu control. | 07-30-2009 |
20090217210 | SYSTEM AND METHOD FOR TELEVISION CONTROL USING HAND GESTURES - Systems and method which allow for control of televisions and other media device are disclosed. A television set is provided with a gesture capture device configured to receive a gesture input directed to at least one of a plurality of predefined areas related to the television set. The television set further includes a user interaction interface configured to generate data indicative of the gesture input directed toward the at least one of the predefined areas and control the television based at least in part on the generated data. | 08-27-2009 |
20090217211 | ENHANCED INPUT USING RECOGNIZED GESTURES - Enhanced input using recognized gestures, in which a user's gesture is recognized from first and second images, and a representation of the user is displayed in a central region of a control that further includes interaction elements disposed radially in relation to the central region. The enhanced input also includes interacting with the control based on the recognized user's gesture, and controlling an application based on interacting with the control. | 08-27-2009 |
20090222770 | METHOD OF INPUTTING CONTROL INSTRUCTION AND HANDHELD DEVICE THEREOF - A method of inputting a control instruction and a handheld device thereof are provided. The handheld device includes a memory unit, a touch module, and a recognition module. The method includes receiving a writing track input by the user from a touch module, analyzing the writing track by the recognition module to convert the writing track into a track data, and comparing the track data with a feature data stored in the memory unit to judge whether the two are consistent with each other, so as to determine whether to execute a program instruction corresponding to the feature data. Through the handheld device and method, when a user inputs a writing track, the handheld device activates a corresponding application program and specific actions thereof, so as to reduce the time of searching for the application program, thereby enhancing the practicability of the handheld device to the user. | 09-03-2009 |
20090241072 | Unlocking a Device by Performing Gestures on an Unlock Image - A device with a touch-sensitive display may be unlocked via gestures performed on the touch-sensitive display. The device is unlocked if contact with the display corresponds to a predefined gesture for unlocking the device. The device displays one or more unlock images with respect to which the predefined gesture is to be performed in order to unlock the device. The performance of the predefined gesture with respect to the unlock image may include moving the unlock image to a predefined location and/or moving the unlock image along a predefined path. The device may also display visual cues of the predefined gesture on the touch screen to remind a user of the gesture. | 09-24-2009 |
20090249258 | Simple Motion Based Input System - One embodiment a programmable device embodying a program of executable instructions to perform steps including assigning multiple tasks or symbols to each of a number of motion groups; segmenting motion data from sensor(s); matching the segments to motion groups; composing and then selecting task(s) or symbol sequence(s) from the task(s) and/or symbol(s) assigned to the matched motion groups. | 10-01-2009 |
20090254868 | TRANSLATION OF GESTURE RESPONSES IN A VIRTUAL WORLD - Translating gestures made by one avatar to a second avatar in a virtual world by receiving an input from a first user representing an input gesture to be made by the first avatar to the second avatar. The input gesture is translated to generate at least one translated gesture for display. The translated gesture may be output for display as being made by the first avatar to the second avatar. | 10-08-2009 |
20090254869 | MULTI-PARAMETER EXTRACTION ALGORITHMS FOR TACTILE IMAGES FROM USER INTERFACE TACTILE SENSOR ARRAYS - A user interface employs a tactile sensor array producing a rich flux of independently-adjustable interactive control parameters, rates of change, and symbols derived from these as well as tactile shapes, patterns, gestures, syntaxes, and phrases from each of one or more regions of contact or proximity. The tactile sensor array may comprise a pressure sensor array, proximity sensor array, or other sensor such as a video camera. The user interface derives up to six independently-adjustable interactive real-time control parameters plus rates and symbols from a single finger tip. Simple running sums employed during scans so that individual sensor measurements need not be stored. The user interface supports multiple regions of contact or proximity wherein at least one of the regions has a non-convex shape. In addition, the tactile sensor array may be partitioned into sections or modules each with a separate scanning loop and/or processor. | 10-08-2009 |
20090265669 | LANGUAGE INPUT INTERFACE ON A DEVICE - Methods, systems, devices, and apparatus, including computer program products, for inputting text. A user interface element is presented on a touch-sensitive display of a device. The user interface element is associated with a plurality of characters, at least a subset of which is associated with respective gestures. A user input performing a gesture with respect to the user interface element is received. The character from the subset that is associated with the gesture performed with respect to the user interface element is inputted. | 10-22-2009 |
20090265670 | USER INTERFACE FOR A MOBILE DEVICE USING A USER'S GESTURE IN THE PROXIMITY OF AN ELECTRONIC DEVICE - An electronic device having a user interface on a display and method for controlling the device, the method including: detecting a proximity of an object to the display; detecting a two-dimensional motion pattern of the object; and controlling the user interface according to the detected two-dimensional motion pattern. Also, a method including: detecting an object in a space over a border between first and second zones of a plurality of touch-sensitive zones and outputting a detection signal; and simultaneously displaying first and second information elements corresponding to the first and second zones in response to the detection signal. | 10-22-2009 |
20090265671 | MOBILE DEVICES WITH MOTION GESTURE RECOGNITION - Mobile devices using motion gesture recognition. In one aspect, processing motion to control a portable electronic device includes receiving, on the device, sensed motion data derived from motion sensors of the device and based on device movement in space. The motion sensors include at least three rotational motion sensors and at least three accelerometers. A particular operating mode is determined to be active while the movement of the device occurs, the mode being one of multiple different operating modes of the device. Motion gesture(s) are recognized from the motion data from a set of motion gestures available for recognition in the active operating mode. Each of the different operating modes, when active, has a different set of gestures available. State(s) of the device are changed based on the recognized gestures, including changing output of a display screen on the device. | 10-22-2009 |
20090276734 | Projection of Images onto Tangible User Interfaces - A surface computing device is described which has a surface which can be switched between transparent and diffuse states. When the surface is in its diffuse state, an image can be projected onto the surface and when the surface is in its transparent state, an image can be projected through the surface and onto an object. In an embodiment, the image projected onto the object is redirected onto a different face of the object, so as to provide an additional display surface or to augment the appearance of the object. In another embodiment, the image may be redirected onto another object. | 11-05-2009 |
20090282370 | GRAPHICAL USER INTERFACE FOR DATA ENTRY - A graphical user interface is provided for facilitating entry of data into a telephone, personal digital assistant or other computing device having a touch-sensitive input component (e.g., a touch screen). The interface includes multiple initial contact areas associated with different input (e.g., characters, numerical values, commands), a home area and spokes positioned between the initial contact areas and the home area. The interface is manipulated using gestures. A data input gesture begins by touching in or near an initial contact area and moving to or toward the home area, generally in proximity to the corresponding spoke. Other illustrative gestures include tracing directly from one initial contact area to another (e.g., to add the corresponding data values), performing a “throwing” gesture out of the home area (e.g., to delete the last input), gesturing backward/forward in the home area (e.g., to move backward/forward through a series of fields), etc. | 11-12-2009 |
20090282371 | INTEGRATION SYSTEM FOR MEDICAL INSTRUMENTS WITH REMOTE CONTROL - An integration system for medical instruments is described. In various embodiments, the integration system is useful for managing information from, and controlling, multiple medical instruments in a medical facility, as well as providing high fidelity audio communications between members of a clinical team. The system can be operated remotely in a sterile environment using gesture-based control and/or voice-recognition control. The system can record combined data instrument data, clinical data, system data, video and audio signals from a surgical procedure synchronously, as the data would be perceived during the procedure, in a central database. The recorded data can be retrieved and reviewed for instructional, diagnostic or analytical purposes. | 11-12-2009 |
20090288044 | ACCESSING A MENU UTILIZING A DRAG-OPERATION - Computer-readable media, computerized methods, and computer systems for intuitively invoking a presentation action (e.g., rendering a menu) by applying a drag-operation at a top-level control button rendered at a touchscreen display are provided. Initially, aspects of a user-initiated input applied at the top-level control button are detected. These aspects may include an actuation location and a distance of a drag-movement therefrom. If a distance of the drag-movement at the touchscreen display is greater than a threshold distance in a particular radial direction from the actuation location, the user-initiated input is considered a drag-operation. Typically, a set of trigger boundaries are constructed based on system metrics to assist in disambiguating the drag-operation from a tap-type operation. If a drag-operation is identified, the presentation action is invoked; otherwise, a principle action associated with the top-level control button (e.g., manipulating content of an application) may be invoked. | 11-19-2009 |
20090300554 | Gesture Recognition for Display Zoom Feature - A method, apparatus, and system are disclosed that provide a computing user with an ability to engage in a multitude of operations via the entry of gestures. Computer operations may be mapped to shapes, and a comparison may take place between a user-entered gesture and the shapes to determine whether the gesture approximates at least one of the shapes. Responsive to determining that the gesture approximates at least one of the shapes, an operation associated with the shape may be executed. The operation may include a zoom operation (e.g., a zoom-in or a zoom-out operation), wherein the dimensions of the gesture may influence content to be included in an updated display. Additional adjustments may be performed to improve a resolution associated with the content included in the updated display. | 12-03-2009 |
20090307634 | User Interface, Device and Method for Displaying a Stable Screen View - A user interface configured to display a screen view representing an application and to receive motion data representing a detected movement, said user interface being further configured to update said displayed screen view to visually counteract said detected movement. | 12-10-2009 |
20090313587 | METHOD AND APPARATUS FOR PROVIDING MOTION ACTIVATED UPDATING OF WEATHER INFORMATION - An approach provides updating of weather information on a mobile device. Motion of a mobile device is detected, wherein the mobile device is configured to execute a weather application for presenting weather information to a user. Update of the weather information is retrieved in response to the detected motion. | 12-17-2009 |
20090327974 | USER INTERFACE FOR GESTURAL CONTROL - A UI (user interface) for gestural control enhances the navigation experience for the user by preventing multiple gestures from being inadvertently invoked at the same time. This problem is overcome by establishing two or more categories of gestures. For instance, the first category of gestures may include gestures that are likely to be invoked before gestures that are included in the second category of gestures. That is, gestures in the second category will typically be invoked after a gesture in the first category has already been invoked. One example of a gesture that falls into the first category may be a gesture that initiates operation of a device, whereas a gesture that falls into the second category may be change in volume. Gestures that fall into the second category require more criteria to be satisfied in order to be invoked than gestures that fall into the first category. | 12-31-2009 |
20090327975 | Multi-Touch Sorting Gesture - A method and apparatus are provided for recognizing multi-touch gestures on a touch sensitive display. A plurality of graphical objects is displayed within a user interface (UI) of a display screen operable to receive touch input. A first touch input exceeding a first time duration is detected over a first graphical object. A touch-and-hold gesture action is generated, which is then applied to the first graphical object. A second touch input is then detected over a second graphical object and a touch-select gesture action is generated, which is then applied to the second graphical object. The first and second gestures are processed to determine an associated operation, which is then performed on the second graphical object. | 12-31-2009 |
20090327976 | Portable Device, Method, and Graphical User Interface for Displaying a Portion of an Electronic Document on a Touch Screen Display - A computer-implemented method for use in conjunction with a computing device with a touch screen display comprises displaying a portion of a web page in a web browser application without concurrently displaying a Uniform Resource Locator (URL) entry area for inputting URLs of web pages. A gesture is detected in a predefined area at the top of the touch screen display. In response to detecting the gesture in the predefined area at the top of the touch screen display, the URL entry area is displayed. | 12-31-2009 |
20090327977 | INTERACTIVE CONTROL DEVICE AND METHOD FOR OPERATING THE INTERACTIVE CONTROL DEVICE - An interactive control device includes a display device, and a method is for operating the interactive control device. The method includes: displaying graphical information on the display device; receiving sensor information; activating a control action if on the basis of the sensor information it is ascertained that a body part of a user is located within an activation region that is spatially defined relative to a display region of a control element on the display device with which the control action is associated; the received sensor information including user information that is evaluated prior to an activation of the control action in order to ascertain a control intention for the at least one control element; and the information represented on the display device being adapted as a function of the ascertained control intention such that the at least one control element is represented in a manner optimized for the activation of the control action associated with the control element. The control device may be arranged as a component of a motor vehicle console so as to be able to implement the control method. | 12-31-2009 |
20090327978 | Hand-Held Device and Method for Operating a Single Pointer Touch Sensitive User Interface - A hand-held device and method for operating a single pointer touch sensitive user interface of a hand-held electronic device are provided. The method includes defining as being active a first one of a set of two or more controllable interface functions including at least a first controllable interface function and a second controllable interface function. A control gesture is then detected and the control gesture is associated with the active one of the set of two or more controllable interface functions, where the detected pattern adjusts the performance of the active controllable interface function. A transition gesture is then detected including a pointer pattern movement, which is not included as a control gesture for any of the two or more controllable interface functions, where upon detection of the transition gesture, the transition gesture defines a second one of the set of two or more controllable interface functions as being the active one of the set of two or more controllable interface functions. A further control gesture is then detected and the control gesture is associated with the active one of the set of two or more controllable interface functions, where the detected pattern adjusts the performance of the active controllable interface function. | 12-31-2009 |
20100005427 | Systems and Methods of Touchless Interaction - A contactless display system enables a user to interact with a displayed image by moving a finger, or pointer, toward a selected portion of the image. Images can be enlarged, or translated dynamically in response to detected movement. Operational methodology can be manually switched between contact-type and contactless operation to enhance flexibility. | 01-07-2010 |
20100005428 | INFORMATION PROCESSING APPARATUS AND METHOD FOR DISPLAYING AUXILIARY INFORMATION - There is provided an information processing apparatus, including a direction detection unit that detects a drawing direction of a locus drawn in an input process of a gesture when the gesture is input, a gesture search unit that searches for the gesture matching the drawing direction of the locus detected by the direction detection unit from among a plurality of predetermined gestures, and an auxiliary information display unit that displays a search result by the gesture search unit in a screen as auxiliary information each time the drawing direction of the locus is detected by the direction detection unit. | 01-07-2010 |
20100017758 | PROCESSING FOR DISTINGUISHING PEN GESTURES AND DYNAMIC SELF-CALIBRATION OF PEN-BASED COMPUTING SYSTEMS - Systems, methods, and computer-readable media process and distinguish user input device gestures, such as gestures input via a pen in a pen-based computing system, e.g., to quickly and reliably distinguish between electronic ink entry, single taps, double taps, press-and-hold actions, dragging operations, and the like. Systems, methods, and computer-readable media also are provided for dynamically calibrating a computer system, e.g., calibrating a displayed input panel view based on input data recognized and received by a digitizer. Such systems and methods may operate without entering a dedicated or special calibration application, program, or routine. | 01-21-2010 |
20100017759 | Systems and Methods For Physics-Based Tactile Messaging - Systems and methods for physics-based tactile messaging are disclosed. For example, one disclosed method includes the steps of receiving a sensor signal from a sensor configured to sense a physical interaction with a messaging device; determining an interaction between one or more virtual message objects and a virtual message environment, the interaction based at least in part on the sensor signal and a virtual physical parameter of at least one of the one or more virtual message objects; and determining a haptic effect based at least in part on the interaction. The method additionally includes the step of generating a haptic signal configured to cause an actuator to output the haptic effect. | 01-21-2010 |
20100023895 | Touch Interaction with a Curved Display - Touch interaction with a curved display (e.g., a sphere, a hemisphere, a cylinder, etc.) is facilitated by preserving a predetermined orientation for objects. In an example embodiment, a curved display is monitored to detect a touch input on an object. If a touch input on an object is detected based on the monitoring, then one or more locations of the touch input are determined. The object may be manipulated responsive to the determined one or more locations of the touch input. While manipulation of the object is permitted, a predetermined orientation is preserved. | 01-28-2010 |
20100031200 | METHOD OF INPUTTING A HAND-DRAWN PATTERN PASSWORD - A method of inputting a hand-drawn pattern password includes the steps of providing a plurality of input keys on a touch panel; corresponding each of the input keys to a unique character, numeral, or a symbol; and causing a user to sequentially touch a sequence of some of the input keys on the touch panel to thereby draw a user-remembered pattern password, so that a sequence of characters, numerals, and/or symbols corresponding to the input keys for drawing the pattern password constitutes an effective password and are input. | 02-04-2010 |
20100031201 | PROJECTION OF A USER INTERFACE OF A DEVICE - A method may include projecting, by a device, content on a surface, determining an orientation of the device, detecting a movement of the device, determining an operation that corresponds to the movement and interacts with the content, and performing the operation. | 02-04-2010 |
20100031202 | USER-DEFINED GESTURE SET FOR SURFACE COMPUTING - The claimed subject matter provides a system and/or a method that facilitates generating an intuitive set of gestures for employment with surface computing. A gesture set creator can prompt two or more users with a potential effect for a portion of displayed data. An interface component can receive at least one surface input from the user in response to the prompted potential effect. A surface detection component can track the surface input utilizing a computer vision-based sensing technique. The gesture set creator collects the surface input from the two or more users in order to identify a user-defined gesture based upon a correlation between the respective surface inputs, wherein the user-defined gesture is defined as an input that initiates the potential effect for the portion of displayed data. | 02-04-2010 |
20100031203 | USER-DEFINED GESTURE SET FOR SURFACE COMPUTING - The claimed subject matter provides a system and/or a method that facilitates generating an intuitive set of gestures for employment with surface computing. A gesture set creator can prompt two or more users with a potential effect for a portion of displayed data. An interface component can receive at least one surface input from the user in response to the prompted potential effect. A surface detection component can track the surface input utilizing a computer vision-based sensing technique. The gesture set creator collects the surface input from the two or more users in order to identify a user-defined gesture based upon a correlation between the respective surface inputs, wherein the user-defined gesture is defined as an input that initiates the potential effect for the portion of displayed data. | 02-04-2010 |
20100037184 | PORTABLE ELECTRONIC DEVICE AND METHOD FOR SELECTING MENU ITEMS - A portable electronic device includes a motion detection module and a storage system. The motion detection module is configured for determining a direction of movement of the portable electronic device when orientation of the portable electronic device has been changed. The motion detection module is further configured for generating an input signal associated with the movement and providing the input signal to an application of the portable electronic device to initiate an operation performed by the application, wherein the input signal includes menu position information of a menu item of the application. The motion detection module is further configured for selecting a desired menu item according to the menu position. The storage system is used for storing the application and movement data of the portable electronic device. | 02-11-2010 |
20100037185 | INPUT METHOD FOR COMMUNICATION DEVICE - An input method for a communication device includes the steps of displaying a result box and a dial ring comprising a plurality of buttons on a touch screen of the communication device; receiving a touch and slide operation on a button; recognizing the touch and slide operation and rotating the dial ring according to the slide operation; and displaying information represented by the touched button in the result box when the touched button rotates to a predetermined position. People who are used to rotary dial phone can use the input method without difficulty. | 02-11-2010 |
20100042954 | Motion based input selection - A method for selecting an input value based on sensed motion is provided. In one embodiment, the method includes varying a graphical element displayed on a handheld device in response to sensed motion to identify an input value. The motion-based input may be used to perform a function on the handheld device or on an external device. For example, the input may be used to open a lock or to rotate a displayed image. Various additional methods, devices, and systems employing motion-based inputs are also provided. | 02-18-2010 |
20100050133 | Compound Gesture Recognition - One embodiment of the invention includes a method for executing and interpreting gesture inputs in a gesture recognition interface system. The method includes detecting and translating a first sub-gesture into a first device input that defines a given reference associated with a portion of displayed visual content. The method also includes detecting and translating a second sub-gesture into a second device input that defines an execution command for the portion of the displayed visual content to which the given reference refers. | 02-25-2010 |
20100050134 | ENHANCED DETECTION OF CIRCULAR ENGAGEMENT GESTURE - The enhanced detection of a circular engagement gesture, in which a shape is defined within motion data, and the motion data is sampled at points that are aligned with the defined shape. It is determined whether a moving object is performing a gesture correlating to the defined shape based on a pattern exhibited by the sampled motion data. An application is controlled if determining that the moving object is performing the gesture. | 02-25-2010 |
20100058251 | OMNIDIRECTIONAL GESTURE DETECTION - An omnidirectional electronic device is disclosed. The electronic device can perform operations associated with a combination of inputs that can, in some cases, be recognized irrespective of the position or orientation in which they are applied to the electronic device. The inputs can include, for example, single or multi-touch taps, presses, swipes, rotations, characters and symbols. The inputs can be provided one or more times in succession and can be held for an amount of time. In one embodiment, an omnidirectional media player can perform media operations associated with a combination of inputs that can be recognized irrespective of the position or orientation in which they are applied to an input area of the media player. | 03-04-2010 |
20100058252 | GESTURE GUIDE SYSTEM AND A METHOD FOR CONTROLLING A COMPUTER SYSTEM BY A GESTURE - A gesture guide system and a method for controlling a computer system by a gesture are provided. The system includes a sensor element and a computer system. The method includes steps of: communicating the sensor element with the computer system; the computer system shows at least one gesture option and the corresponding function instruction; the sensor element detecting a gesture of the user; and the computer system executes the corresponding function instruction in response to the detected gesture. | 03-04-2010 |
20100058253 | MOBILE TERMINAL AND METHOD FOR CONTROLLING MUSIC PLAY THEREOF - A mobile terminal is provided including a display unit, a sensing unit, and a controller. The display unit is configured as a touch screen for displaying album art of a song currently being played. The sensing unit is for sensing a touch applied to the touch screen. The controller is for controlling play of the song based on a touch sensed at a certain region of the album art displayed on the display unit. | 03-04-2010 |
20100058254 | Information Processing Apparatus and Information Processing Method - To provide an information processing apparatus and information processing method capable of rapidly and easily zooming in or out an image displayed on a display unit. The apparatus includes a display unit | 03-04-2010 |
20100064261 | PORTABLE ELECTRONIC DEVICE WITH RELATIVE GESTURE RECOGNITION MODE - A computer program executable on a portable electronic device having a touch screen sensor is provided. The computer program may include an input mode switching module configured to receive a mode switch user input to switch between a direct input mode and a relative gesture recognition mode. The computer program may further include a gesture-based control module configured, in the relative gesture recognition mode, to recognize a contact point on the touch screen sensor between a digit of a user and a surface of the touch screen sensor in a defined region in which the graphical user interface elements are unselectable, and to identify a detected gesture based on user touch input originating from the contact point, and to send a message to an application program to adjust an operation of the portable electronic device based on the detected gesture. | 03-11-2010 |
20100064262 | Optical multi-touch method of window interface - An optical multi-touch method of a window interface is adapted to control an object in the window interface. The method includes providing a first optical sensing window to obtain a first tracking signal and providing a second optical sensing window to obtain a second tracking signal; resolving the first tracking signal to determine a first displacement direction and resolving the second tracking signal to determine a second displacement direction; and controlling a motion of the object in the window interface according to a relative relation between the first displacement direction and the second displacement direction. | 03-11-2010 |
20100070931 | METHOD AND APPARATUS FOR SELECTING AN OBJECT - A device and method for selecting objects on a touch-sensitive display are described. The device includes a touch-sensitive display for displaying at least one object, the touch-sensitive display responsive to a user input, a selection detection section operatively coupled to the touch-input display, the selection detection section configured to detect a back-and-forth movement of the user input when the input is in contact with the touch-sensitive display and to select an object on the touch-sensitive display for further operation when the back-and-forth motion is in proximity to the object. | 03-18-2010 |
20100070932 | VEHICLE ON-BOARD DEVICE - A vehicle on-board device includes a user interface device and a processing section. The user interface device is mounted inside of a vehicle, and configured and arranged to output information to a user and to receive a user input. The processing section is operatively coupled to the user interface device, and configured to perform a prescribed function in response to a prescribed user operation received by the user input interface device. The processing section is further configured to perform an interactive tutorial control to provide the user with at least one interactive instruction for the prescribed function in which the processing section prompts the user to input the prescribed user operation, determines whether the user input received by the user interface device matches the prescribed user operation, and completes the interactive learning control when the user input matches the prescribed user operation. | 03-18-2010 |
20100077361 | Method of displaying multiple points of interest on a personal navigation device - A method of displaying points of interest in a personal navigation device includes displaying a map on a display of the personal navigation device, receiving touch input at a touched position of the display, and searching an area within a search radius of the touched position for points of interest. The method also includes displaying points of interest located in the area within the search radius, where the found points of interest are represented by icons connected to their locations on the map with a line extending out from the touched position, and spreading out the icons around the touched position to separate the icons from each other. | 03-25-2010 |
20100083188 | COMPUTER USER INTERFACE SYSTEM AND METHODS - Systems and methods may provide user control of a computer system via one or more sensors. Also, systems and methods may provide automated response of a computer system to information acquired via one or more sensors. The sensor(s) may be configured to measure distance, depth proximity and/or presence. In particular, the sensor(s) may be configured to measure a relative location, distance, presence, movements and/or gestures of one or more users of the computer system. Thus, the systems and methods may provide a computer user interface based on measurements of distance, depth, proximity, presence and/or movements by one or more sensors. For example, various contexts and/or operations of the computer system, at the operating system level and/or the application level, may be controlled, automatically and/or at a user's direction, based on information acquired by the sensor(s). | 04-01-2010 |
20100083189 | METHOD AND APPARATUS FOR SPATIAL CONTEXT BASED COORDINATION OF INFORMATION AMONG MULTIPLE DEVICES - The invention includes a method and apparatus for coordinating transfer of information between ones of a plurality of devices including a coordinating device and at least one other device. In one embodiment, a method includes detecting selection of an item available at a first one of the devices, detecting a gesture-based command for the selected item, identifying a second one of the devices based on the gesture-based command and a spatial relationship between the coordinating device and the second one of the devices, and initiating a control message adapted for enabling the first one of the devices to propagate the selected item toward the second one of the devices. The control message is adapted for enabling the first one of the devices to propagate the selected item toward the second one of the devices. The first one of the devices on which the item is available may be the coordinating device or another device. | 04-01-2010 |
20100083190 | TOUCH GESTURE INTERFACE APPARATUSES, SYSTEMS, AND METHODS - In certain embodiments, an object touch is detected on a touch screen display, a touch gesture interface is displayed on the touch screen display in response to the object touch, a touch gesture is detected on the touch screen display, and an action is performed based on the touch gesture. In certain embodiments, the touch gesture includes a directional touch gesture in a direction away from a position of the object touch on a surface of the touch screen display. In certain embodiments, the touch gesture interface includes a plurality of selectable options, and the action includes one of navigating through the selectable options and selecting one of the selectable options. | 04-01-2010 |
20100083191 | METHOD AND APPARATUS FOR DISPLAYING CONTENT AT A MOBILE DEVICE - The invention relates to displaying content at a mobile device. In some embodiments, a method and apparatus are arranged to display content at a mobile device in association with a graphical device indicating a method of user interaction associated with the content. The mobile device is arranged to detect, at a detector such as a motion detector or on a touch screen of the mobile device, a user interaction and to determine whether the user interaction corresponds to the indicated method of interaction. The method and apparatus are arranged to perform an action relating to the content in response to the detection of the indicated method of interaction. | 04-01-2010 |
20100088653 | PORTABLE ELECTRONIC DEVICE AND METHOD OF CONTROLLING SAME - A method of controlling a portable electronic device that has a touch screen display includes rendering a graphical user interface including a plurality of selectable features, detecting a first touch event at a first location on the touch screen display, detecting a second touch event at a second location on the touch screen display during the first touch event, and selecting ones of the plurality of selectable features located in an area having boundaries defined by the first location and the second location to define a group of selected features. | 04-08-2010 |
20100088654 | ELECTRONIC DEVICE HAVING A STATE AWARE TOUCHSCREEN - An electronic device having a touchscreen display. A graphical user interface (GUI) is displayed on the touchscreen display that includes a user interface element displayed in a default state at a location, the user interface element being associated with a function. The user interface element is changed from the default state to a first state upon detecting a first input event at the location. The user interface element is changed from the first state to a second state upon detecting a second input event at the location. | 04-08-2010 |
20100095250 | Facilitating Interaction With An Application - An apparatus for facilitating interaction with an application includes a memory and logic. The memory stores image data generated by an instance of an application. The logic repeats the following for each user of a number of users: receives a sensor signal representing a gesture performed by a user and indicating a user instruction; modifies the image data according to the user instruction; and sends the image data to initiate a display of an image according to the user instruction. | 04-15-2010 |
20100095251 | LINKAGE BETWEEN MOTION SENSING AND POSITION APPLICATIONS IN A PORTABLE COMMUNICATION DEVICE - Criteria for movement of a mobile communication device that can be initiated by the user are defined. A criterion can be stored as a data characteristic in device memory. Motion of the device can be sensed to determine, by the device controller, whether sensed motion meets the defined criterion. The sensed motion may be derived from an accelerometer, or equivalent means, in the device. If the sensed motion is determined by the controller to match stored criterion data, the controller triggers activation of an application that is dependent on location of the device. A stored application associated with the matched data characteristic is accessed from one or more stored applications respectively associated in memory with stored data characteristics. | 04-15-2010 |
20100100854 | GESTURE OPERATION INPUT SYSTEM - A gesture operation input system includes one or more subsystems to receive an input indicating a modifier input, receive a gesture input, wherein the gesture input indicates an action to be performed, and receive an indication that the modifier input is no longer being received. After receiving the gesture input, the gesture operation input system then determines the action to be performed using the gesture input and performs the action. | 04-22-2010 |
20100100855 | HANDHELD TERMINAL AND METHOD FOR CONTROLLING THE HANDHELD TERMINAL USING TOUCH INPUT - A handheld terminal includes a coordinate recognizer to recognize a first coordinate on a screen where a touch starts and to recognize a second coordinate on the screen where the touch ends, a function identifier to identify a function corresponding to the pair of coordinates, and a function performer to perform the identified function. The first and second coordinates may respectively correspond to a service icon displayed at or near the first coordinate and a process area displayed at or near the second coordinate and associated with the identified function. A method for controlling a handheld terminal includes recognizing a first coordinate on a screen where a touch starts and a second coordinate on the screen where the touch ends, identifying a function corresponding to the first coordinate and the second coordinate, and performing the identified function. | 04-22-2010 |
20100115473 | ASSOCIATING GESTURES ON A TOUCH SCREEN WITH CHARACTERS - The present invention provides methods for associating a gesture, in contact with a touch screen, with a character. More specifically, the present invention links a user's movement on a surface of a device to represent a character. A character includes any number, letter, or symbol. For example, an illustrative embodiment of the present invention, a user may swipe a surface on their device such as a cell phone. The present invention recognizes the swipe to represent the number “0,” a swipe in another direction to represent the number “1,” a tap in the middle region to represent the number “2,” etc. | 05-06-2010 |
20100125816 | MOVEMENT RECOGNITION AS INPUT MECHANISM - The detection of relative motion or orientation between a user and a computing device can be used to control aspects of the device. For example, the computing device can include an imaging element and software for locating positions, shapes, separations, and/or other aspects of a user's facial features relative to the device, such that an orientation of the device relative to the user can be determined. A user then can provide input to the device by performing actions such as tilting the device, moving the user's head, making a facial expression, or otherwise altering an orientation of at least one aspect of the user with respect to the device. Such an approach can be used in addition to, or as an alternative to, conventional input devices such as keypads and touch screens. | 05-20-2010 |
20100125817 | 3D INTERFACE APPARATUS AND INTERFACING METHOD USING THE SAME - A 3D interface apparatus which is operated based on motion and an interfacing method using the same are provided. The interface apparatus includes a motion sensor, a controller which determines a wind property, and a wind generation module which generates a wind. Accordingly, a user is allowed to manipulate a GUI more easily, conveniently, and intuitively. | 05-20-2010 |
20100131904 | TILTABLE USER INTERFACE - A programmable effects system for graphical user interfaces is disclosed. One embodiment comprises adjusting a graphical user interface in response to a tilt of a device. In this way, a graphical user interface may have viewable content not shown in a first view, where the viewable content may be displayed in a tilted view in response to the device tilt. | 05-27-2010 |
20100131905 | Methods, Systems, and Products for Gesture-Activation - Methods, systems, and products are disclosed for operating home appliances using gesture recognition. A sequence of video images is received and compared to a stored sequence of gesture images. A gesture image is associated to an operation of an appliance. | 05-27-2010 |
20100138797 | PORTABLE ELECTRONIC DEVICE WITH SPLIT VISION CONTENT SHARING CONTROL AND METHOD - A portable electronic device, such as a mobile phone, has a main camera and a video call camera that are receive optical input representative of motion of a user's hand(s) or hand gestures. The motion or gestures are decoded and used as a remote control input to control the displaying of content by a display device, such as a television or a projector, which receives the content for display from the mobile phone. A method of displaying content from a portable electronic device on a separate display or projector and of controlling such displaying by remote control based on hand movement or gestures. | 06-03-2010 |
20100138798 | System and method for executing a game process - A 3-D imaging system for recognition and interpretation of gestures to control a computer. The system includes a 3-D imaging system that performs gesture recognition and interpretation based on a previous mapping of a plurality of hand poses and orientations to user commands for a given user. When the user is identified to the system, the imaging system images gestures presented by the user, performs a lookup for the user command associated with the captured image(s), and executes the user command(s) to effect control of the computer, programs, and connected devices. | 06-03-2010 |
20100146457 | Interactive Display - A pen-based calculator, wherein the user is presented with a screen and a gesture based input device by means of which a sum can be “handwritten” and displayed on the screen. The visibility on the screen of the system's status is provided through two types of feedback: annotation and morphing. As the user is writing the expression, the system can process in the background and, as a symbol is recognised, the user is made aware of this recognition by visual feedback: a typeset character stretched to the stroke's hull replaces the written strokes. Morphing formats the entered expression into a correctly typeset equation by moving the symbols as little as possible from the user's writing, and the morph provides the continuity between the user's input and the typeset equation that allows them to continue to edit and use it. Finally, the answer may be displayed at the correct location relative to the equation. | 06-10-2010 |
20100146458 | Operating System Providing Multi-Touch Support For Applications In A Mobile Device - An operating system providing multi-touch support for (user) applications in a mobile device. In one embodiment, a check of whether the touch screen (in the mobile device) has multi-touch capability is performed. A first interface with multi-touch capability is provided to the (user) applications if the touch screen has multi-touch capability and a second interface with single touch capability being provided if the touch screen does not have multi-touch capability. The first and second interfaces may be provided by corresponding device drivers loaded when the mobile device is initialized with the operating system. A device driver (providing the second interface) is also designed to perform the check and execute another device driver (providing the first interface) if the touch screen has multi-touch capability. | 06-10-2010 |
20100146459 | Apparatus and Method for Influencing Application Window Functionality Based on Characteristics of Touch Initiated User Interface Manipulations - Apparatuses and methods for presenting an application window on a touch sensitive screen of a mobile device, the application window configured to facilitate user interaction with an application and with a plurality of touch activatable items displayable in a predetermined manner within the application window. A first long tap is detected having a first predetermined duration within the application window and invoking, in response to detecting the first long tap, a first mode of the application window that enables a first type of behavior of one or more of the touch activatable items. At a touch sensitive screen location other than a location within the application window, a second long tap having a second predetermined duration is detected and, in response to the second long tap, a second mode is invoked that influences behavior of one or more of the touch activatable items within the application window. | 06-10-2010 |
20100146460 | SYSTEM AND METHOD FOR MODIFYING A PLURALITY OF KEY INPUT REGIONS BASED ON DETECTED TILT AND/OR RATE OF TILT OF AN ELECTRONIC DEVICE - A system, method and computer program that utilizes motion detection circuitry to dynamically update displayed labels on one or more key input regions. In one aspect of the invention, the number of key input regions is substantially less than the number of keys on a conventional QWERTY keypad and the labels on the key input regions dynamically change based on the detected motion of the motion detection circuitry. | 06-10-2010 |
20100146461 | ELECTRONIC APPARATUS AND DISPLAYING METHOD THEREOF - An electronic apparatus and a displaying method thereof are provided. The electronic apparatus includes a sensor which senses user information, and a controller which reads out item information based on the sensed user information, determines a display method of item content corresponding to the read-out item information, and controls the item content to be displayed in the determined display method. Accordingly, actual goods are displayed along with information regarding the actual goods so that the user can realize a displayed image as actual goods and can easily obtain goods information. | 06-10-2010 |
20100146462 | INFORMATION PROCESSING APPARATUS AND METHOD - An information processing apparatus having a touch-sensitive panel and processing a gesture input performed via the touch-sensitive panel accepts an instruction from a user for transitioning from a first processing state to a second processing state; sets a number of gesture-input-based operations in accordance with the instruction accepted; and executes corresponding processing as a gesture input in the second processing state with regard to gesture inputs of the number of operations set. The information processing apparatus executes corresponding processing as a gesture input in the first processing state with regard to a gesture input after the gesture inputs of the number of operations have been performed. | 06-10-2010 |
20100146463 | WATCH PHONE AND METHOD FOR HANDLING AN INCOMING CALL IN THE WATCH PHONE - A watch phone and a method for handling an incoming call using the watch phone are provided. In the watch phone, a display device includes a touch screen panel and a display, turns off the touch screen panel in a watch mode, turns on the touch screen panel in an idle mode or upon receipt of an incoming call, and displays at least two areas for call connection and call rejection, upon receipt of the incoming call. A single mode selection key selects one of the watch mode and the idle mode. A controller performs control operations so that the touch screen panel is turned off in the watch mode and is turned on in the idle mode or upon receipt of the incoming call, and connects or rejects the incoming call, when the at least two areas for call connection or call rejection, which are displayed upon receipt of the incoming call, are pointed to or dragged to. | 06-10-2010 |
20100146464 | Architecture For Controlling A Computer Using Hand Gestures - Architecture for implementing a perceptual user interface. The architecture comprises alternative modalities for controlling computer application programs and manipulating on-screen objects through hand gestures or a combination of hand gestures and verbal commands. The perceptual user interface system includes a tracking component that detects object characteristics of at least one of a plurality of objects within a scene, and tracks the respective object. Detection of object characteristics is based at least in part upon image comparison of a plurality of images relative to a course mapping of the images. A seeding component iteratively seeds the tracking component with object hypotheses based upon the presence of the object characteristics and the image comparison. A filtering component selectively removes the tracked object from the object hypotheses and/or at least one object hypothesis from the set of object hypotheses based upon predetermined removal criteria. | 06-10-2010 |
20100153890 | Method, Apparatus and Computer Program Product for Providing a Predictive Model for Drawing Using Touch Screen Devices - An apparatus for providing a predictive model for use with touch screen devices may include a processor. The processor may be configured to identify a stroke event received at a touch screen display, evaluate an environmental parameter corresponding to the touch screen display to determine a scenario based on the environmental parameter, and generate a graphic output corresponding to the identified stroke event for the scenario determined. A corresponding method and computer program product are also provided. | 06-17-2010 |
20100162177 | INTERACTIVE ENTERTAINMENT SYSTEM AND METHOD OF OPERATION THEREOF - An interactive entertainment system comprises a plurality of devices providing an ambient environment, gesture detection means for detecting a gesture of a user, and control means for receiving an output from the gesture detection means and for communicating with at least one device. The control means is arranged to derive from the output a location in the ambient environment and to change the operation of one or more devices in the determined location, according to the output of the gesture detection means. | 06-24-2010 |
20100162178 | Apparatus, method, computer program and user interface for enabling user input - An apparatus, method, computer program and user interface the apparatus including: a display configured to display a first item; | 06-24-2010 |
20100162179 | Method and Apparatus for Adding or Deleting at Least One Item Based at Least in Part on a Movement - In accordance with an example embodiment of the present invention, an apparatus comprising a user interface configured to detect a first touch, detect a second touch, and detect a movement from the first touch or the second touch. The apparatus further comprises a processor configured to delete or add at least one item based at least in part on the movement. | 06-24-2010 |
20100162180 | GESTURE-BASED NAVIGATION - A method includes detecting an area of a touch screen that is touched by an instrument and determining a gesture corresponding to the area touched. The method further includes performing a crossover operation when it is determined that the gesture corresponds to a crossover gesture and displaying on the touch screen a content that includes a first child content and a second child content that is associated with the crossover operation, where the crossover operation includes navigating between the first child content to the second child content in response to the crossover gesture. The first child content is accessible via a first parent content and the second child content is accessible via a second parent content, and when navigating between the first child content to the second child content, the first parent content and the second parent content is not displayed. | 06-24-2010 |
20100162181 | Interpreting Gesture Input Including Introduction Or Removal Of A Point Of Contact While A Gesture Is In Progress - A touch-sensitive device accepts single-touch and multi-touch input representing gestures, and is able to changing a parameter of a gesture responsive to introduction or removal of a point of contact while the gesture is in progress. The operation associated with the gesture, such as a manipulation of an on-screen object, changes in a predictable manner if the user introduces or removes a contact point while the gesture is in progress. The overall nature of the operation being performed does not change, but a parameter of the operation can change. In various embodiments, each time a contact point is added or removed, the system and method of the present invention resets the relationship between the contact point locations and the operation being performed, in such a manner as to avoid or minimize discontinuities in the operation. In this manner, the invention avoids sudden or unpredictable changes to an object being manipulated. | 06-24-2010 |
20100162182 | METHOD AND APPARATUS FOR UNLOCKING ELECTRONIC APPLIANCE - An unlocking method and apparatus for an electronic appliance are disclosed. The method and apparatus may enable a user to unlock the electronic appliance by identifying a gesture and to invoke a function mapped to the gesture. The unlocking method includes detecting a preset gesture input when an input means is locked. The method includes unlocking the input means in response to the input gesture. The method also includes invoking an application mapped to the input gesture in response to unlocking. | 06-24-2010 |
20100169840 | Method For Recognizing And Tracing Gesture - A method for recognizing and tracing a gesture fetches a gesture image by an image sensor. The gesture image is processed for recognizing and tracing, and a corresponding action is performed according to the processed result. The gesture image is pre-processed and then a moved image is detected. The moved image is analyzed to obtain a gesture feature. When the gesture feature is corresponding to a moved gesture, a center coordinate of the moved gesture is detected and outputted to control a cursor. When the gesture feature is corresponding to a command gesture, a relevant action command is outputted. Therefore, the method provides cursor movement and command input by user gesture. | 07-01-2010 |
20100169841 | HANDWRITING MANIPULATION FOR CONDUCTING A SEARCH OVER MULTIPLE DATABASES - A method and system is provided for allowing a user of a multifunctional device to search information without going through several menu/button manipulations. More specifically, a multifunctional device comprises a user screen area including a touch sensitive layer to receive user input in order to initiate a search over several local and/or remote databases. The user is allowed to input a free-style handwriting query on a screen of a device to look up information, such as contact information, available applications, wallpapers, ringtones, photos, call logs, etc. After conducting a search, the device presents the search results to the user in such a way that the user can start an intended operation by selecting a search result, such as making a call, starting an application, etc. | 07-01-2010 |
20100169842 | Control Function Gestures - Techniques involving control function gestures are described. In an implementation, a control function is identified in response to gesture input at a touch screen of the remote control device. Execution of the identified control function is initiated a client device that is communicatively coupled to the remote control device and that is configured to alter an output of content by the client device that is broadcast to the client device. | 07-01-2010 |
20100169843 | INPUT APPARATUS, HANDHELD APPARATUS, AND CONTROL METHOD - An input apparatus includes: a motion signal output section to detect a movement of an object for controlling a movement of an image displayed on a screen and output a motion signal corresponding to the movement of the object; a control command generation section to generate a control command corresponding to the motion signal for controlling the movement of the image; an operation signal output section to detect a user operation unintended for the control of the movement of the image and output an operation signal based on the operation; an operation command generation section to generate an operation command based on the operation signal; and a control section to control the control command generation section to generate, in temporal relation with a generation timing of the operation command, the control command with a sensitivity of the movement of the image with respect to the movement of the object changed. | 07-01-2010 |
20100180237 | FUNCTIONALITY SWITCHING IN POINTER INPUT DEVICES - Various embodiments for switching functionality of a graphical user interface (GUI) pointer input device are provided. A first gesture pattern is configured. The first gesture pattern, when performed, enables a predetermined function of the input device. The predetermined function substitutes for a default function of the input device. The enabling of the predetermined function is indicated to a user on the GUI. A second gesture pattern is configured. The second gesture pattern, when performed, cancels the predetermined function of the input device and enables the default function. | 07-15-2010 |
20100185990 | Movable display apparatus, robot having movable display apparatus and display method thereof - A movable display apparatus, a robot having the movable display apparatus, and a display method thereof, and, more particularly, a display method of a robot having an apparatus to display an image according to a visual point of a user are provided. It is possible to provide a convenient extended image service, by movably mounting the apparatus to display the image according to the visual point of the user and mounting the movable display apparatus in the robot so as to accurately display the image according to the visual point of the user using the mobility and motion of the robot. In addition, it is possible to provide an image, which is viewed like a three-dimensional image, via a two-dimensional display apparatus, by changing the displayed image according to a variation in the sightline of the user. | 07-22-2010 |
20100192108 | METHOD FOR RECOGNIZING GESTURES ON LIQUID CRYSTAL DISPLAY APPARATUS WITH TOUCH INPUT FUNCTION - One aspect of the present invention discloses a method for detecting gestures on a liquid crystal display apparatus with touch input functions is disclosed, wherein the liquid crystal display apparatus includes a display region and a button region. The method comprises the steps of checking whether an object touches a first region of the liquid crystal display apparatus, checking whether the object slides into a second region of the liquid crystal display apparatus after touching the first region, and sending a gesture signal to perform a predetermined function if the object slides between the display region and button region. | 07-29-2010 |
20100192109 | Detecting and Interpreting Real-World and Security Gestures on Touch and Hover Sensitive Devices - “Real-world” gestures such as hand or finger movements/orientations that are generally recognized to mean certain things (e.g., an “OK” hand signal generally indicates an affirmative response) can be interpreted by a touch or hover sensitive device to more efficiently and accurately effect intended operations. These gestures can include, but are not limited to, “OK gestures,” “grasp everything gestures,” “stamp of approval gestures,” “circle select gestures,” “X to delete gestures,” “knock to inquire gestures,” “hitchhiker directional gestures,” and “shape gestures.” In addition, gestures can be used to provide identification and allow or deny access to applications, files, and the like. | 07-29-2010 |
20100199226 | Method and Apparatus for Determining Input Information from a Continuous Stroke Input - An apparatus, comprising a processor configured to receive a continuous stroke input related to a virtual keypad, determine a first input information based at least in part on said continuous stroke input, display a shape associated with said first input information, receive input associated with said shape, and determine a second input information based at least in part on said shape and said input associated with said shape is disclosed. | 08-05-2010 |
20100199227 | IMAGE COLLAGE AUTHORING - A user interface that includes a catalog area, a collage mock-up area, and a mode select interface control operable to select an operational state of the user interface is displayed. Thumbnails of respective images are shown in the catalog area. A layout of a subset of the images is presented in the collage mock-up area. In response to the receipt of a user input gesture and a determination that the user interface is in a first operational state, a first action type is performed based on the type of the received user input gesture and the object type of the target object. In response to the receipt of the user input gesture and a determination that the user interface is in a second operational state, a second action type is performed based on the type of the received user input gesture and the object type of the target object. | 08-05-2010 |
20100199228 | Gesture Keyboarding - Systems, methods and computer readable media are disclosed for gesture keyboarding. A user makes a gesture by either making a pose or moving in a pre-defined way that is captured by a depth camera. The depth information provided by the depth camera is parsed to determine at least that part of the user that is making the gesture. When parsed, the character or action signified by this gesture is identified. | 08-05-2010 |
20100199229 | MAPPING A NATURAL INPUT DEVICE TO A LEGACY SYSTEM - Systems and methods for mapping natural input devices to legacy system inputs are disclosed. One example system may include a computing device having an algorithmic preprocessing module configured to receive input data containing a natural user input and to identify the natural user input in the input data. The computing device may further include a gesture module coupled to the algorithmic preprocessing module, the gesture module being configured to associate the natural user input to a gesture in a gesture library. The computing device may also include a mapping module to map the gesture to a legacy controller input, and to send the legacy controller input to a legacy system in response to the natural user input. | 08-05-2010 |
20100199230 | GESTURE RECOGNIZER SYSTEM ARCHITICTURE - Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data. | 08-05-2010 |
20100199231 | PREDICTIVE DETERMINATION - Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data. | 08-05-2010 |
20100199232 | Wearable Gestural Interface - This invention may be implemented as a wearable apparatus comprised of a camera, a projector, a mirror, a microphone and a digital computer. The camera captures visual data. This data is analyzed by the digital computer to recognize objects and hand gestures, using color tracking and edge detection techniques. The projector is used, along with a mirror to adjust the direction of the projected light, to project images on objects in the user's environment. For example, the images may be projected on surfaces such as a wall, table, or piece of paper. The projected images may contain information relevant to the object being augmented. Indeed, the information may include current data obtained from the Internet. Also, the projected images may comprise graphical interfaces, with which a user may interact by making hand gestures. | 08-05-2010 |
20100211918 | Web Cam Based User Interaction - This document describes tools for inputting data into a computer via the movement of features of a user as detected by a webcam or other input device. This is accomplished by a user moving his or her features in view of a webcam. The webcam or other input device then detects the presence and motion of the feature(s) and converts these motions into input signals to execute predetermined input instructions. | 08-19-2010 |
20100211919 | RENDERING OBJECT ICONS ASSOCIATED WITH A FIRST OBJECT ICON UPON DETECTING FINGERS MOVING APART - A method comprises detecting two fingers touching a first object icon on a touch sensitive display and then moving in generally opposing directions. The first object icon is associated with on or more constituent elements. In response to such detecting, the method causes additional object icons to appear on the display. Each additional object icon represents a constituent element of the first object icon. | 08-19-2010 |
20100211920 | Detecting and Interpreting Real-World and Security Gestures on Touch and Hover Sensitive Devices - “Real-world” gestures such as hand or finger movements/orientations that are generally recognized to mean certain things (e.g., an “OK” hand signal generally indicates an affirmative response) can be interpreted by a touch or hover sensitive device to more efficiently and accurately effect intended operations. These gestures can include, but are not limited to, “OK gestures,” “grasp everything gestures,” “stamp of approval gestures,” “circle select gestures,” “X to delete gestures,” “knock to inquire gestures,” “hitchhiker directional gestures,” and “shape gestures.” In addition, gestures can be used to provide identification and allow or deny access to applications, files, and the like. | 08-19-2010 |
20100218144 | Method and Apparatus for Displaying Additional Information Items - An apparatus, that may include a processor configured to receive a touch input associated with a first information item, determine a first set of at least one additional information item associated with said first information item, based at least in part on said touch input, generate a first visual representation based at least in part on said first set, and display said first visual representation is disclosed. A corresponding method and computer-readable medium are also disclosed. | 08-26-2010 |
20100223582 | SYSTEM AND METHOD FOR ANALYZING MOVEMENTS OF AN ELECTRONIC DEVICE USING ROTATIONAL MOVEMENT DATA - The disclosure relates to a system and method for analyzing movements of a handheld electronic device. The system comprises: memory; a microprocessor; a first module to generate movement data responsive to movements of the device, such as rotational movements; a second module providing instructions to the microprocessor to map the movement data against symbols representing an input movement string and store the string representation in the memory; and a third module. The third module provides instructions to the microprocessor to analyze data relating to the string representation against data relating to a gesture string representing a gesture related to a command for the device to determine if the gesture has been imparted on the device; and if the string representation sufficiently matches the gesture string, executes a command associated with the gesture on the device. | 09-02-2010 |
20100229129 | CREATING ORGANIZATIONAL CONTAINERS ON A GRAPHICAL USER INTERFACE - Embodiments related to the formation of an organizational container on a touch-sensitive graphical user interface are disclosed. One disclosed embodiment provides a method of forming an organizational container comprising receiving a touch gesture at the graphical user interface, the touch gesture defining a set of zero or more content items to be grouped together and further defining a region of the touch-sensitive graphical user interface. The method further comprises forming an organizational container responsive to receiving the touch gesture at the touch-sensitive graphical user interface, presenting a boundary defining the organizational container, moving the set of content items into the organizational container, and presenting the set of content items arranged within the boundary according to an organized view. | 09-09-2010 |
20100229130 | Focal-Control User Interface - A user interface and techniques for manipulating a graphical representation via indirect manipulation of focal controls are described. Generally, the user interface includes a graphical representation (e.g., an image, video, application, browser, map, etc.), one or more visible or transparent focal controls, and gesture detection functionality to detect inputs from a user. The user may provide this input via a peripheral device (e.g., a mouse, keyboard, etc.), a touch-screen display, or in another suitable manner. In each instance, the user provides an input relative to the focal control and, in response to detecting the input, the gesture detection functionality manipulates the underlying graphical representation. | 09-09-2010 |
20100235793 | Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display - In some embodiments, a device displays content on a touch screen display and detects input by finger gestures. In response to the finger gestures, the device selects content, visually distinguishes the selected content, and/or updates the selected content based on detected input. In some embodiments, the device displays a command display area that includes one or more command icons; detects activation of a command icon in the command display area; and, in response to detecting activation of the command icon in the command display area, performs a corresponding action with respect to the selected content. Exemplary actions include cutting, copying, and pasting content. | 09-16-2010 |
20100235794 | Accelerated Scrolling for a Multifunction Device - A computer-implemented method is performed at a multifunction device with a display and a touch-sensitive surface. The method includes detecting multiple input gestures by a user, beginning with an initial input gesture. For each input gesture after the initial input gesture, the method scrolls information on the display at a respective scrolling speed. The respective scrolling speed is determined based on the respective input gesture movement speed in the input gesture and a movement multiplier. The method determines whether the respective input gesture meets one or more swipe gesture criteria, and determines whether the respective input gesture meets one or more successive gesture criteria. When the input gesture meets the one or more swipe gesture criteria and the one or more successive gesture criteria, the method updates the movement multiplier in accordance with one or more movement multiplier adjustment criteria. | 09-16-2010 |
20100241999 | Canvas Manipulation Using 3D Spatial Gestures - User interface manipulation using three-dimensional (3D) spatial gestures may be provided. A two-dimensional (2D) user interface (UI) representation may be displayed. A first gesture may be performed, and, in response to the first gesture's detection, the 2D UI representation may be converted into a 3D UI representation. A second gesture may then be performed, and, in response to the second gesture's detection, the 3D UI representation may be manipulated. Finally, a third gesture may be performed, and, in response to the third gesture's detection, the 3D UI representation may be converted back into the 2D UI representation. | 09-23-2010 |
20100251188 | Method of Determining Input Pattern and Computer Readable Storage Medium - A method of determining input pattern is adapted to be implemented on an electronic apparatus equipped with a touch panel and includes the steps of: detecting a plurality of boundary points between an input pattern inputted through the touch panel and a circumscribed polygon of the input pattern, detecting an area ratio of a polygon defined by the boundary points to the circumscribed polygon, and determining the shape of the input pattern at least according to the area ratio. The present invention also provides a computer readable storage medium having a program stored therein. When the program is executed which enables an electronic apparatus equipped with a touch panel to determine the shape and/or direction of an input pattern inputted through the touch panel. | 09-30-2010 |
20100251189 | Using gesture objects to replace menus for computer control - The present invention generally comprises a computer control environment that builds on the Blackspace™ software system to provide further functionality and flexibility in directing a computer. It employs graphic inputs drawn by a user and known as gestures to replace and supplant the pop-up and pull-down menus known in the prior art. | 09-30-2010 |
20100269072 | USER INTERFACE DEVICE, USER INTERFACE METHOD, AND RECORDING MEDIUM - A user interface device ( | 10-21-2010 |
20100275166 | USER ADAPTIVE GESTURE RECOGNITION METHOD AND USER ADAPTIVE GESTURE RECOGNITION SYSTEM - The present invention relates to a user adaptive gesture recognition method and a user adaptive gesture recognition system. The present invention relates to a user adaptive gesture recognition method and a user adaptive gesture recognition system that, by using a terminal equipped with an acceleration sensor, can drive mobile application software in the terminal or can process a function of an application program for browsing to be displayed on the terminal based on acceleration information. Accordingly, the user gesture can be recognized and processed by using an acceleration sensor installed in a mobile apparatus. In addition, the user adaptive gesture can be stored in the mobile apparatus by using the acceleration sensor, and thus a mobile application can be easily utilized with a simple gesture. | 10-28-2010 |
20100281435 | SYSTEM AND METHOD FOR MULTIMODAL INTERACTION USING ROBUST GESTURE PROCESSING - Disclosed herein are systems, computer-implemented methods, and tangible computer-readable media for multimodal interaction. The method includes receiving a plurality of multimodal inputs associated with a query, the plurality of multimodal inputs including at least one gesture input, editing the at least one gesture input with a gesture edit machine. The method further includes responding to the query based on the edited gesture input and remaining multimodal inputs. The gesture inputs can be from a stylus, finger, mouse, and other pointing/gesture device. The gesture input can be unexpected or errorful. The gesture edit machine can perform actions such as deletion, substitution, insertion, and aggregation. The gesture edit machine can be modeled as a finite-state transducer. In one aspect, the method further includes generating a lattice for each input, generating an integrated lattice of combined meaning of the generated lattices, and responding to the query further based on the integrated lattice. | 11-04-2010 |
20100281436 | BINDING USERS TO A GESTURE BASED SYSTEM AND PROVIDING FEEDBACK TO THE USERS - Techniques for managing a set of states associated with a capture device are disclosed herein. The capture device may detect and bind to users, and may provide feedback about whether the capture device is bound to, or detecting a user. Techniques are also disclosed wherein virtual ports may be associated with users bound to a capture device and feedback about the state of virtual ports may be provided. | 11-04-2010 |
20100281437 | MANAGING VIRTUAL PORTS - Techniques for managing virtual ports are disclosed herein. Each such virtual port may have different associated features such as, for example, privileges, rights or options. When one or more users are in a capture scene of a gesture based system, the system may associate virtual ports with the users and maintain the virtual ports. Also provided are techniques for disassociating virtual ports with users or swapping virtual ports between two or more users. | 11-04-2010 |
20100281438 | ALTERING A VIEW PERSPECTIVE WITHIN A DISPLAY ENVIRONMENT - Disclosed herein are systems and methods for altering a view perspective within a display environment. For example, gesture data corresponding to a plurality of inputs may be stored. The input may be input into a game or application implemented by a computing device. Images of a user of the game or application may be captured. For example, a suitable capture device may capture several images of the user over a period of time. The images may be analyzed and processed for detecting a user's gesture. Aspects of the user's gesture may be compared to the stored gesture data for determining an intended gesture input for the user. The comparison may be part of an analysis for determining inputs corresponding to the gesture data, where one or more of the inputs are input into the game or application and cause a view perspective within the display environment to be altered. | 11-04-2010 |
20100281439 | Method to Control Perspective for a Camera-Controlled Computer - Systems, methods and computer readable media are disclosed for controlling perspective of a camera-controlled computer. A capture device captures user gestures and sends corresponding data to a recognizer engine. The recognizer engine analyzes the data with a plurality of filters, each filter corresponding to a gesture. Based on the output of those filters, a perspective control is determined, and a display device displays a new perspective corresponding to the perspective control. | 11-04-2010 |
20100281440 | Detecting, Representing, and Interpreting Three-Space Input: Gestural Continuum Subsuming Freespace, Proximal, and Surface-Contact Modes - Systems and methods for detecting, representing, and interpreting three-space input are described. Embodiments of the system, in the context of an SOE, process low-level data from a plurality of sources of spatial tracking data and analyze these semantically uncorrelated spatiotemporal data and generate high-level gestural events according to dynamically configurable implicit and explicit gesture descriptions. The events produced are suitable for consumption by interactive systems, and the embodiments provide one or more mechanisms for controlling and effecting event distribution to these consumers. The embodiments further provide to the consumers of its events a facility for transforming gestural events among arbitrary spatial and semantic frames of reference. | 11-04-2010 |
20100287513 | MULTI-DEVICE GESTURE INTERACTIVITY - A system is provided for enabling cross-device gesture-based interactivity. The system includes a first computing device with a first display operative to display an image item, and a second computing device with a second display. The second display is operative to display a corresponding representation of the image item in response to a gesture which is applied to one of the computing devices and spatially interpreted based on a relative position of the first computing device and the second computing device. | 11-11-2010 |
20100299641 | PORTABLE ELECTRONIC DEVICE AND METHOD OF CONTROLLING SAME - A method of controlling a portable electronic device having a touch-sensitive display includes rendering content on the touch-sensitive display, detecting a first touch at a first location on the touch-sensitive display, detecting a second touch at a second location on the touch-sensitive display during the first touch, determining an area having a boundary defined by the first location and the second location, when a zoom selection is detected, performing a zooming operation by expanding rendered content in at least the area. | 11-25-2010 |
20100299642 | Electronic Device with Sensing Assembly and Method for Detecting Basic Gestures - An electronic device and method are described for detecting a predefined gesture that is a specified pattern of movement of an external object relative to the electronic device. The method includes providing as part of the electronic device a sensing assembly including at least one photoreceiver and a plurality of phototransmitters, wherein each phototransmitter is positioned to emit infrared light away from the electronic device about a corresponding central transmission axis, wherein each central transmission axis is oriented in a different direction with respect to the others. The emission of infrared light by each of the phototransmitters is controlled during each of a plurality of sequential time periods, and wherein the external, object moves in the specified pattern of movement during the plurality of sequential time periods. For each of the plurality of phototransmitters and for each of the plurality of sequential time periods, a corresponding measured signal indicative of a respective amount of infrared light which originated from that phototransmitter during that time period and was reflected by the external object prior to being received by the photoreceiver is generated, and the measured signals are evaluated to detect the predefined gesture. | 11-25-2010 |
20100306711 | Method and Apparatus for a Motion State Aware Device - A device comprising a motion context logic that receives data from at least one motion sensor is described. The motion context logic determines a user's motion context. Context based action logic manages the device based on the user's motion context. | 12-02-2010 |
20100306712 | Gesture Coach - A capture device may capture a user's motion and a display device may display a model that maps to the user's motion, including gestures that are applicable for control. A user may be unfamiliar with a system that maps the user's motions or not know what gestures are applicable for an executing application. A user may not understand or know how to perform gestures that are applicable for the executing application. User motion data and/or outputs of filters corresponding to gestures may be analyzed to determine those cases where assistance to the user on performing the gesture is appropriate. | 12-02-2010 |
20100306713 | Gesture Tool - Systems, methods and computer readable media are disclosed for a gesture tool. A capture device captures user movement and provides corresponding data to a gesture recognizer engine and an application. From that, the data is parsed to determine whether it satisfies one or more gesture filters, each filter corresponding to user-performed gesture. The data and the information about the filters is also sent to a gesture tool, which displays aspects of the data and filters. In response to user input corresponding to a change in a filter, the gesture tool sends an indication of such to the gesture recognizer engine and application, where that change occurs. | 12-02-2010 |
20100306714 | Gesture Shortcuts - Systems, methods and computer readable media are disclosed for gesture shortcuts. A user's movement or body position is captured by a capture device of a system, and is used as input to control the system. For a system-recognized gesture, there may be a full version of the gesture and a shortcut of the gesture. Where the system recognizes that either the full version of the gesture or the shortcut of the gesture has been performed, it sends an indication that the system-recognized gesture was observed to a corresponding application. Where the shortcut comprises a subset of the full version of the gesture, and both the shortcut and the full version of the gesture are recognized as the user performs the full version of the gesture, the system recognizes that only a single performance of the gesture has occurred, and indicates to the application as such. | 12-02-2010 |
20100306715 | Gestures Beyond Skeletal - Systems, methods and computer readable media are disclosed for gesture input beyond skeletal. A user's movement or body position is captured by a capture device of a system. Further, non-user-position data is received by the system, such as controller input by the user, an item that the user is wearing, a prop under the control of the user, or a second user's movement or body position. The system incorporates both the user-position data and the non-user-position data to determine one or more inputs the user made to the system. | 12-02-2010 |
20100306716 | EXTENDING STANDARD GESTURES - In a system that utilizes gestures for controlling aspects of an application, strict requirements for success may limit approachability or accessibility for different types of people. The system may receive data reflecting movement of a user and remap a standard gesture to correspond to the received data. Following the remapping, the system may receive data reflecting skeletal movement of a user, and determine from that data whether the user has performed one or more standard and/or remapped gestures. In an exemplary embodiment, a gesture library comprises a plurality of gestures. Where these gestures are complementary with each other, they may be grouped into gesture packages. A gesture package may include gestures that are packaged as remapped gestures or a gesture package may include options for remapping standard gestures to new data. | 12-02-2010 |
20100306717 | STORAGE MEDIUM HAVING GAME PROGRAM STORED THEREIN AND GAME APPARATUS - A game apparatus detects a path inputted by a player, and moves an object placed in a virtual game space along the path. Moreover, the game apparatus controls the object, which is moving along the path, to perform a predetermined action, and determines a return position when the predetermined action is finished. The return position is a position at which the object having finished the predetermined action returns to the path, and is determined from among positions along the path. The game apparatus resumes the movement of the object along the path after returning the object, having finished the predetermined action, to the return position. | 12-02-2010 |
20100306718 | APPARATUS AND METHOD FOR UNLOCKING A LOCKING MODE OF PORTABLE TERMINAL - A method of unlocking a locking mode of a portable terminal, which includes sensing a user's gesture input which is set in a locking mode of the portable terminal. The locking mode is unlocked in response to the user's gesture input, and a function mapped to the user's gesture can be executed when unlocking the locking mode. A portable terminal compares gestures among predefined sets of gesture information in order to check whether there is a gesture that coincides with the analyzed gesture. | 12-02-2010 |
20100325590 | OPERATION CONTROL DEVICE, OPERATION CONTROL METHOD, AND COMPUTER-READABLE RECORDING MEDIUM - An apparatus and method provide logic for controlling a controllable device by distinguishing between an intended motion of a user and an unintended motion of the user. In one implementation, a computer-implemented method is provided to control a controllable device by distinguishing between a control movement and a non-control movement. The method receives spatial positions of a joint of a human appendage and a reference point disposed along the appendage and distal to the joint. The method determines whether a movement of the reference point about the joint is a control movement or a non-control movement, based on a comparison of direction of movement of the reference point and a direction of displacement between the reference point and the upper joint. A control instruction is executed when the movement is a control movement. | 12-23-2010 |
20100333043 | Terminating a Communication Session by Performing a Gesture on a User Interface - There is disclosed a wireless communication device for communicating with one or more remote devices. The device comprises a touch-sensitive surface, a user interface, and a transceiver. The user interface produces an input signal in response to detecting a predetermined gesture at the touch-sensitive surface. The transceiver communicates wirelessly with a remote device and terminates communication with the remote device in response to the input signal from the user interface. The device determines that it is communicating the remote device, detects the predetermined gesture at the touch-sensitive surface, and terminates communication with the remote device in response to detecting the predetermined gesture while communicating with the remote device. The predetermined gesture includes continuous contact at the touch-sensitive surface between discrete locations of the surface. | 12-30-2010 |
20100333044 | Gesture-based Interface System and Method - Systems and methods of manipulating display parameters of displayed images, and optionally designating the images for manipulation, via a gesture pad. | 12-30-2010 |
20100333045 | Gesture Based Interaction with Traffic Data - Gesture based interaction with traffic data is disclosed. A virtual broadcast presentation may be generated based on dynamic information such as traffic information, weather information, or other information that may be featured on a virtual broadcast presentation. A gesture made by a user is detected and processed to determine an input command associated with the detected gesture. The virtual broadcast presentation may be manipulated based on the input command. | 12-30-2010 |
20110004853 | Method for multiple touch modes, method for applying multi single-touch instruction and electronic device performing these methods - A method for multiple touch modes, a method for applying multi single-touch instruction, and an electronic device performing these methods are disclosed. The method for multiple touch modes comprises the following steps: receiving at least one instruction; determining whether the at least one instruction comprises a start instruction; if yes, determining whether the at least one instruction is a multi single-touch instruction; and if yes, performing a multi single-touch operation corresponding to the at least one instruction. | 01-06-2011 |
20110010676 | SYSTEM AND METHOD FOR ALLOCATING DIGITAL GRAFFITI OBJECTS AND CANVASSES - The subject specification provides a system, method, and computer readable storage medium directed towards allocating digital canvasses for digital graffiti. The specification discloses receiving data corresponding to digital graffiti formed from a gesture undergone by a device. The specification also discloses identifying a digital canvas corresponding to the digital graffiti as a function of the received data. | 01-13-2011 |
20110022991 | TOUCH DETECTING INTERACTIVE DISPLAY BACKGROUND - The invention provides an interactive display that is controlled by user gestures identified on a touch detecting display surface. In the preferred embodiment of the invention, imagery is projected onto a horizontal projection surface from a projector located above the projection surface. The locations where a user contacts the projection surface are detected using a set of infrared emitters and receivers arrayed around the perimeter of the projection surface. For each contact location, a computer software application stores a history of contact position information and, from the position history, determines a velocity for each contact location. Based upon the position history and the velocity information, gestures are identified. The identified gestures are associated with display commands that are executed to update the displayed imagery accordingly. Thus, the invention enables users to control the display through direct physical interaction with the imagery. | 01-27-2011 |
20110022992 | METHOD FOR MODIFYING A REPRESENTATION BASED UPON A USER INSTRUCTION - The invention relates to a method for modifying a representation based upon a user instruction and a system for producing a modified representation by said method. Conventional drawing systems, such as pen and paper and writing tablets, require a reasonable degree of drawing skill which not all users possess. Additionally, these conventional systems produce static drawings. The method of the invention comprises receiving a representation from a first user, associating the representation with an input object classification, receiving an instruction from a second user, associating the instruction with an animation classification, determining a modification of the representation using the input object classification and the animation classification, and modifying the representation using the modification. When the first user provides a representation of something, for example a character in a story, it is identified to a certain degree by associating it with an object classification. In other words, the best possible match is determined. As the second user imagines a story involving the representation, dynamic elements of the story are exhibited in one or more communication forms such as writing, speech, gestures, facial expressions. By deriving an instruction from these signals, the representation may be modified, or animated, to illustrate the dynamic element in the story. This improves the feedback to the users, and increases the enjoyment of the users. | 01-27-2011 |
20110029934 | Finger Touch Gesture for Joining and Unjoining Discrete Touch Objects - An approach is provided to join graphical user interface objects into a group. A request is received at a touch-enabled display screen to join a first graphical user interface object with a second graphical user interface object. The request is from a user of the system. The first and second graphical user interface objects are then associated with each other. The first and second graphical user interface objects are displayed on the touch-enabled display screen adjacent to each other and a visual indicator is also displayed near the objects that indicates that the objects have been joined in a group. | 02-03-2011 |
20110029935 | METHOD AND APPARATUS FOR DETECTING UNDESIRED USERS USING SOCIALLY COLLABORATIVE FILTERING - In one embodiment, a method includes identifying at least one socially relevant gesture associated with a user and identifying at least one gesture graph that identifies content associated with at least one undesirable entity. The at least one socially relevant gesture is identified while the user is interacting with a system. The content includes a plurality of socially relevant gestures associated with the at least one undesirable entity. The method also includes determining when a distance between the at least one socially relevant gesture associated with the user and the content indicates that the user is undesirable, and processing the user as being undesirable when the distance indicates that the user is undesirable. | 02-03-2011 |
20110035708 | MULTI-TOUCH WALLPAPER MANAGEMENT - A method and apparatus for multi-touch wallpaper management for a mobile computing device are described wherein a first wallpaper image is displayed on a multi-touch-sensitive display of the mobile computing device and a multi-touch gesture is received indicating a request to change the first wallpaper image. In response to the multi-touch gesture, at least a portion of a second wallpaper image is displayed. Other embodiments are described and claimed. | 02-10-2011 |
20110041100 | Method and Device for Touchless Signing and Recognition - A touchless sensor device ( | 02-17-2011 |
20110041101 | MOBILE TERMINAL AND CONTROLLING METHOD THEREOF - A mobile terminal and controlling method thereof are disclosed. The present invention includes displaying a plurality of objects on a touchscreen and if a first user command is inputted, controlling the objects pertaining to a category corresponding to the first user command among a plurality of the objects displayed on the touchscreen to move into a specific region on the touchscreen. According to at least one of embodiments of the present invention, even if numerous icons for executing diverse functions are displayed in a touchscreen type mobile terminal, the present invention facilitates a terminal user to discover a specific icon from the numerous icons. | 02-17-2011 |
20110041102 | MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME - A mobile terminal and a method for controlling the same are disclosed, which facilitates diverse functions thereof to be registered in a touch gesture and functions, information and menu icons registered in the touch gesture to be arranged and displayed adjacent to the input touch gesture, when the touch gesture is inputted on the touchscreen. | 02-17-2011 |
20110047517 | METADATA TAGGING SYSTEM, IMAGE SEARCHING METHOD AND DEVICE, AND METHOD FOR TAGGING A GESTURE THEREOF - A metadata tagging system, an image searching method, a device, and a gesture tagging method are provided. The metadata tagging system includes a first device which tags metadata to an image and transmits the image tagged with the metadata and a second device which allows at least one image from among stored images to be searched. Accordingly, generated data may be searched and used more easily and conveniently. | 02-24-2011 |
20110055772 | SYSTEM AND METHOD FOR ENHANCED COMMAND INPUT - A portable electronic device having an input device for receiving a gesture based input from a user is used to control a navigation operation of an appliance. The portable electronic device receives via the input device the gesture based input and uses one or more parameters stored in a memory of the portable electronic device and one or more characteristics associated with the gesture based input to cause the portable electronic device to transmit a navigation step command to thereby control the navigation operation of the appliance. | 03-03-2011 |
20110055773 | DIRECT MANIPULATION GESTURES - The present disclosure describes various techniques that may be implemented to execute and/or interpret manipulation gestures performed by a user on a multipoint touch input interface of a computing device. An example method includes receiving a multipoint touch gesture at a multipoint touch input interface of a computing device, wherein the multipoint touch gesture comprises a gesture that is performed with multiple touches on the multipoint touch input interface, and resolving the multipoint touch gesture into a command. The example method further includes determining at least one physical simulation effect to associate with the resolved multipoint touch gesture, and rendering a unified feedback output action in a graphical user interface of the computing device by executing the command, wherein the unified feedback output action includes at least a graphical output action incorporated with the at least one physical simulation effect in the graphical user interface. | 03-03-2011 |
20110055774 | SYSTEM AND METHOD FOR CONTROLLING INTERACTION BETWEEN A MOBILE TERMINAL AND A DIGITAL PICTURE FRAME - A mobile terminal includes a wireless communication unit, a memory, a touch screen, and a controller. The wireless communication unit establishes a connection to an external digital picture frame. The memory stores a plurality of images including one or more characters and information mapped to the characters. The touch screen displays a first image stored in the memory. And, the controller transmits the first image and first information mapped to the first image to the digital picture frame via the wireless communication unit. | 03-03-2011 |
20110055775 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD AND INFORMATION PROCESSING PROGRAM - To present the operating state of a tracing operation in the event of performing a tracing operation to scroll objects of display. | 03-03-2011 |
20110061029 | GESTURE DETECTING METHOD FOR TOUCH PANEL - A gesture detecting method for a touch panel is provided. Firstly, a command mode of the touch panel is established based on a hop touch with fingers sequentially touching the touch panel. Then, a gesture is determined according to an eventually detected touch result of a single touch or multipoint touch, i.e., a detected moving track of the touch points, so as to generate and transmit a gesture instruction. | 03-10-2011 |
20110066984 | Gesture Recognition on Computing Device - A computer-implemented user interface method is disclosed. The method includes displaying information on a touchscreen of a computing device, receiving from a user of the device an input drawn on the touchscreen, correlating the input to a template, where the correlating includes employing a closed-form solution to find a rotation that reduces angular distance between the input and the template, and providing output based on a result of the correlating. | 03-17-2011 |
20110066985 | Systems, Methods, and Mobile Devices for Providing a User Interface to Facilitate Access to Prepaid Wireless Account Information - A method for operating a mobile device includes, in response to receiving a swipe gesture via a user interface of the mobile device, displaying a balance of a prepaid wireless service account on a display of the mobile device. The balance may be displayed in a currency such as monetary currency or a proprietary currency such as minutes or credits provided by a wireless service provider. The method may also include displaying an expiration time to identity a time remaining until the balance expires. The method may also include generating a balance request including a request for the balance of the prepaid wireless service account from a prepaid billing system in response to receiving the swipe gesture, transmitting the balance request to the prepaid billing system, and receiving the balance in response to the balance request. | 03-17-2011 |
20110072400 | METHOD OF PROVIDING USER INTERFACE OF MOBILE TERMINAL EQUIPPED WITH TOUCH SCREEN AND MOBILE TERMINAL THEREOF - A method of providing an interface in a terminal equipped with a touch screen, including displaying a first screen when receiving an unlock input for displaying the preset first screen. A screen change input for changing a part or an entire of the first screen into a preset second screen is received through the touch screen displaying the first screen; and changing a part of screen displayed on the touch screen from the first screen into the second screen according to a progress of the screen change input. | 03-24-2011 |
20110083110 | TOUCH-SENSITIVE DISPLAY AND METHOD OF CONTROL - A method includes displaying one or more selection options on a touch-sensitive display and detecting a hovering touch associated with a first option of the one or more selection options. Information associated with the first option is previewed in a first format in an information field in response to detecting the hovering touch. A selection of one of the one or more selection options is detected, and a function associated with the selected option is performed. | 04-07-2011 |
20110083111 | USER INTERFACE GESTURES AND METHODS FOR PROVIDING FILE SHARING FUNCTIONALITY - Methods and devices provide a gesture activated file sharing functionality enabling users to share files with other nearby computing devices. The file sharing functionality may include establishing wireless links with nearby devices and determine their relative locations. The computing device may detect a file sharing gesture and transmit files to or request files from a nearby device in response to the gesture. Base on gesture parameters, e.g., direction, speed and shape, and computing device attitude parameters, e.g., tilt angle and pointing direction, the computing device may identify a targeted device to which a file may be transmitted. The computing device may request user verification of the identified device and send a request to transmit files to the targeted device. The computing devices may transmit files using networks and addresses provided over the device-to-device communication links. | 04-07-2011 |
20110083112 | INPUT APPARATUS - An input apparatus including an input unit to which a predetermined motion image signal is input, a motion detection unit for detecting a motion from the motion image signal which is input to the input unit, a video signal processing unit for outputting a predetermined video signal, and a control unit, wherein if a hand revolving motion of an operator is detected, the control unit controls the video signal processing unit to output a predetermined first video signal in synchronism with the hand revolving motion in order to inform the operator of a detection situation of the revolving motion and to output a predetermined second video signal in synchronism with the first video signal in order to inform the operator of a progress situation of a manipulation until a predetermined manipulation is definitely fixed. | 04-07-2011 |
20110088002 | METHOD AND PLATFORM FOR GESTURAL TRANSFER OF DIGITAL CONTENT FOR MOBILE DEVICES - A platform is provided which allows for gesture-initiated transfer of digital content from a mobile device to at least one other device which may also be a mobile device. The platform includes an application that leverages components which help in determining the pose of the mobile device. Upon detecting a gesturing motion (e.g., a throwing or casting motion), the system begins transfer of digital content (such as the current application or a set of pre-packaged information) to the at least one other device. The throwing or casting direction is analyzed to determine the appropriate device or devices to receive the content. | 04-14-2011 |
20110088003 | APPARATUS, METHODS AND COMPUTER-READABLE STORAGE MEDIA FOR SECURITY PROVISIONING AT A COMMUNICATION DEVICE - Apparatus, methods and computer-readable storage medium are provided for security provisioning at a communication device. In some embodiments, a method can include: executing a high security application on a communication device based, at least, on detecting that high security is enabled for the communication device and detecting execution of a low security application; outputting, via a user interface (UI), information configured to detect an entry to the communication device; detecting an entry at the UI of the communication device; determining whether the entry corresponds to security access information stored in the communication device; and providing access to the communication device based, at least, on determining that the entry corresponds to the security access information. | 04-14-2011 |
20110093820 | GESTURE PERSONALIZATION AND PROFILE ROAMING - A gesture-based system may have default or pre-packaged gesture information, where a gesture is derived from a user's position or motion in a physical space. In other words, no controllers or devices are necessary. Depending on how a user uses his or her gesture to accomplish the task, the system may refine the properties and the gesture may become personalized. The personalized gesture information may be stored in a gesture profile and can be further updated with the latest data. The gesture-based system may use the gesture profile information for gesture recognition techniques. Further, the gesture profile may be roaming such that the gesture profile is available in a second location without requiring the system to relearn gestures that have already been personalized on behalf of the user. | 04-21-2011 |
20110093821 | DISPLAYING GUI ELEMENTS ON NATURAL USER INTERFACES - A computing system for displaying a GUI element on a natural user interface is described herein. The computing system includes a display configured to display a natural user interface of a program executed on the computing system, and a gesture sensor configured to detect a gesture input directed at the natural user interface by a user. The computing system also includes a processor configured to execute a gesture-recognizing module for recognizing a registration phase, an operation phase, and a termination phase of the gesture input, and a gesture assist module configured to first display a GUI element overlaid upon the natural user interface in response to recognition of the registration phase. The GUI element includes a visual or audio operation cue to prompt the user to carry out the operation phase of the gesture input, and a selector manipulatable by the user via the operation phase of the gesture. | 04-21-2011 |
20110093822 | Image Navigation for Touchscreen User Interface - Various embodiments relate to a local computing device that includes a display and a touchscreen interface. The device is operable to establish a remote network computing session with a host computer system, transmit touch event information associated with touch events, receive graphical display information corresponding to a host image associated with the host computer system, translate the graphical display information from host coordinates to local coordinates, update the local image based on the graphical display information, the local image comprising a selected portion of the host image, and, in response to mouse movement events caused by associated touch events, change the selected portion of the host image while keeping a cursor in the center of the display, except when the center of the selected portion is within a predetermined limit of an edge of the host image, thereafter move the cursor relative to the local display. | 04-21-2011 |
20110107276 | Icon/text interface remote controller - An icon/text interface remote controller and particularly a remote controller incorporating with a display device which displays an icon/text menu interface to allow users to make selection mainly includes a touch panel, a button switch, a power supply, a power control circuit and a wireless emission circuit that incorporate with the display device. Users can see options of the icon/text menu on the display device to do remote control operation in a more user-friendly fashion. | 05-05-2011 |
20110119637 | METHOD AND APPARATUS FOR INTERACTING WITH A CONTENT OBJECT - An approach is provided for interacting with an embedded content object. A request is received, from a device, to access an embedded content object, wherein the content object is related to a content playlist. On receipt of the request, the content object determines whether its content is available. The content object then causes, at least in part, actions that result in an interaction behavior based on the determination. | 05-19-2011 |
20110119638 | USER INTERFACE METHODS AND SYSTEMS FOR PROVIDING GESTURING ON PROJECTED IMAGES - Methods and systems enable a user to interact with a computing device by tracing a gesture on a surface with a laser beam. The computing device may be equipped with or coupled to a projector and a digital camera. The projector may project an image generated on the computing device on a projection surface which the camera images. Location and movement of a laser spot on the projection surface may be detected within received camera images. The projected image and the received camera image may be correlated so that the computing device can determine the location of a laser spot within the projected image. Movements of the laser spot may be correlated to predefined laser gestures which may be associated to particular functions that the computing device may implement. The functions may be similar to other user interface functionality. The function results may be displayed and projected. | 05-19-2011 |
20110119639 | SYSTEM AND METHOD OF HAPTIC COMMUNICATION AT A PORTABLE COMPUTING DEVICE - A method of haptic communication at a wireless device is disclosed. The method may include receiving an input gesture and generating an input gesture message from the input gesture. The input gesture message may be operable for transmission to a receiving wireless device. | 05-19-2011 |
20110119640 | DISTANCE SCALABLE NO TOUCH COMPUTING - Disclosed herein are techniques for scaling and translating gestures such that the applicable gestures for control may vary depending on the user's distance from a gesture-based system. The techniques for scaling and translation may take the varying distances from which a user interacts with components of the gesture-based system, such as a computing environment or capture device, into consideration with respect to defining and/or recognizing gestures. In an example embodiment, the physical space is divided into virtual zones of interaction, and the system may scale or translate a gesture based on the zones. A set of gesture data may be associated with each virtual zone such that gestures appropriate for controlling aspects of the gesture-based system may vary throughout the physical space. | 05-19-2011 |
20110119641 | Call connection method and apparatus in mobile terminal - A call connection method and apparatus of a portable terminal capable of reducing errors due to automatic call connection using a user gesture and recognition of the user gesture are provided. The call connection method of a portable terminal includes providing a list according to a user request, sequentially sensing a first event and a second event according to a user gesture performed after a specific object in the list is selected, and performing automatic call connection based on the specific object when the first event and the second event are sensed. | 05-19-2011 |
20110131537 | METHOD AND APPARATUS FOR PROVIDING USER INTERFACE OF PORTABLE DEVICE - A method includes displaying a user interface for displaying a graphic and a hidden graphic in a first area; displaying a set of contents corresponding to the graphic in a second area distinguishable from the first area; detecting a user's gesture for selecting a part of the first area; enlarging the first area to include a part of the second area; displaying a plurality of graphics including the graphic and the hidden graphic in the extended first area in response to the user's gesture; detecting a user's additional gesture for moving a first graphic among the plurality of graphics; and moving the first graphic to a part of the extended first area in response to the user's additional gesture, and moving a second graphic of the plurality of graphics to an area from which the first graphic is moved out. | 06-02-2011 |
20110145768 | Device, Method, and Grahpical User Interface for Managing User Interface Content and User Interface Elements - Computing devices and methods for managing user interface content and user interface elements are disclosed. In one embodiment, after a plurality of user interface elements is selected from an ordered list, wherein a selection order is maintained for the selected plurality of user interface elements: a user gesture to perform an operation on the plurality of user interface elements is detected, and in response, a stack of temporarily displayed thumbnails corresponding to the selected plurality of user interface elements is displayed, wherein a display order of the stack of temporarily displayed thumbnails corresponds to the selection order of the selected plurality of user interface elements. | 06-16-2011 |
20110154266 | CAMERA NAVIGATION FOR PRESENTATIONS - Techniques for managing a presentation of information in a gesture-based system, where gestures are derived from a user's body position or motion in the physical space, may enable a user to use gestures to control the manner in which the information is presented or to otherwise interact with the gesture-based system. A user may present information to an audience to an audience using gestures that control aspects of the system, or multiple users may work together using gestures to control aspects of the system. Thus, in an example embodiment, a single user can control the presentation of information to the audience via gestures. In another example embodiment, multiple participants can share control of the presentation via gestures captured by a capture device or otherwise interact with the system to control aspects of the presentation. | 06-23-2011 |
20110154267 | Method and Apparatus for Determining an Operation Associsated with a Continuous Stroke Input - An apparatus, comprising a processor and memory including computer program code, the memory and the computer program code configured to, working with the processor, cause the apparatus to perform at least the following: receiving indication of a first input associated with a first touch display, receiving indication of a second input relating to an exiting touch display boundary input associated with the first touch display, receiving indication of a third input relating to an entering touch display boundary input associated with a second touch display, receiving indication of a fourth input associated with the second display, determining that a continuous stroke input comprises the first input, second input, third input, and fourth input, and determining an operation based, at least in part, on the continuous stroke input is disclosed. | 06-23-2011 |
20110154268 | METHOD AND APPARATUS FOR OPERATING IN POINTING AND ENHANCED GESTURING MODES - Methods and apparatuses for implementing gesture command recognition functionality is disclosed. The apparatuses may operate in a pointing mode and operate in an enhanced gesturing mode. While in the enhanced gesturing mode, the apparatuses may cause associated actions in response to recognizing sliding inputs as gesture commands. The gesture commands may be selectively associated with actions based on localities. The apparatuses may present overlays with information content independent of gesture command recognition. The apparatuses may change appearances of visual representations of sliding inputs in response to recognizing the sliding inputs as gesture commands. | 06-23-2011 |
20110161889 | User Interface for Electronic Devices - An electronic device having a user interface and a display unit on which an object is selected from a source screen in response to a first input at the user interface. The selected object is then tunneled to a target screen, via a virtual tunnel, in response to a second input at the user interface. The source screen and the target screen may be a part of the display unit in the electronic device. The tunneled object is then edited or modified to create an object desired by the user. | 06-30-2011 |
20110161890 | Using multi-modal input to control multiple objects on a display - Embodiments of the invention are generally directed to systems, methods, and machine-readable mediums for implementing gesture-based signature authentication. In one embodiment, a system may include several modal input devices. Each modal input device is capable of retrieving a stream of modal input data from a user. The system also includes modal interpretation logic that can interpret each of the retrieved modal input data streams into a corresponding of set of actions. The system additionally includes modal pairing logic to assign each corresponding set of actions to control one of the displayed objects. Furthermore, the system has modal control logic which causes each displayed object to be controlled by its assigned set of actions. | 06-30-2011 |
20110161891 | INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - According to one embodiment, the relationship between respective fingers of a user and corresponding screens is registered, a screen and command are determined according to the motion (motion trajectory) of a finger that touches a touch input device and the type of the finger, and a multi-screen is operated. | 06-30-2011 |
20110161892 | Display Interface and Method for Presenting Visual Feedback of a User Interaction - A display interface and method for presenting visual feedback of a user interaction with an image being presented via an electronic device are provided. The display interface includes a display adapted for visually presenting to the user at least a portion of an image. The display interface further includes a user input adapted for receiving a gesture including a user interaction with the electronic device. The display interface still further includes a controller. The controller is adapted for associating the gesture with a function, determining whether the associated function has reached a limit which would preclude execution of the associated function, and executing the function associated with the detected gesture. If the associated function has not reached the limit which would preclude execution of the associated function, then controller is adapted for executing the function associated with the detected gesture. If the associated function has reached the limit which would preclude execution of the associated function, then the controller is adapted for producing an image distortion proximate the user interaction. | 06-30-2011 |
20110167391 | USER INTERFACE METHODS AND SYSTEMS FOR PROVIDING FORCE-SENSITIVE INPUT - Methods and systems implement touch sensors or force sensitive materials disposed on the case of a computing device in order to enable user input gestures to be performed on portions of the device case. The force sensitive elements may generate an electrical signal in response to a gesture, such as a tap, squeeze, swipe or twist. The properties of the generated electrical signal may be compared to various reference templates to recognize particular input gestures. The force sensitive elements may operate in conjunction with more traditional input methods, such as touch-screen display and electromechanical buttons. By enabling user input gestures on the case of computing devices, the various aspects permit one hand operation of the devices including intuitive gestures that do not require the users focused attention to accomplish. Thus the various aspects may enable users to utilize their computing devices in situations not suitable to conventional user input technologies. | 07-07-2011 |
20110173574 | IN APPLICATION GESTURE INTERPRETATION - In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. A gesture-based system may have a plurality of modes, each mode a hardware configuration, a software configuration, or a combination thereof. Techniques for transitioning a user's control, via the user's gestures, between different modes enables a system to coordinate controls between multiple modes. For example, while a first mode is active, the user's gestures may control aspects of the first mode. The system may transition the user's control from a control of the first mode to a control of a second mode. The transition may be between hardware, software, or a combination thereof. In another embodiment, reserved gestures that correspond to a first mode that may be executed whether or not a second mode is present. | 07-14-2011 |
20110173575 | METHOD AND DEVICE FOR INPUTTING TEXTS - There is disclosed a method for the detection of the selection of a character of a character string to be input from a character set on an input surface, wherein the selection of at least one character of the character string is detected by evaluating a direction vector of a gesture which is input on the input surface. There is also disclosed an input device for carrying out the method, especially a mobile terminal with a touch-sensitive input surface for selecting characters of a character string to be input, in which on the touch-sensitive input surface an input pattern with a number of characters from a character set can be displayed, whereby the input device comprises an evaluation unit, which detects the selection of at least one character of the character string by evaluating a direction vector of a gesture input with an input medium on the input surface. | 07-14-2011 |
20110173576 | USER INTERFACE FOR AUGMENTED REALITY - The disclosed embodiments are directed to a method, an apparatus, and a user interface. The disclosed embodiments include acquiring an image, identifying one or more objects of interest in the image, and providing an indication that additional information is available for the one or more of the objects of interest without obscuring the one or more objects of interest. | 07-14-2011 |
20110185316 | Device, Method, and Graphical User Interface for Managing User Interface Content and User Interface Elements - Alignment guides configured for velocity-sensitive behavior are disclosed. In one embodiment, during a user interface element move gesture, the gesture velocity is determined, and while moving the user interface element during the gesture, the user interface operates in a first or a second state with respect to displaying alignment guides. When the velocity of the user gesture exceeds a predefined velocity threshold, the display of the user interface is maintained in the first state, which does not include visibly displaying alignment guides. When the velocity of the user gesture is less than the predefined velocity threshold, the user interface is displayed in a second state that includes visibly displaying one or more alignment guides. In some embodiments, gesture velocity is used to set alignment guide attraction strength. | 07-28-2011 |
20110185317 | Device, Method, and Graphical User Interface for Resizing User Interface Content - Aspect ratio locking alignment guides for gestures are disclosed. In one embodiment, a gesture is detected to resize a user interface element, and in response, a first alignment guide is visibly displayed, wherein the first alignment guide includes positions representing different sizes the user interface element can be resized to while maintaining the initial aspect ratio of the user interface element. While the user interface element is resized in accordance with the user gesture, and while the first alignment guide is visibly displayed: when the user gesture is substantially aligned with the first alignment guide, visible display of the first alignment guide is maintained; and when the user gesture substantially deviates from the first alignment guide, visible display of the first alignment guide is terminated. | 07-28-2011 |
20110185318 | EDGE GESTURES - Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device. | 07-28-2011 |
20110185319 | VIRTUAL PIN PAD FOR FUEL PAYMENT SYSTEMS - A method and system for displaying a virtual PIN pad in varying locations on a touch screen in order to prevent fraud or the interception of personal identification numbers. | 07-28-2011 |
20110185320 | Cross-reference Gestures - Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device. | 07-28-2011 |
20110185321 | Device, Method, and Graphical User Interface for Precise Positioning of Objects - A method includes, at a computing device with a touch-sensitive display: displaying a user interface object on the touch-sensitive display; detecting a contact on the user interface object; while continuing to detect the contact on the user interface object: detecting an M-finger gesture, distinct from the contact, in a first direction on the touch-sensitive display, where M is an integer; and, in response to detecting the M-finger gesture, translating the user interface object a predefined number of pixels in a direction in accordance with the first direction. | 07-28-2011 |
20110191724 | DEVICE FOR ITEM CONTROL, SYSTEM FOR ITEM CONTROL, AND METHOD - A first device classifies and displays an item, identifies a suitable class matched to approach information of a second device out of the entire area of the classified item as the second device approaches the first device, and provides the second device with the identified class or executes a service linked to the class. The second device approaches a portion where a desired class is displayed by the first device, receives the class from the first device, and provides a linked service using the same. | 08-04-2011 |
20110209097 | Use of Bezel as an Input Mechanism - Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures. | 08-25-2011 |
20110209098 | On and Off-Screen Gesture Combinations - Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures. | 08-25-2011 |
20110209099 | Page Manipulations Using On and Off-Screen Gestures - Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures. | 08-25-2011 |
20110209100 | MULTI-SCREEN PINCH AND EXPAND GESTURES - Embodiments of multi-screen pinch and expand gestures are described. In various embodiments, a first input is recognized at a first screen of a multi-screen system, and the first input includes a first motion input. A second input is recognized at a second screen of the multi-screen system, and the second input includes a second motion input. A pinch gesture or an expand gesture can then be determined from the first and second motion inputs that are associated with the recognized first and second inputs. | 08-25-2011 |
20110209101 | MULTI-SCREEN PINCH-TO-POCKET GESTURE - Embodiments of a multi-screen pinch-to-pocket gesture are described. In various embodiments, a first motion input to a first screen region is recognized at a first screen of a multi-screen system, and the first motion input is recognized to select a displayed object. A second motion input to a second screen region is recognized at a second screen of the multi-screen system, and the second motion input is recognized to select the displayed object. A pinch-to-pocket gesture can then be determined from the recognized first and second motion inputs within the respective first and second screen regions, the pinch-to-pocket gesture effective to pocket the displayed object. | 08-25-2011 |
20110209102 | MULTI-SCREEN DUAL TAP GESTURE - Embodiments of a multi-screen dual tap gesture are described. In various embodiments, a first tap input to a displayed object is recognized at a first screen of a multi-screen system. A second tap input to the displayed object is recognized at a second screen of the multi-screen system, and the second tap input is recognized approximately when the first tap input is recognized. A dual tap gesture can then be determined from the recognized first and second tap inputs. | 08-25-2011 |
20110209103 | MULTI-SCREEN HOLD AND DRAG GESTURE - Embodiments of a multi-screen hold and drag gesture are described. In various embodiments, a hold input is recognized at a first screen of a multi-screen system when the hold input is held in place. A motion input is recognized at a second screen of the multi-screen system, and the motion input is recognized to select a displayed object while the hold input remains held in place. A hold and drag gesture can then be determined from the recognized hold and motion inputs. | 08-25-2011 |
20110209104 | MULTI-SCREEN SYNCHRONOUS SLIDE GESTURE - Embodiments of a multi-screen synchronous slide gesture are described. In various embodiments, a first motion input is recognized at a first screen of a multi-screen system, and the first motion input is recognized when moving in a particular direction across the first screen. A second motion input is recognized at a second screen of the multi-screen system, where the second motion input is recognized when moving in the particular direction across the second screen and approximately when the first motion input is recognized. A synchronous slide gesture can then be determined from the recognized first and second motion inputs. | 08-25-2011 |
20110214092 | System and Method for Management of User Interactions Using Configurable Listeners in a Data Processing System - A system, method, and computer program product for management of user interactions with a data processing system. A method includes loading a listener dependency definition for a user interaction listener in a data processing system, and initializing listener lookup information for the user interaction listener. The method includes detecting a defined user interaction event by the user interaction listener, and handling the detected defined user interaction event by performing a corresponding defined action. | 09-01-2011 |
20110214093 | STORAGE MEDIUM STORING OBJECT CONTROLLING PROGRAM, OBJECT CONTROLLING APPARATUS AND OBJECT CONTROLLING METHOD - A game apparatus includes a first LCD and a second LCD, and on the second LCD, a touch panel is provided. On the first LCD, an enemy object is displayed. On the second LCD, a drawn object generated according to a touch operation is displayed. When the drawn object is generated, the drawn object is moved according to a moving course based on a locus of the touch operation. When the drawn object hits the enemy object, the enemy object is damaged. | 09-01-2011 |
20110214094 | HUMAN-MACHINE INTERFACE - A human-machine interface includes a panel formed of energy transmissive material having a contact surface on which one or more contacts may simultaneously be made. An energy source directs energy to the panel. The panel transmits energy received from the energy source to the contact surface. At least one parameter of the energy transmitted by the panel is altered at regions where contacts are made with the contact surface. A detector is coupled to the panel and detects the at least one parameter of the energy generally over the area of the contact surface and outputs values corresponding thereto. A processor is in communication with the detector. The processor processes the output values to determine the locations of the contact regions on the contact surface and at least one other attributed associated with each of the contacts. | 09-01-2011 |
20110219340 | SYSTEM AND METHOD FOR POINT, SELECT AND TRANSFER HAND GESTURE BASED USER INTERFACE - A system and method for a point, select and transfer hand gesture based user interface is disclosed. In one embodiment, a depth image of a hand gesture is captured using an in-front camera substantially on a frame by frame basis within a predefined interaction volume. Also, a nearest point of the hand gesture to a display screen of a display device is found using a substantially nearest depth value in the captured depth image for each frame. Further, an image-to-screen mapping of the captured depth image and the found nearest point to the display screen is performed upon validating the found nearest point as associated with the hand for each frame. Moreover, one of select options displayed on the display screen is pointed and selected when the substantially nearest depth value is within one or more predetermined threshold ranges, and based on the outcome of the image-to-screen mapping. | 09-08-2011 |
20110225553 | Use Of Standalone Mobile Devices To Extend HID Capabilities Of Computer Systems - A mobile device is adapted so that its HID functionality may be used to control an associated computer GUI. The computer may also be used to extend the HID capabilities of the mobile device. | 09-15-2011 |
20110239166 | METHOD AND SYSTEM FOR CONTROLLING FUNCTIONS IN A MOBILE DEVICE BY MULTI-INPUTS - A method and system for providing control functions in a mobile device according to modes of inputs are provided. The method includes receiving a proximity signal via a sensing module, detecting a touch signal via a touch screen while the proximity signal is being retained, and executing a function set according to the input mode of the touch signal. | 09-29-2011 |
20110246951 | PORTABLE DEVICE AND UNLOCKING METHOD THEREOF - A portable device and an unlocking method stores information groups, each information group including one primary key and at least one subordinate key. The portable device obtains the primary key, at least one subordinate key in one information group, and at least one subordinate key in another information group, and displays as an unlocking image. The portable device further detects user selection, and determine whether the user selection are the primary key and at least one subordinate key in the same information group. The portable device further switches from a lock state to an unlock state, when the user selection are the primary key and at least one subordinate key in the same information group. | 10-06-2011 |
20110246952 | ELECTRONIC DEVICE CAPABLE OF DEFINING TOUCH GESTURES AND METHOD THEREOF - An electronic device capable of defining touch gesture is provided. The electronic device includes a touch panel, a touch detection circuit, a processing unit, and a storage unit. The touch detection circuit produces touch signals in response to user touch operation on the touch panel. The processing unit includes a gesture recognition module and a defining module. The gesture recognition module receives the touch signals from the touch detection circuit, and recognizes touch gesture according to the touch signals. The defining module controls the electronic device to enter a defining module in response to user operation, and prompts the user to define a touch gesture for a function of the electronic device, and establish a map between the touch gesture input by the user and the function selected by the user, and store the map in the storage unit. | 10-06-2011 |
20110252383 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM - There is provided an information processing apparatus including a display section which displays, as a first layout state, an object group including a plurality of objects arranged in a first direction, a detection section which detects an operation input that is input to the display section, and a control section which, when the detection section detects an operation input in a second direction that is perpendicular to the first direction, changes the first layout state into a second layout state in which the respective objects constituting the object group which has been selected are spread and pieces of information associated with the plurality of objects, respectively, are displayed. | 10-13-2011 |
20110258586 | USER CONTROL - An apparatus including: at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to: resolve a user input trace into a first displacement in a first direction and a second displacement in a second direction, orthogonal to the first direction; and control a position within a range in dependence upon both the first displacement and the second displacement. | 10-20-2011 |
20110265045 | ELECTRONIC SYSTEM AND METHOD FOR OPERATING TOUCH SCREEN THEREOF - The disclosure provides an unlock method of an electronic system with a touch screen. The unlock method includes steps below: receiving a triggering event when the system is locked; activating the touch screen in response to the event; receiving an input gesture; comparing the gesture with a customized gesture; unlocking the system in case that the input gesture is matched to the customized gesture, which is customized by user, and is not a default unlock gesture built in the electronic system. | 10-27-2011 |
20110265046 | THROWING GESTURES FOR MOBILE DEVICES - At least one tilt sensor generates a sensor value. A context information server, receives the sensor value and sets at least one context attribute. An application uses at least one context attribute to determine that a flinging gesture has been made and to change an image on a display in response to the flinging gesture. | 10-27-2011 |
20110271235 | Method for displaying a setting menu and corresponding device - The invention relates to a method for displaying a settings menu. In order to optimize the graphical representation of setting carried out by a spectator, the method comprises steps for:
| 11-03-2011 |
20110271236 | DISPLAYING CONTENT ON A DISPLAY DEVICE | 11-03-2011 |
20110283241 | Touch Gesture Actions From A Device's Lock Screen - Embodiments enable a mobile device to execute an action analogous to a user-defined action in response to receipt of a gesture analogous to a user-defined gesture. In a first embodiment, a computer-implemented method executes an action on a mobile device. A lock screen view is displayed on the mobile device to prevent unauthorized and inadvertent access to the mobile device's data. While the mobile device is locked, a touch gesture having a pre-defined shape is detected on a touch screen of the mobile device independently of the initial position of the touch gesture on the touch screen. In response to detection of the touch gesture, a particular action is executed on the mobile device while the mobile device stays locked. The particular action determined according to the pre-defined shape. In this way, detection of the touch gesture causes the particular action to execute while keeping the mobile device locked. | 11-17-2011 |
20110283242 | REPORT OR APPLICATION SCREEN SEARCHING - Search results may be graphically displayed on a client device as thumbnail images. A search for one or more files in the form of a search term may be received from a client device. The search may be executed based on the search term by searching one or more databases corresponding to applications associated with the client device. One or more files may be identified that satisfy the search term. Metadata associated with the identified files may be processed to generate a thumbnail image of the file based at least in part on the metadata for each of the one or more identified files. The thumbnail images of at least a subset of the identified files may be provided to and displayed on the client device. The associated files may be accessed by the client device. | 11-17-2011 |
20110289462 | Computing Device Magnification Gesture - Computing device magnification gesture techniques are described. In implementations, a first input is recognized as a magnification gesture to initiate magnification of at least a portion of a user interface displayed by a display device of a computing device. The magnified portion is displayed in the user interface as at least partially encompassed by an unmagnified portion of the user interface. A second input is recognized as specifying a modification to be made to data included in the magnified portion of the user interface, the second input recognized as occurring during provision of the first input. Responsive to recognition that the first input is no longer being provided, the display of the magnified portion ceases in the user interface. | 11-24-2011 |
20110296355 | Techniques for self adjusting kiosk display information - Techniques for self adjusting kiosk display information are provided. Presentation information is centered within a display of the kiosk. A center location for the presentation information is custom recalibrated within the display based on direction of a user. | 12-01-2011 |
20110296356 | Unlocking a Device by Performing Gestures on an Unlock Image - A device with a touch-sensitive display may be unlocked via gestures performed on the touch-sensitive display. The device is unlocked if contact with the display corresponds to a predefined gesture for unlocking the device. The device displays one or more unlock images with respect to which the predefined gesture is to be performed in order to unlock the device. The performance of the predefined gesture with respect to the unlock image may include moving the unlock image to a predefined location and/or moving the unlock image along a predefined path. The device may also display visual cues of the predefined gesture on the touch screen to remind a user of the gesture. | 12-01-2011 |
20110296357 | Method For Providing A User Interface Using Three-Dimensional Gestures And An Apparatus Using The Same - Provided is a method capable of making various modifications to widgets, graphic objects, or images, which are displayed on a display device, according to motions of a plurality of input units such as finger or stylus pen, with the use of a three-dimensional multi-sensor configured to detect the motions of the input units in a space, without touching the display device. | 12-01-2011 |
20110302538 | SYSTEM AND METHOD FOR DISTINGUISHING MULTIMODAL COMMANDS DIRECTED AT A MACHINE FROM AMBIENT HUMAN COMMUNICATIONS - A method and system of distinguishing multimodal HCI from ambient human interactions using wake up commands is disclosed. In one embodiment, in a method of distinguishing multimodal HCI from ambient human interactions, a wake up command is detected by a computing system. The computing system is then woken up to receive a valid user command from a user upon detecting the wake up command. A countdown timer is substantially simultaneously turned on upon waking up the computing system to receive valid user commands. The countdown timer is set based on application usage parameters such as semantics of the valid user command and context of an application associated with the valid user command. | 12-08-2011 |
20110307840 | ERASE, CIRCLE, PRIORITIZE AND APPLICATION TRAY GESTURES - Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including an activate gesture, a fill gesture, a level gesture, a jump gesture, a checkmark gesture, a strikethrough gesture, an erase gesture, a circle gesture, a prioritize gesture, and an application tray gesture. | 12-15-2011 |
20110307841 | METHOD AND APPARATUS FOR BINDING USER INTERFACE ELEMENTS AND GRANULAR REFLECTIVE PROCESSING - An approach is provided for binding user interface elements and granular reflective processing. An information management infrastructure determines to detect an event, from a first device, for specifying one or more user interface elements for transfer to a second device. The information management infrastructure further identifies one or more processes bound to the user interface elements. The information management infrastructure also determines at least one of a user context, an execution context within the user context, and one or more other execution contexts for the processes, wherein the one or more other execution contexts are from at least one of the user context and one or more other user contexts. The information management infrastructure further causes, at least in part, serialization of at least one of the user context, the execution context, and the one or more other execution contexts. The information management infrastructure further determines to transmit the serialization to the second device to initiate reconstruction of the at least one of the user context, the execution context, and the one or more other execution contexts. | 12-15-2011 |
20110307842 | ELECTRONIC READING DEVICE - This invention provides an electronic reading device which comprises an eye glass frame and a camera-projection component mounted on the eye glass frame comprising a projection unit to project an image onto a projection surface and an optical sensor unit to perform a scan of a region near the projection surface, wherein the optical sensor unit is configured to operate as a user interface by detecting a user input based on the scan. The electronic reading device could create the same page-reading experience as well as page-turning experience on to every surface such as walls, tables, or other kinds of panels as reading a conventional book, i.e. simulating “page turning like a real paper book”. | 12-15-2011 |
20110307843 | Information Processing Apparatus, Operation Method, and Information Processing Program - An information processing apparatus includes an operation unit; and a control unit performing a process in response to an operation executed through the operation unit. Different gesture operations are able to be assigned to an operation corresponding to copy of information and an operation corresponding to cut of information, respectively. The control unit selects a portion designated by a user in information displayed on a display unit, and then copies the selected portion when the user executes the gesture operation corresponding to the copy through the operation unit, whereas the control unit cuts the selected portion when the user executes the gesture operation corresponding to the cut through the operation unit. | 12-15-2011 |
20110314425 | Air gesture recognition type electronic device operating method - An air gesture recognition type electronic device operating method for operating an electronic device having multiple sensors in each of multiple peripheral sides thereof by: approaching an object to the sensors to produce sensing signals and determining whether or not the object has been continuously sensed, and then determining whether or not the moving direction and moving speed of the object match a respective predetermined value, and then coupling and computing all received sensing signals to produce an operating parameter for running an air gesture application procedure. Thus, a user can operate the electronic device without direct contact or the use of any camera or input media, saving the hardware cost and enhancing the operational flexibility. | 12-22-2011 |
20110314426 | RISK-BASED ALERTS - Some embodiments provide a system that facilitates use of a computer system. During operation, the system obtains notification of a risk associated with a user action on the computer system. Next, the system generates an alert within a user interface based at least on a severity of the risk. The alert may include a set of user-interface elements representing an effect of the user action. The system then receives a response to the alert from a user of the computer system. The response may include a dragging of a first of the user-interface elements in one or more directions to a second of the user-interface elements. Finally, the system processes the user action based at least on the response. | 12-22-2011 |
20110314427 | PERSONALIZATION USING CUSTOM GESTURES - A method and apparatus allow users of touchscreen-based devices to create custom gestures on the touchscreen that are associated with behaviors and recognized throughout the operation of the device. The method and apparatus include sensing a user interaction on a touchscreen and detecting whether the sensed user interaction is a custom gesture stored in a behavior repository, the custom gesture being a user-defined interaction on the touchscreen. A gesture processor determines a behavior that is associated with the custom gesture. A personality adapter selects an appropriate operation from a set of operations associated with the behavior based on policies for the behavior, and a main processor executes the appropriate operation. | 12-22-2011 |
20110314428 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A display apparatus includes: a display unit; a user interface (UI) generator which generates UI information to be displayed on the display unit; a user input unit which includes a touch pad; and a controller which controls the UI generator to display a plurality of selective items per item page on the display unit, to move a focus in accordance with a touch motion of the user, and to stop moving the focus if the focus is located at an outermost selective item positioned at an edge of the item page. | 12-22-2011 |
20110314429 | APPLICATION PROGRAMMING INTERFACES FOR GESTURE OPERATIONS - At least certain embodiments of the present disclosure include an environment with user interface software interacting with a software application to provide gesture operations for a display of a device. A method for operating through an application programming interface (API) in this environment includes transferring a scaling transform call. The gesture operations include performing a scaling transform such as a zoom in or zoom out in response to a user input having two or more input points. The gesture operations also include performing a rotation transform to rotate an image or view in response to a user input having two or more input points. | 12-22-2011 |
20110314430 | APPLICATION PROGRAMMING INTERFACES FOR GESTURE OPERATIONS - At least certain embodiments of the present disclosure include an environment with user interface software interacting with a software application to provide gesture operations for a display of a device. A method for operating through an application programming interface (API) in this environment includes transferring a scaling transform call. The gesture operations include performing a scaling transform such as a zoom in or zoom out in response to a user input having two or more input points. The gesture operations also include performing a rotation transform to rotate an image or view in response to a user input having two or more input points. | 12-22-2011 |
20120005632 | EXECUTE A COMMAND - A method for executing a command including detecting a gesture from a user with a sensor, identifying the gesture and a command associated with the gesture, and identifying at least one corresponding device to execute the command on and configuring a device to execute the command on at least one of the corresponding devices. | 01-05-2012 |
20120011476 | ELECTRONIC DEVICE AND METHOD FOR SEARCHING MULTIMEDIA FILE - An electronic device includes a touch screen, a memory module, a dividing module, an identifying module, a sorting module, and a driving module. The memory module saves multimedia files (MFs). Each MF has four tags, each tag corresponds to a category. The dividing module drives the touch screen to display areas that makes up a grid. The grid has reference lines: a horizontal line, a vertical line, a first diagonal line, and a second diagonal line. The identifying module identifies a vector of a user's slide on the touch screen that is substantially parallel to a reference line. The sorting module receives the vector identified by the identifying module and generates a list making up of the tags of the same category that corresponds to the vector. The driving module reads the list and drives the areas corresponding to the vector to display the tags of the list. | 01-12-2012 |
20120023456 | INTERACTIVE IMAGE MATTING - A user interface enables interactive image matting to be performed on an image The user interface may provide results including an alpha matte as feedback in real time. The user interface may provide interactive tools for selecting a portion of the image, and an unknown region for alpha matte processing may be automatically generated adjacent to the selected region. The user may interactively refine the alpha matte as desired to obtain a satisfactory result. | 01-26-2012 |
20120023457 | PRESENTATION OF ADVERTISEMENTS BASED ON USER INTERACTIVITY WITH A WEB PAGE - Methods and systems for presenting advertisements based on user interactivity with a web page are provided. According to embodiments of the invention, a web page is rendered on a client device. Gesture interactivity with the web page is monitored on the client device. A trigger is executed which defines an interactive event. When the interactive event occurs, as determined based on the monitored gesture interactivity with the web page, secondary content, such as an advertisement, is downloaded and displayed on the client device. | 01-26-2012 |
20120023458 | Unlocking a Device by Performing Gestures on an Unlock Image - A device with a touch-sensitive display may be unlocked via gestures performed on the touch-sensitive display. The device is unlocked if contact with the display corresponds to a predefined gesture for unlocking the device. The device displays one or more unlock images with respect to which the predefined gesture is to be performed in order to unlock the device. The performance of the predefined gesture with respect to the unlock image may include moving the unlock image to a predefined location and/or moving the unlock image along a predefined path. The device may also display visual cues of the predefined gesture on the touch screen to remind a user of the gesture. | 01-26-2012 |
20120023459 | SELECTIVE REJECTION OF TOUCH CONTACTS IN AN EDGE REGION OF A TOUCH SURFACE - The selective rejection of touch contacts in an edge region of a touch sensor panel is disclosed. In addition, by providing certain exceptions to the rejection of edge contacts, the functionality of the touch sensor panel can be maximized. Contacts in edge bands around the perimeter of a touch sensor panel can be ignored. However, if a contact in the edge band moves beyond a threshold distance or speed, it can be recognized as part of a gesture. To accommodate different finger sizes, the size of the edge band can be modified based on the identification of the finger or thumb. Furthermore, if contacts in the center region of a touch sensor panel track the movement of contacts in the edge band, the contacts in the edge band can be recognized as part of a gesture. | 01-26-2012 |
20120023460 | APPLICATION PROGRAMMING INTERFACES FOR GESTURE OPERATIONS - At least certain embodiments of the present disclosure include an environment with user interface software interacting with a software application to provide gesture operations for a display of a device. A method for operating through an application programming interface (API) in this environment includes transferring a scaling transform call. The gesture operations include performing a scaling transform such as a zoom in or zoom out in response to a user input having two or more input points. The gesture operations also include performing a rotation transform to rotate an image or view in response to a user input having two or more input points. | 01-26-2012 |
20120023461 | APPLICATION PROGRAMMING INTERFACES FOR GESTURE OPERATIONS - At least certain embodiments of the present disclosure include an environment with user interface software interacting with a software application to provide gesture operations for a display of a device. A method for operating through an application programming interface (API) in this environment includes transferring a scaling transform call. The gesture operations include performing a scaling transform such as a zoom in or zoom out in response to a user input having two or more input points. The gesture operations also include performing a rotation transform to rotate an image or view in response to a user input having two or more input points. | 01-26-2012 |
20120023462 | SKIPPING THROUGH ELECTRONIC CONTENT ON AN ELECTRONIC DEVICE - Embodiments of the present invention disclose a method for skipping through electronic content displayed on an electronic device having a touchscreen display coupled to a processing engine. According to one embodiment, a multi-touch gesture is received from a user. Based on the user's multi-touch gesture, electronic content associated with digital media immediately advances to a subsequent section or immediately reverses back to a previous section of the digital media. | 01-26-2012 |
20120030632 | SYSTEM, METHOD AND APPARATUS FOR CONTROLLING PRESENTATION OF CONTENT - An application for a system that enables cooperating devices to transfer presentation of content from one device to the other by sending either the content or an identification of content from a source device to a destination device. In some embodiments, the actual content is transferred while in other embodiments, an identification of the content and a position within the content is transferred from the source device to the destination device. | 02-02-2012 |
20120030633 | DISPLAY SCENE CREATION SYSTEM - Provided is a display scene creation system that can cause a display scene to make a transition when a gesture is input to a touch panel, without a processing program that relates the touch panel to the gesture. A design of a display scene is set. One or more display components to be displayed in the set design of the display scene are set. A gesture with which the display scene makes a transition when the gesture is input to the set display components is set. A transition display scene table is provided that stores the set gesture and a post-transition display scene where the gesture and the post-transition display scene are associated with each other. | 02-02-2012 |
20120030634 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING PROGRAM - An information processing device includes an operating unit, and a control unit for switching, when dragging is performed which is an operation to change a predetermined value via the operating unit, between changing the value by an amount equivalent to the amount of dragging, and setting a change speed of the value according to the dragging and continue to change the value at the change speed, according to the position of an ending point as to a starting point of the dragging. | 02-02-2012 |
20120030635 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD AND INFORMATION PROCESSING PROGRAM - An information processing apparatus includes: an operation section; and a control section adapted to execute a process in response to dragging through the operation section; the control section changing, when dragging is carried out continuously after a particular operation by the operation section, a process to be executed in response to the dragging based on the particular operation. | 02-02-2012 |
20120030636 | INFORMATION PROCESSING APPARATUS, DISPLAY CONTROL METHOD, AND DISPLAY CONTROL PROGRAM - An information processing apparatus includes: an operation unit; and a control unit performing a process corresponding to dragging and displaying, on a display unit, a cursor which elongates from a start point of the dragging to an end point of the dragging and of which at least one of a size and a shape is different at one end portion, which is on a side of the start point of the dragging, and at the other end portion, which is on a side of the end point of the dragging, when the dragging is executed through the operation unit. | 02-02-2012 |
20120030637 | QUALIFIED COMMAND - A method for executing a qualified command including detecting a hand gesture input for identifying a command, detecting one or more non-hand gesture inputs to qualify the command, and configuring a processor to execute the qualified command on a machine in response to the hand gesture input and one or more non-hand gesture inputs. | 02-02-2012 |
20120036485 | Motion Driven User Interface - A motion driven user interface for a mobile device is described which provides a user with the ability to cause execution of user interface input commands by physically moving the mobile device in space. The mobile device uses embedded sensors to identify its motion which causes execution of a corresponding user interface input command. Further, the command to be executed can vary depending upon the operating context of the mobile device. | 02-09-2012 |
20120042288 | SYSTEMS AND METHODS FOR INTERACTIONS WITH DOCUMENTS ACROSS PAPER AND COMPUTERS - Systems and methods provide for mixed use of physical documents and a computer, and more specifically provide for detailed interactions with fine-grained content of physical documents that are integrated with operations on a computer to provide for improved user interactions between the physical documents and the computer. The system includes a camera which processes the physical documents and detects gestures made by a user with respect to the physical documents, a projector which provides visual feedback on the physical document, and a computer with a display to coordinate the interactions of the user with the computer and the interactions of the user with the physical document. The system, which can be portable, is capable of detecting interactions with fine-grained content of the physical document and translating interactions at the physical document with the computer display, and vice versa. | 02-16-2012 |
20120047468 | Translating User Motion Into Multiple Object Responses - A system for translating user motion into multiple object responses of an on-screen object based on user interaction of an application executing on a computing device is provided. User motion data is received from a capture device from one or more users. The user motion data corresponds to user interaction with an on-screen object presented in the application. The on-screen object corresponds to an object other than an on-screen representation of a user that is displayed by the computing device. The user motion data is automatically translated into multiple object responses of the on-screen object. The multiple object responses of the on-screen object are simultaneously displayed to the users. | 02-23-2012 |
20120047469 | METHOD AND APPARATUS FOR ADAPTING A CONTENT PACKAGE COMPRISING A FIRST CONTENT SEGMENT FROM A FIRST CONTENT SOURCE TO DISPLAY A SECOND CONTENT SEGMENT FROM A SECOND CONTENT SOURCE - An apparatus may include a user interface configured to display a content package including a first content segment from a first content source. A gesture interpreter may be configured to receive a gesture input in a positional relationship to the first content segment. The apparatus may further include a content relationship manager which may be configured to determine relationships between content segments such that a content segment selector may select a second content segment relating to the first content segment from a second content source. Further, the apparatus may include a content package adaptor configured to adapt the content package to provide for display of the second content segment. In some instances the content package adaptor may adapt the content package by providing for display of a second content package, for example from a different application than an application which the first content segment is from. | 02-23-2012 |
20120047470 | METHOD AND APPARATUS FOR BROWSING AN ELECTRONIC BOOK ON A TOUCH SCREEN DISPLAY - A method is disclosed for navigating in an electronic book comprising a plurality of pages, the method comprising displaying in an interface a first given page of the electronic book with a fore edge section representative of a fore edge of the electronic book; detecting a finger motion in the fore edge section and displaying a second given page of the electronic book wherein the second given page is displayed depending on characteristics of the detected finger motion in the fore edge section. | 02-23-2012 |
20120060127 | Automatic orientation of items on a touch screen display utilizing hand direction - Method or orienting items on the display to a user are based on the direction of the user hand, or the user are disclosed. The method relies on detection of the direction of the users' hand and orienting the item at a selected orientation thereto. An aspect of the invention also includes methods of detecting users' position about a touch screen display | 03-08-2012 |
20120060128 | DIRECT, GESTURE-BASED ACTIONS FROM DEVICE'S LOCK SCREEN - Embodiments enable a mobile device to execute an action analogous to a user-defined action in response to receipt of a gesture analogous to a user-defined gesture. In a first embodiment, a computer-implemented method executes an action on a mobile device. A lock screen view is displayed on the mobile device to prevent unauthorized and inadvertent access to the mobile device's data. While the mobile device is locked, a touch gesture having a pre-defined shape is detected on a touch screen of the mobile device independently of the initial position of the touch gesture on the touch screen. In response to detection of the touch gesture, a particular action is executed on the mobile device while the mobile device stays locked. The particular action determined according to the pre-defined shape. In this way, detection of the touch gesture causes the particular action to execute while keeping the mobile device locked. | 03-08-2012 |
20120060129 | MOBILE TERMINAL HAVING TOUCH SCREEN AND METHOD FOR DISPLAYING CONTENTS THEREIN - A mobile terminal having a touch screen and a method for displaying contents therein are provided. The method for displaying contents in a mobile terminal having a touch screen includes determining whether a touch action moves when the touch action is sensed on displayed contents, calculating a physical display change amount for changing and displaying the contents according to the touch action when the touch moves, and continuously changing and displaying the contents according to the physical calculated display change amount when the touch action stops. | 03-08-2012 |
20120066650 | Electronic Device and Method for Evaluating the Strength of a Gestural Password - An electronic device includes a movement sensing assembly for providing signals indicative of movement of an object with respect to the electronic device, wherein the movement includes a sequence of gestures making up a proposed gestural password. A processor in electronic communication with the movement sensing assembly is operable to receive and evaluate the signals to compute a password strength metric indicative of a strength of the proposed gestural password, and a user output component receives and displays an acceptability of the password strength metric. | 03-15-2012 |
20120072873 | TRANSPARENT DISPLAY DEVICE AND METHOD FOR PROVIDING OBJECT INFORMATION - According to an embodiment of the present invention, a method for providing object information includes determining an eye direction of a person toward a first region of a transparent display, selecting at least one object seen via the transparent display in the determined eye direction, acquiring information on the selected object, and displaying the acquired information on the transparent display. | 03-22-2012 |
20120079435 | INTERACTIVE PRESENTAION CONTROL SYSTEM - An interactive presentation control system includes a computer, a projection screen, a computer-controlled projector, and a presentation control device. The projector projects images generated by the computer onto the projection screen. The presentation control device includes a storage unit, an image capturing unit, and a processing unit. The storage unit stores a control table recording relationship between gesture shadow patterns and control commands Each gesture shadow pattern corresponds to one control command. The image capturing unit captures an image of the projection screen periodically. The processing unit processes the captured image to determine whether the captured image includes a gesture shadow pattern, and determine the corresponding control command if the determined gesture shadow pattern is recorded in the control table. The processing unit further generates a control signal corresponding to the determined control command, and transmits the control signal to the computer through the communication unit. | 03-29-2012 |
20120084734 | MULTIPLE-ACCESS-LEVEL LOCK SCREEN - A multiple-access-level lock screen system allows different levels of functionality to be accessed on a computing device. For example, when a device is in a locked state, a user can select (e.g., by making one or more gestures on a touchscreen) a full-access lock screen pane and provide input that causes the device to be fully unlocked, or a user can select a partial-access lock screen pane and provide input that causes only certain resources (e.g., particular applications, attached devices, documents, etc.) to be accessible. Lock screen panes also can be selected (e.g., automatically) in response to events. For example, when a device is in a locked state, a messaging access lock screen pane can be selected automatically in response to an incoming message, and a user can provide input at the messaging access lock screen pane that causes only a messaging application to be accessible. | 04-05-2012 |
20120084735 | GESTURE CONTROLS FOR MULTI-SCREEN USER INTERFACE - Method and apparatus for controlling a computing device using gesture inputs. The computing device may be a handheld computing device with multiple displays. The displays may be capable of displaying a graphical user interface (GUI). The GUI may be a multi screen GUI or a single screen GUI such that receipt of gesture inputs may result in the movement of a GUI from one display to another display or may result in maximization of a multi screen GUI across multiple displays. | 04-05-2012 |
20120084736 | GESTURE CONTROLLED SCREEN REPOSITIONING FOR ONE OR MORE DISPLAYS - Control of a computing device using gesture inputs. The computing device may be a handheld computing device with a plurality of displays. The displays may be capable of displaying a graphical user interface (GUI). The plurality of displays may be modified in response to receipt of a gesture input such that the displays are changed from a first state to a second state. The change of the displays from the first state to the second state may include moving a GUI from a first display to a second display. Additionally, a second GUI may be moved from the second display to the first display. The gesture input may comprise multiple touches, such as a pinch gesture. | 04-05-2012 |
20120084737 | GESTURE CONTROLS FOR MULTI-SCREEN HIERARCHICAL APPLICATIONS - Control of a computing device using gesture inputs. The computing device may be a handheld computing device with a plurality of displays. The displays may be capable of displaying a graphical user interface (GUI). The plurality of displays may be modified in response to receipt of a gesture input such that a hierarchical application having related GUI screens are modified in response to the gesture input. The modification may include changing the hierarchical application from being displayed in a single screen mode to being displayed in a multi screen mode or vice versa. | 04-05-2012 |
20120084738 | USER INTERFACE WITH STACKED APPLICATION MANAGEMENT - Methods and apparatus for controlling a computing device using gesture inputs. The gesture inputs may be operative to move screens corresponding to applications executing on the handheld computing device from one display to another. Additionally, a multi portion gesture may be used to target different screens. For example, a first portion of the gesture may maintain or “pin” a screen in a display such that a second portion of the gesture is operative to move a different screen behind the pinned application. | 04-05-2012 |
20120084739 | FOCUS CHANGE UPON USE OF GESTURE TO MOVE IMAGE - Embodiments are described for handling focus when a gesture is input in a multi-screen device. In embodiments, the gesture indicates a request to move an image displayed on the multi-screen device. In response, the image is moved and the focus is placed on the moved image. | 04-05-2012 |
20120089952 | APPARATUS AND METHOD FOR ADAPTIVE GESTURE RECOGNITION IN PORTABLE TERMINAL - An apparatus and method for gesture recognition in a portable terminal. An operation of the portable terminal includes determining a user situation by using at least one situation information, determining a user's gesture by using at least one sensor measurement value, and performing a function corresponding to the user situation and the gesture. | 04-12-2012 |
20120096411 | Navigation in a Display - An apparatus comprising: including a controller configured to switch a continuous navigation mode to a discontinuous navigation mode in response to a predefined discontinuous navigation input and configured to switch a discontinuous navigation mode to a continuous navigation mode in response to a predefined continuous navigation input. | 04-19-2012 |
20120102436 | APPARATUS AND METHOD FOR USER INPUT FOR CONTROLLING DISPLAYED INFORMATION - In accordance with an example embodiment of the present invention, a method for proximity based input is provided, comprising: detecting presence of an object in close proximity to an input surface, detecting a displayed virtual layer currently associated with the object on the basis of distance of the object to the input surface, detecting a hovering input by the object, and causing a display operation to move at least a portion of the associated virtual layer in accordance with the detected hovering input. | 04-26-2012 |
20120102437 | Notification Group Touch Gesture Dismissal Techniques - In exemplary embodiments, multiple notifications can be displayed by a touch-screen of a computing device and dismissed as a group. For example, touch input sensed by the touch-screen can be used to select multiple notifications. The multiple notifications can then be dismissed using a dismissal gesture. In addition to the foregoing, other aspects are described in the detailed description, claims, and figures. | 04-26-2012 |
20120102438 | DISPLAY SYSTEM AND METHOD OF DISPLAYING BASED ON DEVICE INTERACTIONS - The present invention describes a display system capable of interacting with an interfacing device positioned behind a display screen. The display system includes a display, including a display screen that in one embodiment is transparent. The display system further includes: a viewpoint assessment component for determining a viewpoint of a user positioned in front the display screen and an object tracking component for tracking the user manipulation of an object positioned behind the display screen. The display system includes an interaction tracking component. The interaction tracking component receives data regarding predefined interactions with the interfacing device. Responsive to the predefined interactions with the interfacing device, content on the display screen is modified. | 04-26-2012 |
20120102439 | SYSTEM AND METHOD OF MODIFYING THE DISPLAY CONTENT BASED ON SENSOR INPUT - A display system comprised of: a display including a display screen configured to operate in at least a transparent display mode; an interaction sensing component for receiving sensed data regarding physical user interactions; and an interaction display control component, wherein responsive to the sensed data meeting predefined interaction criteria, content on the display screen is modified. | 04-26-2012 |
20120110516 | POSITION AWARE GESTURES WITH VISUAL FEEDBACK AS INPUT METHOD - A gesture based user interface is provided for a user to interact with a device in order to operate and control the device through detection of gestures and movements of the user. Visual feedback of the user gestures is provided to the user to aid in the user's operational and control decisions of a device. An image capturing device such as a video camera may be employed to capture a user's image, and an integrated application on a computing device may process continuous images from the capturing device to recognize and track user gestures. The gestures may correlate to an object and/or location on the display and the user's image may be projected on the display to provide visual feedback of the user's interaction. | 05-03-2012 |
20120110517 | METHOD AND APPARATUS FOR GESTURE RECOGNITION - A touchscreen device is configured to display a number of user interface elements in accordance with a menu hierarchy. Upon receipt of a predetermined touchscreen gesture (e.g., the circular motion of a manipulator) the menu hierarchy is bypassed and the user is given immediate control over a selected function, for example, a tuning function such as audio volume, screen contrast, and the like. | 05-03-2012 |
20120110518 | TRANSLATION OF DIRECTIONAL INPUT TO GESTURE - A user device is disclosed which includes a touch input and a keypad input. The user device is configured to operate in a gesture capture mode as well as a navigation mode. In the navigation mode, the user interfaces with the touch input to move a cursor or similar selection tool within the user output. In the gesture capture mode, the user interfaces with the touch input to provide gesture data that is translated into key code output having a similar or identical format to outputs of the keypad. | 05-03-2012 |
20120110519 | GRAPHICAL MANIPULATION OF DATA OBJECTS - In an embodiment, a user input defining an enclosed, graphical shape on a video display is received. A number of graphical items are identified as being included within the enclosed, graphical shape. Here, each graphical item is displayed on the video display and represents a data object that has a number of properties. A property is extracted from the number of properties that the data objects have in common based on the identification. A number of other manipulation techniques are also described. | 05-03-2012 |
20120110520 | DEVICE FOR USING USER GESTURE TO REPLACE EXIT KEY AND ENTER KEY OF TERMINAL EQUIPMENT - A device for using user gesture to replace the exit key and the enter key of a terminal equipment, comprising a CPU module, a gesture input module, a gesture processing module, a terminal application module, a memory module and a terminal function module. The CPU module can be connected with the gesture input module, the gesture processing module, the terminal application module, the memory module and the terminal function module, and can receive the user gesture input information sent by the gesture input module, the setting content information sent by the terminal application module, and the gesture identifying information sent by the gesture processing module. The CPU module can exit with or without saving from the received setting content information based on the gesture identifying information. The device increases the viewable area of the user and simplifies the human-machine interaction process. | 05-03-2012 |
20120117517 | USER INTERFACE - A method of controlling a user interface comprising the steps of: detecting movement of a contact continuously between a first contact point at which contact is made and a second contact point at which contact is released; determining a state of a selection/de-selection mode of operation in dependence on: a line traced between the first contact point and the second contact point traversing one of more objects of an application layer; the first contact point not being co-incident with an object of the application layer; and the second contact point not being coincident with an object of the application layer. | 05-10-2012 |
20120124525 | METHOD FOR PROVIDING DISPLAY IMAGE IN MULTIMEDIA DEVICE AND THEREOF - A display device includes a sensor to track movement of at least one body part of a person and a processor to compare an amount of the tracked movement to a reference value, recognize a position shift when the amount of tracked movement exceeds the reference value, and perform a predetermined function of the display device based on the position shift. | 05-17-2012 |
20120124526 | METHOD FOR CONTINUING A FUNCTION INDUCED BY A MULTI-TOUCH GESTURE ON A TOUCHPAD - In a method for continuing a function induced by a multi-touch gesture on a touchpad, the object number of the multi-touch gesture is monitored during the function is performed, if the object number is detected changed so that one or more objects are still on the touchpad, the objects left on the touchpad will be detected to identify whether one or more of them move clockwise or anticlockwise, and if a clockwise or anticlockwise movement is detected, the function will be continued. | 05-17-2012 |
20120124527 | PORTABLE ELECTRONIC DEVICE, AND CONTROL METHOD AND CONTROL PROGRAM FOR THE SAME - An object of the present invention is to provide a portable electronic device capable of performing character inputs of which usability is improved for users, and to provide a control method and a control program for the portable electronic device. Depending on a predetermined operation detected by a detecting unit, a character input control unit rotates a around area around a touched character “na”, and depending on rotation of the around area, the character input control unit cancels display of some characters of related characters based on a predetermined order, and displays new related characters related to the touched character “na” in the around area. | 05-17-2012 |
20120131513 | Gesture Recognition Training - Gesture recognition training is described. In an example, a gesture recognizer is trained to detect gestures performed by a user on an input device. Example gesture records, each showing data describing movement of a finger on the input device when performing an identified gesture are retrieved. A parameter set that defines spatial triggers used to detect gestures from data describing movement on the input device is also retrieved. A processor determines a value for each parameter in the parameter set by selecting a number of trial values, applying the example gesture records to the gesture recognizer with each trial value to determine a score for each trial value, using the score for each trial value to estimate a range of values over which the score is a maximum, and selecting the value from the range of values. | 05-24-2012 |
20120131514 | Gesture Recognition - Gesture recognition is described. In one example, gestures performed by a user of an input device having a touch-sensitive portion are detected using a definition of a number of regions corresponding to zones on the touch-sensitive portion, each region being associated with a distinct set of gestures. Data describing movement of the user's digits on the touch-sensitive portion is received, and an associated region for the data determined. The data is compared to the associated region's set of gestures, and a gesture applicable to the data selected. A command associated with the selected gesture can then be executed. In an example, comparing the data to the set of gestures comprises positioning a threshold for each gesture relative to the start of the digit's movement. The digit's location is compared to each threshold to determine whether a threshold has been crossed, and, if so, selecting the gesture associated with that threshold. | 05-24-2012 |
20120131515 | METHOD AND APPARATUS OF ERROR CORRECTION IN RESISTIVE TOUCH PANELS - A method and apparatus of generating display data based on at least one input display value received from a touch screen device is disclosed. According to one example method of operation the method may include receiving an input display value in a predefined enclosed area of an input domain, and calculating a parametric representation of the received input display value based on the boundaries of the predefined enclosed area in the input domain. The predefined enclosed area may be a triangle. The operations may also include mapping the parametric representation of the input display value to a corresponding output display value in an output domain. The operations may also include displaying the at least one output display value via the display device. | 05-24-2012 |
20120131516 | METHOD, SYSTEM AND COMPUTER READABLE MEDIUM FOR DOCUMENT VISUALIZATION WITH INTERACTIVE FOLDING GESTURE TECHNIQUE ON A MULTI-TOUCH DISPLAY - A method, system and computer readable medium for folding a document page object are provided. A method for folding a document page object in a graphical user interface using multi-touch gestures includes establishing at least two contact points on a display; moving at least one of the two contact points to create a fold on the document page object; and displaying the folded document page object. | 05-24-2012 |
20120131517 | INFORMATION PROCESSING APPARATUS AND OPERATION METHOD THEREOF - There is provided an information processing apparatus with an interface with high convenience for a user. A reference speed is set according to an amount of movement or a movement time period of a pointer of a stylus or a finger. It is determined based on a movement speed of the pointer and the reference speed that a flick operation with the pointer has occurred. | 05-24-2012 |
20120131518 | APPARATUS AND METHOD FOR SELECTING ITEM USING MOVEMENT OF OBJECT - An item selecting apparatus includes a movement detecting unit detecting a movement of a user, a screen displaying image information, a display image storage unit storing data to generate an image to be displayed on the screen, a display image generating unit generating an image to be displayed on the screen, and a control unit controlling so that a plurality of items is displayed on the screen in one of one-, two- and three-dimensional arrangements, in which the control unit receives a signal from the movement detecting unit to measure a movement of the object in at least one of x-, y- and z-axis directions and issues a command to select at least one from among the plurality of items or provides visual feedback thereto, in response to the measured movement of the user and in accordance with the arrangement of the plurality of items on the screen. | 05-24-2012 |
20120131519 | Surfacing Off-Screen Visible Objects - A computer-implemented user input process for a computing device includes receiving, on a touch pad surface over a graphical display, a user input motion dragging across the touch pad surface, identifying the dragging input motion as originating off an edge of the touch pad by identifying a sensed first location for the input motion at a peripheral edge of the touch pad surface, and displaying on the graphical display a sliding graphical element that is animated to move from the edge of the display into a body of the display, over a nonmoving element on the display, in response to identifying the dragging input motion. | 05-24-2012 |
20120131520 | Gesture-based Text Identification and Selection in Images - A device with a touch-sensitive screen supports tapping gestures for identifying, selecting or working with initially unrecognized text. A single tap gesture can cause a portion of a character string to be selected. A double tap gesture can cause the entire character string to be selected. A tap and hold gesture can cause the device to enter a cursor mode wherein a placement of a cursor relative to the characters in a character string can be adjusted. In a text selection mode, a finger can be used to move the cursor from a cursor start position to a cursor end position and to select text between the positions. Selected or identified text can populate fields, control the device, etc. Recognition of text can be performed upon access of an image or upon the device detecting a tapping gesture in association with display of the image on the screen. | 05-24-2012 |
20120137258 | MOBILE ELECTRONIC DEVICE, SCREEN CONTROL METHOD, AND STORAGE MEDIUM STORING SCREEN CONTROL PROGRAM - According to an aspect, a mobile electronic device includes a touch panel and a control unit. The touch panel displays a screen thereon and detects a gesture performed on a surface thereof. When a sweep gesture is detected by the touch panel, the control unit cause an object corresponding to a process, which is executable while the screen is displayed on the touch panel, to be displayed near a position where the sweep gesture is detected on the touch panel. | 05-31-2012 |
20120137259 | ASSOCIATED FILE - A method for accessing a file on a computing machine including configuring a sensor to detect an object and a user interacting with the object, associating the object with at least one file on the computing machine, and configuring a display device to render an associated file being accessed in response to the user interacting with the object. | 05-31-2012 |
20120144345 | Methods and Systems for Radial Input Gestures - A computerized device can comprise a touch-enabled surface and a data processing hardware element configured by a gesture input engine to recognize an input gesture using data from the touch-enabled surface. A parameter value can be set based on determining a path traversed by the input gesture. The data processing element can comprise a processor and the gesture input engine can comprise program logic in a memory device and/or the engine may be implemented using hardware logic. Regardless, the radial input gesture can be used to set one or more parameter values without use of a direct mapping of interface coordinates to parameter values. A method can comprise tracking a plurality of input points, identifying a path defined by the plurality of input points, identifying a closed curve including the path, determining a percentage of the curve traversed by the path, and setting a parameter value based on the percentage. | 06-07-2012 |
20120144346 | MOTION-BASED USER INTERFACE FEATURE SUBSETS - The present disclosure relates to motion adaptive user equipment (UE) in a wireless communications network environment adapted for selecting one or more subsets of accessible user interface (UI) functions based, at least in part, on the determined motion of a UE. By selectively employing the one or more subsets of UI functions, a UE can dynamically adapt to the motion of the UE and limit distracting interactions with the UE creating a safer or more compliant wireless environment. Further disclosed are features related to override of UI limitations, auxiliary location sensing aspects, motion rule updating features, and voluntary and involuntary user preference aspects. | 06-07-2012 |
20120144347 | DISPLAY DEVICE AND CONTROL METHOD THEREOF - A display device and a control method thereof are provided. The display device includes: a touch screen which displays a screen and senses a gesture of a user on the screen; a video processor which process an image for displaying the screen; a communication unit which performs communication with at least one neighboring device; and a controller which performs control to display a miniature image of a screen of contents being displayed and a first user interface (UI) item showing a connected neighboring device of the at least one neighboring device if a first gesture of a user is made while displaying the screen corresponding to predetermined contents, and to transmit information for sharing the contents to the corresponding neighboring device in accordance with a second gesture of a user with regard to the miniature image and the first UI item. | 06-07-2012 |
20120144348 | Managing Virtual Ports - Techniques for managing virtual ports are disclosed herein. Each such virtual port may have different associated features such as, for example, privileges, rights or options. When one or more users are in a capture scene of a gesture based system, the system may associate virtual ports with the users and maintain the virtual ports. Also provided are techniques for disassociating virtual ports with users or swapping virtual ports between two or more users. | 06-07-2012 |
20120151420 | Devices, Systems, and Methods for Conveying Gesture Commands - Devices, systems, and methods are disclosed which relate to conveying gestures associated with commands by displaying images that a user associates with a gesture. Upon performance of the gesture, the commands are carried out by a device, system, etc. For example, a mobile device displays a gesture icon of a hammer. The gesture icon is labeled with a command. When a user makes a downward motion with the forearm, the mobile device senses that gesture through a gesture sensor. The mobile device interprets the gesture and executes the command in the label of the gesture icon. | 06-14-2012 |
20120151421 | ENHANCED DETECTION OF CIRCULAR ENGAGEMENT GESTURE - The enhanced detection of a circular engagement gesture, in which a shape is defined within motion data, and the motion data is sampled at points that are aligned with the defined shape. It is determined whether a moving object is performing a gesture correlating to the defined shape based on a pattern exhibited by the sampled motion data. An application is controlled if determining that the moving object is performing the gesture. | 06-14-2012 |
20120159401 | Workspace Manipulation Using Mobile Device Gestures - Workspaces are manipulated on a mobile device having a display screen. A set of two or more discrete workspaces is established. A default discrete workspace is then displayed on the screen, where the default discrete workspace is one of the discrete workspaces in the set. Whenever a user gestures with the mobile device, the gesture is used to select one of the discrete workspaces from the set, and the selected discrete workspace will be displayed on the screen. | 06-21-2012 |
20120159402 | METHOD AND APPARATUS FOR PROVIDING DIFFERENT USER INTERFACE EFFECTS FOR DIFFERENT IMPLEMENTATION CHARACTERISTICS OF A TOUCH EVENT - A method for providing a mechanism by which different user interface effects may be performed for different classifications of gestures may include receiving an indication of a touch event at a touch screen display, determining a gesture classification and an implementation characteristic classification for the touch event, and enabling provision of substantively different user interface effects for respective different implementation characteristic classifications of the gesture classification. A corresponding apparatus and computer program product are also provided. | 06-21-2012 |
20120159403 | SYSTEM AND METHOD FOR GAUGING AND SCORING AUDIENCE INTEREST OF PRESENTATION CONTENT - A system and method for managing content displayed in media presentations. A client computing device includes a client media presentation application that is responsive to feedback input received during the display of one or more slides of a media presentation to assign a score to each of the one or more slides. The media presentation application stores the assigned scores in a client database as media presentation data. A server computing device includes a server database that stores media presentation data. The server computing device also includes a server media presentation application that communicates with the client media presentation application during a synchronization process to synchronize media presentation data between the client database the server database. | 06-21-2012 |
20120159404 | DETECTING VISUAL GESTURAL PATTERNS - A processing device and method are provided for capturing images, via an image-capturing component of a processing device, and determining a motion of the processing device. An adaptive search center technique may be employed to determine a search center with respect to multiple equal-sized regions of an image frame, based on previously estimated motion vectors. One of several fast block matching methods may be used, based on one or more conditions, to match a block of pixels of one image frame with a second block of pixels of a second image. Upon matching blocks of pixels, motion vectors of the multiple equal-sized regions may be estimated. The motion may be determined, based on the estimated motion vectors, and an associated action may be performed. Various embodiments may implement techniques to distinguish motion blur from de-focus blur and to determine a change in lighting condition. | 06-21-2012 |
20120167017 | SYSTEMS AND METHODS FOR ADAPTIVE GESTURE RECOGNITION - Systems and methods are described for adaptively recognizing gestures indicated by user inputs received from a touchpad, touchscreen, directional pad, mouse or other multi-directional input device. If a user's movement does not indicate a gesture using current gesture recognition parameters, additional processing can be performed to recognize the gesture using other factors. The gesture recognition parameters can then be adapted so that subsequent user inputs that are similar to the previously-rejected inputs will appropriately trigger gesture commands as desired by the user. Gestural data or parameters may be locally or remotely stored for further processing. | 06-28-2012 |
20120174041 | GESTURE-BASED SELECTION - Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for gesture-based selection. In one aspect, a method includes displaying, on a user interface, information referencing a first set of data items, and at least one control that references multiple values, identifying a first gesture input by a user using the control, selecting a first value based on the first gesture, identifying a second gesture input by the user using the control, selecting a second value based on the second gesture, selecting a second set of data items to display based on the first and second values, and displaying information referencing the second set of data items. | 07-05-2012 |
20120174042 | METHOD FOR UNLOCKING SCREEN AND EXECUTING APPLICATION PROGRAM - A method for unlocking screen and executing application program is provided. The method is adapted to a mobile device having a touch screen. Under a screen lock mode of the mobile device, the touch screen is used to detect a touch and drag operation of a user. Then, it is determined whether a start point of the touch and drag operation is located within a predetermined region and a dragging distance of the touch and drag operation along a predetermined path is over a predetermined distance. If yes, it is further determined whether an end point of the touch and drag operation is located within one of a plurality of segmented regions of the touch screen, which respectively correspond to a plurality of application programs. If yes, the screen is unlocked and the application program corresponding to segmented region where the end point is located is executed simultaneously. | 07-05-2012 |
20120174043 | GESTURE-BASED SELECTION - Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for gesture-based selection. In one aspect, a method includes displaying, on a user interface, information referencing a first set of data items and at least one control that references multiple values. A first gesture input by a user using the control is identified, and a first value is selected based on the first gesture. A second gesture input by the user using the control is identified, and a second value is selected based on the second gesture. A second set of data items is selected based on the first and second values, and information referencing the second set of data items is displayed. | 07-05-2012 |
20120174044 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND COMPUTER PROGRAM - Provided is an information processing apparatus including a display unit, provided on an apparatus front-surface side, for displaying information, a first detection unit, provided on an apparatus back-surface side, for detecting an operation input to a back surface, a second detection unit, provided on the apparatus front-surface side, for detecting an operation input to the display unit, and an operation input information determination unit for causing a function corresponding to the operation inputs to be executed, based on detection results of the first detection unit and the second detection unit. When the operation input is detected by the first detection unit and the operation input for operating an object displayed on the display unit is detected by the second detection unit, the operation input information determination unit executes the function corresponding to the operation inputs detected by the first detection unit and the second detection unit. | 07-05-2012 |
20120180001 | ELECTRONIC DEVICE AND METHOD OF CONTROLLING SAME - A method includes detecting a gesture associated with an edge of a display, determining an element associated with the edge, and opening the element. | 07-12-2012 |
20120180002 | NATURAL INPUT FOR SPREADSHEET ACTIONS - Different gestures and actions are used to interact with spreadsheets. The gestures are used in manipulating the spreadsheet and performing other actions in the spreadsheet. For example, gestures may be used to move within the spreadsheet, select data, filter, sort, drill down/up, zoom, split rows/columns, perform undo/redo actions, and the like. Sensors that are associated with a device may also be used in interacting with spreadsheets. For example, an accelerometer may be used for moving and performing operations within the spreadsheet. | 07-12-2012 |
20120180003 | IMAGE FORMING APPARATUS AND TERMINAL DEVICE EACH HAVING TOUCH PANEL - MFP as an image forming apparatus includes a touch panel, a controller connected to the touch panel, a memory for storing a data file, and a communication device for communicating with another device. Continuously after two contacts are made on the touch panel, when a gesture of moving the two contacts in a direction that a spacing therebetween is decreased and then releasing the two contacts after being moved is detected, the controller identifies a file presented by an icon displayed in an area defined by the two contacts at least either of before and after being moved, as a file to be transferred, and transfers the file to be transferred to the other device in response to a request to transfer a file from the other device received by the communication device. | 07-12-2012 |
20120185806 | INTERACTION DATA DISPLAY APPARATUS, PROCESSING APPARATUS AND METHOD FOR DISPLAYING THE INTERACTION DATA - Disclosed herein is a system for keeping sensing of mass data of members of an organization with respect to states of their meeting and activities and analyzing and evaluating their interactions according to those sensor data. Interaction data includes first information denoting whether or not a terminal unit has faced a different terminal unit and second information denoting a state of the terminal unit and excluding information denoting positions of the terminal unit and the first information. An interaction data display apparatus includes a receiving unit for receiving interaction data from the terminal unit and a display unit for displaying the received interaction data. The display unit displays the first and second information items included in the interaction data received by the receiving unit so as to be related to each other according to times at which those first and second information items are obtained respectively. | 07-19-2012 |
20120192116 | Pinch Zoom Velocity Detent - An information handling system includes a gesture sensitive interface and a processor. The processor is configured to receive inputs from the gesture sensitive interface corresponding to first and second interaction points, determine a relative motion between the first and second interaction points, and obtain a velocity of the relative motion. The processor is further configured to determine if the velocity exceeds a threshold, and scale an image on a display from an initial magnification to a predetermined magnification when the velocity exceeds a threshold. | 07-26-2012 |
20120192117 | Device, Method, and Graphical User Interface with a Dynamic Gesture Disambiguation Threshold - An electronic device with a display, a touch-sensitive surface, one or more processors, and memory detects a first portion of a gesture, and determines that the first portion has a first gesture characteristic. The device selects a dynamic disambiguation threshold in accordance with the first gesture characteristic. The dynamic disambiguation threshold is used to determine whether to perform a first type of operation or a second type of operation when a first kind of gesture is detected. The device determines that the gesture is of the first kind of gesture. After selecting the dynamic disambiguation threshold, the device determines whether the gesture meets the dynamic disambiguation threshold. When the gesture meets the dynamic disambiguation threshold, the device performs the first type of operation, and when the gesture does not meet the dynamic disambiguation threshold, the device performs the second type of operation. | 07-26-2012 |
20120192118 | Device, Method, and Graphical User Interface for Navigating through an Electronic Document - An electronic device with a display and a touch-sensitive surface stores a document having primary content, supplementary content, and user-generated content. The device displays a representation of the document in a segmented user interface on the display. Primary content of the document is displayed in a first segment of the segmented user interface and supplementary content of the document is concurrently displayed in a second segment of the segmented user interface distinct from the first segment. The device receives a request to view user-generated content of the document. In response to the request, the device maintains display of the previously displayed primary content, ceases to display at least a portion of the previously displayed supplementary content, and displays user-generated content of the document in a third segment of the segmented user interface distinct from the first segment and the second segment. | 07-26-2012 |
20120192119 | USB HID DEVICE ABSTRACTION FOR HDTP USER INTERFACES - A method for implementing USB communications providing user interface measurement and detection of at least one gesture and one angle of finger position for a touch-based user interface is disclosed. The method comprises receiving real-time tactile-image information from a tactile sensor array; processing the tactile-image information to detect and measure the variation of one angle of a finger position and to detect at least one gesture producing at least one of a parameter value responsive to the variation in the finger angle and a symbol responsive to a detected gesture. These are mapped to a Universal Serial Bus (USB) Human Interface Device message which is transmitted to a host device over USB hardware for use by an application executing on the host device. The method provides for the incorporation of various configurations, tactical grammars, use with a touch screen, and numerous other features. | 07-26-2012 |
20120192120 | IMAGE FORMING APPARATUS AND TERMINAL DEVICE EACH HAVING TOUCH PANEL - An image forming apparatus includes a touch panel, a controller connected to the touch panel, and a memory. When a pinch-in gesture on the touch panel is detected during execution of an application, the controller stores, in the memory, information showing a state of processing of the application when the pinch-in gesture is detected, and when a pinch-out gesture on the touch panel is detected, the controller reads the stored information showing the state of processing of the application from the memory, and resumes processing of the application from the state shown by the information. | 07-26-2012 |
20120192121 | BREATH-SENSITIVE DIGITAL INTERFACE - A breath-sensitive digital interface that enables use a person's breath or other fluid for purposes of navigating digital media, and method for using such an interface. | 07-26-2012 |
20120198392 | ASSOCIATING DEVICES IN A MEDICAL ENVIRONMENT - A medical device includes a gesture detector for detecting a gesture of a second device with respect to the medical device. The gesture is detected within a small time window. The medical device also includes an association gesture determiner for determining that the gesture is an association gesture for initiating a request to associate the medical device with the second device, and a device associator for associating the medical device with the second device based on the association gesture. | 08-02-2012 |
20120204133 | Gesture-Based User Interface - A user interface method, including capturing, by a computer, a sequence of images over time of at least a part of a body of a human subject, and processing the images in order to detect a gesture, selected from a group of gestures consisting of a grab gesture, a push gesture, a pull gesture, and a circular hand motion. A software application is controlled responsively to the detected gesture. | 08-09-2012 |
20120216151 | Using Gestures to Schedule and Manage Meetings - Techniques and configurations for an apparatus are provided for creating and managing meetings using gestures. Movements of a user's hand in a three-dimensional space are detected. The hand movements in the three-dimensional space are interpreted to identify a gesture intended by the user to set up or manage a meeting among a plurality of persons. An electronic command is generated from the detected gesture to set up or manage the meeting. | 08-23-2012 |
20120216152 | TOUCH GESTURES FOR REMOTE CONTROL OPERATIONS - In general, this disclosure describes techniques for providing a user of a first computing device (e.g., a mobile device) with the ability to utilize the first computing device to control a second computing device (e.g., a television). Specifically, the techniques of this disclosure may, in some examples, allow the user to use drawing gestures on a mobile computing device to remotely control and operate the second computing device. Using a presence-sensitive user interface device (e.g., a touch screen), the user may use drawing gestures to indicate characters associated with operations and commands to control the second computing device. | 08-23-2012 |
20120216153 | HANDHELD DEVICES, ELECTRONIC DEVICES, AND DATA TRANSMISSION METHODS AND COMPUTER PROGRAM PRODUCTS THEREOF - Data transmission methods for handheld devices are provided. The data transmission method includes the steps of: receiving a gesture input; determining whether the gesture input matches a predetermined gesture; and if so, obtaining directional information corresponding to the gesture input and transmitting a file and the directional information to at least one electronic device such that display of a user interface of the at least one electronic device generates a display effect corresponding to the gesture according to the directional information. | 08-23-2012 |
20120216154 | TOUCH GESTURES FOR REMOTE CONTROL OPERATIONS - In general, this disclosure describes techniques for providing a user of a first computing device (e.g., a mobile device) with the ability to utilize the first computing device to control a second computing device (e.g., a television). Specifically, the techniques of this disclosure may, in some examples, allow the user to use drawing gestures on a mobile computing device to remotely control and operate the second computing device. Using a presence-sensitive user interface device (e.g., a touch screen), the user may use drawing gestures to indicate characters associated with operations and commands to control the second computing device. | 08-23-2012 |
20120254808 | HOVER-OVER GESTURING ON MOBILE DEVICES - Aspects of the disclosure may relate to detecting, by a computing device, a first user input comprising a first gesture to interact with a touch-sensitive screen of the computing device. Aspects may also include detecting a second user input comprising a second gesture using the touch-sensitive screen of the computing device. Aspects may also include, responsive to detecting the first user input, initiating a hover mode of interaction in a graphical user interface. | 10-04-2012 |
20120254809 | METHOD AND APPARATUS FOR MOTION GESTURE RECOGNITION - Various methods for motion gesture recognition are provided. One example method may include receiving motion gesture test data that was captured in response to a user's performance of a motion gesture. The motion gesture test data may include acceleration values in each of three dimensions of space that have directional components that are defined relative to an orientation of a device. The example method may further include transforming the acceleration values to derive transformed values that are independent of the orientation of the device, and performing a comparison between the transformed values and a gesture template to recognize the motion gesture performed by the user. Similar and related example methods, example apparatuses, and example computer program products are also provided. | 10-04-2012 |
20120254810 | Combined Activation for Natural User Interface Systems - A user interaction activation may be provided. A plurality of signals received from a user may be evaluated to determine whether the plurality of signals are associated with a visual display. If so, the plurality of signals may be translated into an agent action and a context associated with the visual display may be retrieved. The agent action may be performed according to the retrieved context and a result associated with the performed agent action may be displayed to the user. | 10-04-2012 |
20120260220 | PORTABLE ELECTRONIC DEVICE HAVING GESTURE RECOGNITION AND A METHOD FOR CONTROLLING THE SAME - The present disclosure provides a portable electronic device having gesture recognition and a method for controlling the same. In accordance with one example embodiment, the method comprises: sensing distortion of the portable electronic device from a neutral state; determining an action associated with a sensed distortion; and causing the determined action to be performed. | 10-11-2012 |
20120266109 | MULTI-DIMENSIONAL BOUNDARY EFFECTS - Multi-dimensional boundary effects provide visual feedback to indicate that boundaries in user interface elements (e.g., web pages, documents, images, or other elements that can be navigated in more than one dimension) have been reached or exceeded (e.g., during horizontal scrolling, vertical scrolling, diagonal scrolling, or other types of movement). A compression effect can be displayed to indicate that movement has caused one or more boundaries (e.g., a horizontal boundary and/or a vertical boundary) of a UI element to be exceeded. Exemplary compression effects include compressing content along a vertical axis when a vertical boundary has been exceeded and compressing content along a horizontal axis when a horizontal boundary has been exceeded. | 10-18-2012 |
20120272193 | I/O DEVICE FOR A VEHICLE AND METHOD FOR INTERACTING WITH AN I/O DEVICE - An I/O device for a vehicle includes at least one touch-sensitive I/O display unit DT, at least one output display unit DI and a control unit CU connecting the I/O display unit DT and the output display unit DI with an information exchange unit IEU. The touch-sensitive I/O display unit DT is located in a readily reachable position for a driver and the output display unit DI is located in a readily discernible position for a driver, and the control unit CU communicates output data DD-I/O related to an interactive I/O communication to the I/O display unit DT, receives touchscreen input data TI from the I/O display unit DT and communicates output data DD-O to the output display unit DI in relation with the input data TI. An I/O method using the above mentioned I/O device for a vehicle is also provided. | 10-25-2012 |
20120272194 | METHODS AND APPARATUSES FOR FACILITATING GESTURE RECOGNITION - Methods and apparatuses are provided for facilitating gesture recognition. A method may include constructing a matrix based at least in part on an input gesture and a template gesture. The method may further include determining whether a relationship determined based at least in part on the constructed matrix satisfies a predefined threshold. In an instance in which the relationship does not satisfy the predefined threshold, the method may also include eliminating the template gesture from further consideration for recognition of the input gesture. In an instance in which the relationship satisfies the predefined threshold, the method may further include determining a rotation matrix based at least in part on the constructed matrix. Corresponding apparatuses are also provided. | 10-25-2012 |
20120284673 | METHOD AND APPARATUS FOR PROVIDING QUICK ACCESS TO DEVICE FUNCTIONALITY - A method for providing quick access to device functionality responsive to a touch gesture may include receiving an indication of a swipe gesture being performed from a first predefined portion of a display to a second predefined portion of a touch screen display, classifying the swipe gesture as a trigger gesture based on insertion of a motion delay of at least a threshold period of time in connection with the swipe gesture, and causing, in response to classifying the trigger gesture, a display of a predefined set of functional elements that cause execution of a corresponding function when a respective one of the predefined set of functional elements is selected. A corresponding apparatus and computer program product are also provided. | 11-08-2012 |
20120284674 | TOUCH CONTROL METHOD AND APPARATUS - Embodiments of the present disclosure disclose a touch control method and an apparatus. The method includes: entering, when it is detected that a user triggers a function control, a function state corresponding to the function control; detecting a touch control operation performed by the user on an operation object on a touch control panel; under the function state corresponding to the function control, performing corresponding processing on the operation object according to the touch control operation of the user. | 11-08-2012 |
20120297347 | GESTURE-BASED NAVIGATION CONTROL - A user interface may be provided by: displaying a graphical user interface including at least one graphical user interface element; receiving at least one gesture-based user input; displaying a graphical user interface including the at least one graphical user interface element and one or more graphical user interface elements that are hierarchically dependent from the at least one graphical user interface element in response to the at least one gesture-based user input. | 11-22-2012 |
20120297348 | CONTROL OF A DEVICE USING GESTURES - In an operating system running on a processing device, detecting a gesture input via a user interface; identifying an operating system operation that corresponds to the gesture; performing the operating system operation; identifying an application running on the operating system that has subscribed to gesture input; and passing data corresponding to the gesture to the application for use by the application. | 11-22-2012 |
20120304131 | EDGE GESTURE - This document describes techniques and apparatuses enabling an edge gesture. In some embodiments, these techniques and apparatuses enable selection of a user interface not currently exposed on a display through an edge gesture that is easy-to-use and remember. | 11-29-2012 |
20120304132 | SWITCHING BACK TO A PREVIOUSLY-INTERACTED-WITH APPLICATION - This document describes techniques and apparatuses for switching back to a previously-interacted-with application. In some embodiments, these techniques and apparatuses enable selection of a user interface not currently exposed on a display through a simple gesture that is both easy-to-use and remember. | 11-29-2012 |
20120304133 | EDGE GESTURE - This document describes techniques and apparatuses enabling an edge gesture. In some embodiments, these techniques and apparatuses enable selection of a user interface not currently exposed on a display through an edge gesture that is easy-to-use and remember. | 11-29-2012 |
20120311507 | Devices, Methods, and Graphical User Interfaces for Navigating and Editing Text - An electronic device displays text of an electronic document on a display; displays an insertion marker at a first position in the text of the electronic document; detects a first horizontal gesture on a touch-sensitive surface; in response to a determination that the first horizontal gesture satisfies a first set of one or more predefined conditions: translates the electronic document on the display in accordance with a direction of the first horizontal gesture, and maintains the insertion marker at the first position in the text; and, in response to a determination that the first horizontal gesture satisfies a second set of one or more predefined conditions, moves the insertion marker by one character in the text from the first position to a second position in the text in accordance with the direction of the first horizontal gesture. | 12-06-2012 |
20120311508 | Devices, Methods, and Graphical User Interfaces for Providing Accessibility Using a Touch-Sensitive Surface - An electronic device presents a first user interface element of a first type and a second user interface element of a second type. In a sighted mode, the device detects a first interaction with the first user interface element, and performs an operation in accordance with sighted-mode gesture responses for the first user interface element. The device detects a second interaction with the second user interface element, and performs an operation in accordance with sighted-mode gesture responses for the second user interface element. In an accessible mode, the device detects a third interaction with the first user interface element, and performs an operation in accordance with accessible-mode gesture responses for the first user interface element. The device detects a series of interactions with the second user interface element; and, for each interaction, performs an operation in accordance with the sighted-mode gesture responses for the second user interface element. | 12-06-2012 |
20120311509 | READER WITH ENHANCED USER FUNCTIONALITY - A bookmarking system including a reader interface configured to cause content to be displayed on an electronic device having a touch-sensitive screen. The system includes a bookmark module configured to add a bookmark when a user swipes downwardly on the screen, wherein the bookmark is a record relating to the content displayed on the screen during the downward swipe. | 12-06-2012 |
20120317520 | APPARATUS AND METHOD FOR PROVIDING A DYNAMIC USER INTERFACE IN CONSIDERATION OF PHYSICAL CHARACTERISTICS OF A USER - Provided are an apparatus and method for providing a user interface and a terminal employing the same. The apparatus is capable of dynamically changing a graphical object according to the physical characteristics of a user or in an effort to prevent muscle stress to the user. | 12-13-2012 |
20120317521 | General User Interface Gesture Lexicon and Grammar Frameworks for Multi-Touch, High Dimensional Touch Pad (HDTP), Free-Space Camera, and Other User Interfaces - A method for a multi-touch gesture-based user interface wherein a plurality of gestemes are defined as functions of abstract space and time and further being primitive gesture segments that can be concatenated over time and space to construct gestures. Various distinct subset of the gestemes can be concatenated in space and time to construct a distinct gestures. Real-time multi-touch gesture-based information provided by user interface is processed to at least a recognized sequence of specific gestemes and that the sequence of gestemes that the user's execution a gesture has been completed. The specific gesture rendered by the user is recognized according to the sequence of gestemes. Many additional features are then provided from this foundation, including gesture grammars, structured-meaning gesture-lexicon, context, and the use of gesture prosody. | 12-13-2012 |
20120317522 | COMPUTING DEVICE AND METHOD FOR SELECTING DISPLAY REGIONS RESPONSIVE TO NON-DISCRETE DIRECTIONAL INPUT ACTIONS AND INTELLIGENT CONTENT ANALYSIS - A computing device includes a display surface, a human interface feature, and processing resources. The human interface features enables a user of the computing device to enter a non-discrete directional input action. The processing resources execute to: (i) provide content on the display surface; (ii) detect the user performing the input action; (ii) determine a vector from the input action; and (iv) select a region of the display surface based on the vector. | 12-13-2012 |
20120324403 | METHOD OF INFERRING NAVIGATIONAL INTENT IN GESTURAL INPUT SYSTEMS - In a processing system having a touch screen display, a method of inferring navigational intent by a user in a gestural input system of the processing system is disclosed. A graphical user interface may receive current gestural input data for an application of the processing system from the touch screen display. The graphical user interface may generate an output action based at least in part on an analysis of one or more of the current gestural input data, past gestural input data for the application, and current and past context information of usage of the processing system. The graphical user interface may cause performance of the output action. | 12-20-2012 |
20120331424 | ELECTRONIC DEVICE AND METHOD OF DISPLAYING INFORMATION IN RESPONSE TO INPUT - A method includes displaying, in a window or field, first information associated with a first source running on a portable electronic device and detecting an input to display second information associated with a second source. After the detecting, second information associated with the second source and the first information in the window or field is displayed. | 12-27-2012 |
20130007672 | Methods and Systems for Correlating Head Movement with Items Displayed on a User Interface - The present description discloses systems and methods for moving and selecting items in a row on a user interface in correlation with a user's head movements. One embodiment may include measuring an orientation of a user's head and communicating the measurement to a device. Next, the device can be configured to execute instructions to correlate the measurement with a shift of a row of items displayed in a user interface, and execute instructions to cause the items to move in accordance with the correlation. The device may also receive a measurement of an acceleration of the user's head movement, and can be configured to execute instructions to cause the items to move at an acceleration comparable to the measured acceleration. | 01-03-2013 |
20130007673 | REPOSITION PHYSICAL MEDIA - A device including a touch component to detect a directional hand gesture, a media compartment to receive physical media, and a mechanism to reposition the physical media from the media compartment if the directional hand gesture is detected and if the media compartment includes the physical media. | 01-03-2013 |
20130024820 | MOVING A GRAPHICAL SELECTOR - In general, this disclosure describes techniques for moving a graphical selector. In one example, a method includes activating, by a computing device, a graphical key that is displayed with a presence-sensitive interface of the computing device. Upon activation of the graphical key, the method also includes receiving gesture input corresponding to a directional gesture using the presence-sensitive interface of the computing device and moving a graphical selector displayed with the presence-sensitive interface from a first graphical location to a second graphical location by at least one selected increment based on a property of the gesture input. | 01-24-2013 |
20130024821 | METHOD AND APPARATUS FOR MOVING ITEMS USING TOUCHSCREEN - A method for moving or copying displayed items in response to at least one touch event on a touch screen is provided. The method comprises: detecting a user selection of one or more items; displaying an object corresponding to the one or more selected items at a predetermined location; moving the object to a point selected by the user; and moving the selected one or more items in association with the object. The object can be a convergence of a first object depicting a number representing a numerical count of the selected items, and a object visually representing the items. | 01-24-2013 |
20130031514 | Gestures for Presentation of Different Views of a System Diagram - Presenting different views of a system based on input from a user. A first view of a first portion of the system may be displayed. For example, the first portion may be a device of the system. User input specifying a first gesture may be received. In response to the first gesture, a second view of the first portion of the system may be displayed. For example, the first view may represent a first level of abstraction of the portion of the system and the second view may represent a second level of abstraction of the portion of the system. A second gesture may be used to view a view of a different portion of the system. Additionally, when changing from a first view to a second view, the first view may “morph” into the second view. | 01-31-2013 |
20130031515 | Method And Apparatus For Area-Efficient Graphical User Interface - A GUI screen image is a standard screen image, and displays a first combined GUI area, which is a combination of a GUI of the directional keys and a GUI of a joystick, and a second combined GUI area, which is a combination of a GUI of the four-type operation buttons and a GUI of a joystick, at the lower left and at the lower right on the screen image, respectively. Depending on an area in the first combined GUI area or in the second combined GUI area to which a user newly touches, which of the combined GUI to be used is determined and a screen image is switched, and if a finger or a thumb detaches, the screen image switches back. | 01-31-2013 |
20130031516 | IMAGE PROCESSING APPARATUS HAVING TOUCH PANEL - An image processing apparatus includes an operation panel as an example of a touch panel and a display device, as well as CPU as an example of a processing unit for performing processing based on a contact. CPU includes a first identifying unit for identifying a file to be processed, a second identifying unit for identifying an operation to be executed, a determination unit for determining whether or not the combination of the file and operation as identified is appropriate, and a display unit for displaying a determination result. In the case where one of the identifying units previously detects a corresponding gesture to identify the file or the operation, and when a gesture corresponding to the other identifying unit is detected next, then the determination result is displayed on the display device before identification of the file or the operation is completed by the gesture. | 01-31-2013 |
20130031517 | HAND POSE INTERACTION - Provided is a method of hand pose interaction. The method recognizes a user input related to selection of an object displayed on a computing device and displays a graphical user interface (GUI) corresponding to the object. The graphical user interface comprises at least one representation of a hand pose, wherein each representation of a hand pose corresponds to a unique function associated with the object. Upon recognition of a user hand pose corresponding to a hand pose representation in the graphical user interface, the function associated with the hand pose representation is executed. | 01-31-2013 |
20130036389 | COMMAND ISSUING APPARATUS, COMMAND ISSUING METHOD, AND COMPUTER PROGRAM PRODUCT - According to an embodiment, a command issuing apparatus includes an acquiring unit configured to acquire an image obtained by capturing a subject; a detector configured to detect a specific region of the subject from the image; a first setting unit configured to set a specific position indicating a position of the specific region; a second setting unit configured to set a reference position indicating a position that is to be a reference in the image; a first calculator configured to calculate a position vector directing toward the specific position from the reference position; a second calculator configured to calculate, for each of a plurality of command vectors respectively corresponding to predetermined commands, a first parameter indicating a degree of coincidence between the command vector and the position vector; and an issuing unit configured to issue the command based on the first parameter. | 02-07-2013 |
20130042209 | System and Method for Providing Direct Access to an Application when Unlocking a Consumer Electronic Device - A consumer electronic device has an orientation sensor and a lock control. The orientation sensor outputs signals identifying the orientation of the device, while the lock control to allow a user to move the device from a locked state to an unlocked state. The device also includes a plurality of application programs stored in memory. Responsive to the user unlocking the device, a controller will launch a selected application program. The application that is launched by the device is based on an orientation of the device. | 02-14-2013 |
20130047125 | TOUCHSCREEN GESTURES FOR VIRTUAL BOOKMARKING OF PAGES - A system and method are disclosed for navigating an electronic document using a touch-sensitive display screen with gestures that are reminiscent of physically handling the pages of a conventional, bound document. A user may temporarily bookmark one or more selected pages by touching the touchscreen with a finger when the pages are displayed, to mimic using a finger to hold a selected page of a conventional, bound document. Predefined gestures may be specified with different functions, such as returning to a bookmarked page or removing a bookmark. | 02-21-2013 |
20130047126 | SWITCHING BACK TO A PREVIOUSLY-INTERACTED-WITH APPLICATION - This document describes techniques and apparatuses for switching back to a previously-interacted-with application. In some embodiments, these techniques and apparatuses enable selection of a user interface not currently exposed on a display through a simple gesture that is both easy-to-use and remember. | 02-21-2013 |
20130055168 | MULTI-SELECT TOOLS - Touch-sensitive features of devices may be used to demarcate a displayed area on the device to identify a set of data points contained in the demarcated area. After identifying the data points in the demarcated area, the user may be presented with an interface on the display listing different actions that may be performed on at least one of the identified data points. After the user selects the actions to be performed, the computing device may perform the selected actions on one or more of the identified data points in demarcated area. | 02-28-2013 |
20130055169 | APPARATUS AND METHOD FOR UNLOCKING A TOUCH SCREEN DEVICE - An apparatus and method for unlocking a device including a touchscreen display are provided. The method includes detecting an input to the touchscreen, determining whether the input corresponds to a request to unlock the device, determining whether the input is associated with a functionality of the device, unlocking the device if the input corresponds to a request to unlock the device, and loading, if the input is determined to be associated with a functionality of the device, an application associated with the input. | 02-28-2013 |
20130055170 | ELECTRONIC DEVICE AND METHOD OF DISPLAYING INFORMATION IN RESPONSE TO DETECTING A GESTURE - A gesture is detected by an electronic device including a display. In response to detecting a first part of the gesture, a first part of first information is displayed on the display. In response to detecting a second part of the gesture subsequent to the first part of the gesture, the display of the first part of the first information is maintained. In response to detecting a third part of the gesture subsequent to the second part of the gesture, an additional part of the information is displayed. | 02-28-2013 |
20130067419 | Gesture-Enabled Settings - Techniques ( | 03-14-2013 |
20130067420 | Semantic Zoom Gestures - Semantic zoom techniques are described. In one or more implementations, techniques are described that may be utilized by a user to navigate to content of interest. These techniques may also include a variety of different features, such as to support semantic swaps and zooming “in” and “out.” These techniques may also include a variety of different input features, such as to support gestures, cursor-control device, and keyboard inputs. A variety of other features are also supported as further described in the detailed description and figures. | 03-14-2013 |
20130067421 | Secondary Actions on a Notification - Various embodiments enable notifications to be generated in both touch and non-touch environments. In at least some embodiments, a notification window is presented and a drag operation can reveal one or more secondary actions that can be performed. In at least some embodiments, selection of one or more of the secondary actions can occur independent of, and without utilizing additional special affordances, such as buttons. | 03-14-2013 |
20130067422 | TERMINAL CAPABLE OF CONTROLLING ATTRIBUTE OF APPLICATION BASED ON MOTION AND METHOD THEREOF - A terminal controls an application attribute based on a motion. The terminal includes a motion measurement unit, an attribute mapping unit, an attribute control unit, and a controller. The motion measurement unit measures a motion of the terminal. The attribute mapping unit classifies motion directions of the terminal, classifies a motion degree, and maps an attribute type and a control strength of an application installed in the terminal in response to each motion direction and each motion degree. The attribute control unit activates an attribute control of the application based on a motion of the terminal. When the attribute control of the application is activated by the attribute control unit and the motion of the terminal is measured by the motion measurement unit, the controller controls an attribute of the application based on the attribute type and the control strength mapped by the attribute mapping unit. | 03-14-2013 |
20130074014 | COLLABORATIVE GESTURE-BASED INPUT LANGUAGE - In one example, a method includes receiving, by a server, data representative of a group of gestures detected by the plurality of computing devices and data representative of one or more shortcuts associated with the group of gestures from a plurality of computing devices, wherein each shortcut corresponds to an action performed by at least one of the computing devices. The method may further include aggregating, by the server, the data representative of the gestures and the data representative of the associated shortcuts received from the plurality of computing devices based at least in part on detected similarities between at least one of 1) the group of gestures and 2) the associated shortcuts, and defining, by the server, a gesture-shortcut language based at least in part on the aggregated data. | 03-21-2013 |
20130074015 | SYSTEM TO SCROLL TEXT ON A SMALL DISPLAY SCREEN - The present invention is a system to scroll text on a small display screen that includes a screen finger sensor that detects when a selected one of a user's finger and a stylus is dragged across the small display screen and a page, sets a scroll speed, will scroll to an edge of the page and move down a sentence and repeat to scroll, reaches a bottom of the page and will move to a first sentence of a next consecutive page. The system also includes a logic controller that detects the text is displayed and sets the screen finger sensor to await activation of the system and is typically a software module or a downloaded app. | 03-21-2013 |
20130086531 | COMMAND ISSUING DEVICE, METHOD AND COMPUTER PROGRAM PRODUCT - According to an embodiment, a command issuing device includes an acquiring unit configured to acquire a moving image by capturing a hand of an operator; a projection area recognizer configured to recognize a projection area of a projection finger in the moving image; a projector configured to project one of pictures of a graphical user interface (GUI) onto the projection area; an operation area recognizer configured to recognize an operation area of an operation finger in the moving image; a selection determining unit configured to measure an overlapping degree of the projection area and the operation area to determine whether or not the GUI is selected; and a issuing unit configured to issue a command associated with the GUI when it is determined that the GUI is selected. | 04-04-2013 |
20130086532 | TOUCH DEVICE GESTURES - A system and method for facilitating employing touch gestures to control or manipulate a web-based application. The example method includes employing a browser running on a device with a touch-sensitive display to access content provided via a website; determining a context associated with the content, including ascertaining one or more user interface controls to be presented via a display screen used to present the content, and providing a first signal in response thereto; receiving touch input from a touch-sensitive display and providing a second signal in response thereto; and using the second signal to manipulate the display screen in accordance with the context associated with the content presented via the display screen. A library of touch gestures can represent common functions through touch movement patterns. These gestures may be context sensitive so as not to conflict with default touch tablet gestures. | 04-04-2013 |
20130086533 | DEVICE FOR INTERACTING WITH REAL-TIME STREAMS OF CONTENT - An end-user system ( | 04-04-2013 |
20130091473 | CHANGING DISPLAY BETWEEN GRID AND FORM VIEWS - Data can be displayed in a display in a first orientation. The display can include a grid view of the data. A user input can be received, where the user input directs a change of orientation of the display from the first orientation to a second orientation. For example, the user input can include rotating a display device. In response to the user input, the orientation of the display can be changed from the first orientation to the second orientation, and the grid view can be changed to a form view of the data. Also, in response to another user input such as rotating the display device, the orientation can be changed from the second orientation to the first orientation, and the display can be changed from the form view to the grid view. | 04-11-2013 |
20130091474 | METHOD AND ELECTRONIC DEVICE CAPABLE OF SEARCHING AND DISPLAYING SELECTED TEXT - An electronic device includes a storage unit, a touch display unit and a central processing unit. The central processing unit includes a control module, a searching module, and a spit-screen module. The control module generates a first window on the touch display unit to display a text document when the text document is opened, and determines a selected text of the displayed text document by a user according to touch positions when the touch display unit is touched. The searching module searches occurrences of the selected text in the text document, and the control module stores the searched text in the storage unit. The spit-screen module displays each occurrence of the selected text on a second window produced thereby with a size thereof smaller than that of the first window. A related method is also provided. | 04-11-2013 |
20130097565 | LEARNING VALIDATION USING GESTURE RECOGNITION - Embodiments are disclosed that relate to assessing a user's ability to recognize a target item by reacting to the target item and performing a target gesture. For example, one disclosed embodiment provides a method of assessing a user's ability to recognize a target item from a collection of learning items that includes the target item. The method includes providing to a display device the learning items in a sequence and receiving input from a sensor to recognize a user gesture made by the user. If the user gesture is received within a target timeframe corresponding to the target item, then the method includes determining whether the user gesture matches a target gesture. If the user gesture matches the target gesture, then the method includes providing to the display device a reward image for the user. | 04-18-2013 |
20130097566 | SYSTEM AND METHOD FOR DISPLAYING ITEMS ON ELECTRONIC DEVICES - A method, computer readable storage medium, and electronic device are provided which display items on such an electronic device by displaying a first set of items, receiving a first gesture on a touch-sensitive panel of the electronic device, and displaying a second set of items, the second set of items being a subset of the first set of items. | 04-18-2013 |
20130104089 | GESTURE-BASED METHODS FOR INTERACTING WITH INSTANT MESSAGING AND EVENT-BASED COMMUNICATION APPLICATIONS - Gesture-based methods of managing communications of a user participating in communication sessions permit the user to easily manage the communications sessions by defining gestures, defining a meaning of the gesture, and outputting the meaning of the gesture to a communication session when the gesture is detected. The gestures may be contextually dependent, such that a single gesture may generate different output, and may be unconventional to eliminate confusion during gesturing during the communication sessions, and thereby the communications sessions may be more effectively managed. | 04-25-2013 |
20130104090 | DEVICE AND METHOD FOR SELECTION OF OPTIONS BY MOTION GESTURES - A method for selection of an option on a device is provided where the device is enabled for option selection through motion gestures by a user. The method comprises providing at least one option for a first input request and announcing the first input request and at least one option of the first input request. A first motion gesture is detected, and the device determines whether the first motion gesture corresponds to a positive selection or a negative selection, wherein a control module of the device determines whether the first motion gesture meets a threshold for a positive gesture selection or a negative gesture selection. The device advances to a second option and announces the second option upon the determination of a negative selection as the first motion gesture. The selected option for the first input request is stored in a memory of the device after a positive selection. | 04-25-2013 |
20130111414 | VIRTUAL MOUSE DRIVING APPARATUS AND VIRTUAL MOUSE SIMULATION METHOD | 05-02-2013 |
20130117718 | ELECTRONIC DEVICE AND METHOD OF DISPLAYING INFORMATION IN RESPONSE TO A GESTURE - A method includes displaying information associated with a first application on a touch-sensitive display of an electronic device. A gesture is detected on the touch-sensitive display, which gesture indicates a request to display information associated with a second application. At least part of the information associated with the second application is displayed without opening the second application. | 05-09-2013 |
20130125068 | Methods and Apparatus for Natural Media Painting Using a Realistic Brush and Tablet Stylus Gestures - Systems and methods for providing a natural media painting application may receive user inputs through tablet stylus gestures. A user interface may detect stylus gestures that mimic real-world actions of artists based on information collected during user manipulation of the stylus, and may map the gestures to various digital painting and image editing tasks that may be invoked and/or controlled using the gesture-based inputs. The collected information may include spatial and/or directional information, acceleration data, an initial and/or ending position of the stylus, an initial and/or ending orientation of the stylus, and/or pressure data. The stylus gestures may include translations, rotations, twisting motions, mashing gestures, or jerking motions. The application may perform appropriate painting and image editing actions in response to detecting and recognizing the stylus gestures, and the actions taken may be dependent on the work mode and/or context of the graphics application in which stylus gesture was performed. | 05-16-2013 |
20130125069 | System and Method for Interactive Labeling of a Collection of Images - Various embodiments of a system and method for labeling images are described. An image labeling system may label multiple images, where each of the images may be labeled to identify image content or image elements, such as backgrounds or faces. The system may display some of the labeled image elements in different portions of a display area. Unlabeled image elements may be displayed in the same display area. The display size and position of each unlabeled image element may be dependent on similarities between the unlabeled image element and the displayed, labeled image elements. The system may also receive gesture input in order to determine a corresponding labeling task to perform. | 05-16-2013 |
20130139115 | RECORDING TOUCH INFORMATION - A method of recording user-driven events within a computing system includes receiving at a motion-sensitive display surface at least one user-performed gesture, which includes user movement of an object over the surface that recognizes such user interaction therewith. Touch information is generated corresponding to the at least one user-performed gesture. The touch information is configured to be provided to an application. The touch information is intercepted and recorded before it is provided to the application. The intercepted touch information is grouped into at least one chunk, and the at least one chunk is output to the application. | 05-30-2013 |
20130145327 | Interfaces for Displaying an Intersection Space - User-submitted content (e.g., stories) may be associated with descriptive metadata, such as a timeframe, location, tags, and so on. The user-submitted content may be browed and/or searched using the descriptive metadata. Intersection criteria comprising a prevailing timeframe, a location, and/or other metadata criteria may be used to identify an intersection space comprising one or more stories. The stories may be ordered according to relative importance, which may be determined (at least in part) by comparing story metadata to the intersection criteria. Stories may be browsed in an intersection interface comprising a timeframe control. The intersection interface (and the timeframe control) may be configured to receive inputs in various forms including gesture input, movement input, orientation input, and so on. | 06-06-2013 |
20130152024 | ELECTRONIC DEVICE AND PAGE ZOOMING METHOD THEREOF - A page zooming method for an electronic device having a touch screen and a storage unit is provided. The method includes the following steps: generating operation signals in response to a touch operation applied on a page displayed on the touch screen; determining the touch operation being a zooming gesture if the touch operation comprising a press operation and a slide operation at a same time; determining the slide direction and determining the type of the zooming gesture according to the determined slide direction of the slide operation, the type of the zooming gesture comprising a zooming in gesture and a zooming out gesture; creating a zoomed page of the page displayed on the touch screen according to the type of the zooming gesture; and displaying the zoomed page on the touch screen. An electronic device using the page zooming method is also provided. | 06-13-2013 |
20130159938 | GESTURE INFERRED VOCABULARY BINDINGS - The subject disclosure relates to annotating data based on gestures. Gestures include user interaction with a client device or client software. Gestures are tracked and associated with data. In an aspect, client context associated with a gesture is also tracked. The gestures are then employed to determine a global term to associate with the data. In an aspect, a look-up table comprising a pre-defined relationship between gestures and a global term can be employed. In another aspect, an inference component employ context information in conjunction with the tracked gestures to determine a global term to assign to data. After a global term is determined for data based on a gesture, an annotation file for the data can be created associating the data with the global term. | 06-20-2013 |
20130159939 | AUTHENTICATED GESTURE RECOGNITION - Methods, apparatuses, systems, and computer-readable media for performing authenticated gesture recognition are presented. According to one or more aspects, a gesture performed by a user may be detected. An identity of the user may be determined based on sensor input captured substantially contemporaneously with the detected gesture. Then, it may be determined, based on the identity of the user, that the detected gesture corresponds to at least one command of a plurality of commands. Subsequently, the at least one command may be executed. In some arrangements, the gesture may correspond to a first command when performed by a first user, and the same gesture may correspond to a second command different from the first command when performed by a second user different from the first user. | 06-20-2013 |
20130159940 | Gesture-Controlled Interactive Information Board - A method of controlling an information board comprises the steps of sensing a gesture using a gesture capturing controller, determining a type of action having provided the gesture expression from the gesture capturing controller, where the type of command is one of a navigation request. Depending on the determined type of gesture, user interface elements of a spatial configuration are displayed. | 06-20-2013 |
20130159941 | ELECTRONIC DEVICE AND METHOD OF DISPLAYING INFORMATION IN RESPONSE TO A GESTURE - A method includes displaying, on a display of an electronic device, first information and detecting a gesture on the touch-sensitive display, which gesture indicates a request to display an inbox associated with a plurality of applications. In response to detecting the gesture, when a message is received for a first application that is not one of the plurality of applications, a plurality of visual notification icons is displayed and at least part of the inbox is gradually displayed while reducing display of the first information along with movement of the gesture, wherein a first visual notification icon of the plurality of visual notification icons is associated with the first application. | 06-20-2013 |
20130159942 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM - An information processing apparatus includes a proximity panel, a communication module and a controller. The proximity panel receives first gesture information. The first gesture information is received in response to a trajectory of movement on the proximity panel. The communication module receives second gesture information from a computing device. The controller determines whether the first gesture information and the second gesture information correspond to predetermined gesture information. In the event that the controller determines that the first gesture information and the second gesture information correspond to the predetermined gesture information, the controller causes predetermined data to be communicated with the computing device. | 06-20-2013 |
20130167093 | DISPLAY APPARATUS FOR RELEASING LOCKED STATE AND METHOD THEREOF - A method and apparatus are provided for releasing a locked state of a display apparatus. A display unit displays a locked view having a line and an affordance object connected to the line. When a user touches a point on the line in the locked view, a control unit controls the display unit to switch to an unlocked view while separating the affordance object by cutting the line at the point on the line. | 06-27-2013 |
20130167094 | Device, Method, and Graphical User Interface for Selection of Views in a Three-Dimensional Map Based on Gesture Inputs - An electronic device displays a first map view of a map that includes one or more map objects on a touch-sensitive display. While displaying the first map view, the device detects a first gesture of a first gesture type at a first location on the touch-sensitive display. The first location corresponds to a respective map object. In response to detecting the first gesture at the first location, the device enters a map view selection mode. While in the map view selection mode, the device detects a second gesture of a second gesture type at a second location on the touch-sensitive display. The second location corresponds to a respective location on the map. In response to detecting the second gesture at the second location, the device replaces the first map view with a second map view that includes a view of the respective map object from the respective location. | 06-27-2013 |
20130174100 | Device, Method, and Graphical User Interface for Configuring Restricted Interaction with a User Interface - An electronic device, while in an interaction configuration mode: displays a first user interface that includes a plurality of user interface objects; and, while displaying the first user interface, detects one or more gesture inputs on a touch-sensitive surface. For a respective gesture input, the device determines whether one or more user interface objects of the plurality of user interface objects correspond to the respective gesture input. The device visually distinguishes a first set of user interface objects in the plurality of user interface objects that correspond to the detected one or more gesture inputs from a second set of user interface objects in the plurality of user interface objects that do not correspond to the detected one or more gesture inputs. The device detects an input; and, in response to detecting the input, exits the interaction configuration mode and enters a restricted interaction mode. | 07-04-2013 |
20130174101 | ELECTRONIC APPARATUS AND METHOD OF CONTROLLING THE SAME - An electronic apparatus and a method of controlling the electronic apparatus are provided. The method includes: receiving a two hand start command which is to perform a motion task using two hands; if the two hand start command is input, changing a mode of the electronic apparatus to a two hand task mode which is to perform the motion task using the two hands; and if the mode of the electronic apparatus is changed to the two hand task mode, displaying a two hand input guide graphical user interface (GUI) which is to perform the motion task using the two hands. Therefore, a user further intuitively and conveniently perform a function of the electronic apparatus, such as a zoom-in/zoom-out, by using two hands. | 07-04-2013 |
20130179844 | Input Pointer Delay - Various embodiments enable repetitive gestures, such as multiple serial gestures, to be implemented efficiently so as to enhance the user experience. In at least some embodiments, a first gesture associated with an object is detected. The first gesture is associated with a first action. Responsive to detecting the first gesture, pre-processing associated with the first action is performed in the background. Responsive to detecting a second gesture associated with the object within a pre-defined time period, an action associated with the second gesture is performed. Responsive to the second gesture not being performed within the pre-defined time period, processing associated with the first action is completed. | 07-11-2013 |
20130179845 | METHOD AND APPARATUS FOR DISPLAYING KEYPAD IN TERMINAL HAVING TOUCH SCREEN - A method and an apparatus thereof display a key pad and solve the trouble and difficulty of a user in selecting the key pad displayed in a terminal having a touch screen. The method detects a touch gesture with respect to the touch screen; determines whether the detects touch gesture is zoom-out; displays thumbnails representing key pads, respectively, when the detected touch gesture is the zoom-out; and displays a key pad of a selected thumbnail when one of the displayed thumbnails is selected by a user. | 07-11-2013 |
20130185680 | Unlocking a Device by Performing Gestures on an Unlock Image - A device with a touch-sensitive display may be unlocked via gestures performed on the touch-sensitive display. The device is unlocked if contact with the display corresponds to a predefined gesture for unlocking the device. The device displays one or more unlock images with respect to which the predefined gesture is to be performed in order to unlock the device. The performance of the predefined gesture with respect to the unlock image may include moving the unlock image to a predefined location and/or moving the unlock image along a predefined path. The device may also display visual cues of the predefined gesture on the touch screen to remind a user of the gesture. | 07-18-2013 |
20130191789 | CONTROLLING A TRANSACTION WITH COMMAND GESTURES - Embodiments of the invention include systems, methods, and computer-program products that provide for a unique system for controlling transactions based on command gestures. In one embodiment of the invention, a computer-implemented method determines that a user is conducting a transaction with a mobile device. The mobile device senses a gesture performed by the user with the mobile device and alters at least one aspect of the transaction based on the gesture. The gestures can control a wide variety of aspects of the transaction. For example, the user may flag an item for review during the transaction, silence the transaction, receive a subtotal for the transaction, select a payment method, or complete the transaction. In an embodiment, the user is able to customize the gestures to control the transaction according to the user's preferences. | 07-25-2013 |
20130191790 | INTELLIGENT GESTURE-BASED USER'S INSTANTANEOUS INTERACTION AND TASK REQUIREMENTS RECOGNITION SYSTEM AND METHOD - Methods and apparatus for determining an intended gesture-based input command from an incomplete gesture-based input command that is supplied to a gesture-based touch screen display that includes at least a touch sensitive region includes receiving an incomplete gesture-based input command on the touch sensitive region of the gesture-based touch screen device, the incomplete gesture-based input command including a gesture profile and a gesture direction. Gesture signals that include data representative of the gesture profile and the gesture direction are generated in response to the input command. The gesture signals are processed in a processor to predict the intended gesture-based input command. The intended gesture-based command is retrieved, with the processor, from an electronically stored standard gesture library. | 07-25-2013 |
20130191791 | ELECTRONIC DEVICE AND METHOD OF CONTROLLING A DISPLAY - A method includes entering, by a portable electronic device, a low-power condition and while in the low-power condition, detecting an input. In response to detecting the input, a cover image is displayed. An application image is progressively revealed along with movement of a gesture while reducing display of the cover image. | 07-25-2013 |
20130205262 | METHOD AND APPARATUS FOR ADJUSTING A PARAMETER - A method comprising receiving an indication of a continuous stroke input, setting an adjustment magnitude based on a predetermined adjustment magnitude, determining that the continuous stroke input comprises a first adjustment magnitude input, adjusting the adjustment magnitude based on the magnitude adjustment input, determining that the continuous stroke input comprises a first adjustment input, and adjusting a parameter based on the adjustment magnitude and the first adjustment input is disclosed. | 08-08-2013 |
20130212541 | METHOD, A DEVICE AND A SYSTEM FOR RECEIVING USER INPUT - The invention relates to a method, a device and system for receiving user input. User interface events are first formed from low-level events generated by a user interface input device such as a touch screen. The user interface events are modified by forming information on a modifier 5 for the user interface events such as time and coordinate information. The events and their modifiers are sent to a gesture recognition engine, where gesture information is formed from the user interface events and their modifiers. The gesture information is then used as user input to the apparatus. In other words, the gestures may not be 10 formed directly from the low-level events of the input device. Instead, user interface events are formed from the low-level events, and gestures are then recognized from these user interface events. | 08-15-2013 |
20130219345 | APPARATUS AND ASSOCIATED METHODS - An apparatus including: at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following: in response to user input, disassociate the linking of the orientation of content with respect to the orientation of a display of a portable electronic device to retain the particular orientation of the content with respect to the orientation of the display during the user input. | 08-22-2013 |
20130219346 | Input to Locked Computing Device - The subject matter of this specification can be embodied in, among other things, a method that includes receiving at a computing device that is in a locked state, one or more user inputs to unlock the device and to execute at least one command that is different from a command for unlocking the device. The method further includes executing in response to the user inputs to unlock the device an unlocking operation by the device to convert the device from a locked state to an unlocked state. The method further includes executing the at least one command in response to receiving the user inputs to execute the at least one command. The at least one command executes so that results of executing the at least one command are first displayed on the device to a user automatically after the device changes from the locked state to the unlocked state. | 08-22-2013 |
20130227495 | ELECTRONIC DEVICE AND METHOD OF CONTROLLING A DISPLAY - A method comprises detecting, by a portable electronic device having a display, a gesture, selecting a category of selectable items based on the gesture. The method further comprises identifying one or more selectable items within the selected category to be displayed, and displaying the one or more selectable items. | 08-29-2013 |
20130227496 | IMAGE PROCESSING DEVICE, NON-TRANSITORY COMPUTER READABLE MEDIUM, AND IMAGE PROCESSING METHOD - Provided is an image processing device including a display unit that displays a function setting screen, a recognition unit that recognizes a trail of an operation of a user on the function setting screen displayed on the display unit, a determining unit that determines a function selected by the user based on a position of the trail of the operation recognized by the recognition unit, an identification unit that identifies an operation condition designated by the user with respect to the function determined by the determining unit based on the trail of the operation recognized by the recognition unit, and a setting unit that performs setting for executing image processing of the function determined by the determining unit using the operation condition identified by the identification unit. | 08-29-2013 |
20130239069 | CONTROL METHOD FOR MOBILE DEVICE USING SETTING PATTERN AND MOBILE DEVICE - A control method for a mobile device allows for manipulation of various features of a mobile device by inputting a touch pattern even if an application program is operating in a foreground of the mobile device. The control method includes: receiving a touch input signal; transferring the touch input signal to a software block of the mobile device and an application program; determining if the touch input signal corresponds to a setting pattern in the software block; and performing a set action corresponding to the setting pattern if the touch input signal corresponds to the setting pattern. | 09-12-2013 |
20130246978 | SYSTEM AND METHOD FOR ENHANCED COMMAND INPUT - A portable electronic device having an input device for receiving a gesture based input from a user is used to control a navigation operation of an appliance. The portable electronic device receives via the input device the gesture based input and uses one or more parameters stored in a memory of the portable electronic device and one or more characteristics associated with the gesture based input to cause the portable electronic device to transmit a navigation step command to thereby control the navigation operation of the appliance. | 09-19-2013 |
20130246979 | SYSTEM AND METHOD FOR ENHANCED COMMAND INPUT - A portable electronic device having an input device for receiving a gesture based input from a user is used to control a navigation operation of an appliance. The portable electronic device receives via the input device the gesture based input and uses one or more parameters stored in a memory of the portable electronic device and one or more characteristics associated with the gesture based input to cause the portable electronic device to transmit a navigation step command to thereby control the navigation operation of the appliance. | 09-19-2013 |
20130254721 | SYSTEM AND METHOD FOR ENHANCED COMMAND INPUT - A portable electronic device having an input device for receiving a gesture based input from a user is used to control a navigation operation of an appliance. The portable electronic device receives via the input device the gesture based input and uses one or more parameters stored in a memory of the portable electronic device and one or more characteristics associated with the gesture based input to cause the portable electronic device to transmit a navigation step command to thereby control the navigation operation of the appliance. | 09-26-2013 |
20130254722 | SYSTEM AND METHOD FOR ENHANCED COMMAND INPUT - A portable electronic device having an input device for receiving a gesture based input from a user is used to control a navigation operation of an appliance. The portable electronic device receives via the input device the gesture based input and uses one or more parameters stored in a memory of the portable electronic device and one or more characteristics associated with the gesture based input to cause the portable electronic device to transmit a navigation step command to thereby control the navigation operation of the appliance. | 09-26-2013 |
20130268900 | TOUCH SENSOR GESTURE RECOGNITION FOR OPERATION OF MOBILE DEVICES - Touch sensor gesture recognition for operation of mobile devices. An embodiment of a mobile device includes a touch sensor for the detection of gestures, the touch sensor including multiple sensor elements, and a processor, the processor to interpret the gestures detected by the touch sensor, where the mobile device divides the plurality of sensor elements into multiple zones, and the mobile device interprets the gestures based at least in part on which of the zones detects the gesture. An embodiment of a mobile device includes a touch sensor for the detection of gestures, the touch sensor including multiple sensor elements, and a processor, the processor to interpret the gestures detected by the touch sensor, where the processor is to identify one or more dominant actions for an active application or a function of the active application and is to choose a gesture identification algorithm from a plurality of gesture recognition algorithms based at least in part on identified one or more dominant actions, and is to determine a first intended action of a user based on an interpretation of a first gesture using the chosen gesture identification algorithm. An embodiment of a mobile device includes a touch sensor for the detection of gestures, the touch sensor including multiple sensor elements, and a processor, the processor to interpret the gestures detected by the touch sensor, and a mapping between touch sensor data and actual positions of user gestures, the mapping of data being generated by an artificial neural network, where the processor utilizes the mapping at least in part to interpret the gestures. | 10-10-2013 |
20130275923 | Method and Device Having Touchscreen Keyboard with Visual Cues - A method for providing visual cues rendered on a display is provided. The method comprises: detecting a touch input associated with a user interface element rendered on the display; determining an input direction of the touch input; and displaying on the display a visual cue associated with the user interface element, wherein the visual cue is located at a position based on the input direction of the touch input. | 10-17-2013 |
20130275924 | LOW-ATTENTION GESTURAL USER INTERFACE - A system and method to generate a touch-based user interface that allows users to reliably carry out tasks in a low-attention environment. The touch-based user interface relies upon swipe and tap gestures that are easily entered by a user without having to concentrate on a touchscreen or display associated with a touchpad on which the gestures are entered. For gestures that are swipes, only the direction of the swipe is utilized by the system in interpreting the entered command. For tap gestures, only the number of taps in a sequence, as well as the duration of taps, is utilized by the system in interpreting the entered command. By not correlating the location of the entered gestures with what is displayed on the associated display screen, the touch interface disclosed herein is well suited for environments where a user is unable to look at the display screen while performing gestures. | 10-17-2013 |
20130283215 | APPLICATION DISPLAY ON A LOCKED DEVICE - A user request to display an application while the device is locked is received. In response to this user request, one or more images generated by the application are obtained and displayed while the device is locked. Additionally, an indication of an application to be displayed upon resuming operation from a power-saving mode can be received, and an image generated by the application is displayed in response to resuming operation from the power-saving mode. | 10-24-2013 |
20130290910 | USER INTERFACE CONTROL USING A KEYBOARD - User interface control using a keyboard is described. In an embodiment, a user interface displayed on a display device is controlled using a computer connected to a keyboard. The keyboard has a plurality of alphanumeric keys that can be used for text entry. The computer receives data comprising a sequence of key-presses from the keyboard, and generates for each key-press a physical location on the keyboard. The relative physical locations of the key-presses are compared to calculate a movement path over the keyboard. The movement path describes the path of a user's digit over the keyboard. The movement path is mapped to a sequence of coordinates in the user interface, and the movement of an object displayed in the user interface is controlled in accordance with the sequence of coordinates. | 10-31-2013 |
20130290911 | METHOD AND SYSTEM FOR MULTIMODAL AND GESTURAL CONTROL - Embodiments of the present invention disclose a multimodal and gestural control system. According to one embodiment, the multimodal and gestural control system is configured to detect a gesture command from a user via at least one device of a plurality of devices. A control operation and a destination device are both determined based on the gesture command such that the determined control operation is executed on the determined destination device. | 10-31-2013 |
20130298085 | DISPLAYED CONTENT DRILLING IN A TOUCH SCREEN DEVICE - Methods may provide drilling of displayed content in a touch screen device. A method may include detecting a touch gesture by a user on a first portion of displayed content on a touch display. The first portion may include a drillable data element having at least a first dimension, a second dimension and a third dimension. The method may further include detecting information associated with the touch gesture, determining a requested drilling action based at least in part on the detected information, the requested drilling action including at least one of a change of a displayed drill dimension and a change of a displayed drill degree, sending the requested drilling action of the first portion to a report server and presenting a drilled first portion on the touch display. | 11-07-2013 |
20130311955 | System and Method for Providing Direct Access to an Application when Unlocking a Consumer Electronic Device - A consumer electronic device has an orientation sensor and a lock control. The orientation sensor outputs signals identifying the orientation of the device, while the lock control to allow a user to move the device from a locked state to an unlocked state. The device also includes a plurality of application programs stored in memory. Responsive to the user unlocking the device, a controller will launch a selected application program. The application that is launched by the device is based on an orientation of the device. | 11-21-2013 |
20130311956 | INPUT ERROR-CORRECTION METHODS AND APPARATUSES, AND AUTOMATIC ERROR-CORRECTION METHODS, APPARATUSES AND MOBILE TERMINALS - An input error-correction method for a software keyboard is provided. The method includes: when entering an input key on the software keyboard, detecting if there is a sliding input; if there is a sliding input, obtaining a slide angle and a slide direction from the sliding input; and determining a target key to replace the input key according to the input key, the slide angle, and the slide direction for input error correction. | 11-21-2013 |
20130318482 | GESTURAL CONTROL FOR QUANTITATIVE INPUTS - A method for value specification in a responsive interface control, the method including: displaying an interface control in a user interface on a touch display device, wherein the interface control is an interactive interface element configured to set an interface value selected from an ordered continuum of values; detecting a shape of a touch gesture input on the interface control on the touch display device; and changing the interface value in response to detecting a change in the shape of the gesture input. | 11-28-2013 |
20130326429 | CONTEXTUAL GESTURES MANAGER - According to some embodiments, a method and apparatus are provided to receive a first gesture registration associated a first application, receive a portal gesture registration associated with a web portal, and prioritize gestures associated with the web portal based on the first gesture registration and the portal gesture registration. | 12-05-2013 |
20130326430 | OPTIMIZATION SCHEMES FOR CONTROLLING USER INTERFACES THROUGH GESTURE OR TOUCH - A web application provides a custom selection for editing text on a gesture or touch screen. The application replaces native browser handles with selection handles to provide consistent user interface experience across platforms. The application also provides a scheme for semantic interpretation of browser gesture or touch events. The application standardizes browser events into a consistent stream of semantic events that are compatible with a plurality of devices and browsers. The application also provides a gesture or touch optimized user interface in the browser. The application determines gesture or touch input and optimizes the user interface according to the type of input. | 12-05-2013 |
20130326431 | ARRANGEMENT, METHOD AND COMPUTER PROGRAM FOR CONTROLLING A COMPUTER APPARATUS BASED ON EYE-TRACKING - A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request. | 12-05-2013 |
20130326432 | Processing For Distinguishing Pen Gestures And Dynamic Self-Calibration Of Pen-Based Computing Systems - Systems, methods, and computer-readable media process and distinguish user input device gestures, such as gestures input via a pen in a pen-based computing system, e.g., to quickly and reliably distinguish between electronic ink entry, single taps, double taps, press-and-hold actions, dragging operations, and the like. Systems, methods, and computer-readable media also are provided for dynamically calibrating a computer system, e.g., calibrating a displayed input panel view based on input data recognized and received by a digitizer. Such systems and methods may operate without entering a dedicated or special calibration application, program, or routine. | 12-05-2013 |
20130332892 | USER INTERFACE DEVICE ENABLING INPUT MOTIONS BY FINGER TOUCH IN DIFFERENT MODES, AND METHOD AND PROGRAM FOR RECOGNIZING INPUT MOTION - A process is disclosed of operating a user interface device configured to display an image on a display screen and enable user input by a user's finger touch on a touch screen in association with the displayed image, the process including: after the user starts a finger touch on the touch screen, if the user holds the finger at a substantially same location for a duration equal to or longer than a predetermined duration, initiating stationary displaying to display the image stationarily on the display screen, despite any later finger-slide motions on screen; and, after the stationary displaying starts, if a match is made between the finger's contact point on the touch screen and a location of a desirable graphical object included in the image displayed on the display screen, determining that the user has provisionally selected the object. | 12-12-2013 |
20130339908 | USING AN ADAPTIVE CURSOR FOR PREVENTING AND/OR REHABILITATING AN INJURY - Disclosed is software allowing a user to operate a computing device while preventing and/or rehabilitating an injury. Three-dimensional gestures of a user are translated into corresponding movement of a cursor on a display device. Different gestures can indicate the same motion of the cursor. As the user gestures to move the cursor, the software determines, based on a history of use specific to the user, whether the user can continue without feeling pain or fatigue. If it is determined that continued use will cause or is likely to cause pain or fatigue, the software can request the user to take a break, or can switch the gesture or motion required by the user to move the cursor in a similar manner. | 12-19-2013 |
20130339909 | TERMINAL AND METHOD FOR SETTING MENU ENVIRONMENTS IN THE TERMINAL - An apparatus and method for setting a menu environment in a mobile terminal are provided. The apparatus includes a controller for switching to an environment setting mode of a menu according to a type of a gesture having occurred on the menu. | 12-19-2013 |
20130346924 | TOUCH INTERACTIONS WITH A DRAWING APPLICATION - Concepts and technologies are described herein for touch interactions with a drawing application. In accordance with the concepts and technologies disclosed herein, user devices can obtain drawing data generated by a web-based drawing application and can display a drawing by rendering the data in one or more UIs. The user device can interpret touch gestures at a touch sensitive display used to present the UIs and can interpret the touch gestures as corresponding to one or more commands for modifying the UIs. According to various embodiments, the user device can interpret the touch gestures by determining if the touch gestures intersect an object in the drawing and other behavior associated with the touch gesture such as movement, subsequent touches, and whether or not an object intersected was selected when the touch gesture was commenced. | 12-26-2013 |
20140007018 | SUMMATION OF TAPPABLE ELEMENTS RESULTS/ACTIONS BY SWIPE GESTURES | 01-02-2014 |
20140007019 | METHOD AND APPARATUS FOR RELATED USER INPUTS | 01-02-2014 |
20140007020 | USER CUSTOMIZABLE INTERFACE SYSTEM AND IMPLEMENTING METHOD THEREOF | 01-02-2014 |
20140007021 | DISPLAY METHOD AND INFORMATION PROCESSING DEVICE | 01-02-2014 |
20140007022 | NATURAL GESTURE BASED USER INTERFACE METHODS AND SYSTEMS | 01-02-2014 |
20140013285 | METHOD AND APPARATUS FOR OPERATING ADDITIONAL FUNCTION IN MOBILE DEVICE - A method and apparatus for easily operating an additional function associated with an application are provided in a mobile device. In the method, the apparatus detects a gesture inputted while a specific application is running, and invokes an additional function corresponding to the detected gesture. Also, the apparatus displays an execution screen that contains at least one link item for offering a connection with the specific application. Furthermore, when a signal for selecting the link item is inputted, the apparatus applies an execution result of the additional function to the specific application. The inventive method and apparatus thus enhance a user's convenience. | 01-09-2014 |
20140019918 | SMART PHONE LIKE GESTURE INTERFACE FOR WEAPON MOUNTED SYSTEMS - A smart phone like gesture based interface for controlling weapon mounted system is disclosed. In one embodiment, a gesture input is detected via the gesture based interface. One or more inputs and/or one or more outputs associated with the detected gesture interface is then identified. The weapon mounted system is then controlled based on the identified one or more inputs and/or one or more outputs. | 01-16-2014 |
20140026105 | Method and Apparatus Pertaining to a Gesture-Controlled Snooze Instruction - A control circuit detects a predetermined event (such as but not limited to a particular time of day) and responsively automatically switches to an alarm state. While in this alarm state the control circuit monitors at least one non-snooze-specific area of a user input (such as but not limited to an area of a touch-sensitive display) for a user's gesture (such as but not limited to a swipe). In response to detecting the user's gesture while in the alarm state the control circuit automatically interprets the user's gesture as a snooze instruction regarding the alarm state. | 01-23-2014 |
20140033133 | SEMANTIC LEVEL GESTURE TOOL TRACKING AND POSITIONING - Various embodiments described herein include one or more of systems, methods, and software operable to identify a location of or position a gesture tool, such as a mouse pointer or cursor, within a web conference display. Some embodiments may communicate an identified location of a gesture tool within a user interface control of a web conference presenter to web conference participants. The communicated location of the gesture tool may cause the gesture tool to be displayed in a corresponding location within a display of a web conference participant despite differences between a view of the presenter and participant. The gesture tool may include a pointer under the control of a mouse, a cursor, or other gesturing tool. Some embodiments include a web conference recording module operable to record data associated with a web conference, including gesture tool positioning data. | 01-30-2014 |
20140033134 | VARIOUS GESTURE CONTROLS FOR INTERACTIONS IN BETWEEN DEVICES - In some example embodiments, a system and method is shown that includes joining a first device to an asset sharing session to access an asset with the first device. Additionally, a system and method is shown for receiving gesture-based input via the first device, the gesture-based input relating to the asset. Further, a system and method is shown for sharing the asset with a second device, participating in the asset sharing session, based on the gesture-based input. | 01-30-2014 |
20140033135 | GESTURE-INITIATED SYMBOL ENTRY - Apparatus, systems, and methods may operate to receive a symbol recognition process gesture indicating a starting point of symbol entry, to recognize a series of symbols as they are received subsequent to receiving the symbol recognition process gesture, and to instantiate the series of symbols as a symbol set upon receiving an indication that reception of the series of symbols is complete. Instantiating can serve to indicate an ending point of the symbol entry, and to provide the symbol set as a source of input to a process associated with an icon into which the symbol set has been moved, such as by dragging and dropping, or flicking. Additional apparatus, systems, and methods are disclosed. | 01-30-2014 |
20140033136 | Custom Gestures - In one embodiment, a method includes identifying a touch input made by a user of a computing device on a touch screen of the computing device as a particular one of a plurality of custom touch gestures of the user stored on the computing device; determining the particular one of the user inputs corresponding to the particular one of the custom touch gestures identified as the touch gesture made by the user; and executing one or more actions based on the particular one of the user inputs. | 01-30-2014 |
20140033137 | ELECTRONIC APPARATUS, METHOD OF CONTROLLING THE SAME, AND COMPUTER-READABLE STORAGE MEDIUM - An electronic apparatus, a method of controlling the same, and a non-transitory computer-readable storage medium are provided. The method includes: detecting an inclination of the electronic apparatus; recognizing a gesture command from an input image by taking into consideration the inclination of the electronic apparatus; and controlling the electronic apparatus to perform an operation according to the recognized gesture command. | 01-30-2014 |
20140033138 | PHOTOGRAPHING APPARATUS, METHOD OF CONTROLLING THE SAME, AND COMPUTER-READABLE RECORDING MEDIUM - A method of controlling a photographing apparatus includes: recognizing a gesture in an input image as a previously defined gesture; if the gesture corresponds to a previously defined gesture for a zoom operation, performing the zoom operation using optical zoom; and while the zoom operation is being performed, continuing recognizing the gesture by taking into consideration the zoom operation. | 01-30-2014 |
20140033139 | INFORMATION PROCESSING APPARATUS AND COMPUTER READABLE MEDIUM - An information processing apparatus includes: an image pickup unit, a recognition unit and an execution unit. The image pickup unit photographs a recording medium to generate a picked-up image. The recognition unit recognizes a handwritten mark in the picked-up image generated by the image pickup unit. The execution unit performs a predetermined process according to the handwritten mark recognized by the recognized unit. | 01-30-2014 |
20140033140 | QUICK ACCESS FUNCTION SETTING METHOD FOR A TOUCH CONTROL DEVICE - A quick access function setting method for a touch control device is provided, comprising: establishing one-to-one correspondence relation between input gestures of a touch screen and a designated quick access module; saving parameters of the input gestures in the memory area of a central processing unit; regularly detecting input signals in a touch control area and recording the input signals into a memory of the central processing unit by the touch screen in the working state; comparing the input signals with the parameters in the memory area, and executing the corresponding quick access function template if the input signals are matched with the parameters. By the method, it is able to conveniently and quickly access the required function module, and even access the function module with the keyboard locked, thus saving time for searching the function templates on the touch interface. | 01-30-2014 |
20140033141 | METHOD, APPARATUS AND COMPUTER PROGRAM FOR USER CONTROL OF A STATE OF AN APPARATUS - A method comprising: performing user input detection using at least a first detector; detecting using at least the first detector a predefined first phase of a predefined user gesture; detecting using at least a second detector a predefined second phase of the predefined user gesture; and responsive to detecting both the predefined first phase and the predefined second phase of the predefined user gesture, switching between operating in a two-dimensional user interface state and operating in a three-dimensional user interface state. | 01-30-2014 |
20140040835 | ENHANCED INPUT USING RECOGNIZED GESTURES - A representation of a user can move with respect to a graphical user interface based on input of a user. The graphical user interface comprises a central region and interaction elements disposed outside of the central region. The interaction elements are not shown until the representation of the user is aligned with the central region. A gesture of the user is recognized, and, based on the recognized gesture, the display of the graphical user interface is altered and an application control is outputted. | 02-06-2014 |
20140047395 | GESTURE BASED CONTROL OF ELEMENT OR ITEM - Apparatuses ( | 02-13-2014 |
20140053113 | PROCESSING USER INPUT PERTAINING TO CONTENT MOVEMENT - A method of processing user input and an apparatus that includes instructions for executing the method are presented. The user input may pertain to a request to move displayed content in a diagonal direction. In accordance with the inventive concept, the user input may be processed simultaneously along the vertical and horizontal directions to move the displayed content as desired. In one aspect, the method may entail determining a content to be moved based on the user input (e.g., a visual object that a user selects), breaking down the user input into an x-direction component and a y-direction component, computing an elasticity factor for at least one of the x-direction component and the y-direction component, and processing the user input by applying the elasticity factor. The elasticity factor cancels out accidental directional deviation in the user input from the main intended direction of displacement. | 02-20-2014 |
20140053114 | METHOD OF CONTROLLING FUNCTION EXECUTION IN A MOBILE TERMINAL BY RECOGNIZING WRITING GESTURE AND APPARATUS FOR PERFORMING THE SAME - Methods and apparatus are provided for executing a function of a mobile terminal by recognizing a writing gesture. The writing gesture that is inputted on a touchscreen of the mobile terminal is detected. At least one target item to which the writing gesture applies is determined. A preset writing gesture of the at least one target item is compared with the detected writing gesture to determine whether the preset writing gesture is at least similar to the detected writing gesture. An execution command corresponding to the preset writing gesture is extracted, when it is determined that the detected writing gesture is at least similar to the preset writing gesture. The function of the at least one target item is executed by the execution command. | 02-20-2014 |
20140053115 | COMPUTER VISION GESTURE BASED CONTROL OF A DEVICE - A system and method are provided for controlling a device based on computer vision. Embodiments of the system and method of the invention are based on receiving a sequence of images of a field of view; detecting movement of at least one object in the images; applying a shape recognition algorithm on the at least one moving object; confirming that the object is a user hand by combining information from at least two images of the object; and tracking the object to control the device. | 02-20-2014 |
20140053116 | APPLICATION CONTROL IN ELECTRONIC DEVICES - A portable electronic device is provided, comprising a display screen area for providing visual feedback and for receiving gestures inputs, and a switching controller to enable switching between multiple applications that have been executed on the device, the switching controller being adapted to interact with an operating system on the device and including a number of software components that interact with components that are native to an operating system on the device, and wherein the device further comprises a processor for invoking procedures relating to the particular components of the switching controller, wherein the switching controller comprises a task management component for maintaining an ordered list of tasks that are running on the device and allowing for task status to be changed. A method is also provided for controlling switching between a plurality of applications in a portable electronic device, the method comprising a display screen wherein the method includes generating an ordered list of the plurality of applications that are running on a device and controlling switching between the applications on the basis of the list. A computer readable medium comprises computer program code for causing an electronic device to carry out the method | 02-20-2014 |
20140059500 | DATA PROCESSING DEVICE AND METHOD OF PERFORMING DATA PROCESSING ACCORDING TO GESTURE OPERATION - The present invention is to appropriately determine a gesture operation detected to perform data processing according thereto. In the present invention, a CPU judges a processing status at the time of the detection of a gesture operation performed on a touch panel, and after determining its gesture operation type according to the processing status, performs data processing according to the operation type. In this case, the CPU determines and evaluates a gesture operation type for each processing status, and updates values (evaluation accumulation values) of a flick priority pointer and a tap priority pointer. Then, based on these values of the flick priority pointer and the tap priority pointer, the CPU determines one of the gesture operation types (flick operation/tap operation). | 02-27-2014 |
20140059501 | SCREEN DISPLAY CONTROL METHOD OF ELECTRONIC DEVICE AND APPARATUS THEREFOR - A method and apparatus for zooming in or out and displaying a screen according a gesture of a user is provided. The method includes sensing gesture input, determining whether the gesture input corresponds to a predetermined pattern of a first semicircle or semi oval shape, and zooming in or out the image displayed on the screen and displaying a zoomed in or zoomed out image on a screen wherein the zoom ratio is in proportion to a radius of a first semicircle or a radius of a long or short axis of a first semi oval when the gesture input is the pattern of the first semicircle or semi oval shape. | 02-27-2014 |
20140068526 | METHOD AND APPARATUS FOR USER INTERACTION - The subject matter discloses a method of screen navigating; the method comprises the steps of classifying a gesture of a body organ as an action of screen navigation capturing an image of a body organ; analyzing said image to determine whether the image matches said gesture of said body organ; and executing said action of screen navigation if said image matches said body organ gesture. | 03-06-2014 |
20140075393 | Gesture-Based Search Queries - An image-based text extraction and searching system extracts an image be selected by gesture input by a user and the associated image data and proximate textual data in response to the image selection. Extracted image data and textual data can be utilized to perform or enhance a computerized search. The system can determine one or more database search terms based on the textual data and generate at least a first search query proposal related to the image data and the textual data. | 03-13-2014 |
20140075394 | METHOD AND APPARATUS TO FACILITATE INTEROPERABILITY OF APPLICATIONS IN A DEVICE - A method and an apparatus to facilitate interoperability of applications in a device are provided. The method includes linking at least one application with at least one running application, storing at least one content of the linked applications in a stack, and accessing the stack using a gesture on the device. | 03-13-2014 |
20140082569 | Security System and Methods For Portable Devices - A portable communications or computing device includes one or more sensors for detecting motion in the x, y, and z axes, and/or pitch, yaw, and roll, and/or magnetic direction, and includes hardware and software programmed to allow a user to use one or more motions of the portable device as a security feature such that such motions are needed in order to gain access to the functionality of the device or data stored in memory of the device. A display provides prompts allowing the user to choose to include one or motions of the device as security, and allows the user to include such one or more motions of the device alone or in combination with one or more alphanumeric characters as a password to restrict access to the functionality of the device and/or data stored in memory on the device. Such motions can be selected from preset motions. Such motions can be used to provide varying levels of security for selected data or other files or functionality of the device. | 03-20-2014 |
20140082570 | WEIGHTED N-FINGER SCALING AND SCROLLING - In one example, a method includes receiving an indication of an input gesture detected at a presence-sensitive input device, where the input gesture includes one or more input points and each input point is detected at a respective location of the presence-sensitive input device. The method may also include determining a focal point of the input gesture, and determining a radius length. The method may also include determining a shape centered at the focal point and having a size determined based on the radius length. The method may also include responding to a change in a geometric property of the shape by scaling information included in a graphical user interface, where the scaling of the information being centered at the focal point. | 03-20-2014 |
20140082571 | INTERACTION METHOD AND DEVICE IN TOUCH TERMINAL, AND INTERACTION METHOD, SERVER AND COMPUTER STORAGE MEDIUM IN NETWORK APPLICATION - The invention relates to an interaction method and a device in a touch terminal, and an interaction method, a server and a computer storage medium in network application. The Interaction method in the touch terminal comprises steps of: acquiring touch event from a user; acquiring a slide track based on the continuous slide occurred by the touch event, and obtaining a selected interaction object based on the slide track; triggering the selected interaction object to response the touch event based on the slide track. By means of obtaining selected interaction object and slide track based on touch event inputted by user, and then realizing response of touch event by the slide track, the interaction method and the device in the touch terminal, and the interaction method, the server and the storage medium can achieve interaction operations without operations like click selection, twice confirmation, etc., the complexity of operation is reduced effectively and the convenience of operation is improved. | 03-20-2014 |
20140089863 | SYSTEMS, METHODS, AND COMPUTER PROGRAM PRODUCTS FOR GESTURE-BASED SEARCH AND DISCOVERY THROUGH A TOUCHSCREEN INTERFACE - A touchscreen-based user interface that allows the user to perform information searching on a mobile information appliance, such as a tablet computer or smart phone. By moving one or two fingers in appropriate direction(s) across the touchscreen, a search may be specified. Moreover, by using the appropriate gestures, a search may be broadened or narrowed by specifying additional search terms. This may be performed iteratively to allow the user to discover information, at various times narrowing or broadening a search. | 03-27-2014 |
20140089864 | Method of Fusing Multiple Information Sources in Image-based Gesture Recognition System - A method of interpreting input from a user includes providing a surface within reach of a hand of the user. A plurality of locations on the surface that are touched by the user are sensed. An alphanumeric character having a shape most similar to the plurality of touched locations on the surface is determined. The determining includes collecting information associated with hand region localized modules, and modeling the information using statistical models. The user is informed of the alphanumeric character and/or a word in which the alphanumeric character is included. Feedback is received from the user regarding whether the alphanumeric character and/or word is an alphanumeric character and/or word that the user intended to be determined in the determining step. | 03-27-2014 |
20140089865 | HANDWRITING RECOGNITION SERVER - A gesture input application adapted for translating gesture input into font characters. A web application, such as a webpage, embedded with the gesture application is served over a network to one or more computing devices for local execution of the gesture put application by a web browsing software on the computing device. The web application includes rules for styling the webpage on the computing device and the source code for the gesture input application. The computing device executing the web application with the web browsing software receives from a user a gesture input and translates the gesture input into at least one standard font character as an input to the web application. | 03-27-2014 |
20140089866 | COMPUTING SYSTEM UTILIZING THREE-DIMENSIONAL MANIPULATION COMMAND GESTURES - A computing system utilizing three-dimensional manipulation command gestures. An embodiment of an apparatus includes a sensing element to sense a presence or movement of a user of the apparatus, a processor, wherein operation of the processor includes interpretation of command gestures of the user to provide input to the apparatus, and a display screen to provide a display. The command gestures include one or more command gestures to manipulate at least a portion of the display, the one or more command gestures being gestures including motion along an axis between the display screen and the user. | 03-27-2014 |
20140089867 | MOBILE TERMINAL HAVING TOUCH SCREEN AND METHOD FOR DISPLAYING CONTENTS THEREIN - A mobile terminal having a touch screen and a method for displaying contents therein are provided. The method for displaying contents in a mobile terminal having a touch screen includes determining whether a touch action moves when the touch action is sensed on displayed contents, calculating a physical display change amount for changing and displaying the contents according to the touch action when the touch moves, and continuously changing and displaying the contents according to the physical calculated display change amount when the touch action stops. | 03-27-2014 |
20140096091 | SYSTEMS AND METHODS FOR THREE-DIMENSIONAL INTERACTION MONITORING IN AN EMS ENVIRONMENT - A method for tracking interactions in an emergency response environment according to embodiments of the present invention includes receiving color images and depth information from within a field of view of a sensor array; maintaining an emergency encounter record; monitoring one or both of a position of an object and movement of the object in the emergency response environment based on the color images and depth information received by the sensor array; and recording an occurrence of a condition in the emergency encounter record, wherein the condition is based on the one or both of the position of the object and the movement of the object. | 04-03-2014 |
20140096092 | System and Method for Indirect Manipulation of User Interface Object(s) - Provided is a system and method for indirectly manipulating user interface object(s) of a user interface. In a pressure sensitive display embodiment, a user maintains a convenient touch position to a display, performs a search gesture (or selection gesture), and user interface object(s) are identified as satisfying the search criteria (or as selected). Upon being identified, the user interface object(s) are acted upon as though the user were interacting with each object(s) by touching them directly, although further gestured actions are located remote and away from the object(s) at the time of acting upon the object(s). Further provided is the ability to assign the identified object(s) to a remote device for remote user manipulation, for example using a smartphone. Many remote users may each manipulate their own subset of object(s) simultaneously in the same display system, for example facilitating classroom or team collaboration. | 04-03-2014 |
20140101619 | Denoising Touch Gesture Input - In one embodiment, a computing device determines a touch gesture on a touch screen of the computing device. The touch gesture includes two or more data points that each correspond to a particular location on the touch screen and a particular point in time. For each of one or more of the data points, the computing device adjusts a time value representing its particular point in time. For each of one or more of the data points, the computing device adjusts a position value representing its particular location on the touch screen. The computing device fits a curve to the two or more data points to determine a user intent associated with the touch gesture. | 04-10-2014 |
20140101620 | METHOD AND SYSTEM FOR GESTURE IDENTIFICATION BASED ON OBJECT TRACING - A method and system provide light to project to an operation space so that a received image from the operation space will include, if an object is in the operation space, a bright region due to the reflection of light by the object, and identify a gesture according to the variation of a barycenter position, an average brightness, or an area of the bright region in successive images, for generating a corresponding command. Only simple operation and calculation is required to detect the motion of an object moving in the X, Y, or Z axis of an image, for identifying a gesture represented by the motion of the object. | 04-10-2014 |
20140101621 | MOBILE TERMINAL BROWSER PAGE REFRESHING METHODS AND MOBILE TERMINALS - Methods and mobile terminals for refreshing a page in a browser on a mobile terminal are provided herein. In an exemplary method, an external shake instruction can be received from a user. According to the shake instruction, acceleration values of the mobile terminal can be obtained. When at least one of the acceleration values of the mobile terminal exceeds a preset threshold value, the page in the browser on the mobile terminal can be refreshed. | 04-10-2014 |
20140109018 | GESTURE ENTRY TECHNIQUES - Techniques are provided for entering, verifying, and saving a gesture on a touch-sensitive display device. In one embodiment, the device displays a gesture entry screen where a user enters a gesture. The device estimates the entered gesture and displays the estimated gesture on a gesture replay screen. The estimated gesture may be replayed repeatedly until stopped, and the device may display a gesture verification screen where the user may reenter the gesture. The device verifies if the re-entered gesture is substantially the same as the original estimated gesture. Some embodiments include a visible trace following a user's touch on the touch-sensitive display, where the trace may change in color and/or length depending on the speed, duration, and/or complexity of an entered gesture. Some embodiments include display indicator(s) (e.g., a strength bar, color change, timer, etc.) to indicate the strength and/or elapsed time during an entry or replay of a gesture. | 04-17-2014 |
20140109019 | ELECTRONIC DEVICE INCLUDING TOUCH-SENSITIVE DISPLAY AND METHOD OF CONTROLLING SAME - A method includes displaying a plurality of display elements on a touch-sensitive display of an electronic device; displaying a selection tool on the touch-sensitive display; in response to detecting a first gesture, selecting a first portion of the plurality of display elements, the first portion comprising at least one display element; and in response to detecting a second gesture, moving the selection tool without selecting the display elements; and in response to detecting a third gesture, selecting a second portion of the plurality of the plurality of display elements, the second portion being non-contiguous with the first portion. | 04-17-2014 |
20140109020 | METHOD FOR GENERATING A GRAPHICAL USER INTERFACE - A method for generating a graphical user interface object, the method comprising the steps of awaiting a user's gesture input from a gesture input interface; providing information listing at least one gesture type wherein at least two threshold values of at least one parameter are assigned to the given gesture type; verifying whether the input gesture matches a parameterized gesture type; in case the verification confirms that the input gesture matches a parameterized gesture type, extracting the gesture type and the gesture parameter's value from a gesture event notification; identifying an associated action based on the gesture type and parameter; and generating an output signal with a differently configured graphical user interface object content dependent on the gesture type and the gesture parameter. | 04-17-2014 |
20140109021 | METHOD FOR OPERATING A GESTURE-CONTROLLED GRAPHICAL USER INTERFACE - A method for operating a gesture-controlled graphical user interface for a device with a display screen interface ( | 04-17-2014 |
20140109022 | Touch Operation Processing Method and Terminal Device - A touch operation processing method and a terminal device. The method includes: detecting a touch operation of a user, which starts from a border of a screen display area to the screen display area, and using the first point touched by the touch operation in the screen display area as a starting point; and performing, according to the touch operation, reduction processing on an operation interface displayed in the screen display area, where one edge of an operation interface after the reduction processing includes the starting point. Therefore, the demand is met that the user triggers, by one hand, reduction processing on the operation interface and perform a selection operation on an arbitrary position in the entire screen display area of the terminal device when the user holds the terminal device with a large screen with one hand on it. | 04-17-2014 |
20140109023 | ASSIGNING GESTURE DICTIONARIES - Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures. | 04-17-2014 |
20140109024 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND COMPUTER PROGRAM PRODUCT - An information processing apparatus includes a detection unit configured to detect a position of a manipulation body on a display screen and a control unit configured to, if a first manipulation on the display screen in a locked state is detected, display, on the display screen, icons to be arranged based on a reference point indicating a position of the manipulation body when the first manipulation is completed, and if a second manipulation indicating a direction from the reference point is detected, start up an application corresponding to the icon specified by the second manipulation. An associated method and computer readable storage medium are also described. | 04-17-2014 |
20140109025 | HIGH PARAMETER-COUNT TOUCH-PAD CONTROLLER - An apparatus including a touch user interface device including a sensor array configured to sense spatial information associated with one or more contiguous regions of contact and a processing device in communication with the touch user interface device configured to discern a first contiguous region of contact corresponding to a first finger from the one or more contiguous regions of contact, determine a first spatial distribution profile of the first contiguous region at a first time, determine a second spatial distribution profile of the first contiguous region at a second time, analyze a shape variation of the second spatial distribution profile in comparison to the first spatial distribution profile to determine a rotational movement of the first finger, generate a control signal in response to the detected rotational movement. | 04-17-2014 |
20140115544 | METHOD FOR ZOOMING SCREEN AND ELECTRONIC APPARATUS AND COMPUTER READABLE MEDIUM USING THE SAME - A method for zooming screen, an electronic apparatus and a computer readable medium using the same are provided. The method is adapted to an electronic apparatus having a touch screen. Firstly, a zooming gesture is detected by utilizing the touch screen. Next, a screen object acted by the zooming gesture on the touch screen is searched. Afterwards, a specific part of the screen object and a screen edge of the touch screen are used as a reference to zoom all of objects displayed in a screen of the touch screen, such that a height of the specific part of the screen object is constantly maintained during zooming the screen. | 04-24-2014 |
20140115545 | Method for Identifying Operation Mode of Handheld Device and Handheld Device - The embodiments of the present invention provide a method for identifying an operation mode of a handheld device and a handheld device, and relate to the field of electronic technologies, to provide operational inconvenience caused by a difference between left and right hand usage habits. The method includes obtaining left and right hand operation information of a user. If the operation information indicates that the user uses the right hand to perform an operation, displaying a first preset operation interface that conforms to a right hand operation habit on a screen of the handheld device, and if the operation information indicates that the user uses the left hand to perform an operation, displaying a second preset operation interface that conforms to a left hand operation habit on the screen of the handheld device. The embodiments of the present invention are applicable to operating a handheld device. | 04-24-2014 |
20140123077 | SYSTEM AND METHOD FOR USER INTERACTION AND CONTROL OF ELECTRONIC DEVICES - A system and method for close range object tracking are described. Close range depth images of a user's hands and fingers are acquired using a depth sensor. Movements of the user's hands and fingers are identified and tracked. This information is used to permit the user to interact with a virtual object, such as an icon or other object displayed on a screen, or the screen itself. | 05-01-2014 |
20140123078 | MOBILE COMMUNICATIONS DEVICE, NON-TRANSITORY COMPUTER-READABLE MEDIUM AND METHOD OF SWITCHING SCREEN OF MOBILE COMMUNICATIONS DEVICE FROM SCREEN LOCKED STATE TO SCREEN UNLOCKED STATE - A method of switching a screen of a mobile communications device from a screen locked state to a screen unlocked state is provided. The mobile communications device includes a display panel configured to display the screen that has a background section and a first section in the screen locked state. The method includes: moving the first section from a first location to a second location of the screen in the screen locked state when an screen unlocking requirement is determined to be satisfied; and keeping displaying the first section at the second location of the screen after switching the screen from the screen locked state to the screen unlocked state. A non-transitory computer-readable medium and a mobile communications device for switching a screen of the mobile communications device from a screen locked state to a screen unlocked state are also provided. | 05-01-2014 |
20140123079 | DRAWING CONTROL METHOD, APPARATUS, AND MOBILE TERMINAL - In a drawing control method and a terminal provided in the embodiments, a gesture track input by a user is detected, then first attribute information about the gesture track is acquired, and then the gesture track is recognized according to a preset rule and the first attribute information about the gesture track, so as to acquire second attribute information about the gesture track, and finally the gesture track is presented according to the second attribute information about the gesture track; in this manner, part of feature information required for presenting the track is carried in the first attribute information about the gesture track, and therefore, it is avoided that the user frequently manually switches all kinds of options to implement the input of the gesture track, thereby solving a problem of the complexity in drawing operations. | 05-01-2014 |
20140123080 | Electrical Device, Touch Input Method And Control Method - An electrical device, touch input method and a control method thereof are provided. The touch input method is applied to the electrical device. The electrical device includes a display unit and a touch sensing unit arranged on top of the display unit, the touch area of the touch sensing unit is overlaid with the display area of the display unit, the display is used to display objects of the electrical device in the display area, and the touch area is divided into a first area and a second area not overlaying with each other. The touch input method comprises: detecting a gesture input; determining the start point of the gesture input being in the first area or the second area; generating a system management command corresponding to the gesture input when the start point of the gesture input being determined in the second area; generating an object operation command used to operating the object corresponding to the gesture input when the start point of the gesture input being determined in the first area; and executing the system management command or the object operation command. | 05-01-2014 |
20140123081 | DISPLAY APPARATUS AND METHOD THEREOF - A display method of a display apparatus is provided. The display method includes displaying an image on a screen, detecting a touch manipulation with respect to the image, and if the touch manipulation is detected, changing a display status of the image according to a physical attribute of the touch manipulation. | 05-01-2014 |
20140129993 | TILTABLE USER INTERFACE - A programmable effects system for graphical user interfaces is disclosed. One embodiment comprises adjusting a graphical user interface in response to a tilt of a device. In this way, a graphical user interface may display a parallax effect in response to the device tilt. | 05-08-2014 |
20140129994 | Variable Device Graphical User Interface - Methods, systems, devices, and apparatus, including computer program products, for adjusting a graphical user interface. A motion of a device is detected. A graphical user interface of the device is adjusted in response to the detected motion. | 05-08-2014 |
20140129995 | CHANGING OF LIST VIEWS ON MOBILE DEVICE - Various embodiments related to a hand-held mobile computing device are disclosed. One disclosed embodiment comprises a hand-held mobile computing device having a touch-sensitive display forming a surface of the hand-held mobile computing device configured to receive touch input. The hand-held mobile computing device further comprises a processor and memory comprising code executable by the processor to display a scrollable list of items in a first content viewing mode having a first set of content for each of the items in the scrollable list, to detect a dynamic multi-touch gesture over the scrollable list of items, and to change the scrollable list of items to a second content viewing mode responsive to detection of the dynamic multi-touch gesture, wherein the second content viewing mode comprises a second set of content for each item in the scrollable list of items. | 05-08-2014 |
20140137052 | SYSTEM FOR CAPTURING AND REPLAYING SCREEN GESTURES - A capture system may capture client events for an application session. Some client events may contain display information associated with screen gestures. The screen gestures may be associated with any user input that changes how images are displayed during the application session. For example, the screen gestures may comprise one or more of a scroll gesture, a touch start gesture, a touch move gesture, a touch end gesture, and/or a pinch gesture. In another example the screen gesture may comprise a reorientation of a device operating in the application session. A replay system may replay the application session based on the captured client events to recreate images displayed during the application session in response to the screen gestures. | 05-15-2014 |
20140137053 | Information Processing Method And Information Processing Device - An information processing method and information processing device, used in an electronic device that has a display area and an operation detecting area corresponding to the display area are described. The information processing method includes detecting the gesture operation in the operation detecting area when the first operation object and the second operation objects are displayed on the display area; generating a corresponding operating instruction based on the gesture operation when the gesture operation corresponds to the first operation detecting area; performing the operation corresponding to the operating instruction to the first operation object when the operation object parameter indicates the first operation object; performing the operation corresponding to the operating instruction to the second operation object when the operation object parameter indicates the second operation object. | 05-15-2014 |
20140143738 | SYSTEM AND METHOD FOR APPLYING GESTURE INPUT TO DIGITAL CONTENT - A system and method for managing messages within an application interface that includes receiving a message stream; providing a navigational menu to a set of message streams; detecting an initiated gesture item for at least one message within a view of the navigational menu; tracking gesture-state within a progressive order of gesture-states; identifying an action of the gesture-state wherein the action corresponds to the current view and relative ordering of the navigational menu; and applying the message sorting action on the message according to a final gesture-state. | 05-22-2014 |
20140149946 | SELECTIVE SHARING OF DISPLAYED CONTENT IN A VIEW PRESENTED ON A TOUCHSCREEN OF A PROCESSING SYSTEM - Arrangements described herein relate to sharing a view presented on a touchscreen of a processing system. Whether a show gesture state is enabled on the processing system and whether a gesture event gate is open on the processing system are determined. The show gesture state determines whether a gesture detected by the touchscreen is depicted onto a version of the view shared with another processing system. The gesture event gate determines whether a corresponding gesture event is passed to an application that is active in the view. | 05-29-2014 |
20140149947 | MULTI-TOUCH INTERFACE FOR VISUAL ANALYTICS - A system and method for facilitating adjusting a displayed representation of a visualization. An example method includes employing a touch-sensitive display to present a user interface display screen depicting a first visualization; and providing a first user option to apply touch input to a region of the user interface display screen coinciding with a portion of the first visualization to facilitate affecting an arrangement of data displayed via the first visualization, wherein the touch input includes a multi-touch gesture. In a more specific embodiment, the touch gesture includes a rotation gesture, and the method further includes displaying a visual indication of a change, e.g., a pivot operation, to be applied to a second visualization as a user performs the rotation gesture, and updating the second visualization as a user continues perform the rotation gesture. The first visualization is updated based on the second visualization upon completion of the rotation gesture. | 05-29-2014 |
20140149948 | DISPLAY APPARATUS AND METHOD OF CONTROLLING THE SAME - A display apparatus and a method of controlling the same include: a signal receiver which receives a video signal; a signal processor which processes the video signal; a display unit which displays an image based on the video signal; a motion detector which detects a user's motion input with regard to the display unit; and a controller which controls a writing operation to be implemented on the display unit with regard to a user's first motion input forming a predetermined trajectory on a screen of the display unit and controls a preset function to be implemented corresponding to the writing operation with regard to a user's second motion input that gets closer to or farther from the screen of the display unit. | 05-29-2014 |
20140149949 | SELECTIVE SHARING OF DISPLAYED CONTENT IN A VIEW PRESENTED ON A TOUCHSCREEN OF A PROCESSING SYSTEM - A method for sharing a view presented on a touchscreen of a processing system. Whether a show gesture state is enabled on the processing system and whether a gesture event gate is open on the processing system are determined. The show gesture state determines whether a gesture detected by the touchscreen is depicted onto a version of the view shared with another processing system. The gesture event gate determines whether a corresponding gesture event is passed to an application that is active in the view. | 05-29-2014 |
20140149950 | IMAGE OVERLAY-BASED USER INTERFACE APPARATUS AND METHOD - An apparatus includes a display unit configured to display a user interface screen in an area, and a sensing unit configured to capture an image of a motion of an object used to enter user input. The apparatus further includes an input recognition unit configured to lay the image over the user interface screen, and recognize the user input based on the image laid over the user interface screen, and a processing unit configured to process an operation corresponding to the recognized user input. | 05-29-2014 |
20140149951 | METHOD AND APPARATUS CONTINUING ACTION OF USER GESTURES PERFORMED UPON A TOUCH SENSITIVE INTERACTIVE DISPLAY IN SIMULATION OF INERTIA - A method and apparatus for identifying user gestures to control an interactive display identifies gestures based on a bounding box enclosing points at which a user contacts a touch sensor corresponding with the display surface and permits use of inexpensive and highly reliable grid-based touch sensors that provide a bounding box to describe contact information. In identifying gestures, position, motion, shape, and deformation of the bounding box may be considered. Center, width, height, aspect ratio, length and orientation of the bounding box diagonal may be determined. A stretch factor, defined as the maximum of the ratio of the height of the bounding box to the width of the bounding box and the ratio of the width of the bounding box to the height of the bounding box, may also be computed. Gestures may be identified based on the changes in time of these characteristics and quantities. | 05-29-2014 |
20140157209 | SYSTEM AND METHOD FOR DETECTING GESTURES - A system and method that includes detecting an application change within a multi-application operating framework; updating an application hierarchy model for gesture-to-action responses with the detected application change; detecting a gesture; according to the hierarchy model, mapping the detected gesture to an action of an application; and triggering the action. | 06-05-2014 |
20140157210 | Gesture Based Interface System and Method - A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures. A message decision maker generates a message based upon an identified gesture and a recognition mode of the gesture recognition system. The recognition mode is changed under one or more various conditions. | 06-05-2014 |
20140165012 | SINGLE - GESTURE DEVICE UNLOCK AND APPLICATION LAUNCH - A computing device can be unlocked and an application selected for execution with a single gesture. The single gesture can comprise a portion of an unlock gesture and an application selection gesture. An unlock-and-launch user interface can comprise a plurality of tracks and a user can unlock a device and select an application by first moving an icon in a first direction along a first track from a starting position and then along a second track in a second direction. A user can unlock a device and launch an application by supplying an unlock gesture and then selecting an application icon from a series of icons presented while the user's finger or stylus remains in contact with the touchscreen. Applications to be included in an unlock-and-launch interface can be selected by the user, or automatically selected by the device based on application usage and/or device context. | 06-12-2014 |
20140165013 | ELECTRONIC DEVICE AND PAGE ZOOMING METHOD THEREOF - A page zooming method for an electronic device includes, firstly, generating operation signals in response to a touch operation applied on a page displayed on a touch screen of the electronic device. Secondly, one touch operation is determined to be a zoom-triggering gesture when a duration of time of pressing the touch screen on one touching point is longer than a first preset duration of time. Thirdly, one slide operation is identified according to the operation signals generated by the touch screen in response to the slide operation. Next, a zooming type and a zooming ratio are determined according to the zoom-triggering gesture and the slide operation. Lastly, the page is zoomed according to the zooming type and zooming ratio. An electronic device using the page zooming method is also provided. | 06-12-2014 |
20140165014 | TOUCH DEVICE AND CONTROL METHOD THEREOF - A touch device includes a touch display screen, a detection device, and a main control unit. The touch display screen displays an operation interface including at least one shortcut mark and executes a corresponding function when the at least one shortcut mark is operated on. The detection device detects whether the touch device is held by a left hand, a right hand, or both hands of a user in a landscape orientation and a portrait orientation, and outputs detecting signals based on the detection of the detecting device. The main control unit adjusts a display location of the at least one shortcut mark based on the detecting signals. | 06-12-2014 |
20140173528 | CONTACT ENVIRONMENTS WITH DYNAMICALLY CREATED OPTION GROUPS AND ASSOCIATED COMMAND OPTIONS - Disclosed herein are systems, methods, and software for facilitating contact environments. In one implementation, a computing system presents a view of a contact environment comprising a canvas and a plurality of contacts arranged on the canvas. Responsive to a plurality of inclusion gestures, each gestures indicative of an intention to include a different one of the plurality of contacts in an option group, the computing system identifies which of the plurality of contacts to include in the option group. The computing system presents a modified view of the contact environment comprising a plurality of included contacts identified in response to the plurality of inclusion gestures. | 06-19-2014 |
20140173529 | CIRCULAR GESTURE FOR TOUCH SENSITIVE UI CONTROL FEATURE - Techniques are disclosed for providing a circular gesture mode in electronic touch sensitive devices. The user can engage the mode with a particular gesture that includes a combination of contact points that uniquely identify that the circular gesture mode is desired. The combination may include, for example, a press-and-hold activation contact point in conjunction with one or more additional contact points moving in a circular motion, or a multiple contact points moving circular motion. The circular gesture can be used to cause, for instance, specific functions within a given application, and/or within different applications. Clockwise movement can be used to cause one type of change, while counter-clockwise motion can be used to cause another type. Changing pages, sections, and chapters of a book, or changing volume of an audio application, or changing a tool within a given application, or from changing from one application to another are example uses. | 06-19-2014 |
20140173530 | TOUCH SENSITIVE DEVICE WITH PINCH-BASED EXPAND/COLLAPSE FUNCTION - Techniques are disclosed for expanding and collapsing content in electronic touch sensitive devices. The expand/collapse function can be used to navigate through content that may be displayed on a screen. The user can engage the function with a pinch-based gesture. In some cases, the UI feature includes a reading pane that displays a sample of the previously hidden detailed data relating to the content currently displayed on the screen. Background content outside the reading pane can be faded or otherwise softened. In some cases, the UI feature also includes, or alternatively includes, tap-expand feature that allows the user to select one or more items that are intended to be expanded or collapsed. Such a feature can include an initial single contact, followed by an inward or outward pinch with at least one additional contact point, in some example case. | 06-19-2014 |
20140173531 | USER INTERFACE - An apparatus, method and computer program product for: receiving a trace input entered on a surface; receiving movement data describing the movement of the surface during entry of the trace input; and modifying the trace input based on the movement data. | 06-19-2014 |
20140173532 | DISPLAY CONTROL APPARATUS, DISPLAY CONTROL METHOD, AND STORAGE MEDIUM - In display apparatuses each including a touch UI, some of them correct or suppress the direction of display position shift in swipe operation. It is, however, inconvenient that sometimes the correction or suppression is performed at an undesired timing. The display position shift is suppressed depending on an attribute of a displayed object. | 06-19-2014 |
20140181758 | System and Method for Displaying Characters Using Gestures - A method and a system are provided for displaying a character on an electronic device. The method includes displaying a multi-segment display on a touch-sensitive display. A touch gesture is detected over the multi-segment display. The electronic device activates segments of the multi-segment display that correspond to the touch gesture, and then displays the character that corresponds to the activated segments. In an example embodiment, the multi-segment display is a seven-segment display. | 06-26-2014 |
20140181759 | CONTROL SYSTEM AND METHOD USING HAND GESTURE FOR VEHICLE - A control system and method using a hand gesture for a vehicle are provided. The method includes extracting, by a processor, a hand gesture image from a captured hand image within the vehicle and representing the extracted hand gesture image by overlapping the image on a windshield of the vehicle. In addition, the method includes representing a graphic screen on the windshield and operating devices within the vehicle by manipulating a graphic screen according to the hand gesture image. | 06-26-2014 |
20140181760 | ELECTRONIC DEVICE AND METHOD FOR UNLOCKING TOUCH SCREEN THEREOF - A method for unlocking a touch screen of an electronic device includes the following steps. First, touch signals are generated in response to manual touch operations on the touch screen. Second, a shape is identified according to the touch signals. Third, the touch screen is unlocked if the identified shape is substantially the same as a predefined shape, wherein the predefined shape is a shape of two fingertips simultaneously sliding in a direction from up to down on the touch screen. An electronic device equipped for unlocking a touch screen thereof is also provided. | 06-26-2014 |
20140189602 | METHOD AND ASSOCIATED SYSTEM FOR DISPLAYING GRAPHIC CONTENT ON EXTENSION SCREEN - A method for displaying graphic content on an extension screen, comprising: when a detected user activity at a host screen is matched to one of a plurality of gestures, providing a gesture icon in response to the matched gesture; and providing a shared content according to an original content, such that a combined content, which is a combination of the shared content and the gesture icon, can be displayed on the extension screen. An associated system is also disclosed. | 07-03-2014 |
20140189603 | Gesture Based Partition Switching - Generally, a user records one or more user-specific gestures to enable switching from one partition to another using the gesture recorder. Information relating to the recorded gestures is stored in a data storage. Once a user-defined gesture is recorded, a context switcher detects a user performed gesture corresponding to the recorded gesture. Then the context switcher automatically switches from one environment to another environment. | 07-03-2014 |
20140189604 | METHOD AND SYSTEM FOR UNLOCKING A TOUCHSCREEN OF AN ELECTRONIC DEVICE - A method and system for unlocking a touchscreen of an electronic device includes detecting a sequence of one or more press gestures on the touchscreen; and responsive to determining that each individual pressure gesture comprising the sequence of one or more pressure gestures is applied for a specific period of time to a specific area of the touchscreen, unlocking the touchscreen. | 07-03-2014 |
20140189605 | Method for Controlling the Magnification Level on a Display - A method for controlling a screen in a data processing system to generate displays that include portions of an underlying scene is disclosed. The method includes displaying on the screen a first displayed scene, detecting a long touch gesture, and displaying a second scene on the screen. The first displayed scene is characterized by a first magnification and a first offset relative to the underlying scene. The second displayed scene includes a different portion of the underlying scene and is characterized by a second magnification that is different from the first magnification. In one aspect of the invention, the second displayed scene is characterized by a second offset that is determined by the long touch gesture, and the second offset depends on the first displayed scene and the long touch gesture. | 07-03-2014 |
20140189606 | USER INTERFACE FOR A COMPUTING DEVICE - There is disclosed a smartphone, tablet or other computing device comprising: (a) a touch sensitive display; (b) one or more processors; (c) computer memory; (d) one or more computer programs stored in the computer memory and configured to be executed by the one or more processors and including instructions for detecting a swipe in from each and any of the four edges of the touch sensitive display and, in response to the detected swipe, causing the device to behave in a manner that depends on the specific edge swiped-in from (e.g. left, right, top or bottom). | 07-03-2014 |
20140189607 | USER INTERFACE FOR A COMPUTING DEVICE - There is disclosed a smartphone, tablet or other computing device comprising: (a) a touch sensitive display; (b) one or more processors; (c) computer memory; (d) one or more computer programs stored in the computer memory and configured to be executed by the one or more processors and including instructions for detecting a single gestural input, and, in response to detecting the single gestural input, causing two or more different outcomes or functions to be triggered or invoked, depending on the detected extent of the gestural input. | 07-03-2014 |
20140189608 | USER INTERFACE FOR A COMPUTING DEVICE - There is disclosed a smartphone, tablet or other computing device comprising: (a) a touch sensitive display; (b) one or more processors; (c) computer memory; (d) one or more computer programs stored in the computer memory and configured to be executed by the one or more processors and including instructions for detecting a swipe in from one or more edges and then unlocking or making accessible the device from a screen in response to the detected swipe. | 07-03-2014 |
20140189609 | METHOD FOR CONTROLLING TWO OR THREE DIMENSIONAL FIGURE BASED ON TOUCH AND APPARATUS THEREOF - A method for controlling a figure based on a touch includes recognizing a figure which is input by handwriting on a touch screen, displaying one or more variable points to which a controlling function of a figure is mapped on a trajectory of the recognized figure, detecting a touch input that selects one of the variable points, determining whether a controlling function execution request of the recognized figure is detected based on at least one of a pressure information and a touch gesture information corresponding to the detected touch input, and executing the controlling function of the figure corresponding to the selected variable points when the controlling function execution request is detected. | 07-03-2014 |
20140189610 | UNIVERSAL SCRIPT INPUT DEVICE & METHOD - A method providing for input of any script/language, on any computing device, mobile or otherwise by conveying Unicode characters to the computing device instead of keyboard scan codes that require further processing. The method includes all script/language processing independent from the computing device, permits changing input script/language “on-the-fly”, provides for a universal platform-independent method to select each particular script/language, and requires no language-specific support on the computing device, other than the ability to display the selected script. The method also provides for input of commands and backward-compatible input using legacy keyboard key codes. | 07-03-2014 |
20140195987 | Moving A Virtual Object Based on Tapping - A mobile device enables refined selections of displayed virtual objects by responding to a user's tapping actions on the sides of the device. The device can move the object by a small increment in a direction opposite the tapped surface, as though the tapping were gently nudging the object away from that surface. For example, if the user taps on the right side of the device, then the device can responsively move a currently selected object leftward by one pixel. Conversely, if the user taps on the left side of the device, then the device can responsively move the currently selected object rightward by one pixel. Similar movements of similar magnitude and in expected directions can be achieved by tapping the top or bottom of the device. Thus, a currently selected object can be moved in a more refined and precise manner than might be possible using a touchscreen alone. | 07-10-2014 |
20140195988 | OPERATING ENVIRONMENT COMPRISING MULTIPLE CLIENT DEVICES, MULTIPLE DISPLAYS, MULTIPLE USERS, AND GESTURAL CONTROL - Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across at least one of the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The gesture data is absolute three-space location data of an instantaneous state of the at least one object at a point in time and space. The detecting comprises aggregating the gesture data, and identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control at least one of the display devices and the remote client devices in response to the gesture signal. | 07-10-2014 |
20140195989 | INPUT DEVICE, DISPLAY DEVICE AND METHOD OF CONTROLLING THEREOF - Exemplary embodiments may disclose an input device, a display device, and methods of controlling thereof. More particularly, exemplary embodiments may disclose an input device, a display device, and methods of controlling thereof, which inputs a command using gestures. The input device may include: a sensor configured to sense a location and a movement of the input device in response to a gesture command being input; a controller configured to convert the sensed movement into a physical quantity, create a cursor movement command when a size of the physical quantity is less than a predetermined threshold value, and create a predetermined control command when the size of the physical quantity is more than the predetermined threshold value; and a communicator configured to transmit the created commands to a display device. | 07-10-2014 |
20140195990 | MOBILE DEVICE SYSTEM PROVIDING HYBRID WIDGET AND ASSOCIATED CONTROL - A control method of a mobile device providing hybrid widgets includes: displaying a first hybrid widget on a touch screen; detecting a gesture made via said touch screen; and in response to detecting the gesture, converting the first hybrid widget into a second hybrid widget, and displaying the second hybrid widget on the touch screen. | 07-10-2014 |
20140195991 | BOUNDING BOX GESTURE RECOGNITION ON A TOUCH DETECTING INTERACTIVE DISPLAY - The invention provides a method and apparatus for identifying gestures performed by a user to control an interactive display. The gestures are identified based on a bounding box enclosing the points at which a user contacts a touch sensor corresponding with the display surface. The invention thus permits the use of inexpensive and highly reliable grid-based touch sensors that provide a bounding box to describe contact information. In identifying the gestures, the position, motion, shape, and deformation of the bounding box may all be considered. In particular, the center, width, height, aspect ratio, length of the diagonal, and orientation of the diagonal of the bounding box may be determined. A stretch factor, defined as the maximum of the ratio of the height of the bounding box to the width of the bounding box and the ratio of the width of the bounding box to the height of the bounding box, may also be computed. Finally, gestures may be identified based on the changes in time of these characteristics and quantities. | 07-10-2014 |
20140201688 | USER INTERFACE - GESTURAL TOUCH - A method for controlling a domestic appliance that treats contents, said domestic appliance having a display, a controller, and at least one user input component, the method comprising: displaying a first screen on the display; detecting user input to the at least one user input component; correlating the user input to a gesture type with the controller; determining the gesture type from among a plurality of gesture types with the controller; and displaying a second screen on the display according to at least the gesture type, wherein said second screen is different from said first screen. | 07-17-2014 |
20140201689 | FREE-SPACE USER INTERFACE AND CONTROL USING VIRTUAL CONSTRUCTS - During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes may be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct may be updated, continuously or from time to time, based on the control object's location. | 07-17-2014 |
20140201690 | DYNAMIC USER INTERACTIONS FOR DISPLAY CONTROL AND SCALING RESPONSIVENESS OF DISPLAY OBJECTS - The technology disclosed relates to distinguishing meaningful gestures from proximate non-meaningful gestures in a three-dimensional (3D) sensory space. In particular, it relates to calculating spatial trajectories of different gestures and determining a dominant gesture based on magnitudes of the spatial trajectories. The technology disclosed also relates to uniformly responding to gestural inputs from a user irrespective of a position of the user. In particular, it relates to automatically adapting a responsiveness scale between gestures in a physical space and resulting responses in a gestural interface by automatically proportioning on-screen responsiveness to scaled movement distances of gestures in the physical space, user spacing with the 3D sensory space, or virtual object density in the gestural interface. The technology disclosed further relates to detecting if a user has intended to interact with a virtual object based on measuring a degree of completion of gestures and creating interface elements in the 3D space. | 07-17-2014 |
20140208274 | CONTROLLING A COMPUTING-BASED DEVICE USING HAND GESTURES - Methods and system for controlling a computing-based device using both input received from a traditional input device (e.g. keyboard) and hand gestures made on or near a reference object (e.g. keyboard). In some examples, the hand gestures may comprise one or more hand touch gestures and/or one or more hand air gestures. | 07-24-2014 |
20140208275 | COMPUTING SYSTEM UTILIZING COORDINATED TWO-HAND COMMAND GESTURES - A computing system utilizing coordinated two-hand command gestures. An embodiment of an apparatus includes a sensing element to sense a presence or movement of a user of the apparatus, including the sensing of command gestures by the user to provide input to the apparatus, a processor, wherein operation of the processor includes interpretation of the command gestures of the user and the implementation of actions in response to the command gestures, and a display screen to display elements to the user, where the display screen may display one or more elements use in command gestures, where command gestures include coordinated two-hand command gestures of the user. | 07-24-2014 |
20140208276 | APPARATUS AND METHOD FOR PERFORMING MULTI-TASKING - Disclosed is an apparatus and method for performing multi-tasking, and more particularly to a multi-tasking performance apparatus and method which easily enable a shift between a plurality of applications being executed and a shift into an initial application. A currently operating application window is simultaneously displayed, with at least one executed application window by overlapping the executed application window on the currently operating application window according to a predetermined format. | 07-24-2014 |
20140208277 | INFORMATION PROCESSING APPARATUS - An information processing apparatus of the present application includes a drawing unit that draws a line on an image in a drawing area displayed in a fixed mode, which displays an image by fixing a position of the image; a judgment unit that judges whether a position of the line during drawing by the drawing unit is positioned on a boundary indicating the drawing area; a mode switching unit that switches from the fixed mode to a predetermined mode other than the fixed mode in a case in which the position during drawing is judged as being positioned on the boundary indicating the drawing area by the judgment unit; and a display control unit that executes control to display the image based on the mode switched by the mode switching unit. | 07-24-2014 |
20140215409 | ANIMATED DELETE APPARATUS AND METHOD - A computer-implemented method is disclosed for deleting an electronic receipt. Within the method, a computer system may present a receipt image documenting a point-of-sale transaction to a customer via a display of a computing device corresponding to the customer. While the receipt image is being presented, the computer system may receive one or more multi-touch pinch commands applied by the customer to the display of the computing device. In response to such commands, the computer system may virtually crumple the receipt image in a manner tracking the one or more multi-touch pinch commands. Once the receipt image is fully crumpled and/or crumpled and flicked, the computer system may delete receipt data corresponding to the receipt image from the computer device or an account accessed through the computing device. | 07-31-2014 |
20140215410 | ACTIVATION OF A SCREEN READING PROGRAM - Systems and processes for activating a screen reading program are disclosed. One process can include receiving a request to activate the screen reading program and prompting the user to perform an action to confirm the request. The action can include a making swiping gesture, shaking the device, covering a proximity sensor, tapping a display, or the like. In some examples, the confirming action must be received within a time limit or the input can be ignored. In response to receipt of the confirmation (e.g., within the time limit), the screen reading program can be activated. The time limit can be identified using audible notifications at the start and end of the time limit. In another example, a device can detect an event associated with a request to activate a screen reading program. The event can be detected at any time to cause the device to activate the screen reading program. | 07-31-2014 |
20140215411 | METHOD AND APPARATUS FOR CONTROLLING CONTENT PLAYBACK - A method and an apparatus are provided for controlling content in an electronic device. A gesture detection area that detects a gesture of a user is displayed, when a display unit of the electronic device is deformed during content playback. A range of the gesture detection area corresponds to a degree of the content playback. The gesture of the user is detected in the gesture detection area. The content being played is controlled in accordance with the gesture of the user. | 07-31-2014 |
20140215412 | ADJUSTING VALUES OF A PLURALITY OF CONDITIONS - A method, system, and/or computer program product adjust values of a plurality of conditions. A processor receives a user input, which is a movement across a user interface. A tendency of the movement, which describes a direction and velocity of the movement, is determined. According to the tendency of the movement, a processor adjusts a value of at least one of the plurality of conditions by using a plurality of graphs representing the plurality of conditions, where the plurality of conditions describe search criteria, where the user input describes the search criteria, and where the plurality of graphs representing the plurality of conditions have a common starting point and are radial. | 07-31-2014 |
20140215413 | CONTENT SCRUBBING - Navigation of electronic content using content scrubbing is contemplated. The content scrubbing may include moving from one offset position within the content to another offset position as a function of user interaction with a navigational feature, such as but not necessary limited to scrubbing pages in an e-book or frames in a video according to user interaction with a mouse, touchpad, touchscreen, etc. | 07-31-2014 |
20140223381 | INVISIBLE CONTROL - An invisible control may be implemented in a client device or in an application of the client device. A user may activate the invisible control by applying a gesture on a predetermined region of the client device or the application. In response to receiving the user gesture, a predetermined action associated with the invisible control may be activated. The predetermined action may be applied to the application or some or all of the content associated with the application. An Application Programming Interface may further be provided to allow the user, an application vendor or a content provider to customize the invisible control or operating modes associated with activation of the invisible control. | 08-07-2014 |
20140223382 | Z-SHAPED GESTURE FOR TOUCH SENSITIVE UI UNDO, DELETE, AND CLEAR FUNCTIONS - Techniques are disclosed for providing a Z-shaped gesture mode in electronic touch sensitive devices. In some cases, the Z-shaped gesture mode may be configured to undo an action or delete or clear content when a Z-shaped gesture is made. The Z-shaped gesture mode may also be configured to allow the reversal of previously performed undo, delete, and clear functions using a Z-shaped gesture. In some instances, the undo, delete, and clear functions are performed by a Z-shaped gesture drawn from top to bottom, and the reverse function is performed by a reverse Z-shaped gesture drawn from bottom to top. In some cases, the starting contact point and/or ending contact point of the Z-shaped gesture may control the function performed. In some configurations, the Z-shaped gesture mode may include a gesture and hold feature that is activated by holding the ending contact point of the Z-shaped gesture. | 08-07-2014 |
20140223383 | REMOTE CONTROL AND REMOTE CONTROL PROGRAM - This invention provides a remote control with a touch operation unit comprising multiple sensor elements disposed thereon, said remote control provided with functionality to discriminate which sensor elements are active according to, for example, the type of user interface. Specifically, the remote control has: a touch operation unit with multiple sensor elements disposed on a touch surface for the purpose of reading gestures; a first identification information acquisition unit that acquires first identification information, which is information that identifies the sensor element to make active in order to read the aforementioned gesture; and a gesture reading unit that reads the gesture using sensor signals from active sensor elements only. | 08-07-2014 |
20140223384 | SYSTEMS, METHODS, AND APPARATUS FOR CONTROLLING GESTURE INITIATION AND TERMINATION - Certain embodiments of the invention may include systems, methods, and apparatus for controlling gesture initiation and termination of a user and controlling devices based on a user's gestures. According to one embodiment, a vehicle can include at least one actuator; at least one gesture detection device; and one or more processors. The one or more processors receive an initiation indication from the at least one actuator; receive gesture information from the at least one gesture detection device; receive a termination indication from the at least one actuator; determine, from the received gesture information, a gesture from at least one occupant of the vehicle, wherein the gesture is determined based at least in part on the initiation indication; select a command from a plurality of commands, based at least in part on the determined gesture; and output a control signal or a command control signal associated with the command. | 08-07-2014 |
20140223385 | METHODS FOR SYSTEM ENGAGEMENT VIA 3D OBJECT DETECTION - Methods and apparatuses are presented for controlling an application on a device. In some embodiments, a method may include detecting that a user is maintaining an object or gesture at a position hovering near the device for a threshold length of time. The method may also include anchoring an initial position of the object or gesture to the device based on the detection of the maintained position, and controlling the application using the anchored initial position. In some embodiments, controlling the application using the anchored initial position may include manipulating the application based on detecting within a stable zone associated with the anchored initial position a change in height of the gesture or object relative to the device, and not manipulating the application whenever the object or gesture is detected to move along a plane above and parallel to the device and within the stable zone. | 08-07-2014 |
20140223386 | METHOD FOR RECORDING A TRACK AND ELECTRONIC DEVICE USING THE SAME - A method for recording a track and an electronic device using the same are provided. When the electronic device loads an operation system, the recording module is automatically enabled and enters a standby state. When a display unit of the electronic device displays an operating interface and the recording module is triggered, the recording module is switched from the standby state to an active state. When the recording module is at the active state, a drawing layer is displayed and covers the operating interface, and an input track is received via the drawing layer. When a storing command is received, the input track is stored to obtain a composite image. | 08-07-2014 |
20140223387 | TOUCH-SENSITIVE DEVICE AND ON-SCREEN CONTENT MANIPULATION METHOD - An electronic device includes a front panel and a side panel. A display is disposed at the front panel and configured to present content to a user. A touch-sensitive bar is disposed at the side panel and configured to receive a touch gesture input by the user. A control module is configured to manipulate the presented content according to the touch gesture. An on-screen content manipulation method is also provided. | 08-07-2014 |
20140223388 | DISPLAY CONTROL METHOD AND APPARATUS - A method is provided for adjusting a touch screen display of a terminal, the method including adjusting, by a processor, a brightness level of the display based on a direction of a touch gesture performed in a predetermined region of the display; adjusting a contrast level of the display in response to a first characteristic included in the touch gesture, when the touch gesture progresses in a direction of increasing the brightness level; and adjusting a tone level of the display in response to a second characteristic included in the touch gesture, when the touch gesture progresses in a direction of decreasing the brightness level. | 08-07-2014 |
20140237432 | GESTURE-BASED USER-INTERFACE WITH USER-FEEDBACK - A system has a contactless user-interface through which a user controls a functionality of the system. The contactless user-interface has a detector sub-system and a user-feedback sub-system. The contactless user-interface has an alert mode and a control mode. In the alert mode, the user-feedback sub-system has a display monitor to provide a visible acknowledgement to the user, in response to the detector sub-system having detected the presence of the user within a an alert range. The contactless user-interface transitions from the alert mode to the control mode in response to the detector sub-system detecting an initialization gesture of the user. In the control mode, the contactless user-interface controls the functionality in response to the detector subsystem detecting a control gesture of the user. The visible acknowledgement is made to mirror the movements of the user. | 08-21-2014 |
20140237433 | INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD - A processing device and method is provided. According to an illustrative embodiment, the device and method is implemented by detecting a face region of an image, setting at least one action region according to the position of the face region, comparing image data corresponding to the at least one action region to the detection information for purposes of determining whether or not a predetermined action has been performed, and generating a notification when it is determined that the predetermined action has been performed. | 08-21-2014 |
20140245234 | METHOD FOR CONTROLLING DISPLAY OF MULTIPLE OBJECTS DEPENDING ON INPUT RELATED TO OPERATION OF MOBILE TERMINAL, AND MOBILE TERMINAL THEREFOR - A method of controlling a display of a plurality of objects according to an input related to operation of a mobile terminal is provided. The method includes changing, if an input related to operation of the mobile terminal is received when a layout including a plurality of areas in which a plurality of objects are respectively displayed is displayed, one or more of the plurality of areas corresponding to the input related to the operation of the mobile terminal, and displaying a layout including the changed areas. The input related to the operation of the mobile terminal may be a motion of the mobile terminal, a user's breath, or a gesture. If an input related to operation of the mobile terminal is detected while one of the areas constituting the layout is being touched, the remaining areas except for the touched area are rearranged. | 08-28-2014 |
20140245235 | FEEDBACK METHOD AND ELECTRONIC DEVICE THEREOF - The embodiment of the invention provides a feedback method applied to a feedback device, comprising: detecting a first operation between a first electronic device and a second electronic device; generating first information if the first operation meets a first precondition; displaying the first information which indicates that the first operation is an operation interacting between the first electronic device and the second electronic device. The feedback method and the device thereof provided by the embodiments of the invention display the operation relationships between the electronic devices to remind the user of the operation object, thereby avoids the error in the operation. | 08-28-2014 |
20140245236 | Data Processing Apparatus Which Detects Gesture Operation - An object of the present invention is to appropriately judge a gesture operation when the gesture operation is detected and data processing is performed in accordance with the gesture operation. A gesture operation is detected, and gesture operation types are narrowed down based on the detection result. Also, by referring to a user information table including user attributes of an operator performing the gesture operation, the gesture operation types are narrowed down to one gesture operation type. | 08-28-2014 |
20140250413 | ENHANCED PRESENTATION ENVIRONMENTS - Implementations disclosed herein include systems, methods, and software for enhanced presentations. In at least one implementation, motion information is generated that is associated with motion of a subject captured in three dimensions from a top view perspective of the subject. A control is identified based at least in part on the motion information and a presentation of information is rendered based at least in part on the control. | 09-04-2014 |
20140258942 | INTERACTION OF MULTIPLE PERCEPTUAL SENSING INPUTS - A system and method for using multiple perceptual sensing technologies to capture information about a user's actions and for synergistically processing the information is described. Non-limiting examples of perceptual sensing technologies include gesture recognition using depth sensors, two-dimensional cameras, gaze detection, and/or speech recognition. The information captured about a user's gestures using one type of sensing technology is often not able to be captured with another type of technology. Thus, using multiple perceptual sensing technologies allows more information to be captured about the user's gestures. Further, by synergistically leveraging the information acquired using multiple perceptual sensing technologies, a more natural user interface can be created for a user to interact with an electronic device. | 09-11-2014 |
20140258943 | PROVIDING EVENTS RESPONSIVE TO SPATIAL GESTURES - Systems and methods for providing an output responsive to a spatial gesture are provided. In some aspects, an event associated with a spatial gesture or body position information corresponding to the event are received via a two-way socket. A function corresponding to the event is determined, where the function includes modifying data rendered for display at a display device responsive to the spatial gesture. The function is executed. | 09-11-2014 |
20140258944 | MOBILE APPARATUS HAVING FUNCTION OF PRE-ACTION ON OBJECT AND CONTROL METHOD THEREOF - A method of controlling a mobile apparatus having a function of a pre-action on an object is provided. The method includes displaying the object on a touch screen, detecting a pre-action gesture on the object, and performing the pre-action on the object in response to the pre-action gesture. | 09-11-2014 |
20140258945 | INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - In accordance with one embodiment, an information processing apparatus comprises a box-shaped information display input section configured to be arranged with, on an upper surface thereof, a display screen which is used to display an image and to input information through a touching operation; an approach detection unit configured to detect which side of the information display input section an operator approached; and an input character recognition unit configured to recognize a character input to the display screen on condition that the character is in a right direction seen from a side detected by the approach detection unit. | 09-11-2014 |
20140282269 | NON-OCCLUDED DISPLAY FOR HOVER INTERACTIONS - A computing device can be configured to recognize when a user hovers over or is within a determined distance of an element displayed on the computing device to perform certain tasks. Information associated with the element can be displayed when such a hover input is detected. This information may comprise a description of what tasks are performed by selection of the element. This information could also be an enlarged version of the element to help the user disambiguate selection of multiple elements. The information can be displayed in a manner such that at least substantive portions of the information would not be obscured or occluded by the user. | 09-18-2014 |
20140282270 | Method and System for Gesture Recognition - A method and system for recognizing gestures on an electronic device, such as a mobile device (e.g., watch), are disclosed. In one example embodiment, the method includes obtaining a gesture template, determining a first mean value based upon the gesture template, obtaining gesture data by way of a motion sensing component of the electronic device, and calculating (by way of a processing device) a correlation metric based at least indirectly upon the gesture data and the gesture template, where the correlation metric is calculated based at least in part upon the first mean value. The method also includes determining based at least in part upon the correlation metric that a first of the gestures has occurred, and taking at least one additional action based at least in part upon the determining. | 09-18-2014 |
20140282271 | USER INTERFACE RESPONSIVE TO OPERATOR POSITION AND GESTURES - Various embodiments are generally directed to the provision of multiple modes of a user interface that are automatically selected in response to the position and gestures of its operator. An apparatus includes an image sensor to capture at least one image of an operator, and a position component communicatively coupled to the image sensor to determine a proximate distance of the operator to a manually-operable control and to provide the determination of the distance to a user interface component to enable dynamic selection of one of multiple views of a visual portion of a user interface. Other embodiments are described and claimed. | 09-18-2014 |
20140282272 | Interactive Inputs for a Background Task - Systems and methods according to one or more embodiments of the present disclosure provide improved multitasking on user devices. In an embodiment, a method for multitasking comprises detecting a non-touch gesture input received by a user device. The method also comprises associating the non-touch gesture input with an application running in a background, wherein a different focused application is running in a foreground. And the method also comprises controlling the background application with the associated non-touch gesture input without affecting the foreground application. | 09-18-2014 |
20140282273 | SYSTEM AND METHOD FOR ASSIGNING VOICE AND GESTURE COMMAND AREAS - A system and method for assigning user input command areas for receiving user voice and air-gesture commands and allowing user interaction and control of multiple applications of a computing device. The system includes a voice and air-gesture capturing system configured to allow a user to assign three-dimensional user input command areas within the computing environment for each of the multiple applications. The voice and air-gesture capturing system is configured to receive data captured by one or more sensors in the computing environment and identify user input based on the data, including user speech and/or air-gesture commands within one or more user input command areas. The voice and air-gesture capturing system is further configured to identify an application corresponding to the user input based on the identified user input command area and allow user interaction with the identified application based on the user input. | 09-18-2014 |
20140282274 | DETECTION OF A GESTURE PERFORMED WITH AT LEAST TWO CONTROL OBJECTS - Methods, systems, computer-readable media, and apparatuses for implementation of a contactless panning gesture are disclosed. In some embodiments, a remote detection device detects synchronized motion of at least two control objects across a control plane. An attached computing device may then adjust a current position of a displayed content in response to detection of the synchronized motion. In certain embodiments, a threshold for variation in the movement of the control objects may be established to determine when to terminate a panning mode. The threshold may vary based on the velocity of the control objects. | 09-18-2014 |
20140282275 | DETECTION OF A ZOOMING GESTURE - Methods, systems, computer-readable media, and apparatuses for implementation of a contactless zooming gesture are disclosed. In some embodiments, a remote detection device detects a control object associated with a user. An attached computing device may use the detection information to estimate a maximum and minimum extension for the control object, and may match this with the maximum and minimum zoom amount available for a content displayed on a content surface. Remotely detected movement of the control object may then be used to adjust a current zoom of the content. | 09-18-2014 |
20140282276 | GESTURES INVOLVING DIRECT INTERACTION WITH A DATA VISUALIZATION - Functionality is described herein for directly interacting with parts of a data visualization. For instance, the functionality allows a user to directly interact with data items to filter them out from the data visualization, and later restore them to the data visualization. The functionality also allows a user to directly interact with an axis to sort the data items in the data visualization. The functionality also allows a user to directly interact with a label of the data visualization to choose a new label, and to reorganize the information represented by the data visualization in response thereto. Further, before finalizing any update to the data visualization, the functionality may provide a preview of the updated data visualization. | 09-18-2014 |
20140282277 | ADAPTIVE SEARCHING AND RANKING BASED ON GESTURES SIGNIFYING USER PREFERENCES - A system and method for incrementally optimizing the selection or ranking of elements or items from a collection for a user, and providing for immediate modification of criteria used for selection of a next element of item based on simple user feedback on features or characteristics of a previously reviewed selection, as indicated by user gestures. | 09-18-2014 |
20140282278 | DEPTH-BASED USER INTERFACE GESTURE CONTROL - Technologies for depth-based gesture control include a computing device having a display and a depth sensor. The computing device is configured to recognize an input gesture performed by a user, determine a depth relative to the display of the input gesture based on data from the depth sensor, assign a depth plane to the input gesture as a function of the depth, and execute a user interface command based on the input gesture and the assigned depth plane. The user interface command may control a virtual object selected by depth plane, including a player character in a game. The computing device may recognize primary and secondary virtual touch planes and execute a secondary user interface command for input gestures on the secondary virtual touch plane, such as magnifying or selecting user interface elements or enabling additional functionality based on the input gesture. Other embodiments are described and claimed. | 09-18-2014 |
20140282279 | INPUT INTERACTION ON A TOUCH SENSOR COMBINING TOUCH AND HOVER ACTIONS - A system and method for defining a gesture to be any combination of touch and hover actions, the touch and hover actions being combined in any order and any number of discrete touch and hover actions that define a single gesture or a series of gestures. | 09-18-2014 |
20140282280 | GESTURE DETECTION BASED ON TIME DIFFERENCE OF MOVEMENTS - Disclosed herein are a method and electronic device for detecting or identifying a gesture. A first and second movement are detected. A gesture is identified or detected based at least partially on a time difference between the first and second gesture. A function associated with the gesture is performed. | 09-18-2014 |
20140282281 | SEAMLESS MEDIA NAVIGATION - Disclosed are the technology for seamless media navigation. A computing device according to the technology includes a processor, a network interface, a input device (e.g. touch screen) and a media navigation module. The network interface communicates with multiple media servers. The input device generates user input signals of swipe motions. The media navigation module configured, when executed by the processor, to perform a process. The process includes playing a first media object, gradually switching from playing the first media object to playing multiple media objects including the first media object based on a first swipe motion; and gradually switching from playing the multiple media objects to playing one individual media object of the media objects based on a second swipe motion subsequent to the first swipe motion. | 09-18-2014 |
20140282282 | DYNAMIC USER INTERACTIONS FOR DISPLAY CONTROL - The technology disclosed relates to using gestures to supplant or augment use of a standard input device coupled to a system. It also relates to controlling a display using gestures. It further relates to controlling a system using more than one input device. In particular, it relates to detecting a standard input device that causes on-screen actions on a display in response to control manipulations performed using the standard input device. Further, a library of analogous gestures is identified, which includes gestures that are analogous to the control manipulations and also cause the on-screen actions responsive to the control manipulations. Thus, when a gesture from the library of analogous gestures is detected, a signal is generated that mimics a standard signal from the standard input device and causes at least one on-screen action. | 09-18-2014 |
20140282283 | Semantic Gesture Processing Device and Method Providing Novel User Interface Experience - A method for semantic processing of gestures on a user interface, a method for semantic processing of gestures on a user interface using remote resources, and a device incorporating the method as a novel user interface paradigm. User gestural input is sampled, analyzed, and interpreted while input is still occurring, and a change in user intent effects an action. | 09-18-2014 |
20140282284 | PORTABLE REPRODUCTION DEVICE, AND CONTROL METHOD, PROGRAM AND INFORMATION STORAGE MEDIUM FOR PORTABLE REPRODUCTION DEVICE - To provide a portable reproduction device for facilitating change by a user of a reproduction position of content data reproduced in a portable reproduction device. An inclination detection unit ( | 09-18-2014 |
20140289682 | Equivalent Gesture and Soft Button Configuration for Touch Screen Enabled Device - Methods and systems for configuring on touch screen enabled devices custom gestures and custom soft buttons that are equivalent to default gestures configured on such devices allow equivalent custom gestures and soft buttons to be configured seamlessly on such devices without invoking a configuration tool or widget, enable multiple custom gestures and soft buttons equivalent to a particular default gesture to coexist on such devices and permit sets of personal equivalent custom gestures and soft buttons to be saved and loaded on such devices by particular users. | 09-25-2014 |
20140289683 | METHOD AND APPARATUS FOR CALCULATING CHANNEL QUALITY ADAPTIVELY IN MOBILE COMMUNICATION SYSTEM - A method and apparatus for a configurable screen lock are disclosed herein, including receiving a request to release the screen lock while the electronic device is in a screen lock state, and in response to the request, displaying on a display a screen lock setting screen enabling adjustment of at least one execution condition of the screen lock. | 09-25-2014 |
20140298272 | CLOSING, STARTING, AND RESTARTING APPLICATIONS - Described herein are embodiments that relate to implementation of multi-stage gestures, using multi-stage gestures to control applications, and allowing, under certain conditions, invocation of an open operation (which would normally only open an application or bring an application to the fore) to cause a target application to terminate before being newly opened. A multi-stage gesture may be used to invoke different functions at respective gesture stages of a same input stroke. The functions may be different forms of application “closing”, such as backgrounding or suspending an application, terminating an application, and restarting an application. The restarting (including termination) of an application when the application is opened may be termed a “smart-restart”, which may involve interpreting from specific user activity that a user intends to restart an application. | 10-02-2014 |
20140298273 | Systems and Methods for Implementing Three-Dimensional (3D) Gesture Based Graphical User Interfaces (GUI) that Incorporate Gesture Reactive Interface Objects - Systems and methods in accordance with embodiments of the invention implement three-dimensional (3D) gesture based graphical user interfaces (GUI) using gesture reactive interface objects. One embodiment includes using a computing device to render an initial user interface comprising a set of interface objects, detect a targeting 3D gesture in captured image data that identifies a targeted interface object within the user interface, change the rendering of at least the targeted interface object within the user interface in response to the targeting 3D gesture that targets the interface object, detect an interaction 3D gesture in additional captured image data that identifies a specific interaction with a targeted interface object, modify the user interface in response to the interaction with the targeted interface object identified by the interaction 3D gesture, and render the modified user interface. | 10-02-2014 |
20140298274 | METHOD AND ELECTRONIC DEVICE FOR PROCESSING DATA - An electronic device and method for not exposing a memo to the others when delivering a corresponding memo in the electronic device to a specific user are disclosed. An electronic device for processing data includes at least one processor, a memory, and at least one program stored in the memory and executed by the at least one processor, wherein the program causes the at least one processor to receive data from a user, detect a user's gesture, and transform a form of the data based on the detected user's gesture. A method in an electronic device includes receiving a memo from a user, framing a memo with a memo frame, detecting a user's gesture and transforming a form of the memo frame based on the detected user's gesture. | 10-02-2014 |
20140298275 | METHOD FOR RECOGNIZING INPUT GESTURES - The present invention relates to methods, systems, and computer program products for recognizing input point gestures. The system recognizes a position of a cursor finger | 10-02-2014 |
20140298276 | DISPLAY CONTROL DEVICE, DISPLAY CONTROL METHOD, AND DISPLAY CONTROL PROGRAM - This display control device can effectively reduce power consumption while assuring visibility of a display area. A display control device ( | 10-02-2014 |
20140304663 | Gesture Interface - The instant application discloses, among other things, techniques to allow simplified Gesture Interface, which may provide a consistent, easy-to-remember interface for performing various actions with a portable device, including, but not limited to, sharing files, data, and information, winking, waving, pointing, picking up, and dropping. | 10-09-2014 |
20140304664 | PORTABLE DEVICE AND METHOD FOR CONTROLLING THE SAME - A portable device and a method for controlling the same are disclosed, in which a user may set a timer or alarm more conveniently and more exactly. The portable device comprises a display unit, a sensor unit configured to sense a touch input for the display unit, and a processor configured to control the display unit and the sensor unit, wherein the processor is further configured to display an unlock interface for unlocking a lock state when the portable device is in the lock state, the unlock interface including a current time indicator displaying current time information, the current time indicator including a first zone indicating current hour unit information and a second zone indicating current minute unit information. | 10-09-2014 |
20140304665 | CUSTOMIZED GESTURE INTERPRETATION - The technology disclosed relates to filtering gestures, according to one implementation. In particular, it relates to distinguishing between interesting gestures from non-interesting gestures in a three-dimensional (3D) sensory space by comparing characteristics of user-defined reference gestures against characteristics of actual gestures performed in the 3D sensory space. Based on the comparison, a set of gestures of interest are filtered from all the gestures performed in the 3D sensory space. | 10-09-2014 |
20140310661 | DYNAMIC MANAGEMENT OF EDGE INPUTS BY USERS ON A TOUCH DEVICE - Systems and methods of blocking, ignoring, suspending, or otherwise altering edge-related UI gestures on touch-sensitive computing devices or on non-touch sensitive computing devices having active edge I/O commands in certain situations are disclosed. In one embodiment, a second UI gesture coming from an outside edge may be altered after a first UI gesture from a user using an running application under certain conditions—e.g., if the second UI gesture is made within a certain time period after the first UI gesture, the second UI gesture is made within a certain proximity of the first UI gesture, etc. In another embodiment, a computing device is disclosed that comprises a controller, a display screen and an operating system that alters certain edge-related UI gestures that might be made by an operating system if, e.g., certain conditions are present. | 10-16-2014 |
20140317577 | GESTURE CONTROLLABLE SYSTEM USES PROPRIOCEPTION TO CREATE ABSOLUTE FRAME OF REFERENCE - A system has a contactless user-interface for control of the system through pre-determined gestures of a bodily part of the user. The user-interface has a camera and a data processing system. The camera captures video data, representative of the bodily part and of an environment of the bodily part. The data processing system processes the video data. The data processing system determines a current spatial relationship between the bodily part and another bodily part of the user. Only if the spatial relationship matches a pre-determined spatial relationship representative of the pre-determined gesture, the data processing system sets the system into a pre-determined state. | 10-23-2014 |
20140317578 | Multifunction Device with Integrated Search and Application Selection - In some embodiments, a multifunction device with a touch screen display and a plurality of applications concurrently displays a first plurality of application launch icons in a first area of the touch screen display, detects a first input by a user, and in response to detecting the first input by the user, displays a search input area on the touch screen display. In some embodiments, the device receives search input from the user, performs a search using the search input, and displays a plurality of search results from the search. In some embodiments, the device detects user selection of a first search result in the plurality of search results, and displays information corresponding to the first search result in the corresponding first application. In some embodiments, in response input from the user, the device returns to the search results and repeats this process for one or more applications. | 10-23-2014 |
20140325457 | SEARCHING OF LINE PATTERN REPRESENTATIONS USING GESTURES - The gesture-based searching of a line pattern representation amongst a collection of line pattern representations. Upon detecting an input gesture, a computing system matches the input gesture against each of multiple pattern representations. Each line pattern representation represents a line pattern having a changing value in a first dimension as a function of a value in a second dimension. At least some of the matched set may then be visualized to the user. The input gesture may be a literal line pattern to match against, or might be a gesture that has semantic meaning that describes search parameters of a line pattern to search for. The matched set may be presented so that a display parameter conveys a closeness of the match. | 10-30-2014 |
20140325458 | TOUCH INPUTS INTERACTING WITH USER INTERFACE ITEMS - Techniques for managing user interactions with items on a user interface are disclosed. In one aspect, a representation of an opening is presented in response to touch input. A display object is moved over the opening, and the display object is processed in response to the moving. In another aspect, touch input pinching two opposite corners of a display object followed by touch input flicking the display object is received and the display object is deleted in response to the inputs. In another aspect, touch input centered over a display object is received and the display object is deleted in response to the input. In another aspect, touch input corresponding to swiping gestures are received and a display object is securely deleted in response to the gestures. | 10-30-2014 |
20140325459 | GESTURE CONTROL SYSTEM - A control system is provided based on the use of gestures and functioning especially in mobile terminals. The gesture control system is provided with a general purpose interface with its commands for applications to be controlled. The processing software of the gesture signals includes a training program trained free-form gestures made by the user being stored in the gesture library, and a recognizing program, which matches a gesture made by the user to the stored gestures and chooses the most similar gesture thereof. Gestures can hence be used as commands for controlling any application configured or programmed to receive the command. One and the same application functions in different models of mobile terminals without matching, and in a certain mobile terminal all applications can be run, which applications use specified interface commands. The application can be e.g. a game or activity being included in basic implementation of a mobile terminal. | 10-30-2014 |
20140331188 | MULTI TOUCH COMBINATION FOR VIEWING SENSITIVE INFORMATION - A method at an electronic device including a user input device, the method comprising: receiving data comprising displayable content and data indicating that touch events corresponding to a pattern of multiple touches are to be detected in order to display the displayable content; upon detecting touch events corresponding to the pattern of multiple touches, displaying the displayable content; and ceasing to display the displayable content once the touch events corresponding to the pattern of multiple touches are no longer detected. | 11-06-2014 |
20140331189 | ACCESSIBLE SELF-SERVICE KIOSK WITH ENHANCED COMMUNICATION FEATURES - Accessible self-service kiosks with enhanced communication features are disclosed. According to one embodiment, a method for interacting with a user of an accessible self-service kiosk may include (1) an accessible self-service kiosk entering a hearing-impaired accessibility mode for interacting with a user; (2) receiving, using at least one imaging device, a gesture made by the user; (3) the at least one computer processor accessing a database comprising a plurality of gestures and commands associated with each of the plurality of gestures; (4) the at least one computer processor identifying command that is associated with the gesture; and (5) the at least one computer processor responding to the command. | 11-06-2014 |
20140331190 | NON-STRAIGHT GESTURE RECOGNITION METHOD FOR TOUCH DEVICES - A non-straight gesture recognition method is used for a touch device which includes a touch display panel and a data processing module. The method comprises: a user draws on the touch display panel to form a non-straight line locus; next, the data processing module registers a movement time from the non-straight line locus and captures a plurality of locus points to establish a locus equation to calculate a first-order differential value and a quadratic differential value; finally the data processing module judges a locus configuration of the non-straight line locus based on the movement time, first-order differential value and quadratic differential value and orders the multi-touch device to execute a preset program corresponding to the locus configuration. The non-straight line locus can be formed in various types of locus configurations to activate different types of corresponding preset programs, and also reduce user's erroneous operation. | 11-06-2014 |
20140337804 | SYMBOL-BASED DIGITAL INK ANALYSIS - Techniques are described for automatically performing application-specific actions based on global pre-defined symbols entered using digital ink. For example, a computing device supporting digital ink input can receive digital ink content from a user (e.g., via a digitizer and/or touchscreen), process the digital ink input to recognize text and/or graphical content, determine whether global pre-defined symbols are present in the recognized text and/or graphical content, and perform application-specific actions associated with the global pre-defined symbols that are present. The application-specific actions can be associated with built-in and/or third-party applications. | 11-13-2014 |
20140337805 | INFORMATION PROCESSOR AND COMPUTER PROGRAM PRODUCT - According to one embodiment, an information processor includes a detector and a controller. The detector is configured to detect position coordinates on a display screen specified with an object. The controller is configured to perform disabling based on the position coordinates detected by the detector. When a predetermined gesture operation by the object is received upon selection of a first link displayed on the display screen, the controller is configured to disable the selection of the first link based on the position coordinates. When the detector detects position coordinates near the first link, the controller is configured to disable a function of automatically selecting the first link. | 11-13-2014 |
20140337806 | INTERFACING WITH A COMPUTING APPLICATION USING A MULTI-DIGIT SENSOR - A technology is described for interfacing with a computing application using a multi-digit sensor. A method may include obtaining an initial stroke using a single digit of a user on the multi-digit sensor. A direction change point for the initial stroke can be identified. At the direction change point for the initial stroke, a number of additional digits can be presented by the user to the multi-digit sensor. Then a completion stroke can be identified as being made with the number of additional digits. A user interface signal to can be sent to the computing application based on the number of additional digits used in the completion touch stroke. In another configuration of the technology, the touch stroke or gesture may include a single stroke where user interface items can be selected when additional digits are presented at the end of a gesture. | 11-13-2014 |
20140337807 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND RECORDING MEDIUM - There is provided an information processing apparatus including a processing unit configured to control combining of a captured image and an operation target image so as to generate a combined image for feeding back gesture recognition to a user. A degree of visualization of the captured image appears to be changed in the combined image. | 11-13-2014 |
20140344764 | SHAKE-BASED FUNCTIONS ON A COMPUTING DEVICE - Techniques are disclosed for managing active applications on a mobile computing device, referred to collectively herein as a manage active apps mode. The manage active apps mode may be invoked by shaking the device while pressing the device's power button (or while manipulating one or more other user interface control features). The device may include one or more accelerometers, for example, to detect when (and possibly how) the device is being shaken. When invoked, the manage active apps mode may be configured to perform the function of closing, stopping, force stopping, quitting, or deleting of one or more of the device's active applications, for example. In some cases, the mode function performed may be determined by the direction the device is being shaken, such as if the device is being shaken from side-to-side or up-and-down. | 11-20-2014 |
20140344765 | Touch Sensitive UI Pinch and Flick Techniques for Managing Active Applications - Techniques are disclosed for managing active applications on a touch sensitive computing device using a pinch and flick gesture input, referred to collectively herein as a manage active apps mode. The manage active apps mode allows a user to perform a pinch gesture on a display of active applications to form a stack of those active applications. The user can then perform a flick gesture on the stack to perform a function on all of the active applications in the stack. The function may include closing, stopping, force stopping, quitting, or deleting the active applications in the stack, for example. In some cases, the manage active apps mode may be configured to provide feedback (e.g., an animation or sound) after a stack has been flicked to indicate that the function was performed (e.g., that the apps were closed, stopped, etc.). | 11-20-2014 |
20140344766 | REMOTING OR LOCALIZING TOUCH GESTURES AT A VIRTUALIZATION CLIENT AGENT - Aspects of the present disclosure are directed towards responding to a touch gesture at a touch-enabled computing device. An interface control element may be presented at a first computing environment provided by a computing device. A touch gesture may be received at a touchscreen of the computing device, and it may be determined whether at least a portion of the touch gesture occurred at the interface control element. Based, at least in part, on whether at least a portion of the touch gesture occurred at the interface control element, a display of the first computing environment may be adjusted or information corresponding to the touch gesture may be transmitted to a second computing environment. | 11-20-2014 |
20140344767 | REMOTE CONTROL METHOD AND REMOTE CONTROL SYSTEM OF IMAGE DISPLAY APPARATUS - A remote control method includes the steps of connecting an image display apparatus ( | 11-20-2014 |
20140344768 | METHOD OF APPLYING A HANDWRITING SIGNAL TO ACTIVATE AN APPLICATION - A method for activating an application by applying a handwriting signal is provided herein, which input the handwriting signal to a touch device where the activation of the application is called. The method includes the steps of: providing a handwriting signal to a handwriting recognition process; outputting a recognition signal from the handwriting recognition process; and activating the application by an operation system (OS) according to the recognition signal. | 11-20-2014 |
20140351770 | METHOD AND APPARATUS FOR IMMERSIVE SYSTEM INTERFACING - Disclosed are methods and systems for immersive system interfacing. A stimulus is defined in the form of a gesture or posture with an end-effector such as a hand. The stimulus is sensed within an environment and communicated to a processor. In response to the stimulus the processor generates a user interface, the interface being outputted near the end-effector. The stimulus may be identical for all times, all places, and all conditions in the environment. The interface may be generated and outputted in response to the stimulus for all times, places, and conditions in the environment. The apparatus includes a processor adapted to define a stimulus in the form of an end-effector gesture or posture, and to generate a user interface in response to the stimulus. A sensor is adapted to sense the stimulus, and a display is adapted to output the interface to an environment. | 11-27-2014 |
20140359538 | SYSTEMS AND METHODS FOR MOVING DISPLAY OBJECTS BASED ON USER GESTURES - Certain embodiments herein relate to systems and methods for moving display objects based on user gestures. In one embodiment, a system can include at least one memory configured to store computer-executable instructions and at least one control device configured to access the at least one memory and execute the computer-executable instructions. The instructions may be configured to detect a first user gesture adjacent to an output device in order to identity a display object displayed on the output device. The instructions may be configured to detect a second user gesture adjacent to the output device in order to identify a location to move the display object. The instructions may be configured to update the output device to display the display object at the identified location on the output device. | 12-04-2014 |
20140359539 | ORGANIZING DISPLAY DATA ON A MULTIUSER DISPLAY - For organizing display data on a multiuser display, a position module determines a first user position from a first hover presence at a multiuser display. An organization module organizes display data on the multiuser display in response to the first user position. | 12-04-2014 |
20140359540 | UBIQUITOUS NATURAL USER SYSTEM FOR HUMAN-MACHINE INTERACTION - A system is provided that includes sensor(s) configured to provide sensed input including measurements of motion of a user during performance of a task, and in which the motion may include a gesture performed in one of a plurality of 3D zones in an environment of the user that are defined to accept respective, distinct gestures. A front-end system may receive and process the sensed input including the measurements to identify the gesture and from the gesture, identify operations of an electronic resource. The front-end system may identify the gesture based on the one of the plurality of 3D zones in which the gesture is performed. The front-end system may then form and communicate an input to cause the electronic resource to perform the operations and produce an output. And the front-end system may receive the output from the electronic resource, and communicate the output to a display device. | 12-04-2014 |
20140359541 | TERMINAL AND METHOD FOR CONTROLLING MULTI-TOUCH OPERATION IN THE SAME - A terminal is provided. The terminal includes a first sensor disposed in a screen region and configured to sense a user's first touch, a second sensor disposed in a region other than the screen region and configured to sense at least two user's second touches, and a controller configured to perform a multi-touch operation when the first touch is sensed through the first sensor and at least one of the second touch is sensed through the second sensor. | 12-04-2014 |
20140365977 | Accommodating Sensors and Touch in a Unified Experience - Automatically alternating between input modes on a computing device based on a usage pattern is provided. A first input mode is initiated for interacting with content displayed on the computing device. An input corresponding to a second input mode on the computing is then detected. A transition is then made from the first input mode to the second input mode on the computing device. Upon the detecting a termination of the input on the displayed content the second input mode, a gradual transition is made from the second input mode to the first input mode based on a current sensor state of the computing device and a threshold. | 12-11-2014 |
20140365978 | Managing Ink Content in Structured Formats - Managing ink content in structured formats on a computing device is provided. Ink content may be received by the computing device. The ink content may then be recognized by the computing device to correspond to a content format associated with one or more applications. The ink content may then be converted by the computing device into a content associated with the one more applications. | 12-11-2014 |
20140365979 | METHOD AND APPARATUS FOR PERFORMING COMMUNICATION SERVICE BASED ON GESTURE - A method and an apparatus for performing gesture-based communication service are provided, in which a first device detects a first motion, and when the first motion corresponds to a first gesture that belongs to a gesture group, receives information about a second motion from a second device. The first device may perform an event that corresponds to a combination of the first and second gestures, when the received information corresponds to the second gesture that belongs to the gesture group. | 12-11-2014 |
20140365980 | FRAMEWORKS, DEVICES AND METHODS CONFIGURED FOR ENABLING GESTURE-BASED INTERACTION BETWEEN A TOUCH/GESTURE CONTROLLED DISPLAY AND OTHER NETWORKED DEVICES - Described herein are frameworks, devices and methods configured for enabling display for facility information and content, in some cases via touch/gesture controlled interfaces. Embodiments of the invention have been particularly developed for allowing an operator to conveniently access a wide range of information relating to a facility via, for example, one or more wall mounted displays. While some embodiments will be described herein with particular reference to that application, it will be appreciated that the invention is not limited to such a field of use, and is applicable in broader contexts. | 12-11-2014 |
20140365981 | MOTION CONTROL OF MOBILE DEVICE - One aspect of the invention pertains to a method for using motion to control an application on a mobile device. Physical movement or acceleration of a mobile device is detected using a motion sensor in the mobile device. A determination is made as to whether the movement of the mobile device exceeds a predetermined toggle threshold. A feature (e.g., a recording function, a talk function, a rendering or listen function, etc.) on the mobile device is toggled or activated based on the toggle threshold determination. | 12-11-2014 |
20140380247 | TECHNIQUES FOR PAGING THROUGH DIGITAL CONTENT ON TOUCH SCREEN DEVICES - Techniques are disclosed for providing a page flipping mode in electronic touch sensitive devices. The user can engage the page flipping mode by performing an activation gesture, which causes the device to display a magazine page flipping mode or a fast page flipping mode. The page flipping modes may show paginated content such as an opened book or magazine in a single stack or side-by-side layout. The fast page flipping modes may show a single page lying relatively flat or somewhat curled with the edges of subsequent pages visible at the right edge of the page. A page flipping gesture may prompt an animation showing one or more pages folding up to display subsequent pages to the user. In some cases, the number of pages being flipped and/or the speed at which the pages are flipped, is dependent upon the speed and/or length and/or duration of the page flipping gesture. | 12-25-2014 |
20140380248 | METHOD AND APPARATUS FOR GESTURE BASED TEXT STYLING - A method and apparatus for gesture based text styling on a touch screen display is disclosed. The method comprises determining a gesture of a plurality of predefined gestures made on text displayed on a touch screen display, wherein the gesture selectively signifies at least one text style change to the text; and applying the text style change to a least a portion of the displayed text on the touch screen display. | 12-25-2014 |
20140380249 | VISUAL RECOGNITION OF GESTURES - Techniques that enable a user to interact with an electronic device using spatial gestures without touching the electronic device. An electronic device provides a contactless mode of operation during which a user can interact with the electronic device using touchless gestures. A touchless gesture may be used to indicate an action to be performed and also to set an action-related parameter value that is then used when the action is performed. | 12-25-2014 |
20140380250 | IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD AND PROGRAM - An image processing apparatus comprises: an acquisition unit to acquire screen information related to whether or not an operation screen accepts an input by a gesture operation; a retrieval unit to retrieve, from screen information associated with a plurality of screen configuration patterns applicable to the operation screen, the screen information which coincides with the screen information acquired by the acquisition unit; a determining unit to determine the screen configuration pattern to be applied to the operation screen, based on the screen information retrieved by the retrieval unit; and an applying unit to apply a display rule of a screen element defined by the screen configuration pattern determined by the determining unit to a display screen of the operation screen, whereby a user can easily acknowledge whether or not to be able to perform the gesture operation on the screen on which the user intends to perform an operation. | 12-25-2014 |
20140380251 | METHOD AND DEVICE FOR AUGMENTED HANDLING OF MULTIPLE CALLS WITH GESTURES - A wireless communication device ( | 12-25-2014 |
20140380252 | DISPLAY DEVICE AND PROJECTION DEVICE - Provided is a convenient display device including: a display unit capable of projecting a projection image on a domain in accordance with a position of a user; and a judgment unit that judges whether the projection image can be projected on the domain. | 12-25-2014 |
20140380253 | INFORMATION PROCESSING APPARATUS AND METHOD OF PROCESSING INFORMATION - [Object] To improve operability of an information processing apparatus. | 12-25-2014 |
20140380254 | GESTURE TOOL - Systems, methods and computer readable media are disclosed for a gesture tool. A capture device captures user movement and provides corresponding data to a gesture recognizer engine and an application. From that, the data is parsed to determine whether it satisfies one or more gesture filters, each filter corresponding to user-performed gesture. The data and the information about the filters is also sent to a gesture tool, which displays aspects of the data and filters. In response to user input corresponding to a change in a filter, the gesture tool sends an indication of such to the gesture recognizer engine and application, where that change occurs. | 12-25-2014 |
20150012892 | Gestures to Encapsulate Intent - A method is provided for enabling gameplay with a character in a game environment on a mobile device. In the game environment, a player can play a game via a character. When there is an expectation of action from the character in the game, a touch gesture input is received on the mobile device. The fidelity of the touch gesture input is calculated with reference to an optimal gesture in a reference gesture table. If the fidelity is within a predefined range associated with the optimal gesture, the touch gesture input is matched to the optimal gesture and an associated game script is called wherein the character would be shown as having successfully completed the expectation. | 01-08-2015 |
20150012893 | DEVICE, METHOD, AND STORAGE MEDIUM STORING PROGRAM - According to an aspect, a device includes: a communication unit, a touch screen display, and a controller. The communication unit acquires information through communication service. The touch screen display displays a screen for setting an image in individual information registered in address book data. When a predetermined gesture is detected during display of the screen, the controller acquires an image to be associated with the individual information through communication service registered in the individual information. | 01-08-2015 |
20150012894 | RECORDING AND REPRODUCING APPARATUS - A recording and reproducing apparatus includes: a recording means for storing a plurality of images in groups; a display means for displaying images stored in the recording means; a detecting means for detecting a part of a human body or an object in a predetermined form; and a display switching means for switching images to be displayed on the display means in accordance with a form of a part of a human body or a form of an object detected by the detecting means. | 01-08-2015 |
20150020033 | METHOD AND APPARATUS FOR ACTIVATING A USER INTERFACE FROM A LOW POWER STATE - A method for activating a user interface from a low-power state using a touch-sensitive display module, the touch-sensitive display module including a touch-sensitive screen and a display, the touch-sensitive screen including a first touch-sensitive region and a second touch-sensitive region, the method including performing a continuous touch gesture on the touch-sensitive display module, the continuous touch gesture including a first gesture portion detected by the first touch-sensitive region and a second gesture portion detected by the second touch-sensitive region, the first gesture portion of the continuous touch gesture activating the second touch-sensitive region of the touch-sensitive screen. | 01-15-2015 |
20150020034 | TECHNIQUES FOR NOTEBOOK HINGE SENSORS - Techniques are described for notebook hinge sensors. For example, a computing device may comprise a housing having a processor circuit and an input device, the input device arranged on a side of the housing, a lid having a digital display arranged on a side of the lid, a hinge arranged to couple the housing and the lid, and a sensor module coupled to the processor circuit, the sensor module arranged inside the hinge and operative to capture motion input outside of the computing device. Other embodiments are described. | 01-15-2015 |
20150020035 | METHOD, APPARATUS AND MOBILE TERMINAL FOR CONTROLLING AN APPLICATION INTERFACE BY MEANS OF A GESTURE - Various embodiments provide a method, apparatus and mobile terminal for controlling an application interface by means of a gesture. In an exemplary method, a terminal browser can obtain a preset gesture that is inputted into a screen for generating a control point. The control point can be generated according to the preset gesture inputted into the screen. The control point can be associated with one or more preset areas on the screen. The terminal browser can obtain an operation gesture inputted into the screen targeted for the control point and, according to the operation gesture, generate an application interface corresponding to a preset area that the control point is located at. The present disclosure can break through conventional interface operation modes, realizing application interface control via gestures without occupying visible screen space, and adding more freedom for the layout of the application interface on a mobile device. | 01-15-2015 |
20150020036 | MOBILE TERMINAL AND CONTROLLING METHOD THEREOF - A mobile terminal including a touchscreen; a wireless communication unit; a memory configured to store a specific application; and a controller configured to display a first object related to the specific application, the first object having a first size, in which the specific application can be executed in response to a first touch gesture on the first object, in response to a second touch gesture with respect to the displayed first object, display an object indicator indicating that the first object can be changed into a second object related to the specific application, the second object having a second size, and in response to a third touch gesture with respect to the object indicator, change the displayed first object into the second object such that the first object is visually transformed into the second object. | 01-15-2015 |
20150026646 | USER INTERFACE APPARATUS BASED ON HAND GESTURE AND METHOD PROVIDING THE SAME - Provided is a user interface (UI) apparatus based on a hand gesture. The UI apparatus includes an image processing unit configured to detect a position of an index finger and a center position of a hand from a depth image obtained by photographing a user's hand, and detect a position of a thumb on a basis of the detected position of the index finger and the detected center position of the hand, a hand gesture recognizing unit configured to recognize a position change of the index finger and a position change of the thumb, and a function matching unit configured to match the position change of the index finger to a predetermined first function, match the position change of the thumb to a predetermined second function, and output a control signal for executing each of the matched functions. | 01-22-2015 |
20150026647 | MOBILE TERMINAL AND CONTROL METHOD THEREOF - A mobile terminal is provided a mobile terminal and a control method thereof according to exemplary embodiments. The mobile terminal includes a main body that is configured to be wearable on a specific portion of a user's body, a sensing unit that is configured to sense whether or not the main body has been worn, and also sense a user gesture for deciding the worn position of the main body, and a controller that is configured to decide the worn position of the main body according to the sensed user gesture, and setting a user input for generating a first control command in a different manner based on the decided worn position. With the configuration, the main body may sense the worn position by itself so as to provide a user interface, which is more intuitive and convenient for the user, according to the sensed worn position. | 01-22-2015 |
20150026648 | METHOD FOR TRIGGERING APPLICATIONS WITH A SMART DEVICE - A method of triggering applications on smart phone devices is provided, and the method triggers the smart device in particular way to activate at least one application based on a screen brightness variation of the smart device, wherein the triggering method may be performed by clicking, shaking or a distance sensing. | 01-22-2015 |
20150026649 | METHOD, APPARATUS AND SYSTEM FOR CONTROLLING COMPUTER TERMINAL - A method, apparatus and system for controlling a computer terminal are provided. In the method a touch on a touch-sensitive display of a hand-held electronic terminal is detected. It is judged whether a touch posture of the touch matches with a preset touch posture; and if the touch posture of the touch matches with the preset touch posture, a control command corresponding to the touch posture is generated and sent to the computer terminal, wherein the control command is used for controlling the computer terminal to execute a click operation of a mouse and/or a movement operation of a mouse cursor. Through the disclosure, the mouse function is implemented by using the hand-held electronic terminal without increasing the cost of hardware, thereby improving the user experience of the hand-held electronic terminal. | 01-22-2015 |
20150033192 | METHOD FOR CREATING EFFECTIVE INTERACTIVE ADVERTISING CONTENT - A method for interacting with a viewer of a digital signage display providing advertising content. When a person is detected in view of the display, a user representation of the person, such as a silhouette or avatar, is generated and shown on the display. While the person remains in view of the display, the method shows a manipulation of the user representation such as a displayed alteration of it or the user representation interacting with or experiencing a displayed product. With use of the displayed user representation, the person effectively becomes part of the displayed advertisement while in view of the display. | 01-29-2015 |
20150033193 | METHODS FOR MODIFYING IMAGES AND RELATED ASPECTS - Examples are provided of methods and related aspects for presenting of an image on a display and causing modification of the displayed image by displaying at least one tear feature within the image responsive to detecting at least one edge tearing gesture applied to an apparatus. Some methods and related aspects partitioning the image into image portions using said at least one displayed tear feature and retaining a selected one of said image portions on the display. The retained image portion may then comprise a region of interest for which meta-data may be generated. Associating the meta-data with the file from which the image is generated enables the region of interest to be subsequently displayed without repeating the region of interest selection process. | 01-29-2015 |
20150033194 | MULTI-FINGER USER IDENTIFICATION - A method for identifying a user is provided, including the following method operations: identifying at least three contact regions on a touch sensitive surface, the contact regions defined by simultaneous contact of at least three fingers of the user with the touch sensitive surface; for each contact region, determining a center point; determining distances between each of the determined center points of the contact regions; comparing the determined distances against predefined distances associated with a known user; based on the comparing, determining the user to be the known user or not the known user. | 01-29-2015 |
20150033195 | HARDWARE DEVICE, USER CONTROL APPARATUS FOR THE SAME, MEDICAL APPARATUS INCLUDING THE SAME, AND METHOD OF OPERATING MEDICAL APPARATUS - Provided are a hardware device, a user control apparatus for the same, a medical apparatus including the same, and a method of operating the medical apparatus. The method includes sensing a pattern of a hardware device disposed on an ultrasonic touch screen, and when the sensed pattern matches a stored pattern, determining the hardware device as an input apparatus enabling a user command to be input. | 01-29-2015 |
20150040075 | DISPLAY APPARATUS AND CONTROL METHOD THEREOF - Disclosed are a display apparatus that includes: an image receiving section which receives an image; an image processing section which processes the received image; a display section which displays the processed image and comprises a touch panel through which a touch input of a user is receivable; a UI generating section which generates a UI in the display section; a controller which performs a control for displaying the processed image and generating the UI including a thumbnail image corresponding to the displayed image, and determines, if the touch input of the user is received or detected at a first position of the thumbnail image of the generated UI through the touch panel, that the touch input is received or detected at a corresponding second position of the image displayed in the display section to control the image processing section. | 02-05-2015 |
20150040076 | GESTURE RECOGNITION METHOD, APPARATUS AND DEVICE, COMPUTER PROGRAM PRODUCT THEREFOR - In an embodiment, hand gestures, such as hand or finger hovering, in the proximity space of a sensing panel are detected from X-node and Y-node sensing signals indicative of the presence of a hand feature at corresponding row locations and column locations of a sensing panel. Hovering is detected by detecting the locations of maxima for a plurality of frames over a time window for a set of X-node sensing signals and for a set of Y-node sensing signals by recognizing a hovering gesture if the locations of the maxima detected vary over the plurality of frames for one of the sets of X-node and Y-node sensing signals while remaining stationary for the other of the sets of X-node and Y-node sensing signals(Y). Finger shapes are distinguished over “ghosts” generated by palm or fist features by transforming the node-intensity representation for the sensing signals into a node-distance representation, based on the distances of the detection intensities for a number of nodes under a peak for a mean point between the valleys adjacent to the peak. | 02-05-2015 |
20150046884 | CONTEXT SENSITIVE ACTIONS IN RESPONSE TO TOUCH INPUT - Techniques for performing context-sensitive actions in response to touch input are provided. A user interface of an application can be displayed. Touch input can be received in a region of the displayed user interface, and a context can be determined. A first action may be performed if the context is a first context and a second action may instead be performed if the context is a second context different from the first context. In some embodiments, an action may be performed if the context is a first context and the touch input is a first touch input, and may also be performed if the context is a second context and the touch input is a second touch input. | 02-12-2015 |
20150046885 | Method and device for unlocking touch screen - Disclosed is a method for unlocking a touch screen, including that: an unlocking gesture and a screen transition in unlocking are preset, wherein the screen transition is as follows: a screen is turned along an axis on the screen to produce a visual effect similar to that of pushing open a revolving door, such that a lock screen is turned gradually from the front of the screen to the back of the screen, and an unlocked screen is turned gradually from the back of the screen to the front of the screen; the screen is divided into two or more zones each corresponding to a distinct application; when the screen is unlocked in such a zone, the unlocked screen is an interface of an application corresponding to the zone; and when the screen has to be unlocked, the unlocking gesture is performed on the screen to unlock the screen. With the technical solution, efficiency in unlocking a touch screen and user experience may be improved significantly. Also disclosed is a device for unlocking a touch screen including an unlocking gesture setting module, a screen transition setting module, a screen dividing module, and an unlocking module. | 02-12-2015 |
20150046886 | GESTURE RECOGNITION - A wrist-worn athletic performance monitoring system, including a gesture recognition processor configured to execute gesture recognition processes. Interaction with the performance monitoring system may be based, at least in part, on gestures performed by the user, and offer an alternative to making selections on the performance monitoring system using physical buttons, which may be cumbersome and/or inconvenient to use while performing an athletic activities. Additionally, recognized gestures may be used to select one or more operational modes for the athletic performance monitoring systems such that a reduction in power consumption may be achieved. | 02-12-2015 |
20150052487 | SCREEN-UNLOCKING METHOD, SYSTEM AND TOUCH SCREEN TERMINAL - The present disclosure provides a system for backing up and recovering data, which comprises a mobile terminal and a server. The mobile terminal comprises: a parameter backup module, being configured to receive a lockup request for backing up system setting parameters of the mobile terminal, and enable the mobile terminal to transmit the system setting parameters to a network server according to the backup request so that the system setting parameters are backed up by the network server: a request-for-recovery module, being configured to receive a recovery request for recovering the system setting parameters to the mobile terminal and enable the mobile terminal to transmit the recovery request to the network server; and a recovery control module, being configured to receive the system setting parameters that are transmitted by the network server according to the recovery request and control to recover a system state of the mobile terminal to a state that is set by the system setting parameters. In this way, the present disclosure can back up the system setting parameters of the mobile terminal to the server, and recover the system setting parameters from the server to the mobile terminal during a data recovery. Thus, the present disclosure brings great convenience and a new experience to users. | 02-19-2015 |
20150058809 | MULTI-TOUCH GESTURE PROCESSING - One aspect of the invention is a system for multi-touch gesture processing. The system includes a multi-touch display and processing circuitry coupled to the multi-touch display. The processing circuitry is configured to detect a gesture on a gesture target area of a panel toolbar associated with a panel displayed on the multi-touch display. The panel includes panel content displayed in a content area. The gesture target area includes an empty area absent one or more command icons. Based on detection of the gesture, additional content is displayed on the multi-touch display associated with the panel content. | 02-26-2015 |
20150058810 | Electronic Device with Lateral Touch Control Combining Shortcut Function - An electronic device capable of simplifying user operation includes a shell having an opening; a displaying device disposed on the opening; and at least one lateral touch panel, disposed on one or more lateral parts of the shell, for generating a touch control signal according to a gesture applied on the at least one lateral touch panel; wherein the one or more lateral parts are connected to a plane where the opening is located and the external surface thereof is not parallel to the plane, and the electronic device executes a predefined function according to the touch control signal. | 02-26-2015 |
20150058811 | CONTROL SYSTEM FOR DISPLAY SCREEN, INPUT APPARATUS AND CONTROL METHOD - A control system for a display screen, an input apparatus and a control method are provided. An image capturing unit is used to continuously capture an image toward a first side of a display apparatus, and a processing unit is used to execute an image analyzing process for the captured image. The image analyzing process includes the following steps. Whether an object enters an initial sensing space located at the first side is detected. A virtual operating plane is established according to a location of the object when the object enters the initial sensing space is detected, wherein a size of the virtual operating plane is proportioned to a size of the display screen. A movement information of the object in the virtual operating plane is detected for controlling content of the display screen through the movement information. | 02-26-2015 |
20150058812 | SYSTEMS AND METHODS FOR CHANGING BEHAVIOR OF COMPUTER PROGRAM ELEMENTS BASED ON GAZE INPUT - According to the invention, a method for changing the behavior of computer program elements is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user. The method may also include causing, with a computer system, an interactive event controlled by the computer system to alter its behavior based at least in part on the gaze point of the user. | 02-26-2015 |
20150067613 | DISPLAY DEVICE AND METHOD OF SETTING GROUP INFORMATION - Disclosed are a display device and a method of setting group information by displaying additional information to be added to base information via first and second gesture inputs. The display device includes a display unit configured to display visual information, a sensor unit configured to detect an input signal and transmit a detected result to a processor, and the processor configured to control the display unit and the sensor unit. The processor is configured to display base information, detect a first gesture input to the displayed base information, determine an interval of additional information based on a position of the detected first gesture input, detect a second gesture input, determine the number of additional information based on a position of the detected second gesture input, and display at least one additional information according to the determined interval and the determined number of the additional information. | 03-05-2015 |
20150067614 | METHOD FOR DISPLAYING DATA AND ELECTRONIC DEVICE THEREOF - A method of displaying data and an electronic device thereof are provided. The method includes receiving a first gesture input when a first program is displayed on a first layer of a touch screen of the electronic device, displaying a second program corresponding to the first gesture input on a second layer of the touch screen, receiving a second gesture input on the second layer, and displaying a third program corresponding to the second gesture input on a third layer of the touch screen. | 03-05-2015 |
20150067615 | METHOD, APPARATUS, AND RECORDING MEDIUM FOR SCRAPPING CONTENT - A content scraping method includes recognizing a touch trace made on a screen according to a touch location moving to correspond to a touch input and selecting and storing at least one content located on the touch trace. | 03-05-2015 |
20150074613 | Menus with Hand Based Gestures - Several embodiments that allow for more precise three dimensional control of menus within electronic interfaces that may include televisions, computers, tablets, and smartphones. Interaction with electronic interfaces may be made through devices that measure three dimensional interaction such as the Leap Motion controller and Oculus Rift headset. | 03-12-2015 |
20150074614 | DIRECTIONAL CONTROL USING A TOUCH SENSITIVE DEVICE - A method and system for navigation within a two-dimensional grid object displayed on an electronic device includes determining a starting location and a circular motion of a touch gesture on the touch sensitive interface. Advancement of the circular motion of the touch gesture is mapped into a continuous navigation along an axis of the displayed grid object. The mapping into a navigation direction within the grid object is based on the starting location and the circular direction of the touch gesture. The results of the navigation, such as an indication of navigation direction and a location within the grid object are displayed. | 03-12-2015 |
20150074615 | DEVICE, METHOD, AND GRAPHICAL USER INTERFACE FOR MANIPULATING USER INTERFACES BASED ON FINGERPRINT SENSOR INPUTS - An electronic device with a display and a fingerprint sensor displays a fingerprint enrollment interface and detects, on the fingerprint sensor, a plurality of finger gestures performed with a finger. The device collects fingerprint information from the plurality of finger gestures performed with the finger. After collecting the fingerprint information, the device determines whether the collected fingerprint information is sufficient to enroll a fingerprint of the finger. When the collected fingerprint information for the finger is sufficient to enroll the fingerprint of the finger, the device enrolls the fingerprint of the finger with the device. When the collected fingerprint information for the finger is not sufficient to enroll the fingerprint of the finger, the device displays a message in the fingerprint enrollment interface prompting a user to perform one or more additional finger gestures on the fingerprint sensor with the finger. | 03-12-2015 |
20150074616 | PORTABLE DEVICE AND METHOD FOR PROVIDING VOICE RECOGNITION SERVICE - A portable device including a touch sensor configured to sense touch inputs, the touch sensor being in an active state while the portable device is in a standby mode; a touch sensor controller configured to receive the sensed touch inputs; and a processor configured to receive a signal from the touch sensor controller indicating whether the received sensed touch input indicates a first pre-stored pattern corresponding to a first active mode or a second pre-stored pattern corresponding to a second active mode, control the portable device to be in the first active mode based on the touch sensor controller receiving the signal indicating the sensed touch inputs correspond to the first pre-stored pattern, and control the portable device to be in the second active mode based on the touch sensor controller receiving the signal indicating the sensed touch inputs correspond to the second pre-stored pattern. | 03-12-2015 |
20150074617 | Multimedia Playing Device - A multimedia playing device includes a central processing unit, a plurality of sensors electrically coupled to the central processing unit, and an output unit electrically coupled to the central processing unit. The plurality of sensors are operated together with the central processing unit, such that after the sensors detect different hand movements of a user, the central processing unit reads and determines the hand movement and transmits related control signals to the output unit according to different hand movements to achieve the effects of using a hand posture to control related functional movements and enhancing the convenience of using the multimedia playing device. | 03-12-2015 |
20150082255 | METHODS AND APPARATUS FOR DISPLAYING NOTIFICATION INFORMATION - Methods and apparatus for displaying notification information are disclosed. A computing device that is showing a breathing view on its touch screen display, detects a peek request event, such as a press and hold on the display. If the user then swipes one direction (e.g., up to a notification target), the computing device launches the notification intent (e.g., the full text message in the text messaging application). However, if the user swipes another direction (e.g., down to a plurality of notification icons), the computing device displays a notification curtain (e.g., a list of various notifications and links to each associated application). | 03-19-2015 |
20150082256 | APPARATUS AND METHOD FOR DISPLAY IMAGES - An image display apparatus and an image display method are provided. The image display apparatus includes a communicating interface configured to receive an identifier and service information associated with the identifier, a storage configured to store the identifier and the service information, a sensor configured to sense a gesture, a controller configured to process service content corresponding to the identifier by using the service information in response to determining that the gesture corresponds to the identifier, and a display configured to display service content. | 03-19-2015 |
20150082257 | MOBILE TERMINAL AND CONTROL METHOD THEREOF - A mobile terminal including a wireless communication unit configured to provide wireless communication; a touch screen configured to display an execution screen; and a controller configured to receive a touch input to one end of the touch screen that continuously moves in a direction toward the other end of the touch screen, and display a control screen among any one of first and second control screens based on a point to which the touch input is applied. | 03-19-2015 |
20150089454 | OVERSCROLL STRETCH ANIMATION - Overscroll stretch animation. In accordance with a first method embodiment, a portion of displayable information is displayed on a touch screen display in a nominal state. A movement of an object on or near the touch screen display is detected. Responsive to the movement, a boundary limit of the displayable information is detected. Responsive to the detecting, the portion of displayable information is displayed in a distorted state. The distorted state simulates fabric stretching. | 03-26-2015 |
20150089455 | GESTURE INPUT METHOD - A gesture input method includes: observing a wrist; outputting state information indicating a state of the wrist; determining, by a processor, according to the state information whether the wrist is in a dorsiflexion state; and performing, by the processor, a predetermined process in accordance with whether the wrist is in the dorsiflexion state. | 03-26-2015 |
20150089456 | ELECTRONIC DEVICE - A display control unit executes a first control process if a swipe operation detected by an operation detection unit is a first swipe operation in which a finger is linearly moved, and executes a second control process differing from the first control process if the swipe operation detected by the operation detection unit is a second swipe operation in which the finger is moved along a route differing from the route of the finger in the first swipe operation. | 03-26-2015 |
20150095855 | ACTIONABLE CONTENT DISPLAYED ON A TOUCH SCREEN - Some implementations may present a media file that includes video on a touchscreen display. A user gesture performed on the touchscreen display may be detected. The user gesture may include one of a tap gesture, a swipe gesture, or a tap and hold and drag while holding gesture. Text selected by the user gesture may be determined. One or more follow-up actions may be performed automatically based at least partly on the text selected by the user gesture. | 04-02-2015 |
20150095856 | METHOD AND TERMINAL DEVICE FOR DISPLAYING MESSAGES - The present disclosure discloses a method for displaying messages in a terminal device and the terminal device thereof. The method includes the following steps: displaying messages by a display; detecting a first preset gesture acting on the display, wherein the gesture generates two endpoints on the display; and adjusting messages between the two endpoints. Accordingly, it is very convenient for the user to adjust the displayed messages by performing a simple gesture on the display of the terminal device. | 04-02-2015 |
20150100926 | DISTANCE SCALABLE NO TOUCH COMPUTING - Disclosed herein are techniques for scaling and translating gestures such that the applicable gestures for control may vary depending on the user's distance from a gesture-based system. The techniques for scaling and translation may take the varying distances from which a user interacts with components of the gesture-based system, such as a computing environment or capture device, into consideration with respect to defining and/or recognizing gestures. In an example embodiment, the physical space is divided into virtual zones of interaction, and the system may scale or translate a gesture based on the zones. A set of gesture data may be associated with each virtual zone such that gestures appropriate for controlling aspects of the gesture-based system may vary throughout the physical space. | 04-09-2015 |
20150106770 | A PRIMARY DEVICE THAT INTERFACES WITH A SECONDARY DEVICE BASED ON GESTURE COMMANDS - An incoming call from a remote device can be received by a primary device. The primary device can determine a numerical count of detected user gestures. Responsive to determining the numerical count of detected user gestures, the primary device can automatically generate an electronic message indicating a user will return the incoming call in a time frame corresponding to the numerical count of detected user gestures. The primary device can automatically communicate the electronic message to the remote device. | 04-16-2015 |
20150121314 | TWO-FINGER GESTURES - Methods and apparatus, including computer program products, are provided for two-finger gestures. In one aspect there is provided a method, which may include detecting a single-finger gesture proximate to a user interface; tracking the detected single-finger gesture to determine whether the detected single-finger gesture corresponds to a substantially circular motion; providing a first indication for presentation on the user interface, the first indication indicating at least a first selection on the user interface, when the detected single-finger gesture represents the substantially circular motion; detecting a two-finger gesture proximate to the user interface; tracking the detected two-finger gesture to determine whether the detected two-finger gesture corresponds to the substantially circular motion; and providing a second indication for presentation on the user interface, the second indication indicating at least a second selection on the user interface, when the detected two-finger gesture represents the substantially circular motion. | 04-30-2015 |
20150121315 | GESTURE BASED METHOD FOR ENTERING MULTI-VARIABLE DATA ON A GRAPHICAL USER INTERFACE. - A gesture based method for entering multi-dimensional data on a graphical user interface, for use in conjunction with a portable electronic device with a touch screen display, comprising a plurality of vertical and horizontal gestures to specify two different but logically related items of data. | 04-30-2015 |
20150121316 | METHOD FOR UNLOCKING TOUCH-SENSITIVE DEVICE, AND TOUCH-SENSITIVE DEVICE - Embodiments of the invention disclose a method for unlocking a touch-sensitive device and a touch-sensitive device, which resolves a problem that a user always performs unlocking by inputting a default password or by using a default preset graphical interaction interface of the touch-sensitive device, which is monotonous and uninteresting. The method includes: receiving, by the touch-sensitive device, a second unlocking manner input by a user; and unlocking itself by the touch-sensitive device if the second unlocking manner is the same as a first unlocking manner, where the first unlocking manner is preset by the user according to an unlocking feature, and the unlocking feature is position-related and is preset by the user. The present invention is applicable to unlocking a touch-sensitive device having a touchscreen, such as a touchscreen mobile phone or a tablet computer. | 04-30-2015 |
20150128093 | Touch Screen Control for Adjusting a Numerical Value - A method of operating a data processing system having a touch enabled display screen to alter the value of a specified variable in the data processing system is disclosed. A value control is provided on the display screen to alter the variable. The value control has an increment tap region, a decrement tap region, and a drag region. The data processing system recognizes gestures in these regions and changes the variable in response to the gestures by an amount that is determined by the gesture. The drag gesture is characterized by a drag direction and a drag length, the drag direction depending on whether the drag gesture is performed toward the increment or decrement tap regions. The value control is advantageous for small display screens. | 05-07-2015 |
20150128094 | Gesture-Based Controls Via Bone Conduction - Concepts and technologies are disclosed herein for utilizing bone conduction to detect gestures. According to one aspect, a device can generate a signal and send the signal to a sensor network that is connected to a user. The device can receive a modified signal from the sensor network. The modified signal can include the signal as modified by a body of the user. The device can compare the modified signal to the signal to determine a difference in a feature between the signal and the modified signal. The device can determine a gesture performed by the user based upon the difference in the feature between the signal and the modified signal. | 05-07-2015 |
20150128095 | METHOD, DEVICE AND COMPUTER SYSTEM FOR PERFORMING OPERATIONS ON OBJECTS IN AN OBJECT LIST - The present application discloses methods, devices and computer systems for performing operations on objects in an object list. After detecting a swipe gesture on the touch screen, a computer system, e.g. a smart phone, can identify a target object of the swipe gesture. The computer system can also determine the direction of the swipe gesture and a preset operation corresponding to the direction. If the swiping distance of the swipe gesture is sufficiently long, the identified operation is performed on the target object without additional confirmation from the user. Different directions of the swipe gesture result in different operations that are of different nature. A warning message can be displayed when the swiping distance of the swipe gesture is greater than a threshold value. | 05-07-2015 |
20150128096 | SYSTEM TO FACILITATE AND STREAMLINE COMMUNICATION AND INFORMATION-FLOW IN HEALTH-CARE - Processes and systems for facilitating communications in a health care environment are provided. In one example, a process includes receiving a trigger from a wearable computer device to communicate with a medical application interface. The trigger may include detecting a hand gesture of a user of a wearable computer device (e.g., via a camera device or motion sensing device associated with the wearable computer device). The process may then display information associated with the medical application interface on the wearable computer device, and receive input from a user via the wearable computer device for interacting with the medical application interface. Displayed information may include patient information, medical records, test results, and so on. Further, a user may initiate and communicate with a remote user, the communication synchronizing information between two or more users (e.g., to synchronously view medical records, medical image files, and so on) | 05-07-2015 |
20150128097 | METHOD AND ELECTRONIC DEVICE FOR USER INTERFACE - A method for operating an electronic device includes detecting a touch input by an external object on a first part of a display, determining whether the touch input is accompanied by and adjacent input, and processing a function corresponding to a second part of the display when the touch input is not accompanied by the adjacent input. An electronic device includes a display to display an image, a touch sensor to sense at least one of a touch input and adjacent input, and a processor configured to detect the touch input by an external object on a first part of the display via the touch sensor, determine whether the touch input is accompanied by the adjacent input and process a function corresponding to a second part of the display when the touch input is not accompanied by the adjacent input. Other embodiments are also disclosed. | 05-07-2015 |
20150135145 | ELECTRONIC DEVICE - There is provided a user-friendly electronic device including: touch sensors provided on a first surface and at least a second surface other than the first surface; a processor that displays application information based on a way of holding the electronic device by a user, the processor detecting the way of holding the electronic device using a result of detection of the touch sensors; and an assignment unit that assigns a function corresponding to an application to be run to the touch sensors. | 05-14-2015 |
20150149968 | TOUCH DEVICE AND CONTROL METHOD THEREOF - A touch device and a control method thereof are provided. In the control method, a sensing signal generated by a sensor is received. When an operating object is present, a touch screen is configured for touchable regions and untouchable regions, wherein the untouchable regions cannot generate a response through a touch by the operating object. A gesture is received on the untouchable regions of the touch screen, a processor determines whether the gesture is conformed to a default gesture setting, and the processor decides whether or not to adjust part or all of the untouchable regions to the touchable regions according to the determining result. Accordingly, the touch device and the control method thereof of the embodiments of the invention can prevent or lower the occurrence of an accidental touch by a user and enhance the operating convenience of the user. | 05-28-2015 |
20150293689 | TOUCH-CONTROL SYSTEM - A touch-control system is provided. The touch-control system includes: at least two image capturing units, configured to capture a plurality of hand images of a user; and an electronic device, coupled to the image capturing units, configured to recognize a target object from the hand images, and detect motions of the target object in an operating space, wherein the electronic device includes a display unit, and the operating space includes a virtual touch-control plane, wherein when the target object touches a virtual touch-control point on the virtual touch-control plane, the electronic device generates a touch-control signal and performs an associated touch-control operation at a position corresponding to the virtual touch-control points on the display unit. | 10-15-2015 |
20150293691 | ELECTRONIC DEVICE AND METHOD FOR SELECTING DATA ON A SCREEN - A method for an electronic device includes determining whether a drag of a touch draws a circumference of a closed region on a screen, recognizing at least one object within the closed region of the screen, and extracting the at least one object from the closed region of the screen. An apparatus includes a screen configured to display an image, and a processor configured to determine whether a drag of a touch draws a circumference of a closed region on a screen, and recognize at least one object within the closed region of the screen, and extract the at least one object from the closed region of the screen. | 10-15-2015 |
20150309570 | EYE TRACKING SYSTEMS AND METHODS WITH EFFICIENT TEXT ENTRY INPUT FEATURES - Eye tracking systems and methods include such exemplary features as a display device, at least one image capture device and a processing device. The display device displays a user interface including one or more interface elements to a user. The at least one image capture device detects a user's gaze location relative to the display device. The processing device electronically analyzes the location of user elements within the user interface relative to the user's gaze location and dynamically determine whether to initiate the display of a zoom window. The dynamic determination of whether to initiate display of the zoom window may further include analysis of the number, size and density of user elements within the user interface relative to the user's gaze location, the application type associated with the user interface or at the user's gaze location, and/or the structure of eye movements relative to the user interface. | 10-29-2015 |
20150309578 | CONTROL OF A REAL WORLD OBJECT USER INTERFACE - Systems and methods described allow users to select and obtain significant information about objects in the real world, and further to employ gestures as a “real world” interface to manipulate information and to manipulate the selection of objects. In this way, users may be enabled to make better decisions when, e.g., traveling and shopping, and may further be enabled to obtain significant information about objects with which they are surrounded. The systems and methods may include a camera which monitors a user's hand movements or gestures to control a UI, particularly where a user is interacting with real-world objects. Gestures can move the focus of a UI from one real world object to another. The systems and methods may also include a projector to illuminate selected objects, or to display information about objects. | 10-29-2015 |
20150312402 | COMPUTING SYSTEM WITH CONTROL MECHANISM AND METHOD OF OPERATION THEREOF - A computing system includes: a communication unit configured to communicate a client recognition pattern for detecting an agent device within a detection proximity; and a control unit, coupled to the communication unit, configured to: determine a detection quantity based on the client recognition pattern, assign a channel bin based on comparing the detection quantity to a channel occupancy available, and generate an activity command based on an activity request pattern assigned to the channel bin for controlling a device functionality of an electronic device. | 10-29-2015 |
20150324113 | UNLOCKING ELECTRONIC DEVICES USING TOUCHSCREEN INPUT GESTURES - A computer implemented method for detecting input gesture events on a touchscreen of an electronic device and for unlocking the electronic device is disclosed. The method may include displaying, while the electronic device is in a locked state, a plurality of guidance lines on the touchscreen of the electronic device, detecting, during an input gesture event, guidance line crossings and calculating a number of guidance line crossings detected during the input gesture event. The method may also include converting a calculated number of detected guidance line crossings into at least one password digit, comparing a sequence of password digits to a stored password in the electronic device and unlocking, in response to comparing the sequence of password digits to the stored password, the electronic device. | 11-12-2015 |
20150339022 | EVALUATION OF DIGITAL CONTENT USING NON-INTENTIONAL USER FEEDBACK OBTAINED THROUGH HAPTIC INTERFACE - Systems and methods are provided for evaluating the quality of automatically composed digital content based on non-intentional user feedback obtained through a haptic interface. For example, a method includes accessing non-intentional user feedback collected by a haptic interface executing on a computing device, wherein the non-intentional user feedback comprises information regarding user interaction with elements of digital content rendered by the computing device. The digital content is content that is automatically generated using content generation rules. The method further includes evaluating a quality of the digital content based on the non-intentional user feedback, and generating an evaluation report that includes information regarding the quality of the digital content. | 11-26-2015 |
20150339027 | SYSTEMS, METHODS, AND COMPUTER-READABLE MEDIA FOR TOUCH-SCREEN TRACING INPUT - Methods, systems, and computer-readable media for facilitating trace touch input through an input device of a computing system. The input device may include a touch-based input device, such as a touch screen input device. For example, some embodiments may provide a trace input graphical user interface (GUI) object (or “trace input object”) for facilitating trace touch input. The trace input object may include a graphical object, such as a target or pencil-shaped object, presented on a touch screen that is configured to copy or otherwise represent trace touch input entered by a user such that the touch input results are completely visible to the user in substantially real-time as they are being entered by the user. In this manner, users are able to achieve efficient and accurate touch input results when entering trace touch input through a touch-based input device. | 11-26-2015 |
20150339028 | Responding to User Input Gestures - Apparatus comprises at least one processor, and at least one memory, having computer-readable code stored thereon, the at least one memory and the computer program code being configured to, with the at least one processor, cause the apparatus to disable touch-sensitivity of a first touch-sensitive region, to enable touch-sensitivity of a second touch-sensitive region, and to be responsive to receipt, while the touch-sensitivity of the first touch-sensitive region is disabled, of a user input gesture, at least part of which is in respect of the second touch-sensitive region, to cause a graphical user interface to be displayed on a display panel, wherein the first and second touch-sensitive regions are configured to detect at least one type of touch input gesture and are configured such that the touch-sensitivities of the first and second touch sensitive regions are independently controllable. | 11-26-2015 |
20150346823 | System and Method for Selecting Gesture Controls Based on a Location of a Device - An information handling system a memory, a camera, an image analysis module, and an action module. The memory is configured to store a mapping of a plurality gestures to a plurality of operations to be performed in the information handling system. The camera is configured to detect a field of view, and to capture a movement made by an individual within the field of view. The image analysis module is configured to receive movement data from the camera, and to determine a gesture associated with the movement based on the field of view. The action module is configured to map the determined gesture to an operation of the information handling system based on the mapping of the plurality of gestures to the plurality of operations stored in the memory. | 12-03-2015 |
20150346826 | DETECTING INPUT BASED ON MULTIPLE GESTURES - Detecting user input based on multiple gestures is provided. One or more interactions are received from a user via a user interface. An inferred interaction is determined based, at least in part, on a geometric operation, wherein the geometric operation is based on the one or more interactions. The inferred interaction is presented via the user interface. Whether a confirmation has been received for the inferred interaction is determined. | 12-03-2015 |
20150346837 | Gestures, Interactions, And Common Ground In a Surface Computing Environment - Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur. | 12-03-2015 |
20150346947 | FEEDBACK IN TOUCHLESS USER INTERFACE - A computing device ( | 12-03-2015 |
20150346995 | ELECTRONIC APPARATUS AND METHOD - According to one embodiment, an electronic apparatus includes circuitry configured to display strokes of one-line handwritten characters. The circuitry is further configured to display characters corresponding to a recognition result of the strokes in a third direction. The third direction is determined based on a first direction of the one-line and a second direction determined based on at least one vector direction of at least one stroke. | 12-03-2015 |
20150346996 | ELECTRONIC APPARATUS AND METHOD - According to one embodiment, an electronic apparatus includes circuitry. The circuitry is configured to display strokes handwritten on a screen and to display at least one character with a first font type corresponding to a recognition result of the strokes. The first font type is determined by using at least one of (i) whether a pressure is used for determining a form of the strokes and (ii) whether the strokes correspond to a cursive writing. | 12-03-2015 |
20150346998 | RAPID TEXT CURSOR PLACEMENT USING FINGER ORIENTATION - Aspects of the disclosed subject matter are related to a method for utilizing touch object orientation with a touch user interface. First, a first location within a text body on the touch interface is determined. Next, a change in an orientation of a touch object is determined while the touch object remains in contact with the touch device. Thereafter, a second location within the text body on the touch user interface different from the first location is determined based at least in part on the first location and the change in the orientation of the touch object. | 12-03-2015 |
20150346999 | SYSTEM AND METHOD FOR ENHANCED COMMAND INPUT - A portable electronic device having an input device for receiving a gesture based input from a user is used to control a navigation operation of an appliance. The portable electronic device receives via the input device the gesture based input and uses one or more parameters stored in a memory of the portable electronic device and one or more characteristics associated with the gesture based input to cause the portable electronic device to transmit a navigation step command to thereby control the navigation operation of the appliance. | 12-03-2015 |
20150355715 | MIRRORING TOUCH GESTURES - The present disclosure is directed toward systems and methods that mirror a display on a touch screen as well as touch gestures being performed on the touch screen. For example, systems and methods described herein involve detecting a touch gesture being performed on a touch screen and providing a semi-transparent animation of the touch gesture on a mirrored display. The semi-transparent animation can allow users to view both the mirrored display and the touch gesture animation. | 12-10-2015 |
20150355721 | Motion Pattern Classification and Gesture Recognition - Methods, program products, and systems for gesture classification and recognition are disclosed. In general, in one aspect, a system can determine multiple motion patterns for a same user action (e.g., picking up a mobile device from a table) from empirical training data. The system can collect the training data from one or more mobile devices. The training data can include multiple series of motion sensor readings for a specified gesture. Each series of motion sensor readings can correspond to a particular way a user performs the gesture. Using clustering techniques, the system can extract one or more motion patterns from the training data. The system can send the motion patterns to mobile devices as prototypes for gesture recognition. | 12-10-2015 |
20150355722 | Method, Device, System And Non-Transitory Computer-Readable Recording Medium For Providing User Interface - According to one aspect of the present invention, there is provided a method for providing a user interface. The method includes when an input event occurs, acquiring information on postures or motions of a first device and a second device associated with the input event in a preset time interval including the time of occurrence of the input event, and determining a command to be carried out in response to the input event with reference to a relative relationship between the posture or motion of the first device and that of the second device. | 12-10-2015 |
20150355806 | THRESHOLD-BASED DRAGGABLE GESTURE SYSTEM AND METHOD FOR TRIGGERING EVENTS - A novel approach to displaying content on user devices may include initializing a gesture-to-refresh functionality with a view of a page of a non-native application running on a client device. The non-native application can be platform-independent. Each page of the non-native application can have its own gesture-to-refresh functionality. User gesture events such as touch or mouse events relative to the view presented on a display of the client device are continuously monitored by the gesture-to-refresh functionality which detects and tracks the view when dragged by a dragging gesture from a first position to a second position on the display. The view is refreshed when the second position of the view reaches or exceeds a predetermined threshold. The view is allowed to return to the first position upon release of the dragging gesture when the second position of the view is less than the predetermined threshold. | 12-10-2015 |
20150363005 | TECHNIQUES FOR USING HUMAN GESTURES TO CONTROL GESTURE UNAWARE PROGRAMS - A capture device can detect gestures made by a user. The gestures can be used to control a gesture unaware program. | 12-17-2015 |
20150370332 | DEVICE AND METHOD FOR RECOGNIZING GESTURES FOR A USER CONTROL INTERFACE - In the context of a user interface control, a gesture-recognition device: receives gyroscopic data representing a gesture executed with a dedicated instrument including a gyroscopic sensor; determines a correlation between the received gyroscopic data and gyroscopic data relating to a supervised learning and pre-recorded into a database; recognizes or not the executed gesture according to said correlation, the only data representing the executed gesture taken into account being said gyroscopic data; transposes each recognized gesture into a user interface command. | 12-24-2015 |
20150370471 | MULTI-TOUCH INTERFACE AND METHOD FOR DATA VISUALIZATION - A system and method for facilitating adjusting a displayed representation of a visualization. An example method includes employing a touch-sensitive display to present a user interface display screen depicting a first visualization; and providing a first user option to apply touch input to a region of the user interface display screen coinciding with a portion of the first visualization to facilitate affecting an arrangement of data displayed via the first visualization, wherein the touch input includes a multi-touch gesture. In a more specific embodiment, the touch gesture includes a rotation gesture, and the method further includes displaying a visual indication of a change, e.g., a pivot operation, to be applied to a second visualization as a user performs the rotation gesture, and updating the second visualization as a user continues perform the rotation gesture. The first visualization is updated based on the second visualization upon completion of the rotation gesture. | 12-24-2015 |
20150378591 | METHOD OF PROVIDING CONTENT AND ELECTRONIC DEVICE ADAPTED THERETO - A method of providing content in an electronic device and an apparatus for supporting the method are provided. The electronic device includes a communication unit configured to communicate with external devices, a display configured to display at least one content item, and a controller configured to detect a gathering gesture, gather content items stored in the electronic device and content items from the external devices in response to the gathering gesture, determine an order of displaying content items based on attributes of the gathered content items, and arrange and display the gathered content items according to the determined display order. | 12-31-2015 |
20150378594 | Progressively Indicating New Content in an Application-Selectable User Interface - This document describes techniques for progressively indicating new content in an application-selectable user interface. These techniques permit a user to view indications of new content for applications progressively, rather than all at one time. By so doing, the techniques may avoid mentally or visually overloading or over-stimulating a user viewing the indications. | 12-31-2015 |
20160004423 | ELECTRONIC DEVICE WITH TOUCH GESTURE ADJUSTMENT OF A GRAPHICAL REPRESENTATION - An electronic device includes a touch-sensitive display screen to display a graphical representation of a mathematical relationship and to enable a user to enter a one-dimensional touch gesture thereon for performing a one-dimensional adjustment function for the displayed graphical representation. A display adjustment module interprets the one-dimensional touch gesture and performs the one-dimensional adjustment function. The display adjustment module performs a one-dimensional zoom adjustment function in response to one of a one-dimensional pinch touch gesture and a one-dimensional spread touch gesture. The one-dimensional zoom adjustment function increases or reduces a displayed range of values in a first dimension while maintaining unchanged a displayed range of values in a second dimension of the displayed graphical representation. | 01-07-2016 |
20160004430 | Device, Method, and Graphical User Interface for Determining Whether to Scroll or Select Content - An electronic device with a display, a touch-sensitive surface and one or more intensity sensors displays content. While a focus selector is over the content, the device detects a gesture on the touch-sensitive surface, the gesture including a first contact on the touch-sensitive surface and movement of the first contact across the touch-sensitive surface that corresponds to movement of the focus selector on the display. In response to detecting the gesture, when the contact has an intensity below a selection intensity threshold, the device scrolls the content on the display in accordance with the movement of the focus selector on the display without selecting the content. In response to detecting the gesture, when the contact has an intensity above the selection intensity threshold, the device selects at least a portion of the content in accordance with the movement of the focus selector over the content. | 01-07-2016 |
20160011667 | System and Method for Supporting Human Machine Interaction | 01-14-2016 |
20160011769 | SYSTEMS AND METHODS FOR INTERACTIVE IMAGE CARICATURING BY AN ELECTRONIC DEVICE | 01-14-2016 |
20160011772 | DENOISING TOUCH GESTURE INPUT | 01-14-2016 |
20160018899 | DETECTING LOSS OF USER FOCUS IN A DEVICE - A wearable computing device can detect device-raising gestures. For example, onboard motion sensors of the device can detect movement of the device in real time and infer information about the spatial orientation of the device. Based on analysis of signals from the motion sensors, the device can detect a raise gesture, which can be a motion pattern consistent with the user moving the device's display into his line of sight. In response to detecting a raise gesture, the device can activate its display and/or other components. Detection of a raise gesture can occur in stages, and activation of different components can occur at different stages. | 01-21-2016 |
20160018903 | INPUT DEVICE FOR OPERATING GRAPHICAL USER INTERFACE - An input device includes an input unit for inputting a predetermined motion image signal, a motion detector for detecting a motion on the basis of the motion image signal inputted into the input unit, a video signal processor for outputting a predetermined video signal, and a controller. The controller controls the video signal processor so that, when a motion detector detects a first motion, the video signal processor outputs a video signal to explain a predetermined second motion to be next detected by the motion detector after the detection of the first motion to a user. | 01-21-2016 |
20160018981 | Touch-Based Gesture Recognition and Application Navigation - An electronic device includes a display, a touch-sensitive surface, one or more processors, and memory storing one or more programs. The device displays a first user interface of a hierarchy of user interfaces of a software application associated with first and second pan gesture recognizers. The first pan gesture recognizer is configured to recognize a pan gesture that has an initial direction along a first axis and/or a first direction. The second pan gesture recognizer is configured to recognize a pan gesture that has an initial direction along a second axis and/or a second direction distinct from the first direction. The device detects a first pan gesture in an initial direction across the touch-sensitive surface while displaying the first user interface. The device identifies a pan gesture recognizer configured to recognize the first pan gesture, and processes the first pan gesture using the identified pan gesture recognizer. | 01-21-2016 |
20160018982 | Touch-Based Gesture Recognition and Application Navigation - An electronic device includes a display, a touch-sensitive surface, one or more processors, and memory storing one or more programs. The device displays a first user interface of a hierarchy of user interfaces of a software application associated with first and second pan gesture recognizers. The first pan gesture recognizer is configured to recognize a pan gesture that has an initial direction along a first axis and/or a first direction. The second pan gesture recognizer is configured to recognize a pan gesture that has an initial direction along a second axis and/or a second direction distinct from the first direction. The device detects a first pan gesture in an initial direction across the touch-sensitive surface while displaying the first user interface. The device identifies a pan gesture recognizer configured to recognize the first pan gesture, and processes the first pan gesture using the identified pan gesture recognizer. | 01-21-2016 |
20160034171 | MULTI-TOUCH GESTURE RECOGNITION USING MULTIPLE SINGLE-TOUCH TOUCH PADS - Described herein is a device and method that uses multiple touch-sensors on multiple ergonomically separated surfaces together with centralized, common processing to enable multi-touch performance for multi-touch applications. The device uses a combination of two or more separate touch-sensors with common processing to allow use of a wider portfolio of touch technologies which would otherwise only offer single-touch capabilities, for multi-touch applications. The usage of multiple separated sensors allows coverage of various surfaces using sensor technologies that might otherwise be unavailable. The segmented ergonomically formed touch sensitive devices use ergonomic single-touch and multi-touch gestures for controlling or passing general input information to electronic devices having a human-machine input. The devices fit a variety of surface conditions and are operable via a combination of a number of different human body parts. The multiple touch sensors are ergonomically separated or dedicated to body parts to prevent accidental activation by unintended body parts. | 02-04-2016 |
20160034172 | TOUCH DEVICE AND CONTROL METHOD AND METHOD FOR DETERMINING UNLOCKING THEREOF - A touch device and a control method and a method for determining unlocking thereof are provided. The touch device includes image capturing units configured nearby a touch surface respectively, and light-reflected objects are configured around the touch surface. In this method, the image capturing units are used to detect a reflected signal to receive a first gesture on the touch surface when the touch device is operated in a touch-lock mode. The image capturing units emit a detecting signal along the touch plane and detect the reflected signal reflected from the detecting signal. It is determined whether the first gesture matches a first specific gesture. When the first gesture matches the first specific gesture, the touch device is operated in a touch-on mode. Touch information is provided when the touch device is operated in the touch-on mode but not provided when the touch device is operated in the touch-lock mode. | 02-04-2016 |
20160041740 | SYSTEMS AND METHODS FOR PROCESSING OF VISUAL CONTENT USING AFFORDANCES - A system and method are disclosed for processing of visual content and data based on user inputs associated with affordances on a touch screen computer. Visual content may be obtained from one or more sources and presented to the user, either as still images or as a continual stream of images. As visual content is presented, the user can contact the touchscreen in any of a plurality of affordances. Depending on which affordance or affordances are contacted, the direction of motion across an affordance, or the like, the computer may perform a selected action to the instance of the visual content or invoke a change such that similar or related visual content is processed in a similar manner. | 02-11-2016 |
20160041749 | OPERATING METHOD FOR USER INTERFACE - An operating method of a user interface includes the following blocks. A slide area and a zoom area are defined on a touch screen. The user interface is displayed on the touch screen. A slide gesture is detected on the touch screen. A starting point and an extending direction of the slide gesture are determined on the touch screen. The slide operation or a zoom operation is exerted to the user interface according to the starting point and the extending direction of the slide gesture. The user interface slides along the extending direction of the slide gesture when the starting point is located in the slide area. The user interface zooms when the starting point is located in the zoom area. | 02-11-2016 |
20160048211 | Using the Z-Axis in User Interfaces for Head Mountable Displays - Methods, apparatus, and computer-readable media are described herein related to a user interface (UI) for a computing device, such as a head-mountable device (HMD). The computing device can detect a communication event. In response to the communication event, the computing device can display a first item having a current size on a display associated with a display plane. A hand-movement input device associated with the computing device can receive a first input indicative of a gesture toward the display plane. In response to receiving the first input, the computing device can display a first change to the current size of the first item. The hand-movement input device can receive a second input indicative of a gesture away from the display plane. In response to the second input, the computing device can display a second change to the current size of the first item. | 02-18-2016 |
20160048319 | Gesture-based Access to a Mix View - Techniques for gesture-based access to a mixed view associated with an application representation are described. In one or more implementations, a user interface is exposed by an operating system of a computing device. The user interface includes a concurrent display of a plurality of representations of applications that are selectable by a user to launch respective applications. Gesture-based techniques can be used to interact with an application representation to cause one or more visible targets to appear adjacent the representation. The individual targets are individually associated with some type of application functionality, e.g., a quick action or a deep link into content associated with the application. An individual target can then be selected, e.g., touch-selected, by a user to initiate the associated functionality. | 02-18-2016 |
20160054884 | METHOD TO UNLOCK A SCREEN USING A TOUCH INPUT - A computer-implemented method is provided for executing an action on an electronic device. The method includes outputting, at a touch screen of the mobile device, a lock screen view configured to prevent unauthorized or inadvertent access to the electronic device. The access is conditioned to a predefined shape of a user touch input. The method further includes, while the lock screen view is displayed at the touch screen, detecting, at the touch screen, a touch input having a first shape, verifying if the first shape has the predefined shape, in response to detecting that the touch input has the predefined shape, executing, by the electronic device, a particular action determined, at least in part on the predefined shape, if detecting a further touch input at least prior or subsequent the predefined shape. | 02-25-2016 |
20160062590 | USER INTERFACE FOR LIMITING NOTIFICATIONS AND ALERTS - The present disclosure relates to systems and processes for limiting notifications on an electronic device. In one example process, data representing a user input can be received by an electronic device. The data representing the user input can include touch data from the touch-sensitive device, ambient light data from an ambient light sensor, intensity data from a contact intensity sensor, and/or motion data from one or more motion sensors. Based on the data, it can be determined whether the user input is a cover gesture over a touch-sensitive display of the electronic device. In response to determining that the user input is a cover gesture over the touch-sensitive display, the electronic device can be put into a DND mode for a predetermined amount of time. While in the DND mode, the electronic device can cease to output some or all notifications. | 03-03-2016 |
20160062638 | ELECTRONIC DEVICE AND METHOD FOR PROVIDING DRAWING FUNCTION THEREOF - An electronic device is provided. The electronic device includes a touch screen that receives the touch input from a user and displays an object and a control unit that determines an object having a shape corresponding to a trajectory of the touch input, determines whether a previously generated object is present in a position in which the touch input is received, modifies the previously generated object based on the determined object when the previously generated object is present, and generates the determined object when the previously generated object is absent. | 03-03-2016 |
20160070433 | DEVICES, METHODS, AND GRAPHICAL USER INTERFACES FOR ACCESSIBILITY USING A TOUCH-SENSITIVE SURFACE - An electronic device with a touch screen display is provided. Content in a user interface is displayed at a first magnification, and in response to detecting a first multi-finger gesture on the touch screen display (e.g., a three-finger tap gesture), a first portion of the content is magnified to a second magnification, greater than the first magnification. Further, while displaying the first portion of the content in the user interface at the second magnification, the device may detect a second multi-finger gesture on the touch screen display (e.g., movement of a three-finger contact on the display). In response to this movement, the device performs panning the user interface, so that a second portion of the content, different from the first portion, is displayed on the touch screen at the second magnification. | 03-10-2016 |
20160070872 | APPARATUS AND METHODS FOR MANAGING MEDICATION DOSING CONTENT IN AN ELECTRONIC ANESTHESIA RECORD - An apparatus and method for managing medications via gesture-based means in an electronic anesthesia record on a multi-functional device with a gesture-sensitive interface. | 03-10-2016 |
20160077594 | ENHANCED INPUT USING RECOGNIZED GESTURES - A representation of a user can move with respect to a graphical user interface based on input of a user. The graphical user interface comprises a central region and interaction elements disposed outside of the central region. The interaction elements are not shown until the representation of the user is aligned with the central region. A gesture of the user is recognized, and, based on the recognized gesture, the display of the graphical user interface is altered and an application control is outputted. | 03-17-2016 |
20160077704 | METHOD AND APPARATUS FOR TOUCH GESTURES - A system and method for facilitating employing touch gestures to control or manipulate a web-based application. The example method includes employing a browser running on a device with a touch-sensitive display to access content provided via a website; determining a context associated with the content, including ascertaining one or more user interface controls to be presented via a display screen used to present the content, and providing a first signal in response thereto; receiving touch input from a touch-sensitive display and providing a second signal in response thereto; and using the second signal to manipulate the display screen in accordance with the context associated with the content presented via the display screen. A library of touch gestures can represent common functions through touch movement patterns. These gestures may be context sensitive so as not to conflict with default touch tablet gestures. | 03-17-2016 |
20160077705 | SELECTIVE SHARING OF DISPLAYED CONTENT IN A VIEW PRESENTED ON A TOUCHSCREEN OF A PROCESSING SYSTEM - Arrangements described herein relate to sharing a view presented on a touchscreen of a processing system. Whether a show gesture state is enabled on the processing system and whether a gesture event gate is open on the processing system are determined. The show gesture state determines whether a gesture detected by the touchscreen is depicted onto a version of the view shared with another processing system. The gesture event gate determines whether a corresponding gesture event is passed to an application that is active in the view. | 03-17-2016 |
20160085311 | CONTROL UNIT AND METHOD OF INTERACTING WITH A GRAPHICAL USER INTERFACE - A control unit includes an electric field sensor and a motion sensor. Signals from the electric field sensor and the motion sensor are interpreted to generate corresponding graphical user interface control signals for a graphical user interface displayed by a separate electronic device. The graphical user interface control signals include movement control signals for a moveable element of the graphical user interface that correspond to movement of the control unit and a select control signal that corresponds to detection of a change in electric field sensed by the electric field sensor that is indicative of a change in a physical configuration of a user's body part. | 03-24-2016 |
20160085438 | MULTI-FINGER TOUCHPAD GESTURES - A multi-finger touchpad gesture refers to a movement of multiple fingers in a particular pattern across a touchpad. The touchpad senses the multiple fingers, and based on the sensed finger locations and finger movements, one of multiple multi-finger touchpad gestures is detected. A user interface being presented on a display is altered as appropriate in response to the detected multi-finger touchpad gesture. Various different multi-finger touchpad gestures can be detected. The multi-finger touchpad gestures can include a gesture that traverses different hierarchical views of the operating system user interface, a gesture that switches between two recent windows, a gesture that traverses a back stack of windows, a gesture that displays a window selection view and selects a particular window, and a gesture that moves a window to a different location (including snapping a window to an edge or corner of the display). | 03-24-2016 |
20160092099 | Apparatus Equipped with a Touchscreen and Method for Controlling Such an Apparatus - In one embodiment, an apparatus comprises a touchscreen and a control device for executing an operating command which is input on the user side via the touchscreen. The control device is adapted to control the touchscreen such that, in order to prompt a command input, a first graphic element marking a touching starting location and a destination graphic arrangement having at least one second graphic element at a distance from the first graphic element are displayed on the touchscreen. The control device is further adapted to recognise a command input depending detecting a first movement of at least one object over the touchscreen starting from the touching starting location continuously up to a second graphic element of the destination graphic arrangement and on reaching this second graphic element a movement stoppage of the object, followed by a follow-up action of the object on the touchscreen. | 03-31-2016 |
20160092100 | INFORMATION SEARCH - An example information search method includes: obtaining a plurality of paths on a screen of a mobile terminal generated by sliding two or more fingers on the screen, in which one finger corresponds to one path; determining whether the plurality of paths is consistent; if the determining result is positive, generating a search interface calling signal; and displaying a search interface based on the search interface calling signal. Thus, the techniques of the present disclosure enhance information search efficiency. | 03-31-2016 |
20160109952 | Method of Controlling Operating Interface of Display Device by User's Motion - The invention relates to a method of controlling an operating interface of a display device by user's motion. The operating interface is displayed on a screen. The operating interface comprises columns extending along a horizontal direction and layers extending along a depth direction. The method comprises steps of: displaying the operating interface on the screen of the display device; detecting the user's motion by the display device; moving a hand to an initial position by the user; and corresponding the user's motion to the operating interface by the display device. When the user moves the hand left or right or rotates the eyes left or right along the horizontal direction of the screen, the operating interface accordingly switches the columns left or right. When the user moves the hand forward or backward along the depth direction of the screen, the operating interface accordingly switches the layers forward or backward. | 04-21-2016 |
20160110053 | Drawing Support Tool - A graphical user interface displays a shape. Further, a buffer region that is adjacent to an edge of the shape is displayed at the graphical user interface. In addition, a set of drawing data located within the buffer region is received from a user input device. A first subset of the drawing data that is located in the buffer region at a predetermined distance from the edge and a second subset of the drawing data in the buffer region that is located at a distance from the edge that exceeds the predetermined distance are determined with a processor. Further, the first subset of drawing data is displayed. In addition, the second subset of drawing data is prevented from being displayed at the distance. The process also displays the second subset of drawing data at the predetermined distance. | 04-21-2016 |
20160110093 | METHOD OF PERFORMING ONE OR MORE OPERATIONS BASED ON A GESTURE - A method for performing an action in an electronic device is provided. The method includes detecting a gesture performed on a first edge and a second edge of the electronic device. Further, the method includes computing a length of a movement of the gesture, and performing an action corresponding to an item in the electronic device based on the length of the movement. | 04-21-2016 |
20160116986 | SYSTEMS AND METHODS FOR PROVIDING DIRECT AND INDIRECT NAVIGATION MODES FOR TOUCHSCREEN DEVICES - The described embodiments relate generally to systems and methods for providing direct and indirect navigation modes on mobile devices comprising a touch screen display based on a determined characteristic of the mobile device. In example embodiments, upon detecting a first characteristic, touch screen input is interpreted as direct navigation input. Upon detecting that the characteristic of the mobile device has changed, touch screen input may be interpreted as indirect navigation input. The touch screen display of the mobile device may also be reconfigured as a result of determining the change in the characteristic of the mobile device. | 04-28-2016 |
20160117823 | Imaging System and Method for Use in Surgical and Interventional Medical Procedures - A system and method for displaying images of internal anatomy includes an image processing device configured to provide high resolution images of the surgical field from low resolution scans during the procedure. The image processing device digitally manipulates a previously-obtained high resolution baseline image to produce many representative images based on permutations of movement of the baseline image. During the procedure a representative image is selected having an acceptable degree of correlation to the new low resolution image. The selected representative image and the new image are merged to provide a higher resolution image of the surgical field. The image processing device is also configured to provide interactive movement of the displayed image based on movement of the imaging device, and to permit placement of annotations on the displayed image to facilitate communication between the radiology technician and the surgeon. | 04-28-2016 |
20160124512 | GESTURE RECOGNITION USING GESTURE ELEMENTS - Aspects of the present disclosure provide a gesture recognition method and an apparatus for capturing gesture. The apparatus categorizes the raw data of a gesture into gesture elements, and utilizes the contextual dependency between the gesture elements to perform gesture recognition with a high degree of accuracy and small data size. A gesture may be formed by a sequence of one or more gesture elements. | 05-05-2016 |
20160124513 | Human-to-Computer Natural Three-Dimensional Hand Gesture Based Navigation Method - Described herein is a method for enabling human-to-computer three-dimensional hand gesture-based natural interactions. From depth images provided by a range finding imaging system, the method enables efficient and robust detection of a particular sequence of natural gestures including a beginning (start), and an ending (stop) of a predetermined type of natural gestures for delimiting the period during which a control (interaction) gesture is operating in an environment wherein a user is freely moving his hands. The invention is more particularly, although not exclusively, concerned by detection without any false positives nor delay, of intentionally performed natural gesture subsequent to a starting finger tip or hand tip based natural gesture so as to provide efficient and robust navigation, zooming and scrolling interactions within a graphical user interface up until the ending finger tip or hand tip based natural gesture is detected. | 05-05-2016 |
20160132123 | METHOD AND APPARATUS FOR INTERACTION MODE DETERMINATION - A method comprising causing display of content information on an apparatus, determining that at least one reduced interaction criteria has been satisfied to enter a reduced interaction mode, determining that a hand is gripping the apparatus in a manner that is consistent with a use grip, and causing display of, at least part, of the content information based, at least in part, on the determination that the hand is gripping the apparatus in a manner that is consistent with the use grip is disclosed. | 05-12-2016 |
20160132207 | MOBILE TERMINAL AND CONTROL METHOD THEREOF - The present disclosure relates to a mobile terminal capable of carrying out time counting to execute a control function, and a control method thereof. The mobile terminal includes a display unit, and a controller configured to continuously display counting information for guiding the time counting on at least a part of an edge area of a display unit for a predetermined time when the time counting is executed in association with a terminal operation, and execute a control function associated with the terminal operation after a lapse of the predetermined time. | 05-12-2016 |
20160132208 | Touch Operation Processing Method and Terminal Device - A touch operation processing method and a terminal device. The method includes: detecting a touch operation of a user, which starts from a border of a screen display area to the screen display area, and using the first point touched by the touch operation in the screen display area as a starting point; and performing, according to the touch operation, reduction processing on an operation interface displayed in the screen display area, where one edge of an operation interface after the reduction processing includes the starting point. Therefore, the demand is met that the user triggers, by one hand, reduction processing on the operation interface and perform a selection operation on an arbitrary position in the entire screen display area of the terminal device when the user holds the terminal device with a large screen with one hand on it. | 05-12-2016 |
20160132210 | MOBILE ELECTRONIC DEVICE - A mobile electronic device and method are presented. An input operation by a user is received, functions are stored, a remaining battery capacity is measured, and a first image is displayed indicating the remaining battery capacity and full capacity. Battery blocks are set by dividing the full capacity, if an input is entered when the first image is displayed to provide a battery blocks set up. A capacity ratio is calculated based on capacity of each of the battery blocks and the full capacity, and a remaining battery block capacity of the battery blocks is calculated based on the capacity ratio. A function is allocated to the battery blocks, and second images are displayed indicating the capacity ratio and the remaining battery block capacity. The function allocated to the battery blocks for the battery block images is displayed, when the battery blocks are set. | 05-12-2016 |
20160147307 | USER INTERFACE DEVICE, USER INTERFACE METHOD, PROGRAM, AND COMPUTER-READABLE INFORMATION STORAGE MEDIUM - To allow for easy entry of a plurality of characters by handwriting gestures in the air, a user interface device includes template data storage means for storing template data indicating changes in a predetermined writing position when a gesture to write each of a plurality of characters in the air is made, position obtaining means for sequentially obtaining the predetermined writing position when a user makes gestures to sequentially writing characters in the air, similarity evaluation information output means, every time the predetermined writing position is obtained by the position obtaining means, for sequentially outputting similarity evaluation information indicating a similarity between data to be evaluated including a predetermined number of the predetermined writing positions taken in order from newly obtained data and the template data related to each of the plurality of characters, and character string determination means for determining a character string related to the gestures of the user based on the sequentially output similarity evaluation information related to each of the plurality of characters. | 05-26-2016 |
20160147338 | IMAGE DISPLAY SYSTEM AND INPUT DEVICE - An image display system using an electronic apparatus includes a capacitive detection-type touch screen panel as an image display unit. The touch screen panel (touch panel) detects a plurality of touch positions. The system includes a control unit that changes a form an object image displayed on the image display unit based on a change of set parameters or replaces the object image with another image based on the change of the set parameters. The parameters are changed by an input device having a plurality of detected portions detected by the capacitive detection-type touch screen panel. | 05-26-2016 |
20160147401 | ELECTRONIC APPARATUS AND A METHOD FOR DISPLAYING A SCREEN OF THE ELECTRONIC APPARATUS - An electronic device and a method of displaying a screen of the electronic device are provided. The method includes detecting a user gesture for unlocking a sleep state, determining a screen display direction based on the detected user gesture, and displaying the screen based on the determined screen display direction. | 05-26-2016 |
20160147406 | METHOD FOR PROVIDING GRAPHICAL USER INTERFACE AND ELECTRONIC DEVICE FOR SUPPORTING THE SAME - An electronic device, according to certain embodiments of the present disclosure, includes: a display module that displays a plurality of image items; and a processor that, when a swipe gesture input with respect to a specific image item among the plurality of image items is detected, controls the display module to display high level items or low level items of the specific image item. Other embodiments are provided. | 05-26-2016 |
20160147435 | HYBRIDIZATION OF VOICE NOTES AND CALLING - A system and method for receiving a user interaction with a user interface of a client device, determining a current communication mode and a desired communication mode, where the desired communication mode is determined based on the user interaction received by the sensor module. The system further sets the desired communication mode as the current communication mode, and causes presentation of a user interface of the client device based on the desired communication mode being set as the current communication mode. | 05-26-2016 |
20160147439 | Supporting Different Event Models using a Single Input Source - In at least some embodiments, input provided by a single source generates events representing multiple source types through a mapping process, e.g. a touch input generates both touch and mouse events. By configuring the system to not recognize certain gestures, messages associated with the events of the different source types are then interleaved and provided to an associated application for processing. Efficiencies are gained by configuring the system to interleave the messages associated with the source types because messages of one source type can be processed sooner than if the messages of the one source type were queued up and sent in a non-interleaved fashion. | 05-26-2016 |
20160154560 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND A PROGRAM FOR INFORMATION PROCESSING | 06-02-2016 |
20160162148 | APPLICATION LAUNCHING AND SWITCHING INTERFACE - Techniques for application launching and switching are provided. An example method includes receiving an interactive gesture at a computing device, when the interactive gesture matches a predefined gesture, determining a current context of the computing device based at least on one or more tasks, the tasks including previously performed tasks at the computing device or predicted future tasks to be performed at the computing device, based on the determined context, identifying one or more software applications, the software applications including executing applications, terminated applications or uninstalled applications, to perform the one or more tasks, displaying one or more user interface elements representing the software applications, where the user interface elements are selectable to instantiate the identified software applications. | 06-09-2016 |
20160162176 | Method, Device, System and Non-transitory Computer-readable Recording Medium for Providing User Interface - According to one aspect of the present invention, there is provided a method for providing a user interface, comprising the steps of: acquiring information on a trace of a user operation inputted to a device; and controlling a reference coordinate system applied to a user interface provided in the device, with reference to a relative relationship between a first direction specified by the trace of the user operation and a second direction specified by the reference coordinate system. | 06-09-2016 |
20160162177 | METHOD OF PROCESSING INPUT AND ELECTRONIC DEVICE THEREOF - An electronic device, and a method of an electronic device, are provided. The method includes entering a first input mode that receives input with a first input means and a second input mode that receives input with a second input means; receiving a touch input from the first input means; and performing a predetermined function corresponding to the touch input. The function corresponding to the touch input is different based on whether the electronic device is in the first input mode or the second input mode. | 06-09-2016 |
20160179209 | GESTURE INPUT WITH MULTIPLE VIEWS, DISPLAYS AND PHYSICS | 06-23-2016 |
20160179328 | MOBILE TERMINAL AND METHOD OF CONTROLLING CONTENT THEREOF | 06-23-2016 |
20160179329 | USER INPUT | 06-23-2016 |
20160179332 | METHOD AND APPARATUS FOR PROVIDING USER INTERFACE OF PORTABLE DEVICE | 06-23-2016 |
20160179333 | SYSTEM AND METHOD FOR CHANGING THE STATE OF USER INTERFACE ELEMENT MARKED ON PHYSICAL OBJECTS | 06-23-2016 |
20160179334 | DEVICE, METHOD, AND GRAPHICAL USER INTERFACE FOR CONFIGURING AND IMPLEMENTING RESTRICTED INTERACTIONS FOR APPLICATIONS | 06-23-2016 |
20160179364 | DISAMBIGUATING INK STROKES AND GESTURE INPUTS | 06-23-2016 |
20160179366 | TOUCH INPUT DEVICE AND VEHICLE INCLUDING THE SAME | 06-23-2016 |
20160188151 | Information Processing Method And Electronic Device - An information processing method and an electronic device are provided. The information processing method is applied to an electronic device including a touch screen with a multi-touch function and a display unit. The information processing method includes: acquiring a touch operation in the case that a first display interface corresponding to a first content is displayed on the display unit; acquiring position information of the touch operation; and determining a first partial content in a graphic region as a first to-be-projected content according to the position information, where the first partial content is a part of the first content. With the information processing method, the projection region is quickly determined based on a requirement of a user. | 06-30-2016 |
20160188198 | Method for the Determination of a Speed Vector for a Gesture Recognition System in a Motor Vehicle and Device for the Implementation of the Method - A method for the determination of a speed vector for a gesture recognition system in a motor vehicle is disclosed. The method includes detection of movement data by a detection unit, and transmission of the movement data to a processing unit in the form of vectors. A totalling of the vectors to form a total vector occurs using the processing unit, until either the total vector reaches a predetermined minimum length or a predetermined number of vectors is totalled. If the predetermined number of vectors is totalled and the predetermined minimum length of the total vector is not reached, a determination of the speed vector as a zero vector occurs. If the predetermined minimum length of the total vector is reached, a determination of the speed vector occurs by averaging the information contained in the total vector. An accurate and dynamic determination of the speed vector is thereby achieved. | 06-30-2016 |
20160188202 | DEVICES, METHODS, AND GRAPHICAL USER INTERFACES FOR PROVIDING CONTROL OF A TOUCH-BASED USER INTERFACE ABSENT PHYSICAL TOUCH CAPABILITIES - An electronic device with a display and a touch-sensitive surface displays, on the display, a first visual indicator that corresponds to a virtual touch. The device receives a first input from an adaptive input device. In response to receiving the first input from the adaptive input device, the device displays a first menu on the display. The first menu includes a virtual touches selection icon. In response to detecting selection of the virtual touches selection icon, a menu of virtual multitouch contacts is displayed. | 06-30-2016 |
20160196028 | PORTABLE ELECTRONIC DEVICE HAVING TOUCH-SENSITIVE DISPLAY WITH VARIABLE REPEAT RATE | 07-07-2016 |
20160196032 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD AND A NON-TRANSITORY STORAGE MEDIUM | 07-07-2016 |
20160196033 | Method for Displaying Interface Content and User Equipment | 07-07-2016 |
20160198054 | DATA PROCESSING APPARATUS | 07-07-2016 |
20160202769 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND RECORDING MEDIUM | 07-14-2016 |
20160253090 | METHOD AND SYSTEM FOR INK DATA GENERATION, INK DATA RENDERING, INK DATA MANIPULATION AND INK DATA COMMUNICATION | 09-01-2016 |
20160378198 | GENERAL SPATIAL-GESTURE GRAMMAR USER INTERFACE FOR TOUCHSCREENS, HIGH DIMENSIONAL TOUCH PAD (HDTP), FREE-SPACE CAMERA, AND OTHER USER INTERFACES - A system for a spatial-gesture user interface employing grammatical rules at various levels. Various distinct subset of the gestemes can be concatenated in space and time to construct a distinct gestures. Real-time spatial-gesture information measured by a spatial-gesture user interface is processed to at least a recognized sequence of specific gestemes and that the sequence of gestemes that the user's execution a gesture has been completed. The specific gesture rendered by the user is recognized according to the sequence of gestemes. Many additional features are then provided from this foundation, including gesture grammars, structured-meaning gesture-lexicon, imposed interpretations, context, and the use of gesture-rendering prosody. The invention can be used to provide very general spatial-gesture grammar user interface for touchscreens, high dimensional touch pad (hdtp), free-space camera, and other user interfaces. | 12-29-2016 |
20160378293 | SYSTEM AND METHODS FOR TOUCH TARGET PRESENTATION - The present disclosure relates to user interfaces and in particular to providing a touch target for device interaction. In one embodiment, a process for providing a touch target includes displaying, by a device, a touch target on a display of the device, wherein the touch target is displayed in a first position, and detecting an input touch command to the display of the device. The process can also include positioning the touch target based on the input touch command, wherein the touch target is moved to a second position and controlling device operation based on position of the touch target. The processes and devices described herein may be configured to project contact position of a contact to portions of the display. | 12-29-2016 |
20160378327 | PATH GESTURES - A system includes receiving a start of a path gesture and determining, via a processor, a decision point along the path gesture. At the decision point, a first command associated with a first dimension is displayed. In addition, at the decision point, a second command associated with a second dimension is displayed. | 12-29-2016 |
20160378328 | INFERRING INSIGHTS FROM ENHANCED USER INPUT - A method and associated systems for inferring insights from enhanced user input. A computerized messaging system identifies a user operating a mobile, location-enabled, “scribble” device and associates the user or the device with a domain of interest or with other metadata that characterizes the user. When the user enters an ad hoc “scribble” input via the device, the system automatically tags the input with the user's location, with sensory data received from one or more sensor devices, and with ancillary data received from extrinsic data repositories. The system may then consider this ancillary and sensory data in order to identify or infer rules or insights associated with the user and the scribble. These rules may then be used to identify targeted, user-specific steps to perform in response to receiving the scribble, where these user-specific steps accommodate a user preference without exposing confidential user information to a public data repository. | 12-29-2016 |
20160378330 | IMAGE FORMING APPARATUS AND STORAGE MEDIUM - An image forming apparatus includes: a detection unit that detects a pinch operation that is an operation of increasing or decreasing a distance between two touch positions on a touch panel; an adjustment unit that, upon reception of setting of a size of a non-standard size sheet, adjusts the set size in accordance with the pinch operation; an angle specification unit that specifies, as a pinch operation angle resulting from the pinch operation, an angle formed by a straight line connecting the two touch positions and one of two orthogonal coordinate axes of a coordinate system of the touch pad; and an adjustment amount change unit that changes an adjustment amount for the set size per operation in accordance with whether the pinch operation angle falls within a first angular range or a second angular range that does not overlap the first angular range. | 12-29-2016 |
20160378331 | Copy and Staple Gestures - Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device. | 12-29-2016 |
20170235479 | EXECUTING A DEFAULT ACTION ON A TOUCHSCREEN DEVICE | 08-17-2017 |
20170235480 | INPUT APPARATUS, DISPLAY APPARATUS AND CONTROL METHOD THEREOF | 08-17-2017 |
20180022217 | METHOD FOR DRIVING AN OPERATING ARRANGEMENT FOR A MOTOR VEHICLE WITH PROVISION OF TWO OPERATING MODES FOR AN OPERATING ELEMENT, OPERATING ARRANGEMENT AND MOTOR VEHICLE | 01-25-2018 |
20180024728 | Terminal and Terminal Wallpaper Control Method | 01-25-2018 |
20190146576 | IMPLEMENTING A CUSTOMIZED INTERACTION PATTERN FOR A DEVICE | 05-16-2019 |
20190146653 | IMAGE DISPLAY SYSTEM, AND CONTROL APPARATUS FOR HEAD-MOUNTED DISPLAY AND OPERATION METHOD THEREFOR | 05-16-2019 |
20220137810 | HYBRIDIZATION OF VOICE NOTES AND CALLING - A system and method for receiving a user interaction with a user interface of a client device, determining a current communication mode and a desired communication mode, where the desired communication mode is determined based on the user interaction received by the sensor module. The system further sets the desired communication mode as the current communication mode, and causes presentation of a user interface of the client device based on the desired communication mode being set as the current communication mode. | 05-05-2022 |