Patent application number | Description | Published |
20110162048 | LOCAL DEVICE AWARENESS - Certain embodiments may take the form of a method of operating an electronic device to find and determine an identity of other local devices. The method includes transmitting electromagnetic signals from a first electronic device to find devices within a prescribed distance of the first device and receiving electromagnetic response signals from a second electronic device within the prescribed distance from the first electronic device. The method also includes identifying the second electronic device using information received in the electromagnetic response signals. Additionally, the method includes determining if the second electronic device is aware of other electronic devices and, if the second electronic device is aware of other electronic devices, obtaining identifying information of the other devices from the second electronic device. | 06-30-2011 |
20110163944 | INTUITIVE, GESTURE-BASED COMMUNICATIONS WITH PHYSICS METAPHORS - A user can make an intuitive, physical gesture with a first device, which can be detected by one or more onboard motion sensors. The detected motion triggers an animation having a “physics metaphor,” where the object appears to react to forces in a real world, physical environment. The first device detects the presence of a second device and a communication link is established allowing a transfer of data represented by the object to the second device. During the transfer, the first device can animate the object to simulate the object leaving the first device and the second device can animate the object to simulate the object entering the second device. In some implementations, in response to an intuitive, gesture made on a touch sensitive surface of a first device or by physically moving the device, an object can be transferred or broadcast to other devices or a network resource based on a direction, velocity or speed of the gesture. | 07-07-2011 |
20110164029 | Working with 3D Objects - Three-dimensional objects can be generated based on two-dimensional objects. A first user input identifying a 2D object presented in a user interface can be detected, and a second user input including a 3D gesture input that includes a movement in proximity to a surface can be detected. A 3D object can be generated based on the 2D object according to the first and second user inputs, and the 3D object can be presented in the user interface. | 07-07-2011 |
20110167078 | User Interfaces for Content Categorization and Retrieval - Methods, systems, and computer-readable media for providing a scenario desktop for recording a current event scenario and a content desktop for presenting information about a previously recorded event scenario are disclosed. When a first event scenario is detected on the mobile device, the scenario desktop is presented on the mobile device. The scenario desktop exists in parallel with a default primary desktop of the mobile device. An information bundle is created for the first event scenario, including one or more documents accessed through the scenario desktop during the first event scenario. Access to the one or more documents is automatically provided on the mobile device during a second event scenario related to the first event scenario. The access is provided through the content desktop existing in parallel with the primary desktop and the scenario desktop. Other scenario-based content retrieval and presentation methods are also disclosed. | 07-07-2011 |
20110167357 | Scenario-Based Content Organization and Retrieval - Methods, systems, and computer-readable media for scenario-based content categorization, retrieval, and presentation are disclosed. At a first moment in time, a first event scenario is detected by a mobile device, where the first event scenario is defined by one or more participants and one or more contextual cues concurrently monitored by the mobile device and observable to a human user of the mobile device. An information bundle is created in real-time for the first event scenario, where the information bundle includes one or more documents accessed during the first event scenario and is retrievable according to the one or more contextual cues. Access to the one or more documents is automatically provided on the mobile device during a second event scenario that is related to the first event scenario by one or more common contextual cues. Other scenario-based content retrieval and presentation methods are also disclosed. | 07-07-2011 |
20110179368 | 3D View Of File Structure - A file structure or data hierarchy can be navigated using 3D gesture inputs. For example, objects can be arranged in a plurality of layers. A user input, including a 3D gesture input having a movement in proximity to a display surface can be detected. Different layers can be navigated in response to a movement component that is perpendicular to the display surface. | 07-21-2011 |
20110193788 | GRAPHICAL OBJECTS THAT RESPOND TO TOUCH OR MOTION INPUT - A first graphical object on a user interface of a device can be transformed to a second graphical object on the user interface. The second graphical object can be manipulated by a user on the user interface using touch input or by physically moving the device. When manipulated, the object can be animated to appear to have mass that responds to real-world, physical forces, such as gravity, friction or drag. The data represented by the second graphical object can be compressed or archived using a gesture applied to the second graphical object. Graphical objects can be visually sorted on the user interface based on their mass (size). The visual appearance of graphical objects on the user interface can be adjusted to indicate the age of data represented by the graphical objects. | 08-11-2011 |
20110197153 | Touch Inputs Interacting With User Interface Items - Techniques for managing user interactions with items on a user interface are disclosed. In one aspect, a representation of an opening is presented in response to touch input. A display object is moved over the opening, and the display object is processed in response to the moving. In another aspect, touch input pinching two opposite corners of a display object followed by touch input flicking the display object is received and the display object is deleted in response to the inputs. In another aspect, touch input centered over a display object is received and the display object is deleted in response to the input. In another aspect, touch input corresponding to swiping gestures are received and a display object is securely deleted in response to the gestures. | 08-11-2011 |
20110222466 | DYNAMICALLY ADJUSTABLE COMMUNICATIONS SERVICES AND COMMUNICATIONS LINKS - User equipment may be used in forming communications links between users. A virtual communications channel may be maintained between users while adjustments are made to the type of communications traffic that is being conveyed over the link, the nature of the physical channels being used in the link, link bandwidth and other link attributes, user requirements, and other factors. Monitoring circuitry may be used to monitor factors such as the location of a device and other operating parameters. As circumstances dictate, the operation of the device may be adjusted in real time, while maintaining the virtual communications channel intact. User devices may automatically advertize their presence and may automatically detect nearby devices. Content items may be shared using online services. | 09-15-2011 |
20120268410 | Working with 3D Objects - Three-dimensional objects can be generated based on two-dimensional objects. A first user input identifying a 2D object presented in a user interface can be detected, and a second user input including a 3D gesture input that includes a movement in proximity to a surface can be detected. A 3D object can be generated based on the 2D object according to the first and second user inputs, and the 3D object can be presented in the user interface. | 10-25-2012 |
20140173758 | Local Device Awareness - Certain embodiments may take the form of a method of operating an electronic device to find and determine an identity of other local devices. The method includes transmitting electromagnetic signals from a first electronic device to find devices within a prescribed distance of the first device and receiving electromagnetic response signals from a second electronic device within the prescribed distance from the first electronic device. The method also includes identifying the second electronic device using information received in the electromagnetic response signals. Additionally, the method includes determining if the second electronic device is aware of other electronic devices and, if the second electronic device is aware of other electronic devices, obtaining identifying information of the other devices from the second electronic device. | 06-19-2014 |
20140351726 | GRAPHICAL OBJECTS THAT RESPOND TO TOUCH OR MOTION INPUT - A first graphical object on a user interface of a device can be transformed to a second graphical object on the user interface. The second graphical object can be manipulated by a user on the user interface using touch input or by physically moving the device. When manipulated, the object can be animated to appear to have mass that responds to real-world, physical forces, such as gravity, friction or drag. The data represented by the second graphical object can be compressed or archived using a gesture applied to the second graphical object. Graphical objects can be visually sorted on the user interface based on their mass (size). The visual appearance of graphical objects on the user interface can be adjusted to indicate the age of data represented by the graphical objects. | 11-27-2014 |