Patent application number | Description | Published |
20110216002 | Calibration of Portable Devices in a Shared Virtual Space - Methods, systems, and computer programs for generating an interactive space viewable through at least a first and a second device are presented. The method includes an operation for detecting from the first device a location of the second device or vice versa. Further, synchronization information data is exchanged between the first and the second device to identify a reference point in a three-dimensional (3D) space relative to the physical location of the devices in the 3D space. The devices establish the physical location in the 3D space of the other device when setting the reference point. The method further includes an operation for generating views of an interactive scene in the displays of the first and second devices. The interactive scene is tied to the reference point and includes virtual objects. The view in the display shows the interactive scene as observed from the current location of the corresponding device. Moving the device in the 3D space causes the view to change according to the perspective from the current location. | 09-08-2011 |
20110216060 | Maintaining Multiple Views on a Shared Stable Virtual Space - Methods, apparatus, and computer programs for controlling a view of a virtual scene with a portable device are presented. In one method, a signal is received and the portable device is synchronized to make the location of the portable device a reference point in a three-dimensional (3D) space. A virtual scene, which includes virtual reality elements, is generated in the 3D space around the reference point. Further, the method determines the current position in the 3D space of the portable device with respect to the reference point and creates a view of the virtual scene. The view represents the virtual scene as seen from the current position of the portable device and with a viewing angle based on the current position of the portable device. Additionally, the created view is displayed in the portable device, and the view of the virtual scene is changed as the portable device is moved by the user within the 3D space. In another method, multiple players shared the virtual reality and interact among each other view the objects in the virtual reality. | 09-08-2011 |
20110242134 | METHOD FOR AN AUGMENTED REALITY CHARACTER TO MAINTAIN AND EXHIBIT AWARENESS OF AN OBSERVER - Methods and systems for enabling an augmented reality character to maintain and exhibit awareness of an observer are provided. A portable device held by a user is utilized to capture an image stream of a real environment, and generate an augmented reality image stream which includes a virtual character. The augmented reality image stream is displayed on the portable device to the user. As the user maneuvers the portable device, its position and movement are continuously tracked. The virtual character is configured to demonstrate awareness of the user by, for example, adjusting its gaze so as to look in the direction of the portable device. | 10-06-2011 |
20110260830 | BIOMETRIC INTERFACE FOR A HANDHELD DEVICE - Methods and systems for applying biometric data to an interactive program executed by a portable device are provided. According to embodiments of the invention, raw bio-signal data is captured and filtered so as to determine the bio-signal of the user of the interactive program. The bio-signal is analyzed so as to determine biometrics of the user, which are applied as input to the interactive program. A setting or state of the interactive program is modified based on the biometrics. An updated state of the interactive program is rendered to the user, reflecting the modification of the setting or state of the interactive program. | 10-27-2011 |
20110281648 | PLACEMENT OF USER INFORMATION IN A GAME SPACE - The generation, association, and display of in-game tags are disclosed. Such tags introduce an additional dimension of community participation to both single and multiplayer games. Through such tags, players are empowered to communicate through filtered text messages and images as well as audio clips that other game players, including top rated players, have generated and placed at particular coordinates and/or in context of particular events within the game space. The presently described in-game tags and associated user generated content further allow for label based searches with respect to game play. | 11-17-2011 |
20110283238 | Management of Digital Information via an Interface - An interface for managing digital information is provided. Digital information including one or more digital files is stored in memory. An icon is associated with the digital information and rendered inside a translucent bubble. The bubble may be manipulated in the digital environment by a user. | 11-17-2011 |
20120072379 | Evolution of a User Interface Based on Learned Idiosyncrasies and Collected Data of a User - A user interface evolves based on learned idiosyncrasies and collected data of a user. Learned idiosyncrasies and collected data of the user can be stored in a knowledge base. Information from the surrounding environment of the user can be obtained during learning of idiosyncrasies or collection of data. Thought-based statements can be generated based at least in part on the knowledge base and the information from the environment surrounding the user during learning of idiosyncrasies or collection of data. The thought-based statements serve to invoke or respond to subsequent actions of the user. The user interface can be presented so as to allow for interaction with the user based at least in part on the thought-based statements. Furthermore, personality nuances of the user interface can be developed that affect the interaction between the user and the user interface. | 03-22-2012 |
20120072424 | Developing a Knowledge Base Associated with a User That Facilitates Evolution of an Intelligent User Interface - Developing a knowledgebase associated with a user interface is disclosed. Development of the knowledgebase includes cataloging local data associated with a user, collecting remote data associated with the user, recording information associated with verbal input received from the user, tracking acts performed by the user to determine user idiosyncrasies, and updating the knowledgebase with the cataloged local data, the collected remote data, the recorded information, and the user idiosyncrasies. The updated knowledgebase is then provided to a component of a user interface. | 03-22-2012 |
20130117201 | EVOLUTION OF A USER INTERFACE BASED ON LEARNED IDIOSYNCRASIES AND COLLECTED DATA OF A USER - A user interface evolves based on learned idiosyncrasies and collected data of a user. Learned idiosyncrasies and collected data of the user can be stored in a knowledge base. Information from the surrounding environment of the user can be obtained during learning of idiosyncrasies or collection of data. Thought-based statements can be generated based at least in part on the knowledge base and the information from the environment surrounding the user during learning of idiosyncrasies or collection of data. The thought-based statements serve to invoke or respond to subsequent actions of the user. The user interface can be presented so as to allow for interaction with the user based at least in part on the thought-based statements. Furthermore, personality nuances of the user interface can be developed that affect the interaction between the user and the user interface. | 05-09-2013 |
20140002359 | Calibration Of Portable Devices In A Shared Virtual Space | 01-02-2014 |
20140040168 | EVOLUTION OF A USER INTERFACE BASED ON LEARNED IDIOSYNCRASIES AND COLLECTED DATA OF A USER - A user interface evolves based on learned idiosyncrasies and collected data of a user. Learned idiosyncrasies and collected data of the user can be stored in a knowledge base. Information from the surrounding environment of the user can be obtained during learning of idiosyncrasies or collection of data. Thought-based statements can be generated based at least in part on the knowledge base and the information from the environment surrounding the user during learning of idiosyncrasies or collection of data. The thought-based statements serve to invoke or respond to subsequent actions of the user. The user interface can be presented so as to allow for interaction with the user based at least in part on the thought-based statements. Furthermore, personality nuances of the user interface can be developed that affect the interaction between the user and the user interface. | 02-06-2014 |
20140232652 | CALIBRATION OF PORTABLE DEVICES IN A SHARED VIRTUAL SPACE - Methods, systems, and computer programs are provided for generating an interactive space. One method includes operations for associating a first device to a reference point in 3D space, and for calculating by the first device a position of the first device in the 3D space based on inertial information captured by the first device and utilizing dead reckoning. Further, the method includes operations for capturing images with a camera of the first device, and for identifying locations of one or more static features in the images. The position of the first device is corrected based on the identified locations of the one or more static features, and a view of an interactive scene is presented in a display of the first device, where the interactive scene is tied to the reference point and includes virtual objects. | 08-21-2014 |
20140235311 | MAINTAINING MULTIPLE VIEWS ON A SHARED STABLE VIRTUAL SPACE - Methods, apparatus, and computer programs for controlling a view of a virtual scene with a handheld device are presented. In one method, images of a real world scene are captured using a device. The method further includes operations for creating an augmented view for presentation on a display of the device by augmenting the images with virtual reality objects, and for detecting a hand in the images as extending into the real world scene. In addition, the method includes operations for showing the hand in the screen as detected in the images, and for generating interaction data, based on an interaction of the hand with a virtual reality object, when the hand makes virtual contact in the augmented view with the virtual reality object. The augmented view is updated based on the interaction data, which simulates on the screen that the hand is interacting with the virtual reality object. | 08-21-2014 |