Patent application number | Description | Published |
20150185599 | AUDIO BASED ON CAPTURED IMAGE DATA OF VISUAL CONTENT - Techniques of providing audio based on visual content are disclosed. In some embodiments, image data of visual content is received. The image data has been captured by a computing device. Audio data is determined based on the received image data, and the corresponding audio of the audio data is then caused to be played on the computing device. Determining the audio data may comprise identifying the received image data based on one or more characteristics of the received image data, and determining the audio data based on the identification of the received image data. The received image data can comprise video or still pictures. The audio of the audio data can comprise a song or a voice recording. The user computing device can comprises one of a smart phone, a tablet computer, a wearable computing device, a vehicle computing device, a laptop computer, and a desktop computer. | 07-02-2015 |
20150187108 | AUGMENTED REALITY CONTENT ADAPTED TO CHANGES IN REAL WORLD SPACE GEOMETRY - A system and method for augmented reality content adapted to changes in real world space geometry are described. A device captures an image of a local environment and maps a real world space geometry of the local environment using the image of the local environment. The device generates a visualization of a virtual object in the display relative to the mapped real world space geometry of the local environment. A content of the virtual object is adjusted to changes in the real world space geometry of the local environment. | 07-02-2015 |
20150187137 | PHYSICAL OBJECT DISCOVERY - A system and method for discovering a machine using an augmented reality application in a viewing device is described. A default virtual user interface is associated with the machine. The machine broadcasts a status of the machine, the default virtual user interface associated with the machine, and tracking data related to the machine to the viewing device authenticated with the machine and in proximity to the machine. The visualization of the status of the machine object and the default virtual user interface are rendered in a display of the viewing device. | 07-02-2015 |
20150187138 | VISUALIZATION OF PHYSICAL CHARACTERISTICS IN AUGMENTED REALITY - A system and method for visualization of physical characteristics are described. A sensor coupled to an object generates live data. Physical characteristics of the object are computed using the live data. A visualization of the physical characteristics of the object is generated and communicated to a viewing device configured to capture an image of the object. The viewing device augments the image of the object with the visualization of the physical characteristics of the object. | 07-02-2015 |
20150235355 | ACTIVE PARALLAX CORRECTION - Techniques of active parallax correction are disclosed. In some embodiments, a first gaze direction of at least one eye of a user is determined. A determination about virtual content can then be made based on the first gaze direction, and the virtual content can be caused to be presented to the user based on the determination. In some embodiments, making the determination comprises determining a first location on a display surface at which to display the virtual content. In some embodiments, the virtual content can be caused to be displayed on the display surface at the first location. | 08-20-2015 |
20150235474 | THREE-DIMENSIONAL MAPPING SYSTEM - A survey application generates a survey of components associated with a three-dimensional model of an object. The survey application receives video feeds, location information, and orientation information from wearable devices in proximity to the object. The three-dimensional model of the object is generated based on the video feeds, sensor data, location information, and orientation information received from the wearable devices. Analytics is performed from the video feeds to identify a manipulation on the object. The three-dimensional model of the object is updated based on the manipulation on the object. A dynamic status related to the manipulation on the object is generated with respect to reference data related the object. A survey of components associated with the three-dimensional model of the object is generated. | 08-20-2015 |
20150371448 | CONTEXTUAL LOCAL IMAGE RECOGNITION DATASET - A contextual local image recognition module of a device retrieves a primary content dataset from a server and then generates and updates a contextual content dataset based on an image captured with the device. The device stores the primary content dataset and the contextual content dataset. The primary content dataset comprises a first set of images and corresponding virtual object models. The contextual content dataset comprises a second set of images and corresponding virtual object models retrieved from the server. | 12-24-2015 |
20160048515 | SPATIAL DATA PROCESSING - A system and method for spatial data processing are described. Path bundle data packages from a viewing device are accessed and processed. The path bundle data packages identify a user interaction of the viewing device with an augmented reality content relative to and based on a physical object captured by the viewing device. The path bundle data packages are generated based on the sensor data using a data model comprising a data header and a data payload. The data header comprises a contextual header having data identifying the viewing device and a user of the viewing device. A path header having data identifies the path of the interaction with the augmented reality content. A sensor header having data identifies the plurality of sensors. The data payload comprises dynamically sized sampling data from the sensor data. The path bundle data packages are normalized and aggregated. Analytics computation is performed on the normalized and aggregated path bundle data packages. | 02-18-2016 |
20160049004 | REMOTE EXPERT SYSTEM - A remote expert application identifies a manipulation of virtual objects displayed in a first wearable device. The virtual objects are rendered based a physical object viewed with a second wearable device. A manipulation of the virtual objects is received from the first wearable device. A visualization of the manipulation of the virtual objects is generated for a display of the second wearable device. The visualization of the manipulation of the virtual objects is communicated to the second wearable device. | 02-18-2016 |
20160049005 | VISUALIZATION OF PHYSICAL INTERACTIONS IN AUGMENTED REALITY - A system and method for visualization of physical interactions are described. Objects in a scene are captured with a viewing device. Physical characteristics of the objects are computed using data from at least one sensor corresponding to the objects. A physics model of predicted interactions between the one or more objects is generated using the physical characteristics of the objects. An interaction visualization is generated based on the physics model of the predicted interactions between the one or more objects. An image of the one or more objects is augmented with the interaction visualization in a display of the viewing device. | 02-18-2016 |
20160049006 | SPATIAL DATA COLLECTION - A system and method for spatial data collection are described. Sensor data related to a position and an orientation of a device are generated over time using sensors of the device. Augmented reality content is generated based on a physical object captured by the device. A path bundle data package identifying a user interaction of the device with the augmented reality content relative to the physical object is generated. The user interaction identifies a spatial path of an interaction with the augmented reality content. The path bundle data package is generated based on the sensor data using a data model comprising a data header and a data payload. The data header comprises a contextual header having data identifying the device and a user of the device. A path header includes data identifying the path of the interaction with the augmented reality content. A sensor header includes data identifying the sensors. The data payload comprises dynamically sized sampling data from the sensor data. | 02-18-2016 |
20160054791 | NAVIGATING AUGMENTED REALITY CONTENT WITH A WATCH - A system and method for navigating augmented reality (AR) content with a watch are described. A head mounted device identifies a watch, maps and generates a display of an AR menu in a transparent display of the head mounted device. The AR menu is displayed as a layer on the watch. The head mounted device detects a physical user interaction on the watch. The head mounted device navigates the AR menu in response to detecting the physical user interaction on the watch. | 02-25-2016 |
20160055673 | DISTRIBUTED APERTURE VISUAL INERTIA NAVIGATION - A system and method for visual inertial navigation for augmented reality are described. In some embodiments, at least one camera of a wearable device generates a plurality of video frames. At least one inertial measurement unit (IMU) sensors of the wearable device generates IMU data. Features in the plurality of video frames for each camera are tracked. The plurality of video frames for each camera are synchronized and aligned based on the IMU data. A dynamic state of the wearable device is computed based on the synchronized plurality of video frames with the IMU data for each camera. Augmented reality content is generated and positioned in a display of the wearable device based on the dynamic state of the wearable device. | 02-25-2016 |
20160055674 | EXTRACTING SENSOR DATA FOR AUGMENTED REALITY CONTENT - A system and method for extracting data for augmented reality content are described. A device identifies a sensing device using an image captured with at least one camera of the device. Visual data are extracted from the sensing device. The device generates an AR content based on the extracted visual data and maps and displays the AR content in the display to form a layer on the sensing device. | 02-25-2016 |
20160057511 | REMOTE SENSOR ACCESS AND QUEUING - An application generates instructions to a wearable device to remotely activate a sensor in the wearable device and to receive sensor data from the sensor. A query related to a physical object is received. Instructions to wearable devices are generated to remotely activate at least one sensor of the wearable devices in response to the query. Sensor data is received from at least one of the wearable devices in response to that wearable device being within a range of the physical object. | 02-25-2016 |
20160070109 | RETRACTABLE DISPLAY FOR HEAD MOUNTED DEVICE - A head mounted device includes a helmet with a guide, a lens frame, at least one display surface mounted to the lens frame. The guide extends from a cavity of the helmet. The lens frame is moveably connected to the guide and moves along an axis of the guide between a first position within the cavity of the helmet and a second position outside the cavity of the helmet. The display surface is transparent and configured to display augmented reality content. | 03-10-2016 |