Patent application number | Description | Published |
20130321257 | Methods and Apparatus for Cartographically Aware Gestures - Methods and apparatus for a map tool on a mobile device for implementing cartographically aware gestures directed to a map view of a map region. The map tool may base a cartographically aware gesture on an actual gesture input directed to a map view and based on map data for the map region that may include metadata corresponding to elements within the map region. The map tool may then determine, based on one or more elements of the map data, a modification to be applied to an implementation to the gesture. Given the modification to the gesture implementation, the map tool may then render, based on performing the modification to the gesture, an updated map view instead of an updated map view based solely on the user gesture. | 12-05-2013 |
20130321395 | METHOD, SYSTEM AND APPARATUS FOR PROVIDING VISUAL FEEDBACK OF A MAP VIEW CHANGE - Methods, systems and apparatus are described to provide visual feedback of a change in map view. Various embodiments may display a map view of a map in a two-dimensional map view mode. Embodiments may obtain input indicating a change to a three-dimensional map view mode. Input may be obtained through the utilization of touch, auditory, or other well-known input technologies. Some embodiments may allow the input to request a specific display position to display. In response to the input indicating a change to a three-dimensional map view mode, embodiments may then display an animation that moves a virtual camera for the map display to different virtual camera positions to illustrate that the map view mode is changed to a three-dimensional map view mode. | 12-05-2013 |
20130321397 | Methods and Apparatus for Rendering Labels Based on Occlusion Testing for Label Visibility - Methods and apparatus for a map tool displaying a three-dimensional view of a map region, where the map tool determines whether or not partially occluded labels within the map region are to be drawn or are to not be drawn. The map tool determines whether or not to draw a label in a map view dependent upon mapping information and upon construction of a three-dimensional model based on one or more two- or three-dimensional data sets. The map tool further determines whether or not to draw a label in the map view dependent upon a measure of occlusion of a label in the map view. In order to determine a measure of occlusion, the map tool may calculate whether a line of sight projection from virtual camera viewpoint for the mobile device intersects, once or more than once, with any objects or landmarks within the three-dimensional model. | 12-05-2013 |
20130321403 | SYSTEM AND METHOD FOR HIGHLIGHTING A FEATURE IN A 3D MAP WHILE PRESERVING DEPTH - Systems and methods for rendering 3D maps may highlight a feature in a 3D map while preserving depth. A map tool of a mapping or navigation application that detects the selection of a feature in a 3D map (e.g., by touch) may perform a ray intersection to determine the feature that was selected. The map tool may capture the frame to be displayed (with the selected feature highlighted) in several steps. Each step may translate the map about a pivot point of the selected map feature (e.g., in three or four directions) to capture a new frame. The captured frames may be blended together to create a blurred map view that depicts 3D depth in the scene. A crisp version of the selected feature may then be rendered within the otherwise blurred 3D map. Color, brightness, contrast, or saturation values may be modified to further highlight the selected feature. | 12-05-2013 |
20130321431 | METHOD, SYSTEM AND APPARATUS FOR PROVIDING A THREE-DIMENSIONAL TRANSITION ANIMATION FOR A MAP VIEW CHANGE - Methods, systems and apparatus are described to provide a three-dimensional transition for a map view change. Various embodiments may display a map view. Embodiments may obtain input selecting another map view for display. Input may be obtained through the utilization of touch, auditory, or other well-known input technologies. In response to the input selecting a map view, embodiments may then display a transition animation that illustrates moving from the displayed map view to the selected map view in virtual space. Embodiments may then display the selected map view. | 12-05-2013 |
20130321442 | METHOD, SYSTEM AND APPARATUS FOR DYNAMICALLY GENERATING MAP TEXTURES - Methods, systems and apparatus are described to dynamically generate map textures. A client device may obtain map data, which may include one or more shapes described by vector graphics data. Along with the one or more shapes, embodiments may include texture indicators linked to the one or more shapes. Embodiments may render the map data. For one or more shapes, a texture definition may be obtained. Based on the texture definition, a client device may dynamically generate a texture for the shape. The texture may then be applied to the shape to render a current fill portion of the shape. In some embodiments the render map view is displayed. | 12-05-2013 |
20130321472 | METHOD, SYSTEM AND APPARATUS FOR SELECTIVELY OBTAINING MAP IMAGE DATA ACCORDING TO VIRTUAL CAMERA VELOCITY - Methods, systems and apparatus are described to selectively obtain map image data according to virtual camera velocity. Embodiments may display a map view of a map using a virtual camera. Some embodiments may detect a velocity of the virtual camera. Embodiments may then determine map image data for the map view of the map according to the velocity of the virtual camera and obtain the determined map image data. In at least some embodiments, a level-of-detail may be specified for map image data according to the velocity. Map image data may be obtained corresponding to this level-of-detail from a map service or from accessing local storage. | 12-05-2013 |
20130324098 | Methods and Apparatus for Determining Environmental Factors to Modify Hardware or System Operation - Methods and apparatus for an environment analysis tool on a mobile device which may construct a model of the surrounding environment in order to determine whether or not characteristics of the model implicate a degradation in wireless signal quality. In response to an analysis of the constructed model to determine signal quality, the environment analysis tool may alter the behavior of any number of hardware or software functions to avoid or reduce efforts to receive or use the affected signal over the duration of the mobile device's presence within the environment with the signal-degrading characteristics. | 12-05-2013 |
20130328861 | Generation of Road Data - Some embodiments provide a method for generating road data. The method receives data regarding several road segments and several junctions for a map region. The road segments include a first road segment and a second road segment that intersect at a particular junction. The method determines whether the first road segment and the second road segment are separate segments of a same road. When the first and second road segments are separate segments of the same road, the method defines an aggregate road that references the first and second road segments. In some embodiments, the method determines whether the first and second road segments are separate segments of the same road by using location data and road properties of the first and second road segments. In some embodiments, the aggregate road is stored as an ordered list of road segments that link together at junctions. | 12-12-2013 |
20130328916 | Smoothing Road Geometry - Some embodiments provide a method for a mapping service. For a set of road segments that intersect at a junction in a map region, the method generates an initial set of geometries for use in generating downloadable map information for the map region. For each corner formed by the geometries at the junction, the method determines whether to perform a smoothing operation. When a particular corner meets a set of criteria, the method modifies the geometries of at least one road segment to smooth the corner. | 12-12-2013 |
20130332113 | CONTEXT AWARE MAP APPLICATION - The embodiments described relate to techniques and systems for utilizing a portable electronic device to monitor, process, present and manage data captured by a series of sensors and location awareness technologies to provide a context aware map and navigation application. The context aware map application offers a user interface including visual and audio input and output, and provides several map modes that can change based upon context determined by data captured by a series of sensors and location awareness technologies. | 12-12-2013 |
20140071119 | Displaying 3D Objects in a 3D Map Presentation - Some embodiments of the map display application described herein display three-dimensional representations of three-dimensional objects. When the map presentation is moved to display a new area, the three-dimensional representations rise from a ground level to their full heights and transition from transparent to opaque at the same time. The map display applications of some embodiments also remove three-dimensional representations of objects by lowering the objects from their full height to ground level and fading out the representations from opaque to transparent. | 03-13-2014 |
20140365114 | Providing Maneuver Indicators on a Map - For a device that runs a mapping application, a method for providing maneuver indicators along a route of a map. The maneuver indicators are arrows that identify the direction and orientation of a maneuver. A maneuver arrow may be selected and displayed differently from unselected maneuver arrows. Maneuver arrows may be selected automatically based on a user's current location. The mapping application transitions between maneuver arrows and provides an animation for the transition. Complex maneuvers may be indicated by multiple arrows, providing a more detailed guidance for a user of the mapping application. | 12-11-2014 |
20140365965 | NIGHT MODE - A device that provides a map and/or navigation application that displays items on the map and/or navigation instructions differently in different modes. The applications of some embodiments provide a day mode and a night mode. In some embodiments the application uses the day mode as a default and activates the night mode when the time is after sunset at the location of the device. Some embodiments activate night mode when multiple conditions are satisfied (for example, when (1) the time is after sunset at the location of the device and (2) the ambient light level is below a threshold brightness). | 12-11-2014 |