Patent application number | Description | Published |
20130271602 | MOTION EVENT RECOGNITION SYSTEM AND METHOD - Enables recognition of events within motion data including but not limited to motion capture data obtained from portable wireless motion capture elements such as visual markers and sensors, radio frequency identification tags and motion sensors within mobile device computer systems, or calculated based on analyzed movement associated with the same user, other user, historical user or group of users. Provides low power transmission of events. Greatly reduces storage for events such as a shot, move or swing of a player, a concussion of a player, boxer, rider or driver, or a heat stroke, hypothermia, seizure, asthma attack, epileptic attack. Events may be correlated with image(s) as captured from internal/external camera(s) or nanny cam, for example to enable saving video of the event, such as the first steps of a child, violent shaking events, sporting, military or other motion events including concussions, or falling events associated with an elderly person. | 10-17-2013 |
20140376876 | MOTION EVENT RECOGNITION AND VIDEO SYNCHRONIZATION SYSTEM AND METHOD - Enables recognition of events within motion data obtained from portable wireless motion capture elements and video synchronization of the events with video as the events occur or at a later time, based on location and/or time of the event or both. May use integrated camera or external cameras with respect to mobile device to automatically generate generally smaller event videos of the event on the mobile device or server. Also enables analysis or comparison of movement associated with the same user, other user, historical user or group of users. Provides low memory and power utilization and greatly reduces storage for video data that corresponds to events such as a shot, move or swing of a player, a concussion of a player, or other medical related events or events, such as the first steps of a child, or falling events. | 12-25-2014 |
20150154452 | VIDEO AND MOTION EVENT INTEGRATION SYSTEM - Enables intelligent synchronization and transfer of generally concise event videos synchronized with motion data from motion capture sensor(s) coupled with a user or piece of equipment. Greatly saves storage and increases upload speed by uploading event videos and avoiding upload of non-pertinent portions of large videos. Provides intelligent selection of multiple videos from multiple cameras covering an event at a given time, for example selecting one with least shake. Enables near real-time alteration of camera parameters during an event determined by the motion capture sensor, and alteration of playback parameters and special effects for synchronized event videos. Creates highlight reels filtered by metrics and can sort by metric. Integrates with multiple sensors to save event data even if other sensors do not detect the event. Also enables analysis or comparison of movement associated with the same user, other user, historical user or group of users. | 06-04-2015 |
20150310280 | MOTION EVENT RECOGNITION AND VIDEO SYNCHRONIZATION SYSTEM AND METHOD - Enables recognition of events within motion data obtained from portable wireless motion capture elements and video synchronization of the events with video as the events occur or at a later time, based on location and/or time of the event or both. May use integrated camera or external cameras with respect to mobile device to automatically generate generally smaller event videos of the event on the mobile device or server. Also enables analysis or comparison of movement associated with the same user, other user, historical user or group of users. Provides low memory and power utilization and greatly reduces storage for video data that corresponds to events such as a shot, move or swing of a player, a concussion of a player, or other medical related events or events, such as the first steps of a child, or falling events. | 10-29-2015 |
Patent application number | Description | Published |
20110305369 | PORTABLE WIRELESS MOBILE DEVICE MOTION CAPTURE AND ANALYSIS SYSTEM AND METHOD - Portable wireless mobile device motion capture and analysis system and method configured to display motion capture/analysis data on a mobile device. System obtains data from motion capture elements and analyzes the data. Enables unique displays associated with the user, such as 3D overlays onto images of the user to visually depict the captured motion data. Ratings associated with the captured motion can also be displayed. Predicted ball flight path data can be calculated and displayed. Data shown on a time line can also be displayed to show the relative peaks of velocity for various parts of the user's body. Based on the display of data, the user can determine the equipment that fits the best and immediately purchase the equipment, via the mobile device. Custom equipment may be ordered through an interface on the mobile device from a vendor that can assemble-to-order customer built equipment and ship the equipment. Includes active and passive golf shot count capabilities. | 12-15-2011 |
20120088544 | PORTABLE WIRELESS MOBILE DEVICE MOTION CAPTURE DATA MINING SYSTEM AND METHOD - Portable wireless mobile device motion capture data mining system and method configured to display motion capture/analysis data on a mobile device. System obtains data from motion capture elements, analyzes data and store data in database for data mining, which may be charged for. Enables unique displays associated with the user, such as 3D overlays onto images of the user to visually depict the captured motion data including ratings. Predicted ball flight path data can be calculated and shown on a time line showing relative peaks of velocity for the user's body parts. User can determine equipment that fits best and immediately purchase the equipment, via the mobile device. Custom equipment may be ordered on the mobile device from a vendor that can assemble-to-order customer built equipment and ship the equipment. Includes active and passive golf shot count capabilities. | 04-12-2012 |
20120116548 | MOTION CAPTURE ELEMENT - Motion capture element for low power and accurate data capture for use in healthcare compliance, sporting, gaming, military, virtual reality, industrial, retail loss tracking, security, baby and elderly monitoring and other applications for example obtained from a motion capture element and relayed to a database via a mobile phone. System obtains data from motion capture elements, analyzes data and stores data in database for use in these applications and/or data mining, which may be charged for. Enables unique displays associated with the user, such as 3D overlays onto images of the user to visually depict the captured motion data. Ratings, compliance, ball flight path data can be calculated and displayed, for example on a map or timeline or both. Enables performance related equipment fitting and purchase. Includes active and passive identifier capabilities. | 05-10-2012 |
20120122574 | SYSTEM AND METHOD FOR UTILIZING MOTION CAPTURE DATA - System and method for utilizing motion capture data for healthcare compliance, sporting, gaming, military, virtual reality, industrial, retail loss tracking, security, baby and elderly monitoring and other applications for example obtained from a motion capture element and relayed to a database via a mobile phone. System obtains data from motion capture elements, analyzes data and stores data in database for use in these applications and/or data mining, which may be charged for. Enables unique displays associated with the user, such as 3D overlays onto images of the user to visually depict the captured motion data. Ratings, compliance, ball flight path data can be calculated and displayed, for example on a map or timeline or both. Enables performance related equipment fitting and purchase. Includes active and passive identifier capabilities. | 05-17-2012 |
20120215474 | CALIBRATION SYSTEM FOR SIMULTANEOUS CALIBRATION OF MULTIPLE MOTION CAPTURE ELEMENTS - A calibration system for simultaneous calibration of multiple motion capture elements (MCEs) of at least one type (accelerometer and/or gyroscope). Includes motion and/or rotational element coupled to a base and configured to move and/or rotate multiple MCEs mounted on a mount in and/or about at least one axis. For one axis movement embodiments, after each motion and/or axial rotation, the motion and/or rotational mount itself is rotated for example manually, so the mount points in a different direction, i.e., the Z axis. In a single axis embodiment, this is performed twice so that each axis of the MCEs experience motion and/or rotation about three axes. The motion capture data is sampled and used in calculation of a 3×3 calibration matrix. The physical format of the motion capture sensors may be any format including chip, memory or SIM card format, PCB format, mobile computers/phones. | 08-23-2012 |
20130063432 | VIRTUAL REALITY SYSTEM FOR VIEWING CURRENT AND PREVIOUSLY STORED OR CALCULATED MOTION DATA - Virtual reality system for viewing current and previously stored or calculated motion data. System obtains data from motion capture elements, analyzes data and stores data in database for use in virtual reality applications and/or data mining, which may be charged for. Enables unique displays associated with the user, such as 3D overlays onto images of the user to visually depict the captured motion data. Ratings, compliance, ball flight path data can be calculated and displayed, for example on a map or timeline or both. Enables performance related equipment fitting and purchase. Includes active and passive identifier capabilities. | 03-14-2013 |
20130095941 | ENCLOSURE AND MOUNT FOR MOTION CAPTURE ELEMENT - Enables coupling or retrofitting a golf club with active motion capture electronics that are battery powered, passive or active shot count components, for example a passive RFID, and/or a visual marker on the cap for use with visual motion capture cameras. Does not require modifying the golf club. Electronics package and battery can be easily removed and replaced, for example without any tools. May utilize a weight that is removed when inserting the electronic package in the mount, wherein the weight element may have the same weight as an electronics package, for no net change or minimal change in club weight. May be implemented with a shaft enclosure and expander that may be coupled with a screw aligned along an axis parallel to the axis of the shaft. May utilize non-permanently and/or friction coupling between the mount and shaft. Cap may include a visual marker and/or logo. | 04-18-2013 |
20130128022 | INTELLIGENT MOTION CAPTURE ELEMENT - Intelligent motion capture element that includes sensor personalities that optimize the sensor for specific movements and/or pieces of equipment and/or clothing and may be retrofitted onto existing equipment or interchanged therebetween and automatically detected for example to switch personalities. May be used for low power applications and accurate data capture for use in healthcare compliance, sporting, gaming, military, virtual reality, industrial, retail loss tracking, security, baby and elderly monitoring and other applications for example obtained from a motion capture element and relayed to a database via a mobile phone. System obtains data from motion capture elements, analyzes data and stores data in database for use in these applications and/or data mining. Enables unique displays associated with the user, such as 3D overlays onto images of the user to visually depict the captured motion data. Enables performance related equipment fitting and purchase. Includes active and passive identifier capabilities. | 05-23-2013 |
20130211774 | FITTING SYSTEM FOR SPORTING EQUIPMENT - Enables a fitting system for sporting equipment using an application that executes on a mobile phone for example to prompt and accept motion inputs from a given motion capture sensor to measure a user's size, range of motion, speed and then utilizes that same sensor to capture motion data from a piece of equipment, for example to further optimize the fit of, or suggest purchase of a particular piece of sporting equipment. Utilizes correlation or other data mining of motion data for size, range of motion, speed of other users to maximize the fit of a piece of equipment for the user based on other user's performance with particular equipment. For example, this enables a user of a similar size, range of motion and speed to data mine for the best performance equipment, e.g., longest drive, lowest putt scores, highest winning percentage, etc., associated with other users having similar characteristics. | 08-15-2013 |
20130225309 | BROADCASTING SYSTEM FOR BROADCASTING IMAGES WITH AUGMENTED MOTION DATA - A broadcasting system for broadcasting images with augmented motion data, which includes at least one camera, a computer and a wireless communication interface. The system obtains data from motion capture elements, analyzes data and optionally stores data in database for use in broadcasting applications, virtual reality applications and/or data mining. The system also recognizes at least one motion capture data element associated with a user or piece of equipment, and receives data associated with the motion capture element via the wireless communication interface. The system also enables unique displays associated with the user, such as 3D overlays onto images of the user to visually depict the captured motion data. Ratings, compliance, ball flight path data can be calculated and displayed, for example on a map or timeline or both. Furthermore, the system enables performance related equipment fitting and purchase. | 08-29-2013 |
20140378194 | BROADCASTING METHOD FOR BROADCASTING IMAGES WITH AUGMENTED MOTION DATA - A broadcasting method for broadcasting images with augmented motion data, which may utilize a system having at least one camera, a computer and a wireless communication interface. The system obtains data from motion capture elements, analyzes data and optionally stores data in database for use in broadcasting applications, virtual reality applications and/or data mining. The system also recognizes at least one motion capture data element associated with a user or piece of equipment, and receives data associated with the motion capture element via the wireless communication interface. The system also enables unique displays associated with the user, such as 3D overlays onto images of the user to visually depict the captured motion data. Ratings, compliance, ball flight path data can be calculated and displayed, for example on a map or timeline or both. Furthermore, the system enables performance related equipment fitting and purchase. | 12-25-2014 |
20150269435 | PORTABLE WIRELESS MOBILE DEVICE MOTION CAPTURE AND ANALYSIS SYSTEM AND METHOD - Portable wireless mobile device motion capture and analysis system and method configured to display motion capture/analysis data on a mobile device. System obtains data from motion capture elements and analyzes the data. Enables unique displays associated with the user, such as 3D overlays onto images of the user to visually depict the captured motion data. Ratings associated with the captured motion can also be displayed. Predicted ball flight path data can be calculated and displayed. Data shown on a time line can also be displayed to show the relative peaks of velocity for various parts of the user's body. Based on the display of data, the user can determine the equipment that fits the best and immediately purchase the equipment, via the mobile device. Custom equipment may be ordered through an interface on the mobile device from a vendor that can assemble-to-order customer built equipment and ship the equipment. | 09-24-2015 |
20150317801 | EVENT ANALYSIS SYSTEM - Enables event analysis from sensors including environmental, physiological and motion capture sensors. Also enables displaying information based on events recognized using sensor data associated with a user, piece of equipment or based on previous motion analysis data from the user or other user(s) or other sensors. Enables intelligent analysis, synchronization, and transfer of generally concise event videos synchronized with motion data from motion capture sensor(s) coupled with a user or piece of equipment. Enables creating, transferring, obtaining, and storing concise event videos generally without non-event video. Events stored in the database identifies trends, correlations, models, and patterns in event data. Greatly saves storage and increases upload speed by uploading event videos and avoiding upload of non-pertinent portions of large videos. Creates highlight and fail reels filtered by metrics and can sort by metric. Compares motion trajectories of users and objects to optimally efficient trajectories, and to desired trajectories. | 11-05-2015 |
20150318015 | MULTI-SENSOR EVENT DETECTION SYSTEM - A sensor event detection system including a motion capture element with a memory, sensor, microprocessor, first communication interface and another sensor. The sensor captures values associated with an orientation, position, velocity and acceleration of the motion capture element. The first communication interface receives other values associated with a temperature, humidity, wind and elevation, and the other sensor locally captures the other values. The microprocessor collects data that includes sensor values from the sensor, stores the data in the memory, and recognizes an event within the data to determine event data. The microprocessor correlates the data or event data with the other values to determine a false positive event or a type of equipment the motion capture element is coupled with, or a type of activity indicated by the data or event data, and transmits the data or event data associated with the event via the first communication interface. | 11-05-2015 |
20150324636 | INTEGRATED SENSOR AND VIDEO MOTION ANALYSIS METHOD - A method that integrates sensor data and video analysis to analyze object motion. Motion capture elements generate motion sensor data for objects of interest, and cameras generate video of these objects. Sensor data and video data are synchronized in time and aligned in space on a common coordinate system. Sensor fusion is used to generate motion metrics from the combined and integrated sensor data and video data. Integration of sensor data and video data supports robust detection of events, generation of video highlight reels or epic fail reels augmented with metrics that show interesting activity, and calculation of metrics that exceed the individual capabilities of either sensors or video analysis alone. | 11-12-2015 |
20150348591 | SENSOR AND MEDIA EVENT DETECTION SYSTEM - Enables detection of events using motion capture sensors and potentially other sensors electromagnetic field, temperature, humidity, wind, pressure, elevation, light, sound, or heart rate sensors to confirm and post events, differentiate similar types of motion events to determine the type of equipment or activity or quality of the event, such proficiency. Enables motion capture data and other sensor data to be utilized to curate text, images, video, sound and post the results to social networks, for example in a dedicated feed. Embodiments of the system also may post or filter to social media sites using any other filter besides location and time and the text in the social media posts for example. May use motion or other sensor data to define and event, eliminate false positive events, post true events, and/or correlate the events with social media to confirm the events, or post the events in a particular channel. | 12-03-2015 |
Patent application number | Description | Published |
20110157001 | METHOD AND APPARATUS FOR DISPLAY FRAMEBUFFER PROCESSING - Various methods for display framebuffer processing are provided. One example method include determining, via a processor, that update criteria associated with a display region has been satisfied, and comparing current frame data for the display region to subsequent frame data for the display region to determine frame data changes associated with the display region. In this regard, the comparing is performed in response to the update criteria being satisfied. The example method may also include facilitating presentation of the frame data changes within the display region on a display. Similar and related example methods and example apparatuses are also provided. | 06-30-2011 |
20110214162 | METHOD AND APPARTUS FOR PROVIDING COOPERATIVE ENABLEMENT OF USER INPUT OPTIONS - An apparatus for providing cooperative enablement or disablement of user input options may include at least one processor and at least one memory including computer program code. The at least one memory and the computer program code may be configured, with the processor, to cause the apparatus to perform at least receiving a first indication identifying any user input option to be enabled or disabled based on context information associated with a local device, receiving a second indication of any user input option to be enabled or disabled based on context information associated with a remote device, and providing enablement or disablement of user input options of the local device based on the first indication and the second indication. A corresponding method and computer program product are also provided. | 09-01-2011 |
20110242122 | METHOD AND APPARATUS FOR DETERMINING AN ACTIVE INPUT AREA - Various methods for determining an active input area are provided. One example method includes acquiring frame buffer data defining an image area that has been refreshed and detecting a cursor within the frame buffer data. Detecting the cursor may include determining that dimensions of the image area match dimensions of a previously acquired image area associated with a successful cursor detection, and the example method may further include directing transmission of coordinates and dimensions of the image area to a remote environment. Similar and related example methods and example apparatuses are also provided. | 10-06-2011 |
20110270600 | METHOD AND APPARATUS FOR PROVIDING INTEROPERABILITY BETWEEN DEVICES - An apparatus, method, and computer program product are provided for enabling interoperability between devices, such as a mobile terminal and some other remote device or remote environment. The apparatus may include at least one processor and at least one memory including computer program code. The at least one memory and the computer program code may be configured, with the processor, to cause the apparatus to maintain a terminal session between a server device and a client device in which the client device emulates at least a portion of a display presented at the server device; receive an indication of a user input received at the client device identifying a function to be performed at the server device; determine a corresponding input to elicit the identified function; and cause the identified function to be performed at the server device. | 11-03-2011 |
20110271183 | METHOD AND APPARATUS FOR PROVIDING INTEROPERABILITY BETWEEN DEVICES - Methods and apparatus are provided to promote interoperability between devices having different user input devices by correlating user input that is provided via one or more input mechanisms of a client device to touch events on the server device. The method may maintain a terminal session between a server device and a client device in which the client device simulates at least a portion of a display generated at the server device. The method may access a mapping between touch events on the server device and actuation of respective input mechanisms of the client device. The method may also interpret one or more control signals provided in response to actuation of an input mechanism based upon the mapping and may then cause a user interface state of the server device to updated based on the actuation of the input mechanism of the client device. | 11-03-2011 |
20110271195 | METHOD AND APPARATUS FOR ALLOCATING CONTENT COMPONENTS TO DIFFERENT HARDWARD INTERFACES - A method and apparatus are described that facilitate mobile device interoperability with a remote environment. A method may be provided that may determine, for each of a plurality of content components of a user interface, a respective hardware interface via which to transmit the content component. The method may also generate meta-information associated with at least one of the content components to facilitate recomposition of the content component and may cause the plurality of content components and the meta-information to be transmitted via the respective hardware interfaces with at least two of the content components being transmitted via different hardware interfaces. A method may also be provided that receives a plurality of streams of content components and meta-information via different respective hardware interfaces, recomposes the content components in accordance with the meta-information to form a unified user interface and causes a display to be presented of the unified user interface. | 11-03-2011 |
20110271198 | METHOD AND APPARATUS FOR PROVIDING COOPERATIVE USER INTERFACE LAYER MANAGEMENT WITH RESPECT TO INTER-DEVICE COMMUNICATIONS - An apparatus for providing cooperative user interface layer management may include at least one processor and at least one memory including computer program code. The at least one memory and the computer program code may be configured, with the processor, to cause the apparatus to perform at least maintaining a terminal session between a server device and a client device in which the client device emulates at least a portion of a display presented at the server device, receiving, at the server device, an indication identifying a user interface layer for which display of information related to the user interface layer is not supported at the client device, and determining a response to a user input provided at the client device based on whether the user input relates to the user interface layer identified by the indication. A corresponding method and computer program product are also provided. | 11-03-2011 |
20110320953 | METHOD AND APPARATUS FOR PROJECTING A USER INTERFACE VIA PARTITION STREAMING - Various methods for projecting a user interface via multiple encoded streams are provided. One example method includes generating first and at least second data streams. The data included in the first and second data streams may be configured to cause respective partitions of a unified user interface image to be displayed. The example method may also include generating fiducial information indicating at least a location for displaying the data of the first data streams on a display. The example method may also include causing the first data stream, the second data stream, and the fiducial information to be transmitted from an apparatus to a remote environment for displaying the first partition of the unified user interface image and the second partition of the unified user interface image on a display of the remote environment. Similar and related example methods and example apparatuses are also provided. | 12-29-2011 |
20120095643 | Method, Apparatus, and Computer Program Product for Modifying a User Interface Format - Various methods for modifying a user interface format are provided. One example method comprises receiving vehicle-based data at a mobile device via a communications link between the mobile device and an on-board vehicle analysis system, determining an environmental context based at least on the vehicle-based data, and modifying a user interface format based on the determined environmental context. Similar and related example methods, example apparatuses, and example computer program products are also provided. | 04-19-2012 |
20120203491 | METHOD AND APPARATUS FOR PROVIDING CONTEXT-AWARE CONTROL OF SENSORS AND SENSOR DATA - An approach is provided for context-aware control of sensors and sensor data. A sensor manager determines context information based, at least in part, on one or more sensors. The sensor manager also determines resource consumption information associated with a one or more other sensors, one or more functions of the one or more other sensors, or a combination thereof. The sensor manager then processes and/or facilitates a processing of the context information and the resource consumption information to determine at least one operational state associated with the one or more other sensors, the one or more functions of the one or more other sensors, or a combination thereof. | 08-09-2012 |
20130024777 | METHOD AND APPARATUS FOR TRIGGERING A REMOTE DATA ENTRY INTERFACE - Various methods triggering a remote data entry interface are provided. One example method includes receiving, at a device, a data entry field selection message notifying that data entry is desired, inhibiting a presentation of a data entry input interface on a display of the device, and causing a remote interface trigger message to be sent to a remote device to direct the remote device to present a remote data entry input interface on a display of the remote device. Similar and related example methods and example apparatuses are also provided. | 01-24-2013 |
20130024783 | METHOD AND APPARATUS FOR PROVIDING DATA ENTRY CONTENT TO A REMOTE ENVIRONMENT - Various methods for providing data entry content to a remote environment are provided. One example method includes receiving, at a device, a data entry field selection message notifying that a selection of a data entry field has occurred at a remote device, modifying a focus of a user interface of the device to the data entry field, retrieving current data content of the data entry field, and causing an indication to be provided to the remote device informing the remote device of the current data content of the data entry field to enable the remote device to display the current data content of the data entry field in a data entry input interface. Similar and related example methods and example apparatuses are also provided. | 01-24-2013 |
20130044062 | METHOD AND APPARATUS FOR TRANSLATING BETWEEN FORCE INPUTS AND TEMPORAL INPUTS - An apparatus, method, and computer program product are described that provide for translating between force inputs and temporal inputs to execute operations conventionally associated with temporal inputs. The apparatus includes at least one processor and at least one memory including computer program code that are designed to cause the apparatus to at least receive a force indication relating to a force component of the input received, and provide for execution of an operation associated with a temporal input based on the force indication. For example, an input having a force component, such as a “hard press,” would result in the execution of an operation otherwise associated with a temporal input, such as a “long press.” By using a force input, the user can effect an operation in less time by cutting out the time that would have been expended to execute the corresponding “long press” gesture. | 02-21-2013 |
20130050133 | METHOD AND APPARATUS FOR PRECLUDING OPERATIONS ASSOCIATED WITH ACCIDENTAL TOUCH INPUTS - An apparatus, method, and computer program product are described that provide for a determination of the validity of an input based, at least in part, on a force indication relating to the input received and a proximity indication. An operation associated with the input may be precluded (e.g., the input may be disregarded) based, at least in part, on a determination that the input is invalid. When the input is determined to be valid, on the other hand, the associated operation may be executed. As a result, the execution of operations that are unintended (such as operations that are associated with accidental contact with the touch screen display) may be minimized, and the user's experience with the device can be enhanced. | 02-28-2013 |
20130093713 | METHOD AND APPARATUS FOR DETERMINING THE PRESENCE OF A DEVICE FOR EXECUTING OPERATIONS - An apparatus, method, and computer program product are described that can detect the presence of a tangible object without using computer vision. The apparatus receives a signal from a device proximate the apparatus, where the signal includes at least one of a proximity component and an orientation component, and also receives a touch input from the associated display. The apparatus then determines whether there is an association between the signal and the touch input based on the proximity component and/or the orientation component. If the signal and the touch input are associated, it is an indication that the device is disposed on the display, operations may be executed, such as to facilitate interaction between the apparatus and the device. As a result, any object capable of providing a signal having a proximity component or an orientation component can be detected (e.g., without the use of cameras or fiducial markers). | 04-18-2013 |
20130100167 | METHOD AND APPARATUS FOR CONTROL OF ORIENTATION OF INFORMATION PRESENTED BASED UPON DEVICE USE STATE - A method, apparatus, and computer program product are provided to enable the provision of a mechanism by which an orientation of information presented may be maintained based upon a use state of a device, regardless of a physical display orientation. A method may include determining a physical orientation, determining whether the physical display orientation warrants a change of orientation of information presented, and determining whether a use state precludes a change of the orientation of the information presented based upon the physical display orientation. The method may further maintain the orientation of the information presented in response to a physical display orientation that warrants a change of the orientation of the information presented in an instance in which the use state is determined to preclude the change of the orientation of the information presented based on the physical display orientation. | 04-25-2013 |
20130109961 | APPARATUS AND METHOD FOR PROVIDING DYNAMIC FIDUCIAL MARKERS FOR DEVICES | 05-02-2013 |
20130132611 | Apparatus, a Method and a Computer Program - An apparatus includes one local input/output device; and an interface configured to interface between an application hosted by the apparatus and the local input/output device and is configured to interface between the application hosted by the apparatus and a remote input/output device hosted by another apparatus; wherein the interface has a first state in which the interface is configured to couple the application and the local input/output device but not couple the application and the remote input/output device hosted by the another apparatus; wherein the interface has a second state in which the interface is configured to couple the application and the local input/output device and to couple the application and the remote input/output device hosted by another apparatus; and wherein the interface is configured to be responsive to a proximity detection trigger to change its state. | 05-23-2013 |
20130163598 | Encoding Watermarks In A Sequence Of Sent Packets, The Encoding Useful For Uniquely Identifying An Entity In Encrypted Networks - A method includes sending over the network from a source entity to a destination entity a sequence of a plurality of packets. Each packet in the sequence includes a same identifier corresponding to a network entity on the network. Sending includes modifying a property of the sequence of packets to uniquely identify the sequence of packets. The method includes receiving information indicating the identifier corresponds to the modification of the property. Another method includes examining a sequence of packets sent over a network from a source entity to a destination entity, each packet in the sequence comprising a same identifier corresponding to a network entity on the network. The method includes determining whether a property of the sequence of packets was modified when sent to uniquely identify the sequence of packets; and responsive to the determining, associating the identifier with the network identity. Apparatus and program products are also disclosed. | 06-27-2013 |
20130190005 | DIRECTIONAL PEER-TO-PEER NETWORKING - Interaction between wireless devices, one of which may be equipped with a directional radio (a radio with a directional antenna for directional sensing capabilities) can be used in directional peer-to-peer networking. A method can include obtaining, at a first device from a serving device, proximity information regarding a second device. The method can also include obtaining, at the first device from the serving device, direction information regarding the second device. The method can further include calculating a position of the second device with respect to the first device based on the proximity information and based on the direction information. The method can additionally include communicating with the second device based on the position. | 07-25-2013 |
20130212483 | APPARATUS AND METHOD FOR PROVIDING FOR REMOTE USER INTERACTION - An apparatus, method, and computer program product are described that provide for a replicated user interface, including a replicated active display area and a replicated boundary area, such that the replicated user interface can recognize the same types of user inputs that are recognized by a remote user interface, and the same operations may be executed upon receipt of such inputs. The apparatus can establish a communications link with the remote user interface and determine whether the remote user interface is configured to detect a user input at least partially received in a boundary area outside an active display area of the remote user interface. A replicated active display area and a replicated boundary area may then be provided for in response to a determination that the remote user interface is configured to detect a user input at least partially received in the boundary area of the remote user interface. | 08-15-2013 |
20130215040 | APPARATUS AND METHOD FOR DETERMINING THE POSITION OF USER INPUT - An apparatus, method, and computer program product are described that determine a position of a touch component of a user input received outside a touch sensitive area of a touch surface by correlating a position of the force component and a position of a touch component of a portion of the user input received within the touch sensitive area with a position of the force component of a portion of the user input received outside the touch sensitive area. In this way, the position of a touch component of the user input received outside the touch sensitive area, where the touch surface is not capable via hardware to detect the position of the touch component, may be determined, and operations may be executed based on the position of the touch component that is determined. | 08-22-2013 |
20130222223 | METHOD AND APPARATUS FOR INTERPRETING A GESTURE - A method, apparatus and computer program product are provided to facilitate user interaction with a display that is capable of presenting at least portions of the user interfaces of multiple devices, such as by recognizing and interpreting a gesture as providing input to one of the devices. In the context of a method, an identification of one or more valid gestures of at least a first device is received in an instance in which a plurality of devices interact such that portions of the respective user interfaces are capable of being presented upon a display. The method also includes receiving information indicative of a gesture and determining whether the gesture is valid. Depending upon whether the gesture is a valid gesture, the method also includes causing an indication of the gesture to be provided to the first device. | 08-29-2013 |
20130244625 | APPARATUS AND METHOD FOR PROVIDING FOR ENABLING USE OF AN APPLICATION BASED ON A DETERMINED OPERATING CONDITION AND SAFETY RATING - An apparatus, method, and computer program product are described that provide for an apparatus that can determine the operating condition of a device and a safety rating of an application executed by the device based on the operating condition that is determined. The apparatus may then cause a liability waiver regarding use of the application to be presented to the user for consideration and acceptance based on the safety rating that is determined. If the user accepts the liability waiver, the apparatus may further provide for the storage of an indication of acceptance of the liability waiver and, in turn, enable use of the application. In cases in which the user does not agree to the liability waiver, however, the apparatus may disable at least a portion of the functionality of the application and/or the device. | 09-19-2013 |
20130309972 | APPARATUS AND METHOD FOR DETECTING PROXIMATE DEVICES - An apparatus, method, and computer program product are described that provide for a user to share content with other users who are proximate to his or her device in a simple and intuitive manner. In some embodiments, a “wave” gesture is used to identify users of devices that nearby to the source user's device with whom the source user may communicate, such as to share content. Upon receiving a first orientation input, a scanning mode may be initiated during which one or more devices proximate the apparatus are determined. A second orientation input that is different from the first orientation input, and the scanning mode may be terminated in response. As a result, a communication with at least one selected device of the one or more devices determined to be proximate the apparatus may be facilitated. | 11-21-2013 |
20150382138 | LOCATION-BASED AUDIO MESSAGING - Mobile devices provide a variety of techniques for presenting messages from sources to a user. However, when the message pertains to the presence of the user at a location, the available communications techniques may exhibit deficiencies, e.g., reliance on the memory of the source and/or user of the existence and content of a message between its initiation and the user's visit to the location, or reliance on the communication accessibility of the user, the device, and/or the source during the user's location visit. Presented herein are techniques for enabling a mobile device, at a first time, to receive a request to present an audio message during the presence of the user at a location; and, at a second time, detecting the presence of the user at the location, and presenting the audio message to the user, optionally without awaiting a request from the user to present the message. | 12-31-2015 |