Patent application number | Description | Published |
20090327171 | RECOGNIZING GESTURES FROM FOREARM EMG SIGNALS - A machine learning model is trained by instructing a user to perform proscribed gestures, sampling signals from EMG sensors arranged arbitrarily on the user's forearm with respect to locations of muscles in the forearm, extracting feature samples from the sampled signals, labeling the feature samples according to the corresponding gestures instructed to be performed, and training the machine learning model with the labeled feature samples. Subsequently, gestures may be recognized using the trained machine learning model by sampling signals from the EMG sensors, extracting from the signals unlabeled feature samples of a same type as those extracted during the training, passing the unlabeled feature samples to the machine learning model, and outputting from the machine learning model indicia of a gesture classified by the machine learning model. | 12-31-2009 |
20100244767 | MAGNETIC INDUCTIVE CHARGING WITH LOW FAR FIELDS - A charging station wirelessly transmits power to mobile electronic devices (MEDs) each having a planar-shaped receiver coil (RC) and a capacitor connected in parallel across the RC. The station includes a planar charging surface, a number of series-interconnected bank A source coils (SCs), a number of series-interconnected bank B SCs, and electronics for energizing the SCs. Each SC generates a flux field perpendicular to the charging surface. The bank A and bank B SCs are interleaved and alternately energized in a repeating duty cycle. The coils in each bank are also alternately wound in a different direction so that the fields cancel each other out in a far-field environment. Whenever an MED is placed in close proximity to the charging surface, the fields wirelessly induce power in the RC. The MEDs can have any two-dimensional orientation with respect to the charging surface. | 09-30-2010 |
Patent application number | Description | Published |
20130232552 | Automatic Context Sharing with Privacy - The subject disclosure is directed towards a technology by which a computing device user may share context-related information (e.g., including current activity) with other recipient machines. A requestor may request to peek at a user's context, and if the requestor is valid (pre-approved by the user), a response based on context-related information is sent, which may be via a cloud service. The response may be filtered and/or adjusted based upon the identity of the requestor and other information associated with that identity, e.g., filtering criteria set by the user. Also described is notifying the user of the peek request, and logging information corresponding to the request and response. A broadcast message may also be sent by the device to share context without waiting for a peek request. | 09-05-2013 |
20140160424 | MULTI-TOUCH INTERACTIONS ON EYEWEAR - The subject disclosure is directed towards eyewear configured as an input device, such as for interaction with a computing device. The eyewear includes a multi-touch sensor set, e.g., located on the frames of eyeglasses, that outputs signals representative of user interaction with the eyewear, such as via taps, presses, swipes and pinches. Sensor handling logic may be used to provide input data corresponding to the signals to a program, e.g., the program with which the user wants to interact. | 06-12-2014 |
20150071555 | Managing Access by Applications to Perceptual Information - Functionality is described herein by which plural environment-sensing applications capture information from an environment in a fine-grained and least-privileged manner. By doing so, the functionality reduces the risk that private information that appears within the environment will be released to unauthorized parties. Among other aspects, the functionality provides an error correction mechanism for reducing the incidence of false positives in the detection of objects, an offloading technique for delegating computationally intensive recognition tasks to a remote computing framework, and a visualization module by which a user may inspect the access rights to be granted (or already granted) to each application. | 03-12-2015 |
20150196209 | CARDIOVASCULAR RISK FACTOR SENSING DEVICE - Various technologies described herein pertain to sensing cardiovascular risk factors of a user. A chair includes one or more sensors configured to output signals indicative of conditions at site(s) on a body of a user. A seat of the chair, a back of the chair, and/or arms of the chair can include the sensor(s). Moreover, the chair includes a collection circuit configured to receive the signals from the sensor(s). A risk factor evaluation component is configured to detect a pulse wave velocity of the user based on the signals from the sensor(s). The risk factor evaluation component is further configured to perform a pulse wave analysis of the user based on a morphology of a pulse pressure waveform of the user, and the pulse pressure waveform is detected based on the signals from the sensor(s). | 07-16-2015 |
20150199480 | CONTROLLING HEALTH SCREENING VIA ENTERTAINMENT EXPERIENCES - Various technologies described herein pertain to controlling performance of a health assessment of a user in an entertainment venue. Data in a health record of the user is accessed, where the health record is retained in computer-readable storage. The user is located at the entertainment venue, and the entertainment venue includes an attraction. A health parameter of the user to be measured as part of the health assessment performed in the entertainment venue is selected based on the data in the health record of the user. Further, an interaction between the user and the attraction of the entertainment venue is controlled based on the health parameter to be measured. Data indicative of the health parameter of the user is computed based on a signal output by a sensor. The signal is output by the sensor during the interaction between the user and the attraction of the entertainment venue. | 07-16-2015 |
20150199484 | USING SENSORS AND DEMOGRAPHIC DATA TO AUTOMATICALLY ADJUST MEDICATION DOSES - Various technologies described herein pertain to adjust recommended dosages of a medication for a user in a non-clinical environment. The medication can be identified and an indication of a symptom of the user desirably managed by the medication can be received. An initial recommended dosage of the medication can be determined based on static data of the user and the symptom. Dynamic data indicative of efficacy of the medication for the user over time in the non-clinical environment can be collected from sensor(s) in the non-clinical environment. The dynamic data indicative of the efficacy of the medication can include data indicative of the symptom and data indicative of a side effect of the user resulting from the medication. A subsequent recommended dosage of the medication can be refined based on the static data of the user and the dynamic data indicative of the efficacy of the medication for the user. | 07-16-2015 |
Patent application number | Description | Published |
20150228062 | RESTAURANT-SPECIFIC FOOD LOGGING FROM IMAGES - A “Food Logger” provides various approaches for learning or training one or more image-based models (referred to herein as “meal models”) of nutritional content of meals. This training is based on one or more datasets of images of meals in combination with “meal features” that describe various parameters of the meal. Examples of meal features include, but are not limited to, food type, meal contents, portion size, nutritional content (e.g., calories, vitamins, minerals, carbohydrates, protein, salt, etc.), food source (e.g., specific restaurants or restaurant chains, grocery stores, particular pre-packaged foods, school meals, meals prepared at home, etc.). Given the trained models, the Food Logger automatically provides estimates of nutritional information based on automated recognition of new images of meals provided by (or for) the user. This nutritional information is then used to enable a wide range of user-centric interactions relating to food consumed by individual users. | 08-13-2015 |
Patent application number | Description | Published |
20120319959 | DEVICE INTERACTION THROUGH BARRIER - There is provided an electronic device having a touch-sensing element configured for sensing touches on a surface thereof. A baseline sensitivity setting determines a sensitivity of the touch-sensing element. The touch-sensing element is configured to register a touch that meets or exceeds the baseline sensitivity setting, and to ignore a touch that does not meet the baseline sensitivity setting. The device further includes a sensor that senses an operating condition of the device. A memory of the device includes code executable by the device and configured to adjust the baseline sensitivity setting based upon the sensed operating condition. | 12-20-2012 |
20140249398 | DETERMINING PULSE TRANSIT TIME NON-INVASIVELY USING HANDHELD DEVICES - A system and method to determine pulse transit time using a handheld device. The method includes generating an electrocardiogram (EKG) for a user of the handheld device. Two portions of the user's body are in contact with two contact points of the handheld device. The method also includes de-noising the EKG to identify a start time when a blood pulse leaves a heart of the user. The method further includes de-noising a plurality of video images of the user to identify a pressure wave indicating an arterial site and a time when the pressure wave appears. Additionally, the method includes determining the PTT based on the de-noised EKG and the de-noised video images. | 09-04-2014 |
20140257533 | AUTOMATIC EXERCISE SEGMENTATION AND RECOGNITION - A physical activity monitoring device includes a sensor array with one or more sensors configured to measure physical activity attributes of a user. A controller automatically determines time intervals where the user is actively engaged in a physical activity based on the physical activity attributes. The controller also automatically determines a type of physical activity the user in actively engaged in during the determined time intervals based on the physical activity attributes. A reporter outputs information regarding the type of physical activity to the user. | 09-11-2014 |
20140257534 | EXTENDING GAMEPLAY WITH PHYSICAL ACTIVITY MONITORING DEVICE - A physical activity monitoring device receives an indication of one or more physical activities to be performed as an extension of a game being played on a game system and measures physical activity attributes of a user wearing the physical activity monitoring device. The physical activity monitoring device determines the user's progress towards completion of the one or more physical activities based on the physical activity attributes and outputs to the game device an indication of the user's progress towards completion of the one or more physical activities. | 09-11-2014 |
20140257535 | PERSONAL TRAINING WITH PHYSICAL ACTIVITY MONITORING DEVICE - A physical activity monitoring device receives a workout regimen including a plurality of exercises. For each of the plurality of exercises, the physical activity monitoring device indicates that exercise to a user and measures physical activity attributes of the user. The physical activity monitoring device outputs information regarding the user's progress towards completion of that exercise based on the physical activity attributes. | 09-11-2014 |
Patent application number | Description | Published |
20090326406 | WEARABLE ELECTROMYOGRAPHY-BASED CONTROLLERS FOR HUMAN-COMPUTER INTERFACE - A “Wearable Electromyography-Based Controller” includes a plurality of Electromyography (EMG) sensors and provides a wired or wireless human-computer interface (HCl) for interacting with computing systems and attached devices via electrical signals generated by specific movement of the user's muscles. Following initial automated self-calibration and positional localization processes, measurement and interpretation of muscle generated electrical signals is accomplished by sampling signals from the EMG sensors of the Wearable Electromyography-Based Controller. In operation, the Wearable Electromyography-Based Controller is donned by the user and placed into a coarsely approximate position on the surface of the user's skin. Automated cues or instructions are then provided to the user for fine-tuning placement of the Wearable Electromyography-Based Controller. Examples of Wearable Electromyography-Based Controllers include articles of manufacture, such as an armband, wristwatch, or article of clothing having a plurality of integrated EMG-based sensor nodes and associated electronics. | 12-31-2009 |
20120188158 | WEARABLE ELECTROMYOGRAPHY-BASED HUMAN-COMPUTER INTERFACE - A “Wearable Electromyography-Based Controller” includes a plurality of Electromyography (EMG) sensors and provides a wired or wireless human-computer interface (HCl) for interacting with computing systems and attached devices via electrical signals generated by specific movement of the user's muscles. Following initial automated self-calibration and positional localization processes, measurement and interpretation of muscle generated electrical signals is accomplished by sampling signals from the EMG sensors of the Wearable Electromyography-Based Controller. In operation, the Wearable Electromyography-Based Controller is donned by the user and placed into a coarsely approximate position on the surface of the user's skin. Automated cues or instructions are then provided to the user for fine-tuning placement of the Wearable Electromyography-Based Controller. Examples of Wearable Electromyography-Based Controllers include articles of manufacture, such as an armband, wristwatch, or article of clothing having a plurality of integrated EMG-based sensor nodes and associated electronics. | 07-26-2012 |
20130232095 | RECOGNIZING FINGER GESTURES FROM FOREARM EMG SIGNALS - A machine learning model is trained by instructing a user to perform various predefined gestures, sampling signals from EMG sensors arranged arbitrarily on the user's forearm with respect to locations of muscles in the forearm, extracting feature samples from the sampled signals, labeling the feature samples according to the corresponding gestures instructed to be performed, and training the machine learning model with the labeled feature samples. Subsequently, gestures may be recognized using the trained machine learning model by sampling signals from the EMG sensors, extracting from the signals unlabeled feature samples of a same type as those extracted during the training, passing the unlabeled feature samples to the machine learning model, and outputting from the machine learning model indicia of a gesture classified by the machine learning model. | 09-05-2013 |
20140274159 | INFERRING PLACEMENT OF MOBILE ELECTRONIC DEVICES - A “Placement Detector” enables handheld or mobile electronic devices such as phones, media players, tablets, etc., to infer their current position or placement. Placement inference is performed by evaluating one or more sensors associated with the device relative to one or more trained probabilistic models to infer device relative to a user. Example placement inferences include, but are not limited to, inferring whether the device is currently in a user's pocket, in a user's purse (or other carrying bag or backpack), in a closed area such as a drawer or box, in an open area such as on a table, indoors, outdoors, etc. These types of placement inferences facilitate a wide range of automated user-device interactions, including, but not limited to, placement-dependent notifications, placement-dependent responses to various inputs, prevention of inadvertent “pocket dialing,” prevention of inadvertent power cycling of devices, lost or misplaced device location assistance, etc. | 09-18-2014 |