Patent application number | Description | Published |
20140016860 | FACIAL ANALYSIS TO DETECT ASYMMETRIC EXPRESSIONS - A system and method for facial analysis to detect asymmetric expressions is disclosed. A series of facial images is collected, and an image from the series of images is evaluated with a classifier. The image is then flipped to create a flipped image. Then, the flipped image is evaluated with the classifier. The results of the evaluation of original image and the flipped image are compared. Asymmetric features such as a wink, a raised eyebrow, a smirk, or a wince are identified. These asymmetric features are associated with mental states such as skepticism, contempt, condescension, repugnance, disgust, disbelief, cynicism, pessimism, doubt, suspicion, and distrust. | 01-16-2014 |
20140051047 | SPORADIC COLLECTION OF MOBILE AFFECT DATA - A user may react to an interaction by exhibiting a mental state. A camera or other monitoring device can be used to capture one or more manifestations of the user's mental state, such as facial expressions, electrodermal activity, or movements. However, there may be conditions where the monitoring device is not able to detect the manifestation continually. Thus, various capabilities and implementations are described where the mental state data is collected on an intermittent basis, analyzed, and an output rendered based on the analysis of the mental state data. | 02-20-2014 |
20140058828 | OPTIMIZING MEDIA BASED ON MENTAL STATE ANALYSIS - Mental state data is collected from a group of people as they view a media presentation, such as an advertisement, a television show, or a movie. The mental state data is analyzed to produce mental state information, such as inferred mental states, facial expressions, or valence. The mental state information is used to automatically optimize the previously viewed media presentation. The optimization may change various aspects of the media presentation including the length of different portions of the media presentation, the overall length of the media presentation, character selection, music selection, advertisement placement, and brand reveal time. | 02-27-2014 |
20140112540 | COLLECTION OF AFFECT DATA FROM MULTIPLE MOBILE DEVICES - A user interacts with various pieces of technology to perform numerous tasks and activities. Reactions can be observed and mental states inferred from these performances. Multiple devices, including mobile devices, can observe and record or transmit a user's mental state data. The mental state data collected from the multiple devices can be used to analyze the mental states of the user. The mental state data can be in the form of facial expressions, electrodermal activity, movements, or other detectable manifestations. Multiple cameras on the multiple devices can be usefully employed to collect facial data. An output can be rendered based on an analysis of the mental state data. | 04-24-2014 |
20140200416 | MENTAL STATE ANALYSIS USING HEART RATE COLLECTION BASED ON VIDEO IMAGERY - Video of one or more people is obtained and analyzed. Heart rate information is determined from the video and the heart rate information is used in mental state analysis. The heart rate information and resulting mental state analysis are correlated to stimuli, such as digital media which is consumed or with which a person interacts. The heart rate information is used to infer mental states. The mental state analysis, based on the heart rate information, can be used to optimize digital media or modify a digital game. | 07-17-2014 |
20140200417 | MENTAL STATE ANALYSIS USING BLINK RATE - Mental state analysis is performed by obtaining video of an individual as the individual interacts with a computer, either by performing various operations or by consuming a media presentation. The video is analyzed to determine eye-blink information on the individual, such as eye-blink rate or eye-blink duration. A mental state of the individual is then inferred based on the eye blink information. The blink-rate information and associated mental states can be used to modify an advertisement, a media presentation, or a digital game. | 07-17-2014 |
20140200463 | MENTAL STATE WELL BEING MONITORING - The mental state of an individual is obtained to determine their well-being status. The mental state is derived from an analysis of facial information and physiological information of an individual. The well-being status of other individuals is correlated to the well-being status of the first individual. The well-being status of the individual or group of individuals is rendered for display. The well-being status of an individual is used to provide feedback and to recommend activities for the individual. | 07-17-2014 |
20140201207 | MENTAL STATE DATA TAGGING FOR DATA COLLECTED FROM MULTIPLE SOURCES - Mental state data useful for determining mental state information on an individual, such as video of an individual's face, is captured. Additional data that is helpful in determining the mental state information, such as contextual information, is also determined. The data and additional data allows interpretation of individual mental state information. The additional data is tagged to the mental state data and at least some of the mental state data, along with the tagged data, can be sent to a web service where it is used to produce further mental state information. | 07-17-2014 |
20140323817 | PERSONAL EMOTIONAL PROFILE GENERATION - The mental state of an individual is obtained in order to generate an emotional profile for the individual. The individual's mental state is derived from an analysis of the individual's facial and physiological information. The emotional profile of other individuals is correlated to the first individual for comparison. Various categories of emotional profiles are defined based upon the correlation. The emotional profile of the individual or group of individuals is rendered for display, used to provide feedback and to recommend activities for the individual, or provide information about the individual. | 10-30-2014 |
20140357976 | MENTAL STATE ANALYSIS USING AN APPLICATION PROGRAMMING INTERFACE - A mobile device is emotionally enabled using an application programming interface (API) in order to infer a user's emotions and make the emotions available for sharing. Images of an individual or individuals are captured and send through the API. The images are evaluated to determine the individual's mental state. Mental state analysis is output to an app running on the device on which the API resides for further sharing, analysis, or transmission. A software development kit (SDK) can be used to generate the API or to otherwise facilitate the emotional enablement of a mobile device and the apps that run on the device. | 12-04-2014 |