Patent application number | Description | Published |
20110125734 | QUESTIONS AND ANSWERS GENERATION - A system, method and/or computer program product for automatically generating questions and answers based on any corpus of data. The computer system, given a collection of textual documents, automatically generates collections of questions about the documents together with answers to those questions. In particular, such a process can be applied to the so called ‘open’ domain, where the type of the corpus is not given in advance, and neither is the ontology of the corpus. The system improves the exploring of large bodies of textual information. Applications implementing the system and method include new types of tutoring systems, educational question-answering games, national security and business analysis systems, etc. | 05-26-2011 |
20120077178 | SYSTEM AND METHOD FOR DOMAIN ADAPTATION IN QUESTION ANSWERING - The present disclosure relates generally to question answering systems and methods and, particularly, to systems and methods for domain adaptation in question answering. | 03-29-2012 |
20120078062 | DECISION-SUPPORT APPLICATION AND SYSTEM FOR MEDICAL DIFFERENTIAL-DIAGNOSIS AND TREATMENT USING A QUESTION-ANSWERING SYSTEM - A decision-support system for medical diagnosis and treatment comprises software modules embodied on a computer readable medium, and the software modules comprise an input/output module and a question-answering module. The method receives patient case information using the input/output module, and generates a medical diagnosis or treatment query based on the patient case information and also generates a plurality of medical diagnosis or treatment answers for the query using the question-answering module. The method also calculates numerical values for multiple medical evidence dimensions from medical evidence sources for each of the answers using the question-answering module and also calculates a corresponding confidence value for each of the answers based on the numerical value of each evidence dimension using the question-answering module. The method further outputs the medical diagnosis or treatment answers, the corresponding confidence values, and the numerical values of each medical evidence dimension for one or more selected medical diagnosis or treatment answers using the input/output module. | 03-29-2012 |
20120078636 | EVIDENCE DIFFUSION AMONG CANDIDATE ANSWERS DURING QUESTION ANSWERING - Diffusing evidence among candidate answers during question answering may identify a relationship between a first candidate answer and a second candidate answer, wherein the candidate answers are generated by a question-answering computer process, the candidate answers have associated supporting evidence, and the candidate answers have associated confidence scores. All or some of the evidence may be transferred from the first candidate answer to the second candidate answer based on the identified relationship. A new confidence score may be computed for the second candidate answer based on the transferred evidence. | 03-29-2012 |
20120078826 | FACT CHECKING USING AND AIDING PROBABILISTIC QUESTION ANSWERING - A system, a method and a computer program product for verifying a statement are provided. The system is configured to receive a statement. The system is configured to decompose the received statement into one or more sets of question and answer pairs. The system is configured to determine a confidence value of each answer in the one or more question and answer pair sets. The system is configured to combine the determined confidence values. The combined confidence values represent a probability that the received statement is evaluated as true. | 03-29-2012 |
20120078837 | DECISION-SUPPORT APPLICATION AND SYSTEM FOR PROBLEM SOLVING USING A QUESTION-ANSWERING SYSTEM - A decision-support system for problem solving comprises software modules embodied on a computer readable medium, and the software modules comprise an input/output module and a question-answering module. The method receives problem case information using the input/output module, generates a query based on the problem case information, and generates a plurality of answers for the query using the question-answering module. The method also calculates numerical values for multiple evidence dimensions from evidence sources for each of the answers using the question-answering module and calculates a corresponding confidence value for each of the answers based on the numerical value of each evidence dimension using the question-answering module. Further, the method outputs the answers, the corresponding confidence values, and the numerical values of each evidence dimension for one or more selected answers using the input/output module. | 03-29-2012 |
20120078873 | USING ONTOLOGICAL INFORMATION IN OPEN DOMAIN TYPE COERCION - A computer-implemented system, method and program product generates answers to questions in an input query text string. The method includes determining, by a programmed processor unit, a lexical answer type (LAT) string associated with an input query; automatically obtaining a candidate answer string to the input query from a data corpus; mapping the query LAT string to a first type string in a structured resource; mapping the candidate answer string to a second type string in the structured resource; and determining if the first type string and the second type string are disjointed; and scoring the candidate answer string based on the determination of the types being disjointed wherein the structured resource includes a semantic database providing ontological content. | 03-29-2012 |
20120078888 | PROVIDING ANSWERS TO QUESTIONS USING LOGICAL SYNTHESIS OF CANDIDATE ANSWERS - A method, system and computer program product for generating answers to questions. In one embodiment, the method comprises receiving an input query, decomposing the input query into a plurality of different subqueries, and conducting a search in one or more data sources to identify at least one candidate answer to each of the subqueries. A ranking function is applied to each of the candidate answers to determine a ranking for each of these candidate answers; and for each of the subqueries, one of the candidate answers to the subquery is selected based on this ranking. A logical synthesis component is applied to synthesize a candidate answer for the input query from the selected the candidate answers to the subqueries. In one embodiment, the procedure applied by the logical synthesis component to synthesize the candidate answer for the input query is determined from the input query. | 03-29-2012 |
20120078889 | PROVIDING ANSWERS TO QUESTIONS USING HYPOTHESIS PRUNING - A method, system and computer program product for generating answers to questions. In one embodiment, the method comprises receiving a query, conducting a search through one or more data sources to identify candidate answers to the query, and providing each of the candidate answers with a preliminary score. The method further comprises filtering out any of the candidate answers with a preliminary score that does not satisfy a defined condition. The candidate answers having preliminary scores that satisfy this condition form a subset of the candidate answers. Each of the candidate answers in this subset is processed to produce further scores. A ranking function is applied to these further scores to determine a ranking for each of the candidate answers in the subset; and after this ranking function is applied, one or more of the candidate answers are selected as one or more final answers to the query. | 03-29-2012 |
20120078890 | LEXICAL ANSWER TYPE CONFIDENCE ESTIMATION AND APPLICATION - A system, method and computer program product for automatically estimating the confidence of a detected LAT to provide a more accurate overall score for an obtained candidate answer. A confidence “score” or value of each detected LAT is obtained, and the system and method performs combining the confidence score with a degree of match between a LAT and an AnswerType of the candidate answer to provide improved overall score for the candidate answer. | 03-29-2012 |
20120078891 | PROVIDING ANSWERS TO QUESTIONS USING MULTIPLE MODELS TO SCORE CANDIDATE ANSWERS - A method, system and computer program product for generating answers to questions. In one embodiment, the method comprises receiving an input query; conducting a search to identify candidate answers to the input query, and producing a plurality of scores for each of the candidate answers. For each of the candidate answers, one, of a plurality of candidate ranking functions, is selected. This selected ranking function is applied to the each of the candidate answers to determine a ranking for the candidate answer based on the scores for that candidate answer. One or more of the candidate answers is selected, based on the rankings for the candidate answers, as one or more answers to the input query. In an embodiment, the ranking function selection is performed using information about the question. In an embodiment, the ranking function selection is performed using information about each answer. | 03-29-2012 |
20120078902 | PROVIDING QUESTION AND ANSWERS WITH DEFERRED TYPE EVALUATION USING TEXT WITH LIMITED STRUCTURE - A system, method and computer program product for conducting questions and answers with deferred type evaluation based on any corpus of data. The method includes processing a query including waiting until a “Type” (i.e. a descriptor) is determined AND a candidate answer is provided. Then, a search is conducted to look (search) for evidence that the candidate answer has the required Lexical Answer Type (e.g., as determined by a matching function that can leverage a parser, a semantic interpreter and/or a simple pattern matcher). Prior to or during candidate answer evaluation, a process is provided for extracting and storing collections of entity-type pairs from semi-structured text documents. During QA processing and candidate answer scoring, a process is implemented to match the query LAT against the lexical type of each provided candidate answer and generate a score judging a degree of match. | 03-29-2012 |
20120078926 | EFFICIENT PASSAGE RETRIEVAL USING DOCUMENT METADATA - A system, method and computer program product for efficiently retrieving relevant passages to questions based on a corpus of data. A processor device receives an input query and performs a query analysis to obtain searchable query terms. The processor performs: matching metadata associated with one or more documents against the query terms. The document metadata includes one or more of: a title of the documents, one or more user tags or clouds. Then the processor device performs: mapping matched document metadata to corresponding one or more documents; identifying corresponding matched documents to form a subcorpus of documents; and conducting a search in the data subcorpus using the searchable query terms to obtain one or more passages relevant input query from the identified documents. | 03-29-2012 |
20120084076 | CONTEXT-BASED DISAMBIGUATION OF ACRONYMS AND ABBREVIATIONS - Context-based disambiguation of acronyms and/or abbreviations may determine a target abbreviation and one or more keywords appearing in context with the target abbreviation in a received passage, the target abbreviation representing a shortened form of one or more word. A contextual search query including the target abbreviation and said one or more keywords may be generated. A pseudo document index may be searched for one or more expansions of the target abbreviation by invoking the contextual search query, the pseudo document index containing index of one or more pseudo documents, associated one or more abbreviations and associated context keywords. One or more pseudo documents associated with the target abbreviation may be returned based on the searching of the pseudo document index. | 04-05-2012 |
20120084293 | PROVIDING ANSWERS TO QUESTIONS INCLUDING ASSEMBLING ANSWERS FROM MULTIPLE DOCUMENT SEGMENTS - A method, system and computer program product for generating answers to questions. In one embodiment, the method comprises receiving an input query, identifying a plurality of candidate answers to the query; and for at least one of these candidate answers, identifying at least one proof of the answer. This proof includes a series of premises, and a multitude of documents are identified that include references to the premises. A set of these documents is selected that include references to all of the premises. This set of documents is used to generate one or more scores for the one of the candidate answers. A defined procedure is applied to the candidate answers to determine a ranking for the answers, and this includes using the one or more scores for the at least one of the candidate answers in the defined procedure to determine the ranking for this one candidate answer. | 04-05-2012 |
20120089622 | SCORING CANDIDATES USING STRUCTURAL INFORMATION IN SEMI-STRUCTURED DOCUMENTS FOR QUESTION ANSWERING SYSTEMS - A system, program product, and methodology automatically scores candidate answers to questions in a question and answer system. In the candidate answer scoring method, a processor device performs one or more of receiving one or more candidate answers associated with a query string, the candidates obtained from a data source having semi-structured content; identifying one or more documents with semi-structured content from the data source having a candidate answer; and for each identified document: extracting one or more entity structures embedded in the identified document; determining a number of the entity structures in the identified document that appear in the received input query; and, computing a score for a candidate answer in the document as a function of the number Overall system efficiency is improved by giving the correct candidate answers higher scores through leveraging context-dependent structural information such as links to other documents and embedded tags. | 04-12-2012 |
20120131016 | EVIDENCE PROFILING - Evidence profiling, in one aspect, may receive a candidate answer and supporting pieces of evidence. An evidence profile may be generated, the evidence profile communicating a degree to which the evidence supports the candidate answer as being correct. The evidence profile may provide dimensions of evidence, and each dimension may support or refute the candidate answer as being correct. | 05-24-2012 |
20120301864 | USER INTERFACE FOR AN EVIDENCE-BASED, HYPOTHESIS-GENERATING DECISION SUPPORT SYSTEM - Systems and methods display at least one subject, and display a location for at least one user to enter at least one problem related to the subject. The problem comprises unknown items to which the user would like more information. In response to the problem, such systems and methods automatically generate evidence topics related to the problem, and automatically generate questions related to the problem and the evidence topics. Further, such systems and methods can receive additional questions from the user. In response to the questions, such systems and methods automatically generate answers to the questions by referring to sources, automatically calculate confidence measures of each of the answers, and then display the questions, the answers, and the confidence measures. When the user identifies one of the answers as a selected answer, such systems and methods display details of the sources and the factors used to generate the selected answer. | 11-29-2012 |
20120323906 | LEXICAL ANSWER TYPE CONFIDENCE ESTIMATION AND APPLICATION - A system, method and computer program product for automatically estimating the confidence of a detected LAT to provide a more accurate overall score for an obtained candidate answer. A confidence “score” or value of each detected LAT is obtained, and the system and method performs combining the confidence score with a degree of match between a LAT and an AnswerType of the candidate answer to provide improved overall score for the candidate answer. | 12-20-2012 |
20120329032 | SCORING CANDIDATES USING STRUCTURAL INFORMATION IN SEMI-STRUCTURED DOCUMENTS FOR QUESTION ANSWERING SYSTEMS - A system, program product, and methodology automatically scores candidate answers to questions in a question and answer system. In the candidate answer scoring method, a processor device performs one or more of receiving one or more candidate answers associated with a query string, the candidates obtained from a data source having semi-structured content; identifying one or more documents with semi-structured content from the data source having a candidate answer; and for each identified document: extracting one or more entity structures embedded in the identified document; determining a number of the entity structures in the identified document that appear in the received input query; and, computing a score for a candidate answer in the document as a function of the number Overall system efficiency is improved by giving the correct candidate answers higher scores through leveraging context-dependent structural information such as links to other documents and embedded tags. | 12-27-2012 |
20120330648 | CONTEXT-BASED DISAMBIGUATION OF ACRONYMS AND ABBREVIATIONS - Context-based disambiguation of acronyms and/or abbreviations may determine a target abbreviation and one or more keywords appearing in context with the target abbreviation in a received passage, the target abbreviation representing a shortened form of one or more word. A contextual search query including the target abbreviation and said one or more keywords may be generated. A pseudo document index may be searched for one or more expansions of the target abbreviation by invoking the contextual search query, the pseudo document index containing index of one or more pseudo documents, associated one or more abbreviations and associated context keywords. One or more pseudo documents associated with the target abbreviation may be returned based on the searching of the pseudo document index. | 12-27-2012 |
20120330882 | FACT CHECKING USING AND AIDING PROBABILISTIC QUESTION ANSWERING - A system and a computer program product for verifying a statement are provided. The system is configured to receive a statement. The system is configured to decompose the received statement into one or more sets of question and answer pairs. The system is configured to determine a confidence value of each answer in the one or more question and answer pair sets. The system is configured to combine the determined confidence values. The combined confidence values represent a probability that the received statement is evaluated as true. | 12-27-2012 |
20120330921 | USING ONTOLOGICAL INFORMATION IN OPEN DOMAIN TYPE COERCION - A computer-implemented system, method and program product generates answers to questions in an input query text string. The method includes determining, by a programmed processor unit, a lexical answer type (LAT) string associated with an input query; automatically obtaining a candidate answer string to the input query from a data corpus; mapping the query LAT string to a first type string in a structured resource; mapping the candidate answer string to a second type string in the structured resource; and determining if the first type string and the second type string are disjointed; and scoring the candidate answer string based on the determination of the types being disjointed wherein the structured resource includes a semantic database providing ontological content. | 12-27-2012 |
20120330934 | PROVIDING QUESTION AND ANSWERS WITH DEFERRED TYPE EVALUATION USING TEXT WITH LIMITED STRUCTURE - A system, method and computer program product for conducting questions and answers with deferred type evaluation based on any corpus of data. The method includes processing a query including waiting until a “Type” (i.e. a descriptor) is determined AND a candidate answer is provided. Then, a search is conducted to look (search) for evidence that the candidate answer has the required Lexical Answer Type (e.g., as determined by a matching function that can leverage a parser, a semantic interpreter and/or a simple pattern matcher). Prior to or during candidate answer evaluation, a process is provided for extracting and storing collections of entity-type pairs from semi-structured text documents. During QA processing and candidate answer scoring, a process is implemented to match the query LAT against the lexical type of each provided candidate answer and generate a score judging a degree of match. | 12-27-2012 |
20120331003 | EFFICIENT PASSAGE RETRIEVAL USING DOCUMENT METADATA - A system, method and computer program product for efficiently retrieving relevant passages to questions based on a corpus of data. A processor device receives an input query and performs a query analysis to obtain searchable query terms. The processor performs: matching metadata associated with one or more documents against the query terms. The document metadata includes one or more of: a title of the documents, one or more user tags or clouds. Then the processor device performs: mapping matched document metadata to corresponding one or more documents; identifying corresponding matched documents to form a subcorpus of documents; and conducting a search in the data subcorpus using the searchable query terms to obtain one or more passages relevant input query from the identified documents. | 12-27-2012 |
20130006641 | PROVIDING ANSWERS TO QUESTIONS USING LOGICAL SYNTHESIS OF CANDIDATE ANSWERS - A method, system and computer program product for generating answers to questions. In one embodiment, the method comprises receiving an input query, decomposing the input query into a plurality of different subqueries, and conducting a search in one or more data sources to identify at least one candidate answer to each of the subqueries. A ranking function is applied to each of the candidate answers to determine a ranking for each of these candidate answers; and for each of the subqueries, one of the candidate answers to the subquery is selected based on this ranking. A logical synthesis component is applied to synthesize a candidate answer for the input query from the selected the candidate answers to the subqueries. In one embodiment, the procedure applied by the logical synthesis component to synthesize the candidate answer for the input query is determined from the input query. | 01-03-2013 |
20130007055 | PROVIDING ANSWERS TO QUESTIONS USING MULTIPLE MODELS TO SCORE CANDIDATE ANSWERS - A method, system and computer program product for generating answers to questions. In one embodiment, the method comprises receiving an input query; conducting a search to identify candidate answers to the input query, and producing a plurality of scores for each of the candidate answers. For each of the candidate answers, one, of a plurality of candidate ranking functions, is selected. This selected ranking function is applied to the each of the candidate answers to determine a ranking for the candidate answer based on the scores for that candidate answer. One or more of the candidate answers is selected, based on the rankings for the candidate answers, as one or more answers to the input query. In an embodiment, the ranking function selection is performed using information about the question. In an embodiment, the ranking function selection is performed using information about each answer. | 01-03-2013 |
20130013547 | EVIDENCE PROFILING - Evidence profiling, in one aspect, may receive a candidate answer and supporting pieces of evidence. An evidence profile may be generated, the evidence profile communicating a degree to which the evidence supports the candidate answer as being correct. The evidence profile may provide dimensions of evidence, and each dimension may support or refute the candidate answer as being correct. | 01-10-2013 |
20130013615 | PROVIDING ANSWERS TO QUESTIONS INCLUDING ASSEMBLING ANSWERS FROM MULTIPLE DOCUMENT SEGMENTS - A method, system and computer program product for generating answers to questions. In one embodiment, the method comprises receiving an input query, identifying a plurality of candidate answers to the query; and for at least one of these candidate answers, identifying at least one proof of the answer. This proof includes a series of premises, and a multitude of documents are identified that include references to the premises. A set of these documents is selected that include references to all of the premises. This set of documents is used to generate one or more scores for the one of the candidate answers. A defined procedure is applied to the candidate answers to determine a ranking for the answers, and this includes using the one or more scores for the at least one of the candidate answers in the defined procedure to determine the ranking for this one candidate answer. | 01-10-2013 |
20130017523 | UTILIZING FAILURES IN QUESTION AND ANSWER SYSTEM RESPONSES TO ENHANCE THE ACCURACY OF QUESTION AND ANSWER SYSTEMS - A method of enhancing the accuracy of a question-answer system. Missing information from a corpus of data is identified. The missing information is any information that improves a confidence for a candidate answer to a question. A follow-on inquiry is generated. The follow-on inquiry prompts for the missing information to be provided. The follow-on inquiry is output to an external source. A response to the follow-on inquiry is received from the external source. The response is added to the corpus of data. | 01-17-2013 |
20130017524 | UTILIZING FAILURES IN QUESTION AND ANSWER SYSTEM RESPONSES TO ENHANCE THE ACCURACY OF QUESTION AND ANSWER SYSTEMS - A computerized device for enhancing the accuracy of a question-answer system. The computerized device comprises a question-answer system comprising software for performing a plurality of question answering processes. A receiver receives a question into the question-answer system. A processor that generates a plurality of candidate answers to the question is connected to the question-answer system. The processor determines a confidence score for each of the plurality of candidate answers. The processor evaluates sources of evidence used to generate the plurality of candidate answers. The processor identifies missing information from a corpus of data. The missing information comprises any information that improves a confidence score for a candidate answer. The processor generates at least one follow-on inquiry based on the missing information. A network interface outputs the at least one follow-on inquiry to external sources separate from the question-answer system. | 01-17-2013 |
20130018652 | EVIDENCE DIFFUSION AMONG CANDIDATE ANSWERS DURING QUESTION ANSWERING - Diffusing evidence among candidate answers during question answering may identify a relationship between a first candidate answer and a second candidate answer, wherein the candidate answers are generated by a question-answering computer process, the candidate answers have associated supporting evidence, and the candidate answers have associated confidence scores. All or some of the evidence may be transferred from the first candidate answer to the second candidate answer based on the identified relationship. A new confidence score may be computed for the second candidate answer based on the transferred evidence. | 01-17-2013 |
20130018876 | PROVIDING ANSWERS TO QUESTIONS USING HYPOTHESIS PRUNING - A method, system and computer program product for generating answers to questions. In one embodiment, the method comprises receiving a query, conducting a search through one or more data sources to identify candidate answers to the query, and providing each of the candidate answers with a preliminary score. The method further comprises filtering out any of the candidate answers with a preliminary score that does not satisfy a defined condition. The candidate answers having preliminary scores that satisfy this condition form a subset of the candidate answers. Each of the candidate answers in this subset is processed to produce further scores. A ranking function is applied to these further scores to determine a ranking for each of the candidate answers in the subset; and after this ranking function is applied, one or more of the candidate answers are selected as one or more final answers to the query. | 01-17-2013 |
20130019285 | VALIDATING THAT A USER IS HUMAN - A method of validating that a user is human. A user question is generated using a computerized device. The user question is output to a user. A user response to the user question is received from the user. The user response is validated as having been provided by a human. | 01-17-2013 |
20130019286 | VALIDATING THAT A USER IS HUMAN - A method of validating that a user is human. A user question is generated using a computerized device. The user question is output to a user. A user response to the user question is received from the user. The user response is validated as having been provided by a human. | 01-17-2013 |
20130035930 | PREDICTING LEXICAL ANSWER TYPES IN OPEN DOMAIN QUESTION AND ANSWERING (QA) SYSTEMS - In an automated Question Answer (QA) system architecture for automatic open-domain Question Answering, a system, method and computer program product for predicting the Lexical Answer Type (LAT) of a question. The approach is completely unsupervised and is based on a large-scale lexical knowledge base automatically extracted from a Web corpus. This approach for predicting the LAT can be implemented as a specific subtask of a QA process, and/or used for general purpose knowledge acquisition tasks such as frame induction from text. | 02-07-2013 |
20130035931 | PREDICTING LEXICAL ANSWER TYPES IN OPEN DOMAIN QUESTION AND ANSWERING (QA) SYSTEMS - In an automated Question Answer (QA) system architecture for automatic open-domain Question Answering, a system, method and computer program product for predicting the Lexical Answer Type (LAT) of a question. The approach is completely unsupervised and is based on a large-scale lexical knowledge base automatically extracted from a Web corpus. This approach for predicting the LAT can be implemented as a specific subtask of a QA process, and/or used for general purpose knowledge acquisition tasks such as frame induction from text. | 02-07-2013 |
20130282363 | LEXICAL ANSWER TYPE CONFIDENCE ESTIMATION AND APPLICATION - A system, method and computer program product for automatically estimating the confidence of a detected LAT to provide a more accurate overall score for an obtained candidate answer. A confidence “score” or value of each detected LAT is obtained, and the system and method performs combining the confidence score with a degree of match between a LAT and an AnswerType of the candidate answer to provide improved overall score for the candidate answer. | 10-24-2013 |
20140072947 | GENERATING SECONDARY QUESTIONS IN AN INTROSPECTIVE QUESTION ANSWERING SYSTEM - A method of generating secondary questions in a question-answer system. Missing information is identified from a corpus of data using a computerized device. The missing information comprises any information that improves confidence scores for candidate answers to a question. The computerized device automatically generates a plurality of hypotheses concerning the missing information. The computerized device automatically generates at least one secondary question based on each of the plurality of hypotheses. The hypotheses are ranked based on relative utility to determine an order in which the computerized device outputs the at least one secondary question to external sources to obtain responses. | 03-13-2014 |
20140072948 | GENERATING SECONDARY QUESTIONS IN AN INTROSPECTIVE QUESTION ANSWERING SYSTEM - A method of generating secondary questions in a question-answer system. Missing information is identified from a corpus of data using a computerized device. The missing information comprises any information that improves confidence scores for candidate answers to a question. The computerized device automatically generates a plurality of hypotheses concerning the missing information. The computerized device automatically generates at least one secondary question based on each of the plurality of hypotheses. The hypotheses are ranked based on relative utility to determine an order in which the computerized device outputs the at least one secondary question to external sources to obtain responses. | 03-13-2014 |
20140108321 | TEXT-BASED INFERENCE CHAINING - A method, system and computer program product for generating inference graphs over content to answer input inquiries. First, independent factors are produced from the inquiry, and these factors are converted to questions. The questions are then input to a probabilistic question answering system (PQA) that discovers relations which are used to iteratively expand an inference graph starting from the factors and ending with possible answers. A probabilistic reasoning system is used to infer the confidence in each answer by, for example, propagating confidences across relations and nodes in the inference graph as it is expanded. The inference graph generator system can be used to simultaneously bi-directionally generate forward and backward inference graphs that uses a depth controller component to limit the generation of both paths if they do not meet. Otherwise, a joiner process forces the discovery of relations that may join the answers to factors in the inquiry. | 04-17-2014 |
20140108322 | TEXT-BASED INFERENCE CHAINING - A method, system and computer program product for generating inference graphs over content to answer input inquiries. First, independent factors are produced from the inquiry, and these factors are converted to questions. The questions are then input to a probabilistic question answering system (PQA) that discovers relations which are used to iteratively expand an inference graph starting from the factors and ending with possible answers. A probabilistic reasoning system is used to infer the confidence in each answer by, for example, propagating confidences across relations and nodes in the inference graph as it is expanded. The inference graph generator system can be used to simultaneously bi-directionally generate forward and backward inference graphs that uses a depth controller component to limit the generation of both paths if they do not meet. Otherwise, a joiner process forces the discovery of relations that may join the answers to factors in the inquiry. | 04-17-2014 |
20140164303 | METHOD OF ANSWERING QUESTIONS AND SCORING ANSWERS USING STRUCTURED KNOWLEDGE MINED FROM A CORPUS OF DATA - In a method of answering questions and scoring answers, a title and at least one topical field are identified for a document. A field name and field content associated with the topical field is identified, and a title-oriented document is created by combining the title, the field name, and the field content associated with the topical field. For each title-oriented document, a term in the title is matched to previously established categories to produce a title concept identifier. The topical field is synthesized to produce a field concept identifier and a field content concept identifier. A question is received. The question topic term and the question content identifier are used to identify at least one question-matching relation instance. The title concept identifier of each question-matching relation instance is identified as a candidate answer to the question. Each candidate answer and a corresponding answer score is output. | 06-12-2014 |
20140164304 | METHOD OF ANSWERING QUESTIONS AND SCORING ANSWERS USING STRUCTURED KNOWLEDGE MINED FROM A CORPUS OF DATA - In a method of answering questions and scoring answers, a title and at least one topical field are identified for a document. A field name and field content associated with the topical field is identified, and a title-oriented document is created by combining the title, the field name, and the field content associated with the topical field. For each title-oriented document, a term in the title is matched to previously established categories to produce a title concept identifier. The topical field is synthesized to produce a field concept identifier and a field content concept identifier. A question is received. The question topic term and the question content identifier are used to identify at least one question-matching relation instance. The title concept identifier of each question-matching relation instance is identified as a candidate answer to the question. Each candidate answer and a corresponding answer score is output. | 06-12-2014 |
20140272904 | COMBINING DIFFERENT TYPE COERCION COMPONENTS FOR DEFERRED TYPE EVALUATION - In a method of answering questions, a question is received, a question LAT is determined, and a candidate answer to the question is identified. Preliminary types for the candidate answer are determined using first components to produce the preliminary types. Each of the first components produces a preliminary type using different methods. A first type-score representing a degree of match between the preliminary type and the question LAT is produced. Each preliminary type and each first type-score is evaluated using second components. Each of the second components produces a second score based on a combination of the first type-score and a measure of degree that the preliminary type matches the question LAT. The second components use different methods to produce the second score. A final score representing a degree of confidence that the candidate answer matches the question LAT is calculated based on the second score. | 09-18-2014 |
20140337329 | PROVIDING ANSWERS TO QUESTIONS USING MULTIPLE MODELS TO SCORE CANDIDATE ANSWERS - A method, system and computer program product for generating answers to questions. In one embodiment, the method comprises receiving an input query; conducting a search to identify candidate answers to the input query, and producing a plurality of scores for each of the candidate answers. For each of the candidate answers, one, of a plurality of candidate ranking functions, is selected. This selected ranking function is applied to the each of the candidate answers to determine a ranking for the candidate answer based on the scores for that candidate answer. One or more of the candidate answers is selected, based on the rankings for the candidate answers, as one or more answers to the input query. In an embodiment, the ranking function selection is performed using information about the question. In an embodiment, the ranking function selection is performed using information about each answer. | 11-13-2014 |
20150026169 | PROVIDING ANSWERS TO QUESTIONS USING LOGICAL SYNTHESIS OF CANDIDATE ANSWERS - A method, system and computer program product for generating answers to questions. In one embodiment, the method comprises receiving an input query, decomposing the input query into a plurality of different subqueries, and conducting a search in one or more data sources to identify at least one candidate answer to each of the subqueries. A ranking function is applied to each of the candidate answers to determine a ranking for each of these candidate answers; and for each of the subqueries, one of the candidate answers to the subquery is selected based on this ranking. A logical synthesis component is applied to synthesize a candidate answer for the input query from the selected the candidate answers to the subqueries. In one embodiment, the procedure applied by the logical synthesis component to synthesize the candidate answer for the input query is determined from the input query. | 01-22-2015 |