Traditional logical approaches to semantics and newer distributional
or vector space approaches have complementary strengths and weaknesses.
We have developed methods that integrate logical and distributional
models by using a CCG-based parser to produce a detailed logical form
for each sentence, and combining the result with soft inference rules
derived from distributional semantics that connect the meanings of their
component words and phrases. For recognizing textual entailment (RTE)
we use Markov Logic Networks (MLNs) to combine these representations,
and for Semantic Textual Similarity (STS) we use Probabilistic Soft
Logic (PSL). We present experimental results on standard benchmark
datasets for these problems and emphasize the advantages of combining
logical structure of sentences with statistical knowledge mined from
large corpora.
Bio:
Raymond J. Mooney is a Professor in the Department of Computer Science at the University of Texas at Austin. He received his Ph.D. in 1988 from the University of Illinois at Urbana/Champaign. He is an author of over
150 published research papers, primarily in the areas of machine
learning and natural language processing. He was the President of
the International Machine Learning Society from 2008-2011, program
co-chair for AAAI 2006, general chair for HLT-EMNLP 2005, and
co-chair for ICML 1990. He is a Fellow of the American Association for
Artificial Intelligence and the Association for Computing Machinery, and the recipient of best paper awards from AAAI-96, KDD-04, ICML-05 and ACL-07.