Prospects For Applied Semantics: A Textual Entailment Perspective
Speaker:
Ido Dagan (Bar-Ilan University, Israel)
Abstract:
How should an applied model for natural language semantics look like? While the desired functionality and output of models for other levels of language, like morphology and syntax, are quite consensual, this is not the case for semantics. Classical formal approaches suggest that semantic models should produce logic representations of text. Yet, the formal logic-based approach remained a rather small niche in state of the art NLP. Common practice, on the other hand, is rather chaotic, with a plethora of scattered semantic processing tasks whose relationships are largely unclear. This talk will argue that the textual entailment paradigm may provide a suitable encompassing framework for much of the applied semantics space. Under this approach the underlying semantic modeling task should be mapping between natural language units whose meanings entail one another, rather than mapping language units onto an extrinsic logical language. This approach is motivated by the observation that many semantic inference needs across NLP can be reduced to the entailment framework, which, in turn, may encompass much of the common-practice techniques in applied semantics. The first part of the talk will motivate and present the entailment framework. We will then review BIUTEE, the Bar-Ilan University Textual Entailment Engine, illustrating practical entailment modeling and interesting research directions, including a proof system over parse-trees and global learning of entailment graphs. Finally, I will suggest some vision along two lines: (a) creating generic entailment engines that may power semantic processing across NLP tasks and applications; (b) teasing ideas, triggered by the textual entailment paradigm, for potential leaps in long-awaited application areas, including text exploration, intelligent tutoring and natural language interfaces.