Commonsense Knowledge, Ontology and Ordinary Language

Walid S. Saba

Over two decades ago a “quite revolution”, as Charniak once called it, overwhelmingly replaced knowledge-based approaches in natural language processing (NLP) by quantitative (e.g., statistical, corpus-based, machine learning) methods. Although it is our firm belief that purely quantitative approaches cannot be the only paradigm for NLP, dissatisfaction with purely engineering approaches to the construction of large knowledge bases for NLP are somewhat justified. In this paper we hope to demonstrate that both trends are partly misguided and that the time has come to enrich logical semantics with an ontological structure that reflects our commonsense view of the world and the way we talk about in ordinary language. In particular, it will be demonstrated that a number of challenges in the semantics of natural language (e.g., metonymy, intensionality, copredication, nominal compounds, etc.) can be properly and uniformly addressed if semantics were grounded in an ontology that reflects our commonsense view of the world and the way we talk about it in ordinary language.

Keywords: Ontology, compositional semantics, commonsense knowledge, reasoning.

here