About this Project

This project is concerned with the computational problem of integrating heterogeneous data, described in accordance with disparate conceptual ontologies, to answer questions by means of comprehensive automated querying and analysis. This problem nowadays affects all disciplines, including the humanities, and has become especially pressing in the era of “big data” on the Internet, which tantalizes us with the prospect of detecting previously unseen patterns and relationships that may stimulate new insights and enlarge our intellectual horizons. With the aid of visiting speakers and computational work in ontology design, will explore the theory and practice of ontology-based data integration using “top-level” formal ontologies. Our aim is to understand how theory informs practice in this area and how the stubborn inability of computer systems to achieve our desired goals—to achieve true “artificial intelligence”—may reveal philosophical weaknesses in the way human knowledge has been understood and represented. Achieving our aim will require interdisciplinary discussions between theoreticians and practitioners in the field of ontology and a willingness to range across the humanities and the sciences—in our case, with a focus on biomedical science, the area of natural science most intensively engaged with ontology-based data integration. We expect that our explorations of this topic will interact fruitfully with a rich literature in epistemology and the philosophy of mind, and will be of general interest to scholars who are engaged in the digital humanities.


Algorithmic Intelligence Has Gotten So Smart, It’s Easy to Forget It’s Artificial

July 1, 2019

This segment from NPR's Fresh Air, on the increasing sophistication of AI technology, refers to research by Neubauer Collegium Visiting Fellow Brian Cantwell Smith and his forthcoming book, The Promise of Artificial Intelligence.


There are no events associated with this project yet.