What is it about?
This paper revisits the conception of intelligence and understanding as embodied in the Turing Test. It argues that a simple system of meaning relations drawn from words/lexical items in a natural language and framed in terms of syntax-free relations in linguistic texts can help ground linguistic inferences in a manner that can be taken to be 'understanding' in a mechanized system. Understanding in this case is a matter of running through the relevant inferences meaning relations allow for, and some of these inferences are plain deductions and some can serve to act as abductions. Understanding in terms of meaning relations also supervenes on linguistic syntax because such understanding cannot be simply reduced to syntactic relations. The current approach to meaning and understanding thus shows that this is one way, if not the only way, of (re)framing Alan Turing's original insight into the nature of thinking in computing systems.
Featured Image
Photo by Markus Winkler on Unsplash
Why is it important?
This work shows the importance of natural language for AI, by highlighting Alan Turing's consideration of natural language understanding in his formulation of the Turing Test, and also formulates a fresh system of meaning relations that can be deciphered from natural language text.
Perspectives
Read the Original
This page is a summary of: Meaning Relations, Syntax, and Understanding, Axiomathes, January 2021, Springer Science + Business Media,
DOI: 10.1007/s10516-021-09534-x.
You can read the full text:
Contributors
The following have contributed to this page