Natural Language Has a Logical Structure

Natural language is a complex phenomenon that can be studied from many different disciplinary perspectives: physical, neurological, psychological, sociological, historical, etc.

But what is truly distinctive and valuable about human natural language is its semantic or representational capacities — the features of language responsible for how words carry meaning, and how words can be combined into sentences to make an indefinite number of distinct, meaningful assertions about the world.

Linguists commonly distinguish three different perspectives from which one can study the representational capacity of language: syntax, semantics, and pragmatics.

Syntax involves the rules for combining words and parts of speech into meaningful sentences. “Alice saw Mary in the park” is a meaningful sentence in English. “Saw Alice park Mary in the” is not a meaningful sentence. The syntax of a language specifies the rules that explain why the former is a well-formed sentence but the latter is not. Different languages will have different rules, but many languages will share rules, and some rules may be shared by all languages.

Semantics involves the study of meaning and reference in language — how a word like “dog” comes to represent (refer to, or denote) the four-legged canines we know and love, and how a sentence like “that dog has fleas” comes to represent a state of affairs in the world which can be evaluated as true or false. Semantics, in this sense, is about how expressions in a language can refer to, or be about, objects or states of affairs in the world.

Pragmatics involves the social and contextual dimensions of human communication that determine how utterances are interpreted and acquire meaning in real-world situations. The very same sentence when uttered in the context of a conversation among friends (“I promise to tell the truth”), can mean something very different when uttered in the context of swearing a legal oath (“Put your hand on the Bible and repeat after me …”). Understanding the pragmatic dimensions of communication is an important part of being a competent user of a natural language.

I always want to caution students about these labels, because distinguishing “semantics” — the study of meaning — from “syntax” and “pragmatics”, makes it seem like syntax and pragmatics aren’t involved in determining the meanings of words and sentences. But that’s obviously false. All three dimensions of language are involved in determining the meaning of natural language utterances.

A better way of thinking of it is that “syntax”, “semantics” and “pragmatics” are labels for different theoretical perspectives from which one can study different aspects of the complex bio-psycho-social phenomenon that is natural language.

So far so good. But where does logic enter the picture?

Well, here are two claims that have been made about natural language:

1. The semantics of natural language is generative.

Competent users of a natural language are able to understand the meaning of an indefinite number of different sentences, and can generate an indefinite number of different sentences, even though the vocabulary of a language is finite. By applying a finite number of syntactical rules to a finite vocabulary, we are able to generate and understand a virtual infinity of meaningful sentences. This remarkable ability is what we mean when we say that human natural language is “generative”.

A note about this terminology. In linguistics, “generative” is also more narrowly associated with the “generative” or “transformational” theories of grammar introduced by Noam Chomsky. I don’t want to imply any specific association with the details of those theories. To avoid confusion, philosophers of language often talk about the “productivity” of language as a synonym for what I’m calling “generativity”.

2. The semantics of natural language is compositional.

In natural language, meaningful expressions are built up from other meaningful expressions.

We can analyze the meaning of a sentence like “Bob is a teacher and a violinist” by noting that the expression contains “Bob is a teacher” and “Bob is a violinist” as component parts.

Similarly, we can see that the meaning of a complex sentence like “If John buys the tickets then either Ben or Mary will pay for snacks” is determined in part by the meanings of the component sentences “John buys the tickets”, “Ben will pay for snacks” and “Mary will pay for snacks”.

In the philosophy of language, the “principle of compositionality” says that the meaning of a complex expression is determined by its structure and the meanings of its constituents.

In this example, the structure of the complex expression is determined by the meaning of terms like “If … then”, and “or”.

To make this explicit, let

J = “John buys the ticket

B = “Ben will pay for snacks

M = “Mary will pay for snacks

Then we can write

If John buys the tickets then either Ben or Mary will pay for snacks

as

If J then (B or M)

where the capital letters stand for the constituents of the expression, and the structural features are determined by the meanings of the terms “if”, “then”, “or”, and the placement of the brackets.

Now, this looks suspiciously like a translation exercise in propositional logic that you would learn how to do in a symbolic logic class. That’s no accident. Propositional logic was designed to model this kind of compositional semantics.

This suggests one obvious relationship between logic and language.

Logical systems can represent or model important structural features of natural language, such as the generativity and compositionality of language.

Of course we can always ask whether natural language really is generative and compositional in the ways suggested here. Many have thought so, but this is an empirical question that can only be settled by empirical investigation. (Natural language may not turn out to be strictly compositional, but there are compelling arguments that natural language is by-and-large compositional.)

On the other hand, many artificial languages, like those studied in an introductory symbolic logic class, are designed to meet the requirement of compositionality.

Thus, one way that logic is relevant to linguistics is that by creating and studying the properties of artificial languages, and comparing those with the properties of natural language that we discover through empirical investigation, we can gain some insight into the logical properties of natural language.

Okay, let’s move on to the second point I want to make about logic and language.