Formal semantics (natural language)
Formal semantics is the scientific study of linguistic meaning through formal tools from logic and mathematics. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. Formal semanticists rely on diverse methods to analyze natural language. Many examine the meaning of a sentence by studying the circumstances in which it would be true. They describe these circumstances using abstract mathematical models to represent entities and their features. The principle of compositionality helps them link the meaning of expressions to abstract objects in these models. This principle asserts that the meaning of a compound expression is determined by the meanings of its parts.
Propositional and predicate logic are formal systems used to analyze the semantic structure of sentences. They introduce concepts like singular terms, predicates, quantifiers, and logical connectives to represent the logical form of natural language expressions. Type theory is another approach utilized to describe sentences as nested functions with precisely defined input and output types. Various theoretical frameworks build on these systems. Possible world semantics and situation semantics evaluate truth across different hypothetical scenarios. Dynamic semantics analyzes the meaning of a sentence as the information contribution it makes.
Using these and similar theoretical tools, formal semanticists investigate a wide range of linguistic phenomena. They study quantificational expressions, which indicate the quantity of something, like the sentence "all ravens are black". An influential proposal analyzes them as relations between two sets—the set of ravens and the set of black things in this example. Quantifiers are also used to examine the meaning of definite and indefinite descriptions, which denote specific entities, like the expression "the president of Kenya". Formal semanticists are also interested in tense and aspect, which provide temporal information about events and circumstances. In addition to studying statements about what is true, semantics also investigates other sentences types such as questions and imperatives. Other investigated linguistic phenomena include intensionality, modality, negation, plural expressions, and the influence of contextual factors.
Formal semantics is relevant to various fields. In logic and computer science, formal semantics refers to the analysis of meaning in artificially constructed logical and programming languages. In cognitive science, some researchers rely on the insights of formal semantics to study the nature of the mind. Formal semantics has its roots in the development of modern logic starting in the late 19th century. Richard Montague's work in the late 1960s and early 1970s was pivotal in applying these logical principles to natural language, inspiring many scholars to refine his insights and apply them to diverse linguistic phenomena.
Definition
[edit]Formal semantics is a branch of linguistics and philosophy that studies linguistic meaning using formal methods. To analyze language in a precise and systematic manner, it incorporates ideas from logic, mathematics, and philosophy of language, like the concepts of truth conditions, model theory, and compositionality.[1] Because of the prominence of these concepts, formal semantics is also referred to as truth-conditional semantics and model-theoretic semantics.[2] Formal semanticists typically adopt an externalist view that interprets meaning as the entities to which expressions refer. This focus on the connection between language and the external world sets formal semantics apart from semantic theories that concentrate on the cognitive processes and mental representations involved in understanding language.[3]
The primary focus of formal semantics is the analysis of natural languages such as English, Spanish, and Japanese. This enterprise faces challenges due to the complexity and context-dependence of natural language. As a result, theorists sometimes limit their studies to specific fragments or subsets of these languages to avoid the complexities.[4] They investigate diverse linguistic phenomena, including reference, quantifiers, plurality, tense, aspect, vagueness, modality, scope, binding, conditionals, questions, and imperatives.[5]
Understood in a wide sense, formal semantics also includes the study of artificial or constructed languages. This covers the formal languages used in the logical analysis of arguments, such as the language of first-order logic, and programming languages in computer science, such as C++, JavaScript, and Python.[6] Formal semantics is related to formal pragmatics since both are subfields of formal linguistics. One key difference is that formal pragmatics centers on how language is used in communication rather than the problem of meaning in general.[7]
Methodology
[edit]Formal semanticists rely on diverse methods, conceptual tools, and background assumptions, which distinguish the field from other branches of semantics. Most of these principles originate in logic, mathematics, and the philosophy of language.[8] One key principle is that an adequate theory of meaning needs to accurately predict sentences' truth conditions. A sentence's truth conditions are the circumstances under which it would be true. For example, the sentence "Tina is tall and happy" is true if Tina has the property of being tall and also the property of being happy; these are its truth conditions. This principle reflects the idea that understanding a sentence requires knowing how it relates to reality and under which circumstances it would be appropriate to use it.[9]
To test the adequacy of their theories, formal semanticists typically rely on the linguistic intuitions of competent speakers as a form of empirical validation.[10] A key source of data comes from judgments concerning entailment, a relation between sentences—called premises and conclusions—in which truth is preserved. For instance, the sentence "Tina is tall and happy" entails the sentence "Tina is tall" because the truth of the first sentence guarantees the truth of the second. One aspect of understanding the meaning of a sentence is comprehending what it does and does not entail.[11][a] The study of entailment relations helps formal semanticists build new theories and verify whether existing theories accurately reflect the intuitions of native speakers.[13]

Formal semanticists typically analyze language by proposing a system of model theoretic interpretation. In such a system, a sentence is evaluated relative to a mathematical structure called a model that maps each expression to an abstract object.[b] Models rely on set theory and introduce abstract objects for all entities in this situation. For example, a model of a situation where Tina is tall and happy may include an abstract object representing Tina and two sets of objects—one for all tall entities and another for all happy entities. Different models introduce different objects and group them into different sets while interpretation functions link natural language expressions to these objects and sets. Using this approach, semanticists can precisely calculate the truth values of sentences. For example, the sentence "Tina is tall" is true in models where the abstract object corresponding to the name "Tina"[c] is in the set corresponding to the predicate "tall". In this way, model theoretic interpretation makes it possible to formally capture truth conditions and entailments.[17]

The principle of compositionality is another key methodological assumption for analyzing the meaning of natural language sentences and linking them to abstract models. It states that the meaning of a compound expression is determined by the meanings of its parts and the way they are combined. According to this principle, if a person knows the meanings of the name "Tina", the verb "is", and the adjective "happy", they can understand the sentence "Tina is happy" even if they have never heard this specific combination of words before. The principle of compositionality explains how language users can comprehend an infinite number of sentences based on their knowledge of a finite number of words and rules.[19]
Within formal semantics, there are diverse ways how to construct models and relate linguistic expressions to them.[20] One common approach posits a level of grammatical structure called logical form. In this view, sentences do not have a unique syntactic form, but rather a phonological form, which gets pronounced, and a separate logical form, which undergoes semantic interpretation. In these approaches, apparent mismatches between form and meaning can be explained using syntactic mechanisms such as movement. This is in contrast to approaches such as Direct Compositionality, which assume that the surface grammatical form is the only grammatical form and that mismatches should be explained using other semantic mechanisms such as type shifting.[21][22][23][24][25]
Semantic formalisms differ in whether they adopt the method of direct interpretation or indirect interpretation. In direct interpretation, the semantic interpretation rules map natural language expressions directly to their denotations, whereas indirect interpretation maps them to expressions of some formal language. For example, in direct interpretation the name "Barack Obama" would be mapped to the man himself, while a system of indirect interpretation would map it to a logical constant "o" which would then be mapped to the man himself. In this way, direct interpretation can be seen as providing a model theory for natural languages themselves, while indirect interpretation provides a system of logic translation and then relies on the model theory of the formal language. These two methods are in principle equivalent, but are often chosen based on practical or expository considerations.[26][27]
Formal systems and theories
[edit]Propositional and predicate logic
[edit]Formal semanticists often rely on propositional and predicate logic to analyze the semantic structure of sentences. Propositional logic can be used to examine compound sentences made up of several independent clauses. It employs letters like and to represent simple statements. Compound statements are created by combining simple statements with logical connectives, such as (and), (or), and (if...then), which express the relationships between the statements. For example, the sentence "Alice is happy and Bob is rich" can be translated into propositional logic with the formula , where stands for "Alice is happy" and stands for "Bob is rich". Of key interest to semantic analysis is that the truth value of these compound statements is directly determined by the truth values of the simple statements. For instance, the formula is only true if both and are true; otherwise, it is false.[29]
Predicate logic extends propositional logic by articulating the internal structure of non-compound sentences through concepts like singular term, predicate, and quantifier. Singular terms refer to specific entities, whereas predicates describe characteristics of and relations between entities. For instance, the sentence "Alice is happy" can be represented with the formula , where is a singular term and is a predicate.[d] Quantifiers express that a certain condition applies to some or all entities. For example, the sentence "Someone is happy" can be represented with the formula , where the existential quantifier indicates that happiness applies at least to one person. Similarly, the idea that everyone is happy can be expressed through the formula , where is the universal quantifier.[31]
There are different ways how natural language sentences can be translated into predicate logic. A common approach interprets verbs as predicates. Intransitive verbs, like "sleeps" and "dances", have a subject but no objects and are interpreted as one-place predicates. Transitive verbs, like "loves" and "gives", have one or more objects and are interpreted as predicates with two or more places. For example, the sentence "Bob loves Alice" can be formalized as , using the two-place predicate . Typically, not every word in natural language sentences has a direct counterpart symbol in the logic translation, and in some cases, the pattern of the logical formula differs significantly from the surface structure of the natural language sentence. For example, sentences like "all cats are animals" are usually translated as (for all entities, if the entity is a cat then the entity is an animal) even though the expression "if...then" () is not present in the original sentence.[32]
Logic translations face challenges as a result of attempting to associate vague and ambiguous ordinary language expressions with precise logical formulas. This process frequently requires case-by-case interpretation without a generally accepted algorithm to cover all cases.[33] Many early approaches to formal semantics, such as the works of Gottlob Frege, Rudolf Carnap, and Donald Davidson, relied primarily on predicate logic.[34]
Type theory
[edit]
Type theory is another approach[e] to formal semantics that was popularized by Montague. Its core idea is that expressions belong to different types, which describe how the expressions can be used and combined with other expressions. Type theory, typically in the form of typed lambda calculus, provides a formalism for this endeavor. It begins by defining a small number of basic types, which can be fused to create new types.[37]
According to a common approach, there are only two basic types: entities () and truth values (). Entities are the denotations of names and similar noun phrases, while truth values are the denotations of declarative sentences. All other types are constructed from these two types as functions that have entities, truth values, or other functions as inputs and outputs. This way, a sentence is analyzed as a complex function made up of several internal functions. When all functions are evaluated, the output is a truth value. Simple intransitive verbs without objects are functions that take an entity as input and produce a truth value as output. The type of this function is written as , where the first letter indicates the input type and the second letter the output type. According to this approach, the sentence "Alice sleeps" is analyzed as a function that takes the entity "Alice" as input to produce a truth value. Transitive verbs with one object, such as the verb "likes", are complex or nested functions. They take an entity as input and output a second function, which itself requires an entity as input to produce a truth value, formalized as .[f] This way, the sentence "Alice likes Bob" corresponds to a nested function to which two entities are applied. Similar types of analyses are provided for all relevant expressions, including logical connectives and quantifiers.[39]
Others
[edit]Possible worlds are another central concept used in the analysis of linguistic meaning. A possible world is a complete and consistent version of how everything could have been, similar to a hypothetical alternative universe. For instance, the dinosaurs were wiped out in the actual world but there are possible worlds where they survived. Possible worlds have various applications in formal semantics, usually to study expressions or aspects of meaning that are difficult to explain when referring only to entities of the actual world. They include modal statements about what is possible or necessary and descriptions of the contents of mental states, such as what people believe and desire. Possible worlds are also used to explain how two expressions can have different meanings even though they refer to the same entity, such as the expressions "the morning star" and "the evening star", which both refer to the planet Venus. One way to include possible worlds in the model-theoretic formalism is to define a set of all possible worlds as one additional component of a model. The interpretation of the meanings of different expressions is then modified to account for this change. For example, to explain that a sentence may be true in one possible world and false in another, one can interpret its meaning not directly as a truth value but as a function from a possible world to a truth value.[40]
Situation semantics is a theory closely related to possible world semantics. Situations, like possible worlds, present possible circumstances. However, unlike possible worlds, they do not encompass a whole universe but only capture specific parts or fragments of possible worlds. This modification reflects the observation that many statements are context-dependent and aim to describe the speaker's specific circumstances rather than the world at large. For example, the sentence "every student sings" is false when interpreted as an assertion about the universe as a whole. However, speakers may use this sentence in the context of a limited situation, such as a specific high school musical, in which it can be true.[41]
Dynamic semantics interprets language usage as a dynamic process in which information is continually updated against the background of an existing context. It rejects static approaches that associate a given expression with a fixed meaning. Instead, this theory argues that meaning depends on the information that is already present in the context, understanding the meaning of a sentence as the change in information it produces. This view reflects the idea that sentences are usually not interpreted in isolation but form part of a larger discourse, to which they contribute in some way.[42] For example, update semantics—one form of dynamic semantics—defines an information state as the set of all possible worlds compatible with the current information, reflecting the idea that the information is incomplete and cannot determine which of these worlds is the right one. Sentences introducing new information update the information state by excluding some possible worlds, thereby decreasing uncertainty.[43][g]
Studied linguistic phenomena
[edit]Quantifiers
[edit]Quantifiers are expressions that indicate the quantity of something. In predicate logic, the most basic quantifiers only provide information about whether a condition applies to all or some entities, as seen in sentences like "all ravens are black" and "some students smoke". Formal semanticists use the concept of generalized quantifiers to extend this basic framework to a broad range of quantificational expressions in natural language that usually provide more detailed information. They include diverse expressions such as "most", "few", "twelve", and "fewer than ten".[45]
Most quantificational expressions can be interpreted as relations between two sets.[h] For instance, the sentence "all ravens are black" conveys the idea that the set of ravens is a subset of the set of black entities. Similarly, the sentence "fewer than ten books were sold" asserts that the set of books and the set of sold items have fewer than ten elements in common.[47] In English, quantifiers are often expressed with a determiner,[i] such as "all" and "few", indicating the relation between the sets, followed by a noun phrase and a predicate to describe the involved sets.[49]
Quantifiers can be divided into proportional and cardinal quantifiers based on the relation between the sets. Proportional quantifiers, such as "all" and "most", indicate the relative overlap of the first set with the second set. For them, the order of the sets matters. For instance, the sentences "all ravens are black" and "all black things are ravens" have different meanings even though they refer to the same sets. Cardinal quantifiers, such as "four" and "no", provide information about the absolute number of overlapping entities, independent of relative proportion. For them, the order of the sets does not matter, as exemplified by the sentences "no rose is black" and "no black thing is a rose".[50]
Typically, the domain of natural language quantifiers is implicitly limited to a certain range of entities relevant to the discussed issue. For example, in the context of a specific kindergarten, the domain of the sentence "all children are sleeping" is limited to the children attending this kindergarten.[51]
The scope of a quantifier is the part of the sentence to which it applies. Some natural language sentences have scope ambiguity, resulting in competing interpretations of the scope of quantifiers. Depending on how the scope is interpreted, the sentence "Some man loves every woman" can mean either "there is a man such that he loves all women" or "for every woman there is at least one man who loves her".[52]
Descriptions and names
[edit]
Definite and indefinite descriptions are phrases that denote a specific entity or group of entities within a given context. Definite descriptions in English typically use the definite article "the", followed by a noun phrase, such as "the president of Kenya". However, they can also take other forms, such as "her husband" or "John's bicycle". Indefinite descriptions are usually expressed with the indefinite articles "a" and "an", as in "a lazy coworker" and "an old friend".[54] Definite descriptions typically point to a unique entity and assume that the listener is familiar with the referent. Indefinite descriptions usually allow the description to apply to more than one entity and introduce the entity without presupposing prior knowledge.[55]
Diverse theories about the correct analysis of definite and indefinite descriptions have been proposed. An influential early view, suggested by Bertrand Russell, interprets them using existential quantifiers. It proposes that indefinite descriptions like "a man ran" have the logical form . Definite descriptions have a similar form, with the difference that the description is unique, meaning that the first predicate only applies to a single entity.[56] A central motivation for Russell's approach was to solve semantic puzzles that arise from definite descriptions that do not refer to any particular entity. For example, the sentence "the present king of France is bald" refers to no existing entity, posing challenges for determining its truth value. According to Russell's analysis, the sentence is false since no unique entity exists to which the predicates "present king of France" and "bald" apply.[57]
The problem of names is closely related to that of definite descriptions because both expressions aim to refer to a particular entity. According to Millian theories, names refer directly without any descriptive information of the denoted entity. This view is opposed by description theories, which argue that names carry implicit descriptive contents that help interpreters identify their referents. One view understands names as implicit definite descriptions, proposing that the descriptive content of the name "Socrates" may include information like "the teacher of Plato".[58]
Tense, aspect, and events
[edit]Tense and aspect provide temporal information about events and circumstances. Tense indicates whether something happened in the past, present, or future, offering a reference point to place events within a timeline relative to the time of the utterance. Aspect conveys additional information about how events unfold in time, like the distinction between completed, ongoing, and repetitive events. In English, both tense and aspect can be expressed through verb forms. On the level of tense the sentence "I ate" indicates the past, whereas the sentence "I will eat" indicates the future. On the level of aspect, the sentence "I ate" indicates a completed action, whereas the sentence "I was eating" indicates an ongoing action.[59]
Formal semanticists employ diverse conceptual tools to describe tense, such as different types of temporal logic as extensions of predicate logic. One approach includes a set of times in the mathematical model to interpret temporal statements. Some models conceptualize time as a series of instances, while others introduce intervals as the basic units of time. The difference is that intervals have a duration and can overlap, whereas instances are discrete time points that do not intersect. One form of temporal logic introduces tense operators to indicate the time a sentence describes, like the operator for past events and the operator for future events. This way, the formula expresses that Naomi danced in the past, while asserts that she will dance in the future.[60] The semantic analysis of aspect is divided into grammatical aspect, expressed through verb forms, and lexical aspect, which covers the inherent temporal characteristics of different verbs.[61]
An influential approach to the semantic role of events was proposed by Donald Davidson. Using predicate logic, it represents events as singular terms and translates action sentences into logical formulas about events, even if the original sentences contain no explicit reference to events. For example, it translates the sentence "Jones buttered the toast slowly with a knife" as (literally: there was an event, which was a buttering of the toast by Jones, was slow, and involved a knife). One motivation for this approach is to provide a systematic method for translating adverbs, like "slowly", and other adjuncts into logical formulas.[62]
Intensionality, modality, and propositional attitudes
[edit]Semanticists often distinguish two aspects of meaning: extension and intension.[63][j] Extension is the entity or group of entities to which an expression refers, while intension is the inherent concept or underlying idea it conveys. For example, the expressions "the morning star" and "the evening star" have the same extension, as both refer to the planet Venus. However, their meanings differ on the level of intension since they present the planet in different ways by evoking distinct concepts.[65]
Extensionality and intensionality[k] are characteristics of sentences. A sentence is extensional if expressions with the same extension can be substituted without changing the sentence's truth value. For example, the sentence "the morning star is a planet" remains true if the expression "the morning star" is replaced with the expression "the evening star". Intensional sentences, by contrast, are not only sensitive to extensions but also to intensions, meaning that extensionally equivalent expressions cannot be freely replaced. For instance, the sentence "Ann knows that the morning star is the morning star" is intensional since it can be true while the extensionally equivalent sentence "Ann knows that the evening star is the morning star" is false.[67]
Intensionality is present in various linguistic expressions. For instance, modal expressions, such as "may", "can", and "must", usually introduce intensional contexts.[l] They express what is possible or necessary, describing how the world could or could not have been rather than how it actually is. A common approach to the analysis of modal expressions is the use of the modal operators and to modify the meaning of sentences and represent what is possible and necessary. For example, if the formula stands for the statement "it is raining", then the formula stands for the statement "it is possible that it is raining". To interpret the meaning of modal statements, formal semanticists often rely on the concept of possible worlds. According to this approach, a sentence is possibly true if it is true in at least one possible world, whereas it is necessarily true if it is true in all possible worlds.[69][m]
Propositional attitude reports—another example of intensionality—discuss mental states of individuals. They often use verbs like "believes", "doubts", and "wants", followed by a that-clause describing the content of the attitude, like the sentence "Kyrie believes that the earth is flat". The use of possible worlds is also common for the analysis of propositional attitudes. For example, the content of a propositional attitude can be understood as the set of all possible worlds in which it is true, such as all possible worlds with a flat earth in the mentioned example.[71] The meaning of propositional attitude reports containing definite or indefinite descriptions is often ambiguous. This ambiguity arises from the interpretation of the description, which can be subjective or objective. For example, if Jasper wants a drink from his butler but is unaware that his butler poisoned his wife, then the sentence "Jasper wants a drink from the poisoner of his wife" is ambiguous. According to the objective interpretation—called de re interpretation—the sentence is true since the butler is in fact the poisoner. Conversely, the subjective interpretation—called de dicto interpretation—renders it false since Jasper does not want drinks from poisoners.[72]
Questions and imperatives
[edit]The main focus of formal semantics is on statements, which aim to describe reality and are either true or false depending on whether they succeed. However, this analysis does not cover all types of sentences. Specific frameworks have been proposed for the analysis of other sentence types, such as questions and imperatives.[73]
Various theories analyze the meaning of questions in terms of possible answers, replacing the concept of truth conditions, common in the analysis of statements, with the related notion of answerhood conditions. One approach, initially formulated by Charles Leonard Hamblin, interprets answerhood conditions as the set of statements that qualify as answers to a question. For instance, the sentences "Marco called" and "Don called" qualify as answers to the question "Who called?", but the sentence "I like ice cream" does not. A common distinction is between yes-no questions, which only ask for confirmation, and open-ended questions, which seek more detailed information. Additional considerations include the distinctions between true and false answers, and between complete and partial answers, depending on whether the response contains all the requested information. On the symbolic level, questions can be expressed using as an operator to indicate the subject of the question. For example, the question "Who called?" can be formalized as , whereas the question "Did anyone call?" takes the form .[74]
Imperative sentences usually express commands or instructions, like the sentence "Close the door!". Unlike declarative and interrogative sentences, which generally convey or request information, the primary goal of imperatives is to influence the behavior of the listener. As a result, imperatives have no or at least no obvious truth conditions. Other difficulties in the analysis of imperative sentences are that they usually lack an explicit subject and that they can express various other meanings besides commands, such as advice, invitations, or permissions. Formal semanticists study the meaning of imperatives by examining how they interact with other linguistic phenomena. These include cases in which one imperative entails another imperative, the negation of an imperative, and conditional imperatives as well as conjunctions and disjunctions of several imperatives.[75]
Other phenomena
[edit]Diverse other linguistic phenomena are studied in formal semantics. Negation is typically understood as an operation that inverts the meaning of an expression. In classical logic, it is expressed through the operator as in , indicating that Mia is not sleeping. This operator inverts the truth value of a statement: if is false then is true. In natural language, negative particles and quantifiers, such as "not" and "no", are often used to indicate negation. These expressions can occur in different positions within sentences to negate either the full sentence or specific parts of it. The scope of a negation operator is the part of the sentence that it affects, which can sometimes be ambiguous. For example, the sentence "all doctors have no car" can mean that not every doctor has a car, that not a single doctor has a car, or that no individual car is collectively owned by all doctors.[76]
Plural expressions refer to multiple objects, such as the terms "children" and "apples". Formal semanticists typically interpret them as denoting some kind of plural object, such as the set of individuals belonging to the group in question. They distinguish between distributive and collective uses depending on whether the predicate applies to each individual separately or to the group as a whole. Some sentences are ambiguous and allow for both interpretations. For example, the sentence "two boys pushed a car" can mean that there were two cars and each boy pushed one (distributive) or that there was one car that both boys pushed together (collective).[77]
Formal semanticists also examine expressions whose meaning depends on contextual factors. They include indexical or deictic expressions, which refer to some aspect of the situation of the text. Examples are the pronouns "I" and "you", which refer to the speaker and the addressee, as well as the adverbs "today" and "over there", which refer to temporal and spatial aspects of the situation. Anaphoric expressions are another type of context-dependent expression. They refer to terms or phrases used earlier in the text, called antecedents. In the passage "Peter woke up. He switched on the light." the word "he" is an anaphoric expression with the word "Peter" as its antecedent. This grammatical association is known as binding and depends on the context since the word "he" would refer to someone else if the preceding sentence had a different antecedent.[78] Other linguistic phenomena studied by formal semanticists include presupposition, conditionals, thematic roles, spatial expressions, adjectives, and adverbs.[79]
While much work focuses on the study of English sentences, formal semantics is not limited to the English language and includes cross-linguistic analysis.[80] For example, there are diverse methods of expressing grammatical aspect across distinct languages, including prefixes, verb endings, auxiliary verbs, or combinations of several methods. They impact semantic analysis by influencing the hierarchical structure of sentences.[81] Similar problems of cross-linguistic analysis are also found for other linguistic phenomena, including quantification and indefinite descriptions.[82]
In various fields
[edit]Formal logic
[edit]Formal logic studies the laws of deductive reasoning, focusing on entailment relations between premises and conclusions rather than linguistic meaning in general. It investigates rules of inference, such as modus ponens, which describe the logical structure of deductively valid arguments.[83] Formal logicians develop artificial languages, like the language of predicate logic, to avoid the ambiguities of natural language and give precise descriptions of the laws of logic.[84] Formal semantics plays a central role in this endeavor for applying these laws to natural language arguments. It helps logicians discern the logical form of everyday arguments, serving as a crucial step in translating them into logical formulas.[85]
Another key overlap between formal semantics and formal logic concerns the meaning of artificial logical languages. The semantics of logic examines the construction of mathematical models of formal languages, similar to the models used by formal semanticists to study natural language. These models typically include abstract objects to represent individuals and sets. The relation to formulas is established through an interpretation function that maps symbols to the abstract objects they denote.[86] A key aspect of this interface is the contrast between syntactic and semantic entailment.[n] A premise syntactically entails a conclusion if the conclusion can be deduced using rules of inference. A premise semantically entails a conclusion if the conclusion is true in every possible model where the premise is true.[88]
Computer science
[edit]Computational semantics is an interdisciplinary field at the intersection of computer science and formal semantics. It studies how computational processes can be utilized to deal with linguistic meaning. A primary focus is the analysis of natural language sentences through computer-based methods to discern their logical structure, understand their content, and extract information. This form of inquiry has various applications in areas of artificial intelligence, such as automated reasoning, machine learning, and machine translation. Difficulties in this process come from the ambiguity, vagueness, and context dependence of natural language expressions.[89]
Another intersection concerns the analysis of the meaning of programming languages. A programming language is an artificial language designed to give instructions or describe computations to be performed by computers. A formal semantics of a programming language is a mathematical model of how it works. Its goal is to help computer scientists understand, analyze, and verify program behavior.[90] Static semantics describes the process of compilation or how a human-readable programming language is translated into binary machine code.[91] Dynamic semantics examines run-time behavior or what happens during the execution of instructions.[92] The main approaches to dynamic semantics are denotational, axiomatic, and operational semantics. Denotational semantics describes the effects of code elements, axiomatic semantics examines the conditions before and after code execution, and operational semantics interprets code execution as a series of state transitions.[93]
Cognitive science
[edit]Cognitive science studies the mind by focusing on how it represents and transforms information. It is an interdisciplinary field that integrates research from diverse areas, ranging from psychology and neuroscience to philosophy, artificial intelligence, and linguistics.[94] Some researchers emphasize the central role of language in understanding the human mind and rely on formal semantics to provide an abstract model for analyzing how linguistic meaning is constructed and interpreted.[95]
Formal semantics is also relevant to cognitive neuroscience, which seeks to explain the biological processes underlying cognition. One approach uses brain imaging techniques to visualize brain activity and employs mathematical models to link this data to cognitive processes. Insights from formal semantics can refine these models and help formulate testable predictions. For instance, researchers can examine semantic cognition by presenting a person with semantic variations of a sentence and measuring the differences in brain responses.[96]
History
[edit]
Formal semantics has its roots in the development of modern logic in the late 19th and early 20th centuries. Gottlob Frege laid the foundations of predicate logic and examined how this logical system can be used to analyze natural language arguments. He engaged in this analysis using a small number of basic concepts of formal semantics, such as singular terms, predicates, quantifiers, and logical connectives. Frege also formulated the principle of compositionality and introduced the distinction between sense and reference.[98]
Following Frege's work, Alfred Tarski developed a rigorous theory of truth in formal languages starting in the 1930s. He provided a precise analysis of truth conditions and clarified the concept of logical consequence. His work formed a cornerstone of model theory.[99] Rudolf Carnap synthesized and generalized many of Frege's and Tarski's ideas. To overcome problems associated with extensional definitions of meaning, Carnap pioneered the study of intensional semantics, defining intensions as functions from possible worlds to denotations.[100]
Donald Davidson was influenced by Tarski's approach and emphasized the role of truth conditions as a key component of semantic theory and the analysis of sentence meaning. He also proposed an event-based formalism to translate action sentences into predicate logic.[101] Noam Chomsky's work on generative grammar inspired Jerrold Katz and Jerry Fodor to explore the relation between syntactic rules and semantic content through the principle of compositionality.[102]
Many of these contributions prepared the work of Richard Montague, usually considered the main founding figure of formal semantics. One of his key achievements, starting in the late 1960s and early 1970s, was the development of a systematic formalism for analyzing significant portions of the English language using tools from formal logic. This stood in contrast to many earlier approaches, which addressed some aspects of natural language but were skeptical of broader applications and had their main focus on the analysis of formal languages. Relying on type theory and the principle of compositionality, Montague analyzed complex natural language expressions as nested functions with precisely defined input and output types.[103] This development happened against the background of the "linguistic wars"—a debate between proponents of generative semantics and interpretive semantics about whether syntax and semantics are deeply integrated or independent aspects of language. Montague's approach aimed to provide a unified perspective by explaining the relationship between syntactic and semantic rules.[104] His system also covers intensional sentences such as modal expressions and propositional attitude reports through the concept of possible worlds.[105]

In the following decades, Montague's work influenced many scholars, who sought to refine or modify his insights and apply them to diverse linguistic phenomena.[107] Barbara Partee was instrumental in explaining and popularizing Montague's ideas, helping formal semantics grow into a subfield of linguistics by integrating Montague's insights into linguistic theory.[108] In response, various theorists focused on the relation between syntax and semantics, proposing diverse grammatical theories to explain the interface, such as Generalized phrase structure grammar and Head-driven phrase structure grammar. They also include the contributions of Irene Heim and Angelika Kratzer to the semantics of generative grammar.[109] A parallel development was a rising interest in pragmatics, which examines how the use of an expression affects its meaning, encompassing topics like context dependence, presupposition, and indexicality.[110]
The work of Robert Stalnaker and David Lewis prepared the development of dynamic semantics, which analyzes the meaning of a sentence as the information contribution it makes. Their theories inspired later developments by Hans Kamp, Heim, Jeroen Groenendijk, and Martin Stokhof.[111] Stalnaker and Lewis, together with Saul Kripke, also made influential contributions to possible world semantics.[112] Jon Barwise and John Perry proposed situation semantics as another influential framework. It incorporates many insights from possible world semantics but takes a more fine-grained approach, analyzing meaning in terms of situations rather than possible worlds.[113] Both David Kaplan and Pauline Jacobson made various contributions to the study of context-sensitive expressions, such as deictic and anaphoric terms. Jacobson also explored the principle of direct compositionality, which suggests a particularly close link between syntax and semantics.[114]
See also
[edit]- Alternative semantics
- Discourse representation theory
- Frame semantics (linguistics)
- Inquisitive semantics
- Syntax–semantics interface
- Traditional grammar
References
[edit]Notes
[edit]- ^ In this example, the entailment relation is one-way. However, entailments can also go in both directions if two sentences entail each other, like the sentences "Tina is tall and happy" and "Tina is happy and tall". In such cases, the two sentences are said to be equivalent.[12]
- ^ This general method also reflects the externalist theory of meaning common in formal semantics: the meaning of an expression is interpreted as the entities it denotes in an abstract model.[15]
- ^ This can be expressed symbolically through the use of double brackets. For example, the formula refers to the object denoted by the name "Tina" in the model "M".[16]
- ^ Typically, predicates start with uppercase letters and singular terms start with lowercase letters.[30]
- ^ Predicate logic and type theory are not exclusive approaches and are sometimes combined into hybrid systems in modern formal semantics.[36]
- ^ This process is known as currying.[38]
- ^ Dynamic predicate logic is another approach that modifies the language of predicate logic to better capture natural language expressions that refer to individuals mentioned earlier, such as pronouns.[44]
- ^ In type theory, sets can be interpreted as characteristic functions from entities to truth values of the type , returning true if the entity is a member of the set and false otherwise. As a consequence, most quantifiers have the type , corresponding to a function that takes two sets as inputs and outputs a truth value that depends on the relation between the sets.[46]
- ^ In some cases, bare plurals act as quantifiers without a determiner, such as the sentence "firemen wear helmets", expressing the idea that all firemen wear helmets.[48]
- ^ This distinction is also discussed under the terms reference and sense as well as denotation and connotation.[64]
- ^ Intensionality is different from intentionality but the two concepts are related since expressions describing intentionality, like propositional attitude reports, are typically intensional.[66]
- ^ There are diverse ways to express modality, including modal auxiliaries such as "could" and "should"; modal adverbs such as "possibly" and "necessarily"; and modal adjectives such as "conceivable" and "probable".[68]
- ^ More fine-grained approaches distinguish between different types of modality, such as logical, epistemic, and deontic modality, while introducing separate operators for each type.[70]
- ^ These entailment relations are also referred to as deductive-theoretic and model-theoretic consequence.[87]
Citations
[edit]- ^
- Portner & Partee 2002, pp. 1–2
- Partee 2016, pp. 3–4
- Winter 2016, pp. 3–4
- Matthews 2007, p. 144
- ^
- Portner 2005, p. 14
- Moeschler 2007, p. 32
- Saeed 2009, p. 305
- ^
- Portner & Partee 2002, pp. 1–2
- Partee 2016, pp. 3–4
- Lappin 2003, pp. 370–371
- Yalcin 2014, pp. 17–54
- von Fintel 2023
- ^
- King 2009, pp. 557–558
- Fox 2014, pp. 85–87
- Barba 2007, pp. 637–639
- ^
- Cann 1993, pp. ix–xii, 1–3
- Winter 2016, pp. 5–6
- ^
- King 2009, pp. 557–558
- Portner 2005, pp. 214–216
- ^
- Moeschler 2007, pp. 31–32
- Griffiths & Cummins 2023, p. 1
- Bezuidenhout 2009, p. 875
- ^
- Portner & Partee 2002, pp. 1–2
- Partee 2016, pp. 3–4
- Winter 2016, pp. 3–4
- Kearns 2011, p. 24
- ^
- Lappin 2003, pp. 375–376
- Kearns 2011, pp. 6, 8–11
- Winter 2016, pp. 17–21
- ^
- Stokhof 2013, pp. 210–213
- Winter 2016, pp. 12–16
- ^
- Coppock & Champollion 2025, pp. 6, 15–28
- Winter 2016, pp. 12–13
- Kearns 2011, pp. 11–12
- ^ Winter 2016, p. 16
- ^
- Stokhof 2013, pp. 210–213
- Winter 2016, pp. 12–16
- ^ Winter 2016, p. 19
- ^
- Winter 2016, pp. 18, 240–241
- Portner & Partee 2002, pp. 1–2
- Partee 2016, pp. 3–4
- Lappin 2003, pp. 370–371
- ^ Winter 2016, p. 18
- ^
- Winter 2016, pp. 17–18, 24–27, 240–241
- Fox 2014, pp. 86, 90–92
- Saeed 2009, pp. 309–310
- ^
- Harris 2017, pp. 158–159
- Dowty 2007, p. 48
- ^
- Winter 2016, pp. 28–29
- Lappin 2003, pp. 370, 374
- Stokhof 2013, pp. 208–209
- Fox 2014, pp. 90–91
- ^
- Lappin 2003, pp. 370–371
- Moeschler 2007, p. 32
- Fox 2014, p. 92
- ^ Coppock & Champollion 2025, pp. 291–294, "'Logical Form' refers here to a level of syntactic representation… It is natural to refer to the [logical] translation as the ‘logical form’ of a sentence, but this is not what is meant by ‘Logical Form’ in this context.".
- ^ Ruys, Eddy; Winter, Yoad (2011). "Quantifier scope in formal linguistics." (PDF). In Gabbay, Dov; Guenthner, Franz (eds.). Handbook of Philosophical Logic (2 ed.). Dordrecht: Springer. pp. 159–225. doi:10.1007/978-94-007-0479-4_3. ISBN 978-94-007-0478-7.
The term Logical Form has of course been chosen to suggest similarity with the logician's notion of the logical form of a proposition which underlies its inference properties, as distinguished from the grammatical form. However, it has repeatedly been stressed… that the representation of a sentence at the grammatical level of LF is not to be equated with its "logical form."
- ^ Sailer, Manfred (2016). "The syntax–semantics interface". In Aloni, Maria; Dekker, Paul (eds.). The Cambridge Handbook of Formal Semantics. Cambridge University Press. pp. 644–654. ISBN 978-1-316-55273-5.
In this section we will characterize an approach to the syntax–semantics interface that assumes a syntactic level of Logical Form (LF)... Whether or not a semantic representation language has a theoretically important status is a matter of debate. May (1991) calls such a representation a (lower-case) logical form in contrast to LF.
- ^ Heim & Kratzer 1998, Chapter 3.2, Chapter 7.
- ^ Szabolcsi, Anna (2001). "The syntax of scope". In Baltin, Mark; Collins, Chris (eds.). The Handbook of Contemporary Syntactic Theory. Wiley. p. 607-633.
- ^ Coppock & Champollion 2025, pp. 36, 43–46.
- ^ Cable, Seth (2019). "From Logic To Montague Grammar (Course Notes)". LingBuzz. Interpretations for Natural Language: Towards Montague’s Theory of Translation. Retrieved 19 June 2025.
- ^
- Lappin 2003, pp. 371–373
- King 2009, pp. 557–558
- Harris 2017, pp. 149–150, 161–162
- Partee 2010, pp. 16–17
- ^
- Lappin 2003, p. 371
- Kearns 2011, pp. 24–26
- Klement, Lead section, § 1. Introduction, § 3. The Language of Propositional Logic
- ^ Shapiro & Kouri Kissel 2024, § 2.1 Building Blocks
- ^
- Lappin 2003, p. 371
- Kearns 2011, pp. 32–37, 45–47
- Cann 1993, pp. 27–28, 32–33, 151–153
- Shapiro & Kouri Kissel 2024, Lead section, § 2. Language
- ^
- Kearns 2011, pp. 32–37, 45–47
- Cann 1993, pp. 27–28, 32–33
- ^
- Hurley 2018, p. 327-8
- Baumgartner & Lampert 2008, p. 95
- ^ Lappin 2003, pp. 371–375
- ^ Winter 2016, p. 55
- ^
- Fox 2014, pp. 92, 98–101
- Kearns 2011, pp. 24, 57–58
- Cann 1993, pp. 27, 82–84
- ^
- Winter 2016, pp. 45–46
- Kearns 2011, pp. 57–62
- Lappin 2003, pp. 375–376
- ^ Winter 2016, pp. 57–58
- ^
- Kearns 2011, pp. 57–62
- Winter 2016, pp. 44–58
- ^
- Winter 2016, pp. 5, 190–191, 198–199
- Kearns 2011, pp. 8, 79, 82–83
- Saeed 2009, pp. 32, 337, 378
- ^
- Lappin 2003, pp. 385–388
- Portner 2005, pp. 212–213
- Akman 2009, pp. 890–892
- ^
- Lappin 2003, pp. 381–385
- Groenendijk & Stokhof 2009, pp. 272–274
- Nouwen et al. 2022, Lead section
- ^
- Groenendijk & Stokhof 2009, pp. 275–276
- Nouwen et al. 2022, § 1.1 Update Semantics
- ^
- Nouwen et al. 2022, § 2. Dynamic Predicate Logic
- Groenendijk & Stokhof 2009, pp. 274–275
- ^
- Westerståhl 2016, pp. 206–209
- Kearns 2011, pp. 94–97
- Portner 2005, pp. 112–114
- ^
- Westerståhl 2016, pp. 206–209
- Kearns 2011, pp. 121–122
- ^
- Westerståhl 2016, pp. 206–209
- Kearns 2011, pp. 96–98
- Winter 2016, pp. 114–118
- ^ Westerståhl 2016, pp. 210–211
- ^
- Westerståhl 2016, pp. 210–211
- Kearns 2011, pp. 97–98
- Winter 2016, pp. 114–118
- ^
- Kearns 2011, pp. 94–97
- Westerståhl 2016, pp. 206–209
- ^
- Iacona 2015, p. 130
- Westerståhl 2016, pp. 212–213
- Kearns 2011, pp. 106–107
- ^
- Kearns 2011, p. 105
- Peters & Westerståhl 2006, pp. 16–17
- ^ Ludlow 2023, § 2. Russell's Theory of Descriptions
- ^
- Ostertag 2009, pp. 194–195
- Ludlow 2023, Lead section, § 1. What Are Descriptions?
- Winter 2016, p. 235
- ^ Abbott 2009, pp. 184–186
- ^ Ludlow 2023, § 2. Russell's Theory of Descriptions
- ^
- Kearns 2011, pp. 111–113
- Ostertag 2009, pp. 195–196
- Ludlow 2023, § 3. Motivations for Russell's Theory of Descriptions, § 5.1 The Challenge to Russell's Truth Conditions
- ^
- Reimer 2009, pp. 762–763
- Ludlow 2023, § 4.1 Descriptive Theories of Proper Names
- Cumming 2023, § 2.1 Meaning and Extension
- Kearns 2011, pp. 111–112
- ^
- Kearns 2011, pp. 176–177
- Winter 2016, pp. 235–236
- Portner 2005, pp. 137–138
- Grønn & von Stechow 2016, pp. 313–314
- ^
- Goranko & Rumberg 2025, § 2. 2. Formal Models of Time, § 3. Prior's Basic Tense Logic TL
- Kearns 2011, pp. 185–188
- Portner 2005, pp. 139–141
- Grønn & von Stechow 2016, pp. 313–314, 339
- ^
- Rothstein 2016, pp. 342–343
- Kearns 2011, pp. 156–157, 176–177
- ^
- Kearns 2011, pp. 241–245
- Lasersohn 2009, pp. 279–280
- ^
- Griffiths & Cummins 2023, pp. 7–9
- Cunningham 2009, pp. 526–527
- Saeed 2009, p. 46
- ^
- Griffiths & Cummins 2023, pp. 7–9
- Cunningham 2009, pp. 526–527
- Saeed 2009, p. 46
- ^ Fitting 2022, § 1. What Is This About?
- ^
- Cann 1993, pp. 308–309
- Parsons 2016, p. 9
- ^
- Oldager 2009, pp. 301–302
- Fitting 2022, Lead section, § 1. What Is This About?
- ^ Portner 2009a, pp. 1–2
- ^
- Winter 2016, p. 237
- Kearns 2011, pp. 79–84
- ^ Kearns 2011, pp. 79–82
- ^
- Cann 1993, pp. 308–309
- Lindeman, Lead section
- Portner 2005, pp. 161–163
- Kearns 2011, pp. 8–9, 134, 137, 141–142
- ^
- Kearns 2011, pp. 8–9, 134, 137, 141–142
- Winter 2016, pp. 218–221
- ^
- Dekker, Aloni & Groenendijk 2016, pp. 560–566
- Portner 2009, pp. 594–595
- Winter 2016, pp. 237–238
- Allan 2009, p. xiii
- ^
- Dekker, Aloni & Groenendijk 2016, pp. 560–566
- Cross & Roelofsen 2024, § 1.2 Kinds of Questions, § 2. The Semantics of Elementary Questions
- ^
- Portner 2009, pp. 594–598, 616, 618–619
- Fox 2015, pp. 314–319
- ^
- de Swart 2016, pp. 467–470, 489
- Cann 1993, pp. 60–61
- Kearns 2011, pp. 27–28
- de Swart 2012, p. 111
- ^
- Lasersohn 2009a, pp. 688–691
- Nouwen 2016, pp. 267–270
- Syrett & Musolino 2013, pp. 259–260
- ^
- Kearns 2011, p. 16
- Winter 2016, pp. 234–235
- Huang 2000, pp. 390–391
- ^
- Kearns 2011, pp. 17, 206
- Winter 2016, pp. 236–237
- ^
- Partee 2016, pp. 10–11, 21
- Bittner 2008, p. 383
- ^
- Saeed 2009, pp. 133–137
- Rothstein 2016, pp. 360–363, 366–367
- ^
- Partee 2016, p. 21
- Brasoveanu & Farkas 2016, pp. 243, 265
- ^
- Hintikka & Sandu 2006, pp. 13–14
- Audi 1999, pp. 679–681
- Cannon 2002, pp. 14–15
- ^
- Tully 2005, pp. 532–533
- Hodges 2005, pp. 533–536
- Walton 1996
- Johnson 1999, pp. 265–268
- ^
- Portner 2005, pp. 214–217
- Brun 2003, pp. 83, 175, 207
- Peregrin & Svoboda 2016, pp. 55, 64
- ^
- Cook 2009, pp. 124, 256
- Shapiro & Kouri Kissel 2024, Lead section, § 4. Semantics
- Barba 2007, p. 639
- ^
- ^
- Forster 2003, p. 74
- Cook 2009, pp. 82, 176
- McKeon 2010, pp. 1–2, 24–25
- Shapiro & Kouri Kissel 2024, Lead section, § 5. Meta-theory
- ^
- Portner 2005, pp. 214–217
- Geeraerts 2010, p. 118
- Bunt & Muskens 1999, pp. 1–2
- Erk 2018, Summary
- Stone 2016, pp. 775–777, 799–800
- ^
- Fernández 2014, pp. 14–16
- Winskel 1993, pp. xv–xvi
- Portner 2005, pp. 214–217
- ^
- Fernández 2014, pp. 14–15
- Fritzson 2010, p. 703
- Mosses 2003, p. 167
- ^
- Fernández 2014, pp. 15–16
- Fritzson 2010, p. 703
- Mosses 2003, p. 167
- ^
- Fernández 2014, p. 16
- O’Regan 2020, pp. 193–194
- Winskel 1993, pp. xv–xvi
- ^ Portner 2005, pp. 216–217
- ^
- Portner 2005, pp. 214–217
- Partee 2008, p. 10
- Partee 1995, pp. 311–360
- ^
- Winter 2016, pp. 3–4
- Baggio, Stenning & van Lambalgen 2016, pp. 756–757, 761–762, 773–774
- ^
- Harris 2017, p. 162
- King 2009
- Partee 2010, p. 17
- Magee 2011, pp. 39–40
- ^
- Lappin 2003, pp. 371–373
- King 2009, pp. 557–558
- Harris 2017, pp. 149–150, 161–162
- Partee 2010, pp. 16–17
- ^
- Harris 2017, p. 162
- King 2009
- Partee 2010, p. 17
- Magee 2011, pp. 39–40
- ^
- Harris 2017, pp. 162–163
- Partee 2010, p. 17
- Lappin 2003, pp. 373–374
- ^
- Lappin 2003, pp. 375–376
- Harris 2017, pp. 151–152
- Kearns 2011, pp. 241–245
- ^ Partee 2010, pp. 6–8
- ^
- Harris 2017, pp. 172–175
- Partee 2010, pp. 3–4, 26–27
- Portner & Partee 2002, pp. 3–4
- Lappin 2003, p. 390
- Barwise & Cooper 2002, pp. 114–115
- ^
- Portner & Partee 2002, pp. 3–4
- Partee 2010, pp. 14–15
- Magee 2011, pp. 40–41
- ^
- Harris 2017, pp. 172–175
- Partee 2010, pp. 3–4, 26–27
- Portner & Partee 2002, pp. 3–4
- ^ Partee 2010, pp. 15, 31, 34–36
- ^
- Portner & Partee 2002, pp. 3–6
- Partee 2010, pp. 36–38
- ^ Partee 2010, pp. 15, 31, 34–36
- ^
- Portner & Partee 2002, pp. 4–5
- Partee 2010, p. 36
- Heim & Kratzer 1998, p. ix
- ^
- Portner & Partee 2002, p. 5
- Partee 2010, pp. 38–40
- ^
- ^
- Partee 2010, pp. 19–20
- King 2009
- ^ Portner & Partee 2002, p. 6
- ^
- Harris 2017, pp. 158–159
- Dowty 2007, p. 48
- King 2009
Sources
[edit]- Abbott, B. (2009). "Definite and Indefinite". In Allan, Keith (ed.). Concise Encyclopedia of Semantics. Elsevier. pp. 184–191. ISBN 978-0-08-095969-6.
- Akman, V. (2009). "Situation Semantics". In Allan, Keith (ed.). Concise Encyclopedia of Semantics. Elsevier. pp. 890–893. ISBN 978-0-08-095969-6.
- Allan, Keith (2009). "Introduction". In Allan, Keith (ed.). Concise Encyclopedia of Semantics. Elsevier. pp. xi–xv. ISBN 978-0-08-095969-6.
- Audi, Robert (1999). "Philosophy of Logic". The Cambridge Dictionary of Philosophy. Cambridge University Press. ISBN 978-1-107-64379-6.
- Baggio, Giosuè; Stenning, Keith; van Lambalgen, Michiel (2016). "24. Semantics and Cognition". In Aloni, Maria; Dekker, Paul (eds.). The Cambridge Handbook of Formal Semantics. Cambridge University Press. pp. 756–774. ISBN 978-1-316-55273-5.
- Barba, Juan (2007). "Formal Semantics in the Age of Pragmatics". Linguistics and Philosophy. 30 (6): 637–668. doi:10.1007/s10988-008-9031-4.
- Barker, Chris; Jacobson, Pauline (2007). "Introduction: Direct Compositionality". In Barker, Chris; Jacobson, Pauline (eds.). Direct Compositionality. Oxford University Press. pp. 1–23. ISBN 978-0-19-152540-7.
- Barwise, Jon; Cooper, Robin (2002). "Generalized Quantifiers and Natural Language". In Portner, Paul H.; Partee, Barbara H. (eds.). Formal Semantics: The Essential Readings. Blackwell Publishers. pp. 75–126. ISBN 0-631-21541-7.
- Baumgartner, Michael; Lampert, Timm (September 2008). "Adequate Formalization". Synthese. 164 (1): 93–115. doi:10.1007/s11229-007-9218-1. S2CID 15396554.
- Bezuidenhout, A. (2009). "Semantics–Pragmatics Boundary". In Allan, Keith (ed.). Concise Encyclopedia of Semantics. Elsevier. ISBN 978-0-080-95969-6. Retrieved 2024-02-04.
- Bittner, Maria (2008). "Aspectual Universals of Temporal Anaphora". In Rothstein, Susan D. (ed.). Theoretical and Crosslinguistic Approaches to the Semantics of Aspect. John Benjamins Publishing. pp. 349–386. ISBN 978-90-272-9158-5.
- Brasoveanu, Adrian; Farkas, Donka F. (2016). "8. Indefinites". In Aloni, Maria; Dekker, Paul (eds.). The Cambridge Handbook of Formal Semantics. Cambridge University Press. pp. 238–266. ISBN 978-1-316-55273-5.
- Brun, Georg (2003). Die richtige Formel: Philosophische Probleme der logischen Formalisierung (in German). Ontos Verlag. ISBN 3-937202-13-7.
- Bunt, Harry; Muskens, Reinhard (1999). "Computational Semantics". Computing Meaning: Volume 1. Springer Netherlands. doi:10.1007/978-94-011-4231-1_1. ISBN 978-9-401-14231-1. Retrieved 2024-02-15.
- Cann, Ronnie (1993). Formal Semantics: An Introduction. Cambridge University Press.
- Cannon, Douglas (2002). Deductive Logic in Natural Language. Broadview Press. ISBN 978-1-77048-113-8.
- Cook, Roy T. (2009). Dictionary of Philosophical Logic. Edinburgh University Press. ISBN 978-0-7486-3197-1.
- Coppock, Elizabeth; Champollion, Lucas (2025). "Invitation to Formal Semantics" (PDF). Manuscript. Retrieved 3 June 2025.
- Cross, Charles; Roelofsen, Floris (2024). "Questions". The Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University. Retrieved 6 June 2025.
- Cumming, Sam (2023). "Names". The Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University. Retrieved 2 June 2025.
- Cunningham, D. J. (2009). "Meaning, Sense, and Reference". In Allan, Keith (ed.). Concise Encyclopedia of Semantics. Elsevier. pp. 526–535. ISBN 978-0-08-095969-6.
- de Swart, Henriëtte (2012). "Scope Ambiguities with Negative Quantifiers". In Heusinger, H. K. von; Egli, U. (eds.). Reference and Anaphoric Relations. Springer Science & Business Media. pp. 109–132. ISBN 978-94-011-3947-2.
- de Swart, Henriëtte (2016). "Negation". In Aloni, Maria; Dekker, Paul (eds.). The Cambridge Handbook of Formal Semantics. Cambridge University Press. pp. 467–489. ISBN 978-1-316-55273-5.
- Dekker, Paul; Aloni, Maria; Groenendijk, Jeroen (2016). "Questions". In Aloni, Maria; Dekker, Paul (eds.). The Cambridge Handbook of Formal Semantics. Cambridge University Press. pp. 560–593. ISBN 978-1-316-55273-5.
- Dowty, David (2007). "Compositionality as an Empirical Problem". In Barker, Chris; Jacobson, Pauline (eds.). Direct Compositionality. Oxford University Press. pp. 23–101. ISBN 978-0-19-152540-7.
- Erk, Katrin (2018). "Computational Semantics". Oxford Research Encyclopedia of Linguistics. Oxford University Press. ISBN 978-0-199-38465-5. Archived from the original on 2024-02-13. Retrieved 2024-02-15.
- Fernández, Maribel (2014). Programming Languages and Operational Semantics: A Concise Overview. Springer. ISBN 978-1-447-16368-8. Retrieved 2024-02-04.
- Fitting, Melvin (2022). "Intensional Logic". The Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University. Retrieved 3 June 2025.
- Forster, Thomas (2003). Logic, Induction and Sets. Cambridge University Press. ISBN 978-0-521-53361-4.
- Fox, Chris (2014). "The Meaning of Formal Semantics". In Stalmaszczyk, Piotr (ed.). Semantics and Beyond: Philosophical and Linguistic Inquiries. Walter de Gruyter GmbH & Co KG. pp. 85–107. ISBN 978-3-11-039114-5.
- Fox, Chris (2015). "10. The Semantics of Imperatives". In Lappin, Shalom; Fox, Chris (eds.). The Handbook of Contemporary Semantic Theory (1 ed.). Wiley. pp. 314–341. doi:10.1002/9781118882139.ch10. ISBN 978-0-470-67073-6.
- Fritzson, Peter (2010). Principles of Object-Oriented Modeling and Simulation with Modelica 2.1. John Wiley & Sons. ISBN 978-0-470-93761-7. Retrieved 2024-02-19.
- Geeraerts, Dirk (2010). Theories of Lexical Semantics. Oxford University Press. ISBN 978-0-198-70030-2. Retrieved 2024-02-15.
- Goranko, Valentin; Rumberg, Antje (2025). "Temporal Logic". The Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University. Retrieved 2 June 2025.
- Griffiths, Patrick; Cummins, Chris (2023). An Introduction to English Semantics and Pragmatics (3rd ed.). Edinburgh University Press. ISBN 978-1-399-50460-7.
- Groenendijk, J.; Stokhof, M. (2009). "Dynamic Semantics". In Allan, Keith (ed.). Concise Encyclopedia of Semantics. Elsevier. ISBN 978-0-080-95969-6.
- Grønn, Atle; von Stechow, Arnim (2016). "11. Tense". In Aloni, Maria; Dekker, Paul (eds.). The Cambridge Handbook of Formal Semantics. Cambridge University Press. pp. 313–341. ISBN 978-1-316-55273-5.
- Harris, Daniel W. (2017). "The History and Prehistory of Natural-Language Semantics". In Lapointe, Sandra; Pincock, Christopher (eds.). Innovations in the History of Analytical Philosophy. Palgrave Macmillan UK. pp. 149–194. doi:10.1057/978-1-137-40808-2_6. ISBN 978-1-137-40807-5.
- Heim, Irene; Kratzer, Angelika (1998). Semantics in Generative Grammar. Wiley. ISBN 978-0-631-19712-6.
- Hintikka, Jaakko; Sandu, Gabriel (2006). "What Is Logic?". In Jacquette, Dale (ed.). Philosophy of Logic. North Holland. pp. 13–39. ISBN 978-0-444-51541-4.
- Hodges, Wilfrid (2005). "Logic, Modern". In Honderich, Ted (ed.). The Oxford Companion to Philosophy. Oxford University Press. pp. 533–536. ISBN 978-0-19-926479-7.
- Huang, Yan (2000). Anaphora: A Cross-linguistic Approach. Oxford University Press. ISBN 978-0-19-823528-6.
- Hurley, Patrick J. (2018). A Concise Introduction to Logic (13th ed.). Cengage Learning. ISBN 978-1-305-95809-8.
- Iacona, Andrea (2015). "Quantification and Logical Form". In Torza, Alessandro (ed.). Quantifiers, Quantifiers, and Quantifiers: Themes in Logic, Metaphysics, and Language. Springer. ISBN 978-3-319-18362-6.
- Jacobson, Pauline (2014). Compositional Semantics: An Introduction to the Syntax/Semantics Interface. OUP Oxford. ISBN 978-0-19-166483-0.
- Janssen, Theo M. V.; Zimmermann, Thomas Ede (2025). "Montague Semantics". The Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University. Retrieved 28 May 2025.
- Johnson, Ralph H. (1999). "The Relation Between Formal and Informal Logic". Argumentation. 13 (3): 265–274. doi:10.1023/A:1007789101256. S2CID 141283158.
- Kearns, Kate (2011). Semantics (2nd ed.). Bloomsbury Academic. ISBN 978-0-230-23229-7.
- King, Jeffrey C. (2009). "Formal Semantics". In Lepore, Ernie; Smith, Barry C. (eds.). The Oxford Handbook of Philosophy of Language (1 ed.). Oxford University Press. pp. 557–573. ISBN 978-0-19-955223-8.
- Klement, Kevin C. "Propositional Logic". Internet Encyclopedia of Philosophy. Retrieved 24 March 2025.
- Lappin, Shalom (2003). "An Introduction to Formal Semantics". In Aronoff, Mark; Ress-Miller, Janie (eds.). The Handbook of Linguistics (1st ed.). Wiley. pp. 369–393. doi:10.1002/9780470756409.ch15. ISBN 978-0-631-20497-8.
- Lasersohn, P. (2009). "Event-Based Semantics". In Allan, Keith (ed.). Concise Encyclopedia of Semantics. Elsevier. pp. 279–282. ISBN 978-0-08-095969-6.
- Lasersohn, P. (2009a). "Plurality". In Allan, Keith (ed.). Concise Encyclopedia of Semantics. Elsevier. pp. 688–691. ISBN 978-0-08-095969-6.
- Lindeman, David. "Propositional Attitudes". Internet Encyclopedia of Philosophy. Retrieved 4 June 2025.
- Ludlow, Peter (2023). "Descriptions". The Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University. Retrieved 2 June 2025.
- Magee, Liam (2011). "The Meaning of Meaning: Alternative Disciplinary Perspectives". In Cope, Bill; Kalantzis, Mary; Magee, Liam (eds.). Towards A Semantic Web: Connecting Knowledge in Academic Research. Elsevier. pp. 35–80. ISBN 978-1-78063-174-5.
- Matthews, P. H. (2007). The Concise Oxford Dictionary of Linguistics (2nd ed.). Oxford University Press. ISBN 978-0-19-920272-0.
- May, Robert C. (2001). "Logical Form in Linguistics". In Wilson, Robert A.; Keil, Frank C. (eds.). The MIT Encyclopedia of the Cognitive Sciences. MIT Press. pp. 486–487. ISBN 978-0-262-73144-7.
- McKeon, Matthew. "Logical Consequence". Internet Encyclopedia of Philosophy. Retrieved 12 June 2025.
- McKeon, Matthew W. (2010). The Concept of Logical Consequence: An Introduction to Philosophical Logic. Peter Lang. ISBN 978-1-4331-0645-3.
- Moeschler, Jacques (2007). "Introduction to Semantics". In Rajman, Martin (ed.). Speech and Language Engineering. EPFL Press. ISBN 978-0-824-72219-7.
- Mosses, Peter D. (2003). "The Varieties of Programming Language Semantics (And Their Uses)". In Bjørner, Dines; Broy, Manfred; Zamulin, Alexandre (eds.). Perspectives of System Informatics: 4th International Andrei Ershov Memorial Conference, PSI 2001, Akademgorodok, Novosibirsk, Russia, July 2-6, 2001, Revised Papers. Springer. ISBN 978-3-540-45575-2. Retrieved 2024-02-19.
- Nouwen, Rick (2016). "Plurality". In Aloni, Maria; Dekker, Paul (eds.). The Cambridge Handbook of Formal Semantics. Cambridge University Press. pp. 267–284. ISBN 978-1-316-55273-5.
- Nouwen, Rick; Brasoveanu, Adrian; van Eijck, Jan; Visser, Albert (2022). "Dynamic Semantics". The Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University. Archived from the original on 25 February 2024. Retrieved 13 February 2024.
- Oldager, N. (2009). "Extensionality and Intensionality". In Allan, Keith (ed.). Concise Encyclopedia of Semantics. Elsevier. pp. 301–304. ISBN 978-0-08-095969-6.
- Ostertag, G. (2009). "Definite and Indefinite Descriptions". In Allan, Keith (ed.). Concise Encyclopedia of Semantics. Elsevier. pp. 194–200. ISBN 978-0-08-095969-6.
- O’Regan, Gerard (2020). Mathematics in Computing: An Accessible Guide to Historical, Foundational and Application Contexts. Springer Nature. ISBN 978-3-030-34209-8. Retrieved 2024-02-19.
- Parsons, David (2016). Theories of Intensionality: A Critical Survey. Springer. ISBN 978-981-10-2484-9.
- Partee, Barbara H. (1995). "11. Lexical Semantics and Compositionality". In Osherson, Daniel N.; Gleitman, Lila R. (eds.). An Invitation to Cognitive Science: Language. MIT Press. ISBN 978-0-262-65044-1.
- Partee, Barbara H. (2008). Compositionality in Formal Semantics: Selected Papers. John Wiley & Sons. ISBN 978-0-470-75129-9.
- Partee, Barbara H (2010). "Formal Semantics: Origins, Issues, Early Impact". Baltic International Yearbook of Cognition, Logic and Communication. 6 (1): 1–52. doi:10.4148/biyclc.v6i0.1580.
- Partee, Barbara H. (2016). "1. Formal Semantics". In Aloni, Maria; Dekker, Paul (eds.). The Cambridge Handbook of Formal Semantics. Cambridge University Press. pp. 3–32. ISBN 978-1-316-55273-5.
- Peregrin, Jaroslav; Svoboda, Vladimír (2016). "Logical Formalization and the Formation of Logic(s)". Logique et Analyse (233): 55–80. ISSN 0024-5836. JSTOR 26767818. Archived from the original on 21 February 2023. Retrieved 27 March 2023.
- Peters, Stanley; Westerståhl, Dag (2006). Quantifiers in Language and Logic. Clarendon Press. ISBN 978-0-19-929125-0.
- Portner, Paul H. (2005). What Is Meaning?: Fundamentals of Formal Semantics. John Wiley & Sons. ISBN 978-1-4051-0918-5.
- Portner, Paul (2009). "Imperatives". In Allan, Keith (ed.). Concise Encyclopedia of Semantics. Elsevier. pp. 594–626. ISBN 978-0-08-095969-6.
- Portner, Paul (2009a). Modality. Oxford University Press. ISBN 978-0-19-929242-4.
- Portner, Paul H.; Partee, Barbara H. (2002). "Introduction". In Portner, Paul H.; Partee, Barbara H. (eds.). Formal Semantics: The Essential Readings. Blackwell Publishers. pp. 1–16. ISBN 0-631-21541-7.
- Reimer, M. (2009). "Proper Names: Philosophical Aspects". In Allan, Keith (ed.). Concise Encyclopedia of Semantics. Elsevier. pp. 762–766. ISBN 978-0-08-095969-6.
- Rothstein, Susan (2016). "12. Aspect". In Aloni, Maria; Dekker, Paul (eds.). The Cambridge Handbook of Formal Semantics. Cambridge University Press. pp. 342–368. ISBN 978-1-316-55273-5.
- Saeed, John I. (2009). Semantics (3rd ed.). Wiley-Blackwell. ISBN 978-1-405-15639-4.
- Shapiro, Stewart; Kouri Kissel, Teresa (2024). "Classical Logic". The Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University. Retrieved 12 June 2025.
- Stokhof, Martin (2007). "Hand or Hammer? On Formal and Natural Languages in Semantics". Journal of Indian Philosophy. 35 (5–6): 597–626. doi:10.1007/s10781-007-9023-7.
- Stokhof, Martin (2013). "Formal Semantics and Wittgenstein: An Alternative?". Monist. 96 (2): 205–231. doi:10.5840/monist20139629.
- Stone, Matthew (2016). "25. Semantics and Computation". In Aloni, Maria; Dekker, Paul (eds.). The Cambridge Handbook of Formal Semantics. Cambridge University Press. pp. 775–800. ISBN 978-1-316-55273-5.
- Syrett, Kristen; Musolino, Julien (2013). "Collectivity, Distributivity, and the Interpretation of Plural Numerical Expressions in Child and Adult Language". Language Acquisition. 20 (4): 259–291. doi:10.1080/10489223.2013.828060.
- Tully, Robert (2005). "Logic, Informal". In Honderich, Ted (ed.). The Oxford Companion to Philosophy. Oxford University Press. pp. 532–533. ISBN 978-0-19-926479-7.
- von Fintel, Kai (2023). "What Is Semantics?" (Document). MIT course notes.
- Walton, Douglas (1996). "Formal and Informal Logic". In Craig, Edward (ed.). Routledge Encyclopedia of Philosophy. Routledge. doi:10.4324/9780415249126-X014-1. ISBN 978-0-415-07310-3.
- Westerståhl, Dag (2016). "7. Generalized Quantifiers". In Aloni, Maria; Dekker, Paul (eds.). The Cambridge Handbook of Formal Semantics. Cambridge University Press. pp. 206–237. ISBN 978-1-316-55273-5.
- Winskel, Glynn (1993). The Formal Semantics of Programming Languages: An Introduction. MIT Press. ISBN 978-0-262-23169-5.
- Winter, Yoad (2016). Elements of Formal Semantics: An Introduction to the Mathematical Theory of Meaning in Natural Language. Edinburgh University Press. ISBN 978-0-7486-7777-1.
- Yalcin, Seth (2014). "Semantics and Metasemantics in the Context of Generative Grammar". In Alexis Burgess; Brett Sherman (eds.). Metasemantics: New Essays on the Foundations of Meaning. Oxford University Press. ISBN 9780199669592.