Specimens: "most of" generic NPs in a contextually flexible type theory

Specimens: "most of" generic NPs in a contextually flexible type theory
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

This paper proposes to compute the meanings associated to sentences with generic NPs corresponding to the most of generalized quantifier. We call these generics specimens and they resemble stereotypes or prototypes in lexical semantics. The meanings are viewed as logical formulae that can be thereafter interpreted in your favorite models. We rather depart from the dominant Fregean single untyped universe and go for type theory with hints from Hilbert epsilon calculus and from medieval philosophy. Our type theoretic analysis bears some resemblance with on going work in lexical semantics. Our model also applies to classical examples involving a class (or a generic element of this class) which is provided by the context. An outcome of this study is that, in the minimalism-contextualism debate, if one adopts a type theoretical view, terms encode the purely semantic meaning component while their typing is pragmatically determined.


💡 Research Summary

The paper proposes a novel type‑theoretic framework for interpreting generic noun phrases that correspond to the “most of” generalized quantifier, which the author calls “specimens”. Departing from the traditional Fregean single‑untyped universe, the author adopts second‑order λ‑calculus (System F) to provide a flexible typing discipline that can capture both semantic content and pragmatic context‑dependence.

The central device is a specimen operator, denoted ∡ x·A, which maps any property A to a representative element that embodies the properties true of most A‑instances. This operator is inspired by Hilbert’s τ‑operator but differs from the ε‑choice function in that it does not assert the existence of a specific individual; rather, it encodes a measure‑theoretic “most” without committing to a concrete object. The paper argues that ∡ behaves more like τ (universal) than ε (existential) and that it avoids contradictions because the negation of a property that holds of the specimen cannot also hold.

System F is chosen because it supports polymorphic types (Π‑types) and type‑level quantification, allowing a single lexical entry to be instantiated at many different types. Base types include the usual e (entities), t (truth values), and a collection of domain‑specific types such as human, dog, 2‑year‑girl, etc. The constant ∀ has type Π α.(α→t)→t, yielding a universal quantifier over any type α when instantiated. Analogously, the constant ∡ has type Π α.α, producing the specimen of any type α when applied.

The semantic composition follows a Montague‑style λ‑term application, but each lexical entry may also provide optional λ‑terms that perform type‑shifts when needed. For example, the adjective “tall” is given the polymorphic type Π α.α→t and is defined as a predicate that holds of an object x of type α precisely when x exceeds the height of the specimen of α. This yields two distinct readings for the sentence “Carlotta is tall”: (i) treating Carlotta as a 2‑year‑girl, the sentence is true because she is taller than the maximal height recorded for the specimen of that class; (ii) treating her as a human, the sentence is false because she does not exceed the specimen height for the broader human class. The choice between these readings is driven by contextual information that selects the appropriate type‑shift (e.g., a coercion function h:2‑year‑girl→human).

Other illustrative examples include: (1) “The AKC notes that any dog may bite …”, which is rendered using Hilbert’s τ‑operator as a universal generic; (2) “The Brits love France”, which becomes love(∡{Brits}, France), mirroring the behavior of the ι‑choice function but without requiring an actual individual Briton. The paper also discusses how “most of” is not merely a cardinality notion but a measure‑theoretic one, citing the mathematical fact that the proportion of prime numbers tends to zero as n→∞.

The author emphasizes that the term calculus models the purely semantic component, while the flexibility of typing in System F captures pragmatic adaptation. This separation clarifies the long‑standing minimalism‑contextualism debate: semantics supplies the logical form; pragmatics supplies the type‑selection that determines which logical form is appropriate in a given discourse context.

In conclusion, the paper demonstrates that a System F‑based typed λ‑calculus equipped with a specimen operator can systematically derive logical representations for “most of” generics, handle context‑driven type shifts, and delineate the boundary between semantics and pragmatics. The approach bridges lexical semantics, formal logic, and medieval philosophical insights, offering a promising avenue for further research on genericity and type‑theoretic semantics.


Comments & Academic Discussion

Loading comments...

Leave a Comment