Algebras over a field and semantics for context based reasoning

Reading time: 6 minute
...

📝 Original Info

  • Title: Algebras over a field and semantics for context based reasoning
  • ArXiv ID: 1111.1673
  • Date: 2011-11-08
  • Authors: Authors not provided in the supplied excerpt. —

📝 Abstract

This paper introduces context algebras and demonstrates their application to combining logical and vector-based representations of meaning. Other approaches to this problem attempt to reproduce aspects of logical semantics within new frameworks. The approach we present here is different: We show how logical semantics can be embedded within a vector space framework, and use this to combine distributional semantics, in which the meanings of words are represented as vectors, with logical semantics, in which the meaning of a sentence is represented as a logical form.

💡 Deep Analysis

Deep Dive into Algebras over a field and semantics for context based reasoning.

This paper introduces context algebras and demonstrates their application to combining logical and vector-based representations of meaning. Other approaches to this problem attempt to reproduce aspects of logical semantics within new frameworks. The approach we present here is different: We show how logical semantics can be embedded within a vector space framework, and use this to combine distributional semantics, in which the meanings of words are represented as vectors, with logical semantics, in which the meaning of a sentence is represented as a logical form.

📄 Full Content

This chapter introduces context algebras and demonstrates their application to combining logical and vector-based representations of meaning. Other chapters in this volume consider approaches that attempt to reproduce aspects of logical semantics within new frameworks. The approach we present here is different: We show how logical semantics can be embedded within a vector space framework, and use this to combine distributional semantics, in which the meanings of words are represented as vectors, with logical semantics, in which the meaning of a sentence is represented as a logical form.

The ideas discussed here are present (at least implicitly) in earlier work, however we have introduced some notions which allow the mathematics to be tidied considerably:

• When context algebras were introduced [3] they were applied only to functions from a free monoid A * to R. In fact, this construction generalises to functions from A * to an arbitrary vector space V . The proof of the general case is identical to the specific one, and is reproduced here unchanged.

• This more general construction gives us an elegant way of embedding logical semantics within an algebraic framework. The embedding presented here follows similar lines to the thinking of [3], but uses the new, more general, context algebras.

• The method of combining logical semantics with vector-based lexical semantics is new, but follows similar lines to an approach suggested in [4].

Like other work in this book, we are concerned with the question of how to compose vector-based representations of meaning so that phrases and sentences are also represented as vectors. We wish to preserve the wonderful flexibility and fine-grained distinctions of meaning that vector spaces allow, and which have been so successful in lexical semantics, to build a complete framework for natural language semantics encompassing words, phrases, sentences and beyond. Unlike other work, in the approach presented here, we do not attempt to reconstruct logical semantics from scratch, instead embedding logical representations within a vector space. This has some benefits:

• Doing natural language semantics well is difficult, and a lot of work has gone into getting logical semantics for natural language right. It includes worrying about things like anaphora resolution, generalised quantifiers and negation, and reproducing this work from scratch in a vector-based framework is a mammoth task. Our approach allows us to reuse existing work while incorporating vector-based lexical semantics.

• There is the potential to reuse existing tools for natural language semantics, although computation in general is a problem with our approach.

The downside to our approach is that we don’t yet have an efficient way of computing with it, although we have ideas for how this may be achieved. Another potential criticism of this approach is that the flexibility in how vector representations are combined with logic may be hindered by requiring the wholesale adoption of existing formalisms, rather than the more tailored approaches of other work.

We first recall some basic definitions:

Definition 1 (Algebra over a field). An algebra over a field is a vector space A over a field K together with a binary operation (a, b) → ab on A that is bilinear,

for all a, b, c ∈ A and all α, β ∈ K. If we additionally have the property (ab)c = a(bc) then A is called associative. An algebra is called unital if it has a distinguished unity element 1 satisfying 1x = x1 = x for all x ∈ A. We are generally only interested in real associative algebras, where K is the field of real numbers, R.

Examples of associative algebras are given by square matrices of order n under normal matrix multiplication and entry-wise vector operations. The field of the algebra is the field of the elements of the matrices; so real valued matrices form a real associative algebra.

The distributional hypothesis of Harris [5] states that words will have similar meanings if and only if they occur in similar contexts. We formalise this idea, and examine the resultant mathematical properties.

Let A be some set, which we imagine to be the set of words of a natural language. If V is a vector space, we define a general language for V (or simply a language when there is no ambiguity) as a function from the free monoid A * to V . For each string x ∈ A * , we have associated with it a vector in V that may have several interpretations:

• V may simply be the real numbers R, and the language may describe a probability distribution over strings in A * , in which case we can view the language as a generative model of a natural language, describing the probability of observing each possible string as a sentence or a document.

• V may be a vector space describing the meaning of strings, for example a representation of model-theoretic semantics. In this case, the language attaches a meaning to each possible string in A * .

Given a general language L we define

…(Full text truncated)…

📸 Image Gallery

cover.png

Reference

This content is AI-processed based on ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut