Outlines of Algebraic Semantics


Pavel Bělíček


‘Algebraic semantics’’ is a new theoretical approach to the formal description of the semantic structure of natural languages in terms of standard mathematical algebras. Its interdisciplinary systematics encroaches upon the theoretical fields of syntactic theory, logic, artificial intelligence as well as theoretical algebras. Its projections upon the sphere of logical thought can be referred to as conceptual logic and its intersections with mathematic theories deserve denoting as notional algebra.




Formal Grammars. 1

Formal Presuppositions of Semantic Description. 1

Constituency and Dependency. 2

Parenthetical Grammar 3

Concatenation and Decatenation. 3

Fractional Grammars. 4

Semantic Grammars. 4

Semantic Restriction and Expansion. 4

The Properties of Commutativity and Associativity. 5

Revisiting String Theory and its Reaxiomatisation. 6

Axiomatisation of Semantic Grammars. 7

Projective Grammars. 7

Semantic Modality and Its Quantification. 8

Quantification and Quantifiers. 8

Negation. 9

Bivalent or Dual Negation. 9

Quadral Quantification. 10

Quadrivalent, Quadral or Quaternary Negation. 11

Modal Semantics. 11

Octovalent, Octal or Octonary Negation. 14

References. 18


Formal Grammars


The first endeavours to give a formal notation of linguistic strings were due to Axel Thue and Emil Post1 who adapted their theory to data processing in Turing machines. An epoch-breaking attempt in their formalisation was launched by Y. Bar-Hillel, who devised a quasi-arithmetical notation for deciphering syntactic phrases.2 In the mid-1950s it developed into categorial grammars that offered a recognoscative apparatus for evaluating the grammatical correction of sentences and lexical strings. His analysis of language structures arose as a by-product of the earliest research in rewriting systems designed for machine-processing. It provided a decision-making counterpart to Noam Chomsky’s generative phrase-structure grammars that made the greatest contribution to modern techniques of artificial intelligence. His results influenced generations of young researchers and became a headstone of theoretical computer science.


Formal Presuppositions of Semantic Description


Linguistic studies wend their way in two directions, one devoted to visible language form and another to the invisible sphere of meaning-oriented semantics. Their interrelations are not linked by strict mathematical homomorphism but allow us to speak informally about approximate mappings. Let there be a natural language L composed from its alphabet A, vocabulary V and the apparatus G of grammatical rules. Then it is possible to define algebraic semantics as a formal system dealing with their mapping f into the realm of semantic referents. The vocabulary is mapped into the set S of semantic meanings (sememes), while the grammatical apparatus G is projected upon the schematic layout C of logical and ontological categories.

                   f(words) = meanings                           f: V ® S

                   f(grammar) = categories                      f: G ® C

Natural languages involve much polysemy so it is necessary to restrain the reference of words to the kernel vocabulary of primary literal meanings. This methodological step presupposes abstracting from infinite varieties of figurative meanings implied by numerous secondary connotations. This is why algebraic semantics directly switches from linguistic form to the realm of meaning. For simplicity sake it treats words as basic sememes in their basic primary elementary sense. When dealing with the modal meaning of must, may, will, shall, it considers them right away as sememes and resigns from mentioning the irrelevant intricacies of their formal lexemes.


The present state of human cognition may be summed up by concluding that theoretical logic and mathematics give an exact formalisation of the most essential fields of human thought but cover only a small part of semantic fields. Even if they give a precise logical treatment of basic elementary concepts, they do not care to render an integral description of the whole layout of a given semantic area. They are engrossed too much in their special internal technicalities that hinder them from joining their subtheories into an all-inclusive picture of the outer world. Algebraic semantics works with less rigorous theoretical apparatus but relentlessly strives to ensure mutual convertibility between semantic, logical, mathematical and algebraic calculi.


Constituency and Dependency


Modern advances of formal grammars have devised two elementary types of formal linguistic analysis. One was based on Chomsky’s [[phrase-structure grammar | phrase-structure grammars]] and their close predecessor, the immediate constituent analysis proposed by Rulon Wells3. Both approaches treated linguistic structures as linear sequences of words made up from the vocabulary of a natural language and put forward useful methods of their hierarchical segmentation. Their chief weakness was seen in low sensitivity to the mutual subordination of constituents. This drawback was partly removed by L. Tesnière’s project of [[dependency grammar | dependency grammars]]4. His verb-centred system focused on semantic actants and syntactic pairs relating heads and dependents. Their mutual advantages are elucidated by the comparison5 of two ways of analysing the sentence We are trying to understand the difference given below.  


Table 1. Dependency and constituency grammars

The chief asset of grammatical trees is that they give a vivid illustrative representation of syntactic structures for common laic observers but this is debased by difficulties, which it brings about in automatic word processing. Hence, a convenient remedy is provided by parenthetical and fractional grammars.



Parenthetical Grammar


In formal linguistics it is essential to realise that the laws of associativity hold neither in lexical nor in syntactic strings. Their lack and absence advances a strong argument for parenthetisation. The structuring and inner hierarchy in the following German and English expressions is much easier to understand from the use of parentheses.

 ‘‘Parenthetical grammar’’ is a formal rewriting system that applies parentheses for expressing the grammatical relations of dependency and semantic subordination. It provides the simplest method of syntactic parsing without requiring very demanding means of visual representation. It employs a simple apparatus of left brackets (‘{’, ‘[’ or ‘(’) in order to demark the initial boundary of linguistic expressions and right brackets (‘}’, ‘]’ or ‘)’) that delimit their end. As seen in the phrase a ladies’ dress parenthetisation induces considerable differences in meaning:

             a ladies’ dress = a (ladies’ dress) ¹ (a lady’s) dress = a lady’s dress .

The expression on the left describes a dress for ladies, whereas the phrase structure on the right refers to a particular lady’s garment.

A simple example of sentence analysis is given by the collocation Such an extremely long journey exhausted our energy. Its parenthetical articulation grammar segments couples of heads and dependents into the ensuing hierarchy:

((((Such (an ((extremely long) journey))) (exhausted (our energy))).

When rendered in terms of phrase structures, its decomposition proceeds as follows:

S ® NP VP ® ((AP NP) VP) ® ((Adv AP NP) VP) ® ((D A NP) VP) ® ((D A NP) (V NP)) .

Another telling illustration is supplied by the string Little Red Riding-Hood went to her grandmother in another village:

((Little (Red Riding-Hood))) (went (to (((her grandmother)) (in (another village)))).

The main reason for introducing such adjustments in syntactic theory is not only that it saves space and simplifies analysis. Its most important theoretical facility consists in opening the second dimension of syntactic hierarchy. Parenthetical grammars turn linear sequences into 2D-patterns embedding strings into a two-dimensional Cartesian space. Its basic horizontal axis x depicts the linear sequencing of symbols, while the second vertical axis y plots strings with the scaled hierarchy of phrase-structures according to different levels of syntactic validity.6 


Concatenation and Decatenation


In current string theory individual symbols and string are treated as immediate constituents linked by the binary operation of concatenation. InformalIy speaking, it is a procedure joining two strings of shorter length into a concatenation whose length is the sum of both segments. Given two arbitrary strings S1 = x1...xn and S2 = y1...ym, their concatenation S1S2 results in the following formula:


S1S2 = x1...xny1...ym .


If x and y are basic symbols, their logical connective is written in different algebraic symbols such as

xy = x * y = x × y .

Bar-Hillel’s theoretical apparatus made use of analogies to arithmetical multiplication, division and cancellation but such conventions represented only a formal and artificial apparatus. In fact, they have little to do with properties of rational numbers featuring in arithmetical fractions. He may have applied also additive formalism that renders concatenating strings as a sum of two addends. An elementary case of additive binary [[concatenation]] can be illustrated by joining two lexical strings composed of several letters as in the formula below:

town + hall = townhall .

An inverse operation to concatenation may be denoted as [[decatenation]]  and defined as unlinking chains into short fragments. A simple illustration of decatenative cancellation is provided by

townhall – hall = town .

Neither concatenation nor decatenation is a commutative operation. This means that the order of addends and subtrahends cannot be switched:

town + hall = townhall ¹ hall + town .

This inconveniency makes us introduce a special symbol Ø for left subtraction:

-town + townhall = town Ø townhall = hall .

The chief argument for giving preference to additive notation for concatenative strings is that the slash sign for right and left division can be employed for other purposes such as syntactic dependence.


Some theoretical contributions have developed the idea of  ‘right cancellation’ conceived as a string operation that deletes some symbols on the right end of the string: “The right cancellation of a letter a from a string s is the removal of the first occurrence of the letter a in the string s, starting from the right hand side. The empty string is always cancellable:{\displaystyle \varepsilon \div a=\varepsilon } Clearly, right cancellation and projection commute.”7 However, it cannot be regarded as identical to the concept of right decatenation.


Fractional Grammars


The formal apparatus of parenthetical grammars shares many inadequacies encountered in immediate constituent analysis. It chains subsequent neighbouring words into pairs but does not specify their grammatical interrelations expressed by their mutual syntactic dependency. A convenient solution is offered by the so-called fractional grammars. They combine the convenient properties of constituency and dependency by indicating the subordinate position of dependents by slash signs ‘/’ and ‘\’. This is how it is possible to analyse a simple sentence The extremely long journey exhausted our energy:


S ® NP\VP ® ((AP\NP)\VP) ® ((Adv\AP))\NP)\VP) ® ((D\(A\NP))\VP) ® ((D ((Adv\A)\NP))\(V/(D\NP))) .

The right slash in V/NP means that in accusative object constructions the noun phrase the NP functions as a dependent of the head V (verb). It is efficient especially in indicating the syntactic status of incongruent attributes following the governing nominal head. Its treatment of attribute constructions is illustrated by the phrase structure the flower of many colours: 

(the\flower)/(of(different\colours)) .

NP® (D\N)/NP ® (D\N)/(A\N) .

The replacement of cancellation by subtraction seems convenient since it permits exploiting slash marks for designating other important string operations. One possible usage might serve for designating relations of syntactic dependency. The inner structure of a word would be comprehensible if we combined dependency with parenthetisation. The afore-mentioned phrases would beam with clarity and explicitness if they were segmented neatly by parentheses determining the hierarchy of terms:

Rücksichtslosigkeit »inconsiderateness’ ,

((((Rück\sichts)\los)\ig)\keit) » ‘(in\((consider)\ate)\ness)’ .

In such lexical derivations suffixes act as the governing head because they explicitly give the whole expression its categorial and part-of-speech standing. If a lexical root is preceded by a few prefixes and appended by several suffixes, we do not consider the order of its etymological composition but the hierarchy of syntactic values. Etymologically speaking, in ‘boldness’ the adjective ‘bold ’ is primary but in lexical analysis it is secondary because the part-of-speech value of ‘‘boldness’ is determined by the suffix ‘-ness’. 


Semantic Grammars


Semantic Restriction and Expansion


Grammatical dependency represents an asymmetric binary relations and as such it needs denoting by left- and right-oriented symbols. The proposed quotient-like notation brings advantages but it may sometimes be found confusing. In order to avoid undesirable arithmetic connotations it is possible to make use of other such as í, ý. This paper supports the afore-mentioned fractional notation that opens the chance to record dependency pairs as fractions. In its notation the phrase yellow glove reads as

yellow \ glove » a + N = aN

and the object phrase eat sandwiches as

eat / sandwiches » V + N = Vn .

The formula yellow \ glove says that glove functions as the governing head, while the first expression yellow is its attribute serving as a dependent. It specifies the mutual subordination of noun phrases and their attributive dependants.


When considering most attributive constructions, it is evident that they function like operations of semantic restriction. The expression young ladies causes that its reference is narrowed to girls. It means that the class of all ladies is reduced to the subclass of those ladies that are of younger age:

a \ b = c                 young women = girls .

Its inverse operation is semantic extension,

a-1 + c = b                 young-1 girls = women ,


The Properties of Commutativity and Associativity


Current string theory arose from word-processing in Turing automata and got firmly established several preconceptions worth revisiting. The standard account of string systems in automata theory has been worked out by John E. Hopcroft and Jeffrey D. Ullman, and hence it deserves denoting as Hopcroft-Ullman axiomatisation.8 Its axiomatic ideas operate well when applied to interpreting command in specialised artificial programming languages but break down when tackling the intricate syntax natural languages. They generalise the algebraic properties of concatenation but forget that in natural languages this operation does not meet the requirements of commutativity and associativity.

In general, algebraic strings include the empty string functioning as the identity element but they fail to preserve associative laws. Therefore their algebraic systems could be classified as groupoids, quasigroups, loops or ‘grammoids’, i.e. systems with non-unique operations.

String algebras are generally considered as associative systems based on the associative operation of binary concatenation: “Concatenation of languages is associative.”9 “Concatenation of strings is associative: s × (t × u) = (s × t) × u.  {\displaystyle s\cdot (t\cdot u)=(s\cdot t)\cdot u}For example, ({b} × ({l} (ɛ × {ah}) = {bl}ɛ × {ah} = {blah}{\displaystyle (\langle b\rangle \cdot \langle l\rangle )\cdot (\varepsilon \cdot \langle ah\rangle )=\langle bl\rangle \cdot \langle ah\rangle =\langle blah\rangle }.“10 “The strings over an alphabet, with the concatenation operation, form an associative algebraic structure with identity element the null string—a free monoid.“11

Most authors admit that „the concatenation of languages as well as concatenation of words is associative, but not commutative.“12 However, some mathematicians have elaborated the theory of special commutative strings, which form an Abelian monoid.13 Commutative monoids are associative monoids with commutative concatenation: „A monoid whose operation is commutative is called a commutative monoid (or, less commonly, an abelian monoid).“14 

String algebras are free monoids or free semigroups with an identity element ɛ or Æ: „In abstract algebra, the free monoid on a set is the monoid whose elements are all the finite sequences (or strings) of zero or more elements from that set, with string concatenation as the monoid operation and with the unique sequence of zero elements, often called the empty string and denoted by ε or λ, as the identity element. The free monoid on a set A is usually denoted A+. The free semigroup on A is the subsemigroup of A+ containing all elements except the empty string. It is usually denoted A+.15

These standard outlines of string algebras have been developed by adding a new operation of alternation that offers the choice of two different elements. Its algebraic properties resemble logical disjunction or set-theoretical union, and so it can be written as x È y = z. It is remarkable for distributivity because because concatenation distributes over alternation: zx È zy = z(x È y). When incorporated into concatenative string systems extended to free monoids, their whole two-operation algebra can be referred as semi-rings. “Sets of strings with concatenation and alternation form a semiring, with concatenation (*) distributing over alternation (+); 0 is the empty set and 1 the set consisting of just the null string.”16

These quotations allow us to conclude that the traditional Hopcroft-Ullman axiomatisation defines string algebras as non-commutative, associative, unital and distributive systems:

Æ * x = x * Æ = x  (* is a unital operation) ,

x * y ¹ y * x (* is a non-commutative operation) ,

x * (y * z) = (x * y) * z) (* is an associative operation) ,

z * x È z * y = z * (x È y) (* distributes over È) .


Revisiting String Theory and its Reaxiomatisation 


Most algebraic notations for syntactic strings deal only with very simple sentence structures and find it difficult to analyse more complex collocations. A considerable improvement of their efficiency can be reached by several theoretical reforms designed to express the real algebraic properties of languages structures. The main reason is that when applying algebraic models of mathematical theorems in automata theory to natural languages, they have to be adjusted to new axioms. Their algebraic properties may be verified on examples taken from Modern English or German. In order to keep their phrases apart from traditional string algebras, it is advisable to denote them as ‘conceptual strings’ and provide them with a new ‘Conceptual Reaxiomatisation’.


Conceptual Reaxiomatisation’ regards conceptual strings as ordered finite polynomials linked by the binary operation of concatenation of additive nature. It is indispensable to clearly distinguish lexical grammars composing words from the set A of letters called alphabet and syntactic grammars concatenating sentence structures from the set V of words called vocabulary. The following examples are predominantly taken from syntactic grammars, where concatenation ‘*’ functions as a unital, non-associative, non-commutative, left-unique and right-unique binary operation. Moreover, it distributes over the logical disjunction ‘È’, whose meaning corresponds to the conjunction ‘or’:

          Æ * x = x * Æ = x                                                  (* is a unital operation) ,

          Æ * people = people * Æ = people                          (unitality) ,

          x * y ¹ y * x                                                          (* is a non-commutative operation) ,

          school garden ¹ garden school                               (non-commutativity) ,

          x * (y * z) ¹ (x * y) * z)                                          (* is a non-associative operation) ,

          (very fast) train ¹ very (fast train)                             (non-associativity) ,

          z * x È z * y = z * (x È y)                                      (* distributes over È) .

          (tall men) or (tall women) = tall men or women         (distributivity) .

Let l be a mapping that assigns to every string its length expressed by the number of its elementary symbols (letters, digits, words). Then for concatenation and right decatenation of any strings x, y, z there exists homomorphisms

              l(x) + l(y) = l(x * y) = l(z)                                    (the additive nature of concatenation) ,

              l(z) – l(y) = l(zy) = l(x)                                   (the subtractive nature of decatenation) .

These partial finding justify summarising the theory of conceptual strings in natural languages into the following mathematical usances and theoretical reaxiomatisation:

Let S be a string system with a binary operation of concatenation ‘×’ over a vocabulary V in a natural language. 

A string system S is unital if there exists an identity element ε such that for every element s of S the equations  s × ɛ {\displaystyle s\cdot \varepsilon =s=\varepsilon \cdot s} = ɛ × s = s hold good.

The concatenation of strings in S and all natural languages is not a commutative operation. As a result, all factors in strings have to preserve their standard ordering.

The concatenation of strings in natural languages is not an associative operation. Accordingly, its factors in a string have to be separated by parentheses. 

The concatenation of strings in natural languages is a right-unique operation17. It means that if the products s × t  = z and s × u  = z hold in a string system, then t = u.

The concatenation of strings in natural languages is a left-unique operation. It implies that if s × u  = z and t × u  = z are both valid in a string system, then s = t. 

If concatenation is a right-unique binary operation ‘×’ in S, then there exists a binary operation of right decatenation ‘-’ inverse to ‘×’ in S.

If concatenation is a left-unique binary operation ‘×’ in S, then there exists a binary operation ‘Ø’ of left decatenation inverse to ‘×’ in S.

If the binary concatenation s × t  = z of strings in natural languages joins two strings of length f and g   and their product z is a string of length h equal to the sum of f and g, i.e. h = f + g, then it is an operation of additive type.

If the binary decatenation z – t = s of two strings in natural languages decreases the length of h of by the length g of t so that h g = f, then it is an operation of subtractive type.

The free groupoid over the vocabulary is not a free monoid but a free quasigroup.

If a string system represents a free unital quasigroup where s × ɛ {\displaystyle s\cdot \varepsilon =s=\varepsilon \cdot s} = ɛ × s = s, it functions as a free loop. 

The string system S = [V, *, È], where È operates as the disjunction operation ‘or’ and distributes over concatenation, S is a quasi-ring.


Axiomatisation of Semantic Grammars


These axiomatic propositions put forward a series of structural reforms enhancing the present-day theories of formal grammars.

(a) parenthetical notation: replacing the tedious tree graphs of phrase structures by parentheses,

(b) parenthetical grammars: parenthesising phrase structures so as to mark their associative coupling,

(c) decatenative operations: ensuring the left and right uniqueness of concatenative operations in

    order to introduce their left and right inverse decatenative operations,

(d) fractional notation: distinguishing the dependency heads from dependents by quotient signs,

(e) fractional grammars: opening the vertical dimension of syntactic hierarchy and plotting it with the

    horizontal string-theoretical linearity into the two-dimensional space of semantic utterances,

(f)  treating phrase structures as free quasigroups or loops thanks to the uniqueness of decatenation,

(g) extending the part-of-speech repertory by adding semantic actants,

(h) designating the part-of-speech standing by various types of letters,

(i)  transcribing all productions into equations with single-valued operations,

(j)  jointing generative and recognoscative grammars into compatible systems,

(k) decomposing complex sentences into branches of a projective grammar.

Table 2. Requirement for systematic grammars


Projective Grammars


More complex phrase structures evade formal description because their complexity exceeds the potential of a given apparatus. Many difficulties stem from phrases composed from peripheral and circumstantial projections bound optionally to noun phases and verb phrases. Table 18 shows that predication in simple sentence structures gets incapsulated into various ‘semisentences’ that function as peripheral phrases annexed to the central predication core. Their syntactic potencies were studied by the Czech linguist Ivan Poldauf18 who classified them as forms of semipredication.

Chief semipredicative constructions are developed by postpositive sequencing and appending them as dependents to the central verbal core. The whole semipredicative unit usually develops the verb phrase in the accusative or nominative case form and it is joined to the central verb by means of postpositive sequencing. This mode of branching semipredicative phrases is instanced in the third row of Table 18. The lowest fourth row is reserved for semipredication standing in anteposition and developed by sequencing from right to left. It is characteristic of attributive constructions, compounds and derivation where the dependents precede the head. In the word cowardness the morpheme coward functions as a stem but it depends upon the suffix -ness that acts the semantic head determining its overall part-of-speech valence.


object constructions

attributive constructions


tribes collect honey

snails are slow

apposition participle




tribes honey-collectors

we can see tribes collecting honey

discuss their collecting honey

this forces tribes to collect honey

tribes that collect honey

snails slow-walkers

see snails moving slowly

about patients getting better make snails (become) slow

snails that are slow




honey-collecting tribes



slow snails



Table 3. Semipredication in constructions from objects and attributes

    Table 3 demonstrates that the simple verb + object relation collect honey consists of two actants but may be incapsulated into a tree-like hierarchy of complex collocations where the object relation is expressed in alternative phrases. They depend on the syntactic functions of the verb collect that may act as an apposition, participle, gerund, infinitive or may be embedded into a clause. Its right side displays similar collocations inherent to the attributive construction slow snails or the qualifying construction Snails are slow.

    Similar branching is observed also in adverbials of manner, place and time. They demonstrate types of predication classified 15 as localisation, temporalisation. The adverbial constructions Children sleep well and Londoners live in London can be annexed to the central verbal core by reducing their predicative charge to appositive, participial, gerundial and infinitive semipredication. Other acceptable transformations crop up when the subsidiary sentence is reduced to a subordinate clause or when it collapses and forms a new compound or derivative lexical unit. Table 4 proves that circumstantial adverbial constructions may occupy almost all syntactic positions that are common to object and complement collocations in Table 3.



adverbial constructions

circumstantial constructions


children sleep well

Londoners live in London






animals running fast

we saw cats running fast

it is difficult to run fast

Janes coming soon

girls that do well

Londoners based in London

we met friends (living) in London

it is nice to live in L.A.

my staying in London

inhabitants who live in London




fast-running animals

well-to-do, fast-killing dose


London residents

wintertime love

New Yorkers

Table 4. Semipredication derived from adverbial constructions


Semantic Modality and Its Quantification


Quantification and Quantifiers


Quantification is a manner of linguistic [[predication]] by means of logical [[quantifier | quantifiers]] and adverbials of quantity. It employs specific means in the grammar of natural languages, logic, algebra and other branches of mathematics. Introducing quantifiers helps to give a quantitative gradation to the occurrence and frequency of various elements in propositional utterances. Aristotelian syllogistics introduced logical quantifiers “all”, “some”, “no” and modern predicate logic incorporated two degrees of occurrence, the [[universal quantifier |]] " and the [[existential quantifier]] $. Traditional grammar distinguishes cardinal, negative, ordinal, multiplicative and partitive numbers. Modern mathematics has built its disciplinary terminology on several special types of quantifying concepts, e.g. integers, natural, rational and algebraic [[number | numbers]]. The principal line of division leads between enumerative and partitive quantifiers. The former behave as arithmetic systems, the latter usually observe principles of Boolean logic.    

Classical logic originally operated only with two values, true’ and false. Any proposition was evaluated by truth functions and given two values. Positive truth values were 1 or T, negative truth values were termed false, untrue and written 0 or ⊥. For these reasons the theoretical apparatus of standard logic was classified as a two-valued logic (also two-level logic or bivalent logic). A deeper look at its opposites suggests that they are associated with quantifying degrees of veracity and alethic verification. In expressions true equation and false statement they act as quantifiers conveying a certain degree of truthfulness.

Efforts to extend the narrow range of two-valued logic finally resulted in the invention of many-valued logic and fuzzy logic. In this development there, however, appeared several important milestones worth noticing. The first turning-point occurred thanks to the explorations into the realm of three-valued logic (also trivalent or ternary logic). Its introduction was initiated by Jan Łukasiewicz and S. C. Kleene, who broadened the pair of true and false by adding the third value of unknown.19 Its disadvantage was that unknown lacked a polar opposite. Its value can organically function only in the quaternary of four values (known, unknown, admissible, inadmissible). The lack of coherent consistency vexed also inventors of four-valued logic. Nuel Belnap devised a four-valued logic that counted with values true, false, both (true and false), and neither (true nor false). Another application of four-valued logic was designed for digital circuits and calculated with values 1 (true), 0 (false), Z (high impedance) and X (indifference).




A special type of quantification is seen in grammatical and logical [[negation | negation]]. In most natural languages its essence consists in the logical complement or potencies of the so-called “null quantifier”. It reports a null degree of occurrence and existence. In formal logic a proposition p is negated by symbols ~p, p or ¬ p (read in all cases as ‘not p’), eventually also p¢ (read as p prime) and p-1. The elementary precept of logic is the equation ¬¬ p = p stating that the negation of an arbitrary proposition yields its positive affirmation. This logical axiom was refuted spitefully by Brouwer’s and Heyting’s intuitionistic logic20 claiming that ¬¬¬p = ¬p.


Bivalent or Dual Negation


Quantifiers and negation can be regarded as unary relations with one argument. A logic L is said to be closed under the unary operation of negation if there exists a negation for each of its elements in L. The basic conception of negative opposites is seen in pairs such as veil – unveil or go – stay. It forms the common case of dual negation, where (x-1)-1 = x. The sentence There exist many (q) swimmers (S) seems to be equivalent to the statement There exist few (q-1) non-swimmers (S-1).

              qS = q-1S-1                                all swimmers = no non-swimmers .

The values “true’’ and “false’’ in two-valued logic form antonyms, they function as opposites that negate another and the denial of either of them equals its dual antipode. Dual negation forms pairs of opposites that are linked together by opposite meanings. Negative antonyms are usually created by adding the prefixes in-, un-, a-, anti- and counter-. Their polarity is illustrated by pairs such as true – false, much – little, many – few, majority – minority, tall – short, high – low, wide – narrow, large – small or happy - sad. 

              qT = (q-1T1)-1                             much (q) money (T1) = not little (q-1) money ,

              qT = (q-1T2)-1                             many (q) people (T2) = not few (q-1) people .

Given two opposite meanings, say much and little or many and few, they apply a sort of ‘central delimitation’ because they introduce a sort of relative quantification in respect to the customary mean. Much and little actually mean “more than the common average of cases”, while little and few indicate rarer incidence amounting to “less than the common average of cases”. Little and few function as negative duals of positive antonyms much and many.




Quadral Quantification


Quantity can be expressed either in an enumerative way as in integers and natural numbers or in a holistic manner in terms of a part of an integral entirety. Generally, in an integral whole there are four degrees of accomplishment delimiting approximate occurrence: total, incomplete, partial and zero degree. There always exists the least lower bound (infimum) and the greatest upper bound (supremum). These bounds are related either to the whole set or only to some of its individual members.



Total degree

Incomplete degree

Partial degree

Zero degree



all girls

not all girls

some girls

no girls



every girl

not every girl

some girl

no girl




not entire


not a part/bit of






need not do


may do


must not do



to make

not to make

to let

to forbid




it must be


it may not be


it may be


it can’t be



to begin

not to begin

to continue

to cease



to get

not to get

to keep

to lose

phasing of verbs

of movement

to come

not to come

to stay

to leave



to learn

not to learn

to know

to forget

Table 5. Fields of quadrivalent gradation

Quadral quantification resists attempts at quantitative precision but allows subjective estimates of the whole category of entities. We may informally specify that for the universal quantifier all x in a collocation All (") dogs (x) bark (b1) the probability p of p("xb1) = 1, i.e. the event of barking is of the highest possible probability. This means that the statement conveys a total degree of truthfulness and indicates that the universal quantifier " pertains to the larger class of total quantifiers.

An opposite case occurs in the sentence structure Some dogs bite (b2) where the subject acquires an existential quantifier $. Its meaning can be rewritten in algebraic symbols such as $xb2 implying that there exists at least one dog that bites. When observed through the prism of probability theory, the probability of the event of biting amounts to at least one occurrence, i.e. p($xb2) > 0. This example makes it clear that the existential quantifier represents as a special subtype of partial quantifiers.

The usual treatment of logical and mathematical classifiers seldom discusses the eventuality of negative quantifiers envisaged in symbols "-1x  “not all dogs” and $-1x “no dogs”. The former formula illustrates the subtype of a ‘non-universal quantifier’ that deserves designating as an incomplete quantifier "-1. In Aristotelian syllogisms the expression Not all dogs bite was rephrased periphrastically as All dogs do not bite because natural languages generally omit special words for incomplete degrees of quantification. Statistically speaking, the statement "-1xb2 with a incomplete quantifier conveys probability p("-1xb2) < 1.

There also exists a negative existential quantifier $-1 denying the presence of a given attribute for any member of a class of individuals. It denotes the zero degree of quantification and hence its appropriate coinage might be the term of zero quantifier (or null quantifier). It is exemplified by the collocation No dogs miaow (b3). It is written as $-1xb3 and its probability equals zero since p($-1xb3) = 0.

Introducing a probability function cannot be refuse as a redundant futility here, because it is almost synonymous to propositional truthfulness. The denomination of total, incomplete, partial and zero quantifiers make their concepts acceptable for quantifying degrees of modality, where the probability function p(x) corresponds to the affiliated functions of feasibility and realisability.


Quadrivalent, Quadral or Quaternary Negation


The elementary case of bivalent quantification with dual negation may be completed by more complex relations of quadral negation. Its idea is required by Boolean concepts involved in the pairs of lattice-theoretical operations join – meet or the set-theoretical operations of union – intersection. It cannot be regarded as an absolute novelty. George Boole was an Irish mathematician of the mid-19th century but he had an early predecessor in Aristotle, who introduced quantitative functors such as allsome and whole – part.

Current accounts of modal logic seldom realise that the quadripartition of total, incomplete, partial and zero degrees encroaches also on the field of verbal modality. The only difference is that quantifiers specify the range of variables, while modal verbs determine the degree of accomplishment of an action. As a result, the formalism developed for describing concepts of quantifiers may be extended successfully also to the sphere of verbal modality that is concerned with determining various degrees of realisability. It is vital to notice that quadral negation relates modalities such as must – may (deontic modal logic), make – let (causative logic) and certainty – possibility (epistemic logic). Moreover, it may suitably cover the opposites shall – will and want – agree (volitive logics) or learn –remember  (cognitive logics) and begin – continue (phasing or inchoative logic).

Quadral or quadrivalent quantification differs from bivalent antonyms by employing marginal delimitation of quantity. Whereas dual antonyms much – little measure quantity in respect to the middle average, the quantifying functors all, some and no measure the range of variables in respect to boundary cases of universal, partial or nullary occurrence. Generally speaking, they denote absolute, universal, occasional or nullary existence. If the dual antonym few is referred to as a negative dual to many, then it is convenient to denote the quadral antonym may as a quadral (or quadral negation) of must. Quadrals express the total and partial degree of realisability of an action. 

When we come across an arbitrary lexical unit and we need to estimate its syntactic potencies, we have to find out its negative duals and quadrals. If we compare two arbitrary quantifying expressions r and s, they are dual antonyms if r = (s-1) -1 and they comply with the following equations:

(qx  Í S) = (q-1x Í S-1)        Many (q) people are swimmers = Few (q-1) people are non-swimmers 

(q-1x Í S) = (qx Í S-1)          Few (q-1) people are swimmers = Many (q) people are non-swimmers

Given two arbitrary quantifying expressions r and s, they are quadral antonyms if they comply with the following four equations:

("x  Í S) = ($-1x Í S-1)          All people are swimmers =  No people are non-swimmers      

("-1x  Í S) = ($x Í S-1)          Not all people are swimmers = Some people are non-swimmers

($x  Í S) = ("-1x Í S-1)          Some people are swimmers =  Not all people are non-swimmers

($x-1 Í S) = ("x Í S-1)          No people are swimmers =  Not all people are non-swimmers


Modal Semantics


The semantic theories of grammatical modality are developed in three independent lines of research. They have become an issue of utmost interest in theoretical grammar, general semantics, modal logic, conceptual programming as well as in artificial intelligence. The most authoritative accounts of linguistic modality were recently given by F. R. Palmer21, P. Portner and A. Kratzer22. Their treatises lack mathematical precision but provide a deeper intuitive understanding of the inner structuring of semantic categories.

An alternative line of studies is conducted in the branch of modal logic, whose greatest pioneers were R. Carnap, C. I. Lewis and W. Quine. Their enquiries contributed a lot to systematising the logical and algebraic interrelations between various types of modal attitudes. They pursued different objectives and worked with a different terminological apparatus. Nevertheless, their ultimate results were mutually comparable and compatible. This is why the most urgent task to tackle in research consists in devising interdisciplinary studies enhancing the interactive convertibility of results in both directions.  

One of promising enquiries combining linguistic, logical, mathematical and computational semantics   is found in the theoretical treatise English Semantics23 (2005) by the Czech Anglicist Pavel Bělíček. His work endeavoured to clothe linguistic analysis in simple algebraic formulas and elucidate the mathematical properties of grammatical relations interweaving the network of semantic fields. When preparing his first textbook of English lexicology, he derived crucial pioneering ideas from Edward H. Bendix and his ‘componential analysis of general vocabulary’24. Another inspirative impetus appeared in Lakoff‘s generative semantics and his semantic equation kill = make die. Both approaches encouraged him to launch a project of the analytic decomposition of English word stock into elementary subcategories.

Bělíček’s semantic apparatus united findings of modal logic with linguistic enquiries and integrated them into a comprehensive scheme including many neglected areas of modality and modal attitudes. It considered sixteen types of predication in the indicative mood and coordinated them with mappings into their respective modal fields. Each of type of predication induced a special projection upon the level of cognitive C-modality and volitive V-modality. An independent level of projections was found also in evaluative E-modality expressing emotional attitudes to reality.


The Quantification of Degrees in Deontic Modality


Semantic modality does not quantify variables in classes of individual members but concerns different degrees of the feasibility or realisability of actions. The collocation Patricia cooks cakes is formulated in indicative modality and presents the activity of cooking cakes as a real thing. On the other hands, the sentence Patricia can cook a cake specifies it as a desirable activity that is not beyond Patricia’s abilities. In modal logic modal verbs are treated as modal functors. In Rudolf Carnap’s enquiries the modal functor N(x) is interpreted as the necessity of an action x and the modal functor P(x) is introduced in order to denote its hypothetical possibility. 


Linguistic modality expresses a huge variety of human attitudes to reality. The crucial core is sought in the field of necessity and deontic logic. The scaling of necessitation distinguishes four degrees: the total degree specifies necessity (1), the incomplete degree expresses evitability (<1), the partial degree conveys possibility (>0) and zero degree equals impossibility (0).


In Czech language modal verbs are negated by the prefix ne- that induces direct negation because it negates the modal verb itself. In English and some Germanic languages modal verbs are negated by not or nicht, which give rise to ‘crossed negation’. Crossed negation in you must not leave means that not does not negate the modal verb must but the following non-modal verb leave. The Czech collocation nemusíš odejít says that your leaving is unnecessary while the English phrase you must not leave implies that it is forbidden. The discordance between Czech and English modal negation is explained by parentheses indicating different articulation of semantic constituents.

Czech  musím pracovat ‘I must work’                      English  I must work

            (ne(musím)) pracovat) ‘I need not work                    I need not work

            smím si oddechnout ‘I may relax                              I may relax

            ((ne(smím)) si oddechnout) ‘I mustn’t relax              I (must (not relax))


Logical properties of modal negation may be demonstrated by algebraic symbols. Let d designate a verb of action and d-1 its antonym calibrated as its dual negation. Such a pair of opposites is illustrated by the verb d = to stay and its dual negation d-1 = to leave.

(1)           xd = y-1d-1                                 he must stay = he must not leave

(2)           x-1d = yd-1                                he need not stay = he may leave

(3)           x1d-1 = y-1d                               he must leave = he must not stay

(4)           x-1d-1= yd                                 he need not leave = he may stay


The Phasing of Dynamic Actions


A wide field of application for quadral relationships is found in phasing logic. It is classified as a branch of temporal logic (another convenient term is also time logic), which deals with various temporal aspects of actions on the axis of time. Phasing logic can clearly distinguish three moments in the accomplishment of an event or feat: inchoation (verbs to begin, to commence), continuation (to continue) or termination (to cease, to finish).

Such a triple of three phases may coordinate verbs of knowledge and learning. Let us have the grammatical relation begin to know conceived as the concatenative operation begin * know = learn. Then verbs of cognition can be axiomatised by means of the following semantic equations.

   to learn = to begin to know = cease not to know

   to remember = to continue to know = not to forget

   to forget = to cease to know = to begin not to know

Verbs to begin and to continue exhibit the relation of quadral negation denoted by the wavy sign ~. Their direct dual negation is (to begin)–1 = not to begin and (to continue) –1 = not to continue = to cease. These pairs of dual compose higher units of quadral negation written as ~continue = begin or reversely ~begin = continue. This semantic distinction is transferred also to their complex composites ~learn = remember or  ~remember = learn. It also holds goods that ~(~learn) = learn. Pairs continuecease and remember forget are duals linked by dual negation, while the pairs begin continue and learnremember are quadrals linked by quadral negation.

When dealing with the semantic field of possession, it is advisable to introduce the symbol h for the verb to have and h-1 for its dual antonym to lack. The state of ownership may arise or perish and its continual changes are expressed by verbs of phasing. English conveys such changes by infinitive constructions where the verb h is preceded by the verbs to begin, to continue and to cease. These verbs express the states of inchoation, continuation and termination and in connection with the verb to have they compose the semantic content of the possessive verbs to get, to keep and to lose. Such a method i of lexical composition and decomposition joins isolated lexical items into an integrated    network entwining the semantic field of possession.

              he gets = he begins to have

              he keeps = he continues to have

              he loses = he begins to lack = he ceases to have

Mathematically speaking, two sememes x and y are Boolean quadrals if they fulfil the following four equations concerning possesive relations:

(1)           xh = y-1h-1                                  he gets = he begins to have = he ceases to lack

(2)           x-1h = yh-1                                 he does not begin to have = he continues to lack

(3)           xh-1 = y-1h                                 he loses = he begins to lack = he ceases to have

(4)           x-1h-1= yh                                  he does not begin to lack = he keeps = he continues to have

Such a comparison of verbs phasing actions makes it clear that the verb keep defines the total degree (1) of accomplishment, to lose corresponds to the incomplete degree of possession (<1), to begin functions as the partial degree of possession (> 0) and not to begin operates as a zero degree (> 0).


Active and Passive Quadrants


A brief glance at the degrees of phasing segmentation suggests that phasing logic requires its specific phasing scale of temporal events. It is beyond any doubt that the main quadral opposites are the verbs to begin and to continue negated by their dual antonyms not to begin and to cease. Yet what is questionable is the priority of to continue in respect to the verb to begin. The main difference is the incomplete degree is covered by special expressions while the zero degree is expressed periphrastically and lacks it peculiar lexical means.

These two manners of quantification induce an important distinction between active and passive gradation. Both fields of semantic quantification exhibit the same four grades (1)-(4) but one emphasises active intentional participation while the other relies on submissive obeisance. Phasing temporal logic lays stress on the doers will and resolution. It illustrates the type of active gradation and should be separated as the active quadrant playing the primary role in a semantic field. On the other hand, the quadral antonyms allsome and must  may represent passive gradation and belong to the secondary passive quadrant of quantification.


Octovalent, Octal or Octonary Negation

Combining relations of quadrivalent relations allows us to set up more complex schemes structuring integrated semantic fields. Their composition makes it possible to construct composite patterns of octal (or octovalent) negation, which link the less congruent pairs of semantic opposites such as must – will or necessity – volition. It is defined as the relation between the active and the passive quadrant of a semantic field.

The active counterpart of classic modality is formed by the modal verbs shall and will accompanied by their dual negations shant and won’t. They differ from must and may by laying greater stress on human will and resolution. The verbs must and may presuppose that there exists superordinate authority commanding the doer to perform prescribed activities. In their modal network the doer acts as a passive victim of other peoples’ will. On the other hand, will counts with active participation and efforts to reinforce one’s wishes and desires. The verb shall exhibits semantic features of submissive obeisance but poses this modal attitude as a voluntary act complying with subjective decisions. Shall resembles must in denoting fatal necessity and will bears resemblance to may in referring to volitive subjectivity but their pair emphasises the subjective point of view.

A deeper understanding of semantic shades in modal attitudes is acquired if modal verbs lose their natural polysemy and they are arranged in narrow juxtaposition with their respective periphrastic constructions. In the table below the four grades (1)-(4) compare active and subjective modality in the upper section in contradistinction to passive and objective modality in the lower section.


Modals verbs


Upper active quadrants

they shall stay = they will not leave

they shall not stay = they will leave

they will stay = they shall not leave

they will not stay = they shall not leave


they agree to stay = they neglect to leave

they do not  agree to stay = they intend to leave

they intend to stay = they neglect to leave

they do not intend to stay = they refuse to leave

Lower passive quadrants

I must stay = I mustn’t leave

I needn’t stay = I may leave

I may stay = I needn’t leave

I mustn’t stay = I must leave


I am obliged to stay = I am not allowed to leave

I am not obliged to stay = I am allowed to leave

I am allowed to stay = I am obliged to leave

I am not allowed to stay = I am obliged to leave

Table 6. Active and passive modality compared

The whole network of deontic modality is depicted in a scheme of four quadrant and eight partitions (Table 7). The left upper quadrant is reserved for conative modality with will constructions and negative shall-not constructions. They are linked by a full-line arrow denoting the relation of dual negation. On the right side this quadrant is complemented by the right upper quadrant reserved for submissive necessity of active and subjective type. Its lower half suggests that the category of submissive modals stands in dual opposition in regard to refutative modality. Both upper quadrants are enclosed in separate frames linked by horizontal dashed arrows of rightward orientation. Dashed arrows denote quadral or quadrivalent negation typical of Boolean structures.

The two lower quadrants in the graph describe passive and objective modality expressed by must and may. The left lower quadrant positions obligative modality confronted with liberative or dispensabilitive modality present in the negative need-not constructions. On the other hand, the right lower quadrant is occupied by permissive modality implied in the verb may. Its dual negation is provided by must not indicating prohibitive modality. It is vital to realise that the afore-mentioned octet of conative, neglective, submissive, refutative, obligative, dispensabilitive, permissive and prohibitive do not refer only to particular modal verbs but form large categories that appear also in the fields of termporal, spacial, functional, behavioral and titulative semantics. 

Table 7 outlines only the field of deontic V-modality (volitive modality) that prescribes infinitive constructions or that-clauses with conditionals and conjunctives. A similar graph can be drawn also for the corresponding C-modality (cognitive modality), where the modal verbs must, may, will and shall exhibit different meanings. Their C-modal octet conveys meanings of certainty, uncertainty, doubt, excludedness and impossibility.

























Table 7.  The diagram of deontic modality



























Table 8. Algebraic symbols










Table 9. Essive constructions











Table 10. Spacial Semantics




Pavel Bělíček: English Semantics. The Semantic Structure of Modern English. Prague 2005, 353p.

Pavel Bělíček: Systematic Poetics III. Formal Poetics and Rhetoric. Prague 2017, 357p.

Leonard Bloomfield: Language. New York: Henry Holt, 1933.

Ronald V. Book  Friedrich Otto: String-rewriting Systems. Springer, 1993.

Richard H. Bruck: A Survey of Binary Systems. Berlin 1958.

J. P. Calbert: Toward the semantics of modality. In: J. P. Calbert & H. Vater (eds.): Aspekte der Modalität. Tübingen: Gunter Narr, 1975.

Noam Chomsky: Syntactic Structures. The Hague/Paris: Mouton, 1957.

P. Culicover: Syntax, 2nd edition. New York: Academic Press, 1982.

Seymour Ginsburg: Algebraic and Automata Theoretic Properties of Formal Languages. North-Holland, 1975.

L. Haegeman – J. Guéron: English Grammar: A Generative Perspective. Oxford, UK: Blackwell Publishers, 1999.

Michael A. Harrison: Introduction to Formal Language Theory, Addison-Wesley, 1978.

David Hays: 1960: Grouping and Dependency Theories. P-1910, RAND Corporation. 1960.

David Hays: Dependency theory: A formalism and some observations. Language, 40: 1964. 511-525. Reprint in: Syntactic Theory 1, Structuralist, ed. Fred W. Householder. Penguin, 1972.

John E. Hopcroft  Jeffrey D. Ullman: Introduction to Automata Theory, Languages, and Computation, Addison-Wesley Publishing, Reading Massachusetts, 1979. 

R. Huddleston: English Grammar: An Outline. Cambridge, UK: Cambridge University Press, 1988.

F. R. Palmer: Mood and Modality. Cambridge Univ. Press, 1994, Second edition 2001.

F. R. Palmer: Modality and the English Modals. London: Routledge, 2014.        

P. Portner: Modality. Oxford: Oxford University Press, 2009. 

Lucien Tesnière: Éléments de syntaxe structurale. Paris: Klincksieck, 1959; Elements of Structural Syntax. John Benjamins, Amsterdam. 2015.

1 Emil Post : Recursive Unsolvability of a Problem of Thue. The Journal of Symbolic Logic, vol. 12, 1947: 1–11.

2 Y. Bar-Hillel: A quasi-arithmetical notation for syntactic description. Language, 29 (1), 1953: 47–58.

3 Rulon S. Wells: Immediate Constituents. Language, 23, 1947: 81–117.

4 L. Tesnière: Éléments de syntaxe structurale. Paris: Klincksieck, 1959.

5 https://en.wikipedia.org/wiki/Dependency_grammar#Dependency_vs._constituency.

6 Pavel Bělíček: Systematic Poetics  III. Formal Poetics and Rhetoric. Prague 2017, 357p., p. 36, 40.

7 https://en.wikipedia.org/wiki/String_operations#String_substitution.

8 John E. Hopcroft – Jeffrey D. Ullman: Introduction to Automata Theory, Languages, and Computation, Addison-Wesley Publishing, Reading Massachusetts, 1979.

9 https://en.wikipedia.org/wiki/String_operations. 

10 https://en.wikipedia.org/wiki/String_operations.

11 https://en.wikipedia.org/wiki/Concatenation.

12 https://de.wikipedia.org/wiki/Formale_Sprache#Konkatenation.

13 David Kohel: Free Abelian monoids. In: Sage Reference Manual: Monoids. Release 7.5. 2017, pp. 11-14.

14 https://en.wikipedia.org/wiki/Monoid#Commutative_monoid.

15 M. Lothaire: Combinatorics on words. Cambridge Mathematical Library, 17. Cambridge University Press, 1997; https://en.wikipedia.org/wiki/Free_monoid.

16 https://en.wikipedia.org/wiki/Concatenation.

17 R. H. Bruck: A Survey of Binary Systems. Berlin, 1958.

18 Ivan Poldauf: Mluvnice současné angličtiny. Substantivum a substantivní větné členy. Praha: SPN, 1958, second edition 1968.

19  J. Łukasiewicz: O logice trójwartościowej. Ruch filozoficzny 5, 1920: 170–171; On three-valued logic, in L. Borkowski (ed.), Selected works by Jan Łukasiewicz, North–Holland, Amsterdam, 1970, pp. 87–88.

20 A. Heyting: Intuitionism. An introduction. North-Holland Publishing Co., Amsterdam, 1956; L. E. J. Brouwer: Collected Works, Vol. I, Amsterdam: North-Holland, 1975.

21 F. R. Palmer: Modality and the English Modals. London: Routledge, 2014.

22 A. Kratzer. The notional category of modality. In H.-J. Eikmeyer & H. Rieser (eds.): Words, Worlds, and Contexts: New Approaches in Word Semantics. Berlin: Walter de Gruyter, 1981.

23 Pavel Bělíček: English Semantics. The Semantic Structure of Modern English. Prague 2005, 353p.

24  Edward H. Bendix: Componential analysis of general vocabulary: the semantic structure of a set of verbs in English, Hindi, and Japanese. Indiana University. Research Center in Anthropology, Folklore, and Linguistics. The Hague; Bloomington: Indiana University, 1966.