Relation composition

&#9758; This page belongs to resource collections on Logic and Inquiry.

Relation composition, or the composition of relations, is the generalization of function composition, or the composition of functions. The following treatment of relation composition takes the &ldquo;strongly typed&rdquo; approach to relations that is outlined in the article on relation theory.

Preliminaries
There are several ways to formalize the subject matter of relations. Relations and their combinations may be described in the logic of relative terms, in set theories of various kinds, and through a broadening of category theory from functions to relations in general.

The first order of business is to define the operation on relations that is variously known as the composition of relations, relational composition, or relative multiplication. In approaching the more general constructions, it pays to begin with the composition of dyadic and triadic relations.

As an incidental observation on usage, there are many different conventions of syntax for denoting the application and composition of relations, with perhaps even more options in general use than are common for the application and composition of functions. In this case there is little chance of standardization, since the convenience of conventions is relative to the context of use, and the same writers use different styles of syntax in different settings, depending on the ease of analysis and computation.

These two factors together generate the following four styles of syntax:

Definition
A notion of relational composition is to be defined that generalizes the usual notion of functional composition:

Note on notation. The ordinary symbol for functional composition is the composition sign, a small circle "$$\circ$$" written between the names of the functions being composed, as $$f \circ g,$$ but the sign is often omitted if there is no risk of confusing the composition of functions with their algebraic product. In contexts where both compositions and products occur, either the composition is marked on each occasion or else the product is marked by means of a center dot &ldquo;$$\cdot$$&rdquo;, as $$f \cdot g.$$

Generalizing the paradigm along parallel lines, the composition of a pair of dyadic relations is formulated in the following two ways:

Geometric construction
There is a neat way of defining relational compositions in geometric terms, not only showing their relationship to the projection operations that come with any cartesian product, but also suggesting natural directions for generalizing relational compositions beyond the dyadic case, and even beyond relations that have any fixed arity, in effect, to the general case of formal languages as generalized relations.

This way of looking at relational compositions is sometimes referred to as Tarski's Trick, on account of his having put it to especially good use in his work (Ulam and Bednarek, 1977). It supplies the imagination with a geometric way of visualizing the relational composition of a pair of dyadic relations, doing this by attaching concrete imagery to the basic set-theoretic operations, namely, intersections, projections, and a certain class of operations inverse to projections, here called tacit extensions.

The stage is set for Tarski's trick by highlighting the links between two topics that are likely to appear wholly unrelated at first, namely:


 * The use of logical conjunction, as denoted by the symbol $$\land,\!$$ in expressions of the form $$F(x, y, z) = G(x, y) \land H(y, z),\!$$ to define a triadic relation $$F\!$$ in terms of a pair of dyadic relations $$G\!$$ and $$H.\!$$


 * The concepts of dyadic projection and projective determination, that are invoked in the &ldquo;weak&rdquo; notion of projective reducibility.

The relational composition $$G \circ H\!$$ of a pair of dyadic relations $$G\!$$ and $$H\!$$ will be constructed in three stages, first, by taking the tacit extensions of $$G\!$$ and $$H\!$$ to triadic relations that reside in the same space, next, by taking the intersection of these extensions, tantamount to the maximal triadic relation that is consistent with the prima facie dyadic relation data, finally, by projecting this intersection on a suitable plane to form a third dyadic relation, constituting in fact the relational composition $$G \circ H\!$$ of the relations $$G\!$$ and $$H.\!$$

The construction of a relational composition in a specifically mathematical setting normally begins with mathematical relations at a higher level of abstraction than the corresponding objects in linguistic or logical settings. This is due to the fact that mathematical objects are typically specified only up to isomorphism as the conventional saying goes, that is, any objects that have the &ldquo;same form&rdquo; are generally regarded as the being the same thing, for most all intents and mathematical purposes. Thus the mathematical construction of a relational composition begins by default with a pair of dyadic relations that reside, without loss of generality, in the same plane, say, $$G, H \subseteq X \times Y,\!$$ as shown in Figure 1.

The dyadic relations $$G\!$$ and $$H\!$$ cannot be composed at all at this point, not without additional information or further stipulation. In order for their relational composition to be possible, one of two types of cases has to happen:


 * The first type of case occurs when $$X = Y.\!$$ In this case, both of the compositions $$G \circ H\!$$ and $$H \circ G\!$$ are defined.


 * The second type of case occurs when $$X\!$$ and $$Y\!$$ are distinct, but when it nevertheless makes sense to speak of a dyadic relation $$\hat{H}\!$$ that is isomorphic to $$H,\!$$ but living in the plane $$YZ,\!$$ that is, in the space of the cartesian product $$Y \times Z,\!$$ for some set $$Z.\!$$

Whether you view isomorphic things to be the same things or not, you still have to specify the exact isomorphisms that are needed to transform any given representation of a thing into a required representation of the same thing. Let us imagine that we have done this, and say how later:

With the required spaces carefully swept out, the stage is set for the presentation of Tarski's trick, and the invocation of the following symbolic formula, claimed to be a definition of the relational composition $$P \circ Q\!$$ of a pair of dyadic relations $$P, Q \subseteq X \times X.\!$$


 * Definition. $$P \circ Q = \mathrm{proj}_{13} (P \times X ~\cap~ X \times Q).\!$$

To get this drift of this definition one needs to understand that it comes from a point of view that regards all dyadic relations as covered well enough by subsets of a suitable cartesian square and thus of the form $$L \subseteq X \times X.\!$$ So, if one has started out with a dyadic relation of the shape $$L \subseteq U \times V,\!$$ one merely lets $$X = U \cup V,\!$$ trading in the initial $$L\!$$ for a new $$L \subseteq X \times X\!$$ as need be.

The projection $$\mathrm{proj}_{13}\!$$ is just the projection of the cartesian cube $$X \times X \times X\!$$ on the space of shape $$X \times X\!$$ that is spanned by the first and the third domains, but since they now have the same names and the same contents it is necessary to distinguish them by numbering their relational places.

Finally, the notation of the cartesian product sign &ldquo;$$\times\!$$&rdquo; is extended to signify two other products with respect to a dyadic relation $$L \subseteq X \times X\!$$ and a subset $$W \subseteq X,\!$$ as follows:


 * Definition. $$L \times W ~=~ \{ (x, y, z) \in X^3 ~:~ (x, y) \in L ~\mathrm{and}~ z \in W \}.\!$$


 * Definition. $$W \times L ~=~ \{ (x, y, z) \in X^3 ~:~ x \in W ~\mathrm{and}~ (y, z) \in L \}.\!$$

Applying these definitions to the case $$P, Q \subseteq X \times X,\!$$ the two dyadic relations whose relational composition $$P \circ Q \subseteq X \times X\!$$ is about to be defined, one finds:


 * $$P \times X ~=~ \{ (x, y, z) \in X^3 ~:~ (x, y) \in P ~\mathrm{and}~ z \in X \},\!$$


 * $$X \times Q ~=~ \{ (x, y, z) \in X^3 ~:~ x \in X ~\mathrm{and}~ (y, z) \in Q \}.\!$$

These are just the appropriate special cases of the tacit extensions already defined.


 * $$P \times X ~=~ \mathrm{te}_{12}^3 (P),~\!$$


 * $$X \times Q ~=~ \mathrm{te}_{23}^1 (Q).~\!$$

In summary, then, the expression:


 * $$\mathrm{proj}_{13} (P \times X ~\cap~ X \times Q)\!$$

is equivalent to the expression:


 * $$\mathrm{proj}_{13} (\mathrm{te}_{12}^3 (P) ~\cap~ \mathrm{te}_{23}^1 (Q))\!$$

and this form is generalized &mdash; although, relative to one's school of thought, perhaps inessentially so &mdash; by the form that was given above as follows:


 * Definition. $$P \circ Q ~=~ \mathrm{proj}_{XZ} (\mathrm{te}_{XY}^Z (P) ~\cap~ \mathrm{te}_{YZ}^X (Q)).\!$$

Figure 3 presents a geometric picture of what is involved in formulating a definition of the triadic relation $$F \subseteq X \times Y \times Z\!$$ by way of a conjunction between the dyadic relation $$G \subseteq X \times Y\!$$ and the dyadic relation $$H \subseteq Y \times Z,\!$$ as done for example by means of an expression of the following form:


 * $$F(x, y, z) ~=~ G(x, y) \land H(y, z).\!$$

To interpret the Figure, visualize the triadic relation $$F \subseteq X \times Y \times Z\!$$ as a body in $$XYZ\!$$-space, while $$G\!$$ is a figure in $$XY\!$$-space and $$H\!$$ is a figure in $$YZ\!$$-space.

The dyadic projections that accompany a triadic relation over $$X, Y, Z\!$$ are defined as follows:


 * $$\mathrm{proj}_{XY} (L) ~=~ \{ (x, y) \in X \times Y : (x, y, z) \in L ~\text{for some}~ z \in Z) \},\!$$


 * $$\mathrm{proj}_{XZ} (L) ~=~ \{ (x, z) \in X \times Z : (x, y, z) \in L ~\text{for some}~ y \in Y) \},\!$$


 * $$\mathrm{proj}_{YZ} (L) ~=~ \{ (y, z) \in Y \times Z : (x, y, z) \in L ~\text{for some}~ x \in X) \}.\!$$

For many purposes it suffices to indicate the dyadic projections of a triadic relation $$L\!$$ by means of the briefer equivalents listed next:


 * $$L_{XY} ~=~ \mathrm{proj}_{XY}(L),\!$$


 * $$L_{XZ} ~=~ \mathrm{proj}_{XZ}(L),\!$$


 * $$L_{YZ} ~=~ \mathrm{proj}_{YZ}(L).\!$$

In light of these definitions, $$\mathrm{proj}_{XY}\!$$ is a mapping from the set $$\mathcal{L}_{XYZ}\!$$ of triadic relations over the domains $$X, Y, Z\!$$ to the set $$\mathcal{L}_{XY}\!$$ of dyadic relations over the domains $$X, Y,\!$$ with similar relationships holding for the other projections. To formalize these relationships in a concise but explicit manner, it serves to add a few more definitions.

The set $$\mathcal{L}_{XYZ},~\!$$ whose members are just the triadic relations over $$X, Y, Z,\!$$ can be recognized as the set of all subsets of the cartesian product $$X \times Y \times Z,\!$$ also known as the power set of $$X \times Y \times Z,\!$$ and notated here as $$\mathrm{Pow} (X \times Y \times Z).\!$$


 * $$\mathcal{L}_{XYZ} ~=~ \{ L : L \subseteq X \times Y \times Z \} ~=~ \mathrm{Pow} (X \times Y \times Z).\!$$

Likewise, the power sets of the pairwise cartesian products encompass all the dyadic relations on pairs of distinct domains that can be chosen from $$\{ X, Y, Z \}.\!$$


 * $$\mathcal{L}_{XY} ~=~ \{L : L \subseteq X \times Y \} ~=~ \mathrm{Pow} (X \times Y),~\!$$


 * $$\mathcal{L}_{XZ} ~=~ \{L : L \subseteq X \times Z \} ~=~ \mathrm{Pow} (X \times Z),~\!$$


 * $$\mathcal{L}_{YZ} ~=~ \{L : L \subseteq Y \times Z \} ~=~ \mathrm{Pow} (Y \times Z).~\!$$

In mathematics, the inverse relation corresponding to a projection map is usually called an extension. To avoid confusion with other senses of the word, however, it is probably best for the sake of this discussion to stick with the more specific term tacit extension.

Given three sets, $$X, Y, Z,\!$$ and three dyadic relations,


 * $$U \subseteq X \times Y,~\!$$


 * $$V \subseteq X \times Z,~\!$$


 * $$W \subseteq Y \times Z,~\!$$

the tacit extensions, $$\mathrm{te}_{XY}^Z, \mathrm{te}_{XZ}^Y, \mathrm{te}_{YZ}^X,~\!$$ of $$U, V, W,\!$$ respectively, are defined as follows:


 * $$\mathrm{te}_{XY}^Z (U) ~=~ \{ (x, y, z) : (x, y) \in U \},\!$$


 * $$\mathrm{te}_{XZ}^Y (V) ~=~ \{ (x, y, z) : (x, z) \in V \},\!$$


 * $$\mathrm{te}_{YZ}^X (W) ~=~ \{ (x, y, z) : (y, z) \in W \}.\!$$

So long as the intended indices attaching to the tacit extensions can be gathered from context, it is usually clear enough to use the abbreviated forms, $$\mathrm{te}(U), \mathrm{te}(V), \mathrm{te}(W).\!$$

The definition and illustration of relational composition presently under way makes use of the tacit extension of $$G \subseteq X \times Y\!$$ to $$\mathrm{te}(G) \subseteq X \times Y \times Z\!$$ and the tacit extension of $$H \subseteq Y \times Z\!$$ to $$\mathrm{te}(H) \subseteq X \times Y \times Z,\!$$ only.

Geometric illustrations of $$\mathrm{te}(G)\!$$ and $$\mathrm{te}(H)\!$$ are afforded by Figures 4 and 5, respectively.

A geometric interpretation can now be given that fleshes out in graphic form the meaning of a formula like the following:


 * $$F(x, y, z) ~=~ G(x, y) \land H(y, z).\!$$

The conjunction that is indicated by &ldquo;$$\land\!$$&rdquo; corresponds as usual to an intersection of two sets, however, in this case it is the intersection of the tacit extensions $$\mathrm{te}(G)\!$$ and $$\mathrm{te}(H).\!$$

Algebraic construction
The transition from a geometric picture of relation composition to an algebraic formulation is accomplished through the introduction of coordinates, in other words, identifiable names for the objects that are related through the various forms of relations, dyadic and triadic in the present case. Adding coordinates to the running Example produces the following Figure:

Thinking of relations in operational terms is facilitated by using variant notations for ordered tuples and sets of ordered tuples, namely, the ordered pair $$(x, y)\!$$ is written $$x\!:\!y,\!$$ the ordered triple $$(x, y, z)\!$$ is written $$x\!:\!y\!:\!z,\!$$ and so on, and a set of tuples is conceived as a logical-algebraic sum, which can be written out in the smaller finite cases in forms like $$a\!:\!b ~+~ b\!:\!c ~+~ c\!:\!d\!$$ and so on.

For example, translating the relations $$F \subseteq X \times Y \times Z, ~ G \subseteq X \times Y, ~ H \subseteq Y \times Z\!$$ into this notation produces the following summary of the data:

As often happens with abstract notations for functions and relations, the type information, in this case, the fact that $$G\!$$ and $$H\!$$ live in different spaces, is left implicit in the context of use.

Let us now verify that all of the proposed definitions, formulas, and other relationships check out against the concrete data of the current composition example. The ultimate goal is to develop a clearer picture of what is going on in the formula that expresses the relational composition of a couple of dyadic relations in terms of the medial projection of the intersection of their tacit extensions:

Here is the big picture, with all the pieces in place:

All that remains is to check the following collection of data and derivations against the situation represented in Figure 8.

Matrix representation
We have it within our reach to pick up another way of representing dyadic relations, namely, the representation as logical matrices, and also to grasp the analogy between relational composition and ordinary matrix multiplication as it appears in linear algebra.

First of all, while we still have the data of a very simple concrete case in mind, let us reflect on what we did in our last Example in order to find the composition $$G \circ H\!$$ of the dyadic relations $$G\!$$ and $$H.\!$$

Here is the setup that we had before:

Let us recall the rule for finding the relational composition of a pair of dyadic relations. Given the dyadic relations $$P \subseteq X \times Y\!$$ and $$Q \subseteq Y \times Z,\!$$ the composition of $$P ~\text{on}~ Q\!$$ is written as $$P \circ Q,\!$$ or more simply as $$PQ,\!$$ and obtained as follows:

To compute $$PQ,\!$$ in general, where $$P\!$$ and $$Q\!$$ are dyadic relations, simply multiply out the two sums in the ordinary distributive algebraic way, but subject to the following rule for finding the product of two elementary relations of shapes $$a:b\!$$ and $$c:d.\!$$

To find the relational composition $$G \circ H,\!$$ one may begin by writing it as a quasi-algebraic product:

Multiplying this out in accord with the applicable form of distributive law one obtains the following expansion:

Applying the rule that determines the product of elementary relations produces the following array:

Since the plus sign in this context represents an operation of logical disjunction or set-theoretic aggregation, all of the positive multiplicites count as one, and this gives the ultimate result:

With an eye toward extracting a general formula for relation composition, viewed here on analogy to algebraic multiplication, let us examine what we did in multiplying the dyadic relations $$G\!$$ and $$H\!$$ together to obtain their relational composite $$G \circ H.\!$$

Given the space $$X = \{ 1, 2, 3, 4, 5, 6, 7 \},\!$$ whose cardinality $$|X|\!$$ is $$7,\!$$ there are $$|X \times X| = |X| \cdot |X|\!$$ $$=\!$$ $$7 \cdot 7 = 49\!$$ elementary relations of the form $$i:j,\!$$ where $$i\!$$ and $$j\!$$ range over the space $$X.\!$$ Although they might be organized in many different ways, it is convenient to regard the collection of elementary relations as arranged in a lexicographic block of the following form:

The relations $$G\!$$ and $$H\!$$ may then be regarded as logical sums of the following forms:

The notation $$\textstyle\sum_{ij}\!$$ indicates a logical sum over the collection of elementary relations $$i\!:\!j\!$$ while the factors $$G_{ij}\!$$ and $$H_{ij}\!$$ are values in the boolean domain $$\mathbb{B} = \{ 0, 1 \}~\!$$ that are called the coefficients of the relations $$G\!$$ and $$H,\!$$ respectively, with regard to the corresponding elementary relations $$i\!:\!j.\!$$

In general, for a dyadic relation $$L,\!$$ the coefficient $$L_{ij}\!$$ of the elementary relation $$i\!:\!j\!$$ in the relation $$L\!$$ will be $$0\!$$ or $$1,\!$$ respectively, as $$i\!:\!j\!$$ is excluded from or included in $$L.\!$$

With these conventions in place, the expansions of $$G\!$$ and $$H\!$$ may be written out as follows:

Stripping down to the bare essentials, one obtains the following matrices of coefficients for the relations $$G\!$$ and $$H.\!$$

These are the logical matrix representations of the dyadic relations $$G\!$$ and $$H.\!$$

If the dyadic relations $$G\!$$ and $$H\!$$ are viewed as logical sums then their relational composition $$G \circ H\!$$ can be regarded as a product of sums, a fact that can be indicated as follows:

The composite relation $$G \circ H\!$$ is itself a dyadic relation over the same space $$X,\!$$ in other words, $$G \circ H \subseteq X \times X,\!$$ and this means that $$G \circ H\!$$ must be amenable to being written as a logical sum of the following form:

In this formula, $$(G \circ H)_{ij}\!$$ is the coefficient of $$G \circ H\!$$ with respect to the elementary relation $$i\!:\!j.\!$$

One of the best ways to reason out what $$G \circ H\!$$ should be is to ask oneself what its coefficient $$(G \circ H)_{ij}\!$$ should be for each of the elementary relations $$i\!:\!j\!$$ in turn.

So let us pose the question:

In order to answer this question, it helps to realize that the indicated product given above can be written in the following equivalent form:

A moment's thought will tell us that $$(G \circ H)_{ij} = 1\!$$ if and only if there is an element $$k\!$$ in $$X\!$$ such that $$G_{ik} = 1\!$$ and $$H_{kj} = 1.\!$$

Consequently, we have the result:

This follows from the properties of boolean arithmetic, specifically, from the fact that the product $$G_{ik} H_{kj}\!$$ is $$1\!$$ if and only if both $$G_{ik}\!$$ and $$H_{kj}\!$$ are $$1\!$$ and from the fact that $$\textstyle\sum_{k} F_{k}\!$$ is equal to $$1\!$$ just in case some $$F_{k}\!$$ is $$1.\!$$

All that remains in order to obtain a computational formula for the relational composite $$G \circ H\!$$ of the dyadic relations $$G\!$$ and $$H\!$$ is to collect the coefficients $$(G \circ H)_{ij}\!$$ as $$i\!$$ and $$j\!$$ range over $$X.\!$$

This is the logical analogue of matrix multiplication in linear algebra, the difference in the logical setting being that all of the operations performed on coefficients take place in a system of boolean arithmetic where summation corresponds to logical disjunction and multiplication corresponds to logical conjunction.

By way of disentangling this formula, one may notice that the form $$\textstyle \sum_{k} G_{ik} H_{kj}\!$$ is what is usually called a scalar product. In this case it is the scalar product of the $$i^\text{th}\!$$ row of $$G\!$$ with the $$j^\text{th}\!$$ column of $$H.\!$$

To make this statement more concrete, let us go back to the examples of $$G\!$$ and $$H\!$$ we came in with:

The formula for computing $$G \circ H\!$$ says the following:

As it happens, it is possible to make exceedingly light work of this example, since there is only one row of $$G\!$$ and one column of $$H\!$$ that are not all zeroes. Taking the scalar product, in a logical way, of the fourth row of $$G\!$$ with the fourth column of $$H\!$$ produces the sole non-zero entry for the matrix of $$G \circ H.\!$$

Graph-theoretic picture
There is another form of representation for dyadic relations that is useful to keep in mind, especially for its ability to render the logic of many complex formulas almost instantly understandable to the mind's eye. This is the representation in terms of bipartite graphs, or bigraphs for short.

Here is what $$G\!$$ and $$H\!$$ look like in the bigraph picture:

These graphs may be read to say:

To form the composite relation $$G \circ H,\!$$ one simply follows the bigraph for $$G\!$$ by the bigraph for $$H,\!$$ here arranging the bigraphs in order down the page, and then treats any non-empty set of paths of length two between two nodes as being equivalent to a single directed edge between those nodes in the composite bigraph for $$G \circ H.\!$$

Here's how it looks in pictures:

Once again we find that $$G \circ H = 4:4.\!$$

We have now seen three different representations of dyadic relations. If one has a strong preference for letters, or numbers, or pictures, then one may be tempted to take one or another of these as being canonical, but each of them will be found to have its peculiar advantages and disadvantages in any given application, and the maximum advantage is therefore approached by keeping all three of them in mind.

To see the promised utility of the bigraph picture of dyadic relations, let us devise a slightly more complex example of a composition problem, and use it to illustrate the logic of the matrix multiplication formula.

Keeping to the same space $$X = \{ 1, 2, 3, 4, 5, 6, 7 \},\!$$ define the dyadic relations $$M, N \subseteq X \times X\!$$ as follows:

Here are the bigraph pictures:

To form the composite relation $$M \circ N,\!$$ one simply follows the bigraph for $$M\!$$ by the bigraph for $$N,\!$$ arranging the bigraphs in order down the page, and then counts any non-empty set of paths of length two between two nodes as being equivalent to a single directed edge between those two nodes in the composite bigraph for $$M \circ N.\!$$

Here's how it looks in pictures:

Let us hark back to that mysterious matrix multiplication formula, and see how it appears in the light of the bigraph representation.

The coefficient of the composition $$M \circ N\!$$ between $$i\!$$ and $$j\!$$ in $$X\!$$ is given as follows:

Graphically interpreted, this is a sum over paths. Starting at the node $$i,\!$$ $$M_{ik}\!$$ being $$1\!$$ indicates that there is an edge in the bigraph of $$M\!$$ from node $$i\!$$ to node $$k\!$$ and $$N_{kj}\!$$ being $$1\!$$ indicates that there is an edge in the bigraph of $$N\!$$ from node $$k\!$$ to node $$j.\!$$ So the $$\textstyle\sum_{k}\!$$ ranges over all possible intermediaries $$k,\!$$ ascending from $$0\!$$ to $$1\!$$ just as soon as there happens to be a path of length two between nodes $$i\!$$ and $$j.\!$$

It is instructive at this point to compute the other possible composition that can be formed from $$M\!$$ and $$N,\!$$ namely, the composition $$N \circ M,\!$$ that takes $$M\!$$ and $$N\!$$ in the opposite order. Here is the graphic computation:

In sum, $$N \circ M = 0.\!$$ This example affords sufficient evidence that relational composition, just like its kindred, matrix multiplication, is a non-commutative algebraic operation.

Focal nodes

 * Inquiry Live
 * Logic Live

Peer nodes

 * Relation Composition @ InterSciWiki
 * Relation Composition @ Subject Wikis
 * Relation Composition @ Wikiversity
 * Relation Composition @ Wikiversity Beta

Logical operators

 * Exclusive disjunction
 * Logical conjunction
 * Logical disjunction
 * Logical equality


 * Logical implication
 * Logical NAND
 * Logical NNOR
 * Negation

Related topics

 * Ampheck
 * Boolean domain
 * Boolean function
 * Boolean-valued function
 * Differential logic


 * Logical graph
 * Minimal negation operator
 * Multigrade operator
 * Parametric operator
 * Peirce's law


 * Propositional calculus
 * Sole sufficient operator
 * Truth table
 * Universe of discourse
 * Zeroth order logic

Relational concepts

 * Continuous predicate
 * Hypostatic abstraction
 * Logic of relatives
 * Logical matrix


 * Relation
 * Relation composition
 * Relation construction
 * Relation reduction


 * Relation theory
 * Relative term
 * Sign relation
 * Triadic relation

Information, Inquiry

 * Inquiry
 * Dynamics of inquiry


 * Semeiotic
 * Logic of information


 * Descriptive science
 * Normative science


 * Pragmatic maxim
 * Truth theory

Related articles

 * Cactus Language
 * Futures Of Logical Graphs
 * Propositional Equation Reasoning Systems


 * Differential Logic : Introduction
 * Differential Propositional Calculus
 * Differential Logic and Dynamic Systems


 * Introduction to Inquiry Driven Systems
 * Prospects for Inquiry Driven Systems
 * Inquiry Driven Systems : Inquiry Into Inquiry

Document history
Portions of the above article were adapted from the following sources under the GNU Free Documentation License, under other applicable licenses, or by permission of the copyright holders.


 * Relation Composition, InterSciWiki
 * Relation Composition, Semantic Web
 * Relation Composition, Subject Wikis
 * Relation Composition, Wikinfo
 * Relation Composition, Wikiversity
 * Relation Composition, Wikiversity Beta
 * Relation Composition, Wikipedia