Sunday, October 30, 2011

On a Combinatory Logic of Modal Operators

We can start with a simple identity or constancy operator, T(_). We could do I(_) instead, but we'll write it T(_) here. And T(_) is a one-place or no-place operator, which simply returns whatever is in its scope, and which can be added to anything you're working with by taking whatever you're working with in its scope. So, for instance:

T(T) = T

And we add to this F(_), which is also a unary/nullary operator, with the following two properties when interacting with T(_):

F(T) = F
F(F) = T

It's our opposition or negation operator; it returns the opposite of whatever falls within its scope. We could write it N(_) instead, but we'll write it F(_) here. We can nest things, of course, so that, for instance, we have F(F(T)), which can easily be shown to be equivalent to T.

To this we can add binary operators. For instance, K(_)(_) is the operator that returns T if the two things in its scope are T, and F if otherwise:

K(T)(T) = T
K(x)(F) = K(F)(x) = F

In the same way, others can be defined. For instance, A(_)(_) returns T if either thing in its scope is T, and F otherwise. We could add others, like C(_)(_) and E(_)(_), but I won't go into that here.

Of course, all of this assumes that all operators of a kind are interchangeable. But suppose you want to index them to some particular kind of thing? Then we can mark off differently indexed operators with numerical subscripts, like so:

K(T1)(T2)

We could, if we wished, modify the rules for T and F slightly:

Tn(Tn) = Tn
F(Tn) = Fn
F(Fn) = Tn

And if we took the index numbers to represent different propositions, we could have standard propositional logic, as the astute reader no doubt has already recognized, for standard propositional logic is really just that fragmentary variant of a sort of combinatory logic of modal operators (or truth values, which are the same thing) in which at least some modal operators are indexed to propositions. A lot of what we do in standard propositional logic is just follow procedures for taking longer strings of combined operators and reducing them to the shortest string we can -- preferably, of course, T or F.

Needless to say, we could also add in other operators. For instance, we often have need for an ineliminability operator, or Box, and a consistency operator, or Diamond, which are, like T and F, nullary/unary, and are related to each other in this way:

F(□) = ◊(F)
□(F) = F(◊)

Which is nice. But, of course, what one really wants is to have Boxes and Diamonds indexed to certain behaviors, doesn't one? Of course one does. We could do subscripts again, but I think superscripts ends up being less confusing in the long run, because we might want to index an operator both to a kind of Box or Diamond and to a proposition, to get all the flavors of modal logic. Of course, there's no reason why Box and Diamond should have all the fun; you can index T and F, thus (for instance) yielding certain kinds of paraconsistent logic.

But there's no reason, either, to think that we can only index to propositions. We can do so just as easily with terms. And, as I have pointed out before, quantifiers are modal operators, so the universal quantifier and its dual are simply one kind of Box and Diamond, □ and ◊. And so on, and so forth, and so forth beyond that. It's all modal operators.