Programming Languages in Haskell

Support for EECS 662 at KU

Index
Blog

One way to think about the world of languages (or the world in general) is in terms of formal systems. Attributed to David Hilbert and Gotlieb Frege, a formal system provides mechanisms for representing and reasoning a out systems. The term formal implies principled or formal in a mathematical sense.

Syntax

The syntax of a formal system is

Syntax is frequently defined using a grammar as defined in Chapter 1.

The alphabet of a grammar is a set of atomic symbols representing syntax elements that cannot be decomposed further. The rules of a grammar define legal orderings of symbols. The set of strings that are in the closure of the alphabet with respect to application of grammar rules is defined as the formal language described by the grammar.

As an example, consider the grammar of a subset of propositional logic:

This format should be familiar the previous chapter. The alphabet includes terminal symbols including $\ttrue$ and $\ffalse$, but also symbols such as $\wedge$ and $\vee$. The $\ID$ term is a shorthand for all identifiers representign propsitions. Grammar rules define $\Rightarrow$, $\Leftrightarrow$, $\vee$, and $wedge$ as binary operations and $\neg$ as a unary operator. The recursive nature of grammar rules over $t$ allows arbitrary nesting of terms.

Inference System

Inference systems are defined by axioms and inference rules. They allow derivation of true statements form other true statements. You do this whenever you use mathematics to simplify a formula or solve equations for a quantity. We all know, for example, that $(x+y)^2 = x^2+2xy+y^2$. We also know that $(x+y)^2 = x^2+2xy+y^2$ is true in algebra regardless of what $x$ and $y$ represent. We could define this relationship using inference rules:

or an equivalence:

The classical notation for inference rules was defined in the previous chapter with axioms simply defined as inference rules with no antecedents. Looking at propositional logic we start with one axiom that $\ttrue$ is always true:

Nothing need be known to assert $\ttrue$ in a proof.

Other inference rules define introduction and elmination rules for various operators. Introduction rules introduce their associated operation in an expression. The introduction rule for $X\wedge Y$ is:

If $X$ and $Y$ are both known, then $X\wedge Y$ is immediately true.

Elimination rules are the inverse of introduction rules. There are two for $X\wedge Y$:

Each rule allows one conjunct to be infered from the conjunction. The first giving the left conjunct and the second the right.

The elimination rule for $\neg$ is the double negative rule that should be intuitive:

The introduction rule for $\neg$ is more interesting as a derivation is one of the antecents. The notation $X\vdash Y$ says that $Y$ is derivable from $X$. The antecedent of the elimination rule says that assuming $X$ gives $Y$ and $\neg Y$ is also known. This is a contradiction because $X$ and $\not X$ cannot be simultaneously true. Thus, $X$ must be false:

This is the classic proof by contradiction. It also suggests that if $\ffalse$ is ever true, we have big problems because we can derive anything.

The rules for implication again eliminate and introduction $X\Rightarrow Y$. The elimination rule is known as modus ponens and says that if $X$ and $X\Rightarrow Y$ are known, then $Y$ is also known:

The introduction rule again has a derivation in the antecedent. If assuming $X$ allows us to derive $Y$, then we also know that $X\Rightarrow Y$:

Finally, we have introduction and elimination rules for logical equivalence.

Now we have a subset of the inference rules for propositional logic. How do we use them? Let’s do a quick proof:

Now we can say $A\wedge B \vdash B\wedge A$. We can also prove the inverse:

Now we can say $B\wedge A \vdash A\wedge B$. Using the implication introduction rule we can go a step farther and prove logical equivalence:

There is much more we can do with inference rules and systems, but this brief demonstration should give you an idea of how these things define formal reasoning processes. Just like Lego, simple things fit together in simple ways to develop complex and elegant systems.

Semantics

A language’s semantics gives it’s structures meaning. When we used inference rules to define how we reason about propositional logic, we provided a reasoning mechanism without regard to meaning. We could have changed the inference rules in a very simple way and gotten something that is not at all propositional logic. Let’s say we defined a completely wrong rule for implication like this:

Clearly this is not how implication works, but it is a perfectly fine rule and we can reach conclusions from it. What makes it incorrect is the semantics of propositional logic. Semantics defines the meanings of language expressions using another mathematical system.

For propositional logic we can use the common notion of a truth table to define our operations:

X Y X $\wedge$ Y
F F F
F T F
T F F
T T T
X Y X $\vee$ Y
F F F
F T T
T F T
T T T
X Y X $\Rightarrow$ Y
F F T
F T T
T F F
T T T
X Y X $\Leftrightarrow$ Y
F F T
F T F
T F F
T T T
X $\neg$ X
T F
F T

We can’t easily derive new truths using simple truth tables, but we can with the inference system. To ensure the inference system only produces correct results we can compare it with what is specified in the truth tables. Let’s look at our broken rule for negation:

The rule says that if $Y$ and $X\Rightarrow Y$ are both true, then $X$ must also be true. Looking at the truth table for $\Rightarrow$ clearly says otherwise. When $Y$ is true and $X\Rightarrow Y$ is true in the second row, $X$ is false.

Discussion

Thinking of languages and mathematical systems as formal systems will serve you well. Throughout this writing we will think of languages in terms of what they look like (syntax), how to manipulate them (inference system), and what they mean (semantics). At times the Hilbert system structure will be difficult to see, but it is always there.

Exercises