Gottlob
May 19, 2020 18:25 · 1920 words · 10 minute read
Hey, everyone, this talk is about programming from an alternate timeline. So first, some background. I’m Matt Dockrey, also known as Fish. In my day job I work on software that controls industrial measuring systems. Otherwise I’m a metalworker and a kinetic sculptor who’s been doing a lot of public art installations over the last few years. But computers were my first passion, and I certainly still love making them do weird things. So, I’ve recently been reading a bunch of primary documents in the history of math.
00:37 - And one of the things his quickly teaches you is that we’re very lucky to have notational systems as standardized as they are today. As a programmer, I was particularly interested in tracing the development of logical notation. It’s Boolean logic, so maybe it goes all the way back to George Boole himself? Not really. The notation in his works is explicitly algebraic. He’s really developing a concept of set theory and probability more than mathematical logic. He uses multiplication to mean both logical AND as well as set intersection. Addition is set union and so on. One of the first things he derives from this is his “Fundamental law of thought”. If x and y are identical, then x times y equals x, which is the same as x squared equals x. The only solutions to this are x = 1 and x = 0, which he interprets as “everything” and “nothing”. I can’t say I’m quite sure if I agree with his proof, but given how zero equals false and one equals true dominates our lives today, he does seem to have stumbled into something important.
01:36 - Skipping over the rest of the 19th century for a moment, the next work which dominates the logical landscape is Russel and Whitehead’s Principia Mathematica. This is an infamously complex and lengthy work, more influential in concept than in practice, but it’s also important for helping solidify notation. It rejects parentheses for grouping, howpever, instead using a tiered system of dots. But a lot happened between these two, including a glorious system that is almost entirely forgotten today. In 1879 Gottlob Frege published his Begriffsschrift, or “concept writing”.
02:09 - This was one of the first stabs at trying to derive basic mathematical concepts from logic, and in that sense it was immensely influential. It was also an almost complete break from any notational system that came before or after it. Often described as a flowchart or being “two dimensional”, it really is worth studying even if you have no interest in logicism or mathematical philosophy. He describes “judgments” which are either “affirmed” or “denied”. If a statement is a judgment it is preceded by this thick vertical line, which shows it to be something known to be affirmed.
02:42 - Without this stroke, it is a “mere combination of ideas” to be considered. Everything connected to the horizontal line is included as part of this judgment. Expressions can be negated using this symbol, the only part of his system that was adopted elsewhere. He also has “letters” which represent a value or a judgement. They’re basic variables except they’re not variable. Okay, now into the fun stuff. This is “conditionality”.
03:06 - To define it he sets out a truth table and says this symbol “stands for the judgment that the third of these possibilities does not take place, but one of the other three does.” While not otherwise naming it as such, he is describing the logical operator of implication here. While this is sometimes written as “If B then A”, it’s important to note that it is not the same thing as an if statement. Implication is only false if B is true but A is not. If B is false, the truth value of the operation is always true. That is, a false statement logically implies everything. Consider the statement “if it is raining, then I am wet”. It only deals with the situation where it is raining, in which case I must be wet for the statement to be true. If it isn’t raining, however, it has nothing to say. Maybe I’m dry, and maybe someone just dumped a bucket over my head.
03:53 - In either case the original statement is still true. I find it best to think of implication in modern terms of NOT-B OR A to keep from getting confused. Like all early logicians, this was his only logical operator beyond simple negation. And that works fine, you can formulate any statement you want with just those two. But it does seem deeply clumsy to the modern reader. This is one of the things that struck me most deeply when learning the early history. Good old AND and OR aren’t that old! Mathematical logic developed out of philosophical logic, and that meant Aristotelean syllogisms. For a long time, all they were trying to do was form a rigorous version of “1. All men are mortal. 2. Socrates is a man. 3. Therefore, Socrates is mortal.” It wasn’t until mathematical logic started to be explored for its own sake that the greater expressiveness of AND and OR came to be appreciated. There was also the “generality”. In Frege’s words, “this stands for the judgment that, whatever we make take for its argument, the function is a fact.
” 04:51 - This works exactly like the “universal quantifier” or “turned A” notation of predicate calculus. They’re both nothing more than a formal way to write Aristotle’s “all men are mortal”. There isn’t, as you might be expecting, an existential quantifier, such as the backward E of predicate calculus. This isn’t strictly needed, as it can be phrased in terms Frege has defined. To say “there is an immortal frog” is the same as saying “it is not true that for all frogs f, f is not immortal”. This lack was common for Frege’s era.
05:21 - The existential quantifier was another bit of syntactic sugar that took a while to be considered as a basic operator in its own right. The Begriffsschrift goes on to develop other terminology in terms of the above, but those are the basics. And reading it, while struggling through the interpretation, I couldn’t help but feel a bit sad that this amazing notation had never been picked up by other mathematicians. What if Russel and/or Whitehead had used it to write the Principia? What if Turing had used it when outlining the next 50 years of computer science? What would programming look like in that world? I decided I had to find out. The end result is Gottlob, my esoteric programming language.
05:57 - It is implemented in Javascript so it can be run in any browser. This was kind of a cheat, since it meant I didn’t have to write a lexer/parser – and I really love writing lexer/parsers! But since it was alway going to need a custom editor for the notation, so I thought it was best to keep it as accessible for people as possible. Just like there isn’t a 1-to-1 mapping between mathematical notation and real programming languages, Frege’s system had to be adapted to turn it into somthing that can do things instead of just proving them. I made the letters be actually variable, for instance. Following his example, these are always single characters, typeset in Fraktur.
06:33 - They must be declared using the judgment stroke. Also following his example, functions are typically named as Greek letters, but this isn’t enforced. The namespace of Greek letters is limited enough that all the built- in utility functions are given normal English names. Conditionalities remained more or less the same, just interpreted in a functional context. B is evaluated, and only if it is affirmed is A evaluated.
06:55 - The entire thing remains a logical operator, however, whose judgment is always affirmed unless B is affirmed and A is denied. Generalities were obviously destined to become a loop operator. However, they couldn’t actually be universal quantifiers. Forcing for statements to loop over all the integers would present both implementation and application difficulties. In Gottlob the quantifier statement uses the pre-existing value of a and loops over every component, if it’s an array, or from 1 up to and including its value, if it’s an integer.
07:26 - Like conditionalities, the generality remains a logical operator. Its judgment is affirmed if and only if phi(a) is affirmed for each and every value of a. I also added a “matrix generality”, which instead of returning a judgement returns an array made up of the return value of every iteration. This was a complete fabrication on my part, but it proved useful and seemed true to the spirit of the alternate timeline I was trying to channel. The rest is more or less a standard programming language that uses standard mathematical notation wherever possible.
07:56 - The editor allows direct editing of any part of the program, with statements being entered either from the menu on the left or via keyboard shortcuts. Basic arithmetical operations are entered as Javascript, just to keep it simple. So what can you do with it? Here is a simple Sieve of Eratosthenes prime number function. The original value is saved in a letter C, which a generality then loops over. So for every number less or equal to A, the conditionalities must be affirmed if they are 1 or A itself, as those branches evaluate to denied and conditionalities are always affirmed if that branch is denied.
08:30 - So that just leaves the check to see if the value c does not divide evenly into A. If it doesn’t for all values of C, then A must be a prime. This is the classic Gregorian leap year calculator. A year is a leap year if it is divisible by 4 unless it is divisible by 100 unless it is also divisible by 400. This is compactly phrased in Gottlob as such. Which makes sense, if you pick it apart. The inner conditionality will only be denied in the situation when the year is divisible by 4 but not by 100. Thus the outer conditionality will be denied when the year is not divisible by 4, and also when divisible by 100 but not by 400. So what did I learn from all this? Well, I appreciate the link between classical logic and modern logic and programming languages a lot more now. Like a lot of science and math, that all tends to get taught in a very ahistorical way, and I wish I had learned about that connections earlier. And I definitely appreciate modern notation a lot more now, particularly the dominance of AND and OR over IMPLIES.
09:28 - I’m sure I’d get more comfortable with it if I used it all the time, but even after playing around all this time I do still find it fairly unintuitive. Discovering simplifications like that is an unsung glory of mathematical formalization. It’s easy to dismiss that as an academic thing, but writing things down in abstract notation can lead you to important discoveries. People had been working with classical syllogisms for literally two thousand years and they never noticed how much more simple it would be with AND and OR, but within decades of Boole’s first attempt those started to emerge. So, abstract notation can free you from assumptions you didn’t even know you had. Thank you. .