Where matter and pattern meet

operads
applied category theory
language
Author
Published

2022-11-07

Abstract

Etymologically, the word matter comes from mother and the word pattern comes from father. Like two parents, matter and pattern represent a fundamental dichotomy: matter is the pure material, unconcerned with our ideas about it; pattern is pure structure, unconcerned with what substantiates it. Considering this dichotomy brings up thoughts of Cartesian dualism. Like Descartes, we need to think about where matter and pattern meet, and hopefully we can do better than “the pineal gland” as an answer. In this post, I’ll discuss how all this relates to language and compositionality, and hence to category theory, as well as explain my hope that we may someday find a dynamic operad or other categorical framework that can account for why the meeting place of matter & pattern seems to condense over time.

Etymologically, the word matter comes from mother and the word pattern comes from father. Like two parents, matter and pattern represent a fundamental dichotomy: matter is the pure material, unconcerned with our ideas about it; pattern is pure structure, unconcerned with what substantiates it. Considering this dichotomy brings up thoughts of Cartesian dualism. Like Descartes, we need to think about where matter and pattern meet, and hopefully we can do better than “the pineal gland” as an answer. In this post, I’ll discuss how all this relates to language and compositionality, and hence to category theory, as well as explain my hope that we may someday find a dynamic operad or other categorical framework that can account for why the meeting place of matter & pattern seems to condense over time.

1 Introduction

At a Computational Theology workshop, held a few months ago in Austin TX, a number of us—Scott Garrabrant, Sophie Libkind (who also helped me refine the ideas for this post), Eliana Lorch, Anna Salamon, and I—spent several hours trying to work out how math and matter relate. How does the math fact that 4+3=7 relate to the paper-and-ink display “4+3=7”, or to some person’s vocal chords vibrating to create the sound “four plus three equals seven”? The math is pure pattern, and the sound, as vibrating air, is pure matter. What’s going on with the link between them?

Eventually, we came upon the transistor as an excellent example of where math and matter meet.

A picture of an NPN transistor and its circuit diagram

On the left we see the transistor as matter, a thing in the material world. On the right we see the transistor as pattern, a logical idea. John Vervaeke says that the term symbol is appropriate here—that the symbol is not just a signifier of some transformation but also an active participant in materially achieving that transformation, i.e. that it’s both pattern and matter—though I don’t know how widespread that terminology is. It seems a little more on the “pattern” side, but I need a word, so I’ll go with it throughout this post, saying that the transistor is a symbol in this sense. If you think of a better word than “symbol” for referring to the fulcrum or janus-point which, like a transistor, is both matter and pattern, please put it in the comment section below!

It is now believed that all of math can be recorded and developed within a proof assistant like Agda, Coq, HOL, or Lean. These programs run on computers, computers are made of logic gates, logic gates are made of NAND gates, and NAND gates are formed by attaching two transistors like so:

A circuit diagram for a NAND gate

One can imagine the connection between math and matter, via proof assistants that run on transistors, as taking place in a kind of hour-glass shape.

We do math purely conceptually, and yet these concepts need to turn into material action in order to affect the world. The transistor is a symbol that serves in a dual role, as both logic and matter, and happens to be close to the atomic scale in both: it is half of a NAND gate (upon which all the logic necessary for a computer is built), and as of late 2022 it is about 2 nanometers (the size of 10 silicon atoms) in length.

In the next section I’ll discuss other examples of symbols that serve in a dual role of matter and pattern, and that like the transistor are as condensed as possible. Then I’ll say what this has to do with category theory and its role in the world.

2 Examples

Besides transistors, what other examples can we find of “symbols”: i.e. condensed instantiations of both matter and pattern? And why are they important?

  1. DNA. As an acidic molecule in 3D space that acts according to physical laws, DNA is matter; but as an ordered sequence of four letters whose three-letter words code for amino acids, it is pattern. Reading the DNA and elaborating its meaning involves many other parts of the cell, and we could consider that whole complex to be both matter and pattern, but DNA is more condensed, more symbolic. Is DNA matter or pattern?

  2. A signature. As the process of moving a pen across paper to form an ink stain, a signature is matter; but as a token of people’s agreement to regulate their behavior, it is pattern. As we sign a contract in good faith, we’ve set up our internal state in such a way that we think it’s likely that our actions in the material world will follow the pattern dictated by the contract. Is signature matter or pattern?

  3. Grandma neuron. Neuroscientists say that a single neuron can code for a single concept, e.g. neuron X fires if and only if one recognizes their own grandma. As a carbon-based object, a neuron is matter; but as representing one’s grandma, it is pattern. If we look at just the grandma neuron, would you call it matter or pattern?

There are many examples, and some are clearer than others. But one thing that strikes me in all this is that there is some process by which examples keep being created, symbols keep getting formed and condensed in our world. Probably early life’s control mechanism was far less condensed than “modern” DNA. Whatever happened in that prebiotic chemistry on the early earth, the robust and effective language for building custom proteins must have been repeatedly refined over eons. Similarly, coordination of animal activity is ancient, but the condensation of this coordination into a binding contract, or the symbolic signature itself, is quite new. And a similar thing could be said about the grandma neuron or the transistor. In each case there seems to be some sort of natural push or process urging the formation of smaller, more concentrated symbols to instantiate pattern as matter. More condensed symbols seem to work better.

What is this natural process? How does it work in so many domains and on so many scales at once? And why is condensing and compressing these symbols somehow “preferred” by evolution? I would love to know.

3 Compositionality and language formation

Transistors wouldn’t be nearly as important in our world today if they didn’t form NAND gates, and hence logic gates, adder circuits, CPUs, etc. But all that is purely conceptual, i.e. pattern. The thing that makes this grammar work is that it simultaneously fits the material embodiment as well. The material transistors can be arranged according to the conceptual pattern and there is a kind of “functoriality” there: just as the matter instantiates the pattern at the lowest level, so does the matter instantiate the pattern at the higher levels too. Logic gates are made of NAND gates wired together, both in pattern and in matter, and this analogy keeps holding all the way up. Your computer has been programmed to run the programs that let you read this, but it is operating on physics in the material world, thanks to the robustness and compositionality of the transistor as symbol.

Similarly, a single contract is important, but an organization works with a whole slew of contracts. When each is being decided upon, the composition of the whole—how all those agreements are going to be carried out in the one organization—requires a conceptual understanding of roles and activities, all of which will be instantiated by human bodies. The composition of agreements within the company reflects the composition of movements by its employees.

The same can be said for DNA or neurons. A single nucleotide does not define DNA, nor does a single neuron define the brain. The organization of nucleotides in DNA or of neurons in the brain is extremely important for determining what the DNA or brain function will actually be. The point is that in each case the function is carried out materially, and this works because the analogy is compositional with respect to the patterns of connection in these organized systems.

So we’re not only interested in the symbol at the center of matter and pattern, but in the compositionality—the grammar—by which the composition of conceptual patterns and the composition of material flows maintain their alignment.

We can see that language occurs materially and affects the material world, and it is also patterned in that it follows conceptual grammatical rules. When I say “pass the salt”, 10^{20} atoms including your arm and a salt shaker move through space, resulting in some sodium chloride crystals arriving in my bowl of soup. Language controls so much of our world, especially if you see things like DNA as language. And just as important as the fundamental symbols at the base of language is the fact that language is compositional.

Thus the drive toward finding symbols, which bridge matter & pattern, and do so compositionally, seems to be the same as, or at least tightly linked with, language formation. So it’s interesting to ask: by what process is language—including the language of thought, the language of computation, the language of life, etc—formed? This formation appears to be a natural multi-scale process, beginning at least as far back as DNA, and continuing to this day.

4 An applied category theory question

Though I’ve used different terms for it over the years, I’ve been interested in using category theory to study collective intelligence and sense-making for just over 15 years now. A current instantiation of this question is represented in this blog post:

Can we use CT to formalize a process by which language formation, i.e. by which the condensation of compositional matter-pattern symbols, would naturally occur?

I would love to help answer this question.

Recently, Brandon Shapiro and I developed something that might be relevant: a category theoretic framework called Dynamic Organizational Structures, e.g. dynamic operads, dynamic categories, etc. We have four examples: deep learning, prediction markets, non-cooperative strategic games, and (Sophie and I developed the example of) Hebbian learning. In each case, there is a multi-scale system for updating the ways parts interact to form wholes.

Perhaps the sort of natural process by which matter-pattern symbols are condensed is scale-invariant enough that it would fit into this framework. In other words, I’m wondering: is there a dynamic operad for the language-formation process? This is of course a very loose question, but I think that’s ok; it’s only meant to inspire people to consider and work on it. Indeed, the Hebbian Learning example above was only loosely inspired by spike-time dependent plasticity, but it still serves to give us some insight into the grammar of Hebbian learners. Applied category theory is very broad, and there’s no reason one needs to use dynamic operads, though they may be useful or at least inspire a more useful framework. I’m not really hoping for a definitive category-theoretic answer to the question of how language or symbols are repeatedly formed, though it’d be interesting to hear if someone thought they had such a thing; instead, I’m looking to consider it and make progress on it.

Category theory itself is language, and it’s really condensed and powerful. And with tools like Catlab, it’s moving closer to the matter side of the divide: the categorical patterns are being instantiated in silicon and metal. Is it just a coincidence that we see this condensation happening yet again? Imagine how powerful it would be to instantiate a compositional language-formation process within computational category theory software. If it’s true that nature keeps making this sort of thing happen, as though by some “invisible hand”, then such an implementation may well be in our future. The more well-rounded our understanding of it is, the more elegantly the system can be designed, and the more frictionless I imagine the result will be.