Paige North (2nd of February)
In this talk, I will report on progress developing a fuzzy type theory, a project that started as part of the ACT 2022 Adjoint School. The motivation is to develop a logic which can model opinions, and we do this by generalizing Martin-Löf type theory. Martin-Löf type theory provides a system in which one can construct a proof (aka a term) of a proposition (aka a type), and we usually interpret such a term as saying that the proposition holds. Fuzzy type theory is a similar system in which one can provide or construct evidence (aka a fuzzy term) to support an opinion (aka a type), but the evidence (fuzzy term) comes with a parameter, for instance a real number between 0 and 1, which expresses to what extent the opinion holds. Martin-Löf type theory is closely related to category theory: from such a type system one can construct a category in which (very roughly) the types become objects and the terms become morphisms, and this can be made part of an equivalence. Thus, we base our development of fuzzy type theory to be the thing which corresponds to categories enriched in a category of fuzzy sets in the same way that Martin-Löf type theory corresponds to categories (enriched in sets).
This is joint work with Shreya Arya, Greta Coraglia, Sean O’Connor, Hans Riess, and Ana Tenório.
Prakash Panangaden (9th of February)
This talk is based on 25-year old material developed in joint work with Samson Abramsky and Richard Blute. I hope that this talk will stimulate others (any myself) to look at these topics in light on modern developments. Our goal was to develop quantitative analogues of the concept of binary relations. In particular, we were interested in finding a suitable definition of probabilistic relations.
The formulation that we came up with arose by generalizing the notion of nuclear maps from functional analysis by defining nuclear ideals in monoidal *-categories. The compact closed structure associated with the category of relations does not generalize directly, instead one obtains nuclear ideals.
Many such categories have a large class of morphisms which behave as if they were part of a compact closed category, i.e. they allow one to transfer variables between the domain and the codomain. We introduce the notion of nuclear ideals to analyze these classes of morphisms. In compact closed categories all morphisms are nuclear, and in the category of Hilbert spaces, the nuclear morphisms are the Hilbert-Schmidt maps.
We also introduce two new examples of monoidal *-categories, in which integration plays the role of composition. In the first, morphisms are a special class of distributions, which we call tame distributions. The second example is based on measure and probability and serves as a possible candidate for probabilistic relations.
Since our original paper was published, other examples of this phenomenon were discovered. We also explored concepts associated with trace ideals, I will briefly talk about these.
Sergey Goncharov (16th of February)
Compositionality proofs in higher-order languages are notoriously involved, and general semantic frameworks guaranteeing compositionality are hard to come by. In particular, Turi and Plotkin’s bialgebraic abstract GSOS framework, which has been successfully applied to obtain off-the-shelf compositionality results for first-order languages, so far did not apply to higher-order languages. I will present a recently emerged development of the theory of abstract GSOS specifications for higher-order languages, in effect transferring the core principles of Turi and Plotkin’s framework to a higher-order setting. In the present framework, the operational semantics of higher-order languages is represented by certain dinatural transformations, we dub higher-order GSOS laws. I will present a general compositionality result w.r.t. the strong variant of Abramsky’s applicative bisimilarity that applies to all systems specified in this way. For presentation purposes, I will stick to a variant of the combinatory logic, as a main vehicle.
The talk is based on a recent POPL'23 paper, which is a joint work with: Stefan Milius, Lutz Schröder, Stelios Tsampas, and Henning Urbat.
Justin Curry (2nd of March)
In this talk I will describe recent and ongoing work dedicated to developing scalable autonomous routing protocols for a future solar system wide internet. At the heart of the talk will be a description of a new coordinate system (based on cosheaves and persistence) for time-varying graphs. In this new coordinate system, we can describe distances between various space networking scenarios, as well as model routing (with propagation delay) using matrix multiplication in a modified coefficient system valued in semi-rings. To demonstrate these models in practice, we use simulations of Earth-Moon-Mars systems generated in SOAP (Satellite Orbital Analysis Program) ranging from 10s to 100s of nodes in size. These simulations are part of a growing codebase being developed by SUNY Albany and NASA Glenn Research Center under the TIMAEUS project.
David Corfield (9th of March)
For the whole length of my academic career, I have looked to make philosophical sense of (higher) category theory. My earliest interests concerned category theory as a new structuralist foundational language for mathematics and for mathematical physics. Later at the n-Category Café there were also attempts to make category-theoretic sense of probability theory, learning theory and diagrammatic reasoning. Today, alongside successes in logic, mathematics and physics, we find a flourishing world of applied category theory, involving work in probability theory, causal reasoning, learning theory, natural language processing, and so on. These are all topics of profound interest to philosophy. In this talk, I will discuss ways in which philosophy can come into a fruitful relationship with category theory.
Riehl, Bradley, Cheng, Dancstep, and Lugg (16th of March)
Category theory is a wonderful subject, deep and broad, spanning the breadth of mathematics and having applications throughout science, engineering, technology, and the arts. But for people outside of academia, it can be a difficult subject to learn. Topos Institute is hosting a panel discussion, moderated by Emily Riehl, featuring panelists who are actively involved in producing category theory books and videos for a non-expert audience. The panellists will discuss their philosophy and techniques, and provide support and encouragement for others to join in this important work. They will also take questions from viewers to help people get a better handle on how they may begin to learn the subject and to help category theorists understand what they can do to facilitate this process.
Christina Vasilakopoulou (23rd of March)
In this talk, we will provide a detailed overview of the sometimes called "Sweedler theory" for algebras and modules. This begins by establishing an enrichment of the category of algebras in the category of coalgebras, as well as an enrichment of a global category of modules in a global category of comodules, giving rise to a structure described as an enriched fibration. Moreover, by investigating a many-object generalization involving categories and modules, we will discuss further directions and applications of this framework to operadic structures.
Simon Willerton (30th of March)
A certain notion of convexity of sets can be captured by a monad, known as a convexity monad or barycentric monad; this is a finite version of so-called probability monads. Various authors (including Mardare-Panangaden-Plotkin, and Fritz-Perrone) have looked at convexity/probability monads on categories of metric spaces. The work of Fritz-Perrone can be recast in terms of enriched categories if you consider metric spaces as categories enriched over the quantale of extended non-negative real numbers.
One can then do a similar thing for any 'suitably convex' quantale R and define a convexity monad on the category of R-categories. In particular, if R is the extended real line [-oo. oo] with the opposite order to that used in metric spaces, then R-categories are what Lawvere called 'entropic spaces' and argued gave a necessary structure for state spaces in thermodynamics. The category of strict algebras with lax algebra maps for the convexity monad in this case is the category of convex entropic spaces with concave maps. The hope is that this connects the Lawvere approach to thermodynamics with the recent approach of Baez-Lynch-Moeller which involves convex spaces and concave maps (without the entropic structure).
Urs Schreiber (13th of April)
The intricacies of realistic — namely: of classically controlled and (topologically) error-protected — quantum algorithms arguably make computer-assisted verification a practical necessity; and yet a satisfactory theory of dependent quantum data types had been missing, certainly one that would be aware of topological error-protection.
To solve this problem we present Linear homotopy type theory (LHoTT) as a programming and certification language for quantum computers with classical control and topologically protected quantum gates, focusing on (1.) its categorical semantics, which is a homotopy-theoretic extension of that of Proto-Quipper and a parameterized extension of Abramsky et al.'s quantum protocols, (2.) its expression of quantum measurement as a computational effect induced from dependent linear type formation and reminiscent of Lee at al.'s dynamic lifting monad but recovering the interacting systems of Coecke et al.'s "classical structures" monads.
Namely, we have recently shown that classical dependent type theory in its novel but mature full-blown form of Homotopy Type Theory (HoTT) is naturally a certification language for realistic topological logic gates. But given that categorical semantics of HoTT is famously provided by parameterized homotopy theory, we had argued earlier [Sc14] for a quantum enhancement LHoTT of classical HoTT, now with semantics in parameterized stable homotopy theory. This linear homotopy type theory LHoTT has meanwhile been formally described; here we explain it as the previously missing certified quantum language with monadic dynamic lifting, as announced in.
Concretely, we observe that besides its support, inherited from HoTT, for topological logic gates, LHoTT intrinsically provides a system of monadic computational effects which realize what in algebraic topology is known as the ambidextrous form of Grothendieck’s "Motivic Yoga"; and we show how this naturally serves to code quantum circuits subject to classical control implemented via computational effects. Logically this emerges as a linearly-typed quantum version of epistemic modal logic inside LHoTT, which besides providing a philosophically satisfactory formulation of quantum measurement, makes the language validate the quantum programming language axioms proposed by Staton; notably the deferred measurement principle is verified by LHoTT.
Finally we indicate the syntax of a domain-specific programming language QS (an abbreviation both for "Quantum Systems" and for "QS^0 -modules" aka spectra) which sugars LHoTT to a practical quantum programming language with all these features; and we showcase QS-pseudocode for simple forms of key algorithm classes, such as quantum teleportation, quantum error-correction and repeat-until-success quantum gates. (This is joint work with D. J. Myers, M. Riley and H. Sati. Slides will be available at: ncatlab.org/schreiber/show/Quantum+Certification+via+Linear+Homotopy+Types)
Davide Trotta (20th of April)
This talk presents an overview of the "generalized existential completion" of Lawvere doctrines and its applications. I present this construction that freely adds (generalized) existential quantifiers to a given doctrine and I will provide an intrinsic characterization of this notion based on an algebraic description of the logical concept of existential-free formulas. This characterization provides a useful tool to recognize a wide variety of examples of doctrines arising as generalized existential completions. These include the subobjects doctrine and the weak subobjects doctrine as well all realizability triposes and, among localic triposes, only the supercoherent ones.
I will also present some applications of the construction to the dialectica interpretation, showing how our algebraic description of quantifier-free formulas allows us to prove that the logical principles involved in the dialectica interpretation are satisfied in the categorical setting, establishing a tight correspondence between the logical system and the categorical framework given by dialectica doctrines.
This talk is based on the following works:
Georges Gonthier (27th of April)
While mathematics is amongst the best organized forms of knowledge, the level of detail of computer-assisted formal mathematics demands an even higher degree of structuring. The cathedrals that are the computer proofs of famous results such as the Four Color or the Odd Order theorems rest on the foothills of large, architected libraries of prerequisites, ranging from the trivial and elementary to undergraduate and graduate curricula. While these can often be crafted to resemble traditional course material, the extra attention to detail also motivates differences, some examples of which we will highlight.
Michael Barany (4th of May)
I will situate categories and related mathematical principles in the history of modern mathematics from the late nineteenth century to the present. The worldviews and perspectives adopted by the Topos Institute derive from specific transformations in the nature and scale of mathematical research over the last century-and-a-bit. These connect the ideas of modern mathematics to its people, institutions, and infrastructures. I will consider two main senses in which categories "come to matter": first, how categories of various kinds became meaningful, salient, and important as ways of seeing the mathematical world and and the worlds of mathematicians; and second, how such categories were encoded, translated, reproduced, and made to move on and across various kinds of material media (i.e. physical matter) such as blackboards, index cards, and printed journals. I contend that these two senses are historically connected, in ways that relate to the continuing goals and challenges of the Topos community.
Chad Giusti (11th of May)
Persistent homology is the central tool in modern applied topology. Briefly, given a (finite) sample from a distribution on a topological space embedded in a metric space, one builds a multi-scale combinatorial representation of the sample called a filtered Vietoris-Rips complex. Applying homology, we summarize this multi-scale structure as an object called a persistence module. When the samples come from real-world data, practitioners apply domain knowledge and ad hoc reasoning to understand how the persistence module reflects organizational principles of the system of interest.
If we wish to formalize this reasoning process, or apply more sophisticated methods from algebraic topology to data analysis, a first step is to develop a notion of functoriality for persistent homology. This pipeline inherits the usual functoriality of homology, taking maps of simplicial complexes to maps on persistence modules, and substantial progress has been made in cycle registration, a formalism for comparing persistence modules using this structure. However, in practice one usually compares samples from unknown spaces; there are relatively few settings where the required map of the combinatorial encodings (or underlying spaces) is likely to be known. In this talk, we describe our recent efforts to develop techniques that provide a notion of "induced map" using the kind of data reasonably available in applied settings, and the challenges that remain in developing a full categorical framework for applied topology. Various parts of this work are joint with Iris Yoon, Robert Ghrist, Niko Schonsheck, Gregory Henselman-Petrusek, and Lori Ziegelmeier, amongst others.
Clark Barrett (18th of May)
Satisfiability modulo theories (SMT) is an automated reasoning paradigm that can automatically prove a wide variety of first-order logic theorems relating to theories that commonly occur when reasoning about computer systems (e.g., arithmetic, arrays, strings, etc.). Together with my colleagues at U Iowa, Bar-Ilan, and UF Minas Gerais, we have for the past few years been developing a way to produce proof certificates from the cvc5 SMT solver. These proof certificates provide enough detail to allow an independent proof checker to confirm the correctness of the theorem proved. The obvious benefit is that this drastically reduces the code that must be trusted to ensure correct results. A less obvious benefit is that proof certificates open up the possibility of integrating SMT solvers into skeptical proof assistants such as Coq, Isabelle/HOL, or Lean. In this talk, I will give an overview of the proof production and proof checking mechanisms we are developing for cvc5 and explain how we are leveraging these to provide trusted SMT automation in proof assistants.
Elaine Landry (25th of May)
The aim of this talk is to show that when we shift our focus from solving philosophical problems to solving mathematical ones, we see that an as-if interpretation of mathematics can be used to provide an account of both the practice and the applicability of mathematics. I begin first with Plato to show that much philosophical milk has been spilt owing to our conflating the method of mathematics with the method of philosophy. I then use my reading of Plato to develop what I call as-ifism, the view that, in mathematics, we treat our hypotheses as if they were true first principles and we do this with the purpose of solving mathematical problems not philosophical ones. I next extend as-ifism to modern mathematics wherein the method of mathematics becomes the axiomatic method, noting that this engenders a shift from as-if hypotheses to as-if axioms. I next distinguish as-ifism from if-thenism, and use this to develop my structural as-ifist position. I end by showing that taking a methodological as-ifist route, by placing our focus on what is needed for the practice and applicability of mathematics, we are neither committed to the unconditional consistency of our mathematical axioms nor the unconditional truth of our background meta-mathematical theory. Simply, it is methodological considerations, and not metaphysical ones, that "condition" our as if assumptions of both the consistency of our mathematical axioms and the truth of our background meta-mathematical theory. Finally, I use my methodological as-ifism to reconsider the "foundations debate", specifically, that between set theory and category theory.
Alex Martsinkovsky (1st of June)
Two important features of a linear control system are its controllability and observability. From the algebraic prospective, a system is a module over a ring of differential operators (with constant, polynomial, or analytic coefficients). The part of the system that cannot be controlled is called the autonomy of the system. In the algebraic language, this is just the torsion submodule of the module. Thus the controllable part is described by the torsion-free quotient of the module.
There is a duality between the controllability of the system and the observability of the dual system. The goal of this talk is to propose a conjectural algebraic analog of the observability, called cotorsion. As a justification for the conjecture, we will see a duality between cotorsion and torsion, effected by the Auslander-Gruson-Jensen functor. The latter seems to be a recurring theme, found in a variety of functor categories. It is to be hoped that, despite the algebraic nature of this talk, the non-algebraic members of the audience would recognize familiar (enriched) categorical patterns.
Tom Hirschowitz (8th of June)
Research in programming languages mostly proceeds language by language. This in particular means that key ideas are often introduced for one typical language, and must then be adapted to other languages.
A prominent example of this is Howe's method for proving that applicative bisimilarity, a notion of program equivalence in call-by-name λ-calculus, is a congruence. This method has been adapted to call-by-value λ-calculus, PCF, λ-calculus with delimited continuations, higher-order π-calculus,…
In this work, using category theory, we establish an abstract congruence theorem for applicative bisimilarity, of which most existing adaptations of Howe's method are instances.
This is joint work with Ambroise Lafont.
See also Ugo Dal Lago, Francesco Gavazzo, and Paul Levy. Effectful applicative bisimilarity: Monads, relators, and Howe’s method. LICS 2017. Extended version: arXiv:1704.04647.
Taco Cohen (15th of June)
Categorical Systems theory is a general framework for modelling systems whose state evolves depending on some inputs or actions, and which produce some observable outputs. This framework is general enough to describe arbitrary classical physical systems, and generalizes the Partially Observed Markov Decision Process commonly used in AI. Causal models, formalized as string diagrams in CD-categories, provide a convenient high-level framework for reasoning about how intervening on some outcome variables would affect others. In this talk I will provide a high-level introduction to both frameworks, and discuss their relations. In particular we ask when a system can be accurately described by a causal model, and show that this is not generally possible, thus highlighting the limitations of existing causal modelling frameworks.
Jared Culbertson (22nd of June)
Embracing a categorical perspective has proven to be a useful way to formalize many intuitive and ad hoc compositional approaches across a diverse set of practical fields. In this talk, we will discuss work in three primary domains including probabilistic modeling, hierarchical clustering, and robotics. After briefly surveying prior work in the first two, we will focus on formalizing what it means to "apply a behavior" in a very concrete context of legged robots whose controllers are modeled as hybrid dynamical systems. Central to this approach is developing a common framework for describing (i) sequential composition of hybrid systems (enabled by a cospan description of interfaces and a generalization of Conley's (epsilon,T)-chains to the hybrid setting); (ii) transformations of hybrid systems (modeled as hybrid semiconjugacies and integrated with sequential composition via a double category); and (iii) hierarchical composition of hybrid systems (where hybrid subdivisions satisfying a certain fiber product condition enable composing spans of template/anchor pairs). Throughout, we will highlight some specific practical lessons that were learned in applying categorical formalisms to these real-world problem domains.
Nathaniel Osgood (29th of June)
For decades, System Dynamics (SD) modeling has served as a prominent, diagram-centric methodology used for public health modeling. Much of its strength arises from its versatile use of 3 types of diagrams, with each serving both to elevate transparency across the interdisciplinary teams responsible for most impactful models, and to reason about patterns of system behavior. Causal loop diagrams (CLDs) are used in semi-qualitative processes early in the modeling process and seek to support insight into feedback structure, behavioral modes, and leverage points. As modeling proceeds, system structure diagrams further distinguish stocks (accumulations) from flows and material from informational dependencies. Stock & flow diagrams build on that representation to characterize mathematical dependencies, quantify parameters and initial values for stocks, and have been particularly widely used in scenario simulation in public health and mathematical epidemiology. While ubiquitous use of diagrams renders SD modeling markedly effective in supporting team science and shaping stakeholders’ mental models, existing tools suffer from a number of shortcomings. These include poor support for modularity, cumbersome and obscurant model stratification, and an inability to capture the relationships between the 3 diagram types. Within this talk, we describe initial progress towards creating a framework for compositional System Dynamics, including theory, API support via StockFlow.jl within AlgebraicJulia, and ModelCollab -- a real-time collaborative tool to support interdisciplinary teams in modularly building, composing and flexibly analyzing Stock & Flow diagrams. Our approach separates syntax from semantics, and characterizes diagrams using copresheaves with a schema category. Diagram composition draws on the theory of structured cospans and undirected wiring diagrams, and employs pullbacks for model stratification. Model interpretation is achieved via functorial semantics, with ordinary differential equations being just one of several semantic domains supported. After describing the current state of implementation, we describe plans for future work, including enriching support for CLDs, and adding support for several computational statistics algorithms and additional types of structurally-informed model analyses. This is joint work with John Baez, Evan Patterson, Nicholas Meadows, Sophie Libkind, Alex Alegre and Eric Redekopp.
Urs Schreiber (24th of August)
The intricacies of realistic — namely: of classically controlled and (topologically) error-protected — quantum algorithms arguably make computer-assisted verification a practical necessity; and yet a satisfactory theory of dependent quantum data types had been missing, certainly one that would be aware of topological error-protection.
To solve this problem we present Linear homotopy type theory (LHoTT) as a programming and certification language for quantum computers with classical control and topologically protected quantum gates, focusing on (1.) its categorical semantics, which is a homotopy-theoretic extension of that of Proto-Quipper and a parameterized extension of Abramsky et al.'s quantum protocols, (2.) its expression of quantum measurement as a computational effect induced from dependent linear type formation and reminiscent of Lee at al.‘s dynamic lifting monad but recovering the interacting systems of Coecke et al.‘s "classical structures" monads.
Namely, we have recently shown that classical dependent type theory in its novel but mature full-blown form of Homotopy Type Theory (HoTT) is naturally a certification language for realistic topological logic gates. But given that categorical semantics of HoTT is famously provided by parameterized homotopy theory, we had argued earlier [Sc14] for a quantum enhancement LHoTT of classical HoTT, now with semantics in parameterized stable homotopy theory. This linear homotopy type theory LHoTT has meanwhile been formally described; here we explain it as the previously missing certified quantum language with monadic dynamic lifting, as announced in.
Concretely, we observe that besides its support, inherited from HoTT, for topological logic gates, LHoTT intrinsically provides a system of monadic computational effects which realize what in algebraic topology is known as the ambidextrous form of Grothendieck’s “Motivic Yoga”; and we show how this naturally serves to code quantum circuits subject to classical control implemented via computational effects. Logically this emerges as a linearly-typed quantum version of epistemic modal logic inside LHoTT, which besides providing a philosophically satisfactory formulation of quantum measurement, makes the language validate the quantum programming language axioms proposed by Staton; notably the deferred measurement principle is verified by LHoTT.
Finally we indicate the syntax of a domain-specific programming language QS (an abbreviation both for “Quantum Systems” and for “QS^0 -modules” aka spectra) which sugars LHoTT to a practical quantum programming language with all these features; and we showcase QS-pseudocode for simple forms of key algorithm classes, such as quantum teleportation, quantum error-correction and repeat-until-success quantum gates.
(This is joint work with D. J. Myers, M. Riley and H. Sati.)
Nat Shankar (31st of August)
Logic has always played a central role in computing. One successful application of logic is its use in specifying and modeling computational behavior, and in proving properties of computation systems. SRI's PVS was released thirty years ago with the goal of democratizing interactive theorem proving by combining an expressive formalism with powerful automated reasoning tools for building and maintaining complex formalizations and proofs. We describe some of the features of PVS for defining and reasoning with mathematical abstractions and bridging the gap between informal and formalized mathematical discourse. PVS features an expressive-order logic with algebraic/coalgebraic datatypes, dependent predicate subtypes, and parametric theories. The interactive theorem prover employs a range of automated proof strategies for simplification, rewriting, and case analysis, along with built-in decision procedures for SAT and SMT solving. The applicative fragment of PVS can be viewed as a functional programming language, and efficient executable code can be generated in Common Lisp and C, among other languages. PVS includes extensive libraries spanning a range of topics in mathematics and computing. The talk is an informal overview of the underlying theoretical foundations, and the proof and code generation capabilities of PVS.
Bio: Dr. Natarajan Shankar is a Distinguished Senior Scientist and SRI Fellow at the SRI Computer Science Laboratory. He received a B.Tech. degree in Electrical Engineering from the Indian Institute of Technology, Madras, and Ph.D. in Computer Science from the University of Texas at Austin. He is the author of the book, "Metamathematics, Machines, and Godel's Proof", published by Cambridge University Press. Dr. Shankar is the co-developer of a number of technologies including the PVS interactive proof assistant, the SAL model checker, the Yices SMT solver, and the Arsenal semantic parser. He is a co-recipient of the 2012 CAV Award and the recipient of the 2022 Herbrand Award.
Leonardo de Moura (7th of September)
This talk presents Lean 4, the latest version of the Lean proof assistant, and its impact on the mathematical community. We first introduce the project's design and objectives, followed by the mission of the newly established Lean Focused Research Organization (FRO). The advent of Lean and similar proof assistants has sparked a transformation in mathematical practice, an era we refer to as the "Formal Mathematics Revolution". We'll explore how Lean 4 contributes to this revolution, with its tools and structures enabling mathematicians to formalize complex theories and proofs with unprecedented ease. A key aspect of our philosophy is facilitating decentralized innovation. We discuss the strategies employed to empower a diverse community of researchers, developers, and enthusiasts to contribute to formalized mathematics. We will also delve into the usage of Lean as a functional programming language. With Lean 4, we have not only created an environment for formalizing mathematics but also an effective tool for writing software, enabling a smooth interaction between mathematical and computational aspects. Finally, we will look ahead, sharing our vision and planned steps for the future of Lean 4 and the Lean FRO. We'll discuss how we aim to further grow the user base, continue improving usability, and deepen the reach of formal methods into mathematics and computer science.
Michael Levin (21st of September)
We are composites, made out an agential material. In this talk, I will describe how molecular pathways, cells, tissues, and organs display intelligence - problem-solving and creative behavior in a variety of diverse problem spaces. I will describe evolutionary aspects of this amazing multiscale competency architecture, and how the cognition of cells scales up to the grandiose goals of morphogenetic, behavioral, and linguistic collectives. I will show data from my lab that uses developmental biophysics, computer science, and behavioral science to understand diverse intelligence and work towards practical, biomedical applications of these ideas (in areas of regeneration birth defects, and cancer). I will also describe our synthetic life forms, which we use to understand the latent space of goals for collective systems without an evolutionary history.
Jonathan Sterling (28th of September)
It is easy to teach a student how to give a naïve denotational semantics to a typed lambda calculus without recursion, and then use it to reason about the equational theory: a type might as well be a set, and a program might as well be a function, and equational adequacy at base type is established using a logical relation between the initial model and the category of sets. Adding any non-trivial feature to this language (e.g. general recursion, polymorphism, state, etc.) immediately increases the difficulty beyond the facility of a beginner: to add recursion, one must replace sets and functions with domains and continuous maps, and to accommodate polymorphism and state, one must pass to increasingly inaccessible variations on this basic picture.
The dream of the 1990s was to find a category that behaves like SET in which even general recursive and effectful programming languages could be given naïve denotational semantics, where types are interpreted as “sets” and programs are interpreted as a “functions”, without needing to check any arduous technical conditions like continuity. The benefit of this synthetic domain theory is not only that it looks “easy” for beginners, as more expert-level constructions like powerdomains or even domain equations for recursively defined semantic worlds become simple and direct. Although there have been starts and stops, the dream of synthetic domain theory is alive and well in the 21st Century. Today’s synthetic domain theory is, however, both more modular and more powerful than ever before, and has yielded significant results in programming language semantics including simple denotational semantics for an state of the art programming language with higher-order polymorphism, dependent types, recursive types, general reference types, and first-class module packages that can be stored in the heap.
In this talk, I will explain some important classical results in synthetic domain theory as well as more recent results that illustrate the potential impact of “naïve denotational semantics” on the life of a workaday computer scientist.
Dominic Orchard (12th of October)
Climate change is one of the greatest challenges of our time. Assessing its risks and our progress towards mitigating its worst effects requires a wealth of data about our natural environment that we rapidly process into accurate indicators, assessments, and predictions, with sufficient trust in the resulting insights to make decisions that affect the lives of billions worldwide, both now and in the future. However, in the last decade, climate modelling has faced diminishing returns from current hardware trends and software techniques. Furthermore, developing the required models and analysis tools to effectively process, explore, archive, and derive policy decisions, with a high degree of transparency and trust, remains difficult. I argue that more cross-disciplinary effort between mathematics, computer science, software engineering, and data science is needed to help close the gap between where we are and where we need to be. I will discuss our work at the Institute of Computing for Climate Science and the critical role of programming in addressing the needs of climate science. I will also make connections to relevant areas of category theory which could be leveraged to develop more flexible climate models in the future.
Simona Paoli (19th of October)
Many structures in higher category theory have been described using the combinatorics of simplicial objects, based on the category simplicial delta. A prototype example is that a category can be described as a simplicial set satisfying appropriate conditions (the so called Segal conditions) via the nerve functor. In dimension two, simplicial objects in Cat can be used to describe strict 2-categories and double categories. Among the latter, one can identity the category of weakly globular double categories which gives a model of weak 2-categories via a new paradigm to weaken higher categorical structures: the notion of weak globularity. The fat delta, introduced by Joachim Kock, carries some intuition similar to the simplicial delta, yet it has a different and rich structure. Kock used it to define fair 2-categories, encoding weak 2-categories with strict composition laws. In this talk I illustrate a direct comparison between fair 2-categories and weakly globular double categories which enables to interpret the weak globularity condition in terms of weak units. This comparison is based on a rich interplay between the simplicial delta and the fat delta, and on several novel properties of the latter. I will finally explain the significance of this result in terms of potential higher dimensional generalizations.
Tom Leinster (26th of October)
Ecologists have been debating the best way to measure diversity for more than 70 years. The concept of diversity is relevant not only in ecology, but also in other fields such as genetics and economics, as well as being closely related to information entropy.
The question of how best to quantify diversity has surprising mathematical depth. Indeed, a general study of invariants resembling cardinality and Euler characteristic led to the unifying notion of the magnitude of an enriched category - a quantity which is also closely related to the maximum diversity of a community of prescribed species. I will give a high-level overview of the concepts of entropy, diversity and magnitude, and how they fit together.
Brandon Shapiro (2nd of November)
The double category Cat^#, which can be built out of polynomial comonads, provides a computation-friendly mathematical language for categorical database theory, effects handlers in programming, and discrete open dynamical systems. It has categories as objects, cofunctors as arrows, and prafunctors as pro-arrows. Monads among prafunctors, such as the free category monad on graphs, have algebras including categories, n-categories (strict or weak), double categories, multicategories (symmetric or plain), monoids, monoidal categories (of nearly any variety), and in fact any algebraic notion of higher category whose definition is of a particular form. I will introduce the Cat^# approach to defining (higher) categories and survey some constructions from (higher) category theory that can be expressed in the language of Cat^#, including higher categorical nerves and "algebraic prafunctors" such as the free monoidal category monad on Cat.
Maurice Chiodo (16th of November)
In the past 10 years there has been a significant increase in the level of attention on issues of ethical and responsible development in mathematics. "Ethics in Mathematics" is now a recognised area of study, and there are now substantially more resources to consult on this. However, such understanding has not yet permeated into the minds of the bulk of mathematicians. Most either have minimal realisation of the need to consider ethical and societal issues when carrying out mathematical work, or lack the skill set needed to carry out such thinking in a thorough and systematic way. In short: most mathematicians either don't know how to spot ethical issues, or don't know what to do when they have.
In this talk I will aim to address both of these `blindspots' for mathematicians. I will present my "Teaching Resources for Embedding Ethics in Mathematics" (arXiv:2310.08467), which is a tool to embed ethics in mathematics teaching by way of mathematical exercises that normalise ethical considerations. I will also present my "Manifesto for the Responsible Development of Mathematical Works" (arXiv:2306.09131), which is a tool to help mathematicians dissect their workflow and identify the points at which their actions and choices may lead to harmful consequences. In short: train mathematicians to think about ethics all the time, then educate them on what to do when they identify such issues.
These are joint works with Dennis Müller, and form part of the Cambridge University Ethics in Mathematics Project (ethics.maths.cam.ac.uk).
Minhyong Kim (30th of November)
At a recent conference on the global history of mathematics, a question was raised about the recurrence of Euclid in a number of the talks and the 'Western' bias that seemed to appear in a meeting that was concerned with global history. In this talk, I will discuss the misconceptions around the identities of historical figures like Euclid, the deep-rooted confusion surrounding ancient identities in general, and why it might be important for mathematicians of our times to be aware of them.
Julie Bergner (7th of December)
There are two ways of turning Segal spaces into models for up-to-homotopy categories, or (∞,1)-categories: either asking that the space of objects be discrete, or requiring Rezk's completeness condition. When generalizing to higher (∞,n)-categories, both of these approaches have been taken to multisimplicial models, in the form of Segal n-categories and n-fold complete Segal spaces, but models given by Θ_n-diagrams have focused on the completeness conditions. In this talk, we'll discuss how to get a Θ_n-model with discreteness conditions, but also address the question of when these conditions can be mixed and matched with one another.
André Joyal (14th of December)
Lurie's higher topos theory is a vast extension of Grothendieck's topos theory. I will compare the two theories, stressing similarities and differences. The fact that the ∞-category of parametrised spectra is a higher topos has no analog in Grothendieck topos theory. It is the first stage of the Goodwillie tower associated to any left exact localization of a higher topos. The tower can be understood in analogy with the completion tower of an ideal in a commutative ring. For many purposes, a topos has the dual aspects of a space (the topos) and of a ring (the logos). The category of logoi (=the opposite of the category of topoi) has many properties in common with the category of commutative rings.
In collaboration with Mathieu Anel, Georg Biedermann and Eric Finster.