The history of evolutionary thought might be described as the development of theories that help us count living things. In the natural world questions of numerosity are therefore closely related to the challenges of classification. When we ask how many animals or trees there are do we mean individuals, species, families, or some other level of description? Before Darwin naturalists counted using phenotypic traits and after Darwin they used phylogenetic trees. We still do not agree on the fundamental units of numerosity. I shall discuss arguments from counting nature that resemble arguments about counting in mathematics and physics related to parsimony, forces and fields, and disputes about the role of unified theories. I shall end by describing the challenges related to counting individuals necessary for defining life.
Bio David Krakauer is an evolutionary biologist with degrees in mathematics and computer science, currently he is the President and William H. Miller Professor of Complex Systems at the Santa Fe Institute. David’s research focuses on the evolutionary history of information processing mechanisms in biology and culture. This includes genetic, neural, linguistic and cultural mechanisms. The research spans multiple levels of organization, seeking analogous patterns and principles in genetics, cell biology, microbiology and in organismal behavior and society. At the cellular level David has been interested in molecular processes, which rely on volatile, error-prone, asynchronous, mechanisms, which can be used as a basis for decision making and patterning. David also investigates how signaling interactions at higher levels, including microbial and organismal, are used to coordinate complex life cycles and social systems, and under what conditions we observe the emergence of proto-grammars. Much of this work is motivated by the search for 'noisy-design' principles in biology and culture emerging through evolutionary dynamics that span hierarchical structures.
Natural science uses the language of Mathematics to formulate its theories. There are many mathematical models that are being used in different branches of science. But they are all assumed to be included in a unified mathematical framework, e.g. when the state space of a quantum mechanical system is modeled as the collection of one dimensional subspaces of a separable infinite dimensional Hilbert space, there is an implicit assumption that the object called “Infinite dimensional separable Hilbert space” is a uniquely defined object, whose properties are completely determined. But the study of the foundation of Mathematics is challenged by independence. Many mathematical questions about mathematical objects like the real numbers, Hilbert spaces, topological spaces etc. are independent of the usual axioms in which the whole venture of Mathematics is formulated. So when we formulate our scientific theory in terms of mathematical objects, are we aware that there are “properties” of these objects which are not decided by the mathematical framework? One may claim that these “independent” properties are not relevant to Science, but we shall try to argue that it is conceivable that the phenomena of independence is relevant to the mathematical models used in Science and that the adoption of mathematical language is far from being unique.
Materials · Menachem Magidor's Publications
Bio Menachem Magidor is a professor of Mathematics (emeritus) at the Hebrew University of Jerusalem. He mainly works in Mathematical logic, especially in Set Theory, Model theory and applications of Logic to Computer Science. He served as the president of the Association of Symbolic Logic (1996-1998), and as the president of the division of Logic, Methodology and Philosophy of Science and Technology of the International Union of History and Philosophy of Science (2016-2020). He also served as the president of the Hebrew University of Jerusalem (1997-2009).
Artificial neural networks are becoming increasingly popular models of how the brain processes information. They've been shown capable of playing games at human level, predicting neural activity in response to real-world images, and capturing basic dynamics of decision making. In the majority of these networks, individual neurons can take any non-negative real value. Yet in the history of cognitive science, discrete and symbolic processing has been highlighted as a way of describing the mind. I will compare and contrast these views, explaining why continuous values are necessary for neural networks and how people are aiming to connect these two seemingly disparate approaches. Questions will include: What counts as a symbol? How can symbolic/discrete processing arise from continuous neurons? Will different or hybrid structures ultimately be needed to model the mind/brain? Do we need to reconcile these views or is the success of neural networks enough to support them as models on their own?
Materials · Letting Structure Emerge: Connectionist and Dynamical Systems Approaches to Cognition · Convolutional Neural Networks as a Model of the Visual System: Past, Present, and Future · On the Binding Problem in Artificial Neural Networks · Symbolic Behaviour in Artificial Intelligence
Bio Grace Lindsay is a computational neuroscientist using artificial neural networks to understand the visual system. She is also the author of the book Models of the Mind: How Physics, Engineering, and Mathematics have Shaped our Understanding of the Brain.
Are numbers a necessary feature of building up anything like mathematics or science? Or are they only a feature of the particular history of human mathematics and science? I'll discuss this question in the context of what I've learned from many years of exploring the computational universe of possible programs, as well as my experiences in computational language design, and our recent Wolfram Physics Project, and its potential application to the foundations of metamathematics.
Materials · How Inevitable Is the Concept of Numbers? (Article written in connection with the session) · Showing Off to the Universe: Beacons for the Afterlife of Our Civilization · The Empirical Metamathematics of Euclid and Beyond · Systems Based on Numbers · Advance of the Data Civilization: A Timeline · A New Kind of Science · Finally We May Have a Path to the Fundamental Theory of Physics… and It’s Beautiful · The Wolfram Physics Project: A One-Year Update · What Is Consciousness? Some New Perspectives from Our Physics Project
Bio Stephen Wolfram is the creator of Mathematica, Wolfram|Alpha and the Wolfram Language; the author of A New Kind of Science; and the founder and CEO of Wolfram Research. Stephen has worked extensively on many fields of research including high energy physics, complexity theory, computer science and experimental mathematics.
Physics is formulated in terms of timeless axiomatic mathematics. However, time is essential in all our stories, in particular in physics. For example, to think of an event is to think of something in time. A formulation of physics based on intuitionism, a constructive form of mathematics built on time-evolving processes, would offer a perspective that is closer to our experience of physical reality and may help bridging the gap between static relativity and quantum indeterminacy. Historically, intuitionistic mathematics was introduced by L.E.J. Brouwer with a very subjectivist view where an idealized mathematician continually produces new information by solving conjectures. Here, in contrast, I’ll introduce intuitionism as an objective mathematics that incorporates a dynamical/creative time and an open future. Standard (classical) mathematics appears as the view from the “end of time” and the usual real numbers appear as the hidden variables of classical physics. Similarly, determinism appears as indeterminism seen from the “end of time”. Relativity is often presented as incompatible with indeterminism. Hence, at the end of this presentation I’ll argue that these incompatibility arguments are based on unjustified assumptions and present the “relativity of indeterminacy”.
Materials · Mathematical Intuitionism · Indeterminism in Physics, Classical Chaos and Bohmian Mechanics. Are Real Numbers Really Real? · Real Numbers are the Hidden Variables of Classical Mechanics · Physics without Determinism: Alternative Interpretations of Classical Physics · Mathematical Languages Shape our Understanding of Time in Physics · Indeterminism in Physics and Intuitionistic Mathematics · The Relativity of Indeterminacy
Bio Prof. Nicolas Gisin was born in Geneva, Switzerland, in 1952. He received his Ph.D. degree in theoretical physics from the University of Geneva in 1981. After a post-doc at the University of Rochester, NY, and four years in industry, he joined the Group of Applied Physics at the University of Geneva where he has led the optics section since 1988. His activities range from the foundations of quantum physics to applications in quantum communications. In 2001 he co-founded the company IDQ, world leader in commercial quantum communications. He received two consecutive ERC Advanced Grants. In 2009 he was the first awardee of the John Steward Bell prize and in 2014 the Swiss Science prize delivered by the Marcel Benoist Foundation. In 2021 he joined the Schaffhausen Institute of Technology part time.
Numbers are perhaps the longest-lived cultural system the world has ever known, but we still do not know how old numbers might be, or where and why they might have emerged, or in what form. Investigating these matters is necessarily an inferential endeavor. However, archaeological evidence alone is likely insufficient. Insights from psychology, linguistics, and ethnography have implications for how we might examine and interpret the archaeological record for signs of prehistoric numeracy, perhaps even calling into question some of the assumptions and methods currently in use. Some of the most pressing issues are reviewed, and recommendations are offered for incorporating interdisciplinary data and insights into archaeological investigations and interpretations.
Bio I am interested in how societies become numerate and literate by using and modifying material forms over generations of collaborative effort, the effect this elaborational mechanism has on conceptual content, how material forms become increasingly refined to elicit specific behavioral and psychological responses, and what this might augur about the future of human cognition. I view cognition as embodied, embedded, extended, enacted, and evolving (5E). I have also written on how Neanderthal cognition differed from that of our ancestors, as well as the literary works of Jane Austen.
What is the origin of numbers? Some mathematicians have pointed to formal definitions and axiomatic systems, other scholars have claimed that some numbers are God-given. Ultimately, these accounts do not provide answers that are consistent with what we know today about the natural world (which includes the human brain and mind). In the natural sciences, a widely accepted view in cognitive neuroscience, child psychology, and animal cognition posits that in humans (and many nonhuman animals) there is a biologically endowed capacity specific for number and arithmetic. However, data from various sources —humans from non-industrialized cultures, trained nonhuman animals in captivity, and the neuroscience of symbol processing in schooled participants— do not support this view. The use of loose and misleading technical terminology in the field of "numerical cognition" has facilitated the elaboration of teleological arguments which underlie the above view. To understand this, a crucial distinction between quantical and numerical cognition is necessary: Biologically evolved preconditions (BEPs) for quantification do exist (quantical cognition), but the emergence of exact symbolic quantification and arithmetic proper (numerical cognition) – absent in nonhuman animals – has materialized via human cultural preoccupations and practices that, supported by language and symbolic reference, are crucial dimensions that lie largely outside natural selection. In this talk I’ll discuss the biological enculturation hypothesis, which attempts to explain the complex passage from quantical to numerical cognition in (some) humans, and in the process, gain insight into where numbers come from.
Bio Rafael Núñez is Professor at the Department of Cognitive Sciences of the University of California, San Diego. Born and raised in Chile, he obtained his doctoral degree in Switzerland, and completed his post-doctoral work at Stanford and UC Berkeley. He investigates the development and evolution of everyday and technical cognition (such as mathematics)—especially conceptual systems, symbolization, and abstraction— and their biologically enculturated underpinnings. His multidisciplinary approach uses methods such as psycholinguistic experiments, gesture studies, brain imaging, and field research with isolated indigenous groups. His 2001 best-selling book, Where Mathematics Comes From: How the Embodied Mind Brings Mathematics into Being (co-authored with UC Berkeley linguist George Lakoff), presents a new theoretical framework for understanding the human bio-cultural nature of mathematics and its foundations. Rafael Núñez is on the Scientific Advisory Board of the Australian Centre of Excellence for the Dynamics of Language, an active faculty member of the Center for Academic Research and Training in Anthropogeny (CARTA) devoted to promote transdisciplinary research into human origins, and fellow of the Wissenschaftskolleg zu Berlin. He is one the four PIs of the recently awarded European Research Council Synergy Grant QUANTA, designed to investigate the bio-cultural evolution of quantification.
Any philosophical theory of mathematical knowledge needs to provide a suitable account of natural numbers. Despite their central role in our theoretical and practical life, definitions have varied significantly, depending on whether philosophers have conceived of naturals as ordinal or cardinal numbers, emphasized the role of pure or applied mathematics, conceived of the subject-matter of mathematics in terms of objects or structures (or nothing at all). Disconcerting as it may seem, such disagreement points to some crucial philosophical issues. We'll survey some of the major philosophies of arithmetic and their core definitions, and we'll try to understand why "What are (natural) numbers?" is still an open question in philosophy.
Materials · The Conceptual Basis of Numerical Abilities: One-to-one Correspondence versus the Successor Relation · Real Numbers, Quantities, and Measurement · Definitions of numbers and their applications (Abstractionism, pp. 333–348) · Frege’s Constraint and the Nature of Frege’s Foundational Program · Neologicism, Frege’s Constraint, and the Frege-Heck Condition · Sereni, A. On the Philosophical Significance of Frege’s Constraint · Neo-Fregean Foundations for Real Analysis: Some Reflections on Frege’s Constraint
Bio Andrea Sereni is Associate Professor at the School for Advanced Studies IUSS Pavia. His teaching and research focuses on epistemology, philosophy of mathematics and philosophy of logic. He published on international journals (among which Philosophia Mathematica, Synthese, Inquiry, Review of Symbolic Logic) on themes such as logicism and neo-logicism, Frege's philosophy, the indispensability argument, Frege's applicability constraint, platonism, Benaceraff's problem, mathematical explanation and logical pluralism. He co-authored (with M. Panza) Plato's Problem. An Introduction to Mathematical Platonism (Palgrave, 2013). He co-edited (with D. Molinini et F. Pataut) Synthese Special Issue Indispensability and Explanation (2016); with F. Boccuni, Objectivity, Realism and Proof. FilMat Studies in the Philosophy of Mathematics (Springer, BSPHS, 2016); et with F. Ferrari, N.J. Pedersen et S. Moruzzi, Inquiry's Special Issue Logical Pluralism and Normativity. He is co-editing (with F. Boccuni) Origins and Varieties of Logicism, forthcoming for Routledge. He coordinates the Italian Network for the Philosophy of Mathematics (FilMat).
Downward causation is the controversial idea that ‘higher’ levels of organization can causally influence behaviour at ‘lower’ levels of organization. Here I propose that we can gain traction on downward causation by being operational and examining how adaptive systems identify regularities in evolutionary or learning time and use these regularities to guide behaviour. I suggest that in many adaptive systems components collectively compute their macroscopic worlds through coarse-graining. I further suggest we move from simple feedback to downward causation when components tune behaviour in response to estimates of collectively computed macroscopic properties. I introduce a weak and strong notion of downward causation and discuss the role the strong form plays in the origins of new organizational levels. I illustrate these points with examples from the study of biological, social and artificial systems.
Materials · Downward Causation · Life's Information Hierarchy · Biology Breaktrhoughs - Information Theory of Individuality · Collective Computation of Adaptive Social Structure · Collective Computation in Animal Fission Fusion Dynamics · A Family of Algorithms for Computing Consensus about Node State from Network Data
Bio Jessica Flack is a professor at the Santa Fe Institute, director of SFI's Collective Computation Group (C4), chief editor of the new, transdisciplinary journal, Collective Intelligence, and, previously, was founding director of University of Wisconsin-Madison's Center for Complexity and Collective Computation in the Wisconsin Institutes for Discovery. Flack is interested in the roles of information processing and collective computation in the emergence of robust but evolvable structure and function in biological and social systems. This work sits at the intersection of evolutionary theory, statistical mechanics, information theory, theoretical computer science and cognitive science. Goals include identifying the computational principles that allow nature to overcome subjectivity due to information processing to produce ordered states and accounting for the origins of space and time in biological systems. A central idea is noisy information processors reduce uncertainty about the future by computing their macroscopic worlds through collective coarse-graining in evolutionary and/or learning time. In other words, how the appropriate aggregation of information accumulated by individuals making decisions under uncertainty can produce good collective forecasts. Flack's work has been covered in many publications and media outlets, including the BBC, NPR, Nature, Science, The Economist, New Scientist, Current Biology, The Atlantic, and Quanta Magazine. Flack also writes popular science articles on collective behavior and complexity science for magazines like Aeon. In 2020 her work with several collaborators including Nihat Ay and David Krakauer on the information theory of individuality was chosen as a science breakthrough of the year by Quanta Magazine.
Arithmetical competence plays an important role in most contemporary societies. Recent advances in philosophy of cognition and the cognitive sciences have contributed to a better understanding of the developmental trajectory of arithmetical cognition, leading from proto-arithmetical capacities (quantity approximation and subitising) to counting and arithmetical capacities. In the first part of this talk, I will argue that recent work on enculturation (e.g., Menary, 2015; Pantsar, 2019; Jones, 2020; Fabry & Pantsar, 2021) can help us provide an empirically plausible account of this trajectory. In the present context, the concept of ‘enculturation’ captures the transformation of individual cognitive capacities through the acquisition of culturally evolved, embodied arithmetical practices. In the second part of this talk, I will explore the following question: how can cases of dysculturation (developmental deficits in cognitive capacities) enrich our understanding of the relationship between capacities in proto-arithmetic, counting, and arithmetic? To this end, I will consider cases of dyscalculia, a developmental disorder that affects 3-7% of the general population and is associated with deficits in arithmetic capacities. I will discuss the implications of empirical work on this disorder for philosophical research on the ontogenetic development of arithmetical practices.
Bio Regina Fabry is a philosopher of mind and cognition. She is currently a postdoctoral research fellow in the Department of Philosophy II at Ruhr University Bochum. Her expertise lies in research on enculturation, predictive processing, and 4E cognition and she has published widely in these fields. Thematically, she is particularly interested in mathematical cognition, literacy, narrative practices, mental disorders, and mind-wandering. For more information, see http://www.reginaefabry.de/.
In this talk I will explain how computer scientists have managed to grow numbers in the lab, devoid of all the baggage which humans have attached to them over the last ten thousand years or so. The process of teaching a dumb machine what a number is will force us to think carefully ourselves about the nature of number. I will discuss how to convince the computer that 2 + 2 = 4. I will go on to explain how mathematicians are slowly beginning to learn how to use these synthetic numbers to do the kinds of things which they used to do on blackboards.
Bio Kevin Buzzard is a professor of pure mathematics at Imperial College London. He is a number theorist by training, but recently has become interested in computer theorem provers.