Each of us has made the remarkable journey from a single cell (a quiescent oocyte) to a complex embodied mind. How do cells, which were once independent organisms, work together to pursue the anatomical and physiological goals that enable form and function to reliably self-assemble? In this talk, I will tell the story of the collective intelligence of cellular swarms that embodies William James' definition of intelligence: same ends by different means. I will describe the amazing competencies of the morphogenetic process that builds bodies and minds, and our discoveries on bioelectricity - the cognitive glue that implements embryogenesis, regeneration, and cancer suppression. I will end with a perspective on how biophysical, informational, and behavioral sciences are coming together to redefine the boundaries of the possible in biomedicine and beyond.
Materials · Darwin’s agential materials: evolutionary implications of multiscale competency in developmental biology · Regulative development as a model for origin of life and artificial life studies · Bioelectric networks: the cognitive glue enabling evolutionary scaling from physiology to mind · There’s Plenty of Room Right Here: Biological Systems as Evolved, Overloaded, Multi-Scale Machines · Endless forms most beautiful 2.0: teleonomy and the bioengineering of chimaeric and synthetic organisms · Competency in Navigating Arbitrary Spaces as an Invariant for Analyzing Cognition in Diverse Embodiments · Scales and Science Fiction with Biologist Michael Levin [video]
Roundtable debate in the School about Academic-Enterprise links. Our aim is to give a brief overview of this for anyone interested and try to help with any doubts or questions you might have if you want to move from academic to professional profile.
Bad incentives, outdated technology, persistent identifiers that don't persist, bureaucracy, bureaucracy, bureaucracy. Many things could be improved around the way we currently share knowledge. We've started building solutions for this problem space that I would love to show and get critical feedback on from everyone at SEMF! We, DeSci Labs, are building a new kind of preprint solution that integrates paper, data, and code into reproducible Nodes, each component within a Node citable and importable via a few lines of Python code. This is not a preprint server in the sense that you publish here -- You can really publish anywhere. Your paper will be linked back to your fully reproducible Node on DeSci Nodes. And this link won't break! Because we are using content-addressed storage solutions to ensure persistence, creating a new persistent identifier for science that is, dare I say, better than DOI! In this session I'll explain our tech, give a demo, and then happily engage in critical discussion and feedback! We are quite early in our development roadmap and are always looking for input from the scientific community to make sure we are building in line with your expectations and needs. Learn more about us at desci.com
In this short course, which served as preamble to “Computation, Causality and Compositionality”, we introduce the notion of rewrite system as a common foundation of mathematical and computational practice as well as the notion of higher-order network. We present some basic ideas about hypergraphs and adjacency matrices. We conclude by introducing the general concept of symbolic hypergraph rewriting and recover category theory as a particular case.
In the 19th century, invisible local inertial frames replaced Newton’s equally invisible and unsatisfactory absolute time and space. Both could only be defined by the re- quirement that Newton’s laws hold in them. Einstein’s local freely-falling frames are defined analogously by the field equations of general relativity holding in them. Leibniz, a convinced relationist, said each body’s position is defined by its distances to all the other bodies in the universe. Space, he said, is the order of coexisting things and time their succession. But he failed the real challenge: to define change of position relationally. Mach did little better in the 19th century despite arguing persuasively for the emergence of local inertial frames within a relational universe. One problem was and is how to describe holistically a universe of infinitely many bodies. Newton’s absolute framework, the basis of reductionist science, allowed him to avoid this problem, which persists in field theory’s infinity of degrees of freedom. My aim in this talk is to remove every last trace of Newton’s absolutes in a minimal model with the simplest possible ontology: gravitating point particles, aiming to address the problem of infinity by allowing their number to be arbitrarily large. I adopt Poincaré’s proposal in Science and Hypothesis that a relational law of the universe should, like so many well established local laws, express causality and determinism through differential equations of the second order. The essential content of Newton’s 2nd law remains but with initial positions and their rates of change expressed relationally.
This mini-course explores storytelling, semiotic, and interpretational approaches to mathematics that uncover forgotten and oft-ignored literary aspects of the mathematical practice. Half of the course will investigate ideas in existing literature while the other half will deliver advances in ongoing research. This is both a lecture course and a discussion forum. Participants are not expected to be mathematicians, literary experts, or philosophers. Everyone is welcome, those who just wish to listen and those who would like to engage in the discussion. On a meta-level, the course is also meant to disseminate ideas that might take root in different guises across different fields.
A previous experimental study showed that neurons in the prefrontal cortex (PFC) resolve context- based competition between two sensory modalities through enhanced oscillatory activity at beta frequencies when the context and the sensory dominance were aligned. In contrast, no apparent difference at beta frequencies was found when context pointed to the non-dominant sensory modality, even though the subjects performed the task similarly well. In addition, pre-stimulus alpha-band activity emerged in non-dominant trials on the prefrontal neurons encoding the dominant dimension. Two critical yet unresolved questions arise from this study: first, if not the PFC, what region of the brain resolves non-dominant trials? and second, what is the role of alpha oscillations in the PFC, specifically present in dominant-selective neurons when these neurons do not carry the relevant information? Here, we build upon a previous computational model of corticostriatal processing to test the hypothesis that the conflict between context and dominance may be resolved downstream in the striatum. Results from our computational model show that pre-stimulus alpha inputs from cortex trigger a temporal asymmetry in the striatal microcircuit through short-term synaptic depression: on stimulus presentation, despite that equally strong beta inputs target the dominant and the non- dominant striatal populations, their responses become transiently different since the two depart from different states. Dominant-selective striatal neurons depart from a highly active alpha-triggered regime and therefore are in a depressed state, whereas non-dominant selective neurons depart from silence hence they are not depressed. This creates a window of opportunity to propagate the activation of the non-dominant response that is capable of resolving the conflict between dominance and context present in the PFC.
While for a single particle the (classical) random walk on a generic graph is well understood, for a many-body system, where mutual interactions of the walkers play a significant role, the corresponding problem is much more involved. Field-theoretical formulation of many-body random walks offers a systematic approach that allows for an efficient description of interparticle interactions, and the ensuing non-linear collective behaviour. It is essentially inspired by the method of second quantization, which transforms a many-body quantum-mechanical system into a quantum field theory model. To this end creation and annihilation operators are introduced together with their commutation relations, and the system's dynamics is captured by evolution equations for operators governed by the "second-quantized" Hamiltonian, rather than by time evolution of many-particle probability distributions. The aim of this talk is to show that methods of quantum field theory can be employed to describe a classical system of interacting random walkers, with the prospect of taking advantage of the vast machinery of quantum field theory in a classical stochastic context.
Materials · Slides (unfinished) · Quantum Techniques for Stochastic Mechanics, John C. Baez and Jacob D. Biamonte
Entropy increases. Mechanical work irreversibly turns into heat. The Second Law of thermodynamics is considered one of the great general principles of physical science. But 150 years after it was first introduced, there’s still something deeply mysterious about the Second Law. It almost seems like it’s going to be “provably true”. But one never quite gets there; it always seems to need something extra. Sometimes textbooks will gloss over everything; sometimes they’ll give some kind of “common-sense-but-outside-of-physics argument”. But the mystery of the Second Law has never gone away. Why does the Second Law work? And does it even in fact always work, or is it actually sometimes violated? What does it really depend on? What would be needed to “prove it”? For me personally the quest to understand the Second Law has been no less than a 50-year story. But back in the 1980s, as I began to explore the computational universe of simple programs, I discovered a fundamental phenomenon that was immediately reminiscent of the Second Law. And in the 1990s I started to map out just how this phenomenon might finally be able to demystify the Second Law. But it is only now—with ideas that have emerged from our Physics Project—that I think I can pull all the pieces together and finally be able to construct a proper framework to explain why—and to what extent—the Second Law is true.
Ars Scientia is a new journal and emerging network focusing on the entwinement of scientific and artistic practice. Ars Scientia focuses on raw synthetic practices that mobilize figural and metaphorical thinking in scientific inquiry, and, conversely, formalizability and generalizability in artistic practice. Ars Scientia explores this artistic-scientific conjecture through essays, graphics and visual art. Ars Scientia is issued in an online-format and is edited by Johanna Owen (US) and Mandus Ridefelt (SE). The launch will introduce the intention and thoughts behind the first issue, an overview of the contributions as well as two short readings by Jovana Maksic and Mandus Ridefelt. Contributors: Nolan Allen Alex Boland Carlo Carnevale Sarah Chekfa Cerme Ersalan Cristian Hernandez Petros Lales Moselle K Jovana Maksic Amar Priganica Amitai Romm Chiara Salmini Mert Yıkılmaz
In this three-part interactive seminar, we will try to carve out the space for emotions in science using debates from the philosophy of science as a background for our inquiry. In the first session, we will discuss the questions of objectivity and rationality of science. In the second session, we will talk about the relevance of emotions to some scientific products and elements of practice. Finally, on the last day, we will discuss the possible impact of scientific emotions on scientists’ mental health.
The science I enjoy looks at things in new ways and challenges conventions. I have a particular fondness for showing that something somebody said was impossible is due just to a failure of imagination, and for reconciling 'apparently incompatible' views. For example, Richard Dawkins has explicitly stated the widely-held view that, aside from evolution by natural selection, no other naturally occurring mechanism of adaptation could possibly exist anywhere in the universe (hence Universal Darwinism). But I’ve shown that there is at least one other mechanism, 'natural induction', that is quite different, does not need natural selection, and can produce adaptation at least as good as natural selection. This course will explore such ideas.
Residuality theory is a revolutionary new theory of software design that aims to make it easier to design software systems for complex business environments. Residuality theory models software systems as interconnected residues - an alternative to component and process modeling that uses applied complexity science to make managing uncertainty a fundamental part of the design process.
Understanding consciousness has challenged humanity ever since. Besides philosophical, medical, biological, cognitive and neuroscientific perspectives, one approach has been to differentiate between “normal” consciousness and “altered” states of consciousness. This talk will give an overview of diverse altered states of consciousness, including those induced spontaneously, physiologically, by diseases, psychologically and pharmacologically. We will explore different types of psychoactive substances with a focus on classic psychedelics and the unique experiences induced by these substances.
A roundtable with researchers from different fields discussing current issues of scientific practice: fast science, publish or perish, replication crisis, QRPs and many others. The aim is to gather their perspectives and experiences on these questions, and to explore potential steps to take in order to alleviate them. In the first session we will introduce ourselves and I will present the main questions that I would like to address during the debates. At this point, the participants could share their own experiences on the problems explained. The second session will be focused on exchanging everyone's views about those questions in detail. Finally, in the third session we will sum up our course and we will discuss how the scientific panorama could be cleaned up of those problems, attending to different types of possible solutions. .
In this mini-series we introduce the simply typed lambda calculus, a venerable computational formalism almost one hundred years old, describe how it is used to model computer programs, and explore its connections to logic and category theory. We describe how types differ from sets, how typing judgements are inductively defined using inference rules, and how a typing judgment M : A can be understood as meaning both that M is a program (term/expression) of type A and that M is a proof of formula (proposition) A, a correspondence known as the Curry-Howard isomorphism. We describe the syntax and operational semantics of both the simply typed lambda calculus and the polymorphic lambda calculus, the latter of which is used in real-world compilers such as the GHC Haskell compiler. Additional topics covered include the categorical semantics of the simply typed lambda calculus in cartesian closed categories, Church encodings of natural numbers, lists, and other data types, as well as the basics of type inference (which is used in many real world compilers). Ideal background: some familiarity with functional programming or logic.
Materials · Introduction to Type Theory, Herman Geuvers (slide version 1/3) · Introduction to Type Theory, Herman Geuvers (slide version 2/3) · Introduction to Type Theory, Herman Geuvers (slide version 3/3) · Introduction to Type Theory, Herman Geuvers (long form text version) · Supplementary material for category-theoretic connectionsEffective interoperation between multiple scientific disciplines is crucial to systems engineering. Can the study of interoperability---the working negotiations and hand-offs between theories and models---itself be made into a hard science? Hard sciences are based on mathematics, so this would require a mathematics of interoperability, a mathematics whose subject consists of the bridges and analogies that make data- and model-integration actually work. I propose that category theory serves this purpose exceptionally well. In this talk, I will give evidence for the above claim, and without assuming the audience has seen any category theory before. I will focus on operads, which offer a framework for various forms of compositionality. In particular, I will discuss how operads model the interconnection of dynamical systems, provide a new method for solving systems of nonlinear equations, and explain how these two issues are connected category-theoretically. Finally, I'll explain how all this fits into a larger mathematical approach to interdisciplinarity.
Rewriting systems can be represented/visualised as multiway systems, which can in turn be represented/visualised as directed graphs (with each vertex being a “state” and each edge being an “event”). Rewriting systems can also be equipped with a causal structure, wherein one decomposes each state into “tokens”, and one looks at which tokens are “destroyed” vs. which tokens are “created” by each event. If the tokens created by one event intersect with the tokens destroyed by another, then those events are “causally related”. Those causal relationships can be represented as directed edges between events in a causal graph. The first part of the course will deal with multicomputational irreducibility from a functorial perspective, specifically in relation to how we formulate multiway systems, (multi)computation and related concepts in a compositional way in terms of symmetric monoidal categories. The notions of causal structure that I’ll be using will show the relationship between causality, rewriting and category theory.
Materials · A Functorial Perspective on (Multi)computational Irreducibility · Fast Automated Reasoning over String Diagrams using Multiway Causal Structure
The central concern of computational complexity theory is the minimal "resource costs" needed to perform a given computation on a given type of computer. In the real world, some of the most important resource costs of performing a computation are thermodynamic, e.g., the amount of heat it produces.In this talk I will summarize recent results on how thermodynamic resource costs depend on the computation being performed and the computer being used to perform them.I will start with some new results concerning the thermodynamic costs of performing a given computation in a (loop-free and branch-free) digital circuit. Next I will summarize some results concerning deterministic finite automata (DFA). After that I will review results on how considering the minimal entropy production (EP) of computing a desired output on a TM, rather than the minimal size of an input string that causes the TM to produce that output (i.e., the output's Kolmogorov complexity), results in a correction term to Kolmogorov complexity.I will end by describing the vast new set of research issues at the intersection of stochastic thermodynamics and computer science theory, issues that expand both fields.
Lysergic acid diethylamide (LSD) has a turbulent history since its discovery in 1938. Synthesized in search of a circulatory stimulant, its extraordinary effects on subjective perception were just discovered by accident five years later. In the following, LSD was used as a psychosis model by psychiatrists, for creative inspiration by artists, as a political symbol by the counterculture and as psychotherapeutic adjunct by therapists. Yet, its effects remain little explored by modern science. This talk will present Brazilian’s first randomized, placebo- controlled, double-blind study on LSD. It will show the effects of a low to moderate dose of LSD on human perception, cognition and behaviour. We will focus on creativity and hypnosis-related techniques and include practical exercises for self-experience.
The workshop is based around the question of how to locate and render movable the interplay between analytic and aesthetic modes of thinking in scientific inquiry and artistic practice. The workshop will start with a few thought-provoking examples (scientific fraud, underdeterminism, ethics) from Mandus Ridefelt and Johanna Owen on how to make tangible this space of parallelism and why operationalizing this intertwinement is a core issue of the contemporary. By posing a set of questions and prompts we will attempt to identify, characterize and taxonomize different situations in the participants' disciplines (or beyond) where these concerns are core determinants. The goal of the workshop is to collectively develop methods and awareness around the ways in which analytic and aesthetic concerns co-constitute the ethical motion of knowledge itself. This might be too grand, but let’s give it a shot.
The first part of the talk will be a presentation of our team behind Holon Labs, we will then describe our motivation and work to develop a new type of computer: the collective computer, as a critical technology for the harmony of our future. We will then describe how our industrial backgrounds brought us to the ideas and commitment to this effort and describe the function and design approach we are following to materialize it, including our criticism of the fixation with intelligence as a target for technology and our approach to engineer computers and technology in general. The talk will conclude by describing the challenges of even attempting to build a collective computer like this one within our existing socio-economic systems of organization, research, liability and governance.
Classical mechanics and quantum mechanics are both "driven by energy", in the sense that one has a free choice of Hamiltonian and then the dynamics is a function of that. Statistical mechanics, on the other hand, is "driven by entropy" in that the derivation of equilibria and the distribution of ensembles is chosen by the entropy maximization principle. In the real world, however, systems behave in both ways; they are driven both by energy and entropy. How can we model this within mathematical physics? In this talk, I will dig into this question in more detail and discuss some partial answers.
I will introduce Shannon’s entropy in a classical set-up, using biased coins as an example. Following a toy model, I will remind us on how coarse-graining generates uncertainty (entropy), like in thermodynamics, and how such uncertainty satisfies 1st law like relations. In the second part, I will introduce the axioms of quantum mechanics and the notion of reduced density matrix captures the concept of coarse-graining introduced earlier. The role of entanglement will be stressed in this part. In the third part, we introduce locality, in order to describe quantum field theory. We will derive Unruh’s effect, as a canonical example on how coarse-graining in quantum field theory can generate thermal behaviour. Time permitting, we shall close with black holes. We will connect the first law they satisfy with covariant phase space methods and interpret the result in terms of asymptotic observers. We will briefly mention Jacobson’s derivation of Einstein’s equations based on Clausius relation and the Unruh effect. We will mention how the AdS/CFT captures all these results and, very tangentially, which directions are currently being explored.
Joel Dietz presents on interdisciplinary approaches to math, physics, and music composition, chronicling various attempts of composers to represent spacetime and the current state of art of the field, including the usage and result of software that allows transmedia encoding (i.e. the transfer of from one media type to another). A lecture will be given and various results will be shown in a video reel and in a live audio reactive VJing setup.
From a physics perspective, the brain is an ideal example of a complex system, and certainly the most intriguing one. Employing concepts from physics and information theory, we analyze functional brain connectivity both in the ordinary state and under the influence of a psychedelic brew from an Amazonian indigenous culture. I will describe our journey in developing tools to decode this data and discuss how these studies can contribute to our understanding of the human brain and other complex systems.
Carl Jung’s influence on popular culture, psychology, and religious studies is without comparison. The 2009 publication of his visionary journal Liber Novus, ‘The Red Book’ has given scholars a new understanding of the role of altered states of consciousness in his works, emphasizing his role for ongoing discussions on nonordinary states of consciousness. In this talk, I explore the influence of visionary experiences in Jung's life and his work. I analyse the relevance of The Red Book and later works for interpretating and integrating altered states of consciousness. I also situate Jung's outlook in the discourse on psychedelic substances during the 1960's. Drawing on relevant literature, I argue that in exploring visionary realms Jung's writings bridge the gap between naturally occurring altered states and those induced by substances. His example sets altered states into a context which promotes deeper examination of their meaning and relevance in the life of the individual. Furthermore, his outlook emphasizes the role of personal mythology, symbols and art as means of navigating the unconscious. Jung regarded exceptional experiences, that he called 'confrontations with the unconscious', as a vital tool for regulating the psyche, but also warned against the dangers of disillusionment. Through detailed examination of his dreams, visions, and intuition Jung provides a holistic and soulful perspective on the current discussion on psychedelic experiences. His approach makes central the question of the inherent needs of each person, and the responsibility that the encounters with the unconscious bring about.
In this talk we describe a category-theoretic approach to data representation, migration, and integration, contrast it to the conventional relational approach, and provide examples of applying the categorical approach to real-world problems in industry. We describe how schemas are defined as co-presheaves, databases as set-valued functors, and queries as profunctors. We describe how queries may be both evaluated and co-evaluated, how natural transformations can be understood as data integrity constraints, and how the Grothendieck (category of elements) construction can be used to convert between schema and data. We describe the bi-cartesian closed structure of the category of categories and the topos structure of co-presheaves on a category, and use this structure to give a schema and database-centric semantics to the simply typed lambda calculus, and describe how this structure supports schema and data integration through co-limits. We conclude with a brief description of the open-source CQL tool available at categoricaldata.net and describe some real world uses of CQL. Ideal background: some familiar with databases and category theory.
Materials · Slides
Way-making is a philosophy of cognition that applies across species, disciplines, and scales. It posits cognition as making way through encounters. I use ‘encounter’ because way-making is movement in all realms (mental, physical, virtual, emotional, etc.), and each path is continuous. In this theory, the ways humans and other beings move (or are moved) such as walking, swimming, and crawling are taken as first order cognition: They are not taken as ‘thought’ or ‘mind’; they are understood as the 'ways we make’ which can scale into this form of cognition we call ‘thought’ or ‘mind‘. Forms of cognition we call 'thought' and 'memory' are understood as means of finding our way through encountered sensory regularities, along paths, just as any other form of way-making (such as walking, swimming, etc.) Hippocampal research (or more specifically, research on the hippocampal formation and entorhinal cortex) has expanded greatly over the past century and now shows us how knowledge acquisition, memory, and spatiotemporal navigation might be understood as a common process. Likewise, there are new ways of looking at navigation within biology (the work of Michael Levin being the essential example) which likewise open towards a new understanding of cognition. I am formulating these findings into a general framework that can be used heuristically to understand cognition as a common process across species, and that alleviates traditional philosophical dichotomies. Complexity science and recent developments in mathematics and the coding community seem to offer ways of modeling and visualizing this new philosophy of cognition. This talk presents Way-making and asks for your ideas and input (especially relative to hypernetworks, hypergraphs, the ruliad, etc.) about how we might model a new take on cognition, so as to more clearly visualize its nested, multi-scale, multi-dimensional process.
Materials · Scales and Science Fiction with Biologist Michael Levin [video]
Meditation is a set of practices that have recently been popularised in scientific thought, despite their intrinsic existence in most of the world's cultures. Current empirical evidence seems to point to what age old wisdom has been telling us for millennia - meditation and mindfulness are key aspects to a happy and fulfilling life. Still, they can both seem difficult and inaccessible to most people, due to their association with ascetic practices. This lectures explores mindfulness and meditation through an interdisciplinary lens, focusing on neuroscience, psychology, theology, philosophy, anthropology, physiology and genetics! The aim here is to give a holistic approach to defining the concepts of meditation and mindfulness, explore recent scientific findings and paradigms, and offer the philosophical grounding of these habits. Hopefully, this talk should motivate people and give them the necessary tools to explore this life changing habit!
The theory of centripetal (inwards) and centrifugal (outwards) force has been used with regards to the ‘self’ by Erik Davis and with regards to psychedelic praxis by Christopher Partridge. I claim that these terms also have application with regards to two types of mysticisms in the 1960s. This paper explores the dissonances and resonances, the contradictions and complementarities between two mystics in the 1960s: the activist, friend of Martin Luther King, and scholar of Judaism, Abraham Joshua Heschel; and the Buddhist populariser, once-Anglican-chaplain, and hippy paragon, Alan Watts. Firstly I will examine the social theory behind centripetality and centrifugality, and then I will engage in an analysis of these two types of mysticism to produce a comparative philosophy. In so far as these two mysticisms are polar, they are dialectic and complimentary. In the wake of recent studies in the anthropology of human politics (by Mark Fisher and David Graeber) I suggest that, if mystical experiences can empower and fuel praxis, a one-sided-diet of mystical language and interpretation can and has hampered the political potential – a potential which is far from inevitable – of psychedelic states of consciousness. Opening mysticism to a way of being in between extreme heights of experience, via Heschel, provides empowering insights for what is to be enacted from experience.