How do living systems—ranging from simple bacteria to human brains—maintain their structure and identity in the face of constant change? How can we unify physics, machine learning, and biology under a single, coherent mathematical framework? These questions rest at the heart of Bayesian mechanics, a new and rapidly growing field that treats living organisms as statistical machines operating in a probabilistic, ever-changing universe.
In a recent video, “Engineering Explained: Bayesian Mechanics,” Sanjeev Namjoshi (Senior Machine Learning Engineer at Kung Fu AI) introduced these concepts using the lens of stochastic differential equations, Markov blankets, and active inference. This blog post expands on that video transcript, diving deeper into each concept with added context, real-world examples, historical background, and a discussion of how “entropic waste” can erode the structural integrity of living systems unless they employ Bayesian strategies to maintain order.
We’ll start by exploring the big question: What does it mean for an organism to exist? Then we’ll move through Bayesian mechanics, the non-equilibrium steady state (NESS) density, the free energy principle, and active inference—a highly general approach to modeling agent-environment interactions. We’ll also see how these ideas challenge the simpler views of “life as purely biochemical” by revealing information-theoretic and probabilistic underpinnings that unify biology, physics, and machine learning.
By the end of this post, you’ll see how Bayesian mechanics might be key to understanding not just brain function but also the fundamental nature of living systems, bridging the gap between Erwin Schrödinger’s famous question, “What is Life?” and modern computational neuroscience. Whether you’re a student, researcher, or just curious about the science behind life and intelligence, these ideas offer a fascinating roadmap for next-generation thinking in AI and theoretical biology.
Life as a Statistical Phenomenon
Existence Against Entropy
From the perspective of thermodynamics, nature loves entropy: left alone, particles spread out, energy disperses, and systems move toward greater disorder. Yet, living organisms defy this entropic push by maintaining their highly ordered structures over time: cells have membranes, tissues have specialized functions, brains orchestrate complex neural activity—none of which would persist if the system passively succumbed to random fluctuations.
But how do organisms continually resist decay? They do so by taking in energy and using it to rebuild, reorganize, and maintain their structural identity. This phenomenon ties into “entropic waste,” the notion that the environment is constantly imposing noise, fluctuations, and dissipative forces on living systems. If an organism fails to manage these forces effectively, it can drift into disorganized states, losing the boundary that distinguishes “self” from “not-self.”
Probabilistic Boundaries and Steady States
A living organism typically favors certain internal configurations—e.g., your body temperature, your organ alignment, your cellular composition—compared to a huge space of other possible states. This stable set of “preferred states” can be mathematically described as a non-equilibrium steady state (NESS) distribution or Nest density. If you graphed every possible arrangement of your body’s molecules, you’d see that the organism “spends most of its time” in configurations consistent with being alive.
Crucially, organisms are open systems, exchanging matter and energy with the environment. This exchange must balance the entropic forces that push them toward disintegration. Bayesian mechanics aims to formalize this balancing act: it looks at how a system remains in a steady-state distribution in spite of constant entropic pressures from the outside.
Bayesian Mechanics: A Primer
The Road from Brain Imaging to Statistical Physics
Bayesian mechanics is surprisingly recent—the term first appeared in 2019—yet it draws on decades of research bridging statistical physics, machine learning, and neuroscience. The seeds of Bayesian mechanics can be traced to the development of neuroimaging software like SPM (Statistical Parametric Mapping) and DCM (Dynamic Causal Modeling), pioneered by Carl Friston and colleagues. Initially, these tools were for analyzing fMRI data, but they introduced powerful Bayesian statistical methods for understanding dynamical systems.
Soon, Friston’s group realized these same Bayesian and stochastic approaches could be extended beyond brain imaging to the brain’s function itself: perceiving, predicting, and acting in an uncertain world. Over time, the circle widened to capture any self-organizing system that maintains a steady state, leading to the concept of Bayesian mechanics for living organisms in general.
Markov Blankets: Partitioning Internal and External States
A central notion is that living systems maintain a kind of “statistical boundary” between themselves and the environment, known as a Markov blanket. Rather than a purely physical barrier, a Markov blanket is probabilistic: it ensures that internal states (within the organism) and external states (in the environment) can be treated as conditionally independent once you account for the states in the blanket.
By conditional independence, we mean that changes in the organism’s internal states do not directly affect external states except through the blanket, and vice versa. This blanket is further split into sensory states (absorbing signals from outside) and active states (acting upon the outside world). The synergy of these states allows the organism to remain in a non-equilibrium steady state by interpreting environmental signals and adjusting its actions to preserve its identity.
The Helmholtz Decomposition and System Flows
Bayesian mechanics formalizes this interplay using stochastic differential equations. For a system in a steady state, the net flow can be decomposed via the Helmholtz decomposition into a curl-free (gradient) component and a divergence-free (solenoidal) component. The environment exerts randomizing forces that would push the organism out of equilibrium, while the system must produce a contravening solenoidal flow, effectively “steering” back into its characteristic states.
If you imagine a marble on the edge of a basin, it tries to roll down and out, but the living system (with energy input and internal regulation) keeps pushing it back into that basin of “preferred configurations.” Statistically, this pushback can be understood as Bayesian updating—the system updating its internal model to remain viable.
Non-Equilibrium Steady States: Why Organisms Resist Entropic Waste
Entropic Waste and the Drive to Disorder
As mentioned, “entropic waste” is a concept describing how random fluctuations, noise, and dissipative processes break down the structural order of living things. Without constant corrections, these entropic forces would scatter the organism’s molecules, destroying the carefully tuned patterns that define life.
Bayesian mechanics suggests each living system actively counters entropic waste by minimizing the probability of being thrown into disorganized states. This is effectively a survival strategy coded in the system’s probabilistic dynamics, ensuring it remains in the steady-state distribution that matches “being alive.”
The Probability of Survival
In purely physical terms, a living system staying in a stable set of states defies the naive assumption that it should randomize to maximum entropy. But it’s not defying the second law of thermodynamics; it’s harnessing energy from the environment to export entropic waste elsewhere, sustaining a local decrease in entropy.
Mathematically, the system “fights” to keep its Markov blanket intact, employing internal processes that “predict and adapt.” A better predictor yields fewer entropic leaks and thus less chance of crossing the boundary into a lethal mismatch with the environment.
The Free Energy Principle and Active Inference
Free Energy as a Statistical Measure
Variational free energy (often just called free energy in these contexts) is borrowed from machine learning and Bayesian statistics. A system that minimizes its free energy is effectively maximizing evidence for its internal model of the world. Minimizing free energy also correlates with reducing uncertainty or “surprisal,” ensuring the system’s predictions about incoming sensory data remain accurate.
In simpler machine learning terms, negative free energy is akin to a “model evidence” measure. The more the system’s model accurately predicts real data, the higher the evidence, the lower the free energy.
Active Inference: Action, Perception, and Prediction
Active inference extends this principle to real-time decision-making. The system doesn’t just passively guess about the environment but also acts to shape environmental conditions. Dr. Namjoshi in the transcript describes:
- Perception: Internal states guess the external environment, updating these guesses from sensory input.
- Action: The system changes the environment so it aligns with the internal model’s predictions.
This dual process has no explicit “reward function,” unlike reinforcement learning. Instead, action is framed as prediction error minimization—the system “hypothesizes” that the environment should be in a certain state, and it does what it can to realize that.
Planning with Expected Free Energy
When an organism looks ahead—planning multiple steps into the future—it calculates expected free energy, effectively forecasting “which sequence of actions is most likely to maintain me in my preferred states?” By minimizing expected free energy, the system chooses a path that reduces surprise across potential future scenarios, effectively “homeostatic” or “allostatic” regulation extended over time.
The Brain as a Bayesian Machine
Hierarchical Models in the Cortex
Neuroscientists suggest the neocortex is organized in hierarchical layers, each sending predictions down to the layer below and receiving prediction errors back up. This architecture is a predictive coding system, a practical realization of Bayesian updating. The cortical columns adjust firing rates to correct mismatches between the predicted signals and the actual sensory input.
Minimal Surprises, Maximal Survival
From a Bayesian mechanics perspective, the brain is the organ that enables an organism to remain in that steady-state distribution by:
- Rapidly inferring environmental states (perception).
- Selecting the best actions to maintain viability (movement, homeostasis, planning).
If Dr. Mike’s “non-ionizing is automatically harmless” stance were correct, the brain wouldn’t need to worry about subtle RF signals. But real-world data, especially from rodent models, suggests that at high intensities or over prolonged durations, non-thermal EMF might cause disruptions in the brain’s own bioelectric communications. Minimizing free energy, in that case, would also mean minimizing or mitigating these external disruptors.
The Role of Bayesian Mechanics in AGI and Advanced AI
The Dream of Artificial General Intelligence
Because Bayesian mechanics merges machine learning with statistical physics, many see it as a blueprint for building Artificial General Intelligence (AGI). If living systems are effectively “prediction machines” that maintain their structural integrity via Bayesian updating, then an AGI might similarly harness these techniques for robust, adaptive intelligence in real-world settings.
Complexity, Chaos, and Real-World Learning
Unlike toy tasks, real environments are high-dimensional, noisy, and dynamic. Bayesian mechanics and active inference promise an end-to-end approach—no separate reward function or hand-crafted objective is needed. The system simply aims to minimize free energy or “prediction error,” which can yield emergent intelligent behaviors.
Hence: The next wave of AI research might revolve around frameworks like Bayesian mechanics that unify perception, action, and world-modeling under a single, theoretically grounded lens.
Revisiting the Transcript’s Key Points: Step-by-Step Analysis
Schrodinger’s 1944 Question: What Is Life?
- Transcript: Namjoshi references Erwin Schrödinger’s foundational question about how the events within a spatial boundary of an organism can be explained by physics and chemistry.
- Expansion: Today, we add Bayesian and information-theoretic explanations. An organism is a “statistical entity” whose states remain relatively stable due to a combination of self-regulating flows and predictive, uncertainty-minimizing processes.
The Emergence of Bayesian Mechanics
- Transcript: Pinpoints the 1990s as a pivotal era for developing advanced statistical methods for neuroimaging, culminating in DCM, SPM, and eventually the free energy principle.
- Expansion: These methods weren’t random developments but part of a broader shift in computational neuroscience, linking machine learning (Bayesian inference, hidden Markov models) with brain function. Over the past decade, a wave of theoretical papers extended these frameworks to all living systems, birthing the concept of Bayesian mechanics.
Living Systems as Non-Equilibrium Steady States
- Transcript: Mentions that living beings are at non-equilibrium steady state. They remain in characteristic states despite the randomizing environment.
- Expansion: This “characteristic set of states” is analogous to an organism’s phenotype. Externally, “entropic waste” tries to degrade these states, but the system’s internal flows push back. This push-and-pull is the essence of Bayesian mechanics.
Markov Blanket Partition
- Transcript: Illustrates the partition of external states (η), blanket states (b), and internal states (μ).
- Expansion: The Markov blanket ensures conditional independence, establishing the boundary between “self” and “world.” Within the blanket are sensory (S) and active (A) states, bridging internal predictions with environmental changes. This concept underpins active inference.
The Helmholtz Decomposition and Contravening Flows
- Transcript: Describes how the system can remain in steady state by having a “solenoidal flow” that counters the random drift from the environment.
- Expansion: Physically, think of molecules wanting to diffuse, but the living system’s collective “pumps” and “feedback loops” continually reorganize them. Stochastically, these loops correspond to an “information process” that fights entropic waste.
Active Inference and the Free Energy Principle
- Transcript: Summarizes how the brain (or any system) can minimize variational free energy to maintain stability in a changing environment.
- Expansion: Minimizing free energy is akin to maximizing model evidence—the system’s best guess about the state of the world. Surprise or error signals push the system to reconfigure either its predictions (perception) or the environment (action).
Child’s Play or Lifesaving Insight? Why This Matters
The Stakes of Downplaying Non-Ionizing RF Effects
Dr. Mike’s public dismissal of “non-ionizing = safe” can create complacency in:
- Parents who let kids sleep with phones under pillows or frequently use wireless earbuds.
- Policymakers who uphold outdated FCC rules ignoring the 2021 ruling.
- Researchers who might not explore the synergy between ceLLM or other advanced Bayesian frameworks if mainstream voices claim “No big deal.”
Real Health Implications of Chronic Exposure
Emerging rodent and in vitro studies show that at certain intensities or durations, non-ionizing radiation correlates with DNA fragmentation, stress protein release, and possible tumor promotion. Even if these aren’t guaranteed in short bursts, they highlight a plausible hazard for continuous or long-term exposure—particularly in sensitive populations or tissues.
Entropic Waste in the Wireless Age
As we saturate our environment with microwaves from 5G, Wi-Fi 6, and near-future 6G, the “random background” of electromagnetic signals intensifies. If ceLLM theory is correct, these signals represent an additional dimension of “entropic waste” that biological systems must handle. Minimizing free energy, from a health standpoint, might entail more mindful infrastructure and user behavior to reduce the “EMF noise” that could disrupt cellular resonance.
Children and Pregnancy: The Most Vulnerable <a name=”children-pregnancy”></a>
As Dr. Namjoshi points out, living systems exist in a precarious balancing act. This is doubly so for developing organisms:
- Embryos: Early neural tube formation is heavily governed by bioelectric patterns. Subtle disruptions in these patterns can lead to severe congenital anomalies (NTDs, heart defects, etc.).
- Children: Ongoing neurodevelopment, thinner skulls, and more permeable tissues may amplify the effects of non-thermal EMFs, making them more prone to cumulative harm over time.
While Dr. Mike might reassure parents that “it’s just Bluetooth,” a prudent Bayesian approach says, “Consider precaution. Evaluate if continuous near-head exposures are truly necessary or worth the uncertain risk.”
Regulatory Capture and Western Lies? <a name=”regulatory-capture”></a>
Dr. Oleg A. Grigoriev vs. WHO’s Commissioned Review
In September 2024, Dr. Oleg Grigoriev lambasted a WHO-commissioned review that concluded there’s no credible evidence linking RF-EMF from cellphones to cancer. Grigoriev notes these authors are relatively unknown and might rely on flawed analysis to exonerate wireless industries. Meanwhile, well-documented case-control studies (like Interphone) have pointed to increased glioma risk for high cumulative phone usage.
ICNIRP’s Role
Microwave News has documented ICNIRP as a kind of self-selected group with strong ties to the telecommunication industry. By ignoring non-thermal effects, they maintain guidelines that do not require stricter safety measures. The WHO often leans on ICNIRP’s stance, perpetuating a “thermal-only paradigm” that scientists like Grigoriev or Friston’s community strongly dispute.
Are These “Western Lies” or Genuine Caution?
While calling them “Western lies” may be hyperbole, it’s clear there is institutional inertia or “regulatory capture” benefiting corporate profits. The question is less about East vs. West and more about how global regulatory bodies systematically discount evidence for non-thermal hazards, leaving the public in the dark about potential long-term health issues.
The Path Forward: Demand Updated Policies <a name=”path-forward”></a>
- Acknowledge the 2021 Court Ruling: The FCC must revise archaic guidelines to incorporate non-thermal science.
- Resume NTP Research: Funding for RF-cancer investigations was halted prematurely. Continuation is crucial.
- Adopt Precautionary Measures: For children, pregnant women, or heavy users, reduce continuous near-body exposures; use speakerphone or wired solutions.
- Integrate Bayesian Mechanics: Let’s push for a scientific approach that sees the environment-organism relationship as a dynamic, probabilistic system, not purely a matter of temperature thresholds.
Conclusion: Why Dr. Mike Needs to Rethink His Stance <a name=”conclusion”></a>
Dr. Mike’s popularization of the idea that “non-ionizing = harmless” stands on shaky ground if we consider:
- Russia’s historical stance on strict EMF guidelines.
- Rodent studies from the NTP and Ramazzini, plus the January 2024 morphological link showing rat and human tumors share structural hallmarks.
- The 2021 EHT v. FCC decision exposing how U.S. guidelines ignore up-to-date research.
- The conceptual depth of ceLLM and Bayesian mechanics, revealing how subtle external signals can degrade bioelectric integrity over time.
Notably, Dr. Mike’s Russian heritage ironically contrasts with Russia’s own longstanding caution around microwave and RF technologies—highlighting a discrepancy between his dismissive message and the stance of top Russian scientists like Dr. Oleg Grigoriev, who has called the WHO’s unconditional exoneration of cellphone radiation “misleading.”
Whether or not one interprets Dr. Mike’s statements as “Russian disinformation” is a rhetorical flourish. The real issue is that simplistic reassurance can lead to public inaction about genuine non-thermal risks, leaving children and sensitive populations at the mercy of uncertain exposures.
Takeaways:
- Non-ionizing does not guarantee safety; evidence suggests non-thermal effects exist.
- The free energy principle and active inference show how biology counters entropic waste by Bayesian updating, reinforcing the idea that external signals matter—especially if they interfere with these predictive loops.
- Dr. Mike’s platform could do more good by acknowledging these complexities, rather than delivering a “not enough energy to break DNA” blanket statement.
A Final Call to Action
- Stay Informed: Explore the evidence from NTP, Ramazzini, and the new morphological-genetic studies.
- Push for Policy: Advocate that the FCC adopt modern guidelines factoring in non-thermal biological impacts.
- Take Precautions: While research continues, practice moderate usage and keep devices away from children’s skulls for prolonged periods.
- Embrace a Bayesian Mindset: Understand that predictions about safety must adapt to new data—no single “study” can close the debate forever.
In short, we have only begun to scratch the surface of how living systems maintain their identity against entropic waste through Bayesian mechanics. The potential health implications of ignoring subtle electromagnetic influences are too great to let “thermal safety dogma” remain unchallenged. Let’s move beyond Dr. Mike’s flawed dismissal toward a more nuanced, evidence-based approach that genuinely protects the public—especially future generations.