The First Embodied Whole-Brain Emulation: How a Startup Put a Copy of a Fly's Brain Inside a Virtual Body
Eon Systems claims to have achieved the first multi-behavior embodiment of a whole-brain emulation, connecting a 125,000-neuron Drosophila brain model to a physics-simulated body. We examine the science behind the demo, what it actually shows, and the vast gap between a fruit fly and the human brain the company ultimately wants to copy.
Key Takeaways
Eon Systems demonstrated an embodied whole-brain emulation of Drosophila melanogaster by integrating a connectome-derived computational model (125,000+ neurons, 50 million synapses) with a MuJoCo physics-simulated body to produce multiple distinct behaviors. While this represents a genuine milestone, the simplistic leaky integrate-and-fire neuron models, absence of neuromodulation and glial cell dynamics, and the 560× gap to a mouse brain (70 million neurons) highlight the enormous challenges that remain on the path to mammalian and human emulation.
On March 7, 2026, Eon Systems PBC — a small San Francisco–based startup with a mission statement that reads like science fiction — released a 41-second video that it says shows the world's first embodied whole-brain emulation producing multiple distinct behaviors. A computational copy of the entire adult Drosophila melanogaster brain, wired neuron-to-neuron from electron microscopy data, drives a physics-simulated fly body through walking, grooming, and coordinated limb movements. If the claim holds up, it represents a qualitative threshold in neuroscience: the first time a complete emulated brain, derived from a biological connectome rather than trained by reinforcement learning, has closed the sensorimotor loop in a virtual body.
But 'if the claim holds up' is doing heavy lifting. To assess what Eon actually demonstrated — and to distinguish genuine scientific progress from venture-funded marketing — requires examining the peer-reviewed research underpinning the demo, the known limitations of the models involved, and the staggering distance between a fruit fly and the company's stated endpoint: a digital copy of the human brain.
The Science: From Connectome to Computation
The foundation of Eon's demo rests on a 2024 Nature paper by Philip Shiu and collaborators [1], titled 'A Drosophila computational brain model reveals sensorimotor processing.' Shiu, a senior scientist at Eon and former postdoctoral researcher at UC Berkeley, built a leaky integrate-and-fire (LIF) computational model of the entire adult Drosophila central brain. The model contains more than 125,000 neurons and 50 million synaptic connections, reconstructed from the FlyWire connectome — the first complete wiring diagram of an adult insect brain, mapped from electron microscopy data by an international consortium and published in Nature in October 2024.
What makes Shiu's model noteworthy is not just its scale but its predictive power. By feeding the model with known neurotransmitter identities (excitatory, inhibitory, and modulatory) predicted by machine learning, the researchers showed that computationally activating sugar-sensing gustatory neurons accurately predicted which downstream neurons would fire — matching experimental data from optogenetic studies. The model also correctly predicted feeding motor outputs and described how different taste modalities interact at the circuit level. For mechanosensory circuits, activating virtual touch sensors on the antennae reproduced the known antennal grooming circuit, achieving 95% accuracy in predicting motor behavior.
This is the paper's genuine contribution: demonstrating that connectivity plus neurotransmitter identity alone — without tuning individual synaptic weights or modeling detailed biophysics — can produce functionally meaningful circuit-level predictions. It ran on a laptop. That alone is remarkable.
The Embodiment: Brain Meets Body
The March 2026 demo goes further. Eon claims to have taken Shiu's disembodied brain model and connected it to NeuroMechFly v2, an embodied simulation framework developed by Pavan Ramdya's group at EPFL and published in Nature Methods in 2024 [2]. NeuroMechFly v2 provides a biomechanically accurate digital twin of the adult fruit fly — built from micro-CT scans of a real fly — that can simulate vision, olfaction, leg adhesion, and complex terrain navigation inside the MuJoCo physics engine. The framework includes compound eyes with simulated ommatidia, odor receptors, and a hierarchical brain-VNC (ventral nerve cord) control architecture analogous to the vertebrate brain-spinal cord system.
Eon also references work by Pembe Gizem Özdil and colleagues at EPFL, published on bioRxiv in December 2024, which identified centralized interneurons and shared premotor neurons in the Drosophila connectome that synchronize neck, antennal, and foreleg motor networks during grooming. This research provides a circuit-level explanation for multi–body-part coordination — exactly the kind of behavior Eon's demo claims to reproduce.
The integration, as described by Eon, works like this: sensory input flows into the emulated brain, neural activity propagates through the complete connectome, motor commands flow out to the virtual body, and the physically simulated body executes the movements. The result, shown in the video, appears to be multiple naturalistic behaviors — not a single hand-tuned motion, but emergent multi-behavior output from the brain model's own dynamics.
What Makes This Different from DeepMind's Virtual Fly
The comparison that best illustrates the significance — and the limitations — of Eon's approach is with DeepMind and Janelia Research Campus's 'flybody' project. The DeepMind/Janelia team built an anatomically detailed MuJoCo model of a fruit fly with 67 body parts, 66 joints, and 102 degrees of freedom. Their virtual fly can walk, fly, and navigate complex trajectories with remarkable realism.
But the controller is fundamentally different: DeepMind trained neural network policies using reinforcement learning from real fly behavior data. The 'brain' driving their virtual fly is a deep learning model optimized to reproduce behavior — not a biological circuit map. It learns what a fly should do, not how a fly's brain actually computes what to do. The distinction matters enormously: RL policies can achieve impressive behavioral fidelity without telling you anything about the biological mechanisms generating the behavior. Eon's approach, by contrast, starts from the biological wiring and asks whether that wiring alone is sufficient to produce behavior when coupled to a body.
| Feature | Eon Systems (Connectome-Based) | DeepMind/Janelia (RL-Based) |
|---|---|---|
| Brain model source | FlyWire connectome (electron microscopy) | Deep RL training on behavior data |
| Neuron count | 125,000+ biological neurons | Artificial neural network (policy) |
| Behavior generation | Emergent from circuit dynamics | Learned from reward optimization |
| Neurotransmitter modeling | Yes (ML-predicted identities) | N/A |
| Synaptic weight tuning | No (connectivity only) | Trained weights |
| Biological insight | Predicts circuit mechanisms | Reproduces behavior patterns |
| Physics engine | MuJoCo (via NeuroMechFly v2) | MuJoCo (native flybody) |
The Limitations Nobody Is Advertising
The LIF model used in Shiu's paper — and presumably in Eon's demo — is among the simplest representations of a neuron that still captures spiking behavior. It reduces each neuron to a single voltage variable governed by a first-order differential equation with a leak term and threshold. Real neurons have thousands of ion channels operating across multiple timescales, dendritic arbors that perform local computation, gap junctions that the connectome does not fully capture, and stochastic release mechanisms at synapses.
More critically, the model lacks several categories of biological dynamics that neuroscientists consider essential for brain function:
- Neuromodulation: Dopamine, serotonin, octopamine, and other neuromodulators can globally alter circuit behavior. The connectome captures synaptic wiring but not the diffuse, volume-transmission signaling of neuromodulators.
- Glial cells: Astrocytes and other glia modulate synaptic transmission, regulate ion concentrations, and participate in information processing. They are entirely absent from the model.
- Synaptic plasticity: The model uses fixed connectivity. Real brains constantly modify synaptic strength through learning, habituation, and homeostatic regulation.
- Vasculature and metabolism: Biological brains depend on tightly regulated blood flow and energy delivery, which computational models omit entirely.
- Non-spiking neurons: A significant fraction of insect brain neurons communicate through graded potentials, not action potentials. LIF models cannot represent this.
The 95% accuracy figure from the Shiu paper, while impressive, merits context: it refers to the model's ability to predict which motor neurons activate given specific sensory inputs, not to the behavioral fidelity of a fully embodied simulation. There is a massive gap between 'predicting which neurons fire' and 'producing naturalistic behavior in a physics-simulated body.' Eon's demo video is 41 seconds long, and the company has not yet published peer-reviewed results on the embodied integration.
The OpenWorm Precedent: A Cautionary Tale
Anyone evaluating Eon's claims should also consider the history of OpenWorm, the open-science project that has spent over a decade attempting to create a complete computer model of the C. elegans nematode — an organism with just 302 neurons and a fully mapped connectome that has been available since 1986. Despite its comparatively tiny nervous system, OpenWorm has not yet achieved a fully functional embodied simulation that reproduces the worm's complete behavioral repertoire.
The reasons are instructive. Knowing the wiring diagram is necessary but not sufficient: the functional properties of each synapse (strength, dynamics, plasticity), the influence of neuromodulators, and the coupling between neural activity and body mechanics all require additional data and modeling effort. C. elegans also complicates matters because many of its neurons communicate via graded potentials rather than spikes. The BAAIWorm project, building on OpenWorm's tools, reported improvements in late 2024 using more detailed neural models trained from both neuron- and network-level data with closed-loop sensory feedback — but a complete simulation remains elusive.
If 302 neurons remain incompletely simulated after 10+ years of international effort, the claim that 125,000 can be 'embodied' in a meaningful way deserves scrutiny. Of course, the Drosophila brain has an advantage: its neurons are predominantly spiking neurons, making LIF models a better fit than for C. elegans. But the scale of unknowns is proportionally larger.
The Road to Mouse — and Human
Eon's stated mission is to produce the world's largest connectome and highest-fidelity brain emulation, with an explicit trajectory from fly to mouse to human. The company, founded by Michael Andregg — who previously co-founded Halcyon Molecular (DNA sequencing) and Fathom Computing (optics-based AI hardware) — has assembled an advisory board that includes George Church (Harvard geneticist), Stephen Wolfram (Mathematica/Wolfram Alpha), Konrad Körding (Penn neuroscience and AI), Anders Sandberg (Future of Humanity Institute), Robin Hanson (George Mason economist and 'Age of Em' author), Stephen Larson (OpenWorm co-founder), and Alexander Huth (UT Austin computational neuroscience and fMRI).
The company says it is combining expansion microscopy — a technique that physically enlarges tissue samples to achieve nanometer-scale imaging of neural connections — with tens of thousands of hours of calcium and voltage imaging to capture how neural networks activate in living tissue. The goal: a complete connectome and functional model of a mouse brain.
A mouse brain contains roughly 70 million neurons — 560 times the fly's count. The human brain has 86 billion neurons and approximately 100 trillion synapses. Eon frames the jump from fly to mouse as 'a question of scale, not of kind,' but most neuroscientists would disagree. Mammalian brains have six-layered cortices, long-range white matter tracts absent in insects, fundamentally different neuromodulatory systems, and vastly more complex synaptic physiology. The wiring principles that make an LIF fly model work may not transfer.
Even the data acquisition challenge is formidable. The FlyWire connectome — mapping a brain roughly the size of a poppy seed — required years of effort by hundreds of researchers using automated AI segmentation of electron microscopy images plus extensive manual proofreading. A full mouse brain connectome at synaptic resolution would generate exabytes of imaging data. No current pipeline can process that volume, though expansion microscopy and serial-section techniques are advancing rapidly.
What the Demo Actually Shows
Setting aside the marketing language, what Eon demonstrated in March 2026 is genuinely significant in a narrow technical sense: a computational brain model derived from a complete biological connectome, running in real time, driving a physically simulated body through multiple behavioral patterns. No one else has done exactly this at this scale.
But significant caveats apply. The demo is a video, not a peer-reviewed publication. The details of the integration — how sensory feedback was implemented, what simplifications were made, whether the behaviors were cherry-picked from many runs — remain unpublished. The LIF model, while surprisingly effective, is a cartoon of a neuron. And the distance from here to a mouse, let alone a human, is not merely 'scale' — it likely requires fundamental advances in neuroscience, imaging, computation, and modeling theory that do not yet exist.
The Bigger Picture: Two Paths to Understanding the Brain
What Eon's demo, DeepMind's virtual fly, and the broader connectomics revolution collectively illustrate is that neuroscience is converging on a central question from two directions. From the bottom up, connectome-based models like Shiu's ask: does the wiring diagram contain enough information to explain behavior? From the top down, RL-based models like DeepMind's ask: can we reproduce behavior without understanding the wiring? Each approach has strengths the other lacks, and neither alone is sufficient.
The most productive path likely lies in their intersection. A 2024 Nature paper by Pillow and colleagues [3] proposed the concept of an 'effectome' — a causal model that uses the connectome as a structural prior and combines it with optogenetic perturbation data to estimate how strongly neurons actually influence each other in vivo. Their analysis found that the fly brain's dynamics are dominated by many small, largely independent circuits — suggesting that a causal brain model is feasible for the fly, even if the current LIF model is only a first approximation.
Disclosure and Financial Context
Several aspects merit transparency. Michael Andregg, the company's CEO and founder, published the original announcement on his personal Substack — not in a peer-reviewed venue. He disclosed a financial interest in Eon. The company is a PBC (Public Benefit Corporation), a legal structure that allows pursuit of public benefit alongside profit, but it is still a private company seeking investment. The advisory board, while individually distinguished, does not constitute peer review of the demo itself.
Eon's team page lists job openings and advisors but provides limited information about its actual staff size, funding, or technical infrastructure. The gap between the peer-reviewed foundation (the Nature paper) and the commercial claim (the embodied demo) has not yet been bridged by independent scientific evaluation.
What Comes Next
The Drosophila whole-brain emulation is a genuine scientific milestone built on years of connectomics research by hundreds of scientists worldwide. Eon's contribution — coupling an existing brain model to an existing embodiment framework to produce multi-behavior output — is an engineering achievement that, if validated by peer review, advances the field. But the breathless framing ('the machine is becoming the ghost') obscures the vast distance between a simplified model of a fruit fly brain and anything resembling conscious experience or human-level intelligence.
The real test will come when Eon publishes its integration methods for independent scrutiny, when other groups attempt to reproduce the results, and when the community can evaluate whether the observed behaviors are genuinely emergent from brain dynamics or artifacts of implementation choices. Until then, the demo is a promising start — not a terminus — on a road that may be far longer than any current map suggests.
📚 Sources & References
| # | Source | Link |
|---|---|---|
| [1] | A Drosophila computational brain model reveals sensorimotor processing |
|
| [2] | NeuroMechFly v2: simulating embodied sensorimotor control in adult Drosophila |
|
| [3] | The fly connectome reveals a path to the effectome |
|
| [4] | Whole-body simulation of realistic fruit fly locomotion with deep reinforcement learning |
|