Milestone: 80-Billion-Parameter Brain Model Validated on Marlowe
Professor Dan Yamins’ team completed a proof-of-concept hero run on Marlowe, validating that an 80-billion-parameter brain-inspired neural network can train at scale across 24 nodes (192 H100 GPUs). The 24-hour run achieved 45% model FLOPS utilization (MFU), confirming that capability-scale computational neuroscience training is feasible on Stanford’s GPU computational instrument.
The Counterfactual World Model is designed to study how biological neural circuits give rise to cognitive function. A full 30-day training campaign is planned to produce a complete model.