Welcome to Infinite Curiosity, a newsletter that explores the intersection of Artificial Intelligence and Startups. Tech enthusiasts across 200 countries have been reading what I write. Subscribe to this newsletter for free to directly receive it in your inbox:
Here are 5 offbeat / quirky wins scored by AlphaEvolve that caught my attention:
1. Reclaiming a Data Center’s Worth of Compute Daily
Most AI systems just burn through GPU hours and call it a day. AlphaEvolve does the opposite: it gives silicon back. Google’s team slipped an evolutionary scheduler into Borg, the internal cluster manager.
Every minute it reshuffles tasks, squeezes gaps, and turns off stray chips no job really needs. That tiny‑looking 0.7 % efficiency bump recurs across millions of cores, so every 24 hours the system essentially hands Google a full extra data center of spare GPU time.
Those free cycles get recycled into fresh experiments instead of fresh invoices. Efficiency at this scale isn’t a rounding error. It’s a moat.
2. Shattering a 56‑Year‑Old Mathematics Record
Matrix multiplication has been poked at by mathematicians since the 1960s because a faster method unlocks cheaper graphics, physics, and AI. Strassen’s 1969 trick cut the work for 4x4 matrices to 49 scalar multiplications. Nobody could beat that ceiling for more than half a century.
AlphaEvolve was left to breed code in simulation. And it found a way to do it in 48. The discovery surprised even the Google researchers watching the run. The algorithm’s steps looked more like poetry than algebra. In practice it drops training time for transformer layers by a few percent. Small per batch, massive at cloud scale today too.
3. Beating the Compiler at Its Own Game
FlashAttention is already the Formula 1 engine of transformer inference, tuned by hand down to the register level. AlphaEvolve treated it like a starting point, not a finish line.
By mutating the kernel over thousands of generations, it surfaced a variant that runs 32 % faster on A100s and 28 % faster on TPU v5e. And all this while keeping the exact same output. No new hardware, no exotic instruction sets. Just smarter ordering of loads and stores the human eye missed.
Drop that kernel into your stack and you need fewer machines to serve the same traffic. Cheaper, greener, and a solid flex for any team.
4. Nudging a 300‑Year‑Old Geometry Puzzle Forward
The “kissing number” asks how many identical spheres can touch one central sphere without overlap. Humans cracked the answer in 1, 2, 3, 4, 8, and 24 dimensions. But the wild middle grounds stayed murky.
AlphaEvolve pointed its search at the ugly duckling i.e. 11‑D space. And spat out a configuration with 593 spheres, up from the previous best of 558. Mathematicians haven’t proved optimality yet, but the jump already feeds better high‑dimensional hashing and error‑correcting codes.
Every extra sphere tightens those schemes, meaning fewer bits lost in storage or radio. Sometimes progress hides inside abstract mathematics, waiting to power future protocols too.
5. A Two‑Brain Workflow That Never Sleeps
AlphaEvolve isn’t a single model. It’s a tag‑team. Gemini Flash plays scout by blasting out thousands of quick‑and‑dirty ideas. Gemini Pro follows as critic by grading every snippet for accuracy, speed, and security.
The best candidates breed, the weak ones vanish, and the cycle restarts in minutes. This buddy system means AlphaEvolve improves while you do your other work. By the time you push new code at lunch, the platform likely found a faster version. Continuous evolution feels less like software and more like compound interest for your compute budget.
If you're a founder or an investor who has been thinking about this, I'd love to hear from you.
If you are getting value from this newsletter, consider subscribing for free and sharing it with 1 friend who’s curious about AI: