Infra Threads I've Been Noodling On
List of infra product threads / ideas I've been thinking about in an AI-native world
Welcome to Infinite Curiosity, a newsletter that explores the intersection of Artificial Intelligence and Startups. Tech enthusiasts across 200 countries have been reading what I write. Subscribe to this newsletter for free to directly receive it in your inbox:
What does infra look like in an AI-native world? Here’s a list of threads / ideas I've been noodling on:
1. Verticalized AI Cloud
Serverless, domain-specific clouds. One tuned for text, another for generative media, another for hard science. Let teams pay only for the exact compute they consume. They’ll leap past the generic GPU products by packaging models, data, and orchestration as one-click “labs” for each vertical.
2. AI Scaffolding
A model call is now just an http request. But the real magic is the guardrails, retries, routing, and evals that wrap it. Whoever owns this scaffolding layer will own the runtime for modern software.
3. Agent-native Data Infra
We need a database built for messy human-plus-agent conversations, not just pristine tables. This means dynamic policy engines, self-curating lakehouses, and query routers that auto-rewrite SQL to hit the right store without you noticing.
4. IDE Where Agents Are the Coders
Humans stop grinding out code and start curating backlogs while agent swarms push commits. Picture Jira’s kanban fused with VS Code panes and GitHub Actions logs. Then let the autonomy take over.
5. AI-Native Git
Next-gen version control starts as an Apache project that treats merges, diffs, and lineage as LLM-first primitives. A venture-backed layer will commercialize it, then layer on an agent-centric IDE on top. History repeating itself, but faster.
6. Outer Loop of Software Development
AI code review is a starting point. Every policy gate from hosting to SOC-2 gets an autonomous copilot. The dev outer loop becomes a continuous negotiation between code, compliance, and bots.
7. LLM-optimized file systems
LLM-optimized file systems like 3FS and Rust-style languages purpose-built for tensor ops will displace yesterday’s POSIX baggage. Add system-prompt learning pipelines and you’ve got the new ops tier for AI.
8. AI for Mathematics
When machines crack formal proofs, they unlock shortcuts in many areas. This includes computational fluid dynamics, quant finance, and simulated worlds we can barely model today. Lean and its AI-powered LeanDojo are shaping that revolution. Mathematics is the next frontier for compilers and GPUs.
If you're a founder or an investor who has been thinking about this, I'd love to hear from you.
If you are getting value from this newsletter, consider subscribing for free and sharing it with 1 friend who’s curious about AI: