Weekly 3x3: CapEx Boom. Agentic Checkouts. Quantum Optimization in Portfolio Allocation
Big Tech’s $725B infrastructure bet is crushing free cash flow as "Agentic Commerce" protocols move from pilot to production.
Big Tech’s $725B infrastructure bet is crushing free cash flow as "Agentic Commerce" protocols move from pilot to production.
Trillion-dollar infrastructure bets are under pressure as firms learn that scaling spend doesn’t scale performance without better ways to manage what models actually see.
A volatile week across markets and AI. Tech valuations wobble, central banks stay cautious, Big Tech defends trillion-dollar AI bets, and new research questions how we measure progress. Signals are getting noisier — not clearer.
The contemporary narrative around AI is dominated by exponential capabilities: the emergence of reasoning, the human-level performance on standardized tests, the spectacular hallucinations. This focus on performance has obscured a far more fundamental and immediate constraint: the economics of production. We have industrialized a new class of cognitive labor, transforming
Markets rally on US–China trade optimism as stocks hit new highs, but risks linger beneath the surface. AI spending shifts from hype to ROI, while new research flags both accelerating scientific automation and growing concerns over concentration, stability, and control.
I was in Tokyo this week for the STAC Summit. By coincidence, the World Athletics Championships were happening at the same time. One evening I was at the Japan National Stadium watching sprinters and distance runners push human limits. The next morning I was at our Summit, where firms shared
This week, a team led by Dan Qiao from Tsinghua University achieved something that seemed impossible: they broke through a computational barrier that had stood for four decades. Their breakthrough was a new shortest path algorithm that runs faster than the legendary Dijkstra's algorithm—the gold standard that&
The AI arms race is shaping up like a three-way tug-of-war — and it’s not yet clear who’s going to fall in the mud first. The US, the Middle East, and China each bring something essential to the table. But none of them hold all the cards. That’s
The data warehouse is no longer the centre of gravity. It’s just one node in a growing constellation. Modern data strategies aren’t about centralising everything in one place. They’re about composability—the ability to mix and match components, adapt to change, and build around the edges. At
In financial markets, fairness is easy to believe in but hard to define — especially when you're dealing with electronic trading systems that operate faster than the blink of an eye. For most of history, fairness in trading meant access: If you could make it to the floor of
The data lakehouse is no longer an experiment. It’s fast becoming the blueprint for enterprise data architecture. At the center of this shift are two open technologies: Apache Parquet and Apache Iceberg. Understanding their role today — and where they’re headed — is critical for sizing the opportunity. Parquet: The
I've often found explainability in AI to be an elusive concept—essential, yet difficult to pin down. It underpins trust, adoption, and regulatory compliance, but how does it actually work? A recent paper, Explainable Artificial Intelligence (XAI): From Inherent Explainability to Large Language Models by Fuseini Mumuni and