Fast and curious: A 40-year barrier falls in computer science
Tsinghua researchers beat Dijkstra’s legendary shortest path algorithm, opening the door to faster routing everywhere.
This week, a team led by Dan Qiao from Tsinghua University achieved something that seemed impossible: they broke through a computational barrier that had stood for four decades.
Their breakthrough was a new shortest path algorithm that runs faster than the legendary Dijkstra's algorithm—the gold standard that's been taught in computer science textbooks since 1956.
Shortest path algorithms are everywhere. Every time your GPS calculates a route, every time data travels across the internet, every time a delivery company optimizes routes, these algorithms work behind the scenes. They're used trillions of times every day across countless applications.
The new algorithm achieves O(m log^(2/3) n) time complexity for single-source shortest paths on directed graphs. For non-computer scientists, this means it can find the shortest route from one point to every other point in a network significantly faster than previous methods, especially as networks get larger.
The 40-year wait
Dijkstra's algorithm was revolutionary when it was created, and it represented a theoretical ceiling that researchers have been trying to break for decades. Various improvements and optimizations have been made over the years, but none achieved this level of fundamental breakthrough in performance.
When dealing with massive networks, e.g. global internet infrastructure, transportation systems, or social networks with billions of connections, even small improvements in efficiency translate to dramatic real-world benefits, and the practical implications are substantial, for example in:
Navigation and logistics: GPS systems and delivery route optimization could become noticeably faster, especially for complex multi-stop routes or real-time traffic adjustments.
Network infrastructure: Internet routing protocols could handle traffic more efficiently, potentially improving connection speeds and reliability across the web.
Graph analysis: Social networks, recommendation systems, and any application that analyzes relationships between data points could see significant performance improvements.
Optimization, optimization, everywhere
My favourite thing about this breakthrough is that it shows that there's enormous room for optimization even in fields we assume are already mature. So what about AI? Well, the DeepSeek moment taught us that throwing more compute at problems isn't the only path forward. When DeepSeek achieved competitive performance with dramatically fewer resources, it revealed that the AI field is still ripe for algorithmic breakthroughs rather than just scaling existing approaches.
We're often told that progress requires exponentially more data, compute, and energy. But instead of accepting that we need more powerful hardware to handle larger networks, algorithmic researchers focus on finding fundamentally better ways to solve problems.
In my view, there are likely countless algorithmic improvements waiting to be discovered. Consider how much room there might be for optimization in:
- Training efficiency: Better algorithms could reduce the massive compute requirements for training large models
- Inference optimization: Smarter approaches to running AI models could make them accessible to far more people
- Architecture design: We may still be in the early days of discovering optimal neural network structures
If there were still 40 years of untapped improvements in shortest path algorithms, imagine what breakthroughs are waiting in areas where we've only been seriously working for a decade.