Moore's Law Explained

Posted on June 11, 2025

Moore's Law, coined by Intel co-founder Gordon Moore in 1965, observed that the number of transistors on a microchip doubles approximately every two years while the cost halves. This observation has driven the exponential growth in computing power that has transformed society over the past half-century, though its continuation faces increasing physical and economic challenges.

The law isn't a law of nature but rather an observation about the semiconductor industry's innovation pace. It became a self-fulfilling prophecy as chip manufacturers used it to set research and development targets. This exponential improvement meant that computers became dramatically faster, smaller, and cheaper at a predictable rate. A smartphone today has millions of times more computing power than the room-sized computers of the 1960s.

The implications extend far beyond just faster processors. Moore's Law enabled the personal computer revolution, the internet, smartphones, and artificial intelligence. It made computation so cheap that we now embed processors in everything from toasters to cars. The predictability of improvement allowed companies to plan products years in advance, knowing what computational power would be available.

However, Moore's Law is reaching physical limits. As transistors approach atomic scales, quantum effects and heat dissipation become major challenges. Manufacturing costs are skyrocketing - new chip fabrication plants cost over $20 billion. The industry is exploring alternatives like 3D chip architectures, quantum computing, and specialized processors for AI. While the exact formulation of Moore's Law may end, the drive for computational improvement continues, just through different means. Understanding this historical trend helps contextualize the rapid technological change we've experienced and what might come next.