Skip to main content

Moore's Law is dead, but innovation will continue to be

The "silicon area", that the base of the production of semiconductors present in our computers, is no longer able to take the step established by the famous "Moore's Law", with the theory that is soon destined to be buried. This was stated by a substantial deepening of Nature.com site, which claims that soon the company will stop industry to pursue the dictates of the old theory, that only last year celebrated its fiftieth year of birth.

Moore's Law has been theorized for the first time by the legendary co-founder of Intel, Gordon E. Moore, in 1965. It was enunciated in various ways over the years, but in its simplest form the concept expressed by its author it is that "the complexity of a chip, measured for example by the number of transistors per chip", would be doubled from year to year. An assumption that the companies in the sector, for example, the same Intel, have managed to keep up to date considering the revision of 1970, with the theory that changed envisaging doubling every eighteen months.

The long study of Nature, however, tells a different reality for the coming years, a reality that the semiconductor industry is expected of time. Modern chips are already quite complex in their implementation and use production processes already extremely miniaturized, to the point that it is becoming increasingly complex and difficult to increase their density. In view of this it seems very difficult today to be able to double its capacity every 18 months.

The most advanced chips currently on the market using production processes at 14nm, where a nanometer is a billionth of a meter. At this size the individual parts that make up the chip are smaller than a typical viral particle, and the like to the size of the outer membranes of a germ. Print a 14nm chip has become so complex and expensive that to date only four companies manage to keep pace: Intel, Samsung, TSMC and GlobalFoundries.

In comparison, only 10 years ago there were no less than 18 on the market. The next step will follow the market will be that of 10nm, which should arrive in the chip for the consumer market in 2016. Later come the 7nm, expected in 2018 or in 2019, while TSMC has begun work on a manufacturing process with 7nm the goal of having the first prototypes ready for next year. It will be with 5nm that there will be the first issues of a physical, hard to get around with the current skills.

Comments

Popular posts from this blog

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT