AI is reshaping chip design tools, and the results are impossible to ignore

Bob O'Donnell

Posts: 112   +2
Staff member
Why it matters: As powerful as AI may be, many industries are still struggling to find clear-cut applications that make a measurable, demonstrable difference. Thankfully, that is not the case when it comes to chip design software. In fact, since their introduction just a few years ago, AI-powered features have become a mainstay of EDA (Electronic Design Automation) tools from companies such as Cadence and Synopsys.

Silicon designers quickly discovered that many of the complex yet often tedious tasks involved in their process – particularly the "grunt work" – could be automated or dramatically simplified by intelligent AI algorithms. From the automated layout of certain IP blocks to improved efficiencies in IP block interconnects, these AI features help accelerate the less creative (but still critical) parts of the workflow, allowing designers to focus more on the interesting and innovative aspects of chip development.

In addition, AI-powered tools can drive impressive improvements in chip performance and energy efficiency. Case in point, vendors like Cadence have indicated up to 60% performance improvements on specific blocks within a chip because of AI enhancements.

Silicon designers quickly discovered that many of the complex yet often tedious tasks involved in their process – particularly the "grunt work" – could be automated or dramatically simplified by intelligent AI algorithms.

Power improvements of up to 38% have also been made possible thanks to these tools. Along the way, silicon engineers also discovered that AI-powered features could reduce the amount of time necessary to finish a chip design – in some cases, up to 10× faster.

In short, these AI-powered EDA programs provide the kind of ideal AI-enhanced scenario of increased productivity and more engaging work that many organizations are looking for.

Not surprisingly, this has also led to significant growth in the use of AI-powered capabilities in modern chip design tools. In fact, based on public data regarding the number of chip design tapeouts disclosed by major companies like Cadence and Synopsys, as well as their estimates of AI feature adoption, the industry is now crossing a critical threshold.

Specifically, just over 50% of advanced silicon designs (those built with 28nm process technologies and smaller) are now believed to be AI-assisted. Looking ahead, it's easy to predict that this percentage will continue to grow significantly over the next few years.

Given that there were zero AI-assisted tapeouts just four years ago, that's impressive progress. More importantly, it's a great example of how applied applications of AI technology can have a profound impact on a business's evolution. The fact that it happens to be in the chip industry (and, appropriately, likely involves a significant percentage of chips that are designed to accelerate AI computing!) makes the moment even more relevant and consequential.

According to Cadence, these AI features can reduce chip design times by as much as a month, which is a significant positive impact. Plus, as mentioned earlier, it's a benefit that can be directly tied to the AI features – about as concrete an example of the technology's benefits as you could ever want.

The power and performance improvements alone make the enhancements enabled by AI incredibly valuable. However, toss in the increased efficiency of the work that silicon engineers can achieve with these tools, and the story gets that much stronger.

It's easy to see why so many people in the world of semiconductor design – including industry leaders like Nvidia, AMD, Qualcomm, MediaTek, Samsung Semiconductor, Marvell, and Broadcom – are so excited about the possibilities for AI in their product creation tools (as well as for the AI accelerators they're going to be designing with those tools!).

The timing of the crossover point also ties in very nicely with a number of other semiconductor industry developments. Most notably, the past few years have seen a big increase in the kind and number of companies who are working on advanced chip designs.

From cloud computing providers such as Google, Microsoft, and Amazon's AWS to device makers like Apple, Samsung, and more, there are many organizations pursuing the custom silicon route as a critical means of differentiation. However, the number of skilled chip designers in the world is still relatively limited, so having more advanced AI tools that can enable even junior designers or others with limited experience to take on more sophisticated chip layout tasks is critically important to keep the semiconductor industry advancing forward.

Even for the long-time semiconductor players, these enhancements create new possibilities, including the ability to create more designs, build more customized options, and run more projects in parallel. Creating more customized designs, in particular, is something that many in the chip industry (and their chip-buying clients) have wanted for a very long time, however the practical realities of doing so with traditional design tools have kept that from becoming possible. But now all of these capabilities can translate into opportunities to build on the rapid growth the semiconductor industry has seen over the last few years.

Another important point is that as semiconductor designs move into smaller and smaller process nodes and the number of transistors per chip continues to expand, AI chip design features are quickly evolving from a nicety to a necessity. The number of factors, permutations, and connections that chip designers face is quickly growing, and the work to create these sophisticated new chips demands the enhanced intelligence that a well-designed AI-powered tool can enable.

While it's true that the speed of AI adoption and the extent of its influence haven't been as fast or as profound as many first expected in certain industries, it's also becoming very clear that in targeted applications, it's proving to be even more impactful than many hoped.

With the transition to AI-enhanced chip designs crossing over this important 50% barrier, it's apparent that EDA tools are unquestioned beneficiaries of these advances. From a semiconductor industry perspective, it's also clear we're entering an exciting new AI era.

Bob O'Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on X

Permalink to story:

 
I can see AI speeding up the iteration process a lot, but take the performance and power efficiency gains with a big grain of salt. Cutting edge silicon is already hyper optimised. Most of the gains will come from the engineers being able to iterate a lot more on the design.

The problem is that even with AI being used we are seeing a huge increased in the release time/cadance between new generations of CPUs and GPUs, especially GPUs.
 
*ML, Not AI.
Exactly! Organisations have been using ML for these sorts of purposes - pattern recognition, image manipulation, protein folding etc etc for decades. It has nothing to do with AI. I'm so sick of this AI misnomer and the hype/bullsh1t around the current brute-force technology used in what we suddenly started calling 'AI'. The main problem is it will hide the true AI's which are on their way in the next few years. They are a very different proposition and we should be extremely wary of how they are built and especially who is in charge of them. Utterly amoral companies like Google and Meta being in charge will be the death of us all.
 
ML [...] has nothing to do with AI
ML is a subset of the broader field known as AI. I’m also tired of seeing the term "AI" thrown around in every possible context like it was just invented yesterday. But let’s be honest, the progress in this field over the past 10 years has been remarkable, and we owe that to the people who saw its potential and worked hard to get us here. Unlike them, I used to laugh at the idea of AI 25 years ago I was building and training small neural networks at school and thought it was kind of a joke.
 
AI/ML/DL have been make a serious difference for years in science, medicine and engineering. So this is not surprising. People are trying to equate desktop plagiarising engines with serious AI being used in research and development.
 
Back
OSZAR »