AI-Powered Chip Design Goes Mainstream – EE Times
AI in chip design has gone mainstream, with Synopsys’ AI-powered DSO (design space optimization) tool reaching 100 commercial tapeouts.
The company’s DSO was released in 2020, with its first commercial tapeout in August 2021. The tool uses reinforcement learning to tackle physical layout optimization—that is, all the physical aspects of chip design.
“This technology has seen very fast adoption over the past two and a half years,” Stelios Diamantidis, distinguished architect at Synopsys, told EE Times. “Importantly, [the 100 tapeouts] are across design nodes, across vertical markets and with very high coverage of the semiconductor leaders.”
Synopsys expects “hundreds” more tapeouts this year, according to Diamantidis. The EDA company is working with customers designing cutting-edge chips, including CPUs, GPUs and AI accelerators, with new applications continuing to emerge.
Synopsys customer SK Hynix was able to not only increase productivity using DSO, but also reduced die size for one of its designs by 5%. This 5% silicon area improvement can mean a considerable economic benefit, given that memory products are produced in the tens or hundreds of millions, Diamantidis said.
The tool’s first 100 tapeouts includes the first commercial tapeout using the cloud-based version of DSO—an STMicroelectronics 7-nm FinFET design. ST said physical layout optimization for a new Arm core was 3× faster in terms of both time-to-result and engineering time spent on the design using DSO.
“We used the tools to accelerate the physical design of a key IP block that we integrated into an ASIC,” Indavong Vongsavady, methods, tools and infrastructure director of the RF and communication division at STMicroelectronics, told EE Times. “As is true for highly complex designs, we don’t have a one-size-fits-all approach to the design. We use different tools and approaches for different pieces and in this case, given DSO’s ability to let us perform lots of experiments in a short period of time to optimize performance, power and area, it made sense to use it on a particular block of IP.”
Using DSO during the design authoring process allowed ST to achieve a higher level of optimization than usual, he said, adding that ST is confident that the product will meet quality criteria during the normal verification and final sign-off process.
“We think that AI/ML algorithms are a good fit when we have extremely challenging optimization problems to solve,” Vongsavady added. “The best way to solve these is using a huge number of experiments, and this is where the DSO.ai tool shines because it offloads the burden of mechanical tasks and gives our engineers more time for creative thinking.”
The decision to use DSO in Microsoft’s Azure cloud was made largely for flexibility and convenience; ST performed tens to hundreds of parallel runs on 16 or 32 CPU cores, peaking at a few thousand CPU cores.
“The cloud enhances our storage and compute capacity, and we exploited that elasticity in our use of the tool,” Vongsavady said. “We can access as much server and CPU capacity as we need for massive parallel experiments without the risk of impacting other ST projects running on-premises.”
DSO uses a technique called reinforcement learning to tackle physical layout. The problem is complex; place and route alone has a state space of 10 to the power of 90,000. Reinforcement learning is a good fit since it doesn’t require masses of data for training. This would mean having access to Synopsys’ customers’ chip designs, which are proprietary. Instead, the algorithm can be trained by designing chips from scratch and being given a score for how well it does. Over time, the algorithm tries over and over to achieve a better score.
Part of DSO’s secret sauce is the ability to also learn from completed designs. In this case, customers would use their own existing designs to train their version of algorithm via supervised learning, as a way of using knowledge of prior designs. This can help the AI converge faster on an optimum design.
While much of the focus of the AI industry in general has been on transformers and large language models lately, Synopsys’ Diamantidis said reinforcement learning is definitely the best fit for this application.
“We continue to see tremendous value in reinforcement learning,” he said. “Comparing DSO in 2020 to the 2023 version, there is a clear evolution of technology as we ourselves become smarter and more research emerges that we can leverage.”
While reinforcement learning will continue to be impactful both in EDA and other applications, it isn’t the only horse in the race. Synopsys is interested in possible combinations with other learning paradigms, such as generative AI and graph networks, but reinforcement learning will continue to be the backbone, Diamantidis said.
“We may not have big data, which is generated out in the wild and we get to go out and mine it, but we do have what [machine learning pioneer] Andrew Ng calls ‘good data,’ which is we understand the origin of the data and can make some assumptions about its classification and behavior and that gives us some advantages that again play to reinforcement learning’s strengths.”
Synopsys will also continue to develop its statistical learning (non-AI) tools, according to Diamantidis.
One of the key downstream effects of AI-powered chip design is to make the ASIC model more attractive since it means more quickly and easily taping out customized versions of ASICs for particular verticals or customers. Other downstream effects of designing better performing chips on vastly condensed timescales with less engineering resource include prolonging Moore’s law by squeezing better performance from older process nodes, and accelerating entry to market for companies without chip design experience.
This is what DSO can achieve today, but what’s next for AI-powered EDA tools? Synopsys CEO Aart de Geus gave some hints in his keynote at ISSCC 2022, in which he talked about using AI to more efficiently remaster silicon designs for different process nodes, or different foundries. (This is a future technology; DSO’s remastering capability doesn’t have any commercial tapeouts yet).
In an interview with EE Times at the time, de Geus said that AI-powered remastering has the potential to help ease fab capacity issues. The trick is using learning from the original chip design when remastering the chip for a different node, rather than starting from scratch.
“It was quite remarkable how the learning from [a tapeout at] N40 was absolutely still applicable in N10, which is 3 nodes more advanced,” he said. “This tells me that certain characteristics of the design really determine its essence: its bottlenecks, what makes the speed, what makes the power and so on.”
Versus a human-designed chip remastered for N10, the AI-designed version was faster and more power efficient, but also took less time to design.
“Something else is coming out of these experiments: the notion that if you are using high tech, which is fast-moving, but for products that have a long life cycle, you should start designing some of those products such that if you had to make a new version of it, you already captured the learning or the spec in such a way that in seven years it can be designed more easily,” he added.
Not only can AI design better chips than human engineers, but we will need AI to keep up with future increases in device complexity—just like we needed design automation 35 years ago, de Geus said. The number of dimensions that need to be optimized is growing, including speed, area, dynamic power, static power, reliability, manufacturability and yield.
“One way that humans design when they don’t know the exact answer is they put in margin,” he said. “AI can find itsy bitsy little margin pieces all over the place, and so maybe it’s a very small piece that it finds, but it finds 3 billion of them, and you can see the human challenge with that.”
Sally Ward-Foxton covers AI for EETimes.com and EETimes Europe magazine. Sally has spent the last 18 years writing about the electronics industry from London. She has written for Electronic Design, ECN, Electronic Specifier: Design, Components in Electronics, and many more news publications. She holds a Masters’ degree in Electrical and Electronic Engineering from the University of Cambridge. Follow Sally on LinkedIn
You must Register or Login to post a comment.
This site uses Akismet to reduce spam. Learn how your comment data is processed.
Advertisement