**London**: As semiconductor verification faces increasing complexity, experts highlight the need for smarter methodologies, automation, and the integration of AI tools. Discussions focus on optimising testing processes to enhance efficiency and performance, while addressing challenges such as coverage closure and targeted test prioritisation.
In the rapidly evolving semiconductor industry, verification teams are facing increasing complexity in their tasks, necessitating substantial advancements in methodology and automation. As the tools currently available often operate on single processor cores, the functional simulation capabilities remain limited. The challenge lies in balancing total simulation throughput with the immediate need for timely results during debugging cycles.
During a recent discussion, Frank Schirrmeister, executive director for strategic programs and systems solutions at Synopsys, highlighted three primary areas of improvement within semiconductor verification. He noted, “There is the base speed of the engines, and that is increasing. The second thing we refer to as ‘smarter verification.’… The third is automation and AI.” This triad is likened to a “three-legged stool” whereby optimisation in speed, smart methodology, and automation are imperative for moving forward.
Experts within the industry have acknowledged that doing tasks in a more intelligent manner and applying automation can significantly enhance performance. Bryan Ramirez, director of solutions management at Siemens EDA, emphasised the importance of understanding the tests completed. “Knowing what you’ve tested is critical,” he remarked, pointing out that aligning test scripts with design specifications is an area of robust industry investment. However, effective coverage closure remains a challenge, despite improvements in automating coverage across various verification engines.
The question of which tests to prioritise in a regression suite is crucial. As Bradley Geden, director of product marketing at Synopsys, elaborated, “When you make a change to your RTL, do you need to run the whole regression suite?” He noted that the objective of automation is to identify high-value tests that should be executed first. This strategic approach enables engineers to focus on eliminating bugs, while also potentially speeding up the entire verification process.
One of the notable challenges is determining which tests can be bypassed altogether. Ramirez remarked, “The fastest simulation cycle is one that you don’t have to run at all,” pointing to an area for improvement in helping teams identify when simulations can be skipped, thereby increasing overall efficiency.
AI has emerged as a pivotal tool within the verification process. Matt Graham, product management group director at Cadence, commented on the impact of AI on test pattern generation, stating, “AI can help optimize wasted cycles and redundant tests.” However, AI’s influence is not solely positive. Ramirez warned that while generative AI could mean faster code creation, it may lead to lower quality outcomes and subsequently result in a more significant verification burden.
When discussing verification stages, Schirrmeister pointed out the necessity of finding bugs early in the development pipeline. He explained the frustrations when issues discovered during integration phases should have been addressed earlier. The discourse has underscored the need for tailored methodologies that align with the contours of the design process, citing the disparity between different abstraction levels in the development of complex systems.
Bernie Delay, senior engineering director at Synopsys, noted the imperative for higher level abstraction focused on particular scenarios, particularly at the system-on-chip (SoC) level. “The scenarios you’re trying to create are not lower-level IP scenarios,” he said, advocating for methodologies that concentrate on specific issues, such as cache coherency.
Experts have called attention to the integration of different abstraction levels, as models become more critical in managing the vast design complexity. Understanding which models to apply, and when, can drive significant performance gains. Ramirez noted that the adoption of a top-down development methodology is increasingly common, with the focus shifting towards dynamically determining the use of abstract models in response to real-time testing and activity within the simulation.
Ultimately, the theme repeating throughout the discussion was that verification represents a process that is moulded by the specific needs of each design project. Delay encapsulated this sentiment, stating, “If you’re looking for one silver bullet, I don’t think you’re going to find that.” He advocated for a multifaceted approach that incorporates AI/ML techniques and other methodologies to achieve timely, confident chip designs.
As the semiconductor industry continues to grow and evolve, the advancement of verification strategies plays a critical role. The ongoing improvements in methodologies and tools, coupled with the growing integration of AI, promise to significantly enhance capabilities and efficiencies within verification teams.
Source: Noah Wire Services



