Highlights of DVCon EU 2021


Another virtual edition of DVCon Europe (October 26-27, 2021) has just come to an end.

The technical content of the conference was diverse as always, but over the last years, the scope of the event has been successfully broadened and the on-going aim is to strengthen collaboration between software, hardware and system-level design and verification experts.

AMIQ was involved on multiple levels: sponsoring, exhibiting and attending technical presentations.

Some of the highlights of the technical presentations attended and enjoyed by the AMIQ team are presented below.

Table of Contents

Hot Topics

There were two hot topics that caught our attention: Machine Learning (ML) and the elastic verification infrastructure.

ML is sexy, catchy, hot and powerful so many are tempted to use it to solve verification-related problems. Beside the multitude of papers, it was also addressed by a panel, Can ML Be The Driver Of Next-Generation Verification?, with panelists from Qualcomm, Cadence, IBM Research, Verifyter, Breker Verification Systems and Renesas. The panelists approached the discussion from different perspectives, showing some of their successes and failures. It was a landmark panel discussion for the ML topic.

The elastic verification infrastructure was addressed by the Anatomy of a verification flow panel which brought together people from AMD, ARM, SiFive and Mythic AI. The discussion started by enumerating some of the mandatory verification flow components, but as it unfolded, one requirement emerged: EDA vendors should provide elastic verification tools (e.g. simulation, emulation etc.) and infrastructure.

The industry is asking the EDA oligopol to provide their application suites in the cloud (e.g. AWS, Microsoft Azure or proprietary) together with an updated business model. Although it might be doable today, some of the panelists highlighted the current business model as the bottleneck. The industry is asking for a pay-per-use model or some mixed model (i.e. a base yearly floating license package plus pay-per-use), which adapts naturally to the project’s needs. This way the panelists expect to save some of the costs associated with licenses not being used outside the project peaks.
The moderator, Jean-Marie Brunet, highlighted that an elastic infrastructure solution can be very cost-efficient for start-ups (i.e. avoid OpEx with infrastructure) while for established companies might be more expensive than it is today. But, in order to have a cost efficient verification flow it is not enough to use an elastic infrastructure with an attractive business model, an improvement of the verification environments and methodology is also needed. Companies need to improve verification environments performance, improve debug flows, adapt the regression to its goals (e.g. debug, metrics collection, maintenance etc.), have a better prioritization of the verification jobs and, last but not least, do a better verification planning and partitioning. Maybe companies should already start preparing their flows for the coming cloud-based solutions.
[Update 2021/11/10] I took the time to look deeper into the EDA vendors offerings and all three vendors have a cloud-based solution. I am a bit puzzled: is it that the panelists were not aware of the existing solutions? Was this another PR panel to boost the “need” for cloud based solutions?

AI, Machine Learning, Deep Learning

Artificial Intelligence In ASIC/SOC Verification (Paul Kaunds – P&C)

This tutorial presented an overview of different approaches in which ML techniques are used to improve different stages of the functional verification process.

The main focus was around reducing the coverage closure time and resources by training artificial neural networks or deploying genetic algorithms.

In the end, the presenter outlined the major challenges that are faced during each of these research directions.

Although it showed several different strategies used across the industry, the attendees seemed more interested in specific examples and numbers that were not included.

Machine Learning for Coverage Analysis in Design Verification (Jayasree Venkatesh – Qualcomm India)

An interesting presentation of a framework that reduces the code coverage closure time by harnessing the power of artificial neural networks.

The audience appreciated the level of details regarding the mathematical model and the numbers that accompanied the conclusions. Different parameter values can be chosen in early stages for adjusting the impact of either the simulation time or the initial coverage percentage. This is quite convenient since different projects have different requirements and use case conditions.

SimPy and Chips: A Discrete Event Simulation framework in python for large scale architectural modelling of machine intelligence accelerators (Hachem Yassine, Daniel Wilkinson, Graham Cunnigham, Iason Myttas – Graphcore)

The intention was to develop a tool that can assess the challenges for SoC verification tasks across different topologies.

The paper proposed SimPy, a framework developed in Python that aims to reduce the simulation time for complex SoC. It has dedicated functions for modelling latencies, protocols, and pipeline stages.

The major limitation of SimPy was the fact that it cannot schedule events in the future, and this required some framework workarounds for enlarging support. Also, a sequence of events can significantly slow down simulation because the framework ends up spanning multiple events in the event heap memory.

Nevertheless, the direction of the research is welcomed and might have a future impact for the Open-Source community.

Optimizing Design Verification using Machine Learning: Doing better than Random (William Hughes, Sandeep Srinivasan, Rohit Suvarna – VerifAI & Maithilee Kulkarni – VerifAI/Xilinx)

The authors presented findings on using a Reinforcement Learning Agent (RLA) for optimizing FIFO depths for different DUTs.

The papers described how to fill code coverage using a Deep-Q-Network (DQN), based on a small example with three depth levels of ‘IF’- statement. The RLA learned to hit an average code depth of 1.8 after 500 episodes and 2.5+ after 2500 episodes.

The use cases focused on different DUTs like a MESI Cache Controller and a RISC V system.

The solution can be scaled for large projects with a very large number of coverpoints and bins by deploying different artificial neural networks in parallel.

Machine Learning based Structure Recognition in Analog Schematics for Constraints Generation (Rituj Patel, Husni Habal & Konda Reddy Venkata – Infineon Technologies)

A major issue for analog circuits is the manual identification of different analog blocks and stages in the circuit. The authors developed a machine learning solution that can automatically recognize the structures and estimate their number by analyzing the circuit meta-data.

The paper uses a K-means clustering, non-supervised ML approach that can identify the target structures and a Graph Convolutional Neural Network implemented in TensorFlow.

SystemVerilog and UVM

Testbench flexibility as a foundation for success (Ana Sanz Carretero, Katherine Garden & Wei Wei Cheong – Xilinx)

This paper presented a successful approach for managing a multitude of testbench architectures.

It was a very technical presentation with implementation details of each component of a flexible verification architecture and coding methodology (with strong OOP foundations) that allows a single UVM testbench to adapt and support all levels of integration (unit-level, top-level, system level, etc.).

A highly recommended read if you are working with testbenches at different verification levels.

Handling Asynchronous Reset(s) Testing by building reset-awareness into UVM testbench components (Wei Wei Cheong, Katherine Garden & Ana Sanz Carretero – Xilinx)

Another interesting paper from the same authors (see above) about reset verification, a very common and crucial procedure to be carried out in order to verify that the DUT is able to enter and exit the reset phase cleanly and its performance still conforms to its specification after the reset process.

Several testbench architectures and techniques for reset handling have been already presented throughout the years, but this paper aims to present an alternative approach in scheduling one or more asynchronous resets throughout a simulation and handling these random reset events in the UVM testbench components by using the fork-disable_fork structure.

One Testbench To Rule Them All! (Salman Tanvir, Markus Brosch & Amer Siddiqi – Infineon Technologies)

Developing testbenches to support both vertical and horizontal reuse seems to be a real challenge in the SV/UVM world. The authors of this paper focused on defining the requirements for a testbench structure with common base code and approach.

This approach allows reuse as much as possible, decreases the debug effort and simplifies integration of an IP verification environment into higher verification levels (sub-system or chip levels).

An example of testbenches which are similar in their needs, but different in their approach can be observed by comparing this paper with Testbench flexibility as a foundation for success.

Test driven Hardware Design and Verification (Bodo Hoppe, Holger Horbach – IBM Research & Development, Georg Gläser – IMMS, Tudor Timisescu – Infineon Technologies, Matthew Ballance – Siemens EDA, Philipp Wagner – FOSSi Foundation)

Although classified as a tutorial, this session was a fishbowl discussion with one moderator and five main participants. Everyone was invited to actively participate in the discussion, with around 10 out of the 50 attendees having contact or used Test Driven Development (TDD).

The shared experiences tried to include different industry perspectives: an RTL designer that was using TDD to ensure that his design is working as intended, while some talked about their beginner experience with TDD or asked some questions.

The discussion felt a bit too general and vague. The setup of the meeting was interesting since you could actually contribute and steer the discussion, but the lack of code or concrete examples did take away some of the interest.

Generating Stimulus

No Country For Old Men – A Modern Take on Metrics Driven Verification (Svetlomir Hristozkov, James Pallister & Richard Porter – Graphcore)

A simple mix of Python/C++/SQL/javascript/SV to improve our lives.

The presentation started with a discussion about the increasing demand for simulation power and resources. Since usage always increases, it leaves projects chasing for something that can never be reached and fully exploited.

The proposal is to adapt the method of creating, reviewing and collecting coverage to speed up time to tapeout.

The most interesting idea in the paper was the collection of coverage prior to the TB development. Both the constrained randomization and coverage model were modeled in C++/Python, the coverage was collected and analyzed before development even started. Unit testing using Python was also part of the project model presented by the author.

The presentation was very well done as a demo, both code and continuous integration tools were presented with visible results. The only downside was that the time allocated for the session was too small compared to the complexity of this lecture.

Accelerated Coverage Closure by Utilizing Local Structure in the RTL Code (Rhys Buggy, Gokce Sarar, Guillaume Shippee, Han Nuon, Vishal Karna & Tushit Jain – Qualcomm Technologies)

The paper used genetic algorithms to reduce the number of stimuli required for coverage closure, with two IPs having different levels of complexity for comparison.
The results were very interesting as they showed a coverage closure time 44X smaller than the pure random approach.
The solution relied on identifying the space of valid parameter values that maximize coverage.


Accelerate Signoff with JasperGold RTL Designer Apps (Tutorial) (Bijitendra Mittra, Kanwarpal Singh – Cadence)

This tutorial took the attendees through the Cadence JasperGold Superlint and CDC applications, which add formal verification technology and functional checks to the traditional structural checks for LINT, CDC and RDC.

The apps aim to support the designers to identify the real problem violations, confirming fixes, and providing justification for waiving the violations that are not problematic.

Attendees learned how this combination is able to “shift left” the discussed checks, providing a much more complete level of automated verification.

Automated Code Checks To Accelerate Top-Level Design Verification (Tutorial) (Nicolae Tusinschi – OneSpin Solutions)

A tutorial based on the 360-DV Inspect from OneSpin. It is an automatic code inspection tool, which aims to help designers detect bugs before the functional verification process.

It drives three different verification perspectives: Structural Analysis (focused on semantic analysis, a code linter), Safety Checks (assertions based on the RTL synthesis) and Activation Analysis (code reachability, FSM, dead code checks).

The tutorial focused on presenting the types of checks performed by the tool and their specific user-experience items.

Detection Of Glitch-Prone Clock And Reset Propagation With Automated Formal Analysis (Kaushal Shah, Sulabh Kumar Khare – Siemens EDA)

This paper described a new technology which provides debug triage and reduces violation noise, used for detecting glitches on clock and reset paths. It highlighted the possible causes of glitches appearing in a design with emphasis on the issues introduced by the synthesis optimization step.

The proposed solution was applied on two different designs, one with 10M gates and the other one with 2M gates. Glitches reports were provided with the source, converging point, destination nodes, propagation conditions and the glitch schematic.

The presentation was more about the problem and the challenges (simulation is not exhaustive, glitches may occur where there are no assertions, the detection is noisy), than the solution and the results.

SystemC, Virtual Platforms, System Modeling

SystemC Evolution Day (October 28, 2021) was a full-day, technical workshop on the evolution of SystemC standards to advance the SystemC ecosystem. Current and future standardization topics around SystemC were discussed in order to accelerate their progress for inclusion in Accellera/IEEE standards.

Accelerating Analog/Mixed-Signal Design And Verification Through Integrated Rapid Analysis (Tutorial) (Ganesh Rathinavel – MathWorks)

An interesting tutorial on how Simulink was used alongside Virtuoso to create behavioral models for the early stages of system-level verification.
It would have been interesting to see if this setup can be used for creating real-number models for analog modules since these usually bring a lot of challenges when trading-off between accuracy and simulation runtime.

The charts that could be generated using Virtuoso provided a comprehensive overview of the entire design exploration phase. This is quite useful for debugging and improving the models, as well as optimizing some key steps of the modeling process.

AI/ML Accelerator Verification Tutorial: High-Level Verification of C-Level Design (Tutorial) (David Aerne, Jonathan Craft – Siemens EDA)

This tutorial presented the figures of using a high-abstraction language for implementing AI solutions in hardware.

The accelerators implemented using SystemC can overcome the challenges fueled by conflicting requirements and inefficient resource allocation.

The presentation continued as a workshop on demonstrating the rapid implementation of the high abstraction models without departing from the typical verification techniques.

Using Hardware-Aware, Model-Based Software Development To Speed Up Embedded Designs (Tutorial) (Irina Costachescu & Razvan Chivu – NXP Semiconductors, Mauro Fusco Headshot & Cristian Macario – MathWorks)

This tutorial presented a toolbox dedicated to model based design development on top of MATLAB & Simulink. This was an actual tutorial, with specific applications being created from scratch using MBDT (Model Based Design Toolbox) and showed how they can be validated.

Deep Cycle HW/SW Verification Using High-Performance Prototyping Systems (Tutorial) (Andy Jolley – Synopsys)

A tutorial centered around ideas and growing technologies for debugging and validation of hardware through prototyping, focused on high performance FPGAs.

The presentation showed proprietary technology in use, but also touched on general notions and went in-depth on what new features are available when it comes to debugging and overall visibility inside the DUT, during different types of validation stages.

The main take-away from the presentation was the increasing need for event detection during prototyping and the way recent software advances allow for that to happen.

Functional safety

Collaborative, Advanced Fault Analysis: Addressing The Functional Safety Verification Challenges From The Accellera Functional Safety (Tutorial) (Sesha Sai Kumar – Optima)

The presentation outlined challenges with the recently introduced Accellera Functional Safety White Paper as well as suggestions on how to address them using today’s tools and methodologies.

The major challenge was choosing the right order when using the functional safety tools. In this sense, the author recommended developing and using a single platform that can be easily set up.

Another major issue was choosing an intuitive classification of the identified faults since this directly affects the quality of the verification process. For maintainability reasons, the fault classification spans across different analysis stages of the functional safety verification tasks.

Best Paper and Poster Awards

This year’s Best Paper Award went to:
Testbench flexibility as a foundation for success (Ana Sanz Carretero, Katherine Garden & Wei Wei Cheong – Xilinx)

The Best Poster Award went to:
An Analysis of Stimulus Techniques for Efficient Functional Coverage Closure (Caglayan Yalcin & Aileen McCabe – QT Technologies)

Virtual experience

Although it lacked the ambience of the live conference, Virtual DVCon EU 2021 provided a platform filled with opportunities to engage.

Registered attendees had the ability to post publicly viewable questions that the presenters could answer. Virtual engagements allowed for live discussions with exhibitors and other participants.

Another great feature of virtual conferences was that you can go back and watch the recordings of all the sessions at your own pace.

The organizers mentioned that the platform will be available through November 26, along with all the recordings and saved discussions.


The conference once again set the bar high in terms of technical program diversity and premium content.

This article is a collaborative effort. A big thank you to my colleagues for providing their insight into this conference.

Let me know your thoughts on the conference in the comments section below.


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Subscribe to our newsletter

Do you want to be up to date with our latest articles?