Functional Verification

What is functional verification?

Functional verification is a step in the EDA process that ensures your logic conforms to the specification. It ascertains whether the logic design is performing as intended. Companies pour millions of dollars into this step because it can be a significant breaking point. Despite this reality, companies still encounter major hardware bugs that can end a platform. Functional verification is difficult because you're testing every possible outcome, which can get into the trillions of test cases on sufficiently large designs. The challenge with functional verification is creating tools to test a design quickly and efficiently without leaving anything out. Significant subfields in functional verification deal with the other areas you need to work with when verifying a circuit.

Functional verification also builds on Universal Verification Methodology (UVM) to create a reusable verification structure. UVM and functional verification go hand in hand because UVM essentially layers on top of the SystemVerilog code that is necessary to create the verification test benches and ensure the design is doing what it is supposed to do. The main challenge with functional verification is identifying what to test, as testing everything is not possible.

One might miss a test that could potentially stop the design from having a bug in the future. Hardware bugs are notoriously expensive to fix, so it makes sense to try your best to fix them as early as possible in the integrated circuit design process.

 

Why is functional verification important?

Functional verification ensures you release a product that works the way you designed it. Functional verification checks your logic design as thoroughly as humanly possible. As products become more complex, so do system design specs. Understanding how the various pieces of the puzzle fit together is critical, and a large team of people working together to accomplish this goal can get cumbersome. Functional verification is so valuable because it provides the frameworks and tools allowing you to systematically tackle the big question: does your design work the way you designed it? The common framework allows a large team to work on the project's design and verification parts as seamlessly as possible. You can split the task to allow multiple people to tackle individual steps. As functional verification is a crucial step in the chip-building industry, your engineers must understand how to do it well.

As mentioned, missing a crucial functional verification test can cost the company billions of dollars. The major problem with functional verification is that it is impossible to test every possible combination in complex designs. Therefore, a systemic method to decide whether a test should be run is also necessary. These decisions require experience and insight, one of the many reasons why companies invest significant resources into effective functional verification.

Functional verification is also important for performance optimization. You might verify a product only to realize it doesn't perform as well as you want it to. You can then continue the verification process to understand ways to improve its performance. The lessons learned from functional verification will filter up into the design process and down into other verification steps later in the workflow.

How does functional verification work?

Simulating all possible test cases in functional verification is impossible. Therefore, companies must verify an integrated circuit design by approaching it from different areas. This gives the best possible chance of correct functional verification. There are still some test cases worth simulating, but others are best approached in a way that will make them easy and accessible for your engineers to complete.

System Verilog is a pretty common language that most people using functional verification will utilize. As discussed, many of them will also use UVM to build on top of it, so it brings more information in. UVM and functional verification are complimentary. You would have the base language SystemVerilog, then UVM layers on top to add additional functionality. The code writer would control these, thereby contributing even more functionality.

Here are the different methods of simulating a logic design:

Logic Simulation

One way of achieving functional verification is by doing logic simulation. You can simulate separate logical structures that make up a functional unit by performing logic simulation before the functional unit is built. This allows you to parallelize the necessary computations, sufficiently testing the relevant cases to ensure that functional verification is done well. Logic simulation also helps you build a reliable code base of logical elements for future projects. As most complex integrated circuits are made from the same basic logic elements, it doesn't make sense to reinvent the wheel if you have a proven reusable component.

Emulation

Emulation is where you build a version of your logic design using an FPGA or other programmable logic devices. This method is expensive and slow, but it is much faster than simulation, and you can use it to boot up certain software programs. For example, you could boot up the operating system you plan to run on the physical hardware. The major benefit of emulation is that you create an accurate version of your logic design. If there are any obvious bugs, you can see them at the step of the process. Having the bugs found here is much better than making it to the final hardware design.

Formal Verification

Formal verification uses mathematical expressions to check the logic design. Through mathematics, you're proving that specific requirements are met in the design. You can also use this method to check that deadlocks do not happen in your design. It is better to catch problems here than at later stages in the design process.

Functional verification with Cadence

Cadence verification comprises core engines and applications that increase design quality and throughput, fulfilling verification requirements for a wide variety of applications and vertical segments.

Cadence Xcelium Logic Simulator provides best-in-class core engine performance for SystemVerilog, VHDL, SystemC®, e, UVM, mixed-signal, low power, and X-propagation. It leverages a set of domain-specific apps, including mixed-signal, machine learning-based test compression, and functional safety, which enable design teams to achieve verification closure early for IP and SoC designs.

Cadence emulation and prototyping systems provide comprehensive IP/SoC design verification, system validation, hardware and software regressions, and early software development. They comprise of a dynamic duo of tightly integrated systems: Cadence Palladium Z2 Enterprise Emulation, optimized for rapid predictable hardware debug, and Cadence Protium X2 Enterprise Prototyping, optimized for highest performance multi-billion gate software validation.

The Cadence Jasper Formal Verification Platform consists of formal verification apps at the C/C++ and RTL level. They use smart proof technology and machine learning to find and fix bugs and improve verification productivity early in the design cycle.

Cadence Verisium Manager automates end-to-end management of complex verification projects from planning to closure. Verisium Manager tightly integrates with the Cadence Verisium Artificial Intelligence (AI)-Driven Verification Platform, leveraging big data and AI to reduce silicon bugs and accelerate time to market. It is built on the Cadence.AI Generative AI Platform, providing the best multi-engine, multi-run, multi-user, and multi-site verification management capabilities.