Previous Section Next Section

1.1 The Evolution of Verification

This section presents an evolution from a pure HDL-based verification methodology to a test-bench automation system. Figure 1-1 represents the order of evolution in verification environments.

Figure 1-1. Evolution of Verification

graphics/01fig01.gif

The following subsections explain these phases of evolution in greater detail. Each subsection also describes the challenges faced at each stage of evolution.

1.1.1 HDL-Based Verification

With the introduction of hardware description languages (HDLs), it became common to describe both the Device Under Test (DUT) and the test environment in VHDL or Verilog. In a typical HDL test environment:

Figure 1-2 shows the HDL task-based strategy.

Figure 1-2. HDL Task-Based Strategy

graphics/01fig02.gif

This approach broke down for large, complex designs because:

1.1.2 Object-Oriented Verification

To make the test environment more reusable and readable, some verification engineers began to write the tests and the test environment code in an object-oriented programming language like C++.

The test writing effort was reduced, because object-oriented programming facilitated modeling the data input and output of the DUT at a high level of abstraction. The engineer created the abstract data models and let the environment create the bit-level representations of these abstract data models.

Although the amount of time spent creating individual tests was reduced, the time spent on other verification tasks increased. The test environment became more complex because new utilities, such as a simulator interface, were required. The time required to build the test environment was substantial, often overwhelming the time saved in test creation, so the overall productivity gain was not sufficient to handle the increasing design complexity.

1.1.3 Random Generation

As the efforts for object-oriented verification took root, verification engineers also realized the need to reduce the effort required to create directed tests, which were tedious, time-consuming and difficult to maintain. Therefore, verification engineers began to use random generators to automatically select input stimuli. By writing a single test and running it multiple times with different seeds, an engineer could, in effect, use the environment to create multiple tests.

However, fully random generation created a lot of illegal stimuli. In order to avoid many uninteresting or redundant tests, it was necessary to build a custom generator. Creating and maintaining a custom generator proved to be a difficult challenge.

In addition, random generation introduced new requirements:

Thus, although the test writing effort in this environment was greatly reduced, the additional work to maintain the generator meant that the overall productivity gain was not sufficient to handle the increasing design complexity.

1.1.4 Testbench Tool

At this point, there was a strong motivation to reduce the amount of effort spent in creating the complex utilities in the test environment, such as the simulator interface and the custom generator. These utilities were typically difficult to maintain when design specifications changed during the verification process. Moving to a different simulator or a different version of the design also required significant rework in the verification environment.

Typically, a testbench tool would reduce the effort required to build the test environment. However, often testbench tools did not have sophisticated constraint resolution and language-layering capability that allowed the test writer to specify the test at an abstract level, without detailed knowledge of the structure of the test environment. Therefore, the test writing effort (the most frequent activity during simulation) was still not as efficient as it could be.

In addition, testbench tools did not have a way to specify all kinds of temporal sequences and relationships. Therefore, the description of the checking requirements and complex coverage scenarios was very difficult to capture in the test environment.

1.1.5 Complete Verification Automation System

Since isolated testbench tools had certain limitations, a complete verification automation system that has various enabling technologies was needed to produce a significant boost in productivity. The verification automation environment needed to have the following characteristics:

A complete verification automation system increased the overall productivity of a verification environment by helping the engineer efficiently perform the following tasks:

The following section describes how e can be used to build a complete verification automation system.

Previous Section Next Section