This section presents an evolution from a pure HDL-based verification methodology to a test-bench automation system. Figure 1-1 represents the order of evolution in verification environments.
The following subsections explain these phases of evolution in greater detail. Each subsection also describes the challenges faced at each stage of evolution.
With the introduction of hardware description languages (HDLs), it became common to describe both the Device Under Test (DUT) and the test environment in VHDL or Verilog. In a typical HDL test environment:
The testbench consisted of HDL procedures that wrote data to the DUT or read data from it.
The tests, which called the testbench procedures in sequence to apply manually selected input stimuli to the DUT and check the results, were directed towards specific features of the design.
Figure 1-2 shows the HDL task-based strategy.
This approach broke down for large, complex designs because:
The tests were tedious and time consuming to create.
The tests were difficult to read and maintain.
There were too many corner cases for the available labor.
The environment became difficult to create and maintain because it used little shared code. HDLs did not provide robust features necessary to model complex environments.
To make the test environment more reusable and readable, some verification engineers began to write the tests and the test environment code in an object-oriented programming language like C++.
The test writing effort was reduced, because object-oriented programming facilitated modeling the data input and output of the DUT at a high level of abstraction. The engineer created the abstract data models and let the environment create the bit-level representations of these abstract data models.
Although the amount of time spent creating individual tests was reduced, the time spent on other verification tasks increased. The test environment became more complex because new utilities, such as a simulator interface, were required. The time required to build the test environment was substantial, often overwhelming the time saved in test creation, so the overall productivity gain was not sufficient to handle the increasing design complexity.
As the efforts for object-oriented verification took root, verification engineers also realized the need to reduce the effort required to create directed tests, which were tedious, time-consuming and difficult to maintain. Therefore, verification engineers began to use random generators to automatically select input stimuli. By writing a single test and running it multiple times with different seeds, an engineer could, in effect, use the environment to create multiple tests.
However, fully random generation created a lot of illegal stimuli. In order to avoid many uninteresting or redundant tests, it was necessary to build a custom generator. Creating and maintaining a custom generator proved to be a difficult challenge.
In addition, random generation introduced new requirements:
Checking the test results became more difficult — because the input stimuli could be different each time the test was run, explicit expected results were impossible to define before the run.
Functional test coverage became a requirement — the engineer could not tell which tests verified which design requirements without analyzing the tests to see which input stimuli were applied.
Thus, although the test writing effort in this environment was greatly reduced, the additional work to maintain the generator meant that the overall productivity gain was not sufficient to handle the increasing design complexity.
At this point, there was a strong motivation to reduce the amount of effort spent in creating the complex utilities in the test environment, such as the simulator interface and the custom generator. These utilities were typically difficult to maintain when design specifications changed during the verification process. Moving to a different simulator or a different version of the design also required significant rework in the verification environment.
Typically, a testbench tool would reduce the effort required to build the test environment. However, often testbench tools did not have sophisticated constraint resolution and language-layering capability that allowed the test writer to specify the test at an abstract level, without detailed knowledge of the structure of the test environment. Therefore, the test writing effort (the most frequent activity during simulation) was still not as efficient as it could be.
In addition, testbench tools did not have a way to specify all kinds of temporal sequences and relationships. Therefore, the description of the checking requirements and complex coverage scenarios was very difficult to capture in the test environment.
Since isolated testbench tools had certain limitations, a complete verification automation system that has various enabling technologies was needed to produce a significant boost in productivity. The verification automation environment needed to have the following characteristics:
A language that allows objects in the verification environment to be extended for a particular test rather than be derived by inheritance enables a 90% reduction in test writing labor.
A language to express constraints in a verification environment because a constraint-based approach is more powerful for testbench description. A sophisticated constraint solver and generator are needed to solve constraints between items in different objects.
A coverage engine that allows goals to be defined for complex test scenarios.
A temporal engine that lets the engineer capture protocol rules in a concise, declarative syntax.
A complete verification automation system increased the overall productivity of a verification environment by helping the engineer efficiently perform the following tasks:
Defining a test plan
Writing and maintaining the testbench environment
Selecting test vectors (input stimuli)
Checking results
Measuring progress against the test plan (coverage)
The following section describes how e can be used to build a complete verification automation system.