Verification

Embed Size (px)

DESCRIPTION

standard verfication testing and vlsi

Citation preview

In electronic design automation, is the task of verifying that the logic design conforms to specification. The purpose of verification is to ensure that the result of some transformation is as intended or as expected.RTL coding from a specification, insertion of a scan chain, synthesizing RTL code into a gate-level netlist and layout of a gate-level netlist are some of the transformations performed in a hardware design project.The designers deploy verification environment utilizing C/C++ and SystemC languages through clearly predefined verification strategy and simulate a complex SoC with its embedded software.Verification stratergies*************************Testcases can be either white-box or black-box, depending on the visibility and knowledge you have of the internal implementation of eachunit under verificationWith higher levels of abstraction, you have less control over the timing and coordination of the stimulus and response, but it is easier to generate largeamount of stimulus and observe the response over a long period of time. If detailed controls are required to perform certain testcases, itmay be necessary to work at a lower level of abstractionFor example, verifying a processor interface can be accomplished at the individual read and write cycle levels. But that requires eachtestcase to have an intimate knowledge of the memory-mapped registers and how to program them. That same interface could bedriven at the device driver level. The testcase would have access to a set of high-level procedural calls to perform complete operations.You must plan how you will determine the expected response, then how to verify that the design provided theresponse you expected.Forexample, verifying a graphic engine involves checking the outputpicture for expected content. A self-checking simulation would bevery good at verifying individual pixels in the picture. But a humanwould be more efficient in recognizing a filled red circle.BFM:The Bus Functional Model (BFM) for a device interacts with the DUT by both driving and sampling the DUT signals. A bus functional model is a model that provides a task or procedural interface to specify certain bus operations for a defined bus protocol. For a memory DUT, transactions usually take the form of read and write operations. Bus functional models are easy to use and provide good performance. It has to follow the timing protocol of the DUT interface. BFM describes the functionality and provides a cycle accurate interface to DUT. It models external behavior of the device. For re usability, the implementation of the BFM functionality should be kept as independent of the communication to the BFM as it can be. Protocol monitor do not drive any signals, monitor the DUT outputs, identifies all the transactions and report any protocol violations.Again lets take a packet protocol. The monitor gets the information from the packet like, length of the packet, address of the packet etc.The following types of test bench are the most common:Stimulus onlyContains only the stimulus driver and DUT; does not contain any results verification.Full test benchContains stimulus driver, known good results, and results comparison.Simulator specificThe test bench is written in a simulator-specific format.Hybrid test benchCombines techniques from more than one test bench style.Fast test benchTest bench written to get ultimate speed from simulation.In its most common use, equivalence checking compares two netlists to ensure that some netlist post-processing, such as scanchaininsertion, clock-tree synthesis or manual modification1, did not change the functionality of the circuit.Another popular use of equivalence checking is to verify that the netlist correctly implements the original RTL codeFor example, all state machines in a design could be checked for unreachable or isolated states. A more powerful property checker may be able to determine if deadlockconditions can occur.Functional: You can prove presence of bugs but u cant prove absence of it.black box: With a black-box approach, functional verification is performedwithout any knowledge of the actual implementation of a design.All verification is accomplished through the available interfaces,without direct access to the internal state of the design, withoutknowledge of its structure and implementationwhite: a white-box approach has full visibility andcontrollability of the internal structure and implementation of thedesign being verified.Grey-box verification is a compromise between the aloofness of ablack-box verification and the dependence on the implementationof white-box verificationVerification approaches************************Top Down Testing is an approach to integrated testing where the top integrated modules are tested and the branch of the module is tested step by step until the end of the related module. If coverage is missing, it usually indicates either unused code or incomplete tests.Types: brach coverage,statement coverage,path,expression,toggleBottom Up Testing is an approach to integrated testing where the lowest level components are tested first, then used to facilitate the testing of higher level components. The process is repeated until the component at the top of the hierarchy is tested. A co-simulation environment employs an ISS (instruction set simulator) that models processor and a SystemC/HDL simulator based on a FPGA. A co-verification environment employs FPGA-based hardware emulator and an actual embedded processor. It is used to verify the whole SoC integration before the fabrication of a target SoC.co-simulation environment mainly consisting of two parts; an ISS and a hardware simulator. An ISS is used to execute software design of target SoC. A hardware simulator is to implement the hardware part using SystemC and HDLwe employ a co-verification framework to validate the whole SoC system integrated in single hardware platform, which contains an embedded processor core and a fast hardware emulator employing a hardware accelerator to reduce the verification timeCo-verification has some advantages compared with respective verification. Key concept behind co-verification is to merge the respective debug environments used by hardware and software teams into a single framework. It provides designers an early access to both hardware and software components of the designs.