Are you new to functional verification? Or do you need a refresher? This presentation takes you through the basics of functional verification - overall scope and process with examples. Also included are some tips on do's and don'ts!
2. | 2
Contents
▪ Verification Philosophy
▪ Familiarity
▪ Scope of Verification
▪ Functional Verification as a Career
▪ Overview of Process
▪ Some Examples: The 3 Musketeers
▪ Do’s and Don’t’s
▪ State of the Art
3. | 3
The Verification Philosophy
KNOW EVOLUTION to understand why
things are the way they are!
GET THE DNA to evolve further
GO BEYOND basics to get productive
4. | 4
Contents
▪ Verification Philosophy
▪ Familiarity
▪ Scope of Verification
▪ Functional Verification as a Career
▪ Overview of Process
▪ Some Examples: The 3 Musketeers
▪ Do’s and Don’t’s
▪ State of the Art
5. | 5
Basic Lab Verification
How did you Verify ICs used
during the Digital Design
Lab ?
- Covered using the Truth
table by giving various
inputs and checking
outputs
- Downsides are
- Time consuming
- Error prone
- Repeatability is not
automatic
7. | 7
Contents
▪ Verification Philosophy
▪ Familiarity
▪ Scope of Verification
▪ Functional Verification as a Career
▪ Overview of Process
▪ Some Examples: The 3 Musketeers
▪ Do’s and Don’t’s
▪ State of the Art
8. | 8
• Four major types of verification :
1. Functional
2. Timing
3. Test
4. Equivalence
• Functional verification ensures correct logic
performance - under all conditions
• 70 % of chip cycle is verification and more than 70 %
re-spins are functional bugs
• Functional verification is very time consuming and thus
is always on the critical path
Scope of Verification (1/2)
9. | 9
• Design technologies are evolving in Moore’s Law
friendly.
• Functional verification technologies are yet to catch up
with the advancement happening in the are of Design
• Functional Verification Tools and methodologies are
attempting to reduce the overall verification time by
enabling parallelism of effort, higher abstraction levels
and automation
• There is no single formula of success
Scope of Verification (2/2)
10. | 10
Contents
▪ Verification Philosophy
▪ Familiarity
▪ Scope of Verification
▪ Functional Verification as a Career
▪ Overview of Process
▪ Some Examples: The 3 Musketeers
▪ Do’s and Don’t’s
▪ State of the Art
12. | 12
• Knows the design
• Contributes to architecture
• Gives feedback as first user
• Is Creative, Logical and Lateral
• Programmer
• “Quality cop”
• Bridges specification, architecture & design
Hats worn by a Verification Engineer
13. | 13
Contents
▪ Verification Philosophy
▪ Familiarity
▪ Scope of Verification
▪ Functional Verification as a Career
▪ Overview of Process
▪ Some Examples: The 3 Musketeers
▪ Do’s and Don’t’s
▪ State of the Art
14. | 14
1. Verification plan
2. Test bench
3. Bring up
4. Debug
5. Regressions
6. Sign off
Process of Functional Verification
1
2
3
4
5
6
15. | 15
• Create varied stimuli and check if the response is
as expected
• Ensure design is not doing anything unexpected
• Look at both Functional specifications and Micro-
architecture specifications (Black box & White
box)
Stimuli
Response
Verification Plan (1/2)1
16. | 16
• Test plan: Create the list of programmable parameters
and features
1. Normal operation : Basic & Concurrent
2. Corner cases
3. Error cases
4. Negative cases
5. Stress design cases
• Functional coverage plan: Capture important features and
parameters
• Strategy to execute the test plan
Verification Plan (2/2)1
17. | 17
Every Test Bench Should have the following characteristics
• Ability to exercise all features and parameters
• Ability achieve the desired coverage
• Necessary abstraction
• Scalability
• Balance ease of verification vs. reality
• Ease of understanding and maintenance
• Ease of debugging – Developer vs User
• Developer is interested in Internal architecture
• User is interested in only black box view
• Easily controllable and observable
Test Bench - Architecture2
18. | 18
External interfaces of DUT
• Reset and Clock
generators
• Bus functional models/
drivers/monitors DUT
Ports
BFM Driver
Test Bench - Components (1/3)2
19. | 19
Functionality
• API or Transactors
• Scoreboard
• Golden models
• End of test
• Assertions
• Functional coverage
DUT
Ports
BFM Driver
SB Reg Xfer
Test Bench - Components (2/3)2
20. | 20
Stimuli :
• Randomized
parameters
• Constrained
random gens
• Self checking
directed test
DUT
Ports
BFM Driver
SB Reg Xfer
CFG GEN TES
T
Test Bench - Components (3/3)2
21. | 21
• Get the clocks and resets up
• No ‘X’ & ‘Z’ on critical inputs, Tie them off if needed
• Get all critical interfaces of DUT up
• Start enabling features in its basic form and introduce
variables one at a time
Design and Test Bench Bring Up3
22. | 22
• Done when things not going as expected
• Takes up 30% of functional verification activity
• First step is to isolate between the DUT/TB
• Isolation is done with help of logging
• Use regular expression friendly logging
• Extract stimuli and response from log
• If its not isolatable from log look at waves
• Benchmark debug is to be able to suggest the design fix
• Verification engineer is owner of bug till it is fixed
Debug4
23. | 23
• Intersection among features
• Introduction of new features can cause a domino effect of failures
on previously passing ones
• Protect what has been working by running a set of critical tests
before every update
• Periodically complete run of all tests to maintain the health
• Random tests run with seeds
Regressions5
24. | 24
• Verification never really completes
• Minimize risk using metrics
• 100 % code coverage
• 100 % functional coverage
• All directed tests passing
• Randoms having zero failures or acceptable failure rate
over few weeks
• No critical bugs found over last few weeks
Sign Off6
25. | 25
Contents
▪ Verification Philosophy
▪ Familiarity
▪ Scope of Verification
▪ Functional Verification as a Career
▪ Overview of Process
▪ Some Examples: The 3 Musketeers
▪ Do’s and Don’t’s
▪ State of the Art
26. | 26
• Action begins…
• Lets demonstrate what we have learnt by attacking The
Three Musketeers described below
• Each of these will take you through complete flow and
also focus on clarifying one aspect we have learnt in
previous slides
1. Counters – 2-bit and 256-bit
• Verification strategy
2. FIFO Verification
• Test cases and assertions
3. UART Verification
• Test bench architecture
The 3 Musketeers
27. | 27
Test plan
▪ Make sure on reset assertion count can reset to 0
▪ Counter can roll back to 0 after max
▪ Counter can count through all counts
Verification strategy
▪ Do we need random?
▪ Use the directed tests
Test bench
▪ Clock and reset generator
▪ 2 directed tests
Counters: 2-Bit Counter Verification
Strategy
28. | 28
Test plan:
▪ Make sure on reset assertion count can reset to 0
▪ Counter can roll back to 0 after max
▪ Counter can count through all counts
Verification strategy:
▪ Do we need random?
▪ Sequentially counting all counts 2 ** 256 = 1e77 would
take almost forever
Counters: 256-Bit Counter Verification
Strategy (1/3)
29. | 29
Bring up
▪ Start off with the pre load count as 0
▪ Disable the intermediate reset generation
▪ Make sure counter can count up to random count after initial reset
Enabling random
▪ Enable the random pre-load count
▪ Enable intermediate reset generation
▪ Complete the directed negative test
Closure
▪ Code and functional coverage defined above are met
▪ 50 seeded random regressions are clean over couple of weeks
Counters: 256-Bit Counter Verification
Strategy (2/3)
30. | 30
• Start from random value and count up to another random
value
• Divide count range into 4 and make sure each range
covers some counts
• All the bits should be toggling
• Count from Max to 0
• Reset during random count
• Bias the design change for pre load
• How do you handle condition of reset and enable
asserted together ?
Counters: 256-Bit Counter Verification
Strategy (3/3)
31. | 31
• Identify variables that need randomization
– Random pre-load count
– Random count up value to end the test
– Control to enable/disable intermediate reset generation
– Random count at which reset needs to be generated
▫ All random stimulus should have control to make it directed if
needed
• Clock generator
• Counter model with load and reset control - This will act as a score board.
• The model’s output will be used for check - Comparing at every clock will
ensure that the bug’s can be caught immediately
Counters: 256-Bit Counter Test Bench
32. | 32
count <= count + 1
• Models are
behavioral code
• Simpler to write
• Smaller than DUT
• Quicker to write
• Lesser code
means lesser
bugs!
Counters: 256-Bit Counter DUT vs Model
34. | 34
Normal operation
▪ Order of data pushed should match POP
▪ After N PUSH and 0 POP, FIFO FULL assertion
▪ Num of PUSH == Num of POP EMPTY assertion
▪ Stress : PUSH and POP at same time
▪ Reset intermediately
Negative case
▪ PUSH on FULL and POP on EMPTY
Implementation specific
▪ Internal counter alignments for FULL/EMPTY generation
Always true conditions [Assertion]
▪ Both FULL and EMPTY should never get asserted together
▪ Number of clocks between the assertion of Write on input to de-assertion of
empty on output - This is also indicative of performance of the FIFO.
Synchronous FIFO - Test Plan
35. | 35
Normal FIFO operation should be covered with the random
approach
▪ Random delay on the POP side will lead to FULL conditions on PUSH
side
▪ Random delay on PUSH and POP side together will lead to correct
checking of FULL/EMPTY conditions on various read and write
counter alignments
▪ Data pushed should be randomized - Cover for all toggling bits
Negative and intermediate reset cases can be covered as directed
case
▪ PUSH when full and POP when empty
▪ Reset during the various combinations of full/empty
Synchronous FIFO – Verification Strategy
36. | 36
DUT
DRV PUSH DRV POPSCOREBOARD
GEN PUSH GEN POP
RND
TESTS
DIR TESTS
Synchronous FIFO – Test Bench (1/4)
37. | 37
DRV PUSH
▪ Handles the push side interface of FIFO
▪ push_data(data)/is_full()
DRV POP
▪ Handles the pop side interface of FIFO
▪ pop_data(data)/is_empty()
GEN PUSH
▪ Uses the PUSH DRV and generates specified number of PUSH
▪ Injects random delay between the PUSH
GEN POP
▪ Uses the PUSH DRV and generates specified number of PUSH
▪ Injects random delay between the PUSH
Synchronous FIFO – Test Bench (2/4)
38. | 38
Scoreboard
▪ Model of FIFO - Implemented in the behavioral code
▪ Connected to test bench environment using the call backs
▪ Drivers indicate the scoreboard when PUSH and POP happens
▪ On PUSH, scoreboard will store the data pushed - This is
“golden data”
▪ On POP, scoreboard will compare the data popped from design
with the data stored internally and indicate if there are
mismatches - This will verify both ordering and data integrity
▪ Scoreboard should have capability to get disabled - This will be
used during some of the directed tests for negative cases and
resets
Synchronous FIFO – Test Bench (3/4)
39. | 39
Random tests
▪ Simplest level just sets up counts of FIFO operations
▪ Set the delays between PUSH/POP
▪ With and without random resets enabled
Directed tests
▪ Negative cases directly using the API’s of DRV PUSH and
DRV POP
End of test
▪ Random tests check specified count completed
▪ Make sure at the end FIFO is its default state
Synchronous FIFO – Test Bench (4/4)
40. | 40
• Confirm clocks and resets are running and no ‘x’ on
any input
• Start off with a simple directed test that disables the
scoreboard and does couple of pushes and pops
• In the directed test enable the scoreboard and get the
basic case fully working
Synchronous FIFO – Bring up
41. | 41
Full feature enabling
▪ Start off the normal operation random test with smaller
counts of operations
▪ Gradually increase the count of operations to reasonable
number
▪ Do multiple seeded regressions of the random operation
command line
▪ Write up the various negative condition and any corner
case directed test cases
Closure
▪ Code and functional coverage met
▪ 50 seeded random regressions are clean over couple of
weeks
Synchronous FIFO – Closure
42. | 42
DUT
HOST
BUS BFM
UART BFMSCOREBOARD
REG API INTR API
TX GEN RX GEN
BFM API
TESTS
UART Test Bench (1/2)
43. | 43
UART BFM
▪ Behavioral model of UART only modeling the Bus
operation and Tx and Rx pins
▪ Does not need to implement all the internal registers of the
BFM but it does need to provide the necessary controls
▪ Does not need to implement the interrupts
set_configuration()/tx_bfm_data()/rx_bfm_data()
▪ Provides support for the tx error injections
▪ Does framing checks on rx
UART Test Bench (2/2)
44. | 44
REGAPI
▪ All the register access details can be implemented here
▪ DUT initialization will be done here
▪ Data transmitted/received from UART will have series of
steps. That sequence will be implemented here.
Interrupt API
▪ Will implement the interrupt servicing of the UART
interrupts
▪ Can be for popping received data or can be for the error
conditions
UART Functional Layer (1/2)
45. | 45
BFM API
▪ Sets up the BFM configuration matching the DUT
configuration
▪ Pops the data received from the BFM
▪ Provides hooks for TB to inject the errors
TX GEN/RX GEN
▪ Generates the specified number of the data packets
▪ Injects the random delay between the packets
UART Functional Layer (2/2)
46. | 46
• Extracts DUT Tx data transmitted from HOST BFM and
compares in order with the data received from UART BFM
on it’s Rx
• Extracts UART BFM Tx data transmitted from UART BFM
and compares in order with the data received from HOST
BFM on DUT Rx
UART Scoreboard
47. | 47
Contents
▪ Verification Philosophy
▪ Familiarity
▪ Scope of Verification
▪ Functional Verification as a Career
▪ Overview of Process
▪ Some Examples: The 3 Musketeers
▪ Do’s and Don’t’s
▪ State of the Art
48. | 48
• Focus on optimum verification to achieve quality
• Design evolves - make sure verification plan evolves
around that
• Get plugged into various sources of information to
track changes
• Use emails or better bug reports
• Document and archive important discussions
• Avoid bypass and short cuts. If you really need to
take one always clearly comment/document
Do’s and Don’t’s
49. | 49
Contents
▪ Verification Philosophy
▪ Familiarity
▪ Scope of Verification
▪ Functional Verification as a Career
▪ Overview of Process
▪ Some Examples: The 3 Musketeers
▪ Do’s and Don’t’s
▪ State of the Art
50. | 50
• Constrained random approach has been found to be very
effective in functional verification
• System Verilog has OOPs support to enhance the re-
usability, special constructs for randomization and
coverage
State of the Art – SystemVerilog
51. | 51
• Common good practices of constrained random verification
environment development have been standardized as
methodologies
• There are multiple such industry level standard verification
methodologies such as VMM, OVM and UVM
• These methodologies also provide set of base class libraries and
• utilities that are very useful across the entire spectrum of
verification activities
State of the Art – Verification
Methodologies
52. | 52
End of Document
Visit us on:
http://www.arrowdevices.com