3. • Company introduction
• Testing & verification
• Black box vs. White box approach
• Unit tests
• Functional models
• Automated testing (Nightly builds)
• Performance tests
Overview
4. • European vendor of FPGA-based network solutions
• Formerly FPGA department of Invea-Tech
• First to introduce 100GE NIC (PCI-E form factor)
Vast experience with FPGA technology since 2002
Xilinx Alliance program partner
PCI-SIG® member company
• Primary focus
Low-latency electronic trading
High-speed packet capture
Smart traffic filtering
FPGA firmware development kit
Company introduction
5. • FPGA based low latency trading solution
Pre-built blocks for communication with an exchange
FIX/FAST and binary
market data protocols
Full order book,
aggregated level book
Order sessions
management (FIX,
ArcaDirect, OUCH, …)
User trading strategy
in C/C++ (optional)
Latency optimized API
Tradecope
6. • Requirements for HW solutions
Reliability
Time-to-market
• Design vs. Verification (testing) = 30% vs. 70%
• Black box approach
Setting inputs, checking outputs
Error gets propagated to outputs
• White box approach
Mechanisms for checking internal state
Automatic check and reporting of defined properties
Testing of HW systems
7. • Simplest white box approach in HW design
• Statements, that are always true
Reading from empty FIFO
Multiple 1 bits in one-hot-encoding
“Things that won’t ever happen”
• Asserts get triggered during simulations
• Helps localizing the bug
• Does not affect hardware performance
Assert is only checked in simulation
In HW – interrupts (report & log the problem)
Assertions
8. • Testbenches for each unit
• UVM/OVM methodology
• SystemVerilog verification
• Pseudo-random generator of inputs
• Checking of outputs
Functional model in SystemVerilog
Object based programming, high-level description
• Simulation can run for days
Basic test has 10,000 transactions
• Monitoring of coverage
Did we cover all input combinations?
Unit tests
9. • HW pipeline implemented in SW (C++)
Same functionality
Verification or SW simulation
• Separate output for each stage
• Same data capture passed
to both HW and SW
• Outputs compared for
each stage
• Over 20 Mbps testing speed
Functional model
10. • How to test a change didn’t break anything?
Too many configurations to test (exchanges, protocols)
• Automated testing using Cdash
Nightly builds, SW model, live exchange access
• GIT branching model (master vs. devel)
Automated testing
11. • Ensure performance parameters of the design
Low tick-to-trade latency
High throughput, zero packet loss
• Online latency reporting
Internal latency of each order reported to SW
Online analysis of results, statistics over time
• Black box measurements
Optical TAPs on input and output wire
Precise timestamps for market data & outgoing order
Independent latency measurement
Performance testing
12. • Most of the time-to-market spent in verification
Automate the process as much as possible
Functional models for your cores
• Simplify localizing and fixing bugs
Assertions and checks in the logic
Outputs from each stage
• Don’t forget about performance!
Automatic latency measurements and reporting
Sustainable throughput without packet loss
Conclusions