Sei sulla pagina 1di 30

How Formal Verification

Can Please Your Project


Manager (too)

Jasper User Group Meeting


Cupertino, October 2013

Laurent Arditi
ARM France
ARMSDAQ: FV
Manager’s satisfacation with formal

2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015

2 Jasper User Group Meeting 2013


Intro
 Managers are interested in quality, schedules…
and Return On Investment (ROI)
 So what’s the ROI of formal verification, and how to increase it?

3 Jasper User Group Meeting 2013


Agenda (kind of)
 Demonstrate the statements about formal verification are lies:
“Formal is costly because it needs lots of constraints”
“Formal does not find significant nor genuine bugs”
“Formal is limited in size and is not able to tackle much more than a
small arbiter”
 Show that there are approaches to get a very high ROI out of
formal by both
Reducing the human and infrastructure (of formal AND
of simulation)
Increasing the
 Examples taken from real ARM projects (CPUs, GPUs, other
IPs) using formal as part of their validation strategy

4 Jasper User Group Meeting 2013


Embedded (designers) properties in formal
 Written primarily for simulation |-> no extra
 Some rewrites to increase formal efficiency |=> can find
bugs escaping the standard assertions (both in simulation
and in formal). Build a good know-how over time:
 Use of oracle and pseudo-constants
 Use structural symmetry
 Split into smaller assertions
 Write assertions as implications for coverage results
 Measure the implication ratio. Look for weak modules
 Push conditions to the left to make covers more difficult to reach
 Do not duplicate similar covers
 Disable assertions for similar instances
 Reduce cluster at a low risk

5 Jasper User Group Meeting 2013


X-propagation
 Simulation has issues with Xs:
 X-pessimism X && !X == X
 X-optimism if (X) … else …
 Simulators do not deal correctly with Xs: RTL vs gate differences
 Simulation tests masking possible X-propagations because of their
preamble
 X-propagation with JasperGold
 Few hand-written properties
but auto-generated ones are very good
 Finds many bugs
 Debugging is very easy, especially compared to simulation
 Some bugs are not real ones but it’s good practise to fix them anyway
 easier to fix than to analyze the potential implications
 a bug may mask another one
Now relying exclusively on JasperGold to clean X-propagation
6 Jasper User Group Meeting 2013
Clocking verification
Low-power designs have complex clocking schemes which are
difficult to verify with simulation
 Regional clock gating:
Block A Block B  Internal clock can be
stopped only when
in nothing is pending
 Some clock
out relationships
 High-level clock
CG gating:
 out must be stable
when CLKEN is low
 in must not be
slow sampled when
fast clock clock CLKEN is low
CG
slow clock enable
(CLKEN)

7 Jasper User Group Meeting 2013


Clocking verification with formal
 Real validation problem shown on Cortex® A12
 Simulation
 Intrusive (X-injection). Needs testbench changes, slower simulation,
not reaching the corner-cases
 Could find 1 late bug
 JasperGold for regional clock gating and output stability
 Found many bugs: corner cases
 JasperGold SPV for input sampling
 Flow setup is generic
 Just need to list the clock enable signals and the associated pins
 Not intrusive
 “Exhaustive”, but almost no full proofs
 Could find many very late bugs
 Could be done using formal only => Reduce simulation cost
8 Jasper User Group Meeting 2013
High-level properties
 Very generic because using generic formal components only
 So easily reusable
 Examples
 Cache line duplication. Looking at the content of the formal models
for the tagrams
 Coherency in multi-core CPUs
 Memory system architectural properties (PDF)

 Don’t try to get full proofs for these, nor expect to get very
deep exploration depths

 But with a bug hunting approach real RTL bugs are found

9 Jasper User Group Meeting 2013


Free assertions
 Automatic assertion synthesis:
 Structural (JasperGold SPS)
 The is very low,
 But usually not showing real bugs
 Useful to release a “clean code”

 Behavioural (JasperGold BPS)


 The tool must provide high quality assertions,
otherwise the triage work is too costly
 BPS has a good automatic ranking
 Still need to define when and where to apply

10 Jasper User Group Meeting 2013


Formal on the farm
 Have many Jasper licenses (talk to K. for business model :-)
 Run on a big farm if hundreds of properties
 May require some hosts with a huge memory. From 4 to 100GB
 May require a retry mechanism for memory limitations
 Put extra effort on “important” properties
 Late soak with formal is not efficient
 Soak = run formal proofs for a very long time (i.e. 1 week)
 In many different configurations
 Trialled on Cortex® R7 maturity,
did not find any significant bug
 ROI bug founds / cluster time is not good
 Better spend time on new assertions, new engines,
new validation areas, new approaches, etc.

11 Jasper User Group Meeting 2013


Some other cost reduction techniques
 Use available Jasper ProofKits
 Reuse home-made formal components
 Generic set of abstractions
 Based on architectural items defined in ARM ARM
 And on our know-how
 Very few constraints at top-level in addition to the ProofKits
 Allow some overconstraints
 E.g. no config change
 Complete sets of properties for internal interfaces may cost
 But good for [in]formal validation too
 Easier to change the design if interface changes first

12 Jasper User Group Meeting 2013


Reduce the costs of simulation
 No X-propagation
 Design exploration using JasperGold first
 Code coverage explanation with JasperGold
 Correlate simulation covered/not covered with formal
covered/unreachables
 Found a big issue with our simulation tool (claiming something is
covered while formal – and humans – say it’s unreachable)
 Functional coverage explanation
 To be automated using UCDB - UCIS
 Too many assertions slow down the simulation speed
 So disable the proven asserts
 JasperGold commands for design exploration
 Really useful to build ad-hoc flows based on formal
 Removing a big effort from simulation and from humans
13 Jasper User Group Meeting 2013
What is good for simulation, is good for formal too
 Design mutations
 Reduce FIFO depths, change arbitration, etc. in order to maximize the
probability to get to “tricky cases”
 Very good for simulation – but must not be always enabled (increase
stalls, etc.)
 Very good for formal too – if implemented correctly (on/off
abstractions), always good
 Dynamic configurations
 Replace static configuration switches by extra – stable - pins
 Not always feasible (interface change, etc.)
 Allow to run different configurations without recompile in simulation
 Allow a single formal run instead of 1 per config

14 Jasper User Group Meeting 2013


An efficient organization for formal

waveforms

RTL
Design
team

assertions
formal proof

lp
he
(JasperGold)
setup
Validation
constraints
team
abstractions

Central flows
formal scripts
libraries reports

Leads &
managers
email Excel ValSpider Jira

15 Jasper User Group Meeting 2013


Cortex ® A12 http://www.arm.com/products/processors/cortex-a/cortex-a12-processor.php

“The Cortex®-A12 processor is the highest performance mid-range mobile processing solution
designed for mobile applications like the use in smartphones and tablets devices. The Cortex-
A12 processor is the successor to the highly successful Cortex-A9 processor and is optimized
for highest performance in the mainstream mobile power envelope leading to best-in-class
efficiency.”

16 Jasper User Group Meeting 2013


Cortex® A12 formal verification
 JasperGold mostly running at top-level
 Completed all “must have” criteria:
 0 failures on embedded properties
+ protocol checkers + X-propagation
 Precondition coverage > 80%
 Other successful applications
 ECC matrix verification
 LSU FSM verification and debug
 Clocking schemes
 Code coverage help
 Strong effort on i-side
 Can be isolated
 High assertion density
 Strong effort on data engine
 Imported design with a new interface
 Assertions to understand the design

17 Jasper User Group Meeting 2013


Cortex® A12 formal verification results
 Formal with Jasper accounts for less than 10% of the total verification costs
 Formal engineers for setup, runs, reports, support, maintenance, etc.
 Designers writing embedded assertions (done even if no formal)
 LSF footprint for top-level similar to a single block-level simulation TB

 Formal with Jasper found 18% of


the real bugs in Jira, probably
closer to 25%
 Reasonable “garbage” ratio:
20%, exactly similar to
simulation. Means low number of
From October’12 false-negatives to debug
to July’13  Some bugs would never have
been found without formal (clock
issues, LSU FSM, X-propagation)
Overall ROI advantage  Formal buildbot usually first to
is 2.5x for formal vs warn about RTL issues
simulation
18 Jasper User Group Meeting 2013
Design bring-up benefits (Bug Avoidance)

 Allowed validation to start before simulation testbench


availability
 On a GPU design, a collection of new modules was verified
with formal only
 Saved resources for simulation on other modules
 And allowed to focus more on the integration verification
 Found just a couple of bugs during integration test, showing the
high RTL quality

“formal bringup was worth the effort and also


saved us overall time”

19 Jasper User Group Meeting 2013


Design bring-up benefits
 Another GPU example:
 Not more, not less bugs, but the bugs are found much earlier
 So less RTL changes (code churn), especially late
Block Formal bring- Bug density Code churn Bug density Code churn
up usage total total late late
L2C High 6 242 1 10
LSC Medium 8 577 2 34
LS No 27 460 13 171
TEX No 8 369 4 54
JM No 6 265 3 53
HT No 10 254 4 71

we r is better
All bugs
Lo L2C bugs

time time

20 Jasper User Group Meeting 2013


Bug find / fix cycles
 Early phase
 Unit level bugs found and fixed by (or
close to) designers
 Bugs easily identified and fixed
 Usually local scope, so global side effect
risk is minimized
 Short or non-existent review and tracking
process
 Late phase
 Bugs found by validation team
 Deeply buried unit level bugs are hard
to identify from system level
environment
 Global side effect risk from changes is
high
 More restrictive change process
(review, code control, re-validate)
21 Jasper User Group Meeting 2013
CPU in development: L1 focus on Formal
 Memory system designed and verified using lots of formal:
 Design bring-up (designers)
 Embedded and assertions (designers): mainstream formal
 PDF: High-level architectural assertions (verif engineers + designers)
 Designers used formal and fixed the bugs before checking
their code in
 Checked-in RTL was a lot cleaner
 Simulation cycles used more effectively finding “real” bugs
 A late major rework of the main interface was greatly eased
thanks to interface assertions and formal
 The bugs found by PDF are fundamental, and some would
probably not have been caught by simulation
 And this was mostly executed by graduate engineers (no PhD
needed)

22 Jasper User Group Meeting 2013


CPU in development: L1 bugs found by methodology

simulation PDF (deep formal) mainstream formal

23 Jasper User Group Meeting 2013


Verification RoI Top Tip
 Get Designers to find and fix their own bugs
Old Way New Way

Minimal trace length

Highlight relevant logic

Minimal trace activity

24
Under investigations: coverage

 Verify the Verification: how good is my verification, and when


am I done?
 Techniques exist, based on code mutations:
 If the RTL code is modified, is the change detected by simulation
tests, or by assertion failures in formal?
 Needs lots of reruns. Impossible on big designs
 Instead, use the JasperGold coverage app to answer to
 What is statically covered by the asserts (COI)?
 What is really covered when proving the asserts (PC)?
 What is covered by the undetermined asserts (bound)?
 Equivalent end result as with mutations, but much more
efficient and complete

25 Jasper User Group Meeting 2013


What to show to your manager?
Nothing… until “The design is formally proven”

26 Jasper User Group Meeting 2013


What to show to your manager?
 Collect data from formal on a regular basis, and show the
cost and benefit evolutions. Some useful numbers:
 Number and density of assertion per block
 Percentage of covered, proven, fails, etc.
 Highlight the covered and fails, not the proven (managers don’t
care about provens!)
 Number of bug founds
 Found early saves simulation time and debug burden... and stress
 Found late shows the superiority of formal against simulation for
corner cases
 Exact cost of engineering work, and of infrastructure
 Compare against simulation, emulation, FPGA, etc.
 Show that some bugs are impossible - or very difficult - to
catch with non-formal validation (the opposite is also true but
we know that already)
27 Jasper User Group Meeting 2013
Current and next steps using
the new JasperGold apps
 Coverage
in progress…
 Build a standard flow (COV)
 Functional coverage and UCDB link (COV) not started

 Low-power verification
 Power-aware property verification (LPV) just starting
 Structural checks and auto assertions (LPV)
 Power-saving feature verification (clocking, etc.)
not started
 Sequential equivalence checking (SEC)
 Standardise ad-hoc flows (FPV, SPV) in progress…

 Security feature verification (SPV)


evaled
 Tackle security sensitive blocks (Trustzone)
 Secure cores not started

 IPXACT Register validation (CSR) evaled

 IPXACT Connectivity verification (CON) in progress…

28 Jasper User Group Meeting 2013


Thank you
Emperor Joseph II: My dear young man, don't take it too hard. Your
work is ingenious. It's quality work. And there are simply too many
notes, that's all. Just cut a few and it will be perfect.

Mozart: Which few did you have in mind, Majesty?

Quote from the Amadeus movie, by Milos Forman.


Thanks to Martin Lampard for pointing it.
Verification Taxonomy: The Aha Moment
Architectural RTL Post-silicon

A A H A A
Micro-Architectural Analysis Bug Analysis
Proper module partitioning Investigate late-cycle bugs
and modeling for formal Isolate corner-case bugs
verification (observed in field, lab)
Ensure clean interfaces Confirm the correctness of
bug fix
Bug Avoidance
Formal during RTL bring-up
Catch bugs early Bug Absence
Eliminate throw-away Prove critical properties to
testbench creation effort Bug Hunting get 100% assurance
Find bugs at block and May require considerable
system level user expertise and effort
Automation and regression
on server-farm friendly

30

Potrebbero piacerti anche