Sunday, February 22, 2015

DVCON : Enabling the O's of OODA loop in DV

Today the traditional methods of verifying the design strangle with complexity seeking new leaps and bounds.  We discussed in our last post about why a new pair of eyes was added to the design flow and since then we continue to not only increase the numbers of those pairs but also improve the lenses that these eyes use to weed out bugs. These lenses are nothing but the flows and methodologies we have been introducing as the design challenges continue to unfold. Today, we have reached a point in verification where ‘one size doesn’t fit all’. While the nature of the design commands a customized process of verifying it, even for a given design, moving from block to sub system (UVM centric) and to SoC/Top level (directed tests) we need to change the way we verify the scope. Besides the level, there are certain categories of functions that are best suited for a certain way of verification (read formal). Beyond this, modelling the design and putting a reusable verification environment around it to accelerate the development is another area that requires attention. With analog sitting next to digital on the same die, verifying the two together demands a unique approach. All in all, for the product to hit the market window in the right time you cannot just verify the design but you need to put a well defined strategy to verify it in the fastest and best possible fashion.

So what is OODA loop? From Wikipedia, the phrase OODA loop refers to the decision cycle of observe, orient, decide, and act, developed by military strategist and USAF Colonel John Boyd. According to Boyd, decision making occurs in a recurring cycle of observe-orient-decide-act. An entity (whether an individual or an organization) that can process this cycle quickly, observing and reacting to unfolding events more rapidly than an opponent, can thereby "get inside" the opponent's decision cycle and gain the advantage. The primary application of OODA loop was at the strategic level in military operations. Since the concept is core to defining the right strategy, the application base continues to increase.

To some extent this OODA loop entered the DV cycle with the introduction of Constrained Random Verification paired with Coverage Driven verification closure. Constrained random regressions kicked off the process of observing the gaps, analyzing the holes, decide if they need to be covered and acting on refining the constraints further so as to direct the simulations to cover the holes. 

Today, the need for applying OODA loop is at a much higher level i.e. to strategize on the right mix of tools, flows and methodologies in realizing a winning edge. The outcome depends highly on the 2 O’s i.e. Observe & Orient. In order to maximize returns on these Os, one must be AWARE of –
1. The current process that is followed
2  Pain points in the current process and anticipated ones for the next design
3. Different means to address the pain points

Even to address the first 2 points mentioned above, one needs to be aware of what all to observe in the process and how to measure the pain points. While the EDA partners try to help out in the best possible way enabling the teams with the right mix, it is important to understand what kind of challenges are keeping the the others in the industry busy and how are they solving these problems. One of the premiere forums addressing this aspect is DVCON! Last year, DVCON extended its presence from US to India and Europe. These events provide a unique opportunity to get involved and in the process - connect, share and learn. Re-iterating the words of Benjamin Franklin once again –

Tell me and I forget.
Teach me and I remember.
Involve me and I learn.

So this is your chance to contribute to enabling the fraternity with the Os of OODA loop!

Relevant dates -
DVCON US – March 2-5, San Jose
DVCON India – September 10-11, Bangalore
DVCON Europe – November 11-12, Munich

Other posts on DVCON at siddha'karana –


Sunday, February 8, 2015

Designers should not verify their own code! REALLY?

Around 2 decades back, the demands from designs were relatively simple and the focus was on improving performance. Process nodes had longer life, power optimization wasn’t even discussed and time to market pressure was relatively less given that the end products enjoyed long life. In those days, it was the designer who would first design and later verify his own code usually using the same HDL. Over the years, as complexity accelerated, a new breed of engineers entered the scene, the DV engineers! The rational given was that there is a need for an independent pair of eyes to confirm if the design is meeting the intent! Verification was still sequential to the design in early days of directed verification. Soon, there was a need for constrained random verification (CRV) and additional techniques to contain the growing verification challenge. The test bench development now started in parallel to the design, improving the size & need of verification teams further. With non HDLs i.e. HVLs entering the scene the need for DV engineers was inevitable. All these years, the rational of having an additional pair of eyes continued to be heard to an extent that we have started believing that designers should not verify their own code.

In my last post I emphasized the need for collaboration wherein designers and verification engineers need to come together for faster verification closure. Neil Johnson recently concluded in his post on designers verifying their own code. My 2 cents to whether designers should not or shall I say do not verify their own code –

To start with, let’s look at what all involves verification? The figure adjoining is a summary of efforts spent in verification based on the study conducted by Wilson research in 2012 (commissioned by Mentor). Clubbing some of the activities it is clear that ~40% of the time is spent in Test planning, Test bench development and other activities. The rest ~60% of the effort is spent on Debug & creating + running tests. The DUT here can be an IP or SoC. 

When an IP is under development or SoC is getting integrated, the DV engineers would be involved into the 40% of the activities mentioned above. These are the tasks that actually fall in line with the statement of additional pair of eyes validating design intent. They need to understand the architecture of the DUT and come up with a verification plan, develop verification environment and hooks to monitor progress. At this level, the involvement of the design team starts with activities like test plan review, code coverage review, inputs to corner cases and tests of interest. 

So, once the design is alive on the testbench, do the designers just sit & watch the DV team validate the representation of spec developed by them? NO!

Debugging alone is a single major activity that consumes an equal amount or sometimes more efforts from the designers to root cause the bug. Apart from it, there is significant involvement of the design team during IP & SoC verification.

For IPs, CRV is a usual choice. The power of CRV lies in automating the test generation using the testbench. A little additional automation enables the designers to generate constrained tests themselves. Assertions are another very important aspect in IPs. With introduction of assertion synthesis tools, the designers work on segregating the generated points into assertions or coverage. For SoCs, apart from reuse of CRV, directed verification is an obvious choice. Introduction to new tools on graph based verification help designers to try out tests based on the test plan developed by the DV engineer. Apart from this, corner case analysis and usecase waveform reviews are another time consuming contributions put in by designers for verifying the DUT.

Coming back to the rational on having an independent pair of eyes verify the code, the implication was never that the designers shall not verify their own code. Infact there is no way for the DV team to do it in a disjoint fashion. Today the verification engineer himself is designing a highly sophisticated test bench that is actually equivalent to a designer’s code in complexity.  So it would be rather apt to say that it is the two designs striking each other to enable verification under the collaboration between design & verification teams!

What is your take on this? Drop a note below!