Saturday, October 30, 2010

EDA 360 : Realizing what it is?

A vision paper released by Cadence in early 2010 created a BUZZ in the EDA community. The hot topic of this year’s CDNLive 2010 is EDA360. Here is a summary on this topic in a 3 part series which concludes with what EDA360 brings to our plate as a verification engineer.
Cadence has a history of creating a wave (with whatever it does) so much that it kind of enters into the DNA of the Cadence guys and the people associated. In past, when Cadence released Low Power CPF solution, my team was invited for a technical brief on the solution at Cadence premises. By that time we had heard a lot about CPF from the Sales, AEs and friends working for Cadence but we were amazed when, while we were completing the formalities to enter their building, the security guard asked if we were there to discuss Low power! I am sure this time if someone visits he would be asking for EDA360 J
WHAT IS EDA360?
For many years semiconductor industry has been religiously following Moore’s Law.  With few defined nodes to shrink size, exponential increase in cost of silicon at advance nodes, saturation in SOC architecture for a given application, time to market challenges and shriveling market window for any product, there is a need to CHANGE the way every entity associated with this industry has been operating.  With software claiming majority of product cost, the innovation and differentiation revolves more around end user applications (apps). This demands a SHIFT in the ‘traditional approach’ of developing HW, OS, SW and APPS serially, to an ‘application driven model’ where, definition of a complete platform enables HW & SW development in parallel. Recent success of iPhone (Hardware dependent platform) and Android (Hardware independent platform) over conventional players demonstrates the changing dynamics of the market. This calls for a HOLISTIC ADJUSTMENT among the partners in this ecosystem.
EDA360 is a potential answer to this CHANGE, this SHIFT and this HOLISTIC ADJUSTMENT. It addresses the need for all stakeholders in the ecosystem with the goal of improving the 3Ps (Productivity, Predictability and Profitability) for the 4th P i.e. the Product. It is an ambitious attempt to connect the product development chain by understanding the customer’s customer (the end customer). Focusing on the ‘consumer’, EDA360 preaches a top down i.e. application driven approach for product development by highlighting the grey areas in the present model and how the new model can help weed out the inefficiencies of the ecosystem thereby improving the 3Ps.
Who does what?
It starts with re-classification of the developers into –
1.  Creators – Organizations that will keep innovating silicon with the goal for better performance, low power and small die size. These teams would be constantly feeding silicon capacity and following Moore’s law. With minimal contribution towards end product differentiation, to be productive & profitable only a handful of such companies can survive.
2.  Integrators – Organizations that will optimally integrate the products from creators to actualize the end product. They would stick to mature process nodes and define application focused platforms (software stack) to serve the application needs with the goal to meet quality, cost and time to market. 
How it will be done?
To achieve the end goal, EDA360 brings forward 3 important capabilities –
-          System Realization
-          SOC Realization
-          Silicon Realization
Semiconductor industry is at a juncture where certain critical decisions become inevitable. As Charles Darwin pointed out in the Theory of Evolution – “Beneficial genetic mutations are part of random mutations occurring within an organism and are preserved because they aid survival. These mutations are passed on to the next generation where an accumulated effect results in an entirely different organism adapted to the changes in environment”. Hopefully EDA360 rolls out in a series of such mutations bringing out the much desired and demanded change essential for evolution of our industry.

Sunday, October 10, 2010

PROOF of NO BUG... OR ...NO PROOF of BUG

To pursue a career in engineering, all of us have been through multiple courses in mathematics. All these courses involved proving equations or theorems. Given a relationship between variables we are asked to prove that relationship. There are defined methods and by following a step by step approach we are able to derive the desired proof. All this time we focus on proving LHS == RHS. Now imagine, if you are asked to prove the equivalence (LHS == RHS) by proving that there can be no way (LHS != RHS) the equation is non equivalent.
Sounds like a bully?
Interestingly, the whole practice of verifying semiconductor designs is based on this puzzling statement.
'Verification' i.e. sid'dha-karana, is a noun to the verb 'verify derived from Medieval Latin verificare : verus = true + facere = to make. So the literal meaning of the verification engineer's job is to make it true.
In an ideal world a fully verified design is the one where we have a PROOF of NO BUG (PNB) in the design. But the first learning imbibed in a verification engineer is that we cannot achieve 100% verification - an unconscious setback on the way one has been proving equality. A number of limitations (engineering or hardware resources, tools, schedule to name a few) that laid the foundation of this unachievable 100% verification has tossed the regular equation and shifted our focus from PNB to NPB  i.e. NO PROOF of BUG. The verification engineer thus endeavors to pursue all means to make sure there is no proof of a bug found (feels like a daunting task if you still relate it with proving there is no way that the equation is non equivalent). With a set of tools, methodologies, languages and checklists the hunt for the bugs i.e. all possible ways to prove the non equivalence begins. Slowly, as we approach closer to verification closure, the constantly passing regression and diminishing bug rate strengthens our assumption that no more bugs are concealed. With verification sign off, the design is labeled BUG FREE with the assumption that hopefully if no more bugs discovered, NPB == PNB. Silicon validation adds more credibility to our assumption, while the team lives with anxiety during the Si bring up. If we are fortunate enough to have customers who explored the design in the way we did in verification the assumption becomes immortal…. Happy ending!
Of course if we miss one of the way(s) of reaching to the bug(s), the result is much more costly in cash & kind when compared to our inability in proving an equation before our Maths teacher.
Maybe that’s the risk we as verification engineers assume when we pick NPB over PNB for whatever reasons.
Happy BUG hunting! :)

Sunday, October 3, 2010

Predictions for future - The next decade!

Constraints & limitations with available resources lead to innovative thinking... to inventions... to evolution...

As we step into the next decade, the verification fraternity is all set to reap the benefits of the investments made in the present decade. Listed below are technologies (in no specific order) that would define the future course of verification. [Watch out for details on each one of them in future posts.]

1. UVM - The UVM standard committee is all set to roll out the UVM1.0 standard. This methodology sets up a new base for reusability, coverage driven verification, supporting different HVLs and curtailing the confusion of (which one's better).

2. Formal verification - Faster simulation, exhaustive scenario generation, handling complexity, you name it and formal [hai na] is there for it. Improved engines shall handle larger database and provide a defined way of moving from unit to chip level with formal. The APP based approach further brings in the required focus & results.

3. Hardware accelerators - Running real time applications at pre-silicon database with faster turnaround is the need of hour. Hardware accelerators would be able to suffice this demand enabling all domains to exercise real life data before moving to back end implementation.

4. Unified coverage model - Slowly creeping into main stream the work group is making progress to unify the coverage data base across different sources and simulators thereby improving reusability across many levels.

5. AMS - With analog claiming bigger chunk of the mixed signal die, verification is not limited to address correct integration only, instead measurement of analog performance while harmonics from digital operates in tandem would be more prevalent.

6. Low power designs - Orientation to go GREEN brings in new techniques to curb power. While verification ensures correct implementation it shall move a level up & assist in estimating power figures with different application scenarios (use cases) thereby defining optimum system architecture.

7. Electronic System Level (ESL) - Dealing with complexity is easier at higher abstraction levels. System modeling is gaining traction and so is system verification. Standards like TLM2.0 opened up gates for support of transaction modeling in different languages assisting in both design & verification.

8. Cloud computing - Problems associated with verification aren't only in developing test benches, debugging or reusing. Efficient management of compute infra structure to get maximum return on investments made in hardware & software is equally important. Cloud computing defines a new approach to deal with this chronic techno-business problem.

9. Virtual platforms - A complete platform that engages all players of the eco system at the same time i.e. application developers, embedded designers, ASIC design, verification & implementation teams to aid faster time to market will be the key to next generation designs. Verification holds the key to defining & developing such platforms.

10. Courses - With so much going in Verification, don't be surprised if it emerges as a separate stream for specialization in masters degree and adding more PHDs in this domain :)

The above list is what I forsee as the key to NEXT GEN verification.
If you have any other exciting stuff that should unfold in next decade do drop in your comments.

Previous -
'Recap from the past - the last decade'
'Living the present - the current decade'