Sunday, October 3, 2010

Predictions for future - The next decade!

Constraints & limitations with available resources lead to innovative thinking... to inventions... to evolution...

As we step into the next decade, the verification fraternity is all set to reap the benefits of the investments made in the present decade. Listed below are technologies (in no specific order) that would define the future course of verification. [Watch out for details on each one of them in future posts.]

1. UVM - The UVM standard committee is all set to roll out the UVM1.0 standard. This methodology sets up a new base for reusability, coverage driven verification, supporting different HVLs and curtailing the confusion of (which one's better).

2. Formal verification - Faster simulation, exhaustive scenario generation, handling complexity, you name it and formal [hai na] is there for it. Improved engines shall handle larger database and provide a defined way of moving from unit to chip level with formal. The APP based approach further brings in the required focus & results.

3. Hardware accelerators - Running real time applications at pre-silicon database with faster turnaround is the need of hour. Hardware accelerators would be able to suffice this demand enabling all domains to exercise real life data before moving to back end implementation.

4. Unified coverage model - Slowly creeping into main stream the work group is making progress to unify the coverage data base across different sources and simulators thereby improving reusability across many levels.

5. AMS - With analog claiming bigger chunk of the mixed signal die, verification is not limited to address correct integration only, instead measurement of analog performance while harmonics from digital operates in tandem would be more prevalent.

6. Low power designs - Orientation to go GREEN brings in new techniques to curb power. While verification ensures correct implementation it shall move a level up & assist in estimating power figures with different application scenarios (use cases) thereby defining optimum system architecture.

7. Electronic System Level (ESL) - Dealing with complexity is easier at higher abstraction levels. System modeling is gaining traction and so is system verification. Standards like TLM2.0 opened up gates for support of transaction modeling in different languages assisting in both design & verification.

8. Cloud computing - Problems associated with verification aren't only in developing test benches, debugging or reusing. Efficient management of compute infra structure to get maximum return on investments made in hardware & software is equally important. Cloud computing defines a new approach to deal with this chronic techno-business problem.

9. Virtual platforms - A complete platform that engages all players of the eco system at the same time i.e. application developers, embedded designers, ASIC design, verification & implementation teams to aid faster time to market will be the key to next generation designs. Verification holds the key to defining & developing such platforms.

10. Courses - With so much going in Verification, don't be surprised if it emerges as a separate stream for specialization in masters degree and adding more PHDs in this domain :)

The above list is what I forsee as the key to NEXT GEN verification.
If you have any other exciting stuff that should unfold in next decade do drop in your comments.

Previous -
'Recap from the past - the last decade'
'Living the present - the current decade'

No comments:

Post a Comment