Sunday, December 28, 2014

Top 10 DV events of 2014


www.fastweb.com
The semiconductor industry while growing at a modest rate is at crossroads to define the next killer category of products. The growth in the phone and tablet segment is bound to follow the footsteps of PC industry. IoT (Internet of Things) has been the buzzword but clear definition of the road map is still fuzzy. Infrastructure as always would continue to grow due to the insatiable appetite for faster connectivity and explosion in the amount of data coming in with cloud computing. Irrespective of the product definitions the need for increased functionality on smaller die size with low power and faster time to market at low cost would prevail. To respond to this demand, the DV community has been gearing up in small steps and some of those important ones were taken in 2014. As the stage is getting ready to bid farewell to 2014 and welcome the New Year, let’s have a quick recap of these events in no particular order –

1. UVM 1.2 - A final version of Universal Verification Methodology, was released by Accellera for public review before taking it forward for IEEE standardization. This finally marks an end to the methodology war and resulting confusion. 

2. PSPWG – While UVM has been an anchor for IP verification and reuse, verification of complex scenarios at SoC level is still a challenge particularly when multiple processors are involved and we are looking for reuse across compute platforms. Portable Stimulus Proposed Working Group kicked off this year bringing in stakeholders across the industry to brainstorm on a Unified Portable Stimulus definition as a potential answer to the SoC verification challenge.

3. Formal – Cadence buys Jasper for $170m claiming the largest formal R&D team on the planet. It is expected to help the customers further wherein integration of best of formal technologies from either side would combine and complement the simulation and emulation platforms. 

4. Verification Cockpit – Another interesting shift seen with major EDA vendors is pulling all verification solutions under one umbrella. This includes simulation, emulation, prototyping, formal, coverage, debug and VIPs etc. This is an important step for future SoCs wherein the DV engineer can use the best solution for a given problem to achieve faster verification sign-off.

5. Emulation – The wait was finally over where emulation solutions found wide adoption across the industry. This year witnessed that emulation is no more a luxury but a necessity to meet product schedule by accelerating verification turnaround and enabling the shift left in the product development cycle.  

6. DVCON – This year marked another milestone where in DVCON expanded its reach with Accellera sponsoring DVCON India and DVCON Europe providing an excellent platform for engineers to connect and share their experiments and experiences.

7. AMS – Mentor Graphics acquired Berkeley Design Automation, Inc., bringing in the required flow addressing analog, mixed-signal, and RF circuit verification.  This move is a step to integrate a fairly focused solution with the existing expertise at Mentor to address the next generation verification challenges that would unfold as analog claims more % on silicon. 

8. VIP – This year going with VIP solutions was a no brainer. The pitch of buy vs sell is long gone and industry has embraced third party VIPs with open arms. For a change, non occurrence of any major event in VIP domain was itself an event!

9. X prop – Another solution in verification that created much steam is the X-prop solution from all vendors to enable weeding out the functional X-s from the design at RTL level. This helped many projects reduce the turnaround time spent with GLS.

10. Standards Accellera announced revision of few other standards that include SystemC core language (SystemC 2.3.1) an update to the standard released in 2011 focusing on transaction level modelling, SystemC verification library (SCV 2.0) containing implementation of the verification extensions and Verilog-AMS 2.4 that includes extensions to benefit verification, behavioral modelling and compact modelling.

I hope you had an eventful 2014 with different tools, flows and methodologies. Drop in a comment with what you felt was most interesting in 2014 as we welcome 2015 and the events that would unfold with time.

Wish you all a Happy and Eventful 2015!

Sunday, October 26, 2014

Verification and Firecrackers

Last week, the festive season in India was at its peak with celebrations everywhere. Yes, it was Diwali - the biggest and the brightest of all festivals.
         Source:compucentro.info
Diwali aka Deepawali means a 'row of lights' (deep = light and avali = a row). It’s the time when every house, every street, every city, the whole country is illuminated with lights, filled with aroma of incense sticks, delicious food and sounds of fire-crackers all around. People rejoice paying their obeisance to the Gods for showering health, wealth and prosperity. For many, besides the lights & food, it the crackers that is of supreme interest. The variety ranges from the ones that lighten the floors and the sky to the once that generate a lot of sound. On the eve, while everyone was busy in full ecstasy an interesting observation caught my attention initiating the neurons in my brain to connect it to verification resulting into this post.

The Observation

While enjoying, typically the groups get identified based on age or preference for firecrackers. Usually the younger ones are busy with the stuff that sparkles while the elder ones get a kick with the noisy ones. A few are in transition from light to the sound with some basic items. The image below shows a specific type of firecrackers both bundled and dismantled. 


                                                          Source : Google (www.shutterstock.com)

The novices were busy with the single ones bursting one at a time. A hundred of such pieces would probably take 50 minutes or so almost linearly progressing. If one of them didn’t burst, they would actually probe and see that it worked or threw it. On the other hand the grownups were occupied with the bundled ones that once fired would go on till all of them gave out the resulting light and sound. More the no. of pieces, the longer it would take but the time was much less then releasing individual pieces. It was hard to identify if a 100 pieces bundle actually resulted in 100 sounds or not but overall the effect was better.

The Connection

Observing the above, I could quickly connect it to the learning we have been through as verification engineers. It was directed tests initially wherein we would develop one test at a time and get it going. The total time to complete all the tests was quite linear and for limited number of tests the milestones were visible. Slowly we incorporated randomness into verification where a bundle of tests could run in a regression hitting different features quickly. The run time is dependent on the no. of times the tests run in random regression. Yes, this also results in redundancy but the gains are more especially if the no. of targets is more. 

The Conclusion

As for the firecrackers, the novices playing with individual ones may move to the bundles next year – a natural progression! This is an important conclusion as it demonstrates the learning steps for verification engineers too. Knowing SV & UVM is good but that doesn’t make one a verification engineer. A verification engineer needs to have a nose for bugs and develop code that can hit them fast and hard. This learning is hidden in working on directed tests initially and transitioning to constrained random thereafter. You would appreciate the power of the latter in a better way!

Try extrapolating it more to different aspects of verification and I am sure you would find the connections all throughout. Drop in a comment on your conclusions!

Disclaimer: The intention of the blog is not to promote bursting crackers. There are different views on the resulting pollution and environment hazards that the reader can view on the web and make an individual choice.

Sunday, October 12, 2014

Moving towards Context Aware Verification (CAV)

The race between predictions vs. achievement of Moore’s law has had multi-fold impact on the semiconductor industry. Reuse has come to the rescue both from the design and verification viewpoint to help teams achieve added functionality on a given die size. This phenomenon lead to the proliferation of IP & VIP market. Standardization of interfaces further enables this move by shifting the product differentiation towards architecture and limited proprietary blocks. To enable continued returns and cater to different application segments, the IP needs to be highly configurable. 

Verifying such flexible IP is a challenge; integrating it for a given application and ensuring that it works, further complicates the problem. Given that verification already claims majority of the design cycle efforts, it is important to optimize on the resources, be it tool licenses, simulation platform or engineer’s bandwidth. A focused attempt is required so as to ensure that every effort converges towards the end application i.e. the context in which the design would be used irrespective of the flexibility that the silicon offers. This refers to the subject Context Aware Verification (CAV)!

Verification space has been experiencing substantial momentum on multiple fronts so as to fill the arsenal of the verification engineer with all sorts of tactics required for the challenges coming our way. While these developments are happening independent of each other, they seem to converge towards enabling CAV. Let’s take a quick look at some of these techniques –

Traditionally, test plan used to answer what is to be verified until constrained random entered the scene where how to verify, what to verify and when are we done needs to be addressed by the verification plan. Today verification planner tools enable us to develop executable verification plan with the flexibility to tag the features based on engineer who owns it or based on milestones or based on priorities and above all based on any other custom definition. This customization is useful to club features with reference to a particular context or configuration in which the IP can operate. With this information, the end user of the IP can channelize his efforts in a particular direction rather than wandering everywhere thereby realizing CAV in the larger scheme of things.

Apart from coverage goals that get defined as part of the vplan, there is a need of a subset of tests that would achieve these goals faster. What this means is that the test definition needs to be -
- Scalable at different levels (IP, sub system & SoC)
- Portable across platforms (constrained random for block level, directed tests for SoC verification & validation) 
- Provide a possibility of tagging the tests w.r.t. a given configuration viz a set of valid paths that the design would traverse in the context of a given application. 
Graph based verification is a potential solution to all of this. There is a need to standardize the efforts and to enable discussions in this direction Accellera has initiated Portable Stimulus Proposed Working Group. Once there is a consensus on the stimuli representation, selection of a subset of tests targeting a given configuration would further boost CAV.

With design size marching north, the simulation platform falls short in achieving verification closure in a given time. A variety of emulation (HW acceleration or prototyping) platforms provide an excellent option to speed up this process based on the design requirements. While the verification teams benefit from the simulation acceleration, these boxes also help in early software development and validation. The shift left approach in the industry is enabling basic bring up of OS and even simulating real time apps on these platforms much before the silicon is back. Ability to run the end software on the RTL brings further focus and is an important step towards achieving CAV.

Once all these technologies reach maturity a combined solution would bring in the required focus in the context of the end application. 

As Chris Anderson said – In the world of infinite choice, context – not content – is king!

Our designs with myriad configurations are no different. It is the context that would bring in convergence faster making those products that follow this flow as king!

Saturday, October 4, 2014

DVCON India 2014 : Event Recap!

It’s been a week and we are still receiving messages and emails on the success of the first edition of DVCON in India. The campaign kicked off a few months back and the team put in relentless efforts for the success of this event. There was an overwhelming response from the ESL and DV community once the call for abstracts got announced. Panelists in both the tracks conducted multiple reviews for every paper and after long discussions the chosen ones were intimated for the presentations. The content of the papers were a clear indication of the quality of work and due diligence that is put in by engineers in this geography. Abstracts were submitted from authors outside India too and many traveled to present it on Sept 25-26 2014.

When I reached early morning the corridors, the halls and the stage were all set to kick start the 2 days filled with learning, sharing and networking. While the registrations (paid) prior to the event indicated expected numbers, scores of spot registrations further added the icing to the cake. Interestingly, the halls were full by 9:30 AM and the program kicked off in time with the lamp lighting ceremony and welcome note from Umesh Sisodia – Chair DVCON India 2014. Dr. Walden C. Rhines – CEO Mentor Graphics mesmerized the crowd with his keynote on 'Accelerating EDA innovation through SoC designmethodology convergence'. We couldn’t have asked for a better start! Next, Dr.Mahesh Mehendale - CTO, MCU at Texas Instruments threw an excellent insight on 'Challenges in the design and verification of ultra-low power “more than Moore” systems'. The mood of the conference was all set with the success of the first session. The conference bifurcated from here into ESL and DV tracks with interesting topics getting discussed as part of invited talks. During teak breaks, lunch hour and evening cocktails, the long galleries were full with chit chat between engineers, exhibitors and poster presenters. After lunch, 4 tutorials started in parallel with industry pundits talking about technologies that are into mass adoption and what to look for next. All sessions were jam packed with audience eager to ask questions during the sessions and continue that inquisitiveness till the day wrapped up with informal meetings during the cocktail.

DAY 2 witnessed the same zeal and enthusiasm. It started off with Ajeetha Kumari – Vice Chair, DVCON India 2014 welcoming the audience followed by Dennis Brophy – Vice Chairman Accellera sharing insights on different working groups within Accellera and inviting engineers to actively participate and contribute. Next was the guru Janick Bergeron – Verification fellow, Synopsys talking about 'Where is the Next Level of Verification Productivity Coming from?' The morning session wrapped up with keynote from Mr. Vishwas Vaidya – AGM, Electronics, Tata Motors discussing 'Automotive Embedded Systems: Opportunities and Challenges'. From there on, it was the papers & posters running into 4 to 5 parallel tracks with engineers sharing the challenges they faced and the solutions they discovered as part of this process. Yes there was an award for the same and the judges were none other than the audience themselves casting their vote by end of all sessions. The crowd continued to stay back anxious to know the results and participate in a series of lucky draws that followed. While a wide variety of topics got covered on DAY 2, the results were all in favor of UVM papers clearly confirming of what Dr. Wally Rhines presented as a starting point that India leads in adoption of System Verilog and UVM across the world (based on a latest survey). 

By the Day 2 evening the corridor, the halls and the stage were silent again, maybe exhausted of experiencing the 2 day long sessions, maybe enjoying the recap of the eagerness shown by  400+ delegates, maybe feeling proud of their contribution to the history for hosting the first DVCON in India.  

Many congratulations for those who were able to experience the event and be part of this historical moment. Live tweets for the event can be searched with #DVConIndia on twitter. The proceedings, photographs and videos of the event will be made available soon on the official website www.dvcon.in. If you missed this year, make sure you make it for the next year event which would be surely bigger & better!!!


Yours truly presented an invited talk “...from Nostalgia to Dreams ...the journey of verification” on DAY 1. Stay tuned for a few blogs from that discussion!

Saturday, September 20, 2014

DVCON India : Journey of Verification (preview)

It’s hard to tell whether it’s been 30+ years or more since the initial footsteps were taken towards this journey of verification. Whenever or wherever it took off, the caravan continues to grow. For sure it has been an exciting journey so far! The start of DVCON in India this year marks an important milestone for the troops settled in different geographies and connected with this common link we call Verification (Sid'dha-karana in the local language hindi). As an acknowledgement towards the countless efforts by each one in this journey and to celebrate the contributions of each verification engineer towards silicon success, yours truly is presenting a talk on Journey of Verification at DVCON India 2014 on Sept 25 in Bangalore.

The 2 day event touches all aspects of verification with keynotes from industry luminaries, tutorials from subject matter gurus, invited talks from domain leaders, best of the pick papers /posters, opportunity to network with the who’s who and extended exhibition hours. In this mixed bag covering all technical areas on system development and DV, I plan to share a recap of everything around verification. The story starts off introducing the basic ingredients of verification and peeps into the the need for a verification engineer, verification team and an arsenal full of tools.

After introducing the star cast, the story would move touching areas like directed verification, constrained random, the language wars, methodology wars, format wars and the conclusions on them. We quickly review the IP verification of today, aspects of SoC verification and the trends on where are we heading. The story winds up revisiting some myths that I come across talking to verification engineers and managers as part of my job. I hope this discussion would help you remember those nostalgic experiences, learn something that was always considered implicit and absorb all of it with humour.


It is an attempt to present my 2 cents on this magical world of Verification that gives us an opportunity to start out hunt for bugs everyday and try out different arms in this quest. I hope to see you join us on Sept 25-26 at Bangalore.

- If you haven’t yet registered for DVCON India 2014, here is the link.
- If you are attending DVCON India, let’s meet!
- If you want to share your exciting moments in this journey of verification drop in a comment or email me. Everyone has a story to tell.... let’s share it here!

Suggested Reading -


Sunday, August 17, 2014

Taking a pulse of what's going on in Verification!

The design complexity today is marching forward at an accelerated speed and its effect on verification is witnessing equal leaps and bounds. The world of verification has grown multi-fold in past 5 years across all fronts covering what all to verify, how to verify and who all required to verify. Today, conferences hosted by EDA partners are not just focusing on a simulator, a language, a methodology or a new tool and instead bringing up discussions on a plethora of topics. It seems to address this growth, Cadence decided to have the 2nd day of CDNLive India 2014 mostly targeting the DV community hosting multiple tracks on verification. With hundreds of footfalls, a mixed bag of papers on all aspects of verification, an extended exhibitor pavilion for partners along with lunch and tea sessions busy with networking, the event was truly in sync with the theme, connect... share... inspire!

A decade back all such events focussed on sharing the upgrades in tools particularly the simulator. The technology update included introduction to new features, added support for language and improved performance. Today no one talks about the simulator and instead everything around it getting integrated under a hood with features like –

  • A broad VIP catalog with added support for easy bring up of test benches for IP/SoC, compliance test suite with ready made coverage/assertions. A new class of accelerated VIPs that interface easily with the emulators giving further boost to the overall productivity.
  • Updates to power aware verification involving multiple tools (formal, simulator & emulator) with support for different power formats/versions, auto generation of relevant assertions and different aides to ease debugging these scenarios.
  • How to model analog blocks to achieve high performance with electrical accuracy as part of the AMS support. Different languages supporting model development, coverage driven verification to validate these models and reuse of the environment when integrated with digital blocks.
  • A portfolio of formal apps that can accomplish the job of static checks related to connectivity, power, registers, X-prop, protocol compliance checks etc.
  • Improved support for IPXACT based flows to enable register modelling, register verification and interface connectivity promoting IP reuse with minimum issues.
  • Integrated support for coverage collection and merging across different levels (IP, sub system & SoC) of verification involving different tools or flows used to achieve verification closure.
  • Verification management featuring executable verification plans, regression management, triaging and analysis with different views based on user’s role in the project.
  • Added debug tools and support to view the transactions, filter or play around back and forth with the simulation logs.
  • Hardware accelerators with improved capacity, performance and features that enable detailed debugging, power aware support, assertions and coverage.
  • Prototyping platforms/emulators and how they enable boot up with android etc. much before the silicon arrives.
  • Improved Virtual platforms and models in sync with the above to enable early software development thereby shifting the whole product development cycle to the left.
  • Performance analysis of SoC confirming architecture stability or assisting in exploring alternates promptly.

Yes! Verification has evolved into a GODZILLA beyond the control of one HVL, one methodology and one simulator. The rules of the game have changed and the industry is responding faster than ever to this change. Observing all these changes I recollect the topic EDA360 introduced by Cadence to the industry back in 2010. While the terminology might have lost its steam, the essence of the idea seems to have realized quite a lot since then. For those who missed reading about it please refer to the below posts summarizing EDA360. Believe me it’s worth reading!


Do you agree that the current state is in sync with EDA360? Leave a comment!

If you have missed out to any of these events, don’t miss to attend DVCON India 2014 on Sept 25, 26. Registrations now open.

Monday, July 14, 2014

DVCON goes GLOBAL!

Design and Verification Conference popularly known as DVCON is beginning to flex its muscles and move out of Silicon Valley to reach your friendly neighborhood this year. Yes, DVCON will be hosting the conference at Bangalore, India and Munich, Germany in 2014 on top of the already concluded conference at San Jose, California. This blog post is a quick refresher about the event for fellow engineers who have been busy solving the most intricate verification challenges in the trenches unknowingly that a platform does exist to discuss those puzzles and hear out how the community is handling them while sharing their solutions. It’s an avenue promoting collaboration among the design and verification community without a bias for a particular tool, flow or methodology. It’s a forum where everyone, a novice or a guru, a student or a professional, a beginner or an expert come together to discuss the takeaways that can be implemented as soon as they hit their work stations. It’s a place where you learn, discuss, network, collaborate and connect.....and this time it’s HERE (Bangalore & Munich)!

HISTORY of DVCON

It’s been 25 years in the spirit and 10 years to the name for this conference. The origin can be traced back to late 80s when VHDL started picking up; the user community started meeting twice a year under VUG (VHDL users group) leading to a conference by the name VIUF (VHDL International User Forum). Around the same time Verilog too gained user traction leading to IVC (International Verilog Conference) in early 90s. While the two events continued to serve the respective communities, they joined hands in 1997 giving way to IVC/VIUF conference later termed as HDLCon. Finally in 2003, HDLCon became DVCON giving it a legacy of 25 years and a brand that continues to evolve for a decade now. In 2014, it extends the reach to India and Europe.

DVCON India

ISCUG (Indian SystemC User’s Group) has been hosting an event with focus to accelerate adoption of SystemC as an open source standard for ESL design for past 2 years. This platform now morphs into DVCON India, a 2-day conference in Bangalore on Sept 25-26 2014 running Design Verification and ESL track in parallel.  In words of the committee, DVCon India is an attempt to bring a vendor-neutral international Design and Verification conference closer to home for the benefit of the broader engineering community. An excellent platform to share knowledge, experience and best practices covering ESL, Design & Verification for IP and SOC, VIP development and Virtual Prototyping for Embedded Software development and debug. The conference provides multiple opportunities to interact with industry experts delivering keynote speeches, invited talks, tutorials, panel discussions, technical paper presentations, poster sessions and exhibits from ecosystem partners.

Call for abstracts – submission deadline JULY 21st, 2014.
Registrations soon to be opened.

DVCON Europe

A new chapter for the Design and Verification community in Europe starts at Munich, Germany this year on Oct 14-15, 2014. The focus of the conference would be on Electronic System Level (ESL), Verification & Validation, Analog/Mixed-Signal, IP reuse, Design Automation, and Low Power design and verification. To know more about what you’ll see at the conference hear it in the words of Martin Barnasconi, General Chair DVCON Europe here.

Call for abstracts – submission deadline over.
Registrations – Open.

This new move from Accellera Systems Initiative opens up opportunities for thousands of engineers who have benefited from the proceedings of DVCON for a long time to get involved, contribute and learn for a better tomorrow. As Benjamin Franklin rightly said,

Tell me and I forget.
Teach me and I remember.
Involve me and I learn.

An excellent opportunity for all of us to get involved & learn!


Sunday, June 29, 2014

Shift left and Reuse in verification

SNUG India 2014 was a jam packed event! All sessions including the keynotes, papers, tutorials and the design expo were full with engineers pouring in huge numbers. Follow up questions during the presentations and curiosity of the engineers during the expo to understand the solutions from partners were clear indicators of the value these conferences bring in apart from the freebies :) 

Papers on the first day of Verification track focused on the 'Shift Left' approach viz confirming that the design is an exact representation of the spec by uncovering the issues early in the design cycle. The first 3 papers discussed involvement of acceleration/prototyping to enable early SW development and validation of the design in parallel to verification. The 4th paper talked about how the X’s of GLS can be stimulated at RTL to save time & avoid ECOs. While shifting left helps in achieving the goals faster, it is equally important to deploy tools & flows improving 'Reuse'. This was the gist of papers presented on the second day where users shared their experience with enhanced scalability and reusability on different aspects of the verification paradigm be it Formal, low power or IP and SoC verification. 

‘Shift left’ & ‘Reuse’ are interesting concepts sure to reap wonders when applied in conjunction! Really? Let’s see.


Reuse in verification is achieved predominantly through VIPs and test scenarios. 

Expectations from the VIP are not only limited to reuse in the context of IP at block or SoC level simulations. There is a need to have VIPs architected in such a way so as to be reused while porting the target design to a prototyping or emulation platform. If it is the system bus, the VIP needs to be partitioned such that the timed portion of the VIP can move into the box while the untimed portion still continues to be on the work station. SCEMI protocol enables the required communication between the host and the target box. If the VIP is a model for a peripheral sitting outside the chip, it can either fully reside inside the box if everything is to be programmed through the corresponding host IP or else have the same partitioning as the system bus VIP. While the benefits of moving to faster platforms are obvious, the challenges to enable this transition are multi-fold. Porting the design to a target box demands a struggle with partitioning amidst the complexity arising due to multiple clock and power domains. Though the industry has evolved to a large extent on these aspects, architecting and developing verification code that is portable between simulation and emulation is still in its infancy. 

VIPs are available off the shelf but a lot of test development today still happens with home grown test plan and test suite with reuse from what is provided as part of the VIP. Given that the ultimate verification goal is to have a functional SoC with minimal probability of a design bug, C based tests play an important role in achieving it. These tests need to be defined & developed in a planned fashion so as to enable reuse at simulation, emulation and even for Si bring up. A well thought about strategy can actually enable the first level of test development right at the IP stage where a model of the processor can be plugged as part of the IP test bench enabling early test development. Once this suite is ready, complex cases can be covered either with a detailed use case test plan and/or deployment of tools that enable test definition and development in an automated manner. With HW SW teams working in silos and probably in different geographies, a comprehensive solution that converges the two is quite difficult to achieve.

To realize the above there is a need to conceive the end picture, enable development of the pieces and finally connecting the dots to bring up multiple platforms early enough in the ASIC design cycle.

Sunday, June 15, 2014

Sequences in UVM1.2

This post concludes our date with sequences in UVM with a quick overview on modifications related to sequences in UVM1.2. UVM that came out with an EA version in 2010 received mass adoption and has matured by the day since then. The working group is all set with another major release(UVM1.2) that should be out anytime. It opened up for public review during DVCON 2014. You may want to dabble with this release and share your feedback here. Given the wide adoption and maturity level, the plan further is to probably take it to IEEE. Victor from eda playground shared interesting insights on what’s new in the UVM1.2 release.  Given below are 4 important changes related to sequences in UVM1.2 –

#1 Setting default sequence from command line [Mantis – 3741]

A new plusarg has been added to the command line to allow setting of default_sequence that the sequencer executes during the start phase. With this change the user can have 1 test with default sequence that can be modified from command line thereby avoiding redundant test development just for changing the default sequence. You can also run regressions by replacing the default sequence with a sequence library (remember sequence library is derived from uvm_sequence).

How was this done in UVM1.1?

uvm_config_db#(uvm_object_wrapper)::set(this,“<seqr_path>.<phase>”,“default_sequence”,<sequence>::type_id::get());

Additional control in UVM1.2?

+uvm_set_default_sequence = <seqr>,<phase>,<sequence>

How does it affect current code?

Existing code works without change while you move to UVM1.2. Users are free to choose either of the above. If both are present in the code the result may be a function of phase.

#2 Guarded starting_phase [Mantis – 4431]

The uvm_sequence_base::starting_phase variable is now protected and accessible only via set_starting_phase() and get_starting_phase() methods. One cannot set this variable again if retrieved using get method. The primary reason for this update was that any modification to the starting_phase variable in the code would lead to errors that were hard to debug.

How was this done in UVM1.1?

starting_phase as a variable was used to set phase, compare phase, raise & drop objections

How does it work in UVM1.2?

Use pre-defined methods set_starting_phase() & get_starting_phase()

How does it affect current code?

While moving to UVM1.2, if the code uses starting_phase variable then one can expect compile errors. The Accellera UVM team is generous enough to provide a script as part of the release that searches the starting_phase variable and replaces it with pre-defined methods based on how the variable is used in the code.

#3 Automatic raise & drop objection [Mantis – 4432]

set_automatic_phase_objection() method is added to raise and drop objections based on the starting_phase of the sequence thereby avoiding manual code. This change is mainly for ease of use.

How was this done in UVM1.1?

starting_phase.raise_objection(this);
starting_phase.drop_objection(this);

How does it work in UVM1.2?

set_automatic_phase_objection(1); 

How does it affect current code?

The existing code would need modification. As mentioned above, use the script provided by Accellera UVM team as part of the release.

#4 enum changes [Mantis – 4269]

A lot of verification code today involves a mix of different languages, methodologies and code developed by different teams. In UVM1.1 the enum values related to sequences didn’t have a unique signature and this lead to compilation errors when the uvm_pkg was wildcard imported into a scope that already had enum declarations with same name. To avoid this, those enum values are now prefixed with UVM_ string.

How was this done in UVM1.1?

UVM_ prefix missing in enum values for uvm_sequence_state_enum and uvm_sequencer_arb_mode

What changed in UVM1.2?

uvm_sequence_state_enum and uvm_sequencer_arb_mode enum values now have UVM_ prefix

How does it affect current code?

The existing code would give compile errors and you would need to update the values manually.

So when should you move to UVM1.2?

If you are not and early adopter and doesn't enjoy playing with new release, you can hold on for some time; otherwise go ahead, download the release and give it a shot with your code.

Recommended reading -

Sunday, April 20, 2014

Sequence Library in UVM

Looking into the history of verification we learn that the test bench and test cases came into picture when RTL was represented in form of Verilog or VHDL. As complexity grew, there was a need for another pair of eyes to verify the code and release the designer from this task that continues to transform into a humongous problem. The slow and steady pace of directed verification couldn’t cope up with the rising demand and constrained random verification (CRV) pitched in to fill the gap. Over the years industry played around with different HVLs and methodologies before squaring down to SV based UVM. The biggest advantage of CRV was auto generation of test cases to create scenarios that the human mind couldn’t comprehend. Coverage driven verification (CDV) further complimented CRV in terms of converging the unbounded problem. In the struggle to hit the user defined coverage goals, verification teams sometime forget the core strength of CRV i.e. finding hidden bugs by running random regressions. UVM provides an effective way to accomplish this aim through use of sequence library.

What is Sequence Library?

A sequence library is a conglomeration of registered sequence types derived from uvm_sequence

A sequence once registered shows up in the sequence queue of that sequence library. Reusability demands that a given IP should be configurable so as to plug seamlessly into different SoCs catering varied applications. The verification team can develop multiple sequence libraries to enable regressions for various configurations of the IP. Each of these libraries can be configured to execute sequences any no. of times in different order as configured by the MODE. The available modes in UVM include –

UVM_SEQ_LIB_RAND   : Randomly select any sequence from the queue
UVM_SEQ_LIB_RANDC :  Randomly select from the queue without repeating till all sequences exhausted
UVM_SEQ_LIB_ITEM    : Execute a single sequence item
UVM_SEQ_LIB_USER   : Call select_sequence() method the definition of which can be overridden by the user

Steps to setup a Sequence Library?

STEP 1 : Declare a sequence library. You can declare multiple libraries for a given verification environment.











STEP 2 : Add user defined & relevant sequences to the library. One sequence can be added to multiple sequence libraries.








STEP 3 : Select one of the 4 modes given above based on the context i.e. to run random or run a basic sequence item for sanity testing or a user defined mode as applicable like a set of sequences that test a part of the code for which bug was fixed.






STEP 4 : Select the sequence library as default sequence for a given sequencer from the test. This kind of depends on the STEP 3 i.e. the context.


Advantage

Sequence library is one of the most simple, effective but sparingly used mechanism in UVM. A user can plan to use sequence library by changing modes to achieve sanity testing, mini regression and constrained random regressions from the same test. Further, the library helps in achieving the goal of CRV in terms of generating complex scenarios by calling sequences randomly thereby finding those hidden bugs that would otherwise show up in the SoC verification or the field. 

As Albert Einstein rightly said "No amount of experimentation can ever prove me right; a single experiment can prove me wrong". It is important to run those experiments random enough so as to improve the probability of hitting that single experiment to prove that the RTL is not an exact representation of specification. Sequence library in UVM does this effectively!!!


Previous posts -