Sunday, October 18, 2015

The magical chariot in verification!

As the holiday season kicked off in India, the antennas of my brain tickled to intercept between what is going around and relate it to verification. In past, I have made an attempt to correlate off topic subjects with verification around this time every year. Dropping the mainstream sometimes helps as it gives you a different perspective and a reason to think beyond normal, to think out of the box, to see solutions in different context and apply it to yours, in this case verification! The problem statement being – driving verification closure with growing complexity and shrinking schedules.
 
Before I move forward, let me share the context of these holidays. This is the time in India when festivities are at their peak for few weeks. Celebration is in the air and the diversity in the culture makes it even more fascinating. People from all over India celebrate this season and relate it to various mythological stories while worshipping different deities. The common theme across is that in the war of good and evil, good prevails finally! What is interesting though are the different stories associated with each culture detailing these wars between good and evil. In the process of growth of the evil and the evolution of good to fight it, both tend to acquire different weapons to attack as well as defend. And when the arsenal at both ends is equally equipped, the launch-pad becomes a critical factor in arriving to a decision. Possibly, that is another reason why different deities ride different animals and some of these stories talk about those magical chariots that kind of made the difference to the war.
 
So how does this relate to verification?
 
As verification engineers our quest with bugs amidst growing complexity has made us acquire different skills. We started off with directed verification using HDLs/C/scripts and soon moved to Constrained random verification. Next we picked different coverage metrics i.e. functional, code coverage and assertions. As we marched further, we added formal apps to take care of the housekeeping items that every project needs. Almost a new tool/flow keeps adding every couple of years in line with the Moore’s law J. And now if we look back, the definition of verification as a non-overlapping concern (functional only) in the ASIC design cycle few decades ago is all set to cross roads with the then perceived orthogonal concerns (clock, power, security and software). While we continue to add a new flow, tool or methodology for each of these challenges that are rocking the verification boat, what hasn’t changed much in all these years is the platform that the verification teams continue to use. Yes, new tools and techniques are required but are these additions bringing the next leap that is needed or are they just coping up with the task at hand? Is it time to think different? Time to think beyond normal? Time to think out of the box? And if YES what could be a potential direction?
 
This is where I come back to the mythological stories wherein when the arsenal wasn’t enough; it was the magical chariot that did the trick! Yes, maybe the answer lies in bringing the change in the platform – our SIMULATORS – the workhorse of verification! Interestingly, the answers do not need to be invented. There are alternate solutions available in form VIRTUAL PROTOTYPING or using HARDWARE ACCELERATORS/EMULATORS for RTL simulations. Adopting these solutions would give an edge on both the bugs causing menace as well as the competition! And for those who think it is costly to adopt, a lost market window for the product could be even costlier!!!
 
As Harry Foster mentioned in his keynote at DVCon India 2015 – It’s about time to bring in the paradigm shift from “Increasing cycles of verification TO maximising verification per cycles”. He also quoted Henry Ford, the legend who founded Ford Motor Company and revolutionized transportation and American industry.
 
 
On that note, I wish you all a Happy Dussehra!!! Happy Holidays!!!


If you liked this post you would love -
Mythology & Verification
HANUMAN of verification team!
Trishool for verification
Verification and firecrackers

Friday, September 25, 2015

DVCON India turned 2!!!

Nothing beats nurturing a seed, an infant, an idea or an EVENT; watching it grow and clocking milestones of its achievements. For many of us, the bright sunny day of Sept 10 bought the same feeling. Yes, this month DVCon India turned 2!
 
Sponsored by Accellera, the conference expanded to India last year. An excellent platform for the design and verification community working at IP, SoC and System level to discuss problems, alternate solutions and contribute to standards. The ecosystem today has multiple EDA driven forums showcasing the right and optimal usage of their respective tools. DVCon being vendor neutral focuses on the need for standards in languages and methodologies to overcome the challenges introduced by rising complexity while emphasizing the right way of applying these standards.
 
History of DVCon
The history of DVCon can be traced back to late 80s when VHDL users met twice a year under the name VUG. By early 90s, it became an annual event called VIUF. Around the same time Verilog users also gathered annually for IVC. In the late 90s, these two events joined hands for HDLCon. In 2003, it was re-branded as DVCon. Based on these facts, DVCon US actually has been serving the community for 25+ years. In 2014, it expanded globally to India & Europe.
 


The 2 day conference was held at Leela Palace, Bangalore on Sept 10-11 2015. Riding on the success of DVCon India 2014, this year was planned to be bigger and better!!! Change in venue, modified program for higher interaction among the participants, addition of Gala dinner and higher quality of the content were the key highlights of this year’s event.  The program was put together keeping in mind the 4Cs : Contribute, Collaborate, Connect & Celebrate - a clear reflection of spirit of DVCon.
 
DAY 1
 
Packed hall with ~600 participants witnessing the lamp lightning ceremony was a clear indication of the enthusiasm that was set to unfold. Yours truly opened the DAY 1, introducing the program underlining the message that DVCon is all about active participation! Harry Foster from Mentor Graphics delivered the opening keynote 'From Growing Complexity to Faster Horses' citing interesting facts about the trends in design and verification. Vinay Shenoy shared an excellent insight as part of the invited keynote 'Perspective on Electronics Ecosystem in India' covering the history and initiatives under ‘Make in India’ campaign. Rest of the day kept everyone busy with invited talks from subject matter experts, panels on upcoming technologies and tutorials around standards. The exhibitors kept the crowd involved all throughout sharing potential solutions to the challenges faced. Having drenched with a rich rain of technical content throughout the day, it was time for some fun in the evening. The crowd came together celebrating 10 years of System Verilog as a standard and IEEE standardization of UVM. Amidst applauses, pranks, music, dance and illusions, the day 1 concluded with tweets and chirping over dinner & drinks.
 
DAY 2
 
An extended DAY 1 didn’t stop the participants to change the gears back to technical on DAY 2 with Ajeetha Kumari opening the day followed by Dennis Brophy sharing an overview on Accellera. Manoj Gandhi from Synopsys delivered the opening keynote 'Propelling the Next Generation of Verification Innovation' discussing how design and verification challenges have progressed and the need of the hour. Atul Bhatiya took the stage next as an invited keynote speaker talking about 'Opportunities in Semiconductor Design in India', encouraging the audience to envision and jump where the ball would be rather than running after it. Rest of the day hosted different tracks on papers and posters shortlisted by the Technical Program committee. By the evening, overwhelmed with the discussions, solutions and networking opportunities, the junta assembled again to appreciate the efforts put in by members of the DVCon India committees and congratulate the winners of Best Paper & poster awards!
 
As the Day 2 concluded, the team that put in stretched hours for almost a year was overjoyed with the grand success of the event. Those relentless efforts paid well in bringing up the conference to the next level. Yes! The nurturing all these days, witnessing the growth and marking the achievement of DVCon India 2015 was all worth it!
 
Other posts on DVCon India -
 

Tuesday, August 11, 2015

101 with Richard Goering : The technical blogging guru

Richard Goering - retired EDA editor
The digital world has connected people across geographies without in person meeting or talking. It is interesting to see the cross pollination of ideas, thoughts and mentoring that travels across boundaries flying on the wings of this connected world. The bond developed when connected on these platforms is no less than a real one. I happen to have a similar bond with Richard Goering as a religious follower of his technical articles for more than a decade. So when Richard announced his retirement, I requested him for an interview to be published on this blog. Humble as he is all these years, he accepted this request and what follows is a short interview with the blogging guru whom I admire a lot for his succinct yet comprehensive posts all these years.
 
Q: Richard, please share a brief introduction to your career?
 
I have always been a writer. I graduated from U.C. Berkeley with a degree in journalism in 1973. In 1974, living in what was to become Silicon Valley, I worked for a long-dead publication called Northern California Electronics News. I wrote an article that described electron beam lithography as the “next big thing” in semiconductor manufacturing. Today this technology is still emerging.
 
In the early 1980s I was a technical writer in Kansas City, Missouri for a company that made computer-controlled bare board testers. I took classes at the University of Missouri in Fortran, Pascal, and assembly language. I still remember going to the campus computer centre with a stack of punched cards, hoping that one error wouldn’t keep the whole program from compiling.
 
In 1984 I joined the staff of Computer Design magazine, and wrote several articles about test. Shortly afterwards I was asked to go cover a new area called “CAE” (computer-aided engineering). This was, of course, the discipline that became “EDA” and I have written about it ever since. I was the EDA editor for Computer Design (4 years) and then for EE Times (17 years). I worked for Cadence, primarily as a blogger, for the past 6 years.
 
Q: When did you realize it’s time to start blogging and why?
 
I actually had a blog during my final years at EE Times, which ended in 2007. At Cadence I wrote the Industry Insights blog. Today there are few traditional publications left, especially in print, and it appears that blogs are a primary source of information for design and verification engineers.
 
Q: What are the three key disruptive technologies you observed that had a high impact on the semiconductor industry?
 
From an EDA perspective, the most significant change was the move from gate-level schematics to RTL design with VHDL or Verilog. This move provided a huge leap in productivity. It also allowed verification engineers to work at a higher level of abstraction. Looking more closely at verification, there was a shift from directed testing to constrained-random test generation. This came along with coverage metrics, executable verification plans, and languages such as “e” from Verisity. I think a third disruptive technology is emerging just now – it’s the importance of software in SoC design, and the need for software-driven verification
 
Q: When did you start hearing the need for a verification engineer in the ASIC design cycle?
 
I think this goes back many years. Most chip design companies have separate verification teams. Nowadays there’s a need for design and verification engineers to work more closely together, and for designers to do some top-level verification, often using formal or static techniques.
 
Q: Please share your experiences with the evolution of verification?
 
At EE Times, I wrote about many new verification companies and covered key product announcements. At Cadence I was more focused on Cadence products, but I continued to cover DVCon and other verification related industry events. 
 
Q: Do you believe that today verification accounts for 70% of the ASIC design cycle efforts?
 
I think we must be very careful with statements such as these. The question is, 70% of what? Are we looking at the entire ASIC/SoC design cycle, from software development through physical design? Or are we considering just “front end” hardware design? Are we talking about block-level verification or looking at the whole SoC and the integration between IP blocks? The 70% claim is about marketing, not engineering.
 
Q: What are the key technologies to look forward for in near future?
 
I think you’re going to see software-driven verification methodologies that employ “use case” testing. The idea here is to specify system-level verification scenarios that involve use cases, and to automatically generate portable, constrained-random tests. The tests are “software driven” because they can be applied through C code running on embedded processor models. Another emerging concept is the “formal app.” A formal app is an automated program that handles a specific task, such as X state propagation. Today most providers of formal verification offer formal apps.
 
Q: What is that you would miss about our industry the most?
 
EDA is a dynamic industry. There is always something new and exciting. I will miss the constant innovation and the spirit that drives it.
 
Q: Words of wisdom to the readers?
 
Don’t be afraid to try something new. Increasing chip and system complexity will drive the need for more productive design and verification methodologies. Job descriptions will change as software, hardware, analog, digital, and verification engineers all need to work more closely together.
 
Thank you Richard for your answers.
 
Your writings have helped in spreading the technology and inspired many of us to do it ourselves too. Wish you happiness and good health!!!

Sunday, April 19, 2015

Moore's law - A journey of 50 years

50 years of innovation! 50 years of quest with complexity! 50 years of Moore's law! Yes, April 19th is an important date for the semiconductor industry. It was on this date in 1965 that a paper was published citing Gordon Moore's observation - the no. of transistors on a given silicon area would double in almost every two years. The observation turned to be a benchmark and later a self fulfilling prophecy that is chanted by everyone whether an aspirant wanting to be a part of this industry or veteran who worked all throughout since the time when the law was still an observation! I myself remember my first interview as a fresh grad where I was asked the definition & implication of this law. It may not be a surprise if there is a survey done on one name that people in this industry have read, heard or uttered the most in their careers and the result would be MOORE unanimously!
The below infographic from Intel would help you appreciate the complexity that we are talking about –


In this pursuit to double the no. of transistors, there were major shifts that the industry experienced. Let's have a look at the notable ones that had a major impact -
Birth of EDA industry – As the numbers grew it was difficult to handle the design process manually and there was a need for automating the pieces. While the initial work in these lines happened in the design houses, it was soon realized that re-inventing the wheel and maintaining proprietary flows without considerable differentiation to end products wasn’t so very wise. This lead to the birth of the design automation industry that today happens to be the lifeline of the SoC design cycle.
Birth of the fabless ecosystem – The initial design houses had the muscle to manufacture the end product while allowing some contract manufacturing for the smaller players. This setup had its own set of issues discouraging startups.  Also, maintaining the existing node while investing in R&D for next gen nodes was unsustainable. It was only in the late 80s when Morris Chang introduced the foundry model that the industry realized fabless was a possibility. Since then, all stakeholders of the ecosystem have collaborated towards realizing the Moore’s law.
Reuse – As the transistors scaled, the turnaround time to design should have increased, but, to keep a check on the same, reusability was adopted. This reuse was introduced at multiple levels. Different consortiums came forward to standardize the design representations & hand offs. Standards helped in promoting reuse across the industry. Next was design reuse in form of IPs. For standard protocols the IPs are reused across companies while for proprietary ones reuse within the organization is highly encouraged. Reuse has played a significant role in continuing the pace that Moore’s law suggests.
Abstraction – When the observation was made, the design were still at transistor level and layouts done manually. Due to the need to sustain the rising complexity, it was realized to move to next level of abstraction i.e. logic gates followed by Register Transfer level where the design is represented in HDLs and synthesized to gates. Today the industry is already talking about a still high level synthesizable language.
Specialization – The initial designs didn’t require a variety of skill set as it is today. Given the evolution of the design cycle and the quantum of responsibility at every stage, there was a need to bring in specialists in each area. This lead to RTL designers, verification engineers, gate level implementation engineers and layout engineers. Today the overall team realizing a design runs into hundreds of engineers with varied skill set for a complex SoC involving EDA, foundry, reuse & abstraction.
Throughout these 50 years, there were many a times when experts challenged the sustainability of Moore’s law. Most of them had a scientific rational endorsing their argument. However, the collective effort of the industry always was able to find out an answer to those challenges – sometimes through science, sometimes through logic and sometimes through sheer conviction!
Long live Moore’s law!

Sunday, March 22, 2015

Is Shift Left just a marketing gimmick?

This year DVCON in US was a huge success hosting close to 1200+ visitors busy connecting, sharing & learning! With UVM adoption rate stabilizing, this year the talk of the event was ‘Shift Left’ – a discussion kicked off as a keynote  by Aart J. De Geus, CEO of Synopsys. The reason for the generated interest is because there are gurus preaching it to be the next big thing and then there are pundits predicting it to be a mere marketing buzzword. In reality, both are correct!

The term 'Shift Left' is considerably new and is interesting enough to create a buzz around the industry. Without the buzz there is no awareness and without awareness, no adoption! However, the phenomenon i.e. squeezing the development cycle aka 'Shift left' for faster time to market has been there for more than a decade.

In the 90s, hundreds of team members worked relentlessly to tape out 1 chip in years & were flown to destinations like Hawaii for celebrating it. Today this is no more heard because every organization or for that matter even the captive centres itself are taping out multiple chips per year. The celebration got squeezed to a lunch/dinner - probably indicating a 'Shift Left' in celebrations too :)

Back in the 90s, the product was HE centric and the so called ASIC design cycle was fairly simple owing to its sequential nature where next stage starts once earlier is done. The industry saw this as an opportunity and started working towards tools & flows that can help bring in efficiency by parallelizing the efforts. Introduction to constrained random verification lead the verification efforts to be parallel to RTL thereby stepping left. Early RTL releases to implementation team helped parallelizing the efforts towards floor planning, placement, die size estimation and package design etc. Reuse of IPs, VIPs, flow, methodologies etc gave further push enabling optimized design cycle. These efforts helped in bringing the first level of the now called 'Shift Left' in the design cycle.

In the later part of the last decade, 2 observations were evident to the industry -
1. The product is no longer HW alone and instead a conglomeration of HW & SW with the later adding further delays to the overall product development cycle.
2. Efficiency achieved out of parallelism is limited by the longest pole of the divided tasks. In ASIC design cycle, Verification happens to be gating further squeeze in the cycle.

This became the next focus area and today given that the solutions have reached some level of maturity the buzz word that we call ‘Shift Left’ finally found an identity! The key ideas that enable this shift left include –

- Formal APPS enabling faster targeted verification of defined facets in any design. The static nature of the solution wrapped up in form of APPS has tickled the interest in the design community to contribute to verification productivity by cleaning up the design before mainstream verification starts. This leads to another buzzword DFV - ‘Design for Verification’.

- FPGA prototyping has always been there but each organization was spending time & efforts to define & develop the prototyping board. Today off the shelf solutions give the desired jump start to the prototyping process enabling early SW development once the RTL is mature.

- To improve the speed of verification, hardware accelerators aka emulation platforms were introduced and these solutions opened up gates for early software development even before the RTL freeze milestone.

- Improvement in speed with higher level of abstraction was evident when the industry moved from Gate level to RTL. The next move was planned with transaction level modelling. While high level synthesis is yet to witness mass adoption, its extension resulted in Virtual prototyping platform enabling architecture exploration, HW SW partitioning and early SW development even before the RTL design/integration starts.

In summary, the process of product development cycle is getting refined by the day. The industry is busy weeding out inefficiencies in the flow, automating everything possible to improve predictability and bringing in the required collaboration across the stakeholders for realizing better, faster & cheaper products. Yes, some call it the great SHIFT LEFT!

Sunday, February 22, 2015

DVCON : Enabling the O's of OODA loop in DV

Today the traditional methods of verifying the design strangle with complexity seeking new leaps and bounds.  We discussed in our last post about why a new pair of eyes was added to the design flow and since then we continue to not only increase the numbers of those pairs but also improve the lenses that these eyes use to weed out bugs. These lenses are nothing but the flows and methodologies we have been introducing as the design challenges continue to unfold. Today, we have reached a point in verification where ‘one size doesn’t fit all’. While the nature of the design commands a customized process of verifying it, even for a given design, moving from block to sub system (UVM centric) and to SoC/Top level (directed tests) we need to change the way we verify the scope. Besides the level, there are certain categories of functions that are best suited for a certain way of verification (read formal). Beyond this, modelling the design and putting a reusable verification environment around it to accelerate the development is another area that requires attention. With analog sitting next to digital on the same die, verifying the two together demands a unique approach. All in all, for the product to hit the market window in the right time you cannot just verify the design but you need to put a well defined strategy to verify it in the fastest and best possible fashion.

So what is OODA loop? From Wikipedia, the phrase OODA loop refers to the decision cycle of observe, orient, decide, and act, developed by military strategist and USAF Colonel John Boyd. According to Boyd, decision making occurs in a recurring cycle of observe-orient-decide-act. An entity (whether an individual or an organization) that can process this cycle quickly, observing and reacting to unfolding events more rapidly than an opponent, can thereby "get inside" the opponent's decision cycle and gain the advantage. The primary application of OODA loop was at the strategic level in military operations. Since the concept is core to defining the right strategy, the application base continues to increase.

To some extent this OODA loop entered the DV cycle with the introduction of Constrained Random Verification paired with Coverage Driven verification closure. Constrained random regressions kicked off the process of observing the gaps, analyzing the holes, decide if they need to be covered and acting on refining the constraints further so as to direct the simulations to cover the holes. 

Today, the need for applying OODA loop is at a much higher level i.e. to strategize on the right mix of tools, flows and methodologies in realizing a winning edge. The outcome depends highly on the 2 O’s i.e. Observe & Orient. In order to maximize returns on these Os, one must be AWARE of –
1. The current process that is followed
2  Pain points in the current process and anticipated ones for the next design
3. Different means to address the pain points

Even to address the first 2 points mentioned above, one needs to be aware of what all to observe in the process and how to measure the pain points. While the EDA partners try to help out in the best possible way enabling the teams with the right mix, it is important to understand what kind of challenges are keeping the the others in the industry busy and how are they solving these problems. One of the premiere forums addressing this aspect is DVCON! Last year, DVCON extended its presence from US to India and Europe. These events provide a unique opportunity to get involved and in the process - connect, share and learn. Re-iterating the words of Benjamin Franklin once again –

Tell me and I forget.
Teach me and I remember.
Involve me and I learn.

So this is your chance to contribute to enabling the fraternity with the Os of OODA loop!

Relevant dates -
DVCON US – March 2-5, San Jose
DVCON India – September 10-11, Bangalore
DVCON Europe – November 11-12, Munich

Other posts on DVCON at siddha'karana –


Sunday, February 8, 2015

Designers should not verify their own code! REALLY?

Around 2 decades back, the demands from designs were relatively simple and the focus was on improving performance. Process nodes had longer life, power optimization wasn’t even discussed and time to market pressure was relatively less given that the end products enjoyed long life. In those days, it was the designer who would first design and later verify his own code usually using the same HDL. Over the years, as complexity accelerated, a new breed of engineers entered the scene, the DV engineers! The rational given was that there is a need for an independent pair of eyes to confirm if the design is meeting the intent! Verification was still sequential to the design in early days of directed verification. Soon, there was a need for constrained random verification (CRV) and additional techniques to contain the growing verification challenge. The test bench development now started in parallel to the design, improving the size & need of verification teams further. With non HDLs i.e. HVLs entering the scene the need for DV engineers was inevitable. All these years, the rational of having an additional pair of eyes continued to be heard to an extent that we have started believing that designers should not verify their own code.

In my last post I emphasized the need for collaboration wherein designers and verification engineers need to come together for faster verification closure. Neil Johnson recently concluded in his post on designers verifying their own code. My 2 cents to whether designers should not or shall I say do not verify their own code –

To start with, let’s look at what all involves verification? The figure adjoining is a summary of efforts spent in verification based on the study conducted by Wilson research in 2012 (commissioned by Mentor). Clubbing some of the activities it is clear that ~40% of the time is spent in Test planning, Test bench development and other activities. The rest ~60% of the effort is spent on Debug & creating + running tests. The DUT here can be an IP or SoC. 

When an IP is under development or SoC is getting integrated, the DV engineers would be involved into the 40% of the activities mentioned above. These are the tasks that actually fall in line with the statement of additional pair of eyes validating design intent. They need to understand the architecture of the DUT and come up with a verification plan, develop verification environment and hooks to monitor progress. At this level, the involvement of the design team starts with activities like test plan review, code coverage review, inputs to corner cases and tests of interest. 

So, once the design is alive on the testbench, do the designers just sit & watch the DV team validate the representation of spec developed by them? NO!

Debugging alone is a single major activity that consumes an equal amount or sometimes more efforts from the designers to root cause the bug. Apart from it, there is significant involvement of the design team during IP & SoC verification.

For IPs, CRV is a usual choice. The power of CRV lies in automating the test generation using the testbench. A little additional automation enables the designers to generate constrained tests themselves. Assertions are another very important aspect in IPs. With introduction of assertion synthesis tools, the designers work on segregating the generated points into assertions or coverage. For SoCs, apart from reuse of CRV, directed verification is an obvious choice. Introduction to new tools on graph based verification help designers to try out tests based on the test plan developed by the DV engineer. Apart from this, corner case analysis and usecase waveform reviews are another time consuming contributions put in by designers for verifying the DUT.

Coming back to the rational on having an independent pair of eyes verify the code, the implication was never that the designers shall not verify their own code. Infact there is no way for the DV team to do it in a disjoint fashion. Today the verification engineer himself is designing a highly sophisticated test bench that is actually equivalent to a designer’s code in complexity.  So it would be rather apt to say that it is the two designs striking each other to enable verification under the collaboration between design & verification teams!

What is your take on this? Drop a note below!

Sunday, January 25, 2015

The 4th C in Verification

The 3C’s of verification i.e. Constraints, Checkers & Coverage have been playing an important role enabling faster verification closure. With growing complexity and shrinking market windows it is important to introduce the 4th C that can be a game changer in actually differentiating your product development life cycle. Interestingly the 4th C is less technical but highly effective in results. It is agnostic to the tool or flow or methodology but if introduced and practiced diligently would surely result in multi-fold returns. Since verification claims almost 70%of the ASIC design cycle, it is evident that timely sign off on DV would be the key to faster time to market of the product. Yes, the 4th C I am referring to is Collaboration!

UVM demonstrates a perfect example of collaboration within the verification fraternity to converge on a methodology that benefits everyone. Verification today spreads beyond RTL simulations to high level model validation, virtual platform based design validation, analog model validation, static checks, timing simulations, FPGA prototyping/emulation and post silicon validation. What this means is that we need to step out and collaborate with different stakeholders enabling faster closure.

The first & foremost being the architecture team, RTL designers & analog designers who conceive the design and realize it in some or the other form and many a times fall short of accurate documentation. The architecture team can help to a large extent in defining the context under which theverification needs to be carried out thereby narrowing down the scope. With a variety of tools available, the DV teams can work closely with designers to clean the RTL removing obvious issues that otherwise would stall simulation progress. Further, assertion synthesis and coverage closure would help in closing the verification at different levels smoothly. Working with analog designers can help tune the models and their validation process wrt the circuit representation of the design. This enables faster closure of designs that see increased scope of analog on silicon.

Next are the tools that we use. It is important to collaborate with the EDA vendors in not just being the user of the tool but working closely with them in anticipating the challenges expected in the next design and be early adopters of the tools to flush the flows and get ready for the real drill. Similarly, joining hands with the IP & VIP vendors is equally crucial. Setting up right expectations with the IP vendors on the deliverables from verification view point i.e. coverage metrics, test plans, integration guide, integration tests etc. would help in faster closure on SoC verification. Working with VIP vendors to define how best to leverage the VIP components, sequences, tests & coverage etc. at block and SoC level avoids redundant efforts and help in closing verification faster.

The design service providers augment the existing teams bringing the required elasticity to the project needs or take up ownership of derivatives and execute them. These engineers are exposed to a variety of flows and methodologies while contributing to different projects. They can help in introducing efficiency to the existing ways of accomplishing tasks. Auditing existing flows and porting the legacy environment to better ones is another way these groups can contribute effectively if partnered aptly.

Finally the software teams that bring life to the HW we verify. In my last blog I highlighted the need for HW & SW teams to work more closely and how verification teams acts as a bridge between the two. Working closely with the SW teams can improve reusability and eliminate redundancies in the efforts.


Collaboration today is the need of the hour! We need to be open to recognize the efforts put in by different stakeholders from the ecosystem to realize a product. Collaboration improves reuse and avoids a lot of wasted efforts in terms of repeated work or incorrect understanding of intent. Above all, the camaraderie developed as part of this process would ensure that any or all these folks are available at any time to jump in the hour of need to cover for unforeseen effects of Murphy’s law.

Drop in your experiences & views with collaboration in the comments section.

Sunday, January 11, 2015

HW - SW : Yes, Steve Jobs was right!

The start of the year marked another step forward towards the NEXT BIG THING in semiconductor space with a fleet of companies showcasing interesting products at CES in Las Vegas. In parallel, the VLSI conference 2015 at Bangalore also focused on Internet of Things with industry luminaries sharing their views and many local start-ups busy demonstrating their products. As we march forward to enable everything around us with sensors, integrating connectivity through gateways and associated analytics in the cloud, the need for lower form factors, low power, efficient performance and high security at lowest possible cost in limited time is felt more than ever. While there has been a remarkable progress in SoC development targeting these goals, the end product landing with the users doesn’t always reflect the perceived outcome. What this means is, we are at a point where HW and SW cannot work in silos anymore and they need multiple degrees of collaborations.

To enable this next generation product suites, there is a lot of debate going around the CLOSED & OPEN source development. Every discussion refers to the stories of Apple vs Microsoft/Google or iOS vs Android etc. While an open source definitely accelerates development in different dimensions, we all would agree that some of the major user experiences were delivered in a closed system. Interestingly, this debate is more philosophical! At a fundamental level, the reason for closed development was to ensure the HW and SW teams are tightly bound – a doctrine strongly preached by Steve Jobs. From an engineering standpoint, with limited infrastructure available around that time, a closed approach was an outcome of this thought process. Today, the times have changed and there are multiple options available at different abstraction levels to enable close knitting of HW and SW. 

To start with, the basic architecture exploration phase of partitioning the HW and SW can be enabled with the virtual platforms. With the availability of high level models one can quickly build up a desired system to analyze the bottlenecks and performance parameters. There is work in progress to bring power modelling at this level for early power estimation. A transition to cycle accurate models on this platform further enables early software development in parallel to the SoC design cycle. 

Once the RTL is ready, the emulation platforms accelerate the verification cycle by facilitating the testing to be carried out with the external devices imitating real peripherals. This platform also enables the SW teams to try out the code with the actual RTL that would go onto the silicon. The emulators support performance and power analysis that further aid in ensuring that the target specification for the end product is achieved. 

Next, the advancements in FPGA prototyping space finally gives the required boost to have the entire design run faster ensuring that the complete OS can be booted with use-cases running much ahead of the Si tape-out providing insurance to the end product realization.

This new generation of EDA solutions are enabling the bare metal software development to work in complete conjunction with the hardware thereby exploiting every single aspect of the later. It is the verification team that is morphing itself into a bridge between the HW and SW team enabling the SHIFT LEFT in the product development cycle. While the industry pundits can continue to debate over the closed vs open philosophy, the stage is all set to enable HW SW co-development in a given proximity under either of these cases.

As Steve Jobs believed, the differentiation between a GOOD vs GREAT product is the tight coupling of the underlying HW with the associated SW topped with simplicity of use. Yes, Steve Jobs was right and today we see technology enabling his vision for everyone!

Wish you & your loved ones a Happy and Prosperous 2015!

Disclaimer: The thoughts shared are an individual opinion of the author and not influenced by any corporate.