Last week, the festive season in India was at its peak with celebrations everywhere. Yes, it was Diwali - the biggest and the brightest of all festivals.
Source:compucentro.infoDiwali aka Deepawali means a 'row of lights' (deep = light and avali = a row). It’s the time when every house, every street, every city, the whole country is illuminated with lights, filled with aroma of incense sticks, delicious food and sounds of fire-crackers all around. People rejoice paying their obeisance to the Gods for showering health, wealth and prosperity. For many, besides the lights & food, it the crackers that is of supreme interest. The variety ranges from the ones that lighten the floors and the sky to the once that generate a lot of sound. On the eve, while everyone was busy in full ecstasy an interesting observation caught my attention initiating the neurons in my brain to connect it to verification resulting into this post.
While enjoying, typically the groups get identified based on age or preference for firecrackers. Usually the younger ones are busy with the stuff that sparkles while the elder ones get a kick with the noisy ones. A few are in transition from light to the sound with some basic items. The image below shows a specific type of firecrackers both bundled and dismantled.
The novices were busy with the single ones bursting one at a time. A hundred of such pieces would probably take 50 minutes or so almost linearly progressing. If one of them didn’t burst, they would actually probe and see that it worked or threw it. On the other hand the grownups were occupied with the bundled ones that once fired would go on till all of them gave out the resulting light and sound. More the no. of pieces, the longer it would take but the time was much less then releasing individual pieces. It was hard to identify if a 100 pieces bundle actually resulted in 100 sounds or not but overall the effect was better.
Observing the above, I could quickly connect it to the learning we have been through as verification engineers. It was directed tests initially wherein we would develop one test at a time and get it going. The total time to complete all the tests was quite linear and for limited number of tests the milestones were visible. Slowly we incorporated randomness into verification where a bundle of tests could run in a regression hitting different features quickly. The run time is dependent on the no. of times the tests run in random regression. Yes, this also results in redundancy but the gains are more especially if the no. of targets is more.
As for the firecrackers, the novices playing with individual ones may move to the bundles next year – a natural progression! This is an important conclusion as it demonstrates the learning steps for verification engineers too. Knowing SV & UVM is good but that doesn’t make one a verification engineer. A verification engineer needs to have a nose for bugs and develop code that can hit them fast and hard. This learning is hidden in working on directed tests initially and transitioning to constrained random thereafter. You would appreciate the power of the latter in a better way!
Try extrapolating it more to different aspects of verification and I am sure you would find the connections all throughout. Drop in a comment on your conclusions!
Disclaimer: The intention of the blog is not to promote bursting crackers. There are different views on the resulting pollution and environment hazards that the reader can view on the web and make an individual choice.