Friday, July 2, 2010

SystemVerilog OVM’s apply_config_settings – why & were?

Arayik Babayan, a friend of mine from Armenia asked me what is the use of “apply_config_settings” in OVM. As you may be aware SystemVerilog is a flexible language that can be used for building highly configurable and scalable verification environments. OVM adds a great deal of capabilities on top of plain “system verilog” to make it lot easier to handle that task. One of them is the configuration interface mechanism – usually we use set_config* and get_config* stuff. Internally OVM’s build() takes care of “applying” these settings. automatically usually. What if you want to change few settings across the env after the build? That’s when you use this apply_config_settings explicitly – it internally calls the set_*local for modified settings and viola – you are ready to go!

That’s “advanced OVM” for this week!

 

Enjoy OVMing..

TeamCVC

www.cvcblr.com/blog

Tuesday, June 29, 2010

Dealing with SystemVerilog constraint solver failures – the Questa way

… Tuesday Technote on Solver Debug, Jijo PS, Srini TeamCVC www.cvcblr.com

Dealing with simple solver failure – looking for really “quick help”. It is a layered SystemVerilog code for a SAN Router. An inherited constraint in a testcase showed randomize() failure. Before you jump to conclusion on the simple nature of the problem – consider that this is the first time eevr I look at this design/env as the original author moved out of the company (sign of good times :-) ?) and am given to fix the code ASAP – in next 15 minutes that’s (sounds way too familiar, Huh?).

 

# Number of fware xactn 19
# ** Fatal: [Time 0 ns] Test cfg Solver failure
#    Time: 0 ns  Scope: san_rt_top.san_rt_test_pgm_0.b1.lp.a1 File: ../
rt_test_03.sv Line: 83
# ** Note: Data structure takes 9699728 bytes of memory
#          Process time 0.03 seconds
#          $finish    : ../test/san_rt_test_03.sv(83)
#    Time: 0 ns  Iteration: 2  Instance: /san_rt_top/san_rt_test_pgm_0

 

So what next? Consult our friendly Questa SolveDebug: add vsim –solvedebug and bang you go…

It does 2 things:

  1. It prints the minimal set of conflicting constraints,
  2. Creates a stand-alone test to reproduce the failure in a crisp testcase. See below:

 

Minimal set of constraints from user-code

# ../test/san_rt_test_03.sv(82): randomize() failed due to conflicts between the following constraints:
#     ../test/san_rt_test_03.sv(59): san_rt_test_cfg_0.cst_reasonable_fw_xactns_1 { (san_rt_test_cfg_0.no_of_fware_xactions > 32'h00001360); }
#     ../src/san_rt_fware_gen.sv(42): san_rt_test_cfg_0.cst_reasonable_fw_xactns { (san_rt_test_cfg_0.no_of_fware_xactions < 32'h00000032); }
# ** Fatal: [Time 0 ns] Test cfg Solver failure
#    Time: 0 ns  Scope: san_rt_top.san_rt_test_pgm_0.b1.lp.a1 File: ../test/san_rt_test_03.sv Line: 83
# ** Note: Data structure takes 9699728 bytes of memory
#          Process time 0.02 seconds
#          $finish    : ../test/san_rt_test_03.sv(83

Testcase being created by Questa (system verilog code, can be run standalone)

 

# ../test/san_rt_test_03.sv(82): randomize() failed; generating simplified testcase scenario...
# ----- begin testcase -----
# module top;
#
# class TFoo;
#     rand bit [15:0] \san_rt_test_cfg_0.no_of_fware_xactions ;
#     constraint all_constraints {
#         // ../src/san_rt_fware_gen.sv(42): san_rt_test_cfg_0.cst_reasonable_fw_xactns { (san_rt_test_cfg_0.no_of_fware_xactions < 32'h00000032); }
#         (\san_rt_test_cfg_0.no_of_fware_xactions  < 32'h00000032);
#         // ../test/san_rt_test_03.sv(62): san_rt_test_cfg_0.small_tst_cst { (san_rt_test_cfg_0.no_of_fware_xactions < 32'h000013ec); }
#         (\san_rt_test_cfg_0.no_of_fware_xactions  < 32'h000013ec);
#         // ../test/san_rt_test_03.sv(59): san_rt_test_cfg_0.cst_reasonable_fw_xactns_1 { (san_rt_test_cfg_0.no_of_fware_xactions > 32'h00001360); }
#         (\san_rt_test_cfg_0.no_of_fware_xactions  > 32'h00001360);
#     }
# endclass
#
# TFoo f = new;
# int status;
#
# initial begin
#     status = f.randomize();
#     $display(status);
# end
#
# endmodule
# ----- end testcase -----
#

 

Now that was easy to fix, simply override the test-specific constraint in the inherited test_cfg than “adding to it”. Glad I met my deadline for today!

 

Hats off Questa – wish it prints the vsim –solvefaildebug automatically on such failures to log file.

TeamCVC

www.cvcblr.com/blog

Monday, June 7, 2010

Pre-DAC round-up of Verification technologies

Given the business climate and local commitments, it is hard for me to be at DAC. But with keen focus on Verification it is kind of important for CVC (www.cvcblr.com) to share our thoughts on fresh ideas/technologies on Verification that are being demo-ed at DAC-2010 (www.dac.com). Leaving the BIG-3 out (I hope to blog about them prior to DAC on what we see as “updates” from them separately), here is a quick round-up of what we see as promising solutions that any DAC attendee in Verification domain might be interested. Feel free to comment via our blog @ www.cvcblr.com/blog – we would love to hear them!

NextOP

One of the most promising start-ups in the assertion based verification domain. They have been in stealth mode for a few years. Only recently quite a bit of information has been let out about their technology. It all started with an eval report from a real user and active follow-ups from then – see: http://www.cvcblr.com/blog/?p=147

Ben Cohen (www.systemverilog.us) recently had some good discussions about this technology based on our DVCon-2010 paper on SVA paper (contact us to get a copy: http://www.cvcblr.com/about_us) It did find some interesting bug via simulation run –> property extraction –> coverage hole –> bug! It is a little long route, but however it is an interesting approach. See details at:http://www.cvcblr.com/blog/?p=163

Make sure you visit their booth @DAC (NextOp exhibits at Booth #1442) to learn more. In a nutshell their technology is about analyzing existing RTL & testbench+testcase (via regression) and extract quality properties for your design – then it is upto the RTL designers to qualify whether these “properties” are assertions/coverage/don’t cares. Their promise is minimal noise, but your mileage may vary!

Vennsa’s OnPoint

If you ask anyone in EDA/Semiconductor industry about the “elephant in the room” problem in front-end VLSI, the answer is loud-n-clear DEBUG! Besides SpringSoft/Novas noone seemed to have the perseverance needed to sail through tough times trying to address that problem. (Remember Veritools, anyone BTW?) Now we have a genuine attempt to automate the debug – Vennsa’s OnPoint. Not much is known yet about it, but here is a picture (Copyright by Vennsa http://www.vennsa.com/ ):

onpoint_screenshot

This actually fits very nicely with our Unique workshop on “Debug” (see: www.cvcblr.com/trainings) – wherein we look at some of the common debug problems and demonstrate how little tricks with TCL, GUI/Markers etc. can save you hours if not days!

Look at some of our earlier Tweet’s  on OnPoint at www.twitter.com/sricvc to get some more info.

I’m sure we will hear more about it in coming weeks/months.

Jasper’s ActiveDesign

One of the most charismatic EDA tools that I’ve come across with so far – that’s if they really deliver on being the “Twitter of RTL Design” expectation that has been set of this. A picture is worth more than…here you go:

image

Read more about it at: http://www.cvcblr.com/blog/?p=144

Zocalo-tech

Do you care to approach your ABV adoption more methodically? Quoting Harry Foster, all time ABV promoter: (from his invited tutorial entited: “Assertion-Based Verification: Industry Myths to Realities”,

……”what differentiates a successful team from an unsuccessful team is process and adoption of new verification methods. Unsuccessful teams tend to approach development in an ad hoc fashion, while successful teams employ a more mature level of methodology that is systematic”. ……

Now Zocalo is one vendor trying to address that “methodology” aspect of ABV – via their Bird-dog primarily. We looked at their Zazz-OVL and even during today’s SVA training locally (http://www.cvcblr.com/trng_profiles/CVC_LG_SVA_profile.pdf) we were discussing how complex some of the OVL choices could be and I mentioned ZazzOVL – as the Dutch puts it, it is “jammer” (pronounce it as “yammer”, see: http://forum.wordreference.com/showthread.php?t=359560) that we didn’t have the tool handy to show off the value (during the lab session I mean). So make no mistake – their ZazzOVL is very very handy indeed – if you are adding OVLs that’s.

Coming back to their offerings – Bird-dog is a very interesting approach, very much for those assertion enthusiasts who look for “where is the maximum ROI of adding assertions”. Their Visual-SVA is like a “temporal GUI/editor” for complex SVA coding, not my personal cup-of-tea, but I do see value for some there. However generating “traces” for assertions within Visual-SVA is certainly a good attempt. Let’s see how they fair in real life usage! Visit Zocalo Tech Booth # 1509

The all new UVM (a la erstwhile OVM)

Sure you have heard of that – UVM, a sincere effort from Accellera to arrive at a “Universal” methodology from those seemingly competing OVM & VMM. Unless you want to risk your company not paying off your DAC bills, you wouldn’t want to miss that UVM booth :-) Honestly – I believe every one is looking forward to that. As the Accellera PR puts it:

Accellera's DAC breakfast, sponsored by Cadence, Mentor and Synopsys, will feature a standards update with an overview of how the Universal Verification Methodology (UVM) standard supports verification tool interoperability and gives IP and EDA users more choices, and a panel on "UVM: Charting the New Territory." This event continues the celebration of Accellera's 10 years of standards excellence.

For the first time, all 3 major vendors “sponsor” one event promoting ONE methodology – a great news indeed for the users. BTW, there is Aldec catching up on the SystemVerilog support with Riviera-Pro product line. Ask them for VMM/OVM/UVM support updates at: http://www.aldec.com/registration/dac

Agnisys's OVM/UVM management kits

A young EDA company based in Noida, India with solid EDA background (Anupam). They have iDesignSpec & iVerifySpec as products - one is for Register automation and another for overall Verification management. The REG automation has been a long awaited/wished for stuff, almost 8 years back we at Intel used Perl+DOC (Table) for something similar - glad to see a much more finished end product now. It can emit VMM-RAL, OVM and soon perhaps the UVM code too.

Sapient-Inc's IC management

Another  young EDA company, according to the founder - Subash, a long time chip designer/manager:

I started Sapient-IC from the pain and frustration of managing IC products. The die size grows, schedule slips, VP yells at everybody. This is what I want address. Analytics for decision makers, comparative analysis for design choices to financial analysis.

Breker’s Trek

A not-so-young EDA company (compared to the likes of NextOp/Zocalo etc.) with some interesting success stories with NVidia, STMicro. Their Trek is certainly a refreshing approach to testcase writing – especially for SoC Verification. See: http://www.cvcblr.com/blog/?p=148 for more details.

RealIntent’s Ascent

So much has been told, written about Linters – yet its adoption has been hampered heavily by the amount of “noise” it creates. Realintent’s Ascent claims to be less on that – and that is their primary seeling point. Not sure how they achieve that – given the natural side-effect of trying “find faults” with any given code.

SpringSoft

Check with them what’s up with their Certess/Certitude – it is an innovative approach for sure – mutation based TB qualification. As much as we have heard locally, there have been success and also some additional “noise”.

Monday, May 24, 2010

NextOp's assertion synthesis and our recent FIFO experience

NextOp's assertion synthesis and our recent FIFO experience

Based on DVCon 2010 paper on SystemVerilog Assertions - 2009 (see www.cvcblr.com --> Publications) we recently got our FIFO model run through NextOp's BugScope tool. It produced some interesting stuff. The main one I liked is

pop |-> full;

This is an eye opener property - as this should never be the case! But BugScope indeed indicated that we are missing this - either as assert or cover. Obviously this is not a good assert, so when we analyzed deep, it turned out to be a "valid coverage" based on the RTL written. Details at:

http://verificationguild.com/modules.php?name=Forums&file=viewtopic&p=18073#18073

So essentially we did have a coverage hole - when that hole is analyzed, we get a design error/bug! What an interesting go-around way of detecting bugs - who cares, as long the bug detection is automatic, it is good!

Ajeetha,

http://www.cvcblr.com/blog/?p=163

Monday, May 17, 2010

Welcome the next generation Verification Methodology – UVM

For all those System Verilog geeks, lovers, followers here is a sigh of BIG relief – at last we have a UNIVERSAL Verification Methodology that all the 3 major EDA vendors would openly support (and hopefully promote as well). As we speak, UVM-EA (Early Adaptor) release is now available. Take a look at it from Accellera site.

CVC (www.cvcblr.com) has been constantly following this release and are about to release our fresh trainings on this UVM. After all it is based on OVM 2.0.* on which we had successful trainings delivered to several customers locally. The most recent one just over the last weekend! (Yeah, we do have weekend classes as well).

So, what are you waiting for? Go ahead and ask for our upcoming UVM class via training@cvcblr.com or call us via +91-9620209226

Talk to you soon on UVM!

CVC Team

www.cvcblr.com

Saturday, April 24, 2010

A glimpse of our DVAudit – what goes on @CVC’s TDG

Many have asked us the following:

  • Is CVC a training company? I see: www.cvcblr.com/trainings
  • Do you work on “live” projects?
  • What is your TDG really doing?

and more. Usually these questions are more from students/RCGs (Recent College Graduates) than the experienced lot – as the experienced lot is well networked with CVC founders (www.linkedin.com/in/svenka3, www.linkedin.com/in/ajeetha) and can hence know about us well.

Our honest answer to such questions is “all of the above” :-) That’s YES, we ARE proud to be a training company www.cvcblr.com/trainings – simply b’cos we know why we are doing that. We work on “live” projects – we constantly upgrade to next generation technology, the most recent being the SystemVerilog VMM/OVM, Low power etc.

What makes us really different and keeps us constantly innovating is the thirst for “doing better”. This is the core of our PDG – Product Division Group (yet to be formally announced on our website), we look for ways to enhance the productivity. For instance when there is a customer deliverable for a Verification code, “Team CVC” spends quality time together to do thorough reviews, code walk through, custom lints etc. Here is our latest, weekend edition on un-moderated, live-from-the board glimpse of our DVAudit review done on a customer deliverable.

DVAudit

 

For the experienced lot reading this entry (BTW, thanks for getting so far :-)), this is such a common part of your tech-life. For those uninitiated, this is how industry works - “writing piece of code” is just A part – there is lot more to it in making it customer ready/production ready.

Now what’s innovative about the above “glimpse” – if you read it carefully you can observe that we are creating a thorough check-list of “executable” process for Design-Verification Audit. It is something CVC has been doing it for its corporate customers behind the scene for many years. Now it is slowly taking shape as part of our PDG – stay tuned for more..

Wednesday, March 17, 2010

A modern approach to SoC level verification

 

           Verifying SoC is fun and tedious. Especially with several buzz words around the corner, it is quite easy to get lost in maze of buzz-words and miss the goal. At the end one may feel that the plain old wisdom of whiteboard based testcase review/plan is/was lot more controllable & observable. We did that back in 2000 @ Realchip communications and yes it worked really well. But with shrinking times and mounting complexity is that really fast enough? Before I hear constrained-random, blink for a while – how much random do you want your end-to-end data flow in-and-out of ASIC/SoC to be?

We at CVC (www.cvcblr.com) take pride in partnering with all major EDA vendors (http://www.cvcblr.com/partners) – big & small to look for best possible solution for different problems than suggesting “one-size-fit-all” like solution.

Here is a relevant thread @Vguild: http://www.verificationguild.com/modules.php?name=Forums&file=viewtopic&p=17615#17615

I am due to start work on an ASIC, and am wondering about a suitable verification strategy. The ASIC consists of a data path, with continuous data input from ADCs and continuous output to DACs, and a couple of embedded processors utilising external flash and SRAM.

So the interfaces to the ASIC are pretty much:
(1) parallel data bus in
(2) parallel data bus out
(3) external memory interface for CPUs

And here is our own experience/view of some emerging approach to this problem – we don’t claim to have solved it completely, but seem to be making good progress towards a methodical and controllable (yet scalable) manner.

 

Hi Siskin,
Good question/topic. While the value of OVM/VMM is very profound at block levels, their usage at SoC level wherein end-to-end data flow is being checked is not very well reported (yet) in literature. Needless to say they are far better than inventing your own. Especially if you have block-to-system reuse of these VIP components they definitely come very handy. The virtual sequences/multi-stream scenarios do assist but IMHO they come with heavy workouts. Instead what we promote to our customers here and have been proto-typing with at CVC is the solution from Breker, it is called Trek. It can work on top of any existing TB - Verilog/VHDL/TCL/VMM/OVM you name it.
Idea is to reuse the block level components to do what they do best and build tests at a higher level - in this case using graphs, nodes etc. I tend to like this as I used to like Petri nets during my post-graduation days (though didn't followup on my interest afterwards).
My first impression was to use Trek simply as a testcase creation engine but slowly I'm getting convinced it is useful as "checker" as well - especially the end-to-end checks.
You are absolutely right - use assertions in IP interface levels and use some sort of higher level stimulus. Where I see Trek useful in SoC verification is the ability to describe your "flow of data through SoC" as a graph and let the tool generate tests for you. I even jokingly say one can use a palmtop/PDA to draw these graphs during travel, convert them to Trek graph (somehow, didn't chase that dream yet) and have tests ready while I'm on travel - flight/train/bus whatever be it! On a serious note, this is quite similar to how we used to discuss our testplans on a whiteboard during our Realchip (a communication startup in 2000-2001) days, now becoming "executable" Smile
See ST's usage of Trek @
http://www10.edacafe.com/nbc/articles/view_article.php?articleid=787856
Feel free to contact me offline if you need further assistance on Trek. We have our 2nd successful project finishing on using Trek, though these are small/medium scale ones.
My 2 cents!
Srini
www.cvcblr.com
_________________
Srinivasan Venkataramanan
Chief Technology Officer, CVC www.cvcblr.com
A Pragmatic Approach to VMM Adoption
SystemVerilog Assertions Handbook
Using PSL/SUGAR 2nd Edition.
Contributor: The functional verification of electronic systems