Archive for September, 2013
Tuesday, September 24th, 2013
Jim Lewis, VHDL Training Expert at SynthWorks (and founding member of OSVVM, which Aldec was an early adopter of) was kind enough to author a guest blog for Aldec. Here’s an excerpt:
After presenting a conference paper on how to do OSVVM-style constrained random and intelligent coverage (randomization based on functional coverage holes), I received a great question, “Why Randomize?”
The easiest way to answer this is with an example. Let’s look at a FIFO test – test a FIFO, write to it, read from it, write to it and read from it simultaneously, fill it and see that additional writes are held off successfully, and empty it and see that additional reads are held off successfully.
Most certainly a FIFO can be tested using a directed test (just code, no randomization). The following simulation waveform shows diffcount (the number of words in the FIFO) for a directed test. The lowest value is empty. The highest is full. Using this, you can visually check off all of the required conditions and see that the FIFO is indeed tested.
For the rest of this article, visit the Aldec Design and Verification Blog.
Tags: Aldec, coverage, fifo test, functional coverage holes, intelligent coverage, os-vvm, osvvm-style constrained random, randomization, systemverilog, VHDL, vhdl testbench techniques No Comments »
Monday, September 23rd, 2013
If DO-254 is both the mission and the map required to achieve compliance, then traceability represents the roads on that map. Consider this.
– Roads connect two or more places on a map; traceability connects two or more elements in a project (such as functions, requirements, concept, design, verification data and test results).
– Road names help identify specific places that are linked to it; traceability names help identify specific project elements that are linked to it.
– In the absence of roads, reaching your destination is practically impossible; in the absence of traceability achieving compliance is also practically impossible.
(more…)
Tags: Aldec, cca, circuit card assembly, conceptual design, detailed design, do-254, FPGA, fpga requirements, hardware design process, hardware requirements, HDL Design, implementation, individual system requirements, requirements capture, spec-tracer, test results, Traceability, verification results, verification test cases No Comments »
Wednesday, September 18th, 2013
Today’s System-on-Chip verification teams are moving up in the levels of abstraction to increase the degree of coverage in the system design. As designs grow larger, we start to see an increase in test time within our HDL simulations. Engineers can utilize Hardware-Assisted approaches such as simulation acceleration, transaction-level co-emulation, and prototyping to combat the growing simulation times of an RTL simulator. In this article, we’ll dive much deeper into the transaction-level co-emulation methodology.
(more…)
Tags: accelera, Aldec, co-simulation, dpi, Emulation, FPGA, function-based, hardware, hardware emulation platform, hardware-assisted verification method, hardware-assisted verification solution, hdl simulations, high-level testbenches, macro-based, pipes-based, prototyping, rtl simulator, sce-mi, simulation acceleration, SoC, SoC and ASIC Prototyping, soc designs, standard for co-emulation modeling interface, system-on-chip verification, systemverilog direct programming interface, systemverilog lrm, transaction-level co-emulation, transaction-level co-emulation methodology, Validation, verification No Comments »
Monday, September 16th, 2013
It occurred to me that it has been a few months since we shared an update on HiPer Simulation A/MS. Following DAC 2013 and Daniel Payne’s posts at SemiWiki (post 1, post 2), we at Aldec and Tanner EDA have received many inquiries from the field, conducted a number of evaluations, and deployed our analog/mixed-signal (AMS) design flow with our first mutual customers. In this article, I’ll share more the mixed-signal simulation methodology and highlight some of Verilog-AMS use cases that we have seen in the field.
Digital & Analog HDLs
The Verilog and VHDL languages were designed to handle discrete signals, where the number of possible signal values is limited (e.g. 1, 0, X, Z). Whereas Verilog-A was designed to handle continuous-time (analog) signals, that can take any value from a continuous range at any point.
For the rest of this article, visit the Aldec Design and Verification Blog.
Tags: Aldec, analog, co-simulation, digital, hiper simulation a/ms, mixed-level simulation, mixed-signal, mixed-signal design approach, Riviera-PRO, safety-critical, simulation-based verification, tanner eda, transistor-level implementation, verilog-ams simulators, vhdl languages 7 Comments »
Monday, September 9th, 2013
You look confused. Perhaps I owe you an explanation. Anyone familiar with hardware design flow knows that it starts with specification and ends with implementation. The specification in this flow is the “What” – it defines what needs to be designed. The process for implementation is the “How” – it defines how you are going to achieve it.
Let’s break down just one part of the “How” or implementation – the Design Process. For many years hand-coded RTL has been used as the de facto method for implementation and it is still being used as predominant method for designing cutting-edge hardware. But does it follow that it is the most efficient method? I would say probably not, especially given the ever-growing complexity of the hardware.
For the rest of this article, visit the Aldec Design and Verification Blog.
Tags: Aldec, clocks, design, hand-coded rtl, hardware design flow, hierarchies, high level synthesis, hls tool, processes, rtl, SoC, SoC and ASIC Prototyping, systemc, technology, Validation, verification No Comments »
|