Open side-bar Menu
 EDACafe Editorial
Peggy Aycinena
Peggy Aycinena
Peggy Aycinena is a contributing editor for EDACafe.Com

At the Verification Bar: Solutions from Here to Eternity

 
September 26th, 2013 by Peggy Aycinena

A Professor, a Sage, and a Guru walked into a bar. Brian the Bartender, greeted them: “What’ll it be, boys?”

The Professor said, “We need some help, Brian, settling an argument.”

“No problema,” Brian the Bartender said. “I’ve got an answer for everything.”

“Well,” the Professor said, “I think ESL’s not going to happen in our lifetime, but the Guru here says it’s just around the corner now that he and his have finally got all the pieces of the flow in place.”

Brian the Bartender laughed, “Yeah, the Guru’s been saying that since the dawn of mankind!”

“Exactly,” the Professor said.

Again Brian the Bartender laughed, “Guru, can you defend yourself? And don’t even think about plunking your wordy White Paper down on the bar. This is a public house, not a public library.”

The Guru laughed, “Fine, but it’s not my problem if none of you know how to read. And okay, perhaps I was off by a decade when I predicted that ESL was only 5 years away at the end of the last century. Nonetheless, now it actually is upon us thanks to a lot of work with my guys this year at DAC in Austin.

“Of course, our success there was predicated on the pain point we reached in 2011 when the complexity of mainstream design drove us in earnest to ESL. The pain was there in 2011, but there was no flow.

“By 2012, however, we’d discovered the flow, and the missing parts when we realized there was not enough power consideration in the flow. That’s when we let timing concerns drop to number two.

“Now here in 2013, I’ve been working with a bunch of people – Atrenta, in particular – and it’s dawned on us that the big problem is the need for acceleration and emulation, adding those things into the flow to let the software guys know what’s going on. And that’s what happened at DAC this year, enough guys and I got together to figure out how to get the thing to work.

“I’m a software guy,” the Guru acknowledged, “so over a nice lunch with Jason and Frank we completed the flow. Of course, if you could read, you would see that architecture is still being left out of the picture. Clearly, we need more standards and lots more tools to get things to work.”

The Guru whipped a laminated card out of his pocket protector and waved it at the Professor and the Sage. “Look at the names on this card,” he said. “These are the main ESL players who are working with us to make it all happen.”

The Professor and the Sage looked at Brian the Bartender, but he offered no help.

The Guru droned through the list while his companions sipped their beers. “Green Hills, MathWorks, Wind River by way of Intel, Calypto, Forte, Cadence, ARM, Synopsys, Mentor, Duolog, Docea, Apache, Oasys, Imperas, Real Intent, Jasper, OneSpin, Atrenta,” he intoned. “These are the guys making it work.”

The Professor and the Sage both started to say something, but the Guru was not done: “By the way, as we move towards better tools, EDA is gobbling up the embedded guys. Next we’ll run into the mechanical guys, so stay tuned because over the next 3 years who knows who might be eating who.”

Brain the Bartender pushed a bowl of peanuts toward the Guru and changed the topic. “It’s obvious,” he said, “that there’s a need for more verification cycles, but how do I know when I’ve executed the right cycles?”

The Professor looked grateful for a chance to opine and started to speak. He was too slow, however, because a guy at the end of the bar piped up. “Hi, I’m Ziv and I can answer that. The general method is about using coverage, especially when you go to system verification for the SoC. Our box for emulation, for instance – and that of others – helps you identify the metrics you need to determine what to verify and when.”

The Professor let Ziv finish his thought and then finally jumped in. “One thing I’d like to stress is the bottom-up nature of the verification process,” the Professor said.

“Up til now, we have concentrated on exhaustive verification, but some of the things we spend time verifying can never actually happen. So it’s only by thinking about the problem from the top down that we can identify what actually has to be verified. Refine that process and you’ll find that the important stuff is verified first. Then, if needs be, we can run a few more test cases.”

The Sage, who had listened patiently and was now well into his second beer, at last spoke up.

“I have kind of had my hand on this stuff for a long time,” he started, self-effacingly, “and I agree with the Professor. Currently, we are verifying things that won’t ever happen, which is why the notion of virtual prototyping synthesizable behavior is so important.”

Brain the Bartender rolled his eyes. He knew this discombobulated conversation between the Professor,  the Sage, and the Guru would only become more so, the longer the beer flowed. Nevertheless, he knew these guys left generous tips, so he poured another round and let them continue.

“Back in ’96,” the Guru said, “I was on a panel and we were asked if formal is the answer to verification. Of course, we all said no and it was then that the idea of the intelligent testbench came up.

“Especially because it was in that time frame that Intel was known to have verified one particular block in a design 96 different times. We said, put aside the blocks that have already been verified and stop the madness. Mentor has since developed some good tools to make that happen, and also the company Brekker. Today, intelligent testbench does work, although we still need to work on the big probe.”

Ziv again piped up from his end of the bar, “IP verification is more and more about verifying the blocks of a system. You have to verify IP exhaustively and it is okay to use formal, but many blocks are actually too big for formal.

“Although, some approaches say that the key to verifying is not to verify exhaustively, but to define the use cases – and not in some abstract way, but guaranteeing that the same requirements are validated through the process.

“There are three areas here which are important,” Ziv continued. “They include components, function, and …”

Brian the Bartender jumped in before Ziv could complete his list. “Ten years ago,” Brian remembered, “the design team was so much bigger than the verification team. Now, they’re equal is size. What’s that all about?”

The Professor concurred, “Yeah, and if we don’t change things fast, verification will actually swamp design, so we need to put executable requirements and use cases in place. We should never, ever let the verification team get bigger than the design team.

“Having said that,” the Professor added, “today’s chips are moving towards having no unique content whatsoever, in which case the verification team could conceivably get bigger.”

Brian the Bartender glanced at the Guru, “You’ve got that pit bull look again!”

The Guru responded, “We’ve just got to stop looking at the Design Team and the Verification Team separately, and instead realize that today we’re doing IP-based design. It’s very hierarchical – up to 6 different levels of hierarchy – so what we have now are Assembly Teams assembling the blocks.

“And by the way,” he added. “in the ideal case, there would be no more than 5 blocks!”

The Professor and the Sage chuckled. “Yeah right,” they both chortled simultaneously.

The Professor looked around the bar – a crowd had gathered – and asked, “Anybody here using just 5 blocks of IP?”

When nobody raised a hand, the Professor looked at the Guru. “Sorry,” he said, but he didn’t look sorry.

“Whatever,” the Guru conceded quickly. “The truth is that any design with over 35 blocks cannot be verified, so you have to look at the Assembly Team. If that team works on a design with 25 to 35 blocks, it’s probably a manageable number.

“And yet, nobody is talking about verification as a hierarchical process, even though they should be. In fact, right now we have to verify the design one step at a time. But as we go up the chain of abstraction in the design, we actually are looking at application specific designs. It’s no longer about general purpose chips, but about chips specifically designed for a specific application.”

Brain the Bartender turned to the Sage. “You build companies, what do you think?”

The Sage replied, “Yes, and one of my companies is using IP from a bunch of different places. From an economics point of view, we just can’t let the team get any bigger, because at some point throwing more monkeys at the problem only produces more monkey poop.

“So we try to parse the problem into blocks that are all about the same size, and then we distribute them to different teams, working hard to keep the teams synchronized. This is particularly useful for managing designs for an application-specific cell phone, for converging on a chip that will work in that environment.”

The Professor commiserated, “But in the verification flow, we don’t have enough communication.”

The Sage replied, “We’re always asking, is the stuff I have in front of me relevant to the problem?”

Ziv spoke up from his end of the bar, “And the verification problem continues to grow in leaps and bounds. No wonder EDA is growing into embedded software, because that’s where the headcount is going.

“Especially because these days, most design is derivative. It’s built on earlier designs with only a small percentage of change. In fact, only about 10 percent of the design today is actually new design.

“We still see verification and design oriented toward the old process, even though we all know that today it’s an assembly problem – the needs are focused on managing the integration of the IP.”

The Professor attempted again to clarify, “You’re saying it’s an integration problem, because you’re accepting a bottom-up flow. But the Sage and I are both saying it needs to be a top-down solution. Which use case should be run is decided best by a top-down point of view.”

Ziv emphasized his take on the problem: “IP blocks, software, Android-based systems – they are all reusing IP and are reconfigured somewhat differently. It’s bottom-up for the blocks, but top-down for the uses cases. It’s important to realize that both points of view need to co-exist.”

The Sage countered, “If your team has 50 or 60 guys, at least 20 of those guys have to be working on the software. Meanwhile, we know that EDA guys will work like sons of a gun to give us the verification tools we need. But the thing that’s going to kill us is the software that’s sitting on the hardware.”

Ziv pushed back, “But most simulation at the SoC level is done on an emulator. Coverage and debug means it has to work on that machine. We need to take those machines and make them hybrids, able to serve the whole OS, the software, and the whole stack.”

The Sage may not have heard him: “We see every year that some accelerator is built on a parallel processor. Just look at the latest version built on an Nvidia processor, 128 cores, for example, within a single processor.

“The thinking was that we could put that out on The Cloud and get everybody together, but it failed due to the cost of the thing. Plus,” the Sage added, “nobody wanted their data outside of their own private cloud. Simulation will never answer the questions that emulation can answer.

“Everybody will try to kill me when I say it, but emulation is always going to be ubiquitous. You’re always going to have to have it!”

The Professor looked up from his beer, “RTL simulation hit the wall when processor scaling outgrew its capacity. Before that, no one believed that emulation would grow powerful enough, but now it’s completely moved to the ESL level. The models are becoming ubiquitous and the cost savings are making it worthwhile. It’s clearly showing us how to make the best of all possible worlds!”

Brian the Bartender looked around the room: “Are there questions from the floor?”

A stranger called over from his table, “Before, when we were verifying for the mil-aero guys, everything was highly constrained, but now it’s about all consumer-orientated  things.”

Ziv offered randomly from the end of the bar, “Testing will still be manual, but most verification will move up to the software and the systems.”

The Guru also randomized, “The embedded tool market has been in crisis for 5 years. There’s been no decent work there since Linux meant that RTOS was the source of all money, so there’s been no go-to tool. Now Apple’s finally working on it, because the iPhone apps were draining the phone in 90 minutes.”

The Sage looked up from his beer and peered at Brian the Bartender, “This may end up being the last time I’m invited in here,” he said, “but I met a guy recently who does those insurance tables. I think they’re called actuaries.

“He told me that if I live until 80, the chances are that I will live forever. And the best way to live to 80 is to stop showering, because if I fall in the shower and break my hip I’ll probably die from the side affects. So I’ve decided to stop showering, so I can live forever.”

Brian the Bartender looked pained.

“It’s the same with verification,” the Sage continued. “The only thing it really does is build your confidence that you’re doing the right stuff.

“But you would be far better off pursuing a tactic and philosophy of getting it right at the specification step in the design. If I have an $180 million design and spend 40 percent of that cost on verification, that’s a lot of money. Wouldn’t it be better to get the design right in the first place?

“And also, rather than spend money on verifying the hardware, why not let software upgrades fix the system on the fly. I bought an iPhone 4s and the thing chewed up all of my battery power in 2 hours. I made an appointment with one of those Apple Geniuses and he said I needed to shut all of the functions that I had running to save power.

“Instead, I did a software upgrade. I uploaded a filter that fixed the OS and made the thing last 8 hours between chargings. It was easy.

“It’s the same with the Tesla. It’s got a new dashboard every morning, because Mother Tesla is talking to the car during the night and implementing fixes.

“And how does this affect the verification problem? I’m just saying it’s better during the design to just get to a ‘sufficiently good’ point and then say, whatever problems are left, we’ll fix them later through the software. That’s how verification should happen today.”

The Sage stopped speaking and there was a silence in the room. The Professor looked at Brian the Bartender, the Guru look at Brian the Bartender, Ziv looked at Brian the Bartender, and so did everyone else in the room – expect for the Sage. He looked at his shoes.

Brian the Bartender looked back at everyone and then reached across the bar. He patted the Sage on the shoulder and said,  “Look. These are tough questions and tough times. I appreciate your point of view, and I’m sure everybody in the house does as well.

“No matter that you’re no longer showering and that you think all problems can be fixed by software. It’s been a long evening, and we’re not going to hold you to anything that’s been said here over beer among friends. And just to prove there are no hard feelings, the drinks are on the house.”

The silence in the room erupted into bedlam. People leaped up from their chairs and rushed to the bar to congratulate the Professor, the Guru and Ziv for their wisdom and work.

Of course, the Sage got the biggest pats on the back as everyone knew it was his honesty and forthrightness that produced the results that really mattered – drinks on the house and systems that self-correct by pulling solutions out of The Cloud.

Night and Day. Day and Night. From Here to Eternity.

***************

This little narrative might have been inspired by an afternoon panel today at the Cadence System-to-Silicon Verification Summit. Any resemblance to real people, however, is purely coincidental.

***************

Tags: , , , , , , , , , , , , , , , , , , , , , ,

One Response to “At the Verification Bar: Solutions from Here to Eternity”

  1. Avatar John Swan says:

    I do agree with “Sage” in that we are too focused on design by verification – and that at RTL – instead of fixing the flow to do more validation and verification at ESL where the process runs much faster.

    There are several flow improvements that can be made. There is growing support for virtual prototyping.

    Over 10 years ago I oversaw a rapid flow from behavioral function to emulation by using HLS. Doing this would complement virtual prototyping for early SW/firmware development.

Logged in as . Log out »




© 2024 Internet Business Systems, Inc.
670 Aberdeen Way, Milpitas, CA 95035
+1 (408) 882-6554 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise