July 28, 2003
The Root of all Evil
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
For thinking souls, the overlap between philosophy and technology is an intriguing one. As such, it would appear that Anthony Mark Jones is a thinking soul. I had a chance to talk with him recently and enjoyed hearing a complex set of development theories laid out against a complex backdrop of design realities.
Jones has an MA in theoretical physics from Oxford and has worked extensively on chip architectures at Inmos, Micron Technology, and Advanced Hardware Architectures. He has been working hard of late to articulate the overlap between philosophy and technology, and to use those results to conceive of a new strategy to more efficiently design today's big chips - in particular, the systems that use them. He says he's taking a “holistic approach” in this effort, trying to find the unifying issues that include the technology, the specification, and the verification of big chips.
He says, “If you look at a 1-billion transistor chip, you have to ask yourself - How on earth do we execute a design on such a thing? Since I'm interested in taking a holistic approach to the problem, it really comes down to how we manage the process of building these devices.”
“With regards to the technology itself, there are huge numbers of transistors, huge complexity in the wiring, and a fundamental bottleneck in the state machines themselves. Clearly these issues are problems for everyone attempting to design these devices.”
According to Jones, if technology is the problem, specification should provide the solution.
“One of the interesting things I've noticed [at the outset of a project] is that the System Guy comes in, does the design and the development in software, finishes the thing, packages it with a big red bow and then hands it over to the Hardware Guy. Then the Hardware Guy [basically] redesigns the whole thing all over again, but in a different modality. Meanwhile, ninety percent of the time, the Software Guy and the Hardware Guy are bickering between themselves and having to compromise to get the thing to really work.”
“We need to bridge this gap through a complete modification of the way we do the specification, particularly at the transistor level, while also keeping the analog issues in the forefront. I'm not talking about hardware/software co-design here. I'm talking about a fundamentally different way of doing the specification. The key question is - Why are we doing something different in the hardware domain than we're doing in the software domain? If we're really trying to deal with managing the complexity here, we need to move beyond the separation inherent in co-design and co-simulation.”
“I've been thinking about this for a long time and have a number of ideas - some of them not fully formed - that fundamentally establish that any level of design should be independent of the context. Absolutely independent, if this is possible.”
“Take software specification as the obvious example here. There was a huge negative reaction in the industry when academia came up with Object-Oriented Programming. The academics said you couldn't really solve complex systems top-down. The best anyone can do is manage from the bottom up. Initially, the negative reaction was strong, and people didn't like the apparently unorganized and unstructured evolution. But there are many examples, such as the Linux libraries, STL, and Microsoft foundation libraries which have all proved the naysayers wrong.”
“Now people have gotten over their prejudices about doing software design this way. I think Object-Oriented Programming has been very successful, but [historically] it took a long time to be accepted because of the orthodoxy of corporate organizations and their built-in infrastructure. Essentially, this emerged as a very good example of how you could trade off what looked like a negative connotation - too much memory would be required, too much processor power - and create, instead, a positive connotation through the realization that well-designed objects provide an ability to construct a system in a context-free manner.”
Beyond the issues of technology and the specification, however, Jones says you ultimately have to face the most daunting issue of all - verification.
He says, “I think there are a number of examples here. IBM regularly builds large SoCs using IP that's all done, finished, and proven in silicon. However, it still takes them 18 months to get a chip out. You would think that with the quality of the IP being used, the process would be much faster. But in the last three big chips I have worked on, it has taken 6x in time, computer resources, and man hours, to do the verification in comparison to the time needed to do the design.”
“Years ago, I was involved in the design of a huge 3D graphics machine with embedded DRAM. Verification was at least 90 percent of the effort on that project. Today's systems are hugely complex, the IP is not well encapsulated, and it's not separable from the overall design. It may be possible to segregate the IP, but it involves systems not currently being used.”
According to Jones, all of this verification mess could be alleviated if specification, design, and most importantly, IP were to be developed context-free.
“There's a common message here. The reason why verification has exploded is that things don't talk together very well. You need huge amounts of glue logic and there are always communication bottlenecks. It's a very unnatural way to enable isolation, if you need to do true hierarchical verification. The consequence of this is that the behavior of the IP can change depending on how it's being used. Simply isolating IP or signing off on pieces of IP is not enough.”
“Think of the human brain. This is a device whose I/O bandwidth is quite limited. It reacts slowly to inputs. Yet, the brain has evolved a number of complex heuristic algorithms, which allow it to behave as if it had much higher bandwidth. The brain has been developed context free. It has the ability to overcome its basic constraint, which is the physical nature of the body itself.”
that process and figuring out how to push that complexity down into the hierarchy.”
“Today's systems are so complicated that we're being stifled by our limited ability to do these things and I think we are considering the wrong costs. The methodologies are dictating what we can do in the marketplace.”
“The imperative here is balance. What happens if you go through a top-down decomposition is that you're going to have to build a very complex system to figure it out. You cannot be heavily optimized everywhere because local optimization, rather than relieving the pressure, creates wholly unexpected constraints. I think that [Stanford University Professor] Donald Knuth said it best when he said that premature optimization is the root of all evil.”
Root of all evil, indeed!
Clearly, my conversation with Mark Jones was an open-ended one - as well as one that involved philosophies difficult at times to comprehend - as he laid out multiple ideas and orthogonal (a great word!) strategies. Ultimately, I came away with the impression that perhaps someday, with help from Jones and other freethinkers, designers will be allowed to build balanced systems that can evolve organically, rather than being dictated by the whims of the methods used to create them.
You can find the full EDACafe event calendar here.
To read more news, click here.
-- Peggy Aycinena, EDACafe.com Contributing Editor.
Be the first to review this article