November 24, 2003
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Were you to ask your prototypical software guy, he'd quickly point out the error in your ways if you were to make the laughable suggestion that multi-tasking and multi-threading are one in the same.
He'd say (matter-of-factly), “Multi-tasking is when an operating system makes it look like the computer is running multiple processes at the same time, which could be separate programs, or two instances of the same program, running at the same time. However, in the case of two of those processes being two separate instances of the same program running at the same time, multi-threading allows one copy of that program in memory to support those two separate instances, without bashing each other or conflicting with each other. Without mutli-threading, you'd have to have two separate copies of the same program in memory at the same time to support those two separate instances.”
That's when you'd put your finger tips together, tap yourself on the forehead (hard), and say out loud, “Of course! What was I thinking? Right! Of course, multi-tasking isn't multi-threading. I knew that!”
Then you'd tiptoe off to your cubicle, secretly pick up the phone and call Tom Peterson at MIPS Technologies to start the whole conversation all over again. Tom's the Director of Product marketing at MIPS, which is why he's responsible for the strategy and marketing of the company's synthesizable 32- and 64-bit product lines (including system software). Surely Tom would be able to explain multi-threading. After all MIPS just made a big announcement last month introducing the MIPS MT ASE (Application Specific Extension), which is a new multi-threading (MT) extension to the company's “signature” architecture.
Tom would try: “Multi-threading is the ability to take advantage of inherent parallelism commonly found in many applications. Designers are able to share CPU compute resources across multiple threads of data such that concurrent data streams can run in less time and/or the same amount of work can be completed by a smaller number of processors. By introducing an MT capability on an industry-standard architecture such as MIPS, companies are ensured of maintaining compatibility with the software already optimized for their application.”
“We are announcing the MT ASE technology right now, because MIPS Technologies constantly works with lead customers on architectural enhancements and core products to ensure our roadmap solves the design challenges of our customers. Accordingly, due to the ongoing increase in SoC design costs, OEMs and silicon vendors constantly need new ways to reduce costs while delivering new, compelling products to market. The customer response so far has indicated to us that they love it.”
processing on a
set-top-box. Threading can allocate bandwidth on the standard CPU, but still deal with the real-time requirements of audio processing.”
But let's say, hypothetically, that you'd been staring at your wall calendar the whole time Tom had been talking and that you found yourself short the courage needed to ask him to start all over again, because you still didn't quite get it. Instead you would hint at your dilemma by asking a question like, “So, how much education is required on the part of your customers to actually understand all of this?”
You would wait, hoping to hear that the answer was, “Lots!”
Instead Tom would say, “One of the advantages of the MIPS IP cores is the ease of use and the out-of-the-box user experience. By developing complete reference design flows with partners such as Cadence, Synopsys, and others, as well as developing configuration and build-time options we have minimized the learning curve for our customers to help accelerate the products time to market. In addition we provide extensive options that give the designer the flexibility in configuring the implementation to match the demands of the target application.”
Somewhere in there you'd think you heard Tom reference a “classic customer arrangement” between MIPS and QuickLogic. So, still staring hard at your wall calendar, you'd ask him to explain the “classic” part.
with the ability to add instructions to the standard instruction set for even greater flexibility.”
Suddenly a thought would occur to you. Why not ask Tom if he could refer you to someone at QuickLogic, who would in turn allow you to go back to Square One and ask (again), “So what's up with this multi-threading thing?”
Tom, never suspecting your ongoing befuddlement, would kindly send you to talk to Ian Ferguson, Vice President and General Manager for QuickMIPS Products at QuickLogic. You'd ply yourself with caffeine in anticipation of another intense phone call, ring through to Ian, and hope that he would succeed in getting through to you where others had failed. This time, undoubtedly because of the caffeine, you would not be disappointed.
Ian would say, “Here is a good analogy - consider that you have a clerk in your Accounting Department, and the clerk is asked to look at expense reports, tally up the totals, and approve them to be paid. If you're in charge of Accounting, you want to keep feeding expense reports to that clerk, or you're just spending money on the clerk's salary, but not getting any useful work for the money spent.”
“When the clerk is working slowly, it's relatively easy for someone from another department to bring a batch of work over and put it in the Accounting clerk's inbox. However, if the clerk in Accounting starts to work faster, it's harder and harder to keep the inbox filled with work. The clerk will be sitting and waiting for paperwork to arrive from another department, unless employees from other departments start running to bring batches of work over to the clerk's inbox. However, if the Accounting clerk is really fast, it's simply not possible for employees from other departments to run fast enough to bring batches of work into Accounting to keep the clerk's inbox filled.”
“So, you assign the Accounting clerk to handle more tasks - more 'threads' - than just processing expense reports. You arrange to fill the clerk's inbox with tasks to keep the clerk productive when there are gaps in between batches of expense report. The clerk in Accounting could be doing payroll, for instance, when there are no expense reports to process. In fact, you could prioritize the work - expense reports always come first, but if there are none of those, the clerk should flick over and start doing some useful work on the payroll until the next batch of expense reports are placed in the inbox.”
times. You're reducing the amount of dead time, and maximizing use of the CPU.”
You can find the full EDACafe event calendar here.
To read more news, click here.
-- Peggy Aycinena, EDACafe.com Contributing Editor.
Be the first to review this article