The Dominion of Design intel
The Future of Quantum Computing is Counted in QubitsMay 2nd, 2018 by intel
Intel Explainer: ‘Tangle Lake,’ Intel’s 49-Qubit Processor
At CES 2018 in January, Intel CEO Brian Krzanich predicted that quantum computing will solve problems that today take months or years for our most powerful supercomputers to resolve. Krzanich then unveiled Intel’s 49-qubit superconducting quantum test chip, code-named “Tangle Lake.” Quantum computing is heralded for its potential. Leaders in scientific and industrial fields are hopeful quantum computing will speed advances in chemistry, drug development, financial modeling and climate change. More: Quantum Computing at Intel | A Quantum Computing Primer | More Intel Explainers Quantum computations use quantum bits (qubits), which can be in multiple states at the same time – quite different from digital computing’s requirement that data be either in one state or another (0 or 1, for example). Running a large number of calculations in parallel opens a future where complex problems can be solved in much less time on a quantum computer compared with a traditional digital device. Read the rest of The Future of Quantum Computing is Counted in Qubits Intel’s Drone and Artificial Intelligence Technology to Help Restore China’s Great WallMay 2nd, 2018 by intel
What’s New: Intel and the China Foundation for Cultural Heritage Conservation have created a partnership to protect and restore the Great Wall of China. “Using drones, we are able to inspect multiple aspects of the structure including areas that are quite inaccessible. We continue to be excited about the future of inspections being automated all the way from drone data capture to data processing, analysis and insights. We look forward to leveraging our technology to aid in the preservation of more world heritage sites in the future.” Why It Matters: The Great Wall’s Jiankou section is among its most famous stretches, as well as its steepest. Located in thick vegetation, the section of the wall, which dates to the third century B.C., has naturally weathered and requires repair. Intel’s AI and Falcon 8+ drone technologies will be used to remotely inspect and map the Jiankou section, which has been difficult for repair teams to reach. Intel Saffron AI Speeds Issue Resolution for Manufacturing, Software and AerospaceApril 30th, 2018 by intel
What’s New: Intel today released the Intel® Saffron™ AI Quality and Maintenance Decision Support Suite – a suite of artificial intelligence (AI)-powered software applications using associative memory learning and reasoning to facilitate faster issue resolution. “Customers including Accenture, a major aircraft manufacturing company and even Intel are already receiving tremendous value from Intel Saffron AI software. It digs into disparate data sources to surface customers’ best practices, providing them with the meaningful insights needed to resolve issues faster.” What It Includes: The Intel Saffron AI Quality and Maintenance Decision Support Suite is comprised of two software applications:
One Use Case: Accenture*, a global professional services company, is already using Intel Saffron AI to help clients resolve issues faster and reduce wasted efforts in product testing and defect resolution. Read the rest of Intel Saffron AI Speeds Issue Resolution for Manufacturing, Software and Aerospace Autonomous Driving – Hands on the Wheel or No Wheel at AllApril 11th, 2018 by intel
Intel Explainer: 6 Levels of Autonomous Driving
Vehicles on the road today are getting smarter, safer and more capable. But even the newest vehicles vary widely in their advanced driver assistance systems (ADAS), which aim to enhance safety and make driving more comfortable. Add to that the global race to fully self-driving vehicles, which will take the driver out of the equation completely. Vehicles can be categorized according to the ADAS features they offer, and the Society of Automotive Engineers defines six levels of automotive automation, explained here. Read the rest of Autonomous Driving – Hands on the Wheel or No Wheel at All Intel Creates Neuromorphic Research Community to Advance ‘Loihi’ Test ChipMarch 1st, 2018 by intel
Members will Receive Resources for Exploring Neuromorphic Computing Use CasesBy Dr. Michael Mayberry This week, we hosted the Neuro Inspired Computational Elements (NICE) workshop at our Oregon campus with the goal of bringing together researchers from different scientific disciplines to discuss and explore the development of next-generation computing architectures, including neuromorphic computing. Today at the workshop, we provided an update on Intel’s neuromorphic research and announced a collaborative research initiative to encourage experimentation with our Loihi neuromorphic test chip. Here’s a status of our neuromorphic computing efforts and details on this new research community. Where We Are Fabrication and packaging of our Loihi test chip was completed in early November, and we began power-on and validation. We were pleased to find 100 percent functionality, a wide operating margin and few bugs overall. Our small-scale demonstrations that we had prepared on our emulator worked as expected on the real silicon, though, of course, running orders of magnitude faster. Our equivalent of a “Hello World” application is recognizing a 3-D object from multiple viewing angles, structured after the COIL-20 example from Columbia University. As measured at our lab, this particular application uses less than 1 percent of Loihi, learns the training set in seconds and consumes tens of milliwatts. We shared Loihi architectural details in a paper that IEEE Micro recently published, and we presented those details and several demos to NICE workshop attendees this week. We have delivered the first developer systems to select research collaborators who are working on a variety of applications including sensing, motor control, information processing and more. Software development tools remain one of our focus areas, and we’re looking forward to running much larger scale applications in conjunction with research collaborators. As we learn more together, we expect progress to accelerate, and that’s where today’s announcement comes in. Read the rest of Intel Creates Neuromorphic Research Community to Advance ‘Loihi’ Test Chip CES Panel: Autonomous Vehicles in the Cities of TomorrowJanuary 18th, 2018 by Sanjay Gangal
Where do autonomous vehicles stand today and when will they be ready? How will they operate in connected cities and will consumers be ready to use them? Listen to this panel of experts working on autonomy share their perspectives on the current and future state of self-driving technology.
Read the rest of CES Panel: Autonomous Vehicles in the Cities of Tomorrow CES 2018 Press Conference: Qualcomm – Inventing the Path to 5GJanuary 17th, 2018 by Sanjay Gangal
Qualcomm President Cristiano Amon is at CES to showcase the company’s latest inventions that are leading the world to 5G in industries from IoT to automotive.
Read the rest of CES 2018 Press Conference: Qualcomm – Inventing the Path to 5G Median Income of Electrotechnology, IT Professionals Rises to $130,000 for Largest Gain in Past Five YearsNovember 17th, 2015 by Sanjay Gangal
Article source: IEEE Median income for electrotechnology and information technology professionals jumped by more than 4 percent in 2014, the largest increase in the past five years, according to the 2015 IEEE-USASalary & Benefits Survey. Median incomes from primary sources — salary, commissions, bonuses and net self-employment income — for U.S. IEEE members working full-time in their primary area of technical competence (job specialty) rose from $124,700 in the 2013 tax year to $130,000 in 2014. The 4.25 percent increase comes a year after median income rose by its small percentage over the past five years, .56 percent. The results are based on survey responses from 10,215 people. Here are median incomes since 2009:
Those employed in communications technology once again enjoyed the highest median earnings ($150,000), followed by circuits and devices ($143,008) and signals and applications ($141,062). Asymptotic or Divergent: Three Verification Managers Look to the Future at DACMay 5th, 2014 by Sanjay Gangal
What would the Design Automation Conference (DAC) be without a verification panel or two? This year, one in particular takes a look at a variety of verification technologies. Titled, “The Asymptote of Verification,” it will be moderated by Bryon Moyer of EE Journal and held Monday, June 2, from 5:15 p.m. until 6 p.m. in the Pavilion (Booth #313) on the exhibit floor. Proposed and organized by Graham Bell of Real Intent, users make up the panel and include Brian Hunter of Cavium, Holger Busch at Infineon Technologies and Bill Steinmetz from NVIDIA. Special thanks go to Breker, OneSpin and Real Intent for securing these three experts who will share their real-world experiences with formal verification, static RTL analysis, and graph-based verification. Oh yes, they are users of Breker, OneSpin and Real Intent tools. Read the rest of Asymptotic or Divergent: Three Verification Managers Look to the Future at DAC |