SAN JOSE, Calif.--(BUSINESS WIRE)--Feb. 4, 2002--Philips Semiconductors, a division of Royal Philips Electronics (NYSE: PHG - news; AEX:PHI), and MobilEye BV. -- a privately held company -- are from today forming a strategic partnership to manufacture a highly integrated System-on-Chip solution for automotive driver assistance applications, taking the first step towards the development of autonomous driving systems.
Philips Semiconductors and MobilEye will leverage their respective expertise in IC creation and driver assistance systems to develop an ASIC design for applications such as Adaptive Cruise Control to maintain safe headway distance in cruise-control mode, Lane Departure Warning, Forward Collision Warning, and sensory fusion applications for collision mitigation and active safety. Following on from Philips Semiconductors' leading role in the Safe-by-Wire and FlexRay consortiums, this announcement strengthens the company's position as a leader in the development of automotive safety, autonomous driving and X-by-Wire systems respectively.
"This is a great development in bringing active safety devices into the car," said Pascal Langlois, vice-president for Philips Semiconductors' Global Market Segment Automotive. "We are delighted to have teamed up with MobilEye to really speed development of cost-effective electronic safety systems which will ultimately make the driving experience far safer. Additionally, the ASIC SoC represents some key technological challenges which Philips Semiconductors and MobilEye are happy to resolve."
"We are jointly developing an ASIC SoC of extreme importance for the automotive market," said Ziv Aviram, president and chief executive of MobilEye BV. "Philips Semiconductors is a leading player in automotive electronics and ASIC SoC and we were extremely keen to work with them on this project. Together we are confident of being able to bring this key automotive safety development to the mass market, with fast production times."
The System-on-Chip solutions will deliver computationally-intense (i.e. intense real-time calculation) applications for real-time visual recognition and scene interpretation, customized for use in intelligent vehicle systems. The chip architecture is designed to maximize cost performance by having a fully-fledged application, such as a low-cost version of Adaptive Cruise Control from a single video source, on a single chip. The system, using sensors, can enable intelligent interpretations of the visual field such as detecting vehicles, pedestrians and road signs to provide an intelligent driver assistance system.
Even though the chip architecture is designed to have a fully-fledged application on a single chip, it is sufficiently flexible and programmable to accommodate a wide range of visual processing applications outside of the automobile.
The pattern classification module is application-specific yet at the same time based on general principles, which can accommodate other classes of objects such as human faces and pedestrians. The automotive applications provide a rich context for deploying this architecture due to the growing need to have sensors that can enable intelligent interpretations of the visual field such as detecting vehicles, pedestrians and road signs. However other non-automotive applications would soon become relevant such as home entertainment and surveillance systems.
The System-on-Chip functional capabilities include proprietary pattern identification techniques for segmenting out vehicles from the background scene under static and dynamic conditions; visual motion analysis techniques for isolating dynamically moving patterns such as passing and crossing vehicles and for estimating the host vehicle's yaw and pitch rates; and image processing techniques for lane following and road path prediction. Unlike conventional approaches, the technological architecture is designed to deliver the full range of capabilities from a monocular (single camera) video stream (in visible or IR spectrum), yet the chip architecture is designed to accept multiple sensory inputs, such as millimeter-wave or laser radar vehicle tracks for sensory fusion applications.
The System-on-Chip architecture offers a high level of cost performance with the target of reaching high volume penetration to the growing market of intelligent on-board driving assistance systems. The architecture includes multiple ARM946 programmable central micro-processors for driving general purpose computations and application-level programming and four application-specific modules for image pre-processing, motion analysis, pattern recognition, and lane following. The architecture includes 2.2Mbit SRAM of on-chip memory for efficient image memory management.
To maximize cost performance, peripheral circuits are integrated, including dual CAN, PROM, and SDRAM controllers, parallel I/O, and image data input units. The System-on-Chip will be manufactured using the leading CMOS 0.18 micron technology, as installed in several Philips-owned wafer fabs. The product will receive full cabin-grade automotive qualification.
First silicon samples are to be released for testing by end of 2002 with the target to be deployed on 2005 car models.
About Philips Semiconductors
Philips Semiconductors, with revenues of US$6.3 billion in 2000, is a world leader in silicon systems and standard products for wireless communications, digital entertainment, computing and automotive applications. The organization designs, develops and manufactures silicon solutions based on its innovative Nexperia(TM) architecture to create living technology for its customers building products, service providers using the products, and consumers enjoying the resulting products and services. For more information: www.semiconductors.philips.com.
About MobilEye BV.
MobilEye BV is privately held and headquartered in the Netherlands with subsidiaries in Jerusalem, Israel, and Mountain View, Calif. in the US. The company was founded by Mr. Ziv Aviram and Prof. Amnon Shashua from the Computer Science department of the Hebrew University of Jerusalem and is currently on sabbatical as a visiting professor at Stanford University in the US.
The company has developed a number of leading proprietary algorithms and reference platforms for processing a monocular visual stream for making visual interpretations of complex environments. Much of the core technology is based on computer vision and machine learning algorithms borrowing heavily on principles of human perception and visual interpretation. MobilEye's algorithms run today on a proprietary custom board installed in auto maker sites in the US, Europe and Japan for 2005 car models qualifications.
To learn more about MobilEye BV, visit
Philips Semiconductors Paul Morrison, 408/474-5065 (USA) Email Contact Robyn Kao, +886 2 2134 2968 (Asia Pacific) Email Contact or The Hoffman Agency Natalie Kessler, 408/975-3032 (USA) or Warman & Bannister Birgit van Gellecom, +31 40 214 60 14 (Europe) Email Contact