Published in Transportation

Safety is why Intel acquired MobileEye

by on20 June 2018


27 million car market share helped too

A year ago, Intel acquired Mobile Eye for $15.3 billion and Intel's CEO has expained why it happened. Safety is in the company’s mind, but Brian Krzanich was honest enough to recognize that Mobile Eye solves a large gap that Intel had before the acquisition.

Speaking at the Automobil Elektronik automotive electronic congress  in Ludwigsburg, Germany, Krzanich addressed the audience talking about the need for an end to end autonomous driving platform.

Intel acquired MobileEye to save lives, and of course to increase its importance in the future of ADAS and self driving and increase Intel’s presence in this lucrative area.

The MobileEye acquisition solves this end to end platform as Intel has quite robust technology with the cold powered Xeon processors which can do the training necessary for the success of self driving, while MobileEye has the EyeQ SoC that will help the ADAS (advanced drive assistance systems) part.

Data is a pile of mess until trained

Krzanich states that data is the most valuable resource, and if you ask us we would pick clean water first, but let's assume that he was thinking about our industry. Here at Fudzilla we strongly believe that data is a big pile of hoarders’ mess without any structure and this is why everyone today is rushing into AI and machine learning to make any sense of the data.

You equip a training car with eight cameras, lidar and radar, you drive it around and you will get huge amounts of data. Now, once you get the data, you need to process it, in Intel’s case with Xeon and Xeon Phi processors to create the training phase. What gets trained needs to be inferenced, and in Intel’s case you need an ADAS chip to recognize cars, pedestrians, velocity, traffic signs, animals etc.

Krzanich mentioned that Mobile Eye sold its solutions to 27 million cars and has a big list of clients. Both Intel and now its integrated unit called MobileEye want to focus on  Responsibility and Sensitive Safety (RSS). The RSS formally defines what it means to drive safely and the boundaries of where human-like assertive driving becomes unsafe driving. Adding a non-proprietary and definitive safety and transparency layer to verify decisions proposed by any developer’s driving policy (planning) software, which are typically probabilistic and opaque.

Intel’s idea is that RSS should keep you safe along the way from A to B. Driving is based around one key things and that's trying not to hit anything/anyone while staying on the road.

We need to come back to  Mobile Eye and its Eye solutions that are already shipping in many of these 27 million cars. Back in 2014, MobileEye announced EyeQ3 a 2.5W 40nm CMOS chip with 0.256 Tops (trillion operations per second) that was an industry first camera only AEB Auto Emergency Breaking, traffic sign detection, holistic path planning, road profile reconstruction as well as suspension adjustment.

All these new features were added to the existing Lane Departure warning, Active high control, traffic sign recognition, adaptive cruise control, traffic jam assistant and forward collision warning integrated in EyeQ1 and EyeQ2 SOCs has manufactured and shipped before. Most of these features are highly likely in the Honda Civic 2017 Eurpean car, and below is a video of what Honda calls Honda sense, most likely based on EyeQ solutions. Mobile Eye listed Honda as a partner but Honda has never confirmed what the SoC is inside that enables the Honda Sensing features. 

MobileEye EyeQ3 should be enough for autonomous Level Support two while the EyeQ4 that started shipping in 2018, a 28nm chip is expected to reach Level 3 Support.

One thing is very clear, Intel has a good shot with MobileEye - a technology that is gradually expanding its software and hardware capabilities. With Intel behind its back, it can create a solid automotive solution.   

Last modified on 21 June 2018
Rate this item
(0 votes)

Read more about: