omron

Automated driving supports "future safety" as something standard

"We will make humans and vehicles

cooperate in real time.

Omron will make sensors that

only Omron can create

by incorporating AI."

MOBILITY

Physical Sensing Lab., Sensing Technology Research Center, Technology and Intellectual Property H.Q.

Takashi Iketani

The mobility in the future that automated driving aims at is "a mobile private room".
You can enjoy going wherever you want to whenever you want on a route of your choice, while reading a book, watching a movie, having dinner and enjoying conversation with your loved ones...
Mobility will become more fun and bring you values that was not imaginable have before.

After redefining automated driving for the future to come, one of the answers we have reached is that "cars can understand humans".

The world's first technology, "driver monitoring", which only Omron can provide is a technology which is indispensable for making the inevitable switching between automated and manual driving safe.

"We want to eliminate accidents caused by driver's negligence"

We want to eliminate accidents caused by driver's negligence

Although vehicle safety performance has become enhanced, the number of traffic accident deaths in the world is increasing and is projected to reach 190 million in 2020. Actually about 80% of those deaths are caused by human factors such as a delay in sensing danger or errors in judgment by drivers.
It is believed that driving operations on expressways can be entrusted to automated driving by 2018. However, big challenges remain regarding how "safely" the driving can be switched from automated to manual driving.

Based on the fact that the number of accidents is increasing, Takashi Iketani is asking a fundamental question, "What is automated driving?"

世界では毎日3400人以上が交通事故で命を失い、何千万人もの人々が交通事故でケガや障害を負っている(WHO調べ)

In the world, more than 3,400 people lose their lives in a traffic accident every day and tens of millions people are injured or disabled in traffic accidents (according to WHO)

"For a car to move, the relationship between the car, people, and the surroundings is essential. From the point of view of automated driving, people tend to focus on the relationship between the "car and surroundings", how the vehicle should adapt to the external environment to move. But Omron's development team believed that we should focus on the relationship between "people and cars" even in automated driving. The reason is, no matter how much automated driving advances, there still has to be cooperation between people and cars. Then we came up with the idea that we can get closer to solving the problem of traffic accidents as a social issue, by reducing the human factors in accidents by sensing the driver and letting the vehicle understand the human. The idea triggered the development of "driver monitoring"".

"Driver monitoring" is the world's first technology that captures human driver's motions and conditions based on the camera images of the driver and classifies the degree of concentration of the driving real-time.

The monitoring system can instantaneously judge any status of the driver, for example, as level 1 if he/she is inattentive, level 2 if operating a smartphone, or level 3 if drowsy or panicked. By linking the judgment result to the vehicle control to warn the driver or safely stop the vehicle without switching from automated to manual driving to avoid the risk, traffic accidents caused by human factors can be prevented.

"How we can overcome the limit of on-vehicle environment"

「車載組み込み環境という制約をいかに克服するか」

"Driver monitoring" senses an image of driver's upper body. A big hurdle lies ahead because the sensing must be real-time and continuous without any interruption, as it affects driving the vehicle.

By combining time-series deep learning and image sensing technologies, the judgment of drivers' various motions and conditions can be classified on a real-time basis.

By combining time-series deep learning and image sensing technologies, the judgment of drivers' various motions and conditions can be classified on a real-time basis.

"In general, advanced and complex sensing requires a large-scale computer but an on-vehicle device must be small, which means limited processing capability. Cloud processing on the Internet cannot respond to the driving conditions that change by the moment. A key to the development is how we can overcome the limit of on-vehicle environment."

We found a way out in the high-precision facial image sensing technology "OKAO" that Omron has been developing for more than 20 years. There is also the cutting-edge AI technology, time-series deep learning. "Driver monitoring" is a fusion of these two technologies. The engine based on the knowledge acquired by "OKAO" estimates a facial expression from conditions of facial parts such as eyes and mouth. The challenge of limited processing capability has been also overcome after a process of trial and error. Iketani says that great ingenuity is involved there.

"The key is to process an input image from one camera by separating it into a detailed facial image and a rough motion image. By combining high-resolution facial images and minimum resolution motion images with an exquisite balance, both high-precision sensing and compactification have been achieved."

"The technology of cars that understand humans will become standard in the future"

The technology of cars that understand humans will become standard in the future

Conventional sensors can detect individual conditions such as dozing or inattentive driving, but they cannot judge the extent to which the condition may affect the driving. What is truly epoch-making about "driver monitoring" is that it can judge the risk every possible level of driver's conditions. Why was Omron able to reach such a stage?

Driver monitoring can estimate the degree of driver's various driving risks with one sensor

"Driver monitoring" can estimate the degree of driver's various driving risks with one sensor

"We had thorough discussions on what should be detected from the driver. There are a lot of things we should take into account when leaving the driver in charge of driving a vehicle: if he/she is sleeping, or he/she is facing forward, and so on. But does the vehicle need such individual pieces of information? What does the vehicle want to understand in order to safely hand over the driving to the human driver? We set up a hypothesis that it is a "degree of driver's concentration". A driver may be under various conditions while the vehicle must be controlled based on them. If they can be connected by the driver's concentration degree, the machine can understand the driver's condition, leading to better cooperation between people and cars. In sensing technologies, we believe that defining an index such as "driver concentration" is very important. In fact, we have materialized the sensing of "driver concentration", and "driver monitoring" through connection with control. We believe that Omron has reached the solution because the company covers not only sensing but also control."

Now that automatic braking for collision avoidance has become a common on-vehicle technology, "driver monitoring" will be ubiquitous in the not-so-distant future.

Iketani's team is developing the technology, hoping to put it into practical use for automated driving on the highway in 2018.

Technology that understands humans based on various sensing data developed by Omron supports safety and well-being of autonomous mobility society.
With this strong belief, we will keep leading future mobility.

Click here for inquiries regarding Mobility
click