Autoliv introduces production-ready electronic horizon module
BMW Hams Hall engine plant ramps up production of next-generation engines

Mitsubishi Electric to introduce EMIRAI 3 xDAS assisted-driving concept car at Tokyo Motor Show

Mitsubishi Electric Corporation will introduce the EMIRAI 3 xDAS concept car featuring next-generation driving-assistance technology during the 44th Tokyo Motor Show 2015 later this month. Building on the EMIRAI 2 xDAS, which was introduced at the 2013 Tokyo Motor Show, the EMIRAI 3 xDAS features evolved technologies for human machine interface (HMI), driver sensing, telematics, and light control.

The HMI features a multi-bonding display—LCDs panels on the dashboard and center console are laminated with an optical bonding process for high visibility and operability, as well as aesthetic harmony with vehicle interiors. Larger images are achieved without larger LCDs by narrowing the widths between separate displays.

1008-1
1008-1
Top: EMIRAI 3 xDAS. Bottom: EMIRAI 3 xDAS interior.

The high-visibility panels reduce reflections thanks to the optical-bonding and optical-design technologies. Display items can be changed according to user preferences. Cloud content synchronization and selectable contents layouts enable drivers to create highly personalized interiors.

In-vehicle equipment can be operated without having to look at the display. The operator’s hand profile and motions are detected for simplified adjustment of air temperature and music volume.

A wearable device vibrates to notify specific passengers of information, etc. as required. An overwrite input function, co-developed with Tokyo University of Agriculture and Technology, allows the driver to write on top of characters without confirming each time.

A 3D head-up display (HUD) provides three-dimensional images of objects up to more than 10 meters ahead of the driver so that the driver can keep his or her eyes on the road ahead.

3D imaging with binocular disparity on the HUD adjusts the display’s position according to specific situations, such as when turning driving on an expressway, etc., for safer, easier driving.

The driver’s own operating condition is sensed with a camera and a cardiograph that is based on a non-contact cardiograph co-developed with the National University Corporation Kyushu Institute of Technology. The driver’s face direction and line of sight are sensed via a camera.

This provides safe-driving support and predictive assistance based on driver behavior. Proactive analysis of map data, for example, can identify intersections with poor visibility, and then display side-camera views looking up and down the cross street. Further, the system learns to react automatically whenever the same location/situation is reencountered.

A cloud-based application analyzes the driver’s physical condition by comparing current behavior with past behavioral data stored in the cloud. If fatigue is detected, suitable rest stops are recommended.

The system collects and shares information on dangerous locations. It also provides remote control of home appliances.

Comments

The comments to this entry are closed.