Sei sulla pagina 1di 3

Augmented Reality and Representation in Vehicle

for Safe Driving at Night


Byoung-Jun Park, Jeong-Woo Lee, Changrak Yoon, Kyong-Ho Kim
IT Convergence Technology Research Laboratory
Electronics and Telecommunications Research Institute
Daejeon, Republic of Korea
{bj_park, jeow7, cryoon, kkh}@etri.re.kr

Abstract— To reduce traffic accidents at night driving, we safety using the augmented reality technology implemented by
introduce an augmented reality system using head-up projecting dangerous information to the windshield
display (HUD) in a vehicle and discuss the effectiveness and superimposed on the real world.
usefulness of the proposed system with representation
II. AUGMENTED REALITY SYSTEM USING HEAD-UP
strategy. The proposed system embraces recognition of
DISPLAY IN VEHICLE
dynamic object by radar-vision fusion, forward collision
warning by threat assessment, and representation strategy A. Overview of Augmented Reality System in Vehicle
using HUD. For warning information to prevent collision at Driving at night is one of the most dangerous things. It may
night driving, an augmented reality information aligned at be difficult to tell whether a black shape in front of your
driver's eye and overlapped with real world is shown headlights is a pedestrian or not when you're driving late at night.
through the proposed system, and the usefulness of the According to the National Safety Council, vehicle death rates at
system is validated in the experiments. night are three times higher than during the day. One problem is
vision. Almost 90% of a driver's reaction depends on it. Also, a
Keywords— Augmented reality, forward collision warning, time to driver should take care of vehicles, lanes, pedestrians and traffic
collision, head-up display, night driving, signs while controlling vehicle speed and directions. All these
works cause increment of physical and mental workload and
I. INTRODUCTION induce a dangerous situation.
Night driving requires lots of concentration, which can be In order to prevent an accident while night driving, the
tiring and presents very different challenges from driving during proposed system offers forward collision warning information
the daytime. Also, the human eye requires light to see. An by means of implementation of augmented reality using HUD.
estimated 90 percent of all driver decisions are made based on Namely, the propose system extracts dynamic pedestrians and
what they see. Therefore, night driving is a top cause of vehicles and recognizes dangerous situation, and then it provides
accidents. To avoid an accident during the night, vision-based a driver with warning information by projecting virtual warning
vehicle detection systems have being developed [1]-[3] and in- signs overlapped with real world and mapped to the viewpoint
vehicle head-up display (HUD) was introduced to implement of the driver. It should help driver’s hazard event detection. The
augmented reality, which is currently being applied in many proposed augmented reality system consists of sensor, decision
aspects of the real world [4][5]. Recently, a car manufacturer of and presentation parts.
the world has development and commercialization plans about
HUD technology to offer a driver various information related B. Recognition of Dynamic Object at Nighttime
with safety and convenience such as velocity, driving direction, The sensor part of the proposed system acquires situation,
warning messages, etc. [6][7]. such as scene, velocity and position, using night vision, radar,
CAN and GPS and detects a pedestrian and vehicle in front of
In this paper, we deal with an augmented reality system the host vehicle. For night vision, we use FLIR PathFindIR
using HUD for providing collision warning information on the camera, which is a thermal imaging system and displays cold
windshield overlying real world in a vehicle while night driving. objects as black and hot objects as white. A radar utilizes the
The proposed system consists of sensor, decision, and Delphi 76 GHz ESR to long range coverage.
representation parts. In sensor part, the radar-vision fusion is
used to detect and recognize a pedestrian and a vehicle from The radar-vision fusion is used to detect and recognize a
dataset gotten from night vision and radar sensors at nighttime. pedestrian and a vehicle at nighttime. The distance and angle of
The danger level is decided by forward collision warning (FCW) an object extracted from the radar are overlapped to an object
with time to collision (TTC) in decision part and augmented detected from night vision image, and then final result are
reality information is presented on the windshield using HUD defined as shown in Fig. 1. Fig. 1 (a) is an image as a driver
according to representation strategy for the determined vision and (b) is an image of night vision of the proposed system.
information. We set up an indoor test bed with the prototype A selected range by radar shows in (c) and there is the detected
system and consider the possibility of improvement of driver pedestrian by radar-vision fusion in (d).

978-1-4673-7116-2/15/$31.00 ©2015 IEEE 1261 ICTC 2015


improve the driving safety and minimize the driving workload,
the augmented reality warning information should be
represented in such a way that it is more easily understood.
Therefore, it is necessary to configure the characteristics of the
augmented reality information, and find a way to represent the
augmented reality information according to those characteristics.
In addition, the proposed system needs to be conducted into both
the amount of information that can minimize a driver’s cognitive
(a) RGB image (b) Night vision image load and the types of representation that can minimize the
driver’s visual load.
The result of a feasibility study using driving simulator is
used to configure the visual properties for representation of the
augmented reality information [8]. For the defined
representation according to the danger level, which is displayed
on the windshield of a vehicle using HUD, Table I illustrates the
visual properties including the visual information of the threat
level, shape, color, transparency, position, size, icon, text, etc.
(c) Detection by radar (d) Result of the final detection
The definition of threat assessment also shows in Table I.
Figure 1. Recognition of Dynamic Object at Nighttime
TABLE I. REPRESENTATION FOR AUGMENTED REALITY
C. Forward Collision Warning
Danger Danger Danger Danger
FCW is an automobile safety module designed to prevent a Icon
Level 1 Level 2 Level 3 Level 4
rear-end accident. The importance of keeping sufficient
headway for reduction of accidents is recognized by traffic
Pedestrian
authorities worldwide. The proposed system provides FCW in
situations where the host vehicle is approaching a preceding
vehicle with a high closing rate at night. In order to decide
warning information with a danger level, FCW uses the state
information and the driving information of vehicle including
distances, position, TTC, the direction of movement and velocity.
Vehicle

NONE NONE
 velocity 2  
SafetyDist ance 
100
Dist ≤ BrDist TTC < 2.4 Others Dist ≥ SftDist
TA

 velocity 2  
BreakDistance 
2g TA: Threat Assessment, Dist: Distance, BrDist: Break Distance, SftDist: Safety Distance

 Dist ance
TTC  [s]  
RelativeVel ocity

where, μ is coefficient friction and g is acceleration of gravity.


The RelativeVelocity is computed from the variation of headway
distance with respect to time. The distance is resulted from the
detected vehicles and pedestrians by performing the radar-vision.
To define the danger level by threat assessment in FCW as
shown in Table I, we have experimented about TTC threshold.
The database for the extraction of TTC threshold have been built
as data for about 18 hours including about 500 vehicles with Figure 2. Results of Augmented Reality in Vehicle for Vehicle Ahead
lower velocity than 60km/h and lower distance than 100 m on
roads and highways. In addition, we consider the position and Fig. 2 shows the displayed augmented reality information on
the movement of the front vehicle for the determination of a windshield in the view point of a driver. Yellow box is the
danger level. If the position of the front vehicle is not on the augmented reality zone which is implemented by a projection
driving lane, then the danger level is lower than the last level type HUD with 50 inch size in the front of 7.5 m. The red box is
status. If there is the cut-in vehicle, then the danger lever is the augmented reality warning information and indicates the
higher than the last danger level. detected pedestrian for the danger level 1. Augmented reality
information is presented on the windshield of a vehicle for the
D. Augmented Reality and Representaion Strategy
each level according to representation strategy. A driver can
It is important to provide necessary warning information to cognize easily and intuitively warning information when
the driver through an intuitive and helpful representation. To augmented reality information is presented to prevent collision.

1262
III. EXPERIMENTS and then determined a danger level for the recognized object.
In order to implement augmented reality in vehicle, we The augmented reality warning information with a danger level
developed a projection type HUD with 7.5m focal length and 50 is presented by HUD according to the representation strategy as
inch screen and used a windshield of Genesis DH model shown in Table I. For the vehicle ahead, the results of FCW
(Hyundai motor). Also, we have built an indoor test bed which according to the danger level is shown in Fig. 4. The proposed
is similar to real driving environments and tested the developed system helps a driver recognize objects clearly in total darkness
system using the dataset collected on the real driving. and the glare of oncoming headlights.

The augmented reality system on the indoor test bed IV. CONCLUSTIONS
determines whether to provide the information for warning To provide the forward collision warning information at
through the integrated S/W which is installed on the PC based night driving, we developed augmented reality system in vehicle
on the recognition result of collected data from various sensors. using head-up display. For this, recognition of dynamic objects
And the determined warning information is represented through and FCW at nighttime were discussed and augmented reality
HUD as the augmented reality. Augmented reality information information based on projection type HUD was implemented to
is transferred from the coordinate of the image acquired by provide warning information matched to the driver’s eye. In
sensor to three dimensional virtual coordinate based on a addition, we defined an information representation by
driver’s eyes and projected on windshield of a vehicle for the considering both the driving situation and the drivers themselves
overlap with real world. to offer warning information to drivers using an efficient method
and carried out test to prove the usefulness of the proposed
system.
The results of the experiment supported the effectiveness and
validity of the proposed augmented reality system in a vehicle
as an effective representation method based on driver's
characteristics under unseen environments such as darkness and
unfavorable weather conditions. We have confirmed that it can
(a) Level 4 (b) Level 3 improve intuitive cognition of a driver and reduce the distraction
of a driver through an experimentation. As a future research, we
will embrace studies on the improvement of performance and
ergonomic information representation for determining
information.
ACKNOWLEDGMENT
This work was funded by the Industrial Strategic Technology
(c) Level 2 (d) Level 1 Development Program of MOTIE [10040927, Driver-oriented
Figure 3. Results of Augmented Reality in Vehicle for Crossing Pedestrian
vehicle augmented reality system based on head up display for
the driving safety and convenience]
REFERENCES
[1] K.-H. Choi, D.-H. Kim, K.-S. Kim, J.-W. Kwon, S.-I. Lee, K. Chen, and
J.-H. Park, “State Machine and Downhill Simplex Approach for Vision-
Based Nighttime Vehicle Detection,” ETRI Journal, vol. 36, pp. 439-449
2014.
[2] J.C. McCall and M.M. Trivedi, “Video-Based Lane Estimation and
Tracking for Driver Assistance: Survey, System, and Evaluation,” IEEE
(a) Level 4 (b) Level 3
Trans. Intell. Transp. Syst., vol. 7, no. 1, pp. 20–37, 2006.
[3] Y.-L. Chen et al., “Nighttime Vehicle Detection for Driver Assistance and
Autonomous Vehicles,” Int. Conf. Pattern Recogn., pp. 687–690, 2006.
[4] H.S. Park, M.W. Park, K.H. Won, K.H. Kim and S.K. Jung, “In-Vehicle
AR-HUD System to Provide Driving-Safety Information,” ETRI Journal,
vol. 35, pp. 1038-1047, 2013.
[5] H.S. Park and K.H. Kim, “Adaptive Multimodal In-Vehicle Information
System for Safe Driving,” ETRI Journal, vol. 37, pp. 626-636, 2015.
(c) Level 2 (d) Level 1 [6] V. Charissis and S. Papanastasiou, “Human-Machine Collaboration
through Vehicle Head Up Display Interface,” Cognition, Technology &
Figure 4. Results of Augmented Reality in Vehicle for Vehicle Ahead Work, vol. 12, pp. 41-50, 2010.
[7] http://www.autoevolution.com/news/gms-full-windshield-hud-
Fig. 3 is the results of collision warning information technology-explained-18454.html
expressed as augmented reality which is aligned at driver's eye [8] H. S. Park and K. H. Kim, “Efficient Information Representation Method
for crossing pedestrian at nighttime and rain. The proposed for Driver-centered AR-HUD system,” in HCI Int., part-3, 2013, pp. 393–
system recognized the pedestrian while night driving on the road 400.

1263

Potrebbero piacerti anche