Printer Friendly

A 3D Full Windshield Head Up Display.

INTRODUCTION

Head-Up Displays (HUD) has been used on board aircrafts for several decades. Initially designed for military applications as a means of targeting, they are used now to improve situational awareness, thanks to the ability to superimpose information in conformity with the external landscape. So HUD was the first application of what we now call Augmented Reality (AR).

In order to reduce eye re-focusing time, and to reduce parallax issues, images are collimated, so symbols are perceived far away.

But the head projection unit able to generate a bright collimated image is very bulky, and the associated complex optic engine is very expensive.

Collimated images are also only visible on a very small eye-box, resulting in a poor user experience.

In this paper, we present a novel system, using simulated collimation, Holographic Optical Elements and spectral interferential filters. With these breakthrough technologies, applicated to HUD, we remove the need of complex refractive optics, and offer a theoretical infinite eye-box, for an outstanding user experience.

BACKGROUND

Head-Up Displays have, for the last decades, retained the same main architecture: source, projector and combiner. The projector creates an intermediate image from the object source, which is collimated to infinity and superimposed on the external world by a combiner.

At first HUDs used CRT and a partially reflective plane combiner, then a holographic curved combiner. Further generations introduced LCD display with CCFL lamp backlight, and then LED backlight, associated with dielectric coated combiner. The latest generation introduces diffractive optics and a thin combiner based on pupil expansion. Throughout these generations, the functional role of each element remains the same.

Avionics HUD are collimated to a long distance, enough to be perceived as infinity. In this case, symbology such as speed vector, horizontal situation, synthetic vision, enhanced vision, is superimposed with the real world, without parallax issue when the pilot moves his head inside the eyebox, offering an augmented reality function. In addition, the combiner located near the pilot's eyes provides a large field of view, generally more than 30[degrees] in horizontal and 20[degrees] in vertical. These characteristics improve the user experience and increase the pilot's situational awareness in critical phases of flight such as approach, landing and taxiing.

In counterpart, large projector space envelope, limited head clearance and limited eye motion box are the drawbacks of current avionics HUDs.

To address those it has been envisioned to use the windshield surface. Windshield projection has been explored for years within the automotive market

Automotive Head-Up Displays are mainly used to display information while keeping eyes on the road. This includes speed, navigation directions, warning and information relative to the car's condition.

This information is not displayed in conformity with the landscape, generating parallax errors between symbology displayed a few meters away and the real world in the background.

Automotive HUD field of view (FOV) is limited by the size of the HUD box, this limitation came from the necessary large mirror required to reflect the image on the full windshield, as shown on figure 4

Augmented Reality applications will require a very large field of view. But this critical parameter has a strong impact on the HUD box size, mainly due to horizontal (X) and vertical (Y) mirror dimensions, as shown in figure 5.

With this optical architecture, a HUD with a field of view of 14[degrees]x6[degrees] will require a mirror size of 325 mm x175 mm.

For avionics applications, 35[degrees] in horizontal and 25[degrees] in vertical are common values. HUD box architecture would then require mirrors larger than 500 mm, not compatible with cockpit installation constraints, hence the different optical architecture currently used in avionics applications, resulting in a limited eyebox.

A full windshield head-up display using direct projection on a transparent screen is an alternative and a breakthrough with current HUD architectures. Papers [4] and [5] present such a HUD and such a HUD simulator.

DIRECT WINDSHIELD PROJECTION

Direct projection on windshield reduces installation constraints, thanks to the use of small projectors, and enables the removal of the combiner. Paper [6] presents a system, using a small UV laser projector, and a windshield including a layer with transparent phosphor.

With this concept, a bulky HUD system could be replaced by a basic video projector with a reduced footprint, and a transparent screen embedded on the windshield. But it raises several issues, particularly the mismatch between real world and images presented at short distance from the eyes, providing a bad user experience on augmented reality application (Fig. 6).

In summary, direct projection on a transparent screen could be an excellent solution for an eyes-out display as it solves most space envelope issues. But the lack of collimation and the vergence-accommodation conflict are critical showstoppers [7].

Simulated Collimation

An efficient augmented reality (AR) system should superimpose virtual objects on the real world.

For an AR HUD, horizon line, velocity vector, synthetic or enhanced vision shall be displayed in conformity with the real world, without accommodation effort and vergence issues.

This has historically been achieved using a complex and bulky collimating optics inside the projector. But the collimated rays give a tunneling effect and the image is only visible inside a small eye-box.

With direct projection on a transparent screen, the image is formed at a finite distance on the screen. Real world superposition cannot be performed without the use of collimation.

With the transparent display, we propose to simulate collimation by using Stereoscopy (figure 7).

A stereoscopic projector is used to display left-eye image and right-eye image on the transparent screen, which is, in fact, a windshield with an optical layer.

When the distance of the two images on the screen match the inter-pupillary distance, the virtual object is seen at infinity. At the same time, as 3D glasses allow each eye to see only one image, the double vision issue is solved.

This system offers infinite viewing directions with no constraints such as the small eyebox and unique viewing point of traditional HUD.

3D Transparent Screen Used as HUD

The user wears a pair of 3D glasses, and camera-based head tracking is used to compensate the parallax error caused by head motion (figure 8).

3D Principle for Windshield Projection

A 3D system using polarized goggles cuts off at least 50% of the external luminance, due to light polarization.

An active shutter 3D system achieving more than 50% transmission of external luminance will require a complex synchronization mechanism between projection source & temporal eye shuttering and will so became tricky and bulky on head.

The most suitable setup for a see-through system seems to be the wavelength multiplexing approach [8]. By using a bandpass filter for each color and each eye, it is possible to achieve a transmission rate of 60% (and more if narrow band sources are used) while using only a light passive component on head.

For an Avionics Head Up Display, displayed images are green, as requested by current Avionics standards [9].

The principle presented and adapted to our proposed application removes only a small bandwidth of the visible spectrum for the goggle notches, typically less than 30 nm of green at 520 nm and at 540 nm, resulting in an outstanding photopic transmission of more than 75%.

As these two notches are applied respectively to left and right eyes, the entire spectrum is visible by the two eyes, except for the notch wavelengths, only visible by one eye. Practically, in order to avoid any possibility of crosstalk between the two images (ie part or all of the left-eye image which could be seen by the right eye), it could be convenient to design both notches with a small overlap. Adjusting this overlap around 532nm would help to reduce the hazards associated with some green laser illumination.

In order to achieve spectral multiplexing, it is possible to use two filtered projectors, but such a system is bulky and not compatible for short-range projection as required in aircraft or car cockpits.

The use of one short throw projector, used in temporal mode, is a preferred solution, but requires doubling the image rate. In this case, a specific active spectral modulator is needed, able to tune accurately and efficiently the two bandwidths, as described in document [10]

Remaining Issues

With this novel system, collimation is generated through stereoscopy, head motion detection, and the use of 3D glasses mimics looking through a legacy HUD, but without the risk of losing the image due to a small eyebox.

Accommodation Vergence versus Viewing Distance

When left and right images are displayed on the windshield, accommodation is still performed on the glass while vergence is at infinity. It generates an accommodation-vergence conflict, well known in 3D projection [11][12].

In our system, as the pilot looks outside, and vergence is naturally performed at infinity.

To solve the accommodation-vergence issue, pilots must see both landscape and symbology without any effort of accommodation.

As these two objects are not at the same distance, they should be inside the vision depth field to avoid accommodation mismatch.

It was shown that, if specifics rules are applied, and with a small loss of visual acuity, accommodation vergence issues can be dealt with [13].

This figure is the result of an experimentation made to measure the relative acuity of normal subjects when they are reading symbols on a screen with a virtual infinite distance. We can see that the normal acuity is obtained when the subject is far enough from the screen. At a distance of more than 1.6m, there is no difficulty in reading symbols. When the distance to the screen is less, the acuity is degraded up to 2/10 when the distance is around 50cm.

WINDSHIELD HUD PROBLEMATIC

In this concept, the images for Left and Right Eye are projected onto a screen located at a finite distance in front of the pilot or the driver. In order to avoid the installation within the cockpit of an additional optical component (with its mechanical interfaces, positioning and safety mechanisms), a preferred solution is to use the windshield as this projection screen.

The windshield screen must then present the following capabilities:

* It must allow see-through view of the external scene, with a transmission factor as high as possible, without affecting the resolution of the see-through image as seen by the pilot.

* When illuminated by a projector, each point of the screen has to reemit the energy received from the projector in a solid angle compatible with the required Eye Motion Box (EMB) dimensions, with a luminance compliant with the HUD contrast requirement. Practically, to ensure the legibility of the symbols presented by the HUD, the luminance level of the HUD image must reach at least 30% of the luminance of the background as transmitted through the windshield.

HOLOGRAPHIC WINDSHIELD DISPLAY APPROACH

One solution to improve the compromise between the required transmission and reflection properties of the windshield screen is obtained by working on the spectral dimension of light. Typically the external scene is illuminated by the sun on a wide spectral range, limited here to the visible domain by the pilot's physiological characteristics.

We can then take advantage of a narrow bandwidth illumination source for the HUD by adjusting the reflection efficiency of the screen to this specific narrow bandwidth.

As can be seen in Figure 12, only a small portion of the spectrum is removed in transmission, providing a high see-through photopic transmission. On the other hand, when the HUD illumination spectrum is adapted with this reflection peak (both in position and width), then most of the energy coming from the projector will be reflected towards the pilot, with limited loss in transmission.

One remaining issue is to achieve the required "reflective-diffusive" function of the screen. This can be done through the use of customized Holographic Optical Elements (HOE) as one developed by LighTrans [14] (see figure 13).

At each point on the screen, a specific optical function enables the incoming light to be spread over the required solid angle at a given wavelength. This function is recorded as a volume phase hologram on a photosensitive medium. At playback, the function will be activated for any light respecting the local Bragg condition between wavelength and angle of incidence. Away from the Bragg condition, the function is not active and the screen is working as a passive transmissive component.

In terms of manufacturing process, two steps are required to make the HOE on a plane substrate. During the fisrt step, a surface relief diffractive structure is generated as a Master with e-Beam. This structure is an engineered association of micro-gratings to ensure uniform filling of the Eye motion Box when illuminated from the designed projector position. During the second step, the Master function is replicated as a volume hologram by recording on a photopolymer layer the interferences between an incident coherent beam and its reflection on the Master.

Typically, and for a high diffractive efficiency component, the spectral bandwidth of the HOE is proportional to the index modulation within the volume phase hologram. With the photopolymers available today the spectral bandwidth varies between 8 and 10nm.

As for the photopolymer, with a thickness around 20[micro]m, it can be easily embedded inside the windshield or laminated on a PET sheet.

For this 3D HUD concept, the windshield screen is composed of two HOEs, one active for the Right Eye image, one for the Left eye image. But it is also possible to record the two wavelengths on the same HOE, with two different Laser sources (Figure 14).

With a curved windshield, the curvature is generally such that the photopolymer can be laminated on the curved surface (Developable surface). In that case, the design of the master diffractive structure will take into account this curvature so as to ensure a correct "reflective-diffusive" function in the final assembly.

DISCUSSION

The use of stereoscopy to simulate collimation is achievable using high transmission spectral glasses and Holographic Optical Elements as transparent screen on the windshield.

With the spectral goggles, a see-through photopic transmission rate of 75% can be achieved, with the possibility to block unwanted rays coming from laser pointers at 532 nm.

With the HOE, the reflective efficiency is more than 85%, while keeping a see-through transmission of more than 90%. This value outperforms the very poor efficiency of today car HUD : less than 20% of the image luminance is reflected towards the driver, taking advantage of the reflection losses on an uncoated transparent glass or plastic component at high angles of incidence (cf Fresnel equations).

For automotive applications, color is highly desirable, and today high-end HUDs are full color. On such devices, color images are reflected by the windshield (used as a half-mirror). For a higher efficiency and transparency, we use two HOEs operating on two green wavelengths. Displaying color image is possible, and will require two HOEs per color component (Red, Green, Blue), one for each eye. The tunable bandpass filter will be more complex, with three bands (One for each primary color), as filters used for 3D projection [4], with higher constraints on the spectral position and bandwidth of each band, but still enabling the replacement of the bulky rotating wheel. Such solution will require the use of wide spectrum color sources. Today, compatible LED sources are only available in green for single chip components. For Blue and Red, the use of two bins per color is a possible solution.

The major advantage of the proposed concept of projecting the HUD image on a transparent screen comes from the way the geometric etendue of the complete system is addressed.

From the observer point of vue, the geometric etendue of the HUD image can be assessed summarily as the product of the solid angle associated with the accessible Field Of View (FOV) by the transverse dimensions of the Eye Motion Box (EMB). The natural trend / requirement from customers is to increase both FOV and EMB, resulting in a huge etendue, with always bulkier optical systems, and critical space envelope issues

With current HUD architectures, whether for avionics of for automotive, only one (complex) optical system is used to transport the image from the object source to the pilot. The laws of optics require that the etendue remains the same from the beginning to the end of the optical system: if we try to use a small object source, the etendue conservation will enforce an impossibly large entrance pupil, and inversely.

The proposed concept enables to break this etendue constraint: we have here two separate optical systems.

With the first optical system, the image is projected onto the screen. As per standard projection systems (ie cinema), a small object source can be used with a short focal length lens and generate a large image. The screen itself plays the role of the second optical system: every point on the screen redirects the energy received from the projector in a solid angle corresponding to the EMB.

Finally, this concept enables to get a huge etendue in the pilot's space while working with a relatively small etendue in the object space. This may looks like magic, but it's not and it works!

Even if we manage to break the etendue issue, there still exists an issue with the conservation of energy: the power needed to get the required luminance level at the pilot's eyes comes only from the projector. The key issue of this concept is the way the energy coming from the projector is handled.

SUMMARY/CONCLUSIONS

A new HUD architecture was presented, using the windshield as a transparent screen, with Holographic Optical Element. This new concept break the size limitation of today HUD box to deliver an outstanding field of view and eyebox as large as the full windshield, a highly desirable performance to improve the user experience for tomorrow Augmented Reality applications.

REFERENCES

(1.) Peng Haichao, Cheng Dewen, Han Jian, Xu Chen, Song Weitao, Ha Liuzhu, Yang Jian, Hu Quanxing, and Wang Yongtian, "Design and fabrication of a holographic head-up display with asymmetric field of view," Appl. Opt. 53, H177-H185 (2014)

(2.) Texas Instrument, "DLP[R] Technology: Solving design challenges in next generation of automotive head-up display systems" (White paper)

(3.) Texas Instrument, "Enabling the Next Generation of Automotive Head-Up Display Systems" (Application report)

(4.) Takaki Y., Urano Y., Kashiwada S., Ando H., et al., "Super multi-view windshield display for long-distance image information presentation.", Opt. Express 19, 704-716 (2011)

(5.) Park M W., Jung S K., "A Projector-based Full Windshield HUD Simulator to Evaluate the Visualization Methods", The Sixth IEEE International Conference on Ubiquitous and Future Networks (ICUFN 2014)

(6.) Wu W., Seder T., Cui D., "A Prototype of Landmark Based Car Navigation Using a Full Windshield HeadUp Display System", AMC '09 Proceedings of the 2009 workshop on Ambient media computing (p. 21-28)

(7.) Coni, P., Hourlier, S., Servantie, X., Laluque, L. et al., "A 3D Head Up Display with Simulated Collimation," SAE Technical Paper 2016-01-1978, 2016, doi:10.4271/2016-01-1978.

(8.) Jorke Helmut; Simon Arnold; New high efficiency interference filter characteristics for stereoscopic imaging. Proc. SPIE 8288, Stereoscopic Displays and Applications XXIII, 82881D (February 6, 2012); doi:10.1117/12.910512.

(9.) SAE International Aerospace Standard, "Minimum Performance Standard for Airborne Head Up Display (HUD)," SAE Standard AS8055, Ref. Jan. 2008.

(10.) Coni P., Bardon J.L, Gueguen A., Grossetete M., "Development of a 3D HUD using a tunable bandpass filter for wavelength multiplexing", Proceedings of Display Week Los Angeles 2017 (Mai 2017)

(11.) Hoffman D M., Girshick A R., Akeley K., Banks M S., "Vergence-accommodation conflicts hinder visual performance and cause visual fatigue", Journal of Vision March 2008, Vol.8, 33.

(12.) Shibata T., Kim J., Hoffman D. M., Banks M S., "Visual discomfort with stereo displays: Effects of viewing distance and direction of vergence-accommodation conflict", Proceedings of SPIE-IS&T Electronic Imaging, SPIE Vol. 7863, 78630P.

(13.) Coni P., Hourlier S., Gueguen A., Servantie X., et al., "A Full Windshield Head-Up Display using Simulated Collimation", Proceedings of Display Week San Francisco 2016 (Mai 2016)

(14.) LighTrans, "Customized Holographic Screens for HUD Applications", http://www.lighttrans.com/1033.html

CONTACT INFORMATION

Philippe Coni

Display Expert

THALES Avionics SAS

philippe.coni@fr.thalesgroup.com

ACKNOWLEDGMENTS

This work was performed in the frame of several studies funded by the DGAC (Direction Generale de l'Aviation Civile - French Aviation Authority)

The authors are grateful to Tim PIKE for his careful review of the manuscript.

DEFINITIONS/ABBREVIATIONS

AR - Augmented reality

AR-HUD - Augmented Reality HUD

3D - Three-Dimentionnal

AOI - Angle Of Incidence

AR - Augmented Reality

CCFL - Cold Cathode Fluorescent Lamp

CRT - Cathode-Ray Tube

EMB - Eye Motion Box

FOV - Field Of View

HOE - Holographic Optical Element

HUD - Head Up Display

LCD - Liquid Cristal Display

RGB - Red Green Blue

UV - Ultraviolet (Radiation)

APPENDIX

Philippe Coni, Jean Luc Bardon, and Xavier Servantie

Thales Avionics

doi:10.4271/2017-01-2156
COPYRIGHT 2017 SAE International
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2017 Gale, Cengage Learning. All rights reserved.

 
Article Details
Printer friendly Cite/link Email Feedback
Author:Coni, Philippe; Bardon, Jean Luc; Servantie, Xavier
Publication:SAE International Journal of Aerospace
Geographic Code:1USA
Date:Dec 1, 2017
Words:3463
Previous Article:The Evolution of the Composite Fuselage - A Manufacturing Perspective.
Next Article:Real Time Pose Control of an Industrial Robotic System for Machining of Large Scale Components in Aerospace Industry Using Laser Tracker System.
Topics:

Terms of use | Privacy policy | Copyright © 2018 Farlex, Inc. | Feedback | For webmasters