Printer Friendly

VividQ--Real-Time Holographic 3D Displays to Become Reality.

Holography has come a long way since its invention by Dennis Gabor in 1947, with today's holography encompassing many different areas that at the time of its invention seemed unimaginable.

One such area is real-time holographic 3D displays, which are often considered the pinnacle of holography. Whilst academic researchers and companies across the globe have continued to invest in developing 3D or pseudo holographic display technologies, it is only in the last few years that commercial interests have begun to explore real-time holographic 3D displays. This in part is due to the enormous computational power requirement (called the computational hurdle) that real time holographic 3D displays demand which, in turn, has slowed the development of the technology.

However, all this is about to change with new UK start-up company--VividQ. It has developed ground-breaking software for diffractive holography and 3D point cloud data processing to make 3D real time holographic display screens a commercial reality using standard computing power.

Holography News[R] caught up with VividQ to find out more about the company, its ground-breaking technology and future applications.

Background

The journey began in 2016 following an idea of how to analyse large data sets and reduce their size. Computer generated holograms (CGHs)/digital holograms were investigated due to the large data set size, and a software system built that was able to speed up the generation of real time CGH by over 1,000 times compared to established methods.

A year later, and with the help of angel investors, VividQ was born in February 2017, in Cambridge, by a technical team of PhD graduates from Oxford, St Andrews, and Cambridge Universities and led by cofounder and CEO Darran Milne.

Since then the company has continued developing its software products for commercial applications, has grown its commercial and technical teams in Cambridge and London, and has made a full release of the software framework. It has also established a holographic display network through partnerships with chipmakers, consumer electronics manufacturers, display and hardware manufacturers in the US, Taiwan and Europe to enable both its software to be integrated into devices, and make holography a viable commercial display solution for the first time.

The company currently employs 17 people between offices in central Cambridge and London and is looking to expand to 28 employees by the end of 2019. The company also recently secured further investment funding to further develop and enhance its products.

Current technologies

Existing 3D-like/pseudo holographic technologies that are used in devices such as 3D glasses, head-up-displays (HUDs) or concert projections are partial solutions to the '3D problem', but are intrinsically limited in the content they can display and the level of realism they can achieve.

The Pepper's Ghost effect Is often marketed as a hologram but is in fact nothing of the sort. Whilst the effect is impressive, it is simply a 2D image reflected off a transparent surface. The disadvantage of Pepper's Ghost projections is that they only look realistic at a certain distance and must be viewed head on, otherwise the flat nature of the image becomes very obvious.

Another technology is stereoscopy, which is often used in 3D films and presents an illusion of 3D depth, but again there is nothing 3D or holographic involved. Instead the technique presents two images slightly offset on the same screen, each of which is viewed independently by the viewers eye to create a perception of 3D depth.

As the viewers eyes are a certain distance apart from each other, the eyes view the same screen at two different angles, thus presenting two different images to the brain to create an illusory 3D effect.

Problems with AR and VR headsets

Both stereoscopy and Pepper's Ghost effects have found their way into a number of Augmented Reality (AR) and Virtual Reality (VR) headsets. Both these technologies do not achieve natural depth perception as they only display virtual objects in one depth plane, meaning the headset displays all virtual objects In focus simultaneously, as seen on a TV screen.

As a consequence, this lack of realistic depth causes side effects such as nausea, headaches and eye strain for headset users when used over extended periods of time. The theory behind these side effect symptoms Is known as the Vergence-Accommodation Conflict (VAC), summarised below.

Vergence-Accommodation Conflict

We'll begin with a couple of definitions:

Accommodation is the ability of the eye to change its focus from distant to near objects by the lens changing its shape to keep an object In focus and Is an important depth cue to focus light onto the retina of the eye.

Vergence can be described as the simultaneous movement of both eyes in opposite directions to obtain or maintain single binocular vision. Thus, when looking at an object nearby the eyes converge on it, which is another depth cue.

When an observer is looking at a natural scene, vergence and accommodation are always synchronised. The VAC issue arises when we look at virtual information that is all-in-focus ie. in one depth plane. For example, a virtual world experienced through a VR headset display Is usually around 3m away. The actual screen is of course much closer, but there are optics in the headset that make it appear as if it were at that distance. Your eyes remain focused on the screen all the time, therefore accommodating the near-distance screen while converging on the virtual world 3m away. Our eyes adapt to make sense of the virtual view, but after a while the side effects of headaches, nausea and fatigue ensue.

In the case of other display technologies such as 3D cinema or 3D TVs, VAC-related symptoms can occur, usually to a slightly lesser degree as the displays don't completely block out the real world compared with VR. Genuine real time 3D holographic projection displays completely avoid the VAC issue as they recreate the light reflected from a real-world scene, thereby ensuring all depth cues are synchronised, providing a more realistic and natural viewing experience

VividQ's solution for full-3D holography

As previously described up, until now the computing hurdle and the VAC issue have prevented the generation of true real time holographic 3D displays, thereby thwarting the commercial and practical viability of the technology. VividQ has now solved both these issues using a patented interoperable software framework that exceeds the requirements to satisfy all visual depth cues, rid any side-effects caused by VAC and thereby enable true real time holography to become a commercially viable product.

The proprietary software framework uses state-of-the-art algorithms that can generate holograms with up to 250 depth planes and with up to 60 depth planes in real time, In milliseconds, on standard computing hardware. This enables 3D holographic images to be generated in real-time that users can view and interact with. The software comprises of four modules, referred to as capture, squeeze, core and view.

The software captures the 3D and point cloud data and instantly converts into a holographic format, then squeezes the data using compression software, generates the holographic Images in real-time using standard GPUs (graphic processing units) and computing power, ie. the core, prior to viewing the diffractive holograms generated from 3D models and point cloud data using a special light modulator (SLM).

The latter are found in the many AR/MR solutions available in the market today but are typically used as high resolution 2D displays. Using view, these can be repurposed to provide a full 3D visual experience and also correct for aberrations/ defects.

The ecosystem--supply chain partnerships

In addition to interoperability, the company believes its software will be at the heart of future 3D displays and has built an ecosystem of expert hardware companies to enable this. These companies include leading manufacturers of liquid crystal on silicon (LCoS displays), compute platforms (chipsets and graphic processing units) and optical engines.

The company's extensive ecosystem partnerships with many hardware organisations enables the software to be tested and tailored as required, with the aim of making VividQ's software the standard for original equipment manufacturers to develop fully holographic augmented reality smart glasses, head-up displays and consumer electronics.

Real times holographic display advantages

Some of the advantages of real-time 3D holographic displays can be summarised as follows:

* Simple optical design and minimum hardware footprint--ideal for mobile and wearables;

* Aberration correction--imperfections in the hardware can be corrected for in the software

* Comfortable viewing experience--thereby avoiding VAC related side effects such as nausea;

* Efficient--light is steered to where it is needed, and no light is wasted or blocked out

Future applications

The current primary application Is for augmented reality devices, but future applications are vast and include desktop displays, consumer electronics, smartphones, tablets, TVs, HUDs, medical devices, depth sensing cameras, Immersive simulated environments, CAD, Internet of Things devices etc ...

Holography News looks forward to updating our subscribers as VividQ moves forward with its business and real time 3D holographic displays become a reality.

www.vivid-q.com https://medium.com/vividq-holographyblog

Caption: Pepper's Ghost: A viewer looking through the red rectangle sees a ghost floating next to the table. The illusion is created by a large piece of glass or a half-silvered mirror, situated between viewer and scene (green outline). The glass reflects a mirror-image room (left) that is hidden from the viewer. If the mirror-image room Heft] is darkened, it does not reflect well in the glass. The empty room (top) is brightly lit, making it very visible to the viewer. By means of adjusting the light up or down, the ghost can appear and disappear.

Caption: Left image: AR and VR one depth plane. Right image: holographic display with full depth perception.

Caption: The extent that eyes converge depends on the distance from your eyes to the object.

Caption: An example of VividQ's ecosystem.

Caption: Examples of a real-time 3D holographic display using VividQ's software when viewed through a headset. The elephant in the rear plane appears out of focus at a different depth plane. As the elephant moves forward towards the viewer it comes into focus with all its three-dimensionality and natural depth perception.

Please Note: Illustration(s) are not available due to copyright restrictions.
COPYRIGHT 2019 Reconnaissance International
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2019 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Publication:Holography News
Geographic Code:4EUUK
Date:Jul 1, 2019
Words:1684
Previous Article:Security Foiling Steps Up Document Security with Holo-Sign.
Next Article:Idvac Develops Vacuum Metallised White Coating.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters