Demonstrating Portable Holograms

Over the past months we have been busy integrating the new Intel cameras, the Microsoft Hololens and the Unreal Engine with our existing streaming platform. After a summer of troubles we now have a moderately stable base platform for our future case studies and developments. To celebrate this we were invited to demonstrate the general technology to the New Time Music conference in Turku, Finland on the 30th September. The video illustrates the portable single camera setup used for this demonstration. The event was a great success, although their interests inevitably were on the audio which we have insufficiently developed as yet. Following the demonstration we were invited to another conference in February.

Our initial technology relies on active depth cameras, in this case the Intel RealSense L515 camera which is unfortunately discontinued. This camera will scan the scene using an infrared laser to provide a reasonable estimate of the depth of a pixel in meters. However, it is sensitive to sun light, reflective surfaces and material properties which requires that the surrounding room be largely windowless with soft fabric or non-reflective surfaces. Other cameras may handle more challenging environments a little better. Ordinarily you would wish to use a minimum of 3 such cameras, despite being less portable and straightforward. Overall, it works just like a webcam, plug it in and it will auto align and calibrate easily in the single camera case. Prior to using these cameras we relied upon high quality stereo cameras to produce a sharper capture, with these cameras being passive and therefore not interfering with each other. However, this stereo option was not portable due to needing a GPU per camera pair.

The other key component for this project is the Microsoft Hololens 2, a mixed reality headset with transparent glasses that allow the user to see the surrounding room and in principle allows others to see their eyes. Whilst the headset is light weight and comfortable, it is still difficult to see the persons eyes when wearing it and the field-of-view is limited. We are still awaiting a better hardware solution. Despite some troubles in getting our technology to integrate, it does produce the experience we were looking for. The symmetric experience where both sides are wearing the headset is still not ideal. Other research groups take various approaches to resolving this, perhaps by trying to remove the headset virtually.

Holographic personAfter streaming over the network via our web services, the 3D hologram appears as shown in the picture. Presently it remains a little unstable at the edges, especially when using only a single camera. Our next steps involve continuing to either develop or integrate algorithms to enhance image stability, along with co-designing the general interface for using this technology as an immersive telepresence solution. However, what is shown here is the base expectation for our future case studies since this project is not devoted to advancing the fundamental technology much further than this.