Note: The Apollo 11 demo was intended to be presented onstage at Microsoft Build. The live showcase has been postponed due to unforeseen onsite technical issues. Although we were unable to show the Apollo 11 experience onstage today, we're excited to help others understand the potential of using HoloLens 2 to learn and share stories in entirely new ways that have never been possible until now.
Creative communities across entertainment, visualization, design, manufacturing, and education eagerly anticipate Unreal Engine 4 native support for HoloLens 2, which Epic has confirmed will be released by the end of May. To kick off Microsoft Build, the Unreal Engine team unveiled a remarkable interactive visualization of the Apollo 11 lunar landing, which celebrates its 50th anniversary this year.
ILM Chief Creative Officer John Knoll and Andrew Chaikin, space historian and author of Man on the Moon, present the multi-user HoloLens 2 demonstration, which recreates the historic 1969 event in meticulous detail. The demo presents a vision for the future of computing in which manipulating high-quality 3D content using a headset is as accessible as checking email on a smartphone.
The lifelike experience offers a bird’s-eye view of many aspects of the Apollo 11 mission, including the launch itself, an accurate model of the Saturn V, a detailed reenactment of the lunar landing, and a look at Neil Armstrong’s first steps on the moon reconstructed based on data and footage from the mission.
Highlights that would be impossible to convey with this level of detail in any other medium include the three stages of Saturn V, the form-follows-function design of the Eagle lander, and lunar module’s suspenseful descent to the moon’s surface.
“When we combine the power of HoloLens with the power of Azure, our partners can deliver transformative solutions. From automotive to manufacturing, from architecture to healthcare, our customers need highly precise and detailed representation of their 3D content,” said Alex Kipman, technical fellow in Microsoft’s Cloud and AI group. “Epic just showed us how to directly stream high-polygon content, with no decimation, to HoloLens. Unreal Engine enables HoloLens 2 to display holograms of infinite detail, far in excess of what is possible with edge compute and rendering alone.”
The demo’s visuals stream wirelessly using Unreal Engine 4.22 on networked PCs to two HoloLens 2 devices using Azure Spatial Anchors to create a shared experience between Knoll and Chaikin, offering a glimpse into the potential of photorealistic, social mixed reality experiences. The two presenters collaborated in the environment, interacting with the same holograms in the same space – an exchange that looked and felt simple and seamless, but was in fact highly complex.
The demo also takes advantage of HoloLens 2 instinctual interactions whereby users can naturally move their heads and hands to touch and manipulate holograms in front of them, for example, bringing hands together and pushing them apart to see the unique components of the Saturn V rocket as detached, individual units.
Finally, Unreal Engine 4’s support for Holographic Remoting brings high-end PC graphics to HoloLens devices. The Apollo 11 demo features a staggering 15 million polygons in a physically-based rendering environment with fully dynamic lighting and shadows, multi-layered materials, and volumetric effects.