Graz (A), December 2019 - Dieter Schmalstieg, a researcher at Graz University of Technology, is working on a method combining the advantages of cloud computing and virtual reality. This method will allow computer games to be displayed on inexpensive VR headsets in unsurpassed quality.
Streaming services such as Netflix or Amazon Prime are widely used; however, the next wave of digital media - cloud gaming - is imminent. This technology is similar to video-on-demand services. A computer game is run on a server in the cloud. The players access the server via an internet connection and receive an audio-video stream on their personal devices. Players no longer have to own a powerful gaming device; instead, they just need a fast internet connection capable of streaming large amounts of data from the cloud with low latency.
Cloud computing has the potential to elevate VR games to the next level. However, the bandwidth requirements are still challenging. A fluid VR display requires up to ten times more computational performance to generate enough pixels and enough frames per second. Traditional video transmission is easily pushed beyond its limits. Dieter Schmalstieg, head of the Institute of Computer Graphics and Vision, and his team have developed a novel method unlocking a breakthrough potential for untethered VR experiences.
Their method, called "shading atlas streaming", can deliver a compelling VR experience with significantly fewer bits per second transmitted over the network. Schmalstieg explains, "We are not streaming videos, but geometrically encoded data, which is decoded on the VR headset and converted into an image."
Latency – the temporal delay caused by signal transmission, storage, or processing of data packets – is compensated for by the system. "It is physically impossible to remove all latency, but our encoding allows correct images to be predicted for a small temporal window into the future. As a result, physical latency is compensated for, and the user does not perceive any delays," says Schmalstieg. Only a few pixel errors from mispredictions remain – too few to be perceived by the users.
In practice, it is important to be able to integrate the new technology into existing infrastructure. For this purpose, the researchers use conventional MPEG video compression to encode and transmit the data. MPEG decoding capabilities already exist in VR headsets; therefore, Shading Atlas Streaming can be used without investing in new hardware.
Shading Atlas Streaming is generally applicable to all areas involving 3D data and VR headsets. The researchers are working with US chip manufacturer Qualcomm on commercial exploitation of their research results.