Remote Rendering for VR

Abstract: The aim of this thesis is to study and advance technology relating to remoterendering of Virtual Reality (VR). In remote rendering, rendered content iscommonly streamed as video images in network packets from a server to aclient. Experiments are conducted with varying networks and configurationsthroughout this work as well as with different technologies that enable or improveremote VR experiences. As an introduction to the field, the thesis beginswith related studies on 360-video. Here, a statistic based on throughput alone isproposed for use in light-weight performance monitoring of encrypted HTTPS360-video streams. The statistic gives an indication of the potential of stalls inthe video stream which may be of use for network providers wanting to allocatebandwidth optimally. Moving on from 360-video into real-time remote rendering,a wireless VR adapter, TPCAST, is studied and a method for monitoringthe input- and video-throughput of this device is proposed and implemented.With the monitoring tool, it is for example possible to identify video stalls thatoccur in TPCAST and thus determine a baseline of its robustness in terms ofvideo delivery. Having determined the baseline, we move on to developing aprototype remote rendering system for VR. The prototype has so far been usedto study the bitrate requirements of remote VR and to develop a novel technologythat reduces the image size from a codec-perspective by utilizing theHidden Area Mesh (HAM) that is unique to VR. By reducing the image size,codecs can run faster and time will therefore be saved each frame, potentiallyreducing the latency of the system.

  CLICK HERE TO DOWNLOAD THE WHOLE DISSERTATION. (in PDF format)