Streaming VR video

What is VR video? Do you see VR video when you watch 360 content in an Oculus headset? Or do you see VR video when your right and left eye see slightly different images and you can actually see depth? Both are required for a real VR experience, but how to you stream them to your headset?

Bildet kan inneholde: snø, ekstremsport, hjelm, vinter, snowboard.

Video streaming is something that we experience every day. There is interactive streaming, which we experience in Zoom, and on-demand streaming, which we experience in Netflix or YouTube.

But there is also something that is called VR video streaming.

Most of the time, people mean by VR video that you can move your head wearing a head-mounted display (Oculus, Vive, ...) and look freely in all directions, but without control over the location of the camera in the virtual world. This is also known as 360 video.

Other times, people want to see real depth, and that means that slightly different images must be presented for the left and right eyes, and they call that VR video as well. This is known as stereoscopic video.

Is it easy to bring these two ideas together? The answer is: not really, because 360 video requires that the camera rotates, but it must stay in the same place, whereas stereoscopic video requires that two camera look in the same direction, but from slightly different places.

If we want both, we must bring at least two 360 cameras and compute (infra- or extrapolate) the stereoscopic video in real-time: that allows us a horizontal head rotation by several degrees. We can cover 360 horizontal degrees if we use three cameras, and we can also go vertical with four 360 cameras arranged in a pyramid.

But how do we record and compress this kind of video data from up to four 360 cameras? All of these video frames are extremely similar, so we should be able to compress them relative to each other, inspired by MPEG's coding for stereoscopic videos. And perhaps we can code in a different manner and make it easier for the client to show stereoscopic frames?

In this thesis, we explore the means of compression, transfer, decompression and rendering.

Learning outcome

  • Understand bandwidth adaption for streaming video over the Internet.
  • Learn details about video coding, compression, inter- and extrapolation and compression.
  • Understand how to conduct user studies to assess whether a video coding and compression methods works.
  • Learn how to use CUDA for hardware compression, decompression, interpolation and extrapolation.


  • IN2140 or equivalent

  • have IN3230 or take IN4230, or when available, take a course specialized in mobile communication

  • take IN5050 and IN5060

Publisert 18. okt. 2021 13:55 - Sist endret 18. okt. 2021 16:59

Omfang (studiepoeng)