Skip ⬇︎

Cloud-based 360° Video Playout on TV

Presenter: Louay Bassbouss

Demo from Web & Networks Interest Group

The Cloud-based 360° Video Playout allows the viewing of high quality 360° videos on devices with constrained capabilities such as TV sets. It reduces the required bandwidth and processing resources by rendering the field of view (FOV) in the cloud in advance and only stream the selected FOV to the client. Each FOV video is provided as individual stream. Transition videos which enable a smooth switch between FOVs are generated and provided as individual streams as well.

The 360° player implementation in the browser uses multiple Web APIs like fetch (with readable streams), MSE, EME and video element. The player requests video segments of the current FOV until the user switches to another view. In this case, the MSE buffer which contains video segments of current FOV will be emptied (from a certain position) and video segments of the transition video or of the new FOV video will be fetched from the CDN and appended to the MSE buffer. Information about network latency and actual throughput are essential to determine the bitrate level of the new segments to fetch in order to get them at the right time and to calculate the position at which the buffer will be emptied.

Previous: Demo - Browser-based Edge Computing Solution for Offloading Process and Resource All talks Next: Link Performance Prediction and Network Trace Tool

Skip

Skip

Transcript

© ESA/NASAFraunhofer FOKUSInstitute for Open Communication SystemsCloud-based 360°Video Playout on TVWeb & Networks Breakout Session DemoDr. Louay Bassbouss | TPAC2020 |26-30 October 2020

Hello, everyone.

My name is Louay Bassbouss, and I work for Fraunhofer Institute for Open Communication Systems.

I am going to demonstrate in this video, our solution Cloud-based 360° Video Playout on TV as one of the Web and Networks Breakout Session Demos.

If you want to know more about the solution, please visit the link displayed on this slide.

☞☞www.fokus.fraunhofer.de/go/360üEnables 360°video playback on devices with limited processing capabilities like TVs by rendering FOV videos in advanceüReduces bandwidth consumption by 80-90% by streaming only the field of view (FOV) to the clientüSupports multiple video codecs and streaming formats like DASHüUses existing streaming infrastructures, CDNs and Web APIs (MSE, EME, fetch, ReadableStream) for delivery and playbacküEnables 4K FOV resolution from 24K 360°equirectangular videosCloud-based 360°Video Playout on TV

Now let's start with the benefits of the Cloud-based 360° Video Playout.

First, it enables 360° video playback on devices with limited processing capabilities like TVs by rendering the field of view videos in advance.

It reduces the bandwidth consumption by 80 to 90%, since on this current field of view, it's streamed to the client and not the entire 360° video.

It also supports multiple video codecs and streaming formats like MPEG-DASH.

It uses existing streaming infrastructures, and Web APIs for the delivery and playback.

And finally it enables a playback of high quality, 360° videos.

4K FOV24K Equirectangular Frame

You can see on this slide, a 24K Equirectangular Frame.

4K FOV Frame

And on this slide, the corresponding field of your frame was 4K resolution, rendered from the previous 24K equirectangular frame.

1.FOV videos are pre-rendered in the cloud from the input 360°video (Live or VoD)2.Each FOV video is packaged as DASH stream and stored in a cloud storage (CDN Origin)3.FOV DASH streams are delivered to clients through CDNs4.The Web player uses the fetchand MSEAPIs to download the DASH segements and append them to the SourceBufferHow it works?360°Camera360°VideoCDN...Player...FOV Video RendererFOV Video Storage...

Let's take a look on how the cloud pre-rendering approach works.

The first step is the field of view videos are pre-rendered in the cloud.

Then the pre-rendered videos are packaged as DASH streams and made available in a cloud storage.

In the third step, the DASH streams are delivered to clients via CDNs.

In the last step, the DASH streams are played back in a dedicated player, where players can use, for example, the fetch and MSE APIs to request and play the DASH segments.

Let's take a closer look to the structure of the generated videos.

Segments à12345678Static FOV Videos0°90°180°270°TransitionVideos Left0°90°180°270°Transition Videos Right0°90°180°270°Lets take a closer look to the pre-rendered videosTwo types of videos are pre-rendered:Static FOV Videos: record the FOV at fixed positions and create a DASH stream for each FOV videoTransitionVideos: rotate the virtual camera with a constant speed in different directions (left, right, up, down) starting at each FOV position and create a DASH stream for each transition video. Transition videos enable a smooth navigation between the FOVsThe example on this slide shows the generated FOV and Transition videos with 90°step and for the left and right directions à12 DASH streams are generated in total:4 Static FOV Videos4 Transition Videos Left4 Transition Videos Right

So I have two types of pre-rendered videos: Static Field of Videos, which are the recording of the field of view by fixing the virtual camera at predefined static positions - the example shown on this slide, we have four static positions, zero degree, 90 degree, 180 degree and 270 degree; and Transition Videos, which are required to enable a smooth transition between the static field of views - these are generated by rotating the virtual camera at a constant speed, starting from each static field of view position and in different directions.

The example on this slide considers only the left and right directions.

In this example, 12 DASH streams are generated in total, four Static Field of View Videos, four Transition Videos to the Left, and four Transition Videos to the Right.

Biathlon 2019 -ZDFFernsehgarten 2019 -ZDF360°View -ERTFIFA World Cup 2018 -ERTKumpel-Tag mit Andy -WDRBasketball 360°-DTPilots and Live Cases on TVLIVELIVEVODVODVODVOD

The solution is already used in production by many broadcasters, as part of their HbbTV services.

Let's have a closer look to the Biathlon 360° live streaming on TV, which was provided by the German public broadcaster ZDF during Biathlon World Cup last year.

TV screen with German audio comments - the German public Broadcaster ZDF offers an HbbTV application that can be launched via the Red or Green remote controller buttons.

Watch Video on YouTube àwww.youtube.com/watch?v=Bvmih-PbIwA

TV Screen shows Intro page to 360° Live Streaming on TV during Biathlon World cup in Oberhof, Germany.

TV screen with German explanations to use 360° live stream: the launched HbbTV application integrations the 360° Web player using the cloud rendering approach.

Static FOV Videos0°123490°180°270°78TransitionVideos Left0°90°180°270°456Transition Videos Right0°90°180°270°Lets take a closer look to the web player1234567894MSE SourceBufferTV/HbbTV Web Player10TV RC for navigationLeft Button Pressed

Now let's take a closer look to the web player with focus on the buffering algorithm.

We will use the same example shown before with four Static Field of View positions, and with Left and Right Transition Videos.

This is by the way, the same configuration used for the Biathlon live stream.

The player starts playing field of view video at static position, zero degree and keeps one additional segment in the MSE SourceBuffer.

During the playback, also search field of view segment.

So user presses, the left button, which causes the player to start buffering left transition video segments starting by segment number four.

The user releases the button during the playback of segment number five, but since segment number six, of the new field of view cannot be downloaded on time, segment number seven will be requested instead.

So now let's watch the sequence of this example at normal speed.

Accurate prediction on the timing for downloading the next segment after a user interaction is very important àAlready indentified as an important requirement in the Web & Network IGChallenges12345678910Static FOV Video 0°Transition VideoStatic FOV Video 270°PressLeft ButtonMake Decision àDownload Seg.:#4 Transition Left at 270°#5 Transition Left at 0°ReleaseLeft ButtonMake Decision àDownload Seg.: #6 Static FOV 180°#7 Static FOV 270°TTT

The most important challenge of this approach is the accurate prediction on the timing for downloading video segments.

This is already identified as an important requirement in the Web and Networks group.

Let's have a look again to the example, to see where the prediction is required.

When the user presses the left button while playing segment number three, the player decides to download segment four of the left transition video at 270 degree and not segment number five of the left transition video at zero degree.

Similar when the user releases the left button while playing segment number five, the player decides to download segment number seven of the Static Field of View at 270 degree.

Fraunhofer Institute forOpen Communication Systems FOKUSKaiserin-Augusta-Allee3110589 Berlin, Germanywww.fokus.fraunhofer.deDr. Louay Bassbousslouay.bassbouss@fokus.fraunhofer.deFor more details àwww.fokus.fraunhofer.de/go/360

Thank you for watching this video.

If you have questions, please visit the solution page or contact me.

Skip

Sponsors

Platinum sponsor

Coil Technologies,

Media sponsor

Legible

For further details, contact sponsorship@w3.org