Video Optimization
Video Optimization
This chapter describes how to optimize video across your network. Video is one of the fastest growing technologies in business today. Video supports an increasing number of applications, business processes, and training videos. Video on the WAN frequently consumes 30 to 60 percent of the bandwidth. Whether as video-heavy applications or streaming media, video can comprise more than half of traffic.
This chapter includes the following sections:
•  Overview of Video Optimization
•  HTTP Stream Splitting
•  Video On-Demand with HTTP Prepopulation
•  On-Demand Video Caching
Overview of Video Optimization
You can use video to disseminate information either through live or recorded content. As of RiOS 9.1, these are the solutions for video optimization:
•  Live Streaming with split-streaming - Live streams are only available at a specific time. Examples include video streams of a live sporting event or broadcasting executive events to the workforce. In RiOS 7.0, you can optimize live streaming video with HTTP stream splitting.
For more information about HTTP stream splitting, see HTTP Stream Splitting.
•  On-demand with HTTP prepopulation - On-demand streams are stored on a server and transmitted when requested by a user: for example, training videos. In RiOS 7.0 or later, you can optimize pre-recorded video by using HTTP prepopulation.
For more information about on-demand with HTTP prepopulation, see Video On-Demand with HTTP Prepopulation and the HTTP Prepopulation.
•  On-demand video caching - On-demand video streams with cache eligible headers are stored as entire objects by the web proxy feature on the client-side SteelHead, enabling subsequent requests for the same video to be served locally from the cache.
For more information about web proxy, Overview of the Web Proxy Feature and the SteelCentral Controller for SteelHead Deployment Guide. For more information about on-demand video caching, see On-Demand Video Caching.
Distribution of video impacts not only the overall bandwidth use, but it also impacts other services running on the network. In the case of live broadcasting, you might need multiple simultaneous streams per broadcast to support multiple bit rates (for example, remote or wireless workers might watch video at a lower bit rate, while viewers in the office might watch video at a higher bit rate for a larger screen). As the number of streams increases, the likelihood increases that other services, such business applications, are affected.
Depending on the codec, resolution, and software in use, video distribution can use anywhere between 16 Kbps (for a low resolution, highly compressed video stream optimized for a small display) to 15 Mbps or more (for a high resolution HDTV stream) per stream. Typical enterprise video streams use between 350 and 500 Kbps to support a single user desktop application.
Connectivity can be overwhelmed quickly and does not scale well when you compare these speeds to typical branch office connectivity, in which you send a single stream for each user. For example, a typical T1circuit running at 1.544 Mbps (or an E1 running at 2.048 Mbps) might be able to serve users’ needs for many business applications, but it cannot support more than 4 or 5 concurrent video streams. When the entire company might be watching a video broadcast or video conference at the same time, the need for efficient delivery of video across the WAN becomes clear.
When you optimize video streams of either type, you reduce the overall impact of video traffic on the WAN, thereby ensuring service continuity and optimal user experience.
HTTP Stream Splitting
RiOS uses HTTP stream splitting to optimize the following different live video technologies:
•  Microsoft Silverlight (RiOS 7.0 or later)
•  Adobe HTTP Dynamic Streaming (RiOS 7.0 or later)
•  Apple HTTP Live Streaming (RiOS 8.5 or later)
You can optimize video technologies for on-demand video. For details, see Video On-Demand with HTTP Prepopulation.
Note: This section requires you be familiar with your origin server and video encoder.
An unoptimized live video stream can saturate a T1 link with as few as four viewers. Normally, each client that connects to view video draws its own stream, quickly exhausting the resources of smaller branch office links. With stream splitting, one stream is sent from the data center to each branch office, and software at each of the branch offices splits or replicates the stream for each individual client connecting.
You can use HTTP stream splitting to reduce the redundancy of streams operating between the head-end video server and the branch office clients. When you enable stream splitting, the first request for a video stream is sent out over the WAN, and the redundant requests are sent by the SteelHead when the first request is complete. As a result, only one copy of the stream is sent across the WAN no matter how many viewers are tuned in for the live stream. In RiOS 9.2 and later, the SteelHead identifies these streams using the video manifest file or video header metadata.
Figure: Video Streaming Before and After HTTP Splitting
You can deliver seamless live and on-demand video. By using new streaming technology, you can ensure viewers always get the quality of video best suited for their conditions. Viewers with more bandwidth and processing power receive a higher-quality video stream than viewers with less bandwidth and processing power.
Note: Stream splitting saves bandwidth: all clients maintain an active connection to the video source even though redundant streams are being removed from the WAN.
RiOS 9.1 and later improve live video stream splitting with the following enhancements:
•  The stream splitting cache holds more video fragments for a longer period of time to account for clients that could be out of sync or slower to play back.
•  A new report plots the cache hit count over time for a particular live video indicating the amount of video requests that were served locally from the cache instead of being fetched over the WAN. The graph also includes a plot for the number of total live video sessions intercepted.
•  The ability to enable video stream splitting on a per-host basis. The ability to selectively enable stream splitting on a particular host ensures that the cache does not fill up with recreational content.
For more information about these enhancements, see the SteelHead Management Console User’s Guide.
To enable HTTP stream splitting
1. Set up the video origin server.
2. On the SteelHead Management Console, choose Optimization > Protocols: HTTP.
Figure: Microsoft Silverlight Stream Splitting on the HTTP Page
3. Select Enable HTTP Stream Splitting.
Your video is now automatically optimized.
For CLI commands associated with this feature, see the Riverbed Command-Line Interface Reference Manual.
The following resources provide more information:
•  For information about Microsoft Silverlight smooth streaming, go to
•  For more information about Adobe dynamic HTTP streaming, go to
•  For more information about Apple HTTP live streaming, go to
•  For more information about video solutions, see the white paper Video Architectures with Riverbed.
Video On-Demand with HTTP Prepopulation
Company updates and new internal training videos are examples of video content that can cause bursts of traffic when they are first made available. Video content is generally accessible only from web servers and can only be accessed using HTTP. Prewarming RiOS data store during off hours helps to reduce WAN bandwidth consumption during peak hours. While HTTP prepopulation enables you to prepopulate any information over HTTP, this feature is geared toward video optimization.
For more information about HTTP prepopulation, see HTTP Prepopulation.
On-Demand Video Caching
You enable the web proxy feature from the SCC, and you must have the SteelHead xx70 hardware platform running RiOS 9.1. You can configure rules to select traffic that is cached on the client-side SteelHead. You must mark the content you want to cache for web proxy so that the video files are stored as web objects on the disk.
Using the web proxy feature achieves the same benefits as HTTP prepopulation, but web proxy uses a different area of disk inside the SteelHead, enabling larger objects to be cached. Any subsequent requests for the same content that is in the web proxy cache are served, in their entirety, from the cache. Serving the video locally removes the overhead of transferring the redundant request over the network.
For more information about web proxy, see Overview of the Web Proxy Feature of the SteelCentral Controller for SteelHead Deployment Guide, the SteelCentral Controller for SteelHead User’s Guide, and the SteelHead Management Console User’s Guide.