Making movies

by Tim Anderson

Microsoft, Adobe and Intel each offer powerful video solutions. We find out what’s available, and talk to Intel about its latest set of tools.

HardCopy Issue: 65 | Published: February 27, 2015

Demand for video has never been greater, and not just for entertainment. Training, surveillance systems, communications, marketing, news: all these and more require high quality video, often combined with the need to support a diversity of clients, from low-end mobile devices to high-definition displays.

Today’s developers benefit from a rich array of tools for building video applications and preparing the content, with or without content protection. At a high level, it has never been easier to integrate video into your business. Microsoft’s Office 365, for example, includes a video portal (in preview) that lets you upload videos with drag and drop, assign them to channels, and set permissions for who else can view and upload.

Both Adobe Flash and Microsoft Silverlight remain useful as video clients, especially since they enable developers to bypass the complexities of browser compatibility. That said, many mobile clients cannot use plug-ins and the trend is towards either HTML5-based solutions, or dedicated mobile apps.

 

Azure Media Services

The services that Microsoft offers are backed by Azure Media Services, and you can create your own custom applications based on the same back-end. Azure Media Services supports streaming in MPEG DASH, Apple HLS, Adobe HDS, and Microsoft’s own Smooth Streaming protocol.

Adaptive Streaming Standards

Adaptive Streaming is a technique for delivering media using a bit-rate that varies according to the network conditions. Content is delivered in small segments, and the client selects the next segment at the best quality the network will allow without pausing.

MPEG DASH (Dynamic Adaptive Streaming over HTTP) is a container format for delivering media over HTTP. It became an international standard in 2011, though browser support is limited because the standard is relatively new. However MPEG DASH has momentum and is likely to be the preferred standard for browser video streaming. Currently Internet Explorer 11 has support with H.264, Google Chrome with H.264 or WebM formats, and Opera 20 or higher with WebM formats. Scripts are available which will play HTML5 MPEG-DASH when available, or fall back to Adobe Flash if necessary, which also supports the format. Microsoft’s PlayReady SDK supports MPEG DASH on iOS and Android.

Apple HLS (HTTP Live Streaming) is another adaptive streaming protocol which is used by Apple in its QuickTime media player and Safari browser. If you are targeting Apple devices, HLS has the best support.

Adobe HDS (HTTP Dynamic Streaming) is a protocol used by Adobe PrimeTime and supported by Adobe Flash Player at up to 1080p resolution. Adobe also now supports MPEG DASH alongside HDS.

Smooth Streaming is Microsoft’s adaptive streaming protocol, supporting 1080p video and with clients including the Silverlight plug-in, Windows 8 Store apps and Universal apps, and Windows Phone.

Developers have a variety of options for building clients for Azure Media Services or the related IIS Media Services (for on-premises deployment). There is a Smooth Streaming Client SDK for Windows Store apps and Windows Phone, Silverlight support for Windows and Mac, an Xbox Application Development Kit, a Smooth Streaming SDK for Apple iOS, and a plug-in for Adobe Flash. There is also a Porting Kit which includes generic C++ code that can be compiled for any platform, including embedded devices.

The Smooth Streaming development kits also support Microsoft’s PlayReady SDK which offers protected content support. Using this your content can be encrypted so it can only be played after issuing a licence or a key to the client. The SDK supports iOS and Android as well as Windows Phone and Windows Store apps, Internet Explorer, Silverlight, Xbox One and Xbox 360. The PlayReady porting kit can be compiled for any platform including embedded devices, x86 or ARM.

If you want to include advertising in video delivered by Microsoft’s platform, you do so via player APIs, though not all clients include this support. Those that do include Windows 8, Silverlight, Windows Phone 8 and iOS.

Azure Media Services also includes encoding services, with support for a variety of input formats including H.264, MPEG-1, MPEG-2, MPEG-4, VC-1, DV and WMV, as well as uncompressed formats. You can upload your content for encoding and then encrypt it if required for PlayReady, all under programmatic control.

 

Adobe video support

Adobe has a long history of supporting the streaming of video, both on the server and in clients such as its Flash player. The company also offers strong video creation and editing tools in its Creative Cloud suite which includes Premier Pro CC and After Effects CC.

adobe primetime

Comcast uses Adobe Primetime to deliver TV experiences to its set top boxes.

Adobe Media Server (AMS), formerly Flash Media Server, is now at version 5.0 and runs on Windows or Linux. It supports Flash and AIR (Adobe Integrated Runtime) as well as Apple HLS for iOS clients. You can protect content with Adobe Access, and offer protected HTTP streaming for both Flash and Apple iOS clients. Android support is primarily via Adobe AIR, which lets you compile a Flash application to Android. For cloud delivery, Adobe offers pre-configured virtual machines on Amazon Web Services with which you can stream media stored on Amazon’s S3 storage services. You can also scale the service with Amazon CloudFront.

AMS supports RTMP (Real Time Message Protocol), a Flash-specific messaging protocol, as well as RTMFP (Real Time Media Flow Protocol) which supports peer-to-peer networking and communication.

Using RTMFP you can create communication applications including features such as real-time chat, as well as saving server bandwidth on video delivery.

Developers get a head start creating Flash clients thanks to Strobe Media Playback, an open source Flash media player.

AMS and Adobe’s Primetime have some overlap in functionality, but are targeted at two categories of users. AMS is a Flash platform technology which provides a streaming platform, including content protection, that can be scripted and controlled using Flash applications, making it suitable for multi-person, interactive video applications, chat, gaming platforms and so on.

By contrast, Primetime is aimed at the video business as a whole rather than the Flash Platform, and offers more than just video streaming. It streams using HLS and MPEG-DASH, but not Flash RTMP. It provides a set of components for not only video, but also multiple language support, ad insertion and decisioning, quality of service analytics, DRM licensing, and connections through to Adobe Analytics. Primetime is suitable for businesses that are focused on ad targeting or audience segmentation, and who want to have the same video capabilities across multiple devices, without any Flash dependency. That said, you can use AMS as the media server for Primetime, so the two approaches are not totally exclusive.

 

Intel Media Server Studio 2015

Intel’s Media Server Studio is a new suite of products for developers of enterprise media solutions, or for companies building their own transcoding solutions or video management tools. It includes a wide range of tools for analysing and optimising video performance, which are of general value to developers. It is a cross-platform product working with both Windows and Linux.

The heart of Media Server Studio is the software development kit for Intel’s graphics and CPU platform. In addition there are codecs for H.265, AAC audio and MPEG audio. VTune Amplifier, also in the suite, is a profiling tool which can monitor both CPU and GPU activity in order to identify bottlenecks. Other tools include the Visual Quality Caliper, which lets you inspect encoded or decoded video streams for low-level analysis; the Premium Telecine Interlace Reverser development library for de-interlace processing; Video Pro Analyzer is an analysis tool for analysing H.265 and VP9; and the Intel Stress Bitstreams and Encoder, available for H.265 and VP9, for validating and debugging a video playback solution.

The OpenCL Code Builder is also included, letting you code general-purpose software that runs on the GPU or on co-processors such as Xeon Phi. It is suitable for graphics and video processing where that benefits from concurrent programming. Metrics Monitor is a tool in preview that monitors GPU load. This is a complicated product, so we spoke to Mark Buxton, Director of Media Development Products, who told us more.

“Media Server Studio started off as a fork from our original client-focused media efforts at Intel,” said Buxton. “We’ve had for many years Intel HD Graphics focused on delivering consumer grade experiences at low cost points, and QSV or Quick Sync Video [encoding and decoding built into CPUs]. We’re pushing how we integrated video IP blocks into all of our SSEs [Streaming SIMD extensions] at this point. But they’ve taken a bit longer to get into our server products and our high-end embedded products.

“If you are really going to build a professional-grade video encoder you need to focus a bit less on performance and quite a bit more on quality. We put a lot of effort recently into trying to improve the quality, scalability and the tools that have to surround those. This product works on Linux, and it’s stressed an enormous amount to ensure that it doesn’t have any memory leaks, any resource leaks, any defects or any behavioural problem, that could compromise very high reliability servers.”

VideoProAnalyzer

Intel’s Video Pro Analyzer, an analysis tool for H.265 and VP9.

Media Server Studio is available in two editions, Essentials and Professional. Buxton explains: “The Essentials edition is a way for us to distribute high performance graphics drivers, patching infrastructure, APIs and developer SDKs. It’s a low-cost, royalty-free component that lets people deploy runtime ingredients such as our hardware-accelerated AVC [Advanced Video Coding] encoders, or MPEG 2 decoders, or special de-interlace technologies.

“The Professional version adds a lot of soft codecs and hybrid codecs, things that leverage the GPU and the CPU together. A great example would be our HEVC [High Efficiency Video Coding] encoder and some of our audio components. Our customers in this space often take our encoder or decoder or some of our components and build them into their own transcode pipeline. An example of that might be Thomson Video Networks. Their ViBE XT1000 Xtream Transcoder XT1000 specialised transcoder is using ingredients from our products to develop its video transcode pipeline.”

One of Intel’s goals is to provide strong support for cloud-hosted transcoding. “The product has had weaknesses in its inability to scale to the number of cores that we’ve had on our Xeon E5 roadmap and its focus on graphics-enabled silicon. We’ve found that to address those segments we have to take the hardware-oriented solutions we’ve had in the past and provide good quality software versions that work and scale properly in virtualised environments on large numbers of cores, such as 24 cores per E5 server.”

Buxton explains that some customers use Media Server Studio for its tools rather than its SDK. “Application developers can incorporate our soft-code engine in their applications, but they tend to use our tools quite a bit more. The Video Pro Analyser, and the Stress Bitstreams and Encoder, are validation and analysis tools that look very deep into the video quality associated with, for example, an HEVC bitstream.

Video standard wars

Developing video applications would be easy if there were a single format to deal with, but of course that is not the case. Technology never stands still, and there are legacy, current and cutting-edge formats all of which require support.

The big divide today is between H.264 and H.265 (also known as HEVC or High Efficiency Video Coding), and Google’s VP8 and VP9 (both supported by the WebM project), all of which are video compression standards. VP8 and VP9 are royalty-free, whereas H.264 and its successor, H.265 command commercial licences. VP8 and VP9 are supported in Google’s Chrome web browser, though to date the H.264 and H.265 standards have benefited from better hardware support. Google’s standards are important though, not least because of the dominance of YouTube in web video, and chipset vendors are hastening to provide strong support in future.

A key factor in adoption of either HEVC or VP9 is that their efficient compression reduces the bandwidth and storage requirements for 4K video. The term ‘4K’ implies a resolution of 4096 pixels per horizontal line, though in practice it includes UHDTV (Ultra High Definition TV) at 3840 by 2160 pixels as well as full DCI (Digital Cinema Initiative) 4K which is 4096 by 2160.

Despite the advantages of the newest formats, there is still high demand for older formats which are widely compatible and typically used in low resolutions. MPEG-1 dates from 1992 and supports video and stereo audio. MPEG-2 adds support for interlaced video and multi-channel audio and is the format used by DVD. MPEG-4 was introduced in 1998 and achieves higher compression ratios, as well as supporting Intellectual Property Management and Protection (IPMP). MPEG-4 AVC is aligned to H.264 and used by Blu-ray.

“Our Stress Bitstreams and Encoder product has been taken up broadly by the ecosystem and we’re one of the only suppliers of VP9 conformance bitstreams. VP9 is the format that Google announced to replace VP8. It is an HEVC competitor and provides similar quality in a less complicated codec. Because of the uptake by YouTube, and the expected future uptake within WebRTC, we’re seeing a fair degree of interest in VP9, though it does not compare with the massive shift that’s happening in HEVC.”

Buxton says that the Stress Bitstreams and Encoder has some unique benefits. “Most people look at the coverage of the syntax elements in a bitstream, they try to make sure that they’re covering every element. But you really have to go beyond that and make sure that you are looking at the intermediate values of the bits to make sure that all the different stages in the processing pipeline are being fully exercised by the stress content. We have a carefully instrumented decoder and we package that with our Stress Bitstreams product so people can closely examine the coverage that they’re getting. We also package with it a kind of encoder so that customers can build their own performance and stress content. The two things together let people very quickly figure out there’s a defect somewhere in their decode process.”

Why does Intel include OpenCL CodeBuilder in the suite? “It’s the next level of detail down,” says Buxton. “What we find is that most of our top-tier customers need to do a considerable amount of analysis on video. Even basic video transcode applications can optimise their quality by doing special casing, let’s say doing face detection for people and putting extra resolution where the faces are, stealing bits from less interesting content. OpenCL is the perfect complement. It’s the language that we use to program our GPUs. OpenCL Codebuilder has optimisation techniques that help a developer do something that’s quite hard today, which is to make some of the decisions that OpenCL exposes to the developer more intelligently. A good example would be figuring out how to properly thread or break down a frame into composable units, so that your code runs optimally. Codebuilder provides the ability to experiment with a number of different combinations that will help you tune to get the most out of your code on a particular piece of hardware.”

Do you have to use Intel’s C++ Compiler with Media Server Studio? “We use it to build all of our runtime components, because it provides in my experience about a 20 to 25 per cent boost on top of everything else out there,” claims Buxton. “But we don’t have a strict runtime dependency on the compiler so if you’re a developer we provide compatible C APIs so you can get at our components. You can use GCC, you can use the Intel C Compiler, you can use pretty much anything. The only runtime component that we offer in the suite today is OpenCL because it’s the optimal way to get access to Intel’s GPUs.”

Find Out More

More details on the Grey Matter website at www.greymatter.com. If you’d like to discuss your media solutions further, call Grey Matter on 01364 654100 or email maildesk@greymatter.com.

HardCopy 69 ad
Adobe Creative Cloud for Teams ad