An increased need to test and measure



In the good old days of analogue, broadcast test equipment was traditionally an elementary electronic device, such as a waveform monitor, vectorscope or audio level meter (in post-production) and spectrum analysers and field-strength meters (in broadcast transmission).

Traditionally, it was a task for broadcast engineers to ensure that our workflow signals were within spec and met all standards. In today’s post-production and broadcast environment, we now work with multiple digital signals. The ever-evolving complexity of production workflows such as streaming video, UHD, 4K and high frame rates, makes test and measurement even more important in the broadcast workplace. As a result, the equipment has changed from a hardware-centric environment to software-based systems, with more and more intelligence incorporated into the product. The engineer sitting at a bench in the workshop has been largely replaced with software tools like automated quality control (QC) to meet the needs and workload of multi-platform delivery.

The original use of test and measurement equipment like the waveform monitor (WFM) and vectorscope was to line up video-tape based equipment and check a few basic parameters of the recorded video, which included checking black level, peak white, colour phase, noise, colour gamut and timing. Videotape technology could suffer from alignment issues, head clogs and other problems which impacted the video quality. Although tape was very reliable, each record operation still needed at least a start, a middle and an end check. QC was easy, then, and the general process was to spot-check a process or watch a screen – glancing momentarily at the waveform monitor and back to the CRT monitor with an eagle-eye watching for ill-timed videotape dropouts.

The move to file-based workflows eliminated a number of the parameters and, with them, the need to test for analogue-based faults. However, due to the complexities of digital workflows, the number of test and measurements that need to be performed to ensure that the content delivered to the consumer is at a suitable quality level has increased. The sheer scale of the number of measurements that must be performed has naturally led to software-based digital test equipment and automated quality control systems.

It was the EBU (European Broadcasting Union) who recognised quality control (QC) in file-based broadcast workflows back in 2010. They commented that “broadcasters moving to file-based production facilities have to consider how to implement and use automated quality control (QC) systems. Manual quality control is simply not adequate anymore.”

In most countries or regions where content will be shown there are regulatory requirements for several aspects of produced content: audio loudness levels should comply with the CALM act in the USA, or EBU R128 in Europe, for example. Closed captions or subtitles must be present, sometimes in multiple languages and formats, while in the UK and Japan you must test content to ensure the absence of flashing patterns which may trigger PSE (photosensitive epilepsy) in susceptible viewers.

A human carrying out a QC test can verify audio quality and language, check for visual video artefacts and make the call whether to fail or pass the content. What humans can’t see, though, is the mass of ancillary data (in digital form) that makes the digital media file valid. Automated QC uses computers and software to check technical parameters that can’t readily (or at all) be examined by a human, and can augment the work of expert QC viewers by alerting them to issues that should then be examined by such an expert. Automated QC systems are now the only practical way to validate that a file is correctly constructed according to the requirements of the target platform, including resolution, format, bitrates and file syntax — a task that is beyond the ability of most technical experts.

The multi-platform, multi-screen media world of today offers content owners and content distributers a host of opportunities to develop substantial new revenue streams. With the large and varied amount of data being produced, automated QC systems need functionality to achieve what may seem impossible. Most systems have been developed to have wide file format support to include everything used in broadcast and post-production, as well as support for streaming and network sources, while some even offer RAW file support.

Systems will primarily do a container check to ‘recognise’ the file format, evaluate how many video and audio streams there are, what the bitrate is, start timecode and duration. After that, it checks the video codec, frame size and frame rate, as well as frame aspect and pixel aspect ratios. The system checks that all video and audio levels conform to the standards asked for and the more intelligent systems will automatically correct chroma, black and RGB gamut levels if they are outside the limits (and will even correct PSE flashing errors). Audio problems such as clipping are easily observable in the decoded stream and QC systems can determine whether loudness limits, peak limits, instantaneous peaks and true peak value limits have been exceeded, as well as long-term loudness over the span of the content. Other types of checkable baseband audio flaws include silence due to audio dropouts.

Automated QC systems are not fool proof, however, and human intervention is required from time to time. A correctly setup and administered system running automation will get over 90 percent of file-based work done. The balance of the process will rely on human input because there are creative interpretations in both audio and video that an automated system may fail to identify through misinterpretation. I have had content ‘failed’ because a close-up shot of a zebra was misinterpreted by a machine as being an excessive moiré pattern. A 5.1 surround soundscape mix in a scene shot in the height of cicada breeding season was rejected because the audio track contained continuous DC electrical buzz. My submissions required human intervention and, once passed, I was assured that the system had now learned what a close-up of a zebra looked like and what a cicada sounded like!

The media industry has been revolutionised by the adoption of digital file-based workflows – and having an understanding of the functions that make up file-based workflows, and what needs to be tested, is essential for knowing how to effectively implement quality control. Broadcasting success relies on quality and, most importantly, consistency in the processes that produce quality. It’s what broadcasting has been about since day one, and – thanks to some automation, artificial intelligence and whole lot of human chutzpah – it will always be.


Please enter your comment!
Please enter your name here