Home Authors Posts by Ian Dormer

Ian Dormer

Ian Dormer was born in Zimbabwe and has been in the TV business since the 1980s, having served in various positions a the SABC, M-Net and SuperSport. Ian currently works and resides in New Zealand.

Media Asset Storage: it’s in our DNA


When it comes to the storage of media assets (or any data, for that matter), whether it’s online or up on the cloud, there’s no doubt that the immediate future of data storage remains magnetic tape. Recent technological advancements have given new life to hard drives, but when it comes to long-term archiving of assets, the tape or hard drive of the future could be something very old, something that everyone has inside them: DNA.

The first commercial digital-tape storage system, IBM’s Model 726, could store about 1.1 megabytes on one reel of tape. Today, a modern LTO tape cartridge can hold 30 terabytes. Meanwhile, a single robotic tape library can contain up to 556 petabytes of data. While tape doesn’t offer the fast read/write speeds of hard drives, the medium’s advantages are many.

For starters, tape is reliable, with error rates four to five orders of magnitude lower than those of hard drives. They are energy efficient: once all the data has been written, the tape cartridge simply sits in a slot in a tape library without consuming any power until it’s needed again. And tape is very secure, with built-in, on-the-fly encryption – and if a cartridge isn’t mounted in a drive, the data cannot be accessed or modified. The main reason why tape is so popular is simple economics. Tape storage costs one-sixth the amount you’d have to pay to keep the same amount of data on disks, which is why you find tape systems almost any place where massive amounts of data are being stored.

But, as mentioned, tape is slow, and so the development of hard drive technology continues. The longevity of hard disks, and the rapid rise of solid-state drives (SSDs), can be attributed to a continual improvement process to minimise the drawbacks of tape technology. The hard disk game changed dramatically in 2005 with perpendicular magnetic recording (PMR), where, broadly speaking, magnetised bits stand perpendicular to the head of the hard disk platter instead of lying down, making room for more bits. However, after years of data density improvements using PMR (densities doubled between 2009 and 2015), researchers are once again hitting the physical limits: each magnetic ‘bit’ is becoming too small to reliably hold its data, increasing the potential for corruption.

Shingled magnetic recording (SMR), introduced by Seagate in 2014, is one way to fit more data on a disk’s platter. In an SMR disk, when the write head writes a data track, the new track will overlap part of the previously written track, reducing its width and meaning more tracks can fit on a platter. The thinner track can still be read, as read heads can be physically thinner than write heads. Western Digital launched a 15TB SMR hard drive in 2018 targeting data centres, with plans to increase the capacity per rack by up to 60TB soon.

The next big thing is two-dimensional magnetic recording (TDMR). This is another Seagate technology, and aims to solve the problem of reading data from tightly packed hard disk tracks, where the read head picks up interference from tracks around the one being read. TDMR disks use multiple read heads to pick up data from several tracks at a time, then work out which data is needed, turning the noise into useful data that can be analysed and then discarded when not required. 14 and 16TB TDMR drives came onto the market in 2019.

The multiple read heads of TDMR disks can improve read speeds, but to improve write speeds while increasing data density you need to move away from SMR to the latest hard disk technology: heat-assisted magnetic recording (HAMR). This aims to overcome the compromise of SMR by changing the material of the hard disk platter, to one where each bit will maintain its magnetic data integrity at a smaller size. As HAMR’s name implies, the solution is to use a laser to heat up part of the hard disk platter before the data is written. This lowers the material’s coercivity enough for the data to be written, before the heated section cools and the coercivity rises again to make the data secure. HAMR has the potential to increase hard disk density tenfold.

Therefore, both hard drive and magnetic tape technologies work for the storage and retrieval of data assets – but the trouble is that technology is battling to keep up with the continual flood of data currently being generated, and forecast to be generated in the future. What’s the solution? The hard drive of the future could actually be something very old, something that is inside every person reading this: DNA.

Deoxyribonucleic acid, or DNA, is the molecule that dictates how an organism develops. DNA can also hold a staggering amount of information: 215 petabytes (1 petabyte is about 100 million gigabytes) of data on a single gram. Just as impressive is its longevity. Traditional mediums like magnetic tape and flash memory tend to degrade, whether through repeated use or simply time. DNA degrades, too, but at a significantly slower rate: depending on the storage conditions, it can last thousands, or even tens of thousands, of years.

The idea of storing data on DNA was proposed back in the 1960s by Soviet scientist Mikhail Neiman. In the decades since, researchers have made great strides in making it achievable – though at a price. Currently, the most cost-effective DNA storage technique costs about US$3,500 per MB to write the data and US$1,000 per MB to read it, so don’t retire your LTO or hard drive array just yet.

DNA’s storage capabilities, however, are intriguing and have huge potential for computing in the future. For years, technology roughly followed the path laid out by Moore’s Law, which stated that every two years or so, we could double the number of transistors that fit on a microchip. However, computer chips have become so small these days that it’s increasingly unlikely we can continue to squeeze more transistors in there. Essentially, Moore’s Law is dead, but DNA-based computing for the future is very much alive and well.

Graphic Content


In a changing broadcast environment, where streaming of content directly to the viewer is becoming the new norm, the need for quality but cost-effective on-air graphics for live streams has given rise to a new, out-of-the-box solution that matches TV-level production at a fraction of the price. It gets even better because the new workflow is unlike anything that has ever been seen before: it’s a platform that not only offers TV-quality results, but it is cost effective and has no expensive hardware, software or staff. It is, furthermore, a cinch to use on a self-service platform that anyone, yes anyone, can manipulate to manage all on-air graphic aspects of their broadcasts and live streams.

There are two parts to a sports game on TV. There’s the actual play and then there’s the stats and graphics that enrich the action. A big part of creating professional and engaging sports broadcasts is live on-air graphics – they tell a story and show relevant information, from game scores to player and team stats. Graphics also display in-game advertisements from sponsors who support the club’s brands, broadcasters and leagues, which seek to target highly-defined sports audiences.

For smaller teams, clubs, and schools, producing sophisticated graphics — live and in-game — is prohibitive. Hardware is expensive; software, difficult to master. Skilled staff is required to operate the consoles, and designers are needed to create the graphics. Matching the TV-quality level of production traditionally costs tens of thousands of dollars, and that’s per game.

As live streaming has become more popular and affordable, the need to produce quality on-air graphics on a professional level has become a necessity, and all too often smaller sporting organisations and production companies are hard-pressed to compete with the major broadcasters, as traditional graphic and titling solutions are unrealistically expensive.

However, a few years back a group of Australia-based innovators formed a company called Live Graphic Systems. They put their heads together and came up with a new breed of automated graphics and broadcast management software that offered TV-quality sports graphics but with no need for hardware, designers, skilled graphics operators or developers. The system is called LIGR.

Since its inception, LIGR has produced graphics for over 10,000 sports games across the globe with over 1,000 sponsor ad-sets uploaded onto their platform. To date, it’s estimated that LIGR has saved over 100,000 hours of graphics operation, design and development – which equates to more than $20 million saved. LIGR provides out-of-the-box, ESPN-quality graphics templates that let you produce a professional live stream in a few simple steps. Just login, select your sport, and choose your graphics theme. Then upload your team, player and sponsor assets one by one (or upload your fixtures and squads in one hit), and watch as professional automated graphics animate over your live streams.

The system incorporates a sponsorship management tool that incorporates brand and sponsor assets into the live stream graphics and has a sponsor reporting dashboard to provide metrics and insights about when and how the sponsors information was displayed during the broadcast. LIGR provides a complete asset management system (AMS), where the user can upload their assets, which are mapped to their teams and are then assigned to the schedule of games. Each league has its own assets, such as logos, watermarks, naming conventions and colours, and the user sets all of this up pre-season or pre-game. When games are scheduled in the LIGR scheduler, all assets will map to each competition and each home and away team automatically.

Being an HTML 5 solution, LIGR works with all production workflows to deliver the graphic solutions. It integrates with all software vision mixers that support web browser inputs, such as OBS and vMix, as well as all cloud-based vision mixing and distribution platforms that support web browser graphics overlays and automated camera systems that support web browser graphics overlays, such as Pixellot.

In April this year, LIGR are set to release the LIGR Deck system – which automatically creates an entire game’s live graphics by simply live scoring the game. The users can utilise the LIGR Live Scoring App in a smart phone, or by connecting to the official scoring system or an external live score data source such as Stats Perform, Genius Sports, Sport Radar – either way, this completely removes any manual graphics operation.

Users can upload and manage the assets they need in an easy-to-use dashboard and the graphics will animate on- and off-screen—automatically, and based on the live match data. Last year, LIGRA were set a challenge by Cricket Australia to produce professional and data integrated graphics for over 120 live stream games across the season. The system had to integrate with Cricket Australia’s OptaPro system (which provides analytical data) in real time while connecting brands to specific triggered events in-game. The system had to be able to run graphics in a combination of auto and manual mode, with graphics to be overlaid off-site in a cloud-based master control room (MCR) with direction fed by an on-site producer to the off-site graphics operator. LIGR worked on a unique workflow that combined traditional workflows with automated and remote workflows for the local JLT Cup, The Woman’s Big Bash League, as well as international non-televised live streams.

Interest in live graphic systems technology is gaining momentum, and not only in the streaming environment – it is also attracting attention from mainstream broadcasters. LIGR really is the easiest way to deliver a consistent and professional level of live streaming quality graphic content, for all games, across all age groups, to all fans – only without the hard work.

Using lighting to stream data


When is technology development ever going to slow down? Just when you thought that we had high-speed wireless communication all sorted, somebody comes up with an idea that is potentially better – a technology that allows LED lighting to become internet and broadcast data transmitters, creating a new form of high-speed, optical wireless communication that leverages the visible and infrared (IR) light spectrum. This new form of transmission could be of use within the broadcast and media environment.

Back in 2011, in a world where the space for radio frequencies was already becoming over-saturated, University of Edinburgh research professor of mobile communications, Harald Haas, discovered something amazing – Light Fidelity (LiFi). This is a way and means of sending and receiving data using the modulation of light at frequencies that are imperceptible to the human eye, and it emerged that LED lighting was the perfect medium to develop the technology around.

Because LEDs are semiconductors, they can turn on and off up to a million times per second, enabling the diodes to send data quickly. In professor Haas’s Li-Fi installation, a digital signal processor integrated (or attached) to an LED driver takes data from a network, server or the internet and converts it into a digital signal—basically a sequence of discrete voltage levels. The LED driver in each fixture converts the digital signal into a photonic signal, transmitting it at a very high frequency as an Orthogonal Frequency Division Multiplex (OFDM) signal.

OFDM signals are also employed in 4G LTE, 5G and most Wi-Fi technologies because they use many small-bandwidth channels collectively, rather than a single large-bandwidth channel. The decoder on the receiving device—say, a computer or smartphone—then translates the OFDM signals into data for the user. The radio spectrum allotted for wireless communications spans only 300 gigahertz, while the visible light spectrum spans 300 terahertz, from red light at 400 terahertz to violet at 700 terahertz. Thus, light has 1,000 times more frequency available for wireless communications than radio. Li-Fi has the potential to achieve much higher data rates than we can currently achieve using Wi-Fi, with speeds of tens of gigabits per second having been achieved under controlled settings in the lab.

Unlike Wi-Fi, Li-Fi transmits data to devices via direct and incident light without risk of disrupting sensitive electronics, such as those found in hospitals or on airplanes. And because Li-Fi is directional, requiring a line of sight between the transmitter and the receiving device, it cannot be hacked. Li-Fi’s directionality also reduces the risk of interference with other devices vying for a connection, so co-channel interference, a source of noise, and electromagnetic radiation from the simultaneous use of mains electricity and wireless technology, which exist in radio waves, are absent in Li-Fi, making it a very reliable technology in terms of stability.

Apart from the obvious home-use advantages, technology researchers are looking at developing Li-Fi as means of providing network communications in television production studios and for OB van setups in stadiums. Developers have already tested the technology to replace traditional Wi-Fi in aircraft with great success. It’s being tested in vehicles to communicate with one another via front and back lights to increase road safety and integrated into streetlights and traffic signals to provide information about current road situations to users’ smartphones.

Li-Fi can be used anywhere that can be outfitted with wireless communications and electric lighting: commercial buildings, retail environments and smart cities for in-vehicle data transmission. As IoT (Internet of Things) and smart applications gain traction, Li-Fi offers the omnipresent wireless connections that these devices and applications need. Along with its use in bi-directional internet communications, Li-Fi holds promise in cataloguing and entertainment applications. Toric and Luciom, a French VLC company acquired by Philips Lighting, refer to these three respective applications as Li-Fi Internet, Li-Fi Tag, and Li-Fi Broadcast categories. Li-Fi Tag uses a router to broadcast the same tag repeatedly, for example, to confirm the specific row and shelf of a product in a store or distribution centre. The data flows one way from the emitter to the receiver, such as a device used for tracking inventory. Li-Fi Broadcast uses a router to transmit data, videos, music and shopping coupons one-way to consumer devices, such as smartphones.

Li-Fi has already been developed as an integral part of the new 5G network. Though Li-Fi technology is prohibitively expensive in its early stages, it isn’t stopping tech developers from seeing its advantages – and the more money that’s put into development, the cheaper the end cost will become. An Australian start-up company is developing and integrating Li-Fi into their lighting kits for film and television production. Essentially, the individual lighting components all communicate with each other via the light network that they produce; the rigger adjusts settings on a smartphone app and each light self-adjusts accordingly.

Along with all these benefits, there are also some disadvantages of a Li-Fi connection. Since it uses visible light to transmit data, Li-Fi would be rather useless in conditions where there is no light. That means no Internet while lying in your bed at night. Similarly, if you have a Wi-Fi router installed in one room of your house, you can connect your devices sitting anywhere in the house – but this is not the case with Li-Fi. Since visible rays cannot pass through walls, you have to be in the immediate vicinity of the source of light to access the Internet on your device, which may not sound particularly convenient to many people.

Whilst the technology is not here to replace Wi-Fi, a recent article posted on Tweet Tech stated that Li-Fi is on track to become a $113bn industry in 2022. If you really want to see the advantages of Li-Fi you need to visit Dubai. The city became the first smart Li-Fi city in the world last year to roll out the internet, live stream television channels and run the city’s infrastructure programme on the massive street lighting grid.

Li-Fi is impressive technology that, from a tech professional’s perspective, should be something to embrace and not something to be wary of. As Plato once said “We can easily forgive a child who is afraid of the dark; the real tragedy of life is when men are afraid of the light.”

Studio tech rising up


The broadcast and media technology industry sectors are enormous, covering everything from cameras and studios through to network technologies, post-production services and live streaming video. Last year we saw incredible advances, with the wider availability of machine learning and Augmented Reality (AR). These developments promise a new wave of automated production processes and the upward rise of virtual studio technology.

Augmented Reality and virtual studio technology changed the face of broadcast television in 2019, through the creation of virtual objects within a live studio environment. We saw photorealistic representations of football players during a match build up or a journalist standing in digital representations of parliament as voting occurred during an election. This year, thanks also to the rapid development of 5G technology, will see AR being used more in studio setups, especially in the sports broadcast environment with the 2020 Olympics on the horizon. Virtual set solutions, powered by games engines, proved a big draw for the live broadcast as well as the scripted broadcast markets. The fusion of games-engine renders with live broadcast has taken virtual set solutions to another level with photorealistic 3D graphic objects appearing indistinguishable from reality. The technology is not only being used in existing studio set-ups, but is spurning new growth in the studio facility market worldwide.

One example is dock10, United Kingdom’s leading television facility housing ten television studios offering a wide range of production services, who recently chalked up a new milestone with its latest virtual studio capability. dock10 called in Zero Density, providers of the cutting-edge product Reality Engine, to power the brand-new 4K UHD virtual studio housed in MediaCityUK and used by shows such as BBC Sport’s landmark show Match of the Day (MOTD). The soccer show draws around seven million viewers each weekend and BBC Sport has gone on-air with said virtual studio capability as of August last year. MOTD2, Football Focus and Final Score will also broadcast in the same virtual studio. The studio has five Reality Engines deployed on five cameras (three pedestals, one crane and one railcam).

Founded in Istanbul in 2014, Zero Density is an international technology company dedicated to developing creative products for the broadcast, augmented reality, live event and eSports industries. The founders, who shared a common background in broadcast and media, came together on a mission to use their creativity to develop innovative virtual production solutions for live broadcast. The company’s headquarters remain in Turkey, but the company now has an extensive network of clients all over the world. Zero Density offers the next level of virtual studio production with real-time visual effects. It provides Unreal Engine native platform, Reality Engine, with advanced real-time compositing tools and its proprietary keying technology, Reality Keyer. Reality is the most photorealistic real-time 3D Virtual Studio and Augmented Reality platform in the industry.

Another adoptee of Zero Density technology is Sky News Arabia (SNA), who have employed the latest virtual and augmented reality technologies as part of a massive editorial and technical revamp aimed at staying at the forefront of news reporting.

Sky News Arabia (SNA), a joint venture between Abu Dhabi Media Investment Corporation (ADMIC) and Sky Limited, recently completed an editorial overhaul with a fresh line-up of presenters and a special selection of digital-only programming. The revamp saw the launch of one of the most advanced newsrooms in the region, with the addition of a brand-new wing that is now home to Studio B, a 400sqm studio boasting the UAE’s largest chromakey background at 15x11m. The large chroma area gives the broadcaster ample room to use a lighting grid for keying objects in positions where the presenters are on the VR set. The facility also includes an outdoor landscape set designed for the morning show. In the US market, Zero Density partnered with The Weather Channel Television Network to be a virtual studio technology provider for immersive mixed reality content late last year. The Weather Channel chose the latest technology to create awareness among its audiences with visually-striking disaster scenarios. Zero Density’s hyper-realistic graphic renderings and Hollywood-level visual effects are used to communicate the vital message of the channel to their audiences more effectively.

Virtual studio technology doesn’t necessarily mean that studio floor space is no longer required. Media giant Sky announced in December 2019 that is to build huge new film and television studios near the existing Elstree production site just outside London. As the battle between Netflix, Amazon and other streaming services intensifies, the need for more VR and AR studio space has also grown. Sky Studios Elstree will become the European production base for Sky and NBCUniversal, which owns Universal Studios, as both are owned by the US pay-TV giant Comcast. Sky says the studio complex, which it hopes will also become a base for other companies’ projects, will result in £3bn being spent on TV and film productions in its first five years. The company has made the move to develop its own production base as the UK increasingly faces a squeeze on studio space, as TV companies, film studios and the streaming giants pour billions into making new content. Sky Studios Elstree will have 14 stages, the smallest of which will be 1,800 square metres, with enough space to shoot several films and TV shows at the same time.

Television broadcast of the Olympic Games has, over the decades, become a showcase for new technologies. Virtual reality (VR) and 5G innovations will add value the Olympics output and are examples of how the event has become not only a fortnight of elite sport, but a test-bed for new technology ideas.

To put this into practice, the IOC is leaning on its in-house Olympic Broadcasting Services (OBS) media arm to incorporate innovation into the Games, and there’s no doubt that virtual studio technology will be in the mix.

LIVE Submersible Broadcasting


At last year’s IBC (September 2019), Associated Press (AP) was awarded the prestigious IBC Innovation Award for their undersea reporting during the First Descent Mission in Seychelles. The technical award was for the innovative storytelling and creativity made possible thanks to the partnership between AP, Nekton, Sonardyne, LiveU and Inmarsat. What made the award extra special was the fact that this was the first full HD quality multi-camera live signal from the depths of the ocean using optical video transmission techniques, in which the pictures transmit through the ocean waves using light in the electromagnetic spectrum.

The mission’s broadcasts were part of the Nekton Deep Ocean Research Institute’s First Descent expedition, which is exploring some of the world’s least explored areas around the Indian Ocean, as part of a project to increase understanding and aid protection of the marine life they contain. Very little research has been undertaken beneath 30metres (scuba depth) across Seychelles’ vast ocean territory of 1.37 million square kilometres. The objective is to contribute to establishing a baseline of marine life and the state of the ocean in Seychelles. Research is focused from the surface into the Bathyal Zone (200m to 3000m), home to the greatest patterns of biodiversity and impact of human activities on these vital ecosystems. Supported by 13 scientists based on the mother ship, the Ocean Zephyr, Nekton’s goal is to undertake at least 50 “first descents” into these waters to collect and generate data whilst live streaming their missions to interested parties.

Previous real-time video transmissions from the world’s deep oceans were signals sent from remotely operated unmanned subsea vehicles, with video and audio fed via a fixed or tethered video or fiber optic cable, or sent over a low quality digital live stream that also requires underwater cables and a significant time delay. Tethered cables restricted the freedom of movement of the unmanned vehicle and often ended in sliced or broken cables. When planning these missions, Nekton Deep Ocean Research Institute and Associated Press contracted, amongst others, pioneering subsea communications technology company Sonardyne to help provide the ways and means to transmit live video from the depths, without restricting the free-roaming submersibles, to the research vessel, Ocean Zephyr.

Sonardyne, who specialize in acoustic and non-acoustic technologies in marine environments, had the ideal solution in a recently developed product – the Bluecomm free space optical modem. With a depth rating of 4000m and a data rate of up to 10Mbps, BlueComm modems use an array of high power light emitting diodes (LEDs) that are rapidly modulated to transmit data. It uses a separate photomultiplier tube as its receiving element. BlueComm operates using visible light, which can travel significant distances through water, making it an excellent tool for wireless transfer of video. The general rule of thumb is whatever colour the water appears, is the colour which will be least absorbed. So blue in clean water is the least absorbed electromagnetic wavelength, making it the obvious colour the lights that BlueComm uses. Because it uses visible light, the system is most effective in low ambient light conditions such as deep water or shallow water nighttime operations and because Nekton were focusing on the Bathyal Zone, this kind of technology was a perfect fit. For shallower and turbid waters, Sonardyne have developed a ultra violet (UV) version which improves transmission where there is a lot of ambient light but the maximum possible range reduces to about 80 meters. The optical data transmission is highly energy efficient, enabling more than nine gigabytes of data to be transferred using a single Lithium D sized battery cell.

Onboard the Ocean Zephyr, the decoded video was uplinked using Inmarsat’s high bandwidth SAILOR 100 GX compact one-metre Ka-band terminal, with the back-up of FleetBroadband, allowing AP to send live footage from First Descent’s mother ship to their production hub in London and on to hundreds of broadcasters and digital publishers across the globe. Over 70 hours of live content was transmitted during First Descent’s mission in the Seychelles, including nine hours of prime-time television broadcast on Sky, and two-way interviews with the submersible crews. At one point, pictures were even beamed to the giant screens positioned above the concourses of London’s major railway stations, offering commuters a live ‘feed’ to events unfolding deep beneath the waves on the other side of the world. Sky News and Sky Atlantic, as part of Sky Ocean Rescue, which have also joined the mission, plan to broadcast more live subsea programmes in the future as the project develops.

Meanwhile, the potential of the Nekton Mission continues to unfold. Nekton has teamed up with the University of Oxford to develop artificial intelligence tools, for example, to accelerate analysis and publication. Data and video will be made available through OCTOPUS – Ocean Tool for Public Understanding and Science – to provide a holistic and dynamic view of the changing state of the Indian Ocean, its biodiversity and human impacts. Better connectivity can also increase participation and improved real-time communication opens the door for experts from developing nations to join the scientific exploration of the oceans. In fact, promoting local engagement is one of the Nekton Mission’s broader objectives and the project organisers made sure to create opportunities for marine scientists based in the Seychelles to participate in all aspects of the expedition.

Together with datasets and research findings emerging from the expedition, this inclusive approach is intended to support the Seychelles implement a Marine Spatial Plan, which will see around one-third of its national waters protected as part of building a sustainable Blue Economy. This is important because the way the Indian Ocean changes in the coming decades will profoundly affect the lives, livelihoods and wellbeing of the 2.5 billion people living in the Indian Ocean region. Their next mission is in the Maldives around April this year and then onto the Mozambique Channel, all coming to you live from under the sea thanks to groundbreaking technology, a whole bunch of flashing blue lights…and a little bit of magic!

LiveU is represented by Concilium Technologies in South Africa.

2019 – a year of change and the need for speed


2019 has been a year of technology firsts in the media and entertainment industry. Multi-platform content saw a meteoric rise, Artificial Intelligence (AI) gained ground across many fronts and connectivity, devices and personalisation changed market dynamics significantly.

We have seen the development of new formats, transmission networks and content management systems, and there is no doubt that 5G technology will have a profound impact on the industry as a whole when it is fully rolled out. This year has truly been a year of change and adaption, and some areas may be bigger than we think…


One of the star trends this year has been Artificial Intelligence technology. Everyday it is becoming more and more necessary to have tools that allow us not only to automate tasks in order to gain efficiency and speed, but that also allow us to rely on ‘intelligent automation’ for our daily tasks and workflows. Analytics Company, IDC Research, has already forecast an investment of 77.6 billion dollars in cognitive AI systems within the media and entertainment world for the year 2022.

Devoncroft’s Big Broadcast Survey for 2019 presented some initial data on investment trends in broadcast technology, where AI and machine learning already occupy the seventh place in the list of trends considered most important by professionals of the broadcast industry. There is still a long way to go, both in the evolution of these tools and their implementation. Artificial Intelligence offers very interesting advantages, such as automatic cataloguing of metadata, advanced searches of content adapted to the specific needs of users, facial recognition, object detection, audio effects detection or speech-to-text, analysis of the sentiment of images or even performing the transcription and automatic translation of texts.

The move to IP

One of the big technology drivers this year has been the adoption of new connectivity. IP technology and connectivity derived from the development of new video formats and this form of content transmission not only allows post-production houses and broadcasters to work with higher-quality formats but also allows immediate content delivery and more importantly, remote production. It’s probably one of the most disruptive changes in broadcast history, not only because of the technical challenges, but also in adapting to the way the industry works. Therefore, despite all the advantages it presents, its adoption rates are still quite low.

A thriving eSports market

Since its inception back in the 1990s, eSports were virtually unheard of and battled to get any mainstream coverage. This year, the situation has changed big time. These tournaments have become increasingly popular, hundreds of millions in prizes and more ‘traditional’ sport structures have made eSports more attractive to broadcasters, and therefore this form of entertainment is attracting money, sponsors and new audiences at an incredible rate.

As more people are consuming eSports content, the hours spent watching eSports videos are also increasing every year. In 2012, people were spending only about 1.3 billion hours watching these videos. Now, in 2019, the increase has been drastic, as the number of viewers has grown to 380 million, who watched 6.6 billion hours of eSports videos worldwide. It’s caught the attention of the broadcast sector who are investing in its future.

Multi-Platform Content

Despite the rumours of its supposed irrelevancy, traditional TV is alive and thriving and predicted to keep growing. It’s a completely brand-safe environment, offering reach, high-quality execution, measurability and storytelling opportunities, but we have entered a new generation for TV.

As online video has matured, creating a cross-screen video experience has become a key focus for broadcasters – and 2019 has seen a meteoric rise in the media and entertainment industry for multi-platform content. There has been a marked increase in the demand for multi-platform content producers in the worldwide job market, and technology providers such as Avid, Adobe and Blackmagic Design, for example, have all updated their software with enhanced AI based multi-platform export features within their editing platforms.

Industry Convergence

The distinctions between print and digital, video games and sports, wireless and fixed internet access, pay-TV and OTT, social and traditional media are all blurring. Streaming services, TV companies and social networks are now competing over both conventional sports and eSports rights. We are also seeing TV broadcasters, telecoms, tech companies, OTT operators and film studios all competing to provide TV content. Radio stations, podcast companies and streaming services are all competing to provide radio and podcast content.

There’s no surprise, therefore, that 2019 saw a ton of merger and acquisitions making the headlines. AT&T and Time Warner merged, and the acquisition of 21st Century Fox by Disney that took place in March are only two such examples. It’s hard to believe that just a few years ago, streaming was the supplement to traditional broadcast television and pay-TV. There were issues with quality of service, quality of experience and the belief that any major event – like the World Cup or even concerts streamed live — would ‘break’ the internet. Netflix proved everybody wrong, becoming the world’s largest global TV channel in 2019 and their success has spurred many others to join the streaming party. Over the next few years, experts are predicting data traffic levels will grow to over 397.8 trillion megabytes in 2022.

Fast as 5G

5G technology is being developed at a furious pace and the roll-out of 5G wireless networks is inevitably going to change the way media is both broadcast and received. 5G was well represented at both NAB and IBC, offering insights as to how the technology will influence the future of broadcasting.

Something that I am following with much interest is a Chinese 5G technology-driven project, which was unveiled at the 2019 Qingdao International Film and Television Expo back in August. Government officials and industry insiders recognise that the ground-breaking nature of 5G technology could potentially give a fresh injection to the broadcasting and TV sectors in China and are willing to throw millions into its development. The establishment of a 5G High-and New-Tech Video Pilot Park in Qingdao will see the roll-out of the first batch of high- and new-tech video products as soon as the end of this year, by adopting state-of-the-art technologies such as virtual reality, augmented reality, mixed reality, 4K/8K Ultra High Definition, high frame rate and wide colour – all of them in 5G. It’s a sign of what’s to come, adding a new impetus to the world of broadcast, media and entertainment beyond 2019.

IBC 2019: It’s a record


It’s a place where cutting edge innovations and creative ideas are shared and relationships are formed. Each year IBC continues to grow and this year attendance hit a record number! The world’s media, entertainment and technology industry once again gathered in Amsterdam by the droves, while the total attendance figure of 56,390 saw a record number of next-generation (18 – 35s) attendees, demonstrating the vital role that the show has in the broadcast and entertainment industry.

IBC CEO Michael Crimp was delighted to see audience growth in key target areas, “Particularly in welcoming more young people, senior level executives and overseas visitors,” he said. ”While this gives us a focus to build on next year, our metrics for success also include crucial elements like quality of experience, audience engagement and IBC’s influence on the industry, and our conversations with exhibitors and attendees tells us that these have all improved on 2018.”

This year’s show was indeed jam-packed with the technology and trends of tomorrow and perhaps the biggest highlight for most was the first-ever IBC Esports Showcase live tournament. I think it highlighted just how gripping and entertaining Esports can be and why the media and broadcast industry should be getting involved. Esports is an incredibly fast-growing movement and IBC attendees saw it first-hand, with two professional teams from ESL’s network of National Championships across Europe going head-to-head in the classic FPS multiplayer Counter-Strike: Global Offensive.

The broadcast, media and entertainment industry’s sense of social responsibility is stronger than ever. Movements championing diversity and inclusivity are gathering momentum and there is a conscious increase in company initiatives making a positive impact in the workplace and community. To reflect this, and its commitment to driving change in the industry, IBC has for the first time recognised social responsibility as part of its prestigious awards programme with a stand-alone award: the Social Impact Award. Competition for this award was so intense that the judges awarded not one, but three trophies: to Turkish broadcaster TRT for its World Citizen programme; Sagar Vani, an Indian initiative, which is an omni-channel citizen engagement platform; and finally Chouette Films, an initiative of the University of London’s School of Oriental and African Studies, whose aims are to produce academic and informative content with the smallest of environmental footprints.

The conference sessions provided attendees with much food for thought. Google’s Android TV and Roku were singled out as two of the most transformative technologies at IBC by Accedo’s Fredrik Andersson in a What Caught My Eye conference session on Innovation. A big topic of conversation on the main IBC stage was change: changing monetisation models, changing consumer habits, even changing content expectations (have we reached Peak Content yet?).  Something that never changes, though, is the ever spectacular Big Screen Events. As always, the IBC Big Screen, which is equipped with Dolby Vision and Dolby Atmos, delivered a stunning programme of events and screenings. An exclusive cinematic screening of Game of Thrones’ epic Season 8 battle episode drew a huge crowd, as did a session on the stories behind the edit and the music of the Elton John biopic, Rocket Man.

The IBC2019 exhibition featured 1,700 exhibitors over 15 halls offering attendees the opportunity to discover all the latest trends and technologies at their own pace. In the post-production environment, Adobe used the show to unveil Auto Reframe, a new feature for its Premiere Pro video editing software that is powered by the company’s Adobe Sensei artificial intelligence (AI) and machine learning (ML) platform. Auto Reframe automatically reframes and reformats video content so that the same project can be published in different aspect ratios, from square to vertical to cinematic 16:9 versions. Avid used the opening day of IBC to share that its Media Composer video editing software offering will now be able to deliver native support for Apple’s ProRes RAW camera codec, and will support ProRes playback and encoding on Windows.

The exponential growth in video consumption worldwide is a challenge, as consumer demands and expectations increase. Taking notice of the market trends, Nikon opportunely unveiled its all-in mirrorless moviemaking set-up: the Nikon Z6 Essential Movie Kit, built around the video-friendly 24.5MP full-frame 4K Nikon Z6 body. Comprising filmmaking essentials such as the Atomos Ninja V monitor, SmallRig camera cage and spare batteries, Nikon describes the Movie Kit as “providing the pure essentials to get rolling quickly, with all the core tools to make high-quality movies,” while “leaving filmmakers free to customise further components to suit their personal preferences.” The package will cost aspirant filmmakers around US$3800.

6K was a buzzword often dropped into conversations at IBC, and Blackmagic Design used the occasion to announce the Blackmagic Pocket Cinema Camera 6K, a new handheld digital film camera with a full Super 35 size 6K HDR image sensor. There are no surprises when it comes to Blackmagic Design: the Aussie company continues to impress with their amazing technology, and of great interest was the release of their ATEM Mini, a new low-cost live production switcher specifically designed to allow live streaming to YouTube and business presentations via Skype.

As always, Sony showed off their prowess in the industry, unveiling a whole new range of products, solutions and services. The highlight for many a DoP had to be the new PXW-FX9 – XDCAM camera, featuring Sony’s newly-developed Exmor R 6K full-frame sensor and Fast Hybrid Auto Focus system. Building upon the success of the PXW-FS7 and PXW-FS7M2, and inheriting its color science from the digital motion picture camera VENICE, the new camera offers greater creative freedom to capture stunning images and represents the ultimate tool of choice for documentaries, music videos, drama productions and event shooting. Also of interest to shooters was the launch of Sony’s new full-frame E-Mount FE C 16-35mm T3.1 G cinema lens, an ideal match for large-format cameras such as the PXW-FX9 and VENICE, where the wide angle zoom combines advanced optical performance, operability and intelligent shooting functions. For the audiophiles, an impressive third generation of the DWX digital wireless microphone system was great to see, with the compact DWT-B30 bodypack transmitter catching my eye.

For an industry that is constantly changing, being able to experience all the latest tech and hear about the challenges and opportunities facing the industry from key industry players all in one place is invaluable. IBC Director Imran Sroya commented: “IBC continues to succeed because we work hard to present the most knowledgeable speakers, the most topical sessions and the technology of tomorrow, providing a global meeting point that enables industry professionals to get together and share vital information about all aspects of media, entertainment and technology.”

It really is the place to be and the place to meet and I am already looking forward to see what IBC has up its sleeve for next year!

A whole lot of firsts for Rugby World Cup


The first sports event broadcast of a Rugby Union international was radio coverage between England and Wales from Twickenham in the United Kingdom, back in January 1927. Some 40 years later, the first-ever rugby match was broadcast on television in colour. It was a highly charged third test between England and the New Zealand All Blacks in 1967, also at Twickenham.

The idea of a Rugby World Cup had been suggested on numerous occasions going back to the 1950s, but met with opposition from most unions until 1985 – when it was finally agreed upon, seeing the inaugural tournament played in 1987. It is now the third-largest sports event in the world, after the summer Olympics and the Football World Cup, and this year sees the ninth Rugby World Cup take place in Japan, with a whole host of firsts when it comes to broadcast technology.

Rugby has been played in Japan since at least 1866, when the Yokohama Football Club was founded. It’s fitting, therefore, that Yokohama, which borders Tokyo, will host this year’s final. It’s the ninth Rugby World Cup and Japan is set to break new ground as the host of the first tournament in Asia. International Games Broadcast Services (IGBS), a joint venture between HBS and IMG Media, have been appointed to be the host broadcaster. The decision to appoint a specialised host broadcaster for the first time reflects World Rugby’s commitment to the highest standards of ground-breaking technical production and consistency between tournaments.

Also, for the first time, all 48 matches of a Rugby World Cup will be produced in multiple formats. The UHD standard is 4K SDR 2160p/59.94, while HD standards are 1080p/59.94 and 1080i/59.94. In addition to the traditional World Feed, Rights Holding Broadcasters (RHBs) will have access to uninterrupted live feeds to complement their studio operations, plus access to action clips during the match to enhance their analysis and programming. Dedicated ENG Crews will provide content from around the country, the tournament and the competing teams. All the live and ENG content will be available via the World Rugby Media Server, which is being supplied by EVS, with logged rushes plus some post-produced features all available for RHBs’ programming, be it a traditional broadcast or online offering either at the International Broadcast Centre or remotely at their home studio.

IGBS is also introducing another first, the Match Day Preview Show, which will look ahead to the next day’s games. This, combined with the live matches and the daily highlights show, will offer broadcasters access to all-day programming. In a world that now has the need to feed social media, World Rugby has introduced a specific content package to promote the event on a variety of social media platforms. The social media content production team will enable rights holders to simply populate their own streams, websites and apps with high-quality content, with short-form content, infographics and 360° virtual reality (VR) clips all made available.

Production teams have been drawn from France, UK, South Africa, New Zealand and Australia by the host broadcaster in order to maintain the highest standards throughout the six-week tournament. There are a number of first-time innovations that have been introduced to enhance the coverage this year. Depending on the rating of the match, there will be 23, 28 or 32 cameras, as well as corner-flag cameras and Spidercam will be

operational at 34 of the 48 matches. Hawkeye will be providing facilities for the Television Match Official (TMO) and for Citing and Head Injury Assessment.

NHK, the sole public broadcaster of Japan, are going to provide 8K Super Hi-Vision coverage to the domestic market with unprecedented free-to-air coverage offering an opportunity to use rugby’s biggest event to reach the widest possible audience. Under the IGBS umbrella, they will use nine cameras together with some host 4K cameras up-converted. They are planning to broadcast 31 of the 48 matches in 8K, and NHK will use Japanese UHD graphics created on-site as well as Augmented Graphics in conjunction with Spidercam to add to the 8K spectacle.

All the broadcast title and program graphics will be run by Alston Elliot, who also provide the official data throughout the tournament. Alston Elliot is a graphics production company that specialises in televised sports graphics and data systems, and also serves as technology partner to broadcasters, if required, by supplying turnkey graphics systems and custom output software. In particular, football broadcasters such as the English Premier League, FA Cup, Europa League and FA Women’s Super League have adopted their turnkey services. Other sports they supply to are golf, motorsports, athletics, tennis, hockey, fishing and, of course, IPL cricket. Their technical innovation for rugby includes scrum analysis, play patterns, try origins, team trends, ruck analysis, tackle analysis and field position analysis.

The company’s graphics creation workflows are based mainly on Vizrt and ChyronHego software and they have come a long way since they started out in the UK back in 1992. The company now has offices in South Africa, India and also Australia, where they recently designed and supplied a ground-breaking broadcast graphics package for Augmented Reality on Spidercam for the National Rugby League.

One of the major challenges facing the broadcasters is a bit of a strange one. Remarkably there are four varieties of local power in Japan – 200v/60Hz, 100V/50Hz, 200V/50Hz and 100V/60Hz, depending on the stadium location. These challenges have been overcome, however, with the appointment of Aggreko, a UK based company who will provide critical power systems and distribution for all broadcasts at the various stadiums as well as backup systems for the 12 venues across Japan.

Meanwhile, world lighting leader, Signify, have installed its connected lighting system Interact Sports at the Toyota Stadium in Aichi, Japan. It’s the first outdoor stadium in Japan to install connected LED pitch lighting in combination with high performance Philips ArenaVision LEDs. This new lighting meets the stringent broadcast standards for flicker-free Ultra-HD 4K television and super slow-motion action replays. People at home will clearly see every detail and emotion on the pitch in a tournament that is bound to provide us the best that sport broadcasting has to offer and the most exciting rugby we have seen…ever!

It’s IBC time again!

September is nearly upon us, where does the time go? It’s a month of changes: for us in the southern hemisphere it’s the announcement of spring; for our friends up north, the start of autumn; and, more importantly, the start of the world’s most influential media, entertainment and technology show – IBC2019.

The IBC theme this year is ‘See it differently’, an apt theme considering that broadcasting is going through an immense amount of disruption, thanks widely to new technology and consumer viewing habits.

This year’s show is slightly different, in that IBC has decided to align the dates of the exhibition and conference so that they will now both take place from Friday, 13 September through to Tuesday, 17 September 2019. Until now, the IBC Conference always started a day earlier than the Exhibition, on the preceding Thursday. Over the five day conference, 1,700 delegates and guests from across the globe will hear from an outstanding line-up of 300-plus speakers, enjoy fantastic networking opportunities and be inspired to embrace the changes in our industry together. The exhibition expects to draw a crowd of over 55,000 attendees.

This year’s IBC conference programme features some of the foremost thought-leaders, innovators and policy-makers in their fields and covers a wide breadth of topics. The programme will explore new strategies, business disruptors and future technological progress, and will hopefully reveal the future roadmap of the industry.

Top of my list from the conference sessions is a peak into business disruptors YouTube on their content creation and monetising of the channel, with a keynote speech from Cecile Frot-Coutaz, Head of YouTube EMEA. YouTube has largely been about user-generated content shot on mobile phones. But large viewing figures and big sponsorship deals have seen some YouTube stars become increasingly more professional in their approach to video production, creating competition for traditional broadcasters. The desire for instant-access content on YouTube, as well as a growing number of other platforms, can be an opportunity for broadcasters, giving them a new outlet and way of engaging their audience.

Another not-to-miss from the conference programme will be the Global Gamechangers session on Friday 13 September. Here, you’ll have the opportunity to meet the pioneers of creativity and innovation and be inspired and informed by the greatest creative, innovative and future-facing talents making headlines across the global stage. The Global Gamechangers will debate the future of the industry as they consider where revenues will come from, the creative challenges facing content makers and how broadcasters can remain relevant in a future dominated by digital media.

Always fully booked in advance, the IBC2019 Big Screen Programme focuses on how innovation in tech is allowing us to bring stories to life like never before. This year you will be able to gain from insights from creative and technical deep dives, and from cinematographers to colourists involved in the production of hits like Toy Story 4 and Games of Thrones. A world-class forum where creativity meets tech, the Big Screen programme allows us to hear from the talent behind the camera on everything from cinema and big event programming to boxset dramas and an in-depth look into the tech bringing this content to our screens.

Outside the IBC2019 conference doors – which contain over 50,000 square meters of exhibition space, over 1,700 exhibitors and over 55,000 attendees made up of innovators, key decision-makers and press – you’ll get the opportunity to discover adjacent technology and sectors, and catch up with the latest developments in broadcast and how they can fit into your future media plans.

One of the major areas of interest will be Artificial Intelligence. Is AI still hype or is it really the next big thing? At last year’s IBC, the Future Zone looked at how Augmented Reality (AR) and Virtual Reality (VR) had already had an impact on the broadcast, media and entertainment industries. This year, with both technologies having dramatically improved, there’s a wider look at existing projects and new ways that AR and VR can impact broadcasting, for creators and viewers alike. From the creation of virtual objects in a TV studio, to complete virtual sets, AR is already a big part of many broadcasts. Looking to the future, many believe that high-speed 5G mobile networks will create new opportunities for AR and VR, creating new ways of telling stories and delivering immersive narrative experiences.

Other hot topics will no doubt be Cloud Production, Cyber Security, High Dynamic Range (HDR) and one of my personal favourites, Esports. Esports is already a billion-dollar industry and all signs point to it growing rapidly over the coming years. With this potential comes fresh challenges, such as how to create interesting stories from in-game streaming. Esports was introduced at IBC2018, but the focus will be much larger at this year’s show. For the first time, IBC2019 is hosting the IBC Esports Showcase designed to give attendees an insight into this growing area. From managing the complexity of production to delivering an Esports broadcast, the Esports Showcase will host a live Esports tournament to demonstrate the techniques, trends and technologies required to bring this exciting new form of entertainment to life.

Something that has been in the headlines – be it good or bad – is the implementation of Mobile 5G networks. 5G networks are starting to be switched on across Europe, with plans to rapidly expand coverage. Offering broadband-like speeds, 5G is a revolutionary new type of mobile network that makes high-speed internet access possible for mobile devices. For broadcasters, 5G can offer a complete portable transmission solution, even delivering 4K video streams. For consumers, 5G can be used for streaming high-capacity content, such as with Barcelona Football Club, which has used 5G to embed wireless 360-degree cameras throughout the Nou Camp stadium, streaming the video to home fans using VR headsets. I have no doubt that 5G will be bigger than ever and a talking-point long after IBC finishes.

IBC organisers say that this event is the world’s most influential media, entertainment and technology show – and they aren’t wrong. It’s set to be a goodie, and offers everything from new product launches to opportunities to engage with customers old and new and to meet up with your broadcast colleagues as well as all the industry leaders. IBC 2019 is heading to Amsterdam, and – as they say in Dutch – “zie je daar.”

An increased need to test and measure


In the good old days of analogue, broadcast test equipment was traditionally an elementary electronic device, such as a waveform monitor, vectorscope or audio level meter (in post-production) and spectrum analysers and field-strength meters (in broadcast transmission).

Traditionally, it was a task for broadcast engineers to ensure that our workflow signals were within spec and met all standards. In today’s post-production and broadcast environment, we now work with multiple digital signals. The ever-evolving complexity of production workflows such as streaming video, UHD, 4K and high frame rates, makes test and measurement even more important in the broadcast workplace. As a result, the equipment has changed from a hardware-centric environment to software-based systems, with more and more intelligence incorporated into the product. The engineer sitting at a bench in the workshop has been largely replaced with software tools like automated quality control (QC) to meet the needs and workload of multi-platform delivery.

The original use of test and measurement equipment like the waveform monitor (WFM) and vectorscope was to line up video-tape based equipment and check a few basic parameters of the recorded video, which included checking black level, peak white, colour phase, noise, colour gamut and timing. Videotape technology could suffer from alignment issues, head clogs and other problems which impacted the video quality. Although tape was very reliable, each record operation still needed at least a start, a middle and an end check. QC was easy, then, and the general process was to spot-check a process or watch a screen – glancing momentarily at the waveform monitor and back to the CRT monitor with an eagle-eye watching for ill-timed videotape dropouts.

The move to file-based workflows eliminated a number of the parameters and, with them, the need to test for analogue-based faults. However, due to the complexities of digital workflows, the number of test and measurements that need to be performed to ensure that the content delivered to the consumer is at a suitable quality level has increased. The sheer scale of the number of measurements that must be performed has naturally led to software-based digital test equipment and automated quality control systems.

It was the EBU (European Broadcasting Union) who recognised quality control (QC) in file-based broadcast workflows back in 2010. They commented that “broadcasters moving to file-based production facilities have to consider how to implement and use automated quality control (QC) systems. Manual quality control is simply not adequate anymore.”

In most countries or regions where content will be shown there are regulatory requirements for several aspects of produced content: audio loudness levels should comply with the CALM act in the USA, or EBU R128 in Europe, for example. Closed captions or subtitles must be present, sometimes in multiple languages and formats, while in the UK and Japan you must test content to ensure the absence of flashing patterns which may trigger PSE (photosensitive epilepsy) in susceptible viewers.

A human carrying out a QC test can verify audio quality and language, check for visual video artefacts and make the call whether to fail or pass the content. What humans can’t see, though, is the mass of ancillary data (in digital form) that makes the digital media file valid. Automated QC uses computers and software to check technical parameters that can’t readily (or at all) be examined by a human, and can augment the work of expert QC viewers by alerting them to issues that should then be examined by such an expert. Automated QC systems are now the only practical way to validate that a file is correctly constructed according to the requirements of the target platform, including resolution, format, bitrates and file syntax — a task that is beyond the ability of most technical experts.

The multi-platform, multi-screen media world of today offers content owners and content distributers a host of opportunities to develop substantial new revenue streams. With the large and varied amount of data being produced, automated QC systems need functionality to achieve what may seem impossible. Most systems have been developed to have wide file format support to include everything used in broadcast and post-production, as well as support for streaming and network sources, while some even offer RAW file support.

Systems will primarily do a container check to ‘recognise’ the file format, evaluate how many video and audio streams there are, what the bitrate is, start timecode and duration. After that, it checks the video codec, frame size and frame rate, as well as frame aspect and pixel aspect ratios. The system checks that all video and audio levels conform to the standards asked for and the more intelligent systems will automatically correct chroma, black and RGB gamut levels if they are outside the limits (and will even correct PSE flashing errors). Audio problems such as clipping are easily observable in the decoded stream and QC systems can determine whether loudness limits, peak limits, instantaneous peaks and true peak value limits have been exceeded, as well as long-term loudness over the span of the content. Other types of checkable baseband audio flaws include silence due to audio dropouts.

Automated QC systems are not fool proof, however, and human intervention is required from time to time. A correctly setup and administered system running automation will get over 90 percent of file-based work done. The balance of the process will rely on human input because there are creative interpretations in both audio and video that an automated system may fail to identify through misinterpretation. I have had content ‘failed’ because a close-up shot of a zebra was misinterpreted by a machine as being an excessive moiré pattern. A 5.1 surround soundscape mix in a scene shot in the height of cicada breeding season was rejected because the audio track contained continuous DC electrical buzz. My submissions required human intervention and, once passed, I was assured that the system had now learned what a close-up of a zebra looked like and what a cicada sounded like!

The media industry has been revolutionised by the adoption of digital file-based workflows – and having an understanding of the functions that make up file-based workflows, and what needs to be tested, is essential for knowing how to effectively implement quality control. Broadcasting success relies on quality and, most importantly, consistency in the processes that produce quality. It’s what broadcasting has been about since day one, and – thanks to some automation, artificial intelligence and whole lot of human chutzpah – it will always be.

- Advertisement -
- Advertisement -
- Advertisement -

Pin It on Pinterest