Home Authors Posts by Blackmagic Design

Blackmagic Design

Blackmagic Design
64 POSTS 0 JOBS
Blackmagic Design creates the world's highest quality video editing products, digital film cameras, color correctors, video converters, video monitoring, routers, live production switchers, disk recorders, waveform monitors and film restoration software for the feature film, post production and television broadcast industries.

The making of an African Sundance classic

SCREEN AFRICA EXCLUSIVE:

Lemohang Jeremiah Mosese is a self-taught filmmaker and visual artist from Lesotho, now based in Berlin. His film Mother, I am Suffocating. This is My Last Film About You, was selected for Final Cut in Venice in 2018, where it won six awards. The film premiered in the Berlinale Forum in 2019.

Mosese was one of three filmmakers selected for Biennale College Cinema with his feature film This Is Not A Burial, It’s A Resurrection. The film had its international premiere at the 2020 Sundance Film Festival, where it won the Special Jury Award for Visionary Filmmaking in the World Cinema Dramatic competition. Global sales rights have now been picked up by Memento Film’s cinema arthouse label, Artscope.

Set amongst the mountains of land-locked Lesotho, This Is Not A Burial, It’s A Resurrection follows an 80-year-old widow as she winds up her earthly affairs, makes arrangements for her burial and prepares to die. But when her village is threatened with forced resettlement due to the construction of a reservoir, she finds a new will to live and ignites a collective spirit of defiance within her community. In the final dramatic moments of her life, Mantoa’s legend is forged and made eternal.

Screen Africa spoke to Mosese about the making of his Sundance-award-winning film…

Congratulations on This is Not A Burial, It’s A Resurrection premiering and winning at Sundance this year! Did you expect to be getting such recognition for the film?

I was definitely not expecting to get such recognition in general! You do this work, and you just hope that it resonates with people. My work is very personal, it’s all reactionary, it’s not for everyone. It’s kind of a film I would personally seek out in the festival. We had only six months to create this film from scratch. There are many things I wish I would have done differently, but I have come to learn to let go. It’s not mine anymore, it has its own life.

Why is this film so personal to you?

I drew on real-life events for the film, which is set in my native home, Lesotho. Lesotho is a tiny country completely enveloped by South Africa. Its behemoth mountain ranges make up nearly three-quarters of its terrain and these are responsible for the abundance of water in the country, believed to be among the highest quality in the world. Lesotho annually exports an estimated 780 million cubic metres of water to South Africa; this marks Africa’s largest water transfer scheme in history.

As more and more reservoirs are built, thousands of highland villagers are forcibly removed from their land and are relocated to urban living environments, where they not only lose their livestock, crops and way of life, but also their individual and collective identity. Most liken the process of displacement to a death. More and more forests, villages and family relics are being erased in the name of progress. Destroyed and forgotten in a soulless march towards futurity. I am personally not for or against progress. I am more interested in interrogating the psychological, spiritual and social elements that attend it.

When I was a child, my family was evicted from our home. My grandmother’s village is undergoing forced resettlement right now. I still know every texture of her house’s walls, its thatched roof, the smell of oak trees after rain, the stone kraal. Soon this will be razed and flooded and water will be channelled into the heart of South Africa. Let’s just say in every scene and every character there is me; my conflicts, my struggles with faith, my fears, my hopes and dreams.

Can you tell us more about the filmmaking process?

The filmmaking process was definitely challenging. As I said, we had to complete the film in just six months so we had a very packed shooting schedule. The film was shot on location in the remote mountains of Lesotho, where running water and electricity are a scarcity. Equipment, vehicles, crew and other resources were brought into the country from South

Africa. The tiny crew of just fifteen people endured extreme weather conditions while shooting in areas with no road access. Equipment and cast were often transported on horseback and on mules.

Apart from the leads, the cast is made up almost entirely of actual residents from the village where photography took place. It’s really interesting because they knew nothing about cinema: they had never seen something like this in their village before, so they had no preconceived ideas about acting. Everything was really natural and we made sure the set design was always as minimalistic as possible. I didn’t want anyone to feel like they were being watched or under scrutiny.

I come from a visual world. As much as I love language, visuals take precedence in my work. I knew the texture, I knew the composition, pace, tone and the feel I wanted. And I tried to find a playful space on set, to stumble upon things. Pierre de Villiers, my DoP, and I have a synchronised love and passion for beauty. His way of seeing light is just amazing. He comes from the commercial world and it was his first film – it’s incredible. I also trusted him with the choice of camera we used, which was the Sony Venice. It served us best in low light conditions.

What was the decision-making behind editing the film yourself?

As a director, I’m obsessed with how to tell the story through the edit of a film. Editing is the same as writing, it’s so important to really bring all your ideas to the screen. But, of course, sometimes it’s nice to bring someone external in to edit your film – they’ll bring a new perspective to it.

We had some very poignant scenes and my aim was to make sure the edit was paced to reflect those scenes. It was built around them and the framing, so everything was as visual and slow as possible, then got faster around the tension.

What software was used to edit the film and what was the editing process like?

I now use DaVinci Resolve to edit and grade all of my films. I first saw DaVinci Resolve as an NLE solution when Apple brought out Final Cut X. I’d been a Final Cut user but I just didn’t love their redesign, it made everything harder for me. I’m not even a purist! But the UI just didn’t feel very playful to me anymore. I stopped editing completely when they introduced it. Then I bought a Blackmagic camera – one of the old ones – and it came with Resolve. I fell in love with it! The UI is a whole other world; it’s aesthetically pleasing and makes so much sense to me creatively.

It was a tricky process as we were high up in the mountains and it was really remote. My assistant and I acquired a generator for power and set up a temporary edit suite on location using quad core Mac Pros. It was here that we transcoded all of the rushes and assembled a rough cut. The edit was then finalised at Uhuru Productions in Cape Town. The grade was delivered by colourist Nic Apostoli, at nearby Comfort and Fame Studios.

Can you tell us more about the grading process?

Basically, I wanted a 16mm look. I love that look, and I wanted this film to look like this vintage piece that survived, not a new feature. What I love about DaVinci is you can combine both scanned film and digital, and you won’t be able to see the difference. With my last film, Mother, I used 8mm and integrated that perfectly into Resolve. With this film, I wanted to turn it on its head and do a film emulation look. I basically use LUTs, and play with 16 and 35mm emulation LUTs, and work with colourists to build on them. I always play around, there’s no strict template.

What advice would you give to up-and-coming filmmakers?

Keep making films, fail, and succeed. We are so privileged right now to be in a time when you can literally make films with anything. There are cameras all over the place and software programs like Resolve that let you do all of the post-production in one place. All the information is out there, you just need to go and grab it.

The Refinery colourist Kyle Stroebel talks working on Universal’s Bulletproof 2

SCREEN AFRICA EXCLUSIVE:

Cape Town-based post-production and visual effects house The Refinery recently completed the conform and grading work on Universal’s upcoming Bulletproof 2. Directed by Don Michael Paul and starring Faizon Love, Kirk Fox, Tony Todd and Cassie Clare, the film is due for theatrical release in January 2020.

We caught up with Kyle Stroebel, colourist at The Refinery, to find out what his process was like working on Bulletproof 2.

“I’ve been a colourist for more than 13 years now,” comments Stroebel. “That term has taken on many forms over the years, from the telecine days of film negative to the more modern digital intermediate suites.

“I made my way through the early days of grading rushes and handling dailies to now finishing commercials, long form feature and series work. Refinery has a great relationship with Universal, and we have finished a lot of their movies shot locally in the last few years. When the opportunity came to be a part of Bulletproof 2, it was very exciting.”

What was your brief when it came to the final grade of the film?

My brief was to “make it pop.” Think Michael Bay, think Tony Scott, think strong colours, punchy contrast and a luxurious Miami setting. This kind of grade means adjusting for blue skies and warm skin tones, and makes the final result look like one audiences would expect from a big budget blockbuster.

What was it like to work with director Don Michael Paul and DoP Michael Swan?

Don is a great character in as much as he’s incredibly expressive. He uses emotional gestures and adjectives to give you a sense of what he’s looking for. He always wants to go extreme and further, which I love.

After we completed the DI (Digital Intermediate), Don had two days with Roundabout Post LA to finesse what I had finished, so if he needed to push it even further, then he could. Michael’s images were beautiful from the get go. I could tell Don had had that very same brief with him, and I was just continuing that vision.

Do you have any favourite scenes or sequences from a grading perspective? Why is this?

There’s a gunfight in a strip club lit with neon blues and purples. Local DoPs may go quite conservative in their approach, or overly embrace the neon and lose a bit of sense of reality. Michael Swan got this balance just so right. It’s dingy but beautiful. It’s colourful, so the cool blues and purples just contrast the flames and gunfire so magically.

What hardware did you use to complete your work on the film?

We finished on a custom Linux box running DaVinci Resolve. This is a bit of a change-up for me, as I almost work exclusively on Baselight nowadays.

This was fitted with 2 x Nvidia GTX 2080Ti graphics cards and a ton of RAM. We used my Sony PVM OLED monitor and because this station was working literally next to my Baselight Panel, we opted for the Blackmagic Mini panel on the desk for space purposes.

What formats and codecs were you working with?

The shoot was very action-driven, which means there were terabytes of footage from I think around 15 different cameras sitting on multiple raids – everything from Alexa Mini shooting Pro Res, to crash cam H264 and a bunch of S-Log Sony stuff thrown in. For this reason, we actually conformed everything and then managed it out as 12 bit DPXs. This made the most sense as it was what the VFX vendor was finishing in anyway.

What were your biggest challenges working on the film?

Continuity. The shooting schedule was really quick and the weather often didn’t play ball. There’s a scene on a dock where it cuts from a midday shot where it’s pouring with rain, almost immediately to a shot in late afternoon sunset.

The joy of modern filmmaking is that this can’t interrupt shooting, so the colourist has to sort it out as best as possible through the process. It’s such a fine balance.

Refinery used DaVinci Resolve throughout the conform and grade – can you tell us why, and whether you found any specific tools helpful throughout your work?

This was actually a request from the studio. Like I said earlier, Don had a couple of days to finesse with a colourist in LA at Roundabout Post, which is a Resolve-based facility. I found the whole process very easy and intuitive.

There’s such an advanced toolset within Resolve that is really easily implemented. The individual channel mixer is something that I use a lot, and with so much of the film being shot later in the day and needing to match the material shot earlier on, the noise reduction on the fly helps a ton. Plus the keyer is incredibly quick to refine, so those blue skies can happen with little effort.

What is your take on the current state of the industry in South Africa, and what trends are you watching closely?

I’ve been doing this long enough now to know there will always be stages of flux and movement. The bedrock of commercial content being produced five to seven years ago doesn’t have the budget in our contemporary era.

There’s been a move to digital online content in a deteriorating economy which has forced agencies and content producers to be clever. I think commercials now are as strong as they have ever been, they just may be slightly fewer and further between.

However, I think the long form industry is in a very good place. Slowly locals are beginning to consume more local content. And there is a drive for local narrative entertainment. Netflix has commissioned two Original series (one of which I am currently finishing) and the feature department is in a very buoyant position. Much of that is foreign investment in local talent, seeing the skills and creativity that our industry offers. This is helped by a government rebate program which also directly affects post-production.

One way or another, Refinery is busier than we have ever been, and I work seven days a week trying to accommodate all our clients’ needs.

What advice would you give to aspiring colourists hoping to follow in your footsteps?

Get your hands on anything you can. Start grading material that’s shot on anything. Learn what the balls on your panels do. How they react under different settings. A large part of our job is problem solving, and the only way you’re going to learn how to solve those problems is to be presented with them. And then be prepared to start at the bottom. I guess like any key crew you don’t just start as a director or editor or colourist. You work your way there.

It took me five years before I finished my first commercial. If you aren’t prepared to work crazy hours doing the less glamorous stuff, then this probably isn’t for you. I’m lucky that I love what I do. If you don’t truly love this and don’t want to dedicate a large portion of your life to it (mixed with all kinds of sacrifices), then walk away now. But if you do love it, I think it will drive you. It’s certainly driven me.

Rob Legato uses Blackmagic Design to create virtual production for The Lion King

Visual effects supervisor Rob Legato used a wide variety of Blackmagic Design products to create the virtual production environment for the live-action version of Disney’s The Lion King. The film, which was released in July this year, was directed by Jon Favreau and features the voices of Donald Glover and Beyoncé Knowles-Carter.

With the technology available today, producing a 3D animated feature film doesn’t have to be a process of waiting for test animations from an animation team. Visual effects supervisor Rob Legato, an Academy Award winner for films such as Hugo and The Jungle Book, wanted to take the technology to a new level, and create a space where traditional filmmakers could work in a digital environment, using the familiar tools found on live action sets. “The goal wasn’t to generate each shot in the computer,” said Legato, “but to photograph the digital environment as if it were a real set.”

Bringing beloved characters back to the big screen in a whole new way, the story journeys to the African savanna where a future king must overcome betrayal and tragedy to assume his rightful place on Pride Rock. Like the original 1994 movie from Disney Animation, which was for its time an amazing accomplishment in 2D animation, the 2019 version pushed the abilities of modern technology once more, this time utilising advanced computer graphics to create a never before seen photorealistic style. But beyond the final look, the project embraced new technology throughout, including during production, utilising a cutting edge virtual environment.

The production stage where The Lion King was shot might look strange, with unusual devices filling the main floor and an array of technicians behind computers around the perimeter, but these were just the bones of the process. To begin shooting, director Jon Favreau and cinematographer Caleb Deschanel wore headsets that placed them in the virtual world of Mufasa and Simba.

Rather than forcing the filmmakers to adapt to digital tools, Legato modified physical filmmaking devices to work within the virtual world. A crane unit was modified with tracking devices to allow the computers to recreate its motion precisely in the computer. Even a Steadicam was brought in, allowing Deschanel to move the camera virtually with the same tools as a live action shoot. The goal was to let production create in a traditional way, using standard tools that existed not just on a stage but in the computer. “In traditional pre vis you would move the camera entirely within the computer,” said Legato. “But in our virtual environment, we literally laid down dolly track on the stage, and it was represented accurately on the digital set.”

Blackmagic Design was not simply a part of the system, but the backbone for the process, providing the infrastructure for the virtual world as well as the studio as a whole. “We used Blackmagic products first as video routing for the entire building,” said visual effects producer Matt Rubin, “and at every stage of handling video, from capturing footage shot by the team using DeckLink cards, through Micro Studio Camera 4Ks as witness cameras, Teranex standards converters and various ATEM video switchers such as the ATEM Production Studio 4K and ATEM Television Studio HD.

Editorial and visual effects were networked together via Smart Videohubs routers to allow both departments access to the screening room, as well as act as sources to the screening room for shots. During virtual production, as the computers generated the virtual environment, DeckLink capture and playback cards captured the footage and played through a video network, feeding into a control station and recorded on HyperDeck Studio Minis.

Blackmagic Design announces RAW 1.5

Blackmagic Design used IBC 2019 as the perfect platform to announce Blackmagic RAW 1.5, a new software update with support for Adobe Premiere Pro and Avid Media Composer, plus Blackmagic RAW Speed test for Mac, PC and Linux, so customers can work on a wider range of platforms and editing software with their Blackmagic RAW files. Blackmagic RAW 1.5 is available for download now from the Blackmagic Design web site.

The new Blackmagic RAW 1.5 update includes Blackmagic RAW Speed Test which is now available on Windows and Linux for the first time. Blackmagic RAW Speed Test is a CPU and GPU benchmarking tool for testing the speed of decoding full resolution Blackmagic RAW frames on their system. Multiple CPU cores and GPUs are automatically detected and used during the test so that customers get accurate and realistic results. Simply select Blackmagic RAW constant bitrate 3:1, 5:1, 8:1 or 12:1 and the desired resolution to perform the test. Results are displayed in an easy to read table that shows how many frames per second the computer can decode for all supported resolutions.

Editors working in Adobe Premiere Pro and Avid Media Composer can now work with Blackmagic RAW files using the free plug-ins found in Blackmagic RAW 1.5. These new plug-ins enable editors to work with Blackmagic RAW directly, so they no longer have to transcode files. That means camera original Blackmagic RAW files can be used throughout the entire workflow. There is no longer a need to create proxy files and conform edits for finishing. These plug-ins bring the quality of RAW in small, modern, GPU and CPU accelerated files that are faster and easier to work with than any other video format.

Best of all, when projects are moved from Premiere Pro or Media Composer into DaVinci Resolve for color correction and finishing, all of the camera RAW metadata and image quality is still there.

“Blackmagic RAW is now available for editors working on all major professional NLEs,” said Grant Petty, Blackmagic Design CEO. “It’s exciting because you can now edit native Blackmagic RAW files in Premiere Pro and Media Composer and then finish them in DaVinci Resolve without needing to create proxy files, all without ever losing quality!”

Blackmagic RAW 1.5 features

  • Includes Blackmagic RAW Speed Test for Mac, Windows and Linux.
  • Adds support for Adobe Premiere Pro and Avid Media Composer.
  • Performance improvements and minor bug fixes.

About Blackmagic Design

Blackmagic Design creates the world’s highest quality video editing products, digital film cameras, color correctors, video converters, video monitoring, routers, live production switchers, disk recorders, waveform monitors and real time film scanners for the feature film, post production and television broadcast industries. For more information, please go to www.blackmagicdesign.com.

A South African Horror: Man Makes a Picture on the making of 8

SCREEN AFRICA EXCLUSIVE:

Despite having only just completed their second feature, the team at Man Makes a Picture is already quickly rising to stardom. Their first film, The Recce, won best foreign feature at the Idyllwild Film Festival in California in March and a Silver Remi at the Worldfest in Houston in April. Now their second film, 8, has been taken to market at Cannes after being signed by LA-based sales agent, Rock Salt Releasing.

Playing on African folklore and mythology, 8 has been described as horrific on a primordial level: it portrays what happens when an old man, fated to collect souls for eternity, seeks atonement after trading his daughter’s soul. The film stars award-winning actor Tsamano Sebe (Of Good Report), Igne Beckmann (Escape Room), Garth Breytenbach (Troy: Fall of a City) and upcoming star, Keita Luna.

To find out more, we caught up with the team behind the film about everything from why they chose the horror genre to the way they designed the cinematography, sound and post on an indie budget.

What made this project – 8 – important to the team at Man Makes a Picture?

Jac Williams, producer: The director, Harold Hölscher, and I have been developing this project for the last two years. It’s a proper South African horror story – a period piece that takes place in the late-70s. We liked the script and the whole Man Makes a Picture team was very excited to jump on this project. The horror genre seemed like a good route as we were looking for a project that would be quicker to distribute internationally than The Recce – an Afrikaans film that is much more difficult sell abroad.  I think we made the right choice seeing that we recently signed a world-wide sales deal with a company in Los Angles and attended The Cannes film market with them in May.

What cameras did you use on the film?

Jacques van Tonder, technical producer: We worked with BMD URSA Mini 4.6Ks on The Recce and, as far I was concerned, they were battle-tested in terms of reliability. We were very pleased with their performance then, and our opinion was validated with a nomination at the Cameraimage Film festival in Poland in the best Cinematography in a Debut Feature Category.

Budget is always a concern on independent projects, and with the BMD being really cost-efficient, we were able to have two camera bodies shooting multicam scenes, as well as have a gimbal pre-rigged at times or have a splinter unit to go out and pick up shots. The 4.6K resolution also gave our VFX team good resolution to work with. Considering it is a totally independently-funded film, we had to keep all options on the table for possible distribution and the camera being on the Netflix-approved list was a big selling-point. Through our experience we knew that shooting RAW would give us a lot of flexibility in the grade. These cameras really provide excellent value for money.

Dave Pienaar, director of photography: In the end, the Blackmagic cameras made sense. Two cameras gave security as the shoot was far away from any backup rental house. It was handy to have a second body available for gimbal and second-angle setups. I’m not really a fan of shooting two cameras on character-driven scenes, as I feel the lighting and composition suffer, but on more technical scenes it definitely helped. I felt they handled the shoot well and was quite pleased with the cinematic quality they lent to the picture.

What lenses and rigs did you use on the cameras? What effect did this bring to the final cinematography?

Pienaar: We used the new Sigma Cine range of lenses, which actually complemented the Blackmagic cameras really well. I was worried about the combination of a digital camera and super-sharp lenses. But they somehow seemed nice and organic on the Blackmagic. The lenses have a great range of focal lengths and are nice and fast, which helped with the night scenes. I was able to shoot them pretty close to wide open, and they still kept it together. I quite like to fog the lens a little, as I feel it takes the edge off the super-clean modern lens and adds an emotive, organic quality to the picture.

I also used a long ARRI Alura Zoom which was a lot of fun. Originally I thought it was going to be more of a technical lens instead of a storytelling lens, but I loved using its slow, creeping zooms to build tension.

Could you describe the shooting style that you opted for?

Pienaar: We went with a more classic operating style on this film. Tripod and slider most of the shoot. I felt the ‘grounded’ feel of the camera with long, eerie shots added to the suspense. A gimbal was used on some shots to move the camera without it looking too handheld. I really loved shooting the more intimate character-driven scenes with just one or two characters, as opposed to the larger, more technical scenes with many characters.

Tell us more about how the cameras were rated, and whether a LUT was used during filming?

Pienaar: We acquired in RAW in the lowest compression to get the best out of the cameras. But, I was terrified of underexposing the dark contrasting night stuff. I rated the cameras 800 ISO on the day scenes and quite often rated them to 200 ISO on the night shots. I felt that as long as I didn’t overexpose the highlights, which is a real danger when shooting at 200 ISO, there would be more information on the rushes as I would effectively be overexposing by two stops. 

van Tonder: The cameras in combination with Resolve software make it really easy to create and load LUTs for custom looks straight out of camera. We ended up with a generic shooting LUT that worked for both day and night scenes. On a tight schedule and with a small crew, it was great to be able to customise the look without additional crew or equipment apart from our DIT setup.

What was the approach to lighting and audio?

Pienaar: I had a lot of fun on this film. I almost exclusively shoot commercials and so quite often feel restricted by client and agency on a commercial job. It was liberating to be able to light it the way I wanted to. Toby Smuts, the gaffer, was great at coming up with suggestions and had an amazing wireless LED DMX system which was a treat on a budget-restricted shoot like this. I loved shooting the Shed Night Interior scenes.

Adriaan Drotsch, sound recordist and audio post supervisor: It was an absolute honour to work on my very first horror film, as sound plays such an important role in this genre. We recorded on a Zaxcom Nomad 10 and used a Zaxcom ZMT3 wireless kit with Sanken cos11-D microphones. The Zaxcom kit comes with a never-clip function, which literally means you can’t overdrive the input. Never-clip helped us to get around tricky scenes where a performer is whispering and then, all of a sudden, goes into a frantic scream.

For the on-boom microphone we went with a more vintage feel – a Sennheiser Mkh416, a Sennheiser Mkh816 and, for all the inside scenes, the trusty Sennheiser Mkh50. The biggest challenge on the production was to minimise post-ADR recording and trying to get as much on location as possible while the whole of the crew spent 24 hours a day on location.

The real fun started in post: we played around with awesome ideas to see which one will give the biggest fright, while still keeping you interested for more. The director was very involved and had specific ideas about what the caricatures needed to sound like.

The awesome team from Sound and Motion Studios really intensified the film through sound and did a brilliant job creating it. 

How did you approach the post production for 8?

van Tonder: We started our post process with a DaVinci Resolve-based lab for dailies and offline processing. We had daily rushes viewing in the lab. It is a great combination, working with the URSA Minis and Resolve. With our setup, we could view rushes in the native RAW format with LUT applied and retain full image quality while working on looks. Our editor could then begin assembly and QC within a few hours of the scene being shot.

Jacques le Roux, editor & Harold Hölscher, director: Editing was done in Premiere Pro with the camera originals being transcoded to ProRes proxy at HD resolution. We spent a lot of time shaping the performances so that the characters were believable, and ensured we always got the pacing and suspense just right. Next, we removed redundant scenes or parts of scenes: sometimes what works great in the script is not necessarily reflected on screen. We wanted to keep the suspense and horror elements as real as possible, so we did most of the effects techniques in-camera, with minimal SFX in mind. We had to focus more on classical cutting to convey emotion and fear. The style of the film is very romantic and old school, so the editing and post production had to keep that style intact.

Artificially Intelligent Media

SCREEN AFRICA EXCLUSIVE:

There is no doubt that artificial intelligence (AI) will touch every aspect of business across all industries in the years ahead. In broadcasting and media, it is already having a profound effect. The technology is widely being used to analyse and understand video content, speeding up processes like searching and logging, for example. AI is now developing into an intelligent video creation tool, being able to film and edit complete productions thanks to machine learning algorithms.

Media, in general, holds large amounts of unstructured data, which requires humans to understand it. Tasks like content management, processing, interpretation, quality checking, all take a lot of time and effort. However, current AI and machine learning (ML) algorithms have reached a level of accuracy close to human capabilities. This means many labour-intensive processes are now taken over by AI instead.

All major cloud providers are offering varying forms of AI to assist with post-production. From shot logging and speech-to-text, to scene and object identification, AI augments human logging, providing richer metadata for each scene and shot. Some post-production software integrates directly with cloud AI for a seamless in-application experience.

Over the past few months, most major post-production edit software has included some form of AI into their platforms. Blackmagic Design Resolve, for example, introduced DaVinci Neural Engine which uses deep neural networks, machine learning and artificial intelligence to power new features like speed warp motion estimation for retiming, super scale for up-scaling footage, auto colour and colour matching, as well as repetitive time-consuming problems like sorting clips into bins based on who is in the shot, for example.

Avid’s new AI tools are available through Avid | AI, which is also part of the Avid | On Demand cloud services. Avid | AI is a set of cloud services (a combination of Avid-developed tools and tools from Microsoft Cognitive Services) that utilise machine learning, including facial and scene recognition and text and audio analysis. Also released recently was Avid | Transformation, a new suite of automated services including auto-transcoding, watermarking and content repackaging for delivery to any device, anywhere.

Adobe has also updated its video editing applications with useful new features for both After Effects and Premiere Pro users, and some really cool Adobe Sensei AI integration specifically for Premiere Pro. First and foremost, the new Colour Match feature leverages the Adobe Sensei AI to automatically apply the colour grade of one shot to another. This feature comes complete with Face Detection, so Premiere can match skin tones where necessary, and a new split-view allows you to see the results of your colour grade as you go – either as an interactive slider, or as a side-by-side comparison.

In addition to Colour Match and Split View, Adobe has used its Sensei AI to make some audio improvements as well. Autoducking will automatically turn down your music when dialog or sound effects are present, generating key frames right on the audio track so you can easily override the automatic ducking, or else simply adjust individual key frames as needed.

Adobe After Effects, meanwhile, rolled out a new feature that can automatically remove objects from a video. While Adobe Photoshop has long offered a tool that can conceal areas of a still image with a camouflage fill, the software giant said the ability to do so across multiple frames was made possible by improvements to its machine learning platform, Adobe Sensei. The feature is the latest example of how artificial intelligence is transforming the video production process, making professional content quicker and easier to produce at scale. The new tool is able to track a discrete object across a given clip, remove it and fill the space it occupied with pixels that blend with the surrounding imagery. Adobe suggests that it can be used for anything from removing anachronistic giveaways within a period piece to erasing a stray boom mic.

AI has suddenly become one of the most important technologies and the most in-demand tool for the video creation market owing to its ability to sense, reason, act and adapt. The general popularity of automation (in various business practices) is another contributing factor. But do we think that AI will ever replace human input?

There are many applications that are starting to hint that it is possible. An early example has to be GoPro’s QuickStories, a quirky piece of software that copies the latest footage from your camera to your phone and – using advanced algorithms – automatically edits it into an awesome video.  Another intriguing piece of kit is SOLOSHOT3. Described as ‘your robot cameraman’, SOLOSHOT3 is a 4K camera on a tripod that automatically tracks a subject wearing a tag, keeping them perfectly in frame and in focus whilst recording the action. SOLOSHOT3 can quickly produce an edited and shareable video of highlights using its automated editing tools and post the video online – with no human intervention required.

The BBC’s Research and Development arm has been experimenting with how machine learning and AI could be used both to automate live production and search the broadcaster’s rather large archives. Their experimenting resulted in a documentary, screened late last year, made entirely by algorithms – and while it wasn’t the best bit of television ever made, it was a pioneering achievement from a machine learning perspective.

In Tel Aviv, Israel, a company called Minute have developed a deep learning AI video optimisation tool that automatically generates highlights from full-length videos. Minute’s AI-powered deep learning technology analyses video content to identify peak moments, allowing the system to automatically generate teasers from any video content with simple, seamless integration. Whilst pessimists claim this kind of application could one day replace humans altogether, the developers at Minute believe that their technology complements, rather than replaces, content creators and storytellers.

Most organisations today are exploring how they can best leverage and embrace these new technologies. This technology is also proving to be a boon for video editors and production teams. It enables professionals to focus more on artistic aspects rather than editing, which is considered a rather boring and mechanical task by many. Learning how AI technologies can help the entire production chain by improving quality and efficiency should benefit everyone. New things shouldn’t frighten us, they should excite. Two decades ago, we were all worried about non-linear editing – and look what happened to that concern!

Around the world with Timeline’s Remote Production

In the world of outdoor broadcast, what do you do when you have an event in South Africa one day, and – let’s say – in the UK the next? It would be a logistical nightmare, not to mention expensive, to ship an OB van with all its kit and a whole production crew between locations in such a short space of time.

We spoke to broadcast services specialist, Timeline TV, about the innovative remote production solution it has designed exactly for this purpose.

Initially devised for live match coverage of the Women’s Super League (WSL) – the top league for women’s football in the United Kingdom – the system has also been used on international events such as the Landmarks Half Marathon, Vitality Big Half Marathon, Formula 1, Sail GP, World Superbikes and various other news and sporting events.

With just the camera operators needing to travel to events, Timeline’s system can be deployed at any venue – or, in the case of the WSL, any football ground in the UK – while delivering all production requirements from one central location.

Timeline’s managing director, Dan McDonnell, explains that the team chose Blackmagic URSA Broadcast cameras for the system: “We pride ourselves on delivering broadcast quality output for our creative partners, and the URSA Broadcast, when paired with our B4 lenses, delivers excellent images both on and off the pitch.”

When it comes to controlling the cameras, Timeline’s IP engineering team had to get creative. “Remote camera racking was essential and required a new approach to data transmission whilst keeping the on-site operation as simple as possible,” explains Dan. “There was nothing else on the market that would afford us the flexibility, quality or control unless we deployed a full OB.”

Built around the Blackmagic 3G-SDI Arduino Shield, and – using the manufacturer’s SDI camera control protocol – Timeline’s engineers developed a H.265-based 4G bonding and IP transmission solution that resides in a backpack. It integrates a transmitter and the camera control interface systems needed for acquisition, with live transmission back to the company’s Ealing Studios headquarters (in London, England), where coverage is produced.

“Essentially, we’ve eliminated the requirement for a full, static router, which normally is in place for a camera control unit to feed through. As well as the high resolution camera signals, an additional low res feed is sent carrying all the necessary camera settings,” Dan explains. “Working over IP channels provides reliable and consistent remote signal feeds from wherever we are around the country, and this, combined with Blackmagic’s open protocol, has given us the flexibility to devise a high-quality remote production solution that meets broadcasters’ strict standards for live sports coverage.”

Timeline is continuing to provide all Women’s Super League footage with the remote solution, but its efficiency and technical capabilities have been, and can be, easily transferred to other sporting leagues, tournaments and touring events.

“The ability to control an entire multi camera system via IP means that we can produce comprehensive coverage of high-profile events from anywhere in the world, without having to send a huge amount of staff or equipment out to the venues,” Dan concludes. “Once the operators are on site, it’s simply a matter of switching on the receivers and the cameras, and we’re ready to go.”

Blackmagic Design announces new Teranex Mini SDI to DisplayPort 8K HDR

Blackmagic Design has announced Teranex Mini SDI to DisplayPort 8K HDR, an advanced 8K DisplayPort monitoring solution with dual on screen scope overlays, HDR, 33 point 3D LUTs and monitor calibration that’s been specifically designed for the professional film and television market and to take advantage of a new generation of monitors such as the Pro Display XDR. Teranex Mini SDI to DisplayPort 8K HDR will be available in Oct 2019 from Blackmagic Design resellers worldwide.

Teranex Mini SDI to DisplayPort 8K HDR is an advanced 8K monitoring solution for DisplayPort computer displays or high quality monitors such as the new Pro Display XDR. Unlike basic converters, Teranex Mini SDI to DisplayPort 8K HDR can use third party calibration probes to accurately align connected displays for precise colour. There’s even two on-screen scopes that can be selected between WFM, Parade, Vector and Histogram. Teranex Mini SDI to DisplayPort 8K HDR is perfect for film studios and broadcasters who need professional but affordable colour accurate monitoring. Customers also get an elegant design with colour LCD for monitoring and control of settings.

The front panel includes controls and a colour display for input video, audio meters and the video standard indicator. The rear panel has Quad Link 12G-SDI for HD, Ultra HD as well as 8K formats. There’s 2 DisplayPort connections for regular computer monitors or USB-C style DisplayPort monitors such as the new Pro Display XDR. The built in scaler will ensure the video input standard is scaled to the native resolution of the connected DisplayPort monitor. Customers can even connect both 2SI or Square Division inputs.

Teranex Mini SDI to DisplayPort 8K HDR has everything for the latest HDR workflows. All that’s required is to connect a HDR compatible DisplayPort monitor to allow HDR SDI monitoring. Static metadata PQ and Hybrid Log Gamma (HLG) formats in the VPID are handled according to the ST2108-1, ST2084 and the ST425 standards. Teranex Mini SDI to DisplayPort 8K HDR handles ST425 which defines two new bits in the VPID to indicate transfer characteristic of SDR, HLG or PQ. Plus the ST2108-1 standard defines how to transport HDR static or dynamic metadata over SDI. Plus there is support for ST2082-10 for 12G SDI as well as ST425 for 3G-SDI sources. Both Rec.2020 and Rec.709 colourspaces are supported and 100% of the DCI-P3 format.

Two fully independent on screen scopes are included so compliance with broadcast standards is easy when doing critical high end work. Scopes are overlaid on screen so they can customise position, size and opacity. Customers can select from a range of scopes, including waveform for displaying luminance levels of their input signal. The vectorscope display allows customers to see the intensity of colour at 100% SDI reference levels. Customers also get RGB and YUV parade displays, which are ideal for colour correction and checking for illegal levels. Histogram shows the distribution of white to black detail in their images and highlights or shadows clipping.

Teranex Mini SDI to DisplayPort 8K HDR includes the same high quality 33 point 3D LUTs as used in the film industry. It’s even possible to calibrate the connected display by connecting a third party USB colour probe and Teranex Mini SDI to DisplayPort 8K HDR will analyse the monitor and generate a 3D LUT to correct for colour differences between displays. Two independent 3D LUTs can be loaded and customers can select between them from the front panel.

It’s always difficult to know if a monitor is the correct colour because every display is slightly different, even between the same model. Teranex Mini SDI to DisplayPort 8K HDR solves this problem as it can use a third party USB probe for automatically aligning their monitor. SpectraCal C6, X-Rite i1 Display Pro or the Klein K10-A probes are supported and plug into the front of the converter. The converter takes care of all the work and will automatically generate test signals on the monitor during the calibration process.

All Quad Link 12G-SDI inputs have outputs for looping to other equipment. Plus all HD, Ultra HD and 8K standards are supported allowing broadcast or film industry use. In 720p customers get support for 50p, 59.94p up to 60p. In 1080i formats, they get 50i, 59.94i up to 60i. 1080p, 1080PsF as well as 2160p formats are supported from 23.98 to 60 fps. Customers even get support for 2K and 4K DCI film formats from 23.98p to 60p. 4320p 8K formats are supported at 23.98, 24, 25, 29.97, 50, up to 59.94. With 2SI to Square Division conversion built in, an 8K source will be automatically converted for the monitor. Teranex Mini SDI to DisplayPort 8K HDR even handles both Level A or B 3G-SDI plus YUV and RGB SDI formats.

The front panel LCD provides confidence monitoring with both images and accurate audio level meters. There are menus for all functions and it’s easy to scroll through menu “pages” to find settings that need changing. The 3D LUTs can be enabled just by pressing the 1 or 2 buttons. Calibration is also started via the menus and customers simply follow the prompts to calibrate their display. The audio meters can even be switched between VU or PPM ballistics. There are settings for configuring scopes, their on screen location or opacity. Customers can even view and edit network settings.

For rack mounting, customers can simply add a Teranex Mini Rack Shelf. This consists of a metal tray that holds the converter and allows it to be screwed down before the whole shelf is bolted into the rack. Of course Teranex Mini SDI to DisplayPort 8K HDR includes rubber feet that customers can attach to the underside of the converter if customers want to place it on a desktop.

“We are excited to announce the new Teranex Mini SDI to DisplayPort 8K HDR for customers working with the new Pro Display XDR,” said Grant Petty, Blackmagic Design CEO. “It provides advanced HDR and color critical monitoring features such as built in scopes, 33 point 3D LUT support, automatic probe based calibration and native 8K for the latest customer workflows!”

Teranex Mini SDI to DisplayPort 8K HDR Features

  • Includes support for HDR via SDI and DisplayPort.
  • 2 built in scopes live overlaid on monitor.
  • Film industry quality 33 point 3D LUTs.
  • Supports automatic monitor calibration using colour probes.
  • Advanced Quad Link 12G-SDI inputs for 8K.
  • Scales input video to the native monitor resolution.
  • Includes LCD for monitoring and menu settings.
  • Utility software included for Mac and Windows.
  • Supports latest 8K DisplayPort monitors and displays.
  • Can be used on a desktop or rack mounted.

Meet the Editor: Haiko Boldt

Haiko Boldt is a freelance video editor and graphic designer currently living in Namibia, who worked as editor and cinematographer on #LANDoftheBRAVEfilm. He has received a Namibian Film Award for his editing work and is the owner of Thunderboldt Design & Post Production.

#LANDoftheBRAVEfilm, produced in Nambia, traces the journey of policewoman Meisie Willemse: a tough cop with an illustrious career, who hides a dodgy past. As she solves one the biggest cases of her career, she is forced to face herself; the closer she gets to catching the killer, the more the dark secrets of her past are revealed, ultimately derailing her life.

Screen Africa chatted to Haiko about the interesting story, the equipment used to capture the epic Namibian landscapes and the need to “kill your darlings” in the editing room…

How did you first become involved in working on the film?

My involvement came about through the director, Tim Huebschle, and producer David Benade. We had worked together on short films and various other projects over the years. I was given the opportunity to read one of the early drafts of the script and I enjoyed the story immensely. It is a local story, which I hoped would turn into a local production and would give me the chance to stretch my creative muscles as an editor. Initially, my involvement was only meant to be editor of the film, but during the pre-production phase Tim and David approached me and asked if I would also like do the cinematography for the film, as well. This was an amazing opportunity and daunting, as well, but of course I accepted.

The teasers all look amazing – especially the wide-angle shots. What cameras and lens set-ups were used on the film?

The early teasers were filmed on a Canon 60D with the 18-135mm kit lens. Some of the aerials were done with the Phantom 2 and later with the DJI Inspire One. For principal photography, we used the Sony A7s II with a Metabones adaptor and Canon lenses (EF 16-35mm, EF 24-105mm and the EF 100-400mm), as well as a Sigma Macro lens. We recorded onto the Blackmagic Design 4K Video Assist from the Sony A7s II HDMI out to be able to capture in 4K (UHD) and Apple ProRes HQ 422.

The director’s monitor on set was an identical 4K Video Assist. In the beginning of the production, the director’s monitor would get its feed via SDI cable. We did have a wireless video transmission option but could not use it because it only worked with HD and we were shooting in UHD. A bit into the production, however, I found a new solution after the launch of a few products from Blackmagic Design. Among these was the Mini Converter SDI to HDMI 4K. This was used to down-convert the UHD to HD and to connect the wireless video transmitter. Luckily, the device is small and compact and did not add too much weight to the camera rig. The whole setup was powered from a V-Mount battery plate system via 12V and D-Tap cables to the devices. Aerials were filmed with the DJI Inspire One and the DJI Ronin MX gimbal was used for the motion shots.

Can you talk us through your choice to complete full picture post in Resolve? What advantages did this bring?

Working on other projects, we had gotten into a bit of a workflow issue between FCP X and the export and import of the audio through third-party software for audio post.

I always assumed that the colour grade was going to happen in DaVinci Resolve, which led us to decide to choose Resolve for editing as well. Because of the free version, we could install DaVinci Resolve, give it a try and see if it would be the right fit. The advantages were clearly the fact that the entire workflow happens within one ecosystem. It is so easy to switch from edit to audio to colour workflow with a click of the mouse. This became evident when, during the rough cut of the film, the sound designer would come in over the weekends and clean up the audio for the past week’s rough cut. This meant that by the time we had a first full rough cut, we had clean audio, too.

Recording on the 4K video assist in ProRes HQ, which is optimised for DaVinci Resolve, also meant no transcoding. It was liberating, as we edited in full resolution – and so what we saw on screen was what we would get out, and it didn’t slow down the computer.

What was your Resolve setup, including panels, hardware, monitoring and more?

The studio setup consisted of an entry-level iMac Pro running DaVinci Resolve and a 16TB G-Tech Raid Drive for the media. Connected to this was a 4K Ultra Studio. We opted for a BenQ SW271 monitor and the Blackmagic Design Micro Panel for the colour grading. During editing, the audio was taken out of the 4K studio and sent into a small mixer and out to two Yamaha speakers. We also had a big screen TV for viewing in studio.

Can you describe the brief for the edit? What were you trying to achieve with it?

The brief was to forget what I had seen on location, and craft the edit from the footage that was there. The director and I started to make editing decisions as early on as the storyboarding stage. The edit was quite predetermined in a way, as we had created an extensive storyboard with more than 700 boards. We spent about five weeks on it and had already made decisions on shot sizes and angles, and had specified the shots we would use for specific parts of the dialogue. This helped us immensely on set, but also set the tone for the edit.

The story is really interesting – and the female lead, too. Did you face challenges in telling her story through your work?

‘To kill your darlings’, like the director said, was and is probably always a challenge. We all loved some shots for whatever reason, but for the greater good of the story, some of them didn’t fit or just felt a bit ‘out.’ Letting go of shots like these can only make the film better.

Do you have a favourite scene or sequence? Can you tell us why, and give us an insight into the workflow behind creating it?

In #LANDoftheBRAVEfilm, my favourites are the shots that involved small edit decisions. Omitting certain parts of the story, or even just seconds of footage – small decisions that ultimately strengthened the film. Always remember that sometimes the things you don’t say also have an impact. For a specific example, I really like one of the city skyline shots. We kept the bottom of the shot at normal speed and sped up the top part to get some movement into the clouds, to create the effect that time was passing.

 

Blackmagic Design releases new Teranex Mini SDI to HDMI 8K

Blackmagic Design recently announced Teranex Mini SDI to HDMI 8K HDR, a new advanced 8K HDMI monitoring solution with dual on screen scope overlays, HDR, 33 point 3D LUTs and monitor calibration for professional, colour accurate SDI monitoring on HDMI 8K screens. Teranex Mini SDI to HDMI 8K HDR is now available from Blackmagic Design resellers worldwide.

Teranex Mini SDI to HDMI 8K HDR is an advanced 8K monitoring solution for large screen televisions and video projectors. Unlike basic converters, Teranex Mini can use third party calibration probes to accurately align connected displays for precise colour. There are 2 on-screen scopes that can be selected between WFM, Parade, Vector and Histogram. Teranex Mini SDI to HDMI 8K HDR is perfect for film studios and broadcasters who need professional but affordable colour accurate monitoring. Customers also get an elegant design with colour LCD for monitoring and control of settings.

The front panel feels elegant when placed on a desktop plus it includes buttons and a colour display for video monitoring with audio meters and video standard. The rear panel has Quad Link 12G-SDI for HD, Ultra HD as well as 8K formats. There are also 4 HDMI outputs allowing use with 8K televisions that feature quad HMDI inputs, plus a down converter for using 8K sources on Ultra HD or HD televisions. Customers can even convert between 2SI and Square Division all automatically.

Teranex Mini SDI to HDMI 8K has everything for the latest HDR workflows. All that’s required is to connect a HDMI display to get HDR SDI monitoring. Static metadata PQ and Hybrid Log Gamma (HLG) formats in the VPID are handled according to the ST2108-1, ST2084 and the ST425 standards. Teranex Mini SDI to HDMI 8K HDR handles ST425 which defines 2 new bits in the VPID to indicate transfer characteristic of SDR, HLG or PQ. Plus the ST2108-1 standard defines how to transport HDR static or dynamic metadata over SDI. Plus there is support for ST2082-10 for 12G SDI as well as ST425 for 3G-SDI sources. Both Rec.2020 and Rec.709 colour spaces are supported and 100% of the DCI-P3 format.

Two fully independent on screen scopes are included so compliance with broadcast standards is easy when doing critical high end work. Scopes are overlaid on screen so customers can customise position, size and opacity. Customers can select from a range of scopes, including waveform for displaying luminance levels of their input signal. The vectorscope display allows customers to see the intensity of colour at 100% SDI reference levels. Customers also get RGB and YUV parade displays which are ideal for colour correction and checking for illegal levels. Histogram shows the distribution of white to black detail in their images and highlights or shadows clipping.

Teranex Mini SDI to HDMI 8K HDR includes the same high quality 33 point 3D LUTs as used in the film industry. It’s even possible to calibrate the connected display by connecting a third party USB colour probe and Teranex Mini SDI to HDMI 8K HDR will analyse the monitor and generate a 3D LUT to correct for colour differences between displays. Two independent 3D LUTs can be loaded and customers can select between them from the front panel.

It’s always difficult to know if a monitor is the correct colour because every display is slightly different, even between the same model. Teranex Mini SDI to HDMI 8K HDR solves this problem as it can use a third party USB probe for automatically aligning their monitor. SpectraCal C6, X-Rite i1 Display Pro or the Klein K10-A probes are supported and plug into the front of the converter. The converter takes care of all the work and will automatically generate test signals on the monitor during the calibration process.

All Quad Link 12G-SDI inputs have outputs for looping to other equipment. Plus all HD, Ultra HD and 8K standards are supported allowing broadcast or film industry use. In 720p customers get support for 50p, 59.94p up to 60p. In 1080i formats, customers get 50i, 59.94i up to 60i. 1080p, 1080PsF as well as 2160p formats are supported from 23.98 to 60 fps. Customers even get support for 2K and 4K DCI film formats from 23.98p to 60p. 4320p 8K formats are supported at 23.98, 24, 25, 29.97, 50, up to 59.94. With 2SI to Square Division conversion built in, an 8K source will be automatically converted for the monitor. Teranex Mini SDI to HDMI even handles both Level A or B 3G-SDI plus YUV and RGB SDI formats.

The front panel LCD provides confidence monitoring with both images and accurate audio level meters. There are menus for all functions and it’s easy to scroll through menus “pages” to find settings that need changing. The 3D LUTs can be enabled just by pressing 1 or 2 buttons. Calibration is also started via the menus and customers simply follow the prompts to calibrate their display. The audio meters can even be switched between VU or PPM ballistics. HDMI instant lock can be enabled to ensure the HDMI display locks instantly if the input video is interrupted. There are settings for configuring scopes, their on screen location or opacity. Customers can even view and edit network settings.

For rack mounting customers can simply add a Teranex Mini Rack Shelf. This consists of a metal tray that holds the converter and allows it to be screwed down before the whole shelf is bolted into the rack. Of course Teranex Mini SDI to HDMI 8K HDR includes rubber feet that customers can attach to the underside of the converter if customers want to place it on a desktop.

- Advertisement -
- Advertisement -
- Advertisement -

Pin It on Pinterest