top of page

Creating a Character: Of Colour Correction, Creating Smoke and High Speed Punch Effect

The Easter break is over and the final sprint towards deadline has begun. As I’ve blogged about previously, our project still had to be filmed and edited, which is a major feat regarding the amount of editing required. But I will come to that in a bit

Wednesday, 24 April 2019

On Wednesday, we thus filmed the project again, this time with more time at hand (4,5h) and an improved storyboard that also contained enough ‘empty’ space in the right places for me to go and add visual effects onto them in post.

With this reshoot, I was again working as 2ndAC, again making sure that the green screen was safely rigged and evenly lit. I also ensured that each and every take was slated correctly as well as documented in the production log to ease and speed up my workflow later in the Editsuite.

This time round, however, we also filmed with some other improvements: One of them was filming with the dynamic range of film instead of video, which gives me a couple of more stops to play with in the edit and especially with the green screen.

We furthermore progressed to using the Spydercheckr colour chart for each and every shot containing the green screen to make the colour correction process easier for these shots. This was a result of learning from a mistake that I made with the Poppie project. Watching the footage with our Invisible Man back, we noticed slight changes in lighting within the shot and between shots.

Whilst I then realised that these changes resulted from the light reflections of cars passing by, they were not visible to the naked eye on set. Using the Spydercheckr colour chart to match these shots, would have helped in solving this issue at least partially in between shots and might have helped to balance the colour correction a bit better. Thus my intent to use the Spydercheckr with every green screen shot.

After filming our film a second time, I sat down the very same evening and transferred the footage onto the Editshare, watched (as well as listened back) to all the clips, thereby creating an EDL while also organising my Avid desktop as well as the files themselves. With this preparation made, I was able to start editing the next day.

Thursday 25 April 2019

Now Thursday began with a different challenge, as I finally needed to settle on the exact order of my workflow (and editing programmes) for post-production.

My Research

Researching into the matter of the ’correct workflow’, I quickly realised that there is no right way, but many different ways to go about, depending on what you need to get done exactly.

The film editor John Byron Hanby suggested the following order in his blog entries ‘Become a Workflow Master, Part 1: Before the Edit’ and ‘Become a Workflow Master, Part 2: Edit to Final Delivery’, which can be considered the standard procedure:

  1. Media Ingestion and Management (Media Transfer)

  2. Organization (File and Programme Organisation)- organizing the media on your computer to prepare for editing

  3. Importing (File Import into Software)

  4. Editing (quite self-explanatory)

  5. Color Grading (and I would add Colour Correction before that)

  6. Special Effects (Modification of Footage)

  7. Final Delivery (Export in the right format, resolution, and bit-rates)

This is a pretty standard workflow that is widely used. This workflow would mean that, with the required programmes at hand, I would start with Avid, move on to DaVinci, then back into Avid, before moving the footage into After Effects and then returning to Avid for export.

But because I need to add the background first to be able to choose the colour grading for the layer containing our actor, and because I thus also need the colour chart that I’ve used in the beginning of each clip together with the slate, before I can create the rough cut, I intermittently decided to choose the order: After Effects, DaVinci, and then Avid.

However, after much deliberation and researching a bit more into the matter, I then found this guy, who gave a valuable tip regarding the editing workflow with a green screen:

He changed my mind regarding the workflow. Thus, I first settled for the colour correction process in DaVinci, then for After Effects and then for editing a rough cut in Avid. This I did because AE (if it was the last programme in the chain) tends to have issues with exporting files correctly.

The same sometimes goes for Avid if the files ’started off’ in Avid, moved through various programmes and were later reintregrated in Avid again. Even looking out for the correct codecs, these AAF roundtrips can fail at the last step, consuming a lot of time that we certainly do not have at our disposal.

Thus, I decided to have each clip pass through DaVinci and then through After Effects, first settling on the correct colour correction, and then editing effects before putting them into Avid to cut them to their proper length.

Furthermore, because of the restricted access to the Editsuite and due to the fact that a lot of transfers between editing programmes is not only time consuming but can increase the risk of committing a mistake that can corrupt a required file, I decided to keep the amount of transfers as low as possible, cutting out the intermediate returns to Avid as suggested by Hanby.

Starting the Post-Production

I then started out by importing all relevant clips into DaVinci and sequencing them in their right order in the Edit Mode, as I’ve previously blogged about in my blog entry ‘Introduction into DaVinci Resolve’.

I then moved onto the Color Mode colour correcting all green screen clips with the help of the Spydercheckr colour chart.

For this, I selected the colour chart tool from the Monitor Window and placed it exactly over the relevant area in the image to match up, before I clicked on the colour chart icon in the top left corner of the bottom window, selecting the Datacolor SpyderCheckr as default chart.

Already in the abovementioned blog, I summarised this process:

“Matching the colours to the colour chart on the clip now entails that DaVinci will try to change the image according to the colour chart depicted. However, since this is not a fool-proof automated process, DaVinci will also give you a percentage of how much each individual colour is off.”

With this procedure done, only three clips did not match to the colour correction result of the rest. Checking my documentation though, I realised that all relevant clips had been filmed with the same lens (50mm) and exposure (a halved stop between 2 and 2.8) as well as the same lighting (6.500K).

So the process really isn’t always fool-proof. Thus, I tried adjusting the colour correction of these rogue clips by using another reference image of the Spydercheckr in the same clip.

And after a bit of playing around, two of the three clips also did turn out very closely to the rest of the footage - turns out that the angle of the colour chart does have an impact as it might reflect more or less light of the lighting, influencing the interpretation of exposure as done by the software. So much like using a light meter, the colour chart needs to be directed at the lens in a way that catches the relevant light of the scene.

With the third clip however, this trick did not work properly. No matter what reference image I took, it turned out much darker than the rest, albeit the same exposure. Thus, and in order to make these three clips more equal to the rest (and simplify my work later on in green screening), I then changed the overall brightness of these clips to match the rest, as also described in my previous DaVinci blog entry.

I then moved onto all the clips that we filmed in the infinity cove. This turned out to be a much bigger feat as we did not use the Spydercheckr then. Furthermore, it turned out that the image was very slightly overexposed and our actor wearing very similar colours to the background, which made the manual colour correction process difficult to achieve.

In order to get a bright white, I would have to dial down the slightly orange aspects of the image, then turn up the blacks (which would also give a bit more structure to some surfaces) as the increased white now washed out a lot.

Moving on to the correction of the mid tones, this was now made more difficult by the fact that our actor was rather rosy in complexion and wore a pale pink T-shirt, making it really difficult to pull the pinks in his shirt up without also pulling up the pinks in his skin.

After two hours of trial-and-error experimentation, I finally found what I considered the optimal balance regarding the factors at hand and exported these colour correction settings as a LUT. This I did by right-clicking on the relevant image and clicking the field Generate a 3D LUT, saving it into a folder, adding it to the list of LUT’s in the Project Settings Menu and selecting it as a default LUT.

After that, and for every other clip shot in the infinity cove, I created another node in the Node Panel, reimported the just created LUT back as a pre-set and applied that pre-set onto all the clips. And naturally, because of the same exposure thorough-out these shots, the procedure was fine for most of these clips.

After that I exported all my colour corrected clips as a .mov to be able to be imported into After Effects the next day, as described in my blog entry Introduction into DaVinci Resolve.

With the thusly created clips, I watched them back on a separate player, renaming them to their relevant clip name for further organising purposes. I did not watch these clips back in DaVinci because I wanted to ensure that the export has worked fine with out any issues.

However, as it turned out, I will have to go back and adjust three clips from the Character Creation/Infinity Cove Scene as my LUT did not work out perfectly for all these shots. Three of the clips had a rather pink tinge to them after exporting, with one of them being slightly overexposed.

Today, Friday 26 April 2019

The Plan

On today’s schedule in the Editsuite stood the digital preparation of the green screen clips – as blogged about in my blog entry ‘Let me Introduce You to… Adobe After Effects and Green Screens’– as well as imbedding the relevant video backgrounds, followed with the proper colour grading. Furthermore, I aimed to get the Dragon Ball Effect done and add the visual nametag at the end for when our character Tyler is asked for his name.

The Lag

An ambitious plan which was furthermore crunched by the fact that our entire Editsuite again lagged and no video file – neither in DaVinci nor in After Effects – could be played in its original speed, not even with its quality reduced to a quarter of the original. This went so far that not only the correct timing of effects, but also a simple playback on the VLC media player was near enough impossible.

Whilst I was able to export 20 colour corrected clips in DaVinci within roughly 23 minutes yesterday, the programme today took 110 min to export the last three clips that I colour corrected again. Since almost every single editing machine but one in the Editsuite was affected by this lag across three programmes, we asked our technician demonstrator to have a look at the issue.

After some time of freeing up additional space and deleting now unnecessary temp files from shared drives, the lag somewhat abated. It was not yet perfect but more workable. However, this issue resulted in a delay of my work of roughly three hours, which I immediately felt in my schedule.

Missing Video and Sound Files

Another problem that squeezed this schedule even more was the fact that the background clips, which I was supposed to chroma-key, as well as the sound effects that I required, had not been provided yet. Waiting for the background clips to be delivered ASAP, I immediately started preparing the relevant scenes for chroma -keying by using the effect Keylight 1.2 and its technique, which I described in my blog entry as mentioned above. After a short while, I had these shots fully prepared and just had to wait for the relevant backgrounds to be put in.

Overhauling the Three Rogue Clips

Thus, in the meantime, I started overhauling the three clips that I did not manage to finish in time yesterday. For the bits that were slightly off, I decided to manually change the overall brightness of the clips until they matched the rest of the footage.

The Dragon Ball Effect

Then I moved on to preparing the High Speed Punch or Dragon Ball Effect scene according to a tutorial that I found a while back:

This was a challenge especially, since the lag (and the general RAM requirements for After Effects) made proper and reliable timing really difficult, thus requiring a lot of checks on timing without being fully convinced that the timing would turn out to be still correct once rendered.

With this effect, you would essentially start out by filming an actor air punching the camera for roughly a minute. In post, you would then place a smoke effect around the extended fist at the peak of every punch.

After that, you would collate the clip and all its individual smoke effects into one composition and speed it up by using the time warp effect, before adding some blur to suggest motion blur from the speed of the punch. In order to create this unnatural anime effect that is known from the Dragon Ball Movies, you would then exempt the head from this motion blur by masking it out.

And tadaa! You would get an effect like in this video:

Creating My Own Smoke Effects

However, in order for this effect to work, I first needed to get access to a small variety of smoke effects. Since our version of After Effects does not come with its own smoke effect, I quickly looked online to see whether I would find a couple of free ones that would be alright regarding their style – and more importantly – their copyright.

As this was not the case, I then decided to create my own smoke effect in After Effects myself and thus researched some tutorials. From the wealth of information given, I found this tutorial to be the most encompassing and concise in regard to the variety of its effect and for our intended purpose:

Following the steps in the tutorial, I added a particle playground as a Solid Layer to my composition and tweaked the values for Barrel Radius, Particles Per Second, Direction, Velocity and Particle Radius to design the puff of smoke to my liking.

Since this effect is intended to recreate artificial gun smoke, some of the values had to be changed to imitate normal puffs of smoke. I furthermore decided to add a Camera Lens Blur and a Fast Box Blur to make the smoke effect appear more ethereal.

I then decided to settle for the colour grey instead of white, as suggested by the tutorial, as I felt that it looked more naturally like smoke with the way I designed my first smoke puff. Duplicating this layer/effect, I then tweaked the properties of each individual clip a bit more to create a variety of smoke puffs and started applying them to the peak of every punch.

However, regarding the lag and the rendering issue of After Effects, I quickly realised that in order to make them somewhat work, I would often have to insert the puffs just slightly before the punch hits its peak. This, I realised, was because these smoke puffs would not indeed depict smoke in the classic sense, but would imitate compressed air that already builds up before the punch is released, indicating a super-sonic punch.

Man, these anime artists have imagination!

However, with this effect created, I managed to apply my own designed puffs to the first 23 seconds of our one-minute clip before I had gained access to the background videos. Whilst there is still two thirds of the clip to do, I am already looking forward to the result later.

Enter: Issues with the Video Backgrounds

As our director/producer returned with the relevant video files for the backgrounds, I paused my work on the Dragon Ball Effect Scene and I immediately started incorporating these video files into their respective composition. However, as I did so, I also quickly realised that all backgrounds but one have been provided in the wrong file format, creating a highly pixelated look once I enlarged them to the target resolution of 1080p.

Since I was not able to use these files, I asked our director/producer for the files in the correct resolution, which he said he could first provide by Monday, as he could not access the files from uni. This however might have massive implications for using and applying multi-layered green screens within the time restrictions at hand, as masking out several background layers and adding them to the composition takes time. With this, I was furthermore restricted to experiment with the only background that worked so far, which was the post-apocalyptic background.

Thus, I started colour grading the front layer with our actor by adding a Lumetri Color Effect and tweaking the values for saturation as well as for the highlights, shadows and midtones.

Thus, the intermediate clip looked like this:

Whilst the overall tones were now closing in to the colour spectrum of the background, the highlights were still not correct as they did not replicate the cool light of the street lamps at night, but looked too hard and red. Thus, I tweaked the colour temperature of the highlights and moved them into the cool blue area:

Which gave the highlights a softer tinge:

Whilst the colour grading is not yet perfect, it is much better than the previous image. I chose the highlights to be of a blue-ish nature as – judging from the car in the background which front is vaguely lit in blue – there seems to be a light source intended in the background image as well. Having rather strong highlights, I used this indication of light as a good basis to adjust the colour temperature to something similar, imitating a lamppost that would – from the audience’s point of view – stand even behind the audience itself, slightly illuminating the face of our actor.

Legal Restrictions Popping Up

Asking him for the copyright clearance, I was informed that this was all checked and cleared for our purposes and could also be traced in the documentation he provided. Not wanting to lose more time at the booked editing machine, I thus immediately started incorporating the background into its relevant composition and began to adjust our filmed clips in terms of composition (rule of thirds) as well as in terms of colour grading, deciding to take up the legal documentation (attributing the creator(s)) later on at home.

However, right here it turned out that aligning our actor screen right – according to the rule of thirds – was impossible for that shot, as our actor repeatedly dipped out of frame right, cutting his arm off visibly. With this issue, I had to place him back where he originally was in the frame, which somewhat worked with the background provided although he was too far right on the screen for my personal taste.

Another issue with the background then appeared, which was the watermark that been inserted in the bottom of the video. This perplexed me as I was previously informed that the footage was good to use for our purposes. I double-checked the copyright with our producer and he confirmed again that he had checked the case and that we were free to go. However, since this watermark was placed at the far bottom of the image – and we had collectively decided to letterbox our project early on – we would not be able to see it later in the project.

It was not until later, when I was typing up this blog that I was still bothered by this issue and tried to retrace the copyright of this particular video myself. And indeed! After a quick research, I found that this background was not free to use at all as it belonged to a computer game called Homefront Revolution and was used in an ASMR video on Youtube:

With this information being given, I contacted our producer/director, informed him about the update regarding the copyright and requested him to find another alternative that would have to be ready by Monday morning, when I intended to incorporate and adjust the background videos for good.

My Learning of This Week

  1. Using a Spydercheckr is a lifesaver and timesaver! It takes away so much time that would otherwise go into trial-and-error experimentation and has a roughly 90% change of immediate success without any tweaks. Note: This is also helpful for shots without green screen elements but with subtly changing lighting scenarios (like filming outside).

  2. Even if you feel stressed and pressured, dedicate just a couple of minutes to researching/retracing the copyright of any file you’ve been given, especially if you feel that something might be off! Had I done this immediately after I felt doubts about the copyright, I would have saved 1,5 hours of editing time.

  3. Ensure, double-check, and even triple-check that special video and audio files have been given to you in time and remind the relevant people in time. Not having the files at your disposal when you are sitting in the Editsuite that is booked out will result in unnecessary loss of time.

  4. I now know how to create smoke in After Effects.

  5. Editing and post-production takes ten times longer than you planned for.

References:

Abdullah Yameen. (2018) After Effects Tutorial - How To Make Smoke [online] https://www.youtube.com/watch?v=EfyWOlKqksU[Accessed on 26 April 2019]

Avid Community. (2015) Avid-After Effects Workflow? [online] https://community.avid.com/forums/p/135197/768767.aspx[Accessed on 14 April 2019]

Clinttill. (2016) HOW TO EXPORT AN AVID SEQUENCE FOR USE IN AFTER EFFECTS [online] http://clinttill.net/blog/2016/2/26/how-to-export-an-avid-sequence-for-use-in-after-effects[Accessed on 14 April 2019]

Cristi Rosiek. (2014) After Effects Tutorial - How To Make Smoke [online] https://www.youtube.com/watch?v=UoNFREV6oXY [Accessed on 26 April 2019]

Eurogamer. (2014) Super Smash Bros. – Mii Character Announce Trailer – E3 2014 – Eurogamer [online] https://www.youtube.com/watch?v=YdDYoCU2kv0&frags=pl%2Cwn [Accessed on 3 March]

Film Learnin. (2014) Film Learnin: Anime speed fighting effect! [online] https://www.youtube.com/watch?v=Y_5q0Sln8TU&list=PLRG4t0YYtkIzejIqbZ-qKOLjQchEri9ev&index=120&t=1s [Accessed on ]

Hanby, J. (2017) Become a Workflow Master, Part 1: Before the Edit [online] https://nofilmschool.com/2017/08/become-a-post-production-workflow-master-part-one[Accessed on 14 April 2019]

Hanby, J. (2017) Become a Workflow Master, Part 2: Edit to Final Delivery [online] https://nofilmschool.com/2017/08/become-a-post-production-workflow-master-part-two[Accessed on 14 April 2019]

Ignace Aleya. (2017) How To Fake Atmospheric Smoke Animation Effects in Adobe After Effects using Fractal Noise Tutorial [online]https://www.youtube.com/watch?v=6jYqFBehSmE[Accessed on 26 April 2019]

MiesnerMedia. (2015) Chroma Keying Workflow - DaVinci Resolve and After Effects Tutorial [online]

https://www.youtube.com/watch?v=ngQyu1UjIL8 [Accessed on 25 April 2019]

Mendelovich, J. (2017) 13 Essentials Steps to Supercharge Your Editing Workflow [online] https://www.cinema5d.com/the-methodology-and-psychology-of-editing/ [Accessed on 14 April 2019]

Nedomansky, V. (2014) Perfecting the Film Post Production Workflow [online] http://vashivisuals.com/grind-perfecting-post-production-workflow/[Accessed on 14 April 2019]

Sudhakaran, S. (2013) The Avid to After Effects Workflow [online] https://wolfcrow.com/the-avid-to-after-effects-workflow/[Accessed on 14 April 2019]

Hartle, S. (2018) Introduction Into DaVinci Resolve. 19 October 2018. Svea Hartle SveaExMachina’s Blog [online] https://sveahartle.wixsite.com/sveaexmachina/single-post/2018/10/19/Introduction-into-DaVinci-Resolve?fbclid=IwAR3CMMuNz1gGvpXgmtnkPMG9yPzYrsebzWR5jRcKQegjCqDtRVRCgTJyd8k [Accessed on 26 April 2019] Video Game Ambience Asmr (2018) Video Game Ambience Asmr - (Homefront Revolution) Rainy Post Apocalyptic Alleyway [online] https://www.youtube.com/watch?v=R14nb32v14E&list=PLRG4t0YYtkIzejIqbZ-qKOLjQchEri9ev&index=15&t=383s [Accessed on 26 April 2019]

Visionary Fire. (2016) Particular SMOKE Tutorial | Ultra Realistic! [online] https://www.youtube.com/watch?v=LDXD6j0760Y[Accessed on 26 April 2019]

  • Facebook
  • Twitter
  • LinkedIn

©2019 by Svea Hartle

bottom of page