You're currently browsing the "VFX" category. See all posts instead.

Mosaic Effect in Fusion

I’ve recorded another video tutorial for your viewing pleasure. It’s just a tiny effect, but the video shows that you can take even little tasks like this a bit further.

httpvh://www.youtube.com/watch?v=NL99jnhOzWk

Rocket Launch Reference Footage

This is an awesome video about an impressive feat of engineering. But I don’t just love the video and it’s voiceover from a technology-geek point of view. It’s also a great reference footage for rocket exhaust and the mayhem it causes. I would love to see more science-fiction authors take this into account when they allow space ships to launch and land as effortlessly as a taxi pulling up.

Watch it full-screen and turn on HD 720p!

Here’s another one: The sound track of a ride on a space shuttle booster rocket from its launch to its impact in the ocean.

Thanks to my colleague who showed me the aptly-named tumblr page FUCK YEAH SPACE EXPLORATION 🙂

Correcting Rolling Shutter in Fusion

eyeon has recently released its Dimension optical flow and stereo disparity toolset. I didn’t have a chance to test the stereo part yet, but since you can now calculate nice and crisp motion vectors, I’ve written a Fuse that can correct rolling shutter artifacts. Here’s a quick demo video:

You can get real-time performance on your GPU using OpenCL and as usual I’ve licensed my source code BSD-style if you want to tinker with it.

I haven’t had enough footage from different cameras to test the plugin thoroughly (just my iPhone and a Canon 5D Mark II). I also haven’t tested any other motion vector generators except for Dimension. Maybe Twixtor works as well, we’ll see…

Updated 2012-10-09: supports the rolling shutter method used by Syntheyes (center scanline is fixed)

Download the plugin here: RollingShutter_v1_6.Fuse or head over to Vfxpedia.
Photo credits for icon: CC-BY Nayu Kim

Syntheyes Lens Distortion to 3DEqualizer

I’ve crunched a few numbers yesterday and I think I came up with a solution to convert the lens distortion coefficient calculated by Syntheyes to the one used by 3DEqualizer.

This is necessary since up until now, Fusion’s native LensDistort tool only supports the latter. Unfortunately, only the first coefficient can be used since 3DEqualizer’s formula differs for the higher order coefficient. You can download a tool script that does the calculations for you here:

LensDistort for Syntheyes.py

Here’s the math behind:

Rendering rec709 in Fusion

This is part of a series of articles dealing with color workflows in Fusion. Other parts:
Linear Gamma Workflow in Fusion Part 1 and Part 2.

I’ve noticed a couple of Google hits for this topic. Maybe you’re interested in this because Fusion seems to lack a color space selector like Nuke has in its Write node? In Fusion the workflow boils down to using the Gamut tool where rec709 is called by its official name “ITU-BT R.709”. But as with everything in compositing, it makes sense to understand what you’re actually doing since in the end the “how to output rec709” question is just a matter of giving your client what he expects.

The rec709 standard encompasses much more than just the gray-scale gamma curve. It also specifies color primaries, white point and so on which would also affect the hue of your pixels. But usually, if somebody hands you his footage, telling you it’s in “rec709” he probably means that he exported it from his black box editing thingy and he wants the result to look neither brighter nor darker. So I’m mostly dealing with the proper gamma curve for rec709 in this post, not colorimetric mambo-jumbo that I’m still figuring out myself and that mostly concerns DOPs or colorists.

Option 1:

Source footage is a supposed rec709 mov and you’re not using a linear gamma workflow. Fusion won’t do anything to footage it loads in or saves out, so your workflow will simply be this:

Your end result will look the same as your source footage (except for the effect you’ve added). rec709 in, rec709 out. sRGB in, sRGB out. If it doesn’t, then it might be because you’re writing out movs in different codecs and Quicktime does its own unpredictable color shifting. Use file sequences instead. Also, if you’re comparing Fusion’s output to something else in Nuke, make sure both Read nodes are set to the same color space! By default, image sequences are interpreted as sRGB but movs as something else.

Option 2:

Same as before, but you’re working with linear gamma:

The first gamut tool allows you to bring in footage from a different color space as well (sRGB jpegs from a camera come to mind). This is the proper workflow for compositing and also the one used in Nuke.

Note: Fusion 6.4 added a new display-referred rec709 space that is used by ARRI’s Alexa color science. The regular rec709 color space has now been renamed to ITU-R BT.709 (scene).

Option 3:

Linear CGI to rec709. First graph is for a linear workflow, the second one for compositing in rec709 directly. The latter isn’t correct, but useful if you’re afraid of this whole LUT thing…

Option 4:

Source footage is a dpx sequence in a logarithmic format (film or something new like Alexa Log C). Your client wants rec709 for reviews and editing (regardless of what the deliverable will be). You really should be working in a linear gamma workflow here:

To convert from logarithmic to linear you’ll have to set the Loader’s conversion values to those specified by the guy who made the dpx files. If you’ve received Alexa Log C there’s a special conversion that needs to be done. Fusion 6.4 supports it in the Loader and CineonLog tools. In earlier versions you have to bypass Fusion’s log-to-lin conversion that happens in the Loader. Check this forum thread for an implementation of the gamma curve that Nuke is suggesting for Log C or create your own conversion LUTs on ARRI’s site.

The result of the loader will be linear, so a Gamut tool with output space set to ITU-R BT.709 and add gamma enabled results in rec709. Alexa footage will look less saturated than what an editor might see using his ARRI LUT. To fix this, set the Gamut tool’s source color space to Alexa’s wide gamut (requires Fusion 6.4 or later).

Option 4b:

You have also been given a film LUT which the editor would have used to convert logarithmic footage to rec709 on his side. He’s doing more than just a gamma correction so if you’re not doing the same he’ll think you screwed up because saturation and highlights look different. This flow assumes that his LUT expects a logarithmic input which gets converted to rec709:

Don’t use his LUT to convert your footage to rec709 before linearizing it. You’ll lose the dynamic range you could get from the dpx. Better guess some settings for the log2lin conversion in the Loader. As long as you’re using the same values in the CineonLog tool (which goes lin to log), you’re kinda safe.

Update: Ken Turner has more on how to use LUTs in Fusion.

 

 

Linear Gamma in Fusion Part 2

In part one, I have written about viewer LUTs to aid my linear gamma workflow. This part provides an example of the complete workflow (without CGI) and talks a bit about the basics. For starters, lots of people have already explained the issue of gamma quite nicely so I’ll make it short, skip the history and relate things to Fusion.

This article is part of a series of articles dealing with color workflows in Fusion. Other parts:
Linear Gamma Workflow Part 1 and Rendering rec709.

How to setup a linear workflow in Fusion (click to enlarge) – update: you can also use the “Select View LUT Plugin” comp script to save your LUT settings

Image Editing is Math

Color corrections, image warping… everything. In math, you rely on basic assumptions, for example that 1 + 1 = 2. A more specific example that relates to VFX is this: if you take two photos with a camera and for the second take you open your shutter twice as long as before, twice the amount of light will be “recorded” and the resulting image will be twice as bright. In other words, every pixel’s value should be twice as much as before. Fusion is using algorithms that expect this to be the case – as does every other image editing and rendering software out there. They expect a pixel that is twice as bright to have twice the RGB values.

So what’s the big deal? Well, the problem is that every image that looks alright inside an image viewer, a webbrowser, Photoshop, Premiere or Fusion doesn’t obey this premise. The files all have their RGB values warped (due to reasons going back to the beginnings of computer graphics and CRT monitors) to look normal to our eyes when viewed on a monitor. But if computer programs are looking at the raw numbers, assuming that twice the RGB means twice the brightness, they are mistaken. Their algorithms result in 1+1 = 3.

Try it yourself. Increase the gain of a photo by a factor of two. All the color values will be twice as high but the image looks very blown out – not like a photo that has been taken with half the shutter speed (or twice the aperture). This is why you’ve been adding one color correction after another, tweaking curves, creating masks for highlight areas and doing all kinds of nasty stuff to an image. You’ve unknowingly been doing math in a universe where 1+1 = 3.

Gamma Correction

I haven’t used the word gamma yet. In very simple terms, it’s a measurement of how much the color values of an image have been warped and if you read up on it you’ll quickly end up with figures of curved lines, formulas and jargon. Let’s use language a compositor would use (but don’t try to impress your TD, colorist or resident math geek):

An image like the first one to the right that looks good on your monitor by default, i.e. one with warped RGB values, has “gamma”. You need to “remove gamma” or apply a “gamma correction” using a BrightnessContrast or Gamut tool to bring the values back into a dimension where the image is “linear”, i.e. where 1+1=2. Fusion’s algorithms are happy now, your color corrections and defocus effects look much more realistic but oh my… the image is way too dark. But nevertheless, in this dark image, an object that has twice the RGB values really has emitted twice the amount of light towards the camera. Again, this is simplified since digital stills cameras are made to produce visually pleasing jpegs, not physically correct ones.

At the end of your comp, you’ll have to “add gamma” or convert the image back into a “color space” that is suitable for your target audience who will need to see the image with warped RGB values so it appears correctly to human eyes.

Working Linear by Example

The workflow problem that arises is obvious: How do you work on an image if what is being displayed is unnaturally dark? Enter the LUT button. A LUT is a color correction that is applied on-the-fly, just for the viewers, usually in real-time on your GPU. It doesn’t modify the image data of your comp in any way. There are basically two gamma corrections that make sense for a basic linear workflow: sRGB and ITU-R BT.709 (rec709). Both brighten up a linear image for human consumption and differ slightly in perceived brightness (or rather “gamma”).

You should use the same LUT that was used to linearize your footage. If there are no specifications from your grading or editing department, you can’t go wrong with sRGB – although rec709 makes the image appear a bit darker and will prevent you from crushing the blacks too much. But unless you’re working on a calibrated broadcast monitor and know exactly what your editing or grading department and broadcast station does to your images, neither LUT will give you certainty about how it will look to other people. If your task is simply to add something to an otherwise unchanged plate it also doesn’t matter which LUT you’re using as long as you’ve linearized your footage and you’re working in floating point.

Here’s an example comp that demonstrates a feasible linear workflow. It uses a photograph to demonstrate how a defocus produces photorealistic results when applied with linear gamma. It also includes the ViewShaders from my previous post. The next part of this series will discuss CGI and “illegal” operations in linear space.

Space Stallions: Mind = Blown

Das ist was für die Fans von Sci-Fi-Zeichentrickserien aus den 80ern und 90ern: Thundercats + Galaxy Rangers + Saber Rider = Space Stallions!

httpvh://www.youtube.com/watch?v=5Otaq2tmNMM

Eine Bachelor-Arbeit vom dänischen Animation Workshop.