Style Transfer With EbSynth For Compositing

A large part of a modern compositors job boils down to something called “Style Transfer.” This is when we are usually given a paintover, reference image, or some other media format, and are asked to replicate that look in comp. Sometimes this is a “simple” task of stylistically colour grading the image, and sometimes it involves changing every single pixel in the frame fundamentally. Since every project has unique stylization goals, I doubt there is a single way to do this… however one of the most versatile tools in our toolbelt is tracking. EbSynth offers a unique machine learning based approach to tracking, which utilizes optical flow to remap, and track stylized key frames onto a moving image. This sounds pretty promising to me, but how does it work in practice? It turns out pretty well, but only for short periods of time, and for simple animations

Below is a breakdown of my test using a single keyframe to re-stylize a shot from Adventure Time in EbSynth


Results

In this test I made a quick stylized keyframe for what I wanted the end result to look like, and then fed that keyframe into EbSynth with the original video. Since the animation in this shot is fairly simple, and BMO’s animation is limited to a 2D plane, the results were quite successful. You can see that the glows, scanlines, and edge breakup, as well as the overall colour gradation was tracked correctly… however the shadow on the ground behind BMO had to be hand animated, as EbSynth has no concept of lighting/shadows (yet.)

Pros

  • Very fast tracking, fully utilizes the GPU. Much faster than Smart Vectors

  • Simple interface, well designed

  • Great at 2d planar, surface, and corner pin type tracking

  • Flexible, anything that can be drawn as a keyframe, can be tracked in

Cons

  • Complex animations require a lot more keyframes, and at a certain point it is faster just to do it the old fashioned way

  • Artifacts can be created, with no easy way to clean them up in EbSynth

  • Although the interface is simple, the software might as well be a black box for more advanced users

Conclusion

I tried several tests, ranging in complexity, and the one above was the only one to really worked “out of the box,” and I don’t think it was much faster than doing it the old fashioned way with keyframes, trackers, etc… however the stylization I was applying was very rudimentary, and I suspect that a more complex test like turning BMO into a watercolour painting would be a lot quicker, and easier than previous methods…. maybe I will try that in the future. I will also continue comparing this to things like Nukes Smart Vectors, as I do think EbSynth is much faster than those, and could serve as a valuable alternative there


For a more in-depth tutorial, I recommend checking out Joel Haver’s video on how he uses EbSynth to create hilarious animations for YouTube!

Previous
Previous

Is AI video generation any good yet?

Next
Next

PhotoShop Beta Generative Ai Tests