Real-time motion fx experiments
This demo contains some real-time effects reacting to motion of talented dancers. Even though it shows pre-recorded footage, it would work the same way with any live video as only RGB data are used as input (no additional sensors were used). These effects mostly rely on motion vectors generated by moving objects. All fx were recorded in real-time using Touchdesigner.
I am also sharing project file in case someone would like to play with it. It is not really optimized in any way as I was just trying to quickly prototype some concepts (consider it rather a very rough sketch ). In my case it did run at about 25fps (the same rate as source videos I have used), but I guess it would be possible to run it at 60fps in case it would be carefully optimized.
Proper optical flow detection (just like the one by nvidia) might improve results, but for the time being I was happy to just experiment with (somewhat fake) implementation of optical flow in TD. Scene contains gpu particles with customized emitter that emit particles only from moving parts and at the same time it use motion vectors as particle force.
Thanks goes to Otec Mirec for his videos and inspiration.

Dancers:
Otec Mirec
Simon
Kika
Martina
Viktor
DOP:
Barbara Koll