Technical research on virtual film production
I plan on building a virtual film production camera rig and set up a game engine to smoothly work together. The goal is to have a working system for students that won't cost millions.
The camera rig should hold a camera, a screen, and one or more HTC Vive trackers/controllers. This way the camera pan, tilt, roll, and movement on x, y, z-axis will be tracked and used in the game engine for realtime video rendering. The screen on the rig should display the output of the computer so that the camera operator will see the 3d rendered imagery - preferably wireless transmitted. Vive controllers could be used for virtual camera zooms or focusing.
Having a green/bluescreen as a background and replacing it in realtime with a virtual 3d-rendered background allows the director, cameraman, and gaffer to reference surroundings and light in a more intuitive way. Since LIV VR is not working together with tilt brush anymore I'll dig into the LIV VR SDKSoftware Development Kit for Unity.
Filming LEDLight-emitting diode screens:
It has been a new development in film production ever since Gravity (2013) to film LEDLight-emitting diode screens as and using them as light sources. This technology has become more sophisticated and is now used for television shows:
This obviously has only particular cases, where it makes sense to use. Having closer shots and the background out of focus is one of them. I'm very excited to test the DBL screen wall for this technique.