For my Master’s Thesis I investigated existing depth-of-field simulation methods for VFX applications. Both object- and image space approaches have been considered. As part of this work, a prototypical Nuke Plugin Prototype has been developed in C++.
Most available image based depth-of-field simulation approaches do not allow for spatially varying bokeh shapes. Effects like “Cat’s Eye Bokeh” cannot be simulated with these methods. The prototypical Nuke plugin adds this functionality by interpolating PSFs.
Principal of Operation
The princial of operation is based on the research by Thomas Hach et al as presented in “Cinematic Bokeh rendering for real scenes “.
For the prototype, point-spread-functions (PSFs) were manually acquired with a photo camera for a certain lens, sensor and a small LED slight source at a given focus distance. In fixed interval distances from the camera, a center and an outer PSF have been obtained.
The plugin prototype reads these PSFs from disk and interpolates them in order to be able to synthesize a PSF for each pixel in an input image with a certain distance from the camera. The below image shows such an interpolation result. The PSFs in the green circles are the obtained PSFs while all other PSFs were synthesized through interpolation.
The interpolated PSFs are then used as filter kernels for convolution. Two different approaches have been tested for this thesis. The gathering and the spreading method. The below image shows the results of both methods as straight-forward implementations. Spatially varying bokeh shapes can be discovered in the stary background. These are the result of the PSF interpolation. Both images show severe depth discontinuity and intensity leakage artefacts. 3d Models and textures in the below test images showing ISS and Kepler Observatory, have been obtained from NASA’s extensive 3d resource library. Images and depth pass were rendered with Mental Ray for Autodesk Maya.