Dome light, or BG plate made to not be in camera's depth of field
nasty-command last edited by Dr. Sassi
I notice that when I am inserting a 3d object in a live action scene, the plate (in RS, I've been using the BG plate function in the dome light) it gets blurred by the camera Depth of field . This doesn't actually work in a helpful way because the footage was already shot this it's own depth of field and I am trying to match that. It seem like we should have the ability to work with a backdrop (and/or other objects) that don't get effected by the camera's DOF.
Is this possible?
Dr. Sassi last edited by
This is a general question and difficult to answer. Or it is simple with an "I have no idea how to exclude DOF blur for a Back-Plate in the same render session".
Yes, that is not how it should work, doubling blur effects. Since I am writing in a forum, let me put some points related to this discussion. You might be aware of all of them, so bear with me.
I avoid baking background footage or images into the rendering for many reasons:
The Lens-Distortion-Workflow would be the first part, as the key is to keep the original footage as it is. Un-distort will limit the quality. In times with 12K cameras, we might change that idea.
The footage is always 2D based (vs. LIDAR or Point Cloud projections of it.) In this way, the image is blurred without anything that would resample DOF. Not to talk alone that the footage might have already DOF with a specific BOKEH aesthetic. (Some use those interchangeably, which I think is unsuitable for a discussion.)
Lights in the footage are seen (inside the 3D project) from the same distance. Typically, that is not a big problem if it is not reflected on a concave or convex surface, similar to any optical refraction, as the distance is also the key here.
Baked-in footage might show later as a "Take" that was not supposed to be used and needs to be swapped out. With a lock-off or Motion Controlled camera, that can be easily swapped out later in Compositing, but if baked in, not so quickly.
The footage might be based on filtration, e.g., diffusion, or has other Color Grading influences already. How to match this in 3DCG apps?
How to work with Zoom, camera shake and other parts is a question of Motion tracking and matching the resulting effects.
I hope the shortlist provided enough to show that a general answer might not be possible and a case-by-case solution will be needed.
There might be more, but this is the core. I wanted to mention why I suggest keeping the footage in the Compositing. Only for the affected parts should the practical camera footage of the Back Plate be used. If so, create a Puzzle matte to provide the Compositing department with the correct information.
Another puzzle would be how the information of a Back-Plate would be treated if it moved through a nearly transparent ice sculpture. Is that then Back-Plate or a 3D object?
How about Motion Blur? That would be another question. It is not simple to have DOF blur and Motion Blur outside the 3D app if rendered as 2D footage. As with any photographic process that leads to 2D results, information on all obscured parts is naturally excluded. But precisely, that is a component of DOF, and Motion blur is often part of the DOF or Motion blur—another argument for having the footage inside. So have it inside or outside. I guess, more often than not, it is both.
I will add this post to a feature request I made, and fingers crossed, the fantastic team at Redshift 3D will find a way; after all, we can already exclude the camera motion blur.
I hope I have enough information to showcase that this is not a simple theme. But thanks for asking and pointing out that this needs to be addressed in more detail.
Enjoy your weekend