Most rendering systems generate images with the entire scene in sharp focus, mimicking a pinhole camera. However, real cameras contain multi-lens assemblies with finite apertures and exhibit different imaging characteristics such as limited depth of field, field distortion, vignetting and spatially varying exposure. In this assignment, you'll extend pbrt with support for a more realistic camera model that accurately simulates these effects. Specifically, we will provide you with specifications of real wide-angle, normal and telephoto lenses, each composed of multiple lens elements. You will build a camera plugin for pbrt that simulates the traversal of light through these lens assemblies onto the film plane of a virtual camera. With this camera simulator, you'll explore the effects of focus, aperture and exposure. Once you have a working camera simulator, you will add simple auto-focus capabilities to your camera.
Before beginning this assignment you should read the paper "A Realistic Camera Model for Computer Graphics" by Kolb, Mitchell, and Hanrahan. You may also want to review parts of Chapter 6 in pbrt.
Starter code and data files for Lab 2 are located here. In addition to source code, this archive contains the pbrt scene files you will render in this assignment, a collection of lens data files (*.dat), and auto-focus zone info files (*.txt).
In order to integrate the starter code into pbrt, please unpack the
archive into the root of your pbrt installation. You can do this by
first copying the archive to the root of your pbrt installation and
then by invoking the following command.
gunzip -c lab2_src.tgz | tar xf -
You'll need to make some modifications to pbrt before building the realistic camera class.
virtual void AutoFocus(const Scene* scene, const SamplerRenderer* sr,
Sample* sample ) { }
Also add the following line before the beginning of the class declaration.
#include "renderers/samplerrenderer.h"
Sample *sample = new Sample(sampler, surfaceIntegrator,
volumeIntegrator, scene );
camera->AutoFocus( scene, this, sample );
At this point, you should be able to build the code simply by invoking make from the src/ directory. The Makefile provided with pbrt does not need any changes.
In this assignment you will implement the RealisticCamera class defined in cameras/realistic.cpp. The other source files provided in the archive are helper classes that are useful when implementing auto-focus, and are discussed in the auto-focus section below.
The pbrt scene files (in lab2scenes/ ) in this assignment specify that rendering should use the "realistic" camera plugin. The realistic camera accepts a number of parameters from the scene file including the name of a lens data file, the distance between the film plane and the location of the back lens element (the one closest to the film), the diameter of the aperture stop, and the length of the film diagonal (distance from top left corner to bottom right corner of the film). The values of these parameters are passed in to the constructor of the RealisticCamera class. All values are in units of millimeters. For example, a scene file might specify the following camera.
Camera "realistic"
"string specfile" "dgauss.50mm.dat"
"float filmdistance" 36.77
"float aperture_diameter" 17.1
"float filmdiag" 70
The .dat files included with the starter code (also in lab2scenes/) describe camera lenses using the format described in Figure 1 of the Kolb paper. The RealisticCamera class must read and parse the specified lens data file. In pbrt, a camera's viewing direction is the positive z-direction in camera space. Therefore, your camera should be looking directly down the z-axis. The first lens element listed in the file (the lens element closest to the world, and farthest from the film plane) should be located at the origin in camera space with the rest of the lens system and film plane extending in the negative-z direction. Each line in the file contains the following information about one spherical lens interface.
lens_radius z-axis_intercept index_of_refraction aperture
More precisely:
Note that exactly one of the lines in the data file will have lens_radius = 0. This is the aperture stop of the camera. The maximum size is given by the aperture value on this line. The actual size of the aperture stop is given as a parameter to the realistic camera from the pbrt scene file. Also note that the index of refraction of the world side of the first lens element is 1 (it's air).
Next you'll need to implement the RealisticCamera::GenerateRay function. The GenerateRay function takes a sample position in image space (given by sample.imageX and sample.imageY) as an argument and should return a random ray into the scene. To the rest of pbrt, your camera looks just like any other camera; it just takes a sample position and returns a ray from the camera out into the world. Here's an outline of the main steps.
Note that the above images were rendered with an earlier version of pbrt.
The materials in your rendered images will look a little different since the
provided scene files use different materials.
Also, pbrt will generate warning messages about a scaling in the
camera-to-world transformation. You can safely ignore these messages.
For rays that terminate at the aperture stop, return a ray with a weight of 0 -- pbrt tests for such a case and will terminate the ray instead of sending it out into the scene.
You'll need to submit renderings for each of the four scenes at both 4 and 512 samples per pixel. You should also re-render lab2_telephoto.pbrt with the aperture radius decreased by half. What are the two main effects you expect to see? Does your camera simulation produce this result? By decreasing the aperture radius by one half, by how many stops have you decreased the resulting photographs exposure?
You should also submit a writeup which thoroughly describes your camera implementation and answers the following questions.
The auto-focus mechanism in a modern digital camera samples light incident on subregions of the sensor (film) plane and analyzes light in these subregions to compute focus. These regions of the frame are called auto-focus zones (AF zones). For example, an auto-focus algorithm might look for the presence of high frequencies (sharp edges in the image) within an AF zone to determine that the image is in focus. You may have noticed the AF zones in the viewfinder of your own camera. As an example, the AF zones used by the auto-focus system in the Nikon D200 are shown below.
In this part of the assignment, you'll be implementing an
auto-focus algorithm for your RealisticCamera. We will
provide you a scene and a set of AF zones, and you will need to use
these zones to automatically determine the film depth for your camera
so that the scene is in focus. Notice that in
lab2_afdgauss_closeup.pbrt, the camera description contains an
extra parameter af_zones. This parameter specifies the text file that
contains a list of AF zones. Each line in the file defines the bottom
left and top right of a rectangular zone using four floating point
numbers:
These coordinates are relative to the top left corner of the film (the
numbers will fall between 0.0 and 1.0). For example, a zone spanning
the entire film plane would be given by 0.0 1.0 0.0 1.0. A zone
spanning the top left quadrant of the film is 0.0 0.5 0.0 0.5.
xleft xright ytop ybottom
You will now need to implement the AutoFocus method of the RealisticCamera class. In this method, the camera should modify its film depth so that the scene is in focus.
There are many ways to go about implementing this part of the assignment. One approach is to shoot rays from within AF zones on the film plane out into the scene (essentially rendering a small part of the image) and then analyze the subimage to determine if it is in focus. The starter code provided is intended to help you implement auto-focus in this manner. Here are some tips to get started with the provided code:
To test your auto-focusing algorithm, we provide three scenes that require the camera to focus using a single AF zone. The images resulting from proper focusing on lab2_afdgauss_closeup.pbrt, lab2_afdgauss_bg.pbrt, and lab2_aftelephoto.pbrt are shown below (rendered at 512 samples per pixel). The location of the AF zone in each image is shown as a white rectangle.
We have also provided scenes lab2_afspheres.pbrt and lab2_bunnies.pbrt that are constructed so that there is a choice of which object to bring into focus. We have defined multiple auto-focus zones for these scenes. How might you modify your auto-focus algorithm to account for input from multiple zones? Many cameras choose to focus on the closest object they can bring into focus or have "modes" that allow the user to hint at where focus should be set. For example, you might want to add an additional parameter to your camera that designates whether to focus on close up or far away objects in the scene.
Note that the spheres scene uses different material properties, so your images will look a little different. The bunnies scene is unaltered.
Please submit renderings for lab2_afdgauss_closeup.pbrt, lab2_dgauss_bg.pbrt, and lab2_aftelephoto.pbrt at the film depth computed by your auto-focus routine with at least 256 samples per pixel.
In addition, please add a description of your auto-focus implementation and the film plane depths your algorithm computed for each of the three scenes to your writeup.
All files should be contained in a single ZIP file. Make sure your code, images, and report, are contained in their own, separate folders within the ZIP file.
We expect a good student to have to work approximately 20 hours on this assignment.
This assignment will be graded on a 4 point scale:
NOTE: We will NOT (under any circumstance) accept labs that only run under windows or have porting problems. You are on your own, if you use windows to develop your labs. There will be absolutely NO exceptions.