USC

ICT

GL

Interactive Image-Based Relighting with Spatially-Varying Lights
Interactive Image-Based Relighting with Spatially-Varying Lights
SIGGRAPH 2009 Poster
Borom Tunwattanapong     Paul Debevec
USC Institute for Creative Technologies


Novel lighting conditions. Left: A combining of two sets of lighting configuration to form more comlicated lighiting condition. Middle: The face lit from per pixel view dependent reflection angles. Right: The face lit from the whole sphere of directions at once.

 


introduction:

We present an interactive relighting technique where different areas of the image can be illuminated with combinations of dfferent lighting directions. The approach is to first capture a 4D reflectance field using a light stage, then interpolate user-specied light constraints using radial basis functions, and finally render the calculated illumination result in real-time using the GPU. The application can simulate impossible but artistically useful lighting conditions ranging from subtle embellishments to cinematography to non-photorealistic rendering effects as in [Akers et al. 2003]. For example, the user can sculpt a point light (rather than an area light) which illuminates a face from the whole sphere of directions at once, or which lights every pixel from a grazing angle relative to its surface normal (Fig. 2).

 

 

capturing and storing reflectance data:

We have principally used 4D reflectance fields taken in Light Stage 2 [Hawkins et al. 2004] where a rotating semicircle of strobe lights is synchronized to a Thomson Viper digital motion picture camera. By sequentially strobing lights on the subject from 32 longitudinal directions by 15 directions in latitude, we capture 480 lighting directions of the stationary subject in eight seconds. We load half-HD resolution data (540 x 960) in 8-bit sRGB color space to CPU memory and quarter-HD resolution data (270 x 480) in 8-bit sRGB color space to GPU memory, leaving the original full-HD 10-bit log data on disk.

 

 

 

GPU-accelerated RBF interpolation of lighting constraints:

The user sculpts the illumination by applying light constraints at selected image locations. Each constraint specifies light parameters including its direction, color, and intensity at its image location and a GPU program uses Gaussian radial basis functions to interpolate lighting parameters across the whole image using all user-specified constraints. The use of the GPU enhances the calculation of the Gaussian exponential function. The interpolation coefficients can be visualized interactively as a false color image of the directions by showing (X,Y,Z) lighting direction vector components as (R,G,B) colors. Users can additively combine multiple, independently constrained lights to create more complex lighting conditions.

final rendering:

The interpolated lighting direction constraints yield per-pixel lookup values into the reflectance field, computed with sub-pixel accuracy using bilinear interpolation. For each set of lighting constraints, one pixel of the final result corresponds to one direction of light. By having more than one set of lighting constraints, one pixel can be lit from several directions, colors and intensities of lights. Users can enable or disable sets of lighting constraints to see the effect of an individual set. The application can render a 480 x 270 pixel result in real-time using the GPU. The program also employs a multi-threaded system to compute a 960 x 540 pixel in the background taking just a second or two to provide a high-resolution update. For final output, the application can render a high-resolution result at 1920 x 1080 pixels - loading up to a 3GB reflectance field from disk - within three to four minutes.

 

 

 

 

 


Material:

SIGGRAPH 2009 Poster:

SIGGRAPH 2009 Video (4:59 min):


Related Projects: