Capturing and Rendering With Incident Light Fields (bibtex)
by J. Unger, Andreas Wenger, Tim Hawkins, Andrew Gardner, Paul Debevec
Abstract:
This paper presents a process for capturing spatially and directionally varying illumination from a real-world scene and using this lighting to illuminate computer-generated objects. We use two devices for capturing such illumination. In the first we photograph an array of mirrored spheres in high dynamic range to capture the spatially varying illumination. In the second, we obtain higher resolution data by capturing images with an high dynamic range omnidirectional camera as it traverses across a plane. For both methods we apply the light field technique to extrapolate the incident illumination to a volume. We render computer-generated objects as illuminated by this captured illumination using a custom shader within an existing global illumination rendering system. To demonstrate our technique we capture several spatially-varying lighting environments with spotlights, shadows, and dappled lighting and use them to illuminate synthetic scenes. We also show comparisons to real objects under the same illumination.
Reference:
Capturing and Rendering With Incident Light Fields (J. Unger, Andreas Wenger, Tim Hawkins, Andrew Gardner, Paul Debevec), In Proceedings of the 14th Eurographics workshop on Rendering, 2003.
Bibtex Entry:
@inproceedings{unger_capturing_2003,
	title = {Capturing and {Rendering} {With} {Incident} {Light} {Fields}},
	url = {http://ict.usc.edu/pubs/Capturing%20and%20Rendering%20With%20Incident%20Light%20Fields.pdf},
	abstract = {This paper presents a process for capturing spatially and directionally varying illumination from a real-world scene and using this lighting to illuminate computer-generated objects. We use two devices for capturing such illumination. In the first we photograph an array of mirrored spheres in high dynamic range to capture the spatially varying illumination. In the second, we obtain higher resolution data by capturing images with an high dynamic range omnidirectional camera as it traverses across a plane. For both methods we apply the light field technique to extrapolate the incident illumination to a volume. We render computer-generated objects as illuminated by this captured illumination using a custom shader within an existing global illumination rendering system. To demonstrate our technique we capture several spatially-varying lighting environments with spotlights, shadows, and dappled lighting and use them to illuminate synthetic scenes. We also show comparisons to real objects under the same illumination.},
	booktitle = {Proceedings of the 14th {Eurographics} workshop on {Rendering}},
	author = {Unger, J. and Wenger, Andreas and Hawkins, Tim and Gardner, Andrew and Debevec, Paul},
	year = {2003},
	keywords = {Graphics}
}
Powered by bibtexbrowser