Yizhou Yu | Jitendra Malik |
We present a new approach to producing photorealistic computer renderings of real architectural scenes under novel lighting conditions, such as at different times of day, starting from a small set of photographs of the real scene. Traditional texture mapping approaches to image-based modeling and rendering are unable to do this because texture maps are the product of the interaction between lighting and surface reflectance and one cannot deal with novel lighting without dissecting their respective contributions. To obtain this decomposition into lighting and reflectance, our basic approach is to solve a series of optimization problems to find the parameters of appropriate lighting and reflectance models that best explain the measured values in the various photographs of the scene. The lighting models include the radiance distributions from the sun and the sky, as well as the landscape to consider the effect of secondary illumination from the environment. The reflectance models are for the surfaces of the architecture. Photographs are taken for the sun, the sky, the landscape, as well as the architecture at a few different times of day to collect enough data for recovering the various lighting and reflectance models. We can predict novel illumination conditions with the recovered lighting models and use these together with the recovered reflectance values to produce renderings of the scene. Our results show that our goal of generating photorealistic renderings of real architectural scenes under novel lighting conditions has been achieved.
Paper postcript and pdf files as it appears in the proceedings of SIGGRAPH'98.