Basic Concepts of Physically-Based Rendering

    Tuesday, 19 November

    09:00 - 12:45

    Room S426 + S427

    This tutorial will cover the basics of physically-based rendering such as reflection models (BRDF), volume scattering (phase functions), optical phenomena (dispersion and polarization). It will also cover image formation via basic camera models. A brief summary of popular algorithms will be covered including radiosity, path tracing, photon tracing, and Metropolis Light Transport. The course will end with a more detailed description of adjoint photon tracing so that attendees can later implement their own physically-based renderer.



    Intended Audience

    People with little familiarity with physically based rendering that want to get a quick entry into writing a physically based rendering.


    Basic graphic background.


    Peter Shirley, NVIDIA

    Dr. Peter Shirley has worked in computer graphics for over two decades. He has a B.A. in physics from Reed College and a Ph.D. in computer science for the University of Illinois at Urbana-Champaign. He is the coauthor of three books and dozens of technical articles. He has worded at Indiana University, Cornell University, the University of Utah and NVIDIA Research. His professional interests include interactive and realistic rendering, statistical computing, visualization, and immersive environments.