Spectra

Curtis Hu, Ashvin Verma, Toben Main, Viktor Mooren

Summary

This project aims to develop a physically-based spectral rendering system capable of accurately simulating thin-film interference effects, such as those seen in soap bubbles, oil slicks, and similar iridescent phenomena on the surface of opaque bodies. The system will address the challenges of modeling wavelength-dependent light transport and spectral interference, exploring solutions by extending an existing ray tracer.

Problem Description

Traditional RGB rendering (using only red, green, and blue channels as color components), while wildly successful in computer graphics in general, has fundamental limitations in representing phenomena where there is a shift in the spectral power distributions. In a standard renderer, each RGB color channel represents (possible combinations of) a broad band of wavelengths, so fine spectral interactions are averaged out. This means phenomena like thin-film interference—where certain wavelengths reinforce or cancel out due to path length differences—cannot be replicated with simple RGB approximations. In fact, standard RGB rendering cannot accurately represent highly wavelength-dependent phenomena such as thin-film interference. The result is that an RGB-based renderer would fail to produce the distinctive rainbow shimmer on a soap bubble, the shifting colors of an oil slick or the iridisensce of a leather couch, because it lacks the spectral resolution to model how different wavelengths interfere.

Spectral rendering is important for simulating these physical phenomena because it treats light as a spectrum of wavelengths rather than just three values. A classic example is a soap bubble: it shows iridescent colors that change with viewing angle and light direction due to thin-film interference. Light reflecting off the thin soap film undergoes interference between the reflections from the front and back surfaces of the film, leading to vivid colors that depend on the film’s thickness and the incoming light’s wavelength and angle. Similarly, oil on water displays rainbow patches for the same reason. Capturing these effects requires computing physics at the wavelength level. Dispersion is another wavelength-dependent effect, where materials like prisms or certain glass split white light into a spectrum of colors. Both dispersion and thin-film interference are beyond the reach of an RGB renderer but can be handled by a spectral renderer. The core challenge in this project is to simulate both dispersion and thin-film interference within a rendering system by using accurate wavelength sampling and interference modeling. This involves extending the light transport simulation to sample many wavelengths (or a continuous representation of spectrum) and computing interference based on phase differences for those wavelengths. Integrating this into a renderer will require careful design to manage the additional computational cost. We are mainly trying to modify our existing ray tracer from HW3, but are seeing that we may need to modify its guts to add these features. The decision will be based on feasibility and which approach allows easier integration of spectral physics. Either way, the goal is to overcome the RGB limitations by incorporating true spectral light transport and thin-film interference modeling into the rendering pipeline.

Goals and Deliverables

  1. MVP (What we plan to deliver):
    • Spectral Rendering Engine: A functioning spectral renderer capable of producing images of soap bubbles (or similar thin films) with realistic thin-film interference patterns. This renderer will use physically accurate spectral sampling of light, rather than RGB, as the foundation.
    • Spectral Light Representation: Implement a method for representing and sampling light spectra instead of just three RGB channels. This could involve Monte Carlo sampling of individual wavelengths across the visible spectrum or using a fixed spectral basis. The aim is to ensure the renderer can handle continuous wavelength data for light sources and surfaces.
    • Thin-Film Interference Model: Implement thin-film interference effects using physics-based optics models. This includes using Fresnel equations for reflectance at interfaces and accounting for the phase shifts that occur when light waves reflect within a thin film layer. By combining reflections from the top and bottom surfaces of a film with the appropriate phase difference, the renderer will simulate constructive and destructive interference at different wavelengths.
    • Rendered Demonstrations: Output a set of example images demonstrating the effects. For instance, render a scene with a soap bubble under various lighting and viewing angles, showing the changing iridescent colors. These images will serve as validation that our spectral rendering and thin-film interference model are working correctly and producing visually convincing results.

soap bubble soap bubble

  1. Stretch Goal (What we hope to deliver):
    • Extended Phenomena: Extend the spectral renderer to handle additional phenomena beyond soap bubbles. Examples include rendering oil slicks on water (another thin-film case with spatially varying thickness), prism dispersion (splitting white light into a spectrum), or rainbow caustics through dispersive optics or a thin film on a piece of leather causing iridisensce. This will demonstrate the generality of our spectral approach for various wavelength-dependent effects.
    • Spectral Sampling Strategies: Evaluate and compare different spectral sampling models for efficiency and accuracy. For instance, we might compare straightforward Monte Carlo sampling of wavelengths versus using a precomputed spectral basis or fewer “hero” wavelengths to approximate the full spectrum. This analysis will help determine the trade-offs in image quality versus performance for different spectral rendering techniques.
    • White-Light Caustics: Show support for rendering caustics and dispersion under white light illumination. For example, a glass object or prism that creates a spectrum of colors on a surface (a dispersive caustic) would be an impressive demonstration. Our system should be able to simulate how a single white light beam can produce a multi-colored pattern after passing through or reflecting off a dispersive material.
    • Real-World Spectral Data Integration: Increase realism by using real spectral data. This could include using standard illumination spectra (such as the CIE D65 daylight spectrum) instead of an idealized white light, or incorporating measured thin-film thickness maps or measured index-of-refraction spectra for materials. By plugging in real-world data, we can make our rendered results more physically accurate and directly comparable to real photographs.
    • Performance Analysis and Optimization: Perform a basic performance analysis of our spectral renderer and implement acceleration techniques to make it more practical. Potential optimizations include importance sampling of wavelengths (focusing samples on wavelengths that contribute most to the final image), caching reusable spectral results (for example, caching interference results for certain thicknesses or materials), or adaptive sampling where the renderer allocates more samples to regions with high spectral variation. We will document how these techniques improve rendering times or reduce noise, and ensure that the enhanced renderer is still delivering physically accurate results.

Schedule

  • Week 1: Research & Planning – Investigate the physics and math of thin-film interference and dispersion in detail. This involves reading physics literature on thin films (to understand equations for interference based on film thickness and refractive indices) and reviewing existing work on spectral rendering in graphics. We will also survey existing spectral rendering engines or frameworks (for example, check if PBRT or other open-source ray tracers have spectral modes) to gauge what solutions already exist. By the end of this week, we aim to have a clear understanding of the theoretical models and a plan for the renderer’s architecture.
  • Week 2: Setup & Spectral Foundation – Decide on the renderer codebase. We will choose whether to extend an existing ray tracer (such as PBRT-v4 or the course-provided ray tracing framework) or start building a new one tailored for spectral rendering. Once decided, begin implementation of the spectral framework: define how spectra are represented in the code (e.g., as an array of sampled wavelengths, or an analytic function, or using a small set of basis functions). Implement the ability to sample random wavelengths for rays (if using Monte Carlo spectral sampling) and propagate those through the rendering pipeline. By the end of Week 2, we expect to have basic non-interference spectral rendering working (e.g., rendering a scene with dispersion or just showing that a prism splits light correctly, to verify our spectral sampling works).
  • Week 3: Thin-Film Interference Implementation – Integrate a thin-film interference model into the renderer. This includes coding the Fresnel reflectance for at least two-layer interfaces (air/film and film/substrate) and computing the phase shifts due to the film’s thickness for each wavelength. We will likely start with a simple scenario (a single thin film on a surface) and assume uniform thickness, then possibly extend to spatially varying thickness. After implementing the interference calculations, render test scenes focused on soap bubbles or thin-film plates. We’ll experiment with different film thicknesses and view angles to ensure the interference colors appear as expected. By the end of this week, we should have initial images of a spectral soap bubble with noticeable iridescent coloring. Any mismatches or physical inaccuracies will be debugged during this stage.
  • Week 4: Optimization and Extensions – Refine the renderer for quality and speed. This includes optimizing spectral integration (for example, reducing noise in the images by smarter wavelength sampling) and ensuring that adding spectral effects hasn’t made the render unbearably slow. We will also, time permitting, implement one or two aspirational features: for instance, try rendering an oil slick scene or a prism with caustics to see our renderer handle those cases. We’ll spend some time analyzing performance (comparing render times with different numbers of wavelength samples, etc.) and perhaps implement an importance sampling strategy if noise is an issue. The final part of the week is dedicated to preparing the final demo, images, and presentation. We will create side-by-side comparisons (if possible) of RGB vs spectral renderings to highlight the improvements, and assemble our findings into a coherent presentation for the class. Since members of our team have experience in GPU programming, we might write kernels and accelerate our ray tracer.

Resources

  • https://larswander.com/writing/spectral-ray-tracing/
  • EFFICIENT SPECTRAL RENDERING ON THE GPU FOR PREDICTIVE RENDERING David Murray,1 Alban Fichet,1 and Romain Pacanowski1,2 Institut d’Optique Graduate School, CNRS INRIA
  • https://www.dgp.toronto.edu/~nmorris/CSC2530/Project/morris.pdf
  • https://www-users.cse.umn.edu/~gmeyer/papers/gondek-meyer-siggraph-1994.pdf

Computing Platforms: MacOS, Windows 11

Starter Code: We are considering using PBRT-v3/v4 or the CS184 provided ray tracing base code as a starting point. PBRT v4, for instance, already supports spectral rendering modes, which could save time – we would then focus The project will be implemented in C++ for performance and to allow integration with existing graphics code. If we base our work on PBRT or another renderer, we will work within that C++ framework. We might also use GLSL or CUDA for any GPU-accelerated portions or shader-based experiments (for instance, if we attempt a real-time thin-film shader as a stretch goal). The choice of platform will be finalized in Week 2 when we decide on extending vs. writing from scratch.

We plan to use our personal computers for development and testing. Some team members have a laptop/desktop with a capable GPU (e.g., a NVIDIA graphics card) which will help with faster rendering and potential GPU implementations for all the team members to use.