Our presentation can be found here.

Our short movies can be found here and here.

Our goal was to render an electric storm striking an ocean. We both took several years of ancient Greek in high school, so we decided to theme the picture along those lines; they scene is supposed to represent a Greek ship lost on its way to battle. Here are a couple of lightning photographs that we found inspirational while working on the project:

The image consists of several fairly independent components:

### Experiments

We explored several types of sky.
- 2D-photon mapping
- heightfield-based geometry
- flat procedural texturing
- metaballs
### Final Sky

Here are some sample images from our experimental sky techniques:

Flat Texture | Pure Volume |

Metaballs Volume | Heightfield |

We decided to go with the final approach of **3D-photon mapping** [1]. The water reflects the sky, and so finding a physically-accurate model for the sky required a lot of attention. Although some of the other methods looked pretty good, volume rendering the 3D photon map added the most realism to the scene. The density function for the sky is 3D Perlin noise.

We traced 10 million photons through the volume of the clouds to generate a volumetric photon map. For space-efficiency and access time, we stored the resulting data as a 3D texture map. This optimization circumvented the costly search operation.

Next, we used Peter Williams' "Volume Density Optical Model" [2] to render the photon map as a luminous volume. The photon map saved us the work of recursively tracing shadow and scatter rays from our primary ray-march, converting the exponential general-case volume rendering integration into a linear time algorithm.

A slice of the 3D photon map | The resulting image |

- Linear waves: The lines of curvature are orthogonal and straight. There are globally two principal directions. These monotonic waves are the sort that a steady wind will cause.
- Circular waves: The lines of curvature are still orthogonal, but are not all straight. The first set of lines of curvature form concentric circles around a given point, while the second set are radial spokes from that point. This kind of waves occur in nature due to specific events (for example, when a large fish makes a splash).

The ocean is raytraced using Musgrave's QAEB-tracing algorithm [5]. Precise procedural normals are available from the underlying analytically defined surface. Reflection rays are generated upon collision, and weighted by a function of their incident angle. We did not trace any rays underwater, but we did use an approximation to the subsurface scattering caused by the bright lightning:

No Subsurface Scattering | With Subsurface Scattering |

Since the water is defined by simple wave functions, it was possible to accurately solve the governing partial differential equations for its motion. Using d'Alembert's solution to the wave equation [7], we animated the water in a physically accurate fashion without having to numerically solve PDEs.

### Geometry

We used a stochastic process to simulate a realistic lightning geometry as a phenomenological model. The program steps through the lightning in a random direction, determined by parameters to a recurive function, rotates a random amount, and branches a random distance. We control all the randomness with constraints and heuristics. For example, the lightning branches less at the top and more towards the end. To add curvature to the lightning, we used sin waves with constrained random parameters. Additional measures were taken to force the main branch to always reach the surface of the water. As output, the lightning geometry generation program exports line segments. The line segments can either feed into our Modeling Studio or into the raytracer program.
### Rendering

The lightning is rendered as a linear light source. Since it has negligable thickness, rays cannot collide with it,
so to make it visible to the viewer, we had to render the entire atmosphere as a (very low-density) volume.
Atmospheric particles near the lightning reflect a large amount of light to the viewer,
while particles distant from the lightning reflect little light.
This produces the glowing visual effect for the lightning
(in particular, the glow was NOT done as an image-space operation or camera effect).
Since we wanted the atmosphere to appear perfectly smooth,
we decided to use fixed sampling along the lightning rather than Monte Carlo integration for this portion of the rendering.
This caused the noise in the atmosphere to be extremely low, at the cost of a small bias.

Although the geometries tend to look fine with no tweaking, we also decided to implement a Lightning Modeling Studio GUI to perfect the geometries before rendering. We implemented the Lightning Studio in YZ-Windows.

To get a sense of the temporal aspects of lightning storms, we watched a couple of videos of lightning (from wvlightning.com) and noticed that the geometries repeated for several seconds. We thus decided to use the same geometry for multiple flashes in our short video. Here is a lightning scientist's explanation of this effect:

To speed the volume integration up for the homogeneous low-density atmospheric medium, we used an approximation to the volume rendering equation and derived a closed-form expression for the contribution of a lightning segment to the glow along a ray. This saved us the expense of marching along each ray through the expansive atmospheric volume. The mathematics of the optimization are described in this brief document (PDF).

The lightning's reflection through the clouds was accomplished through a photon map, as described above; we used uniform random sampling of the lightning to generate photon initial positions, and traced their paths through the clouds to generate the photon map. Diffuse shading calculations integrated the lightning uniformly by arc length.

Our code only supports these lights in the form of discrete line segments, but our technique is general and may easily be applied to an arbitrary 1-manifold with boundary. The linear light sources can also define color and brightness as a function on the path. An useful and relatively simple optimization to consider for the future is importance sampling the lightning's geometry when performing diffuse shading calculations. This could be accomplished by using the combined brightness and projected angle of a lightning segment as a probability distribution function.

The final images were rendered at 3600 by 2700 pixels with 100 shadow rays per eye ray, and then resampled down to 1280 by 960 to reduce aliasing. The close-up image of the boat with the purple lightning took about 8 hours of rendering time, and the distant views took about 1 hour each.

- Jensen, Henrik W.: "Realistic Image Synthesis Using Photon Mapping." AK Peters, 2001.
- Williams, Peter L.: "A Volume Density Optical Model." ACM Press, New York, 1992.
- Ebert, David S.: "Procedural Volumetric Cloud Modeling, Animation, and Real-time Techniques."
- Dobashi, Y. et al: "Efficient Rendering of Lightning taking into Account Scattering Effects due to Clouds and Atmospheric Particles."
- Musgrave, F. Kenton: "Quasi-Analytic Error-Bounded Ray Tracing of Procedural Height Fields."
- Finch, M.: "Effective Water Simulation from Physical Models," in
__GPU Gems__. Addison-Wesley, Boston, 2004. - Strauss, Walter A.: "Partial Differential Equations, An Introduction." John Wiley & Sons, Inc., New York, 1992.
- Internet: Vaisala Thunderstorm. "Frequently asked Questions About Lightning." http://www.lightningstorm.com/tux/jsp/faq/index.jsp, 2004.