Planetary Atmospheric Scattering
Atmospheric scattering is a complex process that occurs when sunlight interacts with particles in a planetary atmosphere, resulting in visually striking phenomena such as the blue sky and colorful sunsets.
Atmospheric scattering is complicated. It involves intricate light interactions with various particles and gases within a planet’s atmosphere, and the light transport equation in a participating medium applied to the atmosphere is very difficult to solve. Therefore, many promises have been made in earlier works to render atmospheric phenomena such as sunsets, sky colors, and aerial perspectives in real-time. Eric Bruneton and Fabrice Neyret’s approach aims to overcome these challenges by employing precomputed atmospheric scattering techniques. The method significantly reduces the computational demands commonly associated with atmospheric rendering using precomputed look-up tables for key components like transmittance, single scattering, multiple scattering, and ground irradiance.
In this post, we discuss a project Jerry Lu and I completed for the Advanced Computer Graphics Course at Rensselaer Polytechnic Institute, building upon Bruneton and Neyret’s work. Our project involved experimenting with and implementing this method, which precomputes the light transport equation to simulate light scattering processes in real-time. This is done considering various viewpoints, view directions, and sun directions. We employed precomputed look-up tables (LUTs) for essential elements like transmittance, single and multiple scattering, and ground irradiance. This approach enabled our implementation to be highly efficient, maintaining impressive performance levels and delivering visually striking results.
Due to time constraints, our implementation does not include the complete methodology described in the original paper; specifically, we omitted the illuminance as well as the shadows and light shafts component. Despite these modifications, our implementation still captures the essence of the original work and produces accurate and lovely renderings of the planetary atmosphere without compromising the overall results.
Atmospheric models
Atmospheric scattering primarily consists of two physical phenomena: Rayleigh and Mie scattering, which are the main contributors to the colors of our sky. Rayleigh scattering is an optical phenomenon predominantly responsible for clear blue skies and vibrant orange-red sunsets. As light passes through the air, it interacts with small molecules whose wavelengths are much shorter, resulting in wavelength-sensitive scattering. In contrast, Mie scattering takes place during overcast weather and creates the distinctive Tyndall effect. It occurs when the size of airborne particles, such as those found in fog, smoke, and dust, is approximately equal to or greater than the wavelength of light. Unlike Rayleigh scattering, Mie scattering does not exhibit wavelength sensitivity and has a limited capacity to alter the color of incoming light. A more detailed overview of atmospheric scattering is shown in Figure 1.
Physical model
Incorporating the aforementioned scatterings, we can develop a physical model consists of air molecules and aerosol particles, in a thin spherical layer of decreasing density between and . At any given point, the proportion of light scattered degrees from its incident direction is determined by the product of a scattering coefficient and a phase function . The particle density influences , while delineates the angular dependency. For smaller air molecules, both and are defined by Rayleigh theory:
where , is the altitude, is the wavelength, is the index of refraction of air, is the molecular density at sea level , and is the thickness of the atmosphere if its density were uniform. Similarly, the Mie theory defines the scattering coefficient and a phase function for larger aerosols for a smaller height scale with an exponentially decreasing density:
Aerosols also absorb a fraction of the incident light, which is measured with an absorption coefficient . Combining the absorption and scattering coefficient gives us the extinct coefficient ( for air molecules). The ozone layer also has a large impact on the appearance of the Earth’s atmosphere, and the absorption of ozone contributes to the sky’s blue hue when the sun is near the horizon. We approximate the absorption rate by the following equation:
In real-time rendering, we can not afford to compute light scattering for each individual wavelength. Instead, we approximate the energy distribution across the spectrum using the three RGB components, so the term is assigned to a certain value, referred to as . Therefore, we have the following functions for scattering coefficient , extinction coefficient and phase function :
which we will use for calculating atmospheric scattering and discuss later.
Rendering equation
In the work by Eric Bruneton and Fabrice Neyret, the rendering equation is presented as follows:
where represent the radiance of sunlight in direction , reaching point from the viewing direction (see Figure 1). In order to better understand the rendering equation, we need to specify a few additional terms:
Assume the ray extends to the atmosphere’s boundary, where is either at the ground or the atmospheric boundary where , then represents the transmittance between and , represents the radiance of light reflected at , and the radiance of light scattered at in direction . With these notions, the terms in the rendering equation can be expanded as follows:
where represents the attenuation value of direct sunlight as it reaches via . denotes the light reflection at and the attenuation value of the light before it reaches . Meanwhile, is the internal scattering towards between to .
Transmittance
The transmittance represents the proportion that remains unabsorbed or unattenuated after a beam of light travels from one point to another. By definition, depends only on the positions of the two points and is independent of terrain. Therefore, we can examine it under the assumption of a smooth planetary surface (i.e., all distances from the surface to the sphere’s center are equal to ) when solving for . Upon determining , we store it in a precomputed texture that will be accessed during the rendering process.
Transmittance model
In terms of computation, the transmittance between two points and is the transmittance between and the nearest intersection of the half-line with either the top or bottom atmospheric boundary (See Figure 2). We then divide this value by the transmittance between and (or use 0 if the segment intersects the ground). Because the transmittance values between and and between and are the same, it is sufficient to know the transmittance between a point in the atmosphere and points on the top atmosphere boundary to compute the transmittance between arbitrary points. The transmittance then depends on only two parameters, which can be taken as the radius and the cosine of the “view zenith angle”, . Based on the definition and the Beer-Lambert law, solving for transmittance involves integrating the density of air molecules, aerosols, and air molecules that absorb light (e.g., ozone) along the same segment .
Transmittance precomputation
To store the calculated transmittance, we need to map the two parameters to the texture coordinates , and perform the inverse mapping when retrieving the solution during the rendering process. The original paper provided a generic mapping technique that works for any atmosphere and provides an increased sampling rate near the horizon: , and equals a value between 0 and 1 by considering the distance to the top atmosphere boundary, where and (See Figure 2).
Transmittance look-up
While rendering, we can get the transmittance between two arbitrary points and inside the atmosphere using just two texture look-ups. This is because the transmittance between and equals the transmittance between and the top atmospheric boundary divided by the transmittance between and the top atmospheric boundary, and vice versa. It is worth to mention that we also need the atmospheric transmittance to the Sun when calculating single scattering and sky irradiance. During which step, we approximate by integrating over its disc, assuming constant transmittance except below the horizon. We calculate it using top atmosphere boundary transmittance multiplied by the visible Sun disc fraction. It is important to note that we do not perform the ray-ground intersection testing step here. Instead, we implement the function and use it on the caller side. This is because results could be inaccurate for rays close to the horizon, as finite precision and rounding errors in floating-point operations can cause discrepancies.
Single scattering
By single scattering, we mean that there has only been one scattering event since the light left the light source before it reached the observer’s eye. The single scattered radiance is responsible for the majority of the color we see because the atmosphere scatters light at a relatively low rate.
Single scattering model
Consider the Sun light scattered at a point by air molecules or aerosols particles before arriving at another point , the radiance arriving at is the product of:
- the solar irradiance at the top of the atmosphere.
- the fraction of the Sun light at the top of the atmosphere that reaches , .
- the Rayleigh or Mie scattering coefficient at .
- the Rayleigh or Mie phase function.
- the fraction of the light scattered at towards that reaches , .
where a total of 4 parameters needed (See Figure 3):
- is the distance from to the center of the sphere.
- is the angle formed by extending the radius from point to atmosphere boundary and the line segment connecting points and .
- is the angle formed by extending the radius from point to atmosphere boundary and the unit direction vector towards the Sun .
- is the angle formed by the same unit direction vector for and line segment connecting points and .
Single scattering precomputation
Single scattering is quite expensive, and many evaluations are needed to compute multiple scattering due to the recursive nature of the rendering equation. We, therefore, want to precompute the solution and save the results to texture, which requires a mapping from the 4 function parameters to texture coordinates. This requires a 4D texture, and a function that maps to texture coordinates .
The mapping between and roughly follows the same procedure we mentioned in the previous section. However, the mapping between and is more complex. The mapping for takes the minimal distance to the nearest atmosphere boundary into account to map it to interval. On the other hand, the mapping for relies on the distance to the top atmosphere boundary and uses a configurable parameter, but still maintains a higher sampling rate near the horizon.
Single scattering look-up
With the help from precomputed texture generated using the method discussed previously, we can now get the scattering between a point and the nearest atmosphere boundary with two texture look-ups in real-time, which guarantees efficient multiple scattering calculations described in the next section. In particular, we employ two 3D texture look-ups to simulate a single 4D texture lookup using quadrilinear interpolation.
Multiple scattering
By multiple scattering, we mean that the light from the Sun reaches the observer’s eye after undergoing two or more bounces within the atmosphere. These bounces consist of scattering events, where light interacts with atmospheric particles, or reflections off the ground.
Multiple scattering model
Multiple scattering can be broken down into the sum of double scattering, triple scattering, and so on, with each term representing light reaching a point in the atmosphere after exactly two, three, etc., bounces. In addition, each term can be calculated based on the previous one; that is, the light arriving at some point from direction after bounces is an integral over all the possible points for the last bounce, which involves the light arriving at from any direction, after bounces.
Unfortunately, the calculation at each scattering order requires a triple integral based on the previous scattering order: one integral over all the points on the line segment from to the nearest atmosphere boundary in direction , as well as a nested double integral over all directions at each point . It will be extremely inefficient if we perform the calculation naively. As suggested by Eric Bruneton and Fabrice Neyret in their work, we use the following algorithm when dealing with multiple scattering:
# Precompute single scattering in a texture.
if scattering_order >= 2:
for q in points:
for omega in directions:
# Look up (n-1)-th scattering texture
# Compute the scattered light at q towards omega
for p in points:
for omega in directions:
# Look up the texture computed on line 6
# Compute the scattered light at p towards omega
where the calculations for involve only a double integral, and the calculations for involve only a single integral, since it is based on the precomputed texture in the previous step.
Multiple scattering precomputation
To save computation time when calculating the next order, we must precompute each scattering order in a texture, which requires a mapping from function parameters to texture coordinates. Fortunately, all scattering orders rely on the same parameters as the single scattering described before. Therefore, we can conveniently reuse the mappings defined for single scattering.
Multiple scattering look-up
Likewise, we can simply reuse the same look-up procedure for single scattering to read a value from the precomputed textures for multiple scattering.
Ground irradiance
Ground irradiance, which is the sunlight received at the planet’s surface after bounces, plays an important role in precomputing the -th order of scattering, where , in determining the light path contributions with their -th bounce on the ground. In this case, we require ground irradiance only for horizontal surfaces situated at the bottom of the atmosphere for a uniform albedo spherical planet. Furthermore, during the rendering process, we need to compute the contribution of light paths with their final bounce on the ground to achieve an accurate result. Different from the previous case, we need ground irradiance for varying altitudes and surface normals, and precomputation is desired for efficiency. In line with the original work, we precompute irradiance only for horizontal surfaces at any altitude, using 2D textures as a substitute for the more complex 4D textures required in the general case.
Ground irradiance model
Irradiance is calculated as the integral over a hemisphere of the incident radiance, multiplied by a cosine factor. For ground irradiance computation, we divide it into direct and indirect components. The direct ground irradiance is determined by the Sun’s incident radiance at the top of the atmosphere, multiplied by the transmittance through the atmosphere. Given the small solid angle of the Sun, we can approximate transmittance as a constant and move it outside the irradiance integral, which should be performed over the visible fraction of the Sun’s disc rather than the entire hemisphere. For indirect ground irradiance, the integral over the hemisphere must be calculated numerically. Specifically, we need to compute the integral over all directions of the hemisphere, considering the product of the radiance arriving from direction after bounces and the cosine factor .
Ground irradiance precomputation
As mentioned in the previous section, the irradiance depends only on and because we precompute the ground irradiance only for horizontal surfaces. Therefore, we only need a mapping from ground irradiance parameters to texture coordinates , and a simple affine mapping is sufficient because of the smooth ground irradiance function.
Ground irradiance look-up
The ground irradiance look-up is relatively straightforward, as it can be accomplished with just a single texture look-up similar to what we did for transmittance textures.
Precomputed LUTs
After implementing the methodology outlined above, we are able to precompute all necessary Look-Up Tables (LUTs) for real-time rendering. These LUTs appear as follows:
Results and discussion
We have developed two distinct engines to showcase the outcomes of our project: a command-line-based engine and a graphical user interface (GUI) based engine. The first engine, the command-line-based variant, is capable of generating a two-dimensional image of a predefined scene in real-time, with users having the ability to customize various elements such as camera angles and positioning. The second engine, utilizing a graphical interface, offers users an enhanced level of control over the scene, including adjustments to sun position, color balance, and even vertex and fragment shaders modifications within the engine. These features enable users to manipulate a range of aspects, creating a more immersive and interactive experience. Both engine utilizes precomputed LUTs from the atmosphere model we created.
All the renderings are provided by the CPU renderer, which samples the entire spectrum to give a radiance spectrum of pixels. Based on this radiance spectrum, we can calculate the color of the pixels in different ways. If not explicitly stated, the result is to first integrate the radiance spectrum with the CIE XYZ basis, converting it to CIE XYZ luminance and then to linear sRGB luminance. The output images are saved as PNG at 8-bit for each channel, after the luminance is gamma corrected () and tone mapped. Figure 4 (a) shows a less-accurate result with only 3 radiance samples without converting to sRGB. The GPU renderer aims to be real-time by sacrificing the color accuracy, which only samples at RGB wavelengths for radiance.
Precomputation takes about 200 seconds to generate all the LUTs in 20 threads with fourth order multiple scattering. Once generated, the LUTs are simply copied to memory and rendered immediately. The GPU renderer can read half precision LUTs and render the atmosphere at more than 144 frames per second, and can adjust the camera and the position of the sun in real time.
Acknowledgments
This project, conducted for RPI Advanced Computer Graphics Spring 2023, is primarily built upon the prior works of Eric Bruneton and Fabrice Neyret. We express our gratitude to them for supplying valuable references.