 | [TOC] Chapter 12: Monte Carlo Integration |  |
Monte Carlo Integration is crucial in rendering as it uses probability to approximate complex lighting effects like global illumination and soft shadows by sampling random light paths. By leveraging statistical techniques, it estimates the distribution of light interactions with surfaces, making it possible to handle high-dimensional and intricate problems realistically.
 | Probability Theory Basics |  |
Probability Theory is the branch of mathematics that deals with randomness and uncertainty. In ray tracing, we utilize random sampling to estimate lighting, color, and material properties.
Probability Density Function (PDF)
A Probability Density Function (PDF) is a function that describes the relative likelihood of a random variable to take on a particular value. For a continuous random variable \(X\), the PDF \(p(x)\) must satisfy:
\[
\int_{-\infty}^{\infty} p(x) \, dx = 1
\]
This means that the total area under the PDF curve equals 1.
Example: PDF of Uniform Distribution
For a uniform distribution defined over the interval \([a, b]\):
\[
p(x) = \frac{1}{b - a} \quad \text{for } a \leq x \leq b
\]
This means that every value in this range has an equal probability of being sampled.
Cumulative Distribution Function (CDF)
The Cumulative Distribution Function (CDF), \(F(x)\), gives the probability that the random variable \(X\) is less than or equal to \(x\):
\[
F(x) = \int_{-\infty}^{x} p(t) \, dt
\]
Example: CDF for Uniform Distribution
For the uniform distribution defined above, the CDF can be expressed as:
\[
F(x) = \begin{cases}
0 & x < a \\
\frac{x-a}{b-a} & a \leq x < b \\
1 & x \geq b
\end{cases}
\]
This CDF shows that for any value of \(x\) less than \(a\), the probability is 0; for \(x\) in the range \([a, b]\), the probability increases linearly, and for \(x\) greater than \(b\), the probability is 1.
 | The Monte Carlo Estimator |  |
The Monte Carlo estimator is a statistical method for estimating the value of an integral. It relies on random sampling to approximate integrals of functions over a domain \(D\).
Monte Carlo Estimator Equation
The integral \(I\) of a function \(f(x)\) over a domain \(D\) can be estimated using \(N\) random samples \(x_i\) drawn from a probability density function \(p(x)\):
\[
I \approx \frac{1}{N} \sum_{i=1}^{N} \frac{f(x_i)}{p(x_i)}
\]
This equation provides a way to compute the integral by weighting the function value \(f(x_i)\) by the inverse of the probability density \(p(x_i)\) at that point, which corrects for the likelihood of sampling from the distribution.
Example Code Snippet
Here's how to implement a Monte Carlo estimator in JavaScript:
function monteCarloEstimator(f, p, numSamples) { let sum = 0;
for (let i = 0; i < numSamples; i++) { const x = sampleFromDistribution(p); // Sample from the distribution sum += f(x) / p(x); // Weight the function value by the inverse PDF }
return sum / numSamples; // Average over all samples }
// Sample from uniform distribution function sampleFromDistribution(p) { // Assuming p(x) is uniform over [0, 1] return Math.random(); }
// Example function to integrate: f(x) = x^2 function f(x) { return x * x; // The integral over [0, 1] is 1/3 }
// Run the estimator const numSamples = 10000; const result = monteCarloEstimator(f, x => 1, numSamples); // p(x) = 1 console.log("Estimated integral:", result); // Should approximate 1/3
In this example, we define a function \(f(x) = x^2\) and use the Monte Carlo estimator to approximate the integral over the interval \([0, 1]\).
 | Sampling Random Variables |  |
Sampling Techniques
Sampling is the process of selecting points from a probability distribution. Various techniques can be employed to sample random variables effectively.
Inverse Transform Sampling is a widely used method where we first sample from a uniform distribution and then transform that sample using the inverse of the CDF.
If \(u\) is drawn from \(U(0, 1)\), then the corresponding sample from the distribution with CDF \(F\) can be computed as:
\[
x = F^{-1}(u)
\]
For a uniform distribution over \([a, b]\):
1. Generate a random value \(u \sim U(0, 1)\).
2. Compute \(x = a + (b - a) \cdot u\).
|