Signals: Random Signal Analysis
AI-Generated Content
Signals: Random Signal Analysis
Unlike deterministic signals, which can be precisely predicted, random signals are fundamentally unpredictable and must be described using the tools of probability and statistics. Whether analyzing communication noise, financial market fluctuations, or seismic data, you cannot work with a single equation; you must characterize the signal's average behavior and its inherent randomness. This field provides the mathematical framework to quantify uncertainty, predict average system responses, and design filters to extract signals from noise.
Statistical Description of Random Processes
A random process is not a single signal but an entire ensemble, or collection, of all possible signal realizations. For example, imagine recording the voltage across a resistor due to thermal noise every day at 9 AM. Each day's recording is one realization. The collection of all possible daily recordings is the random process.
To describe it, we use statistical moments. The most fundamental are the mean and mean-squared value. The mean function, , is the average value across the ensemble at a specific time instant . The variance, , measures the spread or power of the fluctuations around the mean at time . The square root of the variance is the standard deviation. For many engineering analyses, particularly involving power, the mean-squared value is crucial, as it represents the total average power (for a signal with units like volts, across a 1-ohm resistor).
Ensemble Averages and Correlation
Calculating expectations like the mean requires ensemble averaging. You take the value at a fixed time from every possible realization in the process, sum them, and divide by the number of realizations. This is distinct from time-averaging a single signal.
A far more revealing measure is the autocorrelation function. It quantifies the "self-similarity" of a process across two different times. For a random process , it is defined as: This expectation tells you how correlated the signal is with a delayed version of itself. If is high, knowing the value at gives you a good estimate of the value at . If it's near zero, the two time samples are largely unrelated. This function is central to defining a crucial simplifying property: wide-sense stationarity.
Wide-Sense Stationarity (WSS)
A random process is wide-sense stationary if two conditions hold. First, its mean is constant for all time: , a constant. Second, its autocorrelation function depends only on the time difference , not on the absolute times. So for a WSS process: This is a powerful simplification. It means the process's statistical properties are shift-invariant in time. Most man-made signals and many natural phenomena are modeled as WSS, at least over reasonable observation intervals. For a WSS process, the variance is also constant and given by .
Power Spectral Density (PSD) and the Wiener-Khinchin Theorem
While autocorrelation describes temporal relationships, engineers often need to know how signal power is distributed across frequency. The Power Spectral Density (PSD), denoted , provides exactly this. It tells you the average power per unit frequency (e.g., watts per hertz).
The profound link between the time and frequency-domain descriptions of a WSS process is given by the Wiener-Khinchin theorem. It states that for a WSS process, the PSD is the Fourier Transform of its autocorrelation function: Conversely, the autocorrelation is the inverse Fourier transform of the PSD: This theorem is the cornerstone of random signal analysis in frequency. It means you can find the average power in a frequency band by integrating the PSD over that band. The total average power is .
Random Signals Through LTI Systems
A key application is predicting how a random input signal is transformed by a Linear Time-Invariant (LTI) system. If a WSS process with PSD is input to an LTI system with impulse response and frequency response , the output is also WSS.
The analysis is elegant in both domains. In the frequency domain, the output PSD is simply the input PSD scaled by the squared magnitude of the system's frequency response: In the time domain, the output autocorrelation is related through convolution: . The mean of the output is . These relationships allow you to design filters to shape noise or predict the power of a signal after amplification, filtering, or transmission.
Common Pitfalls
- Confusing Ensemble and Time Averages: A common error is to compute a time average from a single realization and assume it equals the ensemble mean. They are only guaranteed to be equal for a subclass of WSS processes called ergodic processes. Always verify ergodicity before using time averages to estimate ensemble properties.
- Misapplying the Wiener-Khinchin Theorem: This theorem applies only to wide-sense stationary processes. Attempting to take the Fourier transform of the autocorrelation function of a non-stationary process will not yield a valid PSD in the standard sense.
- Forgetting the Mean in Power Calculations: The total average power is . The AC power (power of the fluctuating part) is the variance . Using alone for AC power when the mean is non-zero is incorrect.
- Incorrect PSD Scaling through Systems: Remember that the system scales the PSD by , not the spectrum itself. Writing is a critical mistake. The magnitude must be squared because PSD is a power measure.
Summary
- Random signals are described probabilistically as random processes, requiring statistical measures like the mean, variance, and autocorrelation function .
- Wide-sense stationarity is a key practical assumption where the mean is constant and autocorrelation depends only on the time lag , simplifying analysis dramatically.
- The Wiener-Khinchin theorem establishes that for a WSS process, the Power Spectral Density (PSD) is the Fourier transform of its autocorrelation function, linking time-domain correlation to frequency-domain power distribution.
- When a WSS random signal passes through an LTI system, the output PSD is the input PSD multiplied by , a fundamental result for analyzing noise in engineered systems.
- Avoid critical errors by distinguishing between ensemble and time averages, ensuring stationarity before applying key theorems, and correctly accounting for the mean in power calculations.