Fourier series clearly open the frequency domain as an interesting and useful way of determining how circuits and systems respond to periodic input signals. Can we use similar techniques for nonperiodic signals? What is the response of the filter to a single pulse? Addressing these issues requires us to find the Fourier spectrum of all signals, both periodic and nonperiodic ones. We need a definition for the Fourier spectrum of a signal, periodic or not. This spectrum is calculated by what is known as the Fourier transform.
Let be a periodic signal having period . We want to consider what happens to this signal's spectrum as we let the period become longer and longer. We denote the spectrum for any assumed value of the period by . We calculate the spectrum according to the familiar formula
Let's calculate the Fourier transform of the pulse signal, . Note how closely this result resembles the expression for Fourier series coefficients of the periodic pulse signal.
[link] shows how increasing the period does indeed lead to a continuum of coefficients, and that the Fourier transform does correspond to what the continuum becomes. The quantity has a special name, the sinc (pronounced "sink") function, and is denoted by . Thus, the magnitude of the pulse's Fourier transform equals .
The Fourier transform relates a signal's time and frequency domain representations to each other. The direct Fourier transform (or simply the Fourier transform) calculates a signal's frequency domain representation from its time-domain variant ([link]). The inverse Fourier transform ([link]) finds the time-domain representation from the frequency domain. Rather than explicitly writing the required integral, we often symbolically express these transform calculations as and , respectively.
The differing exponent signs means that some curious results occur when we use the wrong sign. What is ? In other words, use the wrong exponent sign in evaluating the inverse Fourier transform.
Properties of the Fourier transform and some useful transform pairs are provided in the accompanying tables ([link] and [link]). Especially important among these properties is Parseval's Theorem, which states that power computed in either domain equals the power in the other.
How many Fourier transform operations need to be applied to get the original signal back: ?
. We know that . Therefore, two Fourier transforms applied to yields . We need two more to get us back where we started.
Note that the mathematical relationships between the time domain and frequency domain versions of the same signal are termed transforms. We are transforming (in the nontechnical meaning of the word) a signal from one representation to another. We express Fourier transform pairs as . A signal's time and frequency domain representations are uniquely related to each other. A signal thus "exists" in both the time and frequency domains, with the Fourier transform bridging between the two. We can define an information carrying signal in either the time or frequency domains; it behooves the wise engineer to use the simpler of the two.
A common misunderstanding is that while a signal exists in both the time and frequency domains, a single formula expressing a signal must contain only time or frequency: Both cannot be present simultaneously. This situation mirrors what happens with complex amplitudes in circuits: As we reveal how communications systems work and are designed, we will define signals entirely in the frequency domain without explicitly finding their time domain variants. This idea is shown in another module where we define Fourier series coefficients according to letter to be transmitted. Thus, a signal, though most familiarly defined in the time-domain, really can be defined equally as well (and sometimes more easily) in the frequency domain. For example, impedances depend on frequency and the time variable cannot appear.
We will learn that finding a linear, time-invariant system's output in the time domain can be most easily calculated by determining the input signal's spectrum, performing a simple calculation in the frequency domain, and inverse transforming the result. Furthermore, understanding communications and information processing systems requires a thorough understanding of signal structure and of how systems work in both the time and frequency domains.
The only difficulty in calculating the Fourier transform of any signal occurs when we have periodic signals (in either domain). Realizing that the Fourier series is a special case of the Fourier transform, we simply calculate the Fourier series coefficients instead, and plot them along with the spectra of nonperiodic signals on the same frequency axis.
Time-Domain | Frequency Domain | |
---|---|---|
Linearity | ||
Conjugate Symmetry | ||
Even Symmetry | ||
Odd Symmetry | ||
Scale Change | ||
Time Delay | ||
Complex Modulation | ||
Amplitude Modulation by Cosine | ||
Amplitude Modulation by Sine | ||
Differentiation | ||
Integration | if | |
Multiplication by | ||
Area | ||
Value at Origin | ||
Parseval's Theorem |
In communications, a very important operation on a signal is to amplitude modulate it. Using this operation more as an example rather than elaborating the communications aspects here, we want to compute the Fourier transform — the spectrum — of Thus, For the spectrum of , we use the Fourier series. Its period is , and its only nonzero Fourier coefficients are . The second term is not periodic unless has the same period as the sinusoid. Using Euler's relation, the spectrum of the second term can be derived as Using Euler's relation for the cosine, Exploiting the uniqueness property of the Fourier transform, we have
Note how in this figure the signal is defined in the frequency domain. To find its time domain representation, we simply use the inverse Fourier transform.
What is the signal that corresponds to the spectrum shown in the upper panel of [link]?
The signal is the inverse Fourier transform of the triangularly shaped spectrum, and equals
What is the power in , the amplitude-modulated signal? Try the calculation in both the time and frequency domains.
The result is most easily found in the spectrum's formula: the power in the signal-related part of is half the power of the signal .
In this example, we call the signal a baseband signal because its power is contained at low frequencies. Signals such as speech and the Dow Jones averages are baseband signals. The baseband signal's bandwidth equals , the highest frequency at which it has power. Since 's spectrum is confined to a frequency band not close to the origin (we assume ), we have a bandpass signal. The bandwidth of a bandpass signal is not its highest frequency, but the range of positive frequencies where the signal has power. Thus, in this example, the bandwidth is . Why a signal's bandwidth should depend on its spectral shape will become clear once we develop communications systems.