[link] shows how this
sequence of signals portrays the signal more accurately
as more terms are added.
Fourier Series spectrum of a half-wave rectified sine wave
The Fourier series spectrum of a half-wave rectified sinusoid
is shown in the upper portion. The index indicates the
multiple of the fundamental frequency at which the signal has
energy. The cumulative effect of adding terms to the Fourier
series for the half-wave rectified sine wave is shown in the
bottom portion. The dashed line is the actual signal, with
the solid line showing the finite series approximation to the
indicated number of terms,
.
We need to assess quantitatively the accuracy of the
Fourier series approximation so that we can judge how rapidly
the series approaches the signal. When we use a
-term series, the error—the difference between
the signal and the
-term series—corresponds to the unused terms from
the series.
To find the rms error, we must square this expression and
integrate it over a period. Again, the integral of most
cross-terms is zero, leaving
[link] shows how the error in the
Fourier series for the half-wave rectified sinusoid decreases as
more terms are incorporated. In particular, the use of four
terms, as shown in the bottom plot of [link], has a rms error (relative
to the rms value of the signal) of about 3%. The Fourier series
in this case converges quickly to the signal.
Approximation error for a half-wave rectified sinusoid
The rms error calculated according to
[link]
is shown as a function of the number of terms in the
series for the half-wave rectified sinusoid.
The error has been normalized by the rms value of the
signal.
We can look at [link] to
see the power spectrum and the rms approximation error for the
square wave.
Power spectrum and approximation error for a square wave
The upper plot shows the power spectrum of the square wave,
and the lower plot the rms error of the finite-length
Fourier series approximation to the square wave. The
asterisk denotes the rms error when the number of terms
in the Fourier series equals 99.
Because the Fourier coefficients decay more slowly here than for
the half-wave rectified sinusoid, the rms error is not
decreasing quickly. Said another way, the square-wave's
spectrum contains more power at higher frequencies than does the
half-wave-rectified sinusoid. This difference between the two
Fourier series results because the half-wave rectified
sinusoid's Fourier coefficients are proportional to
while those of the square wave are proportional to
. If fact, after 99 terms of the square wave's
approximation, the error is bigger than 10 terms of the
approximation for the half-wave rectified sinusoid.
Mathematicians have shown that no signal has an rms
approximation error that decays more slowly than it does for the
square wave.
Calculate the harmonic distortion for the square wave.
Total harmonic distortion in the square wave is
.
More than just decaying slowly, Fourier series approximation
shown in [link] exhibits
interesting behavior.
Fourier series approximation of a square wave
Fourier series approximation to
.
The number of terms in the Fourier sum is indicated in each
plot, and the square wave is shown as a dashed line over two
periods.
Although the square wave's Fourier series requires more terms
for a given representation accuracy, when comparing plots it is
not clear that the two are equal. Does the Fourier series
really equal the square wave at all values
of
? In particular, at each step-change
in the square wave, the Fourier series exhibits a peak followed
by rapid oscillations. As more terms are added to the series,
the oscillations seem to become more rapid and smaller, but the
peaks are not decreasing. For the Fourier series approximation for the half-wave
rectified sinusoid, no such behavior occurs. What is
happening?
Consider this mathematical question intuitively: Can a
discontinuous function, like the square wave, be expressed as a
sum, even an infinite one, of continuous signals? One should at
least be suspicious, and in fact, it can't be thus
expressed. This issue brought
Fourier
much criticism from the French Academy of Science (Laplace,
Lagrange, Monge and LaCroix comprised the review committee) for
several years after its presentation on 1807. It was not
resolved for almost a century, and its resolution is interesting
and important to understand from a practical viewpoint.
The extraneous peaks in the square wave's Fourier series
never disappear; they are termed
Gibb's phenomenon after the American physicist
Josiah Willard Gibbs. They occur whenever the signal is
discontinuous, and will always be present whenever the signal
has jumps.
Let's return to the question of equality; how can the
equal sign in the
definition of
the Fourier series
be justified? The partial answer is that
pointwise—each and every value of
—equality
is not guaranteed. However, mathematicians
later in the nineteenth century showed that the rms error of
the Fourier series was always zero.
What this means is that the error between a signal and its
Fourier series approximation may not be zero, but that its rms
value will be zero! It is through the eyes of the rms value
that we redefine equality: The usual definition of equality is
called pointwise equality: Two signals
,
are said to be equal pointwise if
for all values of . A new
definition of equality is mean-square equality: Two
signals are said to be equal in the mean square if
. For Fourier series, Gibb's phenomenon peaks have
finite height and zero width. The error differs from zero only
at isolated points—whenever the periodic signal contains
discontinuities—and equals about 9% of the size of the
discontinuity. The value of a function at a finite set of points
does not affect its integral. This effect underlies the reason
why defining the value of a discontinuous function, like we
refrained from doing in defining the step function, at its
discontinuity is meaningless. Whatever you pick for a value has
no practical relevance for either the signal's spectrum or for
how a system responds to the signal. The Fourier series value
"at" the discontinuity is the average of the values on either
side of the jump.