420 likes | 436 Views
Learn about impulse responses, transfer functions, and signal spectra in system identification theory. Reference: "System Identification Theory For The User" by Lennart Ljung.
E N D
SYSTEMSIdentification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart Ljung(1999)
Lecture 2 Topics to be covered include: Impulse responses and transfer functions. Frequency domain expression. Stochastic Process. Signal spectra Disturbances Ergodicity Introduction 2
Impulse responses Most often, the input signal u(t) is kept constant between the sampling instants: So It is well known that a linear, time-invariant, causal system can be described as: Sampling 3
Impulse responses For ease of notation assume that T is one time unit and use t to enumerate the sampling instants
Transfer functions Define forward and backward shift operator q and q-1 as Now we can write output as: G(q) is the transfer operator or transfer function Similarly for disturbance we have So the basic description for a linear system with additive disturbance is:
Transfer functions Some terminology G(q) is the transfer operator or transfer function We shall say that the transfer function G(q) is stable if This means that G(z) is analytic on and outside the unit circle. We shall say the filter H(q) is monic if h(0)=1:
Frequency-domain expressions Let It will convenient to write Now we can write output as: So we have
Frequency-domain expressions For stable system Now suppose
Periodograms of signals over finite intervals Fourier transform (FT) Discrete Fourier transform (DFT) Exercise1: Show that u(t) can be derived by putting UN(ω) in u(t).
Periodograms of signals over finite intervals Some property of UN(ω) The function UN(ω) is therefore uniquely defined by its values over the interval [ 0, 2π ]. It is, however, customary to consider UN(ω) for the interval [ - π, π ]. So u(t)can be defined as The number UN(ω) tell us the weight that the frequency ωcarries in the decomposition. So Is known as the periodogram of the signal u(t), t= 1 , 2 , 3 , ….. Parseval’s relationship:
Periodograms of signals over finite intervals Example: Periodogram of a sinusoid
Discrete Fourier transform (DFT) Periodograms of signals over finite intervals The periodogram defines, in a sense, the frequency contents of a signal over a finite time interval. But we seek for a definition of a similar concept for signals over the interval [1, ∞). 12 But this limits fail to exist for many signals of practical interest. 12
Transformation of Periodograms where As a signal is filtered through a linear system, its Periodograms changes. Let: Define: Claim:
Transformation of Periodograms Claim: Proof: Now
Transformation of Periodograms Claim: Proof: Now
Transformation of Periodograms Claim: Proof: So
Stochastic Processes For scalar (RV) For vector (RV) Probability density function (PDF) Definition: The expectation E[e] of a random variable e is: Definition: The variance, Cov[e], of a random variable, e, is: A random variable (RV) is a rule (or function) that assigns a real number to every outcome of a random experiment. The closing price of Iranian power market observed from Apr. 15 to Sep. 22, 2009. If e may assume a certain value with nonzero probability then fe contains δ function. Two random variables e1 and e2 are independent, if we have:
Stochastic Processes A stochastic process is a rule (or function) that assigns a time function to every outcome of a random experiment. • Consider the random experimentoftossing a dice att=0andobserving the number on the top face. • The sample space of this experiment consists of the outcomes • {1, 2, 3, · · · , 6}. • For each outcome of the experiment, let us arbitrarily assign a function of time t in the following manner. • Thesetof functions{x1(t),x2(t),··, x6(t)}representsastochasticprocess.
Stochastic Processes Mean of a random process X(t) is In general, mX(t) is a function of time. Correlation RX(t1, t2) of a random process X(t) is Note RX(t1, t2) is a function of t1 and t2. Autocovariance CX(t1, t2) of a random process X(t) is defined as the covariance of X(t1) and X(t2): In particular, when t1 = t2 = t, we have
Stochastic Processes Example Sinusoid with random amplitude
Stochastic Processes Example Sinusoid with random phase
Stochastic Processes x(t) is stationary if Example Sinusoid with random phase Clearly x(t) is a stationary (WSS). Example Sinusoid with random amplitude Clearly x(t) is not a stationary. ????? This may be a limiting definition.
Signal Spectra A Common Framework for Deterministic and Stochastic Signals y(t) is not a stationary process ????? This may be a limiting definition. To deal with this problem, we introduce the following definition: Quasi-stationary signals
x(t) is stationary if and ( ) {s(t)} is a bounded sequence and It is quasi stationary since Stochastic Processes Quasi-stationary signals: A signal {s(t)} is said to be quasi-stationary if it is subject to Quasi-stationary If {s(t)} is a deterministic sequence means If {s(t)} is a stationary stochastic process
Signal Spectra Quasi-stationary signals: A signal {s(t)} is said to be quasi-stationary if it is subject to and ( ) Notation: The notation means that the limit exists. Sometimes with some abuse of notation, we call it Covariance function of s. Exercise2: Show that sometime it is exactly covariance function of s.
Signal Spectra 2- the cross-covariance function exist. Two signals {s(t)} and {w(t)} are jointly quasi-stationary if: 1- They both are quasi-stationary, Uncorrelated
Signal Spectra Discrete Fourier transform (DFT) The periodogram defines, in a sense, the frequency contents of a signal over a finite time interval. But we seek for a definition of a similar concept for signals over the interval [1, ∞). 27 But this limits fail to exist for many signals of practical interest. So we shall develop a frame work for describing signals and their spectra that is applicable to deterministic as well as stochastic signals. 27
We define the (power) spectrum of {s(t)} as When following limits exists: and cross spectrum between {s(t)} and {w(t)} as When following limits exists: Signal Spectra Use Fourier transform of covariance function (Spectrum or Spectral density) Exercise3: Show that spectrum always is a real function but cross spectrum is in general a complex-valued function.
Show that Where And finally show that Signal Spectra Exercise4 : Spectra of a Periodic Signal: Consider a deterministic, periodic signal with period M, i.e., s(t)=s(t+M)
Exercise5: Spectra of a Sinusoid: Consider Show that Signal Spectra
Example Stationary Stochastic Processes: Consider v(t) as a stationary stochastic processes We will assume that e(t) has zero mean and variance λ. It is clear that: Where Signal Spectra The spectrum is: Exercise6: Show (I)
Signal Spectra The stochastic process described by v(t)= H(q)e(t), where {e(t)} is a sequence of independent random variables with zero mean values and covariances λ , has the spectrum Spectrum of Stationary Stochastic Processes
Spectrum of a Mixed Deterministic and Stochastic Signal Stochastic: stationary and zero mean Signal Spectra deterministic Exercise7: Proof it.
Transformation of Spectra by Linear Systems Theorem: Let{w(t)} be a quasi-stationary with spectrum , and let G(q) be a stable transfer function. Let Then {s(t)} is also quasi-stationary and
Disturbances So v(t) u(t) y(t) + There are always signals beyond our control that also affect the system. We assume that such effects can be lumped into an additive term v(t) at the output There are many sources and causes for such a disturbance term. • Measurement noise. • Uncontrollable inputs. ( a person in a room produce 100 W/person)
Disturbances Characterization of disturbances • Its value is not known beforehand. • Making qualified guesses about future values is possible. • It is natural to employ a probabilistic framework to describe future disturbances. We put ourselves at time t and would like to know disturbance at t+k, k≥ 1 so we use the following approach. Where e(t) is a white noise. This description does not allow completely general characteristic of all possible probabilistic disturbances, but it is versatile enough.
Disturbances A realization of v(t) for propose e(t) Consider for example, the following PDF for e(t): Small values of μ are suitable to describe classical disturbance patterns, steps, pulses, sinuses and ramps. Exercise8: Derive above figure for μ=0.1 and μ=0.9 and a suitable h(k).
Disturbances A realization of v(t) for propose e(t) On the other hand, the PDF: Often we only specify the second-order properties of the sequence {e(t)} that is the mean and variances. Exercise9: What is a white noise? Exercise10: Derive above figure for δ=0.1 and δ=0.9 and a suitable h(k).
Disturbances We will assume that e(t) has zero mean and variance λ. Now we want to know the characteristic of v(t) : Mean: Covariance:
Disturbances We will assume that e(t) has zero mean and variance λ. Now we want to know the characteristic of v(t) : Mean: Covariance: Since the mean and covariance are do not depend on t, the process is said to be stationary.
Ergodicity Suppose you are concerned with determining what the most visited parks in a city are. • One idea is to take a momentary snapshot: to see how many people are this moment • in park A, how many are in park B and so on. • Another idea is to look at one individual (or few of them) and to follow him for a • certain period of time, e.g. a year. The first one may not be representative for a longer period of time, while the second one may not be representative for all the people. The idea is that an ensemble is ergodic if the two types of statistics give the same result. Many ensembles, like the human populations, are not ergodic.
Ergodicity Let x(t) is a stochastic process Most of our computations will depend on a given realization of a quasi stationary process. Ergodicity will allow us to make statements about repeated experiments.