610 likes | 750 Views
Lectures on Calculus. Multivariable Differentiation. by William M. Faucette. University of West Georgia. Adapted from Calculus on Manifolds. by Michael Spivak. Multivariable Differentiation.
E N D
Lectures on Calculus Multivariable Differentiation
by William M. Faucette University of West Georgia
Adapted from Calculus on Manifolds by Michael Spivak
Multivariable Differentiation Recall that a function f: RR is differentiable at a in R if there is a number f (a) such that
Multivariable Differentiation This definition makes no sense for functions f:RnRm for several reasons, not the least of which is that you cannot divide by a vector.
Multivariable Differentiation However, we can rewrite this definition so that it can be generalized to several variables. First, rewrite the definition this way
Multivariable Differentiation Notice that the function taking h to f (a)h is a linear transformation from R to R. So we can view f (a) as being a linear transformation, at least in the one dimensional case.
Multivariable Differentiation So, we define a function f:RnRm to be differentiable at a in Rn if there exists a linear transformation from Rn to Rm so that
Multivariable Differentiation Notice that taking the length here is essential since the numerator is a vector in Rm and denominator is a vector in Rn.
Multivariable Differentiation Definition: The linear transformation is denoted Df(a) and called the derivative of f at a, provided
Multivariable Differentiation Notice that for f:RnRm, the derivative Df(a):RnRm is a linear transformation. Df(a) is the linear transformation most closely approximating the map f at a, in the sense that
Multivariable Differentiation For a function f:RnRm, the derivative Df(a) is unique if it exists. This result will follow from what we do later.
Multivariable Differentiation Since Df(a) is a linear transformation, we can give its matrix with respect to the standard bases on Rn and Rm. This matrix is an mxn matrix called the Jacobian matrix of f at a. We will see how to compute this matrix shortly.
Lemma 1 Lemma: If f:RnRm is a linear transformation, then Df(a)=f.
Lemma 1 Proof: Let =f. Then
Lemma 2 Lemma: Let T:RmRn be a linear transformation. Then there is a number M such that |T(h)|≤M|h| for h2Rm.
Lemma 2 Proof: Let A be the matrix of T with respect to the standard bases for Rm and Rn. So A is an nxm matrix [aij] If A is the zero matrix, then T is the zero linear transformation and there is nothing to prove. So assume A≠0. Let K=max{|aij|}>0.
Lemma 2 Proof: Then So, we need only let M=Km. QED
The Chain Rule Theorem (Chain Rule): If f: RnRm is differentiable at a, and g: RmRp is differentiable at f(a), then the composition gf: RnRp is differentiable at a and
The Chain Rule In this expression, the right side is the composition of linear transformations, which, of course, corresponds to the product of the corresponding Jacobians at the respective points.
The Chain Rule Proof: Let b=f(a), let =Df(a), and let =Dg(f(a)). Define
The Chain Rule Since f is differentiable at a, and is the derivative of f at a, we have
The Chain Rule Similarly, since g is differentiable at b, and is the derivative of g at b, we have
The Chain Rule To show that gf is differentiable with derivative , we must show that
The Chain Rule Recall that and that is a linear transformation. Then we have
The Chain Rule Next, recall that Then we have
The Chain Rule From the preceding slide, we have So, we must show that
The Chain Rule Recall that Given >0, we can find >0 so that which is true provided that |x-a|<1, since f must be continuous at a.
The Chain Rule Then Here, we’ve used Lemma 2 to find M so that
The Chain Rule Dividing by |x-a| and taking a limit, we get
The Chain Rule Since >0 is arbitrary, we have which is what we needed to show first.
The Chain Rule Recall that Given >0, we can find 2>0 so that
The Chain Rule By Lemma 2, we can find M so that Hence
The Chain Rule Since >0 is arbitrary, we have which is what we needed to show second. QED
The Derivative of f:RnRm Let f be given by m coordinate functions f 1, . . . , f m. We can first make a reduction to the case where m=1 using the following theorem.
The Derivative of f:RnRm Theorem: If f:RnRm, then f is differentiable at a2Rn if and only if each fi is differentiable at a2Rn, and
The Derivative of f:RnRm Proof: One direction is easy. Suppose f is differentiable. Let i:RmR be projection onto the ith coordinate. Then fi= if. Since i is a linear transformation, by Lemma 1 it is differentiable and is its own derivative. Hence, by the Chain Rule, we have fi= if is differentiable and Df i(a) is the ith component of Df(a).
The Derivative of f:RnRm Proof: Conversely, suppose each fi is differentiable at a with derivative Dfi(a). Set Then
The Derivative of f:RnRm Proof: By the definition of the derivative, we have, for each i,
The Derivative of f:RnRm Proof: Then This concludes the proof. QED
The Derivative of f:RnRm The preceding theorem reduces differentiating f:RnRm to finding the derivative of each component function fi:RnR. Now we’ll work on this problem.
Partial Derivatives Let f: RnR and a2Rn. We define the ith partial derivative of f at a by
The Derivative of f:RnRm Theorem: If f:RnRm is differentiable at a, then Djf i(a) exists for 1≤ i ≤m, 1≤ j ≤n and f(a) is the mxn matrix (Djf i(a)).
The Derivative of f:RnRm Proof: Suppose first that m=1, so that f:RnR. Define h:RRn by h(x)=(a1, . . . , x, . . . ,an), with x in the jth place. Then
The Derivative of f:RnRm Proof: Hence, by the Chain Rule, we have