350 likes | 549 Views
Ben Gurion University of the Negev. www.bgu.ac.il/atomchip , www.bgu.ac.il/nanocenter. Physics 3 for Electrical Engineering. Lecturers: Daniel Rohrlich, Ron Folman Teaching Assistants: Daniel Ariad, Barukh Dolgin.
E N D
Ben Gurion University of the Negev www.bgu.ac.il/atomchip,www.bgu.ac.il/nanocenter Physics 3 for Electrical Engineering Lecturers: Daniel Rohrlich, Ron Folman Teaching Assistants: Daniel Ariad, Barukh Dolgin Week 7. Quantum mechanics – scalar product of wave functions • Hermitian operators, eigenvalues and eigenfunctions • expectation values • eigenfunction expansions • Dirac’s -function • commutators • generalized uncertainty principle Sources: Merzbacher (2nd edition) Chap. 8; Merzbacher (3rd edition) Chap. 3 Sects. 3-4, Chap. 4 Sects. 1-4, Chap. 10 Sect. 5 and Appendix 1.
We have seen a few solutions of Schrödinger’s equation, and at the same time we are still trying to understand it. Let’s take a fresh look at the time-independent Schrödinger equation: where in 1D.
We have seen a few solutions of Schrödinger’s equation, and at the same time we are still trying to understand it. Let’s take a fresh look at the time-independent Schrödinger equation: where in 1D. The solutions are What kind of equation is that?
Similarly, if we define then the solutions of the free time-independent Schrödinger equation satisfy What kind of equation is that?
One more: Let be the parity operation, defined by then the solutions for a symmetric 1D square well satisfy What kind of equation is that?
Hermitian operators, eigenvalues and eigenfunctions are examples of linear operators: If , for some and , then we call the eigenvalue of and we call the (corresponding) eigenvector of .
Hermitian operators, eigenvalues and eigenfunctions are examples of linear operators: If , for some and , then we call the eigenvalue of and we call the (corresponding) eigenvector of . Dirac defined a “ket” vector notation: or even .
Hermitian operators, eigenvalues and eigenfunctions are examples of linear operators: If , for some and , then we call the eigenvalue of and we call the (corresponding) eigenvector of . Dirac defined a “ket” vector notation: or even . He also defined a “bra” vector which look like this: .
Scalar product of wave functions A vector space has a “scalar product” of two wave functions ψ(x) and φ(x); it is in Dirac’s “bra-ket” notation. Note that the scalar product of ψ(x) and φ(x) depends on their order.
Scalar product of wave functions A vector space has a “scalar product” of two wave functions ψ(x) and φ(x); it is in Dirac’s “bra-ket” notation. Note that the scalar product of ψ(x) and φ(x) depends on their order. The scalar product of ψ(x) with itself is
Scalar product of wave functions A vector space has a “scalar product” of two wave functions ψ(x) and φ(x); it is in Dirac’s “bra-ket” notation. Note that the scalar product of ψ(x) and φ(x) depends on their order. The scalar product of ψ(x) with itself is
are not only linear operators, they are also physical operators (“observables”) that represent physical quantities. Therefore their eigenvalues must be real. The set of all eigenvectors of an observable forms the basis of a vector space. This vector space contains all possible states of the system. It is complete.
are not only linear operators, they are also physical operators (“observables”) that represent physical quantities. Therefore their eigenvalues must be real. The set of all eigenvectors of an observable forms the basis of a vector space. This vector space contains all possible states of the system. It is complete. Example: The eigenvectors of are all the functionseikx. Any function on the line can be written as a linear combination of these eigenvectors (Fourier analysis on a line).
are not only linear operators, they are also physical operators (“observables”) that represent physical quantities. Therefore their eigenvalues must be real. The set of all eigenvectors of an observable forms the basis of a vector space. This vector space contains all possible states of the system. It is complete. Example: The energy eigenvectors of an infinite 1D square well between 0 and L can be written and any function that vanishes at x = 0 and x = L can be written as a linear sum of these ψn(x) (Fourier analysis on an interval).
are not only linear operators, they are also physical operators (“observables”) that represent physical quantities. Therefore their eigenvalues must be real. But how do we know which operators represent physical quantities? How do we know whether an operator has only real eigenvalues? A partial answer: any observable must be “Hermitian”: where φ(x) and ψ(x) are any two wave functions.
Why Hermitian? We can prove that if is Hermitian then 1. the eigenvalues ai of are real, and 2. the eigenvectors of can be chosen orthogonal. Proof: 1. hence every aiis real; 2. hence either ai = aj or i.e. and are orthogonal. (If ai = aj then we can choose linear combinations of and that are eigenvectors of and orthogonal.)
The observable has a special name: it is the “Hamiltonian”. Let us show that the Hamiltonian is a Hermitian operator, i.e. that We assume that all wave functions φ(x) and ψ(x) vanish for |x| → ∞. Now Integrating by parts twice, we get
Thus and is Hermitian. The fact that is Hermitian leads to two important conclusions: 1. hence every Eiis real; 2. hence either Ei = Ej or i.e. and are orthogonal.
In fact, we can always form a complete orthonormal basis for the vector space of out of eigenstates of . For any we can write “Ortho” means that if i ≠ j, and “normal” means that for every i. Assuming that is normalized, too, we find
And now we can generalize our probability rule: Not only is |φ(x)|2 the probability density to find a particle in the state φ(x) at the point x, but also is the probability to find a particle in the state to be in the state . If the energy Ej is non-degenerate, then |cj|2 is the probability to find the particle with energy Ej. From here it is an easy step to define an expectation value: is the average energy in the state .
Expectation values Whatever we just proved about the Hamiltonian applies also to every other observable, because every physical operator is Hermitian. For example, the expectation value of momentum in the state is The expectation value (average value) of the position in the state is which is just an ordinary common-sense average. So = x.
Eigenfunction expansions We saw that every state has an expansion in eigenstates of the Hamiltonian. The same is true for any other observable : the state has an expansion in eigenstates of : where and the coefficients cj depend on .
Dirac’s δ-function What about x? We declared = x to be an operator, but what is its eigenfunction? Apparently the eigenfunction of should be a wave function that is zero everywhere except at some point a. We can approximate such a function in many ways, e.g. as a very thin gaussian function (ε → 0): Dirac instead defined a “δ-function” δ(x−a) with the following properties: for any function f(x),
Dirac’s δ-function Dirac’s δ-function is an eigenfunction of with “δ-function normalization”: Using Dirac’s δ-function, we can write an eigenfunction expansion for as follows: where we regard φ(a) as the “coefficient” of the eigenfunction δ(x−a).
Dirac’s δ-function Dirac’s “δ-function normalization” applies also to momentum eigenfunctions and because their scalar product is (The integral is infinitely bigger for k1 = k2 than for k1 ≠ k2 .)
Dirac’s δ-function One way to “tame” the integral is to treat it as in the limit ε→0. Multiplying this by f(k) and integrating over k, we get
Commutators Operators are different from numbers, and in particular, they don’t obey the rule xy = yx that real and complex numbers x,y obey. If we apply two operators to a state , we may get a different result than if we apply them in the opposite order: The commutator of is defined as operating on a state.
Commutators If a Hermitian operator commutes with , i.e. , then we can find eigenstates of that are also eigenstates of . Example: If commutes with the parity operator , then for any state , . So if is an eigenvector of with eigenvalue E, i.e. , then so is : If E is nondegenerate, then for some number α, so is an eigenstate of . If E is degenerate, then are eigenstates of , since
Let’s calculate the commutator by applying it to a wave function ψ(x): hence
Generalized uncertainty principle Definition: where etc. Theorem: where is a vector orthogonal to . Proof: We have for some , and if we apply the bra to each side we obtain If we take the scalar product of each side with itself, we obtain
Generalized uncertainty principle Suppose where and are different, in general. Now if we calculate we have and therefore
Application to position and momentum uncertainty (Heisenberg’s uncertainty relation for Δx and Δp):