1 / 27

Least squares method

Least squares method. Let adjustable parameters for structure refinement be u j Then if R = S w( hkl ) (|F obs | – |F calc |) 2 = S w D 2 Must get ∂R/∂u i = 0 one eqn/parameter. hkl. hkl. hkl. hkl. Least squares method.

Download Presentation

Least squares method

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Least squares method Let adjustable parameters for structure refinement be uj Then if R = S w(hkl)(|Fobs|– |Fcalc|)2 = S w D2 Must get ∂R/∂ui = 0 one eqn/parameter hkl hkl hkl hkl

  2. Least squares method Let adjustable parameters for structure refinement be uj Then if R = S w(hkl)(|Fobs|– |Fcalc|)2 = S w D2 Must get ∂R/∂ui = 0 one eqn/parameter Then S w D ∂|Fc|/∂ui = 0 hkl hkl hkl hkl

  3. Least squares Simple example – again To solve simultaneous linear eqns: a11x1 + a12x2 + … = y1 a21x1 + a22x2 + … = y2 If: Then simultaneous eqns given by A x = y

  4. Least squares a11x1 + a12x2 + … ≈ y1 a21x1 + a22x2 + … ≈ y2 Then: a11x1 + a12x2 + … – y1 = e1 a21x1 + a22x2 + … – y2 = e2 No exact solution as before – but can get best solution by minimizing S ei Suppose: 2 i

  5. Least squares a11x1 + a12x2 + … – y1 = e1 a21x1 + a22x2 + … – y2 = e2 No exact solution as before – but can get best solution by minimizing S ei Also – note that no. observations > no. of variable parameters (n > m) Minimize: 2 i

  6. Least squares Minimize:

  7. Least squares To illustrate calcn, let n, m = 2 (a11x1 + a12x2 – y1)2 = e12 (a21x1 + a22x2 – y2)2 = e22 Take partial derivative wrtx1, set = 0: (a11x1 + a12x2 – y1) a11 = 0 (a21x1 + a22x2 – y2) a21 = 0

  8. Least squares To illustrate calcn, let n, m = 2 (a11x1 + a12x2 – y1)2 = e12 (a21x1 + a22x2 – y2)2 = e22 Take partial derivative wrtx1, set = 0: (a11x1 + a12x2 – y1) a11 = 0 (a21x1 + a22x2 – y2) a21 = 0 (a11 a11) x1 + (a11 a12)x2 = (a11) y1 (a21 a21)x1 + (a21 a22)x2 = (a21) y2 (a11 a11 +a21 a21) x1 + (a11 a12 + a21 a22)x2 = (a11 y1 + a21 y2 )

  9. Least squares (a11 a11 +a21 a21) x1 + (a11 a12 + a21 a22)x2 = (a11 y1 + a21 y2 ) x1 Sai1 + x2 S ai1 ai2 = S ai1 yi 2 2 2 2 i=1 i=1 i=1

  10. Least squares (a11 a11 +a21 a21) x1 + (a11 a12 + a21 a22)x2 = (a11 y1 + a21 y2 ) x1 Sai1 + x2 S ai1 ai2 = S ai1 yi Now consider: 2 2 2 2 i=1 i=1 i=1

  11. Least squares (a11 a11 +a21 a21) x1 + (a11 a12 + a21 a22)x2 = (a11 y1 + a21 y2 ) x1 Sai1 + x2 S ai1 ai2 = S ai1 yi Now consider: ATA 2 2 2 2 i=1 i=1 i=1

  12. Least squares (a11 a11 +a21 a21) x1 + (a11 a12 + a21 a22)x2 = (a11 y1 + a21 y2 ) x1 Sai1 + x2 S ai1 ai2 = S ai1 yi Now consider: ATA And: (AT A) x = (AT y) 2 2 2 2 i=1 i=1 i=1

  13. Least squares In general:

  14. Least squares In general: And: (AT A) x = (AT y)

  15. Least squares In general: (AT A) x = (AT y) x = (AT A)-1 (AT y)

  16. Least squares Again: ƒs are not linear in xi

  17. Least squares Again: ƒs are not linear in xi Expand ƒs in Taylor series

  18. Least squares Again: ƒs are not linear in xi Expand ƒs in Taylor series

  19. Least squares Solve, as before:

  20. Least squares Solve, as before:

  21. Least squares Solve, as before:

  22. Least squares Weighting factors matrix:

  23. Least squares So: Need set of initial parameters xjo Problem solution gives shifts ∆xj, not xj

  24. Least squares So: Need set of initial parameters xjo Problem solution gives shifts ∆xj, not xj Eqns not exact, so refinement process requires no. of cycles to complete the refinement Add shifts ∆xj to xjo for each new refinement cycle

  25. Least squares How good are final parameters? Use usual procedure to calculate standard deviations, s(xj) no. observations no. parameters

  26. Least squares Warning: Frequently, all parameters cannot be “let go” at the same time How to tell which parameters can be refined simultaneously?

  27. Least squares Warning: Frequently, all parameters cannot be “let go” at the same time How to tell which parameters can be refined simultaneously? Use correlation matrix: Calc correlation matrix for each refinement cycle Look for strong interactions (rij > + 0.5 or < – 0.5, roughly) If 2 parameters interact, hold one constant

More Related