620 likes | 652 Views
This research focuses on optimizing isotonic regression models to predict a child's height based on the mother's height, using graph structures and edge constraints. The objective is to find an increasing function that satisfies the pairwise interactions of the graph. The cost function is constrained for all pairs of heights.
E N D
Optimization on Graphs Edge Constraints For all Graph pairwise interactions on for ( ) Objective
Optimization on Graphs Edge Constraints For all Graph pairwise interactions on for ( ) ( ) Objective
Isotonic Regression Predict child’s height from mother’s height Model? Height of child Height of mother
Isotonic Regression Predict child’s height from mother’s height Model? Increasing function? Height of child Height of mother
Isotonic Regression Predict child’s height from mother’s height Model? Increasing function? Height of child Height of mother
Isotonic Regression Height of child Height of mother
Isotonic Regression Height of child Height of mother
Isotonic Regression Constraint: for all Cost: Height of child Height of mother
Isotonic Regression Constraint: for all Cost: Height of child Height of mother
Isotonic Regression Constraint: for all Cost: Height of child Height of mother
Isotonic Regression Constraint: for all Cost: Height of child Height of mother
Isotonic Regression Constraint: for all Cost: Height of child Height of mother
Isotonic Regression Constraint Graph Age of child Height of mother Height of child Height of mother Height of child
Isotonic Regression Constraint Graph Age of child Height of mother Height of mother Taller mother AND older child
Isotonic Regression Constraint Graph Constraint: for all Age of child Cost: Height of mother
Isotonic Regression Constraint Graph Constraint: for all Age of child Cost: Height of mother Height of child
Optimization on Graphs Graph Functions & constraints on pairs : predicted height of child Constraint: for all Cost:
Optimization on Graphs Graph Functions & constraints on pairs (+ barrier trick) Unconstrained problem with modified objective
Optimization Primer Second Order Methods – “Newton Steps” 2nd order 1st order
Optimization Primer Second Order Methods – “Newton Steps”
Optimization Primer Second Order Methods – “Newton Steps”
Optimization Primer Second Order Methods – “Newton Steps”
Optimization Primer Second Order Methods – “Newton Steps”
Optimization Primer Second Order Methods – “Newton Steps”
Optimization Primer Second Order Methods – “Newton Steps”
Optimization Primer Second Order Methods – “Newton Steps” A good update step!
Second Order Methods Usually finding step that solves linear equation is too expensive! But for optimization on graphs, we can find the Newton step much faster than we can solve general linear equations
Graphs and Hessian Linear Equations Newton Step: find s.t. Gaussian Elimination: time for matrix . “Faster” methods: time Hessian from sum of convex functions on two variables Symmetric M-matrix Laplacian Spielman-Teng ’04: Laplacian linear equations can be solved in time
Graphs and Hessian Linear Equations Newton Step: find s.t. Gaussian Elimination: time for matrix . “Faster” methods: time Hessian from sum of convex functions on two variables Symmetric M-matrix Laplacian Daitch-Spielman ’08: Symmetric M-matrix linear equations can be solved in time
Convex Functions No negative eigenvalues!
Hessians & Graphs Graph-Structured Cost Function
Second Derivatives 2-by-2 PSD non-negative eigenvalues If every term looks like the sum is an M-matrix
Second Derivatives 2-by-2 PSD non-negative eigenvalues If every term looks like the sum is an M-matrix
Second Derivatives 2-by-2 PSD non-negative eigenvalues If every term looks like Newton Step can be computed in nearly linear time!