40 likes | 54 Views
A Generalization of PCA to the Exponential Family Collins, Dasgupta and Schapire Presented by Guy Lebanon. Two Viewpoints of PCA. Algebraic
E N D
A Generalization of PCA to the Exponential FamilyCollins, Dasgupta and SchapirePresented by Guy Lebanon
Two Viewpoints of PCA • Algebraic Given data find a linear transformation such that the sum of squared distances is minimized (over all linear transformation ) • Statistical Given data assume that each point is a random variable. Find the maximum likelihood estimator under the constraint that are in a K dimensional subspace and are linearly related to the data.
The Gaussian assumption may be inappropriate – especially if the data is binary valued or non-negative for example. • Suggestion: replace the Gaussian distribution by any exponential distribution. Given data such that each point comes from an exponential family distribution , find the MLE for under the assumption that it lies in a low dimensional subspace.
The new algorithm finds a linear transformation in the parameter space but a nonlinear subspace in the original coordinates . • The loss functions may be cast in terms of Bregman distances. • The loss function is not convex in the general case. • The authors use the alternating minimization algorithm (Csiszar and Tsunadi) to compute the transformation.