200 likes | 498 Views
Non-negative Matrix Factorization with Sparseness Constraints. Patrik O . Hoyer. Journal of Machine Learning Research,2004. Jain- De,Lee. OutLINE. Introduction Adding Sparseness Constraints to NMF Experiments with Sparseness Constraints Conclusions. Introduction.
E N D
Non-negative Matrix Factorization with Sparseness Constraints Patrik O. Hoyer Journal of Machine Learning Research,2004 Jain-De,Lee
OutLINE • Introduction • Adding Sparseness Constraints to NMF • Experiments with Sparseness Constraints • Conclusions
Introduction • Non-negative matrix factorization (NMF) • A useful representation typically makes latent structure in the data explicit • Reduces the dimensionality of the data • The non-negativity constraints make the representation purely additive
Introduction • Sparserepresentation • Representation encodes much of the data using few ‘active’ components • The sparseness given by NMF is somewhat of a side-effect rather than a goal • Include the option to control sparseness explicitly
Adding Sparseness Constraints to NMF Illustration of various degrees of sparseness • The concept of sparse coding • Only a few units are effectively used to represent typical data vectors
Adding Sparseness Constraints to NMF where n is the dimensionality of x • Sparseness measure • Based on the relationship between the L1 norm and the L2 norm
Adding Sparseness Constraints to NMF Where wiis the ithcolumn of W and hiis the ithrow of H Swand Share the desired sparsenesses of W and H (respectively) • To constrain NMF to find solutions with desired degrees of sparseness • What exactly should be sparse? • Under optional constraints minimized
Adding Sparseness Constraints to NMF Initialize Project Iterate • Projected gradient descent algorithm for NMF with sparseness constraints
Adding Sparseness Constraints to NMF Project • If sparseness constraints on W: • Project each column of W to be non-negative • Have unchanged L2 norm, L1 norm set to achieve desired sparseness • If sparseness constraints on H: • Project each row of H to be non-negative • Have unchanged L2 norm, L1 norm set to achieve desired sparseness
Adding Sparseness Constraints to NMF or Where μWand μHare small positive constants • Iterate • If sparseness constraints on W(or H) apply • Set or • Project • else take standard multiplicative step
Adding Sparseness Constraints to NMF • Given any vector x, find the closest non-negative vector s with a given L1 norm and a given L2 norm • Projection operator • Problem
Adding Sparseness Constraints to NMF if if • Algorithm • Set • Set Z={} • Iterate • Set • Set ,where α≥ 0 • If all components of s are non-negative, return s, end • Set Z=Z ∪ {i ; si<0} • Set si=0 , • Calculate • Set si= si –c , • Go to 1
Experiments with Sparseness Constraints NMF applied to various image data sets (a) Basis images given by NMF applied to face image data from the CBCL database (b) Basis images derived from the ORL face image database (c) Basis vectors from NMF applied to ON/OFF-contrast filtered natural image data
Experiments with Sparseness Constraints Features learned from the CBCL face image database using NMF with sparseness constraints
Experiments with Sparseness Constraints Features learned from the ORL face image database using NMF with three levelssparseness constraints (a) 0.5 (b) 0.6 (c) 0.75
Experiments with Sparseness Constraints The sparseness of the coefficients was fixed at 0.85 Standard NMF (Figure 1c)
Experiments with Sparseness Constraints Number of iterations required for the projection algorithm to converge
Conclusions • Useful to control the degree of sparseness explicitly • Describe a projection operator capable of simultaneously enforcing both L1 and L2 norms • To show its use in the NMF framework for learning representations that could not be obtained by regular NMF