Matrix Factorization, Algorithms, Applications, and Avaliable packages

追风赶月莫停留,平芜尽处是春山。这篇文章主要讲述Matrix Factorization, Algorithms, Applications, and Avaliable packages相关的知识,希望能为你提供帮助。
来源:http://www.cvchina.info/2011/09/05/matrix-factorization-jungle/
美帝的有心人士收集了市面上的矩阵分解的差点儿全部算法和应用。因为源地址在某神奇物质之外,特转载过来,源地址
Matrix Decompositions  has a long history and generally centers around a set of known factorizations such as LU, QR, SVD and eigendecompositions. More  recent factorizations have seen the light of the day with work started with the advent of NMF, k-means and related algorithm  [1]. However, with the advent of new methods based on random projections and convex optimization that started in part in the  compressive sensing literature, we are seeing another surge of very diverse algorithms dedicated to many different kinds of matrix factorizations with new constraints based on rank and/or positivity and/or sparsity,… As a result of this large increase in interest, I have decided to keep a list of them here following the success of the  big picture in compressive sensing.
【Matrix Factorization, Algorithms, Applications, and Avaliable packages】The sources for this list include  the following most excellent sites:  Stephen Becker’s page,  Raghunandan H. Keshavan‘ s  page,  Nuclear Norm and Matrix Recovery  through SDP by  Christoph Helmberg,  Arvind Ganesh’s  Low-Rank Matrix Recovery and Completion via Convex Optimization  who provide more in-depth additional information.    Additional codes were featured also on  Nuit Blanche. The following people provided additional inputs:  Olivier Grisel,  Matthieu Puigt.
Most of the algorithms listed below generally rely on using the nuclear norm as a proxy to the rank functional.  It may not be optimal. Currently,  CVX  (  Michael Grant  and  Stephen   Boyd) consistently allows one to explore other proxies for the rank functional such as the  log-det  as found by  Maryam   Fazell,  Haitham Hindi,  Stephen Boyd.  ** is used to show that the algorithm uses another heuristic than the nuclear norm.
In terms of notations, A refers to a matrix, L refers to a low rank matrix, S a sparse one and N to a noisy one. This page lists the different codes that implement the following matrix factorizations:  Matrix Completion,  Robust PCA  , Noisy  Robust PCA,  Sparse PCA,  NMF,  Dictionary Learning,  MMV,  Randomized Algorithms and other factorizations. Some of these toolboxes can sometimes implement several of these decompositions and are listed accordingly. Before I list algorithm here, I generally feature them on Nuit Blanche under the MF tag:  http://nuit-blanche.blogspot.com/search/label/MF  or.  you can also subscribe to the Nuit Blanche feed,
Matrix Completion, A = H.*L with H a known mask,  L unknown  solve for  L lowest rank possible
The idea of this approach is to complete the unknown coefficients of a matrix based on the fact that the matrix is low rank:

  • OptSpace:  Matrix Completion from a Few Entries  by  Raghunandan H. Keshavan,  Andrea Montanari, and  Sewoong Oh
  • LMaFit: Low-Rank Matrix Fitting
  • **  Penalty Decomposition Methods for Rank Minimization  by  Zhaosong Lu  and  Yong Zhang.The attendant  MATLAB code is here.
  • Jellyfish:  Parallel Stochastic Gradient Algorithms for Large-Scale Matrix Completion, B. Recht, C. Re, Apr 2011
  • GROUSE: Online Identification and Tracking of Subspaces from Highly Incomplete Information, L. Balzano, R. Nowak, B. Recht, 2010
  • SVP:  Guaranteed Rank Minimization via Singular Value Projection, R. Meka, P. Jain, I.S.Dhillon, 2009
  • SET: SET: an algorithm for consistent matrix completion, W. Dai, O. Milenkovic, 2009
  • NNLS: An accelerated proximal gradient algorithm for nuclear norm regularized least squares problems, K. Toh, S. Yun, 2009
  • FPCA: Fixed point and Bregman iterative methods for matrix rank minimization, S. Ma, D. Goldfard, L. Chen, 2009
  • SVT: A singular value thresholding algorithm for matrix completion, J-F Cai, E.J. Candes, Z. Shen, 2008
Noisy  Robust PCA,    A = L + S + N with  L, S, N unknown, solve for  L low rank, S sparse, N noise
  • GoDec  : Randomized Low-rank and Sparse Matrix Decomposition in Noisy Case
  • ReProCS: The  Recursive Projected Compressive Sensing code  (example)
Robust PCA : A = L + S  with  L, S, N unknown, solve for  L low rank, S sparse
  • Robust PCA  : Two Codes that go with the paper  “Two Proposals for Robust PCA Using Semidefinite Programming.”  by  MichaleI Mccoy  and  Joel  Tropp
  • SPAMS (SPArse Modeling Software)
  • ADMM:  Alternating Direction Method of Multipliers  ‘‘Fast Automatic Background Extraction via Robust PCA’  by  Ivan Papusha.  The  poster is here. The  matlab implementation is here.
  • PCP:  Generalized Principal Component Pursuit
  • Augmented Lagrange Multiplier (ALM) Method [exact ALM - MATLAB  zip] [inexact ALM - MATLABzip], Reference -  The Augmented Lagrange Multiplier Method for Exact Recovery of Corrupted Low-Rank Matrices, Z. Lin, M. Chen, L. Wu, and Y. Ma (UIUC Technical Report UILU-ENG-09-2215, November 2009)
  • Accelerated Proximal Gradient , Reference -  Fast Convex Optimization Algorithms for Exact Recovery of a Corrupted Low-Rank Matrix, Z. Lin, A. Ganesh, J. Wright, L. Wu, M. Chen, and Y. Ma (UIUC Technical Report UILU-ENG-09-2214, August 2009)[full SVD version - MATLAB  zip] [partial SVD version - MATLAB  zip]
  • Dual Method [MATLAB  zip], Reference -  Fast Convex Optimization Algorithms for Exact Recovery of a Corrupted Low-Rank Matrix, Z. Lin, A. Ganesh, J. Wright, L. Wu, M. Chen, and Y. Ma (UIUC Technical Report UILU-ENG-09-2214, August 2009).
  • Singular Value Thresholding [MATLAB  zip]. Reference -  A Singular Value Thresholding Algorithm for Matrix Completion, J. -F. Cai, E. J. Candès, and Z. Shen (2008).
  • Alternating Direction Method [MATLAB  zip] , Reference -  Sparse and Low-Rank Matrix Decomposition via Alternating Direction Methods, X. Yuan, and J. Yang (2009).
  • LMaFit: Low-Rank Matrix Fitting
  • Bayesian robust PCA
  • Compressive-Projection PCA  (CPPCA)
Sparse PCA:  A = DX   with  unknown D and X, solve for  sparse D
Sparse PCA  on wikipedia
  • R. Jenatton, G. Obozinski, F. Bach. Structured Sparse Principal Component Analysis. International Conference on Artificial Intelligence and Statistics (AISTATS). [pdf] [code]
  • SPAMs
  • DSPCA:  Sparse PCA using SDP  . Code is  here.
  • PathPCA: A fast greedy algorithm for Sparse PCA. The code is  here.
Dictionary Learning: A = DX   with  unknown D and X, solve for  sparse X
Some implementation of dictionary learning implement the NMF
  • Online Learning for Matrix Factorization and Sparse Coding  by  Julien Mairal,  Francis Bach,  Jean Ponce,Guillermo Sapiro  [The code is released as  SPArse Modeling Softwareor  SPAMS]
  • Dictionary Learning Algorithms for Sparse Representation  (Matlab implementation of  FOCUSS/FOCUSS-CNDL is here)
  • Multiscale sparse image representation with learned dictionaries  [Matlab implementation of the  K-SVD algorithm is here, a newer implementation by Ron Rubinstein is  here  ]
  • Efficient sparse coding algorithms  [ Matlab  code is here  ]
  • Shift Invariant Sparse Coding of Image and Music Data. Matlab implemention is  here
  • Shift-invariant dictionary learning for sparse representations: extending K-SVD.
  • Thresholded Smoothed-L0 (SL0) Dictionary Learning for Sparse Representations  by Hadi Zayyani,  Massoud Babaie-Zadeh  and  Remi Gribonval.
  • Non-negative Sparse Modeling of Textures (NMF)  [Matlab implementation of  NMF (Non-negative Matrix Factorization) and NTF (Non-negative Tensor), a faster implementation of NMF can be found  here, here is a more recent  Non-Negative Tensor Factorizations package]
NMF: A = DX with  unknown D and X, solve for elements of  D,X > 0
Non-negative Matrix Factorization (NMF) on wikipedia
  • HALS:  Accelerated Multiplicative Updates and Hierarchical ALS Algorithms for Nonnegative Matrix Factorization  by  Nicolas Gillis,  Fran?ois Glineur.
  • SPAMS (SPArse Modeling Software)  by  Julien Mairal,  Francis Bach,  Jean Ponce,Guillermo Sapiro
  • NMF:  C.-J. Lin.  Projected gradient methods for non-negative matrix factorization.  Neural Computation, 19(2007), 2756-2779.
  • Non-Negative Matrix Factorization:  This page  contains an optimized C implementation of the Non-Negative Matrix Factorization (NMF) algorithm, described in  [Lee & Seung 2001]. We implement the update rules that minimize a weighted SSD error metric. A detailed description of weighted NMF can be found in[Peers et al. 2006].
  • NTFLAB for Signal Processing, Toolboxes for NMF (Non-negative Matrix Factorization) and NTF (Non-negative Tensor Factorization) for BSS (Blind Source Separation)
  • Non-negative Sparse Modeling of Textures (NMF)  [Matlab implementation of  NMF (Non-negative Matrix Factorization) and NTF (Non-negative Tensor), a faster implementation of NMF can be found  here, here is a more recent  Non-Negative Tensor Factorizations package]
Multiple Measurement Vector (MMV) Y = A X with  unknown X  and  rows of X are sparse.
  • T-MSBL/T-SBL  by  Zhilin Zhang
  • Compressive MUSIC with optimized partial support for joint sparse recovery  by  Jong Min Kim,  Ok Kyun Lee,  Jong Chul Ye  [no code]
  • The REMBO Algorithm Accelerated Recovery of Jointly Sparse Vectors  by  Moshe Mishali and Yonina C. Eldar [ no code]
Blind Source Separation (BSS) Y = A X with  unknown A and X  and statistical independence between  columns of X or subspaces of columns of X
Include Independent Component Analysis (ICA), Independent Subspace Analysis (ISA), and Sparse Component Analysis (SCA).  There are many available codes for ICA and some for SCA. Here is a non-exhaustive list of some famous ones (which are not limited to linear instantaneous mixtures). TBC
ICA:
  • ICALab:  http://www.bsp.brain.riken.jp/ICALAB/
  • BLISS softwares:  http://www.lis.inpg.fr/pages_perso/bliss/deliverables/d20.html
  • MISEP:  http://www.lx.it.pt/~lbalmeida/ica/mitoolbox.html
  • Parra and Spence’s frequency-domain convolutive ICA:http://people.kyb.tuebingen.mpg.de/harmeling/code/convbss-0.1.tar
  • C-FICA:  http://www.ast.obs-mip.fr/c-fica
SCA:
  • DUET:  http://sparse.ucd.ie/publications/rickard07duet.pdf  (the matlab code is given at the end of this pdf document)
  • LI-TIFROM:  http://www.ast.obs-mip.fr/li-tifrom
Randomized Algorithms
These algorithms uses generally random projections to shrink very large problems into smaller ones that can be amenable to traditional matrix factorization methods.
Resource
Randomized algorithms for matrices and data  by Michael W. Mahoney
Randomized Algorithms for Low-Rank Matrix Decomposition
  • Randomized PCA
  • Randomized Least Squares:  Blendenpik(  http://pdos.csail.mit.edu/~petar/papers/blendenpik-v1.pdf  )
Other factorization
D(T(.)) = L + E  with  unknown L, E and unknown transformation T  and  solve for transformation T, Low Rank L and Noise E
  • RASL: Robust Batch Alignment of Images by Sparse and Low-Rank Decomposition
  • TILT: Transform Invariant Low-rank Textures
Frameworks featuring advanced Matrix factorizations
For the time being, few have integrated the most recent factorizations.
  • Scikit Learn  (python)
  • Matlab Toolbox for Dimensionality Reduction  (Probabilistic PCA,  Factor Analysis (FA)…)
  • Orange  (Python)
  • pcaMethods—a  bioconductor  package providing PCA methods for incomplete data. R Language
GraphLab / Hadoop
  • Danny Bickson  keeps a  blog on GraphLab.
Books
  • Matrix Factorizations on Amazon.
Example of use
  • CS: Low Rank Compressive Spectral Imaging and a multishot CASSI
  • CS: Heuristics for Rank Proxy and how it changes everything….
  • Tennis Players are Sparse !
Sources
Arvind Ganesh’s  Low-Rank Matrix Recovery and Completion via Convex Optimization
  • Raghunandan H. Keshavan‘ s  list
  • Stephen Becker’s list
  • Nuclear Norm and Matrix Recovery  through SDP by  Christoph Helmberg
  • Nuit Blanche
Relevant links
  • Welcome to the Matrix Factorization Jungle
  • Finding Structure With Randomness
Reference:
A Uni?ed View of Matrix Factorization Models by Ajit P. Singh and Geoffrey J. Gordon


http://blog.sciencenet.cn/blog-242887-483128.html 






    推荐阅读