What is SVD and NMF? (Learning from Computational Linear Algebra course)

fastai
linear_algebra
Author

Kurian Benoy

Published

February 11, 2019

Singular-value decomposition (SVD)

Singular-value decomposition (SVD) is a factorization of a real or complex matrix. It is the generalization of the eigendecomposition of a positive semidefinite normal matrix (for example, a symmetric matrix with positive eigenvalues) to any m × n {mn}  matrix via an extension of the polar decomposition. It has many useful applications in signal processing and statistics. Formally, the singular-value decomposition of an m × n real or complex matrix M  is a factorization of the form U Σ V ∗ , where U  is an m × m  real or complex unitary matrix, Σ  is an m × n rectangular diagonal matrix with non-negative real numbers on the diagonal, and V  is an n × n  real or complex unitary matrix. The diagonal entries σ i  of Σ  are known as the singular values of M . The columns of U  and the columns of V  are called the left-singular vectors and right-singular vectors of M , respectively. The singular-value decomposition can be computed using the following observations: The left-singular vectors of M are a set of orthonormal eigenvectors of MM∗. The right-singular vectors of M are a set of orthonormal eigenvectors of M∗M. The non-zero singular values of M (found on the diagonal entries of Σ) are the square roots of the non-zero eigenvalues of both M∗M and MM∗.source.

You maybe bored with this wikipedia definition. Yet what if I say to you Singular value decomposition is one of the most popular algorithms which have influence our society and is used in many fields like medicines, data science , Spacial science and much more like famous algorithms like fast fourier integration, Monte Carlo Integration , quick sort and much more. But according to math_rachael, the algorithm has not yet recieved the attentinon it really deserves to get.

SVD is basically a small equation such that:

A[data matrix] = U[left singular matrix] * epsilon(diagonal of singular values) * VT[right singular values)

where: - A is an mn matrix - U is an mn orthogonal matrix - S is an nn diagonal matrix - V is an nn orthogonal matrix

To dive more into SVD check topic modelling from fast.ai Linear algebra class

Non-negative Matrix Factorization

Rather than constraining our factors to be orthogonal, another idea would to constrain them to be non-negative. NMF is a factorization of a non-negative data set V:

V=WH

into non-negative matrices W,H. Often positive factors will be more easily interpretable (and this is the reason behind NMF’s popularity).

NMF is a state of art feature detection algorithm. NMF is useful when there are many attributes and these attributes are weak predictabililty. By combining attributes, NMF can produce meaningful patterns, topics, or themes. NMF is often useful in text mining. In a text document, the same word can occur in different places with different meanings

NMF is commonly used in : - Topic Modelling - Face Decomposition - Audio source seperation - Chemistry - Bioinformation and much more

Let’s dive more into NMF with a face decomposition examples : link