Singular Value Decomposition (SVD) in MATLAB Tutorial
1. Introduction to Singular Value Decomposition (SVD)
Singular Value Decomposition (SVD) is a fundamental matrix factorization technique that decomposes any given real or complex matrix into a product of three other matrices. It’s a powerful tool with widespread applications across various scientific and engineering disciplines, including signal processing, statistics, machine learning, computer vision, and image processing. SVD is particularly valuable because it can be applied to any matrix, regardless of whether it is square or rectangular, or if it possesses full rank.
The core idea behind SVD is to break down a matrix A into constituent parts that reveal its underlying structure and properties. This decomposition often simplifies complex matrix operations and provides insights into the data represented by the matrix.
2. Mathematical Foundation of SVD
For any matrix A of size m x n, the Singular Value Decomposition is given by:
A = U Σ V^T
Where:
* U is an m x m orthogonal matrix (if A is real) or unitary matrix (if A is complex). Its columns are known as the left-singular vectors of A. These vectors are orthonormal eigenvectors of A A^T.
* Σ (Sigma) is an m x n rectangular diagonal matrix. The diagonal entries, denoted as σ_i, are the singular values of A. These singular values are real, non-negative, and conventionally ordered in a non-increasing sequence (σ_1 ≥ σ_2 ≥ ... ≥ σ_min(m,n) ≥ 0). The singular values are the square roots of the eigenvalues of A A^T (or A^T A).
* V is an n x n orthogonal matrix (or unitary matrix). Its columns are known as the right-singular vectors of A. These vectors are orthonormal eigenvectors of A^T A. V^T denotes the transpose of V (or conjugate transpose if A is complex).
3. Performing SVD in MATLAB
MATLAB provides a highly optimized built-in function, svd, to compute the Singular Value Decomposition.
svd Function Syntax:
The svd function offers several ways to compute the decomposition:
s = svd(A): This returns a column vectorscontaining the singular values of matrixAin descending order. This is useful when you only need the singular values themselves.[U, S, V] = svd(A): This is the most common form, returning the full SVD decomposition.Uis anm x morthogonal matrix.Sis anm x ndiagonal matrix containing the singular values ofAon its main diagonal.Vis ann x northogonal matrix, whose transposeV'is used in the decompositionA = U * S * V'.
[U, S, V] = svd(A, 'econ'): This computes the “economy-size” SVD, which can be more efficient for matrices wheremandnare significantly different.- If
m > n,Uwill bem x n,Swill ben x n, andVwill ben x n. Only the firstncolumns ofUare computed. - If
m <= n, this call is equivalent to the full SVD[U, S, V] = svd(A). - The economy-size SVD retains all the singular values and corresponding singular vectors that are essential for reconstructing the original matrix to its full rank.
- If
Example 1: Basic SVD
Let’s demonstrate with a simple rectangular matrix:
“`matlab
% Define a sample matrix A
A = [1 2 3; 4 5 6; 7 8 9; 10 11 12]; % A is a 4×3 matrix
% Compute the full SVD
[U, S, V] = svd(A);
fprintf(‘Matrix A (%dx%d):\n’, size(A,1), size(A,2));
disp(A);
fprintf(‘\nMatrix U (%dx%d):\n’, size(U,1), size(U,2));
disp(U);
fprintf(‘\nMatrix S (singular values) (%dx%d):\n’, size(S,1), size(S,2));
disp(S);
fprintf(‘\nMatrix V (%dx%d):\n’, size(V,1), size(V,2));
disp(V);
% Verify the decomposition: A should be approximately U * S * V’
reconstructed_A = U * S * V’;
fprintf(‘\nReconstructed A (U * S * V”):\n’);
disp(reconstructed_A);
% Check the difference (should be very close to zero due to floating point precision)
fprintf(‘\nDifference between A and reconstructed_A:\n’);
disp(A – reconstructed_A);
“`
Example 2: Economy-size SVD
Using the same matrix A for the economy-size SVD:
“`matlab
A = [1 2 3; 4 5 6; 7 8 9; 10 11 12]; % A is a 4×3 matrix
% Compute the economy-size SVD
[U_econ, S_econ, V_econ] = svd(A, ‘econ’);
fprintf(‘Economy-size U (%dx%d):\n’, size(U_econ,1), size(U_econ,2));
disp(U_econ);
fprintf(‘\nEconomy-size S (%dx%d):\n’, size(S_econ,1), size(S_econ,2));
disp(S_econ);
fprintf(‘\nEconomy-size V (%dx%d):\n’, size(V_econ,1), size(V_econ,2));
disp(V_econ);
% Verify the decomposition for economy-size SVD
reconstructed_A_econ = U_econ * S_econ * V_econ’;
fprintf(‘\nReconstructed A (economy-size) (U_econ * S_econ * V_econ”):\n’);
disp(reconstructed_A_econ);
“`
Notice that U_econ is 4x3, S_econ is 3x3, and V_econ is 3x3, as m > n (4 > 3). The reconstruction is still accurate.
4. Key Applications of SVD
SVD’s versatility stems from its ability to reveal important characteristics of a matrix. Some prominent applications include:
- Dimensionality Reduction (Principal Component Analysis – PCA): SVD is intimately linked to PCA. By selecting only a subset of the largest singular values and their corresponding singular vectors, one can create a low-rank approximation of the original matrix. This effectively reduces the dimensionality of the data while preserving most of its significant variance and information.
- Image Compression: A classic application where an image (represented as a matrix of pixel intensities) can be compressed by applying SVD. By reconstructing the image using only the most significant singular values (and their associated vectors), substantial compression can be achieved with minimal loss of perceived quality.
- Noise Reduction: Small singular values often correspond to noise in the data. By setting these small singular values to zero or discarding them, SVD can be used as an effective noise filtering technique.
- Solving Linear Equations and Pseudoinverse: SVD provides a robust method to compute the pseudoinverse of a matrix. This is crucial for solving systems of linear equations, especially when the matrix is singular, ill-conditioned, or rectangular (e.g., in least squares problems).
- Recommender Systems: SVD is a core algorithm in collaborative filtering. It helps uncover latent factors that explain user-item interactions, which are then used to predict user preferences and provide recommendations.
- Latent Semantic Analysis (LSA): In natural language processing, SVD is used to reduce the dimensionality of term-document matrices, uncovering latent semantic relationships between words and documents.
5. Practical Example: Image Compression using SVD
Let’s illustrate SVD’s power with an image compression example in MATLAB. We will load a grayscale image, perform SVD, and then reconstruct it using progressively fewer singular values to observe the compression effect.
“`matlab
% Load an example image.
% MATLAB often includes ‘cameraman.tif’ in its Image Processing Toolbox.
% If not available, you can use any grayscale image or create a dummy one.
imagePath = ‘cameraman.tif’;
if exist(imagePath, ‘file’)
img = imread(imagePath);
else
% Create a dummy grayscale image if ‘cameraman.tif’ is not found
warning(‘cameraman.tif not found. Using a built-in grayscale image for demonstration.’);
img = rgb2gray(imread(‘peppers.png’)); % Load ‘peppers.png’ and convert to grayscale
end
% Convert the image to double for numerical computations with SVD
A = double(img);
% Perform SVD on the image matrix
[U, S, V] = svd(A);
% Display the original image
figure;
subplot(2, 2, 1);
imshow(uint8(A)); % Convert back to uint8 for display
title(‘Original Image’);
% Define different numbers of singular values to retain for compression
ranks_to_keep = [10, 50, 100]; % Experiment with these values
for i = 1:length(ranks_to_keep)
k = ranks_to_keep(i);
% Reconstruct the image using only the first 'k' singular values
% This is done by taking the first k columns of U,
% the top-left kxk block of S, and the first k columns of V.
A_compressed = U(:, 1:k) * S(1:k, 1:k) * V(:, 1:k)';
subplot(2, 2, i + 1);
imshow(uint8(A_compressed)); % Display the compressed image
title(sprintf('Compressed with k = %d singular values', k));
% Optional: Calculate and display the compression ratio (approximate)
% Original data points: m*n
% Compressed data points: m*k + k*k + n*k
[m, n] = size(A);
original_storage = m * n;
compressed_storage = (m * k) + k + (n * k); % simplified, considers only non-zero elements
compression_ratio = original_storage / compressed_storage;
fprintf('k = %d: Approx. compression ratio = %.2f\n', k, compression_ratio);
% Optional: Calculate the Frobenius norm error
error_norm = norm(A - A_compressed, 'fro');
fprintf('k = %d: Frobenius norm error = %f\n', k, error_norm);
end
“`
In this example, as k decreases, the image becomes more compressed but also loses detail. Conversely, a larger k retains more detail but offers less compression. The choice of k depends on the desired balance between image quality and file size.
6. Conclusion
Singular Value Decomposition is an indispensable tool in linear algebra and its applications. MATLAB’s svd function makes it readily accessible for various tasks, from theoretical explorations to practical implementations like data compression and dimensionality reduction. Understanding SVD and its capabilities is crucial for anyone working with data analysis, signal processing, and machine learning. By mastering this technique, you can unlock deeper insights into your data and develop more efficient algorithms.