Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p><em>Note to readers: This post at first may seem unrelated to the topic, but please refer to the discussion in the comments above.</em></p> <p>The following is my attempt at implementing the <strong><a href="http://en.wikipedia.org/wiki/Spectral_clustering" rel="nofollow noreferrer">Spectral Clustering</a></strong> algorithm applied to image pixels in <a href="http://en.wikipedia.org/wiki/MATLAB" rel="nofollow noreferrer">MATLAB</a>. I followed exactly the <a href="http://ai.stanford.edu/~ang/papers/nips01-spectral.pdf" rel="nofollow noreferrer">paper</a> mentioned by @Andriyev:</p> <blockquote> <p>Andrew Ng, Michael Jordan, and Yair Weiss (2002). On spectral clustering: analysis and an algorithm. In T. Dietterich, S. Becker, and Z. Ghahramani (Eds.), Advances in Neural Information Processing Systems 14. MIT Press</p> </blockquote> <p>The code:</p> <pre><code>%# parameters to tune SIGMA = 2e-3; %# controls Gaussian kernel width NUM_CLUSTERS = 4; %# specify number of clusters %% Loading and preparing a sample image %# read RGB image, and make it smaller for fast processing I0 = im2double(imread('house.png')); I0 = imresize(I0, 0.1); [r,c,~] = size(I0); %# reshape into one row per-pixel: r*c-by-3 %# (with pixels traversed in columwise-order) I = reshape(I0, [r*c 3]); %% 1) Compute affinity matrix %# for each pair of pixels, apply a Gaussian kernel %# to obtain a measure of similarity A = exp(-SIGMA * squareform(pdist(I,'euclidean')).^2); %# and we plot the matrix obtained imagesc(A) axis xy; colorbar; colormap(hot) %% 2) Compute the Laplacian matrix L D = diag( 1 ./ sqrt(sum(A,2)) ); L = D*A*D; %% 3) perform an eigen decomposition of the laplacian marix L [V,d] = eig(L); %# Sort the eigenvalues and the eigenvectors in descending order. [d,order] = sort(real(diag(d)), 'descend'); V = V(:,order); %# kepp only the largest k eigenvectors %# In this case 4 vectors are enough to explain 99.999% of the variance NUM_VECTORS = sum(cumsum(d)./sum(d) &lt; 0.99999) + 1; V = V(:, 1:NUM_VECTORS); %% 4) renormalize rows of V to unit length VV = bsxfun(@rdivide, V, sqrt(sum(V.^2,2))); %% 5) cluster rows of VV using K-Means opts = statset('MaxIter',100, 'Display','iter'); [clustIDX,clusters] = kmeans(VV, NUM_CLUSTERS, 'options',opts, ... 'distance','sqEuclidean', 'EmptyAction','singleton'); %% 6) assign pixels to cluster and show the results %# assign for each pixel the color of the cluster it belongs to clr = lines(NUM_CLUSTERS); J = reshape(clr(clustIDX,:), [r c 3]); %# show results figure('Name',sprintf('Clustering into K=%d clusters',NUM_CLUSTERS)) subplot(121), imshow(I0), title('original image') subplot(122), imshow(J), title({'clustered pixels' '(color-coded classes)'}) </code></pre> <p>... and using a simple house image I drew in Paint, the results were:</p> <p><img src="https://i.stack.imgur.com/MhTCz.png" alt="laplacian matrix"> <img src="https://i.stack.imgur.com/AD9yO.png" alt="image clustered"></p> <p>and by the way, the first 4 eigenvalues used were:</p> <pre><code>1.0000 0.0014 0.0004 0.0002 </code></pre> <p>and the corresponding eigenvectors [columns of length r*c=400]:</p> <pre><code>-0.0500 0.0572 -0.0112 -0.0200 -0.0500 0.0553 0.0275 0.0135 -0.0500 0.0560 0.0130 0.0009 -0.0500 0.0572 -0.0122 -0.0209 -0.0500 0.0570 -0.0101 -0.0191 -0.0500 0.0562 -0.0094 -0.0184 ...... </code></pre> <p>Note that there are step performed above which you didn't mention in your question (Laplacian matrix, and normalizing its rows)</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload