Tuesday, December 25, 2007

Monday Morning Algorithm 8: An Example of Compressed Measurements Speeding Up Diffusion Maps


In the previous installment of the Monday Morning Algorithm, we featured the diffusion method. I also mentioned that one could use compressed sensing and obtain the same result: i.e the ability to sort several high dimensional images because only one parameter is being varied. When going through this exercise, one can note that of the most expensive operation is the inter-distance measured between all the elements of the set. If every picture is 10 MP, one has to evaluate the difference between elements made of ten million numbers. In Compressed Sensing, one learns that projecting a large image on the right kind of random projections, one can keep the amount of information in the original image. We are going to reuse our last algorithm but we will be projecting the original images of Buzz Lightyear on a smaller space. We will be using a Gaussian ensemble and will be projecting from an image of dimension 57600 down to 100 Compressed Measurements. Let us note that this not an efficient operation unless we have directly access to a compressed measurements or if one does extra operations with the compressed measurements.


clear
% Stephane Lafon, “Diffusion maps and geometric harmonics”,
% Ph.D. dissertation, Yale University (May
% 2004)
% written by Cable Kurwitz and Igor Carron
% This algorithm shows how random projections
% used in compressed sensing provides the same
% results as the original images.
%
% Number of Random Projections
number_RP =100;
% get all the files in jpg
d = dir('buzz*.jpg')
ntotal = size(d,1)
for i=1:ntotal
d(i).name
end
Img = imread(d(1).name);
s=size(Img,1);
w=size(Img,2);
hd=size(Img,3);
swhd=s*w*hd;
%reshape these images into one vector each
for ii=1:ntotal
Img = imread(d(ii).name);
b1(:,ii) = double(reshape(Img, swhd,1));
end
%
% Compressed sensing part
sa=randn(number_RP,swhd);
for i=1:swhd
sa(:,i)=sa(:,i)./norm(sa(:,i));
end
b=sa*b1;
%b=b1;
%
%
for i=1:ntotal
for j=1:ntotal
D1(i,j)=norm(b(:,i)-b(:,j),2);
end
end
%
n1=ntotal;
D_sum=0;
for l = 1:n1;
D_sum = D_sum + min(nonzeros(D1(l,:)));
end;
epsilon = D_sum/n1/10; %
%
% Step 1
%
kernel_1 = exp(-(D1./epsilon).^1);
%
% Step 2
%
one_V = ones(1,n1);
%
% Step 3
%
p = kernel_1*one_V';
kernel_2 = kernel_1./((p*p').^1);
%
% Step 4
%
v = sqrt(kernel_2*one_V');
%
% Step 5
%
K = kernel_2./(v*v');
%
% Step 6
%
%% Singular Value Decomposition
[U S V] = svd(K);
% Normalization
for i=1:ntotal-1
phi(:,i) = U(:,i)./U(:,1); %
end
eig_funct = phi(:,1:6);
%
% The eigenvalues of \delta are approximated by those of K, and its
% eigenfunctions Ái are approximated by phi(:,i) = U(:; i):=U(:; 1)
%
vv=eig(K);

% Initial order of Buzz images
figure(1)
for i=1:ntotal
subplot(3,4,i)
imshow(d(i).name)
end

X = vv(ntotal-1).*phi(:,2);
% sorting the Buzz images using the first
% eigenfunctions the Laplace Beltrami operator
[Y,I] = sort(X);
% d(I).name
%
figure(2)
for i=1:ntotal
subplot(3,4,i)
imshow(d(I(i)).name)
end
%
figure(3)
plot(vv(ntotal-1).*phi(:,2),'.')
title(' Buzz Lightyear')
xlabel('Index')
ylabel('Lambda_1 phi_1')
%
figure(4)
plot(vv(ntotal-1).*phi(:,2),vv(ntotal-2).*phi(:,3),'.')
title(' Buzz Lightyear')
xlabel('Lambda_1 phi_1')
ylabel('Lambda_2 phi_2')
%
figure(5)
plot(eig(K),'*')
xlabel('Index of eigenvalue')
ylabel('Value of eigenvalue')
title(' Eingenvalues of K')
vv=eig(K);
%
It looks like 100 projections are enough to recover the same results as the original analysis. One can note that this process of figuring out what is the number of random projections needed would require the type of analysis evaluating the intrinsic dimensionality of the set as mentioned here by the folks at Rice.

All the Monday Morning Algorithms (and attendant codes/files) are listed here.

No comments:

Printfriendly