ValueError: Contour levels must be increasing - Contour plot in Python - contour

I am trying to plot density estimates using a contour plot and getting the following error.
A = np.random.uniform(size=(100, 2))
#mean = np.mean(x)
#cov = np.cov(x)
mean = np.array([0.5, 0.1])
cov = np.array([[0.1, 0.0], [0.0, 1.5,]])
B = multivariate_normal.pdf(A, mean=mean, cov=cov)
# visualize
contours = plt.contour(A, B, linewidths=2)
plt.clabel(contours, inline=True, fontsize=12)
#plt.plot(x, y)
plt.colorbar();

For someone in future that has this problem in seaborn, I discovered that my data had some extreme outliers, meaning there was effectively no density to plot as 99% of the samples were around the origin. Using the 'clip' functionality in kdeplot worked to reduce the axis and thus plot the actual levels.

Related

How to plot Hog features in the image

I am trying to feature extraction from an image for some certain points. First time I am using HogFeatureextraction. When I plot the features and valid points, I am getting result not on the certain points. I will use these features for training later on. For example, I have points on the straight line. Should not my features on where my certain points on the line. I am a little bit confused about concept of it. I used [features,validPoints] = extractHOGFeatures(I,points). x and y are my 10 positions in the image. In this case how is feature extraction working?
[features,validPoints] = extractHOGFeatures(I,[x,y]);
figure;
imshow(I);
hold on;
plot(features, 'ro');
plot(validPoints,'go');
Thank you
The function's documentation explains it all clearly.
validPoints is a nX2 matrix of xy coordinates so you should use plot(x,y) instead of plot(x) to plot it.
features is a matrix of the HoG features of each point, and simply plot it using plot(features, 'ro') will not produce any reasonable output.
However, you can simply obtain the third output (visualization) from extractHOGFeatures and then use plot to plot it:
I = im2double(imread('cameraman.tif'));
% desired points
n = 20;
x = randi(size(I,2), [n 1]);
y = randi(size(I,1), [n 1]);
% extract features + visualization object
[features,validPoints,visualization] = extractHOGFeatures(I,[x,y]);
% show image and features
figure;
imshow(I);
hold on;
plot(visualization,'Color','r');
% plot valid points
plot(validPoints(:,1),validPoints(:,2),'go');

Matlab: Extract Image in polar representation from Cartesian

I m trying to compute an efficient way to transform an image in cartesian coordinates into a polar representation. I know some functions such as ImToPolar are doing it and it works perfectly but takes a considerable much time for big images, especially when they require to be processed back and forth.
Here´s my input image:
and then I generate a polar mesh using a cartesian mesh centered at 0 and the function cart2pol(). Finally, I plot my image using mesh(theta, r, Input).
And here´s what I obtain:
Its exactly the image I need and it´s the same as ImToPolar or maybe better.
Since MATLAB knows how to compute it, does anybody know how to extract a matrix in polar representation from this output? Or maybe a fast (like in fast fourier transform) way to compute a Polar transform (and inverse) on MATLAB?
pol2cart and meshgrid and interp2 are sufficient to create the result:
I=imread('http://i.stack.imgur.com/HYSyb.png');
[r, c,~] = size(I);
%rgb image can be converted to indexed image to prevent excessive copmutation for each color
[idx, mp] = rgb2ind(I,32);
% add offset to image coordinates
x = (1:c)-(c/2);
y = (1:r)-(r/2);
% create distination coordinates in polar form so value of image can be interpolated in those coordinates
% angle ranges from 0 to 2 * pi and radius assumed that ranges from 0 to 400
% linspace(0,2*pi, 200) leads to a stretched image try it!
[xp yp] = meshgrid(linspace(0,2*pi), linspace(0,400));
%translate coordinate from polar to image coordinates
[xx , yy] = pol2cart(xp,yp);
% interpolate pixel values for unknwon coordinates
out = interp2(x, y, idx, xx, yy);
% save the result to a file
imwrite(out, mp, 'result.png')

Camera calibration: 3D to 2D points mapping

I am working on problem related to camera calibration. In the below image, we consider a world coordinate system with X-axis going leftward, Y-axis rightward and Z-axis upward. We select 15 points(x,y,z) distributed uniformly across the 3 planes. The distance between grid lines is 1 inch. We also obtain MATLAB coordinates for the 15 pixels(u,v). The objective is to obtain the 3x4 camera matrix (M) using homogeneous linear least squares and then project the world points (x,y,z) to the image (u',v') using M. I have written code to do this but the coordinates I'm obtaining (u',v') seem to be very small in magnitude compared to the actual coordinates (u,v). The RMS error is too large and the projected points don't even map onto the image anywhere near the actual points. Is there any scaling that I need to do to convert it to MATLAB coordinates? I am also including my code which isn't very well written since I am relatively new to MATLAB.
P=[];% 2nx12 matrix - 30x12 matrix
for i=1:15 %compute P
world_row = world_coords(i,:); % 3d homogeneous coordinates (x,y,z,1)
zeroelem = repelem(0,4);
image_coord = image_coords(i,:);
img_u = image_coord(1);
prod = -img_u*world_row;
row1 = [world_row,zeroelem,prod];
zeroelem = repelem(0,3);
img_v = image_coord(2);
prod = -img_v*world_row;
row2 = [0,world_row,zeroelem,prod];
P=[P;row1;row2];
end
var1 = P'*P;
[V,D] = eig(var1');//compute eigen vector corresponding to least eigen value
m = V(:,1); //unit vector of norm 1
M = reshape(m,3,4); //camera matrix of 3x4 size
%get projected points
proj = M*world_coords';
U = proj (1,:);
V = proj (2,:);
W = proj (3,:);
for i=1:15
U(i) = U(i)/W(i);
V(i) = V(i)/W(i);
end
final = [U;V];//(u',v')
I am also including the image with the 15 points I have selected. Take P1(u,v) = (286,260) and P1(x,y,z) = (4,0,3). The (u',v') I obtained for this has low values. Can anyone point me what I'm doing wrong?
It was a silly error from my me that was giving me the wrong camera matrix. I noted down the world coordinates of the point P wrongly ((7,0,1) instead of (1,0,1)). This led to wrongly formed 30x12 matrix which we use to form an equation to be solved by homogeneous linear least squares. I have obtained the calibration matrix which projects the 3D points with a low RMS error after correcting this mistake.

Are there MATLAB scripts that will give me a quantitative/visual depiction of all the colors in an image?

I have tried the following:
b=imread('/home2/s163720/lebron.jpg');
hsv = rgb2hsv(b);
h = hsv(:,:,1);
imhist(h,16)
However, it does not give me quite what I'm looking for. It would be great to see a counter of some sort for different hues, or maybe even a distribution of the colors.
This would be greatly appreciated.
I think this may be on the line of what you're looking for.
%Read the image
img = imread('/home2/s163720/lebron.jpg');
%convert to hsv and reshap to a N x 3 matrix
hsv = rgb2hsv(img);
hsv2 = squeeze(reshape(hsv, [], 1, 3));
%Extract hue (and convert to an angle) and saturation
Hue = 2*pi*hsv2(:,1);
Saturation = hsv2(:,2);
%The number of bins in the hue and saturation directions
RadialColorBins = 50;
AngularColorBins = 50;
%Where the edged of the bins are
edges = {linspace(0, 1, RadialColorBins), linspace(0, 2*pi, AngularColorBins) - 2*pi / AngularColorBins};
%bin the data
[heights,centers] = hist3([Saturation, Hue],'Edges',edges);
%Extract the centers
radius = centers{1};
angle = centers{2};
%Force periodicity
angle = [angle, angle(1)];
heights = [heights, heights(:, 1)]';
%Mesh the r and theta components
[Radius, Angle] = meshgrid(radius, angle);
%Make a color map for the polar plot
CMap = hsv2rgb(Angle/(2*pi), Radius, ones(size(Radius)));
figure(1)
imshow(img)
%polar histogram in s-h space
figure(2)
surf(Radius.*cos(Angle), Radius.*sin(Angle), heights, CMap,'EdgeColor','none');
I know this is supposed to be where the answers should be, but I am quite sure that there is not a generic answer for your question. However there are a couple of possibilities, maybe you or anyone else can come up with more.
So the basic problem as you want a histogram, is that you have to choose some representation of the color as a single number. Which is quite a difficult problem.
The first solution could be to transform the rgb color to wavelength, and then ignore the intensity of the image. The problem by using this idea is that the rgb colors defines more colors than on wavelength alone. see: http://jp.mathworks.com/matlabcentral/newsreader/view_thread/313712
A second solution could be to define a number as A = sum(rgb.*[1,10,100]); and then use this number as your representation of the color.
A third solution would be to transform the hexadecimal representation of the number to a decimal representation and then use this number.
Once you have a representation for every color of every pixel, you simply reshape the matrix into a vector and use the standard hist command to plot it. But as mentioned, mayby someone has a better idea for a representation of the color as a single number.

Vector decomposition in matlab

this is my situation: I have a 30x30 image and I want to calculate the radial and tangent component of the gradient of each point (pixel) along the straight line passing through the centre of the image (15,15) and the same (i,j) point.
[dx, dy] = gradient(img);
for i=1:30
for j=1:30
pt = [dx(i, j), dy(i,j)];
line = [i-15, j-15];
costh = dot(line, pt)/(norm(line)*norm(pt));
par(i,j) = norm(costh*line);
tang(i,j) = norm(sin(acos(costh))*line);
end
end
is this code correct?
I think there is a conceptual error in your code, I tried to get your results with a different approach, see how it compares to yours.
[dy, dx] = gradient(img);
I inverted x and y because the usual convention in matlab is to have the first dimension along the rows of a matrix while gradient does the opposite.
I created an array of the same size as img but with each pixel containing the angle of the vector from the center of the image to this point:
[I,J] = ind2sub(size(img), 1:numel(img));
theta=reshape(atan2d(I-ceil(size(img,1)/2), J-ceil(size(img,2)/2)), size(img))+180;
The function atan2d ensures that the 4 quadrants give distinct angle values.
Now the projection of the x and y components can be obtained with trigonometry:
par=dx.*sind(theta)+dy.*cosd(theta);
tang=dx.*cosd(theta)+dy.*sind(theta);
Note the use of the .* to achieve point-by-point multiplication, this is a big advantage of Matlab's matrix computations which saves you a loop.
Here's an example with a well-defined input image (no gradient along the rows and a constant gradient along the columns):
img=repmat(1:30, [30 1]);
The results:
subplot(1,2,1)
imagesc(par)
subplot(1,2,2)
imagesc(tang)
colorbar

Resources