count number of non gray pixels in RGB image in matlab - image

I have a RGB image which has only black and white squares. I want to count number to non gray pixels in this image. I am new to matlab. I want to check the quality of image as it should only contain black and white pixels.Actually I have undistorted this image due that some colored fringes are appeared.I want to know the how many color are introduced to check the quality of the image.

using matlab to get counts of specific pixel values in an image.
Images are RGBA <512x512x4 uint8> when read into matlab (although we can disregard the alpha channel).
Something like this
count = sum(im(:, :, 1) == 255 & im(:, :, 3) == 255 & im(:, :, 3) == 255);
will give you the count of such pixels. Replace sum with find to get the indices of those pixels if you need that.

A pixel is said to be gray if its R,G,B components are all same.
Using this logic
%// checking for equality of R,G,B values
B = any(diff(im,[],3),3); %// selecting only non-gray pixels
count = sum(B(:)); %// Number of non-gray pixels
PS: This answer is tailored from this and this answer.

Related

Find number of white/black pixels in binary image

I have an binary word image with grid and i want to find the foreground pixels (number of black and white in each segments) in each segments
So I just wanted to get the total number of black pixels for an image and I didn't want to segment it. Didn't feel like having to learn how to segment a matrix.
Easy fix to get the sum of the black pixels:
sum(sum(bw == 0))
Conversely to get the white pixels
sum(sum(bw == 1))
The operation function sum(bw == 0) just gets everything to a [1xn] matrix, so you will need to perform sum and you will get the whole image matrix summed.
Thought that cumsum would do the trick, but sum(sum()) is just better for this case. Tells you how little I know about coding. Have fun with this info, note that I'm just using this to determine the area coverage of a binarized image, but for anything else particularly useful.
Assume your black and white image is bw
sum(bw==0) // number of black pixels
sum(bw==1) // number of white pixels
For each grid find the range in both directions, then apply the same idea.
For example, from x1 to x2 and y1 to y2 is one grid:
sum(bw(x1:x2, y1:y2) == 0) // will give you the black pixels you want
That is easy with logical indexing.
For each segment, do:
n_white=sum(segment(:)==1);
n_black=sum(segment(:)==0);

Turn black pixel on the first image to white if it is also black on a second co-located image

I have 2 co-located images, both created in a similar way and both have the size of 7,221 x 119 pixels.
I want to write a logic like this:
If the R,G,B values of a certain pixel (called it x) in image 1 = 0,0,0 (black) And the R,G,B values of pixel x in image 2 = 0,0,0 (black) then change the R,G,B values of pixel x in image 1 to 255,255,255 (white), Else no change.
How can I do this in either Matlab or Python?
You should be able to do this in python with the Pillow package. You need to load the two pixels, check if all the color channels are 0 and if so make them 255, then save the image again. In Python 0 is interpreted as False, so not any(vals) will be True when vals includes only zeros.
from PIL import Image
im1 = Image.open("image1.jpg")
im2 = Image.open("image2.jpg")
pixel = (0, 0)
newcolor = (255,)*3
if not any(im1.getpixel(pixel)) and not any(im2.getpixel(pixel)):
im1.putpixel(pixel, newcolor)
im1.save('image1conv.jpg')
Note that not any(im1.getpixel(pixel)) and not any(im2.getpixel(pixel)) could be rewritten as not any(im1.getpixel(pixel) + im2.getpixel(pixel)), but I think the first way has clearer logic.

Coloring an 8-bit grayscale image in MATLAB

I have an 8-bit grayscale image with different values (0,1,2,3,4,..., 255). What I want to do is color the grayscale image with colors like blue, red, etc. Until now, I have been doing this coloring but only in a greyscale. How can I do it with actual colors?
Here is the code I have written so far. This is where I am searching for all values that are white in an image and replacing them with a darkish gray:
for k = 1:length(tifFiles)
baseFileName = tifFiles(k).name;
fullFileName = fullfile(myFolder, baseFileName);
fprintf(1, 'Now reading %s\n', fullFileName);
imageArray = imread(fullFileName);
%// Logic to replace white grayscale values with darkish gray here
ind_plain = find(imageArray == 255);
imageArray(ind_plain) = 50;
imwrite(imageArray, fullFileName);
end
What you are asking is to perform a pseudo colouring of an image. Doing this in MATLAB is actually quite easy. You can use the grayscale intensities as an index into a colour map, and each intensity would generate a unique colour. First, what you need to do is create a colour map that is 256 elements long, then use ind2rgb to create your colour image given the grayscale intensities / indices of your image.
There are many different colour maps that are available to you in MATLAB. Here are the current available colour maps in MATLAB without the recently added Parula colour map that was introduced in R2014:
How the colour maps work is that lower indices / grayscale values have colours that move toward the left side of the spectrum and higher indices / grayscale values have colours that move toward the right side of the spectrum.
If you want to create a colour map with 256 elements, you simply use any one of those colour maps as a function and specify 256 as the input parameter to generate a 256 element colour map for you. For example, if you wanted to use the HSV colour map, you would do this in MATLAB:
cmap = hsv(256);
Now, given your grayscale image in your MATLAB workspace is stored in imageArray, simply use ind2rgb this way:
colourArray = ind2rgb(double(imageArray)+1, cmap);
The first argument is the grayscale image you want to pseudocolour, and the second input is the colour map produced by any one of MATLAB's colour mapping functions. colourArray will contain your pseudo coloured image. Take note that we offset the grayscale image by 1 and also cast to double. The reason for this is because MATLAB is a 1-indexed programming language, so we have to start indexing into arrays / matrices starting at 1. Because your intensities range from [0,255], and we want to use this to index into the colour map, we must make this go from [1,256] to allow the indexing. In addition, you are most likely using uint8 images, and so adding 1 to a uint8 will simply saturate any values that are already at 255 to 255. We won't be able to go to 256. Therefore, you need to cast the image temporarily to double so that we can increase the precision of the image and then add 1 to allow the image to go to 256 if merited.
Here's an example using the cameraman.tif image that's part of the image processing toolbox. This is what it looks like:
So we can load in that image in MATLAB like so:
imageArray = imread('cameraman.tif');
Next, we can use the above image, generate a HSV colour map then pseudocolour the image:
cmap = hsv(256);
colourArray = ind2rgb(imageArray+1, cmap);
We get:
Take note that you don't have to use any of the colour maps that MATLAB provides. In fact, you can create your own colour map. All you have to do is create a 256 x 3 matrix where each column denotes the proportion of red (first column), green (second column) and blue (third column) values per intensity. Therefore, the first row gives you the colour that is mapped to intensity 0, the second row gives you the colour that is mapped to intensity 1 and so on. Also, you need to make sure that the intensities are floating-point and range from [0,1]. For example, these are the first 10 rows of the HSV colour map generated above:
>> cmap(1:10,:)
ans =
1.0000 0 0
1.0000 0.0234 0
1.0000 0.0469 0
1.0000 0.0703 0
1.0000 0.0938 0
1.0000 0.1172 0
1.0000 0.1406 0
1.0000 0.1641 0
1.0000 0.1875 0
1.0000 0.2109 0
You can then use this custom colour map into ind2rgb to pseudocolour your image.
Good luck and have fun!

how to detect edges in an image having only red object

I have an image, in that image all red objects are detected.
Here's an example with two images:
http://img.weiku.com/waterpicture/2011/10/30/18/road_Traffic_signs_634577283637977297_4.jpg
But when i proceed that image for edge detection method i got the output as only black color. However, I want to detect the edges in that red object.
r=im(:,:,1); g=im(:,:,2); b=im(:,:,3);
diff=imsubtract(r,rgb2gray(im));
bw=im2bw(diff,0.18);
area=bwareaopen(bw,300);
rm=immultiply(area,r); gm=g.*0; bm=b.*0;
image=cat(3,rm,gm,bm);
axes(handles.Image);
imshow(image);
I=image;
Thresholding=im2bw(I);
axes(handles.Image);
imshow(Thresholding)
fontSize=20;
edgeimage=Thresholding;
BW = edge(edgeimage,'canny');
axes(handles.Image);
imshow(BW);
When you apply im2bw you want to use only the red channel of I(i.e the 1st channel). Therefore using this command:
Thresholding =im2bw(I(:,:,1));
for example yields this output:
Just FYI for anyone else that manages to stumble here. The HSV colorspace is better suited for detecting colors over the RGB colorspace. A good example is in gnovice's answer. The main reason for this is that there are colors which can contain full 255 red values but aren't actually red (yellow can be formed from (255,255,0), white from (255,255,255), magenta from (255,0,255), etc).
I modified his code for your purpose below:
cdata = imread('roadsign.jpg');
hsvImage = rgb2hsv(cdata); %# Convert the image to HSV space
hPlane = 360.*hsvImage(:,:,1); %# Get the hue plane scaled from 0 to 360
sPlane = hsvImage(:,:,2); %# Get the saturation plane
bPlane = hsvImage(:,:,3); %# Get the brightness plane
% Must get colors with high brightness and saturation of the red color
redIndex = ((hPlane <= 20) | (hPlane >= 340)) & sPlane >= 0.7 & bPlane >= 0.7;
% Show edges
imshow(edge(redIndex));
Output:

To convert only black color to white in Matlab

I know this thread about converting black color to white and white to black simultaneously.
I would like to convert only black to white.
I know this thread about doing this what I am asking but I do not understand what goes wrong.
Picture
Code
rgbImage = imread('ecg.png');
grayImage = rgb2gray(rgbImage); % for non-indexed images
level = graythresh(grayImage); % threshold for converting image to binary,
binaryImage = im2bw(grayImage, level);
% Extract the individual red, green, and blue color channels.
redChannel = rgbImage(:, :, 1);
greenChannel = rgbImage(:, :, 2);
blueChannel = rgbImage(:, :, 3);
% Make the black parts pure red.
redChannel(~binaryImage) = 255;
greenChannel(~binaryImage) = 0;
blueChannel(~binaryImage) = 0;
% Now recombine to form the output image.
rgbImageOut = cat(3, redChannel, greenChannel, blueChannel);
imshow(rgbImageOut);
Which gives
Where seems to be something wrong in red color channel.
The Black color is just (0,0,0) in RGB so its removal should mean to turn every (0,0,0) pixel to white (255,255,255).
Doing this idea with
redChannel(~binaryImage) = 255;
greenChannel(~binaryImage) = 255;
blueChannel(~binaryImage) = 255;
Gives
So I must have misunderstood something in Matlab. The blue color should not have any black. So this last image is strange.
How can you turn only black color to white?
I want to keep the blue color of the ECG.
If I understand you properly, you want to extract out the blue ECG plot while removing the text and axes. The best way to do that would be to examine the HSV colour space of the image. The HSV colour space is great for discerning colours just like the way humans do. We can clearly see that there are two distinct colours in the image.
We can convert the image to HSV using rgb2hsv and we can examine the components separately. The hue component represents the dominant colour of the pixel, the saturation denotes the purity or how much white light there is in the pixel and the value represents the intensity or strength of the pixel.
Try visualizing each channel doing:
im = imread('http://i.stack.imgur.com/cFOSp.png'); %// Read in your image
hsv = rgb2hsv(im);
figure;
subplot(1,3,1); imshow(hsv(:,:,1)); title('Hue');
subplot(1,3,2); imshow(hsv(:,:,2)); title('Saturation');
subplot(1,3,3); imshow(hsv(:,:,3)); title('Value');
Hmm... well the hue and saturation don't help us at all. It's telling us the dominant colour and saturation are the same... but what sets them apart is the value. If you take a look at the image on the right, we can tell them apart by the strength of the colour itself. So what it's telling us is that the "black" pixels are actually blue but with almost no strength associated to it.
We can actually use this to our advantage. Any pixels whose values are above a certain value are the values we want to keep.
Try setting a threshold... something like 0.75. MATLAB's dynamic range of the HSV values are from [0-1], so:
mask = hsv(:,:,3) > 0.75;
When we threshold the value component, this is what we get:
There's obviously a bit of quantization noise... especially around the axes and font. What I'm going to do next is perform a morphological erosion so that I can eliminate the quantization noise that's around each of the numbers and the axes. I'm going to make it the mask a bit large to ensure that I remove this noise. Using the image processing toolbox:
se = strel('square', 5);
mask_erode = imerode(mask, se);
We get this:
Great, so what I'm going to do now is make a copy of your original image, then set any pixel that is black from the mask I derived (above) to white in the final image. All of the other pixels should remain intact. This way, we can remove any text and the axes seen in your image:
im_final = im;
mask_final = repmat(mask_erode, [1 1 3]);
im_final(~mask_final) = 255;
I need to replicate the mask in the third dimension because this is a colour image and I need to set each channel to 255 simultaneously in the same spatial locations.
When I do that, this is what I get:
Now you'll notice that there are gaps in the graph.... which is to be expected due to quantization noise. We can do something further by converting this image to grayscale and thresholding the image, then filling joining the edges together by a morphological dilation. This is safe because we have already eliminated the axies and text. We can then use this as a mask to index into the original image to obtain our final graph.
Something like this:
im2 = rgb2gray(im_final);
thresh = im2 < 200;
se = strel('line', 10, 90);
im_dilate = imdilate(thresh, se);
mask2 = repmat(im_dilate, [1 1 3]);
im_final_final = 255*ones(size(im), class(im));
im_final_final(mask2) = im(mask2);
I threshold the previous image that we got without the text and axes after I convert it to grayscale, and then I perform dilation with a line structuring element that is 90 degrees in order to connect those lines that were originally disconnected. This thresholded image will contain the pixels that we ultimately need to sample from the original image so that we can get the graph data we need.
I then take this mask, replicate it, make a completely white image and then sample from the original image and place the locations we want from the original image in the white image.
This is our final image:
Very nice! I had to do all of that image processing because your image basically has quantization noise to begin with, so it's going to be a bit harder to get the graph entirely. Ander Biguri in his answer explained in more detail about colour quantization noise so certainly check out his post for more details.
However, as a qualitative measure, we can subtract this image from the original image and see what is remaining:
imshow(rgb2gray(abs(double(im) - double(im_final_final))));
We get:
So it looks like the axes and text are removed fine, but there are some traces in the graph that we didn't capture from the original image and that makes sense. It all has to do with the proper thresholds you want to select in order to get the graph data. There are some trouble spots near the beginning of the graph, and that's probably due to the morphological processing that I did. This image you provided is quite tricky with the quantization noise, so it's going to be very difficult to get a perfect result. Also, these thresholds unfortunately are all heuristic, so play around with the thresholds until you get something that agrees with you.
Good luck!
What's the problem?
You want to detect all black parts of the image, but they are not really black
Example:
Your idea (or your code):
You first binarize the image, selecting the pixels that ARE something against the pixels that are not. In short, you do: if pixel>level; pixel is something
Therefore there is a small misconception you have here! when you write
% Make the black parts pure red.
it should read
% Make every pixel that is something (not background) pure red.
Therefore, when you do
redChannel(~binaryImage) = 255;
greenChannel(~binaryImage) = 255;
blueChannel(~binaryImage) = 255;
You are doing
% Make every pixel that is something (not background) white
% (or what it is the same in this case, delete them).
Therefore what you should get is a completely white image. The image is not completely white because there has been some pixels that were labelled as "not something, part of the background" by the value of level, in case of your image around 0.6.
A solution that one could think of is manually setting the level to 0.05 or similar, so only black pixels will be selected in the gray to binary threholding. But this will not work 100%, as you can see, the numbers have some very "no-black" values.
How would I try to solve the problem:
I would try to find the colour you want, extract just that colour from the image, and then delete outliers.
Extract blue using HSV (I believe I answered you somewhere else how to use HSV).
rgbImage = imread('ecg.png');
hsvImage=rgb2hsv(rgbImage);
I=rgbImage;
R=I(:,:,1);
G=I(:,:,2);
B=I(:,:,3);
th=0.1;
R((hsvImage(:,:,1)>(280/360))|(hsvImage(:,:,1)<(200/360)))=255;
G((hsvImage(:,:,1)>(280/360))|(hsvImage(:,:,1)<(200/360)))=255;
B((hsvImage(:,:,1)>(280/360))|(hsvImage(:,:,1)<(200/360)))=255;
I2= cat(3, R, G, B);
imshow(I2)
Once here we would like to get the biggest blue part, and that would be our signal. Therefore the best approach seems to first binarize the image taking all blue pixels
% Binarize image, getting all the pixels that are "blue"
bw=im2bw(rgb2gray(I2),0.9999);
And then using bwlabel, label all the independent pixel "islands".
% Label each "blob"
lbl=bwlabel(~bw);
The label most repeated will be the signal. So we find it and separate the background from the signal using that label.
% Find the blob with the highes amount of data. That will be your signal.
r=histc(lbl(:),1:max(lbl(:)));
[~,idxmax]=max(r);
% Profit!
signal=rgbImage;
signal(repmat((lbl~=idxmax),[1 1 3]))=255;
background=rgbImage;
background(repmat((lbl==idxmax),[1 1 3]))=255;
Here there is a plot with the signal, background and difference (using the same equation as #rayryang used)
Here is a variation on #rayryeng's solution to extract the blue signal:
%// retrieve picture
imgRGB = imread('http://i.stack.imgur.com/cFOSp.png');
%// detect axis lines and labels
imgHSV = rgb2hsv(imgRGB);
BW = (imgHSV(:,:,3) < 1);
BW = imclose(imclose(BW, strel('line',40,0)), strel('line',10,90));
%// clear those masked pixels by setting them to background white color
imgRGB2 = imgRGB;
imgRGB2(repmat(BW,[1 1 3])) = 255;
%// show extracted signal
imshow(imgRGB2)
To get a better view, here is the detected mask overlayed on top of the original image (I'm using imoverlay function from the File Exchange):
figure
imshow(imoverlay(imgRGB, BW, uint8([255,0,0])))
Here is a code for this:
rgbImage = imread('ecg.png');
redChannel = rgbImage(:, :, 1);
greenChannel = rgbImage(:, :, 2);
blueChannel = rgbImage(:, :, 3);
black = ~redChannel&~greenChannel&~blueChannel;
redChannel(black) = 255;
greenChannel(black) = 255;
blueChannel(black) = 255;
rgbImageOut = cat(3, redChannel, greenChannel, blueChannel);
imshow(rgbImageOut);
black is the area containing the black pixels. These pixels are set to white in each color channel.
In your code you use a threshold and a grayscale image so of course you have much bigger area of pixels that is set to white resp. red color. In this code only pixel that contain absolutly no red, green and blue are set to white.
The following code does the same with a threshold for each color channel:
rgbImage = imread('ecg.png');
redChannel = rgbImage(:, :, 1);
greenChannel = rgbImage(:, :, 2);
blueChannel = rgbImage(:, :, 3);
black = (redChannel<150)&(greenChannel<150)&(blueChannel<150);
redChannel(black) = 255;
greenChannel(black) = 255;
blueChannel(black) = 255;
rgbImageOut = cat(3, redChannel, greenChannel, blueChannel);
imshow(rgbImageOut);

Resources