I want to get the bigram to bigram co-occurrence matrix. This post shows how to get the word-to-word co-occurrence matrix How do I calculate a word-word co-occurrence matrix with sklearn?. However, the method is not working for bigram. Any thoughts?
Related
I know that the mse between two 2d matrices, A and B, of shapes pxq, can be calculated in matrix terms as follows:
1/n tr(AtB)
The nice thing about this equation is that the matrices At and B are conformal to matrix multiplication and also yields a square matrix which has a defined trace.
But if we have two 3d tensors A and B of shapes pxqxr, then I don't understand how to get the outer product between them to get a square matrix so that the mse camne written in terms of trace.
I want to plot multiple ROC curves with a matrix of predictions and labels. I have > 100 samples with a matrix of predictions and labels for each sample. The length of the samples is different. How could I get design a single matrix for all the samples and get multiple ROC curves in a single plot? I would appreciate any suggestions. Thanks
I want to Plot 3D figure from a 1×n(column vector) containing numeric data. I have used ListPlot3D but its not working as I need to convert 1×n(column vector) vector into n×n matrix first then I can use the command
ListPlot3D[{{x1,y1,z1},{x2,y2,z2},…}]
Please guide me how I can convert first 1×n(column vector) into a matrix of order n×n or is there any other way to get 3D plot in Mathematica. I am very new on Mathematica programming. Need your's help to sort out my problem. Highly Appreciated!
I assume that your column vector contains groups of three values that constitute the coordinates of the surface points. In the simplest case, these values are contiguous. In that case, simply use "Partition", like so:
If more rearranging is necessary, use the tools for rearranging lists, as shown here: https://reference.wolfram.com/language/guide/RearrangingAndRestructuringLists.html
The distance transform provides the distance of each pixel from the nearest boundary/contour/background pixel. I don't want closest distance, but I want to get some sort of average measure of the pixel's distance from the boundary/contour in all directions. Any suggestions for computing this distance transform would be appreciated. If there any existing algorithms and/or efficient C++ code available to compute such distance transform, that would be wonderful too.
If you have a binary image of the contours, then you can calculate the number of boundary pixels around each pixel within some windows (using e.g. the integral image, or cv::blur). This would give you something like what you want.
You might be able to combine that with normalizing the distance transform for average distances.
If you want the "average measure of the pixel's distance from the boundary/contour in all directions", then I am afraid that you have to extract the contour and for each pixel inside the pattern, you have to compute the average distance with the pixels belonging to the contour.
An heuristic for a "rough" approximation, would be to compute many distance maps using sources points (they could be the pattern extremities), and for each pixel inside the pattern, then you compute the sum of all distances from the distance maps. But to have the exact measure, you would have to compute as many distance maps as pixels belonging to the contour. But if an approximation is "okay", then this will speed up the processing.
I am doing a very interesting Computer Vision project which talks about how to "create manually" images with Matlab.
The teacher gave me three matrices: the illuminant matrix (called E), the camera sensitivity matrix (called R) and finally, the surface reflectance matrix (called S).
The matrix dimensions are as follows:
S: 31x512x512 (reflectance samples x x-dimension x y-dimension)
R: 31x3
E: 31x1
The teacher gave me also the following relationship:
P=transpose(C)*R=transpose(S)*diagonal(E)*R
Where C is the color matrix.
Where P is the sensor response matrix.
The goal is to display the image formed by all the previous matrices. Therefore, we have to compute the P matrix.
The class of all the matrices is double.
This is what I have done:
Diag_D=diag(D);% Diagonal matrix of D
S_reshaped= reshape(S,31,[512*512]);% Reshape the surface reflectance matrix
S_permute=permute(S_reshaped,[2 1]);% The output matrix is a 262144x31 matrix
Color_Signal_D65_buffer=S_permute*Diag_DD;
Color_Signal_D65=reshape(Color_Signal_D65_buffer,[512 512 31]);% This is the final color matrix
Image_D65_buffer= (reshape(Color_Signal_D65,[512*512],31))*R;% Apply the given formula
Image_D65= reshape(Image_D65_buffer,[512 512 3]);% image formation
Image_D65_norm=sqrt(sum(Image_D65.^2,3));% Compute the Image_D65 norm
Image_D65_Normalized=bsxfun(#rdivide, Image_D65, Image_D65_norm);% Divide each element of the matrix by the norm in order to normalize the matrix
figure
imshow(Image_D65_Normalized)% Display the image
However,it did not work at all. The output is an image but the colors are completely wrong (there is too much blue on the image).
I think it could be a matrix reshaping problem but I have tried all the possible combinations but nothing to do.
Thank you so much for your help
I've finaly found the error. It was a problem in the normalization process. I was using the wrong formula.