putting two curves in a graph in cOutVector - omnet++

I have these two parameters, (parameter 1 and parameter 2) that I want to compare them through time and examine the effect of them on each other. I am using objects of class cOutVector (using record() method)
I want to put the values of these two parameters in one graph as Y axis and time as X axis.
The second type of graph is comprised of parameter 1 on X axis and parameter 2 on Y axis. Is it possible to do this type of graph?
Thanks for your help!

Have You tried to use R? It is the perfect thing for statistical computing and plotting Your data.
Try it - You'll solve Your problem with this. OMNeT provides a extra packet for the R environment, which helps You to analyse the cOutVector structure.
A tutorial can be found here

Related

Is this the correct way to compute distance between two nodes in Omnet++/veins?

I am trying to compute the distance between two mobile nodes in veins and i am using the following method
WaveShortMessage * pos = new WaveShortMessage();
Coord senderPosition = pos->getSenderPos();
Coord receiverPosition = traci->getPositionAt(simTime());
double distance = senderPosition.distance(receiverPosition);
I want to know if this is the correct method for computing distance between two nodes and if the given distance is in meters or centimeters because the value it returns is very large so i'm assuming that it is in centimeter. Thanks in advance.
Yes, the Coord::distance method is the right one to call for calculating how far away two points are.
Note, though, that you are not initializing the sender position before reading it. This way, you are likely getting random values when trying to read

Efficiently calculating a segmented regression on a large dataset

I currently have a large data set, for which I need to calculate a segmented regression (or fit a piecewise linear function in some similar way). However, I have both a large data set, as well as a very large number of pieces.
Currently I have the following approach:
Let si be the end of segment i
Let (xi,yi) denote the i-th data point
Assume the data point xk lies within segment j, then I can create a vector from xk as
(s1,s2-s1,s3-s2,...,xk-sj-1,0,0,...)
To do a segmented regression on the data point, I can do a normal linear regression on each of these vectors.
However, my current estimates show, that if I define the problem that way, I will get about 600.000 vectors with about 2.000 components each. I haven't benchmarked yet, but I don't think my computer will be able to calculate such a large regression problem in any acceptable time.
Is there a better way to calculate this kind of regression problem? One idea was to maybe use some kind of hierarchical approach, i.e. calculate one regression problem by combining multiple segments, so that I can determine start and endpoints for this set. Then calculate an individual segmented regression for this set of segments. However, I cannot figure out how to calculate the regression for this set of segments, so that the endpoints match (I can only match start or endpoint by fixing the intercept but not both).
Another idea I had was to calculate an individual regression for each of the segments and then only use the slope for that segment. However with that approach, errors might start to accumulate and I have no way to control for this kind of error accumulation.
Yet another ideas is that I could do individual regression for each segment, but fix the intercept to the endpoint of the previous segment. However, I still am not sure, if I may get some kind of error accumulation this way.
Clarification
Not sure if this was clear from the rest of the question. I know where the segments start and end. The most important part is, that I have to get each line segment to intersect at the segment boundary with the next segment.
EDIT
Maybe another fact that could help. All points have different x values.
I would group points to rectangular grid areas
based on their position. So you process this task on more smaller datasets and then merge the results together when all done.
I would process each group like this:
compute histogram of angles
take only the most occurring angles
their count determine the number of line segments present in group
do the regression/line fit for these angles
See this Answer it does something very similar (just single line)
compute the intersection points
between line segments to get the endpoints of your piecewise polyline and also connectivity info (join the closest endpoints)
[edit1] after OP edit
You know the edge x coordinates of all segments (x0,x1,...) so just compute average y coordinates of points near segment edge (gray area, green points) and You got the segment line endpoints (blue points). Of coarse this is no fit or regression because of discard all the other points so it leads to bigger errors (unless the segment x coordinated corresponds to regressed lines ...) but there is no way around it with the constrains of solution you have (at least I do not see any).
Because if you use regression on segment data then you can not connect it to other segments and if you try to merge them then you got almost the same result as this:
the size of gray area determine the output ... so play with it a bit ...

Comparing position of two sets of points on 2D image

I've got question about algorithms to compare if two sets of points are in a similar place on the image.
They don't create similar shapes likes circles, rectangles etc, but they are something like irregular clouds.
For example:
The first cloud of points is learning set of desired area on image and we are checking if second cloud is in similar position.
I was thinking of drawing simple shapes to form points (like rectangles which will accumulate all points) and checking if one is in another or distance between centers of figures, but this method doesn't seem to be very accurate.
Are there better algorithms to solve this problem?
Image Moments
Don't worry about the fancy name, it's just a standard method in image processing to do exactly what you require.
Image moment of power n w.r.t. x and m w.r.t. y is actually the
integration of (pixel value * xPosition^n * xPosition^m) over the
entire image.
So (0, 0)th order moment i.e moment(0, 0) is actually area of the cloud.
Similarly, moment(1, 0)/moment(0, 0) is X coordinate of centroid of the cloud.
And, moment(0, 1)/moment(0, 0) is Y coordinate of centroid of the cloud.
Higher order moments give additional features/information peculiar to shape of the clouds.
Now you can easily compare the arbitrary shapes.
These functions are available in opencv and matlab.
Hope this helps.
Good luck.
Sets will have quite similar shapes (it will be set of points of human skeleton from kinect > sensor) and I want to check if person is sitting in the same place as it was learned in the > first place
Then you will probably be able to derive a correspondence between two points (i.e. you will know that a given point is SHOULDER_RIGHT or ELBOW_LEFT or...). If that is the case you can simply calculate the SUM(SQRT((Xi1-Xi2)^2+(Yi1-Yi2)^2) for each i-th pair of points (X1,Y1) and (X2,Y2) (same goes if you can obtain the third dimension Z).
The value thus obtained will have a minimum of zero when the two sets of points are perfectly coinciding.

Calculate Mapping of Nearest Points of 2 matrices

I have two matrices A and B. Each of them has 2 columns having the coordinates of a point ( x , y ).
I need to compute a mapping of points from A to B such that the points have least euclidean distance among them.
Essentially I am trying to emulate what sift does on images but will not carry out the steps that sift does for matching the points...
Thus for all points in A, I compute euclidean distance with all points in B and then remove the mapping of 2 points which have the least distance. Then i continue to do this until A and B are both empty.
Could someone tell me what could be the most efficient way of doing this ?
EDIT
Can somebody help me ... The issue I am facing is that I need to compute all v/s all distances before selecting the minimum of them as the first mapping. Then I need to do this all over again making the computation really long...
Is there any way this can be done efficiently in MATLAB ?
Are you referring to the Procrustes distance between the two different configurations of points? If so, Matlab has a built-in function that computes the smallest-norm transformation that brings the points into alignment (this is the Procrustes distance).
See this documentation for how to use it. If you don't have the Statistics Toolbox, then you should check the Matlab Central File Exchange first to see if anyone's written a non-toolbox version of the procrustes() function before seeking to write your own.

resampling a series of points

i have an array of points in 3d (imagine the trajectory of a ball) with X samples.
now, i want to resample these points so that i have a new array with positions with y samples.
y can be bigger or smaller than x but not smaller than 1. there will always be at least 1 sample.
how would an algorithm look like to resample the original array into a new one? thanks!
The basic idea is to take your X points and plot them on a graph. Then interpolate between them using some reasonable interpolation function. You could use linear interpolation, quadric B-splines, etc. Generally, unless you have a specific reason to believe the points represent a higher-order function (e.g. N4) you want to stick to a relatively low-order interpolation function.
Once you've done that, you have (essentially) a continuous line on your graph. To get your Y points, you just select Y points equally spaced along the graph's X axis.
You have to select some kind of interpolation/approximation function based on the original x samples (e.g. some kind of spline). Then you can evaluate this function at y (equally spaced, if wanted) points to get your new samples.
For the math, you can use the Wikipedia article about spline interpolation as starting point.

Resources