I have a 20,000 point array of gps locations.
They represent points on a forest path that need to be checked. I need to figure out how many km of forest path needs checking.
Group the points into routes.
Measure the shorted path of each route
Which algorithms should I consider and in which order.
Should I get the shortest path and break it up into routes or get the routes and then find the shorted path of each.
This solution asumes that you only have the points and don't know on which forest path th e points are, and in which order, etc.
I would try it this way:
1 connect each node with each other with a link, and as link weight use the distance (or better the number of seconds when going with 2km/h in meters in between the nodes: low speed asuming walking in the wood is slower then on a existimg forest road)
2 if the forest has diffuclties (mountains, vallley, river):
2a: ascent/descent: raise the link weight, using the altitudinal difference, look in outdoor planning resources, how many meters ascent has impact to travellling time. (300m could be one addionional hour as rough estimate)
2b: valley, river or other limits: either again raise the weight or remove the link if one cannot directly go from one point to the other. (e.g draw the valley as polygon and remove all links that cross the polygon)
Are there already paths/ forest roads in the wood?
Yes, draw them as links into the modell (graph), to use link weight, e,.g 5km/h walking speed.
Now you have a graph with nodes and the links with hopefully realistic link weight related to travelling speed between nodes.
Now use Shortes path (Dijkstras Algorithm) and travelling salesman algorithm.
If that all is to much work (could be some months for somebody with a degree in computer science) , plan it manually: draw a raster of 1000 x 1000m and let the human intelligence
do its job.
Since 20.000 points which have to be checked by walking, needs a high effort, it is addionally worth to evaluate automatic planning versus human experience. Try both variants and look which is more efficient.
(I think that people with outdoor experience when having a good map with countour lines and the check points on it, will do a better job, asuming preorganizing by point two quadrants asignement and quadrants to people.)
My other soulution:
This asumes you have more info which you did not have yet posted:
You probably have more info than just the coordinates of the points.
Who has created this points? In your graphic, they look as they are on a path.
Are they recorded while driving on that path with a vehicle? Then you have a time stamp, and therefore an order of points that are in sequnce, and thefore already are related to a path.
So the first step would be to assign the points to a path.
(You also could draw all forest paths known as vectors to a digital map and match the points to the neareast path automatically)
You need the paths when you cannot directly reach each node on a straight line in betwwen them (e.g driving by vehicle or walking in wood when river avoids direct straight line path)
Then once you have a graph with nodes on links, use a minimum spaning tree to calculate the sum of path lengths in kilomter.
For visting the points you often will have to return to a branch, so then a travelling salesman algorith will help to give the kilomters needed to visit all nodes.
The question seems to be similar to a constrained vehicle routing problem. You can try a heuristic for example the savings algorithmus: http://neo.lcc.uma.es/vrp/solution-methods/heuristics/savings-algorithms/.
Related
I asked this question three days ago and I got burned by contributors because I didn't include enough information. I am sorry about that.
I have a 2D matrix and each array position relates to the depth of water in a channel, I was hoping to apply Dijkstra's or a similar "least cost path" algorithm to find out the least amount of concrete needed to build a bridge across the water.
It took some time to format the data into a clean version so I've learned some rudimentary Matlab skills doing that. I have removed most of the land so that now the shoreline is standardised to a certain value, my plan is to use a loop to move through each "pixel" on the "west" shore and run a least cost algorithm against it to the closest "east" shore and move through the entire mesh ultimately finding the least cost one.
This is my problem, fitting the data to any of the algorithms. Unfortunately I get overwhelmed by options and different formats because the other examples are for other use cases.
My other consideration is that when the shortest cost path is calculated that it will be a jagged line which would not be suitable for a bridge so I need to constrain the bend radius in the path if at all possible and I don't know how to go about doing that.
A picture of the channel:
Any advice in an approach method would be great, I just need to know if someone knows a method that should work, then I will spend the time learning how to fit the data.
You can apply Dijkstra to your problem in this way:
the two "dry" regions you want to connect correspond to matrix entries with value 0; the other cells have a positive value designating the depth (or the cost of filling this place with concrete)
your edges are the connections of neighbouring cells in your matrix. (It can be a 4- or 8-neighbourhood.) The weight of the edge is the arithmetic mean of the values of the connected cells.
then you apply the Dijkstra algorithm with a starting point in one "dry" region and an end point in the other "dry" region.
The cheapest path will connect two cells of value 0 and its weight will correspond to sum of costs of cells visited. (One half of each cell weight is coming from the edge going to the cell, the other half from the edge leaving the cell.)
This way you will get a possibly rather crooked path leading over the water, which may be a helpful hint for where to build a cheap bridge.
You can speed up the calculation by using the A*-algorithm. Therefore one needs a lower bound of the remaining costs for reaching the other side for each cell. Such a lower bound can be calculated by examining the "concentric rings" around a point as long as rings do not contain a 0-cell of the other side. The sum of the minimal cell values for each ring is then a lower bound of the remaining costs.
An alternative approach, which emphasizes the constraint that you require a non-jagged shape for your bridge, would be to use Monte-Carlo, simulated annealing or a genetic algorithm, where the initial "bridge" consisted a simple spline curve between two randomly chosen end points (one on each side of the chasm), plus a small number or randomly chosen intermediate points in the chasm. You would end up with a physically 'realistic' bridge and a reasonably optimized cost of concrete.
I have a quadcopter with some sensors and I want to measure values in set of points on the map (2d problem).
Every measurement takes 30 seconds and I assume copter has constant speed of 60km/h.
It can fly constantly 20 minutes and then it needs to land to charge for an hour.
I would like to write an algorithm, that automatically computes flight paths and minimizes time to take all the samples.
I can represent the points as a full graph (I assume I am flying so high, that there are no obstacles). Then time to reach the point is cost on the edge, but I have also cost of visiting the vertex and limited "fuel". It is some generalization of TSP or VRP, but I am not sure which one.
There are also problems with gas stations, but they usually find path between two points.
Can you name an algorithm that could solve this or come up with something similar. It is NP hard, but there could be some nice approximate solutions.
The problem isn't easy to solve because there is also the fuel constraints and you need to find groups of pods. You can use a combination of a brute force algorithm and a heuristic. For example a quad tree or a spatial index (hilbert curve) can reduce the dimensions and the search space. It looks similar to the capacitated vehicle routing problem.
I have a floorplan with lots of d3.js polygon objects on it that represent booths. I am wondering what the best approach is to finding a path between the 2 objects that don't overlap other objects. The use case here is that we have booths and want to show the user how to walk to get from point a to b the most efficient. We can assume path must contain only 90 or 45 degree turns.
we took a shot at using Dijkstra but the scale of it seems to be getting away from us.
The example snapshot of our system:
Our constraints are that this needs to run in the browser. Would be nice if it worked well with d3.js.
Since the layout is a matrix (or nested matrices) this is not a Dijkstra problem, it is simpler than that. The technical name for the problem is a "Manhatten routing". Rather than give a code algorithm, I will show you an example of the optimum route (the blue line) in the following diagram. From this it should be obvious what the algorithm is:
Note that there is a subtle nuance here, and that is that you always want to maximize the number of jogs because even though the overall shape is a matrix, at each corner the person will actually walk diagonally (think of a person cutting diagonally across a four-way intersection). Therefore, simply going north, then west is wrong, because you would only get to cut one corner, but on the route shown you get to cut 5 corners.
This problem is known as finding shortest path between two points with polygonal obstacle, and studied a lot in literature. See here for one example. All algorithms for this is by converting problem to the graph theory problem then running Dijkstra. To doing this:
Each vertex in any polygon is vertex in your graph.
Start point and end points are also vertices in the graph.
Between two vertex there is an edge, if they are visible to each other, to achieve this we can use triangulation algorithms.
Weight of each edge is the distance between its two endpoints in Euclidean space.
Now we are ready to run any shortest path algorithm. The hard part is triangulation, I think triangle library fits for your requirements. Also easier way is searching the web by the keywords that I said in the first line to find implementation. I didn't link to any implementation because I see is better to say it in algorithmic manner to be useful to the future readers.
I'm facing a hard problem:
Imagine I have a map of an entire country, represented by a huge matrix of Cells. Each cell represents a 1 square meter of territory. Each Cell is represented as a double value between 0 and 1 that represents the cost of traversing the cell.
The map obviously is not fittable in memory.
I am trying to wrap my mind arround a way to calculate the optimal path for a robot, from a start point to a end position. The first idea I had was to make a TCP-like moving window, with a minimap of the real map arround the moving robot, and executing the A* algorithm inside there, but I'm facing some problems with maps with huge walls, bad pathfinding, etc...
I am searching the literature about A*-like algorithms and I could not visualize an approximation of what would be a good solution for this problem.
I'm wondering if someone has faced a similar problem or can help with a idea of a possible solution!
Thanks in advance :)
Since I do not know exact data, here's some information that could be useful:
A partial path of a shortest path is itself a shortest path. I.e. you might split up your matrix into submatrices and find (all) shortest paths in there. Note that you do not have to store all results: You e.g. can save memory by not saving a complete path but just the information: Path goes from A to B. The intermediate nodes might be computed later again or stored in a file for later. You might even be able to precompute some shortest paths for certain areas.
Another approach is that you might be able to compress your matrix in some way. I.e. if you have large areas consisting only of one and the same number, it might be good to store just that number and the dimensions of that area.
Another approach (in connection to precompute some shortest paths) is to generate different levels of detail of your map. Considering a map of the USA, this might look the following: The coarsest level of detail contains just the cities New York, Los Angeles, Chicago, Dallas, Philadelphia, Houston und Phoenix. The finer the levels get, the more cities they contain, but - on the other hand - the smaller area of your whole map is shown by them.
Does your problem have any special structure, e.g., does the triangle inequality hold/can you guarantee that the shortest path doesn't jog back and forth? Do you want to perform the query many times? (If so you can do pre-processing that will amortize over multiple queries.) Do you need the exact minimum solution, or will something within an epsilon factor be OK?
One thought was that you can coarsen the matrix - form 100 meter by 100 meter squares, and determine the shortest path distances through the 100 \times 100 squares. Now this will fit in memory (about 1 Gigabyte), you can run Dijkstra, and then expand each step through the 100 \times 100 square.
Also, have you tried running a forward-backward version of Dijkstra's algorithm? I.e., expand from the source and search forthe sink at the same time, and stop when there's an intersection.
Incidentally, why do you need such a fine level of granularity?
Here are some ideas that may work
You can model your path as a piecewise linear curve. If you have 31 line segments then your curve is fully described by 60 numbers. Each of the possible curves have a cost, so the cost is a function on the following form
cost(x1, x2, x3 ..... x60)
Now your problem is to find the global optimum of a function of 60 variables. You can use standard methods to do this. One idea is to use genetic algorithms. Another idea is to use a monte carlo method such as parallel tempering
http://en.wikipedia.org/wiki/Parallel_tempering
Whenever you have a promising path then you can use it as a starting point to find a local minimum of the cost function. Maybe you can use some interpolation to make your cost function is differentiable. Then you can use Newtons method (or rather BFGS) to find local mimima of the cost function.
http://en.wikipedia.org/wiki/Local_minimum
http://en.wikipedia.org/wiki/BFGS
Your problem is somewhat similar to the problem of finding reaction paths in chemical systems. Maybe you can find some inspiration in the book "Energy Landscapes" by Davis Wales.
But I also have some questions:
Is it necessary for you to find the optimal path, or are you just looking for an path that is OK?
How much computer power and time do you have at hand?
Can the robot make sharp turns, or do you need extra physics modelling to improve the cost function?
In my app, the GPS picks the location of the vehicle. It is then supposed to put markers at all points where the vehicle could be if it drives for 1 KM in any direction (note that the roads may fork many times within his 1KM reach).
Can someone suggest me how to do this? Thanks in advance.
This is a very tricky problem to solve with the Google Maps API. The following is one method that you may want to consider:
You can easily calculate a bounding circle of 1km around your GPS point, and it is also easy to calculate points that fall on the circumference of this circle, for any angle. This distance will be "as the crow files" and not the actual road distance, but you may want to check out the following Stack Overflow post for a concrete implementation of this:
How to calculate the latlng of a point a certain distance away from another?
Screenshot with markers at 20 degree intervals on a bounding circle with a 1km radius:
removed dead ImageShack link - How to calculate the latlng of a point a certain distance away from another?
There is also a trick to snap these points to the nearest street. You can check out Mike Williams' Snap point to street examples for a good implementation of this.
Calculating the road distance from your GPS point to each snapped road point could be done with the directions service of the Google Maps API. Note that this will only work in countries that support directions in Google Maps, but more importantly, the road distance will almost always be greater than 1km, because our bounding circle has a 1km radius "as the crow flies". However if you can work with approximate information, this may already be one possible solution.
You can also consider starting with the above solution (1km bounding circle, calculate x points on the circumference, and snap them to the closest road), then calculate the road distance of each path (from your GPS point to each snapped point), and then you can repeat this this recursively for each path, each time using a smaller bounding circle, until you reach a road distance close to 1km. You can decrease the bounding circle in each recursion, in proportion to the error margin, to make your algorithm more efficient.
UPDATE:
I found a very neat implementation which appears to be using a similar method to the one I described above:
Driving Radius (Multiple destinations)
Note how you can change the interval of degrees from the top. With a wide interval you'll get fast results, but you could easily miss a few routes.
Screenshot:
removed dead ImageShack link - Driving Radius
Natural brute force algorithm is to build a list of all possible nodes taking into account each possible decision on every crossroad.
I doubt that within 1km you would get more then 10 crossroads on average and assuming avg of 3 choices on a crossroad you would end up with 3^10 - around 59,049 end nodes (notice that you need to have 10 crossroads on every branch of the road to reach the full number).
In reality the number would go down and I would assume getting to the same node by different route would not be uncommon, especially in cities.
This approach would give you an exact answer (providing you have good street map as input). It is potential time, but the n does not seem to be that high, so it might be practical.
Further improvements and optimizations might be possible depending on what do you need these nodes for (or which kind of scenarios you would consider similar enough to prune them).
Elaborating a bit on Daniel's approach above, you want to first find all the point within a straight line radius from your origin. That's your starting set of nodes. Now include ALL edges incident to those nodes and other nodes in your starting set. Now check that the nodes are connected and that there aren't any nodes out there floating around that you can't reach. Now create a "shortest path tree" starting from your vehicle node.
The tree will give you the shortest paths from your starting node to all other nodes. Note that if you start by creating paths at the furthest nodes, any sub-paths are also shortest paths to those nodes along the way. Make sure to label those nodes on sub-paths as you continue so you don't need to compute them. Worst case scenario, you need to develop a shortest path for all nodes, but in practice this should take much less time.
List all possible nodes taking into account each possible decision on every crossroad
(But how to do it automatically?
Use Dijkstra`s algorithm to find closes route to all points.
Visualize data.
(That is a little bit tricky, because there can be an unreachable areas inside reachable area.