I have been given a strictly convex polygon of S sides and Q queries to process.
All points of polygon and query points are given in (x,y) pairs.The points of the polygon are given in anti-clockwise order.
The aforementioned variables are limited such that 1<=S<=10^6 and 1<=Q<=10^5 and 1<=|x|,|y|<=10^9.
For each query I should output Yes if the given point lies inside the polygon; otherwise, No.
I tried using an O(S) inclusion test (ray-casting) and it timed out for the bigger test cases but also didn't pass all the preliminary ones.
Obviously, the implementation didn't cover all the edge cases and I got to know about a specific algorithm for this question which could answer each query in O(log S) using binary search but I can't figure out how to implement it from the pseudocode (first time doing computational geometry).
Could anyone provide me with the algorithm which covers all edge cases within the required time complexity (Q log S) or guide me to a page or paper that implements it?
First, you can split your convex polygon into left and right parts both starting with the upper point and ending with the lower point. The points in both parts are already sorted by y-coordinate.
Assume that query point has coordinates (qx, qy). Now you can try to find (using a binary search) a segment from the left part and a segment from the right part that intersect with the line y = qy. If you could find both segments and qx is lying between x-coordinates of the segments' intersections with the line y = qy, it's inside the polygon.
The complexity of the query is O(log(S)).
You can do a scan line algorithm.
You need to sort the Q points by their x coordinate.
Then find the S point with the lowest x and consider a line moving along the x axis. You need to track the two sides of the polygon.
Then move along the polygon and the Q set in ascending x coordinate. For every point you now just have to check if it's between the two lines you are tracking.
Complexity is O(Q logQ + S) if Q is not sorted and O(Q+S) if Q is already sorted.
There is no need to sort, a convex polygon is already sorted !
For a convex polygon, point location is quick and easy: split the polygon in two using a straight line between vertex 0 and vertex S/2. The signed area test will tell you on which side the test point lies and which half to keep (the half is also a convex polygon).
Continue recursively until S=3 and compare against the supporting line of the third side.
O(Log(S)) tests in total per query.
(The numbers show the order of the splits.)
Related
I am trying to find a way to determine the rectilinear polygon from a set of integer points (indicated by the red dots in the pictures below). The image below shows what I would like to achieve:
1.
I need only the minimal set of points that define the boundary of the rectilinear polygon. Most hull algorithms I can find do not satisfy the orthogonal nature of this problem, such as the gift-wrapping algorithm, which produce the following result (which is not what I want)...
2.
How can I get the set of points that defines the boundary shown in image 1.?
Updated:
Figure 1. is no longer refereed to as convex..
Following the definition from wikipedia, it is rather easy to create a fast algorithm.
Start constructing upper hull from the leftmost point (uppermost among such if there are many). Add this point to a list.
Find the next point: among all the points with both coordinates strictly greater than of the current point, choose the one with minimal x coordinate. Add this point to your list and continue from it.
Continue adding points in step 2 as long as you can.
Repeat the same from the rightmost point (uppermost among such), but going to the left. I.e. each time choose the next point with greater y, less x, and difference in x must be minimal.
Merge the two lists you got from steps 3 and 4, you got upper hull.
Do the same steps 1-5 for lower hull analogously.
Merge the upper and lower hulls found at steps 5 and 6.
In order to find the next point quickly, just sort your points by x coordinate. For example, when building the very first right-up chain, you sort by x increasing. Then iterate over all points. For each point check if its y coordinate is greater than the current value. If yes, add the point to the list and make it current.
Overall complexity would be O(N log N) for sorting.
EDIT: The description above only shows how to trace the main vertices of the hull. If you want to have a full rectilinear polygon (with line segments between consecutive points), then you have to add an additional point to your chain each time you find next point. For example, when building the right-up chain, if you find a point (x2, y2) from the current point (x1, y1), you have to add (x2, y1) and (x2, y2) to the current chain list (in this order).
I think what you want to compute is the Rectilinear Convex Hull (or Orthogonal Convex Hull) of the set of points. The rectilinear convex hull is an ortho-convex shape, that is, the intersection of the shape with any horizontal or vertical line results in an empty set, a point, or a line segment.
The vertices of the rectilinear convex hull are the set of maximal points under vector dominance. The rectilinear convex hull can then be computed in optimal O(n log n) time. A very simple algorithm is presented in Preparata's book on Computational Geometry (see the section 4.1.3).
I don't know of any standard algorithm for this but it doesn't seem too complicated to define:
Assuming each point in the grid has at least 2 neighbors (or else there's no solution)
p = a point with only two neighbors.
while p isn't null
2a. Mark p as visited
2b. next = the unmarked neighbor that has the least amount of neighbors
2c. next.parent = p
2d. p = next
done
Suppose i have a box with a lot of points. I need to be able to calculate min and max angles for all lines which go through all possible pairs of the points. I can do it in O(n^2) times by just enumerating every point with all others. But is there faster algorithm?
Taking the idea of dual plane proposed by Evgeny Kluev, and my comment about finding left-most intersection point, I'll try to give an equivalent direct solution without any dual space.
The solution is simple: sort your points by (x, y) lexicographically. Now draw a line through each two adjacent points in the sorted order. It can be proved that the minimal angle is achieved by one of these lines. In order to get maximal angle, you need to sort by (x, -y) lexicographically, and also check only adjacent pairs of points.
Let's prove by the idea for min angle. Consider the two points A and B which yield the minimal possible angle. Among such points we can choose the pair with minimal difference of x coordinates.
Suppose that they have same y. If there is no other point between them, then they are adjacent. If there are any points between them, then clearly at least one of them is adjacent to A in our order, and all of them yield the same angle.
Suppose that there exists a point P with x-coordinate in between A and B, i.e. Ax < Px < Bx. If P lies on AB, then AP has same angle but less difference of x coordinates, hence a contradiction. When P is not on AB, then either AP or PB would give you less angle, which also gives contradiction.
Now we have points A and B lying on two adjacent vertical lines. There are no other points between these lines. If A and B are the only points on their vertical lines, then the AB pair is clearly adjacent in sorted order and QED. If there many points on these lines, obviously the minimal angle is achieved by taking the highest point on the left vertical line (which must be A) and the lowest point on the right vertical line (which must be B). Since we sort points of equal x by y, these two points are also adjacent.
Sort the points (or use hash map) to find out if there are any horizontal lines.
Then solve this problem on dual plane. Here you only need to find the leftmost and the rightmost intersection points. Use binary searches to find a pair of horizontal coordinates such that all intersection points are between them. (You could quickly find approximate results just by continuing binary searches from these coordinates).
Then sort lines according to their tangents on dual plane. And for pairs of adjacent lines in this sorted order find intersections closest to those horizontal coordinates. This does not guarantee good complexity in the worst case (when some lines on primal plane are almost horizontal). But in most cases time complexity would be determined by sorting: O(N log N) + O(binary_search_complexity).
Assume that we have been given a set S of n points and an arbitrary query line l. Do some preprocessing (other than duality) so that we can answer the nearest (closest) point (of S) to l in O(log n) time (no restriction on space).
You say "no restriction on space", which implies no restriction on preprocessing time.
Consider the sorted abscissas of the sites after rotation by an arbitrary angle: the site closest to a vertical line is found by dichotomy after Lg(N) comparisons.
Now consider the continuous set of rotations: you can partition it in angle ranges such that the order of the sorted abscissas does not change.
So you will find all limiting angles by taking the sites in pairs, and store the angle value as well as the corresponding ordering of the rotated abscissas.
For a query, find the enclosing angle interval by a first binary search (among O(N²) angles), then the closest site by a search on the rotated abscissas (binary search among O(N) abscissas).
Done the straightforward way, this will require O(N³) storage.
Given that the ordering permutations for two consecutive angles just differ by a single swap, it is not unthinkable that O(N²) storage can be achieved by a suitable data structure.
Distance from line for point (xi,yi) is d = |yi-mxi-c|/sqrt(1+m^2).
We need to minimize f(x,y) = (y-mx-c)^2
These are quadratic equations in (m,c) .
Satisfying conditions :-
F(xi,yi) <= F(x1,y1),F(x2,y2)..
Suppose you get a solver for this then you would get intervals of (m,c) where the conditions are satisfied.
You can use interval tree to search interval and point where a line (m,c) lies .
You need to find the line which goes through I and is perpendicular to the line. Then solve the pair of equation of the two lines to get the intersection. That is the closest point to I from the initial line, but it is not necessarily in S. Let's call the intersection I'. If your elements in S are ordered by x, then you can simply do a binary search to get the closest x in S to I'.x and return the point having that x.
You don't have to phrase it directly in terms of duality, but the key observation is that for two points, the boundary between lines closer to one point than the other point are the set of lines that satisfy certain inequalities on the slope and intercept of the line. So if you use these inequalities, then in a sense you are treating the line as a "point" (a pair of numbers that satisfy certain inequalities to find the nearest point) no matter what you do. It seems like the only other option is to do some preprocessing so you can quickly find the closest projection of your points onto an arbitrary given line (e.g. by computing a small number of projections and eliminating the rest from consideration), but that seems hard to make run in guaranteed O(log n) time per query line.
This ought to work:
Preprocess with a rotating sweep line through angles 0 to pi by projecting all the points onto that line and recording the sweep line angle theta and parameters for the projected points, doing this once for each time two points are coincident ie "cross over each other". By "parameters" I mean pick any fixed origin A and record (p - A) dot [cos theta, sin theta] for all p. There will be O(n^2) of these scan line records, so they can be searched by angle in O(log n) time. Given a query line Q, use binary search to find the two recorded sweep lines that bracket Q' s perpendicular in angle. The recorded projections have the same ordering of points as the ordering of points projected on Q's perpendicular. Now find the parameter for point R that is the projection of Q onto its own perpendicular through A. Use one more binary search in the bracketing sweep lines to find the point closest to B. This is the closest point to Q.
I have two sets of 2D points, separated from each other by a line in the plane. I'd like to efficiently find the pair of points, consisting of one point from each set, with the minimum distance between them. There's a really convenient looking paper by Radu Litiu, Closest Pair for Two Separated Sets of Points, but it uses an L1 (Manhattan) distance metric instead of Euclidean distance.
Does anyone know of a similar algorithm which works with Euclidean distance?
I can just about see an extension of the standard divide & conquer closest pair algorithm -- divide the two sets by a median line perpendicular to the original splitting line, recurse on the two sides, then look for a closer pair consisting of one point from each side of the median. If the minimal distance from the recursive step is d, then the companion for a point on one side of the median must lie within a box of dimensions 2d*d. But unlike with the original algorithm, I can't see any way to bound the number of points within that box, so the algorithm as a whole just becomes O(m*n).
Any ideas?
Evgeny's answer works, but it's a lot of effort without library support: compute a full Voronoi diagram plus an additional sweep line algorithm. It's easier to enumerate for both sets of points the points whose Voronoi cells intersect the separating line, in order, and then test all pairs of points whose cells intersect via a linear-time merge step.
To compute the needed fragment of the Voronoi diagram, assume that the x-axis is the separating line. Sort the points in the set by x-coordinate, discarding points with larger y than some other point with equal x. Begin scanning the points in order of x-coordinate, pushing them onto a stack. Between pushes, if the stack has at least three points, say p, q, r, with r most recently pushed, test whether the line bisecting pq intersects the separating line after the line bisecting qr. If so, discard q, and repeat the test with the new top three. Crude ASCII art:
Case 1: retain q
------1-2-------------- separating line
/ |
p / |
\ |
q-------r
Case 2: discard q
--2---1---------------- separating line
\ /
p X r
\ /
q
For each point of one set find closest point in other set. While doing this, keep only one pair of points having minimal distance between them. This reduces given problem to other one: "Algorithm to find for all points in set A the nearest neighbor in set B", which could be solved using sweep line algorithm over (1) one set of points and (2) Voronoi diagram for other set.
Algorithm complexity is O((M+N) log M). And this algorithm does not use the fact that two sets of points are separated from each other by a line.
well what about this:
determine on which side any point is:
let P be your points (P0,...Pi,...Pn)
let A,B be the separator line start-end points
so: side(Pi) = signum of ((B-A).(Pi-A))
this is based on simple fact that signum of scalar vector multiplication (dot product) depends on the order of points (see triangle/polygon winding rule for more info)
find minimal distance of any (Pi,Pj) where side(Pi)!=side(pj)
so first compute all sides for all points O(N)
then cycle through all Pi and inside that for
cycle through all Pj and search for min distance.
if the Pi and Pj groups aprox. equal size tahn it is O((N/2)^2)
you can further optimize the search by 'sort' the points Pi,Pj by 'distance' from AB
you can use another dot product to do that, this time instead (B-A)
use perpendicular vector to it let say (C-A)
discard all points from Pi2 (and similar Pj2 also)
where ((B-A).(P(i1)-A)) is close to ((B-A).(P(i2)-A))
and |((C-A).(P(i1)-A))| << |((C-A).(P(i2)-A))|
beacuese that means that Pi2 is behind Pi1 (farer from AB)
and close to the normal of AB going near Pi1
complexity after this optimization strongly depend on the dataset.
should be O(N+(Ni*Nj)) where Ni/Nj is number of remaining points Pi/Pj
you need 2N dot products, Ni*Nj distance comparision (do not neet to be sqrt-ed)
A typical approach to this problem is a sweep-line algorithm. Suppose you have a coordinate system that contains all points and the line separating points from different sets. Now imagine a line perpendicular to the separating line hopping from point to point in ascending order. For convenience, you may rotate and translate the point set and the separating line such that the separating line equals the x-axis. The sweep-line is now parallel with the y-axis.
Hopping from point to point with the sweep-line, keep track of the shortest distance of two points from different sets. If the first few points are all from the same set, it's easy to find a formula that will tell you which one you'll have to remember until you hit the first point from the other set.
Suppose you have a total of N points. You will have to sort all points in O(N*log(N)). The sweep-line algorithm itself will run in O(N).
(I'm not sure if this bears any similarity to David's idea...I only saw it now after I logged in to post my thoughts.) For the sake of the argument, let's say we transposed everything so the dividing line is the x axis and sorted our points by the x coordinate. Assuming N is not too large, if we scan along the x-axis (that is, traverse our sorted list of a's and b's), we can keep a record of the overall minimum and two lists of passed points. The current point in the scan is tested against each passed point from the other list while the distance from the point in the list to (x-coordinate of our scan,0) is greater than or equal to the overall min. In the example below, when reaching b2, we can stop testing at a2.
scan ->
Y
| a2
|
| a1 a3
X--------------------------
| b1 b3
| b2
I'm looking for an algorithm to solve this problem:
Given N rectangles on the Cartesian coordinate, find out if the intersection of those rectangles is empty or not. Each rectangle can lie in any direction (not necessary to have its edges parallel to Ox and Oy)
Do you have any suggestion to solve this problem? :) I can think of testing the intersection of each rectangle pair. However, it's O(N*N) and quite slow :(
Abstract
Either use a sorting algorithm according to smallest X value of the rectangle, or store your rectangles in an R-tree and search it.
Straight-forward approach (with sorting)
Let us denote low_x() - the smallest (leftmost) X value of a rectangle, and high_x() - the highest (rightmost) X value of a rectangle.
Algorithm:
Sort the rectangles according to low_x(). # O(n log n)
For each rectangle in sorted array: # O(n)
Finds its highest X point. # O(1)
Compare it with all rectangles whose low_x() is smaller # O(log n)
than this.high(x)
Complexity analysis
This should work on O(n log n) on uniformly distributed rectangles.
The worst case would be O(n^2), for example when the rectangles don't overlap but are one above another. In this case, generalize the algorithm to have low_y() and high_y() too.
Data-structure approach: R-Trees
R-trees (a spatial generalization of B-trees) are one of the best ways to store geospatial data, and can be useful in this problem. Simply store your rectangles in an R-tree, and you can spot intersections with a straightforward O(n log n) complexity. (n searches, log n time for each).
Observation 1: given a polygon A and a rectangle B, the intersection A ∩ B can be computed by 4 intersection with half-planes corresponding to each edge of B.
Observation 2: cutting a half plane from a convex polygon gives you a convex polygon. The first rectangle is a convex polygon. This operation increases the number of vertices at most per 1.
Observation 3: the signed distance of the vertices of a convex polygon to a straight line is a unimodal function.
Here is a sketch of the algorithm:
Maintain the current partial intersection D in a balanced binary tree in a CCW order.
When cutting a half-plane defined by a line L, find the two edges in D that intersect L. This can be done in logarithmic time through some clever binary or ternary search exploiting the unimodality of the signed distance to L. (This is the part I don't exactly remember.) Remove all the vertices on the one side of L from D, and insert the intersection points to D.
Repeat for all edges L of all rectangles.
This seems like a good application of Klee's measure. Basically, if you read http://en.wikipedia.org/wiki/Klee%27s_measure_problem there are lower bounds on the runtime of the best algorithms that can be found for rectilinear intersections at O(n log n).
I think you should use something like the sweep line algorithm: finding intersections is one of its applications. Also, have a look at answers to this questions
Since the rectangles must not be parallel to the axis, it is easier to transform the problem to an already solved one: compute the intersections of the borders of the rectangles.
build a set S which contains all borders, together with the rectangle they're belonging to; you get a set of tuples of the form ((x_start,y_start), (x_end,y_end), r_n), where r_n is of course the ID of the corresponding rectangle
now use a sweep line algorithm to find the intersections of those lines
The sweep line stops at every x-coordinate in S, i.e. all start values and all end values. For every new start coordinate, put the corresponding line in a temporary set I. For each new end-coordinate, remove the corresponding line from I.
Additionally to adding new lines to I, you can check for each new line whether it intersects with one of the lines currently in I. If they do, the corresponding rectangles do, too.
You can find a detailed explanation of this algorithm here.
The runtime is O(n*log(n) + c*log(n)), where c is the number of intersection points of the lines in I.
Pick the smallest rectangle from the set (or any rectangle), and go over each point within it. If one of it's point also exists in all other rectangles, the intersection is not empty. If all points are free from ALL other rectangles, the intersection is empty.