I am aware that we can use the Fast Fourier Transform to multiply two polynomials of degree n in O(n log n) time. This is a big savings from the brute force approach in O(n^2) time. Is it possible to generalize this result to two polynomials of different degree?
Clearly, it can be done in O(n log n) where n is the larger number, but I'm looking for an answer that depends on both n and m.
Related
I'm trying to figure out if there's a way to simplify O(nc) · O(cn) but couldn't come up with anything. Is this equal to infinity, or would it fall under one of the more common type of complexities (O(n), O(nc), O(cn), O(n log n), O(log n) or O(n!))?
O(nc) · O(cn) means the set of functions that can be expressed as f(n) · g(n) where f(n) ∈ O(nc) and g(n) ∈ O(cn).
Since nc ∈ O((1+δ)n) for arbitrarily small positive δ, we have O(1n) ⊂ O(nc) ⊂ O((1+δ)n). (Do you see why?)
Additionally, it's a general property of big-O notation that O(foo) · O(bar) = O(foo · bar); so we can always move multiplication inside the O (or pull multiplication outside the O). (Do you see why?)
Combining these two observations, we have O(cn) ⊂ O(nc) · O(cn) ⊂ O((c+δ)n) for arbitrarily small positive δ. So you can reasonably simplify O(nc) · O(cn) to O((c+δ)n). This is analogous to how O(n log n) is sometimes called "quasilinear", because it's a subset of O(n1+δ) for arbitrarily small positive δ.
e^N grows faster then all functions from your list of asymptotically equivalent candidates.
To find out whether g(N) = E^n*n^E and f(N) = e^N have the same order of grows we need the limit of g(N)/f(N) -> 1 when N->infinity. But:
So given function doesn't fall under one of the more common type of complexities (O(n), O(nc), O(cn), O(n log n), O(log n)) - it grows faster.
Following log log plot illustrates and compares order of grows of mentioned functions:
Some algorithms has exponential brute force solution but may be simplified(restrictions added) and become faster. The brute force solution of the traveling salesman problem is O(n!) which is approximately O(N^N).
Nowadays input with N = 30 is solvable in minutes for 2^N. For N = 1..30 you can observe behavior of c^n*n^c and c^N using following loglog plot:
I'm trying to understand if O(n*m) is considered polynomial, given m and n are the sizes of two independent inputs?
I just want to clarify the concept of polynomial time in here and want to know if O(n*m) has a different name for its type of complexity. How de we represent it in a graph ?
Let's say that m>n. Then O(n*m) < O(m^2), the latter being obviously polynomial. So O(n*m) is also in polynomial.
Yes, it is polynomial. Basically, until you don't see n and m as exponents in Big O it is polynomial.
You can see it like this:
Polynomial. When the complexity of the algorithm is described by some polynomial function. (e.g O(n*m), O(n^3 * log m) etc)
Exponential. When the complexity of the algorithm is described by some exponential function. (e.g O(m * 2^(n)), O(3^n * log m) etc)
I am wondering if there are fast algorithms to do polynomial multiplication modulo N, i.e. given two polynomials of degree N,
I am interested in their product but only up to degree N, i.e.
Note that the k sum only goes to N (not to 2N as it would in normal polynomial multiplication). For example given
the result should be:
I am aware that there are fast algorithms to do polynomial multiplication but I wonder if they can be applied to the restricted polynomial multiplication I am interested in.
In two long polynomials of degree n - 1's division, obviously the reminders with coefficient can be calculated in O(n * n). I want to know if there is any faster algorithms to obtain the reminders, e.g. O(nlogn).
P.S(updated)clarify :if polynomials, p, q have different degree, where deg(p) > deg(q), Is it possible to find the remainders of p / q faster than O((deg(p)- deg(q)) * p) while accuracy don't lose?
If the polynomials are of the same degree, you can determine with one operation the number of times one goes into the other (if the coefficients are integers, this will in general be a rational number). Then you can multiply that fraction times one polynomial and subtract it from the other. This takes O(n) field operations, which is optimal since the answer is of length Θ(n) in the average and worst case.
Yes, there is a faster algorithm, however only for very large degrees. First reverse the polynomials, then use fast (Newton-based) Taylor series division to compute the reversed quotient, reverse the quotient and compute the remainder using fast multiplication.
The first operation has the runtime of a multiplication of polynomials with the degree of the quotient, the second of a multiplication of polynomials of the degree of the divisor.
As homework, I should implement integer multiplication on numbers of a 1000 digits using a divide and conquer approach that works below O(n). What algorithm should I look into?
Schönhage–Strassen algorithm is the one of the fastest multiplication algorithms known. It takes O(n log n log log n) time.
Fürer's algorithm is the fastest large number multiplication algorithm known so far and takes O(n*log n * 2O(log*n)) time.
I don't think any multiplication algorithm could take less than or even equal to O(n). That's simply not possible.
Take a look at the Karatsuba algorithm. It involves a recursion step which you can easily model with divide-and-conquer.