Which recursive formula is more complex? - complexity-theory

T(n) = 4T(n/2) + n
= O(n2) using master theorem.
Is the above more complex than the one below?
T(n) = 3T(n/4) + n2
both are O(n2) using master theorem,
but I do not know how to check the constant.

Hint: Easier question: which one has higher complexity? 4N2 or 5N2

Related

Solving by Masters theorem and recursion tree gives different answers

Suppose we have been given a recurrence relation T(n) = 2T(n/3) + n. And need to find the time complexity.
My problem is why I am getting different answers by master theorem and using recursion tree method.
By Master Theorem for equation T(n) = aT(n/b) + O(nklogpn)
Here a=2,b=3,p=0,k=1 and as a < bk here i.e 2 < 31 by masters theorem it gives us T(n) = O(n) time complexity.
Here is my recursion tree method
And at the end I seriously don't know what did I found out. It looks something like O(n.2logn).
Surely its not the answer and I have messed it up. But I don't get where I am wrong? What is right approach?
Also I wanted to ask if recurrence relations T(n) = 2T(n/3) + n and T(n) = 2T(n/3) + C are both same? where C is a constant in 2nd equation.

Solving the recurrence T(n) = T(n/3) + O(logn) + n by giving a tight bound

Is it ok to drop the lower terms when solving recurrences, like in this case I decided to drop the O(logn).
please excuse the bad hand writing!
Here is my attempt at solving the recurrence:
You can rewrite O(log n) + n = Theta(n) and continue on your merry way with the Master Theorem to get a bound of Theta(n).
If you want an even better bound, you can go to the trouble of verifying T(n) = 3n/2 + c log^2 n for some fixed constant c using the substitution method.

Master theorem for subproblems of different sizes

The Master theorem's generic form mentions that:
it is assumed that all subproblems are essentially the same size
The Akra–Bazzi method is applied when:
the sub-problems have substantially different sizes
But what are the criteria for substantially different? For example I have a recurrence relation like:
T(n) = T(n/4) + T(3n/4) + cn
(c is some constant)
Can I still use the master theorem to solve this relation (for instance approximating it as T(n) = 2T(3n/4) + cn)? Or, in other words, are these subproblem sizes "essentially the same" or are they already "substantially different"?
Assuming c is some constant, you have: T(n) = T(n/4) + T(3n/4) + O(n)
Solving this with the Akra-Bazzi method gives O(n^2)
Solving it by assuming T(n) = 2T(3n/4) + O(n) gives O(n^2.4094) (exp. rounded to 4 dp)
So just by trying it out, you can confirm that they are already substantially different.

Recurrence: T(n) = (2+1/log n)T(n/2)

I have to solve this recurrence relation with tree method, because Master theorem does not apply.
T(n) = (2+1/log n) T(n/2)
After a some thoughts I can not come up with an exact solution. Master's theorem does not work here and unrolling the tree has not gave me anything reasonable. So I will just estimate the complexity in the following way.
For any reasonably big n you can estimate 0 < 1/log n < 1. So you can get:
T1(n) = 2 * T1(n/2)
T2(n) = 3 * T2(n/2)
and O(T1) < O(T) < O(T2). You can find the complexity for both recurrences using master theorem. The complexity of T1 is O(n) and of T2 is O(n^log2(3)).
So you can be sure that the complexity of your recurrence is bigger than O(n) and less than O(n^1.58), so less than quadratic.

Solving T (n) = √2*T(n/2) + log n using master theorem

The question is :
T(n) = √2*T(n/2) + log n
I'm not sure whether the master theorem works here, and kinda stuck.
This looks more like the Akra-Bazzi theorem: http://en.wikipedia.org/wiki/Akra%E2%80%93Bazzi_method#The_formula with k=1, h=0, g(n)=log n, a=(2)^{1/2}, b=1/2. In that case, p=1/2 and you need to evaluate the integral \int_1^x log(u)/u^{3/2} du. You can use integration by parts, or a symbolic integrator. Wolfram Alpha tells me the indefinite integral is -2(log u + 2)/u^{1/2} + C, so the definite integral is 4 - 2(log x + 2)/x^{1/2}. Adding 1 and multiplying by x^{1/2}, we get T(x) = \Theta(5x^{1/2} - 2 log x - 4).
Master theorem have only constrains on your a and b which holds for your case. The fact that a is irrational and you have log(n) as your f(n) has no relation to it.
So in your case your c = log2(sqrt(2)) = 1/2. Since n^c grows faster than your log(n), the complexity of the recursion is O(sqrt(n)).
P.S. solution of Danyal is wrong as the complexity is not nlogn and the solution of Edward Doolittle is correct, also it is an overkill in this simple case.
As per master theorem, f(n) should be polynomial but here
f(n) = logn
which is not a polynomial so it can not be solved by master theorem as per rules. I read somewhere about the fourth case as well. I must mention that as well.
It is also discussed here:
Master's theorem with f(n)=log n
However, there is a limited "fourth case" for the master theorem, which allows it to apply to polylogarithmic functions.
If
f(n) = O(nlogba logk n), then T(n) = O(nlogba log k+1 n).
In other words, suppose you have T(n) = 2T (n/2) + n log n. f(n) isn't a polynomial, but f(n)=n log n, and k = 1. Therefore, T(n) = O(n log2 n)
See this handout for more information: http://cse.unl.edu/~choueiry/S06-235/files/MasterTheorem-HandoutNoNotes.pdf

Resources