Complexity of T(n)=16T(n/4)-2n^2 - algorithm

I'm having difficulties solving this recurrence relation: T(n)=16T(n/4)-2n^2. From my understanding I can't use the master theorem because f(n) is only allowed to be positive. Because of that I tried using the recurrence method:
T(n) = 16T(n/4)-2n^2
T(n) = 16(16T(n/16)-2(n/4)^2)-2n^2
T(n) = 16(16(16T(n/64)-2(n/16)^2)-2(n/4)^2)-2n^2
So this brings me to:
T(n) = 16^i * T(n/4^i) - (i*2*n^2)
T(n) terminates when n/4^i=1 so i = log4(n)
When I insert this into the equation I get:
T(n) = 16^log4(n) - (2*n^2*log4(n))
So would this bring me to a complexity of
T(n) = θ(n^2*log4(n)) ?
Thanks so much!

Related

Solve the recurrence equation T(n) = T(n/3) + O(1) using iteration or substitution

I realize that solving this with Master's theorem gives the answer of Big Theta(log n). However, I want to know more and find the base of the logarithm. I tried reading about masters theorem more to find out about the base but could not find more information on wikipedia (https://en.wikipedia.org/wiki/Master_theorem_(analysis_of_algorithms)).
How would I solve this using recursion tree or substitution method for solving recurrences?
You can assume n = 2^K and T(0) = 0.
Don't set n=2^k but n=3^k
thus T(3^k) = T(3^{k-1}) + c
recurrence becomes w_k = w_{k-1} + c
Assuming T(1) = 1
with the general term: w_k = ck+1
and w_0 = 1
you conclude T(n) = clog_3(n) + 1
and thus T(n) = O(log_3(n))
T(n) = T(n/3) + O(1) = T(n/9) + O(1) + O(1) = T(n/27) + O(1) + O(1) + O(1) = …
After log3(n) steps, the term T vanishes and T(n) = O(log(n)).

Calculate the time complexity of a reccurence relation, Big-Oh notation

I'm trying to find the Big-Oh of this recurrence relation:
T(N) = 4T(N/2) + N^2.
T(1) = 1
From the master theorem we can say T(n) = \Theta(N^2 log(N)) (see case 2).
The answer of the recurrence relation is :
O(N^2 log N)

T(n) = T(n/2) + T(n/4) solve this recurrence using iterative method

How to solve the recurrence equation
T(n) = T(n/2) + T(n/4)
For base case T(n)=1
I have already checked this and this for clue but it didn't help me solving it using iterative method
I just need to get to the general equation for this.
I don't see how to use the "iterative method" to solve this.
But there's a similarity between the recurrence relation you have and the that for the fibonacci numbers, and that can be used to find a solution.
T(2^k) = T(2^(k-1)) + T(2^(k-2)). So assuming T(1) = T(2) = 1, T(2^k) = Fib(k). So for n a power of 2, T(n) = Fib(lg(n)). Since Fib(n) = Theta(phi^n), T(n) = Theta(phi^(lg n)) = Theta(n ^ lg(phi)) ~= n^0.7
Here Fib(n) is the n'th Fibonacci number, and phi = (1 + sqrt(5))/2.

master theorem base case is constant?

Does Master Theorem assumes T(1) is constant? Say if I have an algorithm with time complexity: T(n) = 2T(n/2) + O(1) and T(1) = O(logn), what is the time complexity of this algorithm?
For the recurrence relation: T(n) = 2T(n/2) + O(1), we have
a = 2
b = 2
an O(1) time cost work outside the recursion
therefore the master theorem case 1 applies, and we have:
T(n) ∈ Θ(n ^ log2(2)) ⇒
T(n) ∈ Θ(n)
A recurrence relation defines a sequence based on an initial term. If a problem of size 1 is solved by the recurrence relation T(1) = f(n), where f ∈ O(logn), the value of T(1) can't be determined, i.e. makes no sense as a recurrence relation.
Your statement T(1) = O(logn) does not make any sense. You basically states that some function that does not depend on n for some reason has a logarithmic complexity (thus depends on n in a logarithmic way).
T(1), T(2), T(532143243) are boundary conditions and can not depend on any parameter. They should be a number (5, pi/e, sqrt(5) - i)
Sometimes it's best just to try things out rather than relying on a Theorem.
T(m) = 2T(m/2) + O(1)
T(1) = O(logn)
T(2) = 2T(1) = 2log(n)
T(4) = 2T(2) = 4log(n)
T(8) = 2T(4) = 8log(n)
T(16) = 2T(8) = 16log(n)
T(32) = 2T(16) = 32log(n)
T(m) = 2T(m/2) = mlog(n)
In conclusion, your initial question is indeed nonsensical as others have pointed out because you are attempting to calculate T(n) when the same n is used in T(1) = O(logn). But we can answer your second question that you have added as a comment.

T(n) = T(n/2) + T(n/4) + O(1), what is T(n)?

How to solve this recurrence: T(n) = T(n/2) + T(n/4) + O(1)
It doesn't seem like Master Method will help, as this is not in the form of T(n) = aT(n/b) + f(n). And I got stuck for quite a while.
Akra Bazzi is a much more powerful method than Master method.
Since the 'non-recursive' term is O(1), it amounts to solving the equation
1/2^p + 1/4^p = 1
And the answer you get will be T(n) = Theta(n^p)
I believe solving the above (quadratic in 1/2^p) gives us p = log_2 phi where phi is the golden ratio.
Computing that gives us T(n) = Theta(n^0.694...)

Resources