Calculate the time complexity of a reccurence relation, Big-Oh notation - algorithm

I'm trying to find the Big-Oh of this recurrence relation:
T(N) = 4T(N/2) + N^2.
T(1) = 1

From the master theorem we can say T(n) = \Theta(N^2 log(N)) (see case 2).

The answer of the recurrence relation is :
O(N^2 log N)

Related

Complexity of T(n)=16T(n/4)-2n^2

I'm having difficulties solving this recurrence relation: T(n)=16T(n/4)-2n^2. From my understanding I can't use the master theorem because f(n) is only allowed to be positive. Because of that I tried using the recurrence method:
T(n) = 16T(n/4)-2n^2
T(n) = 16(16T(n/16)-2(n/4)^2)-2n^2
T(n) = 16(16(16T(n/64)-2(n/16)^2)-2(n/4)^2)-2n^2
So this brings me to:
T(n) = 16^i * T(n/4^i) - (i*2*n^2)
T(n) terminates when n/4^i=1 so i = log4(n)
When I insert this into the equation I get:
T(n) = 16^log4(n) - (2*n^2*log4(n))
So would this bring me to a complexity of
T(n) = θ(n^2*log4(n)) ?
Thanks so much!

I tried Solving a recurrence by guessing then proving by induction, but something went wrong

For the following recurrence I have ti find the Big-O complexity O().
T(n) = 2T(n/2) + cn
T(1)=c1
I guessed its T(n) = O(n).
Induction hypothesis : T(n)<= ak for all k < n .
T(n) = 2T(n/2) + c*n
T(n) <= 2(a*(n/2)) + c*n
<= an +cn
=O(n)
I find it completely correct but my TA graded me 0 in this , Where do you think I went wrong ?
The so-called Master Theorem, case 2, states that
T(n) = Theta(n log n)
for your example. Furthermore, if T(n) = O(n) would be true, Quicksort (for which the above recurrence relation is satisfied) would have a linear runtime complexity, which is also not the case.
Concerning your argument, apparently you state that there exists a constant a such that
T(n) <= a*n
holds. Consequently, the induction hypothesis should be as follows.
T(k) <= a*k for each k < n
But even if this is assumed, the induction step proves that
T(n) <= (a+c)*n
holds; however, this does not prove the desired property as it does not prove that
T(n) <= a*n
holds.

How to calculate algorithm time complex

I am trying to multiple two big integer with Karatsuba algorithm.
I known that O(n) is time complexity and T(n) is worst-case time complexity.
Can some one please explain why:
T(n) = 4T(n/2) + O(n) is O(n^2)
And
T(n) = 3T(n/2) + O(n) is O(n^1.59)
T(n) = 4T(n/2) + O(n)
According to the Master theorem:
T(n) is O(n^log_2(4)) = O(n^2)
and
T(n) = 3T(n/2) + O(n)
is
T(n) = O(log_2(3)) ~ O(n^1,5849)
so you can round it to 1.590.

master theorem base case is constant?

Does Master Theorem assumes T(1) is constant? Say if I have an algorithm with time complexity: T(n) = 2T(n/2) + O(1) and T(1) = O(logn), what is the time complexity of this algorithm?
For the recurrence relation: T(n) = 2T(n/2) + O(1), we have
a = 2
b = 2
an O(1) time cost work outside the recursion
therefore the master theorem case 1 applies, and we have:
T(n) ∈ Θ(n ^ log2(2)) ⇒
T(n) ∈ Θ(n)
A recurrence relation defines a sequence based on an initial term. If a problem of size 1 is solved by the recurrence relation T(1) = f(n), where f ∈ O(logn), the value of T(1) can't be determined, i.e. makes no sense as a recurrence relation.
Your statement T(1) = O(logn) does not make any sense. You basically states that some function that does not depend on n for some reason has a logarithmic complexity (thus depends on n in a logarithmic way).
T(1), T(2), T(532143243) are boundary conditions and can not depend on any parameter. They should be a number (5, pi/e, sqrt(5) - i)
Sometimes it's best just to try things out rather than relying on a Theorem.
T(m) = 2T(m/2) + O(1)
T(1) = O(logn)
T(2) = 2T(1) = 2log(n)
T(4) = 2T(2) = 4log(n)
T(8) = 2T(4) = 8log(n)
T(16) = 2T(8) = 16log(n)
T(32) = 2T(16) = 32log(n)
T(m) = 2T(m/2) = mlog(n)
In conclusion, your initial question is indeed nonsensical as others have pointed out because you are attempting to calculate T(n) when the same n is used in T(1) = O(logn). But we can answer your second question that you have added as a comment.

Which recursive formula is more complex?

T(n) = 4T(n/2) + n
= O(n2) using master theorem.
Is the above more complex than the one below?
T(n) = 3T(n/4) + n2
both are O(n2) using master theorem,
but I do not know how to check the constant.
Hint: Easier question: which one has higher complexity? 4N2 or 5N2

Resources