I know what the O(lg n) and the T(n) mean, and in algorithm analysis I don't know how to calculate the T(n) = 3T(n/3) + O(lg n). Should I expand it?
Just like:
T(n) = 3^2 *T(n/3^2) + O(lg n/3) + O(lg n) and so on...
then I get
T(n) = 3^(log b^n) * T(1)+ 3^[log b ^ (n-1)]* lg (n/(3^[log b ^(n-1)])) ...+ O(lg n/3) +O(lg n)
But how can I get the right answer, and can I get an easy way to find it out?
I think you can use Masters Theorem.
T(n)=aT(n/b) + f(n)
Here a=3, b=3 and f(n)=O(log n)
f(n) = O(log n) = O(n)
which implies the answer as BigTheta(n)
For Masters theorem formula plz see Wikipedia. There are three rules and are quite simple
Related
I am trying to find the time complexity (Big-Θ) of this algorithm:
Recursion(n):
while n > 1:
n = floor(n/2)
Recursion(n)
I have found an upper bound of O(n) by considering the worst case which is when n is a power of 2.
However, I am having trouble finding a lower bound (Big-Ω) for this. My intuition is that this is Ω(n) as well, but I am not sure how to show this with the floor function in the way.
Any suggestions? Thank you!
EDIT: the main thing I'm not convinced of is that T(n/2) is equivalent to T(floor(n/2)). How would one prove this for this algorithm?
floor function performs its operation in constant time, O(1). Therefore you can ignore it/see it as a constant. Let's analyze the time complexity of the algorithm below:
T(n) = T(n/2) + 1 (floor operation)
T(n/2) = T(n/4) + 1
...
T(2) = T(1) + 1 --> T(1) = constant
T(n) = T(n/4) + 2
T(n) = T(n/8) + 3
...
T(n) = T(n/2^k) + k 2^k = n therefore k=log(n)
T(n) = T(1) + log(n)
T(n) = log(n)
We can conclude that T(n) ∈ θ(log(n)).
I am trying to multiple two big integer with Karatsuba algorithm.
I known that O(n) is time complexity and T(n) is worst-case time complexity.
Can some one please explain why:
T(n) = 4T(n/2) + O(n) is O(n^2)
And
T(n) = 3T(n/2) + O(n) is O(n^1.59)
T(n) = 4T(n/2) + O(n)
According to the Master theorem:
T(n) is O(n^log_2(4)) = O(n^2)
and
T(n) = 3T(n/2) + O(n)
is
T(n) = O(log_2(3)) ~ O(n^1,5849)
so you can round it to 1.590.
if T(n) = n √ n then
T(n) = O(n)
T(n) = O(n log n)
T(n) = O(n^2)
None of the above
Which of the above options is correct? How to find the order in this case?
Order is O(n^1.5).
O(n) < O(n log n) < O(n^1.5) < O(n^2).
So answer is 4.
I am looking for a recursive algorithm to evaluate what I call Factorial(m,n)=m*m+1*...*n, for every m
I appreciate any help.
What is the complexity of this algorithm?
Let T(n, m) be the time complexity of Factorial(n, m).
Let g(n) = Factorial(n, 1) and T"(n) be the time complexity of g(n), then:
T(n, m) <= T"(n) + T"(m - 1) for any n, m
and T"(n) = T"(n - 1) + O(1) which is O(n).
To sum up, T(n, m) = O(n) + O(m - 1) = O(n + m)
Its will have recurrence equation T(n) = T(n-1) + 2 , In case of function call of Factorial(n,1)
I would like to solve the following recurrence relation:
T(n) = 2T(√n);
I'm guessing that T(n) = O(log log n), but I'm not sure how to prove this. How would I show that this recurrence solves to O(log log n)?
One idea would be to simplify the recurrence by introducing a new variable k such that 2k = n. Then, the recurrence relation works out to
T(2k) = 2T(2k/2)
If you then let S(k) = T(2k), you get the recurrence
S(k) = 2S(k / 2)
Note that this is equivalent to
S(k) = 2S(k / 2) + O(1)
Since 0 = O(1). Therefore, by the Master Theorem, we get that S(k) = Θ(k), since we have that a = 2, b = 2, and d = 0 and logb a > d.
Since S(k) = Θ(k) and S(k) = T(2k) = T(n), we get that T(n) = Θ(k). Since we picked 2k = n, this means that k = log n, so T(n) = Θ(log n). This means that your initial guess of O(log log n) is incorrect and that the runtime is only logarithmic, not doubly-logarithmic. If there was only one recursive call being made, though, you would be right that the runtime would be O(log log n).
Hope this helps!
You can solve this easily by unrolling the recursion:
Now the recurrence will finish when T(1) = a and you can find the appropriate a. When a = 0 or 1 it does not make sense but when a=2 you will get:
Substituting the k into latest part of the first equation you will get the complexity of O(log(n)).
Check other similar recursions here:
T(n) = 2T(n^(1/2)) + log n
T(n) = T(n^(1/2)) + Θ(lg lg n)
T(n) = T(n^(1/2)) + 1