Analysis of algorithm and finding its time complexity - algorithm

if T(n) = n √ n then
T(n) = O(n)
T(n) = O(n log n)
T(n) = O(n^2)
None of the above
Which of the above options is correct? How to find the order in this case?

Order is O(n^1.5).
O(n) < O(n log n) < O(n^1.5) < O(n^2).
So answer is 4.

Related

What will be the time complexity of this code?

I've this code :
int arr [];
QuickSort.(arr); // O(log n)
for (int i = 0; i < arr.length() - 1; i++) {
print(i); // O(n)
}
What is the time complexity of this? Is it O(n * log n) or O(n + log n)?
O(log n) + O(n) =
O(n + log n) =
O(n)
In the complexity analysis, you keep the biggest factor only. As n > log n, log n disappears because its growth is negligible when compared to n's when n goes to infinity.
On the other hand, the quicksort algorithm is an O(n²) algorithm, so I guess the real complexity should be:
O(n²) + O(n) =
O(n² + n) =
O(n²)
Where n is the size of the array.

How to calculate algorithm time complex

I am trying to multiple two big integer with Karatsuba algorithm.
I known that O(n) is time complexity and T(n) is worst-case time complexity.
Can some one please explain why:
T(n) = 4T(n/2) + O(n) is O(n^2)
And
T(n) = 3T(n/2) + O(n) is O(n^1.59)
T(n) = 4T(n/2) + O(n)
According to the Master theorem:
T(n) is O(n^log_2(4)) = O(n^2)
and
T(n) = 3T(n/2) + O(n)
is
T(n) = O(log_2(3)) ~ O(n^1,5849)
so you can round it to 1.590.

Ordering functions by Big O complexity

I'm trying to order the following functions in terms of Big O complexity from low complexity to high complexity: 4^(log(N)), 2N, 3^100, log(log(N)), 5N, N!, (log(N))^2
This:
3^100
log(log(N))
2N
5N
(log(N))^2
4^(log(N))
N!
I figured this out just by using the chart given on wikipedia. Is there a way of verifying the answer?
3^100 = O(1)
log log N = O(log log N)
(log N)^2 = O((log N)^2)
N, 2N, 5N = O(N)
4^logN = O(e^logN)
N! = o(N!)
you made just one small mistake. this is the right order.

How to calculate T(n) = 3T(n/3) + O(lg n)

I know what the O(lg n) and the T(n) mean, and in algorithm analysis I don't know how to calculate the T(n) = 3T(n/3) + O(lg n). Should I expand it?
Just like:
T(n) = 3^2 *T(n/3^2) + O(lg n/3) + O(lg n) and so on...
then I get
T(n) = 3^(log b^n) * T(1)+ 3^[log b ^ (n-1)]* lg (n/(3^[log b ^(n-1)])) ...+ O(lg n/3) +O(lg n)
But how can I get the right answer, and can I get an easy way to find it out?
I think you can use Masters Theorem.
T(n)=aT(n/b) + f(n)
Here a=3, b=3 and f(n)=O(log n)
f(n) = O(log n) = O(n)
which implies the answer as BigTheta(n)
For Masters theorem formula plz see Wikipedia. There are three rules and are quite simple

Big -O notation

Hey i have a question.
say t(n) = O(n log(n)) and u know that this is true.
and then your given these statements and told to say whether they must be true or false. t(n) = n^4 and t(n) = O(N^4)
The statement t(n) = n^4 is false while the statement t(n) = O(N^4) is true. Why?
You have to remember that when you write t(n) = O(n log(n)) and t(n) = O(N^4), what it actually means is that t(n) is in O(...), not that it's equal to it (as O(...) is a set of functions and a function can not be equal to a set of functions). However when you write f(n) = n^4, that means that f(n) is equal to n^4.
Now if f(n) is in O(n log n), it is also in O(n^4) because O(n^4) is a superset of O(n log n). However it can not be equal to n^4, because n^4 is not in O(n log n).
The idea of Big O notation is that it represents an abstracted function of time, it focuses on the slowest part of your algorithm and ignores things that affect execution time (i.e. t(n)) but don't actually make a huge difference.
For exmaple, if your function works on a set of items of size n and just loops through them performing some calculations then you'd say t(n) = O(n). Say you performed some operation only on a few of elements according to some criteria, you would still still say t(n) = O(n) but the actual time taken t(n) would not be a function of n directly, hence t(n) = nx would not hold true.
Look at the second equation in this. From this equation, t(n) = n^4 = O(n^4) is obvious.
t(n) = O(n log n) is false, because ∀M>0,x, ∃n>x, t(n) = n^4 > M(n log n).
(if n > log n and n>M, n^4 > M*n^3 = M(n * n^2) > M(n * log n) = M(n log n), and n>log n when (roughly) n>5)

Resources