What is the time complexity of the following equation - algorithm

I'm having a few problems working out the time complexity using Big-O notation.
This is the equation:
88n^2 logn + 81 + 3n^3 + 12n
I can figure it out I'm guessing its something like:
O(n^2 logn) or O(n^3)
Thanks in advance.

As you know n grow faster than logn.
You also know we can multiply same strength factor to a complexity equation.
So we could simply say n^3 grow faster than n^2 logn.
=> O(n^3)

Since growth rate of n greater than the growth rate of log(n),
we can say n^3 grow faster than n^2 log(n).
So 88n^2 logn + 81 + 3n^3 + 12n => O(n^3)

Related

Explanation for Big-O notation comparison in different complexity class

Why Big-O notation can not compare algorithms in the same complexity class. Please explain, I can not find any detailed explanation.
So, O(n^2) says that this algorithm requires less or equal number of operations to perform. So, when you have algorithm A which requires f(n) = 1000n^2 + 2000n + 3000 operations and algorithm B which requires g(n) = n^2 + 10^20 operations. They're both O(n^2)
For small n the first algorithm will perform better than the second one. And for big ns second algorithm looks better since it has 1 * n^2, but first has 1000 * n^2.
Also, h(n) = n is also O(n^2) and k(n) = 5 is O(n^2). So, I can say that k(n) is better than h(n) because I know how these functions look like.
Consider the case when I don't know how functions k(n) and h(n) look like. The only thing I'm given is k(n) ~ O(n^2), h(n) ~ O(n^2). Can I say which function is better? No.
Summary
You can't say which function is better because Big O notation stays for less or equal. And following is true
O(1) is O(n^2)
O(n) is O(n^2)
How to compare functions?
There is Big Omega notation which stays for greater or equal, for example f(n) = n^2 + n + 1, this function is Omega(n^2) and Omega(n) and Omega(1). When function has complexity equal to some asymptotic, Big Theta is used, so for f(n) described above we can say that:
f(n) is O(n^3)
f(n) is O(n^2)
f(n) is Omega(n^2)
f(n) is Omega(n)
f(n) is Theta(n^2) // this is only one way we can describe f(n) using theta notation
So, to compare asymptotics of functions you need to use Theta instead of Big O or Omega.

What is the Big-O of the function n^2/log(n)?

the time complexity of an algorithm is given by n^2/log(n).
what is that in big O notation? Just n^2 or we keep the log?
As n^2 / (n^2/log(n)) goes to inifinity when n grows, so, n^2/log(n) = o(n^2) (little-oh). Therefore, n^2/log(n) is not equivalent to n^2.

Time complexity of algorithm that has run time proportional to sum of logs [duplicate]

This question already has answers here:
Is log(n!) = Θ(n·log(n))?
(10 answers)
Closed 5 years ago.
I have an algorithm that has a run time proportional to log(1) + log(2) + ... + log(N). Clearly this algorithm runs in O(N log(N)) time. However, I have an intuition that there might be a tighter bound, because the one I've produced uses the value of the largest logarithm term to bound all of the logarithm terms, even though many of the terms are much smaller. Am I correct? Is there a tighter bound that is still simple to express arithmetically?
log(1) + log(2) + ... + log(N) will be log(1*2*3*4*5*.......*n)
which is equal to log (n!)
Striling's approximation
Sterling's Approximation
which can be equated to log(n^n) since log (a^b)= b log a
which is equivalent to (n log n)
and to answer your another question, I can't see any tighter bound in this case.
I hope this helps.
Since log(a) + log(b) = log(ab) the complexity would be log(n!)
log(n!) = n log(n) - n + O(log(n))
For large n, the right side is dominated by the term n log(n). That implies that O(log(n!)) = O(n log(n)).
We can prove log(n!) = O(nlogn) using sterling approximation, you read it further at Why is log(n!) O(nlogn)?

Time complexity of one recursive algorithm

here we have an algorithm
T(n) = n-1 + T(i-1) + T(n-i)
T(1) = 1
How to calculate it's time complexity?
i is between 1 and n
I can recognise this as quick sort algorithm (randomized quick sort).
I am sure the question somehow missed the summation part.
Okay! you can use substitution method over here..check with O(n^2). you will get the idea that O(n^2) is the worst case time complexity.
The average case is a bit tricky. Then the pivot can be any element from 1 to n. Then analyse it. Here also you can apply substituin with T(n)=O(nlogn).
I think we should solve it like this
if i = 2 then we have
T(n) = n + T(n-2) = Theta(n^2)
if i = n/2 then we have
T(n) = n-1 + T(n/2 -1) + T(n/2) = Theta(n.logn)
then we have upper bound O(n^2) and algorithm is in order of O(n^2)

Ordering functions by Big O complexity

I'm trying to order the following functions in terms of Big O complexity from low complexity to high complexity: 4^(log(N)), 2N, 3^100, log(log(N)), 5N, N!, (log(N))^2
This:
3^100
log(log(N))
2N
5N
(log(N))^2
4^(log(N))
N!
I figured this out just by using the chart given on wikipedia. Is there a way of verifying the answer?
3^100 = O(1)
log log N = O(log log N)
(log N)^2 = O((log N)^2)
N, 2N, 5N = O(N)
4^logN = O(e^logN)
N! = o(N!)
you made just one small mistake. this is the right order.

Resources