Comparing big theta values [closed] - asymptotic-complexity

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I am trying to order these different big theta values from largest to smallest:
Θ(n2)
Θ(2n log n)
Θ(n log n2)
Θ(2n2)
Θ(log n)
Θ(n log 2n)
Θ(k2)
Θ(22n)
Θ(n3)
Θ(n)
Θ(2n)
Θ(n1.5)
Θ(√n)
Θ(2n2)
and some of the values are equivalent. Particularly, I want to know if a constant term makes one big-theta value larger than an identical big-theta term without the constant term (for example, are these two values equivalent: Θ(22n) & Θ(n)?).

Θ(log n)
Θ(√n) = Θ(n1/2)
Θ(n) = Θ(2n) = Θ(22n)
Θ(n log n) = Θ(2n log n) = Θ(n log n2) = Θ(n log 2n)
Θ(n1.5)
Θ(n2) = Θ(2n2)
Θ(n3)
Considering your comment:
n log 2n = n (log 2 + log n) = n log 2 + n log n
log 2 is a constant non-zero value, so:
Θ(n log 2n) = Θ(n log 2 + n log n) = Θ(n + n log n) = Θ(n log n)
See the sum and multiplication by a constant properties of the big-{O,Theta, Omega}-notations.

If try replacing n with a huge value then you can figure it out yourself without even asking it to the forum:
o(1)
O(log log n)
O(log n)
O(n^c)
O(n)
O(n log n)
O(n^2)
O(c^n)
O(n!)

Related

Asymptotic Notation Comparison

Is O(logn) = O(2^O(log logn))?
I tried to take the log of both sides
log logn = log2^(log logn)
log logn = log logn log2
We can find a constant C > log2 s.t C log logn > log logn log2
So they are equal to each other. Am I right?
I think what you want to ask is if log n = O(2^(log log n))?
Think of O (big-O) as a <= operator, but the comparison is made asymptotically.
Now, to answer your question, we have to compare log n and 2^(log log n).
We use asymptotic notations only when we need to visualize how much an algorithm will scale as the input grows drastically.
log n is a logarithmic function.
2^(log log n) is an exponential function. (Notice that log log n is the exponent of 2)
It will always be true that a logarithmic function is asymptotically less than an exponential function. If you want to understand, try computing both the functions for very large values of n (like 10000 or 100000000).
So, it can be very easily inferred that log n = O(2^(log log n)).
NOTE: We do not compare asymptotic notations like you asked (O(logn) = O(2^O(log logn))). We compare functions (like log n) using these notations.

How many number of elements can be sorted in Θ(log n) time using heap sort? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
How many number of elements can be sorted in Θ(log n) time using heap sort?
When we do a heapsort, to build the heap we need Θ(n) complexity and then to do the heapsort O(nlog n). I understand this concept. But when it comes to our question here, we can not even build a heap of n elements in Θ(log n) time. So is the answer O(1) considering input size n?
I have also seen a different explanation which derives the complexity as Θ(log n/log log n) considering input size logn. I don't quite follow this method either. So which is the correct answer and why ?
I think the question is "assuming that there is a known value of n somewhere, what is the asymptotic bound on the size of an array, as a function of n, where sorting that array with heapsort will take time Θ(log n)?"
Sorting an array with k elements takes time Θ(k log k) as k grows. You want to choose k such that Θ(k log k) = Θ(log n). Choosing k = Θ(log n) doesn't necessarily work, since Θ(k log k) = Θ(log n log log n) ≠ Θ(log n). On the other hand, if you choose k = Θ(log n / log log n), then the runtime of the sort will be
Θ((log n / log log n) log (log n / log log n))
= Θ((log n / log log n) (log log n - log log log n))
= Θ(log n - log n log log log n / log log n)
= Θ(log n (1 - log log log n / log log n))
Notice that 1 - log log log n / log log n tends toward 1 as n goes to infinity, so the above expression actually is Θ(log n), as required.
Therefore, if you try to sort an array of size Θ(log n / log log n) using heap sort, as a function of n, the runtime is Θ(log n).
Hope this helps!
to sort N elements, time = O(NlogN)
that basically means, sorting each element will take O(logN) time

When c > 0 Log(n) = O(n)? Not sure why it isn't O(log n)

In my homework, the question asks to determine the asymptotic complexity of n^.99999*log(n). I figured that it would be closer to O( n log n) but the answer key suggests that when c > 0, log n = O(n). I'm not quite sure why that is, could someone provide an explanation?
It's also true that lg n = O( nk ) (in fact, it is o(nk); did the hint actually say that, perhaps?) for any constant k, not just 1. Now consider k=0.00001. Then n0.99999 lg n = O(n0.99999 n0.00001 ) = O(n). Note that this bound is not tight, since I could choose an even smaller k, so it's perfectly fine to say that n0.99999 lg n is O(n0.99999 lg n), just as we say n lg n is O(n lg n).

Big Oh Classification [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
Say that I have a function that has the following growth:
2000n^2 + log n
Is it possible for me to conclude that the function is part of the set of functions that fall into the category of O(n)?
O(some function) is the limiting behavior of a function.
Does there exist some C such that C*n describes the upper limit of the function you described for all n?
If you look closely at your function, you can set C to 2000 such that 2000*n^2 = C*n^2...which is greater than C*n.
So no, it is not O(n).
No since O(log n) < O(n^x) for any fixed x, O(2000n^2 + log(n)) = O(n^2)
An easier way to see this is that since O(log n) < O(n^x), O(log n) < O(n^2) and so O(2000n^2 + log(n)) <= O(2000n^2 + n^2) = O(2001n^2) = O(n^2) and since O(2000n^2 + log(n)) has an n^2 term, it is at least as big as n^2 giving us O(2000n^2 + log(n)) >= O(n^2). Now we have that O(2000n^2 + log(n)) <= O(n^2) and O(2000n^2 + log(n)) >= O(n^2) so we can conclude that O(2000n^2 + log(n)) = O(n^2)

Why is the following sequence of functions ordered by asymptotic growth rates? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 6 years ago.
Improve this question
Order the following expressions in increasing Θ-order. If two functions are of the same order of growth, you should state this fact.
n log n, n−1, log n, nlog n, 10n + n3/2, πn, 2n, 2log n, 22log n, log n!
Can someone explain to me why the following answer is correct?
n−1 ≪ log n ≪ 2log n ≪ n log n = log n! ≪ 10n + n3/2≪ nlog n ≪ 2n = 22log n ≪ πn
You should use the facts that:
lim(n->∞) f(n)/ g(n) = 0 this gives you Θ(f(n)) < Θ(g(n))
lim(n->∞) f(n)/ g(n) = c; c > 0 this gives you Θ(f(n)) = Θ(g(n))
lim(n->∞) f(n)/ g(n) = ∞ this gives you Θ(f(n)) > Θ(g(n))
Now using that you get:
lim(n->∞) n^−1 / log n = lim(n->∞) 1 / (n * log n) = 0.
This immediately gives you Θ(n^−1) < Θ(log n)
Go on with the remaining.
For some of the calculations you might find L'Hôpital's rule helpful.
So it has been a long time since I thought about these concepts (and I'm sure others will correct me if this is wrong) but I don't agree with your answer. First off theta means bound above and below by that function. This means 10n + n3/2, 2n, and πn are all the same theta class. log n, 2log n, 22log n are also all the same class. To see that n log n is the same class as log n! use stirling's approximation. Thus you get:
log n = 2 log n = 22 log n ≪ n−1 = 10n + n3/2 = 2n = πn ≪ n log n = log n!
Now if 'n3/2' means n^(3/2) rather than 3/2*n then the order would be:
log n = 2 log n = 22 log n ≪ n−1 = 2n = πn ≪ n log n = log n! ≪ 10n + n3/2

Resources