Big-O Asymptotic growth rate ordering functions - big-o

This is what I have ordered the functions in increasing order of asymptotic growth rates. Also, I have simplified some functions by applying logarithmic rules.
log( log n )
sqrt( log n )
log n^3 (which is equal to log n)
n^2/3
2^logn (which is equal to n)
n log n
n^2
Is this order correct? Or am I missing something?

O( A(n) ) < O( B(n) ) holds iff A(n) / B(n) approaches 0 when n goes to infinity.
You can check your table here: https://www.wolframalpha.com/input/?i=limit+log%28log%28n%29%29+%2F+sqrt%28log%28n%29%29%2C+n+to+infinity
For example
log(log(n)) / sqrt(log(n)) -> 0 for n -> inf
Hence O(log(log(n)) < O(sqrt(log(n)).

Related

question regarding asymptotic runtime behavior

We know that log(n) = O(sqrt n )
I am wondering if is it valid to say that log(n) is theta( sqrt n ) .
numerically , i proved that it is right ; yet i am not too sure about it .
Would like some help
log n is NOT in Theta(sqrt n), since sqrt n is asymptotically greater than log n, meaning that log n isn't in Omega(sqrt n). In other words, sqrt n cannot be an asymptotic lower bound for log n.
Refer to this definition of big theta. Substitute sqrt n for g(n) and log n for f(n) in the definition and you will see that you can easily find a k2 and n0 such that the definition is satisfied (which is why log n is in O(sqrt n)), while finding a suitable k1 will prove impossible (which is why log n is NOT in Omega(sqrt n)).

O( log n! ) and O( (log n )! )

To find their relation I substituted log n = x and log n! = n(log n) so with base a , O( log n! ) became a^x(x) and (log n)! became x(x-1)(x-2)....
now I think the first one has a higher growing speed. But can you help me to find their relation using big O of n^2
Actually x(x-1)(x-2).... becomes x^x + ... because you have x scopes. This means that O((log n)!)has a higher growing speed.
Also, if log(n) := x, then n = 2^x and n^2 will become (2^x)^2 = 2^2x which has lower growing speed than x^x
Summary
O(log n!) < O(n^2) < O((log n)!)

Is complexity O(log(n)) equivalent to O(sqrt(n))?

My professor just taught us that any operation that halves the length of the input has an O(log(n)) complexity as a thumb rule. Why is it not O(sqrt(n)), aren't both of them equivalent?
They are not equivalent: sqrt(N) will increase a lot more quickly than log2(N). There is no constant C so that you would have sqrt(N) < C.log(N) for all values of N greater than some minimum value.
An easy way to grasp this, is that log2(N) will be a value close to the number of (binary) digits of N, while sqrt(N) will be a number that has itself half the number of digits that N has. Or, to state that with an equality:
        log2(N) = 2log2(sqrt(N))
So you need to take the logarithm(!) of sqrt(N) to bring it down to the same order of complexity as log2(N).
For example, for a binary number with 11 digits, 0b10000000000 (=210), the square root is 0b100000, but the logarithm is only 10.
Assuming natural logarithms (otherwise just multiply by a constant), we have
lim {n->inf} log n / sqrt(n) = (inf / inf)
= lim {n->inf} 1/n / 1/(2*sqrt(n)) (by L'Hospital)
= lim {n->inf} 2*sqrt(n)/n
= lim {n->inf} 2/sqrt(n)
= 0 < inf
Refer to https://en.wikipedia.org/wiki/Big_O_notation for alternative defination of O(.) and thereby from above we can say log n = O(sqrt(n)),
Also compare the growth of the functions below, log n is always upper bounded by sqrt(n) for all n > 0.
Just compare the two functions:
sqrt(n) ---------- log(n)
n^(1/2) ---------- log(n)
Plug in Log
log( n^(1/2) ) --- log( log(n) )
(1/2) log(n) ----- log( log(n) )
It is clear that: const . log(n) > log(log(n))
No, It's not equivalent.
#trincot gave one excellent explanation with example in his answer. I'm adding one more point. Your professor taught you that
any operation that halves the length of the input has an O(log(n)) complexity
It's also true that,
any operation that reduces the length of the input by 2/3rd, has a O(log3(n)) complexity
any operation that reduces the length of the input by 3/4th, has a O(log4(n)) complexity
any operation that reduces the length of the input by 4/5th, has a O(log5(n)) complexity
So on ...
It's even true for all reduction of lengths of the input by (B-1)/Bth. It then has a complexity of O(logB(n))
N:B: O(logB(n)) means B based logarithm of n
One way to approach the problem can be to compare the rate of growth of O()
and O( )
As n increases we see that (2) is less than (1). When n = 10,000 eq--1 equals 0.005 while eq--2 equals 0.0001
Hence is better as n increases.
No, they are not equivalent; you can even prove that
O(n**k) > O(log(n, base))
for any k > 0 and base > 1 (k = 1/2 in case of sqrt).
When talking on O(f(n)) we want to investigate the behaviour for large n,
limits is good means for that. Suppose that both big O are equivalent:
O(n**k) = O(log(n, base))
which means there's a some finite constant C such that
O(n**k) <= C * O(log(n, base))
starting from some large enough n; put it in other terms (log(n, base) is not 0 starting from large n, both functions are continuously differentiable):
lim(n**k/log(n, base)) = C
n->+inf
To find out the limit's value we can use L'Hospital's Rule, i.e. take derivatives for numerator and denominator and divide them:
lim(n**k/log(n)) =
lim([k*n**(k-1)]/[ln(base)/n]) =
ln(base) * k * lim(n**k) = +infinity
so we can conclude that there's no constant C such that O(n**k) < C*log(n, base) or in other words
O(n**k) > O(log(n, base))
No, it isn't.
When we are dealing with time complexity, we think of input as a very large number. So let's take n = 2^18. Now for sqrt(n) number of operation will be 2^9 and for log(n) it will be equal to 18 (we consider log with base 2 here). Clearly 2^9 much much greater than 18.
So, we can say that O(log n) is smaller than O(sqrt n).
To prove that sqrt(n) grows faster than lgn(base2) you can take the limit of the 2nd over the 1st and proves it approaches 0 as n approaches infinity.
lim(n—>inf) of (lgn/sqrt(n))
Applying L’Hopitals Rule:
= lim(n—>inf) of (2/(sqrt(n)*ln2))
Since sqrt(n) and ln2 will increase infinitely as n increases, and 2 is a constant, this proves
lim(n—>inf) of (2/(sqrt(n)*ln2)) = 0

Is O(log(n*log n) can be considered as O(log n)

Consider I get f(n)=log(n*log n). Should I say that its O(log(n*log n)?
Or should I do log(n*log n)=log n + log(log n) and then say that the function f(n) is O(log n)?
First of all, as you have observed:
log(n*log n) = log(n) + log(log(n))
but think about log(log N) as N->large (as Floris suggests).
For example, let N = 1000, then log N = 3 (i.e. a small number) and log(3) is even smaller,
this holds as N gets huge, i.e. way more than the number of instructions your code could ever generate.
Thus, O(log(n * log n)) = O(log n + k) = O(log(n)) + k = O(log n)
Another way to look at this is that: n * log n << n^2, so in the worse case:
O(log(n^2)) > O(log(n * log n))
So, 2*O(log(n)) is an upper bound, and O(log(n * log n)) = O(log n)
Use the definition. If f(n) = O(log(n*log(n))), then there must exist a positive constant M and real n0 such that:
|f(n)| ≤ M |log(n*log(n))|
for all n > n0.
Now let's assume (without loss of generality) that n0 > 0. Then
log(n) ≥ log(log(n))
for all n > n0.
From this, we have:
log(n(log(n)) = log(n) + log(log(n)) ≤ 2 * log(n)
Substituting, we find that
|f(n)| ≤ 2*M|log(n))| for all n > n0
Since 2*M is also a positive constant, it immediately follows that f(n) = O(log(n)).
Of course in this case simple transformations show both functions differ by a constant factor asymptotically, as shown.
However, I feel like it is worthwhile remind a classic test for analyzing how two functions relate to each other asymptotically. So here's a little more formal proof.
You can check how does f(x) relates to g(x) by analyzing lim f(x)/g(x) when x->infinity.
There are 3 cases:
lim = infinty <=> O(f(x)) > O(g(x))
inf > lim > 0 <=> O(f(x)) = O(g(x))
lim = 0 <=> O(f(x)) < O(g(x))
So
lim ( log( n * log(n) ) / log n ) =
lim ( log n + log log (n) ) / log n =
lim 1 + log log (n) / log n =
1 + 0 = 1
Note: I assumed log log n / log n to be trivial but you can do it by de l'Hospital Rule.

Big O Question - Algorithmic Analysis

I am revising for an exam and I have found this problem on the internet and was wondering how I would go about solving it.
(With base 2 logs)
Prove that log(2n) is a member of O(log n).
I have given it a go but am not sure if I am right as no answer has been provided. Could you please help?
Here is my attempt:
log 2n - c log n ≤ 0
log 2 + log n - c log n ≤ 0
1 + (1-c) log n ≤ 0
(I then divided by the log n.)
Example: n = 8 and c = 10 evaluates to less than zero. Therefore it is true.
My questions are:
Am I doing this right?
Can my answer be simplified further?
lg(2n) = lg(2) + lg(n).
lg(2) is a constant. See Wikipedia, Logarithmic identities.
The long answer is that
log(2n) log(2) + log(n) log(2)
lim n->infinity ------- = lim --------------- = lim ------ + 1 = 0 + 1 = 1
log(n) log(n) log(n)
Because the ratio of the two functions in the limit exists (i.e. is bounded), they have the same asymptotic complexity.
In the same way, to prove that O(n2) is not O(n), you would do
lim n->infinity (n^2 / n) = lim n which tends to infinity
Doing this for O(n) vs. O(log n) requires more work because
lim n->infinity (n / log n)
needs to be handled somehow. The trick is then that you can use the derivatives instead, as the derivatives in the limit also need to be asymptotically related (otherwise their integrals are not, i.e. the original functions). You take the derivative of n, which is 1, and that of log n, which is n-1, after which
lim n->infinity (1 / (1 / n)) = lim n which tends to infinity

Resources