Is O(logn) = O(2^O(log logn))?
I tried to take the log of both sides
log logn = log2^(log logn)
log logn = log logn log2
We can find a constant C > log2 s.t C log logn > log logn log2
So they are equal to each other. Am I right?
I think what you want to ask is if log n = O(2^(log log n))?
Think of O (big-O) as a <= operator, but the comparison is made asymptotically.
Now, to answer your question, we have to compare log n and 2^(log log n).
We use asymptotic notations only when we need to visualize how much an algorithm will scale as the input grows drastically.
log n is a logarithmic function.
2^(log log n) is an exponential function. (Notice that log log n is the exponent of 2)
It will always be true that a logarithmic function is asymptotically less than an exponential function. If you want to understand, try computing both the functions for very large values of n (like 10000 or 100000000).
So, it can be very easily inferred that log n = O(2^(log log n)).
NOTE: We do not compare asymptotic notations like you asked (O(logn) = O(2^O(log logn))). We compare functions (like log n) using these notations.
Related
I am finding an algorithm for a problem where I have two sets A and B of points with n and m points. I have two algorithms for the sets with complexity O(n log n) and O(m) and I am now wondering whether the complexity for the both algorithms combined is O(n log n) or O(m).
Basically, I am wondering whether there is some relation between m and n which would result in O(m).
If m and n are truly independent of one another and neither quantity influences the other, then the runtime of running an O(n log n)-time algorithm and then an O(m)-time algorithm is will be O(n log n + m). Neither term dominates the other - if n gets huge compared to m then the n log n part dominates, and if m is huge relative to n then the m term dominates.
This gets more complicated if you know how m and n relate to one another in some way. Many graph algorithms, for example, use m to denote the number of edges and n to denote the number of nodes. In those cases, you can sometimes simplify these expressions, but sometimes cannot. For example, the cost of implementing Dijkstra’s algorithm with a Fibonacci heap is O(m + n log n), the same as what we have above.
Size of your input is x: = m + n.
Complexity of a combined (if both are performed at most a constant number of times in the combined algorithm) algorithm is:
O(n log n) + O(m) = O(x log x) + O(x) = O(x log x).
Yes if m ~ n^n, then O(logm) = O(nlogn).
There is a log formula:
log(b^c) = c*log(b)
EDIT:
For both the algos combined the Big O is always the one that is larger because we are concerned about the asymptotic upper bound.
So it will depend on value of n and m. Eg: While n^n < m, the complexity is Olog(m), after that it becomes O(nlog(n)).
For Big-O notation we are only concerned about the larger values, so if n^n >>>> m then it is O(nlog(n)), else if m >>>> n^n then it is O(logm)
We know that log(n) = O(sqrt n )
I am wondering if is it valid to say that log(n) is theta( sqrt n ) .
numerically , i proved that it is right ; yet i am not too sure about it .
Would like some help
log n is NOT in Theta(sqrt n), since sqrt n is asymptotically greater than log n, meaning that log n isn't in Omega(sqrt n). In other words, sqrt n cannot be an asymptotic lower bound for log n.
Refer to this definition of big theta. Substitute sqrt n for g(n) and log n for f(n) in the definition and you will see that you can easily find a k2 and n0 such that the definition is satisfied (which is why log n is in O(sqrt n)), while finding a suitable k1 will prove impossible (which is why log n is NOT in Omega(sqrt n)).
If I have an algorithm that runs in log(n^(5/4)!) time, how can I represent this as something log(n)? Is it just I know that log(n!) would be asymptotically equal to nlog(n) but does the (5/4) change anything, and if it does how so?
Good question! As you noted log(n!) = O(n log n). From this it follows that
log(n^{5/4}!) = O(n^{5/4} log n^{5/4}) = O(n^{5/4} log n)
The last equality follows because log n^{5/4} = (5/4)*log n.
So the you can simplify the expression to O(n^{5/4} log n).
The answer is yes, the factor 5/4 in the exponent matters: the function n^{5/4} grows asymptotically fast than n so you can't ignore it. (This follows from the fact that n^{5/4}/n = n^{1/4}, for example.)
considering
log(sqrt(n)) = (1/2)log(n)
And if for asymptotic analysis we don't consider the constant terms
so, is O(log(sqrt(n))) is as good as O(log(n))?
As per my understanding log(sqrt(n)) will grow slowly in comparison to log(n) if we increase the size of n. But I am not able understand the glitch in moving power of (1/2) at front?
Is it just this that factor 1/2 only slows down the rate?
consider the case when we have log(n*n) represented as 2log(n) , and log(n)?
It is the same asymptotically:
O(log(sqrt(n))) = O(log(n^1/2)) = O(1/2 log(n)) = O(log(n))
You are right, O(log(sqrt(n))) is the same as O(log(n)) by the reasoning given in your question.
Time(A) = log n
Time(B) = log sqrt(n) = log n^(1/2) = 1/2 log n
Asymptotically the same
O(Time(A)) = O(log n)
O(Time(B)) = O(1/2 log n) = O(log n)
O(Time(A)) = O(Time(B))
Insignificantly different
Time(A) = 1 * log n
Time(B) = 1/2 * log n
Time(A) > Time(B)
Time(A) = 2 * Time(B)
Conclusion
log n = 2 log sqrt(n)
Although the difference between log n and log sqrt(n) is in insignificant, log n will always take double the amount of time log sqrt(n) takes
Visual
The big-O notation ignores any constant multiplier.
O(500000.N) is O(N) and is O(0.00001.N).
For the same reason, O(Log(Sqrt(N))) is O(1/2.Log(N)) is O(Log(N)), and that in any base.
The big-O notation is not about the speed of your program, it is about the growth of the running time as N increases.
I am just a bit confused. If time complexity of an algorithm is given by
what is that in big O notation? Just or we keep the log?
If that's the time-complexity of the algorithm, then it is in big-O notation already, so, yes, keep the log. Asymptotically, there is a difference between O(n^2) and O((n^2)*log(n)).
A formal mathematical proof would be nice here.
Let's define following variables and functions:
N - input length of the algorithm,
f(N) = N^2*ln(N) - a function that computes algorithm's execution time.
Let's determine whether growth of this function is asymptotically bounded by O(N^2).
According to the definition of the asymptotic notation [1], g(x) is an asymptotic bound for f(x) if and only if: for all sufficiently large values of x, the absolute value of f(x) is at most a positive constant multiple of g(x). That is, f(x) = O(g(x)) if and only if there exists a positive real number M and a real number x0 such that
|f(x)| <= M*g(x) for all x >= x0 (1)
In our case, there must exists a positive real number M and a real number N0 such that:
|N^2*ln(N)| <= M*N^2 for all N >= N0 (2)
Obviously, such M and x0 do not exist, because for any arbitrary large M there is N0, such that
ln(N) > M for all N >= N0 (3)
Thus, we have proved that N^2*ln(N) is not asymptotically bounded by O(N^2).
References:
1: - https://en.wikipedia.org/wiki/Big_O_notation
A simple way to understand the big O notation is to divide the actual number of atomic steps by the term withing the big O and validate you get a constant (or a value that is smaller than some constant).
for example if your algorithm does 10n²⋅logn steps:
10n²⋅logn/n² = 10 log n -> not constant in n -> 10n²⋅log n is not O(n²)
10n²⋅logn/(n²⋅log n) = 10 -> constant in n -> 10n²⋅log n is O(n²⋅logn)
You do keep the log because log(n) will increase as n increases and will in turn increase your overall complexity since it is multiplied.
As a general rule, you would only remove constants. So for example, if you had O(2 * n^2), you would just say the complexity is O(n^2) because running it on a machine that is twice more powerful shouldn't influence the complexity.
In the same way, if you had complexity O(n^2 + n^2) you would get to the above case and just say it's O(n^2). Since O(log(n)) is more optimal than O(n^2), if you had O(n^2 + log(n)), you would say the complexity is O(n^2) because it's even less than having O(2 * n^2).
O(n^2 * log(n)) does not fall into the above situation so you should not simplify it.
if complexity of some algorithm =O(n^2) it can be written as O(n*n). is it O(n)?absolutely not. so O(n^2*logn) is not O(n^2).what you may want to know is that O(n^2+logn)=O(n^2).
A simple explanation :
O(n2 + n) can be written as O(n2) because when we increase n, the difference between n2 + n and n2 becomes non-existent. Thus it can be written O(n2).
Meanwhile, in O(n2logn) as the n increases, the difference between n2 and n2logn will increase unlike the above case.
Therefore, logn stays.