Difference between complexity logn and log(sqrt(n)) - algorithm

considering
log(sqrt(n)) = (1/2)log(n)
And if for asymptotic analysis we don't consider the constant terms
so, is O(log(sqrt(n))) is as good as O(log(n))?
As per my understanding log(sqrt(n)) will grow slowly in comparison to log(n) if we increase the size of n. But I am not able understand the glitch in moving power of (1/2) at front?
Is it just this that factor 1/2 only slows down the rate?
consider the case when we have log(n*n) represented as 2log(n) , and log(n)?

It is the same asymptotically:
O(log(sqrt(n))) = O(log(n^1/2)) = O(1/2 log(n)) = O(log(n))

You are right, O(log(sqrt(n))) is the same as O(log(n)) by the reasoning given in your question.

Time(A) = log n
Time(B) = log sqrt(n) = log n^(1/2) = 1/2 log n
Asymptotically the same
O(Time(A)) = O(log n)
O(Time(B)) = O(1/2 log n) = O(log n)
O(Time(A)) = O(Time(B))
Insignificantly different
Time(A) = 1 * log n
Time(B) = 1/2 * log n
Time(A) > Time(B)
Time(A) = 2 * Time(B)
Conclusion
log n = 2 log sqrt(n)
Although the difference between log n and log sqrt(n) is in insignificant, log n will always take double the amount of time log sqrt(n) takes
Visual

The big-O notation ignores any constant multiplier.
O(500000.N) is O(N) and is O(0.00001.N).
For the same reason, O(Log(Sqrt(N))) is O(1/2.Log(N)) is O(Log(N)), and that in any base.
The big-O notation is not about the speed of your program, it is about the growth of the running time as N increases.

Related

Time Complexity in asymptotic analysis log n and log (n+m)

Just some interesting discussion inspired by a conversation in my class.
There are two algorithms, one has time complexity log n and another log (n+m).
Am I correct to argue for average cases, log (n+m) one will perform faster while they make no differences in running time when considering it asymptotically? Because taking the limit of both and f1'/f2' will result in a constant, therefore they have the same order of growth.
Thanks!
As I can see from the question, both n and m are independent variables. So
when stating that
O(m + n) = O(n)
it should hold for any m, which is not: the counter example is
m = exp(n)
O(log(m + n)) = O(log(n + exp(n))) = O(log(exp(n))) = O(n) > O(log(n))
That's why in general case we can only say, that
O(log(m + n)) >= O(log(n))
An interesting problem is when O(m + n) = O(n). If m grows not faster then polynom from n, i.e. O(m) <= O(P(n)):
O(log(m + n)) = O(log(P(n) + n)) = O(log(P(n))) = k * O(log(n)) = O(log(n))
In case of (multi)graphs seldom have we that many edges O(m) > P(n): even complete graph Kn contains only m = n * (n - 1) / 2 = P(n) edges, that's why
O(m + n) = O(n)
holds for ordinary graph (no parallel/multiple edges, no loops)

Prove that (log n)! = O(n^k)

i need help to prove this
(log n)! = O(n^k)
I started with nlog n <= c*n^k but could not arrive at the desired solution.
n^k = (e^k)^log n and the factorial grows faster than an exponential (product of growing factors vs. product of constant factors).

O(n^2) vs O (n(logn)^2)

Is time complexity O(n^2) or O (n(logn)^2) better?
I know that when we simplify it, it becomes
O(n) vs O((logn)^2)
and logn < n, but what about logn^2?
n is only less than (log n)2 for values of n less than 0.49...
So in general (log n)2 is better for large n...
But since these O(something)-notations always leave out constant factors, in your case it might not be possible to say for sure which algorithm is better...
Here's a graph:
(The blue line is n and the green line is (log n)2)
Notice, how the difference for small values of n isn't so big and might easily be dwarfed by the constant factors not included in the Big-O notation.
But for large n, (log n)2 wins hands down:
For each constant k asymptotically log(n)^k < n.
Proof is simple, do log on both sides of the equation, and you get:
log(log(n))*k < log(n)
It is easy to see that asymptotically, this is correct.
Semantic note: Assuming here log(n)^k == log(n) * log(n) * ... * log(n) (k times) and NOT log(log(log(...log(n)))..) (k times) as it is sometimes also used.
O(n^2) vs. O(n*log(n)^2)
<=> O(n) vs. O(log(n)^2) (divide by n)
<=> O(sqrt(n)) vs. O(log(n)) (square root)
<=> polynomial vs. logarithmic
Logarithmic wins.
(Log n)^2 is better because if you do a variable change n by exp m, then m^2 is better than exp m
(logn)^2 is also < n.
Take an example:
n = 5
log n = 0.6989....
(log n)^ 2 = 0.4885..
You can see, (long n)^2 is further reduced.
Even if you take any bigger value of n e.g. 100,000,000 , then
log n = 9
(log n)^ 2 = 81
which is far less than n.
O(n(logn)^2) is better (faster) for large n!
take log from both sides:
Log(n^2)=2log(n)
Log(n(logn)^2)=Log(n)+2log(Log(n))=Log(n)+2log(Log(n))
lim n--> infinity [(Log(n)+2log(Log(n)))/2log(n)/]=0.5 (use l'Hôpital's rule)(http://en.wikipedia.org/wiki/L'H%C3%B4pital's_rule)]

Ordering functions by Big O complexity

I'm trying to order the following functions in terms of Big O complexity from low complexity to high complexity: 4^(log(N)), 2N, 3^100, log(log(N)), 5N, N!, (log(N))^2
This:
3^100
log(log(N))
2N
5N
(log(N))^2
4^(log(N))
N!
I figured this out just by using the chart given on wikipedia. Is there a way of verifying the answer?
3^100 = O(1)
log log N = O(log log N)
(log N)^2 = O((log N)^2)
N, 2N, 5N = O(N)
4^logN = O(e^logN)
N! = o(N!)
you made just one small mistake. this is the right order.

Find the big-O of a function

Please help me on following two functions, I need to simplify them.
O(nlogn + n^1.01)
O(log (n^2))
My current idea is
O(nlogn + n^1.01) = O(nlogn)
O(log (n^2)) = O (log (n^2))
Please kindly help me on these two simplification problems and briefly give an explanation, thanks.
For the second, you have O(lg(n²)) = O(2lg(n)) = O(lg(n)).
For the first, you have O(nlg(n) + n^(1.01)) = O(n(lg(n) + n^(0.01)), you've to decide whatever lg(n) or n^(0.01) grows larger.
For that purpose, you can take the derivative of n^0.01 - lg(n) and see if, at the limit for n -> infinity, it is positive or negative: 0.01/x^(0.99) - 1/x; at the limit, x is bigger than x^0.99, so the difference is positive and thus n^0.01 grows asymptotically faster than log(n), so the complexity is O(n^1.01).
Remember:
log (x * y) = log x + log y
and n^k always grows faster than log n for any k>0.
Putting things together, for the first question O(n*log(n)+n^1.01) the first function grows faster than the second summand, i.e. since nlog(n) > n^1.01 for n greater than about 3, it is O(nlog(n))
In the second case use the formula mentioned by KennyTM, so we get
O(log(n^2)) = O(log(n*n)) = O(log(n)+log(n)) = O(2*log(n)) = O(log(n))
because constant terms can be ignored.

Resources