Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
Can someone please clarify this solution a little more?
T(n) = 2T(n^1/2) + log n
Solution:
Let k = log n,
T(n) = T(2^k)=2T(2^(k/2)) + k
Substituting into this the equation S(k) = T(2^k)
we get that
S(k)=2S(k/2) + k
Now, this recurrence equation allows us to use master theorem, which specifies that
S(k) is O(k log k). Substituting back for T(n) implies T(n) is O(log n log log n)
How many times can you keep dividing n by 2? Log_2(n) times. Because Log_2(n) is to what power you need to raise 2, to get n.
Also loglog(n) is how many times you can take the square root of n, so maybe that substitution isn't that necessary, if you know this.
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
In a data structure textbook, the author use this to prove that O(log^c(n))is effective because the complexity is very close to the constant, I don't quite understand the equation.
The intuitive reason why this is is true is that log is the inverse of e^x. Just as the exponential function grows faster than x^k for any k, its inverse must grow slower than x^(1/k) for any k. (Draw pictures and flip the x and y axis to get this intuition.)
However intuition does not lead to a formal proof.
So first, convince yourself that log(log(n)) = o(log(n)).
From that, for any given c, there is an N such that for all n > N that log(log(n)) < c log(n). Now take e^x of both sides and you have found that for sufficiently large n, log(n) < n^c. And therefore log(n) = O(n^c) for any given c.
But that is big-O. We wanted little-o. Well, log(n) = O(n^(c/2) which means that log(n) is actually in o(n^c). And now we're done.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
I am trying to prove that n^2 + 5 log(n) = O(n^2), O representing big-O notation. I am not great with proofs and any help would be appreciated.
Informally, we take big-O to mean the fastest growing term as n grows arbitrarily large. Since n^2 grows much faster than log(n), that should be clear.
More formally, asymptotic behaviors are identical when the limit of the ratio of two functions approaches 1 as their parameter(s) approach(es) infinity, which should sound like the same thing. So, you would need to show that lim(n->inf)((n^2+5log(n))/n^2) = 1.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I've searched online for this but I only seem to find answers for a similar equation:
T(n) = T(n/3) + T(2n/3) + cn
But the one I'm trying to solve is:
T(n) = T(n/3) + T(2n/3)
Base case: We can assume T(a) = Theta(1) for any constant a.
I've succeeded in proving (by induction) that T(n) = O(n*log(n)). I thought the answer should be Theta(n*log(n)), but I cannot prove that T(n) = Omega(n*log(n)).
So my question is - am I correct that the answer is O(n*log(n)), and NOT Theta(n*log(n))? IF that's true that would really be great...
If I'm wrong I will of course explain where I'm stuck in the induction process...
Thanks!
P.S. If you need to, please try to explain using induction, because I haven't learned all methods for solving these problems yet.
You can't prove that it's Omega(n log n) because T(n) = n satisfies the base case and the recurrence.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
In big-O notation is O((log n)^k) = O(log n), where k is some constant right? So what's happening with the (log n)^k when k>=0?
Perhaps this might be the source of the misunderstanding?
log(n^k) = k * log(n), but no such simplification works for log(n)^k = (log(n))^k.
O((log n) * k) == O(log n), but (log n)^k is definitely not the same thing. I believe you're thinking of constant multiplication, which is equivalent in big O notation. However, raising f(n) to a power changes the time to completion. This is the same concept as O(n) being different from O(n^2).
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Some standard books on Algorithms produce this:
0 ≤ f(n) ≤ c⋅g(n) for all n > n0
While defining big-O, can anyone explain to me what this means, using a strong example which can help me to visualize and understand big-O more precisely?
Assume you have a function f(n) and you are trying to classify it - is it a big O of some other function g(n).
The definition basically says that f(n) is in O(g(n)) if there exists two constants C,N such that
f(n) <= c * g(n) for each n > N
Now, let's understand what it means.
Start with the n>N part - it means, we do not "care" for low values of n, we only care for high values, and if some (final number of) low values do not follow the criteria - we can silently ignore them by choosing N bigger then them.
Have a look on the following example:
Though we can see that for low values of n: n^2 < 10nlog(n), the second quickly catches up and after N=10 we get that for all n>10 the claim 10nlog(n) < n^2 is correct, and thus 10nlog(n) is in O(n^2).
The constant c means we can also tolerate some multiple by constant factor, and we can still accept it as desired behavior (useful for example to show that 5*n is O(n), because without it we could never find N such that for each n > N: 5n < n, but with the constant c, we can use c=6 and show 5n < 6n and get that 5n is in O(n).
This question is a math problem, not an algorithmic one.
You can find a definition and a good example here: https://math.stackexchange.com/questions/259063/big-o-interpretation
As #Thomas pointed out, Wikipedia also has a good article on this: http://en.wikipedia.org/wiki/Big_O_notation
If you need more details, try to ask a more specific question.