So I am currently taking an algorithms class and have been asked to prove
Prove: ((n^2 / log n) + 10^5 n * sqrt(n)) / n^2 = O(n^2 / log n)
I have come up with n0 = 1 and c = 5 when solving it I end up with 1 <= 5 I just wanted to see if I could get someone to verify this for me.
I'm not sure if this is the right forum to post in, if it's wrong I apologize and if you could point me in the right direction to go to that would be wonderful.
If I am not wrong, you have to prove that the upper bound of the given function is n^2 logn.
Which can be the case if for very large values of n,
n^2/logn >= n * sqrt(n)
n >= sqrt(n) * log(n)
Since, log(n) < sqrt(n), log(n)*sqrt(n) will always be less than n. Hence, our inequality is correct. So, the upper bound is O(n^2/ logn).
You can use the limit method process:
Thus, the solution of your case should look like this:
Assuming functions f and g are increasing, by definition f(x) = O(g(x)) iff limit x->inf f(x)/g(x) is non-negative. If you substitute your functions for f and g, and simplify the expression, you will see that the limit trivially comes out to be 0.
Related
I am very new to this topic and I am trying to grasp everything related to the asymptotic notations. I want to ask for your opinion on the following question:
If we have, for an algorithm, that T(n)=n!, then we can say for its time complexity that:
1 x 1 x 1 ... x1 <= n! <= n x n x n ... x n
This relation means that n! = O(n^n) and n! = Ω(1). However, can't we do better? We want the big-oh to be as close as we can to the function T(n). If we do the following:
n! <= 1 x 2 x 3 x 4 ... x n x n
That is, for the second to last element, we replace (n-1) with n. Now isnt this relation true? So isn't it true that n! = O(1 x 2 ... x n x n)? Something similar can be said for the lower bound Ω.
I am not sure if there is an error in my though process so I would really appreciate your input. Thanks in advance.
The mathematical statement n! = O(1 x 2 ... x n x n) is true. But also not terribly helpful nor enlightening. In what situations do you want to write n! = O(...)?
Either you are satisfied with n! = n!, and you don't need to write n! = O(1 x 2 ... x n x n). Or you are not satisfied with n! = n!; you want something that explains better exactly how large is n!; then you shouldn't be satisfied with n! = O(1 x 2 ... x n x n) either, as it is not any easier to understand.
Personally, I am satisfied with polynomials, like n^2. I am satisfied with exponentials, like 2^n. I am also somewhat satisfied with n^n, because I know n^n = 2^(n log n), and I also know I can't hope to find a better expression for n^n.
But I am not satisfied with n!. I would like to be able to compare it to exponentials.
Here are two comparisons:
n! < n^n
2^n < n!
The first one is obtained by upperbounding every factor by n in the product; the second one is obtained by lowerbounding every factor by 2 in the product.
That's already pretty good; it tells us that n! is somewhere between the exponential 2^n and the superexponential n^n.
But you can easily tell that the upperbound n^n is too high; for instance, you can find the following tighter bounds quite easily:
n! < n^(n-1)
n! < 2 * n^(n-2)
n! < 6 * n^(n-3)
Note that n^(n-3) is a lot smaller than n^n when n is big! This is slightly better, but still not satisfying.
You could go even further, and notice that half the factors are smaller than n/2, thus:
n! < (n/2)^(n/2) * n^(n/2) = (1/2)^(n/2) * n^n = (n / sqrt(2))^n =~ (0.7 n)^n
This is a slightly tighter upper bound! But can we do even better? I am still not satisfied.
If you are not satisfied either, I encourage you to read: https://en.wikipedia.org/wiki/Stirling%27s_approximation
Are these two equal? I read somewhere that O(2lg n) = O(n). Going by this observation, I'm guessing the answer would be no, but I'm not entirely sure. I'd appreciate any help.
Firstly, O(2log(n)) isn't equal to O(n).
To use big O notation, you would find a function that represents the complexity of your algorithm, then you would find the term in that function with the largest growth rate. Finally, you would eliminate any constant factors you could.
e.g. say your algorithm iterates 4n^2 + 5n + 1 times, where n is the size of the input. First, you would take the term with the highest growth rate, in this case 4n^2, then remove any constant factors, leaving O(n^2) complexity.
In your example, O(2log(n)) can be simplified to O(log(n))
Now on to your question.
In computer science, unless specified otherwise, you can generally assume that log(n) actually means the log of n, base 2.
This means, using log laws, 2^log(n) can be simplified to O(n)
Proof:
y = 2^log(n)
log(y) = log(2^log(n))
log(y) = log(n) * log(2) [Log(2) = 1 since we are talking about base 2 here]
log(y) = log(n)
y = n
I'm taking Data Structures and Algorithm course and I'm stuck at this recursive equation:
T(n) = logn*T(logn) + n
obviously this can't be handled with the use of the Master Theorem, so I was wondering if anybody has any ideas for solving this recursive equation. I'm pretty sure that it should be solved with a change in the parameters, like considering n to be 2^m , but I couldn't manage to find any good fix.
The answer is Theta(n). To prove something is Theta(n), you have to show it is Omega(n) and O(n). Omega(n) in this case is obvious because T(n)>=n. To show that T(n)=O(n), first
Pick a large finite value N such that log(n)^2 < n/100 for all n>N. This is possible because log(n)^2=o(n).
Pick a constant C>100 such that T(n)<Cn for all n<=N. This is possible due to the fact that N is finite.
We will show inductively that T(n)<Cn for all n>N. Since log(n)<n, by the induction hypothesis, we have:
T(n) < n + log(n) C log(n)
= n + C log(n)^2
< n + (C/100) n
= C * (1/100 + 1/C) * n
< C/50 * n
< C*n
In fact, for this function it is even possible to show that T(n) = n + o(n) using a similar argument.
This is by no means an official proof but I think it goes like this.
The key is the + n part. Because of this, T is bounded below by o(n). (or should that be big omega? I'm rusty.) So let's assume that T(n) = O(n) and have a go at that.
Substitute into the original relation
T(n) = (log n)O(log n) + n
= O(log^2(n)) + O(n)
= O(n)
So it still holds.
I have this as a homework question and don't remember learning it in class. Can someone point me in the right direction or have documentation on how to solve these types of problems?
More formally...
First, we prove that if f(n) = 5n, then f ∈ O(n). In order to show this, we must show that for some sufficiently large k and i ≥ k, f(i) ≤ ci. Fortunately, c = 5 makes this trivial.
Next, I'll prove that for all f ∈ O(n) that f ∈ O(n * log n). Hence, we must show that for some sufficiently large k, all i ≥ k, f(i) ≤ ci * log i. Hence, if we let k be large enough that f(i) ≤ ci, and i ≥ 2, then the result is trivial since ci ≤ ci * log i.
QED.
Look into the definition of big-O-notation. It means that 5n will run no slower the nlogn, which is true. nlogn is an upper bound of the number of operations to be performed.
You can prove it by applying L'Hopitals rule to lim n-> infinity of 5n/nlogn
g(n) = 5n and f(n)=nlogn
Derivate g(n) and f(n) so you will get something like this
5/(some stuff here that will contain n)
5/infinity = 0 so 5n = O(nlogn) is true
I don't remember the wording of the formal definition, but what you have to show is:
c1 * 5 * n < c2 * n * logn, n>c3
where c1 and c2 are arbitrary constants, for some number c3. Define c3 in terms of c1 and c2, and you're done.
It's been three years since I touched big-O stuff. But I think you can try to show this:
O(5n) = O(n) = O(nlogn)
O(5n) = O(n) is very easy to show, so all you have to do now is to show O(n) = O(nlogn) which shouldn't be too hard too.
Hi can someone explain me how to resolve this homework?
(n + log n)3^n = O((4^n)/n).
i think it's the same as resolving this inequality: (n + log n)3^n < c((4^n)/n)).
thanks in advance
You need to find a c (as you mentioned in your problem), and you need to show that the inequality holds for all n greater than some k.
By showing that you can find the c and k in question, then by definition you've proved the big-O bound.
Conversely, if you can't find such a c and k, this is because the function on the left is not really upper-bounded by the function on the right. That shouldn't be the case here, though (and you'll know you're getting a more intuitive understanding of asymptotic growth/bounding when you can articulate exactly why).
By definition, f(n) = O(g(n)) is true if there exists a constant M such that |f(n)| < M|g(n)| for every n. In computer science, numbers are nonnegative, so this amounts to finding an M such that f(n) / g(n) < M
This, in turn, can be done by proving that f(n) / g(n) has a finite limit as n increases towards infinity (by definition of a limit). Which, in the case of your (n^2 + n log n) * (3/4)^n is pretty obvious by virtue of how exponential functions work.