Is $n^{3+\epsilon} * log(n^{3+\epsilon}) = O(n^{3+\epsilon})$?
(for \epsilon > 0)
I am not sure if multiplying log(\epsilon) doesn't affect to O(n^{3+\epsilon}) term.
No. Assume that there is a constant c that exists, then you'd have:
n^(3 + ε) log(n^(3 + ε)) <= cn^(3 + ε)
log(n^(3 + ε)) <= c
Clearly, that's not a viable constant as it is increasing for all sufficiently large n.
Related
Hi all I'm having problems calculating the complexity of the following recurrence equation:
T(n)={ O(1) , if n<=2
{ 2*T(n^(1/2)) + O(logn) , if n>=2
I got to a probable conclusion of O(2^n * nlogn). If anyone has got any clue I'd be happy. Thank you.
Suppose for now that n > 2 is a power of two, so that you can write n = 2^m. Also lets write the constant in your O(log(n)) term explicitly as c*log2(n).
Then, unravelling the recursion gives us :
T(2^m) <= 2*T((2^m)^(1/2)) + c*log2(2^m)
= 2*T(2^(m/2)) + c*m
<= 2*( 2*T((2^(m/2))^(1/2)) + c*log2(2^(m/2)) ) + c*m
= 4*T(2^(m/4)) + 2*c*m
<= 4*( 2*T((2^(m/4))^(1/2)) + c*log2(2^(m/4)) ) + 2*c*m
= 8*T(2^(m/8)) + 3*c*m
<= ...
= (2^log2(m))*T(2^1) + log2(m)*c*m
= m*T(2) + c*m*log2(m)
= log2(n)*T(2) + c*log2(n)*log2(log2(n))
= O(log2(n)*log2(log2(n)))
The term log2(m) comes from the fact that we divide m by two at each new recursion level, and so it will take (at most) log2(m) divisions before m <= 1.
Now if n is not a power of two, you can notice that there exists some number r which is a power of two such that n <= r < 2*n. And you can then write T(n) <= T(r) = O(log2(r)*log2(log2(r))) = O(log2(2*n)*log2(log2(2*n))) = O(log2(n)*log2(log2(n))).
So the overall answer is
T(n) = O(log2(n)*log2(log2(n)))
So I have to figure out if n^(1/2) is Big Omega of log(n)^3. I am pretty sure that it is not, since n^(1/2) is not even in the bounds of log(n)^3; but I do not know how to prove it without limits. I know the definition without limits is
g(n) is big Omega of f(n) iff there is a constant c > 0 and an
integer constant n0 => 1 such that f(n) => cg(n) for n => n0
But can I really always find a constant c that will satisfy this?
for instance for log(n)^3=>c*n^(1/2) if c = 0.1 and n = 10 then we get 1=>0.316.
When comparing sqrt(n) with ln(n)^3 what happens is that
ln(n)^3 <= sqrt(n) ; for all n >= N0
How do I know? Because I printed out sufficient samples of both expressions as to convince myself which dominated the other.
To see this more formally, let's first assume that we have already found N0 (we will do that later) and let's prove by induction that if the inequality holds for n >= N0, it will also hold for n+1.
Note that I'm using ln in base e for the sake of simplicity.
Equivalently, we have to show that
ln(n + 1) <= (n + 1)^(1/6)
Now
ln(n + 1) = ln(n + 1) - ln(n) + ln(n)
= ln(1 + 1/n) + ln(n)
<= ln(1 + 1/n) + n^(1/6) ; inductive hypethesis
From the definition of e we know
e = limit (1 + 1/n)^n
taking logarithms
1 = limit n*ln(1 + 1/n)
Therefore, there exits N0 such that
n*ln(1 + 1/n) <= 2 ; for all n >= N0
so
ln(1 + 1/n) <= 2/n
<= 1
Using this above, we get
ln(n + 1) <= 1 + n^(1/6)
<= (n+1)^(1/6)
as we wanted.
We are now left with the task of finding some N0 such that
ln(N0) <= N0^(1/6)
let's take N0 = e^(6k) for some value of k that we will are about to find. We get
ln(N0) = 6k
N0^(1/6) = e^k
so, we only need to pick k such that 6k < e^k, which is possible because the right hand side grows much faster than the left.
We're asked to provide a $ n+4![\sqrt{n}] =O(n) $ with having a good argumentation and a logical build up for it but it's not said how a good argumentation would look like, so I know that $2n+4\sqrt{n}$ always bigger for n=1 but i wouldn't know how to argue about it and how to logically build it since i just thought about it and it happened to be true. Can someone help out with this example so i would know how to do it?
You should look at the following site https://en.wikipedia.org/wiki/Big_O_notation
For the O big notation we would say that if a function is the following: X^3+X^2+100X = O(x^3). This is with idea that if X-> some very big number, the X^3 term will become the dominant factor in the equation.
You can use the same logic to your equation. Which term will become dominant in your equation.
If this is not clear you should try to plot both terms and see how they scale. This could be more clarifying.
A proof is a convincing, logical argument. When in doubt, a good way to write a convincing, logical argument is to use an accepted template for your argument. Then, others can simply check that you have used the template correctly and, if so, the validity of your argument follows.
A useful template for showing asymptotic bounds is mathematical induction. To use this, you show that what you are trying to prove is true for specific simple cases, called base cases, then you assume it is true in all cases up to a certain size (the induction hypothesis) and you finish the proof by showing the hypothesis implies the claim is true for cases of the very next size. If done correctly, you will have shown the claim (parameterized by a natural number n) is true for a fixed n and for all larger n. This is what is exactly what is required for proving asymptotic bounds.
In your case: we want to show that n + 4 * sqrt(n) = O(n). Recall that the (one?) formal definition of big-Oh is the following:
A function f is bound from above by a function g, written f(n) = O(g(n)), if there exist constants c > 0 and n0 > 0 such that for all n > n0, f(n) <= c * g(n).
Consider the case n = 0. We have n + 4 * sqrt(n) = 0 + 4 * 0 = 0 <= 0 = c * 0 = c * n for any constant c. If we now assume the claim is true for all n up to and including k, can we show it is true for n = k + 1? This would require (k + 1) + 4 * sqrt(k + 1) <= c * (k + 1). There are now two cases:
k + 1 is not a perfect square. Since we are doing analysis of algorithms it is implied that we are using integer math, so sqrt(k + 1) = sqrt(k) in this case. Therefore, (k + 1) + 4 * sqrt(k + 1) = (k + 4 * sqrt(k)) + 1 <= (c * k) + 1 <= c * (k + 1) by the induction hypothesis provided that c > 1.
k + 1 is a perfect square. Since we are doing analysis of algorithms it is implied that we are using integer math, so sqrt(k + 1) = sqrt(k) + 1 in this case. Therefore, (k + 1) + 4 * sqrt(k + 1) = (k + 4 * sqrt(k)) + 5 <= (c * k) + 5 <= c * (k + 1) by the induction hypothesis provided that c >= 5.
Because these two cases cover all possibilities and in each case the claim is true for n = k + 1 when we choose c >= 5, we see that n + 4 * sqrt(n) <= 5 * n for all n >= 0 = n0. This concludes the proof that n + 4 * sqrt(n) = O(n).
Prove 5n^2+ 2n -1 = O(n^2).
This is what I have attempted so far:
5n^2 + 2n -1 <= 5n^2 + 2n^2 -n2 for n>1
5n^2+ 2n -1 <= 6n^2 for n>1
5n^2+ 2n -1 = O(n^2) [ c = 6n^2, n0 = 1 ]
Is this the right way of proving Big O notation?
To prove that your expression is O(n^2), you need to show that it is bounded by M*n^2, for some constant M and some minimum n value. By inspection, we can show that your expression is bounded by 10*n^2, for n=10:
For n = 10:
5n^2 + 2n -1 <= 10*n^2
500 + 20 - 1 <= 1000
519 <= 1000
We can also show that the expression is bounded by 10*n^2 for any value n greater than 10:
For n > 10:
5n^2 + 2n -1 <= 10*n^2
5*(10+i)^2 + 2*(10+i) -1 <= 10*(10+i)^2
5*(i^2 + 20i + 100) + 2i + 19 <= 10*(i^2 + 20i + 100)
2i + 19 <= 5*(i^2 + 20i + 100)
2i + 19 <= 5i^2 + 100i + 500
5i^2 + 98i + 481 >= 0, which is true for `i > 0`
Here is a link to the Wikipedia article on Big-O notation:
https://en.m.wikipedia.org/wiki/Big_O_notation
Update:
Note that in practice in order to label your expression O(n^2) we won't resort to such an ugly written proof. Instead, we will just recognize that the n^2 term will dominate the expression for large n, and that the overall behavior will be O(n^2). And your expression is also O(n^3) (and O(n^4), etc.).
it looks fine , I'm thinking that if you are doing it for your assignment work or other formal work then you can also do it in more formal way like for selecting the value of constant ( c ) such as by f(n)/g(n).Otherwise it looks correct.
We have f(n) = 5*n^2+2*n-1 and g(n) = n^2
In order to prove f(n) = O(g(n)), we just need to find 2 constants, namely c > 0 and n_0, s.t. f(n) <= c.g(n) for all n >= n0.
Let's choose some value of c = 5.5 say. Let's evaluate and plot f(n) and c*g(n). As we can see from the plot and we can also show it theoretically since n^2/2 - 2*n + 1 = (n^2-4*n+2)/2 = ((n-2)^2-2)/2 >= 0 for all n >= 4, it implies that 5*n^2+2*n-1 <= 5.5*n^2 for all n >= n0 = 4. Hence, f(n) = O(g(n)). (proved)
1.Given that T(0)=1, T(n)=T([2n/3])+c (in this case 2n/3 is lower bound). What is big-Θ bound for T(n)? Is this just simply log(n)(base 3/2). Please tell me how to get the result.
2.Given the code
void mystery(int n) {
if(n < 2)
return;
else {
int i = 0;
for(i = 1; i <= 8; i += 2) {
mystery(n/3);
}
int count = 0;
for(i = 1; i < n*n; i++) {
count = count + 1;
}
}
}
According to the master theorem, the big-O bound is n^2. But my result is log(n)*n^2 (base 3) . I'm not sure of my result, and actually I do not really know how to deal with the runtime of recursive function. It is just simply the log function?
Or what if like in this code T(n)=4*T(n/3)+n^2?
Cheers.
For (1), the recurrence solves to c log3/2 n + c. To see this, you can use the iteration method to expand out a few terms of the recurrence and spot a pattern:
T(n) = T(2n/3) + c
= T(4n/9) + 2c
= T(8n/27) + 3c
= T((2/3)k n) + kc
Assuming that T(1) = c and solving for the choice of k that makes the expression inside the parentheses equal to 1, we get that
1 = (2/3)k n
(3/2)k = n
k = log3/2
Plugging in this choice of k into the above expression gives the final result.
For (2), you have the recurrence relation
T(n) = 4T(n/3) + n2
Using the master theorem with a = 4, b = 3, and d = 2, we see that logb a = log3 4 < d, so this solves to O(n2). Here's one way to see this. At the top level, you do n2 work. At the layer below that, you have four calls each doing n2 / 9 work, so you do 4n2 / 9 work, less than the top layer. The layer below that does 16 calls that each do n2 / 81 work for a total of 16n2 / 81 work, again much work than the layer above. Overall, each layer does exponentially less work than the layer above it, so the top layer ends up dominating all the other ones asymptotically.
Let's do some complexity analysis, and we'll find that the asymptotic behavior of T(n) depends on the constants of the recursion.
Given T(n) = A T(n*p) + C, with A,C>0 and p<1, let's first try to prove T(n)=O(n log n). We try to find D such that for large enough n
T(n) <= D(n * log(n))
This yields
A * D(n*p * log(n*p)) + C <= D*(n * log(n))
Looking at the higher order terms, this results in
A*D*p <= D
So, if A*p <= 1, this works, and T(n)=O(n log n).
In the special case that A<=1 we can do better, and prove that T(n)=O(log n):
T(n) <= D log(n)
Yields
A * D(log(n*p)) + C <= D*(log(n))
Looking at the higher order terms, this results in
A * D * log(n) + C + A * D *log(p) <= D * log(n)
Which is true for large enough D and n since A<=1 and log(p)<0.
Otherwise, if A*p>1, let's find the minimal value of q such that T(n)=O(n^q). We try to find the minimal q such that there exists D for which
T(n) <= D n^q
This yields
A * D p^q n^q + C <= D*n^q
Looking at the higher order terms, this results in
A*D*p^q <= D
The minimal q that satisfies this is defined by
A*p^q = 1
So we conclude that T(n)=O(n^q) for q = - log(A) / log(p).
Now, given T(n) = A T(n*p) + B n^a + C, with A,B,C>0 and p<1, try to prove that T(n)=O(n^q) for some q. We try to find the minimal q>=a such that for some D>0,
T(n) <= D n^q
This yields
A * D n^q p^q + B n^a + C <= D n^q
Trying q==a, this will work only if
ADp^a + B <= D
I.e. T(n)=O(n^a) if Ap^a < 1.
Otherwise we get to Ap^q = 1 as before, which means T(n)=O(n^q) for q = - log(A) / log(p).