Prove or disprove the following implication (Big O Notation) - algorithm

I couldn't prove this:
f(n) = O(g(n)) implies f(n)^k = O(g(n)^k)
where k is element of the natural, positiv numbers
I've found similar examples on the internet. But I'm not sure if it's right to implement those solutions for this example.

Go back to the definition of big-o.
f(n) = O(g(n)) <=> \exists M \in R+,
\exists n_0 \in N,
such that:
\forall n > n_0
|f(n)| < M.|g(n)|
It is obvious that if k > 0 then |f(n)|^k < (M.|g(n)|)^k.
If k < 0, the relation is inversed.

Related

Is log(f(n)^c)=Ω(log(g(n))) or not

You are given functions f and g such that f(n)=Ω(g(n)). Is log(f(n)^c)=Ω(log(g(n)))? (Here c is some positive constant.) You should assume that f and g are non-decreasing and always bigger than 1.
This a question in my algorithm course and i cant figure out if it's true or false or depending on the constant or depending on the functions f and g
It's straightforward to prove. As f(n) = Omega(g(n)), it means lim{n -> ∞} f(n)/g(n) > 0.
As f and g are non-decreasing and greater than 1, and log is an increasing function, lim{n -> ∞} log(f(n))/log(g(n)) > 0. Hence, log(f(n)) = Omega(log(g(n)).
On the other hand log(f(n)^c) = c log(f(n)). As c is a constant factor, log(f(n)^c) is Omega(log(g(n)) as well anf your claim is correct.
First, I point out that instead of this notation f(n) = Ω(g(n)) I use this f(n) ∈ Ω(g(n))
From Omega definition we have:
f(n) ∈ g(n) <=> ∃s,k > 0 | f(n) >= s*g(n) ∀n >= k
So for log(f(n)^c) ∈ Ω(c*log(g(n))) we can say:
∃s > 0 (s=c for easiness) | log(f(n)^c) >= c*log(g(n)) ∀n >= k
Then we have:
c*log(f(n)) >= c*log(g(n)) ∀n >= k
f(n) >= g(n) ∀n >= k
And since we know that f(n) ∈ Ω(g(n)) we can state that log(f(n)^c) ∈ Ω(c*log(g(n))) .

Asymptotic bounds and Big Θ notation

Suppose that f(n)=4^n and g(n)=n^n, will it be right to conclude that f(n)=Θ(g(n)).
In my opinion it's a correct claim but I'm not 100% sure.
It is incorrect. f(n) = Theta(g(n)) if and only if both f(n) = O(g(n)) and g(n) = O(f(n)). It is true that f(n) = O(g(n)). We will show that it is not the case that g(n) = O(f(n)).
Assume g(n) = O(f(n)). Then there exists a positive real constant c and a positive natural number n0 such that for all n > n0, g(n) <= c * f(n). For our functions, this implies n^n <= c * 4^n. If we take the nth root of both sides of this inequality we get n <= 4c^(1/n). We are free to assume c >= 1 and n0 >= since if we found a smaller value that worked a larger value would work too. For all c > 1 and n > 1, 4c^(1/n) is strictly less than 4c. But then if we choose n > 4c, the inequality is false. So, there cannot be an n0 such that for all n at least n0 the condition holds. This is a contradiction; our initial assumption is disproven.

f(n) = log(n)^m is O(n) for all natural numbers m?

A TA told me that this is true today but I was unable to verify this by googling. This is saying functions like log(n)^2, log(n)^3, ... , log(n)^m are all O(n).
Is this true?
Claim
The function f(n) = log(n)^m, for any natural number m > 2 (m ∈ ℕ+) is in
O(n).
I.e. there exists a set of positive constants c and n0 such that
the following holds:
log(n)^m < c · n, for all n > n0, { m > 2 | m ∈ ℕ+ } (+)
Proof
Assume that (+) does not hold, and denote this assumption as (*).
I.e., given (*), there exists no set of positive constants c and n0 such that (+) holds for any value of m > 2. Under this assumption, the following holds, that for all positive constants c and n0, there exists a n > n0 such that (thanks #Oriol):
log(n)^m ≥ c · n, { m > 2 | m ∈ ℕ+ } (++)
Now, if (++) holds, then the inequality in (++) will hold also after applying any monotonically increasing function to both sides of the inequality. One such function is, conveniently, the log function itself
Hence, under the assumption that (++) holds, then, for all positive constants c and n0, there exists a n > n0 such that the following holds
log(log(n)^m) ≥ log(c · n), { m > 2 | m ∈ ℕ+ }
m · log(log(n)) ≥ log(c · n), { m > 2 | m ∈ ℕ+ } (+++)
However, (+++) is obviously a contradiction: since log(n) dominates (w.r.t. growth) over log(log(n)),
we can—for any given value of m > 2—always find a set of constants c and n0 such that (+++) (and hence (++)) is violated for all n > n0.
Hence, assumption (*) is proven false by contradiction, and hence, (+) holds.
=> for f(n) = log(n)^m, for any finite integer m > 2, it holds that f ∈ O(n).
Yes. If the function it's f(n), it means m is a parameter and f does not depend on it. In fact, we have a different f_m function for each m.
f_m(n) = log(n)^m
Then it's easy. Given m ∈ ℕ, use L'Hôpital's rule repeatively
f_m(n) log(n)^m m * log(n)^(m-1)
limsup ──────── = limsup ────────── = limsup ────────────────── =
n→∞ n n→∞ n n→∞ n
m*(m-1) * log(n)^(m-2) m!
= limsup ──────────────────────── = … = limsup ──── = 0
n n→∞ n
Therefore, f_m ∈ O(n).
Of course, it would be different if we had f(m,n) = log(n)^m. For example, taking m=n,
f(n,n) log(n)^n
limsup ──────── = limsup ────────── = ∞
n→∞ n n→∞ n
Then f ∉ O(n)
In many ways it is more intuitive that for any positive integer m we have:
x^m = O(e^x)
This says that exponential growth dominates polynomial growth (which is why exponential time algorithms are bad news in computer programming).
Assuming that this is true, simply take x = log(n) and use the fact that then x tends to infinity if and only if n tends to infinity and that e^x and log(x) are inverses:
log(n)^m = O(e^log(n)) = O(n)
Finally, since for any natural number m, the root function n => n^(1/m) is increasing, we can rewrite the result as
log(n) = O(n^(1/m))
This way of writing it says that log(n) grows slower than any root (square, cube, etc.) of n, which more obviously corresponds to e^n growing faster than any power of n.
On Edit: the above showed that log(n)^m = O(n) followed from the more familiar x^m = O(e^x). To convert it to a more self-contained proof, we can show the later somewhat directly.
Start with the Taylor series for e^x:
e^x = 1 + x/1! + x^2/2! + x^3/3! + ... + x^n/n! + ...
This is known to converge for all real numbers x. If a positive integer m is given, let K = (m+1)!. Then, if x > K we have 1/x < 1/(m+1)!, hence
x^m = x^(m+1)/x < x^(m+1)/(m+1)! < e^x
which implies x^m = O(e^x). (The last inequality in the above is true since all terms in the expansion for e^x are strictly positive if x>0 and x^(m+1)/(m+1)! is just one of those terms.)

Prove that f(n) = Θ(g(n)) iff g(n) = Θ(f(n))

I have been given the problem:
f(n) are asymptotically positive functions. Prove f(n) = Θ(g(n)) iff g(n) = Θ(f(n)).
Everything I have found points to this statement being invalid. For example an answer I've come across states:
f(n) = O(g(n)) implies g(n) = O(f(n))
f(n) = O(g(n)) means g(n) grows faster than f(n). It cannot imply that f(n) grows
faster than g(n). Hence not true.
Another states:
If f(n) = O(g(n)) then O(f(n)). This is false. If f(n) = 1 and g(n) = n
for all natural numbers n, then f(n) <= g(n) for all natural numbers n, so
f(n) = O(g(n)). However, suppose g(n) = O(f(n)). Then there are natural
numbers n0 and a constant c > 0 such that n=g(n) <= cf(n) = c for all n >=
n0 which is impossible.
I understand that there are slight differences between my exact question and the examples I have found, but I've only been able to come up with solutions that do not prove it. I am correct in thinking that it is not able to be proved or am I looking over some detail?
You can start from here:
Formal Definition: f(n) = Θ (g(n)) means there are positive constants c1, c2, and k, such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ k.
Because you have that iff, you need to start from the left side and to prove the right side, and then start from the right side and prove the left side.
Left -> right
We consider that:
f(n) = Θ(g(n))
and we want to prove that
g(n) = Θ(f(n))
So, we have some positive constants c1, c2 and k such that:
0 ≤ c1*g(n) ≤ f(n) ≤ c2*g(n), for all n ≥ k
The first relation between f and g is:
c1*g(n) ≤ f(n) => g(n) ≤ 1/c1*f(n) (1)
The second relation between f and g is:
f(n) ≤ c2*g(n) => 1/c2*f(n) ≤ g(n) (2)
If we combine (1) and (2), we obtain:
1/c2*f(n) ≤ g(n) ≤ 1/c1*f(n)
If you consider c3 = 1/c2 and c4 = 1/c1, they exist and are positive (because the denominators are positive). And this is true for all n ≥ k (where k can be the same).
So, we have some positive constants c3, c4, k such that:
c3*f(n) ≤ g(n) ≤ c4*f(n), for all n ≥ k
which means that g(n) = Θ(f(n)).
Analogous for right -> left.

Asymptotic proof examples

I came across two asymptotic function proofs.
f(n) = O(g(n)) implies 2^f(n) = O(2^g(n))
Given: f(n) ≤ C1 g(n)
So, 2^f(n) ≤ 2^C1 g(n) --(i)
Now, 2^f(n) = O(2^g(n)) → 2^f(n) ≤ C2 2^g(n) --(ii)
From,(i) we find that (ii) will be true.
Hence 2^f(n) = O(2^g(n)) is TRUE.
Can you tell me if this proof is right? Is there any other way to solve this?
2.f(n) = O((f(n))^2)
How to prove the second example? Here I consider two cases one is if f(n)<1 and other is f(n)>1.
Note: None of them are homework questions.
The attempted-proof for example 1 looks well-intentioned but is flawed. First, “2^f(n) ≤ 2^C1 g(n)” means 2^f(n) ≤ (2^C1)*g(n), which in general is false. It should have been written 2^f(n) ≤ 2^(C1*g(n)). In the line beginning with “Now”, you should explicitly say C2 = 2^C1. The claim “(ii) will be true” is vacuous (there is no (ii)).
A function like f(n) = 1/n disproves the claim in example 2 because there are no constants N and C such that for all n > N, f(n) < C*(f(n))². Proof: Let some N and C be given. Choose n>N, n>C. f(n) = 1/n = n*(1/n²) > C*(1/n²) = C*(f(n))². Because N and C were arbitrarily chosen, this shows that there are no fixed values of N and C such that for all n > N, f(n) < C*(f(n))², QED.
Saying that “f(n) ≥ 1” is not enough to allow proving the second claim; but if you write “f(n) ≥ 1 for all n” or “f() ≥ 1” it is provable. For example, if f(n) = 1/n for odd n and 1+n for even n, we have f(n) > 1 for even n > 0, and less than 1 for odd n. To prove that f(n) = O((f(n))²) is false, use the same proof as in the previous paragraph but with the additional provision that n is even.
Actually, “f(n) ≥ 1 for all n” is stronger than necessary to ensure f(n) = O((f(n))²). Let ε be any fixed positive value. No matter how small ε is, “f(n) ≥ ε for all n > N'” ensures f(n) = O((f(n))²). To prove this, take C = max(1, 1/ε) and N=N'.

Resources