Prove that f(n) = Θ(g(n)) iff g(n) = Θ(f(n)) - algorithm

I have been given the problem:
f(n) are asymptotically positive functions. Prove f(n) = Θ(g(n)) iff g(n) = Θ(f(n)).
Everything I have found points to this statement being invalid. For example an answer I've come across states:
f(n) = O(g(n)) implies g(n) = O(f(n))
f(n) = O(g(n)) means g(n) grows faster than f(n). It cannot imply that f(n) grows
faster than g(n). Hence not true.
Another states:
If f(n) = O(g(n)) then O(f(n)). This is false. If f(n) = 1 and g(n) = n
for all natural numbers n, then f(n) <= g(n) for all natural numbers n, so
f(n) = O(g(n)). However, suppose g(n) = O(f(n)). Then there are natural
numbers n0 and a constant c > 0 such that n=g(n) <= cf(n) = c for all n >=
n0 which is impossible.
I understand that there are slight differences between my exact question and the examples I have found, but I've only been able to come up with solutions that do not prove it. I am correct in thinking that it is not able to be proved or am I looking over some detail?

You can start from here:
Formal Definition: f(n) = Θ (g(n)) means there are positive constants c1, c2, and k, such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ k.
Because you have that iff, you need to start from the left side and to prove the right side, and then start from the right side and prove the left side.
Left -> right
We consider that:
f(n) = Θ(g(n))
and we want to prove that
g(n) = Θ(f(n))
So, we have some positive constants c1, c2 and k such that:
0 ≤ c1*g(n) ≤ f(n) ≤ c2*g(n), for all n ≥ k
The first relation between f and g is:
c1*g(n) ≤ f(n) => g(n) ≤ 1/c1*f(n) (1)
The second relation between f and g is:
f(n) ≤ c2*g(n) => 1/c2*f(n) ≤ g(n) (2)
If we combine (1) and (2), we obtain:
1/c2*f(n) ≤ g(n) ≤ 1/c1*f(n)
If you consider c3 = 1/c2 and c4 = 1/c1, they exist and are positive (because the denominators are positive). And this is true for all n ≥ k (where k can be the same).
So, we have some positive constants c3, c4, k such that:
c3*f(n) ≤ g(n) ≤ c4*f(n), for all n ≥ k
which means that g(n) = Θ(f(n)).
Analogous for right -> left.

Related

Asymptotic bounds and Big Θ notation

Suppose that f(n)=4^n and g(n)=n^n, will it be right to conclude that f(n)=Θ(g(n)).
In my opinion it's a correct claim but I'm not 100% sure.
It is incorrect. f(n) = Theta(g(n)) if and only if both f(n) = O(g(n)) and g(n) = O(f(n)). It is true that f(n) = O(g(n)). We will show that it is not the case that g(n) = O(f(n)).
Assume g(n) = O(f(n)). Then there exists a positive real constant c and a positive natural number n0 such that for all n > n0, g(n) <= c * f(n). For our functions, this implies n^n <= c * 4^n. If we take the nth root of both sides of this inequality we get n <= 4c^(1/n). We are free to assume c >= 1 and n0 >= since if we found a smaller value that worked a larger value would work too. For all c > 1 and n > 1, 4c^(1/n) is strictly less than 4c. But then if we choose n > 4c, the inequality is false. So, there cannot be an n0 such that for all n at least n0 the condition holds. This is a contradiction; our initial assumption is disproven.

Finding the values in Big Oh

I am going through the Asymptotic notations from here. I am reading this f(n) ≤ c g(n)
For example, if f(n) = 2n + 2, We can satisfy it in any way as f(n) is O (c.g(n)) by adjusting the value of n and c. Or is there any specific rule or formula for selecting the value of c and n. Will no always be 1?
There is no formula per se. You can find the formal definition here:
f(n) = O(g(n)) means there are positive constants c and k, such that 0 ≤ f(n) ≤ cg(n) for all n ≥ k. The values of c and k must be fixed for the function f and must not depend on n. (big-O notation).
What I understood from your question is, you are not getting the essence of big-O notation. If your complexity is, for example, O(n^2), then you can guarantee that there is some value of n (greater than k) after which f(n) in no case will exceed c g(n).
Let's try to prove f(n) = 2n + 2 is O(n):
As it seems from the function itself, you cannot set the value of c equal to 2 as you want to find f(n) ≤ c g(n). If you plug in c = 2, you have to find k such that f(n) ≤ c g(n) for n ≥ k. Clearly, there is no n for which 2n ≥ 2n + 2. So, we move on to c = 3.
Now, let's find the value of k. So, we solve the equation 3n ≥ 2n + 2. Solving it:
3n ≥ 2n + 2
=> 3n - 2n ≥ 2
=> n ≥ 2
Therefore, for c = 3, we found value of k = 2 (n ≥ k).
You must also understand, your function isn't just O(n). It is also O(n^2), O(n^3), O(n^4) and so on. All because corresponding values of c and k exist for g(n) = n^2, g(n) = n^3 and g(n) = n^4.
Hope it helps.

If f(n) is Θ(h(n)) and g(n) = O(h(n)) then f(n) + g(n) is Θ(h(n)). True or False

I have been trying to prove/disprove the above,
I have proved that if f(n) is Θ(h(n)) and g(n) = O(h(n)) then f(n) + g(n) is O(h(n))
but now when I am trying to prove/disprove f(n) + g(n) is also Ω(h(n) I am facing a problem. Below is my approach.
From given,
There exists b,c > 0 such that b.h(n) =< f(n) <= c(h(n)) and exists a > 0 such that g(n) <= a.h(n)
I proved O(h(n)) by adding the above two inequalities, but to prove/disprove lower bound formally I am stuck since I do not have a lower bound to g(n) but have a lower bound only to f(n).
Also I am getting confused if the big-oh notation consist of strict inequalities always or not eq. if f(n) is Θ(h(n)) does the following statement hold :
There exists b,c > 0 such that b.h(n) =< f(n) =< c(h(n)).
Thank you.
Assuming f and g positive,
f + g >= f, g
implies
f + g = Ω(h(n)).

Proving if g(n) is o(f(n)), then f(n) + g(n) is Theta(f(n))

So I'm struggling with proving (or disproving) the above question. I feel like it is true, but I'm not sure how to show it.
Again, the question is if g(n) is o(f(n)), then f(n) + g(n) is Theta(f(n))
Note, that is a little-o, not a big-o!!!
So far, I've managed to (easily) show that:
g(n) = o(f(n)) -> g(n) < c*f(n)
Then g(n) + f(n) < (c+1)*f(n) -> (g(n) + f(n)) = O(f(n))
However, for showing Big Omega, I'm not sure what to do there.
Am I going about this right?
EDIT: Everyone provided great help, but I could only mark one. THANK YOU.
One option would be to take the limit of (f(n) + g(n)) / f(n) as n tends toward infinity. If this converges to a finite, nonzero value, then f(n) + g(n) = Θ(f(n)).
Assuming that f(n) is nonzero for sufficiently large n, the above ratio, in the limit, is
(f(n) + g(n)) / f(n)
= f(n) / f(n) + g(n) / f(n)
= 1 + g(n) / f(n).
Therefore, taking the limit as n goes to infinity, the above expression converges to 1 because the ratio goes to zero (this is what it means for g(n) to be o(f(n)).
So far so good.
For the next step, recall that in the best case, 0 <= g(n); this should get you a lower bound on g(n) + f(n).
Before we begin, lets first state what little-o and Big-Theta notations means:
Little-o notation
Formally, that g(n) = o(f(n)) (or g(n) ∈ o(f(n))) holds for
sufficiently large n means that for every positive constant ε
there exists a constant N such that
|g(n)| ≤ ε*|f(n)|, for all n > N (+)
From https://en.wikipedia.org/wiki/Big_O_notation#Little-o_notation.
Big-Θ notation
h(n) = Θ(f(n)) means there exists positive constants k_1, k_2
and N, such that k_1 · |f(n)| and k_2 · |f(n)| is an upper bound
and lower bound on on |h(n)|, respectively, for n > N, i.e.
k_1 · |f(n)| ≤ |h(n)| ≤ k_2 · |f(n)|, for all n > N (++)
From https://www.khanacademy.org/computing/computer-science/algorithms/asymptotic-notation/a/big-big-theta-notation.
Given: g(n) ∈ o(f(n))
Hence, in our case, for every ε>0 we can find some constant N such that (+), for our functions g(n) and f(n). Hence, for n>N, we have
|g(n)| ≤ ε*|f(n)|, for some ε>0, for all n>N
Choose a constant ε < 1 (recall, the above holds for all ε > 0),
with accompanied constant N.
Then the following holds for all n>N
ε(|g(n)| + |f(n)|) ≤ 2|f(n)| ≤ 2(|g(n)| + |f(n)|) ≤ 4*|f(n)| (*)
Stripping away the left-most inequality in (*) and dividing by 2, we have:
|f(n)| ≤ |g(n)| + |f(n)| ≤ 2*|f(n)|, n>N (**)
We see that this is the very definition Big-Θ notation, as presented in (++), with constants k_1 = 1, k_2 = 2 and h(n) = g(n)+f(n). Hence
(**) => g(n) + f(n) is in Θ(f(n))
Ans we have shown that g(n) ∈ o(f(n)) implies (g(n) + f(n)) ∈ Θ(f(n)).

Asymptotic proof examples

I came across two asymptotic function proofs.
f(n) = O(g(n)) implies 2^f(n) = O(2^g(n))
Given: f(n) ≤ C1 g(n)
So, 2^f(n) ≤ 2^C1 g(n) --(i)
Now, 2^f(n) = O(2^g(n)) → 2^f(n) ≤ C2 2^g(n) --(ii)
From,(i) we find that (ii) will be true.
Hence 2^f(n) = O(2^g(n)) is TRUE.
Can you tell me if this proof is right? Is there any other way to solve this?
2.f(n) = O((f(n))^2)
How to prove the second example? Here I consider two cases one is if f(n)<1 and other is f(n)>1.
Note: None of them are homework questions.
The attempted-proof for example 1 looks well-intentioned but is flawed. First, “2^f(n) ≤ 2^C1 g(n)” means 2^f(n) ≤ (2^C1)*g(n), which in general is false. It should have been written 2^f(n) ≤ 2^(C1*g(n)). In the line beginning with “Now”, you should explicitly say C2 = 2^C1. The claim “(ii) will be true” is vacuous (there is no (ii)).
A function like f(n) = 1/n disproves the claim in example 2 because there are no constants N and C such that for all n > N, f(n) < C*(f(n))². Proof: Let some N and C be given. Choose n>N, n>C. f(n) = 1/n = n*(1/n²) > C*(1/n²) = C*(f(n))². Because N and C were arbitrarily chosen, this shows that there are no fixed values of N and C such that for all n > N, f(n) < C*(f(n))², QED.
Saying that “f(n) ≥ 1” is not enough to allow proving the second claim; but if you write “f(n) ≥ 1 for all n” or “f() ≥ 1” it is provable. For example, if f(n) = 1/n for odd n and 1+n for even n, we have f(n) > 1 for even n > 0, and less than 1 for odd n. To prove that f(n) = O((f(n))²) is false, use the same proof as in the previous paragraph but with the additional provision that n is even.
Actually, “f(n) ≥ 1 for all n” is stronger than necessary to ensure f(n) = O((f(n))²). Let ε be any fixed positive value. No matter how small ε is, “f(n) ≥ ε for all n > N'” ensures f(n) = O((f(n))²). To prove this, take C = max(1, 1/ε) and N=N'.

Resources