prove c f(n) is in theta f(n) for c>0 - asymptotic-complexity

c f(n) is in theta f(n) for c>0
I know that c is a constant, and if I can prove c f(n) is in big O(f(n) and in big Omega f(n) simultaneously, it is also in theta f(n), but how can I prove? I got confused.

c f(n) is O(f(n)) because there is a constant k such that :
|c f(n)| ≤ k |f(n)| as n -> infinity
Hence, |c| |f(n)| ≤ k |f(n)|
dividing both sides by |f(n)| we get |c| ≤ k
So, any value of k larger than |c| would satisfy this condition. Therefore, c f(n) is O(f(n))
You can use the same method to show that c f(n) is also Ω(f(n)), and therefore it is ϴ(f(n))

Related

How to prove/disprove if (2^{n})^{1/3} is in Θ(2^{n}) using Big-O, Big-Omega and Big-Theta

So I understand Big-O, Big-Omega and Big-Theta conceptually, but I'm not sure how to prove Big-Omega and Big-Theta.
function f is in Big-O(g) if and only if there exists some constant c > 0 and some constant n_0 ≥ 1 such that for all n ≥ n_0, the expression f(n) ≤ c·g(n) is true.
Big-Omega is the opposite, c·g(n) ≤ f(n).
Big-Theta sandwiches c1·f(n) ≤ g(n) ≤ c2·f(n).
I need to prove/disprove if (2^{n})^{1/3} ∈ Θ(2^{n}) by using all three notations.
What I have so far:
Big-O : (2^{n})^{1/3} ≤ c·2^{n} when c=1 and n_0 = 1, so (2^{n})^{1/3} ∈ O(2^{n})
Big-Omega : We can rewrite (2^{n})^{1/3} = (1/(2^{2n/3}))·(2^n). We see that for c·g(n) ≤ f(n), c has to be ≤ 1/(2^{2n/3}) which is not possible since c > 0. So, there does not exist a c > 0 that satisfies c·g(n) ≤ f(n) and thus, (2^{n})^{1/3} ∉ Ω(2^{n})
Big-Theta : Since (2^{n})^{1/3} ∉ Ω(2^{n}), there is no lower bound c1·f(n) ≤ g(n). Therefore, (2^{n})^{1/3} ∉ Θ(2^{n})
Is this how you are supposed to prove it?
First simplify f(n) = (2^n)^(1/3) to f(n) = 2^(n/3). Then, take a limit of lim_{n\to\infty} f(n)/g(n) that g(n) = 2^n:
lim_{n\to\infty} 2^(n/3) / 2^n = lim_{n\to\infty} 1 / 2^(2n/3) = 0
Hence, f(n) = o(g(n)) (little-oh). It means f(n) is not in \Theta(g(n)). Notice that f(n) = O(g(n)) (Big-Oh) as it is in o(g(n)) (little-Oh).

Big O notation constants

Given the following functions f(n) and g(n), is f in O(g(n)) or is f in Θ(g(n)),
or both? If true, specify a constant c and point n0, if false, briefly specify why.
(a) f(n) = 2n , g(n) = 2^2n
(b) f(n) = n!, g(n) = 2n
I do understand for (a), f(n) = O(g(n)) because g(n) upper bounds f(n)
and for (b), g(n) = O(f(n)) because of dominance relativity on the fact that n! > 2^n..
I have done some research but could not find much on how to calculate the constants c and n0 for this type of questions. thanks for the reply :)
a) f(n) = 2n , g(n) = 2^(2n)
(I added parenthesis.)
f(n) = O(g(n)) iff | f(n) | <= C * | g(n) | for some C>0 and all n > n0
2n <= 1*(2^(2n)) for n>1
Therefore 2n = O(2^(2n)). My constants are C=1 and n0 = 1. But others work too.

How to prove the complexity of a logarithmic function?

Let's say you were given two logarithmic functions like
and you were asked to find if f(n) is O(g(n)) Ω(g(n)) or Θ(g(n)), how would you go about it? I found questions like these easier when you were comparing two exponential equations, because for example with x(n) = n^2 and p(n) = n^2 you could find a c > 0 (ex 3) where x(n) <= cp(n) for all n greater than some n>0 and that would prove that x(n) = O(p(n)). However, comparing two logarithmic functions seems much more difficult for some reason. Any help is appreciated, thanks!
f(n) is O(g(n)) iff there is a constant c and n_0 such that f(n) <= c * g(n) for each n >= n_0.
f(n) is Ω(g(n)) iff there is a constant c and n_0 such that f(n) >= c * g(n) for each n >= n_0.
Now, f(n) is Θ(g(n)) iff f(n) is O(g(n)) and f(n) is Ω(g(n)).
So, in your cases, we have:
f(n) = log (n^2) = 2logn
which means, g(n) is logn and c = 2, which means f(n) <= 2 * logn and f(n) >= 2 * logn, which makes it Ω(logn).
Btw. its also f(n) <= n and f(n) >= 1, so f(n) can be O(n), but we don't use it, since we can find a better O(g(n)). In this case we don't have the same function in both notations, to for those values we don't have Ω. However, we just need one option for g(n) to declare Ω. In cases we can't find it, we say its not Ω. Note the word "we say".
In second case, we care only for "highest growing value", logn part. Now, c = 1, and g = log(n), so in this case, its also Ω(logn).

Prove that f(n) = Θ(g(n)) iff g(n) = Θ(f(n))

I have been given the problem:
f(n) are asymptotically positive functions. Prove f(n) = Θ(g(n)) iff g(n) = Θ(f(n)).
Everything I have found points to this statement being invalid. For example an answer I've come across states:
f(n) = O(g(n)) implies g(n) = O(f(n))
f(n) = O(g(n)) means g(n) grows faster than f(n). It cannot imply that f(n) grows
faster than g(n). Hence not true.
Another states:
If f(n) = O(g(n)) then O(f(n)). This is false. If f(n) = 1 and g(n) = n
for all natural numbers n, then f(n) <= g(n) for all natural numbers n, so
f(n) = O(g(n)). However, suppose g(n) = O(f(n)). Then there are natural
numbers n0 and a constant c > 0 such that n=g(n) <= cf(n) = c for all n >=
n0 which is impossible.
I understand that there are slight differences between my exact question and the examples I have found, but I've only been able to come up with solutions that do not prove it. I am correct in thinking that it is not able to be proved or am I looking over some detail?
You can start from here:
Formal Definition: f(n) = Θ (g(n)) means there are positive constants c1, c2, and k, such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ k.
Because you have that iff, you need to start from the left side and to prove the right side, and then start from the right side and prove the left side.
Left -> right
We consider that:
f(n) = Θ(g(n))
and we want to prove that
g(n) = Θ(f(n))
So, we have some positive constants c1, c2 and k such that:
0 ≤ c1*g(n) ≤ f(n) ≤ c2*g(n), for all n ≥ k
The first relation between f and g is:
c1*g(n) ≤ f(n) => g(n) ≤ 1/c1*f(n) (1)
The second relation between f and g is:
f(n) ≤ c2*g(n) => 1/c2*f(n) ≤ g(n) (2)
If we combine (1) and (2), we obtain:
1/c2*f(n) ≤ g(n) ≤ 1/c1*f(n)
If you consider c3 = 1/c2 and c4 = 1/c1, they exist and are positive (because the denominators are positive). And this is true for all n ≥ k (where k can be the same).
So, we have some positive constants c3, c4, k such that:
c3*f(n) ≤ g(n) ≤ c4*f(n), for all n ≥ k
which means that g(n) = Θ(f(n)).
Analogous for right -> left.

Proving if g(n) is o(f(n)), then f(n) + g(n) is Theta(f(n))

So I'm struggling with proving (or disproving) the above question. I feel like it is true, but I'm not sure how to show it.
Again, the question is if g(n) is o(f(n)), then f(n) + g(n) is Theta(f(n))
Note, that is a little-o, not a big-o!!!
So far, I've managed to (easily) show that:
g(n) = o(f(n)) -> g(n) < c*f(n)
Then g(n) + f(n) < (c+1)*f(n) -> (g(n) + f(n)) = O(f(n))
However, for showing Big Omega, I'm not sure what to do there.
Am I going about this right?
EDIT: Everyone provided great help, but I could only mark one. THANK YOU.
One option would be to take the limit of (f(n) + g(n)) / f(n) as n tends toward infinity. If this converges to a finite, nonzero value, then f(n) + g(n) = Θ(f(n)).
Assuming that f(n) is nonzero for sufficiently large n, the above ratio, in the limit, is
(f(n) + g(n)) / f(n)
= f(n) / f(n) + g(n) / f(n)
= 1 + g(n) / f(n).
Therefore, taking the limit as n goes to infinity, the above expression converges to 1 because the ratio goes to zero (this is what it means for g(n) to be o(f(n)).
So far so good.
For the next step, recall that in the best case, 0 <= g(n); this should get you a lower bound on g(n) + f(n).
Before we begin, lets first state what little-o and Big-Theta notations means:
Little-o notation
Formally, that g(n) = o(f(n)) (or g(n) ∈ o(f(n))) holds for
sufficiently large n means that for every positive constant ε
there exists a constant N such that
|g(n)| ≤ ε*|f(n)|, for all n > N (+)
From https://en.wikipedia.org/wiki/Big_O_notation#Little-o_notation.
Big-Θ notation
h(n) = Θ(f(n)) means there exists positive constants k_1, k_2
and N, such that k_1 · |f(n)| and k_2 · |f(n)| is an upper bound
and lower bound on on |h(n)|, respectively, for n > N, i.e.
k_1 · |f(n)| ≤ |h(n)| ≤ k_2 · |f(n)|, for all n > N (++)
From https://www.khanacademy.org/computing/computer-science/algorithms/asymptotic-notation/a/big-big-theta-notation.
Given: g(n) ∈ o(f(n))
Hence, in our case, for every ε>0 we can find some constant N such that (+), for our functions g(n) and f(n). Hence, for n>N, we have
|g(n)| ≤ ε*|f(n)|, for some ε>0, for all n>N
Choose a constant ε < 1 (recall, the above holds for all ε > 0),
with accompanied constant N.
Then the following holds for all n>N
ε(|g(n)| + |f(n)|) ≤ 2|f(n)| ≤ 2(|g(n)| + |f(n)|) ≤ 4*|f(n)| (*)
Stripping away the left-most inequality in (*) and dividing by 2, we have:
|f(n)| ≤ |g(n)| + |f(n)| ≤ 2*|f(n)|, n>N (**)
We see that this is the very definition Big-Θ notation, as presented in (++), with constants k_1 = 1, k_2 = 2 and h(n) = g(n)+f(n). Hence
(**) => g(n) + f(n) is in Θ(f(n))
Ans we have shown that g(n) ∈ o(f(n)) implies (g(n) + f(n)) ∈ Θ(f(n)).

Resources