Proof of f(n) + ο(f(n)) = Θ(f(n)) - algorithm

Can you please help me prove this? I am trying to set o(f(n))= g(n) and then try to solve the equation
f(n) + g(n) = Θ(f(n)) but I don't know if it is the correct way, and if it is I don't know how to continue my solution. Thank you

Assuming that all the functions are non-negative (otherwise you need to adjust the below proof and definitions to cope with signs).
Suppose g(n) = o(f(n)). That means that for all c>0, there's an N such that n>N implies g(n) < cf(n). So in particular, there's an N such that n>N implies g(n) < f(n) (ie: pick c=1 in the definition).
We also have from the assumption that the functions are non-negative that f(n) <= f(n) + g(n).
Then we have for n>N, f(n) <= f(n) + g(n) < 2f(n) for all n>N. Thus f(n) + g(n) = Theta(f(n)).

Related

Asymptotic Growth: Understanding the specific proof of f(n) + little o(f(n)) = theta (f(n))?

I'm working through proof of f(n) + o(f(n)) = theta (f(n)) and I came across a part in the proof that I am having trouble understanding.
We let f(n) and g(n) be asymptotically positive functions and assume g(n) = O(f(n)).
In the proof, it states that since we know that f(n) + g(n) ≥ f(n) for all n, we can conclude that f(n) + g(n) = Omega((f(n)).
We can also conclude similarly that f(n) + g(n) ≤ 2 f(n). Therefore f(n) + g(n) = O(f(n)).
I am having trouble understanding why it is the case that f(n) + g(n) = Omega((f(n)) and f(n) + g(n) = O(f(n)) would be true. How is it that we can prove that the tight-lower bound is specifically when we add g(n) to f(n)? What is it that we are exactly concluding from the value of g(n)?
One way of proving that f(n) is theta(g(n)) is to prove two separate statements: that f(n) is omega(g(n)), and f(n) is O(g(n)). It's pretty clear this way of proving is correct from the definitions of those notations.
In this exact problem, if we choose some constant c to be equal to 1, we will have, for every n, that f(n) + g(n) >= c * f(n), so that, by definition, shows that f(n) + g(n) is Omega(f(n)). Furthermore, for the O(f(n)) part, if we choose the constant c to be 2 in this case, we need to prove that there exists some n0 such that f(n) + g(n) <= c * f(n) for every n > n0, which is equivalent to g(n) <= f(n) for every n > n0, which is equivalent to the definition of g(n) = O(f(n)) given in the problem statement.
Hope this helps.

How to prove the complexity of a logarithmic function?

Let's say you were given two logarithmic functions like
and you were asked to find if f(n) is O(g(n)) Ω(g(n)) or Θ(g(n)), how would you go about it? I found questions like these easier when you were comparing two exponential equations, because for example with x(n) = n^2 and p(n) = n^2 you could find a c > 0 (ex 3) where x(n) <= cp(n) for all n greater than some n>0 and that would prove that x(n) = O(p(n)). However, comparing two logarithmic functions seems much more difficult for some reason. Any help is appreciated, thanks!
f(n) is O(g(n)) iff there is a constant c and n_0 such that f(n) <= c * g(n) for each n >= n_0.
f(n) is Ω(g(n)) iff there is a constant c and n_0 such that f(n) >= c * g(n) for each n >= n_0.
Now, f(n) is Θ(g(n)) iff f(n) is O(g(n)) and f(n) is Ω(g(n)).
So, in your cases, we have:
f(n) = log (n^2) = 2logn
which means, g(n) is logn and c = 2, which means f(n) <= 2 * logn and f(n) >= 2 * logn, which makes it Ω(logn).
Btw. its also f(n) <= n and f(n) >= 1, so f(n) can be O(n), but we don't use it, since we can find a better O(g(n)). In this case we don't have the same function in both notations, to for those values we don't have Ω. However, we just need one option for g(n) to declare Ω. In cases we can't find it, we say its not Ω. Note the word "we say".
In second case, we care only for "highest growing value", logn part. Now, c = 1, and g = log(n), so in this case, its also Ω(logn).

Proving if g(n) is o(f(n)), then f(n) + g(n) is Theta(f(n))

So I'm struggling with proving (or disproving) the above question. I feel like it is true, but I'm not sure how to show it.
Again, the question is if g(n) is o(f(n)), then f(n) + g(n) is Theta(f(n))
Note, that is a little-o, not a big-o!!!
So far, I've managed to (easily) show that:
g(n) = o(f(n)) -> g(n) < c*f(n)
Then g(n) + f(n) < (c+1)*f(n) -> (g(n) + f(n)) = O(f(n))
However, for showing Big Omega, I'm not sure what to do there.
Am I going about this right?
EDIT: Everyone provided great help, but I could only mark one. THANK YOU.
One option would be to take the limit of (f(n) + g(n)) / f(n) as n tends toward infinity. If this converges to a finite, nonzero value, then f(n) + g(n) = Θ(f(n)).
Assuming that f(n) is nonzero for sufficiently large n, the above ratio, in the limit, is
(f(n) + g(n)) / f(n)
= f(n) / f(n) + g(n) / f(n)
= 1 + g(n) / f(n).
Therefore, taking the limit as n goes to infinity, the above expression converges to 1 because the ratio goes to zero (this is what it means for g(n) to be o(f(n)).
So far so good.
For the next step, recall that in the best case, 0 <= g(n); this should get you a lower bound on g(n) + f(n).
Before we begin, lets first state what little-o and Big-Theta notations means:
Little-o notation
Formally, that g(n) = o(f(n)) (or g(n) ∈ o(f(n))) holds for
sufficiently large n means that for every positive constant ε
there exists a constant N such that
|g(n)| ≤ ε*|f(n)|, for all n > N (+)
From https://en.wikipedia.org/wiki/Big_O_notation#Little-o_notation.
Big-Θ notation
h(n) = Θ(f(n)) means there exists positive constants k_1, k_2
and N, such that k_1 · |f(n)| and k_2 · |f(n)| is an upper bound
and lower bound on on |h(n)|, respectively, for n > N, i.e.
k_1 · |f(n)| ≤ |h(n)| ≤ k_2 · |f(n)|, for all n > N (++)
From https://www.khanacademy.org/computing/computer-science/algorithms/asymptotic-notation/a/big-big-theta-notation.
Given: g(n) ∈ o(f(n))
Hence, in our case, for every ε>0 we can find some constant N such that (+), for our functions g(n) and f(n). Hence, for n>N, we have
|g(n)| ≤ ε*|f(n)|, for some ε>0, for all n>N
Choose a constant ε < 1 (recall, the above holds for all ε > 0),
with accompanied constant N.
Then the following holds for all n>N
ε(|g(n)| + |f(n)|) ≤ 2|f(n)| ≤ 2(|g(n)| + |f(n)|) ≤ 4*|f(n)| (*)
Stripping away the left-most inequality in (*) and dividing by 2, we have:
|f(n)| ≤ |g(n)| + |f(n)| ≤ 2*|f(n)|, n>N (**)
We see that this is the very definition Big-Θ notation, as presented in (++), with constants k_1 = 1, k_2 = 2 and h(n) = g(n)+f(n). Hence
(**) => g(n) + f(n) is in Θ(f(n))
Ans we have shown that g(n) ∈ o(f(n)) implies (g(n) + f(n)) ∈ Θ(f(n)).

Showing f(n) = O(f(n) + g(n))?

I was wondering what the proof for the following Big-O comparison is:
f(n) is O(f(n) + g(n)))
I understand that we could use:
f(n) ≤ constant * (f(n) + g(n))
But I don't know how to follow up.
What about the case where we replace big-O with big-Ω?
If you know that the function g(n) is nonnegative, then note that
f(n) ≤ f(n) + g(n) = 1 · (f(n) + g(n))
Given this, could you use the formal definition of big-O notation to show that f(n) = O(f(n) + g(n))?
If g(n) isn't necessarily nonnegative, then this result isn't necessarily true. For example, take f(n) = n and g(n) = -n. Then f(n) + g(n) = 0, and it's not true that f(n) = O(0).
As for the Ω case, are you sure this result is necessarily true? As a hint, try picking f(n) = n and g(n) = 2n. Is f(n) really Ω(f(n) + g(n)) here?
Hope this helps!

Prove f(n) + g(n) is O(max(f(n),g(n)))

Hello I am having a bit of difficulty proving the following.
f(n) + g(n) is O(max(f(n),g(n)))
This makes logical sense, and by looking at this I can tell you that its correct but I'm having trouble coming up with a proof.
Here is what I have so far:
c * (max(f(n),g(n))) > f(n) + g(n) for n > N
But I'm not sure how to pick a c and N to fit the definition because I don't know what f(n) and g(n) are.
Any help is appreciated.
f(n) + g(n) <= 2* max{f(n),g(n)}
(for each n>0, assume f(n),g(n) are none-negative functions)
Thus, for N=1, for all n>N: f(n) + g(n) <= 2*max{f(n),g(n)}, and we can say by definition of big O that f(n) + g(n) is in O(max{f(n),g(n)})
So basically, we use N=1, c=2 for the formal proof by definition.

Resources