Showing f(n) = O(f(n) + g(n))? - algorithm

I was wondering what the proof for the following Big-O comparison is:
f(n) is O(f(n) + g(n)))
I understand that we could use:
f(n) ≤ constant * (f(n) + g(n))
But I don't know how to follow up.
What about the case where we replace big-O with big-Ω?

If you know that the function g(n) is nonnegative, then note that
f(n) ≤ f(n) + g(n) = 1 · (f(n) + g(n))
Given this, could you use the formal definition of big-O notation to show that f(n) = O(f(n) + g(n))?
If g(n) isn't necessarily nonnegative, then this result isn't necessarily true. For example, take f(n) = n and g(n) = -n. Then f(n) + g(n) = 0, and it's not true that f(n) = O(0).
As for the Ω case, are you sure this result is necessarily true? As a hint, try picking f(n) = n and g(n) = 2n. Is f(n) really Ω(f(n) + g(n)) here?
Hope this helps!

Related

Proof of f(n) + ο(f(n)) = Θ(f(n))

Can you please help me prove this? I am trying to set o(f(n))= g(n) and then try to solve the equation
f(n) + g(n) = Θ(f(n)) but I don't know if it is the correct way, and if it is I don't know how to continue my solution. Thank you
Assuming that all the functions are non-negative (otherwise you need to adjust the below proof and definitions to cope with signs).
Suppose g(n) = o(f(n)). That means that for all c>0, there's an N such that n>N implies g(n) < cf(n). So in particular, there's an N such that n>N implies g(n) < f(n) (ie: pick c=1 in the definition).
We also have from the assumption that the functions are non-negative that f(n) <= f(n) + g(n).
Then we have for n>N, f(n) <= f(n) + g(n) < 2f(n) for all n>N. Thus f(n) + g(n) = Theta(f(n)).

How to operate on Asymptotic Notation Function Sets ie. Big-O + Big-Omega?

I'm trying to determine if the following statement is true or false.
If f(n) ∈ O(n) and g(n) ∈ Ω(n), then f(n) + g(n) ∈ Θ(n).
I think I understand adding the same asymptotic big-O. O(n) + O(n) = O(n)
However, I am unsure about adding or operating on the others combined.
For example:
If f(n) ∈ Θ(n log n), then f(n) * n = ?
Could this answer be both O(n^2*logn) and Θ(n^2*logn)?
Thank you in advance!
You can use the definition of these symbols and try to find a proof or a contradiction example for them.
If f(n) = O(n) and g(n) = Omega(n), the f(n) + g(n) is not in Theta(n) necessarily! As a contradiction, if f(n) = n and g(n) = n^2, then f(n) + g(n) = Theta(n^2). On the other hand, if f(n) = n and g(n) = n, then f(n) + g(n) = Theta(n). Hence, you can just say f(n) + g(n) = Omega(n) and nothing more.

Asymptotic Growth: Understanding the specific proof of f(n) + little o(f(n)) = theta (f(n))?

I'm working through proof of f(n) + o(f(n)) = theta (f(n)) and I came across a part in the proof that I am having trouble understanding.
We let f(n) and g(n) be asymptotically positive functions and assume g(n) = O(f(n)).
In the proof, it states that since we know that f(n) + g(n) ≥ f(n) for all n, we can conclude that f(n) + g(n) = Omega((f(n)).
We can also conclude similarly that f(n) + g(n) ≤ 2 f(n). Therefore f(n) + g(n) = O(f(n)).
I am having trouble understanding why it is the case that f(n) + g(n) = Omega((f(n)) and f(n) + g(n) = O(f(n)) would be true. How is it that we can prove that the tight-lower bound is specifically when we add g(n) to f(n)? What is it that we are exactly concluding from the value of g(n)?
One way of proving that f(n) is theta(g(n)) is to prove two separate statements: that f(n) is omega(g(n)), and f(n) is O(g(n)). It's pretty clear this way of proving is correct from the definitions of those notations.
In this exact problem, if we choose some constant c to be equal to 1, we will have, for every n, that f(n) + g(n) >= c * f(n), so that, by definition, shows that f(n) + g(n) is Omega(f(n)). Furthermore, for the O(f(n)) part, if we choose the constant c to be 2 in this case, we need to prove that there exists some n0 such that f(n) + g(n) <= c * f(n) for every n > n0, which is equivalent to g(n) <= f(n) for every n > n0, which is equivalent to the definition of g(n) = O(f(n)) given in the problem statement.
Hope this helps.

To prove or disprove the given Asymptotic notations

Prove or Disprove
1) max{ f(n), g(n)} = O(f(n) + g(n)), where f(n) and g(n) are positive functions.2) min{ f(n) , g(n)} = Ω(f(n) + g(n)), where f(n) and g(n) are positive functions.
My proof for the first question is something like, max {f(n) + g(n)} would be > than f(n) + g(n). So setting the constant c to the max {f(n) + g(n)} will make the Big'O condition hold true. Is this the right way to go about doing it? Also, I'm not sure as to how to go on to prove the secon question, so any help with regard to that would be much appreciated.
The first one is correct. As max(f(n),g(n)) < f(n) + g(n). The second one is incorrect. You can show that by contrdition. Suppose f(n) = log(n) and g(n) = n. We can see that min(f(n),g(n)) = log(n) but we know that log(n) is not Omgea(n + log(n)).

Prove f(n) + g(n) is O(max(f(n),g(n)))

Hello I am having a bit of difficulty proving the following.
f(n) + g(n) is O(max(f(n),g(n)))
This makes logical sense, and by looking at this I can tell you that its correct but I'm having trouble coming up with a proof.
Here is what I have so far:
c * (max(f(n),g(n))) > f(n) + g(n) for n > N
But I'm not sure how to pick a c and N to fit the definition because I don't know what f(n) and g(n) are.
Any help is appreciated.
f(n) + g(n) <= 2* max{f(n),g(n)}
(for each n>0, assume f(n),g(n) are none-negative functions)
Thus, for N=1, for all n>N: f(n) + g(n) <= 2*max{f(n),g(n)}, and we can say by definition of big O that f(n) + g(n) is in O(max{f(n),g(n)})
So basically, we use N=1, c=2 for the formal proof by definition.

Resources