Proof of closure of asymptotically bounded functions under addition - complexity-theory

I would like to prove this statement:
If f(n) is Theta(h(n)) and g(n) = O((h(n)) then f(n) + g(n) is O(h(n)).
All functions are assumed to be non-negative and monotonically non-decreasing.
My attempted proof explains that if f(n) and g(n) are O(h(n)) then there exists:
positive constants n0, c0 such that f(n) <= c0*h(n) for all n >= n0
positive constants n1, c1 such that f(n) <= c1*h(n) for all n >= n1
Therefore it can be deduced that g(n) + f(n) <= (c1 + c0) * h(n) for all n >= max(n1, n0) meaning that g(n) + f(n) is O(h(n))
Is this correct? Is there a more rigourous proof ... for example, by contradiction?

Your proof is correct and rigorous. It is direct and constructive proof in that you not only show that suitable c and n0 exist, but you say how to calculate them from the assumptions. No other simpler methods of proof come to mind here.

Related

Proof of f(n) + ο(f(n)) = Θ(f(n))

Can you please help me prove this? I am trying to set o(f(n))= g(n) and then try to solve the equation
f(n) + g(n) = Θ(f(n)) but I don't know if it is the correct way, and if it is I don't know how to continue my solution. Thank you
Assuming that all the functions are non-negative (otherwise you need to adjust the below proof and definitions to cope with signs).
Suppose g(n) = o(f(n)). That means that for all c>0, there's an N such that n>N implies g(n) < cf(n). So in particular, there's an N such that n>N implies g(n) < f(n) (ie: pick c=1 in the definition).
We also have from the assumption that the functions are non-negative that f(n) <= f(n) + g(n).
Then we have for n>N, f(n) <= f(n) + g(n) < 2f(n) for all n>N. Thus f(n) + g(n) = Theta(f(n)).

Asymptotic Growth: Understanding the specific proof of f(n) + little o(f(n)) = theta (f(n))?

I'm working through proof of f(n) + o(f(n)) = theta (f(n)) and I came across a part in the proof that I am having trouble understanding.
We let f(n) and g(n) be asymptotically positive functions and assume g(n) = O(f(n)).
In the proof, it states that since we know that f(n) + g(n) ≥ f(n) for all n, we can conclude that f(n) + g(n) = Omega((f(n)).
We can also conclude similarly that f(n) + g(n) ≤ 2 f(n). Therefore f(n) + g(n) = O(f(n)).
I am having trouble understanding why it is the case that f(n) + g(n) = Omega((f(n)) and f(n) + g(n) = O(f(n)) would be true. How is it that we can prove that the tight-lower bound is specifically when we add g(n) to f(n)? What is it that we are exactly concluding from the value of g(n)?
One way of proving that f(n) is theta(g(n)) is to prove two separate statements: that f(n) is omega(g(n)), and f(n) is O(g(n)). It's pretty clear this way of proving is correct from the definitions of those notations.
In this exact problem, if we choose some constant c to be equal to 1, we will have, for every n, that f(n) + g(n) >= c * f(n), so that, by definition, shows that f(n) + g(n) is Omega(f(n)). Furthermore, for the O(f(n)) part, if we choose the constant c to be 2 in this case, we need to prove that there exists some n0 such that f(n) + g(n) <= c * f(n) for every n > n0, which is equivalent to g(n) <= f(n) for every n > n0, which is equivalent to the definition of g(n) = O(f(n)) given in the problem statement.
Hope this helps.

How to prove the complexity of a logarithmic function?

Let's say you were given two logarithmic functions like
and you were asked to find if f(n) is O(g(n)) Ω(g(n)) or Θ(g(n)), how would you go about it? I found questions like these easier when you were comparing two exponential equations, because for example with x(n) = n^2 and p(n) = n^2 you could find a c > 0 (ex 3) where x(n) <= cp(n) for all n greater than some n>0 and that would prove that x(n) = O(p(n)). However, comparing two logarithmic functions seems much more difficult for some reason. Any help is appreciated, thanks!
f(n) is O(g(n)) iff there is a constant c and n_0 such that f(n) <= c * g(n) for each n >= n_0.
f(n) is Ω(g(n)) iff there is a constant c and n_0 such that f(n) >= c * g(n) for each n >= n_0.
Now, f(n) is Θ(g(n)) iff f(n) is O(g(n)) and f(n) is Ω(g(n)).
So, in your cases, we have:
f(n) = log (n^2) = 2logn
which means, g(n) is logn and c = 2, which means f(n) <= 2 * logn and f(n) >= 2 * logn, which makes it Ω(logn).
Btw. its also f(n) <= n and f(n) >= 1, so f(n) can be O(n), but we don't use it, since we can find a better O(g(n)). In this case we don't have the same function in both notations, to for those values we don't have Ω. However, we just need one option for g(n) to declare Ω. In cases we can't find it, we say its not Ω. Note the word "we say".
In second case, we care only for "highest growing value", logn part. Now, c = 1, and g = log(n), so in this case, its also Ω(logn).

big-o and big-omega related to big-theta recursion

Let's suppose that a recursive formula is a big-o(n^2), and at the same time a big-omega(n^2). Does this imply that the recursion is a big-Theta(n^2)?
To make the long story short: the answer is Yes, it does. See proof below.
Though everybody have heard about big-o notation lets recall what exactly does these notations mean with a help of Introduction to Algorithms. For a general case it is said Ο(g(n)), Ω(g(n)), Θ(g(n)), but we will consider yours.
Ο(n2)
Ο(n2) notation defines a set of functions for each of which the following statement holds: There exists such positive constants c and n0 that 0 ≤ f(n) ≤ cn2 holds for all n ≥ n0.
So f(n) is just a function from Ο(n2). Examples 13n, -5, 4n2 + 5. All these pertain to Ο(n2).
Ω(n2)
Ω(n2) notation defines a set of functions for each of which the following statement holds: There exists such positive constants c and n0 that 0 ≤ cn2 ≤ f(n) holds for all n ≥ n0.
So f(n) is just a function from Ω(n2). Examples n4 + n - 1, 3n, n2 - 12. All these pertain to Ω(n2).
Θ(n2)
Θ(n2) notation defines a set of functions for each of which the following statement holds: There exists such positive constants c1, c2 and n0 that 0 ≤ c1n2 ≤ f(n) ≤ c2n2 holds for all n ≥ n0.
Again f(n) is just a function from Θ(n2). These are its representatives n2/2 + 3, 5n2.
Proof
I bet saying that a recursive formula is a big-o(n^2), and at the same time a big-omega(n^2) you meant there is a function (lets call it) f(n) pertaining to
Ω(n2) and Ο(n2).
From Ω(n2) we have existence of c1 that c1n2 ≤ f(n) holds. From Ο(n2) we have existence of c2 that f(n) ≤ c2n2 holds. Consequently we have existence of c1 and c2 that c1n2 ≤ f(n) ≤ c2n2, that is exactly what Θ(n2) is about.

Prove f(n) + g(n) is O(max(f(n),g(n)))

Hello I am having a bit of difficulty proving the following.
f(n) + g(n) is O(max(f(n),g(n)))
This makes logical sense, and by looking at this I can tell you that its correct but I'm having trouble coming up with a proof.
Here is what I have so far:
c * (max(f(n),g(n))) > f(n) + g(n) for n > N
But I'm not sure how to pick a c and N to fit the definition because I don't know what f(n) and g(n) are.
Any help is appreciated.
f(n) + g(n) <= 2* max{f(n),g(n)}
(for each n>0, assume f(n),g(n) are none-negative functions)
Thus, for N=1, for all n>N: f(n) + g(n) <= 2*max{f(n),g(n)}, and we can say by definition of big O that f(n) + g(n) is in O(max{f(n),g(n)})
So basically, we use N=1, c=2 for the formal proof by definition.

Resources