Let's say you were given two logarithmic functions like
and you were asked to find if f(n) is O(g(n)) Ω(g(n)) or Θ(g(n)), how would you go about it? I found questions like these easier when you were comparing two exponential equations, because for example with x(n) = n^2 and p(n) = n^2 you could find a c > 0 (ex 3) where x(n) <= cp(n) for all n greater than some n>0 and that would prove that x(n) = O(p(n)). However, comparing two logarithmic functions seems much more difficult for some reason. Any help is appreciated, thanks!
f(n) is O(g(n)) iff there is a constant c and n_0 such that f(n) <= c * g(n) for each n >= n_0.
f(n) is Ω(g(n)) iff there is a constant c and n_0 such that f(n) >= c * g(n) for each n >= n_0.
Now, f(n) is Θ(g(n)) iff f(n) is O(g(n)) and f(n) is Ω(g(n)).
So, in your cases, we have:
f(n) = log (n^2) = 2logn
which means, g(n) is logn and c = 2, which means f(n) <= 2 * logn and f(n) >= 2 * logn, which makes it Ω(logn).
Btw. its also f(n) <= n and f(n) >= 1, so f(n) can be O(n), but we don't use it, since we can find a better O(g(n)). In this case we don't have the same function in both notations, to for those values we don't have Ω. However, we just need one option for g(n) to declare Ω. In cases we can't find it, we say its not Ω. Note the word "we say".
In second case, we care only for "highest growing value", logn part. Now, c = 1, and g = log(n), so in this case, its also Ω(logn).
Related
Can you please help me prove this? I am trying to set o(f(n))= g(n) and then try to solve the equation
f(n) + g(n) = Θ(f(n)) but I don't know if it is the correct way, and if it is I don't know how to continue my solution. Thank you
Assuming that all the functions are non-negative (otherwise you need to adjust the below proof and definitions to cope with signs).
Suppose g(n) = o(f(n)). That means that for all c>0, there's an N such that n>N implies g(n) < cf(n). So in particular, there's an N such that n>N implies g(n) < f(n) (ie: pick c=1 in the definition).
We also have from the assumption that the functions are non-negative that f(n) <= f(n) + g(n).
Then we have for n>N, f(n) <= f(n) + g(n) < 2f(n) for all n>N. Thus f(n) + g(n) = Theta(f(n)).
I am learning about complexity theory and have a question asking to show truth/falsehood of a number of Big-O statements.
I've done the first few e.g. showing 2^(n+1) is in O(2^n) by finding a constant and N value. But now they are asking more abstract things, for example:
If f(n) is O(g(n)), is log f(n) in O(log g(n))?
Is 2^(f(n)) in O(2^(g(n)))
These both seem like they would be true but I don't know how to express them formally with a constant and a N value. If you can give an example of how I could show these I can go do the rest of the problems.
The comments are both accurate. Here are some notes along the lines you are probably looking for.
Assume f(n) is O(g(n)). Then there exist n0 and c such that f(n) <= cg(n) for n >= n0. Take the logarithm of both sides. log(f(n)) <= log(cg(n)). We can use the laws of logarithms to rewrite this as log(f(n)) <= log(c) + log(g(n)). If g(n) is greater than 1, then log(c) + log(g(n)) <= (1+log(c))*log(g(n)), so we can choose c' = 1 + log(c) and get the desired result. Otherwise, note that for g(n) = 1 we're still good since any choice for c' works.
The second one is not true. Choose f(n) = 2n and g(n) = n. We see f(n) is O(g(n)) by choosing c = 3. However, 2^(2n) = 4^n is not O(2^n). To see that, assume we had n0 and c. Then 4^n <= c*2^n. Dividing by 2^n gives 2^n <= c. But this can't be correct since n can increase indefinitely whereas c is fixed.
I'm working through proof of f(n) + o(f(n)) = theta (f(n)) and I came across a part in the proof that I am having trouble understanding.
We let f(n) and g(n) be asymptotically positive functions and assume g(n) = O(f(n)).
In the proof, it states that since we know that f(n) + g(n) ≥ f(n) for all n, we can conclude that f(n) + g(n) = Omega((f(n)).
We can also conclude similarly that f(n) + g(n) ≤ 2 f(n). Therefore f(n) + g(n) = O(f(n)).
I am having trouble understanding why it is the case that f(n) + g(n) = Omega((f(n)) and f(n) + g(n) = O(f(n)) would be true. How is it that we can prove that the tight-lower bound is specifically when we add g(n) to f(n)? What is it that we are exactly concluding from the value of g(n)?
One way of proving that f(n) is theta(g(n)) is to prove two separate statements: that f(n) is omega(g(n)), and f(n) is O(g(n)). It's pretty clear this way of proving is correct from the definitions of those notations.
In this exact problem, if we choose some constant c to be equal to 1, we will have, for every n, that f(n) + g(n) >= c * f(n), so that, by definition, shows that f(n) + g(n) is Omega(f(n)). Furthermore, for the O(f(n)) part, if we choose the constant c to be 2 in this case, we need to prove that there exists some n0 such that f(n) + g(n) <= c * f(n) for every n > n0, which is equivalent to g(n) <= f(n) for every n > n0, which is equivalent to the definition of g(n) = O(f(n)) given in the problem statement.
Hope this helps.
I am trying to prove that if f(n) and g(n) are asymptotically positive functions, then:
f(n) = O((f(n))^2)
f(n) = O(g(n)) implies 2^(f(n)) = O(2^(g(n)))
f(n) = O(g(n)) implies g(n) = O(f(n))
1) Theorem: If f(n) is an asymptotically positive function from natural numbers to natural numbers, then f(n) = O((f(n))^2) (note I have added an extra, perhaps implied, assumption).
Proof: Because f(n) is an asymptotically positive function from natural numbers to natural numbers, it is guaranteed that for all natural numbers n greater than or equal to some natural number n0, f(n) > 0, hence f(n) >= 1. Because f(n) is guaranteed to be positive we are free to multiply both sides of the inequality by f(n) without changing the direction to get f(n)^2 >= f(n). Therefore, we can choose c = 1 and use the n0 from the assumption to show that f(n) = O((f(n))^2). (Recall that by the definition of Big-Oh, f(n) = O(g(n)) if and only if there exist constants c > 0, n0 such that for n >= n0, f(n) <= c * g(n)).
2) Theorem: if f(n) and g(n) are asymptotically positive functions from natural numbers to natural numbers and f(n) = O(g(n)), then it is not necessarily true that 2^(f(n)) = O(2^(g(n)).
Proof: The proof is by example. It can be shown that 4n = O(2n). 4n and 2n are both asymptotically positive functions from naturals to naturals. However, it can also be shown that 2^(4n) = 16^n is not O(2^(2n)) = O(4^n).
3) Theorem: if f(n) and g(n) are asymptotically positive functions from natural numbers to natural numbers and f(n) = O(g(n)), then it is not necessarily true that g(n) = O(f(n)).
Proof: The proof is by example. It can be shown that n = O(n^2). n and n^2 are both asymptotically positive functions from naturals to naturals. However, it can also be shown that n^2 is not O(n).
f(n) = O((f(n))2)
Any function is by default big-O of itself, i.e. we can use a bigger constant cbig such that f(n) <= cbig.f(n).
Thus,
if f(n) is less than or equal to cbig.f(n),
then f(n) will definitely be less than or equal to cbig.f(n).f(n), for asymptotically positive f(n).
Mathematically, f(n) = O(f(n).f(n)) = O(f(n)2) is true.
f(n) = O(g(n)) implies 2f(n) = O(2g(n))
f(n) = O(g(n)) implies that f(n) <= g(n)
Also, if some positive number n is less than m, then 2n will be less than 2m
Using 1. and 2. above, we can conclude that if f(n) = O(g(n)), then 2f(n) = O(2g(n))
f(n) = O(g(n)) implies g(n) = O(f(n))
This one is wrong.
f(n) = O(g(n)) implies g(n) = Ω(f(n)).
If f(n) = O(g(n)), then f(n) is upper bound by g(n) which means that g(n) is lower bound by f(n), therefore g(n) = Ω(f(n)).
Hello I am having a bit of difficulty proving the following.
f(n) + g(n) is O(max(f(n),g(n)))
This makes logical sense, and by looking at this I can tell you that its correct but I'm having trouble coming up with a proof.
Here is what I have so far:
c * (max(f(n),g(n))) > f(n) + g(n) for n > N
But I'm not sure how to pick a c and N to fit the definition because I don't know what f(n) and g(n) are.
Any help is appreciated.
f(n) + g(n) <= 2* max{f(n),g(n)}
(for each n>0, assume f(n),g(n) are none-negative functions)
Thus, for N=1, for all n>N: f(n) + g(n) <= 2*max{f(n),g(n)}, and we can say by definition of big O that f(n) + g(n) is in O(max{f(n),g(n)})
So basically, we use N=1, c=2 for the formal proof by definition.