I have two functions, f(n)=log2n and g(n)=log10n. I am trying to decide whether f(n) is O(g(n)), or Ω(g(n)) or Θ(g(n)). I thinks i should take the limit f(n)/g(n) as n goes to infinity, and I think that limit is constant so f(n) must be Θ(n).
Am I right?
log2n = log10n / log102 (from here)
So f(n) = g(n) / log102
So f(n) and g(n) only differ by a constant factor (since log102 is constant).
So, from the definitions of O(x), Ω(x) and Θ(x), I'd say:
f(n) ∈ O(g(n)),
f(n) ∈ Ω(g(n)),
f(n) ∈ Θ(g(n))
Yes, you are right. From complexity point of view (at least big O point of view) doesn't matter if it is log2 or log10.
f(n) is both O(g(n)) and f(n) is Ω(g(n)), f(n) is Θ(g(n))
As the limit is constant, you are right that f(n) ∈ Θ(g(n)) (assuming you have a typo in the question). Also of course g(n) ∈ Θ(f(n)).
BTW: Not only the limit of the ratio is constant but log2n/log10n is always a constant(log210).
Related
I know that f(n) grows slower than g(n), but could f(n) has the same growth rate as g(n) since there is an equality sign?
Based on the Big-O definition, yes. For example n is in O(n) as well. In this case, f(n) = n and g(n) = n are even equal, a stronger relation than having the same growth.
I came across this statement, as per my understanding Theta lies between Big O and Omega but I am unable to understand why intersection comes here. Can I get a mathematical as well as analytical understanding for Θ(g(n)) = O(g(n)) ∩ Ω(g(n))
Θ(g(n)) means that the function is bound both above and below by g(n).
Mathematically, if a function f(n) is Θ(g(n)), then
0 ≤ c1.g(n) ≤ f(n) ≤ c2.g(n) for all n greater than some constant k
Now,
O(g(n)) is upper bound on g(n), so a function that is O(g(n)) is upper bounded by g(n).
Ω(g(n)) is lower bound on g(n), so a function that's Ω(g(n)) is lower bound by g(n).
O(g(n)) ∩ Ω(g(n)) is representative of a function sandwiched between g(n) from both above and below, as shown in the image below, which by definition would be Θ(g(n)).
Mathematically, that means the function is 0 ≤ c1.g(n) ≤ f(n) ≤ c2.g(n).
If a function f(n) grows more slowly than a function g(n), why is f(n) = O(g(n))?
e.g. if f(n) is 4n^4 and g(n) is log(4n^n^4)
My book says f=O(g(n)) because g=n^4*log(4n)=n^4(logn + log4)=O(n^4*logn). I understand why g=O(n^4*logn), but I'm not sure how they reached the conclusion that f=O(g(n)) from big O of g.
I understand that f(n) grows more slowly than g(n) just by thinking about the graphs, but I'm having trouble understanding asymptotic behavior and why f=O(g(n)) in general.
The formal definition of big-O notation is that
f(n) = O(g(n)) if there are constants n0 and c such that for any n ≥ n0, we have f(n) ≤ c · g(n).
In other words, f(n) = O(g(n)) if for sufficiently large values of n, the value of f(n) is upper-bounded by some constant multiple of g(n).
Notice that this just says that f(n) is upper-bounded by g(n), not that f(n)'s rate of growth is the same as g(n)'s rate of growth. In that sense, you can think of f(n) = O(g(n)) as akin to saying something like "f ≤ g," that f doesn't grow faster than g, leaving open the possibility that g grows a lot faster than f does.
Are there any functions such as f(n) and g(n) that both;
f(n) != O(g(n)) and
g(n) != O(f(n)).
Are there any functions that fulfills the requirements at the above?
f(n)=n and g(n)=n^(1 + sin(x)).
f(n) is not O(g(n)) and g(n) is not O(f(n)).
Refer http://c2.com/cgi/wiki?BigOh
Consider:
f(n) = 0 if n is odd, else n*n
g(n) = n
Then for odd values g(n) is more than a constant factor bigger than f(n) (and so g(n) is not O(f(n)), while for even values f(n) is more than a constant factor bigger than g(n) (and so f(n) is not O(g(n))).
Observe that f(n) does not have a limit at infinity as n approaches infinity, so in some sense this is a cheap example. But you could fix that by replacing 0, n, n*n with n, n*n, n*n*n.
I think if two non-negative functions have the property that f(n)/g(n) has a (perhaps infinite) limit as n approaches infinity, then it follows that one of them is big-O the other one. If the limit is 0 then f(n) is O(g(n)), if the limit is finite then each is big-O the other, and if the limit is infinite then g(n) is O(f(n)). But I'm too lazy to confirm by writing a proof.
Am I wondering if the follow are true.
If f(n) is O(g(n)) and f(n) is also Ω(g(n)) that means
f(n) is also big Θ(g(n)) right? Also if either of the 2 above are false, then f(n) is not big Θ(g(n))?
If f(n) is O(g(n)) and f(n) is also Ω(g(n)) that means f(n) is also big Θ(g(n)) right?
Yes. That is the definition of big theta.
Also if either of the 2 above are false, then f(n) is not big Θ(g(n))?
Yes. It is a bijection.
if we know f1(n) is Θ(g(n)) and f2 (n) is Θ(g(n)), does that mean f1 (n) + f2 (n) is Θ(g(n)). Why?
Because f1(n) is approximately c1*g(n) and f2 is approximately c2*g(n), so f1(n)+f2(n) is approximately (c1+c2)*g(n), and so any linear combination will preserve that relationship.
Since big-omega is lower bound, big-O is upper bound, and big-theta is upper and lower bound, it stands to reason yes.