Algorithms - Both Little o and Big Omega on the same functions? - algorithm

I have two functions, f(n),g(n) such that f(n)=o(g(n)).
to be clear, I'm taking about little o
It is possible with that information given to me, that f(n)=Omega(g(n)).
To me it sounds that it's not possible, since Little-o definition says to me that
for every c>0,f(n)<c * g(n).
Thanks!

Let's assume that both f and g are strictly positive.
f(n) = o(g(n)) means f(n)/g(n) -> 0 as n tends to infinity.
f(n) = Ω(g(n)) means (assuming the Knuth definition of Ω) g(n) = O(f(n)), which means there's a c>0 such that for large enough n, g(n) <= cf(n). But then, for large enough n, f(n)/g(n) >= 1/c > 0. So it's not possible that f(n)/g(n) -> 0 as n tends to infinity, which means that it's impossible that f(n) = Ω(g(n)) and f(n) = o(g(n)).

No this does not guarantee that. Sometimes, big O is the same as Omega(g(n)) but not all the time.

Related

Given that f(n) = 10000000n and g(n) = n^2. Why f(n) is O(g(n))?

I'm trying to solve the following problem but am unsure on the explanation given in the solution. f2(n) appears to be O(n) and f4(n) appears to be O(n^2). Why, then, does f2(n) is O(f4(n))?
Your current statement is not true. A contradiction example is f2(n) = n = O(n) and f4(n) = 1 = O(n^2). But, f2(n) is not O(f4(n)).
However, as mentioned in the answer that f4(n) is quadratic and f2(n) is linear, by the definition of the big-Oh symbol, we can say f2(n) = O(f4(n)).
Read the wikipedia page about Big O notation and you will understand it better.
Informally, a description of a function in terms of big O notation usually only provides an upper bound on the growth rate of the function.
Given that f(n) = 10000000n and g(n) = n², when you say that f(n) is O(g(n)), it means there exists c > 0 (e.g., c = 1) and n0 such that f(n) ≤ cg(n) whenever n ≥ n0.
If f2 is O(f4) it means that f2 grows asymptotically slower than f4.

How to prove the complexity of a logarithmic function?

Let's say you were given two logarithmic functions like
and you were asked to find if f(n) is O(g(n)) Ω(g(n)) or Θ(g(n)), how would you go about it? I found questions like these easier when you were comparing two exponential equations, because for example with x(n) = n^2 and p(n) = n^2 you could find a c > 0 (ex 3) where x(n) <= cp(n) for all n greater than some n>0 and that would prove that x(n) = O(p(n)). However, comparing two logarithmic functions seems much more difficult for some reason. Any help is appreciated, thanks!
f(n) is O(g(n)) iff there is a constant c and n_0 such that f(n) <= c * g(n) for each n >= n_0.
f(n) is Ω(g(n)) iff there is a constant c and n_0 such that f(n) >= c * g(n) for each n >= n_0.
Now, f(n) is Θ(g(n)) iff f(n) is O(g(n)) and f(n) is Ω(g(n)).
So, in your cases, we have:
f(n) = log (n^2) = 2logn
which means, g(n) is logn and c = 2, which means f(n) <= 2 * logn and f(n) >= 2 * logn, which makes it Ω(logn).
Btw. its also f(n) <= n and f(n) >= 1, so f(n) can be O(n), but we don't use it, since we can find a better O(g(n)). In this case we don't have the same function in both notations, to for those values we don't have Ω. However, we just need one option for g(n) to declare Ω. In cases we can't find it, we say its not Ω. Note the word "we say".
In second case, we care only for "highest growing value", logn part. Now, c = 1, and g = log(n), so in this case, its also Ω(logn).

When to use Big O instead of theta or little o

A question about asymptotic notation. I've seen a lot of explanations of asymptotic notation say:
θ(...) is analogous to =
O(...) is analogous to <=
o(...) is analogous to <
Which would seem to imply that if f(n) = O(g(n)), then either f(n) = θ(g(n)) or f(n) = o(g(n)).
Is it possible to have f(n) = O(g(n)) such that neither f(n) = θ(g(n)) nor f(n) = o(g(n))? If so, what is an example of this? And if not, then why would we ever use O(...) when θ(...) or o(...) are stronger descriptors?
Let f(n)=k!, when k is the smallest integer such that n<=k!.
Then f(n) is not θ(n) (since f(k!+1)/(k!+1) tends to infinity) neither is o(n) (since f(k!)=k!), but clearly f(n)=O(n) (as f(n)<=n).

Complexity from a recent exam that confused people

Do you think the following information is true?
If Θ(f(n)) = Θ(g(n)) AND g(n) > 0 everywhere THEN f(n)/g(n) ∈ Θ(1)
We are having bit of argument with our prof
f(n) = Θ(g(n)) means there's c, d, n0 such that cg(n) <= f(n) <= dg(n) for n > n0.
Then, since g(n) > 0, c <= f(n)/g(n) <= d for n > n0.
So f(n)/g(n) = Θ(1).
Dividing functions f(n),g(n) is not the same as dividing their Big-O. For example let:
f(n) = n^3 + n^2 + n
g(n) = n^3
so:
O(f(n)) = n^3
O(g(n)) = n^3
but:
f(n)/g(n) = 1 + 1/n + 1/n^2 != constant !!!
[Edit1]
but as kfx pointed you are comparing with complexity so you want:
O(f(n)/g(n)) = O(1 + 1/n + 1/n^2) = O(1)
So the answer is Yes.
But beware complexity theory is not really my cup of tea and also I do not have any context to the question of yours.
Using definitions for Landau notation https://en.wikipedia.org/wiki/Big_O_notation, it's easy to conclude that this is true, the limit of division must be less than infinity but larger than 0.
It does not have to be exactly 1 but it has to be a finite constant, which is Θ(1).
A counter example would be nice, and should be easy to be given if the statement isn't true. A positive rigorous proof would probably need to go from definition of limes with respect to series, to prove equivalence of formal and limit definitions.
I use this definition and haven't seen it proven wrong. I suppose the disagreement might lie in exact definition of Θ, it is known that people use those colloquially with minor differences, especially Big O. Or maybe some tricky cases. For positively defined functions and series, I don't think it fails.
Basically there are three options for any pair of functions f, g: Either the first grows asymptotically slower and we write f=o(g) (notice I'm using small o), the first grows asymptotically faster: f=ω(g) (again, small omega) or they are asymptotically tightly bound: f=Θ(g).
What f=o(g) means is stricter then big O in that it doesn't allow for f=Θ(g) to be true; f=Θ(g) implies both f=O(g) and f=Ω(g), but o, Θ and ω are exclusive.
To find out whether f=o(g) it's sufficient to evaluate limit for n going to infinity f(n)/g(n) and if it is zero, f=o(g) is true, if it is infinity f=ω(g) is true and if it is any real finite number, f=Θ(g) is your answer. This is not a definition, but merely a way to evaluate a statement. (One assumption I made here was that both f and g are positive.)
Special case is if limit for n goint to infinity f(n)/1 = f(n) is finite number, it means f(n)=Θ(1) (basically we chose constant function for g).
Now we're getting to your problem: Since f=g(Θ)implies f=O(g), we know that there exists c>0 and n0 such that f(n) <= c*g(n)for all n>n0. Thus we know that f(n)/g(n) <= (c*g(n))/g(n) = cfor all n>n0. The same can be done for Ω just with opposite unequality signs. Thus we get that f(n)/g(n)is between c1and c2 from some n0 which are known to be finite numbers because of how Θ is defined. Because we know our new function is somewhere in there we also know that its limit is finite number, thus proving it is indeed constant.
Conclusion, I believe you were right and I would like your professor to offer counterexample to dispruve the statement. If something didn't make sense feel free to ask more in the comments, I'll try to clarify.

Why is f=O(g) if f(n) grows more slowly than g(n)?

If a function f(n) grows more slowly than a function g(n), why is f(n) = O(g(n))?
e.g. if f(n) is 4n^4 and g(n) is log(4n^n^4)
My book says f=O(g(n)) because g=n^4*log(4n)=n^4(logn + log4)=O(n^4*logn). I understand why g=O(n^4*logn), but I'm not sure how they reached the conclusion that f=O(g(n)) from big O of g.
I understand that f(n) grows more slowly than g(n) just by thinking about the graphs, but I'm having trouble understanding asymptotic behavior and why f=O(g(n)) in general.
The formal definition of big-O notation is that
f(n) = O(g(n)) if there are constants n0 and c such that for any n ≥ n0, we have f(n) ≤ c · g(n).
In other words, f(n) = O(g(n)) if for sufficiently large values of n, the value of f(n) is upper-bounded by some constant multiple of g(n).
Notice that this just says that f(n) is upper-bounded by g(n), not that f(n)'s rate of growth is the same as g(n)'s rate of growth. In that sense, you can think of f(n) = O(g(n)) as akin to saying something like "f ≤ g," that f doesn't grow faster than g, leaving open the possibility that g grows a lot faster than f does.

Resources