When to use Big O instead of theta or little o - algorithm

A question about asymptotic notation. I've seen a lot of explanations of asymptotic notation say:
θ(...) is analogous to =
O(...) is analogous to <=
o(...) is analogous to <
Which would seem to imply that if f(n) = O(g(n)), then either f(n) = θ(g(n)) or f(n) = o(g(n)).
Is it possible to have f(n) = O(g(n)) such that neither f(n) = θ(g(n)) nor f(n) = o(g(n))? If so, what is an example of this? And if not, then why would we ever use O(...) when θ(...) or o(...) are stronger descriptors?

Let f(n)=k!, when k is the smallest integer such that n<=k!.
Then f(n) is not θ(n) (since f(k!+1)/(k!+1) tends to infinity) neither is o(n) (since f(k!)=k!), but clearly f(n)=O(n) (as f(n)<=n).

Related

f(n) is in O(g(n)), could it have same growth as g(n)?

I know that f(n) grows slower than g(n), but could f(n) has the same growth rate as g(n) since there is an equality sign?
Based on the Big-O definition, yes. For example n is in O(n) as well. In this case, f(n) = n and g(n) = n are even equal, a stronger relation than having the same growth.

Algorithms - Both Little o and Big Omega on the same functions?

I have two functions, f(n),g(n) such that f(n)=o(g(n)).
to be clear, I'm taking about little o
It is possible with that information given to me, that f(n)=Omega(g(n)).
To me it sounds that it's not possible, since Little-o definition says to me that
for every c>0,f(n)<c * g(n).
Thanks!
Let's assume that both f and g are strictly positive.
f(n) = o(g(n)) means f(n)/g(n) -> 0 as n tends to infinity.
f(n) = Ω(g(n)) means (assuming the Knuth definition of Ω) g(n) = O(f(n)), which means there's a c>0 such that for large enough n, g(n) <= cf(n). But then, for large enough n, f(n)/g(n) >= 1/c > 0. So it's not possible that f(n)/g(n) -> 0 as n tends to infinity, which means that it's impossible that f(n) = Ω(g(n)) and f(n) = o(g(n)).
No this does not guarantee that. Sometimes, big O is the same as Omega(g(n)) but not all the time.

Clarification for Theta notation in complexity analysis. Θ(g)

When we talk about Θ(g) are we referring to the highest order term of g(n) or are we referring to g(n) exactly as it is?
For example if f(n) = n3. And g(n)=1000n3+n does Θ(g) mean Θ(1000n3+n) or Θ(n3)?
In this scenario can we say that f(n) is Θ(g)?
Θ(g) yields sets of functions that are all of the same complexity class. Θ(1000n3+n) is equal to Θ(n3) because both of these result in the same set.
For simplicity's sake one will usually drop the non-significant terms and multiplicative constants. The lower order additive terms don't change the complexity, nor do any multipliers, so there's no reason to write them out.
Since Θ(g) is a set, you would say that f(n) &in; Θ(g).
NB: Many CS teachers, textbooks, and other resources muddy the waters by using imprecise notation. Lots of people say that f(n)=n3 is O(n3), rather than f(n)=n3 is in O(n3). They use = when they mean &in;.
theta(g(n)) lies between O(g(n)) and omega(g(n))
if g(n) = 1000n^3 + n
first lets find O(g(n)) upper bound
It could be n^3, n^4, n^5 but we choose the closest one which is O(n^3).
O(n^3) is valid because we can find a constant c such that for some value of n
1000n^3 + n < c.n^3
second lets see omega(g(n)) which is lower bound
omega says f(n) > c.g(n)
we can find a constant c such that
1000.n^3 + n > c.n^3
Now we have upper bound which is O(n^3) and lower bound which is omega(n^3).
therefore we have theta which bounds both upper and lower using same function.
By rule : if f(n) = O(g(n)) and f(n) = omega(g(n)) therefore f(n) = theta(g(n))
1000.n^3 + n = theta(n^3)

Complexity analysis of logarithms

I have two functions, f(n)=log2n and g(n)=log10n. I am trying to decide whether f(n) is O(g(n)), or Ω(g(n)) or Θ(g(n)). I thinks i should take the limit f(n)/g(n) as n goes to infinity, and I think that limit is constant so f(n) must be Θ(n).
Am I right?
log2n = log10n / log102 (from here)
So f(n) = g(n) / log102
So f(n) and g(n) only differ by a constant factor (since log102 is constant).
So, from the definitions of O(x), Ω(x) and Θ(x), I'd say:
f(n) ∈ O(g(n)),
f(n) ∈ Ω(g(n)),
f(n) ∈ Θ(g(n))
Yes, you are right. From complexity point of view (at least big O point of view) doesn't matter if it is log2 or log10.
f(n) is both O(g(n)) and f(n) is Ω(g(n)), f(n) is Θ(g(n))
As the limit is constant, you are right that f(n) ∈ Θ(g(n)) (assuming you have a typo in the question). Also of course g(n) ∈ Θ(f(n)).
BTW: Not only the limit of the ratio is constant but log2n/log10n is always a constant(log210).

Determining Asympotic Notation

I have a set of problems where I am given an f(n) and g(n) and I am supposed to determine where f(n) is O(g(n)), Ω(g(n)) or Θ(g(n))
And I must also determine the c(s) and n0 for the correct relationship.
How do I get started on a problem like this?
Here's an example for the kind of problem I am given
f(n)= lg(n^2) g(n)=n lg(n)
You need to reduce f(n) to a form that makes it easy to compare to g(n). For your case:
f(n) = log(n2)
f(n) = 2 log(n)
That should be enough to answer your problem for that example - the process is going to be pretty much the same for the rest of the set.
You can do this using limits as follows
Limit as n tends to infinity(sorry i have no idea how to produce mathematical equations here)
of f(n)/g(n)
If the value obtained is
A constant then f(n) = Θ(g(n))
Infinity then f(n) = Ω(g(n))
Zero then f(n)= O(g(n))

Resources