Determining Asympotic Notation - asymptotic-complexity

I have a set of problems where I am given an f(n) and g(n) and I am supposed to determine where f(n) is O(g(n)), Ω(g(n)) or Θ(g(n))
And I must also determine the c(s) and n0 for the correct relationship.
How do I get started on a problem like this?
Here's an example for the kind of problem I am given
f(n)= lg(n^2) g(n)=n lg(n)

You need to reduce f(n) to a form that makes it easy to compare to g(n). For your case:
f(n) = log(n2)
f(n) = 2 log(n)
That should be enough to answer your problem for that example - the process is going to be pretty much the same for the rest of the set.

You can do this using limits as follows
Limit as n tends to infinity(sorry i have no idea how to produce mathematical equations here)
of f(n)/g(n)
If the value obtained is
A constant then f(n) = Θ(g(n))
Infinity then f(n) = Ω(g(n))
Zero then f(n)= O(g(n))

Related

f(n) is in O(g(n)), could it have same growth as g(n)?

I know that f(n) grows slower than g(n), but could f(n) has the same growth rate as g(n) since there is an equality sign?
Based on the Big-O definition, yes. For example n is in O(n) as well. In this case, f(n) = n and g(n) = n are even equal, a stronger relation than having the same growth.

When to use Big O instead of theta or little o

A question about asymptotic notation. I've seen a lot of explanations of asymptotic notation say:
θ(...) is analogous to =
O(...) is analogous to <=
o(...) is analogous to <
Which would seem to imply that if f(n) = O(g(n)), then either f(n) = θ(g(n)) or f(n) = o(g(n)).
Is it possible to have f(n) = O(g(n)) such that neither f(n) = θ(g(n)) nor f(n) = o(g(n))? If so, what is an example of this? And if not, then why would we ever use O(...) when θ(...) or o(...) are stronger descriptors?
Let f(n)=k!, when k is the smallest integer such that n<=k!.
Then f(n) is not θ(n) (since f(k!+1)/(k!+1) tends to infinity) neither is o(n) (since f(k!)=k!), but clearly f(n)=O(n) (as f(n)<=n).

Clarification for Theta notation in complexity analysis. Θ(g)

When we talk about Θ(g) are we referring to the highest order term of g(n) or are we referring to g(n) exactly as it is?
For example if f(n) = n3. And g(n)=1000n3+n does Θ(g) mean Θ(1000n3+n) or Θ(n3)?
In this scenario can we say that f(n) is Θ(g)?
Θ(g) yields sets of functions that are all of the same complexity class. Θ(1000n3+n) is equal to Θ(n3) because both of these result in the same set.
For simplicity's sake one will usually drop the non-significant terms and multiplicative constants. The lower order additive terms don't change the complexity, nor do any multipliers, so there's no reason to write them out.
Since Θ(g) is a set, you would say that f(n) &in; Θ(g).
NB: Many CS teachers, textbooks, and other resources muddy the waters by using imprecise notation. Lots of people say that f(n)=n3 is O(n3), rather than f(n)=n3 is in O(n3). They use = when they mean &in;.
theta(g(n)) lies between O(g(n)) and omega(g(n))
if g(n) = 1000n^3 + n
first lets find O(g(n)) upper bound
It could be n^3, n^4, n^5 but we choose the closest one which is O(n^3).
O(n^3) is valid because we can find a constant c such that for some value of n
1000n^3 + n < c.n^3
second lets see omega(g(n)) which is lower bound
omega says f(n) > c.g(n)
we can find a constant c such that
1000.n^3 + n > c.n^3
Now we have upper bound which is O(n^3) and lower bound which is omega(n^3).
therefore we have theta which bounds both upper and lower using same function.
By rule : if f(n) = O(g(n)) and f(n) = omega(g(n)) therefore f(n) = theta(g(n))
1000.n^3 + n = theta(n^3)

Complexity analysis of logarithms

I have two functions, f(n)=log2n and g(n)=log10n. I am trying to decide whether f(n) is O(g(n)), or Ω(g(n)) or Θ(g(n)). I thinks i should take the limit f(n)/g(n) as n goes to infinity, and I think that limit is constant so f(n) must be Θ(n).
Am I right?
log2n = log10n / log102 (from here)
So f(n) = g(n) / log102
So f(n) and g(n) only differ by a constant factor (since log102 is constant).
So, from the definitions of O(x), Ω(x) and Θ(x), I'd say:
f(n) ∈ O(g(n)),
f(n) ∈ Ω(g(n)),
f(n) ∈ Θ(g(n))
Yes, you are right. From complexity point of view (at least big O point of view) doesn't matter if it is log2 or log10.
f(n) is both O(g(n)) and f(n) is Ω(g(n)), f(n) is Θ(g(n))
As the limit is constant, you are right that f(n) ∈ Θ(g(n)) (assuming you have a typo in the question). Also of course g(n) ∈ Θ(f(n)).
BTW: Not only the limit of the ratio is constant but log2n/log10n is always a constant(log210).

Growth functions of Algorithm?

Well i have two questions here:-
If f(n) is function whose growth rate is to be found then, Is for all three notations will the g(n) be same, like for f(n)=O(g(n)) and similaraly for omega and theta ?
Theta notation is "omega and Oh" if in some case if oh and omega functions are different then, how will we find theta function there ?
Thanks :)
O, Θ and Ω notation represent related but very different concepts. O-notation expresses an asymptotic upper bound on the growth rate of a function; it says that the function is eventually bounded from above by some constant multiple of some other function. Ω notation is similar, but gives a lower bound. Θ notation gives an asymptotic tight bound - for sufficiently large inputs, the algorithm grows at a rate that is bounded from both above and below by a constant multiple of a function.
If f(n) = O(g(n)), it is not necessarily true that f(n) = Ω(g(n)) or that f(n) = Θ(g(n)). For example, 1 = O(n), but 1 ≠ Ω(n) because n grows strictly faster than 1.
If you find that f(n) = O(g(n)) and Ω(h(n)), where g(n) ≠ h(n), you may have to do a more precise analysis to determine a function j(n) such that f(n) = Θ(j(n)). If g(n) = Θ(h(n)), then you can conclude that f(n) = Θ(g(n)), but if the upper and lower bounds are different there is no mechanical way to determine the Θ growth rate of the function.
Hope this helps!
f(n)=O(g(n)) means that n>N => |f(n)|≤C|g(n)| for some constants N and C.
f(n)=Ω(g(n)) means that n>N => |f(n)|≥C|g(n)| for some constants N and C.
f(n)=Θ(g(n)) means that f(n)=O(g(n)) and f(n)=Ω(g(n)).
It is not possible for all f to find a g such that f(n)=Θ(g(n)) if we want g to be a "good" function (i.e. something like n^r*Log(n)^s). For instance, if f(n)=cos(n)²*n+sin(n)²*n², we have f(n)=O(n²) and f(n)=Ω(n) but we can't find a "good" g such that f(n)=Θ(g(n)).

Resources