Checking big theta, little oh and little omega with limits? - big-o

Say we have two functions f(n) and g(n). If we we wanted to check if f(n) is little oh o(g(n)) would it be valid to do the following:
lim n -> infinity f(n)/g(n) and the result would have to = 0 ?
So if the above comes out to 0, will it mean f(n) is o(g(n))? And how can we check the big theta and little omega with limits?

yes.
o(g(n)) = { f(n): for all constants c > 0, there exists a constant n0 such that 0 ≤ f(n) < cg(n) for all n ≥ n0}.
ALSO: 0 = lim f(n)/g(n)

Related

If f(n) = Θ(g(n)) does that f(n) is asymptotically equal to g(n)?

I'm fairly certain that if f(n) = Θ(g(n)) is true, if f(n) is asymptotically equal to g(n). However, I'm concerned I might be overlooking something. Am I correct in thinking that f(n) = Θ(g(n)) then f(n) is asymptotically equal to g(n)? or am I overlooking something?
I'm trying to compare different algorithms with respective runtimes of f(n) and g(n) and prove that f(n) = Θ(g(n)), but I'm not sure if I'm on the right way or not.
A. f(n) = log(n^100), g(n) = log(n^2)
lim n->∞ f(n)/g(n) = lim n->∞ log(n^200)/log(n^2) = 100
Since the result is a constant, we conclude that f(n) ∈ ϴ(g(n)), hence f(n) = ϴ(g(n)).
B. f(n) = sqrt(n), g(n) = log(n)
lim n->∞ f(n)/g(n) = lim n->∞ sqrt(n)/log(n) = ±∞, in my case ∞, hence f(n) ≠ ϴ(g(n)).
C. f(n) = 3^n, g(n) = 5^n
lim n->∞ f(n)/g(n) = lim n->∞ 3^n/5^n = 0, hence f(n) ≠ ϴ(g(n)).
D. f(n) = sin(n)+3, g(n) = cos(n)+1
lim n->∞ f(n)/g(n) = lim n->∞ sin(n)+3/cos(n)+1 = 4/3, hence f(n) ≠ ϴ(g(n)).
Please tell me, am I on the right way?
Am I correct in thinking that if f(n) = Θ(g(n)) then f(n) is asymptotically equal to g(n)?
No, asymptotic equality is a stronger claim than asymptotically bound. The opposite is true: when 𝑓(𝑛) is asymptotically equal to 𝑔(𝑛), then 𝑓(𝑛) is Θ(𝑔(𝑛))
As defined on Wikipedia - Asymptotic Analysis:
if and only if
𝑓(𝑛)
lim ──── = 1
𝑥→∞ 𝑔(𝑥)
the functions 𝑓 and 𝑔 are said to be asymptotically equivalent.
For the first example where 𝑓(𝑛) = log𝑛¹⁰⁰ and 𝑔(𝑛) = log𝑛², this equivalence does not hold:
log𝑛¹⁰⁰ / log𝑛² = log𝑛¹⁰⁰ − log𝑛² = log𝑛⁹⁸, whose limit diverges to infinity and so 𝑓(𝑛) and 𝑔(𝑛) are not asymptotically equal.
See also Wikipedia - Family of Bachmann–Landau notations

Big O in Algorithms

I was reading an article and came across the following :
Informally, O(g(n)) can be defined as the set of mathematical functions that contains all functions that don’t “grow faster” than g(n). Thus, all functions below are in the set O(n²):
f(n) = n²+3n + 2, f(n) = n log(n), f(n) = 3n+1
.
Can please anyone tell me how f(n) = n²+3n + 2 grows faster than g(n)?
Can please anyone tell me how f(n) = n²+3n + 2 grows faster than g(n)?
Here is one way to understand it (a bit informal, but I find it more intuitive).
Let L be limit as n goes to infinity of f(n)/g(n)
If L is infinity then f(n) grows faster than g(n) (numerator overwhelms denominator).
If L is 0 then f(n) grows slower than g(n) (denominator overwhelms numerator)
If L is finite number then they have same (comparable) growth rates.
We can define O(g(n)) as the following set:
O(g(n)) = { f(n) ∶ ∃ c > 0 and n0 > 0 | 0 ≤ f(n) ≤ c ⋅ g(n), ∀n ≥ n0 }
This means O(g(n)) is the set of all functions f(n) which grow slower than g(n) for some constant c and for n ≥ n0. In order to find n0 and c we use a justification like the following:
n²+3n + 2 ≤ n² + 3n² + 2n²
n²+3n + 2 ≤ 6n² for c = 6 and n ≥ 1
Now if you just use g(n) = n² obviously f(n) = n² + 3n + 2 will grow faster than g(n); but by choosing the value of c correctly g(n) will grow faster than f(n) for n ≥ n0.

How to prove/disprove if (2^{n})^{1/3} is in Θ(2^{n}) using Big-O, Big-Omega and Big-Theta

So I understand Big-O, Big-Omega and Big-Theta conceptually, but I'm not sure how to prove Big-Omega and Big-Theta.
function f is in Big-O(g) if and only if there exists some constant c > 0 and some constant n_0 ≥ 1 such that for all n ≥ n_0, the expression f(n) ≤ c·g(n) is true.
Big-Omega is the opposite, c·g(n) ≤ f(n).
Big-Theta sandwiches c1·f(n) ≤ g(n) ≤ c2·f(n).
I need to prove/disprove if (2^{n})^{1/3} ∈ Θ(2^{n}) by using all three notations.
What I have so far:
Big-O : (2^{n})^{1/3} ≤ c·2^{n} when c=1 and n_0 = 1, so (2^{n})^{1/3} ∈ O(2^{n})
Big-Omega : We can rewrite (2^{n})^{1/3} = (1/(2^{2n/3}))·(2^n). We see that for c·g(n) ≤ f(n), c has to be ≤ 1/(2^{2n/3}) which is not possible since c > 0. So, there does not exist a c > 0 that satisfies c·g(n) ≤ f(n) and thus, (2^{n})^{1/3} ∉ Ω(2^{n})
Big-Theta : Since (2^{n})^{1/3} ∉ Ω(2^{n}), there is no lower bound c1·f(n) ≤ g(n). Therefore, (2^{n})^{1/3} ∉ Θ(2^{n})
Is this how you are supposed to prove it?
First simplify f(n) = (2^n)^(1/3) to f(n) = 2^(n/3). Then, take a limit of lim_{n\to\infty} f(n)/g(n) that g(n) = 2^n:
lim_{n\to\infty} 2^(n/3) / 2^n = lim_{n\to\infty} 1 / 2^(2n/3) = 0
Hence, f(n) = o(g(n)) (little-oh). It means f(n) is not in \Theta(g(n)). Notice that f(n) = O(g(n)) (Big-Oh) as it is in o(g(n)) (little-Oh).

Asymptotic bounds and Big Θ notation

Suppose that f(n)=4^n and g(n)=n^n, will it be right to conclude that f(n)=Θ(g(n)).
In my opinion it's a correct claim but I'm not 100% sure.
It is incorrect. f(n) = Theta(g(n)) if and only if both f(n) = O(g(n)) and g(n) = O(f(n)). It is true that f(n) = O(g(n)). We will show that it is not the case that g(n) = O(f(n)).
Assume g(n) = O(f(n)). Then there exists a positive real constant c and a positive natural number n0 such that for all n > n0, g(n) <= c * f(n). For our functions, this implies n^n <= c * 4^n. If we take the nth root of both sides of this inequality we get n <= 4c^(1/n). We are free to assume c >= 1 and n0 >= since if we found a smaller value that worked a larger value would work too. For all c > 1 and n > 1, 4c^(1/n) is strictly less than 4c. But then if we choose n > 4c, the inequality is false. So, there cannot be an n0 such that for all n at least n0 the condition holds. This is a contradiction; our initial assumption is disproven.

Big O notation constants

Given the following functions f(n) and g(n), is f in O(g(n)) or is f in Θ(g(n)),
or both? If true, specify a constant c and point n0, if false, briefly specify why.
(a) f(n) = 2n , g(n) = 2^2n
(b) f(n) = n!, g(n) = 2n
I do understand for (a), f(n) = O(g(n)) because g(n) upper bounds f(n)
and for (b), g(n) = O(f(n)) because of dominance relativity on the fact that n! > 2^n..
I have done some research but could not find much on how to calculate the constants c and n0 for this type of questions. thanks for the reply :)
a) f(n) = 2n , g(n) = 2^(2n)
(I added parenthesis.)
f(n) = O(g(n)) iff | f(n) | <= C * | g(n) | for some C>0 and all n > n0
2n <= 1*(2^(2n)) for n>1
Therefore 2n = O(2^(2n)). My constants are C=1 and n0 = 1. But others work too.

Resources