Is there a proof that this big-o statement wrong? - algorithm

I've been trying to find some examples that this is wrong because it is supposed to be wrong(?).
if g = O(f) and s = O(r) then g/s = O(f/r)
Is there a counterexample that proves that this big-o division is wrong?

Take g(n) = n^2, f(n)=n^3 and s(n) = 1, r(n)=n^2.
You can see that g = O(f) and s = O(r) but n^2 = g/s ≠ O(f/r = n^3/n^2 = n)

Easy:
g(n) = n²
f(n) = n²
s(n) = n
r(n) = n²
Then:
g ∈ Ο(n²) is true
s ∈ Ο(n²) is true
g/s ∈ Ο(n²/n²) = Ο(1) is false

The idea is: if f is O(g) then f is bounded above by g, but not necessarily tightly bounded. I.e. if f is O(n) then it is also O(n^2), O(n^3), etc....
You can use this idea to find examples by making the denominator function large using a larger bound than necessary, thus making the fraction small.

Big-O is just an upper bound.
Let f=g=1 and let s grow slower than r. Then 1/s grows faster than 1/r.

Let g = n^2, r = n^100, s = n and f = n^2. Apparently the statement is not true: g/s=n and f/r=1/n^98. And 1/n is not in O(1/n^98).
One intuitive way to think of big-oh is "grows not faster than". However when you take the reciprocal, if "f grows no faster than g", then 1/f will grow no slower than 1/g.
A correct statement would be that if g=O(f) and s=O(r), then g/r=O(f/s). Note that in this case I swapped the pairs.

Related

For two non-negative functions f and g, prove or disprove if f = O(g) and g = O(f) and ∀n,f(n) > g(n) then f − g = O(1)

I am a bit confused about how to utilize the asymptotic analysis to prove this statement. I've tried to use the definition of f = O(g) and g = O(f), namely 0<f<=c*g(n) and 0<g <= c2*f(n),however I can deduce what will happen for f(n)-g(n). Can someone help me out on this?
You can make a lot of counterexamples. A simple one is: f(n) = 2n and g(n) = n. You can see 2n \in O(n) and n \in O(2n) by definition. But, f(n) - g(n) = n that is obviously not in O(1).

How to solve a problem on relative asymptotic growth (table) from CLRS?

I struggle to fill this table in even though I took calculus recently and good at math. It is only specified in the chapter how to deal with lim(n^k/c^n), but I have no idea how to compare other functions. I checked the solution manual and no info on that, only a table with answers which provides little insight.
When I solve these I don't really think about limits -- I lean on a couple facts and some well-known properties of big-O notation.
Fact 1: for all functions f and g and all exponents p > 0, we have f(n) = O(g(n)) if and only if f(n)p = O(g(n)p), and likewise with o, Ω, ω, and Θ respectively. This has a straightforward proof from the definition; you just have to raise the constant c to the power p as well.
Fact 2: for all exponents ε > 0, the function lg(n) is o(nε). This follows from l'Hôpital's rule for limits: lim lg(n)/nε = lim (lg(e)/n)/(ε nε−1) = (lg(e)/ε) lim n−ε = 0.
Fact 3:
If f(n) ≤ g(n) + O(1), then 2f(n) = O(2g(n)).
If f(n) ≤ g(n) − ω(1), then 2f(n) = o(2g(n)).
If f(n) ≥ g(n) − O(1), then 2f(n) = Ω(2g(n)).
If f(n) ≥ g(n) + ω(1), then 2f(n) = ω(2g(n)).
Fact 4: lg(n!) = Θ(n lg(n)). The proof uses Stirling's approximation.
To solve (a), use Fact 1 to raise both sides to the power of 1/k and apply Fact 2.
To solve (b), rewrite nk = 2lg(n)k and cn = 2lg(c)n, prove that lg(c) n − lg(n) k = ω(1), and apply Fact 3.
(c) is special. nsin(n) ends up anywhere between 0 and n. Since 0 is o(√n) and n is ω(√n), that's a solid row of NO.
To solve (d), observe that n ≥ n/2 + ω(1) and apply Fact 3.
To solve (e), rewrite nlg(c) = 2lg(n)lg(c) = 2lg(c)lg(n) = clg(n).
To solve (f), use Fact 4 and find that lg(n!) = Θ(n lg(n)) = lg(nn).

Given that f(n) = 10000000n and g(n) = n^2. Why f(n) is O(g(n))?

I'm trying to solve the following problem but am unsure on the explanation given in the solution. f2(n) appears to be O(n) and f4(n) appears to be O(n^2). Why, then, does f2(n) is O(f4(n))?
Your current statement is not true. A contradiction example is f2(n) = n = O(n) and f4(n) = 1 = O(n^2). But, f2(n) is not O(f4(n)).
However, as mentioned in the answer that f4(n) is quadratic and f2(n) is linear, by the definition of the big-Oh symbol, we can say f2(n) = O(f4(n)).
Read the wikipedia page about Big O notation and you will understand it better.
Informally, a description of a function in terms of big O notation usually only provides an upper bound on the growth rate of the function.
Given that f(n) = 10000000n and g(n) = n², when you say that f(n) is O(g(n)), it means there exists c > 0 (e.g., c = 1) and n0 such that f(n) ≤ cg(n) whenever n ≥ n0.
If f2 is O(f4) it means that f2 grows asymptotically slower than f4.

Is it valid to say the function f(n) = n is theta(n)?

I am taking a class and we're reviewing time complexity information.
I understand that big o is an upper bound, and omega is a lower bound, and if those are the same then the function is theta(that bound).
Let's say I have the function f(n) = n. Can we say that is is theta(n)?
I think it is because it is O(n) and Omega(n) for C=1 for k>=1, but I wanted to ask to be sure.
Yes that is correct. It is a common definition to say that f \in \Theta(g) iff f \in \Omega(g) and f \in O(g).
Here f(n) = n and g(n) = n.
To prove both individual parts, liminf f(n)/g(n) = liminf 1 = 1 > 0 and limsup g(n)/f(n) = limsup 1 = 1 < \infty.
In particular f \in Theta(f) for all functions f.
Note however, that the notation usually uses a big \Theta, not a small one.

algorithm analysis - orders of growth question

I'm studing orders of growth "big oh", "big omega", and "big theta". Since I can't type the little symbols for these I will denote them as follows:
ORDER = big oh
OMEGA = big omega
THETA = big theta
For example I'll say n = ORDER(n^2) to mean that the function n is in the order of n^2 (n grows at most as fast n^2).
Ok for the most part I understand these:
n = ORDER(n^2) //n grows at most as fast as n^2
n^2 = OMEGA(n) //n^2 grows atleast as fast as n
8n^2 + 1000 = THETA(n^2) //same order of growth
Ok here comes the example that confuses me:
what is n(n+1) vs n^2
I realize that n(n+1) = n^2 + n; I would say it has the same order of growth as n^2; therefore I would say
n(n+1) = THETA(n^2)
but my question is, would it also be correct to say:
n(n+1) = ORDER(n^2)
please help because this is confusing to me. thanks.
Thank you guys!!
just to make sure I understand correctly, are these all true:
n^2+n = ORDER(2000n^2)
n^2+n = THETA(2000n^2)
n^2+n = OMEGA(2000n^2)
2000n^2 = ORDER(n^2+n)
2000n^2 = THETA(n^2+n)
2000n^2 = OMEGA(n^2+n)
So if f = THETA(g) then f=ORDER(g) and f=OMEGA(g) are also true.
Yes, n(n+1) = Order(n^2) is correct.
If f = Theta(g) then f = Order(g) and g = Order(f) are both true.
Moron is correct, and that is the easiest way to think about it.
But to understand it, return to the definition for f(n) = O(g(n)): there exists a positive M and n0 such that, for all n > n0, f(n) <= Mg(n).
Suppose M=2. Can you find a value, n0, such that for all n > n0, n^2+n <= M(n^2)?
(Plot both functions with pen and paper to understand how they grow in relation to one another.)
You can use this simple table to get an easy and intuitive understanding of what these symbols mean:
If f(n) and g(n) are two functions then
Growth Rate
if f(n) = Θ(g(n)) then growth rate of f(n) = growth rate of g(n)
if f(n) = O(g(n)) then growth rate of f(n) ≤ growth rate of g(n)
if f(n) = Ω(g(n)) then growth rate of f(n) ≥ growth rate of g(n)
if f(n) = o(g(n)) then growth rate of f(n) < growth rate of g(n)
if f(n) = ω(g(n)) then growth rate of f(n) > growth rate of g(n)
Also, the order is always written in terms of the highest order i.e if the order is O(n^2 + n + 1) then we simply write it as O(n^2) as n^2 is of the highest order.

Resources