algorithm analysis - orders of growth question - algorithm

I'm studing orders of growth "big oh", "big omega", and "big theta". Since I can't type the little symbols for these I will denote them as follows:
ORDER = big oh
OMEGA = big omega
THETA = big theta
For example I'll say n = ORDER(n^2) to mean that the function n is in the order of n^2 (n grows at most as fast n^2).
Ok for the most part I understand these:
n = ORDER(n^2) //n grows at most as fast as n^2
n^2 = OMEGA(n) //n^2 grows atleast as fast as n
8n^2 + 1000 = THETA(n^2) //same order of growth
Ok here comes the example that confuses me:
what is n(n+1) vs n^2
I realize that n(n+1) = n^2 + n; I would say it has the same order of growth as n^2; therefore I would say
n(n+1) = THETA(n^2)
but my question is, would it also be correct to say:
n(n+1) = ORDER(n^2)
please help because this is confusing to me. thanks.
Thank you guys!!
just to make sure I understand correctly, are these all true:
n^2+n = ORDER(2000n^2)
n^2+n = THETA(2000n^2)
n^2+n = OMEGA(2000n^2)
2000n^2 = ORDER(n^2+n)
2000n^2 = THETA(n^2+n)
2000n^2 = OMEGA(n^2+n)
So if f = THETA(g) then f=ORDER(g) and f=OMEGA(g) are also true.

Yes, n(n+1) = Order(n^2) is correct.
If f = Theta(g) then f = Order(g) and g = Order(f) are both true.

Moron is correct, and that is the easiest way to think about it.
But to understand it, return to the definition for f(n) = O(g(n)): there exists a positive M and n0 such that, for all n > n0, f(n) <= Mg(n).
Suppose M=2. Can you find a value, n0, such that for all n > n0, n^2+n <= M(n^2)?
(Plot both functions with pen and paper to understand how they grow in relation to one another.)

You can use this simple table to get an easy and intuitive understanding of what these symbols mean:
If f(n) and g(n) are two functions then
Growth Rate
if f(n) = Θ(g(n)) then growth rate of f(n) = growth rate of g(n)
if f(n) = O(g(n)) then growth rate of f(n) ≤ growth rate of g(n)
if f(n) = Ω(g(n)) then growth rate of f(n) ≥ growth rate of g(n)
if f(n) = o(g(n)) then growth rate of f(n) < growth rate of g(n)
if f(n) = ω(g(n)) then growth rate of f(n) > growth rate of g(n)
Also, the order is always written in terms of the highest order i.e if the order is O(n^2 + n + 1) then we simply write it as O(n^2) as n^2 is of the highest order.

Related

How to find the relation of these two functions, when we are considering the time complexity?

For example, like the question below, can I solve this question by just doing the limit comparison test?
Let f(n) = n · (4^n) and let g(n) = 2^(3n)
which relation best applies:
f(n) ≤ O(g(n)), f(n) ≥ Ω(g(n)), or f(n) = Θ(g(n))?
Yes, the limit tests are actually the definition of this notation.
For the provided example |f(n)|/g(n) is n*2-n, so f(n)=O(g(n)) is true, but f(n) = Ω(g(n)), and, therefore, f(n) = Θ(g(n)) is false.
As we know 2^(3n) = 8^n:
lim_{n \to \infty} (f(n)/g(n) = n * 4^n / 8^n = n/2^n)
As the growth of 2^n is faster than n, the above limit is zero. Hence, f(n) = o(g(n)) (little-oh).

Given that f(n) = 10000000n and g(n) = n^2. Why f(n) is O(g(n))?

I'm trying to solve the following problem but am unsure on the explanation given in the solution. f2(n) appears to be O(n) and f4(n) appears to be O(n^2). Why, then, does f2(n) is O(f4(n))?
Your current statement is not true. A contradiction example is f2(n) = n = O(n) and f4(n) = 1 = O(n^2). But, f2(n) is not O(f4(n)).
However, as mentioned in the answer that f4(n) is quadratic and f2(n) is linear, by the definition of the big-Oh symbol, we can say f2(n) = O(f4(n)).
Read the wikipedia page about Big O notation and you will understand it better.
Informally, a description of a function in terms of big O notation usually only provides an upper bound on the growth rate of the function.
Given that f(n) = 10000000n and g(n) = n², when you say that f(n) is O(g(n)), it means there exists c > 0 (e.g., c = 1) and n0 such that f(n) ≤ cg(n) whenever n ≥ n0.
If f2 is O(f4) it means that f2 grows asymptotically slower than f4.

if f(n) = 2n^2 and g(n) = 1.01^n. Is f(n) = O(g(n))? Is f(n) = Ω(g(n))?

Let f(n) = 2n^2 and g(n) = 1.01^n. Is f(n) = O(g(n))? Is f(n) = Ω(g(n))? Justify your answers with a proof.
Think about what the graphs of those functions look like for very large n. Which one grows faster (i.e. overtakes the other in the long run)? Time complexities denote the asymptotic running-time of an algorithm.

The Mathematical Relationship Between Big-Oh Classes

My textbook describes the relationship as follows:
There is a very nice mathematical intuition which describes these classes too. Suppose we have an algorithm which has running time N0 when given an input of size n, and a running time of N1 on an input of size 2n. We can characterize the rates of growth in terms of the relationship between N0 and N1:
Big-Oh Relationship
O(log n) N1 ≈ N0 + c
O(n) N1 ≈ 2N0
O(n²) N1 ≈ 4N0
O(2ⁿ) N1 ≈ (N0)²
Why is this?
That is because if f(n) is in O(g(n)) then it can be thought of as acting like k * g(n) for some k.
So for example if f(n) = O(log(n)) then it acts like k log(n), and now f(2n) ≈ k log(2n) = k (log(2) + log(n)) = k log(2) + k log(n) ≈ k log(2) + f(n) and that is your desired equation with c = k log(2).
Note that this is a rough intuition only. An example of where it breaks down is that f(n) = (2 + sin(n)) log(n) = O(log(n)). The oscillating 2 + sin(n) bit means that f(2n)-f(n) can be basically anything.
I personally find this kind of rough intuition to be misleading and therefore worse than useless. Others find it very helpful. Decide for yourself how much weight you give it.
Basically what they are trying to show is just basic algebra after substituting 2n for n in the functions.
O(log n)
log(2n) = log(2) + log(n)
N1 ≈ c + N0
O(n)
2n = 2(n)
N1 ≈ 2N0
O(n²)
(2n)^2 = 4n^2 = 4(n^2)
N1 ≈ 4N0
O(2ⁿ)
2^(2n) = 2^(n*2) = (2^n)^2
N1 ≈ (N0)²
Since O(f(n)) ~ k * f(n) (almost by definition), you want to look at what happens when you put 2n in for n. In each case:
N1 ≈ k*log 2n = k*(log 2 + log n) = k*log n + k*log 2 ≈ N0 + c where c = k*log 2
N1 ≈ k*(2n) = 2*k*n ≈ 2N0
N1 ≈ k*(2n)^2 = 4*k*n^2 ≈ 4N0
N1 ≈ k*2^(2n) = k*(2^n)^2 ≈ N0*2^n ≈ N0^2/k
So the last one is not quite right, anyway. Keep in mind that these relationships are only true asymptotically, so the approximations will be more accurate as n gets larger. Also, f(n) = O(g(n)) only means g(n) is an upper bound for f(n) for large enough n. So f(n) = O(g(n)) does not necessarily mean f(n) ~ k*g(n). Ideally, you want that to be true, since your big-O bound will be tight when that is the case.

Complexity analysis after multiplication of two functions

Given F(n) = θ(n)
H(n) = O(n)
G(n) = Ω(n)
then what will be order of F(n) + [G(n) . H(n)] ?
edit: F(n) = θ(n) not Q(n)
There isn't enough information to say anything about the function P(n) = G(n)*H(n). All we know is that G grows at least linearly; it could be growing quadratically, cubically, even exponentially. Likewise, we only know that H grows at most linearly; it could only be growing logarithmically, or be constant, or even be decreasing. As a result, P(n) itself could be decreasing or increasing without bound, which means the sum F(n) + P(n) could also be decreasing or increasing without bound.
Suppose, though, that we could assume that H(n) = Ω(1) (i.e., it is at least not decreasing). Now we can say the following about P(n):
P(n) = H(n) * G(n)
>= C1 * G(n)
= Ω(G(n)) = Ω(n)
P(n) <= C1*n * G(n)
= O(n*G(n))
Thus F(n) + P(n) = Ω(n) and F(n) + P(n) = O(n*G(n)), but nothing more can be said; both bounds are as tight as we can make them without more information about H or G.

Resources