Big O in Algorithms - algorithm

I was reading an article and came across the following :
Informally, O(g(n)) can be defined as the set of mathematical functions that contains all functions that don’t “grow faster” than g(n). Thus, all functions below are in the set O(n²):
f(n) = n²+3n + 2, f(n) = n log(n), f(n) = 3n+1
.
Can please anyone tell me how f(n) = n²+3n + 2 grows faster than g(n)?

Can please anyone tell me how f(n) = n²+3n + 2 grows faster than g(n)?
Here is one way to understand it (a bit informal, but I find it more intuitive).
Let L be limit as n goes to infinity of f(n)/g(n)
If L is infinity then f(n) grows faster than g(n) (numerator overwhelms denominator).
If L is 0 then f(n) grows slower than g(n) (denominator overwhelms numerator)
If L is finite number then they have same (comparable) growth rates.

We can define O(g(n)) as the following set:
O(g(n)) = { f(n) ∶ ∃ c > 0 and n0 > 0 | 0 ≤ f(n) ≤ c ⋅ g(n), ∀n ≥ n0 }
This means O(g(n)) is the set of all functions f(n) which grow slower than g(n) for some constant c and for n ≥ n0. In order to find n0 and c we use a justification like the following:
n²+3n + 2 ≤ n² + 3n² + 2n²
n²+3n + 2 ≤ 6n² for c = 6 and n ≥ 1
Now if you just use g(n) = n² obviously f(n) = n² + 3n + 2 will grow faster than g(n); but by choosing the value of c correctly g(n) will grow faster than f(n) for n ≥ n0.

Related

Finding the values in Big Oh

I am going through the Asymptotic notations from here. I am reading this f(n) ≤ c g(n)
For example, if f(n) = 2n + 2, We can satisfy it in any way as f(n) is O (c.g(n)) by adjusting the value of n and c. Or is there any specific rule or formula for selecting the value of c and n. Will no always be 1?
There is no formula per se. You can find the formal definition here:
f(n) = O(g(n)) means there are positive constants c and k, such that 0 ≤ f(n) ≤ cg(n) for all n ≥ k. The values of c and k must be fixed for the function f and must not depend on n. (big-O notation).
What I understood from your question is, you are not getting the essence of big-O notation. If your complexity is, for example, O(n^2), then you can guarantee that there is some value of n (greater than k) after which f(n) in no case will exceed c g(n).
Let's try to prove f(n) = 2n + 2 is O(n):
As it seems from the function itself, you cannot set the value of c equal to 2 as you want to find f(n) ≤ c g(n). If you plug in c = 2, you have to find k such that f(n) ≤ c g(n) for n ≥ k. Clearly, there is no n for which 2n ≥ 2n + 2. So, we move on to c = 3.
Now, let's find the value of k. So, we solve the equation 3n ≥ 2n + 2. Solving it:
3n ≥ 2n + 2
=> 3n - 2n ≥ 2
=> n ≥ 2
Therefore, for c = 3, we found value of k = 2 (n ≥ k).
You must also understand, your function isn't just O(n). It is also O(n^2), O(n^3), O(n^4) and so on. All because corresponding values of c and k exist for g(n) = n^2, g(n) = n^3 and g(n) = n^4.
Hope it helps.

Which pair of functions satisfy f(N) ∼ g(N)?

a. f(N) = N and g(N) = N + N2
b. f(N) = 2N and g(N) = √N
c. f(N) = NlogN + N and g(N) = 2NlogN + N
d. f(N) = 2√N + N and g(N) = √N + N
What is the best way of calculating these functions, I have tried putting values into them, but some of them are very close in values and I am not sure which one to pick.
Calculate f(N)/g(N) in the limit N➡Infinity.
If f(N)/g(N) approaches a positive constant 𝛼 in the limit N➡Infinity, then f(N) ~ 𝛼 g(N).

How to prove the complexity of a logarithmic function?

Let's say you were given two logarithmic functions like
and you were asked to find if f(n) is O(g(n)) Ω(g(n)) or Θ(g(n)), how would you go about it? I found questions like these easier when you were comparing two exponential equations, because for example with x(n) = n^2 and p(n) = n^2 you could find a c > 0 (ex 3) where x(n) <= cp(n) for all n greater than some n>0 and that would prove that x(n) = O(p(n)). However, comparing two logarithmic functions seems much more difficult for some reason. Any help is appreciated, thanks!
f(n) is O(g(n)) iff there is a constant c and n_0 such that f(n) <= c * g(n) for each n >= n_0.
f(n) is Ω(g(n)) iff there is a constant c and n_0 such that f(n) >= c * g(n) for each n >= n_0.
Now, f(n) is Θ(g(n)) iff f(n) is O(g(n)) and f(n) is Ω(g(n)).
So, in your cases, we have:
f(n) = log (n^2) = 2logn
which means, g(n) is logn and c = 2, which means f(n) <= 2 * logn and f(n) >= 2 * logn, which makes it Ω(logn).
Btw. its also f(n) <= n and f(n) >= 1, so f(n) can be O(n), but we don't use it, since we can find a better O(g(n)). In this case we don't have the same function in both notations, to for those values we don't have Ω. However, we just need one option for g(n) to declare Ω. In cases we can't find it, we say its not Ω. Note the word "we say".
In second case, we care only for "highest growing value", logn part. Now, c = 1, and g = log(n), so in this case, its also Ω(logn).

Proving if g(n) is o(f(n)), then f(n) + g(n) is Theta(f(n))

So I'm struggling with proving (or disproving) the above question. I feel like it is true, but I'm not sure how to show it.
Again, the question is if g(n) is o(f(n)), then f(n) + g(n) is Theta(f(n))
Note, that is a little-o, not a big-o!!!
So far, I've managed to (easily) show that:
g(n) = o(f(n)) -> g(n) < c*f(n)
Then g(n) + f(n) < (c+1)*f(n) -> (g(n) + f(n)) = O(f(n))
However, for showing Big Omega, I'm not sure what to do there.
Am I going about this right?
EDIT: Everyone provided great help, but I could only mark one. THANK YOU.
One option would be to take the limit of (f(n) + g(n)) / f(n) as n tends toward infinity. If this converges to a finite, nonzero value, then f(n) + g(n) = Θ(f(n)).
Assuming that f(n) is nonzero for sufficiently large n, the above ratio, in the limit, is
(f(n) + g(n)) / f(n)
= f(n) / f(n) + g(n) / f(n)
= 1 + g(n) / f(n).
Therefore, taking the limit as n goes to infinity, the above expression converges to 1 because the ratio goes to zero (this is what it means for g(n) to be o(f(n)).
So far so good.
For the next step, recall that in the best case, 0 <= g(n); this should get you a lower bound on g(n) + f(n).
Before we begin, lets first state what little-o and Big-Theta notations means:
Little-o notation
Formally, that g(n) = o(f(n)) (or g(n) ∈ o(f(n))) holds for
sufficiently large n means that for every positive constant ε
there exists a constant N such that
|g(n)| ≤ ε*|f(n)|, for all n > N (+)
From https://en.wikipedia.org/wiki/Big_O_notation#Little-o_notation.
Big-Θ notation
h(n) = Θ(f(n)) means there exists positive constants k_1, k_2
and N, such that k_1 · |f(n)| and k_2 · |f(n)| is an upper bound
and lower bound on on |h(n)|, respectively, for n > N, i.e.
k_1 · |f(n)| ≤ |h(n)| ≤ k_2 · |f(n)|, for all n > N (++)
From https://www.khanacademy.org/computing/computer-science/algorithms/asymptotic-notation/a/big-big-theta-notation.
Given: g(n) ∈ o(f(n))
Hence, in our case, for every ε>0 we can find some constant N such that (+), for our functions g(n) and f(n). Hence, for n>N, we have
|g(n)| ≤ ε*|f(n)|, for some ε>0, for all n>N
Choose a constant ε < 1 (recall, the above holds for all ε > 0),
with accompanied constant N.
Then the following holds for all n>N
ε(|g(n)| + |f(n)|) ≤ 2|f(n)| ≤ 2(|g(n)| + |f(n)|) ≤ 4*|f(n)| (*)
Stripping away the left-most inequality in (*) and dividing by 2, we have:
|f(n)| ≤ |g(n)| + |f(n)| ≤ 2*|f(n)|, n>N (**)
We see that this is the very definition Big-Θ notation, as presented in (++), with constants k_1 = 1, k_2 = 2 and h(n) = g(n)+f(n). Hence
(**) => g(n) + f(n) is in Θ(f(n))
Ans we have shown that g(n) ∈ o(f(n)) implies (g(n) + f(n)) ∈ Θ(f(n)).

Comparing complexities

I have these three questions for an exam review:
If f(n) = 2n - 3 give two different functions g(n) and h(n) (so g(n) doesn't equal h(n)) such that f(n) = O(g(n)) and f(n) = O(h(n))
Now do the same again with functions g'(n) and h'(n), but this time the function should be of the form
g'(n) = Ɵ(f(n)) and f(n) = o(h'(n))
Is it possible for a function f(n) = O(g(n)) and f(n) = Ω(g(n))?
I know that a function is O(n) of another, if it is less than or equal to the other function. So I think 1. could be g(n) = 2n²-3 and h(n) = 2n²-10.
I also know that a function is Ɵ(n) of another if it is basically equal to the other function (we can ignore constants), and o(n) if it is only less than the function, so for 2. I think you could have g'(n) = 2n-15 and h'(n) = 2n.
To 3.: It is possible for a function to be both O(n) and Ω(n) because O(n) and Ω(n) allows for the function to be the same as the given function, so you could have a function g(n) that equals f(n) and satisfies the rules for being both O and Ω.
Can someone please tell me if this is correct?
Your answers are mostly right. But I would like to add some points:
Given is f(n) = 2n - 3
With g(n) = 2n²-3 and h(n) = 2n²-10 f(n) is in O(g(n)) and in O(h(n)). But your g(n) and h(n) are basicly the same, at least they are both in Θ(n²). There exists many other function that would also work. E.g.
f(n) ∈ O(n) ⇒ g(n) = n
f(n) ∈ O(nk) ⇒ g(n) = nk ∀ k ≥ 1
f(n) ∈ O(2ⁿ) ⇒ g(n) = 2ⁿ
g'(n) = 2n-15 reduces to g'(n) = n, if we think in complexities, and this is right. In fact, it is the only possible answer.
But f(n) ∈ o(h'(n)) does not hold for h'(n) = 2n. Little-o means that
limn → ∞ | f(n)/g(n) | = 0 ⇔ f(n) ∈ o(g(n))
So you can choose h'(n) = n² or more general h'(n) = nk ∀ k > 1 or h'(n) = cⁿ for a constant c > 1.
Yes it is possible and you can take it also as a definition for Θ(g(n)):
f(n) ∈ Θ(g(n)) ⇔ f(n) ∈ O(g(n)) and f(n) ∈ Ω(g(n))

Resources