What is the order relationship between f(n) = 10n and g(n) = n^(nmod6)?
I know that I can think of f(n) as just n, but thinking about g(n) confuses me because won't nmod6 change with the different values of n? For example, n = 6 would make g(n) = n^0 = 1 but when n = 5, g(n) = n ^ 5. How can I think of this with respect to the Big-Oh, Big-Theta, and Big-Omega relationships?
(n mod 6) can only take values from 0 to 5, so g(n) is bounded above by n^5, and bounded below by 1. So it would be O(n^5) and Omega(1). It does not have a workable Big-Theta.
Related
I struggle to fill this table in even though I took calculus recently and good at math. It is only specified in the chapter how to deal with lim(n^k/c^n), but I have no idea how to compare other functions. I checked the solution manual and no info on that, only a table with answers which provides little insight.
When I solve these I don't really think about limits -- I lean on a couple facts and some well-known properties of big-O notation.
Fact 1: for all functions f and g and all exponents p > 0, we have f(n) = O(g(n)) if and only if f(n)p = O(g(n)p), and likewise with o, Ω, ω, and Θ respectively. This has a straightforward proof from the definition; you just have to raise the constant c to the power p as well.
Fact 2: for all exponents ε > 0, the function lg(n) is o(nε). This follows from l'Hôpital's rule for limits: lim lg(n)/nε = lim (lg(e)/n)/(ε nε−1) = (lg(e)/ε) lim n−ε = 0.
Fact 3:
If f(n) ≤ g(n) + O(1), then 2f(n) = O(2g(n)).
If f(n) ≤ g(n) − ω(1), then 2f(n) = o(2g(n)).
If f(n) ≥ g(n) − O(1), then 2f(n) = Ω(2g(n)).
If f(n) ≥ g(n) + ω(1), then 2f(n) = ω(2g(n)).
Fact 4: lg(n!) = Θ(n lg(n)). The proof uses Stirling's approximation.
To solve (a), use Fact 1 to raise both sides to the power of 1/k and apply Fact 2.
To solve (b), rewrite nk = 2lg(n)k and cn = 2lg(c)n, prove that lg(c) n − lg(n) k = ω(1), and apply Fact 3.
(c) is special. nsin(n) ends up anywhere between 0 and n. Since 0 is o(√n) and n is ω(√n), that's a solid row of NO.
To solve (d), observe that n ≥ n/2 + ω(1) and apply Fact 3.
To solve (e), rewrite nlg(c) = 2lg(n)lg(c) = 2lg(c)lg(n) = clg(n).
To solve (f), use Fact 4 and find that lg(n!) = Θ(n lg(n)) = lg(nn).
For example, like the question below, can I solve this question by just doing the limit comparison test?
Let f(n) = n · (4^n) and let g(n) = 2^(3n)
which relation best applies:
f(n) ≤ O(g(n)), f(n) ≥ Ω(g(n)), or f(n) = Θ(g(n))?
Yes, the limit tests are actually the definition of this notation.
For the provided example |f(n)|/g(n) is n*2-n, so f(n)=O(g(n)) is true, but f(n) = Ω(g(n)), and, therefore, f(n) = Θ(g(n)) is false.
As we know 2^(3n) = 8^n:
lim_{n \to \infty} (f(n)/g(n) = n * 4^n / 8^n = n/2^n)
As the growth of 2^n is faster than n, the above limit is zero. Hence, f(n) = o(g(n)) (little-oh).
I am going through the Asymptotic notations from here. I am reading this f(n) ≤ c g(n)
For example, if f(n) = 2n + 2, We can satisfy it in any way as f(n) is O (c.g(n)) by adjusting the value of n and c. Or is there any specific rule or formula for selecting the value of c and n. Will no always be 1?
There is no formula per se. You can find the formal definition here:
f(n) = O(g(n)) means there are positive constants c and k, such that 0 ≤ f(n) ≤ cg(n) for all n ≥ k. The values of c and k must be fixed for the function f and must not depend on n. (big-O notation).
What I understood from your question is, you are not getting the essence of big-O notation. If your complexity is, for example, O(n^2), then you can guarantee that there is some value of n (greater than k) after which f(n) in no case will exceed c g(n).
Let's try to prove f(n) = 2n + 2 is O(n):
As it seems from the function itself, you cannot set the value of c equal to 2 as you want to find f(n) ≤ c g(n). If you plug in c = 2, you have to find k such that f(n) ≤ c g(n) for n ≥ k. Clearly, there is no n for which 2n ≥ 2n + 2. So, we move on to c = 3.
Now, let's find the value of k. So, we solve the equation 3n ≥ 2n + 2. Solving it:
3n ≥ 2n + 2
=> 3n - 2n ≥ 2
=> n ≥ 2
Therefore, for c = 3, we found value of k = 2 (n ≥ k).
You must also understand, your function isn't just O(n). It is also O(n^2), O(n^3), O(n^4) and so on. All because corresponding values of c and k exist for g(n) = n^2, g(n) = n^3 and g(n) = n^4.
Hope it helps.
Given f(n) = n^[(1+sin(n*pi/2))/2] and g(n) = n^0.5
, how do I prove that f(n) = O(g(n)) / f(n) = Omega(g(n)) / f(n) = Theta(g(n)).
I have worked out that f(n) doesn't seem to have a bound as the function grows bigger and smaller as n grows big.... (i plotted the graph here)
https://www.desmos.com/calculator/xtrh124rjb
So How would one justify which does it belong to ?
Or does it belong to neither of them since it doesn't have a bound at all....?
Consider the sequence 1, 5, 9, …, 4k + 1, … For these values of n, (1 + sin(n * pi / 2)) / 2 = 1. Therefore, for these values of n, your function is identical to the function A(n) = n^1 = n. Note that A(n) = n is NOT O(g(n)) = O(n^0.5); n grows asymptotically faster than n^0.5.
Consider the sequence 3, 7, 11, …, 4k + 3, … For these values of n, (1 + sin(n * pi / 2)) / 2 = 0. Therefore, for these values of n, your function is identical to the function B(N) = n^0 = 1. Note that B(n) = 1 is NOT Omega(g(n)) = Omega(n^0.5); n^0.5 grows asymptotically faster than the constant 1 (which doesn't grow at all).
Either f(n) not being O(g(n)) OR f(n) not being Omega(g(n)) would have already disqualified f(n) from being Theta(g(n)).
Note: f(n) = O(A(n)) and f(n) = O(B(n)). f(n) = Theta(h(n)) where h(n) is any function which oscillates like f(n) and which grows at least as fast and which has a constant lower bound.
Let's say you were given two logarithmic functions like
and you were asked to find if f(n) is O(g(n)) Ω(g(n)) or Θ(g(n)), how would you go about it? I found questions like these easier when you were comparing two exponential equations, because for example with x(n) = n^2 and p(n) = n^2 you could find a c > 0 (ex 3) where x(n) <= cp(n) for all n greater than some n>0 and that would prove that x(n) = O(p(n)). However, comparing two logarithmic functions seems much more difficult for some reason. Any help is appreciated, thanks!
f(n) is O(g(n)) iff there is a constant c and n_0 such that f(n) <= c * g(n) for each n >= n_0.
f(n) is Ω(g(n)) iff there is a constant c and n_0 such that f(n) >= c * g(n) for each n >= n_0.
Now, f(n) is Θ(g(n)) iff f(n) is O(g(n)) and f(n) is Ω(g(n)).
So, in your cases, we have:
f(n) = log (n^2) = 2logn
which means, g(n) is logn and c = 2, which means f(n) <= 2 * logn and f(n) >= 2 * logn, which makes it Ω(logn).
Btw. its also f(n) <= n and f(n) >= 1, so f(n) can be O(n), but we don't use it, since we can find a better O(g(n)). In this case we don't have the same function in both notations, to for those values we don't have Ω. However, we just need one option for g(n) to declare Ω. In cases we can't find it, we say its not Ω. Note the word "we say".
In second case, we care only for "highest growing value", logn part. Now, c = 1, and g = log(n), so in this case, its also Ω(logn).