Proving a given function equals o(N) - algorithm

I am trying to prove that for any constant,k, log^k N = o(N) (little O of N)
All that I know for little o is that it follows the form T(n) = o(p(n)) where T(n) grows at a rate slower than p(n). Also I can't really do a limit and use L'hopital rule because I do not know what f(n) or g(n) is. Can someone please help getting me started!

You need to show that
lim (log^k N)/N = 0
N -> infinity
Write N = e^x, and that becomes
lim (x^k)/(e^x) = 0
Now, use the Power series expansion of the exponential function,
e^x = ∑ (x^n)/n!
to get an estimate for that quotient.
Spoiler: Picking the term with n = k+1 gives e^x > x^(k+1)/(k+1)! and from that easily follows (x^k)/(e^x) < (k+1)!/x.

Related

How to solve a problem on relative asymptotic growth (table) from CLRS?

I struggle to fill this table in even though I took calculus recently and good at math. It is only specified in the chapter how to deal with lim(n^k/c^n), but I have no idea how to compare other functions. I checked the solution manual and no info on that, only a table with answers which provides little insight.
When I solve these I don't really think about limits -- I lean on a couple facts and some well-known properties of big-O notation.
Fact 1: for all functions f and g and all exponents p > 0, we have f(n) = O(g(n)) if and only if f(n)p = O(g(n)p), and likewise with o, Ω, ω, and Θ respectively. This has a straightforward proof from the definition; you just have to raise the constant c to the power p as well.
Fact 2: for all exponents ε > 0, the function lg(n) is o(nε). This follows from l'Hôpital's rule for limits: lim lg(n)/nε = lim (lg(e)/n)/(ε nε−1) = (lg(e)/ε) lim n−ε = 0.
Fact 3:
If f(n) ≤ g(n) + O(1), then 2f(n) = O(2g(n)).
If f(n) ≤ g(n) − ω(1), then 2f(n) = o(2g(n)).
If f(n) ≥ g(n) − O(1), then 2f(n) = Ω(2g(n)).
If f(n) ≥ g(n) + ω(1), then 2f(n) = ω(2g(n)).
Fact 4: lg(n!) = Θ(n lg(n)). The proof uses Stirling's approximation.
To solve (a), use Fact 1 to raise both sides to the power of 1/k and apply Fact 2.
To solve (b), rewrite nk = 2lg(n)k and cn = 2lg(c)n, prove that lg(c) n − lg(n) k = ω(1), and apply Fact 3.
(c) is special. nsin(n) ends up anywhere between 0 and n. Since 0 is o(√n) and n is ω(√n), that's a solid row of NO.
To solve (d), observe that n ≥ n/2 + ω(1) and apply Fact 3.
To solve (e), rewrite nlg(c) = 2lg(n)lg(c) = 2lg(c)lg(n) = clg(n).
To solve (f), use Fact 4 and find that lg(n!) = Θ(n lg(n)) = lg(nn).

Algorithm complexity, solving recursive equation

I'm taking Data Structures and Algorithm course and I'm stuck at this recursive equation:
T(n) = logn*T(logn) + n
obviously this can't be handled with the use of the Master Theorem, so I was wondering if anybody has any ideas for solving this recursive equation. I'm pretty sure that it should be solved with a change in the parameters, like considering n to be 2^m , but I couldn't manage to find any good fix.
The answer is Theta(n). To prove something is Theta(n), you have to show it is Omega(n) and O(n). Omega(n) in this case is obvious because T(n)>=n. To show that T(n)=O(n), first
Pick a large finite value N such that log(n)^2 < n/100 for all n>N. This is possible because log(n)^2=o(n).
Pick a constant C>100 such that T(n)<Cn for all n<=N. This is possible due to the fact that N is finite.
We will show inductively that T(n)<Cn for all n>N. Since log(n)<n, by the induction hypothesis, we have:
T(n) < n + log(n) C log(n)
= n + C log(n)^2
< n + (C/100) n
= C * (1/100 + 1/C) * n
< C/50 * n
< C*n
In fact, for this function it is even possible to show that T(n) = n + o(n) using a similar argument.
This is by no means an official proof but I think it goes like this.
The key is the + n part. Because of this, T is bounded below by o(n). (or should that be big omega? I'm rusty.) So let's assume that T(n) = O(n) and have a go at that.
Substitute into the original relation
T(n) = (log n)O(log n) + n
= O(log^2(n)) + O(n)
= O(n)
So it still holds.

When f(n) = n^.1 and g(n) = log(n)^10, does f(n) = Ω(g)?

I was told that "any exponential trumps any logarithm".
But when the exponential is between zero and one, doesn't the execution time of the logarithm grow much faster? So by that logic it would be f = O(g)
I'm having trouble choosing whether to follow my intuition or what I've been told, but what I've been told may have been not totally accurate.
Let's try out some math here. One important fact is that the logarithm function is monotonically increasing, which means that if
log f(x) ≤ log g(x)
then
f(x) ≤ g(x)
Now, let's see what that does here. We have two functions, x0.1 and log10 x. If we take their logs, we get
log (x0.1) = 0.1 log x
and
log (log10 x) = 10 log log x
Since log log x grows much more slowly than log x, intuitively we can see that the function x0.1 is going to eventually overtake log10 x.
Now, let's formalize this. We want to find some value of x such that
x0.1 > log10 x
Let's suppose that these are base-10 logarithms just to make the math easier. If we assume that x = 10k for some k, we get that
(10k)0.1 ≥ log10 10k
100.1 k > log10 10k
100.1 k > k
Now, take k = 100. Now we have that
100.1 * 100 > 100
1010 > 100
which is clearly true. Since both functions are monotonically increasing, this means that for x ≥ 10100, it is true that
x0.1 > log10 x
Which means that it is not true that x0.1 = O(log10 k).
Hope this helps!
The asymptotic analysis is really focused on the long run relationship (as n assumes larger values, how do the values of the functions compare)? It also disregards constants, which is why you sometimes see strange situations like f(x) = 10000000*x = O(x^2).
For large values of n, f(n) > g(n) which is all that really matters.
Another way to verify that n^0.1 = big omega(log^10(n)) by using the limit rule?
The limit rule is:
limit as n->infinity f(n)/g(g).
if the limit is positive infinity, f(n) != O(g(n)) & g(n) = O(f(n)) or f(n) = big omega(g(n))
if the limit is 0, f(n) = O(g(n)) & g(n) != O(f(n))
if the limit is a positive real number, f(n) = O(g(n)) & g(n) = O(f(n)) or f(n) = big theta(g(n))
For this problem:
let f(n) = O(n^0.1) and let g(n) = log^10(n)
That gives us the limit:
limit as n->infinity (n^0.1)/(log^10(n))
Using L'Hospital's rule on the limit 10 times we get:
limit as n->infinity ((0.1)^10 * ln^10(b) * n^0.1)/(10!) where b is the base of the log
Since the n term is only in the numerator, the limit approaches infinity.
By the limit rule
log^10(n) = O(n^0.1) & n^0.1 != O(log^10(n) or n^0.1 = big omega(log^10(n)).

time complexity

Hi
I have a question that:
consider I have T(n) = m * n^2 (n<m)
is this correct to write T(n) = O(m) ? because I have written T(n) = m*n*n So because n<m we have T(n) = O(m)
thanks
No, the best thing you can do is to write T(n,m) = O(m^3). n < m is a very weak condition and basically just gives you n in O(m). For example, n could always be m-1.
Edit: My first answer was imprecise, as T was only a function in n. If m is constant, the answer is still holds, but O(m^3) is equal to O(1).
I believe in this case T(n) = O(n^2)
The formal definition of big-O:
f(x) = O(g(x)) if and only if there exists a positive real number M and a real number x0 such that |f(x)| ≤ M|g(x)| for all x > x0.
In your case, T(n) will always be less than or equal to m(n^2).

algorithm analysis - orders of growth question

I'm studing orders of growth "big oh", "big omega", and "big theta". Since I can't type the little symbols for these I will denote them as follows:
ORDER = big oh
OMEGA = big omega
THETA = big theta
For example I'll say n = ORDER(n^2) to mean that the function n is in the order of n^2 (n grows at most as fast n^2).
Ok for the most part I understand these:
n = ORDER(n^2) //n grows at most as fast as n^2
n^2 = OMEGA(n) //n^2 grows atleast as fast as n
8n^2 + 1000 = THETA(n^2) //same order of growth
Ok here comes the example that confuses me:
what is n(n+1) vs n^2
I realize that n(n+1) = n^2 + n; I would say it has the same order of growth as n^2; therefore I would say
n(n+1) = THETA(n^2)
but my question is, would it also be correct to say:
n(n+1) = ORDER(n^2)
please help because this is confusing to me. thanks.
Thank you guys!!
just to make sure I understand correctly, are these all true:
n^2+n = ORDER(2000n^2)
n^2+n = THETA(2000n^2)
n^2+n = OMEGA(2000n^2)
2000n^2 = ORDER(n^2+n)
2000n^2 = THETA(n^2+n)
2000n^2 = OMEGA(n^2+n)
So if f = THETA(g) then f=ORDER(g) and f=OMEGA(g) are also true.
Yes, n(n+1) = Order(n^2) is correct.
If f = Theta(g) then f = Order(g) and g = Order(f) are both true.
Moron is correct, and that is the easiest way to think about it.
But to understand it, return to the definition for f(n) = O(g(n)): there exists a positive M and n0 such that, for all n > n0, f(n) <= Mg(n).
Suppose M=2. Can you find a value, n0, such that for all n > n0, n^2+n <= M(n^2)?
(Plot both functions with pen and paper to understand how they grow in relation to one another.)
You can use this simple table to get an easy and intuitive understanding of what these symbols mean:
If f(n) and g(n) are two functions then
Growth Rate
if f(n) = Θ(g(n)) then growth rate of f(n) = growth rate of g(n)
if f(n) = O(g(n)) then growth rate of f(n) ≤ growth rate of g(n)
if f(n) = Ω(g(n)) then growth rate of f(n) ≥ growth rate of g(n)
if f(n) = o(g(n)) then growth rate of f(n) < growth rate of g(n)
if f(n) = ω(g(n)) then growth rate of f(n) > growth rate of g(n)
Also, the order is always written in terms of the highest order i.e if the order is O(n^2 + n + 1) then we simply write it as O(n^2) as n^2 is of the highest order.

Resources