I'm attempting to guess and prove the Big O for:
f(n) = n^3 - 7n^2 + nlg(n) + 10
I guess that big O is n^3 as it is the term with the largest order of growth
However, I'm having trouble proving it. My unsuccesful attempt follows:
f(n) <= cg(n)
f(n) <= n^3 - 7n^2 + nlg(n) + 10 <= cn^3
f(n) <= n^3 + (n^3)*lg(n) + 10n^3 <= cn^3
f(n) <= N^3(11 + lg(n)) <= cn^3
so 11 + lg(n) = c
But this can't be right because c must be constant. What am I doing wrong?
For any base b, we know that there always exists an n0 > 0 such that
log(n)/log(b) < n whenever n >= n0
Thus,
n^3 - 7n^2 + nlg(n) + 10 < n^3 - 7n^2 + n^2 + 10 when n >= n0.
You can solve from there.
For your question, the proof of O(n^3) should look something like this:
f(n) <= n^3 + 7n^2 + nlg(n) + 10 for (n > 0)
f(n) <= n^3 + 7n^3 + nlg(n) + 10 for (n > 1)
f(n) <= n^3 + 7n^3 + n*n^2 + 10 for (n > 2)
f(n) <= n^3 + 7n^3 + n^3 + 10 for (n > 2)
f(n) <= n^3 + 7n^3 + n^3 + n^3 for (n > 3)
f(n) <= 10n^3 for (n > 3)
Therefore f(n) is O(n^3) for n > 3 and k = 10.
Related
I want to prove some asymptotic notations. and want to get the values of c1,c2,n
prove that
f(n)= n^4 + 3n^3 = θ(n^4)
f(n)= n^4 + 3n^3 not equal to θ(n^3)
how can I get the values of C1, C2, and n??
First one
n^4 + 3n^3 = Theta(n^4)
Guess c1 = 1/2 and c2 = 2.
Find n0 that works.
(1/2)n^4 <= n^4 + 3n^3
0 <= (1/2)n^4 + 3n^3
0 <= (1/2)n + 3
-6 <= n
Any choice for n0 works there.
n^4 + 3n^3 <= 2n^4
3 <= n
The smallest choice for n0 is 3.
combined, guess n0 = 3.
Proof is by induction
Base: (1/2)3^4 <= 3^4+(3)3^3 <= (2)3^4
Hypothesis: true up to k
Step:
(1/2)(k+1)^4 <= (k+1)^4 + 3(k+1)^3 <= 2(k+1)^4
1/2(k+1) <= (k+1) + 3 <= 2(k+1)
(1/2)k + 1/2 <= (k+3)+1 <= 2k + 2
From hypothesis (1/2)k <= k+3 <= 2k
also 1/2 <= 1 <= 2
So checks out.
Second one
n^4 + 3n^3 is not Theta(n^3)
Proof of first one is proof of this too
Can't be theta of funcs that aren't theta of each other
Otherwise, it is Omega(n^3)
Show it's not O(n^3)
Assume it were
Then there exist c, n0 such that
n^4 + 3n^3 <= cn^3 for n >= n0
n + 3 <= c for n >= n0
but then for any constant c just choose n' > c - 3
this gives a contradiction
the only assumption was that f(n) = O(n^3)
must have been wrong
I think is O(n*log(n)) but I am not sure.
I tried log(n*n!) = log(n(n *n-1*n-2* ...* 1)) = nlog(n) + log(n) + log(n-1) + ... + log(1) <= nlog(n) + nlog(n) = 2nlog(n)
Can someone explain if this is correct?
Upper bound
log(n*n!) = log(n) + log(n!)
= log(n) + log(n) + log(n-1) + ... + log(2)
<= log(n) + (n-1)log(n)
= n*log(n)
Lower bound
log(n*n!) = log(n) + log(n) + log(n-1) + ... + log(2)
>= log(n) + (n-1)/2*log(n/2) ; 1st half >= log(n/2), 2nd >= 0
>= log(n/2) + (n-1)/2*log(n/2)
= (n+1)/2*log(n/2)
>= (n/2)log(n/2)
Note: Here I'm assuming log(2)>0, which is true if we are taking base-2 logarithms. This is a valid assumption because the relationship between logarithms of different bases is linear, and linear relationships are preserved by big-O.
Intuitively, we see that this is O(n*log(n)), right? But why is this true?
To see the reason we need to find C > 0 and N0 such that
(N0/2)log(N0/2) >= C*N0*log(N0)
which reduces to:
log(N0/2)/log(N0) >= 2*C
or
1 - log(2)/log(N0) >= 2*C
so, choosing C < 1/2, v.g. C = 1/4, the value of N0 only needs to verify:
log(N0) >= 2*log(2) = log(4).
so it is enough to pic N0 = 4.
Note that we had one inequality for two constants C and N0. This is why we had to pick one (which was good enough), and the deduce the other.
t(n) = 1000n + 283n^2+4n^3
Why is the largest valid bound for t(n) n^4? When adding them up, don't you select the biggest between them, which is n^3?
I'm still new to this, thanks for helping out.
t(n) = 1000n + 283n^2 + 4n^3
<= n*n + n * n^2 + 4n^3 ; for n >= 1000
= n^2 + n^3 + 4n^3
<= n^3 + n^3 + 4n^3 ; because n^2 <= n^3
= 6n^3
= O(n^3)
Prove 5n^2+ 2n -1 = O(n^2).
This is what I have attempted so far:
5n^2 + 2n -1 <= 5n^2 + 2n^2 -n2 for n>1
5n^2+ 2n -1 <= 6n^2 for n>1
5n^2+ 2n -1 = O(n^2) [ c = 6n^2, n0 = 1 ]
Is this the right way of proving Big O notation?
To prove that your expression is O(n^2), you need to show that it is bounded by M*n^2, for some constant M and some minimum n value. By inspection, we can show that your expression is bounded by 10*n^2, for n=10:
For n = 10:
5n^2 + 2n -1 <= 10*n^2
500 + 20 - 1 <= 1000
519 <= 1000
We can also show that the expression is bounded by 10*n^2 for any value n greater than 10:
For n > 10:
5n^2 + 2n -1 <= 10*n^2
5*(10+i)^2 + 2*(10+i) -1 <= 10*(10+i)^2
5*(i^2 + 20i + 100) + 2i + 19 <= 10*(i^2 + 20i + 100)
2i + 19 <= 5*(i^2 + 20i + 100)
2i + 19 <= 5i^2 + 100i + 500
5i^2 + 98i + 481 >= 0, which is true for `i > 0`
Here is a link to the Wikipedia article on Big-O notation:
https://en.m.wikipedia.org/wiki/Big_O_notation
Update:
Note that in practice in order to label your expression O(n^2) we won't resort to such an ugly written proof. Instead, we will just recognize that the n^2 term will dominate the expression for large n, and that the overall behavior will be O(n^2). And your expression is also O(n^3) (and O(n^4), etc.).
it looks fine , I'm thinking that if you are doing it for your assignment work or other formal work then you can also do it in more formal way like for selecting the value of constant ( c ) such as by f(n)/g(n).Otherwise it looks correct.
We have f(n) = 5*n^2+2*n-1 and g(n) = n^2
In order to prove f(n) = O(g(n)), we just need to find 2 constants, namely c > 0 and n_0, s.t. f(n) <= c.g(n) for all n >= n0.
Let's choose some value of c = 5.5 say. Let's evaluate and plot f(n) and c*g(n). As we can see from the plot and we can also show it theoretically since n^2/2 - 2*n + 1 = (n^2-4*n+2)/2 = ((n-2)^2-2)/2 >= 0 for all n >= 4, it implies that 5*n^2+2*n-1 <= 5.5*n^2 for all n >= n0 = 4. Hence, f(n) = O(g(n)). (proved)
I am new with running time. I couldn't solve this one.
Given
f(n) = log n^2 and g(n) = log n + 5
prove
f(n) = theta(g(n)).
can anyone help me?
This is trivial. From basic logarithm properties
f(n) = log(n^2) = 2*log(n)
You need
C1*g(n) <= f(n) <= C2*g(n)
i.e.
C1*(log(n) + 5) <= 2*log(n) (1)
and
2*log(n) <= C2*(log(n) + 5) (2)
If you rewrite (1) as
C1*log(n) + C1*5 <= log(n) + log(n)
it looks like C1 == 1 would be a good choice, so we get:
log(n) + 5 <= log(n) + log(n)
5 <= log(n)
n >= 32
So we get for C1 == 1 and for each n >= 32, (1) holds.
For (2), it's obvious that you can choose C2 == 2. At the end you get
for each n >= 32
g(n) <= f(n) <= 2*g(n)
QED.