Adding Big O notation Example - big-o

t(n) = 1000n + 283n^2+4n^3
Why is the largest valid bound for t(n) n^4? When adding them up, don't you select the biggest between them, which is n^3?
I'm still new to this, thanks for helping out.

t(n) = 1000n + 283n^2 + 4n^3
<= n*n + n * n^2 + 4n^3 ; for n >= 1000
= n^2 + n^3 + 4n^3
<= n^3 + n^3 + 4n^3 ; because n^2 <= n^3
= 6n^3
= O(n^3)

Related

Solving recurrence using the iteration method

I need help in solving T(n) = T(n/4) + T(n/3) + 2n using iteration method (recursion tree. I am thinking it would be either Θ(2n) or Θ(n)?
It's straightforward. We have two following inequalities:
T(n) > 2T(n/4) + 2n
and
T(n) < 2T(n/3) + 2n
Now, try to find upper bound and lower bound by expansion. Based on both cases, you will find that T(n) = Theta(n).
For example, for T'(n) = 2T(n/3) + 2n we have the following expansion:
T'(n) = 2T(n/3) + 2n = 2^2 T(n/3^2) + (1 + 2/3) * 2n
By induction we can show that:
T'(n) = 2^log_3(n) T(0) + (1 + 2/3 + 2^2/3^2 + ...) * 2n
< n + 6 * n = 7n
Because 2^log_3(n) < 2^log_2(n) = n and (1 + 2/3 + 2^2/3^2 + ...) is a geometric sum with factor 2/3. Hence, the sum will be 1/(1-2/3) = 3 when n goes to infinity.
You can do the same analysis for the lower bound of T(n).
Therefore, as c1 * n <= T(n) <= c_2 * n, we can conclude that T(n) is in Theta(n).

Complexity of T(n) = 2T(n/2) + n/2 (without master's theorem)?

I am looking at the best case running time for merge sort, and have found the following recurrence relation: T(n) = 2T(n/2) + n/2. I am aware of the fact that merge sort is theta(nlogn) in all cases. In attempting to solve this recurrence relation, I use telescoping:
T(n) = 2*T(n/2) + n/2
T(n) = 2^2*T(n/4) + n/4 + n/2
T(n) = 2^k*T(1) + (n/2 + n/4 + ... + n/2^k)
2^k = n -> log_2(n) = k
T(n) = n + n(1/2 + 1/4 + ... + 1/n)
I am unsure how to solve the summation in the last part... I'm not even sure if that is correct. My thinking is that there would be log_2(n) total items being added in the summation? I am unsure how to derive that 2T(n/2) + n/2 is theta(nlogn) without using the master's theorem please...
As pointed out in the comment, your calculation seems to be wrong.
T(n) = 2*T(n/2) + n/2
T(n) = 2*(2*T(n/4) + n/4) + n/2 = 4*T(n/4) + 2*(n/4) + n/2 = 4*T(n/4) + 2*(n/2)
T(n) = 4*(2*T(n/8) + n/8) + 2*(n/2) = 8*T(n/8) + (n/2) + 2*(n/2) = 8*T(n/8) + 3*(n/2)
...
T(n) = 2^k * T(n / 2^k) + k*(n/2), 2^k = n ---> k = log(n)
T(n) = log(n) * T(1) + log(n) * (n/2)
T(n) = logn + n*log(n)/2
Therefore time complexity of merge sort = O(n*log(n))

Time complexity using recursion tree method

I've been trying to solve the given problem using recursion tree method but my answer has not been coming of the same form
T(n)=8T(n/2)+n^2
The answer of the given problem is Theta(n^3)
Try to expand the equation:
T(n) = 8 T(n/2) + n^2
T(n) = 8(8T(n/4) +(n/2)^2) + n^2 = 8^2T(n/4) + n^2 + 8 (n/2)^2
T(n) = 8^3T(n/8) + n^2 + 8 (n/2)^2 + 8^2 (n/4)^2
Now you can generalize the above sum:
T(n) = sum 8^(i) (n/2^i)^2 for i from 0 to log(n)
Simplify:
T(n) = sum 2^(3i) n^2/2^(2i) for i from 0 to log(n)
T(n) = sum 2^i n^2 for i from 0 to log(n)
T(n) = n^2 (sum 2^i for i from 0 to log(n))
T(n) = n^2 * (2^(log(n)+1) - 1) = n^2 * (2n - 1) = Theta(n^3)
In the above, you should be aware that sum 2^i for i from 0 to log(n) is 1 + 2 + 2^2 + ... + 2^(log(n)) = 2^(log(n) + 1) - 1 = 2n - 1.

What is the asymptotic time-complexity (big theta) of T(n) = log(n*n!)?

I think is O(n*log(n)) but I am not sure.
I tried log(n*n!) = log(n(n *n-1*n-2* ...* 1)) = nlog(n) + log(n) + log(n-1) + ... + log(1) <= nlog(n) + nlog(n) = 2nlog(n)
Can someone explain if this is correct?
Upper bound
log(n*n!) = log(n) + log(n!)
= log(n) + log(n) + log(n-1) + ... + log(2)
<= log(n) + (n-1)log(n)
= n*log(n)
Lower bound
log(n*n!) = log(n) + log(n) + log(n-1) + ... + log(2)
>= log(n) + (n-1)/2*log(n/2) ; 1st half >= log(n/2), 2nd >= 0
>= log(n/2) + (n-1)/2*log(n/2)
= (n+1)/2*log(n/2)
>= (n/2)log(n/2)
Note: Here I'm assuming log(2)>0, which is true if we are taking base-2 logarithms. This is a valid assumption because the relationship between logarithms of different bases is linear, and linear relationships are preserved by big-O.
Intuitively, we see that this is O(n*log(n)), right? But why is this true?
To see the reason we need to find C > 0 and N0 such that
(N0/2)log(N0/2) >= C*N0*log(N0)
which reduces to:
log(N0/2)/log(N0) >= 2*C
or
1 - log(2)/log(N0) >= 2*C
so, choosing C < 1/2, v.g. C = 1/4, the value of N0 only needs to verify:
log(N0) >= 2*log(2) = log(4).
so it is enough to pic N0 = 4.
Note that we had one inequality for two constants C and N0. This is why we had to pick one (which was good enough), and the deduce the other.

Simple Big O with lg(n) proof

I'm attempting to guess and prove the Big O for:
f(n) = n^3 - 7n^2 + nlg(n) + 10
I guess that big O is n^3 as it is the term with the largest order of growth
However, I'm having trouble proving it. My unsuccesful attempt follows:
f(n) <= cg(n)
f(n) <= n^3 - 7n^2 + nlg(n) + 10 <= cn^3
f(n) <= n^3 + (n^3)*lg(n) + 10n^3 <= cn^3
f(n) <= N^3(11 + lg(n)) <= cn^3
so 11 + lg(n) = c
But this can't be right because c must be constant. What am I doing wrong?
For any base b, we know that there always exists an n0 > 0 such that
log(n)/log(b) < n whenever n >= n0
Thus,
n^3 - 7n^2 + nlg(n) + 10 < n^3 - 7n^2 + n^2 + 10 when n >= n0.
You can solve from there.
For your question, the proof of O(n^3) should look something like this:
f(n) <= n^3 + 7n^2 + nlg(n) + 10 for (n > 0)
f(n) <= n^3 + 7n^3 + nlg(n) + 10 for (n > 1)
f(n) <= n^3 + 7n^3 + n*n^2 + 10 for (n > 2)
f(n) <= n^3 + 7n^3 + n^3 + 10 for (n > 2)
f(n) <= n^3 + 7n^3 + n^3 + n^3 for (n > 3)
f(n) <= 10n^3 for (n > 3)
Therefore f(n) is O(n^3) for n > 3 and k = 10.

Resources