summation function prove it is big oh and big theta - algorithm

I don't get how to solve the summation problem proving that it is big oh of n^4 and big omega of n^4.
The problem is this:
f(n) = Σ(i=1 to n) Σ(j=1 to i) Σ(k=1 to j) of k
I wrote the code in C++ for what I think the summation is saying.
for(int i = 1; i <=n; i++)
for(int j = 1; j <= i; j++)
for (int k = 1; k<= j; k++)
something bigoh(1)
I know that I need to prove it is big oh and big omega of n^4

Your code is not reflecting the sum, as the last part of the summing formula is k, while your code assumes a constant for the inner part ("something bigoh(1)"). The code should be:
sum = 0
for(int i = 1; i <=n; i++)
for(int j = 1; j <= i; j++)
for (int k = 1; k <= j; k++)
for (int m = 1; m <= k; m++)
sum++
The innermost loop looks a bit overkill, because it can be replaced with
sum += k
...but writing it this way you can translate the problem to how many times sum++ is executed in the code.
Imagine you have an array of values 1,2,...n, and that you should pick four numbers from it (allowing to pick the same number again), but the order of picking is not important, then you can pick:
1, 1, 1, 1
2, 1, 1, 1
2, 2, 1, 1
2, 2, 2, 1
2, 2, 2, 2
3, 1, 1, 1
3, 2, 1, 1
...
...etc. You would not count {1, 2, 1, 1} as that is one you already counted with {2, 1, 1, 1} -- order is not distinguished. So we only count where the chosen numbers are in a non-increasing order.
Now notice how the four nested loops in this (corrected) code do exactly that: they iterate such combinations, avoiding to count a set twice (by keeping i >= j >= k >= m).
So given that the inner task has constant time complexity, this problem boils down to: how many such combinations exist?
This is a Combination with repetitions. This is denoted as C((n, m)), where in our case m=4, so we count the number of 4-multisubsets, C((n, 4)) ("n multichoose 4"). This number is equivalent to
n(n+1)(n+2)(n+3)/4!
This is evidently O(n4).
There is no way there can fewer (or more) executions of the inner part of the nested loops, so this is also a lower bound: Ω(n4)

Related

Time complexity of nested while with changing condition

I'm trying to solve the complexity of this loop
for(int i= 0; i < n; i++) {
c = i;
while(c > 1){
O(1);
c = c / 2;
}
}
as the while condition changes in every loop I don't know how to calculate that strange series.
I mean, if the loop where
for(int i= 0; i < n; i++) {
c = n;
while(c > 1){
O(1);
c = c / 2;
}
}
I know the while has a complexity of O(logn) and it repeats itself n times, so the complexity would be O(nlogn).
The problem I have with previous loop is "c=i". As c=i, first time (c=0) the loop would reproduce 0 times, when c=1 it would reproduce 0 times again, when c=2 it would reproduce 1 time, then the series would follow and it is 0, 0, 1, 2, 2, 3, 3... (while reproductions each time of for loop)
O(logn) would not repeat itself n times, would repeat a number of times I can't come up with, so I don't know how to solve it.
This need a bit of math involved.Given that log is well defined for a and b:
log(a) + log(b) = log(ab)
Here you have
log(1) + log(2) +....+ log(n) = log(1*....*n) = log(n!)
There is a mathematical approximation for log(n!), namely
log(n!) ~ nlog(n) - n + 1
which reveal O(log(n!)= O(nlog(n))

Algorithmic complexity of this nested loop

Is it O(n) or O(n*logn) of the code below:
for(int j=n, int sum = 0; j>0 ; j--)
for(int k=j; k >0; k--) sum++;
List of iterations:
j = 5: k = 5, 4, 3, 2, 1
j = 4: k = 4, 3, 2, 1,
j = 3: k = 3, 2, 1
j = 2: k = 2, 1
j = 1: k = 1
We have 15 iterations in total.
But, if it is O(n), then only 5 iterations must be.
And if it is O(n*logn) answer would be only around 11-12 iterations.
It's O(n^2). Why? Well it takes:
Look, that for n = 5 the number of calculations i 15 in deed. On the other hand, for n=100 it will be 5050. Which is far away from 100log100 which is around 460.
According to Wikipedia:
Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. It is a member of a family of notations invented by Paul Bachmann, Edmund Landau, and others, collectively called Bachmann-Landau notation or asymptotic notation.

What is the time complexity for this Algorithm?

for i = 1 to n do
j = 2
while j < i do
j = j * j
I think it's time complexity is : log(n!) = n * log(n).
but solution said that it is : n * loglog(n) and I didn't understand why?
In the explanation below, I assume that all arithmetic and comparison operations are O(1).
for i = 1 to n do
The below is repeated N times, which makes the n * part in the solution.
j = 2
while j < i do
j = j * j
The above calculates the first number of the following sequence that's >= i :
2 = 2^(2^0)
4 = 2^(2^1)
16 = 2^(2^2)
256 = 2^(2^3)
65536 = 2^(2^4)
...
So the only thing you need to do is to find the relation between i and 2^(2^i). And log(log(2^(2^i))) = log(2^i) = i.
Let's break it down and work from the inside out.
Imagine the following:
j = 2
while j < n do
j = j * 2
j goes 2, 4, 8, 16..., so if n doubles in size, it only takes roughly one more iteration for j to surpass it. That's basically the definition of logarithmic.
The inner loop in your case is a bit different:
j = 2
while j < n do
j = j * j
Now j goes 2, 4, 16, 256, 65536... and surpasses n even more easily. In the first case, j was growing exponentially per iteration, now it's growing doubly exponentially. But we're interested in the inverse--j surpasses n in log(log(n)) steps.
Then the outer loop just means you do that n times.

Find the time complexity of the algorithm?

I think it is log(logn) because the cycle repeats log(logn) times...
j=1;
i=2;
while (i <= n) do {
B[j] = A[i];
j = j + 1;
i = i * i;
}
You are right, it is O(lg(lg n)) where lg stands for base 2 logarithm.
The reason being that the sequence of values of i is subject to the rule i = prev(i) * prev(i), which turns out to be 2, 2^2, 2^4, 2^8, ... for steps 1, 2, 3, 4, .... In other words, the value of i after k iterations is 2^{2^k}.
Thus, the loop will stop as soon as 2^{2^k} > n or k > lg(lg(n)) (Just take lg twice to both sides of the inequality. The inequality remains valid because lg is an increasing function.)

Complexity of a code fragment

for(int i = 0; i < N; i++)
if(i < 2 || i > N - 3)
for(int j = 1; j <= 10N; j++)
a[i] = a[j - 1] / 2;
So the answer is N(1 + 10N(1)) = n + 10n^2 right? or is it n?
Please explain.
This looks O(N) to me.
The if statement is true for i = 0,1,N-1,N-2, which is a constant number of cases.
Your conclusion is wrong. Although the outer for loops N times, the if condition is only true in 4 cases (0, 1, N-2, N-1). So the total run time is rather N + 4·10·N that is in O(n).
If you want an asymptotic upper bound... O(n^2). If you want to be pickier than that, we need to define computational weights for individual instructions.
Edit: Yeah, it's O(n). I read it wrong the first time.

Resources