What is the runtime of this pseudo-code - runtime

i = 1;
while (i <= n)
j = i;
x = x+A[i];
while (j > 0)
y = x/(2*j);
j = j/2; // Assume here that this returns the floor of the quotient
i = 2*i;
return y;
I'm not sure about my answer, I got O(n2).

Lets remove x and y variables because it doesn't affect to complexity.
i = 1;
while (i <= n)
j = i;
while (j > 0)
j = j/2;
i = 2*i;
In the inner loop you every time divide j by 2, so actually it is not liner it is O(logn). For example when j is 16 you will do 5 ( O(log16)+1 ) steps: 8,4,2,1,0.
In the outer loop you every time multiply i by 2, so it is also O(logn).
So overall complexity will be O(logn * logn).

Related

Need help analyzing Big-0 Complexity for these for loops

I need to analyze the O-Notation complexity of the following code. I think I have some idea, but need some help. Do my O-Notations for each loop correct?
int sum = 0;
for (int i = 1; i <= n*2; i++ ) //N operations, ignore the multiplication by 2
sum++;
O(N)
int sum = 0;
for ( int i = 1; i <= n; i++) //N operations
for ( int j = n; j > 0; j /= 2) //N operations, ignore division by constant?
sum++;
O(N^2)
int sum = 0;
for ( int i = 1; i <= n; i++) //N ops
for ( int j = i; j <= n; j += 2) // N ops
sum++;
O(N^2)
int sum = 0;
for ( int i = 1; i <= n * n; i++) // N * N ops
for ( int j = 1; j < i; j++ ) // N opts
sum++;
O(N^3)
You iterate through 2*N elements, so its O(2N) = O(N)
You iterate through N elements and in each iteration, you go through logN elements, so its O(N*logN) - this time its different from 1., because you divide the counter j, not the limiting variable N, so , that means, if N is 20, you get this:
first iteration: j = 20
second iteration: j = 10
third iteration: j = 5
fourth iteration j = 2
fifth iteration j = 1
so its log(20) = 4.something, and that means 5 iterations...
So, it depends where is calculation perform, either in limiting factor or iteration calculation...
This is O(N^2), because you have for loop with n iterations, and in each iteration you have n/2 iterations, so its O(N * N / 2) = O(N^2)
In first loop you have N*N iterations. In each iteration you have to go to i, which depends on n, so you have:
i=1 => no iteration in inner for loop
i=2 => 1 iteration
i=3 => 2 iterations
i=4 => 3 iterations
...
i=n*n => n*n - 1 iterations
Sum of it is:
0+1+2+3+...+n*n-1 = 2+3+...n*n = 1+2+3+...+m-1(where m is n*n) = m*(m+1)/2 - m
So its: O(M^2) = O(N^4)

Trouble figuring out these tough Big-O examples

I'm trying to study for an upcoming quiz about Big-O notation. I've got a few examples here but they're giving me trouble. They seem a little too advanced for a lot of the basic examples you find online to help. Here are the problems I'm stuck on.
1. `for (i = 1; i <= n/2; i = i * 2) {
sum = sum + product;
for (j= 1; j < i*i*i; j = j + 2) {
sum++;
product += sum;
}
}`
For this one, the i = i * 2 in the outer loop implies O(log(n)), and I don't think the i <= n/2 condition changes anything because of how we ignore constants. So the outer loop stays O(log(n)). The inner loops condition j < i*i*i confuses me because its in terms of 'i' and not 'n'. Would the Big-O of this inner loop then be O(i^3)? And thus the Big-O for the entire problem
be O( (i^3) * log(n) )?
2. `for (i = n; i >= 1; i = i /2) {
sum = sum + product
for (j = 1; j < i*i; j = j + 2) {
sum ++;
for (k = 1 ; k < i*i*j; k++)
product *= i * j;
}
}`
For this one, the outermost loop implies O(log(n)). The middle loop implies, again unsure, O(i^2)? And the innermost loop implies O(i^2*j)? I've never seen examples like this before so I'm almost guessing at this point. Would the Big-O notation for this problem be O(i^4 * n * j)?
3. `for (i = 1; i < n*n; i = i*2) {
for (j = 0; j < i*i; j++) {
sum ++;
for (k = i*j; k > 0; k = k - 2)
product *= i * j;
}
}`
The outermost loop for this one has an n^2 condition, but also a logarithmic increment, so I think that cancels out to be just regular O(n). The middle loop is O(i^2), and the innermost loop is I think just O(n) and trying to trick you. So for this problem the Big-O notation would be O(n^2 * i^2)?
4. `int i = 1, j = 2;
while (i <= n) {
sum += 1;
i = i * j;
j = j * 2;
}`
For this one I did a few iterations to better see what was happening:
i = 1, j = 2
i = 2, j = 4
i = 8, j = 8
i = 64, j = 16
i = 1024, j = 32
So clearly, 'i' grows very quickly, and thus the condition is met very quickly. However I'm not sure just what kind of Big-O notation this is.
Any pointers or hints you can give are greatly appreciated, thanks guys.
You can't add i or j to O-notation, it must be converted to n.
For the first one:
Let k be log 2 i.
Then inner loop is done 2^(k*3)/2=2^(3k-1) times for each iteration of outer loop.
k goes from 1 to log2n.
So total number of iterations is
sum of 2^(3k-1) for k from 1 to log 2 n which is 4/7(n^3-1) according to Wolfram Alpha, which is O(n^3).
For the last one, i=j1*j2*j3*...jk, and jm=2^m
i=2^1*2^2*...2^k=2^(1+2+...k)
So 1+2+3+...+k=log 2 n
(k+1)k/2 = log 2 n
Which is O(sqrt(log n))
BTW, log n^2 is not n.
This question is better to ask at computer science than here.

Big-O analysis for a loop

I've got to analyze this loop, among others, and determine its running time using Big-O notation.
for ( int i = 0; i < n; i += 4 )
for ( int j = 0; j < n; j++ )
for ( int k = 1; k < j*j; k *= 2 )`
Here's what I have so far:
for ( int i = 0; i < n; i += 4 ) = n
for ( int j = 0; j < n; j++ ) = n
for ( int k = 1; k < j*j; k *= 2 ) = log^2 n
Now the problem I'm coming to is the final running time of the loop. My best guess is O(n^2), however I am uncertain if this correct. Can anyone help?
Edit: sorry about the Oh -> O thing. My textbook uses "Big-Oh"
First note that the outer loop is independent from the remaining two - it simply adds a (n/4)* multiplier. We will consider that later.
Now let's consider the complexity of
for ( int j = 0; j < n; j++ )
for ( int k = 1; k < j*j; k *= 2 )
We have the following sum:
0 + log2(1) + log2(2 * 2) + ... + log2(n*n)
It is good to note that log2(n^2) = 2 * log2(n). Thus we re-factor the sum to:
2 * (0 + log2(1) + log2(2) + ... + log2(n))
It is not very easy to analyze this sum but take a look at this post. Using Sterling's approximation one can that it is belongs to O(n*log(n)). Thus the overall complexity is O((n/4)*2*n*log(n))= O(n^2*log(n))
In terms of j, the inner loop is O(log_2(j^2)) time, but sine
log_2(j^2)=2log(j), it is actually O(log(j)).
For each iteration of middle loop, it takes O(log(j)) time (to do the
inner loop), so we need to sum:
sum { log(j) | j=1,..., n-1 } log(1) + log(2) + ... + log(n-1) = log((n-1)!)
And since log((n-1)!) is in O((n-1)log(n-1)) = O(nlogn), we can conclude middle middle loop takes O(nlogn) operations .
Note that both middle and inner loop are independent of i, so to
get the total complexity, we can just multiply n/4 (number of
repeats of outer loop) with complexity of middle loop, and get:
O(n/4 * nlogn) = O(n^2logn)
So, total complexity of this code is O(n^2 * log(n))
Time Complexity of a loop is considered as O(n) if the loop variables is incremented / decremented by a constant amount (which is c in examples below):
for (int i = 1; i <= n; i += c) {
// some O(1) expressions
}
for (int i = n; i > 0; i -= c) {
// some O(1) expressions
}
Time complexity of nested loops is equal to the number of times the innermost statement is executed. For example the following sample loops have O(n²) time complexity:
for (int i = 1; i <=n; i += c) {
for (int j = 1; j <=n; j += c) {
// some O(1) expressions
}
}
for (int i = n; i > 0; i += c) {
for (int j = i+1; j <=n; j += c) {
// some O(1) expressions
}
Time Complexity of a loop is considered as O(logn) if the loop variables is divided / multiplied by a constant amount:
for (int i = 1; i <=n; i *= c) {
// some O(1) expressions
}
for (int i = n; i > 0; i /= c) {
// some O(1) expressions
}
Now we have:
for ( int i = 0; i < n; i += 4 ) <----- runs n times
for ( int j = 0; j < n; j++ ) <----- for every i again runs n times
for ( int k = 1; k < j*j; k *= 2 )` <--- now for every j it runs logarithmic times.
So complexity is O(n²logm) where m is n² which can be simplified to O(n²logn) because n²logm = n²logn² = n² * 2logn ~ n²logn.

Euler's Sieve for Totient Calculation

Guys here is this algorithm for finding phi(i) = eulers totient(i) for all
1 <= i <= n.
int phi[n + 1];
for (int i = 1; i <= n; ++i) phi[i] = i;
//invariant: phi(x) for all 1 <= x < d is calculated during the
//start of the dth iteration.
for (int d = 1; d <= n; ++d) { //divisors
for (int j = 2 * d; j <= n; j += d) phi[j] -= phi[d];
}
How does the above formula help us implement the above algorithm?
Starting from the formula you gave, if we take phi(n) out of the sigma, we get:
Sigma[d|n,d!=n]phi(d) + phi(n) = n
Therefore:
phi(n) = n - Sigma[d|n,d!=n]phi(d)
And this is what the algorithm does: For each n, it starts with a value of n and subtracts phi(d) for each divisor d of n except n itself. Note that this is done in a different order, by iterating over d in the outer loop and n in the inner one, because it's faster to find a number's multiples than to find its divisors.

How to calculate Time Complexity for a given algorithm

i, j, N, sum is all integer type. N is input.
( Code1 )
i = N;
while(i > 1)
{
i = i / 2;
for (j = 0; j < 1000000; j++)
{
sum = sum + j;
}
}
( Code2 )
sum = 0;
d = 1;
d = d << (N-1);
for (i = 0; i < d; i++)
{
for (j = 0; j < 1000000; j++)
{
sum = sum + i;
}
}
How to calculate step count and time complexity for a Code1, Code2?
to calculate the time complexity try to understand what takes how much time, and by what n are you calculating.
if we say the addition ("+") takes O(1) steps then we can check how many time it is done in means of N.
the first code is dividing i in 2 each step, meaning it is doing log(N) steps. so the time complexity is
O(log(N) * 1000000)= O(log(N))
the second code is going form 0 to 2 in the power of n-1 so the complexity is:
O(s^(N-1) * 1000000)= O(2^(N-1))
but this is just a theory, because d has a max of 2^32/2^64 or other number, so it might not be O(2^(N-1)) in practice

Resources