Time Complexity Review - time

What would be the time complexity for the following code snippet?
int[][] A = new int [n][];
for (int i=0; i<n; i++) {
if (i % 2 == 0) // i is a multiple of 2
A[i] = new int [n];
else
A[i] = new int [1];
}
for (int i=0; i<A.length; i++)
for (int j=0; j<A[i].length; j++)
sum = sum + A[i][j];
I understand the first for loop loops n times, then, there will be n/2 rows of of matrix of at length n, and n/2 of at length 1. Would the total time be n^2?

Yes, the complexity will be O(n2).
How?
Half of the times (i.e. n/2 times), you will iterate through n elements = (n/2) * n = n2/2.
Half of the times (again, n/2 times), you will have only one element to iterate over = (n/2) * 1 = n/2.
Therefore, overall complexity = O(n2/2 + n/2) = O(n2)

Well firstly let's decide with terminology. For example, let's put that every single operation will be to equal to 1. Let's take your code (just to be consistent - we will call this method) and go line by line.
int[][] A = new int [n][];
this will be equal to 1.
for (int i=0; i<n; i++) {
Here we have loop and in worst case it will be n.
if (i % 2 == 0) // 1
A[i] = new int [n]; // 1
else
A[i] = new int [1]; // 1
}
Above operation could be counted as 1 each.
for (int i=0; i<A.length; i++)
Loop is equal to n-elements.
for (int j=0; j<A[i].length; j++)
Inner loop is the same n.
sum = sum + A[i][j];
Again this will be equal to 1.
Inner loops are multiplied so you are correct, but take into account that this will be exactly big-O notation O(n2).

Related

How to find the time complexity of these two programs? [duplicate]

int sum = 0;
for(int i = 1; i < n; i++) {
for(int j = 1; j < i * i; j++) {
if(j % i == 0) {
for(int k = 0; k < j; k++) {
sum++;
}
}
}
}
I don't understand how when j = i, 2i, 3i... the last for loop runs n times. I guess I just don't understand how we came to that conclusion based on the if statement.
Edit: I know how to compute the complexity for all the loops except for why the last loop executes i times based on the mod operator... I just don't see how it's i. Basically, why can't j % i go up to i * i rather than i?
Let's label the loops A, B and C:
int sum = 0;
// loop A
for(int i = 1; i < n; i++) {
// loop B
for(int j = 1; j < i * i; j++) {
if(j % i == 0) {
// loop C
for(int k = 0; k < j; k++) {
sum++;
}
}
}
}
Loop A iterates O(n) times.
Loop B iterates O(i2) times per iteration of A. For each of these iterations:
j % i == 0 is evaluated, which takes O(1) time.
On 1/i of these iterations, loop C iterates j times, doing O(1) work per iteration. Since j is O(i2) on average, and this is only done for 1/i iterations of loop B, the average cost is O(i2 / i) = O(i).
Multiplying all of this together, we get O(n × i2 × (1 + i)) = O(n × i3). Since i is on average O(n), this is O(n4).
The tricky part of this is saying that the if condition is only true 1/i of the time:
Basically, why can't j % i go up to i * i rather than i?
In fact, j does go up to j < i * i, not just up to j < i. But the condition j % i == 0 is true if and only if j is a multiple of i.
The multiples of i within the range are i, 2*i, 3*i, ..., (i-1) * i. There are i - 1 of these, so loop C is reached i - 1 times despite loop B iterating i * i - 1 times.
The first loop consumes n iterations.
The second loop consumes n*n iterations. Imagine the case when i=n, then j=n*n.
The third loop consumes n iterations because it's executed only i times, where i is bounded to n in the worst case.
Thus, the code complexity is O(n×n×n×n).
I hope this helps you understand.
All the other answers are correct, I just want to amend the following.
I wanted to see, if the reduction of executions of the inner k-loop was sufficient to reduce the actual complexity below O(n⁴). So I wrote the following:
for (int n = 1; n < 363; ++n) {
int sum = 0;
for(int i = 1; i < n; ++i) {
for(int j = 1; j < i * i; ++j) {
if(j % i == 0) {
for(int k = 0; k < j; ++k) {
sum++;
}
}
}
}
long cubic = (long) Math.pow(n, 3);
long hypCubic = (long) Math.pow(n, 4);
double relative = (double) (sum / (double) hypCubic);
System.out.println("n = " + n + ": iterations = " + sum +
", n³ = " + cubic + ", n⁴ = " + hypCubic + ", rel = " + relative);
}
After executing this, it becomes obvious, that the complexity is in fact n⁴. The last lines of output look like this:
n = 356: iterations = 1989000035, n³ = 45118016, n⁴ = 16062013696, rel = 0.12383254507467704
n = 357: iterations = 2011495675, n³ = 45499293, n⁴ = 16243247601, rel = 0.12383580700180696
n = 358: iterations = 2034181597, n³ = 45882712, n⁴ = 16426010896, rel = 0.12383905075183874
n = 359: iterations = 2057058871, n³ = 46268279, n⁴ = 16610312161, rel = 0.12384227647628734
n = 360: iterations = 2080128570, n³ = 46656000, n⁴ = 16796160000, rel = 0.12384548432498857
n = 361: iterations = 2103391770, n³ = 47045881, n⁴ = 16983563041, rel = 0.12384867444612208
n = 362: iterations = 2126849550, n³ = 47437928, n⁴ = 17172529936, rel = 0.1238518469862343
What this shows is, that the actual relative difference between actual n⁴ and the complexity of this code segment is a factor asymptotic towards a value around 0.124... (actually 0.125). While it does not give us the exact value, we can deduce, the following:
Time complexity is n⁴/8 ~ f(n) where f is your function/method.
The wikipedia-page on Big O notation states in the tables of 'Family of Bachmann–Landau notations' that the ~ defines the limit of the two operand sides is equal. Or:
f is equal to g asymptotically
(I chose 363 as excluded upper bound, because n = 362 is the last value for which we get a sensible result. After that, we exceed the long-space and the relative value becomes negative.)
User kaya3 figured out the following:
The asymptotic constant is exactly 1/8 = 0.125, by the way; here's the exact formula via Wolfram Alpha.
Remove if and modulo without changing the complexity
Here's the original method:
public static long f(int n) {
int sum = 0;
for (int i = 1; i < n; i++) {
for (int j = 1; j < i * i; j++) {
if (j % i == 0) {
for (int k = 0; k < j; k++) {
sum++;
}
}
}
}
return sum;
}
If you're confused by the if and modulo, you can just refactor them away, with j jumping directly from i to 2*i to 3*i ... :
public static long f2(int n) {
int sum = 0;
for (int i = 1; i < n; i++) {
for (int j = i; j < i * i; j = j + i) {
for (int k = 0; k < j; k++) {
sum++;
}
}
}
return sum;
}
To make it even easier to calculate the complexity, you can introduce an intermediary j2 variable, so that every loop variable is incremented by 1 at each iteration:
public static long f3(int n) {
int sum = 0;
for (int i = 1; i < n; i++) {
for (int j2 = 1; j2 < i; j2++) {
int j = j2 * i;
for (int k = 0; k < j; k++) {
sum++;
}
}
}
return sum;
}
You can use debugging or old-school System.out.println in order to check that i, j, k triplet is always the same in each method.
Closed form expression
As mentioned by others, you can use the fact that the sum of the first n integers is equal to n * (n+1) / 2 (see triangular numbers). If you use this simplification for every loop, you get :
public static long f4(int n) {
return (n - 1) * n * (n - 2) * (3 * n - 1) / 24;
}
It is obviously not the same complexity as the original code but it does return the same values.
If you google the first terms, you can notice that 0 0 0 2 11 35 85 175 322 546 870 1320 1925 2717 3731 appear in "Stirling numbers of the first kind: s(n+2, n).", with two 0s added at the beginning. It means that sum is the Stirling number of the first kind s(n, n-2).
Let's have a look at the first two loops.
The first one is simple, it's looping from 1 to n. The second one is more interesting. It goes from 1 to i squared. Let's see some examples:
e.g. n = 4
i = 1
j loops from 1 to 1^2
i = 2
j loops from 1 to 2^2
i = 3
j loops from 1 to 3^2
In total, the i and j loops combined have 1^2 + 2^2 + 3^2.
There is a formula for the sum of first n squares, n * (n+1) * (2n + 1) / 6, which is roughly O(n^3).
You have one last k loop which loops from 0 to j if and only if j % i == 0. Since j goes from 1 to i^2, j % i == 0 is true for i times. Since the i loop iterates over n, you have one extra O(n).
So you have O(n^3) from i and j loops and another O(n) from k loop for a grand total of O(n^4)

What is the time complexity of Stable Selection Sort algorithm?

static void stableSelectionSort(int[] a, int n)
{
for (int i = 0; i < n - 1; i++)
{
int min = i;
for (int j = i + 1; j < n; j++)
if (a[min] > a[j])
min = j;
// Move minimum element at current i.
int key = a[min];
while (min > i)
{
a[min] = a[min - 1];
min--;
}
a[i] = key;
}
}
What will be the time complexity of Stable selection sort algo? Will it be same as selection sort?
So the outer loop runns n-1 times.The first inner loop from i to n, that is for first time it runs n-1 time then n-2 then n-3 ... 1 . Now for the second loop suppose if all the elements are same then each time the loop would be running from i to 0 , adding both first and second loop the inside loops would be running for n times ,so the worst time complexity would reach n^2

Time Complexity for for loop with if-else block

I want to find the time complexity for this below code. Here's my understanding-
The outer for loop will loop 2n times and in the worst case when i==n, we will enter the if block where the nested for loops have complexity of O(n^2), counting the outer for loop, the time complexity for the code block will be O(n^3).
In best case when i!=n, else has complexity of O(n) and the outer for loop is O(n) which makes the complexity, in best case as O(n^2).
Am I correct or am I missing something here?
for (int i = 0; i < 2*n; i++)
{
if (i == n)
{
for (int j = 0; j < i; j++)
for (int k = 0; k < i; k++)
O(1)
}
else
{
for (int j = 0; j < i; j++)
O(1)
}
}
No.
The question "what is T(n)?".
What you are saying is "if i=n, then O(n^3), else O(n^2)".
But there is no i in the question, only n.
Think of a similar question:
"During a week, Pete works 10 hours on Wednesday, and 1 hour on every other day, what is the total time Pete works in a week?".
You don't really answer "if the week is Wednesday, then X, otherwise Y".
Your answer has to include the work time on Wednesday and on every other day as well.
Back in your original question, Wednesday is the case when i=n, and all other days are the case when i!=n.
We have to sum them all up to find the answer.
This is a question of how many times O(1) is executed per loop. The time complexity is a function of n, not i. That is, "How many times is O(1) executed at n?"
There is one run of a O(n^2) loop when i == n.
There are (2n - 2) instances of the O(n) loop in all other cases.
Therefore, the time complexity is O((2n - 2) * n + 1 * n^2) = O(3n^2 - 2*n) = O(n^2).
I've written a C program to spit out the first few values of n^2, the actual value, and n^3 to illustrate the difference:
#include <stdio.h>
int count(int n){
int ctr = 0;
for (int i = 0; i < 2*n; i++){
if (i == n)
for (int j = 0; j < i; j++)
for (int k = 0; k < i; k++)
ctr++;
else
for (int j = 0; j < i; j++)
ctr++;
}
return ctr;
}
int main(){
for (int i = 1; i <= 20; i++){
printf(
"%d\t%d\t%d\t%d\n",
i*i, count(i), 3*i*i - 2*i, i*i*i
);
}
}
Try it online!
(You can paste it into Excel to plot the values.)
The First loop is repeated 2*n times:
for (int i = 0; i < 2*n; i++)
{
// some code
}
This part Just occur once, when i == n and time complexity is : O(n^2):
if (i == n)
{
for (int j = 0; j < i; j++)
for (int k = 0; k < i; k++)
O(1)
}
And this part is depends on i.
else
{
for (int j = 0; j < i; j++)
O(1)
}
Consider i when:
i = 0 the loop is repeated 0 times
i = 1 the loop is repeated 1 times
i = 2 the loop is repeated 2 times
.
.
i = n the loop is repeated n times. (n here is 2*n)
So the loop repeated (n*(n+1)) / 2 times But when i == n else part is not working so (n*(n+1)) / 2 - n and time complexity is O(n^2).
Now we sum all of these parts: O(n^2) (first part) + O(n^2) (second part) because the first part occurs once so it's not O(n^3). Time complaxity is: O(n^2).
Based on #Gassa answer lets sum up all:
O(n^3) + O((2n)^2) = O(n^3) + O(4n^2) = O(n^3) + 4*O(n^2) = O(n^3)
Big O notation allows us throw out 4*O(n^2) because O(n^3) "eats" it

Troubles with Big O Estimate

I'm asked to give a big-O estimates for some pieces of code but I'm having a little bit of trouble
int sum = 0;
for (int i = 0; i < n; i = i + 2) {
for (int j = 0; j < 10; j + +)
sum = sum + i + j;
I'm thinking that the worst case would be O(n/2) because the outer for loop is from i to array length n. However, I'm not sure if the inner loop affects the Big O.
int sum = 0;
for (int i = n; i > n/2; i − −) {
for (int j = 0; j < n; j + +)
sum = sum + i + j;
For this one, I'm thinking it would be O(n^2/2) because the inner loop is from j to n and the outer loop is from n to n/2 which gives me n*(n/2)
int sum = 0;
for (int i = n; i > n − 2; i − −) {
for (int j = 0; j < n; j+ = 5)
sum = sum + i + j;
I'm pretty lost on this one. My guess is O(n^2-2/5)
Your running times for the first two examples are correct.
For the first example, the inner loop of course always executes 10 times. So we can say the total running time is O(10n/2).
For the last example, the outer loop only executes twice, and the inner loop n/5 times, so the total running time is O(2n/5).
Note that, because of the way big-O complexity is defined, constant factors and asymptotically smaller terms are negligible, so your complexities can / should be simplified to:
O(n)
O(n2)
O(n)
If you were to take into account constant factors (using something other than big-O notation of course - perhaps ~-notation), you may have to be explicit about what constitutes a unit of work - perhaps sum = sum + i + j constitutes 2 units of work instead of just 1, since there are 2 addition operations.
You're NOT running nested loops:
for (int i = 0; i < n; i = i + 2);
^----
That semicolon is TERMINATING the loop definition, so the i loop is just counting from 0 -> n, in steps of 2, without doing anything. The j loop is completely independent of the i loop - both are simply dependent on n for their execution time.
For the above algorithms worst case/best case are the same.
In case of Big O notation, lower order terms and coefficient of highest order term can be ignored as Big O is used for describing asymptotic upper bound.
int sum = 0;
for (int i = 0; i < n; i = i + 2) {
for (int j = 0; j < 10; j + +)
sum = sum + i + j;
Total number of outer loop iteration =n/2.for each iteration of outer loop, number of inner loop iterations=10.so total number of inner loop iterations=10*n/2=5n. so clearly it is O(n).
Now think about rest two programs and determine time complexities on your own.

Determining the big-O runtimes of these different loops?

I have a series of questions in which I need feedback and answers. I will comment as to what I think, this is not a homework assignment but rather preparation for my exam.
My main problem is determining the iterations of a loop for different cases. How would go about attempting to figure that out?
Evaluate Running time.
Q2.
for(int i =0 ; i < =n ; i++) // runs n times
for(int j =1; j<= i * i; j++) // same reasoning as 1. n^2
if (j % i == 0)
for(int k = 0; k<j; k++) // runs n^2 times? <- same reasoning as above.
sum++;
Correct Answer: N × N2 × N = O(N^4)
For the following Questions below, I do not have the correct answers.
Q3. a)
int x=0; //constant
for(int i=4*n; i>=1; i--) //runs n times, disregard the constant
x=x+2*i;
My Answer: O(n)
b) Assume for simplicity that n = 3^k
int z=0;
int x=0;
for (int i=1; i<=n; i=i*3){ // runs n/3 times? how does it effect final answer?
z = z+5;
z++;
x = 2*x;
}
My Answer: O(n)
c) Assume for simplicity that n = k^2,
int y=0;
for(int j=1; j*j<=n; j++) //runs O(logn)? j <= (n)^1/2
y++; //constant
My Answer: O(logn)
d)
int b=0; //constant
for(int i=n; i>0; i--) //n times
for(int j=0; j<i; j++) // runs n+ n-1 +...+ 1. O(n^2)
b=b+5;
My Answer: O(n^3)
(e)
int y=1;
int j=0;
for(j=1; j<=2n; j=j+2) //runs n times
y=y+i;
int s=0;
for(i=1; i<=j; i++) // runs n times
s++;
My Answer: O(n)
(f)
int b=0;
for(int i=0; i<n; i++) //runs n times
for(int j=0; j<i*n; j++) //runs n^2 x n times?
b=b+5;
My Answer: O(n^4)
(g) Assume for simplicity that n = 3k, for some positive integer k.
int x=0;
for(int i=1; i<=n; i=i*3){ //runs 1, 3, 9, 27...for values of i.
if(i%2 != 0) //will always be true for values above
for(int j=0; j<i; j++) // runs n times
x++;
}
My Answer: O (n x log base 3 n? )
(h) Assume for simplicity that n = k2, for some positive integer k.
int t=0;
for(int i=1; i<=n; i++) //runs n times
for(int j=0; j*j<4*n; j++) //runs O(logn)
for(int k=1; k*k<=9*n; k++) //runs O(logn)
t++;
My Answer: n x logn x log n = O(n log n^2)
(i) Assume for simplicity that n = 2s, for some positive integer s.
int a = 0;
int k = n*n;
while(k > 1) //runs n^2
{
for (int j=0; j<n*n; j++) //runs n^2
{ a++; }
k = k/2;
}
My Answer: O(n^4)
(j)
int i=0, j=0, y=0, s=0;
for(j=0; j<n+1; j++) //runs n times
y=y+j; //y equals n(n+1)/2 ~ O(n^2)
for(i=1; i<=y; i++) // runs n^2 times
s++;
My Answer: O(n^3)
(k)
int i=1, z=0;
while( z < n*(n+1)/2 ){ //arithmetic series, runs n times
z+=i; i++;
}
My Answer: O(n)
(m) Assume for simplicity that n = 2s, for some positive integer s.
int a = 0;
int k = n*n*n;
while(k > 1) //runs O(logn) complexity
{
for (int j=0; j<k; j++) //runs n^3 times
{ a--; }
k = k/2;
}
My Answer: O(n^3 log n)
Question 4
a) True - since its bounded below by n^2
b) False - f(n) not strictly smaller than g(n)
c) True
d) True -bounded by n^10
e) False - f(n) not strictly smaller than g(n)
f) True
g) True
h) false - since does not equal O(nlogn)
i) true
j) not sure
k) not sure
l) not sure - how should I even attempt these?*
Let's go through these one at a time.
Part (a)
int x=0; //constant
for(int i=4*n; i>=1; i--) //runs n times, disregard the constant
x=x+2*i;
My Answer: O(n)
Yep! That's correct. The loop runs O(n) times and does O(1) work per iteration.
Part (b)
int z=0;
int x=0;
for (int i=1; i<=n; i=i*3){ // runs n/3 times? how does it effect final answer?
z = z+5;
z++;
x = 2*x;
}
My Answer: O(n)
Not quite. Think about the values of i as the loop progresses. It will take on the series of values 1, 3, 9, 27, 81, 243, ..., 3k. Since i is tripling on each iteration, it takes on successive powers of three.
The loop clearly only does O(1) work per iteration, so the main question here is how many total iterations there will be. The loop will stop when i > n. If we let k be some arbitrary iteration of the loop, the value of i on iteration k will be 3k. The loop stops when 3k > n, which happens when k > log3 n. Therefore, the number of iterations is only O(log n), so the total complexity is O(log n).
Part (c)
int y=0;
for(int j=1; j*j<=n; j++) //runs O(logn)? j <= (n)^1/2
y++; //constant
My Answer: O(logn)
Not quite. Notice that j is still growing linearly, but the loop runs as long as j2 ≤ n. This means that as soon as j exceeds √ n, the loop will stop. Therefore, there will only be O(√n) iterations of the loop, and since each one does O(1) work, the total work done is O(√n).
Part (d)
int b=0; //constant
for(int i=n; i>0; i--) //n times
for(int j=0; j<i; j++) // runs n+ n-1 +...+ 1. O(n^2)
b=b+5;
My Answer: O(n^3)
Not quite. You're actually doubly-counting a lot of the work you need to do. You're correct that the inner loop will run n + (n-1) + (n-2) + ... + 1 times, which is O(n2) times, but you're already summing up across all iterations of the outer loop. You don't need to multiply that value by O(n) one more time. The most accurate answer would be O(n2).
Part (e)
int y=1;
int j=0;
for(j=1; j<=2n; j=j+2) //runs n times
y=y+i;
int s=0;
for(i=1; i<=j; i++) // runs n times
s++;
My Answer: O(n)
Yep! Exactly right.
Part (f)
int b=0;
for(int i=0; i<n; i++) //runs n times
for(int j=0; j<i*n; j++) //runs n^2 x n times?
b=b+5;
My Answer: O(n^4)
Again, I believe you're overcounting. The inner loop will run 0 + n + 2n + 3n + 4n + ... + n(n-1) = n(0 + 1 + 2 + ... + n - 1) times, so the total work done is O(n3). You shouldn't multiply by the number of times the outer loop runs because you're already summing up across all iterations. The most accurate runtime would be O(n3).
Part (g)
int x=0;
for(int i=1; i<=n; i=i*3){ //runs 1, 3, 9, 27...for values of i.
if(i%2 != 0) //will always be true for values above
for(int j=0; j<i; j++) // runs n times
x++;
}
My Answer: O (n x log base 3 n? )
The outer loop here will indeed run O(log n) times, but let's see how much work the inner loop does. You're correct that the if statement always evaluates to true. This means that the inner loop will do 1 + 3 + 9 + 27 + ... + 3log3 n work. This summation, however, works out to (3log3 n + 1 - 1) / 2 = (3n + 1) / 2. Therefore, the total work done here is just O(n).
Part (h)
int t=0;
for(int i=1; i<=n; i++) //runs n times
for(int j=0; j*j<4*n; j++) //runs O(logn)
for(int k=1; k*k<=9*n; k++) //runs O(logn)
t++;
My Answer: n x logn x log n = O(n log n^2)
Not quite. Look at the second loop. This actually runs O(√n) times using the same logic as one of the earlier parts. That third inner loop also runs O(√n) times, and so the total work done will be O(n2).
Part (i)
int a = 0;
int k = n*n;
while(k > 1) //runs n^2
{
for (int j=0; j<n*n; j++) //runs n^2
{ a++; }
k = k/2;
}
My Answer: O(n^4)
Not quite. The outer loop starts with k initialized to n2, but notice that k is halved on each iteration. This means that the number of iterations of the outer loop will be log (n2) = 2 log n = O(log n), so the outer loop runs only O(log n) times. That inner loop does do O(n2) work, so the total runtime is O(n2 log n).
Part (j)
int i=0, j=0, y=0, s=0;
for(j=0; j<n+1; j++) //runs n times
y=y+j; //y equals n(n+1)/2 ~ O(n^2)
for(i=1; i<=y; i++) // runs n^2 times
s++;
My Answer: O(n^3)
Close, but not quite! The first loop runs in time O(n) and by the time it's done, the value of j is Θ(n2). This means that the second loop runs for time Θ(n2), so the total time spent is Θ(n2).
Part (k)
int i=1, z=0;
while( z < n*(n+1)/2 )//arithmetic series, runs n times
{
z+=i; i++;
}
My Answer: O(n)
That's correct!
Part (l)
That's odd, there is no part (l).
Part (m)
int a = 0;
int k = n*n*n;
while(k > 1) //runs O(logn) complexity
{
for (int j=0; j<k; j++) //runs n^3 times
{ a--; }
k = k/2;
}
My Answer: O(n^3 log n)
Close, but not quite. You're right that the outer loop runs O(log n) times and that the inner loop does O(n3) work on the first iteration. However, look at the number of iterations of the inner loop more closely:
n3 + n3 / 2+ n3 / 4 + n3 / 8 + ...
= n3 (1 + 1/2 + 1/4 + 1/8 + ...)
≤ 2n3
So the total work done here is actually only O(n3), even though there are log n iterations.
Question 4
Your answers are all correct except for these:
f) True
This is actually false. The expression on the left is
(3/2)n3/2 + 5n2 + lg n
which is not Ω(n2 √n) = Ω(n5/2)
For (j), note that log n6 = 6 log n. Does that help?
For (k), ask whether both sides are O and Ω of one another. What do you find?
For (l), use the fact that alogb c = clogba. Does that help?
Hope this helps!

Resources