can anyone help me to identify the steps for the following example and give more explanation on this Example the steps That determine Big-O notation is O(2n)
int i, j = 1;
for(i = 1; i <= n; i++)
{
j = j * 2;
}
for(i = 1; i <= j; i++)
{
cout << j << "\n";
}
thank you in advance
The first loop has n iterations and assigns 2^n to j.
The second loop has j = 2^n iterations.
The cout has time complexity O(log j) = O(n).
Hence the overall complexity is O(n * 2^n), which is strictly larger than O(2^n).
Related
I want to find the time complexity for this below code. Here's my understanding-
The outer for loop will loop 2n times and in the worst case when i==n, we will enter the if block where the nested for loops have complexity of O(n^2), counting the outer for loop, the time complexity for the code block will be O(n^3).
In best case when i!=n, else has complexity of O(n) and the outer for loop is O(n) which makes the complexity, in best case as O(n^2).
Am I correct or am I missing something here?
for (int i = 0; i < 2*n; i++)
{
if (i == n)
{
for (int j = 0; j < i; j++)
for (int k = 0; k < i; k++)
O(1)
}
else
{
for (int j = 0; j < i; j++)
O(1)
}
}
No.
The question "what is T(n)?".
What you are saying is "if i=n, then O(n^3), else O(n^2)".
But there is no i in the question, only n.
Think of a similar question:
"During a week, Pete works 10 hours on Wednesday, and 1 hour on every other day, what is the total time Pete works in a week?".
You don't really answer "if the week is Wednesday, then X, otherwise Y".
Your answer has to include the work time on Wednesday and on every other day as well.
Back in your original question, Wednesday is the case when i=n, and all other days are the case when i!=n.
We have to sum them all up to find the answer.
This is a question of how many times O(1) is executed per loop. The time complexity is a function of n, not i. That is, "How many times is O(1) executed at n?"
There is one run of a O(n^2) loop when i == n.
There are (2n - 2) instances of the O(n) loop in all other cases.
Therefore, the time complexity is O((2n - 2) * n + 1 * n^2) = O(3n^2 - 2*n) = O(n^2).
I've written a C program to spit out the first few values of n^2, the actual value, and n^3 to illustrate the difference:
#include <stdio.h>
int count(int n){
int ctr = 0;
for (int i = 0; i < 2*n; i++){
if (i == n)
for (int j = 0; j < i; j++)
for (int k = 0; k < i; k++)
ctr++;
else
for (int j = 0; j < i; j++)
ctr++;
}
return ctr;
}
int main(){
for (int i = 1; i <= 20; i++){
printf(
"%d\t%d\t%d\t%d\n",
i*i, count(i), 3*i*i - 2*i, i*i*i
);
}
}
Try it online!
(You can paste it into Excel to plot the values.)
The First loop is repeated 2*n times:
for (int i = 0; i < 2*n; i++)
{
// some code
}
This part Just occur once, when i == n and time complexity is : O(n^2):
if (i == n)
{
for (int j = 0; j < i; j++)
for (int k = 0; k < i; k++)
O(1)
}
And this part is depends on i.
else
{
for (int j = 0; j < i; j++)
O(1)
}
Consider i when:
i = 0 the loop is repeated 0 times
i = 1 the loop is repeated 1 times
i = 2 the loop is repeated 2 times
.
.
i = n the loop is repeated n times. (n here is 2*n)
So the loop repeated (n*(n+1)) / 2 times But when i == n else part is not working so (n*(n+1)) / 2 - n and time complexity is O(n^2).
Now we sum all of these parts: O(n^2) (first part) + O(n^2) (second part) because the first part occurs once so it's not O(n^3). Time complaxity is: O(n^2).
Based on #Gassa answer lets sum up all:
O(n^3) + O((2n)^2) = O(n^3) + O(4n^2) = O(n^3) + 4*O(n^2) = O(n^3)
Big O notation allows us throw out 4*O(n^2) because O(n^3) "eats" it
I have been trying to calculate the time complexity of this algorithm but I don't think that I am right.
Since it can't be n^2, I came up with a formula that goes O(n*(j*(1+j)*50), but I am still not sure enough.
for (int i = 1; i <= n; i++)
for (int j = 1; j <= i ; j++)
for (int k = 1; k <= 100; k++)
cout << "Hello";
Any help would be appreciated.
This is O(n²) indeed. The inner loop runs in constant time. It is the same as
for(int i = 1;i<=n;i++)
for(int j = 1;j<=i;j++) {
cout << "Hello";
cout << "Hello";
cout << "Hello";
cout << "Hello";
/* repeats 96 mores times */
}
More specifically, you can calculate the number of steps as
T(n) = 1 + 2 + 3 + ... + n
= n * n(1 + n)/2
= (n² + n)/2
Constants don't matter, so this function grows in O(n² + n) which is simply O(n²).
Instead of unrolling the inner loop, you could multiply everything by 100, but this wouldn't change the complexity.
Can someone please explain how the worst case running time is O(N) and not O(N^2)in the following excercise. There is double for loop, where for every i we need to compare j to i , sum++ and then increment and again repeat the operation until reach N.
What is the order of growth of the worst case running time of the following code fragment
as a function of N?
int sum = 0;
for (int i = 1; i <= N; i = i*2)
for (int j = 0; j < i; j++)
sum++;
Question Explanation
The answer is : N
The body of the inner loop is executed 1 + 2 + 4 + 8 + ... + N ~ 2N times.
I think you already stated the answer in your question -- the inner loop is executed 2N times, which is O(N). In asymptotic (or big-O) notation any multiples are dropped because for very, very large values, the graph of 2N looks just like N, so it isn't considered significant. In this case, the complexity of the problem is equal to the number of times "sum++" is called, because the algorithm is so simple. Does that make sense?
Complexity doesn't depends upon number of nested loops
it is O(Nc):
Time complexity of nested loops is equal to the number of times theinnermost statement is executed.For example the following sample loops have O(N2) time complexity
for (int i = 1; i <=n; i += c) {
for (int j = 1; j <=n; j += c) {
// some O(1) expressions
}
}
for (int i = n; i > 0; i += c) {
for (int j = i+1; j <=n; j += c) {
// some O(1) expressions
}
For each of the following algorithms, identify and state the running time using Big-O.
//i for (int i = 0; Math.sqrt(i) < n; i++)
cout << i << endl;
//ii for (int i = 0; i < n; i++){
cout << i << endl;
int k = n;
while (k > 0)
{
k /= 2;
cout << k << endl;
} // while
}
//iii
int k = 1;
for (int i = 0; i < n; i++)
k = k * 2;
for (int j = 0; j < k; j++)
cout << j << endl;
I've calculate the loop times for the first question using n=1 and n=2. The loop in i will run n^2-1 times. Please help and guide me to identify the Big-O notation.
(i) for (int i = 0; Math.sqrt(i) < n; i++)
cout << i << endl;
The loop will run until squareRoot(i) < N , or until i < N^2. Thus the running time will be O(N^2), ie. quadratic.
(ii) for (int i = 0; i < n; i++){
cout << i << endl;
int k = n;
while (k > 0)
{
k /= 2;
cout << k << endl;
} // while
}
The outer loop will run for N iterations. The inner loop will run for logN iterations(because the inner loop will run for k=N, N/2, N/(2^2), N/(2^3), ...logN times). Thus the running time will be O(N logN), ie. linearithmic.
(iii)
int k = 1;
for (int i = 0; i < n; i++)
k = k * 2;
for (int j = 0; j < k; j++)
cout << j << endl;
The value of k after the execution of the first loop will be 2^n as k is multiplied by 2 n times. The second loop runs k times. Thus it will run for 2^n iterations. Running time is O(2^N), ie. exponential.
For the first question, you will have to loop until Math.sqrt(i) >= n, that means that you will stop when i >= n*n, thus the first program runs in O(n^2).
For the second question, the outer loop will execute n times, and the inner loop keeps repeatedly halving k (which is initially equal to n). So the inner loop executes log n times, thus the total time complexity is O(n log n).
For the third question, the first loop executes n times, and on each iteration you double the value of k which is initially 1. After the loop terminates, you will have k = 2^n, and the second loop executes k times, so the total complexity will be O(2^n)
Couple hints may allow you to solve most of running time complexity problems in CS tests/homeworks.
If something decrease by a factor of 2 on each iteration, that's a log(N). In your second case the inner loop index is halved each time.
Geometric series,
a r^0 + a r^1 + a r^2 ... = a (r^n - 1) / (r - 1).
Write out third problem:
2 + 4 + 8 + 16 ... = 2^1 + 2^2 + 2^3 + 2^4 + ...
and use the closed form formula.
Generally it helps to look for log2 and to write few terms to see if there is a repeatable pattern.
Other common questions require you to know factorials and its approximation (Sterling's approximation)
Using Sigma Notation, you can formally obtain the following results:
(i)
(ii)
(iii)
for(int i = 1; i < str1.length(); i++)
{
for(int j = 1; j < str2.length(); j++)
{
if(str1[i] == str2[j]) c[i][j] = c[i-1][j-1] + 1;
else c[i][j] = max(c[i-1][j], c[i][j-1]);
}
}
This is the algorithm, I need to find a time-complexity of this one. Let's assume that:
m = str1.length()
n = str2.length()
Thus it's obviously O(m*n).
But I have a couple of questions:
considering that comparisons, and assignments give 1 time-complexity, is this true that I'll have:
O( 3*m*n*(3+1+2+2) )
time complexity?
Is this possible to count worst-case, best-case, or average-case complexity here, if the array has to be fully assigned?