Identify and state the running time using Big-O - algorithm

For each of the following algorithms, identify and state the running time using Big-O.
//i for (int i = 0; Math.sqrt(i) < n; i++)
cout << i << endl;
//ii for (int i = 0; i < n; i++){
cout << i << endl;
int k = n;
while (k > 0)
{
k /= 2;
cout << k << endl;
} // while
}
//iii
int k = 1;
for (int i = 0; i < n; i++)
k = k * 2;
for (int j = 0; j < k; j++)
cout << j << endl;
I've calculate the loop times for the first question using n=1 and n=2. The loop in i will run n^2-1 times. Please help and guide me to identify the Big-O notation.

(i) for (int i = 0; Math.sqrt(i) < n; i++)
cout << i << endl;
The loop will run until squareRoot(i) < N , or until i < N^2. Thus the running time will be O(N^2), ie. quadratic.
(ii) for (int i = 0; i < n; i++){
cout << i << endl;
int k = n;
while (k > 0)
{
k /= 2;
cout << k << endl;
} // while
}
The outer loop will run for N iterations. The inner loop will run for logN iterations(because the inner loop will run for k=N, N/2, N/(2^2), N/(2^3), ...logN times). Thus the running time will be O(N logN), ie. linearithmic.
(iii)
int k = 1;
for (int i = 0; i < n; i++)
k = k * 2;
for (int j = 0; j < k; j++)
cout << j << endl;
The value of k after the execution of the first loop will be 2^n as k is multiplied by 2 n times. The second loop runs k times. Thus it will run for 2^n iterations. Running time is O(2^N), ie. exponential.

For the first question, you will have to loop until Math.sqrt(i) >= n, that means that you will stop when i >= n*n, thus the first program runs in O(n^2).
For the second question, the outer loop will execute n times, and the inner loop keeps repeatedly halving k (which is initially equal to n). So the inner loop executes log n times, thus the total time complexity is O(n log n).
For the third question, the first loop executes n times, and on each iteration you double the value of k which is initially 1. After the loop terminates, you will have k = 2^n, and the second loop executes k times, so the total complexity will be O(2^n)

Couple hints may allow you to solve most of running time complexity problems in CS tests/homeworks.
If something decrease by a factor of 2 on each iteration, that's a log(N). In your second case the inner loop index is halved each time.
Geometric series,
a r^0 + a r^1 + a r^2 ... = a (r^n - 1) / (r - 1).
Write out third problem:
2 + 4 + 8 + 16 ... = 2^1 + 2^2 + 2^3 + 2^4 + ...
and use the closed form formula.
Generally it helps to look for log2 and to write few terms to see if there is a repeatable pattern.
Other common questions require you to know factorials and its approximation (Sterling's approximation)

Using Sigma Notation, you can formally obtain the following results:
(i)
(ii)
(iii)

Related

Time Complexity for for loop with if-else block

I want to find the time complexity for this below code. Here's my understanding-
The outer for loop will loop 2n times and in the worst case when i==n, we will enter the if block where the nested for loops have complexity of O(n^2), counting the outer for loop, the time complexity for the code block will be O(n^3).
In best case when i!=n, else has complexity of O(n) and the outer for loop is O(n) which makes the complexity, in best case as O(n^2).
Am I correct or am I missing something here?
for (int i = 0; i < 2*n; i++)
{
if (i == n)
{
for (int j = 0; j < i; j++)
for (int k = 0; k < i; k++)
O(1)
}
else
{
for (int j = 0; j < i; j++)
O(1)
}
}
No.
The question "what is T(n)?".
What you are saying is "if i=n, then O(n^3), else O(n^2)".
But there is no i in the question, only n.
Think of a similar question:
"During a week, Pete works 10 hours on Wednesday, and 1 hour on every other day, what is the total time Pete works in a week?".
You don't really answer "if the week is Wednesday, then X, otherwise Y".
Your answer has to include the work time on Wednesday and on every other day as well.
Back in your original question, Wednesday is the case when i=n, and all other days are the case when i!=n.
We have to sum them all up to find the answer.
This is a question of how many times O(1) is executed per loop. The time complexity is a function of n, not i. That is, "How many times is O(1) executed at n?"
There is one run of a O(n^2) loop when i == n.
There are (2n - 2) instances of the O(n) loop in all other cases.
Therefore, the time complexity is O((2n - 2) * n + 1 * n^2) = O(3n^2 - 2*n) = O(n^2).
I've written a C program to spit out the first few values of n^2, the actual value, and n^3 to illustrate the difference:
#include <stdio.h>
int count(int n){
int ctr = 0;
for (int i = 0; i < 2*n; i++){
if (i == n)
for (int j = 0; j < i; j++)
for (int k = 0; k < i; k++)
ctr++;
else
for (int j = 0; j < i; j++)
ctr++;
}
return ctr;
}
int main(){
for (int i = 1; i <= 20; i++){
printf(
"%d\t%d\t%d\t%d\n",
i*i, count(i), 3*i*i - 2*i, i*i*i
);
}
}
Try it online!
(You can paste it into Excel to plot the values.)
The First loop is repeated 2*n times:
for (int i = 0; i < 2*n; i++)
{
// some code
}
This part Just occur once, when i == n and time complexity is : O(n^2):
if (i == n)
{
for (int j = 0; j < i; j++)
for (int k = 0; k < i; k++)
O(1)
}
And this part is depends on i.
else
{
for (int j = 0; j < i; j++)
O(1)
}
Consider i when:
i = 0 the loop is repeated 0 times
i = 1 the loop is repeated 1 times
i = 2 the loop is repeated 2 times
.
.
i = n the loop is repeated n times. (n here is 2*n)
So the loop repeated (n*(n+1)) / 2 times But when i == n else part is not working so (n*(n+1)) / 2 - n and time complexity is O(n^2).
Now we sum all of these parts: O(n^2) (first part) + O(n^2) (second part) because the first part occurs once so it's not O(n^3). Time complaxity is: O(n^2).
Based on #Gassa answer lets sum up all:
O(n^3) + O((2n)^2) = O(n^3) + O(4n^2) = O(n^3) + 4*O(n^2) = O(n^3)
Big O notation allows us throw out 4*O(n^2) because O(n^3) "eats" it

Big notation for an algorithm efficiency

can anyone help me to identify the steps for the following example and give more explanation on this Example the steps That determine Big-O notation is O(2n)
int i, j = 1;
for(i = 1; i <= n; i++)
{
j = j * 2;
}
for(i = 1; i <= j; i++)
{
cout << j << "\n";
}
thank you in advance
The first loop has n iterations and assigns 2^n to j.
The second loop has j = 2^n iterations.
The cout has time complexity O(log j) = O(n).
Hence the overall complexity is O(n * 2^n), which is strictly larger than O(2^n).

What would be the time complexity of this code?

What would be the time complexity of this code?
i = n;
while(i > 1) {
j = i;
while (j < n) {
k = 0;
while (k < n) {
k = k + 2;
}
j = j * 2;
}
i = i / 2;
}
I tried analyzing this code and got a complexity of Log^2n * n I rewrote the code in a for loop format to make it easier to see which came out like this.
for (i = n; i > 1; i = i / 2) // log2n + 1
{
for(j = i; j < n; j = j * 2) // log2n + 1
{
for (k = 0; k < n; k = k + 2) // n + 1 times
{
cout << "I = " << i << " J = " << j << " and K = " << k << endl;
}
cout << endl;
}
}
Is that correct? If not, why? I am new to algorithms and trying to understand but don't know where else to ask, sorry.
Yes, your answer is correct. The variable i is halved at every step, making the outer loop O(log n). j doubles at every step, making that loop O(log n), and the innermost k loop increases linearly, making that loop O(n). Multiplying together gives O(n log² n).

How do I calculate the time complexity of this algorithm?

I have been trying to calculate the time complexity of this algorithm but I don't think that I am right.
Since it can't be n^2, I came up with a formula that goes O(n*(j*(1+j)*50), but I am still not sure enough.
for (int i = 1; i <= n; i++)
for (int j = 1; j <= i ; j++)
for (int k = 1; k <= 100; k++)
cout << "Hello";
Any help would be appreciated.
This is O(n²) indeed. The inner loop runs in constant time. It is the same as
for(int i = 1;i<=n;i++)
for(int j = 1;j<=i;j++) {
cout << "Hello";
cout << "Hello";
cout << "Hello";
cout << "Hello";
/* repeats 96 mores times */
}
More specifically, you can calculate the number of steps as
T(n) = 1 + 2 + 3 + ... + n
= n * n(1 + n)/2
= (n² + n)/2
Constants don't matter, so this function grows in O(n² + n) which is simply O(n²).
Instead of unrolling the inner loop, you could multiply everything by 100, but this wouldn't change the complexity.

Figuring out Big(o)

for (int i = 0; i < n; i++)
{
for (int j = 0; j < i*i; j++)
{
cout << j << endl;
result++;
}
}
Running this code say for 5 it runs a total of 30 times. I know the outer loop runs N. the inner loop though is giving me some trouble since it's not n*n but i*i and I haven't seen one like this before trying to figure out T(n), and Big(O).
This algorithm is O(n^3): To realise this we have to figure out how often the inner code
cout << j << endl;
result++;
is executed. For this we need to sum up 1*1+2*2+...+n*n = n^3/3+n^2/2+n/6 which is a well known result (see e.g. Sum of the Squares of the First n Natural Numbers). Thus O(T(n)) = O(1*1+2*2+...+n*n) = O(n^3) and the (time) complexity of the algorithm is therefore O(n^3).
Edit: If you're wondering why this is sufficient (see also Example 4 in Time complexity with examples) it is helpful to rewrite your code as a single loop so we can see that the loops add a constant amount of instructions (for each run of the inner code):
int i = 0;
int j = 0;
while(i < n) {
cout << j << endl;
result++;
if(j < i * i) j++; //were still in the inner loop
else {//start the next iteration of the outer loop
j = 0;
i++;
}
}
Thus the two loops 'add' the two comparisons plus the if-statement which simply makes the conditional jumps and their effects more explicit.

Resources