What's the Complexity of this Algorithm? BigO - big-o

I have the following algorithm:
for (i=0; i<=n-1; i++) {
for (j=i; j<=n-1; j++) {
sum = 0;
for (k=i; k<=j; k++) {
sum = sum + v[k];
if (sum > max) max = sum;
}
}
}
The complexity of the first is O(n), the second is n-i, the third is j-i+1.
I know O(n^3) is an upper bound. But what's the true thing we can assume as complexity of this algorithm? Is it O(n^3)?
Thank you.

I think O(n^3), the limits of the iterations doesnt matter.

It's O(n^3) in worst case (when i, j and k are of similar value). It's O(n) in best case, when j and k are 0 or 1:)
Since you have to be prepared for worst case data (and this is the main reason of examining complexity) you should assume O(n^3)

Related

What is the worst-case asymptotic cost of this algorithm?

void Ordena(TipoItem *A, int n)
{
TipoItem x;
int i, j;
for (i = 1; i < n; i++)
{
for (j = n; j > i; j--)
{
if (A[j].chave < A[j - 1].chave)
{
x = A[j];
A[j] = A[j - 1];
A[j - 1] = x;
}
}
}
}
I believe the worst case is when the array is in descending order, am I right?
About the asymptotic cost in terms of number of movements, is it O(n²) or O(2n²) ?
I've just started learning about asymptotic cost (as you can tell).
As you were saying the worst-case scenario here is the one where the array is in descending order because you will have to execute the if statement every single time. However, since we are talking about the asymptotic notation, it is quite irrelevant whether or not you execute the if statement because the cost of those three instructions is actually constant(i.d. O(1)). Therefore, the important thing here is how many times you actually have to loop through the elements in the array and this happens exactly, if you do the math, n^2/2 + n/2 times. So, the computational complexity is O(n^2) because the predominant part here is n^2/2, and the asymptotic notation doesn't take into account the multiplicative factor 1/2, even if sometimes these factors could influence the execution time.

Time complexity of an algorithm - embedded loop

I'm trying to solve this algorithm but I'm not sure
here is the code(trying to get the complexity)
For (i =0, i<N, i++) {
For (j=0, j<i/2, j++) {
S.O.P (“”);
}
}
the S.O.P stands for the instructions given to the CPU.
I don't know what S.O.P. stands for, but for the sake of the question, I'll suppose it takes a fixed amount of time - O(1).
Therefore, the only thing left is defining the time the for loops take to run.
for (i =0, i<N, i++) { // n times
For (j=0, j<i/2, j++) { // for each time, this runs n/2 times
S.O.P (“”); // fixed time O(1)
}
}
Given that, we can calculate:
T(n) = {sum for i from 0 to n} i/2 = (1/2)*(n*(n-1)/2)
T(n) = (n*(n - 1)/4) * O(1) = O(n^2/4) = O(n^2)
So the final time complexity would be O(n^2) (O of n squared).

Time Complexity of series of algorithms

I have some doubts about the time complexities of these algorithms:
These are all the possible options for these algorithms
Algorithm 1
int i=0, j=0, sum = 0;
while i*i < n {
while j*j < n {
sum = sum + i * j;
j = j+2;
}
i = i+5;
}
//I don't really know what this one might be, but my poor logic suggests O(sqrt(n))
Algorithm 2
sum = 0;
for (i=0; i<=n; i++){
j = n;
while j>i {
sum = sum + j - i;
j = j – 1;
}
}
//While I know this algorithm has some nested loops, I don't know if the actual answer is O(n^2)
Algorithm 3
for (int i=0; i<=n; i++)
for (int j=0; j<=n; j++){
k = 0;
while k<n {
c = c+ 1;
k = K + 100;
}
}
//I believe this one has a time complexity of O(n^3) since it has 3 nested loops
Algorithm 4
Algorithm int f(int n) {
if n==0 or n == 1
return n
else
return f(n-2)+f(n-1)
//I know for a fact this algorithm is O(2^n) since it is the poor implementation of fibonacci
I think I have an idea of what the answers might be but I would like to get a second opinion on them. Thanks in advance.
Okay so this is what I think, Answers in bold
Algorithm 1
I think this is the most interesting algorithm. Let's build it up from a simple case
Let's temporarily assume it was just 2 nested loops that just checked for i,j being <n and not the the squares of those values. Now the fact that i and j get incremented by 5 and 2 respectively is inconsequential. And the complexity would have just been O(n2).
Now let's factor in the fact that the check is in fact i*i < n and j * j < n. This means that effective value ofyour iterators is their square and not just their absolute value. If it had been just one loop the complexity would have been O(sqrt(n)). But since we have 2 nested loops the complexity become O(sqrt(n) * sqrt(n)) which becomes O(n)
Algorithm 2
Your reasoning is right. It is O(n2)
Algorithm 3
Your reasoning is right. It is O(n3)
Algorithm 4
I don't think this is a fibonacci implementation but your time complexity guess is correct. Why I don't think this is a fibonacci is because this algorithm takes in a large number and works in reverse. You could send in something like 10 which isn't even a fibonacci number.
A nice way to think of this is as a binary tree. Every node, in your case every call to the fn that is not 1 or 0 spawns 2 children. Each call minimally removes 1 from the initial value sent in, say n. Therefore there would be n levels. The maximum number of nodes in a binary tree of depth n is 2n - 1. Therefore O(2n) is right

Time in Big O notation for if(N^2%N==0)

I wanted to understand how to calculate time complexity on if statements.
I got this problem:
sum = 0;
for(i=0;i<n;i++)
{
for(j=1;j<i*i;j++)
{
if(j%i==0)
{
for(k=0;k<j;k++)
{
sum++;
}
}
}
}
Now, I understand that for lines (1) and (2) I have n^3 in total, and according to my professor the total time is n^4, I also see that the if statement is testing to check when the remainder of n^2/n is 0, and the for loop in line (4) in my opinion should be n^2, but I don't know how to calculate it in order for lines (3) through (4) have O(n) in total. Any help is welcome. Thanks in advance.
Let's compute sum by rewriting the program and observing some math facts:
Phase 1:
for (i = 0; i < n; i++) {
for (j = 0; j < i*i; j += i) {
sum += j;
}
}
Phase 2 (use arithmetic progression):
for (i = 0; i < n; i++) {
sum += i*i * (i + 1) / 2;
}
Phase 3:
Sum of cubes is a polynomial 4th degree
So, sum = O(n^4). The original program achieves that by adding 1, so it needs O(n^4) additions.
There are 6 lines of actual code here. (You had them all on one line which was very hard to read. John Odom fixed that.)
1) Runs in O(1)
2) Runs in O(n), total is O(n)
3) Runs in O(n^2), total is O(n^3)
4) Runs in O(1), this filters the incoming from O(n^3) to O(n^2)
Edit: I missed the fact that this loop went to n^2 instead of n.
5) Runs in O(n) but this n is the square of the original n, thus it's really O(n^2), total is O(n^4)
6) Runs O(1), total is O(n^4)
Thus the total time is O(n^4)
Note, however:
for(j=1;j<i*i;j++)
{
if(j%i==0)
This would be much better rewritten as
for(j=i;j<i*i;j+=i)
(hopefully I got the syntax right, it's been ages since I've done C.)

Complexity of algorithm

What is the complexity given for the following problem is O(n). Shouldn't it be
O(n^2)? That is because the outer loop is O(n) and inner is also O(n), therefore n*n = O(n^2)?
The answer sheet of this question states that the answer is O(n). How is that possible?
public static void q1d(int n) {
int count = 0;
for (int i = 0; i < n; i++) {
count++;
for (int j = 0; j < n; j++) {
count++;
}
}
}
The complexity for the following problem is O(n^2), how can you obtain that? Can someone please elaborate?
public static void q1E(int n) {
int count = 0;
for (int i = 0; i < n; i++) {
count++;
for (int j = 0; j < n/2; j++) {
count++;
}
}
}
Thanks
The first example is O(n^2), so it seems they've made a mistake. To calculate (informally) the second example, we can do n * (n/2) = (n^2)/2 = O(n^2). If this doesn't make sense, you need to go and brush up what the meaning of something being O(n^k) is.
The complexity of both code is O(n*n)
FIRST
The outer loop runs n times and the inner loop varies from 0 to n-1 times
so
total = 1 + 2 + 3 + 4 ... + n
which if you add the arithmetic progression is n * ( n + 1 ) / 2 is O(n*n)
SECOND
The outer loop runs n times and the inner loop varies from 0 to n-1/2 times
so
total = 1 + 1/2 + 3/2 + 4/2 ... + n/2
which if you add the arithmetic progression is n * ( n + 1 ) / 4 is also O(n*n)
First case is definitely O(n^2)
The second is O(n^2) as well because you omit constants when calculate big O
Your answer sheet is wrong, the first algorithm is clearly O(n^2).
Big-Oh notation is "worst case" so when calculating the Big-Oh value, we generally ignore multiplications / divisions by constants.
That being said, your second example is also O(n^2) in the worst case because, although the inner loop is "only" 1/2 n, the n is the clear bounding factor. In practice the second algorithm will be less than O(n^2) operations -- but Big-Oh is intended to be a "worst case" (ie. maximal bounding) measurement, so the exact number of operations is ignored in favor of focusing on how the algorithm behaves as n approaches infinity.
Both are O(n^2). Your answer is wrong. Or you may have written the question incorrectly.

Resources