Time complexity for two nested loop for a specific case - algorithm

I have a function test(arr, x) where arr is the array of finite size and x is the finite positive number. I have two nested for loops as below:
function test(arr, x) {
for(i=0; i < arr.length; i++) { //executes arr.length times
for(j=1; j <= x; j++) { // executes x times
//code goes here
}
}
}
Question:
What is the time complexity for this case? I'm confused between O(n) and O(n^2).
Any help will be appreciated. Thanks in advance!

Related

two or more for loops time complexity

If we assume the statements inside of a for loop is O(1).
for (i = 0; i < N; i++) {
sequence of statements
}
Then the above time complexity should be o(n)
another example is:
for (i = 0; i < N; i++) {
for (j = i+1; j < N; j++) {
sequence of statements
}
}
the above time complexity should be o(n^2)
It seems like one for loop symbolize n times
However, I've seen some discussion said that it's not simply like this
Sometimes there are 2 for loops but it doesn't mean the time complexity must be o(n^2)
Can somebody give me an example of two or more for loops with only O(n) time complexity (with explanation will be much better)???
It occurs when either the outer or inner loop is bounded by a fixed limit (constant value) on iterations. It happens a lot in many problems, like bit manipulation. An example like this:
for (int i = 0; i < n; i++) {
int x = ...; //some constant time operations to determine int x
for (int j = 0; x != 0; j++) {
x >>>= 1;
}
}
The inner loop is limited by the number of bits for int. For Java, it is 32, so the inner loop is strictly limited by 32 iterations and is constant time. The complexity is linear time or O(n).
Loops can have fixed time if the bounds are fixed. So for these nested loops, where k is a constant:
// k is a constant
for (int i=0; i<N; i++) {
for (int j=0; j<k; j++) {
}
}
// or this
for (int i=0; i<k; i++) {
for (int j=0; j<N; j++) {
}
}
Are both O(kN) ~= O(N)
The code in the post has a time complexity of:
Given this is a second order polynomial, (N^2 - N)/2, you correctly assessed the asymptotic time complexity of your example as O(N^2).

Run time and theta notation

for the following code:
for(i=0;i<5;i++)
for(j=2;j<n;j++)
{
c[i][j]=0;
for(k=0;k<n;k++)
c[i][j]=a[i][k]*b[k][j];
}
I would say the run time is theta(n^3), as I see in the k loop, there is two n (n^3) and on the other loop, another n, making it n^3. Would this be right or what did I miss.
Thank you
Here is your code, formatted:
for (i=0; i < 5; i++) {
for (j=2; j < n; j++) {
c[i][j] = 0;
for (k=0; k < n;k++)
c[i][j] = a[i][k]*b[k][j];
}
}
The outer loop in i only iterates 5 times, and so can just be treated as a constant penalty as far as complexity is concerned. The inner two loops in j and k are independent of each other, and both are O(n). We can therefore just multiple the complexities to get O(n^2) for the overall running time as a function of n.

big O for an algorithm

I have a question about this algorithm, I am having hard time in understanding the complexity
While(input) {
...
array to check
...
for (int i=0; i< arraysize; i++) {
creation first subarray;
creation second subarray;
}
for (int i=0; i < firstsubarraysize; i++) {
addinput to array;
for( int j = 0; j < secondsubarraysize; j++) {
calculate array[j] * input;
}
}
}// endwhile
So, considering input a M variable and the array is N, the big O would be O(m(nlogn)) or O(n2)?
The sub arrays are not always n\2.
I apologize If I'm not very clear.
Generally, for each nested loop, you will have one N in the big O for time complexity.
Here assume the outer most loop while run M times, then there are two nested for inside of it. So the total is O(M N^2).

Big-O complexity of algorithm

Whats the total BigO complexity of an algorithm like the one below:
function void a(int n, int p) {
for(i = 0; id < n; i++){
print(i)
for(j=0; j < p; j++){
print(i+j)
}
}
}
A formal and efficient way is to use the following methodology:
Where c is a constant.
So, it´s the value instead of (logarithmic) length.
If i have n=5 and p=10, how much prints would this give?
50. This means n*p is correct
Think about it... you do n times p loops. So O(n*p).

time complexity for code and an order of magnitude improvement

I have the following problem:
For the following code, with reason, give the time complexity of the function.
Write a function which performs the same task but which is an order-of magnitude improvement in time complexity. A function with greater (time or space) complexity will not get credit.
Code:
int something(int[] a) {
for (int i = 0; i < n; i++)
if (a[i] % 2 == 0) {
temp = a[i];
for(int j = i; j > 0; j--)
a[j] = a[j-1];
a[0] = temp;
}
}
I'm thinking that since the temp = a[i] assignment in the worst case is done n times, a time complexity of n is assigned to that, and a[j] = a[j-1] is run n(n+1)/2 times so a time complexity value of (n2+n)/2 is assigned to that, summing them returns a time complexity of n+0.5n2+0.5n, removing the constants would lead to 2n+n2 and a complexity of n2.
For the order of magnitude improvement:
int something(int[] a) {
String answer = "";
for (int i = 0; i < n; i++) {
if (a[i] % 2 == 0) answer = a[i] + answer;
else answer = answer + a[i];
}
for (int i = 0; i < n; i++)
a[i] = answer.charAt(i);
}
The code inside the first for-loop is executed n times and in the second for-loop n times, summing gives a time complexity figure of 2n.
Is this correct? Or am I doing something wrong?
I suppose your function is to arrange a list with all the even numbers at the beginning of the list and then followed by the odd numbers.
For the first function the complexity is O(n2) as you have specified.
But for the second function the complexity is O(n) if the operator + which is used for appending is implemented as a constant time operation. Usually the append operator + is implemented as a constant time operation without any hidden complexity. So we can conclude that the second operation takes O(n) time.

Resources