Complexity and Big-O Notation - algorithm

What is the worst case time complexity for the following two algorithms assuming items (an ArrayList<Integer>)has enough unused space that it never needs to be re-sized? My initial guess is that A would run slower because it has to shift every element over to add the new one at index [0]. I think B is O(N^2) in the worst case but I am not sure.
A.
for (int i = 0; i < N; i++)
items.add(0, new Integer(i));
and B.
for (int i = 0; i < N; i++)
items.add(new Integer(i));

If your question is about java, then first version is slower and has complexity O(N^2)for the very reason you mention, while B has complexity O(N).

Implementation A could be, by assuming that the items array is sufficiently large, implemented as:
for (int i = 0; i < n; i++) {
for (int j = items.size; j > 0; j++) {
items[j] = items[j-1];
}
items[0] = i;
}
The total number of operations executed in this case (assuming m was the initial size of the items list) would be:
This has the complexity O(n2)
Option B, on the other hand, can be implemented as
for (int i = 0; i < n; i++) {
items[items.size] = i;
items.size++;
}
and the number of operations executed in this case will be
This has the complexity O(n)

in A, you must shift all of the items to the right one in the internal array of the array list for each insertion. This will be O(n^2) to complete the operation. In the second case, no shifting is needed so it will be O(n). In A, you are doing tons of unnecessary and expensive work.
I am assuming, as you had stipulated, that the internal array is not resized.

Related

two or more for loops time complexity

If we assume the statements inside of a for loop is O(1).
for (i = 0; i < N; i++) {
sequence of statements
}
Then the above time complexity should be o(n)
another example is:
for (i = 0; i < N; i++) {
for (j = i+1; j < N; j++) {
sequence of statements
}
}
the above time complexity should be o(n^2)
It seems like one for loop symbolize n times
However, I've seen some discussion said that it's not simply like this
Sometimes there are 2 for loops but it doesn't mean the time complexity must be o(n^2)
Can somebody give me an example of two or more for loops with only O(n) time complexity (with explanation will be much better)???
It occurs when either the outer or inner loop is bounded by a fixed limit (constant value) on iterations. It happens a lot in many problems, like bit manipulation. An example like this:
for (int i = 0; i < n; i++) {
int x = ...; //some constant time operations to determine int x
for (int j = 0; x != 0; j++) {
x >>>= 1;
}
}
The inner loop is limited by the number of bits for int. For Java, it is 32, so the inner loop is strictly limited by 32 iterations and is constant time. The complexity is linear time or O(n).
Loops can have fixed time if the bounds are fixed. So for these nested loops, where k is a constant:
// k is a constant
for (int i=0; i<N; i++) {
for (int j=0; j<k; j++) {
}
}
// or this
for (int i=0; i<k; i++) {
for (int j=0; j<N; j++) {
}
}
Are both O(kN) ~= O(N)
The code in the post has a time complexity of:
Given this is a second order polynomial, (N^2 - N)/2, you correctly assessed the asymptotic time complexity of your example as O(N^2).

Big O Time Complexity for nested loops

What is the time complexity for the below code?
int i = 0;
while(i*i <=N) {
for(int j = 0; j <=N; j++) {
for(int k = 0; k <=N; k++, i++) {
//O(1) operation
}
}
i++;
}
In nested loops if the outer loop 1 takes O(1) time and inner loop 2 takes O(logn) time and inner loop 3 takes O(n).
Then the total T.C. is O(1)O(logn)O(n) = O(nlogn). Is it true?
Please explain.
tl;dr: This code runs in O(n^2).
Detailed answer:
The outer "loop": while (i*i <= M) is a distraction.
Because i increases itself for each iteration of the most inner loop - it means by the time you re-evaluate the condition in it, the value of i is going to be N*N. This means, the outer loop is always repeating itself only once.
Now, once you ignore it, it is easy to see that the time complexity of the remaining code is O(N^2), since it's basically can be rewritten as:
int i = 0;
if (i * i <= N) { // Since the while loop does not repeat more than once
for(int j = 0; j <=N; j++) {
for(int k = 0; k <=N; k++, i++) {
//O(1) operation
}
}
i++;
}
Note: This answer assume no overflows of the variables, and specifically i does not overflow
Big O Time Complexity for nested loops
The operation "//O(1) operation" is executed (N+1)^2 times. And the number of times the calculations done by the loops themselves (e.g. performing the test j<=N) is also square (N^2+a*N+b).
So the time complexity is O(N^2).
You can test that by extending your program the following way:
int i=0;
int count=0;
while(i*i <= N)
{
for(int j=0; j<=N; j++)
{
for(int k=0; k<=N; k++, i++)
{
count++;
}
}
i++;
printf("i after the while() loop: %d\n",i);
}
printf("Number of steps: %d\n",count);
You test your program with different values of N and you can see that:
i is (N+1)^2+1 after the first pass of the while() loop. This means that the condition i*i<=N is equal to (N+1)^4 + 2*N^2 + 3*N <= (-2). And this is always false for N>=0.
The operation count++ (representing ourt O(1) operation) is done (N+1)1^2 times.
Time Complexity
Such kind of questions are typically asked on the CS Stack Exchange web site, not on StackOverflow.
StackOverflow is intended for questions about writing computer programs only.
(Note that StackOverflow is part of the StackExchange network and you can use all StackExchange web sites with your user account.)

big O for an algorithm

I have a question about this algorithm, I am having hard time in understanding the complexity
While(input) {
...
array to check
...
for (int i=0; i< arraysize; i++) {
creation first subarray;
creation second subarray;
}
for (int i=0; i < firstsubarraysize; i++) {
addinput to array;
for( int j = 0; j < secondsubarraysize; j++) {
calculate array[j] * input;
}
}
}// endwhile
So, considering input a M variable and the array is N, the big O would be O(m(nlogn)) or O(n2)?
The sub arrays are not always n\2.
I apologize If I'm not very clear.
Generally, for each nested loop, you will have one N in the big O for time complexity.
Here assume the outer most loop while run M times, then there are two nested for inside of it. So the total is O(M N^2).

worst case runtime of the double for loop

Can someone please explain how the worst case running time is O(N) and not O(N^2)in the following excercise. There is double for loop, where for every i we need to compare j to i , sum++ and then increment and again repeat the operation until reach N.
What is the order of growth of the worst case running time of the following code fragment
as a function of N?
int sum = 0;
for (int i = 1; i <= N; i = i*2)
for (int j = 0; j < i; j++)
sum++;
Question Explanation
The answer is : N
The body of the inner loop is executed 1 + 2 + 4 + 8 + ... + N ~ 2N times.
I think you already stated the answer in your question -- the inner loop is executed 2N times, which is O(N). In asymptotic (or big-O) notation any multiples are dropped because for very, very large values, the graph of 2N looks just like N, so it isn't considered significant. In this case, the complexity of the problem is equal to the number of times "sum++" is called, because the algorithm is so simple. Does that make sense?
Complexity doesn't depends upon number of nested loops
it is O(Nc):
Time complexity of nested loops is equal to the number of times theinnermost statement is executed.For example the following sample loops have O(N2) time complexity
for (int i = 1; i <=n; i += c) {
for (int j = 1; j <=n; j += c) {
// some O(1) expressions
}
}
for (int i = n; i > 0; i += c) {
for (int j = i+1; j <=n; j += c) {
// some O(1) expressions
}

time complexity for code and an order of magnitude improvement

I have the following problem:
For the following code, with reason, give the time complexity of the function.
Write a function which performs the same task but which is an order-of magnitude improvement in time complexity. A function with greater (time or space) complexity will not get credit.
Code:
int something(int[] a) {
for (int i = 0; i < n; i++)
if (a[i] % 2 == 0) {
temp = a[i];
for(int j = i; j > 0; j--)
a[j] = a[j-1];
a[0] = temp;
}
}
I'm thinking that since the temp = a[i] assignment in the worst case is done n times, a time complexity of n is assigned to that, and a[j] = a[j-1] is run n(n+1)/2 times so a time complexity value of (n2+n)/2 is assigned to that, summing them returns a time complexity of n+0.5n2+0.5n, removing the constants would lead to 2n+n2 and a complexity of n2.
For the order of magnitude improvement:
int something(int[] a) {
String answer = "";
for (int i = 0; i < n; i++) {
if (a[i] % 2 == 0) answer = a[i] + answer;
else answer = answer + a[i];
}
for (int i = 0; i < n; i++)
a[i] = answer.charAt(i);
}
The code inside the first for-loop is executed n times and in the second for-loop n times, summing gives a time complexity figure of 2n.
Is this correct? Or am I doing something wrong?
I suppose your function is to arrange a list with all the even numbers at the beginning of the list and then followed by the odd numbers.
For the first function the complexity is O(n2) as you have specified.
But for the second function the complexity is O(n) if the operator + which is used for appending is implemented as a constant time operation. Usually the append operator + is implemented as a constant time operation without any hidden complexity. So we can conclude that the second operation takes O(n) time.

Resources