big O for an algorithm - algorithm

I have a question about this algorithm, I am having hard time in understanding the complexity
While(input) {
...
array to check
...
for (int i=0; i< arraysize; i++) {
creation first subarray;
creation second subarray;
}
for (int i=0; i < firstsubarraysize; i++) {
addinput to array;
for( int j = 0; j < secondsubarraysize; j++) {
calculate array[j] * input;
}
}
}// endwhile
So, considering input a M variable and the array is N, the big O would be O(m(nlogn)) or O(n2)?
The sub arrays are not always n\2.
I apologize If I'm not very clear.

Generally, for each nested loop, you will have one N in the big O for time complexity.
Here assume the outer most loop while run M times, then there are two nested for inside of it. So the total is O(M N^2).

Related

Running Time of Nested Loops, When inner loop false,

01.) Considering the growth time, When the inner loop of a nested loop is false, is its growth time O (n)?
ex:
for (int i=0; i<n; i++){
for (int j=0; j>n; i++){
//some code
}
}
02.) Considering the growth time, When the inner loop of a nested loop has another array length variable as 'm', is its growth time O (n m)?
ex:
for (int i=0; i<n; i++){
for (int j=0; j<m; i++){
//some code
}
}
What is the running time of the following code? (please explain with steps)
for (int i = 0; i < n; i++) {
for (int j = 0; j > n; j++) {
for (int k = 0; k > n; k++) {
System.out.println("*");
}
}
}
Thank You.
Yes, if it is absolutely clear that inner loop will not be executed due to the conditional's being false regardless of the input size, then time complexity will be O(n).
Yes, a detailed analysis can be found below. However, I assume you incorrectly incremented i in your inner loop.
for (int i=0; i<n; i++){
for (int j=0; j<m; j++){ // j++
//some code
}
}
number of operations performed by inner loop in each iteration of outer loop : m
number of operations performed by outer loop : n
Total number of operations : n*m
Time complexity f(n) ∈ O(n*m)
If the code you posted is not incorrect, then the time complexity analyis:
for (int i=0; i<n; i++){
for (int j=0; j<m; i++){
//some code
}
}
j=0 --> j<m (i++)
If 0<m, then it is an infinite loop. It's pointless to make a time complexity analysis.
If 0>=m, then inner loop is not going to be executed, therefore time complexity will be O(n).
Detailed analysis:
for (int i = 0; i < n; i++) { // n many times
for (int j = 0; j > n; j++) { // n many times
for (int k = 0; k > n; k++) { // n many times
System.out.println("*");
}
}
}
All of the loops are iterated n many times. Most-inner loop will perform n operations. Outer loop will perform n, which makes n*n. And most-outer loop will perform n operations, which makes n*n*n in total.
Time complexity f(n) ∈ O(n^3)

two or more for loops time complexity

If we assume the statements inside of a for loop is O(1).
for (i = 0; i < N; i++) {
sequence of statements
}
Then the above time complexity should be o(n)
another example is:
for (i = 0; i < N; i++) {
for (j = i+1; j < N; j++) {
sequence of statements
}
}
the above time complexity should be o(n^2)
It seems like one for loop symbolize n times
However, I've seen some discussion said that it's not simply like this
Sometimes there are 2 for loops but it doesn't mean the time complexity must be o(n^2)
Can somebody give me an example of two or more for loops with only O(n) time complexity (with explanation will be much better)???
It occurs when either the outer or inner loop is bounded by a fixed limit (constant value) on iterations. It happens a lot in many problems, like bit manipulation. An example like this:
for (int i = 0; i < n; i++) {
int x = ...; //some constant time operations to determine int x
for (int j = 0; x != 0; j++) {
x >>>= 1;
}
}
The inner loop is limited by the number of bits for int. For Java, it is 32, so the inner loop is strictly limited by 32 iterations and is constant time. The complexity is linear time or O(n).
Loops can have fixed time if the bounds are fixed. So for these nested loops, where k is a constant:
// k is a constant
for (int i=0; i<N; i++) {
for (int j=0; j<k; j++) {
}
}
// or this
for (int i=0; i<k; i++) {
for (int j=0; j<N; j++) {
}
}
Are both O(kN) ~= O(N)
The code in the post has a time complexity of:
Given this is a second order polynomial, (N^2 - N)/2, you correctly assessed the asymptotic time complexity of your example as O(N^2).

Big O Time Complexity for nested loops

What is the time complexity for the below code?
int i = 0;
while(i*i <=N) {
for(int j = 0; j <=N; j++) {
for(int k = 0; k <=N; k++, i++) {
//O(1) operation
}
}
i++;
}
In nested loops if the outer loop 1 takes O(1) time and inner loop 2 takes O(logn) time and inner loop 3 takes O(n).
Then the total T.C. is O(1)O(logn)O(n) = O(nlogn). Is it true?
Please explain.
tl;dr: This code runs in O(n^2).
Detailed answer:
The outer "loop": while (i*i <= M) is a distraction.
Because i increases itself for each iteration of the most inner loop - it means by the time you re-evaluate the condition in it, the value of i is going to be N*N. This means, the outer loop is always repeating itself only once.
Now, once you ignore it, it is easy to see that the time complexity of the remaining code is O(N^2), since it's basically can be rewritten as:
int i = 0;
if (i * i <= N) { // Since the while loop does not repeat more than once
for(int j = 0; j <=N; j++) {
for(int k = 0; k <=N; k++, i++) {
//O(1) operation
}
}
i++;
}
Note: This answer assume no overflows of the variables, and specifically i does not overflow
Big O Time Complexity for nested loops
The operation "//O(1) operation" is executed (N+1)^2 times. And the number of times the calculations done by the loops themselves (e.g. performing the test j<=N) is also square (N^2+a*N+b).
So the time complexity is O(N^2).
You can test that by extending your program the following way:
int i=0;
int count=0;
while(i*i <= N)
{
for(int j=0; j<=N; j++)
{
for(int k=0; k<=N; k++, i++)
{
count++;
}
}
i++;
printf("i after the while() loop: %d\n",i);
}
printf("Number of steps: %d\n",count);
You test your program with different values of N and you can see that:
i is (N+1)^2+1 after the first pass of the while() loop. This means that the condition i*i<=N is equal to (N+1)^4 + 2*N^2 + 3*N <= (-2). And this is always false for N>=0.
The operation count++ (representing ourt O(1) operation) is done (N+1)1^2 times.
Time Complexity
Such kind of questions are typically asked on the CS Stack Exchange web site, not on StackOverflow.
StackOverflow is intended for questions about writing computer programs only.
(Note that StackOverflow is part of the StackExchange network and you can use all StackExchange web sites with your user account.)

Run time and theta notation

for the following code:
for(i=0;i<5;i++)
for(j=2;j<n;j++)
{
c[i][j]=0;
for(k=0;k<n;k++)
c[i][j]=a[i][k]*b[k][j];
}
I would say the run time is theta(n^3), as I see in the k loop, there is two n (n^3) and on the other loop, another n, making it n^3. Would this be right or what did I miss.
Thank you
Here is your code, formatted:
for (i=0; i < 5; i++) {
for (j=2; j < n; j++) {
c[i][j] = 0;
for (k=0; k < n;k++)
c[i][j] = a[i][k]*b[k][j];
}
}
The outer loop in i only iterates 5 times, and so can just be treated as a constant penalty as far as complexity is concerned. The inner two loops in j and k are independent of each other, and both are O(n). We can therefore just multiple the complexities to get O(n^2) for the overall running time as a function of n.

Big O Notation for a For-Loop

How do I find the Big O Notation for this for-loop line of code
for (int j = 0; pow(j,2) < n; j++) ?
Does anyone know?
I have read a little on Big O Notation and it’s a very confusing topic to understand. I know that usually for-loop like this one → for (int n = 0; n < 20; ++n), have a Big O notation of O(1), as input increases by 13 so does its output, with linear complexity. Is that the same situation as above?
A loop like this:
for (int i = 0; i < n; ++i) {
doSomething(i);
}
iterates n times, so if doSomething has O(1) running time, then the loop as a whole has O(n) running time.
Similarly, a loop like this:
for (int j = 0; pow(j, 2) < n; j++) {
doSomething(j);
}
iterates ⌈√n⌉ times, so if doSomething has O(1) running time, then the loop as a whole has O(√n) running time.
By the way, note that although pow(j, 2) is O(1) running time — so it doesn't affect your loop's asymptotic complexity — it's nonetheless pretty slow, because it involves logarithms and exponentiation. For most purposes, I'd recommend this instead:
for (int j = 0; j * j < n; j++) {
doSomething(j);
}
or perhaps this:
for (int j = 0; 1.0 * j * j < n; j++) {
doSomething(j);
}

Resources