What is the complexity of these nested for loops:
for(int p=2; p<=LENGTH; p++){
for(int i=1; i*p <= LENGTH; i++)
{
//some code
}
}
Related
In Big-Θ notation, analyze the running time of the following three pieces of code/pseudo-code, describing it as a function of the input, n.
1.
void f1(int n)
{
int t = sqrt(n);
for(int i = 0; i < n; i++){
for(int j = 0; j < n; j++){
// do something O(1)
}
n -= t;
}
}
2.
Assume A is an array of size n+1.
void f2(int* A, int n)
{
for(int i=1; i <= n; i++){
for(int k=1; k <= n; k++){
if( A[k] == i){
for(int m=1; m <= n; m=m+m){
// do something that takes O(1) time
// Assume the contents of the A[] array are not changed
}
}
}
}
}
3.
void f3(int* A, int n)
{
if(n <= 1) return;
else {
f3(A, n-2);
// do something that takes O(1) time
f3(A, n-2);
}
}
Any help would be appreciated.
01.) Considering the growth time, When the inner loop of a nested loop is false, is its growth time O (n)?
ex:
for (int i=0; i<n; i++){
for (int j=0; j>n; i++){
//some code
}
}
02.) Considering the growth time, When the inner loop of a nested loop has another array length variable as 'm', is its growth time O (n m)?
ex:
for (int i=0; i<n; i++){
for (int j=0; j<m; i++){
//some code
}
}
What is the running time of the following code? (please explain with steps)
for (int i = 0; i < n; i++) {
for (int j = 0; j > n; j++) {
for (int k = 0; k > n; k++) {
System.out.println("*");
}
}
}
Thank You.
Yes, if it is absolutely clear that inner loop will not be executed due to the conditional's being false regardless of the input size, then time complexity will be O(n).
Yes, a detailed analysis can be found below. However, I assume you incorrectly incremented i in your inner loop.
for (int i=0; i<n; i++){
for (int j=0; j<m; j++){ // j++
//some code
}
}
number of operations performed by inner loop in each iteration of outer loop : m
number of operations performed by outer loop : n
Total number of operations : n*m
Time complexity f(n) ∈ O(n*m)
If the code you posted is not incorrect, then the time complexity analyis:
for (int i=0; i<n; i++){
for (int j=0; j<m; i++){
//some code
}
}
j=0 --> j<m (i++)
If 0<m, then it is an infinite loop. It's pointless to make a time complexity analysis.
If 0>=m, then inner loop is not going to be executed, therefore time complexity will be O(n).
Detailed analysis:
for (int i = 0; i < n; i++) { // n many times
for (int j = 0; j > n; j++) { // n many times
for (int k = 0; k > n; k++) { // n many times
System.out.println("*");
}
}
}
All of the loops are iterated n many times. Most-inner loop will perform n operations. Outer loop will perform n, which makes n*n. And most-outer loop will perform n operations, which makes n*n*n in total.
Time complexity f(n) ∈ O(n^3)
What will be the time complexity of below code and why?
public static int[] Shuffle(int[] nums, int n)
{
int len = nums.Length;
int[] final = new int[2 * n];
int counter = 0;
for (int i = 0, j = n; i < n; i++, j++)
{
final[counter++] = nums[i];
final[counter++] = nums[j];
}
return final;
}
If we will have two loops as below then it will be considered as time complexity of O(n^2)
for (int i = 0; i < n; i++)
{
for (int j = 0; j < n; j++)
{
}
}
Complexity is O(n) because the cursor is looping from i = 0 until i = n-1. Number of variables doesn't matter when it comes to time complexity. (there is space complexity as well) However care,
for (int i = 0, j = n; i < n; i++, j++)
is completely different from
for (int i = 0; i < n; i++)
{
for (int j = 0; j < n; j++)
{
I have this code and cannot understand the Big-O of this... Thanks
for(i = 0; i<n; i++){
for(j = i; j<n; j++){
if (arr[j]%2!=0){
if (minodd > arr[j]){
}
}
}
}
One of the best ways to approach this problem is to break it down into smaller parts.
First, lets look at your inner loop:
for(j = i; j<n; j++){
if (arr[j]%2!=0){ // O(1)
if (minodd > arr[j]){ // O(1)
}
}
}
The if-statements are O(1) or constant time so we can ignore those and we get just the inner for loop:
for(j = i; j<n; j++){
... // O(1) + O(1)
}
Since the worst case scenario is it loops n times we have O(n) + O(1) + O(1) which can be simplified to O(n) which is called linear time.
Next, lets zoom out and replace the inner loop with our new info:
for(i = 0; i<n; i++){
for(j = i; j<n; j++){
if (arr[j]%2!=0){
if (minodd > arr[j]){
}
}
}
}
becomes:
for(i = 0; i<n; i++){
O(n)
}
Since we know the outside for loop will cycle n times in the worst case, and the inside for loop will cycle n times in the worst case: We get O(n x n) or O(n²) which is also know as polynomial time.
Doesn't this just go on for forever?
You have i < n in your inner loop, so I think it's O(inf).
Now that you've updated the loop, I think #e2-e4 is right:
#include <stdio.h>
int eqn(int n)
{
return n > 0 ? n + eqn(n - 1) : 0;
}
int main(int argc, char **argv)
{
int i, j, n, v, a;
v = 0;
n = 5;
for (i = 0; i < n; i++) {
for (j = i; j < n; j++) {
v++;
}
}
// v = 15 ? 15
printf("v = %d ? %d\n", v, eqn(n));
return 0;
}
I have this following code. I need to calculate this algorithm complexity but i have no idea where to start. This algorithm has 3 nested loops so i guess its complexity is n^3 or am i wrong?
public static void RadixSort(DataArray data)
{
IList> digits = new List>();
for (int i = 0; i < 10; i++)
{
digits.Add(new List<int>());
}
for (int i = 0; i < data.Length; i++)
{
for (int j = 0; j < data.Length; j++)
{
int digit = (int)((data[j] % Math.Pow(10, i + 1)) / Math.Pow(10, i));
digits[digit].Add((int)data[j]);
}
int index = 0;
for (int k = 0; k < digits.Count; k++)
{
IList<int> selDigit = digits[k];
for (int l = 0; l < selDigit.Count; l++)
{
data.Swap(index++, selDigit[l]);
//data[index++] = selDigit[l];
}
}
for (int k = 0; k < digits.Count; k++)
{
digits[k].Clear();
}
}
}
Calculating complexity is more complex than just look at the number of nested loops. If you have a triple nested loop like this:
for(int i=0; i<n; i++)
for(int j=0; j<n; j++)
for(int k=0; k<n; k++)
it will be O(n³), assuming n is not changing in the loop. However, if you consider your case:
for(int i=0; i<n; i++)
for(int j=0; j<m; j++)
for(int k=0; k<m; k++)
the time complexity will instead be O(m²n).
And even the simplest sorting algorithms, like bouble sort, selection sort and insertions sort is O(n²), so if your implementation is worse than that you're doing something wrong. The time complexity for radix sort is O(wn), where w is a measure of the size of the elements.
When uncertain about complexity, a reasonable approach is to add counters to the inner-loop code and at the end of the routine print out the counts. Next, vary the size of the input to see how the results change. The empirical results can immediately confirm or deny your analytic or intuited results.