Questions on analyzing time complexity - algorithm

Hi I have some doubts about my analysis on the following two code fragment:
1)
for (i = 1; i <= 1.5n; i++)
for (j = n; j >= 2; j--)
cout << i, j;
The outer loop will be executed 1.5n times, and the inner loop will be executed n-2 times. Therefore, the complexity is O(1.5n*(n-2) = O(n^2)?
2)
j = 1;
while (j < n/2) {
i = 1;
while (i < j) {
cout << j << i;
i++;
}
j++;
}
The outer while loop will be executed n/2 times, and the inner while loop will be executed 1+2+...+n/2 times. Therefore, the complexity is also O(n^2)?
I was not so sure about my analysis on second problem.
Thank you so much for the help!

You're right. The right solution is to count.
Note that:
j = 0;
while (j < n/2) {
j++;
}
has a O(n)complexity, whereas:
j = 1;
while (j < n) {
j *= 2;
}
has a O(log n) complexity.

Related

What’s the time complexity of the this function with a nested 'for' loop?

Consider:
bar (a) {
for (int i = 0; i < n; ++i)
for (int j = 0; j < i; ++j)
a = a * (i + j);
return a;
}
Find the time complexity for the above function.
The time complexity is O(n^2).
Since the inner for loop runs from 0 to i and outer for loop runs from 0 to n it follows the pattern 1 + 2 + ... + n and this sums up to n*(n-1)/2, which in turn is O(n^2)

Time complexity question 3 loops + if statement

I have some trouble finding the time complexity of the code below. I figured that the if statement will run for approximately n times; however, I could not manage to describe it mathematically. Thanks in advance.
int sum = 0;
for (int i = 1; i < n; i++) {
for (int j = 1 ; j < i*i; j++) {
if (j % i == 0) {
for (int k = 0; k < j; k++) {
sum++;
}
}
}
}
Outer loop
Well, it's clear that it's O(n) because i is bounded by n.
Inner loops
If we take a look at the second loop alone, then it looks as follows:
...
for (int j = 1 ; j < i*i; j++){
...
j is bounded by i*i or simply n^2.
However, the innermost loop won't be executed for every j, but only for js that are divisible by i because that's what the constraint j % i == 0 means. Since j ~ i*i, there will be only i cases, when the innermost loop is executed. So, the number of iterations in the inner loops is bounded by i^3 or simply n^3.
Result
Hence, the overall time complexity is O(n4).

Worst-case running time using Big-Θ notation

I understand that the innermost for loop is Θ(logn)
and the two outermost for loops is Θ(n^2) because it's an arithmetic sum. The if-statement is my main problem. Does anyone know how to solve this?
int tally=0;
for (int i = 1; i < n; i ++)
{
for (int j = i; j < n; j ++)
{
if (j % i == 0)
{
for (int k = 1; k < n; k *= 2)
{
tally++;
}
}
}
}
Edit:
Now I noticed loop order: i before j.
In this case for given i value j varies from i to n and there are (n/i) successful if-conditions.
So program will call then most inner loop
n/1 +n/2+n/3+..+n/n
times. This is sum of harmonic series, it converges to n*ln(n)
So inner loop will be executed n*log^2(n) times.
As you wrote, two outermost loops provide O(n^2) complexity, so overall complexity is O(n^2 + n*log^2(n)), the first term overrides the second one, loop, and finally overall complexity is quadratic.
int tally=0;
for (int i = 1; i < n; i ++)
{
// N TIMES
for (int j = i; j < n; j ++)
{
//N*N/2 TIMES
if (j % i == 0)
{
//NlogN TIMES
for (int k = 1; k < n; k *= 2)
{
//N*logN*logN
tally++;
}
}
}
}
Old answer (wrong)
This complexity is linked with sum of sigma0(n) function (number of divisors) and represented as sequence A006218 (Dirichlet Divisor problem)
We can see that approximation for sum of divisors for values up to n is
n * ( log(n) + 2*gamma - 1 ) + O(sqrt(n))
so average number of successful if-conditions for loop counter j is ~log(j)

Run time Complexity

I believe that the following code is big theta of n^3, is this correct?
for (int i = 0; i < n; i ++)
{ // A is an array of integers
if (A[i] == 0) {
for (int j = 0; j <= i; j++) {
if (A[i] == 0) {
for (int k = 0; k <= j; k++) {
A[i] = 1;
}
}
}
}
}
And that the following is big theta of nlog(n)
for (int i = 1; i < n; i *= 2)
{
func(i);
}
void func(int x) {
if (x <= 1) return;
func(x-1);
}
because the for loop would run log(n) times, and func runs at most n recursive calls.
Thanks for the help!
Your intuition looks correct. Note that for the first bit if the input contains non-zero elements the time complexity drops down to big-theta(n). If you remove the checks it would definitely be big-theta(n^3).
You are correct about the second snippet, however the first is not Big-Theta(n^3). It is not even O(n^3)! The key observation is: for each i, the innermost loop will execute at most once.
Obviously, the worst-case is when the array contains only zeros. However, A[i] will be set to 1 in the first pass of the inner-most loop, and all subsequent checks of if (A[i] == 0) for the same i will be evaluated to false and the innermost loop will not be executed anymore until i increments. Therefore, there are total of 1 + 2 + 3 + .. + n = n * (n + 1) / 2 iterations, so the time complexity of the first snippet is O(n^2).
Hope this helps!

Show that the time complexity of this double loop is O(n)

How would you show that this entire sequence of loops is O(n)? Or is it O(n)? At first glance, just looking at the double loops one might think it's O(n^2) but I don't think it is...
int i = 0;
int arr[N];
int idx = 0;
for (i = 0; i < 2; i++)
{
for (j = 0; j < N/2; j++)
{
idx = (i * N/2) + j;
foo(arr[idx]);
}
}
The outer loop is constant, so the loop is the same as doing:
for (j = 0; j < N/2; j++)
{
idx = (0 * N/2) + j;
foo(arr[idx]);
idx = (1 * N/2) + j;
foo(arr[idx]);
}
Which actually makes the intent of the code a lot clearer also, but as you can see, the amount of operations scales linearly with N, so it's O(N) complexity rather than exponential growth. I can't explain it any better as it's 8.30am and I haven't slept, but I imagine you get my gist.

Resources