Sum of proper divisor upto N - number-theory

I have to find sum of proper divisor upto N (N <= 20000000). I have pre-calculate this function with complexity O(N * log(N)) which takes upto 4 second. How can I optimize it or any alternate solution will be greatly accepted.
Example: N = 10 answer is 86, N = 1000 answer is 823080
int f(){
for(LL i = 1; i <= m; i++){
for(LL j = i; j <= m; j += i){
ans[j] += i;
}
}
for(LL i = 2; i <= m; i++) ans[i] += ans[i - 1];
}
I also tried using prime factorization and this formula, but it takes more time than above algorithm.

I can't edit my previous comment, only to say that the problem referenced by my comment is a little bit different, but with a minor adjustment it answers also this question.
Just multiply inside the loop by the divisor itself.

Related

How to calculate time complexity for this algorithm?

I get a series of roots from 0 to (n-1), but how to count it, I do not understand. Am I thinking in the wrong direction?
for (auto i = 0; i < n; ++i) {
int j = 0;
while (j * j < i) {
++j;
}
}
Sum of sqrt(1) + ... + sqrt(n) can be viewed as the generalized harmonic number of order -1/2 of n.
The Euler-Maclaurin formula gives the sum to be:
Easy to see that the number is bounded by O(psqrt(p))
You can learn more about it here.

Time Complexity for for loop with if-else block

I want to find the time complexity for this below code. Here's my understanding-
The outer for loop will loop 2n times and in the worst case when i==n, we will enter the if block where the nested for loops have complexity of O(n^2), counting the outer for loop, the time complexity for the code block will be O(n^3).
In best case when i!=n, else has complexity of O(n) and the outer for loop is O(n) which makes the complexity, in best case as O(n^2).
Am I correct or am I missing something here?
for (int i = 0; i < 2*n; i++)
{
if (i == n)
{
for (int j = 0; j < i; j++)
for (int k = 0; k < i; k++)
O(1)
}
else
{
for (int j = 0; j < i; j++)
O(1)
}
}
No.
The question "what is T(n)?".
What you are saying is "if i=n, then O(n^3), else O(n^2)".
But there is no i in the question, only n.
Think of a similar question:
"During a week, Pete works 10 hours on Wednesday, and 1 hour on every other day, what is the total time Pete works in a week?".
You don't really answer "if the week is Wednesday, then X, otherwise Y".
Your answer has to include the work time on Wednesday and on every other day as well.
Back in your original question, Wednesday is the case when i=n, and all other days are the case when i!=n.
We have to sum them all up to find the answer.
This is a question of how many times O(1) is executed per loop. The time complexity is a function of n, not i. That is, "How many times is O(1) executed at n?"
There is one run of a O(n^2) loop when i == n.
There are (2n - 2) instances of the O(n) loop in all other cases.
Therefore, the time complexity is O((2n - 2) * n + 1 * n^2) = O(3n^2 - 2*n) = O(n^2).
I've written a C program to spit out the first few values of n^2, the actual value, and n^3 to illustrate the difference:
#include <stdio.h>
int count(int n){
int ctr = 0;
for (int i = 0; i < 2*n; i++){
if (i == n)
for (int j = 0; j < i; j++)
for (int k = 0; k < i; k++)
ctr++;
else
for (int j = 0; j < i; j++)
ctr++;
}
return ctr;
}
int main(){
for (int i = 1; i <= 20; i++){
printf(
"%d\t%d\t%d\t%d\n",
i*i, count(i), 3*i*i - 2*i, i*i*i
);
}
}
Try it online!
(You can paste it into Excel to plot the values.)
The First loop is repeated 2*n times:
for (int i = 0; i < 2*n; i++)
{
// some code
}
This part Just occur once, when i == n and time complexity is : O(n^2):
if (i == n)
{
for (int j = 0; j < i; j++)
for (int k = 0; k < i; k++)
O(1)
}
And this part is depends on i.
else
{
for (int j = 0; j < i; j++)
O(1)
}
Consider i when:
i = 0 the loop is repeated 0 times
i = 1 the loop is repeated 1 times
i = 2 the loop is repeated 2 times
.
.
i = n the loop is repeated n times. (n here is 2*n)
So the loop repeated (n*(n+1)) / 2 times But when i == n else part is not working so (n*(n+1)) / 2 - n and time complexity is O(n^2).
Now we sum all of these parts: O(n^2) (first part) + O(n^2) (second part) because the first part occurs once so it's not O(n^3). Time complaxity is: O(n^2).
Based on #Gassa answer lets sum up all:
O(n^3) + O((2n)^2) = O(n^3) + O(4n^2) = O(n^3) + 4*O(n^2) = O(n^3)
Big O notation allows us throw out 4*O(n^2) because O(n^3) "eats" it

Complexity analysis of while(key <= N*N) loop using Big-Oh

I'm kind of confused to conclude the Big-O notation for this while loop, where N is the input size:
int array[0][(N-1)/2] = 1;
int key = 2,k,l;
i = 0;
int j = (N-1)/2;
while(key <= N*N)
{
if(i <= 0)
k = N-1;
else
k = i-1;
if(j <= 0)
l = N-1;
else
l = j-1;
if(array[k][l])
i = (i+1)%N;
else
{
i = k;
j = l;
}
array[i][j] = key;
key++;
}
I concluded it as O(N2)
because when N=5 it iterates until N*N i.e 5*5=25 times but I'm still confused regarding the rest of the code inside the loop. would really appreciate it if someone could give a step by step explanation of the code, and this loop is just part of a bigger function which has 4 more loops which i understood but not this loop.
What you should actually care about is, how k changes. It grows by one in each iteration, and there are no shortcuts here.
So it's just O(N2).

Big-O for-loop runtime analysis [duplicate]

This question already has answers here:
Big O, how do you calculate/approximate it?
(24 answers)
Closed 8 years ago.
int n = 500;
for(int i = 0; i < n; i++)
for(int j = 0; j < i; j++)
sum++;
My guess is this is simply a O(N^2), but the j < i is giving me doubts.
int n = 500;
for(int i = 0; i < n; i++)
for(int j = 0; j < i*i; j++)
sum++;
Seems like an O(N^3)
int n = 500;
for(int i = 0; i < n; i++)
for(int j = 0; j < i*i; j++)
if( j % i == 0 )
for( k = 0; k < j; k++ )
sum++
O(N^5)?
So for each loop j has a different value. If it was j < n*n, it would've been more straight forward, but this one is a tricky one, so please help. Thanks.
In the first case sum++ executes 0 + 1 + ... + n-1 times. If you apply arithmetic progression formula, you'll get n (n-1) / 2, which is O(n^2).
In the second case we'll have 0 + 1 + 4 + 9 + ... + (n-1)^2, which is sum of squares of first n-1 numbers, and there's a formula for it: (n-1) n (2n-1)
The last one is interesting. You can see, actually, that the most nested for loop is called only when j is a multiplicand of i, so you can rewrite the program as follows:
int n = 500;
for(int i = 0; i < n; i++) {
for(int m = 0; m < i; m++) {
int j = m * i;
for( k = 0; k < j; k++)
sum++
}
}
It's easier to work with math notation:
The formula is derived from the code by analysis: we can see that sum++ gets called j times in the innermost loop, which is called i times, which is called n times. In the end, the problem boils down to a sum of cubes of first n numbers plus lower-order terms (which do not affect the asymptotics)
One final note: it looks obvious, but I'd like to show that in general sum of first N natural numbers in dth power is Ω(N^(d+1)) (see Wikipedia for Big-Omega notation), that is it grows no slower than that function. You can apply the same reasoning to prove that a stronger condition is satisfied, namely, it belongs to Θ(N^(d+1)), which combines both Ω and O.
You are right for all but the last one, which has a tighter bound of O(n^4): note that the last for loop is only executed if j is a multiple of i. There are x / i multiples of i lower than or equal to x, and i * i / i = i. So the last loop is only executed for i values out of the i * i.
Note that big-oh gives an upper bound, so i*i vs n*n makes little difference. Strictly speaking, saying they are all O(n^2015) is also correct (because that is a valid upper bound), but it's hardly helpful, so in practice a tight bound is usually used.
IVlad already gave the correct answer.
I think what confuses you is the "Big Oh" definition.
N^2 has O(N^2)
1/2N^2 has O(N^2)
1/2N^2 + c*N + b also has
O(N^2) - by given c and b are constants independent from N
Check Big Oh definition from here

running time of algorithm does not match the reality

I have the following algorithm:
I analyzed this algoritm as follow:
Since the outer for loop goes from i to n it iterates at most n times,
and the loop on j iterates again from i to n which we can say at most n times,
if we do the same with the whole algorithm we have 4 nested for loop so the running time would be O(n^4).
But when I run this code for different input size I get the following result:
As you can see the result is much closer to n^3? can anyone explain why does this happen or what is wrong with my analysis that I get a loose bound?
Formally, you may proceed like the following, using Sigma Notation, to obtain the order of growth complexity of your algorithm:
Moreover, the equation obtained tells the exact number of iterations executed inside the innermost loop:
int sum = 0;
for( i=0 ; i<n ; i++ )
for( j=i ; j<n ; j++ )
for( k=0 ; k<j ; k++ )
for( h=0 ; h<i ; h++ )
sum ++;
printf("\nsum = %d", sum);
When T(10) = 1155, sum = 1155 also.
I'm sure there's a conceptual way to see why, but you can prove by induction the above has (n + 2) * (n + 1) * n * (n - 1) / 24 loops. Proof left to the reader.
In other words, it is indeed O(n^4).
Edit: You're count increases too frequently. Simply try this code to count number of loops:
for (int n = 0; n < 30; n++) {
int sum = 0;
for (int i = 0; i < n; i++) {
for (int j = i; j < n; j++) {
for(int k = 0; k < j; k++) {
for (int h = k; h < i; h++) {
sum++;
}
}
}
}
System.out.println(n + ": " + sum + " = " + (n + 2) * (n + 1) * n * (n - 1) / 24);
}
You are having a rather complex algorithm. The number of operations is clearly less than n^4, but it isn't at all obvious how much less than n^4, and whether it is O (n^3) or not.
Checking the values n = 1 to 9 and making a guess based on the results is rather pointless.
To get a slightly better idea, assume that the number of steps is either c * n^3 or d * n^4, and make a table of the values c and d for 1 <= n <= 1,000. That might give you a better idea. It's not a foolproof method; there are algorithms changing their behaviour dramatically much later than at n = 1,000.
Best method is of course a proof. Just remember that O (n^4) doesn't mean "approximately n^4 operations", it means "at most c * n^4 operations, for some c". Sometimes c is small.

Resources