Asymptotic running time of loop - asymptotic-complexity

function func5(n)
s = 0;
for i = 1 to 3n^2 do
for j = 1 to floor(2n^3/i) do
s=s + i − j;
return(s);
What is the asymptotic running time of the above algorithms in theta notation?

Expressing this in summation form:
The summation term is a harmonic series. From the Wikipedia page it can be approximated as:
Where gamma is the Euler-Mascheroni constant (~0.577). Therefore the overall complexity is:

Related

Precise Θ notation bound for the running time as a function

I'm studying for an exam, and i've come across the following question:
Provide a precise (Θ notation) bound for the running time as a
function of n for the following function
for i = 1 to n {
j = i
while j < n {
j = j + 4
}
}
I believe the answer would be O(n^2), although I'm certainly an amateur at the subject but m reasoning is the initial loop takes O(n) and the inner loop takes O(n/4) resulting in O(n^2/4). as O(n^2) is dominating it simplifies to O(n^2).
Any clarification would be appreciated.
If you proceed using Sigma notation, and obtain T(n) equals something, then you get Big Theta.
If T(n) is less or equal, then it's Big O.
If T(n) is greater or equal, then it's Big Omega.

Big-Oh and theta notation of a specific function... Running time

If evaluating f(n) is theta(n)
i = 1;
sum = 0;
while (i <= n)
do if (f(i) > k)
then sum += f(i);
i = 2*i;
Would the running time of this be O(n^3) because of the n times the functions are possibly being called or would it be O(n)? Or is it something in terms of theta since that's the information we know? I am very lost on this...
The i variable doubles each time => will reach n in Log2(n) time.
The evaluation of f will be done Log2(n) times => the function time complexity is O(N x LogN).
In fact, if computing f(i) has complexity O(i), then the time complexity is:
1 + 2 + 4 + ... + 2^(Log2(n)) = n (there are Log2(n) steps) => O(n)

∑ Log(i) = big theta(f(n))?

Given S(n) = ∑ Log(i) such that sigma runs from i=1 to n
What is a simple function f(n) so that the sum S(n) is in big Theta of f(n)?
I am thinking about f(n) = loglogn because I believe its within the boundaries of the initial value of the summation which is log1=0 and the terminal value of the summation which is logn.
Hence it would satisfy the definition of big theta..
Is this right? otherwise please help
Draw a picture and convince yourself of the following.
The integral from 1 to N of log(x) is < ∑ Log(i) is < the integral from 1 to N+1 of log(x).
Therefore N log(N) - N < ∑ Log(i) < (N+1) log(N+1) - (N+1).
Both bounds are big Theta of N log(N).

What is the Big-O of this Code?

I thought the Big-O notation will be n^3, but the output does not even closely match my Big O:
int bigO(int [] myArray, int x) {
int count = 0;
for (int i = 0; i < x; i++)
for (int j = i+1; j < x; j++)
for (int k = j+1; k < x; k++) {
System.out.println(myArray[i] + ", " + myArray[j] + ", " +
myArray[k]);
count++;
}
return count;
}
My Apologies, I should have "x" instead of "n"
That's because your function does not perform exactly n^3 operations.
Actually, it does f(n) = (1/6)*n^3 - (1/2)*n^2 + (1/3)*n operations (found it using polynomial fitting).
But, by the definition, f(n) is O(n^3). The intuition behind this is:
(1/6)*n^3 is the dominant factor
(1/6)*n^3 grows within a constant factor of n^3.
Here's a static analysis for your code. Because the loops have all different iteration ranges, it's best if you start with the most inner loop and work your way from inner to outer loop.
The most inner for loop has n-j-1 iterations.
So if you look at the 2 inner loops, you have Sum (n-j-1) iterations (for j in the interval [i+1; n-1]). So you have (n-(i+1)-1) + (n-(i+2)-1) + ... + (n-(n-1)-1) iterations, which equals to (n-i-2) + (n-i-3) + ... + 1 + 0, which is an arithmetic series and the result is: (n-i-2)*(n-i-1)/2.
And now we loop over the outer loop and get Sum (n-i-2)*(n-i-1)/2 iterations (for i in the interval [0; n-1]). This is equal to 1/2*Sum(i^2) + (-n+3/2)*Sum(i) + (n^2/2-3n/2+1)*Sum(1). These sums are easy to calculate and after a bit of rearranging you receive: n^3/6 -n^2/2+n/3, which is the same formula as the one of #JuanLopes.
Since your functions is O(n^3) (n^3/6 -n^2/2+n/3 = O(n^3)), your code doesn't have exactly n^3 iterations. The dominant factor is n^3/6, and you will have about this many iterations.
Big-O notation is not a per-se feature of an algorithm! It describes how the output "grows" over time/space with respect to the size of the input. Define the "size" of your input and you can compute the Big-O complexity of it.
As soon as you change the definition of "size" of your input, you get totally different complexities.
Example:
Algorithm to apply a gaussian filter to a set of images of size X*Y
With respect the no. of images the algorithm operates in linear time
With respect the global no. of pixels to process the algorithm is quadratic
So the answer is: you didn't define your N :-)

Calculating execution time of an algorithm

I have this algorithm:
S(n)
if n=1 then return(0)
else
S(n/3)
x <- 0
while x<= 3n^3 do
x <- x+3
S(n/3)
Is 2 * T(n/3) + n^3 the recurrence relation?
Is T(n) = O(n^3) the execution time?
The recurrence expression is correct. The time complexity of the algorithm is O(n^3).
The recurrence stops at T(1).
Running an example for n = 27 helps deriving a general expression:
T(n) = 2*T(n/3)+n^3 =
= 2*(2*T(n/9)+(n/3)^3)+n^3 =
= 2*(2*(2*T(n/27)+(n/9)^3)+(n/3)^3)+n^3 =
= ... =
= 2*(2*2*T(n/27)+2*(n/9)^3+(n/3)^3)+n^3 =
= 2*2*2*T(n/27)+2*2*(n/9)^3+2*(n/3)^3+n^3
From this example we can see that the general expression is given by:
Which is equivalent to:
Which, in turn, can be solved to the following closed form:
The dominating term in this expression is (1/25)*27n^3 (2^(log_3(n)) is O(n), you can think of it as 2^(log(n)*(1/log(3))); dropping the constant 1/log(3) gives 2^log(n) = n), thus, the recurrence is O(n^3).
2 * T(n/3) + n^3
Yes, I think this is a correct recurrence relation.
Time complexity:
while x<= 3n^3 do
x <- x+3
This has a Time complexity of O(n^3). Also, at each step, the function calls itself twice with 1/3rd n. So the series shall be
n, n/3, n/9, ...
The total complexity after adding each depth
n^3 + 2/27 * (n^3) + 4/243 * (n^3)...
This series is bounded by k*n^3 where k is a constant.
Proof: if it is considered as a GP with a factor of 1/2, then the sum
becomes 2*n^3. Now we can see that at each step, the factor is
continuously decreasing and is less than half. Hence the upper bound is less than 2*n^3.
So in my opinion the complexity = O(n^3).

Resources