Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
Please help me by giving the time complexity analysis of the following function.
function (n)
{
for( i = 1 ; i <= n ; i + + )
{
for( j = 1 ; j <= n ; j+ = i )
{
print( “*” ) ;
}
}
}
I feel it should be n + n*log(n).
One cycle it's O(n);
Cycle inside cycle it's O(n^2);
This case it should be: n*log(n) - as (n/1 + n/2 + n/3 + … + n/n) each iteration number is reducing (in n division by n), plus n, as this is also first iteration cycle;
So, final answer is: n + n*log(n).
Checked my assumption with simple test and some numbers, looks good.
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed last month.
Improve this question
I have a situation where I am looping N times to sort M elements using merge sort. M can differ, i.e., depend on which N iterations we're at.
I came up with O(nmlog(m)), where n is the number of outer elements, and m is the average number of inner elements, but this doesn't sound right.
All you can say is n times the average of mi log(mi), for which there is no simple formula. You could express this as nm*log(m*) where m* is the value that solves m*log(m*) = avg(mi log(mi)), but this is even less tractable.
As the function x log(x) is upward concave, m* will be somewhat above M:= avg(mi).
If the coefficient of variation of the mi is small, you can use the decomposition mi = M + δi and take the average of (M + δi) (log M + log(1 + δi/M)) ~ (M + δi) (log M + δi/M). By averaging, the terms in δi cancel out and what remains is the average of M log M + δi²/M = M log M + σ²/M. Hence O(NM log M + Nσ²/M), which is O(NM log M).
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I have an algorithm with this order:
O((m^2)/n) + O(mn)
I want to know: Is it equal to O(mn)?
O((m^2)/n) > O(mn) OR O((m^2)/n) < O(mn) ???
You should just say the complexity is O(m^2/n + mn).
Let's see when they're equal:
(m^2)/n = mn
m^2 = m(n^2)
m = n^2
So, if m = n^2, they are equal,
when m > n^2, m^2/n is dominant,
when m < n^2, mn is dominant.
Thus neither is always greater than the other, thus we can't cancel out either.
Dimensionally speaking, they cannot be compared. If unit of m and n is same say UNIT
But (m^2)/n is measured in UNIT and mn in UNIT^2 or UNIT-Squared.
(m^2)/n < mn
if you take m = n, then m^2/n will be n.
It means m and n are of same order (or of same magnitude), then complexity is O(mn).
If the order of m and n are different,
if m^2 < n, then it will be O(mn)
if m^2 > n, then it will be O(m^2/n)
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Here's the algorithm:
Let a = 30, i = 1
While i < n
For j = i+1 to n
If ABS[(j*i)+20] < a then a = ABS[(j*i)+20]
i = i + 1
Return k
Whats the number of steps this algorithm will take for the general case where the input is of size n? How do you work that out?
Also does this algorithm come under the quadratic complexity class?
I think this is with O(n^2)
we have
n+(n-1)+(n-2)+(n-3)......[total n] ....3.2.1
if we calculate it, it would be
0.5( (n^2) + n) = C (n^2 + n)
and it is quadratic complexity class.
Let f(i) denote the number of times the inner for loop runs assuming that j goes from i+1 to n. For example f(5) = n - 5 + 1, since j goes through 6,7,...,n. So we want f(1) + f(2) + f(3) + ... + f(n - 1). Compute what each f(i) and then sum them to see the exact answer.
In general there is an outer loop that runs n times, then the inner loop runs at most n times, for a complexity upper bounded by ???
If I was a compiler, I would notice that this code only changes i, j, and a, local variables; and the only variable whose value is subsequently used is k. So I would gradually optimize away everything but this:
Return k
and the computation would be all constant time, just a few machine instructions. Therefore also within quadratic time.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
There is an algorithm which has the time complexity
T(n)=T(n-1)+1/n if n>1
=1 otherwise
I am solving for its asymptotic complexity, and getting order as 'n' but the answer given is 'log n'. Is it correct? If it is log n, then why?
It can be easily seen (or proven formally with induction) that T(n) is the sum of 1/k for the values of k from 1 to n. This is the nth harmonic number, Hn = 1 + 1/2 + 1/3 + ... + 1/n.
Asymptotically, the harmonic numbers grow on the order of log(n). This is because the sum is close in value to the integral of 1/x from 1 to n, which is equal to the natural logarithm of n. In fact, Hn = ln(n) + γ + O(1/n) where γ is a constant. From this, it is easy to show that T(n) = Θ(log(n)).
For more details:
With H(N) = 1 + 1/2 + 1/3 + ... + 1/N
the function x :-> 1/x is a decreasing function so :
We sum from 1 to N the left part and for the right part we sum from 2 to N and we add 1, we get:
Then we calculate the left and right parts : ln(N+1) <= H(N) <= 1 + ln(N)
this implies H(N)/ln(N) -> 1 hence H(N)=Θ(log(N))
(from http://fr.wikipedia.org/wiki/S%C3%A9rie_harmonique#.C3.89quivalent_de_Hn)
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Using only the definition of O()?
You need to prove by contradiction. Assume that n^2 is O(n*log(n)). Which means by definition there is a finite and non variable real number c such that
n^2 <= c * n * log(n)
for every n bigger than some finite number n0.
Then you arrive to the point when c >= n /log(n), and you derive that as n -> INF, c >= INF which is obviously impossible.
And you conclude n^2 is not O(n*log(n))
You want to calculate the limit of
(n * log(n)) / (n ^ 2) =
= log(n) / n =
= 0 if n approaches infinity.
because log(n) grows slower than n.