Suppose I have a m x n array and want to perform an operation on just its perimeter as follows:
for(let i = 0; i < m; i++){
for(let j = 0; j < n; j++){
if(i * j === 0 || i === m - 1 || j === n - 1){
// some function that has O(m * n) time complexity
}
}
}
My calculation for the (worst case) time complexity is:
O(top + right + bottom + left)
= O(2*top + 2*right)
= O(2*top) + O(2*right)
= O(top) + O(right)
= O(n * m * n) + O(m * m * n)
= O(m * n^2) + O(n * m^2)
Is this correct, or am I missing something?
Related
Why is the time complexity is O(n*n!) and not o(n^n) for this program?
void perm(String str, String prefix){
if(str.length() == 0){
System.out.println(prefix);
} else{
for(int i = 0; i < str.length(); i++){
String rem = str.substring(0, i) +
str.substring(i + 1);
perm(rem, prefix + str.charAt(i));
}
}
}
It is both. We have
n! = 1 * 2 * ... * n <= 1 * n * ... * n = n^(n-1)
so n * n! = O(n^n). Now for small o, things look a bit differently, because we have to proof that for any constant factor c exists n, such that c * n * n! < n^n.
But this isn't that complicated either. Let's pick an arbitrary c, then (for n>=3):
(n * n!)/(n^n) = n!/n^(n-1) = (n-1)!/n^(n-2) = 1/n * 2/n * ... * (n-1)/n <
< 1/n * ((n-1)/n)^(n-3) <= 1/n
So we've got n * (n * n!) < n^n. So for our c, we can just pick n >= c and we're good. Thus also n * n! = o(n^n). So your algorithm is both O(n * n!) and o(n^n).
The answer is that the string gets shorter.
I assume the following program without the System.out
void perm(String str, String prefix){
if(str.length() == 0){
} else{
for(int i = 0; i < str.length(); i++){
String rem = str.substring(0, i) +
str.substring(i + 1);
perm(rem, prefix + str.charAt(i));
}
}
}
Lets assume the string has zero characters. Than the runtime is the if statement, which is O(1) = O(c).
For a String of length n it is abusing notation O(c1+n*c2*O(n-1)) to illustrate recursion where c2 is runtime for substring. Hence runtime is O(n!) = O(n*n!) which is btw. O(n^n) as O(n!) = O(n^n). However, there is probably a difference in small o notation.
int i = 0;
int n = 20;
while (i < n)
{
i++;
int j = i;
while (i < n)
{
printf("this is %d", i);
i++;
}
i = j;
}
So to estimate the time complexity of this function, my approach to estimating is that the outer loop runs n times. The inner loop runs n - 1 times? so would the time complexity for this nested loop be O(n^2)?
You can rewrite the initial code
int n = 20;
int i = 0;
while (i < n)
{
i++;
int j = i;
while (i < n)
{
printf("this is %d",i);
i++;
}
i = j;
}
into its equivalent:
int n = 20;
for (int i = 1; i < n; ++i)
for (int j = i; j < n; ++j)
printf("this is %d", j);
Now it's evident that you have O(n**2) time complexity: you have
(n - 1) + (n - 2) + (n - 3) + ... + 3 + 2 + 1 = n * (n - 1) / 2
operations (printf(...)) and
O(n * (n - 1) / 2) = O(n**2 / 2 - n / 2) = O(n**2)
In subsequent iterations, inner loop runs as many times as n-1, n-2, n-3,..., 1. So the sum is n(n-1)/2, which leads to the asymptotic time complexity as O(n^2).
Outer loop is O(n), 2nd loop is O(n^2) and 3rd loop is also O(n^2), but the 3rd loop is conditional.
Does that mean the 3rd loop only happens 1/n (1 every n) times and therefore total big O is O(n^4)?
for (int i = 1; i < n; i++) {
for (int j = 1; j < (n*n); j++) {
if (j % i == 0) {
for (int k = 1; k < (n*n); k++) {
// Simple computation
}
}
}
}
For any given value of i between 1 and n, the complexity of this part:
for (int j = 1; j < (n*n); j++) {
if (j % i == 0) {
for (int k = 1; k < (n*n); k++) {
// Simple computation
}
}
}
is O(n4/i), because the if-condition is true one ith of the time. (Note: if i could be larger than n, then we'd need to write O(n4/i + n2) to include the cost of the loop iterations where the if-condition was false; but since i is known to be small enough that n4/i ≥ n2, we don't need to worry about that.)
So the total complexity of your code, adding together the different loop iterations across all values of i, is O(n4/1 + n4/2 + n4/3 + ⋯ + n4/n) = O(n4 · (1/1 + 1/2 + 1/3 + ⋯ + 1/n)) = O(n4 log n).
(That last bit relies on the fact that, since ln(n) is the integral of 1/x from 1 to n, and 1/x is decreasing over that interval, we have ln(n) < ln(n+1) < (1/1 + 1/2 + 1/3 + ⋯ + 1/n) < 1 + ln(n).)
I've got to analyze this loop, among others, and determine its running time using Big-O notation.
for ( int i = 0; i < n; i += 4 )
for ( int j = 0; j < n; j++ )
for ( int k = 1; k < j*j; k *= 2 )`
Here's what I have so far:
for ( int i = 0; i < n; i += 4 ) = n
for ( int j = 0; j < n; j++ ) = n
for ( int k = 1; k < j*j; k *= 2 ) = log^2 n
Now the problem I'm coming to is the final running time of the loop. My best guess is O(n^2), however I am uncertain if this correct. Can anyone help?
Edit: sorry about the Oh -> O thing. My textbook uses "Big-Oh"
First note that the outer loop is independent from the remaining two - it simply adds a (n/4)* multiplier. We will consider that later.
Now let's consider the complexity of
for ( int j = 0; j < n; j++ )
for ( int k = 1; k < j*j; k *= 2 )
We have the following sum:
0 + log2(1) + log2(2 * 2) + ... + log2(n*n)
It is good to note that log2(n^2) = 2 * log2(n). Thus we re-factor the sum to:
2 * (0 + log2(1) + log2(2) + ... + log2(n))
It is not very easy to analyze this sum but take a look at this post. Using Sterling's approximation one can that it is belongs to O(n*log(n)). Thus the overall complexity is O((n/4)*2*n*log(n))= O(n^2*log(n))
In terms of j, the inner loop is O(log_2(j^2)) time, but sine
log_2(j^2)=2log(j), it is actually O(log(j)).
For each iteration of middle loop, it takes O(log(j)) time (to do the
inner loop), so we need to sum:
sum { log(j) | j=1,..., n-1 } log(1) + log(2) + ... + log(n-1) = log((n-1)!)
And since log((n-1)!) is in O((n-1)log(n-1)) = O(nlogn), we can conclude middle middle loop takes O(nlogn) operations .
Note that both middle and inner loop are independent of i, so to
get the total complexity, we can just multiply n/4 (number of
repeats of outer loop) with complexity of middle loop, and get:
O(n/4 * nlogn) = O(n^2logn)
So, total complexity of this code is O(n^2 * log(n))
Time Complexity of a loop is considered as O(n) if the loop variables is incremented / decremented by a constant amount (which is c in examples below):
for (int i = 1; i <= n; i += c) {
// some O(1) expressions
}
for (int i = n; i > 0; i -= c) {
// some O(1) expressions
}
Time complexity of nested loops is equal to the number of times the innermost statement is executed. For example the following sample loops have O(n²) time complexity:
for (int i = 1; i <=n; i += c) {
for (int j = 1; j <=n; j += c) {
// some O(1) expressions
}
}
for (int i = n; i > 0; i += c) {
for (int j = i+1; j <=n; j += c) {
// some O(1) expressions
}
Time Complexity of a loop is considered as O(logn) if the loop variables is divided / multiplied by a constant amount:
for (int i = 1; i <=n; i *= c) {
// some O(1) expressions
}
for (int i = n; i > 0; i /= c) {
// some O(1) expressions
}
Now we have:
for ( int i = 0; i < n; i += 4 ) <----- runs n times
for ( int j = 0; j < n; j++ ) <----- for every i again runs n times
for ( int k = 1; k < j*j; k *= 2 )` <--- now for every j it runs logarithmic times.
So complexity is O(n²logm) where m is n² which can be simplified to O(n²logn) because n²logm = n²logn² = n² * 2logn ~ n²logn.
I have the following algorithm below :
for(i = 1; i < n; i++){
SmallPos = i;
Smallest = Array[SmallPos];
for(j = i + 1; j <= n; j++)
if (Array[j] < Smallest) {
SmallPos = j;
Smallest = Array[SmallPos];
}
Array[SmallPos] = Array[i];
Array[i] = Smallest;
}
Here is my calculation :
For the nested loop, I find a time complexity of
1 ("int i = 0") + n+1 ("i < n") + n ("i++")
* [1 ("j = i + 1") + n+1 ("j < n") + n ("j++")]
+ 3n (for the if-statement and the statements in it)
+ 4n (the 2 statements before the inner for-loop and the final 2 statements after the inner for-loop).
This is (1 + n + 1 + n)(1 + 1 + n + n) + 7n = (2 + 2n)(2 + 2n) + 7n = 4n^2 + 15n + 4.
But unfortunately, the text book got T(n) = 2n^2 +4n -5.
Please, anyone care to explain to me where I got it wrong?
Here is a formal manner to represent your algorithm, mathematically (Sigma Notation):
Replace c by the number of operations in the outer loop, and c' by the number of operations in the inner loop.