What is the complexity of the following method? - complexity-theory

I'm still learning about complexity measurement using the Big O Notation, was wondering if I'm correct to say that following method's complexity is O(n*log4n), where the "4" is a subscript.
public static void f(int n)
{
for (int i=n; i>0; i--)
{
int j = n;
while (j>0)
j = j/4;
}
}

Yes, You are correct, that the complexity of the function is O(n*log_4(n))
Log_4(n) = ln(n) / ln(4) and ln(4) is a constant, so if Your function has complexity O(n*log_4(n)) it is also true, that it has a complexity of O(n*ln(n))

Did you mean
public static void f(int n)
{
for (int i=n; i>0; i--)
{
int j = i; // Not j = n.
while (j>0)
j = j/4;
}
}
?
In that case, too you are correct. It is O(nlogn). Using the 4 as subscript is correct too, but it only makes it more cumbersome to write.
Even with the j=n line, it is O(nlogn).
In fact to be more accurate you can say it is Theta(n logn).

yes you are right, complexity is n* log4(n)

Related

Is an algorithm whose time complexity increases with input but with a limit considered O(n)?

if a function's statements execution increases with input but with a limit, would it be considered O(n) or O(1)?
for example:
void func(int n)
{
if (n > 1000)
{
for (int i = 0; i < 1000; i++)
{
//do thing
}
}
else
{
for (int i = 0; i < n; i++)
{
//do same thing
}
}
}
is this function O(n) or O(1)?
It is O(1), not O(n).
Big-O analysis is asymptotic: it intentionally ignores an arbitrarily large initial section of the performance function, in order to accurately describe the large-scale behavior.

Why are the Big O Notations for the following codes like this?

Question 1.
My Answer: 1 + n^2 * n = n^3
Correct Answer: O(n)
void f(int n) {
if (n<1000000)
return 0;
for (int i=0; i<(n*n); i++)
return f(n-1);
}
Question 2.
My Answer: n * n * logn = n^2*logn
Correct Answer: O(n^3)
int f(int n) {
for (int i=1; i< (n/2); i++)
for(double j =0; j <100; j+= (100.0/n)
doSomething (n/2); //doSomething(m) has time complexity of O(m)
return 0;
}
Question 3.
My Answer: 1 + n * (logn + 1) = O(nlogn)
Correct Answer: O(logn)
int f(n) {
if (n<1) return;
g(n-1);
}
void g(n) {
f(n/2)
doOne(); // doOne has time complexity O(1)
}
Question 1
void f(int n) {
if (n<1000000)
return 0;
for (int i=0; i<(n*n); i++)
return f(n-1);
}
The for loop is not looping at all because the content is a return statement, so you have at most one loop iteration. This means you can can simplify this code to:
void f(int n) {
if (n<=0)
return 0;
return f(n-1);
}
(simplified in regard for the O(n) analysis)
Here you see why it is O(n) because it is counting down until it hit the recursion stop condition. The fact that there is a "high" value check for n<100000 doesn't matter when you call it with something like f(5*10^300);
Question 2
int f(int n) {
for (int i=1; i< (n/2); i++)
for(double j =0; j <100; j+= (100.0/n)
doSomething (n/2); //doSomething(m) has time complexity of O(m)
return 0;
}
In regard for the O(n) analysis you can simplify some lines:
for (int i=1; i< (n/2); i++)
This can be simplified to:
for (int i=1; i<n; i++)
Therefore it's O(n) as already identified by you.
for(double j =0; j <100; j+= (100.0/n)
This can be simplified as:
for(double j =0; j <1; j+= (1.0/n) (divided by 100)
for(double j =0; j <n; j+= 1.0) (multiplied by n)
And again, a simple O(n) loop.
doSomething (n/2);
That is by definition a O(n) statement.
So in total you have O(n)*O(n)*O(n), which is O(n^3).
Question 3
int f(n) {
if (n<1) return;
g(n-1);
}
void g(n) {
f(n/2)
doOne(); // doOne has time complexity O(1)
}
Not sure how you got the O(n*log(n)) here because not every n value is checked. In fact you start at n and the steps are n/2, n/4, n/8, n/16, n/32, n/64, n/128, ... and at some point you reached the terminate condition if (n<1). This is a simple O(log(n)) loop.

Complexity of this algorithm?

What's the complexity of func1 if func2 = O(n) and func3 = O(n^2) ?
void func1 (int n) {
int i, j, k;
for (k=0;k<n;k++) // O(n)
printf("%d",k);
i=0;
while (i<2*n) { // O(n)
for (j=i;j>1;j--) // Not executed
for (k=15;k>0;k--)
func2(n);
for (j=n;j>1;j--) // O(n)
func3(n); // O(n^2)
i++;
}
}
So that's O(n^2)O(n)O(n) + O(n) = max(O(n^4),O(n)) = O(n^4) ?
Thanks!
void func1 (int n) {
int i, j, k;
for (k=0;k<n;k++)
printf("%d",k);
i=0;
while (i<2*n) { // O(2*n) = O(n)
for (j=i;j>1;j--) // O(n)
for (k=15;k>0;k--) // O(15) = O(1)
func2(n); // O(n)
for (j=n;j>1;j--) // O(n)
func3(n); // O(n^2)
i++;
}
}
For sequences, find the maximum of the steps.
As a rule of thumb, nested loops multiply, but you may need to examine the ranges carefully if they are not independent make sure this is the case (see Paul's comment for an example).

Big-O complexity of algorithm

Whats the total BigO complexity of an algorithm like the one below:
function void a(int n, int p) {
for(i = 0; id < n; i++){
print(i)
for(j=0; j < p; j++){
print(i+j)
}
}
}
A formal and efficient way is to use the following methodology:
Where c is a constant.
So, it´s the value instead of (logarithmic) length.
If i have n=5 and p=10, how much prints would this give?
50. This means n*p is correct
Think about it... you do n times p loops. So O(n*p).

If a function has 2 for loops inside of it (not nested), is bigOh still O(n) for the entire function?

Having different loops inside of a function doesn't cause the BigOh to be multiplied together, right?
Example:
function() {
for(int i = 0; i < n; i++) {
//logic here
}
for(int i = 0; i < n; i++) {
//logic here
}
}
This is pretty well discussed (i.e. General Reference) but yes you are correct, the function you have in your question would be O(n).
Technically O(2n) which gets reduced to O(n)
Yes it's still O(n) because you would have O(n+n) which is O(2n) but we can ignore the 2 because it has negligible effect. But if you had
for (...){
for(...){
//code here
}
}
Then it would be O(n^2)

Resources