Big theta running time of this recursive function - runtime

I need a little help on figuring out the Big-Theta running time for this function.
int recursive(int n) {
sum = 0;
for (int i = 1; i <= n; i++)
sum++
if (n > 1)
return sum + recursive(n-1);
else
return n;
}
I know how what the run time of this function would be if the for loop wasn't in the function, but the loop is throwing me off a little bit. Any advice?

If it was just the for loop, not recursive, the function would be O(n).
If it was just recursive, and didn't have the for loop, it would also be O(n).
But, it's doing n recursive steps (which we know is O(n)) and it's got an O(n) for loop at each of the n steps.
So... does that help?

Related

Complexity of else block containing loop inside a loop block

I'm trying to find the frequency of each statement and the big o of this method.
But I'm struggling with the else part, I know that we take the worst complexity of if and else
But logically, in this case, do I have multiply the frequency of the outer loop (n) by the frequency of the else loop (n+1) ? Although I know that the else block will be executed for one time only, when i=0
But when we follow the rules we have to multiply it
So I'm stuck here and I don't know what to do in this case, I hope you guys could help me
Thanks!
int i, j, sum = 0;
for (i = 0 ; i < n; i++)
if ( i != 0)
Sum += i;
else
for (j = 0 ; j< n; j++)
Sum + = j;
So, you have a O(n) loop and a O(n) else inside it.
But in your code, we only go through the else statement once.
So you have a for loop, and inside the if you do O(1) work (n - 1) times, and once (i == 0) you do O(n) work ( when the else statement is reached )
So the complexity would be O(n) = (n - 1) * O(1) + O(n) ~ O(2n) which is also O(n) because we drop the constants when doing asymptotic analysis.

Big-O & Exact Runtime

I am learning about Big-O and although I started to understand things, I still am not able to correctly measure the Big-O of an algorithm.
I've got a code:
int n = 10;
int count = 0;
int k = 0;
for (int i = 1; i < n; i++)
{
for (int p = 200; p > 2*i; p--)
{
int j = i;
while (j < n)
{
do
{
count++;
k = count * j;
} while (k > j);
j++;
}
}
}
which I have to measure the Big-O and Exact Runtime.
Let me start, the first for loop is O(n) because it depends on n variable.
The second for loop is nested, therefore makes the big-O till now an O(n^2).
So how we gonna calculate the while (j < n) (so only three loops till now) and how we gonna calculate the do while(k > j) if it appears, makes 4 loops, such as in this case?
A comprehend explanation would be really helpful.
Thank you.
Unless I'm much mistaken, this program has an infinite loop and therefore it's time complexity cannot usefully be analyzed.
In particular
do
{
count++;
k = count * j;
} while (k > j);
as soon as this loop is entered for the second time and count = 2, k will be set greater to j, and will remain so indefinitely (ignoring integer overflow, which will happen pretty quickly).
I understand that you're learning Big-Oh notation, but creating toy examples like this probably isn't the best way to understand Big-Oh. I would recommend reading a well-known algorithms textbook where they walk you through new algorithms, explaining and analyzing the time and space complexity as they do so.
I am assuming that the while loop should be:
while (k < j)
Now, in this case, the first for loop would take O(n) time. The second loop would take O(p) time.
Now, for the third loop,
int j = i;`
while (j < n){
...
j++;
}
could be rewritten as
for(j=i;j<n;j++)
meaning it shall take O(n) time.
For the last loop, the value of k increases exponentially.
Consider it to be same as
for(k = count*j ;k<j ;j++,count++)
Hence it shall take O(logn) time.
The total time complexity is O(n^2*p*logn).

How is runtime affected when for() loop is inside an if() statement?

Say we have this code:
doSomething(int n) {
for(int i = 0; i < n; i++) {
if (i % 7 == 0) {
for(int j = 0; j < n; j++) {
print("*");
}
}
}
}
What are the big-O and big-omega runtimes (with proofs/work shown)?
My mind is being blown by the if() statement and how to prove the big-omega (since for big-O we can just ignore the condition because it's an upper bound).
Any help is much appreciated. Thanks!
Let's begin by trying to rewrite things in a way that more clearly exposes the runtime. For example, that inner loop has a runtime of Θ(n), so let's rewrite the code like this:
for(int i = 0; i < n; i++) {
if (i % 7 == 0) {
do Θ(n) work
}
}
Now, think about what happens when this loop runs. One out of every seven iterations will trigger that inner loop and do Θ(n) work, and the remaining six-sevenths won't and will do Θ(1) work per iteration. This means that the total work done is
n ((1 / 7) × Θ(n) + (6 / 7) × Θ(1))
= n(Θ(n) + Θ(1))
= n(Θ(n))
= Θ(n2).
In other words, the total work done here is Θ(n2). And that makes sense, too, since essentially that if statement just reduces the work done by a factor of 1/7, and big-O notation ignores constant factors.
You had originally written the question for this code:
for(int i = 0; i < n; i++) {
if (n % 7 == 0) {
for(int j = 0; j < n; j++) {
print("*");
}
}
}
Here's my original answer:
Let's begin by trying to rewrite things in a way that more clearly exposes the runtime. For example, that inner loop has a runtime of Θ(n), so let's rewrite the code like this:
for(int i = 0; i < n; i++) {
if (n % 7 == 0) {
do Θ(n) work
}
}
So now let's think about what's happening here. There are two possibilities for what could happen. First, it might be the case that n is a multiple of seven. In that case, the if statement triggers every time, and on each of the n outer loop iterations we do Θ(n) work. Therefore, we can say that the total work done in the worst case will be Θ(n2). We can't tighten that bound because as n gets larger and larger, we keep running into more and more multiples of seven. Second, it might be the case that n isn't a multiple of seven. When that happens, we do Θ(n) loop iterations that all do Θ(1) work each, so in the best case we'll do Θ(n) total work.
Overall, this means that in the worst case we do Θ(n2) work, and in the best case we do Θ(n) work. We can therefore say that the runtime of the function is O(n2) and Ω(n), though I think that the more precise descriptions of the best and worst-case runtimes are a little more informative.
The key to the analysis here is realizing that that if statement is either going to always to fire or never going to fire. That makes it easier for us to split the reasoning into two separate cases.
Remember that a big-O analysis isn't just about multiplying a bunch of things together. It's about thinking about what the program will actually do if we were to run it, then thinking through how the logic will play out. You'll rarely be wasting your time if you try approaching big-O analysis this way.

How to calculate time complexity?

I've gone through some basic concepts of calculating time complexities. I would like to know the time complexity of the code that follows.
I think the time complexity would be O(log3n*n2). It may still be wrong and I want to know the exact answer and how to arrive at the same. Thank you :)
function(int n){
if(n == 1) return;
for(int i = 1; i <= n; i++)
for(int j = 1; j <= n; j++)
printf("*");
function(n-3);
}
Two nested loops with n iterations give O(n^2). Recursion calls the function itself for O(n)-time since it decreases n for the constant 3, thus it's called n/3 + 1 = O(n) times. In total, it's O(n^3).
The logarithm constant in your result would be in case that function is called with the value of n/3.

Time Complexity of an Algorithm

Here is problem in which we have to calculate the time complexity of given function
f(i) = 2*f(i+1) + 3*f(i+2)
For (int i=0; i < n; i++)
F[i] = 2*f[i+1]
What i think is the complexity of this algorithm is O(2^n) + O(n) which ultimately is O(2^n).
Please correct me if i am wrong?
Firstly, all the information you required to work these out in future is here.
To answer your question. Because you have not provided a definition of f(i) in terms of I itself it is impossible to determine the actual complexity from what you have written above. However, in general for loops like
for (i = 0; i < N; i++) {
sequence of statements
}
executes N times, so the sequence of statements also executes N times. If we assume the statements are O(1), the total time for the for loop is N * O(1), which is O(N) overall. In your case above, if I take the liberty of re-writing it as
f(0) = 0;
f(1) = 1;
f(i+2) = 2*f(i+1) + 3*f(i)
for (int i=0; i < n; i++)
f[i] = 2*f[i+2]
Then we have a well defined sequence of operations and it should be clear that the complexity for the n operations is, like the example I have given above, n * O(1), which is O(n).
I hope this helps.

Resources