how time complexity O(n) works - data-structures

what is the time complexity of 3k + 1
If I were not wrong time complexity of this equation is O(n)
3k + 1 -> 3x
-> O(n)
How the complexity O(n). Please explain.

3n + 1 = 3n :O(n)
let, if n = 5 then 
3*5 (3*3*3*3*3) (5 operations here)
that is linear complexity

Related

How do you find the complexity of an algorithm given the number of computations performed each iteration?

Say there is an algorithm with input of size n. On the first iteration, it performs n computations, then is left with a problem instance of size floor(n/2) - for the worst case. Now it performs floor(n/2) computations. So, for example, an input of n=25 would see it perform 25+12+6+3+1 computations until an answer is reached, which is 47 total computations. How do you put this into Big O form to find worst case complexity?
You just need to write the corresponding recurrence in a formal manner:
T(n) = T(n/2) + n = n + n/2 + n/4 + ... + 1 =
n(1 + 1/2 + 1/4 + ... + 1/n) < 2 n
=> T(n) = O(n)

How the time complexity of the following code is O(n)?

I was solving a time-complexity question on Interview Bit, which is given below in the image.
The correct answer to this question is O(N). But according to me, the answer should be O(NlogN). Since the complexity for the first "for loop" should be O(logN) because the variable i is divided by 2 in each iteration and I have studied that whenever the loop variables are either multiplied or divided by 2, then the time complexity is O(logN). Now, for the second "for loop", the complexity should be O(N), therefore, the final complexity should be O(N*logN).
Can anyone please explain where I am wrong?
Do the actual math:
T(N) = N + N/2 + N/4 + ... + 1 (log_2 N terms in the sum)
This is a geometric series with ratio 1/2, so the sum is equal to:
T(N) = N*[1 - (1/2)^(log_2 N)] / (1 - 1/2) =
= [N - N/(2^log_2 N)] / 0.5 =
2^log_2 N = N
= (N - 1) / 0.5
So T(N) is O(N).

Time complexity of one recursive algorithm

here we have an algorithm
T(n) = n-1 + T(i-1) + T(n-i)
T(1) = 1
How to calculate it's time complexity?
i is between 1 and n
I can recognise this as quick sort algorithm (randomized quick sort).
I am sure the question somehow missed the summation part.
Okay! you can use substitution method over here..check with O(n^2). you will get the idea that O(n^2) is the worst case time complexity.
The average case is a bit tricky. Then the pivot can be any element from 1 to n. Then analyse it. Here also you can apply substituin with T(n)=O(nlogn).
I think we should solve it like this
if i = 2 then we have
T(n) = n + T(n-2) = Theta(n^2)
if i = n/2 then we have
T(n) = n-1 + T(n/2 -1) + T(n/2) = Theta(n.logn)
then we have upper bound O(n^2) and algorithm is in order of O(n^2)

What is the complexity of multiple runs of an O(n log n) algorithm?

If the problem size is n, and every time an algorithm reduces the problem size by half, I believe the complexity is O (n log n) e..g merge sort. So, basically you are running a (log n) algorithm (the comparison) n times...
Now the problem is, if I have a problem of size n. My algorithm is able to reduce the size by half in a run and each run takes O(n log n). What is the complexity in this case?
If the problem takes n steps at size n, plus an additional run at size floor(n/2) when n > 1, then it takes O(n) time in total: n + n/2 + n/4 + ... =~ 2n = O(n).
Similarly, if each run takes time O(n log n) and an additional run at size floor(n/2) when n > 1, the total time is O(n log n).
Since the size of the problem gets halved in each iteration and at each level the time taken is n log n, the recurrence relation is
T(n) = T(n/2) + n log n
Applying Master theorem,
Comparing with T(n) = a T(n/b) + f(n), we have a=1 and b=2.
Hence nlogba = nlog21
= n0 = 1.
Thus f(n) = n log n > nlogba.
Applying Master theorem we get T(n) = Θ(f(n)) = Θ(n log n).
Hence the complexity is T(n) = Θ(n log n).
EDIT after comments:
If the size of the problem halves at every run, you'll have log(n) runs to complete it. Since every run take n*log(n) time, you'll have log(n) times n*log(n) runs. The total complexity will be:
O(n log(n)^2)
If I don't misunderstand the question, the first run completes in (proportional to) nlogn. The second run has only n/2 left, so completes in n/2log(n/2), and so on.
For large n, which is what you assume when analyzing the time-complexity, log(n/2) = (logn - log2) is to be replaced by logn.
summing over "all" steps:
log(n) * (n + n/2 + n/4 ...) = 2n log(n), i.e. time complexity nlogn
In other words: the time complexity is the same as for your first/basic step, all others together "only" contributing the same amount once more
I'm pretty sure that comes to O(n^2 log n). You create a geometric series of n + n/2 + n/4 + ... = 2n (for large n). But you ignore the coefficient and just get the n.
This is fine unless you mean the inner nlogn to be the same n value as the outer n.
Edit:
I think that what the OP means here is that each run the inner nlogn also gets havled. In other words,
nlogn + n/2 log n/2 + n/4 log n/4 + ... + n/2^(n - 1) log n/2^(n-1)
If this is the case, then one thing to consider is that at some point
2^(n-1) > n
At that point the log breaks down (because log of a number between 0 and 1 is negative). But, you don't really need the log as all there will be is 1 operation in these iterations. So from there on you are just adding 1s.
This occurs at the log n / log 2. So, for the first log n / log 2 iterations, we have the sum as we had above, and after that it is just a sum of 1s.
nlogn + n/2 log n/2 + n/4 log n/4 + ... + n/2^(log n / log 2) log n/2^(log n / log 2) + 1 + 1 + 1 + 1 ... (n - log n / log 2) times
Unfortunately, this expression is not an easy one to simplify...

What is the time complexity of the following equation

I'm having a few problems working out the time complexity using Big-O notation.
This is the equation:
88n^2 logn + 81 + 3n^3 + 12n
I can figure it out I'm guessing its something like:
O(n^2 logn) or O(n^3)
Thanks in advance.
As you know n grow faster than logn.
You also know we can multiply same strength factor to a complexity equation.
So we could simply say n^3 grow faster than n^2 logn.
=> O(n^3)
Since growth rate of n greater than the growth rate of log(n),
we can say n^3 grow faster than n^2 log(n).
So 88n^2 logn + 81 + 3n^3 + 12n => O(n^3)

Resources