How to prove this statement of big o notation? Proof [closed] - big-o

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
How to prove this :
3n^2 + 6n is O(n2)
Do i have to choose 6n as a constant ?

To prove that you need to show that there exist M and x0 for which |3x^2 + 6x| <= M|x^2| for all x > x0

Related

How to solve: T(n)=T(n^0.5)+n^2 [closed]

Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 2 years ago.
Improve this question
How can I prove the following?
T(n) = T(n0.5) + n2 = Θ(n2)
I tried to open the function declaration step by step but it got complicated and I got stuck!
We can expand the expression to:
T(n) = n2 + n1 + n1/2 + n1/4 + ...
Sum the geometric sequence:
T(n) = 2 * n2
So:
T(n) = Θ(n2)

How to calc worst case complixity? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
How can I calc. the worst case complexity for these code using big O notation.
int a=0, b=0;
for(i=0;i<N;i++){
a=a+1;
}
for(j=0;j<M;j++){
b=b+j;
}
The complexity is linear. The worst case is either N or M, which ever is bigger. The first loop will run N times, the second loop will run M times.

What is time-complexity of T(N)=4T(N/2)+(N^2)/logN [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
This question was given in MIT video on analysis of algorithms,
The following question can not be done using master method and can be solved using recurrence tree.
Can any please tell me the solution?
Why exactly do you claim that it can not be done with the masters theorem?. This theorem has only some constraints that a and b are constants and a >= 1 and b > 1. It will hold for any f(n) and therefore you can apply it here.
If you will apply it you would see that a=4, b=2 and therefore c = 2. n^c grows faster than your f(n) and therefore the complexity is O(n^2).

Analysis of algorithm about log log n [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
Consider f(n) = log log (n) , g(n) = 10^10^10^10^10^10 , then f(n) is O(g(n)) .
Is that above consideration true or false ? Please Mention the reasons behind that consideration.
f(n) grows (albet very slowly), and g(n) is a constant.
lim n-> inf, f(n) / g(n) = 0, so this is false

Method to solve the stated recurrence? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Need help finding a method for solving the following:
Given f(n) to be 9f(n/3)+(n2)*(log3n) for all n > 1.
And given f(1)=1.
Solve for f(n)
I tried the master theorem, but all the 3 cases did not fit here, my guess would be using the substitution method, but I am not sure how to apply it
Use the substitution f(n) = n2g(n).
This gives us g(n) = g(n/3) + log n.
And so g(n) = Θ(log2n) and f(n) = Θ(n2log2n)

Resources