Big O notation of simple expressiosn [closed] - big-o

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 7 years ago.
Improve this question
Why
2n^2 = O(n^3)
As definition says
if f(n)<= cg(n),
n ,c > 0
for all n > n0
and since there can be many upper bounds
So any other and better Upper bound

Definition from Skiena:
f(n)=O(g(n)) means c·g(n) is an upper bound on f(n). Thus there exists
some constant c such that always f(n) ≤ c·g(n), for large enough n
(i.e. , n ≥ n0 for some constant n0).
Here f(n) = 2n^2, g(n) = n^3
Let's take constant c = 2. So 2n^2 <= 2n^3 for n >= 1. So it is true.
Of course you can show the same way that it is O(n^2) for same c = 2
From wiki:
A description of a function in terms of big O notation usually only
provides an upper bound on the growth rate of the function.

The big O notation only provides an upper bound, so...
2n² = O(n²), 2n² = O(n³), ... , 2n² = O(whatever bigger than n²)

Related

What kind of algorithm does this recurrence relation represent? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 4 years ago.
Improve this question
Take a look at this recurrence relation for algorithm complexity:
T(n) = 2 T(n-1) - 1
What kind of algorithm does this recurrence relation represent. Notice that there's a minus instead of plus, so it can't be a divide and conquer algorithm.
What kind of algorithm will have complexity with this as it's recurrence relation?
T(n) = 2 T(n-1)-1
T(n) = 4 T(n-2)-3
T(n) = 8 T(n-3)-7
T(n) = 16 T(n-4)-15
...
T(n) = 2^k T(n-k) - 2^(k-1)
If, for example T(1) = O(1) then
T(n) = 2^(n-1) O(1) - 2^(n-2) = O(2^(n-1)) = O(2^n)
which is an exponential growth.
Now let's see that O(1) - 1 = O(1). From CLRS:
O(g(n)) = { f(n) : there exist positive constants c and n0 such that 0 <= f(n) <= c g(n) for all n >= n0 }
Thus to remove effect of -1 we just need to increase hidden constant c by one.
So, as long as your base case have complexity like O(1), O(n) with n > 0 you shouldn't care about -1. In other words if you base case makes recurrence T(n) = 2 T(n-1) at least exponential in n you don't care about this -1.
Example: imagine that you are asked to told if a string S with n characters contains specified character. And you proceed like this, you run algorithm recursively on S[0..n-2] and S[1..n-1]. You stop the recursion when string is one character length, then you just check the character.
Based on the time complexity given, it is an exponential algorithm.
For reduction of size by 1, you are multiplying the time by 2 (approximately)
So it does not come under any polynomial time algorithmic paradigms like divide and conquer, dynamic programming, ...

What is the upper bound of function f(n) = n in Big-O notation and why? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 7 years ago.
Improve this question
I was reading the book Algorithm by Karumanchi .In one of the example it is given that for function f(n)= n the big-o notation is O(n^2).But why is that and why isn't it O(n) with c=2 and n0=1.
f(n) = O(g(n)) sets an upper limit to the function f(n). But this upper limit need not be tight.
So for the function f(n) = n, we can say f(n) = O(n),
also f(n) = O(n^2), f(n) = (n^3) and so on. The definition of Big-Oh doesn't say anything about the tighness of the bound.
Let's first be sure we understand what Karumanchi was saying. First, on page 61, he states that big-O notation "gives the tight upper bound of the given function." (his emphasis). So if O(n) is correct, then O(n^2) is incorrect by his definition.
Then, on page 62, we get the example you cite. He justifies O(n^2) by stating that n <= n^2 for all n >= 1. This is true.
But it is also true that n <= 2n for all n >= 1. (OP's constants.) That justifies the statement n = O(n) with c = 2 and n0 = 1.
So why did he say it's O(n^2)? Who knows? The book is wrong.

Time complexity of Introduction to Algorithms [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I'm learning the Introduction to Algorithms and I'm confused about the answers of exercises:
10*log10 = O(log n^2)
// I think it should be 10*log10 = Theta(log n^2)
n^1.01 = Omega( n* (log n)^2 )
// I think should be n^1.01 = O( n* (log n)^2 )
(log n)^log n = Omega ( n / log n )
// I think should be (log n)^log n = O ( n / log n )
n*2^n = O (3^n)
// I don't know how to prove this.....
Is my thinking correct? I'm appreciating if you can provide with some proof of those four questions.
Thanks indeed.
I think you are confusing the things. Equality (=) in complexity theory must be read as "belongs to class" and not "equals". And then you have to cleary realize the meaning of Big-Oh notation (and other omegas and thetas...). For example, O(n) represents ALL functions that grow no faster than linear function. More formally, if f(n) = O(n) (reads "f(n) belongs to class O(n)"), there exists constant c such that for any n: f(n) < c*n. For instance both f(n) = n and f(n) = log(n) belong to O(n) (i.e., they grow no faster).
Let us consider one of your examples:
n*2^n = O(3^n).
In order to prove that we must find some constant c such that:
n*2^n < c * 3^n;
Some math:
n*2^n < c * 3^n => n < c * (1.5)^n;
You can easily see that, even for c=1 this holds, which proves the statement.
Again, be sure you understand the complexity terminology well.

Solving recurrence: T(n)=sqrt(2)T(n/2)+log(n) [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 4 years ago.
Improve this question
Given the equation T(n)=sqrt(2)T(n/2)+log(n).
The solution points to case 1 of the M.T. with a complexity class of O(sqrt(n)). However after my understanding log(n) is polynomial greater then sqrt(n). Am I missing something?
I used the definition as following: n^e = log_b(a) where a = sqrt(2) and b = 2. This would give me e = 1/2 < 1. log n is obviously polynomial greater then n^e.
No. logx n is not greater than √n.
Consider n=256,
√n = 16,
and
log2 256 = 8 (let us assume base x=2, as with many of the computational problems).
In your recurrence,
T(n)= √2 T(n/2) + log(n)
a = √2, b = 2 and f(n) = log(n)
logb a = log2 √2 = 1/2.
Since log n < na, for a > 0, We have Case 1 of Master Theorem.
There for T(n) = Θ(√n).
Using the masters theorem you get: a=sqrt(2), b = 2 and therefore c = logb(a) = 1/2. Your f(n) = log(n) and therefore you fall into the first case.
So your complexity is O(sqrt(n))

Lower bound : resource required by an algorithm for some class of input size n [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I've been asked to find the lower bound of the following :
T(n)= 23n^3-n^2-n.
So here is how i proceeded and i don't know whether I'm tackling it the proper way:
T(n)>=c(23n^2-n^2) for all n greater than n>=n0
23n^3-n^2-n >=(22n^2) for all n>=2.
T(n)>=c|n^2| for all n>=2
c=22 n0=22.
T(n) is in Big Omega n^2
HELP PLEASE!
Note that n^3 >= n^2 for n >= 1. So, -n^3 <= -n^2 for n >= 1.
Note that n^3 >= n for n >= 1. So, -n^2 <= -n for n >= 1.
So
23n^3 - n^2 - n >= 23n^3 - n^3 - n^3 = 21n^3.
Thus, 21n^3 is a decent lower bound.
Intuitively this makes sense as 23n^3 - n^2 - n is clearly cubic in nature, and thus should have lower bound and upper bound of cn^3 for some c (different c for the lower bound from the c for the upper bound).

Resources