Which algorithm for the knapsack problem has a O(2^n*n) complexity?
I've been asked to implement a solution for the knapsack problem.
I'm familiar with programming but not with asymptotic notation.
Can anybody advise me on which algorithm has a O(2^n*n) complexity?
O(n * 2^n) is the performance of the brute force algorithm (= just try all combinations), see http://en.wikipedia.org/wiki/Knapsack_problem#Meet-in-the-Middle_Algorithm
Related
I was learning about time complexity recently and I was wondering time complexity of a algorithm to compute a^n. My answer will be O(n).
However, I was thinking about the divide and conquer method. As a^n/2 * a^n/2 = a^n. Is it possible to turn time complexity of an algorithm a^n into a algorithm with O(logn) time complexity but I was stuck thinking how would such an algorithm be and how would it works?
I would greatly appreciate if anyone could share their thoughts with me.
I'm having problems calculating the complexity of the following recurrence equation.
It's quite difficult for me to solve it. Could anyone help me to solve this problem? Thanks in advance.
This is the same recurrence for the average case complexity of quicksort with solution
T(n)=O(n log n)
derivation here
If a polynomial time algorithm for an NP-Complete problem is found, lets say its O(n^2) hypothetically, does this imply that there is an O(n^2) solution for all NP problem? I know this would imply that there is a polynomial solution for all NP-problems, but would it necessarily be O(n^2)?
Not necessarily
A problem x that is in NP is also in NP-Complete if and only if every
other problem in NP can be quickly (ie. in polynomial time)
transformed into x.
Therefore an algorithm that solves one NP-Complete problem means we can solve any other problem in NP by transforming it to an instance of that problem and solving it. But the transformation could be any complexity as long as its polynomial we satisfy the condition.
So the answer to your question in no, an O(N^2) algorithm to solve an NP-Complete problem does not imply all NP problems can be solved in O(N^2) time, it only guarantees there exists a polynomial time algorithm to solve it.
ie O(N^2) + T(N) where T(N) is the complexity to transform the problem instance
I have a 0-1 second order cone (SOC) problem and I need to know the complexity of solving this problem if branch and cut (B&C) method is used?. The way I addressed this question is as following:
The 0-1 SOC problem can be solved using B&C method which has an exponential worst case complexity, i.e., O(2^n). At each node of B&C method, the relaxed problem is a SOC problem which can be solved using an interior point method which has a polynomial-time complexity. However, I do not have an expression for the complexity of the interior point method yet. Assuming this complexity is O(n). Then, I can claim that the complexity of solving the 0-1 problem using B&C method is O(2^n) times O(n).
Do not think so. You are solving n nodes each with complexity O(n). By my calculations it would come out to be O(2^n*n^2).
When I was reading about quantum algorithms I faced the Deutsch-Jozsa algorithm, I see that if we want to solve that problem in a non-quantum algorithm, our algorithm would have exponential time complexity. Now I want to know what is the time complexity of Deutsch-Jozsa algorithm as a quantum algorithm on quantum computers?
According to Wikipedia the complexity of the quantum algorithm is constant:
The Deutsch-Jozsa quantum algorithm produces an answer that is always correct with a single evaluation of f.
The algorithm itself are just some calculations on quantum states, without any iterations/... so complexity is O(1).