Algorithm - worst case time complex considers space complexity [closed] - algorithm

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 4 years ago.
Improve this question
To solve a particular problem, 2 algorithms are available -
Algo 1 takes O(n) time and O(n) space in worst case
Algo 2 takes O(nlogn) time and O(1) space in worst case
What is worst case time complexity to solve the problem out of algo 1,2 and why?

If you're asking which worst case is worse? Alg2's worst-case time is worse, because nlogn > n.
Edit (to answer the question raised in comments):
If you're asking what is the best worst case? Alg1's worst-case time is best, again because nlogn > n.

Related

N*2^N vs N*N Time complexity [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed last year.
Improve this question
Which time complexity is better N*(2^N) or N^2 and why?
N*(2^N) or N^2
N*(2^N) is exponential.
If you take n=10, for example, you get 10240
N^2 is merely polynomial.
If you take n=10, for example, you get 100
Exponential is worse than polynomial for large N, and even for reasonable Ns, in your case. To see it intuitively, imagine growing N by 1. In the polynomial case, the result grows by a fraction ((N+1) / N) ^ 2. It grows, but not much. In the exponential case, growing N by 1 doubles the result.

Why is the time complexity of Heapsort O(n logn) and not O(log(n!))? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
Heapsort, every time it iterates within, the heapsize reduces by one, and hence should have a time complexity of Sigma(i=N to 1) O(log i) which would result in O(log n!). And why can't we just report the time complexity of Heapsort as O(log n!).
I came across Stirling's Approximation while trying to answer this question, and realised that log n! -> n logn as n -> inf. Also, is the reason why we agree upon O(n logn) instead of O(log n!), even though log(n!) is smaller than n logn for a wide range of values?

Advantage of using Dinic's O((V^2)E) algorithm over Edmond-Karp algorithm O(V(E^2)) [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
Is there any advantage of using Dinic's O((V^2)E) algorithm over Edmond-Karp algorithm O(V(E^2))?
In other words, I want to know how is O((V^2)E) better than O(V(E^2)) if it is from a Competitive Programming point of view.
Let's say the total number of vertices in n. "Usually", number of edges in a connected graph tend to be between n and n^2.
Mostly the input graphs are not very sparse, so the number of edges in maximum percentage of the cases would be greater than n (might be O(n log n), or in the worst case, O(n^2)).
So, if you consider the worst case scenario, O(V^2 * E) is O(n^4), whereas O(V*E^2) is O(n^5). Hence you see the advantage of using an O(V^2*E) time algorithm over O(V*E^2).

Prove n^2 + 5 log(n) = O(n^2) [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
I am trying to prove that n^2 + 5 log(n) = O(n^2), O representing big-O notation. I am not great with proofs and any help would be appreciated.
Informally, we take big-O to mean the fastest growing term as n grows arbitrarily large. Since n^2 grows much faster than log(n), that should be clear.
More formally, asymptotic behaviors are identical when the limit of the ratio of two functions approaches 1 as their parameter(s) approach(es) infinity, which should sound like the same thing. So, you would need to show that lim(n->inf)((n^2+5log(n))/n^2) = 1.

same on space but different in time complexity [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I had made the comparison in between 2 algorithms, A and B. I found that, the time complexity of these 2 algorithms is different, which are O(n^3) and O(n^4) respectively, but the space complexity of both A and B is same, which is O(n). Is that have probability?
That is possible, yes. The time complexity and the space complexity are not necessarily related to each other.
As an example, think of various sorting algorithms, such as the ones in this list - as you see, both insertion sort and heapsort have a space-complexity (column memory) of O(1), while their time complexities are different.

Resources