I was learning about time complexity recently and I was wondering time complexity of a algorithm to compute a^n. My answer will be O(n).
However, I was thinking about the divide and conquer method. As a^n/2 * a^n/2 = a^n. Is it possible to turn time complexity of an algorithm a^n into a algorithm with O(logn) time complexity but I was stuck thinking how would such an algorithm be and how would it works?
I would greatly appreciate if anyone could share their thoughts with me.
Related
How to develop programs which has less time complexity as possible?
There is no proven method to come up with the optimal algorithm for a given problem.
There are many problems for which is likely the world has not come up with the most efficient algorithm yet.
For example: what is the algorithm with the best time complexity for matrix multiplication? At the time of writing no one knows what that theoretical best time complexity would be, let be there is someone who has designed an algorithm with that time complexity.
Dear friends here I am getting some confusion regarding time complexity of my algorithm. My algorithm has time complexity 3^(.5n). Is it correct way to write 3^(.5n) as (3^.5)^n. In one thesis I got it.
Yes, it is correct way. It is known identity for exponentiation
(a^b)^c = a^(b*c)
But what is relation of math formula to programming?
Hey just a quick question,
I've just started looking into algorithm analysis and I'm attempting to learn Big-Oh notation.
The algorithm I'm looking at contains a quicksort (of complexity O(nlog(n))) to sort a dataset, and then the algorithm that operates upon the set itself has a worst case run-time of n/10 and complexity O(n).
I believe that the overall complexity of the algorithm would just be O(n), because it's of the highest order, so it makes the complexity of the quicksort redundant. However, could someone confirm this or tell me if I'm doing something wrong?
Wrong.
Quicksort has worst case complexity O(n^2). But even if you have an O(nlogn) sort algorithm, this is still more than O(n).
I was wondering if anyone could explain the A* time complexity.
I am using a heuristic that uses euclidean distance for the estimate of the weight. There are no loops in the heuristic function.
So i think that the time complexity of the heuristic is O(1).
Taking this into account, what would the A* complexity be and how is that derived?
you can have your answer here:
Why is the complexity of A* exponential in memory?
time complexity is like memory complexity
If we know the lower bound for the time complexity of a problem is Ω(n^2), am I correct in thinking it is not possible to have an algorithm with worst-case time complexity O(n log n)?
If the lower bound for the time complexity of a problem is Ω(n^2), then that means an algorithm solving this problem has to take at least C*n^2 time.
On the other hand, you have an algorithm that takes at most K*n*logn time.
This algorithm cannot run any longer than nlogn. What you need is an algorithm that runs at least n^2 time.
Therefore; it is impossible for this algorithm to solve this problem. You are correct.