Describing space complexity of algorithms - algorithm

I am currently a CS major, and I am practising different algorithmic questions. I make sure I try to analyse time and space complexity for every question.
However, I have a doubt:
If there are two steps (steps which call recursive functions for varying size of OP) in the algorithm, i.e.
int a = findAns(arr1)
int b = findAns(arr2)
return max(a,b);
Would the worst time complexity of this be: O(N1) + O(N2) or simply, O(max(N1,N2)). I ask because at a time, we would be calling the function with only single input array.
While calculating worst case space complexity, if it comes out to be, O(N) + O(logN), since N > logN, would we discard O(logN) or since logN is also dependent on N and say worst space complexity is O(N), we would say, worst case space complexity to be O(N) only.

Related

Complexity of two Methods

If I have a method that insert an element to an heap with the following code:
(1) If an array is full - create a new array and resize by its original.length * 2 , and then copy each elements from the original array to the new one.
(2). In order to fulfill an Heap just percup/perdown each element to its suit position.
So the worst case complexities are - (1) is O(n) and for (2) its O(logn)
my question is what is the sum of the two complexities? How to calculate the worst case complexity of this algorithm.
Thanks!
For situations like this, if you follow the text book approach the worst case complexity of the algorithm would be
= O(n) + O(logn)
= O(n)
So, the complexity would be O(n)
Actually the name worst case complexity gives you the answer. You should ask yourself the question,
Is there any case where the complexity is O(n).
If yes, then thats the worst case complexity
If you are inserting N elements one by one, then siftUp/siftDown process executes every time, and time dedicated to these procedures is O(NlogN) (as sum of log1+log2+...log(N-1)+log(N))
But array expanding happens seldom. The last expanding takes N steps, previous N/2 steps and so on. Time dedicated to these procedures is
N + N/2 + N/4 + ...+1 = N*(1 + 1/2 +1/4+...) = N*2 = O(N)
So amortized time for expanding part is O(1), and amortized time for inserting part is O(logN)
Overall complexity for N elements is
O(N) + O(NlogN) = O(NlogN)
or
O(logN) per element

How to add Big O and Big omega

If an algorithm has two sub algorithm, when it is best case for sub algorithm A1 to the given input, it is the worst case for sub algorithm A2. How could I find the overall algorithm complexity?
Simply I mean Ω(N) + O(N)=?
I know if the algorithms are in sequential executing order the over all complexity is O(N)+ O(N) and in nested order O(N)* O(N).
Please tell me in both cases, when in sequential and in nested order
Essentially Ω(N) + O(N)= Ω(N). Because O(N) means lower (or at most the same) order of Ω(N). When they are summed, the lower order can be omitted.
If your algorithm includes one operation which takes (for example) O(N) time, and another which takes O(N^2) time, then the overall complexity is O(N^2). There's no such thing as O(N^2 + N). The same goes for Ω(). This answers your question about "sequential executing order".
If your algorithm includes N operations, each of which takes O(N) time, then the overall complexity is O(N^2). The same goes for Ω(). You just multiply the polynomials, and take the term which grows most quickly with increasing N. This answers your question about "nested execution order".

time and space complexity

I have a doubt related with time and space complexity in following 2 case
Blockquote
Case I:
Recurion: Factorial calculation.
int fact(int n)
{
if(n==0)
return 1;
else
return (n*fact(n-1));
}
here how come time complexity become 2*n and space complexity proportional to n.
and
Case II:
Iterative:-
int fact(int n)
{
int i, result = 1;
if(n==0)
result = 1;
else
{
for(1=1;i<=n;i++)
result*=i;
}
return (result);
}
Time complexity proportional to n and space complexity is constant.
This always remain confusing to me.
If my reasoning is faulty, somebody please correct me :)
For the space complexity, here's my explanation:
For the recursive solution, there will be n recursive calls, so there will be n stacks used; one for each call. Hence the O(n) space. This is not the case for the iterative solution - there's just one stack and you're not even using an array, there is only one variable. So the space complexity is O(1).
For the time complexity of the iterative solution, you have n multiplications in the for loop, so a loose bound will be O(n). Every other operation can be assumed to be unit time or constant time, with no bearing on the overall efficiency of the algorithm. For the recursive solution, I am not so sure. If there are two recursive calls at each step, you can think of the entire call stack as a balanced binary tree, and the total number of nodes will be 2*n - 1, but in this case I am not so sure.
From: https://cs.nyu.edu/~acase/fa14/CS2/module_extras.php
Space Complexity
Below we're going to compare three different calls to both the iterative and recursive factorial functions and see how memory gets used. Keep in mind that every variable that we declare has to have space reserved in memory to store it's data. So the space complexity of an algorithm in its simplest form is the number of variables in use. So in this simplest situation we can calculate approximate space complexity using this equation:
space complexity = number of function calls * number of variables per call
Time Complexity: The number of (machine) instructions which a program executes during its running time is called its time complexity in computer science.
Space Complexity:This is essentially the number of memory cells which an algorithm needs.
Case 1: In the program is of recursively calculating the factorial , so there will be one direct call to the function and than there will be backtracking, so the time complexity becomes 2*n.
Talking about the space complexity there will be n stacks declared during the point of execution of program, so it is n.
Case 2: This case is pretty simple here you have n iteration inside the for loop so time complexity is n
This is not the case for the iterative solution - there's just one stack and you're not even using an array, there is only one variable. So the space complexity is O(1)

Avgerage Time Complexity of a sorting algorithm

I have a treesort function which performs two distinct tasks, each with its own time complexity. I figured out the avg. case time complexity of the two tasks but how do I find the overall complexity of the algorithm.
For example the algorithm takes in a random list of "n" keys x:
Sort(x):
Insert(x):
#Time complexity of O(nLog(n))
Traverse(x):
#Time complexity of O(n)
Do I just add the two complexities together to give me O(n + nLog(n)) or do I take the dominant task (in this case Insert) and end up with an overall complexity of O(nLog(n))
In a simple case like this,
O((n) + (n log(n)) = O(n + n log(n))
= O(n (log(n) + 1))
= O(n log(n))
or do I take the dominant task (in this case Insert) and end up with an over complexity of O(nLog(n))
That's right. As n grows, first element in O(n + nLog(n)) sum will become less and less significant. Thus, for sufficiently large n, its contribution can be ignored.
You need to take the dominant one.
The whole idea of measuring complexity this way is based on the assumption that you want to know what happens with large ns.
So if you have a polynomial, you can discard all but the highest order element, if you have a logarithm, you can ignore the base and so on.
In everyday practice however, these differences may start to matter, so it's sometimes good to have a more precise picture of your algorithm's complexity, even down to the level where you assign different weights to different operations.
(Returning to your original questions, assuming you're using base 2 logarithms, at n=1048576, the difference between n+n*logn and n*logn is around 5%, which is probably not really worth worrying about.)

Complexity for recursive functions - Time and Space

I was interested in knowing how to calculate the time and space complexity of recursive functions like permutation, fibonacci(described here)
In general we can have recursion at many places than just at permutaion or recursion, so I am looking for approach generally followed to calculate tmie ans space complexity
Thank you
Take a look at http://www.cs.duke.edu/~ola/ap/recurrence.html
Time complexity and space complexity are the two things that characterize the performance of an algorithm. Nowadays, as space is relatively inexpensive, people bother mostly about time complexity, and time complexity is mostly expressed in terms of a recurrence relation.
Consider binary search algorithm (Search for an element in an array): Each time middle element(mid) is selected and compared with the element(x) to be searched. If (mid > x), search the lower sub-array, otherwise search the upper sub-array. If there are n elements in an array, and let T(n) represents the time complexity of the algorithm, then
T(n) = T(n/2) + c, where c is a constant. With given boundary conditions, we can solve for T(n), in this case, it would be T(n) = log(n), with T(1) = 1.

Resources