Calculating Algorithm Efficiency and Complexity - performance

My instructor taught us how to calculate the computational complexity of an algorithm, but she only did so very briefly and not so well. More specifically, could someone help me calculate the computational complexity of the following:
While (PLength > 0)
chartest = sHex(i)
ascvalue = Strings.Asc(chartest)
decvalue = Convert.ToDecimal(ascvalue)
shiftdecvalue = decvalue + 1
asc = ChrW(shiftdecvalue)
emptychararray(i) = asc
i = i + 1
PLength = PLength - 1
End While
They way I've thought about it, it just comes out to be T(n)= C_1*n+C_2*(n-1)+C_3*(n-1)+C_4*(n-1)+C_5*(n-1)+C_6*(n-1)+C_7*(n-1)+C_8*(n-1)+C_9*(n-1)+C_10*(n-1)
But I feel like I might be oversimplifying it. Furthermore, how do I get the big O notation for this? I've been looking online for resources on how to teach myself this, so if you have any recommendations, those would be greatly appreciated. Thank you in advance.

First note that T(n) can be rewritten as T(n) = C1 * n + C2. Then note that big O notation ignores constants, so the complexity is O(n).

Related

Order of growth determination

I am practicing order of growth and I was having trouble determining the order of growth for the following function:
def ri(na):
if na <= 1:
return na
def han(na):
i = 1
while i < na:
i *= 2
return i
return ri(na/2) + ri(na/2) + han(na-2)
I believe the function han has an order of growth $\Theta(n) = log(n)$ but I'm not sure how to think about this when adding ri(na/2). I would appreciate if anyone can help me out in figuring out how to compute the run time. Thanks so much!
The time complexity of han function is Theta(log(n)) (each time i is mulplied by 2). Hence, the time complexity of ri is T(n) = 2T(n/2) + Theta(log(n)). Using the master theorem we can say T(n) = Theta(n).

Solve recurrence T(n) = T(6n/5) + 1

So I'm preparing for the Algorithms exam and I don't know how to solve this recurrence T(n) = T(6n/5) + 1 since b = 5/6 < 1 and Master theorem cannot be applied.
I hope that someone can give me a hint on how to solve this. :)
Given just that recurrence relation (and no additional information like T(x) = 1 when x > 100), an algorithm with time complexity as described by the relation will never terminate, as the amount of work increases at each call.
T(n) = T(6n/5) + 1
= T(36n/25) + 2
= T(216n/125) + 3
= ...
You can see that the amount of work increases each call, and that it's not going to have a limit as to how much it increases by. As a result, there is no bound on the time complexity of the function.
We can even (informally) argue that such an algorithm cannot exist - increasing the size of the input by 1.2 times each call requires at least 0.2n work, which is clearly O(n) - but the actual cost at each step is claimed to be 1, O(1), so it's impossible for an algorithm described by
this exact recurrence to exist (but fine for algorithms with recurrence eg. T(n) = T(6n/5) + n).

Recurrence: T(n) = 3T(n/2) + n^2(lgn)

Here is the full question...
Analysis of recurrence trees. Find the nice nonrecursive function f (n) such that
T(n) = Θ( f (n)). Show your work: what is the number of levels, number of instances on each level, work of each instance and the total work on that level.
This is a homework question so I do not expect exact answers, but I would like some guidance because I have no idea where to start. Here is part a:
a) T(n) = 3T(n/2) + n^2(lgn)
I really have no idea where to begin.
These types of recurrences are solved with Master's theorem
In your case a=3, b=2 and therefore c = log2(3) < 2.
So you are in the third case and your complexity is O(n^2 * log(n))

How to convert time analysis to O(n)?

Rookie computer science student here, have a question I'm having some trouble answering.
I have a tree traversal algorithm, the time performance of which is O(bm) where b is the branching factor and m is the max depth of the tree. I was wondering how one takes this and converts it into standard asymptotic time analysis (IE O(n), O(n^2), etc).
Same question for a different algorithm I have which is O(b^m).
I have gone through my textbook extensively and not found a clear answer about this. Asymptotic time analysis usually relates to input (n) but I'm not sure what n would mean in this instance. I suppose it would be m?
In general, what do you do when you have multiple inputs?
Thank you for your time.
You should start with building a recurrence. For example, let us consider binary search. The recurrence comes as: T(n) = T(n/2) + c. When you solve it, you will get
T(n) = T(n/2) + c
= T(n/4) + c + c
= T(n/8) + c + c + c
...
= T(n/2^k) + kc
The recurrence is solved when n = 2^k or k = log_2(n). So, the complexity is c.log_2(n)
Now, let us look at another situation where the input is divided into 5 parts, and the results combined in linear time. This recurrence will be
T(n) = 5T(n/5) + n
= 5^2T(n/5^2) + 2n
...
= 5^kT(n/5^k) + kn
This will stop when n = 5^k or k = log_5(n). So, substituting above, the complexity is: n.log_5(n).
I guess you should be able to take it from here on.

Rules of Big O - Question

If I have the following closed form solution for a recurrence relation, how can I simplify it under big O:
f(n) = 3^n + n.9^n
I would hazard a guess at:
f(n) is a member of O(9^n) -> Am not sure if this right? Could someone please let me know how to simplify the above equation under big O and also state which rule you used...
Thanks in advance
http://en.wikipedia.org/wiki/Big_O_notation
If f(x) is a sum of several terms, the one with the largest growth rate is kept, and all others omitted.
So O(n * 9^n), assuming that with n.9^n you meant n * 9^n.
Simple relations which helps you is:
O(1) < O(log(N) < O(N^Epsilon)<O(N)<O(N logN)<O(N^c)<O(c^n)<O(n!)<O(n^n)
for c >1 and 0 < Epsilon <1.
See big O in wiki for better understanding

Resources