What are the most important algorithms? [closed] - algorithm

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 12 years ago.
Christoph Koutschan has set up an interesting survey that tries to identify the most important algorithms "in the world". Since one of the criteria is that "the algorithm has to be widely used" I though that extending the survey to the huge group of users at Stack Overflow would be a natural thing to do.
So, what do you think? Which algorithms deserve a place in the Algorithm Hall of Fame?

I somewhat like this algorithm:
Write code.
Test code. If buggy, go to step 3. If not, go to step 4.
Rewrite code, then go back to step 2.
Get somebody else to test your code. If they discover any bugs, return to step 3, otherwise go to step 5.
Congratulations, your code has no obvious bugs! Now you wait for a user to stumble upon a hidden one, in which case you return to step 3 once again unless you're lucky and are no longer providing support for the code in question.

I'd say binary search since it's usually the first algorithm people learn. And the RSA encryption algorithms are pretty important.

Hashing, since it's the basis for so much in security, data structures, etc. Hashing algorithms have generated a lot of Ph.D. dissertations.

Related

Random Mutation Hill Climber & Simulated Annealing - Which is Fastest? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I have used a random mutation hill climbing algorithm as part of a project that I am working on, but was wondering whether it would be better to use simulated annealing to minimise the chance of getting stuck in any local optima.
The question I have is which one tends to be generally faster from your experience? Obviously there is a huge wealth of applications for both algorithms; this is more of a generalised pondering, if you like.
Thank you.
There's no way to tell in advance (unless your project is a 100% match to a well studied academic problem like a pure TSP - and even then ...). It depends on your project's constraints and your project's size (and if you implement the algorithms correctly).
So, to be sure, you have to implement both algorithms (and many others, like Tabu Search, ...) and use a Benchmarker like this one to compare them.
That being said, I 'd put my money on Simulated Annealing over Random Mutation Hill Climbing any day :)
Note: Simulated Annealing is a short but difficult algorithm: I only got it right in my 3th implementation and I 've seen see plenty of wrong implementations (that still output a pretty ok solution) in blogs, etc. It's easier just to reuse optimization algorithms.

Algorithm ----Where I can get the algorithm resource-----Schedule algorithm and so on [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I wonder If there is a place provided many algorithms.
I want to know the details of some processĀ“s schedule algorithms.
For example, If I want to get some informations about Network, I will check out the RFC documents. I want to know, in the field of os algorithms ,if there is something like RFC.
Further more, If there is a place I can read lots of algorithms in many fields. In my view, Reading the algorithms in many fields can help me a lot in algorithm ------Anyway, someday, maybe I can combine two algorithms to solve one particular problem.
Thanks.
How about this: List of Algorithms. Also you can study Donald Knuth's The Art of Computer Programming Vol 1 - 4.
Wikipedia has lots of them. I don't think that there is not any organization that provides algorithms for OS.
Wikipedia holds a lot of algorithms.
Use section "See Also" there.

Sorting a linked list- why not? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I was recently reading an article which mentioned:
For God's sake, don't try sorting a linked list during the interview.
Is there any reason why the author wrote this? The reason is not immediately clear. I am aware that merge sort works on linked lists in O(nlgn) time- what's wrong with that? Am I missing something obvious?
EDIT:
Any reason why is question is voted to close? I'm honestly curious and merely looking for some answers or interesting points.
I have no way of knowing why the author of the blog wrote what he did. If I had to guess, I'd say what was really meant was something along the lines of:
Don't assume that efficiently sorting a linked list would be as easy as sorting a data structure that provides random access to its elements. If you do end up relying on being able to sort a linked list, be prepared to explain what a suitable algorithm might be, and to discuss its complexity.
I think you'll find that, although it's possible to sort a linked list using merge sort, the code to do so efficiently is somewhat involved. It's not something you'd want to develop while standing at the white board in the middle of an interview.
The operation of getting/setting elements at specific indices is used by most sorting algorithms, and are required to be fast in order for the sorting algorithms to be fast. Normally they are O(1) for say a normal list, but for a linked list it is O(n) and this makes the sorting terribly inefficient. Perhaps this captures the reasoning behind your quote.

Analysis of algorithm complexity on linked lists [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
Please suggest me some good materials or books or links that provede the methods to find the complexity of algorithms that use linked list for their implementation. My question may sound silly to some of you but please reply.Please help
Introduction to Algorithms is the canonical textbook.
It is possible to find a complete PDF of this book online. I'm not going to provide a link, though, because I'm not sure if these copies are legal.
If all you're after is a quick reference, then Wikipedia is the best place to start. For instance, see the table at http://en.wikipedia.org/wiki/Linked_list#Tradeoffs.
Gautam I have a feeling that you are trying to skip the understanding of 'complexity analysis' portion and jump to linked-list-complexity-analysis.
Just so you know, if you want to understand it truly, then you have to understand two parts
1. How to compute complexity of an algorithm
2. The flow of algorithm in question.
'linked-list-complexity-analysis ' is not something that you can understand without understanding {1}
That said, if at all you want ready answer (trust me that wouldnt help you in interviews), you can refer to this book "Data structures and algorithm and applications in C++" (http://www.mhhe.com/engcs/compsci/sahni/)
Highly recommended for beginner.
Immensely boring, and I doubt you will read it again. :P
You will find answers to your specific questions there.
After you are done with that book , I would suggest go with DataStructures using C/C++ by Langsam/tennenbaum.

What's the fastest way to brush up on algorithms for a technical interview (on Monday)? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I have a technical interview on Monday and they were kind enough to give me a heads-up to brush up on my basic algorithms. It's been years since I looked at that kind of stuff and I'm pretty weak on it to begin with so I generally have a bad feeling about this. What's the best way to review the basics and get some practice in before Monday?
Starting Project Euler might help you, also try picking up Algorithms in a Nutshell and working through those examples. Should be do-able in a weekend.
TopCoder Algorithm Tutorials
Get the Algorithm Design Manual and look at the reference section. It has a nice "Problem -> Algorithm" cheat sheet.
Also take a look at questions on StackOverflow that are tagged Algorithm.
They might actually turn up in the interview ;)
Best of Luck!
This SO Question would be helpful. Also, mostly you should know about
* Sorting
* Searching
* Inserting and removing from various data structures
As this are the main algorithms which are normally asked in the interviews.
*Note: This is from my personal experience and it may differ from person to person.
http://en.wikipedia.org/wiki/List_of_algorithms
Especially the Search, Item Search and Sorting sections.

Resources