Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
When we start getting into algorithm design and more discrete computer science topics, we end up having to prove things all of the time. Every time I've seen somebody ask how to become really good at proofs, the common (and possibly lazy) answer is "practice".
Practicing is all fine if you have the basics down, but how do you get into the mind set for mathematical proofs? When did induction click? What resources are best for teaching these topics? What foundation topics should be researched prior to indulging in proof-writing?
They aren't being lazy, practice is the only way. Take classes you have to do proofs in and look online for class notes and old tests with answers from other colleges that go over proofs.
I'll start off my answer by admitting that as a CS student, I had a really tough time grasping a formal way of thinking, and it's never easy, unless you have a talent for it.
I'm afraid there is no better answer than practice and study.
A formal mathematical and algorithmic way of thinking and visioning problems is a skill which first demands a very deep understanding of the subjects you are dealing with. Second, it requires you have good knowledge of existing proofs. Try to envision yourself as some of the great scientists who came up with the algorithms you are studying. Understand how you would have tried to tackle that specific problem. Then see how they proved the correctness of their algorithm.
I can only recommend the greatest textbook in this subject which is Intro to Algorithms by CLRS. If you go through it from start to finish, including every exercise, you will enhance your skills.
Practice is really the only way, but it can helped along by reading proofs as well. I won't touch on practice because the other answerers have covered everything that I can think of, so I'll just talk about what I mean by reading.
Textbooks are very fond of writing out the "important" proofs. Its very nice, because they often prove very powerful statements, and are really fancy. But just as you shouldn't learn to be a world-class gymnast from day 1 by emulating an Olympian (as in, you'll probably break your spine), you shouldn't read any really big proofs (at first). What I found was helpful was reading smaller proofs, usually from returned homework (I assume you're a student) or occasionally a textbook that wisens up.
The reason why I think reading proofs is helpful is because there are a small set of "tricks" or "ideas" that constitute huge chunks of schoolwork proofs, and even more advanced ones. Data structure qualities and recurrence relations usually involve thinking related to proof by induction, proofs involving computability with finite state machines sometimes use the pigeonhole principle, and more rarely the idea of diagonalization (very infrequent, don't worry about it). And of course, just about every other proof uses proof by contradiction. I'm sure there are other handy tools that have slipped my mind, but I hope you get the idea.
Figuring out when, how, and why you'd approach a problem with one particular method or another is what takes practice and experience. I suggest reading proofs in addition to practice because it can often show you creative ways of using a proving method you've already encountered.
As a final note, try to remember when you first learned to program. How did you get better? Proving things and programming things are not too dissimilar, in my opinion. :)
You get into the mind set for doing mathematical proofs by becoming a mathematician. I don't mean the last statement in a tautological way, but realize that a mathematical proof, as published in a mathematical journal, is something of a rhetorical artifact; i.e., it is a proof because a body of mathematicians agree that it is a proof. Ideally, the arguments in the proof could all be reduced to symbolic logic, but this is not how it is done in practice. The utter failure of computer-generated proofs to do valuable mathematics provides some evidence for this.
I get into the mind set by doing proofs and having them accepted by other mathematicians. I agree with the others that "practice" is essential. You don't do proofs unless you try, try, and try. Often the light dawns slowly.
The best resources are, of course, other mathematicians, and reading proofs. There are very few, if any, who can do true mathematical proofs without being part of the mathematical community.
I'm afraid that "practice" really is the best answer here.
Its very similar to programming: once you get the hang of it, you find patterns which solve problems particularly well, and you can create a picture of the high-level design of novel systems which you've never implemented before. However, neophyte programmers aren't aware of patterns: they hack away at code until they accidentally stumble on some solution which appears to "work".
When you're given a problem to prove, you can usually identify properties ("Do I have a set of distinct objects?", "Am I generating permutations?", "Am I looking to minimize/maximize some value?", etc). Sooner or later, proofs will clump together into vaguely similar group, where techniques used to solve one problem can easily apply to novel variations.
Recommended reading:
The Algorithm Design Manual by Steven Skiena.
I have no idea. Probably the same way you get good at composing music.
When I try to prove something I'm not following some fixed strategy, I just think about the problem. Then [undefined amount of time] later, my mind returns a result and I jump up to write it down.
But practicing definitely helps. When I started trying to prove extremely simple statements, like DeMorgan's laws, I was completely hopeless. So I sat down and did the fifty or so optional example problems on a worksheet we were given. Now it feels natural to prove something.
Practice and study makes perfect sense, agreed. Some tricks, that I found useful:
Make notes on everything you study (I've tried just to read books -- a lot of material just passes through).
In addition to previous point: do all (or most) proofs by youself, use book/lecture notes as a guide; a lot of proofs contains phrases like "we can see now, that XXX". And XXX is not always trivial conclusion.
Make exercises; for example, in CLRS book there are dozens of exercises. Exercises are good way to get the ideas behind algorithms/correct proofs.
If you want to better understand the internals of algorithm -- consider participating in online programming contests like UVa's.
Related
I am currently a freshmen in college doing some self-studying about the different sorting algorithms.
My studying source does provide the codes and I did some practice on it (coding the sorting base on the concept it). At the moment, I can provide selection sort with just a little bit of trouble (coding that is).
Am I required to memorize the codes? I know the difference between the sorts and the concept behind it. Do I need to memorize the Pseudocodes behind it as well? Will interviewers ever ask you to produce the codes on the spot?
There is no need to memorize the exact code syntax , but it would be important to understand the logic behind the sorting algorithms (ie. be able to explain using pseduocode).
I've been asked in interview how to do some basic sorting algorithms like a bubble sort, but nothing very complex. I was not required to write the exact "code" in any particular language, but just prove that i know the logic and can explain how it works.
Hello and welcome on Stack Overflow. I'm answering your post below, even though it will be closed as too broad or primarly opinion-based, because this QA site is oriented towards coding questions. You might want to ask this on another QA site, like programmers stackexchange.
Do I need to memorize the Pseudocodes behind it as well?
not really, as in every decent language you'll have a standard library that offers you state of the art implementations, and what you really need to remember is the complexity and mechanism of each sort algorithm, to choose the best fit for your dataset, when you need it.
And otherwise, when you really need to dig again in the pseudocodes, there are books (like the Art of Computer Programming by Donald Knuth), and wikipedia, and many other resources online.
Will interviewers ever ask you to produce the codes on the spot?
Yes they will. It happened to me at least five times. But most of the time, they'll understand that you might not remember the full pseudocode on the spot, but they expect you to know the mechanism and complexity, and be able to reinvent the algorithm on the spot.
Though, when you do interviews, you're usually compared to other people passing the same interview, and between two candidates that passes, they'll choose the one that did the best on the tests… And then you might loose a job opportunity because someone else remembered better those algorithms.
"Designing the right algorithm for a given application is a difficult job. It requires a major creative act, taking a problem and pulling a solution out of the ether. This is much more difficult than taking someone else's idea and modifying it or tweaking it to make it a little better. The space of choices you can make in algorithm design is enormous, enough to leave you plenty of freedom to hang yourself".
I have studied several basic design techniques of algorithms like Divide and Conquer, Dynamic Programming, greedy, backtracking etc.
But i always fail to recognize what principles to apply when i come across certain programming problems. I want to master the designing of algorithms.
So can any one suggest a best place to understand the principles of algorithm design in depth.....
I suggest Programming Pearls, 2nd edition, by Jon Bentley. He talks a lot about algorithm design techniques and provides examples of real world problems, how they were solved, and how different algorithms affected the runtime.
Throughout the book, you learn algorithm design techniques, program verification methods to ensure your algorithms are correct, and you also learn a little bit about data structures. It's a very good book and I recommend it to anyone who wants to master algorithms. Go read the reviews in amazon: http://www.amazon.com/Programming-Pearls-2nd-Edition-Bentley/dp/0201657880
You can have a look at some of the book's contents here: http://netlib.bell-labs.com/cm/cs/pearls/
Enjoy!
You can't learn algorithm design just from reading books. Certainly, books can help. Books like Programming Pearls as suggested in another answer are great because they give you problems to work. Each problem forces you to think about how to solve a particular type of problem.
The idea is that you expose yourself to many different types of problems and their solutions. In doing so, you learn how to examine a problem and see if it shares anything in common with problems you've already seen. In that regard, it's not a whole lot different than the way you learned how to solve "word problems" in math class. Granted, most algorithm problems are more complex than having to figure out where on the tracks the two trains will collide, but the way you learn how to solve the problems is the same. You learn common techniques used to solve simple problems, then combine those techniques to solve more complex problems, etc.
Read, practice, lather, rinse, repeat.
In addition to books like Programming Pearls, there are sites online that post different programming challenges that you can test yourself on. It helps if you have friends or co-workers who also are interested in algorithms, because you can bounce ideas off each other and pose interesting challenges, or work together to come up with solutions to problems.
Did I mention that it takes practice?
"Mastering" anything takes time. A long time. A popular theory is that it takes 10,000 hours of practice to become an expert at anything. There's some dispute about that for particular endeavors, but in general it's true. You don't master anything overnight. You have to study. And practice. And read what others have done. Study some more and practice some more.
A good book about algorithm design is Kleinbeg Tardos. Every design technique depends on the problem that you are going to tackle. It is very important to do the exercises in the algorithm books and have feedback from teachers about that.
If there exist a locally optimal choice taht brings the globally optimal solution you can use a greedy algorithm.
If the problem has optimal substructure, you can use dynamic programming.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I've taken everything up to pre-calculus in college, but when trying to get through things like the Donald Knuth books, or even things like this link: http://en.wikipedia.org/wiki/Self-balancing_binary_search_tree I wind up looking at math that means absolutely nothing to me. I'm not looking for magic, I don't expect to make sense of this in a week, I'm just looking for a good graduated plan of things to read / explore to get me there. Any pointers are welcome, after 20+ years as a professional programmer, I feel it would be nice to have this under my belt. Thanks in advance to everyone! :-)
I actually recommend taking a discrete mathematics course at your local university. This helped me out tremendously. Until I had this, I did not understand recursion (which is based on mathematical induction.) There are a number of other concepts which you will learn in a good discrete mathematics course which are extremely, extremely helpful (graph theory, asymptotic notation, combinatorics...)
I also recommend taking the class for a grade. I have always noticed that this makes people take the course more seriously, even if it is not in line with a degree path or anything past the grade.
If your local university is good, they will likely have tutoring sessions and office hours available that you can go to in order to ask questions and get clarification. These are really, really valuable and helped me learn things in a deeper manner, and more quickly, than I ever could have on my own.
You may need to take calculus in order to meet the prerequisites, but that is something I would also recommend if you'd like to increase your mathematical literacy. This 'answer' will take at least a semester, and more like two, but I think this is the way to go. It's not an immediate solution, but you will become better at math if you perform well in these two classes (and you have a good university close by.)
Your profile says you are in Dallas. I found this course (with no prerequisites!) for you. The syllabus looks like it covered a lot of good material, and the course met at 5:30 p.m. (good for working people!). If they are offering anything similar next semester, I'd consider it. If you call up the instructor, I'm sure he'd be happy to talk with you about what he knows for summer and fall scheduling.
This path has worked well for me.
Good luck!
You can try this: http://www.amazon.com/Concrete-Mathematics-Foundation-Computer-Science/dp/0201558025
There's a pdf version of this available online, you can easily google it out.
Many of my friends who are great programmers recommended it.
A lot of talented programmers understand algorithms before understanding the maths behind them. Maths are only there to help, they are not here to make you understand everything. You will need to spend more time reading about algorithms and complexity, then you might get a sense of how to evaluate them.
I recommend you to read more books about algorithm complexities.
In your long experience as a professional programmer, there surely are topics and sub-domains that you are most curious about. My advice is: identify those areas and go after them. It might be code-breaking, number theory, recursion, functional programming, computational origami, logical puzzles, crystal structures, graphs, genetic algorithms, splines...
Take your own remark to heart:
but when trying to get through things like the Donald Knuth books, or even
things like this link:...I wind up looking at math that means
absolutely nothing to me
What sort of math fascinates you?
I could say there are lots of intriguing puzzles at Project Euler. After you solve a programming challenge, you have access to a forum in which other folks share their solutions and occasionally refer to some body of knowledge they were drawing on. I love it. But what matters is what you like. Your own interests are the key to your learning.
If math and programming no longer have any appeal--you don't like doing them in your spare time--find something else to get into: acting, foreign languages, travel, French cooking, biking. Who knows, maybe you're burned out.
I'd say get a good book in discrete math and one in combinatorics as well. Here are a few I've liked. The Rosen book is good place to start.
http://www.amazon.com/Course-Combinatorics-J-van-Lint/dp/0521006015
http://www.amazon.com/Discrete-Mathematics-Applications-Kenneth-Rosen/dp/0073229725/ref=sr_1_2?s=books&ie=UTF8&qid=1305304408&sr=1-2
http://www.amazon.com/Introductory-Combinatorics-5th-Richard-Brualdi/dp/0136020402/ref=sr_1_7?s=books&ie=UTF8&qid=1305304434&sr=1-7
In line with what Vincent said, I recommend Algorithms in a Nutshell from O'Reilly (here).
There is a plenty of good video-lectures on Discrete Math, Calculus and Applied Math. Just watch them every evening, make notes and try to solve simple problems. To prepare yourself for Knuth, try "Discrete Mathematics". To understand deeply what is math and how all things in the universe are interconnected (including algorithms), try "Joy of Mathematics".
I was looking for just the same thing. I couldn't afford any of the material suggested here so far so here's a link to a YouTube lecture series on Discrete Mathematics. I wish there was a playlist but unfortunately there is not.
The videos are taken uploaded from http://www.aduni.org who ask for a donation of 25c per video to cover operation costs.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Is it important for developers to know discrete math? Most of the books about algorithms and analysis have at least some references to math. I can easily understand the algorithms in principle and can implement them without any problem, but when it comes to the math parts I get stuck. Is it generally assumed that developers will have deep knowledge of math to understand algorithms and methods?
It depends on the kind of developer you're talking about and the kind of math you're talking about. I'm pretty sure that most of the "ordinary" developers don't need to know much about math. But, do you want to be "ordinary" developer?
If you're developing web applications that just display and allow editing of data in database, then you'll probably never need any math.
On the other hand, if you're developing a GPS system that shows a path to the target (or some other application that does more complicated calculations) then discrete math is going to be useful.
It doesn't necessarily be discrete math though - for example, in the finance industry, people need probability and statistics far more often.
That said, knowing math will definitely make you a better developer, because it trains your mind in a way that is useful not only for solving specific (math) problems, but also teaches you how to think about problems in a more formal fashion (which, I believe, is important for writing correct code).
Yes.
I find that discrete math is fairly core to computer science. Understanding set theory, boolean algebra, maps, etc. are all beneficial to a developer and are all part of discrete math.
Of course, the concepts won't always be applicable in the most academic sense. You will almost never open your discrete math textbook and copy something into your code to solve a problem. However, understanding those concepts will help developers write better code, better algorithms, and use design patterns more effectively.
Depends on what the developer is doing probably. If you are doing web, probably not, maybe a bit when it gets to security. Like the time a brute force attack takes to decode a key of certain number of bit under a certain hash. If you are doing graphics for high end games, you probably need to understand quite a bit of maths and the pros and cons of the methods. As a DB admin or network, you shouldn't.
The kinds of problems that you get the chance to solve depends on what you know.
If you only know 4th grade math, you'll only be asked to solve problems that involve math that's at a 4th grade level or less.
If you aspire to do more or understand the underpinnings of other algorithms, you'll have to learn whatever math is necessary.
I think you'll find that working through the points where you get stuck will improve your math, your appreciation for the problems you solve, and a better chance at extending the math you learn to new areas.
It nauseates me to hear people immediately disparage an area that they find difficult, as if to justify their unwillingness to push past the pain of ignorance and struggle. Learning any new skill requires that you push through that barrier, be it math or anything else. I'd advise you to stay with it and prove to yourself that you can master something difficult by resisting the urge to give up.
You have already found out that the results of discrete mathematics are useful in programming. My experience is that understanding why something works, rather than attempting to simply follow it by rote, allows you to find and fix many errors and misunderstandings. It also allows you to handle situations that are almost, but not quite, the way they are in the textbook, and to realise when the textbook answer no longer applies. Time spent understanding even a small part of something that you might use or work on will not be wasted.
If your job is pure CS like Google search where you invent new algorithms then yes, you'll need to be able to analyse running times quite well, also anything sciency like physics simulations.
If you're a 'normal' developer then you'll need to know about running times and what they mean and their impact on your application.
My experience is this:
Knowing something about discrete mathematics is something you will never regret. It will make your job easier in many instances, even in mundane tasks, because you will be familiar enough with various concepts that will at least allow you to construct a smarter google query. In depth familiarity and ability to do these things by rote is probably not helpful to most programmers, but a passing familiarity definitely is.
That said, most programmers in industry that I've encountered (and even some in academia!) know almost nothing about this stuff, so not knowing it is unlikely to place you at a significant disadvantage outside of a few specialist programming sub-disciplines.
You're generally not expected to have a deep knowledge of maths, unless the application requires it - e.g. you're writing financial software, or doing some 3D modelling, load balancing on aircraft, writing some tailored compression algorithm etc. I've worked with great developers who struggled with simple maths. And knowing discreet maths seems very specific. Having an understanding of how a variety of algorithms work can be helpful, and if you can do that it doesn't much matter that you can't construct a proof of their optimality etc.
To be honest what I think is most important is understanding the business you're building for, and how you approach writing code (readability, modular etc)
Some knowledge of discrete math might someday help you stop before spending a lot of time attempting to code up something mathematically impossible or of NP complexity to solve. You will gain a much better "feel" for when some proposed software problem or solution path more resembles an easy homework assignment, or one of those term projects which no one in your class ever completed.
It depends about what part of discrete math you are talking. Of course knowing math will always be an advantage... but I think that knowing of some parts of discrete math are not just advantage, but are very essential for a developer (of course it depends from projects on which he/she is working).
But topics like :
Set theory
Graph theory
Alg. Analysis
Alg. Complexity
Sorting
etc...
are essential for a developer.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I was thinking about ways to improve my ability to find algorithmic solutions to a problem.I have thought of solving math problems from various math sectors such as discrete mathematics or linear algebra.After "googling" a bit I have read an article that claimed the need of learning game programming in order to achieve this and it seems logical to me.
Do you have/had the same concerns as me or do you have any ideas on this?I am looking forward to hear them.
Thank you all, in advance.
P.S.1:I want to say that I already know about programming and how to program(although I am at amateur level:-) ) and I just want to improve at the specific issue, NOT to start learning it
P.S.2:I think that its a useful topic for future reference so I checked the community wiki box.
Solve problems on a daily basis. Watch traffic lights and ask yourself, "How can these be synced to optimize the flow of traffic? Or to optimize the flow of pedestrians? What is the best solution for both?". Look at elevators and ask yourself "Why should these elevators use different rules than the elevators in that other building I visited yesterday? How is it actually implemented? How can it be improved?".
Try to see a problem everywhere, even if it is solved already. Reflect on the solution. Ask yourself why your own superior solution probably isn't as good as the one you can see - what are you missing?
And so on. Every day. All of the time.
The idea is that almost everything can be viewed as an algorithm (a goal that has some kind of meaning to somebody, and a method with which to achieve it). Try to have that in mind next time you watch a gameshow on TV, or when you read the news coverage of the latest bank robbery. Ask yourself "What is the goal?", "Whose goal is it?" and "What is the method?".
It can easily be mistaken for critical thinking but is more about questioning your own solutions, rather than the solutions you try to understand and improve.
First of all, and most important: practice. Think of solutions to everything everytime. It doesn't have to be on your computer, programming. All algorithms will do great. Like this: when you used to trade cards, how did you compare your deck and your friend's to determine the best way for both of you to trade? How can you define how many trades can you do to do the maximum and yet don't get any repeated card?
Use problem databases and online judges like this site, http://uva.onlinejudge.org/index.php, that has hundreds of problems concerning general algorithms. And you don't need to be an expert programmer at all to solve any of them. What you need is a good ability with logic and math. There, you can find problems from the simplest ones to the most challenging. Most of them come from Programming Marathons.
You can, then, implement them in C, C++, Java or Pascal and submit them to the online judge. If you have a good algorithm, it will be accepted. Else, the judge will say your algorithm gave the wrong answer to the problem, or it took too long to solve.
Reading about algorithms helps, but don't waste too much time on it... Reading won't help as much as trying to solve the problems by yourself. Maybe you can read the problem, try to figure out a solution for yourself, compare with the solution proposed by the source and see what you missed. Don't try to memorize them. If you have the concept well learned, you can implement it anywhere. Understanding is the hardest part for most of them.
Polya's "How To Solve It" is a great book for thinking about how to solve mathematical problems and do proofs, and I'd recommend it for anyone who does problem solving.
But! It doesn't really address the excitement that happens when the real world provides input to your system, via channel noise, user wackiness, other programs grabbing resources, etc. For that it is worth looking at algorithms that get applied to real-world input (obligatory and deserved nod to Knuth's collection), and systems which are fairly robust in the face of same (TCP, kernel internals). Part of coming up with good algorithmic solutions is to know what already exists.
And alongside reading all that, of course practice practice practice.
You should check out Mathematics and Plausible Reasoning by G. Polya. It is a rare math book, which actually deals with the thought process involved in making mathematical discoveries. I think it is the same thought process that is involved in coming up with algorithms.
The saying "practice makes perfect" definitely applies. I'm tutoring a friend of mine in programming, and I remind him that "if you don't know how to ride a bike, you could read every book about it but it doesn't mean you'll be better than Lance Armstrong tomorrow - you have to practice".
In your case, how about trying the problems in Project Euler? http://projecteuler.net
There are a ton of problems there, and for each one you could practice at developing an algorithm. Once you get a good-enough implementation, you can access other people's solutions (for a particular problem) and see how others have done it. Don't think of it as math problems, but rather as problems in creating algorithms for solving math problems.
In university, I actually took a class in algorithm design and analysis, and there is definitely a lot of theory behind it. You may hear people talking about "big-O" complexity and stuff like that - there are quite a lot of different properties about algorithms themselves which can lead to greater understanding of what constitutes a "good" algorithm. You can study quite a bit in this regard as well for the long-term.
Check some online judges, TopCoder (algorithm tutorials). Take some algorithms book (CLRS, Skiena) and do harder exercises. Practice much.
I would suggest this path for you :
1.First learn elementary parts of a language.
2.Then learn about some basic maths.
3.Move to topcoder div2 easy problems.Usually if you cannot score 250 pts. in any given day,then it means you need a lot of practise,keep practising.
4.Now's the time to learn some tools of a programmer,take a good book like Algorithm Design Manual by Steven Skienna and learn about dynamic programming and greedy approach.
5.Now move to marathons,don't be discouraged if you cannot solve it quickly.Improvement will not happen overnight,you will have to patiently keep on working hard.
6.Continue step 5 from now on and you will be a better programmer.
Learning about game programming will probably lead you to good algorithms for game programming, but not necessarily to better algorithms in general.
It's a good start, but I think that the best way to learn and apply algorithmic knowledge is
Learn about good algorithms that currently exist for your area of interest
Expand your knowledge by viewing other areas; for example, what kinds of algorithms are
required when working on genetic analysis? What's the best approach for determining
run-off potential as it relates to flooding?
Read about problems in other domains and attempt to use the algorithms that you're
familiar with to see if they fit. If they don't try to break the problem down and see if
you can come up with your own algorithm.
A few more books worth reading (in no particular order):
Aha! Insight (Martin Gardner)
Any of the Programming Pearls books (Jon Bentley)
Concrete Mathematics (Graham, Knuth, and Patashnik)
A Mathematical Theory of Communication (Claude Shannon)
Of course, most of those are just samples -- other books and papers by the same authors are usually quite good as well (e.g. Shannon wrote a lot that's well worth reading, and far too few people give it the attention it deserves).
Read SICP / Structure and Interpretation of Computer Programs and work all the problems; then read The Art of Computer Programming (all volumes), working all the exercises as you go; then work through all the problems at Project Euler.
If you aren't damned good at algorithms after that, there is probably no hope for you. LOL!
P.S. SICP is available freely online, but you have to buy AoCP (get the international, not-for-release-in-north-america edition used for 30 USD). And I haven't done this yet myself (I'm trying when I have free time).
I can recommend the book "Introductory Logic and Sets for Computer Scientists" by Nimal Nissanke (Addison Wesley). The focus is on set theory, predicate logic etc. Basically the maths of solving problems in code if you will. Good stuff and not too difficult to work through.
Good luck...Kevin
Great
how about trying the problems in Project Euler? http://projecteuler.net
There are a ton of problems there, and for each one you could practice at developing an algorithm. Once you get a good-enough implementation, you can access other people's solutions (for a particular problem) and see how others have done it. Don't think of it as math problems, but rather as problems in creating algorithms for solving math problems
Ok, so to sum up the suggestions:
The most effective way to improve this ability is to solve problem as frequently as possible.Either real world problems(such as the elevators "algorithm" which is already suggested) or exercises from books like CLRS(great, I already own it :-)).But I didn't see comments about maths and I don't know what to suppose(if you agree or not).:-s
The links were great.I will definitely use them.I also think that it will be a good exercise to solve problems from national/international informatics contests or to read the way a mathematician proves a theorem.
Thank you all again.Feel free to suggest more, although I am already satisfied with the solutions mentioned.