How To Learn HPC? [closed] - parallel-processing

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed last month.
Improve this question
So I wanted to learn HPC and I couldn't find any resources list.
Of course we have "Awesome HPC" but last update was for 3 years ago.
My main question is how to learn HPC.
what are the prerequisites?
what programming languages should I know?
And if you have any advice I'm all ears.

There are plenty of books about Parallel Computing, Parallel Algorithms, Scientific Computing, OpenMP and MPI; without to mention HPC books.
Some good books are even free and good to start with such as the Victor Eijkhout's HPC book.
With respect to programing languages, C, C++ and FORTRAN are generally used to code HPC applications. CUDA and OpenCL may be good to know for GPU-based programming.
Note that the HPC landscape should not be very different since 3 years, so it is OK to read such tutorials.

I would say it is very much about understanding the principles of computer architecture and from there on you will be able to write efficient code.
First, one has to be able to write good serial code and then you can have a look on parallelizing it.
I'm a big fan of the book "Introduction to High Performance Computing for Scientists and Engineers" from Hager and Wellein. It will give you the full journey from understanding the hardware to writing fast code and finally to make the fast code even faster by parallelizing. By the way, it's from 2010, but all in there still holds ;-)

Related

Prolog as first programming language [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I'm a computer engineering student but i've never programmed in my life (i've only studied physics, chemistry, control systems etc). Since I know a lot more about math than programming languages, and I'm studying logic right now on my own (i read it can be useful for artificial intelligence), i was thinking about learning Prolog as a first programming language. I tried to find some information about it on the internet but i couldn't really find much, all i discovered is that it's not really useful for landing a job but it can give you a different "mindset". Do you think it would be worth to learn it, or it would just be better learning something like C, Python etc?
Thank you!
Prolog is indeed a wonderful language, and it makes you think is a very differnt mode from other languages. As for making it a first, I think that's a bold move, and I suspect it'll make learning more languages a bit of a challenge. If your intent is to learn sofware engineering, I'm very sure you'll eventually learn another one.
I'd start with Python but, since you're curious already, learn it on the side.

How do Hardware Description Languages differ from General Purpose languages at the low level? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Question:
How do Hardware languages (HDLs) differ from general purpose languages such as Python, Java, etc. In particular, what is the primary trade-off that causes general purpose languages to be sub-optimal for FPGA's when compared to VHDL and Verilog?
Context:
I'm a programmer but definitely work at a high level of abstraction such as JavaScript, tinkering with API's, etc. My low-level knowledge is very limited but I am playing around with an FPGA and have some novice questions that I cannot solve with Google or Wikis.
Considering I am a novice, please do not vote harshly against this post. Just state your suggestions for the question and I will happily revise! :)
Example:
For example, why isn't everyone just coding FPGAs and ASICs with Python or C# instead of Verilog or VHDL? I understand that there are some Python libraries, but I have read that they are limited in their viable use-cases. I would greatly appreciate someone shining some light on why HDLs are necessary and beneficial and why general purpose languages are not optimal in comparison for these scenarios.
Thanks in advance!
This is a broad opinionated question, but I think there is a short answer. In some sense, they are all programming languages, i.e text descriptions that gets compiled into a set of machine instructions to be executed on a host machine(software).
But an HDL is also a text description that gets compiled into a set of machine instructions to build another machine (hardware).
Technically, any programming language could be used to describe hardware (SystemC in C++ as an example), Verilog and VHDL were specifically developed to model and simulate hardware most efficiently.

Advanced Rudimentary Computing? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Lets say that my definition of 'rudimentary programming' refers to the fundamental tools employed for a computer to perform a task.
Considering programming rudiments, the learning spectrum usually looks something like this:
Variables, data types and variable memory
Arrays/Lists and their manipulation
Looping and conditionals
Functions
Classes
Multi threading/processing
Streams (hard-disk and web)
My question is, have I missed any of the major rudiments? Is there a 'next' to the spectrum that still eludes me?
I think you missed the most important one: algorithms. Understanding the complexity, know the situation to use them, why use them and more important, how to implement them.
I'm pretty sure that you already know a lot about algorithms but if you think that your tool-knowledge (aka the programming languages) are good enough, you should start focus, more, on the algorithms.
A great book to start is: Introduction to Algorithms, from Thomas H. Cormen

How do people solve programming competitions so fast? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I hope this is not a vague/broad/subjective question. If it is, please close it.
Anyway, at several programming competitions (like Google's Code Jam, Facebook's Hacker Cup and so on), by the time I've successfully understood a problem and have an inkling of how to approach it, I see that half the questions are already solved by many people.
My question is, how do these people get so good? Is it pure genius? Is it experience? Is it the ability to think really fast? How would you suggest I improve my skills? I would say I'm a competent programmer. I can eventually solve some of those questions.
Additionally, whenever I inspect the code of winners, I see a LOT of macros being used. This implies to me that they sort of have a template (like #define for loops to some abbreviated version) which they use to program faster. Does this make a significant difference?
The thing is, you're competing against people who've spent massive amounts of time mastering their skill to compete in these competitions. You're unlikely to catch up any time soon, but...
How do these people get so good?
Have the theoretical knowledge to solve the problems and practice, practice, practice.
Is it pure genius?
It can be, but practice can to a reasonable extent make up for it.
Is it experience?
Yes.
Is it the ability to think really fast?
Not really. Practice allows you to approach the problem correctly and skip insignificant details in the problem statement.
How would you suggest I improve my skills?
Get the theoretical knowledge and practice.
Do macros make a significant difference?
It may cut 10% off of your time, but probably not much more.
Statistically speaking, any programming competition with a large enough audience will attract super-talents who can churn out nice and elegant code at super-speed. It's like running the marathon. Running it in 4 hours is really good, even if the world record is around 2 hours. Don't worry about it.
Focus on code quality and elegance instead, instead of being able to churn out code at super-speed. Practise, have fun, and don't look too much at how fast other people are working.

automated theorem proving program - where to start? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm a second year student with my discrete mathematics 2 assignment is to make an automated theorem prover. I have to make a simple prover program that works on Propositional Logic in 4 weeks (assuming that the proof always exist). I've googled so far but the materials there is really hard to understand in 4 weeks. Can anyone recommend me some book/site/open source code that is for beginners or some useful hints to start with? Thank you in advance.
Note: I flagged this to be moved to the Computer Science site because they are much more on top of ATP over there.
It would be nice if you could include what you have looked at and why it does not help you. Then we can figure out what might be better for you. Also, if you have to write a program, then knowing what languages you know will help. Most of what I do with this is done in a functional language such as OCaml or F#, or a logic language such as Prolog or Mercury.
Have you seen "Handbook of Practical Logic and Automated Reasoning" (WorldCat) by John Harrison. I included the (WorldCat) link so you can find the book in a local library as opposed to waiting to buy it which will eat up most of your time.
If you look you will find the OCaml code at the bottom of the page, and F# here and Haskell here.
In case you haven't see the ATP or Proof Assistant at Wikipedia, you might get a lead to some code and papers.

Resources