It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
I was wondering if folks use VHDL/FPGAs in scientific computing.
An example scenario that I was thinking off was say:
Construct an arbitrary precision floating point adder
Configure an FPGA board to then add such numbers
So I was looking for references (example code) where VHDL/FPGAs have been used in scientific computing.
Thanks in advance.
There are several vendors who build heterogeneous computing systems using FPGAs. I doubt you'll find complete source code for such systems.
SRC Computing
Convey Computer
Mitrionics. A reseller of other systems.
Novo-G. An academic project.
Look into radio astronomy. With arrays such as the VLA and ALMA, the massively parallel correlator is the part that could be considered most important. These typically use FPGAs but could use custom-designed chips for extreme performance at higher cost.
Some fine reading:
https://science.nrao.edu/facilities/cdl/digital-signal-processing
http://web.njit.edu/~gary/728/Lecture8.html
Related
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
I have a question. Gradually quantum computers will emerge someday. So, nowadays making algorithm efficient is important; I mean, making it optimal to run as fast as possible. But once quantum PC emerges does the algorithm performance improvements are still important?
Cheers
Quantum algorithms have the possibility of searching a complete number space for an answer in one go, that much is true. However, the algorithm you choose will still determine how many of these steps are needed, and whether you can pack the whole space with useful inputs.
I think its too early to worry about it. It also might be quantum computer will never made.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
Has anyone tried to implement Numenta's most recent cortical learning theory (http://www.numenta.com)? I'm working on it and would like to share experience.
I think the ideas from Numeta are very promising. But as with any company that wants to make money they are not verbose enough about the technology so that one should be able to re-implement their system (at least from what I have seen so far). It is probably not in their interest that one could just rebuild the system from which they plan to make money.
Also their system is very general and complex. So unless you have a lot of experience with other kinds of neural networks and learning algorithms, I would not recommend experimenting with their ideas. First try to do Backpropagation and maybe some of the less advanced temporal learning until you are really familiar with that.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
I want to know if is there any specific algorithm that can be followed to understand the meaning of a word/sentence/paragraph. Basically, I want to write a program that will take text/paragraph as input and try to find out what its meaning is. And thereby highlight the emotions within the text.
Also, if there is an algorithm to understand things, can the same algorithm be applied to itself? It reduces the quest further to a point where we become interested in knowing meaning of meaning OR rather definition of definition.
You want Natural Language Processing and Semantic Technology. This is still a flourishing area in computer science. Look at things such as a Semantic Reasoner. You can start with Jena. There are also other things you can look at such as Academic Thesis papers.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
When dealing with a scripting engine, I'd expect them to be fractions slower than code compiled to assembly. What sort of efficiency numbers are there for major scripting languages (if any)?
Or is this a futile question?
Thanks.
Go to http://shootout.alioth.debian.org/ for actual numbers.
As you can see, languages that are usually compiled (i.e. C, C++, etc.) destroy interpreted languages in terms of performance (both running time and memory).
But the question is odd.
Any scripting language can be made compilable into native code (i.e. assembly) and vice versa (e.g. HipHop: PHP to C++ compiler).
And language aside, some compilers are much better than others because they know how to optimize the code to run faster natively. And they also differ between single-core and multi-core systems.
So if I can take a guess... if you're making a decision on what language to use based on performance (especially... ESPECIALLY if you're talking about scripting languages), you're probably making a mistake. There are many more considerations beyond performance that impact on selection of a programming language for a project.
If I guessed wrong, sorry!
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 12 years ago.
I recenty asked a question about parallel programming algorithms which was closed quite fast due to my bad ability to communicate my intent:
https://stackoverflow.com/questions/2407631/what-is-the-most-useful-parallel-programming-algorithm-closed
I had also recently asked another question, specifically:
Is MapReduce just a generalisation of another programming principle?
The other question was specifically about map reduce and to see if mapreduce was a more specific version of some other concept in parallel programming. This question (about a useful parallel programming algorithm) is more about the whole series of algorithms for parallel programming. You will have to excuse me though as I am quite new to parallel programming, so maybe MapReduce or something that is a more general form of mapreduce is the "only" parallel programming construct which is available, in which case I apologise for my ignorance.
There's probably two "main" parallel programming constructs.
Map/Reduce is one. At a high, ultra-generic level, it's just parallel divide-and-conquer. Send out the individual bits to parallel handlers, and combine the results when they arrive.
The other main parallel programming construct is the pipeline... pieces of work go through a series of stages, each of which can be run in a parallel thread.
I think that just about any parallelization algorithm is going to boil down to one of those two. I could be wrong, of course.