Predictability of Pseudo Random Generators by Quantum computers - random

May classical pseudo random generators be predictable by powerful quantum computers in the future or is it proven that this is not possible?
If they are predictable, do scientists know whether there exist PRGs that are unpredictable by quantum computers?

The security of a classical Cryptographic Pseudo-Random Number Generator (CPRNG) is always based on some hardness assumption, such as "factoring is hard" or "colliding the SHA-256 function is hard".
Quantum computers make some computational problems easier. That violates some of the old hardness assumptions. But not all of them.
For example, blum blum shub is likely broken by quantum computers, but no one knows how to break lattice-based cryptography with quantum computers. Showing you can break all classical CPRNGs with quantum computers is tantamount to showing that BQP=NP, which is not expected to be the case.
Even if quantum computers did break all classical CPRNGs, they happen to also fill that hole. They enable the creation of "Einstein-certified" random numbers.

Related

Most suitable pseudo random number generators for Metropolis–Hastings MCMC

I am doing a lot of Metropolis-Hastings Markov chain Monte Carlo (MCMC).
Most codes I have in use, use Mersenne Twister (MT) as pseudo random number generator (PRNG).
However, I recently read, that MT is outdated and probably shouldn't be used anymore as it fails some tests and is relatively slow. So I am willing to switch.
Numpy now defaults to PCG (https://www.pcg-random.org/), which claims to be good. Other sites are rather critical. E.g. http://pcg.di.unimi.it/pcg.php.
It seems everyone praises its own work.
There is some good information already here: Pseudo-random number generator
But many answers are already a bit dated and I want to formulate my question a bit more specific.
As I said: the main use case is Metropolis-Hastings MCMC.
Therefore, I need:
uniformly distributed numbers in half-open and open intervals
around 2^50 samples, apparently per rule of thumb the PRNG should have a period of at least 2^128
sufficient quality of random numbers (whatever this might mean)
a reasonable fast PRNG (for a fixed runtime faster code means more accuracy for MCMC)
I do not need
cryptographically security
As I am by no means an expert, of course usability counts also. So I would welcome an available C++ implementation (this seems to be standard), which is sufficiently easy to use for the novice.

Computers not using binary?

I don't know if this is the right place to ask such question but, ALL computers use binary, when you program the compiler always turn your code into a "01" form.
But, are there computers that do NOT work on that principle?
If yes, what kind of computers they are? What can they do and how were they created?
If no, why can't such computers be made?
Thanks and I hope this is good information for anyone looking for a similar answer.
Yes, but they are esoteric. Modern computers are digital, use base two, are electronic, and use classical systems largely due to the success of integrated digital logic circuits, but it was a long road and other types of computers have been invented along the way.
Digital computers in other bases
A handful of computers were made based on ternary (base 3) logic. Some of these used bipolar voltages to represent values in balanced ternary. The general consensus is that if you are going to use digital circuitry, binary is the best option, so this is mostly a historical curiosity.
See: https://en.wikipedia.org/wiki/Ternary_computer
Analog and mechanical computers
Analog computers use continuous voltages to represent values. Mechanical computers use the positions of physical parts, such as gear shafts, to represent values. There is some renewed interest in analog computers for problems such as image analysis, where researchers suspect that the benefits of performance outweigh the disadvantages of accumulated error. Whether analog computers see a resurgence is unknown.
See: https://en.wikipedia.org/wiki/Analog_computer
Quantum computers
Quantum computers represent values using superpositions of quantum states. Creating a large quantum computer is an open problem, and a number of algorithms have already been written for quantum computers which would outperform classical implementations. For example, Shor's algorithm.
See: https://en.wikipedia.org/wiki/Quantum_computing
Technically speaking all computers are based on a voltage of
0 = no voltage
1 = yes voltage
However, there are new "Quantum computers" that are far from complete that do not use a 0-1, instead they use quantum bits.
Here is more information on the topic:
https://en.wikipedia.org/wiki/Quantum_computing

Is Mersenne Twister a good binary RNG?

I'm trying to find an RNG to generate a stream of pseudorandom bits. I have found that Mersenne Twister (MT19937) is a widely used RNG that generates good 32-bit unsigned integers and that implementations have been done to generate apparently good double-precision floats (generating a 53-bit integer). But I don't seem to find any references to it being well-behaved on the bit side of things.
Marsaglia expressed some concerns about the randomness of Mersenne Twister that are making me doubt about using it.
Does anybody know if Mersenne Twister has a significant bias used to generate pseudorandom bits? If it is the case, does anyone know a good pseudorandom bit generator?
All psudorandom generators strive to generate a high degree of unpredictability per bit. There is currently no way to predict a bit from mersene twisters with a degree substantially better than random chance until you observe 624 values.
All questions in the form of "is X RNG good" must be replied with: "what are you doing with it?" Meresene Twister has had GREAT success in simulations because of its excellent frequency distributions. In cryptographic situations, it is completely and utterly devoid of all value whatsoever. The internal state can be identified by looking at any 624 contiguous outputs. Blum Blum Shub has been very strong in cryptographic situations, but it runs unacceptably slow for use in simulations.
No.
Nobody should be choosing a Mersenne Twister to generate randomness unless it's built-in, and if you are using randomness extensively you should be replacing it anyway. The Mersenne Twister fails basic statistical randomness tests that far simpler, far faster algorithms do not, and is generally just a bit disappointing.
The insecure, non-crytographic pseudo-random number generators I recommend nowadays are xoroshiro+ and the PCG family. xoroshiro+ is faster and purported to be slightly higher quality, but the PCG family comes with a more complete library and fills more roles.
However, modern cryptographic randomness can get more than fast enough. Rust's rand library uses ISAAC by default, and other choices exist. This should be your default choice in all but the most exceptional cases.

Prime Factorization

I have recently been reading about the general use of prime factors within cryptography. Everywhere i read, it states that there is no 'PUBLISHED' algorithm which operates in polynomial time (as opposed to exponential time), to find the prime factors of a key.
If an algorithm was discovered or published which did operate in polynomial time, then how would this impact in the real world computing environment as opposed to the world of theory and computer science. Considering the extent we depend on cryptography would the would suddenly come to halt.
With this in mind if P = NP is true, what might happen, how much do we depend on the fact that it is yet uproved.
I'm a beginner so please forgive any mistakes in my question, but i think you'll get my general gist.
With this in mind if N = NP is true, would they ever tell us.
Who are “they”? If it were true, we would know. The computer scientists? That’s us. The cryptographers and mathematicians? The professionals? The experts? People like us. Users of the Internet, even of Stack Overflow.
We wouldn’t need being told. We’d tell.
Science and research isn’t done behind closed doors. If someone finds out that P = NP, this couldn’t be kept secret, simply because of the way that research is published. In principle, everyone has access to such research.
It depends on who discovers it.
NSA and other organizations that research cryptography under state sponsorship, contrary to Konrad's assertion, do research and science behind closed doors—and guns. And they have "scooped" published academic researchers on some important discoveries. Finally, they have a history of withholding cryptanalytic advances for years after they are independently discovered by academic researchers.
I'm not big into conspiracy theories. But I'd be very surprised if a lot of "black" money hasn't been spent by governments on the factorization problem. And if any results are obtained, they would be kept secret. A lot of criticism has been leveled at agencies in the U.S. for failing to coordinate with each other to avert terrorism. It might be that notifying the FBI of information gathered by the NSA would reveal "too much" about the NSA's capabilities.
You might find the first question posed to Bruce Schneier in this interview interesting. The upshot is that NSA will always have an edge over academia, but that margin is shrinking.
For what it is worth, the NSA recommends the use of elliptic curve Diffie-Hellman key agreement, not RSA encryption. Do they like the smaller keys? Are they looking ahead to quantum computing? Or … ?
Keep in mind that factoring is not known to be (and is conjectured not to be) NP-complete, thus demonstrating a P algorithm for factoring will not imply P=NP. Presumably we could switch the foundation of our encryption algorithms to some NP-complete problem instead.
Here's an article about P = NP from the ACM: http://cacm.acm.org/magazines/2009/9/38904-the-status-of-the-p-versus-np-problem/fulltext
From the link:
Many focus on the negative, that if P
= NP then public-key cryptography becomes impossible. True, but what we
will gain from P = NP will make the
whole Internet look like a footnote in
history.
Since all the NP-complete optimization
problems become easy, everything will
be much more efficient. Transportation
of all forms will be scheduled
optimally to move people and goods
around quicker and cheaper.
Manufacturers can improve their
production to increase speed and
create less waste. And I'm just
scratching the surface.
Given this quote, I'm sure they would tell the world.
I think there were researchers in Canada(?) that were having good luck factoring large numbers with GPUs (or clusters of GPUs). It doesn't mean they were factored in polynomial time but the chip architecture was more favorable to factorization.
If a truly efficient algorithm for factoring composite numbers was discovered, I think the biggest immediate impact would be on e-commerce. Specifically, it would grind to a halt until a form of encryption was developed that doesn't rely on factoring being a one-way function.
There has been a lot of research into cryptography in the private sector for the past four decades. This was a big switch from the previous era, where crypto was largely in the purview of the military and secret government agencies. Those secret agencies definitely tried to resist this change, but once knowledge is discovered, it's very hard to keep it under wraps. With that in mind, I don't think a solution to the P = NP problem would remain a secret for long, despite any ramifications it might have in this one area. The potential benefits would be in a much wider range of applications.
Incidentally, there has been some research into quantum cryptography, which
relies on the foundations of quantum mechanics, in contrast to traditional public key cryptography which relies on the computational difficulty of certain mathematical functions, and cannot provide any indication of eavesdropping or guarantee of key security.
The first practical network using this technology went online in 2008.
As a side note, if you enter into the realm of quantum computing, you can factor in polynomial time. See Rob Pike's notes from his talk on quantum computing, page 25, also known as Shor's algorithm.

Can quantum algorithms be used for encryption?

Can quantum algorithms be useful?
Has any one been successful in putting quantum algorithms to any use?
"Quantum algorithms" are algorithms to be run on quantum computers.
There are things that can be done quickly in the quantum computation model that are not known (or believed) to be possible with classical computation: Discrete logarithm and Integer factorisation (see Shor's algorithm) are in BQP, but not believed to be in P (or BPP). Thus when/if a quantum computer is built, it is known that it can break RSA and most current cryptography.
However,
quantum computers cannot (are not believed to, I mean) solve NP-complete problems in polynomial time, and more importantly,
no one has built a quantum computer yet, and it is not even clear if it will be possible to build one -- avoiding decoherence, etc. (There have been claims of quantum computers with a limited number of qubits -- 5 to 10, but clearly they are not useful for anything much.)
"Well, there's a quantum computer that can factor 15, so those of you using
4-bit RSA should worry." -- Bruce Schneier
[There is also the idea of quantum cryptography, which is cryptography over a quantum channel, and is something quite different from quantum computation.]
The only logical answer is that they are both useful and not useful. ;-)
My understanding is that current quantum computing capabilities can be used to exchange keys securely. The exchanged keys can then be used to perform traditional cryptography.
As far i know about Quantum Computing and Algorithms.I seen pretty much usage of Quantum Algorithms in Cryptography.If you are really interested in Cryptography do please check on those things.Basically all matter is how well you know basics of Quantum mechanics and discrete mathematics. Eg: you must be seeing difficult algorithms like Shor's Algorithm ,this basically integer factorization.Basically integer factorization is easy thing using normal algorithms Algebraic-group factorization algorithm,Fermat's factorization method..etc but when its comes to Quantum Computing its totally different,you are running the things in Quantum computers so algorithm changes and we have to use the algorithms like Shor's etc.
Basically make good understanding about Quantum Computing and then look Quantum Algorithms
There is also some research into whether quantum computing can be used to solve hard problems, such as factoring large numbers (if this was feasible it would break current encryption techniques).
Stackoverflow runs on a quantum computer of sorts.
Feynman has implied the possibility that quantum probability is the source of human creativity.
Individuals in the crowd present answers and vote on them with only a probability of being correct. Only by sampling the crowd many times can the probability be raised to a confident level.
So maybe Stackoverflow examplifies a successful quantum algorithm implementation.
What do you think?
One good use of a quantum device that can be done in current technology is a random number generator.
Generating truly random bits is an important cryptographic primitive, and is used, for example, in the RSA algorithm to generate the private key. In our PC the random number generator isn't random at all in the sense that the source has no entropy in it, and therefore isn't really random at all.

Resources