Is there any open-source Post-Quantum Cryptography (PQC) or Quantum Safe Symmetric cryptography algorithm available for Microsoft Quantum Development Kit
You may find the https://github.com/microsoft/grover-blocks project from Microsoft Research of interest. That repository looks at cost estimates for many different components of AES, such as the SBox step, using the resources estimator provided with the Quantum Development Kit.
PQC is different from Quantum Safe; though PQC can be a subset of Quantum Safe, because Quantum Safe is a broader concept and it can include Quantum Key Distribution.
It is the Asymmetric Key algorithms that we use today are under threat by Quantum Computer.
So the PQC efforts as driven by NIST Standardization process focuses
only on Asymmetric algorithms for Key Exchange and Digital
Signatures.
Doubling the key size of today's Symmetric key usage
will give such symmetric keys enough quantum resistance.
Microsoft QDK is to implement Quantum Computing algorithms - they have nothing to do with PQC. If you need symmetric key support then you can consider the C#. QDK based Q# Quantum Applications can integrate with C# for the Symmetric key encryption.
So QDK has not PQC support.
Related
I want to know if there is a successor for X9.31 (AES) based generators in crypto++ library, since the X9.31 cannot be used after December 2015 (NIST SP800-131A)?
If you're looking for an approved CSPRNG, then you want either the HMAC DRBG or the Hash DRBG, both of which are approved by NIST and are in Crypto++. You will want to seed it using the OS CSPRNG, which is also available in Crypto++.
The HMAC DRBG has been found to be slightly more likely to resist attacks, but the Hash DRBG is going to be faster in a typical implementation. There are no known practical attacks on either when used correctly.
If you don't have a strong need for compliance, I'd go with just using the OS PRNG and stick with that. The OS PRNG is going to be robust and secure and even if you picked another algorithm, you'd still need to seed with it anyway.
If you do need to pick an approved algorithm for compliance reasons but don't otherwise have a strong preference, use an HMAC DRBG with SHA-512, seeded from the OS RNG, which provides the best possible security and the best possible performance for an HMAC DRBG on 64-bit systems.
I don't know if this is the right place to ask such question but, ALL computers use binary, when you program the compiler always turn your code into a "01" form.
But, are there computers that do NOT work on that principle?
If yes, what kind of computers they are? What can they do and how were they created?
If no, why can't such computers be made?
Thanks and I hope this is good information for anyone looking for a similar answer.
Yes, but they are esoteric. Modern computers are digital, use base two, are electronic, and use classical systems largely due to the success of integrated digital logic circuits, but it was a long road and other types of computers have been invented along the way.
Digital computers in other bases
A handful of computers were made based on ternary (base 3) logic. Some of these used bipolar voltages to represent values in balanced ternary. The general consensus is that if you are going to use digital circuitry, binary is the best option, so this is mostly a historical curiosity.
See: https://en.wikipedia.org/wiki/Ternary_computer
Analog and mechanical computers
Analog computers use continuous voltages to represent values. Mechanical computers use the positions of physical parts, such as gear shafts, to represent values. There is some renewed interest in analog computers for problems such as image analysis, where researchers suspect that the benefits of performance outweigh the disadvantages of accumulated error. Whether analog computers see a resurgence is unknown.
See: https://en.wikipedia.org/wiki/Analog_computer
Quantum computers
Quantum computers represent values using superpositions of quantum states. Creating a large quantum computer is an open problem, and a number of algorithms have already been written for quantum computers which would outperform classical implementations. For example, Shor's algorithm.
See: https://en.wikipedia.org/wiki/Quantum_computing
Technically speaking all computers are based on a voltage of
0 = no voltage
1 = yes voltage
However, there are new "Quantum computers" that are far from complete that do not use a 0-1, instead they use quantum bits.
Here is more information on the topic:
https://en.wikipedia.org/wiki/Quantum_computing
May classical pseudo random generators be predictable by powerful quantum computers in the future or is it proven that this is not possible?
If they are predictable, do scientists know whether there exist PRGs that are unpredictable by quantum computers?
The security of a classical Cryptographic Pseudo-Random Number Generator (CPRNG) is always based on some hardness assumption, such as "factoring is hard" or "colliding the SHA-256 function is hard".
Quantum computers make some computational problems easier. That violates some of the old hardness assumptions. But not all of them.
For example, blum blum shub is likely broken by quantum computers, but no one knows how to break lattice-based cryptography with quantum computers. Showing you can break all classical CPRNGs with quantum computers is tantamount to showing that BQP=NP, which is not expected to be the case.
Even if quantum computers did break all classical CPRNGs, they happen to also fill that hole. They enable the creation of "Einstein-certified" random numbers.
Can quantum algorithms be useful?
Has any one been successful in putting quantum algorithms to any use?
"Quantum algorithms" are algorithms to be run on quantum computers.
There are things that can be done quickly in the quantum computation model that are not known (or believed) to be possible with classical computation: Discrete logarithm and Integer factorisation (see Shor's algorithm) are in BQP, but not believed to be in P (or BPP). Thus when/if a quantum computer is built, it is known that it can break RSA and most current cryptography.
However,
quantum computers cannot (are not believed to, I mean) solve NP-complete problems in polynomial time, and more importantly,
no one has built a quantum computer yet, and it is not even clear if it will be possible to build one -- avoiding decoherence, etc. (There have been claims of quantum computers with a limited number of qubits -- 5 to 10, but clearly they are not useful for anything much.)
"Well, there's a quantum computer that can factor 15, so those of you using
4-bit RSA should worry." -- Bruce Schneier
[There is also the idea of quantum cryptography, which is cryptography over a quantum channel, and is something quite different from quantum computation.]
The only logical answer is that they are both useful and not useful. ;-)
My understanding is that current quantum computing capabilities can be used to exchange keys securely. The exchanged keys can then be used to perform traditional cryptography.
As far i know about Quantum Computing and Algorithms.I seen pretty much usage of Quantum Algorithms in Cryptography.If you are really interested in Cryptography do please check on those things.Basically all matter is how well you know basics of Quantum mechanics and discrete mathematics. Eg: you must be seeing difficult algorithms like Shor's Algorithm ,this basically integer factorization.Basically integer factorization is easy thing using normal algorithms Algebraic-group factorization algorithm,Fermat's factorization method..etc but when its comes to Quantum Computing its totally different,you are running the things in Quantum computers so algorithm changes and we have to use the algorithms like Shor's etc.
Basically make good understanding about Quantum Computing and then look Quantum Algorithms
There is also some research into whether quantum computing can be used to solve hard problems, such as factoring large numbers (if this was feasible it would break current encryption techniques).
Stackoverflow runs on a quantum computer of sorts.
Feynman has implied the possibility that quantum probability is the source of human creativity.
Individuals in the crowd present answers and vote on them with only a probability of being correct. Only by sampling the crowd many times can the probability be raised to a confident level.
So maybe Stackoverflow examplifies a successful quantum algorithm implementation.
What do you think?
One good use of a quantum device that can be done in current technology is a random number generator.
Generating truly random bits is an important cryptographic primitive, and is used, for example, in the RSA algorithm to generate the private key. In our PC the random number generator isn't random at all in the sense that the source has no entropy in it, and therefore isn't really random at all.
We are looking to do some heavy security requirements on our project, and we need to do a lot of encryption that is highly performant.
I think that I know that PKI is much slower and more complex than symmetric encryption, but I can't find the numbers to back up my feelings.
Yes, purely asymmetric encryption is much slower than symmetric cyphers (like DES or AES), which is why real applications use hybrid cryptography: the expensive public-key operations are performed only to encrypt (and exchange) an encryption key for the symmetric algorithm that is going to be used for encrypting the real message.
The problem that public-key cryptography solves is that there is no shared secret. With a symmetric encryption you have to trust all involved parties to keep the key secret. This issue should be a much bigger concern than performance (which can be mitigated with a hybrid approach)
On a Macbook running OS X 10.5.5 and a stock build of OpenSSL, "openssl speed" clocks AES-128-CBC at 46,000 1024 bit blocks per second. That same box clocks 1024 bit RSA at 169 signatures per second. AES-128-CBC is the "textbook" block encryption algorithm, and RSA 1024 is the "textbook" public key algorithm. It's apples-to-oranges, but the answer is: RSA is much, much slower.
That's not why you shouldn't be using public key encryption, however. Here's the real reasons:
Public key crypto operations aren't intended for raw data encryption. Algorithms like Diffie-Hellman and RSA were devised as a way of exchanging keys for block crypto algorithms. So, for instance, you'd use a secure random number generator to generate a 128 bit random key for AES, and encrypt those 16 bytes with RSA.
Algorithms like RSA are much less "user-friendly" than AES. With a random key, a plaintext block you feed to AES is going to come out random to anyone without the key. That is actually not the case with RSA, which is --- more so than AES --- just a math equation. So in addition to storing and managing keys properly, you have to be extremely careful with the way you format your RSA plaintext blocks, or you end up with vulnerabilities.
Public key doesn't work without a key management infrastructure. If you don't have a scheme to verify public keys, attackers can substitute their own keypairs for the real ones to launch "man in the middle" attacks. This is why SSL forces you to go through the rigamarole of certificates. Block crypto algorithms like AES do suffer from this problem too, but without a PKI, AES is no less safe than RSA.
Public key crypto operations are susceptible to more implementation vulnerabilities than AES. For example, both sides of an RSA transaction have to agree on parameters, which are numbers fed to the RSA equation. There are evil values attackers can substitute in to silently disable encryption. The same goes for Diffie Hellman and even more so for Elliptic Curve. Another example is the RSA Signature Forgery vulnerability that occurred 2 years ago in multiple high-end SSL implementations.
Using public key is evidence that you're doing something "out of the ordinary". Out of the ordinary is exactly what you never want to be with cryptography; beyond just the algorithms, crypto designs are audited and tested for years before they're considered safe.
To our clients who want to use cryptography in their applications, we make two recommendations:
For "data at rest", use PGP. Really! PGP has been beat up for more than a decade and is considered safe from dumb implementation mistakes. There are open source and commercial variants of it.
For "data in flight", use TLS/SSL. No security protocol in the world is better understood and better tested than TLS; financial institutions everywhere accept it as a secure method to move the most sensitive data.
Here's a decent writeup [matasano.com] me and Nate Lawson, a professional cryptographer, wrote up a few years back. It covers these points in more detail.
Use the OpenSSL speed subcommand to benchmark the algorithms and see for yourself.
[dave#hal9000 ~]$ openssl speed aes-128-cbc
Doing aes-128 cbc for 3s on 16 size blocks: 26126940 aes-128 cbc's in 3.00s
Doing aes-128 cbc for 3s on 64 size blocks: 7160075 aes-128 cbc's in 3.00s
...
The 'numbers' are in 1000s of bytes per second processed.
type 16 bytes 64 bytes 256 bytes 1024 bytes 8192 bytes
aes-128 cbc 139343.68k 152748.27k 155215.70k 155745.61k 157196.29k
[dave#hal9000 ~]$ openssl speed rsa2048
Doing 2048 bit private rsa's for 10s: 9267 2048 bit private RSA's in 9.99s
Doing 2048 bit public rsa's for 10s: 299665 2048 bit public RSA's in 9.99s
...
sign verify sign/s verify/s
rsa 2048 bits 0.001078s 0.000033s 927.6 29996.5
Practical PKI-based encryption systems use asymmetric encryption to encrypt a symmetric key, and then symmetric encryption with that key to encrypt the data (having said that, someone will point out a counter-example).
So the additional overhead imposed by asymmetric crypto algorithms over that of symmetric is fixed - it doesn't depend on the data size, just on the key sizes.
Last time I tested this, validating a chain of 3 or so X.509 certificates [edit to add: and the data they were signing] was taking a fraction of a second on an ARM running at 100MHz or so (averaged over many repetitions, obviously). I can't remember how small - not negligible, but well under a second.
Sorry I can't remember the exact details, but the summary is that unless you're on a very restricted system or doing a lot of encryption (like if you want to accept as many as possible SSL connections a second), NIST-approved asymmetric encryption methods are fast.
Apparently it is 1000x worse. (http://windowsitpro.com/article/articleid/93787/symmetric-vs-asymmetric-ciphers.html). But unless you're really working through a lot of data it isn't going to matter. What you can do is use asymmetric encryption to exchange a symmetric encryption key.
Perhaps you can add some details about your project so that you get better quality answers. What are you trying to secure? From whom? If you could explain the requirements of your security, you'll get a much better answer. Performance doesn't mean much if the encryption mechanism isn't protecting what you think it is.
For instance, X509 certs are an industrial standard way of securing client/server endpoints. PGP armoring can be used to secure license files. For simplicity, Cipher block chaining with Blowfish (and a host of other ciphers) is easy to use in Perl or Java, if you control both end points.
Thanks.
Yes, the hybrid encryption offered by standardized cryptographic schemes like PGP, TLS, and CMS does impose a fixed performance cost on each message or session. How big that impact is depends on the algorithms selected and which operation you are talking about.
For RSA, decryption and signing operations are relatively slow, because it requires modular exponentiation with a large private exponent. RSA encryption and signature verification, on the other hand, is very fast, because it uses the small public exponent. This difference scales quadratically with the key length.
Under ECC, because peers are doing the same math with keys of similar size, operations are more balanced than RSA. In an integrated encryption scheme, an ephemeral EC key can be generated, and used in a key agreement algorithm; that requires a little extra work for the message sender. ECDH key agreement is much, much slower than RSA encryption, but much faster than RSA decryption.
In terms of relative numbers, decrypting with AES might be 100,000x faster than decrypting with RSA. In terms of absolute numbers, depending heavily on hardware, AES might take a few nanoseconds per block, while RSA takes a millisecond or two. And that prompts the question, why would anyone use asymmetric algorithms, ever?
The answer is that these algorithms are used together, for different purposes, in hybrid encryption schemes. Fast, symmetric algorithms like AES are used to protect the message itself, and slow, asymmetric algorithms like RSA are used in turn to protect the keys needed by the symmetric algorithms. This is what allows parties that have never previously shared any secret information, like you and your search engine, to communicate securely with each other.