implication of lacking ind-cca security - public-key-encryption

I was wondering if somebody could tell me what the impication is of an encryption scheme lacking ind-cca or ind-cpa security. For example, the ElGamal scheme is known to lack ind-cca security. The Helios voting protocol uses ElGamal to encrypt each ballot - what I am have a difficult time understanding is what the implication is of ElGamal having an efficient ind-cpa adversary and ElGamal being used in Helios.

Related

ANSI X9.31 PRNG successor

I want to know if there is a successor for X9.31 (AES) based generators in crypto++ library, since the X9.31 cannot be used after December 2015 (NIST SP800-131A)?
If you're looking for an approved CSPRNG, then you want either the HMAC DRBG or the Hash DRBG, both of which are approved by NIST and are in Crypto++. You will want to seed it using the OS CSPRNG, which is also available in Crypto++.
The HMAC DRBG has been found to be slightly more likely to resist attacks, but the Hash DRBG is going to be faster in a typical implementation. There are no known practical attacks on either when used correctly.
If you don't have a strong need for compliance, I'd go with just using the OS PRNG and stick with that. The OS PRNG is going to be robust and secure and even if you picked another algorithm, you'd still need to seed with it anyway.
If you do need to pick an approved algorithm for compliance reasons but don't otherwise have a strong preference, use an HMAC DRBG with SHA-512, seeded from the OS RNG, which provides the best possible security and the best possible performance for an HMAC DRBG on 64-bit systems.

Does winapi's bcrypt.h actually support bcrypt hashing?

This may sound like a strange question, and it feels a bit bizarre that I actually have to ask this, but after spending a couple hours looking over the MSDN documentation for the bcrypt routines that were added in Vista, I've almost reached the conclusion that there is no actual bcrypt support!
According to Wikipedia:
bcrypt is an adaptive cryptographic hash function for passwords
... based on the Blowfish cipher ... Besides incorporating a
salt to protect against rainbow table attacks, bcrypt is an adaptive
hash: over time it can be made slower and slower so it remains
resistant to specific brute-force search attacks against the hash and
the salt.
However, from the documentation on MSDN, the "bcrypt" library is apparently actually a generic interface for encryption and hashing. You have to obtain a handle to an "algorithm provider" via the BCryptOpenAlgorithmProvider function, which has several built-in algorithms to choose from. But the word "blowfish" does not appear anywhere in the list.
So am I missing something? Am I reading this wrong? Or does Windows's "bcrypt" library not actually support bcrypt at all?
In the context of the MSDN, BCrypt is a shortform of "BestCrypt", but the PR name for it is:
Cryptography API: Next Generation (Cng)
It is implemented in bcrypt.dll.
BestCrypt/BCrypt/Cng is the successor to the older CryptoAPI.
Microsoft is slowly removing references to "BestCrypt" from their site, but you can still see it in some pages like:
SHA256Cng Class
This algorithm is for hashing only and does not provide any encryption or decryption. It uses the BCrypt (BestCrypt) layer CNG.
It's interesting (to me anyway) that the .NET framework generally can provide you three implementations for the each kind of crypto algorithm. For example, for SHA2 hashing, there is:
SHA256Managed: an implementation written purely in managed code
SHA256CryptoServiceProvider: a wrapper around the native Cryptographic Service Provider (CSP) implementation
SHA256Cng: a wrapper around Cryptography Next Gen (Cng) implementation
Short version
No, bcrypt is short for bestcrypt. And, no, it doesn't support bcrypt (blowfish crypt) password hashing.
the BCrypt APIs are generic and support various cryptographic hash algorithms, but bcrypt is not one of them. The B Prefix seems to be just a way to distinguish between the older APIs and the Next Generation.

Feistel and non feistel ciphers

While googling I could find only feistel ciphers and didn't find any relevant information on non-feistel ciphers. Can someone suggest me some good non-feistel ciphers?
And yes this is homework.
There's way more than Feistel ciphers. :)
The simple answers: No stream ciphers, such as rc4, are Feistel ciphers. No Public Key ciphers, such as RSA or El Gamal are Feistel ciphers.
And the perhaps-surprising counter-example: Rijndael (the new AES), despite being a block cipher, isn't Feistel.
If you're really interested in Cryptography, I strongly recommend reading Handbook of Applied Cryptography, freely available and significantly better than most undergraduate texts. Schneier's "Applied Cryptography" is decent enough, an excellent introduction, but doesn't get into as much detail as one might like.
Rijndael, Square, Serpent, IDEA, Noekeon, etc. Wikipedia has a list of blockciphers, and the structure (Feistel, Feistel-like (unbalanced Feistel, e.g.), Substitution-Permutation network (SPN), etc. is mentioned in each lemma. SPN and Feistel are the most common, as the design makes it obvious that the function will be invertible. Designs other than these are rarer, but do occur. All the ciphers in the standards (like SSL/TLS, SSH, etc.) are of one of these 2 types.
I suggest putting in a little more effort. Even a cursory on-line search turns up definitions of "Feistel cipher", as well as descriptions of a wide variety of cipher procedures- It should not be too hard to tell which are clearly not Feistel ciphers.
I further recommend finding a good book on the subject, such as Bruce Schneier's "Applied Cryptography" (either edition).

Can the Diffie-Hellman protocol be used as a base for digital signatures?

I am implementing a custo crypto library using the Diffie-Hellman protocol (yes, i know about rsa/ssl/and the likes - i am using it specific purposes) and so far it turned out better than i original expected - using GMP, it's very fast.
My question is, besides the obvious key exchange part, if this protocol can be used for digital signatures as well.
I have looked at quite a few resources online, but so far my search has been fruitless.
Is this at all possible?
Any (serious) ideas are welcome.
Update:
Thanks for the comments. And for the more curious people:
my DH implementation is meant - among other things - to distribute encrypted "resources" to client-side applications. both are, for the most part, my own code.
every client has a DH key pair, and i use it along with my server's public key to generate the shared keys. in turn, i use them for HMACs and symmetric encryption.
DH keys are built anywhere from 128 up to 512 bits, using safe primes as modulus.
I realize how "pure" D-H alone can't be used for signatures, i was hoping for something close to it (or as simple).
It would appear this is feasible: http://www.quadibloc.com/crypto/pk050302.htm.
I would question why you are doing this though. The first rule of implementing crypto is don't implement crypto. There are plenty of libraries that already exist, you would probably be better off leveraging these, crypto code is notoriously hard to get right even if you understand the science behind it.
DSA is the standard way to make digital signatures based on the discrete logarithm problem.
And to answer a potential future question, Ephemeral-static Diffie-Hellman is the standard way to implement asymmetric encryption (to send messages where you know and trust the recipients public key (for example through a certificate), but the recipient does not know your key).

What is the performance difference of pki to symmetric encryption?

We are looking to do some heavy security requirements on our project, and we need to do a lot of encryption that is highly performant.
I think that I know that PKI is much slower and more complex than symmetric encryption, but I can't find the numbers to back up my feelings.
Yes, purely asymmetric encryption is much slower than symmetric cyphers (like DES or AES), which is why real applications use hybrid cryptography: the expensive public-key operations are performed only to encrypt (and exchange) an encryption key for the symmetric algorithm that is going to be used for encrypting the real message.
The problem that public-key cryptography solves is that there is no shared secret. With a symmetric encryption you have to trust all involved parties to keep the key secret. This issue should be a much bigger concern than performance (which can be mitigated with a hybrid approach)
On a Macbook running OS X 10.5.5 and a stock build of OpenSSL, "openssl speed" clocks AES-128-CBC at 46,000 1024 bit blocks per second. That same box clocks 1024 bit RSA at 169 signatures per second. AES-128-CBC is the "textbook" block encryption algorithm, and RSA 1024 is the "textbook" public key algorithm. It's apples-to-oranges, but the answer is: RSA is much, much slower.
That's not why you shouldn't be using public key encryption, however. Here's the real reasons:
Public key crypto operations aren't intended for raw data encryption. Algorithms like Diffie-Hellman and RSA were devised as a way of exchanging keys for block crypto algorithms. So, for instance, you'd use a secure random number generator to generate a 128 bit random key for AES, and encrypt those 16 bytes with RSA.
Algorithms like RSA are much less "user-friendly" than AES. With a random key, a plaintext block you feed to AES is going to come out random to anyone without the key. That is actually not the case with RSA, which is --- more so than AES --- just a math equation. So in addition to storing and managing keys properly, you have to be extremely careful with the way you format your RSA plaintext blocks, or you end up with vulnerabilities.
Public key doesn't work without a key management infrastructure. If you don't have a scheme to verify public keys, attackers can substitute their own keypairs for the real ones to launch "man in the middle" attacks. This is why SSL forces you to go through the rigamarole of certificates. Block crypto algorithms like AES do suffer from this problem too, but without a PKI, AES is no less safe than RSA.
Public key crypto operations are susceptible to more implementation vulnerabilities than AES. For example, both sides of an RSA transaction have to agree on parameters, which are numbers fed to the RSA equation. There are evil values attackers can substitute in to silently disable encryption. The same goes for Diffie Hellman and even more so for Elliptic Curve. Another example is the RSA Signature Forgery vulnerability that occurred 2 years ago in multiple high-end SSL implementations.
Using public key is evidence that you're doing something "out of the ordinary". Out of the ordinary is exactly what you never want to be with cryptography; beyond just the algorithms, crypto designs are audited and tested for years before they're considered safe.
To our clients who want to use cryptography in their applications, we make two recommendations:
For "data at rest", use PGP. Really! PGP has been beat up for more than a decade and is considered safe from dumb implementation mistakes. There are open source and commercial variants of it.
For "data in flight", use TLS/SSL. No security protocol in the world is better understood and better tested than TLS; financial institutions everywhere accept it as a secure method to move the most sensitive data.
Here's a decent writeup [matasano.com] me and Nate Lawson, a professional cryptographer, wrote up a few years back. It covers these points in more detail.
Use the OpenSSL speed subcommand to benchmark the algorithms and see for yourself.
[dave#hal9000 ~]$ openssl speed aes-128-cbc
Doing aes-128 cbc for 3s on 16 size blocks: 26126940 aes-128 cbc's in 3.00s
Doing aes-128 cbc for 3s on 64 size blocks: 7160075 aes-128 cbc's in 3.00s
...
The 'numbers' are in 1000s of bytes per second processed.
type 16 bytes 64 bytes 256 bytes 1024 bytes 8192 bytes
aes-128 cbc 139343.68k 152748.27k 155215.70k 155745.61k 157196.29k
[dave#hal9000 ~]$ openssl speed rsa2048
Doing 2048 bit private rsa's for 10s: 9267 2048 bit private RSA's in 9.99s
Doing 2048 bit public rsa's for 10s: 299665 2048 bit public RSA's in 9.99s
...
sign verify sign/s verify/s
rsa 2048 bits 0.001078s 0.000033s 927.6 29996.5
Practical PKI-based encryption systems use asymmetric encryption to encrypt a symmetric key, and then symmetric encryption with that key to encrypt the data (having said that, someone will point out a counter-example).
So the additional overhead imposed by asymmetric crypto algorithms over that of symmetric is fixed - it doesn't depend on the data size, just on the key sizes.
Last time I tested this, validating a chain of 3 or so X.509 certificates [edit to add: and the data they were signing] was taking a fraction of a second on an ARM running at 100MHz or so (averaged over many repetitions, obviously). I can't remember how small - not negligible, but well under a second.
Sorry I can't remember the exact details, but the summary is that unless you're on a very restricted system or doing a lot of encryption (like if you want to accept as many as possible SSL connections a second), NIST-approved asymmetric encryption methods are fast.
Apparently it is 1000x worse. (http://windowsitpro.com/article/articleid/93787/symmetric-vs-asymmetric-ciphers.html). But unless you're really working through a lot of data it isn't going to matter. What you can do is use asymmetric encryption to exchange a symmetric encryption key.
Perhaps you can add some details about your project so that you get better quality answers. What are you trying to secure? From whom? If you could explain the requirements of your security, you'll get a much better answer. Performance doesn't mean much if the encryption mechanism isn't protecting what you think it is.
For instance, X509 certs are an industrial standard way of securing client/server endpoints. PGP armoring can be used to secure license files. For simplicity, Cipher block chaining with Blowfish (and a host of other ciphers) is easy to use in Perl or Java, if you control both end points.
Thanks.
Yes, the hybrid encryption offered by standardized cryptographic schemes like PGP, TLS, and CMS does impose a fixed performance cost on each message or session. How big that impact is depends on the algorithms selected and which operation you are talking about.
For RSA, decryption and signing operations are relatively slow, because it requires modular exponentiation with a large private exponent. RSA encryption and signature verification, on the other hand, is very fast, because it uses the small public exponent. This difference scales quadratically with the key length.
Under ECC, because peers are doing the same math with keys of similar size, operations are more balanced than RSA. In an integrated encryption scheme, an ephemeral EC key can be generated, and used in a key agreement algorithm; that requires a little extra work for the message sender. ECDH key agreement is much, much slower than RSA encryption, but much faster than RSA decryption.
In terms of relative numbers, decrypting with AES might be 100,000x faster than decrypting with RSA. In terms of absolute numbers, depending heavily on hardware, AES might take a few nanoseconds per block, while RSA takes a millisecond or two. And that prompts the question, why would anyone use asymmetric algorithms, ever?
The answer is that these algorithms are used together, for different purposes, in hybrid encryption schemes. Fast, symmetric algorithms like AES are used to protect the message itself, and slow, asymmetric algorithms like RSA are used in turn to protect the keys needed by the symmetric algorithms. This is what allows parties that have never previously shared any secret information, like you and your search engine, to communicate securely with each other.

Resources