Comments

jedisct1 • 16 points • 2024-02-12

ECC: smaller keys, smaller signatures, fast key generation, faster signatures, slower verification (compared to RSA).

RSA: bigger keys, bigger signatures, slow key generation, slow signatures but fast signature verification

Suby81 • 7 points • 2024-02-12

I would add that ECC security per curve is easier to estimate correctly than RSA security per key size. This is because GNFS complexity for real key sizes is not easy to estimate.

atoponce • 4 points • 2024-02-12

Going further, attacks on multiplicative group primitives, which includes RSA, Diffie-Hellman, DSA, and presumably ElGamal, is proceeding faster than progress against elliptic curves.

upofadown • 1 points • 2024-02-13

It isn’t like, say, 2048 bit RSA is in the same solar system as the best that can be done with the GNFS. That estimate would have to be really bad to cause a threat.

atoponce • 7 points • 2024-02-12

Also

  • ECC: The onus of correctness lies on the cryptographer.

  • RSA: The onus of correctness lies on the developer.

upofadown • 2 points • 2024-02-12

Libraries for RSA exist…

You don’t have to implement RSA from scratch. Is that even a common thing to do?

atoponce • 8 points • 2024-02-12

I think you might be missing the point. RSA ships far more footguns than ECC. Although RSA libraries exist, RSA still required the developers of those libraries to implement RSA correctly.

As an example of a couple of footguns, RSA drags you towards “backwards compatibility” with insecure systems. ECC generally doesn’t need to worry about accidentally accepting 768-bit parameters.

RSA also begs implementors to encrypt directly with its public key primitive, which is usually not what you want to do. Not only does accidentally designing with RSA encryption usually forfeit forward-secrecy, but it also exposes you to new classes of implementation bugs. Elliptic curve systems don’t promote this particular foot-gun.

The weight of correctness/safety in ECC falls primarily on cryptographers, who must provide a set of curve parameters optimized for security at a particular performance level; once that happens, there aren’t many knobs for implementors to turn that can subvert security.

The opposite is true in RSA. Even if you use RSA-KEM or RSA-OAEP, there are additional parameters to supply and things you have to know to get right.

upofadown • 1 points • 2024-02-12

RSA still required the developers of those libraries to implement RSA correctly.

Are we talking about the method or the company?

Even if you use RSA-KEM or RSA-OAEP, there are additional parameters to supply and things you have to know to get right.

Like the key size? … and maybe the exponent?

atoponce • 4 points • 2024-02-12

The thread context is obviously about the public key encryption algorithm, not the company.

The issues with implementing RSA securely go well beyond just key size and exponent. Here’s one paper that discusses 20 years of problems with RSA from Dan Boneh: https://crypto.stanford.edu/~dabo/papers/RSA-survey.pdf

We categorized attacks on RSA into four categories: (1) elementary attacks that exploit blatant misuse of the system, (2) low private exponent attacks serious enough that a low private exponent should never be used, (3) low public exponent attacks, (4) and attacks on the implementation.

upofadown • 0 points • 2024-02-12

That paper is great. It is basically a check list for getting RSA right. The other paragraph in the conclusion:

Two decades of research into inverting the RSA function produced some insightful attacks, but no devastating attack has ever been found. The attacks discovered so far mainly illustrate the pitfalls to be avoided when implementing RSA. At the moment it app ears that proper implementations can be trusted to provide security in the digital world.

breakwaterlabs • 1 points • 2024-02-12

ECC generally doesn’t need to worry about accidentally accepting 768-bit parameters.

To this point: I believe that AD CS configured to issue ECDSA at a minimum of 256-bit strength will also happily issue 256-bit RSA certs, if asked to do so.

plastikbenny • 6 points • 2024-02-12

Some later ECC Standards have implementation safeties built in, specifically RFC 7748 ECDH based on curve 25519 and RFC 8032 EDDSA signatures, and are believed to be unencumbered by NSA.

Other ECC standards have implementation dangers/pitfalls similar to RSA.

Downside to ECC is that the math is not very aproachable unless you want to dedicate a substantial part of your life to the study of elliptic curves, and unlocking good runtime performance requires a multiple precision integer implementation that is crafted specifically to take advantage of tricks related to the prime of the finite field, ie 2255 - 19 and the algorithm it needs to support. Writing a fast custom multiple precision integer implementation is very challanging.

Some high level languages have big number implementations in their standard API that are built specifically for RSA and are even hardware accelerated. However, many RSA implementations are legacy in that they dont support blinding of the private key, dont implement the latest RFC 8017 and modern padding, and have leaking issues because they cut validation short when there are errors.

bascule • 10 points • 2024-02-12

As someone who works on production-quality implementations of ECC and almost-production-quality implementations of RSA, I would definitely say to prefer ECC wherever possible: not only is it faster, but it’s much, much easier to implement correctly.

RSA is extremely difficult to implement correctly, e.g. in constant-time for a given key size. In fact, many RSA implementations are currently broken in such a way that attackers that can measure network timing information from a computer performing RSA private key operations can decrypt ciphertexts and forge signatures: https://people.redhat.com/~hkario/marvin/

A big part of the problem is the RSA ecosystem is heavily dependent on a legacy padding mode called PKCS#1 v1.5, which is used for both encryption and signatures. Despite still being commonly used, this mode has been broken over and over again, with the Marvin Attack being the latest iteration. Most RSA implementations do not handle padding verification failures in constant-time (i.e. via a rejection symbol), which leaks timing information which can be used to recover RSA plaintexts or forge signatures. Newer RSA padding modes (OAEP/PSS) do help here.

Beyond that, as an implementer ECC uses fixed-sized big integers and is generally amenable to an expert implementer carefully writing everything by hand. RSA uses not only larger integers, but also ones whose size may vary based on the key size, which has lead many RSA implementers to use general purpose numerical/“bignum” libraries for the purposes of implementing RSA. But such libraries are unsuitable: for example many will look for leading zeros on numbers after performing an arithmetic operation, and strip them, in order to improve performance, but this introduces a data-dependent timing sidechannel of the sort mentioned above which can be used to decrypt RSA ciphertexts/forge signatures.

The solution there is to carefully build a numerical library specifically for cryptographic use cases like RSA, which always uses constant time operations, and always pads the number of “limbs” used in bignum libraries to a constant number or fixed multiples of the key size.

Coffee_Ops • 1 points • 2024-02-12

The Marvin attack writeup talks out of both sides of its mouth. It suggests (as you do) that this is focused on RSA, but then states that ECDSA is also vulnerable.

It also notes in its QA that having RSA disabled (e.g. per SSLLabs reports) protects you, but only if your key isn’t used somewhere else. That doesn’t make sense to me— the key you’re using on an RSA-disabled site has to be a non-RSA key doesn’t it?

Sorry if this is a bit of a tangent but this is interesting to me.

bascule • 2 points • 2024-02-13

Nowhere does it say ECDSA is vulnerable to the Marvin Attack.

It talks about the semi-related Minerva attack against ECDSA, which also exploits high order bits being zero.

Suby81 • 1 points • 2024-02-13

I agree RSA padding sucks but handling the exceptions of an ecc addition formula ain’t simple either. Also, you have many several ways of doing scalar multiplications, while binary exponentiation is much simpler.

Natanael_L • 2 points • 2024-02-13

These days most curves and schemes use complete formulas with no exceptions, especially abstracted versions like Ristretto which hides all major potential footguns, so you only have to follow the spec carefully and make sure the tests pass and then you’re done. You’re not supposed to compose your own curve.

Suby81 • 1 points • 2024-02-13

Just because those scheme exist, doesnt mean they are widely spread. There a lot of things still relying on P256

bascule • 3 points • 2024-02-13

Complete prime order addition formulas can be used with any prime order elliptic curve, including P-256.

Yes, there are several implementations which use e.g. incomplete Jacobian formulas, but there are also a lot of sidechannel-ridden implementations of RSA too which aren’t properly constant-time, especially in regard to padding rejection.

A clean room implementation of P-256 can be relatively simple. The same cannot be said for RSA.

Natanael_L • 1 points • 2024-02-13

In which case they should use a library written by a professional cryptographer

bascule • 2 points • 2024-02-13

I responded here re: complete addition formulas, but I wanted to address this too:

binary exponentiation is much simpler.

I’ll assume you’re talking about modular exponentiation, which is not easy to implement efficiently in constant-time.

One of the most common modern ways to implement it as a sequence of what are called “Almost Montgomery Multiplications” (AMM). These are modular multiplications that get reduced modulo the bit size of the modulus, but still might overflow the modulus.

So you still need everything you would for Montgomery form, just like you would for implementing field multiplication for an elliptic curve, but you also need AMM, and then you still need to deal with actually reducing the output of AMM into a proper Montgomery representative.

The code required to implement modular exponentiation using AMM winds up being roughly about as complicated as the code required to implement scalar multiplications using complete prime order addition formulas with elliptic curve field elements using an internal Montgomery representation, but with more annoying corner cases to deal with.

Suby81 • 1 points • 2024-02-14

I really don’t see your point. For ECC you have to implement field operations, much more than you would need for RSA, plus you need the point arithmetic.

Then you are willing to pay the latency cost for complete formulas but not for a simpler MM (not AMM) which is simpler to implement.

Also you mentioned that ECC is going to use internally Montgomery so why is ok for ECC but not for RSA?

Also..ECC has 3 possible scalar multiplications: fixed point, variable point, double base. Isn’t all of this adding complexity?

bascule • 1 points • 2024-02-14

For ECC you have to implement field operations, much more than you would need for RSA

Not really. You end up needing quite a lot of the same things, including modular inversions, in a full implementation of RSA that supports keygen in addition to modular exponentiation for encryption/decryption. As it were I just worked on an implementation of Bernstein-Yang we share between ECC and RSA implementations.

Then you are willing to pay the latency cost for complete formulas but not for a simpler MM (not AMM) which is simpler to implement.

The practical performance overhead of using prime order formulas is negligible, whereas AMM is large win: more than twice as fast as typical Montgomery multiplication.

Also you mentioned that ECC is going to use internally Montgomery so why is ok for ECC but not for RSA?

RSA needs AMM for modular exponentiation in addition to regular Montgomery arithmetic for e.g. keygen. I’m saying when you toss that all in, it winds up being nearly as complex as implementing scalar multiplication using Montgomery-based field elements for ECC.

ECC has 3 possible scalar multiplications: fixed point, variable point, double base. Isn’t all of this adding complexity?

“Fixed base scalar multiplication” is really talking about having support for precomputed basepoint tables, which don’t require that much code. Double base chains are also an interesting optimization, but again one that doesn’t add considerable code and scalar multiplication incorporating them winds up being roughly as complicated as modular exponentiation using AMM.

leeharrison1984 • 4 points • 2024-02-12

Everyone pointed out key length, so I’ll add that depending on application, you may see greater compatibility between systems with RSA over ECC.

wwabbbitt • 2 points • 2024-02-12

The size efficiency of ECC keys and signatures is enough to make me pick ECC over RSA wherever ECC is available

Kryptochef • 2 points • 2024-02-12

One seldom mentioned advantage of ECC is that it’s a little harder to understand (or rather to think you understand it) - RSA seems so simple that many people think “oh, I can do this” and instead of using a library build it on their own. And then inevitably fall into one of the pitfalls whose number is about as large as the modulus you need.

(This isn’t meant too seriously - complexity is also a downside, of course, as the relevant math might be understood slightly less. Though elliptic curves are of course well on the “tried and tested” side of things by now.)

ramriot • 2 points • 2024-02-12

Something not so far mentioned as a potential advantage for ECC asymmetric encryption / signatures is that in many ECC schemes the private key can be generated deterministically with minimum computation, unlike the primes used in RSA.

This allows for an interesting feature where trust rings can be implemented by a chain of keys where the private key of each lower ring is generated using ( for example ) a cryptographic hash function or an HMAC with realm on the next higher private key.

All this without needed to store & sync; multiple master, system or site keys. Something that currently exists with such things a PassKeys & is thus a key management scaling limitation.

A use case for this would be in pseudonymous zero knowledge proof authentication schemes where an passkey master is being used to generate site keys. Should the Master key be leaked through a system compromise there exists an in effect Supermaster that can be used to revoke & replace the master & by extension all site keys key globally, something the master key is prohibited from doing by agreed implementation rules.

a2800276 • 4 points • 2024-02-12

In regards to what? You’ll almost certainly get better answers if you put a bit more effort into your question. Or ask ChatGPT.

HashMapsData2Value • 1 points • 2024-02-12

ECC gets you higher security for lower size keys.

breakwaterlabs • 1 points • 2024-02-12

RSA tends to have higher resistance to current quantum attacks at the same Lenstra bit strength, requiring roughly a single order of magnitude increase in qubits to crack. This becomes more significant at RSA4096 and above and probably is not relevant in most cases especially with PQC contenders right around the corner, but I didn’t see it mentioned and it is a difference. Note that this is also very heavily dependent on the current state of the art in quantum attacks.

Beyond that, ECDSA support can be spotty in enterprise apps and is sometimes poorly documented. RSA can also perform verifies faster (around single order of magnitude) at the same Lenstra strength, so from an administrators perspective there are some reasons to use RSA.

On the flip side I understand RSA to have lots of implementation gremlins that make the theoretical arguments less important. Additionally, ECDSA can be multiple orders of magnitude faster to sign, especially as you move towards RSA7168 and higher— to reach 256-bit strength with RSA you’re looking at some truly hideous issuance times.

I see ECDSA recommended a lot more than RSA, but I think “which is better” is going to depend a lot on your use case. If you’re using old-school big enterprise apps from vendors like Cisco and VMware you may find that RSA is a lot less painful to deal with. If you’re in an environment that issues very short lifetime certs, RSA’s slow sign speeds may be an issue (or not, if you use 2048). And if you’re subject to minimum Lenstra strengths like 128+ you’ll probably find that RSA’s keylengths rapidly becomes infeasible to use compared with ECDSA.

EvrenselKisilik • 1 points • 2024-02-15

Idk much how ECC works but I’m feeling like RSA is more secure than the ECC if we’re talking about “security”. You can use 4096-bit keys, add we know it’s very secure today. If you don’t really need a speed that RSA couldn’t give but ECC could, RSA is better.