Discussion:
[curves] Million Dollar Curve
Alex
2016-02-24 03:53:10 UTC
Permalink
What do you guys think of this?:

http://cryptoexperts.github.io/million-dollar-curve/
--
Alex
Nathaniel McCallum
2016-02-24 16:34:14 UTC
Permalink
Post by Alex
http://cryptoexperts.github.io/million-dollar-curve/
FYI, python-rubenesque added support for MDC201601 today.
https://github.com/npmccallum/python-rubenesque/commit/3aee9560ca58ff7f
d573bae841abb3eecda677b7

However, we'd love to see some additional test vectors.
Ray Dillinger
2016-02-24 18:18:28 UTC
Permalink
Post by Alex
http://cryptoexperts.github.io/million-dollar-curve/
I think it's an awesome idea.

Bear
Salz, Rich
2016-02-24 18:20:02 UTC
Permalink
Post by Alex
http://cryptoexperts.github.io/million-dollar-curve/
Who are these folks? What is wrong with25519 and/or 448?

My answers: I don't know and nothing.

So why do I want this?
Thomas Ptacek
2016-02-24 18:28:32 UTC
Permalink
1. CryptoExperts is like a European version of Cryptography Research; it’s people from Gemalto, INRIA, Antoine Joux, Lous Goubin, their grad students, &c. They’re not randos.

2. Their paper doesn’t claim anything is wrong with 25519. They’re just proposing a random Edwards curve alternative to 25519, with a cute trick to generate credible random parameters. Their paper rejects the Brainpool-y parameters-from-math-constants approach, citing Bernstein’s BADA55 argument, in favor of pure random parameters. 

-- 
Thomas Ptacek
Post by Alex
http://cryptoexperts.github.io/million-dollar-curve/
Who are these folks? What is wrong with25519 and/or 448?

My answers: I don't know and nothing.

So why do I want this?

_______________________________________________
Curves mailing list
***@moderncrypto.org
https://moderncrypto.org/mailman/listinfo/curves
Salz, Rich
2016-02-24 18:30:58 UTC
Permalink
2. Their paper doesn’t claim anything is wrong with 25519. They’re just proposing a random Edwards curve alternative to 25519
Which brings me back to the million-dollar question: why do I want this?
Thomas Ptacek
2016-02-24 18:35:25 UTC
Permalink
One reason might be: because you like almost everything else about Curve25519 other than the specially chosen sparse prime, aren’t especially performance sensitive, and your application is cryptographically very conservative, so you’re willing to trade off performance for totally unstructured and “provably” random parameters.

Alyssa Rowan suggested on HN yesterday that a plausible (but weird) scenario for that would be that you’re reusing RSA hardware for your ECC stuff, want all the security benefits of Curve25519, but 2^255-19 might be leak-prone on that hardware.

(I am parroting some of this from a brief conversation with one of the paper authors, which set me off on a reading jag yesterday, and while I don’t find the argument especially persuasive, it at least makes sense to me now.)

-- 
Thomas Ptacek
312-231-7805
Post by Thomas Ptacek
2. Their paper doesn’t claim anything is wrong with 25519. They’re just proposing a random Edwards curve alternative to 25519
Which brings me back to the million-dollar question: why do I want this?
Salz, Rich
2016-02-24 18:37:50 UTC
Permalink
Thanks! Interesting, although not enough to overcome by skepticism. I appreciate your feedback!
--
Senior Architect, Akamai Technologies
IM: ***@jabber.at Twitter: RichSalz
D. J. Bernstein
2016-02-25 01:08:11 UTC
Permalink
“provably” random parameters
Consider the October 2015 conviction of an insider who successfully
compromised the security of a typical modern lottery "by using his
privileged access to an MUSL facility to install a rootkit on the
computer containing Hot Lotto's random number generator":

https://en.wikipedia.org/wiki/Hot_Lotto_fraud_scandal

He won $14.3 million. He was caught only because he didn't realize in
advance that Iowa required winners to be identified publicly---he had
foolishly bought the ticket himself and didn't manage to prepare his
accomplices adequately for a retroactive coverup:

http://www.thedailybeast.com/articles/2015/07/07/inside-the-biggest-lottery-scam-ever.html

There are many other documented examples of people who successfully
broke lottery security and then were caught purely by luck. Presumably
there are also many people who successfully broke lottery security and
then _weren't_ caught.

How much would it cost for a serious attacker to quietly manipulate all
of the "last" lotteries used in the Million Dollar Curve? Not much, and
then the attacker has tremendous flexibility to choose the resulting
curve. Presumably the same attacker also has massive computer power, and
can therefore target a weakness in an incredibly small fraction of all
curves, far beyond a Brainpool user's worst nightmares.

For comparison, https://bada55.cr.yp.to/bada55-20150927.pdf points to a
small amount of wiggle room in the "fastest curve" approach, since the
curve generator can change the choice of curve by changing security
criteria (e.g., is 2^255-19 big enough?). The Million Dollar Curve
attempts to eliminate this wiggle room by having the curve chosen by
uncontrollable lotteries after the security criteria are specified. The
big problem is that lotteries are actually controllable, so the Million
Dollar Curve ends up giving the attacker vastly _more_ wiggle room.

---Dan
Gregory Maxwell
2016-02-25 01:27:22 UTC
Permalink
Post by D. J. Bernstein
“provably” random parameters
Consider the October 2015 conviction of an insider who successfully
compromised the security of a typical modern lottery "by using his
privileged access to an MUSL facility to install a rootkit on the
Beat me to the link. There are, IMO, better schemes for NUMs.

My "favorite"* is a scheme where:

(1) Rigidly define the use of the resulting randomness and agree on
it, in some document with known hash. (Ideally, you provide a program
that writes the implementations and papers on the resulting
cryptosystem given a seed)

(2) Pay bitcoin to N respected parties who have randomly generated
their own keys (idenfied in (1)).

(3) Perform a transaction committing to the scheme hash.

(4) Once that settles in the Bitcoin network, the N parties sweep
their coins and publish their private keys.

NUMS initialization is a random hash (specified in (1)) of the block
data where settlement happened in (4), along with the private keys.

Grinding the NUMS requires 2^70 SHA2 executions per try on average,
knowledge of all N of the secrets, and a grinder needs to win the
block race. Any party who knows any of the N secrets could also steal
the coins on deposit instead of trying to grind; incentivizing the
respected parties to choose unpredictable secrets and keep them
secret.

I believe the primary vectors for biasing the outcomes are in the
construction of the usage in step (1), and the potential for
participants to bias the outcome by 'losing' their private keys if it
doesn't go their way. The latter could be discouraged by the loss of
funds (esp if the release requires signatures by all parties), which
could be quite large... but perhaps still not high enough to thwart
nation-state attackers.

The primary argument for this scheme is that you can argue its
security under several distinct angles: The personal integrity of any
one of the N secret holders, the computational challenge of the
hashcash inner-loop, the bonding to encourage secrecy, or the
competition of the Bitcoin block race.

*"favorite" in the sense that I think NUMS schemes should be avoided
at all cost; if for no other reason than they invite precisely this
sort of endless shed painting!
Nathaniel McCallum
2016-02-24 18:48:37 UTC
Permalink
Post by Alex
http://cryptoexperts.github.io/million-dollar-curve/
Who are these folks?  What is wrong with25519 and/or 448?
From the paper:

Q2. Is there anything wrong with Curve25519?

No. We, at CryptoExperts, actually use Curve25519 and recommend it to
our partners. Yet, we think that people should not rely on the same few
safe curves that are currently out. Our methodology allows to easily
produce safe alternatives.

Q3. Curve25519 vs. Million Dollar Curve

Curve25519 was designed to be as fast as possible, with no security
compromise. This is both a strength and a potential weakness:
    – a strength because it gives a valid argument that no trapdoor was
      introduced in the design,
    – a potential weakness because Curve25519 uses a very specific
      prime field.

As of now, no attack exploiting this specificity is known. For
applications where speed is paramount, Curve25519 is probably the best
option. But for most applications, where losing a little on the
efficiency side is “not a big deal”, Million Dollar Curve is probably
the safest choice.

See also the answer by Ruggero on Stack Exchange.
Krisztián Pintér
2016-02-24 22:36:06 UTC
Permalink
Post by Nathaniel McCallum
    – a potential weakness because Curve25519 uses a very specific
      prime field.
as well as every other curve on the planet. even nist curves use
special primes.
Post by Nathaniel McCallum
applications where speed is paramount, Curve25519 is probably the best
not where it is paramount. this wording suggests that for most
applications, speed is not an issue. the world is very different than
this picture. namely:

* we don't want a whole bunch of curves. we want only a handful,
ideally two, one regular size and one larger. adding more curves is a
disservice.

* speed is pretty much always and issue if one participant is a busy
server.

* we definitely want code simplicity. good curves are designed to have
simple and safe implementations. curve-specific implementations are
always simpler. less potential for errors, less code to audit.
Trevor Perrin
2016-02-25 00:17:30 UTC
Permalink
Post by Krisztián Pintér
Post by Nathaniel McCallum
– a potential weakness because Curve25519 uses a very specific
prime field.
as well as every other curve on the planet. even nist curves use
special primes.
No, Brainpool curves and million dollar curve use "randomly"-chosen primes.

Yes, this incurs a slowdown (~2x). Some would argue it's worth it
because randomly-chosen primes might be more conservative than
special-form primes. Others would argue that if you want to spend
extra cycles in pursuit of security, you're better off with
special-form primes but a larger curve (eg 448).

This has been debated many times, here and elsewhere. Lots of people
(or at least me) seem happy with special-form primes.

Anyways, if this thread continues, hopefully someone can point out
interesting aspects of this new curve or make some novel arguments,
let's not repeat old debates.

Trevor
Thomas Baigneres
2016-03-01 16:41:09 UTC
Permalink
Hello Krisztián,

we were watching the discussion on Million Dollar Curve and felt the need to react to your message because we do not agree with some of your views.
Post by Krisztián Pintér
Post by Nathaniel McCallum
– a potential weakness because Curve25519 uses a very specific
prime field.
as well as every other curve on the planet. even nist curves use
special primes.
Post by Nathaniel McCallum
applications where speed is paramount, Curve25519 is probably the best
not where it is paramount. this wording suggests that for most
applications, speed is not an issue. the world is very different than
* we don't want a whole bunch of curves. we want only a handful,
ideally two, one regular size and one larger. adding more curves is a
disservice.
We believe that, in general, relying on a single solution for cryptography always increases the risk. In case one solution gets attacked, it’s better to have a backup plan at hand than having to find one in a hurry. We agree that if Curve25519 ever gets attacked, it is likely that many other curves (including Million Dollar Curve) will also suffer from this attack, but you never know.
Post by Krisztián Pintér
* speed is pretty much always and issue if one participant is a busy
server.
We agree that for busy servers speed is an issue. Still, most “busy” servers on the planet still use RSA over ECC. And the gap between RSA2048 and Million Dollar Curve is much bigger than between Million Dollar Curve and Curve25519.
Post by Krisztián Pintér
* we definitely want code simplicity. good curves are designed to have
simple and safe implementations. curve-specific implementations are
always simpler. less potential for errors, less code to audit.
We suppose that this matter was already extensively discussed on this mailing list during the “random” vs. “optimized” feud. Our opinion is that a generic implementation of an Edwards Curve (like Million Dollar Curve) is much simpler than an optimized implementation of Curve25519. Of course this generic implementation can directly be used for Curve25519 too, but people implementing Curve25519 will most certainly want to optimize the code. In the end, they will have a much more complex and error prone code. As an example, you can have a look at Nathaniel McCallum’s straightforward implementation of Million Dollar Curve (https://github.com/npmccallum/python-rubenesque/blob/master/rubenesque/curves/mdc.py) and compare it to any optimized implementation of Curve25519.

For the record, we do believe Curve25519 is a great work and we will ourselves continue to use it as a backup plan for Million Dollar Curve ;-)

— The Million Dollar Curve Team
Nathaniel McCallum
2016-03-01 16:46:53 UTC
Permalink
Post by Thomas Baigneres
Hello Krisztián,
we were watching the discussion on Million Dollar Curve and felt the
need to react to your message because we do not agree with some of
your views.
Post by Krisztián Pintér
    – a potential weakness because Curve25519 uses a very
specific
      prime field.
as well as every other curve on the planet. even nist curves use
special primes.
applications where speed is paramount, Curve25519 is probably the best
not where it is paramount. this wording suggests that for most
applications, speed is not an issue. the world is very different than
* we don't want a whole bunch of curves. we want only a handful,
ideally two, one regular size and one larger. adding more curves is a
disservice.
We believe that, in general, relying on a single solution for
cryptography always increases the risk. In case one solution gets
attacked, it’s better to have a backup plan at hand than having to
find one in a hurry. We agree that if Curve25519 ever gets attacked,
it is likely that many other curves (including Million Dollar Curve)
will also suffer from this attack, but you never know. 
Post by Krisztián Pintér
* speed is pretty much always and issue if one participant is a busy
server.
We agree that for busy servers speed is an issue. Still, most “busy”
servers on the planet still use RSA over ECC. And the gap between
RSA2048 and Million Dollar Curve is much bigger than between Million
Dollar Curve and Curve25519.
Post by Krisztián Pintér
* we definitely want code simplicity. good curves are designed to have
simple and safe implementations. curve-specific implementations are
always simpler. less potential for errors, less code to audit.
We suppose that this matter was already extensively discussed on this
mailing list during the “random” vs. “optimized” feud. Our opinion is
that a generic implementation of an Edwards Curve (like Million
Dollar Curve) is much simpler than an optimized implementation of
Curve25519. Of course this generic implementation can directly be
used for Curve25519 too, but people implementing Curve25519 will most
certainly want to optimize the code. In the end, they will have a
much more complex and error prone code.
... and unauditable.
Post by Thomas Baigneres
As an example, you can have a look at Nathaniel McCallum’s
straightforward implementation of Million Dollar Curve
(https://github.com/npmccallum/python-
rubenesque/blob/master/rubenesque/curves/mdc.py) and compare it to
any optimized implementation of Curve25519.
For the record, we do believe Curve25519 is a great work and we will
ourselves continue to use it as a backup plan for Million Dollar
Curve ;-)
Tony Arcieri
2016-03-01 16:51:03 UTC
Permalink
On Tue, Mar 1, 2016 at 8:41 AM, Thomas Baigneres <
Post by Thomas Baigneres
We believe that, in general, relying on a single solution for cryptography
always increases the risk. In case one solution gets attacked, it’s better
to have a backup plan at hand than having to find one in a hurry.
The CFRG already standardized a backup curve for Curve25519:
Ed448-Goldilocks
--
Tony Arcieri
Nathaniel McCallum
2016-03-01 16:51:52 UTC
Permalink
Post by Thomas Baigneres
Hello Krisztián,
we were watching the discussion on Million Dollar Curve and felt the
need to react to your message because we do not agree with some of
your views.
Post by Krisztián Pintér
    – a potential weakness because Curve25519 uses a very
specific
      prime field.
as well as every other curve on the planet. even nist curves use
special primes.
applications where speed is paramount, Curve25519 is probably the best
not where it is paramount. this wording suggests that for most
applications, speed is not an issue. the world is very different than
* we don't want a whole bunch of curves. we want only a handful,
ideally two, one regular size and one larger. adding more curves is a
disservice.
We believe that, in general, relying on a single solution for
cryptography always increases the risk. In case one solution gets
attacked, it’s better to have a backup plan at hand than having to
find one in a hurry. We agree that if Curve25519 ever gets attacked,
it is likely that many other curves (including Million Dollar Curve)
will also suffer from this attack, but you never know. 
Post by Krisztián Pintér
* speed is pretty much always and issue if one participant is a busy
server.
We agree that for busy servers speed is an issue. Still, most “busy”
servers on the planet still use RSA over ECC. And the gap between
RSA2048 and Million Dollar Curve is much bigger than between Million
Dollar Curve and Curve25519.
Post by Krisztián Pintér
* we definitely want code simplicity. good curves are designed to have
simple and safe implementations. curve-specific implementations are
always simpler. less potential for errors, less code to audit.
We suppose that this matter was already extensively discussed on this
mailing list during the “random” vs. “optimized” feud. Our opinion is
that a generic implementation of an Edwards Curve (like Million
Dollar Curve) is much simpler than an optimized implementation of
Curve25519. Of course this generic implementation can directly be
used for Curve25519 too, but people implementing Curve25519 will most
certainly want to optimize the code. In the end, they will have a
much more complex and error prone code. As an example, you can have a
look at Nathaniel McCallum’s straightforward implementation of
Million Dollar Curve (https://github.com/npmccallum/python-rubenesque
/blob/master/rubenesque/curves/mdc.py) and compare it to any
optimized implementation of Curve25519.
For the record, we do believe Curve25519 is a great work and we will
ourselves continue to use it as a backup plan for Million Dollar
Curve ;-)
Thomas, can we get some test vectors for MDC? Thanks!
Watson Ladd
2016-03-01 16:54:05 UTC
Permalink
On Tue, Mar 1, 2016 at 8:41 AM, Thomas Baigneres
Post by Thomas Baigneres
Hello Krisztián,
we were watching the discussion on Million Dollar Curve and felt the need to react to your message because we do not agree with some of your views.
Post by Krisztián Pintér
Post by Nathaniel McCallum
– a potential weakness because Curve25519 uses a very specific
prime field.
as well as every other curve on the planet. even nist curves use
special primes.
Post by Nathaniel McCallum
applications where speed is paramount, Curve25519 is probably the best
not where it is paramount. this wording suggests that for most
applications, speed is not an issue. the world is very different than
* we don't want a whole bunch of curves. we want only a handful,
ideally two, one regular size and one larger. adding more curves is a
disservice.
We believe that, in general, relying on a single solution for cryptography always increases the risk. In case one solution gets attacked, it’s better to have a backup plan at hand than having to find one in a hurry. We agree that if Curve25519 ever gets attacked, it is likely that many other curves (including Million Dollar Curve) will also suffer from this attack, but you never know.
Post by Krisztián Pintér
* speed is pretty much always and issue if one participant is a busy
server.
We agree that for busy servers speed is an issue. Still, most “busy” servers on the planet still use RSA over ECC. And the gap between RSA2048 and Million Dollar Curve is much bigger than between Million Dollar Curve and Curve25519.
Post by Krisztián Pintér
* we definitely want code simplicity. good curves are designed to have
simple and safe implementations. curve-specific implementations are
always simpler. less potential for errors, less code to audit.
We suppose that this matter was already extensively discussed on this mailing list during the “random” vs. “optimized” feud. Our opinion is that a generic implementation of an Edwards Curve (like Million Dollar Curve) is much simpler than an optimized implementation of Curve25519. Of course this generic implementation can directly be used for Curve25519 too, but people implementing Curve25519 will most certainly want to optimize the code. In the end, they will have a much more complex and error prone code. As an example, you can have a look at Nathaniel McCallum’s straightforward implementation of Million Dollar Curve (https://github.com/npmccallum/python-rubenesque/blob/master/rubenesque/curves/mdc.py) and compare it to any optimized implementation of Curve25519.
Like https://github.com/npmccallum/python-rubenesque/blob/master/rubenesque/curves/cfrg.py?

Oh, you wanted to compare apples to oranges?
Post by Thomas Baigneres
For the record, we do believe Curve25519 is a great work and we will ourselves continue to use it as a backup plan for Million Dollar Curve ;-)
— The Million Dollar Curve Team
_______________________________________________
Curves mailing list
https://moderncrypto.org/mailman/listinfo/curves
--
"Man is born free, but everywhere he is in chains".
--Rousseau.
Ben Harris
2016-03-01 19:42:30 UTC
Permalink
On 2 Mar 2016 3:41 am, "Thomas Baigneres" <
We agree that for busy servers speed is an issue. Still, most “busy”
servers on the planet still use RSA over ECC. And the gap between RSA2048
and Million Dollar Curve is much bigger than between Million Dollar Curve
and Curve25519.

Have you actually benchmarked MDC signature verification vs RSA signature
verification?
Krisztián Pintér
2016-03-01 22:16:34 UTC
Permalink
Post by Thomas Baigneres
We believe that, in general, relying on a single solution for
cryptography always increases the risk.
surely. also reduces the cost. the question is where is the balance.
as things are now, having our miriad of software implement a primitive
is an enourmous undertaking. not only you need to get it to a number
of libraries, but then you need to standardize it into rfcs, and add
to servers and clients. it is not something we want if there is no
serious reason for it.

and i claim that having two similar curves is not a serious reason. if
we want backup, pick some very different algo. pick some post quantum,
does not matter if inefficient, nobody will use it, it is just a
backup.

but i think since we have RSA/DSA as backup, we can go five more years
without something new.
Post by Thomas Baigneres
We agree that for busy servers speed is an issue. Still, most
“busy” servers on the planet still use RSA over ECC.
i don't find this a compelling argument. most servers use old
primitives because of the cost of transition, not because they don't
care enough for performance.

this is, again, an optimization problem. how much verifiable
randomness weighs against 2x speed? if i'm the buyer, sign me up for
the 2x speed.
Post by Thomas Baigneres
Our opinion is that a generic implementation of an Edwards Curve (like
Million Dollar Curve) is much simpler than an optimized
implementation of Curve25519.
i don't find this argument compelling either. what we want both
performance and simplicity at the same time. they are, of course,
contradictory to each other. you can also count safety (e.g. side
channel) as a third factor. it is easy to excel in one aspect. it is
easy to improve one at the expense of another. what is hard is to
improve all. yet, that is what we want.

and of course you can add verifiable randomness as a fourth variable.
but the question is, again, what is the value of that? what weight
will it have compared to the other three?
Post by Thomas Baigneres
For the record, we do believe Curve25519 is a great work and we
will ourselves continue to use it as a backup plan for Million Dollar Curve ;-)
for the record: i don't think there is anything wrong with MDC, or its
random generation method (although i haven't looked into the details).
what i say is this: the benefits are lower, and the cost is higher
than advertised.

(also some very unfortunate wording on the website)

Watson Ladd
2016-02-24 18:24:12 UTC
Permalink
Again? I really don't want to waste another 2 years of my life arguing
against random curves.
Post by Alex
http://cryptoexperts.github.io/million-dollar-curve/
--
Alex
_______________________________________________
Curves mailing list
https://moderncrypto.org/mailman/listinfo/curves
--
"Man is born free, but everywhere he is in chains".
--Rousseau.
Krisztián Pintér
2016-02-24 19:32:22 UTC
Permalink
Post by Alex
http://cryptoexperts.github.io/million-dollar-curve/
this is another case of solving a nonexistent problem.
Post by Alex
"ANSSI FRP256v1, NIST P-256, NIST P-384, Curve25519, secp256k1,
brainpoolP256t1, Curve1174 and a few others. However, several of
these curves parameters generation processes contain unjustified
choices
yeah, several. but not all! so why put together a list of a few safe
and few unsafe curves, and then complain the lack of security of some?
the fact is, there are curves with a veryfiable parameter choice.

what can we randomize? we can't randomize the prime. we need very
carefully crafted prime to enable fast calculation modulo that prime.
we need primes very close to powers of two, with the differences being
at very specific locations. just check the goldilocks paper to see how
hard it is to find a good prime. we don't have too many of them. the
curve form (edwards, etc) also come from the same rationale.

goldilocks paper for the lazy: https://eprint.iacr.org/2015/625.pdf

we can randomize the curve parameter, like d for a montgomery curve.
however, minizmizing the constant has the same effect.

we can randomize the generator, but it does not make a whole lot of
difference, and minimizing has the same effect.

so please explain to me, how randomizing improves security in any
meaningful way. it does not.
Loading...