Discussion:
proposal: commercial data recovery
Adam Back
1997-10-14 09:37:21 UTC
Permalink
If we take the design goal of designing a commercial data recovery
system which is not GAK compliant, we can most succinctly state this
design goal as the task of ensuring that:

- at no point will any data transferred over communications links be
accessible to anyone other than the sender and recipient with out
also obtaining data on the recipient and/or senders disks


I think we can distill the design principles required to meet the
design goal of a non-GAK compliant Corporate Data Recovery (CDR) down
to ensuring that:

1. no keys used to secure communications in any part of the system are
a-priori escrowed with third parties

2. second crypto recipients on encrypted communications are not
used to allow access to third parties who are not messaging
recipients manually selected by the sender

3. communications should be encrypted to the minimum number of
recipients (typically one), and those keys should have as short a
life time as is practically possible

Included in 2) is the principle of not re-transmitting over
communication channels keys or data re-encrypted to third parties
after receipt -- that is just structuring -- and violates design
principle 2.


With these three principles you still have lots of flexibility because
you can escrow storage keys, do secret splitting of storage keys, and
store keys encrypted to second storage accessors on the local disk for
stored data recovery purposes.

As an additional bonus, principle 3 adds extra security against
attackers gaining access to encrypted traffic after the fact -- the
recipient no longer has the key -- this is a form of forward secrecy.


Systems designed to the CDR design principles are of significantly
less use to GAKkers than PGP Inc's GAK compliant Commercial Message
Recovery (CMR) design. The CDR design significantly hinders the take
up of GAK if widely deployed.

Design principle 3 -- forward secrecy -- is inherently hostile to
GAKkers, and is the strongest statement you can make against GAK: you
are purposelly _destroying_ communications keys at the earliest
possible moment to ensure that GAKkers can not obtain the keys by
legal and extra-legal coercion, black mail, and rubber hose
cryptanalysis.

The whole system translates into the Feds having to come and
physically take your disk to obtain information about you, which is
much better than GACK, and not what the GAKkers are interested in at
this point. The GAKkers would like to install themselves, and coerce
companies into installing for them (via GAKker/USG/NSA/NIST organised
efforts such as the 56 bit export permit for companies installing key
escrow; and efforts such as the Key Recovery Parners Alliance (KRAP)).
I fear that PGP Inc's CMR proposal inadvertently meets most of the
NIST/NSA specified KRAP requirements.

What the GACKers want is systems where they can perform routine key
word scanning and fishing expeditions into your communications from
the comfort of their offices, without your knowledge. This is push
button Orwellian government snooping.

Within the constraints imposed by the CDR design principles, there is
plenty enough flexibility to acheive the commercial data recovery
functionality to similarly weak levels of enforcability as achieved by
the CMR design. Weak levels of enforceability are appropriate because
there are other exceedingly easy bypass routes: super-encryption, and
walking out of the office with a DAT tape.


I would like to organise a collaborative effort to write a white paper
discussing how to implement various functionalities using the CDR
design principle.

Then I would like to see discussion of which set of these
functionalities which best acheive the user requirement for company
data recovery.

Lastly I would like to see a collaborative development effort to
provide a example implementation of a CDR system which can be used as
a discussion point for the OpenPGP standardisation process.

I suppose the best place to discuss this process is the IETF forum for
discussion of the OpenPGP standard, the OpenPGP mailing list
(subscribe by sending message body "subscribe ietf-open-pgp" to
"***@imc.org").

I have already had people express in email to me their interest in
doing this. Those people can speak up if they want to. Technical
input is sought from people opposed to GAK compliant software, and
from PGP Inc, and others defending PGP's GAK compliant CMR proposal.

Adam
--
Now officially an EAR violation...
Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/

print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<>
)]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`
Bill Frantz
1997-10-14 15:53:56 UTC
Permalink
At 2:37 AM -0700 10/14/97, Adam Back wrote:
>...
>2. second crypto recipients on encrypted communications are not
> used to allow access to third parties who are not messaging
> recipients manually selected by the sender
>...
>
>Included in 2) is the principle of not re-transmitting over
>communication channels keys or data re-encrypted to third parties
>after receipt -- that is just structuring -- and violates design
>principle 2.

This requirement tries to enforce something which can not be enforced by
technical means. That is, when you send another person some data, there is
no technical way you can prevent them from using it however they want. For
example, a user can always program his filters (given something like
procmail) to send decrypted data anywhere he wants.

The idea that you can control what users do with data thru technical means
is the most common flaw I see when people think about security.


N.B. I applaud Adam's direction of building the data recovery businesses
need without helping 3rd parties engage in undetected snooping. Keeping
the decryption keys (if data is not stored in the clear) near the
legitimate copies seems to be a useful step in this direction.


-------------------------------------------------------------------------
Bill Frantz | Internal surveillance | Periwinkle -- Consulting
(408)356-8506 | helped make the USSR the | 16345 Englewood Ave.
***@netcom.com | nation it is today. | Los Gatos, CA 95032, USA
Adam Back
1997-10-14 18:11:18 UTC
Permalink
Bill Frantz <***@netcom.com> writes:
> At 2:37 AM -0700 10/14/97, Adam Back wrote:
> >...
> >2. second crypto recipients on encrypted communications are not
> > used to allow access to third parties who are not messaging
> > recipients manually selected by the sender
> >...
> >
> >Included in 2) is the principle of not re-transmitting over
> >communication channels keys or data re-encrypted to third parties
> >after receipt -- that is just structuring -- and violates design
> >principle 2.
>
> This requirement tries to enforce something which can not be enforced by
> technical means. That is, when you send another person some data, there is
> no technical way you can prevent them from using it however they want. For
> example, a user can always program his filters (given something like
> procmail) to send decrypted data anywhere he wants.

I agree with that statement entirely.

The principle has deeper meaning than your point. It acknowledges
that there are limits to what can be done to enforce things in
software. What it argues is that you should enforce what you can
where this helps you to make your software less useful to GAKkers
without modifications. So if this means that the GAKkers can't use
your software with out getting you to re-write it, or without making
the modifications themselves this is good because fielded systems have
intertia. It takes time and costs money to make software updates,
doubly so where people will be hostile to those updates. People who
would otherwise not care about GAK will suddenly "care" because they
are too lazy to update their system, or becaues the update will cost
them money.


Btw. Lest it is not clear, when I say "should" in this discussion of
the anti-GAK protocol design process I mean "should" according to my
CDR or "anti-GAK" protocol design principles. If following these
principles causes you to have non functioning or unsaleable designs,
and when this occurs you should still try to violate as few of the
design principles as possible.


So for your .procmail filter example: what the principle says is that
you should make it as non-automatable as possible in your software to
do this redirection in electronic form. The danger about automated
redirection in electronic form is that there will be a nice little box
saying for you to type: "***@lazarus", where lazarus is the
address of the recovery machine on your LAN, but then the GAKkers can
pass laws and all the people who bought your software will be
automatically able to comply by filling in that box with
"***@nsa.gov"; alternately the GAKkers may buy your software
for re-sale in Iraq and then fill in the field with or
"***@mil.iq", with the result that Iraqis.

So for example with your .procmail example: the email client should
decrypt the traffic with short lived keys to provide forward secrecy,
and re-encrypt the plaintext for storage in your mail folder with a
recoverable key (presuming you want corporate data recovery of your
mail folder in case your dog chews your smart card key token).

Forwarding of email at the .procmail level won't help the GAKkers in
this case because the email is still encrypted; and the anti-GAK
protocol design principles state that the encrypted message should be
encrypted to one recipient only: you.

The anti-GAK design principles also mean that you should offer no
tools to decrypt from the command line. This ensures that GAKkers
will be hindered from using software provided by you to cobble
together a GAK system, without writing and distributing software
themselves.

Think of the CDR or anti-GAK software principles as attempting to
codify your natural predilections as a GAK hating protocol designer.
They codify how best to design your software to hinder GAKkers.


Clearly the GAKkers can pass a law saying that you must manually
forward each of your emails after decryption to them, but if the
software provides no easy way to automate this process they are asking
the impossible if there are 10 million US citizens using the software.

Contrast this to PGP Inc's CMR design where all that is required is a
change to one field for completely automated over the wire key word
searches.

> N.B. I applaud Adam's direction of building the data recovery businesses
> need without helping 3rd parties engage in undetected snooping. Keeping
> the decryption keys (if data is not stored in the clear) near the
> legitimate copies seems to be a useful step in this direction.

I like your locality point.

It is something I had not earlier considered had more than binary
meaning (communicated or not ever communicated). However it does.
The Frantz corollary to the anti-GAK protocol design principles is
then:

i) recovery information should be kept as close to the data as possible

ii) if recovery information is moved it should in preference not be
transferred using over communications networks, and should
not be transferred automatically by the software without requiring
human interaction.


You can see that this design principle leads to some at first
apparently absurd requirements, but actually they are all sensible and
pertinent. If the software requires user interaction to transmit
recovery information this means therefore that it should not be
possible with the software as is to write automatic scripts; this
contributes to the uselessness of the software to the GAKkers.

Of course there are ways around everything (eg. scripting software
which allows mouse and keyboard actions to be automated); but we are
trying to avoid easy cobbling together actions by the GAKkers allowing
them to convert our software designed under anti-GAK principles into
one which can then becomes automated GAK system.


Also clearly there is some user intelligence required in the design
process to work out functionalities it is worth forgoing where it
comes to this by comparing their potential value to the GAKker against
the ergonomic or utility advantage to the user.

Adam
--
Now officially an EAR violation...
Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/

print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<>
)]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`
Hal Finney
1997-10-14 16:29:10 UTC
Permalink
Adam Back <***@dcs.ex.ac.uk> writes:

> If we take the design goal of designing a commercial data recovery
> system which is not GAK compliant, we can most succinctly state this
> design goal as the task of ensuring that:
>
> - at no point will any data transferred over communications links be
> accessible to anyone other than the sender and recipient with out
> also obtaining data on the recipient and/or senders disks

You do a good job of saying what you want to avoid, but not much about
what you want to accomplish! You are doing this to satisfy some kind
of market need for commercial access to data. What are the requirements
your design will satisfy? What do the customers need? You can't design
a system based solely on what it doesn't do.

PGP, Inc. has made its design choices based on interaction and discussion
between people responsible for security, human interface, sales/marketing
and customer support, as well as the customers themselves. The lessons
which have come out of this include an emphasis on solutions which
people can use and which will work in the field. This means they have
to be transparent, easy to learn, fit into existing procedures, and
be reliable, as well as being secure. Systems which require extensive
retraining, new operational procedures, or which will hurt reliability
are not acceptable - they simply won't be deployed. That's the reality.

A system with forward secrecy on email and rapid turnover of
communications keys has some attractive security properties, but you
need to look at the other aspects of using it. How will it respond to
errors? How can companies guarantee that they will be able to decrypt
data when they need it? Can you assure that no data will ever be around
which requires a key which no longer exists, and can you do so not in
a theoretical world where everything works perfectly, but in a real
company where people make mistakes and forget things?

If you were to talk to people responsible for human engineering and
deployment you'd find that there are many more considerations which go
into a successful system than theoretical security properties. Otherwise
we'd all be using one time pads.

With our current design, as I described earlier, we have data structures
which will enable people to easily replace encryption keys, if a system
can be designed to solve the usability problems in a practical way.
It looks to me like our current model supports the necessary feature
set for what you want to do, so I don't see any need for changes.

The one new suggestion has been to have separate communication and storage
keys, but this has several problematic aspects. Unlike the distinction
between encryption and signature keys, where it is obvious whether a key
is being used properly, it is not at all clear whether a particular
function represents communication or storage. Consider the case of a
remotely mounted file system, where what looks like a storage operation
is actually travelling across a (possibly insecure) network.

Another example comes from the initial version of the Eudora API. When
that makes calls out to the crypto library to encrypt messages, Eudora
actually stores the message in a disk file, has the library encrypt that
to another disk file, then constructs the message from that output file.
It will look to the library like it is encrypting a disk file using a
communications key, a violation of this proposed security policy.

In fact, in most cases a storage key will not be a public key at all,
but will be a symmetric cipher key, such as with our own PGPDisk product.
This allows transparent disk encryption and decryption. In that case you
really don't have "storage keys" as such in the public key infrastructure,
so there is no need for a distinction.

As it stands, you could design a system to use rapid replacement of
email encryption keys in order to provide forward secrecy, without any
changes whatsoever to the existing key structure as it is documented
in the current draft spec. The operational problems of creating such a
system which is reliable and usable are going to be very severe, but the
support is there in the Open-PGP data formats, and other companies and
groups are welcome to tackle them. However I don't think that Open-PGP
is the right forum for the discussion, since the issues are largely
matters of user interface and customer support.

Hal Finney
***@pgp.com
***@rain.org
Adam Back
1997-10-14 19:44:30 UTC
Permalink
Hal Finney <***@rain.org> writes:
> Adam Back <***@dcs.ex.ac.uk> writes:
> > [CDR or "anti-GAK" design principles]
>
> You do a good job of saying what you want to avoid, but not much about
> what you want to accomplish! You are doing this to satisfy some kind
> of market need for commercial access to data. What are the requirements
> your design will satisfy? What do the customers need? You can't design
> a system based solely on what it doesn't do.

So? Of course you can't design a system based soley of don't dos.

The instructions for using the principles are:

The principles are a set of negative "don't dos" to overlay on to the
normal cryptographic protocol and software architecture design
process.

Use your ingenuity and existing knowledge of cryptographic
primitives and good software design, user ergonomics, and user
requirements to design a system which meets your user requirements
while breaking as few as possible of the principles.

What the principles attempt to do is codify the pro-privacy
cryptographers natural tendencies to want to avoid implementing and
fielding systems which can be of use to GAKkers.

I have not seen any refutation of my demonstrations that pretty much
all of PGP's current functionality can be acheived within anti-GAK
design principles.

The most worrying of uses that your software could be put to by
GAKkers is as an implementation of a fully functioning GAK system in a
company like Iraq where the thought police will literally torture
citizens to death for non-government approved thoughts.

PGP Inc's CMR design could I think it is reasonable to argue is a
ready to roll GAK system. It could be installed by the Iraqi
government with force of law tomorrow. Each service provider installs
the PGP policy enforcer, and configures the "message recovery" key to
be "***@mil.iq"; each business with a leased line is ordered
to do as the ISPs, and individuals are required to use pgp5.0 or
pgp5.5 (either of which is GAK compliant).

Now the secret police can read all their citizens mail, and PGP will
have provided the technology. Does that make your collective chests
swell with pride? No it doesn't. It fills you with dispair, and
disgust that someone would so abuse your software, and so ignore your
well meaning recommendations about ethical business behaviour.

However your disgust and despair, and well meangings won't help
prevent the civil rights violations occuring in Iraq using your
software. Your well meaning won't help people in the US either, if
the USG adopts PGP and passes laws to do the same as I described for
Iraq. It could happen. If PGP is sucessful, and the OpenPGP standard
makes it to Netscape SSL like acceptance levels, and the OpenPGP
standard includes the CMR functionality, well you will have built the
GAKkers their dream system. Something with a smooth migration path
for GAK.

You will notice that Bruce Schneier forwarded a message to cypherpunks
where the GAKkers are quoteed already praising the functionality of
pgp 5.5 in proving them with GAKware.

If PGP on the other hand were to adopt the CDR system, and put their
energies in working out the design details, they would find firstly
that they can acheive all the necessary functionality, and that I and
Bruce Schneier, and Peter Trei, and lots of other people would stop
saying rude things about them. But more importantly, they would be
fielding a non-GAK compliant system, and a non-GAK compliant protocol
in OpenPGP, and that will do much more to save our collective bacon
from having our comms key word searched by the GAKkers than PGP Inc's
undubitably well meaning words of assurance of their good intentions.
Really, once you strip out the sarcasm, I am not at all denying PGP
Inc's good intentions.

The problem is: good intentions don't prevent GAK.

Albert Einstein designed the nuke. He was a nice enough guy, who
wouldn't have advocated nuking millions of people but that sure didn't
help prevent Hiroshima.

> PGP, Inc. has made its design choices based on interaction and discussion
> between people responsible for security, human interface, sales/marketing
> and customer support, as well as the customers themselves. The lessons
> which have come out of this include an emphasis on solutions which
> people can use and which will work in the field. This means they have
> to be transparent, easy to learn, fit into existing procedures, and
> be reliable, as well as being secure. Systems which require extensive
> retraining, new operational procedures, or which will hurt reliability
> are not acceptable - they simply won't be deployed. That's the reality.

Of course, and you would use those considerations to balance the
"don'ts" in the design principles. An important consideration of the
design principles is that deployment of anti-GAK software wins;
therefore the design principles also state that you should do your
best to design systems which your users will be pleased with, which
they will buy. As a pro-privacy advocate in the crypto software
industry you therefore have dual moral obligation:

- to design systems which aren't GAK compliant

- to work your butt off making your user ergonomics so excellent that
you wipe out the competition

This is because your concern should be to ensure that the majority of
users are using non-GAK compliant software, so that the GAKkers will
find it more logistically difficult to install GAK; so that they will
have a hard time persuading users to switch to TIS
"voluntary/mandatory" GAKware. The won't switch because they love the
slick GUI, they won't switch because they don't want to be bothered
replacing their software, and a few won't switch because they realise
what the GAKkers are up to.

> A system with forward secrecy on email and rapid turnover of
> communications keys has some attractive security properties, but you
> need to look at the other aspects of using it. How will it respond to
> errors? How can companies guarantee that they will be able to decrypt
> data when they need it? Can you assure that no data will ever be around
> which requires a key which no longer exists, and can you do so not in
> a theoretical world where everything works perfectly, but in a real
> company where people make mistakes and forget things?

The forward secrecy is a separable issue, actually. And is a good
example of balancing anti-GAK design principles against ergonomics
etc. Personally I think you could acheive it without that much extra
effort with minimal to no ergonomic impact. We can argue about this
separately as it is not really the central issue.

The key points are that there should only be one crypto recipient of a
message, and that the key used to encrypt the message and that
compromise of either the senders and/or the recipients systems should
be required to compromise the message.

This clearly allows recovery of the message contents, because the
communications key can be stored encrypted to a recovery key on the
recipients disk.

> With our current design, as I described earlier, we have data structures
> which will enable people to easily replace encryption keys,

So? That is just as easy to do in any number of possible design
permutations without having GAK compliancy.

> It looks to me like our current model supports the necessary feature
> set for what you want to do, so I don't see any need for changes.

What _I_ want to do completely irrelevant. It matters not a fig what
I do tinkering with some scripts for my own amusement, or in selling
to 1% of the market. What matters is what PGP does in selling to 50%+
of the market, and what the OpenPGP standard says in being used by 90%
of the market. If the OpenPGP standard requires GAK compliancy we
have a serious problem.

I'm much more interested in persuading you personally, and PGP Inc
that what they are doing is GAK compliant, and that by switching to a
CDR design they can avoid this GAK compliancy.

My interest in OpenPGP is that in the event that we are unsuccessful
in persuading PGP of dangers of their CMR system, that we have a fall
back in being able to still make OpenPGP compatible clients which
aren't compatible, so that other companies can field systems which
aren't GAK compliant, and can achieve their corporate data recovery in
morally responsible ways. In this event I would be vying for the
other companies to flourish at the expense of PGP Inc going bankrupt.

> The one new suggestion has been to have separate communication and storage
> keys, but this has several problematic aspects. Unlike the distinction
> between encryption and signature keys, where it is obvious whether a key
> is being used properly, it is not at all clear whether a particular
> function represents communication or storage. Consider the case of a
> remotely mounted file system, where what looks like a storage operation
> is actually travelling across a (possibly insecure) network.

That has nothing to do with the design of the way PGP implements to
the OpenPGP standard.

What that says is that if you are involved in design of VPN or IPSEC
products that they should also be designed with anti-GAK design
principles.

You have to develop a sense of proportionality about the significance
of each anti-GAK principle as applied to each communication you apply
it to.

> Another example comes from the initial version of the Eudora API. When
> that makes calls out to the crypto library to encrypt messages, Eudora
> actually stores the message in a disk file, has the library encrypt that
> to another disk file, then constructs the message from that output file.
> It will look to the library like it is encrypting a disk file using a
> communications key, a violation of this proposed security policy.

If the disk is NFS mounted it would be.

You have to apply some critical thinking to designing any system. So
it is with designing according to the anti-GAK principles. Having
sound design principles doesn't negate the need for critical thought
in any design field.

What are you thinking above? That GAK might be implemented by
governments forcing us to use GAK compliant replacements for SSH, for
IPSEC and VPN software, and thereby be able to snoop on your email?

Firstly it sounds kind of round-about approach, and secondly the
correct area to solve this problem is in the purchase of non-GAK
compliant IPSEC and VPN software. Or if PGP gets around to IPSEC in
the design of non-GAK compliant IPSEC software. (Don't tell me: PGP
would design IPSEC packets with GAK compliant CMR also?)

You have to apply the above sort of critical reasonsing, and balanced
evaluation of weak point of the system.

> In fact, in most cases a storage key will not be a public key at all,
> but will be a symmetric cipher key, such as with our own PGPDisk product.

When I was arguing for a separate storage key, I was thinking that you
would want a symmetric storage key. Perhaps pgp -c, or pgp -cs
functionality.

If you use the CDR method, and use symmetric storage keys, with
multiple storage recipients (one symmetric key (you), one asymmetric
company recovery key) for the stored mail folder, you have message
recovery.

Actually you could get away without storage keys as such, in that you
could store the recovery information for your encryption key on the
local disk. If you were to use this simple change and discard the
second enforced crypto recipient field (which breaks the second
anti-GAK princile) you could have the minimal set of modifications
needed to convert pgp 5.5 into an non-GAK compliant system.

See any specific flaws in that?

The only thing it does not allow is corporate message snooping. I
have yet to see a PGP employee admit that pgp5.5 is designed to meet
the user requirement of message snooping.

> This allows transparent disk encryption and decryption. In that case you
> really don't have "storage keys" as such in the public key infrastructure,
> so there is no need for a distinction.

Yes. And that is the ideal system in my view. However I notice that
pgp doesn't have a PGPdisk for the PC, nor for any versions of unix.

I don't have a MAC. MACs are nice machines and all, but they don't
actly have the lions share of the market as IBM compatible PCs do. I
encourage you to develop disk systems for other platforms.

Adam
--
Now officially an EAR violation...
Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/

print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<>
)]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`
n***@ceddec.com
1997-10-17 23:02:56 UTC
Permalink
On Tue, 14 Oct 1997, Hal Finney wrote:

> You do a good job of saying what you want to avoid, but not much about
> what you want to accomplish! You are doing this to satisfy some kind
> of market need for commercial access to data. What are the requirements
> your design will satisfy? What do the customers need? You can't design
> a system based solely on what it doesn't do.
>
> PGP, Inc. has made its design choices based on interaction and discussion
> between people responsible for security, human interface, sales/marketing
> and customer support, as well as the customers themselves. The lessons
> which have come out of this include an emphasis on solutions which
> people can use and which will work in the field. This means they have
> to be transparent, easy to learn, fit into existing procedures, and
> be reliable, as well as being secure. Systems which require extensive
> retraining, new operational procedures, or which will hurt reliability
> are not acceptable - they simply won't be deployed. That's the reality.

One thing PGP has not looked into would be having a master key such that N
bits will be the same from the corporate master key to any user's key.
Then you could recover any user key from the corporate key, but it would
requrire some expense and effort, so it would prevent casual
eavesdropping by the administrators.

A second method would require N (typically 3) escrow keys to recreate the
access key, and would work best if it only decrypted one user's messages.

I would rather not trust a single party within a corporation.

Right now, whoever has the corporate key can read everyone's email. What
happens when there is an insider trading lawsuit when the CIO reads the
CEO's "private" email? I can think of other examples. And if the
corporate key is compromised, I assume that compromises every piece of
email up to that point?

But let me ask a question about PGP, Inc. - Do they use the PGP 5.5
version with corporate key recovery internally?
Jon Callas
1997-10-18 00:14:12 UTC
Permalink
At 07:02 PM 10/17/97 -0400, nospam-***@ceddec.com wrote:
On Tue, 14 Oct 1997, Hal Finney wrote:

One thing PGP has not looked into would be having a master key such that N
bits will be the same from the corporate master key to any user's key.
Then you could recover any user key from the corporate key, but it would
requrire some expense and effort, so it would prevent casual
eavesdropping by the administrators.

A second method would require N (typically 3) escrow keys to recreate the
access key, and would work best if it only decrypted one user's messages.

I would rather not trust a single party within a corporation.

Take a look at my "Why CMR isn't Key Escrow" essay. In it I recommend not
using a single CMRK, and mention a couple of easy-to-implement,
fault-tolerant uses.

In our plans for the next release is putting in secret sharing for a number
of purposes. I'm in favor of requiring that a CMRK be secret-shared, even
if a tin pot dictator can easily own all the shares.

Right now, whoever has the corporate key can read everyone's email. What
happens when there is an insider trading lawsuit when the CIO reads the
CEO's "private" email? I can think of other examples. And if the
corporate key is compromised, I assume that compromises every piece of
email up to that point?

I don't think you've been reading the descriptions of how it works. You're
also focusing on using it with a single key. Every user can have a
different key. No user MUST have a key.

But let me ask a question about PGP, Inc. - Do they use the PGP 5.5
version with corporate key recovery internally?

No, we don't. We have no need to. It would be inappropriate for our
environment.

Jon



-----
Jon Callas ***@pgp.com
Chief Scientist 555 Twin Dolphin Drive
Pretty Good Privacy, Inc. Suite 570
(415) 596-1960 Redwood Shores, CA 94065
Fingerprints: D1EC 3C51 FCB1 67F8 4345 4A04 7DF9 C2E6 F129 27A9 (DSS)
665B 797F 37D1 C240 53AC 6D87 3A60 4628 (RSA)
Adam Back
1997-10-18 07:17:06 UTC
Permalink
Jon Callas <***@pgp.com> writes:
> Right now, whoever has the corporate key can read everyone's email. What
> happens when there is an insider trading lawsuit when the CIO reads the
> CEO's "private" email? I can think of other examples. And if the
> corporate key is compromised, I assume that compromises every piece of
> email up to that point?
>
> I don't think you've been reading the descriptions of how it works. You're
> also focusing on using it with a single key. Every user can have a
> different key. No user MUST have a key.

The fact that it is all optional does not mean that a company may
choose to use it in that way.

I suspect that many companies with their strict property ownership
opinions will have one CMR key, and use the pgp5.5 for business
framework to enforce that all users use it.

> But let me ask a question about PGP, Inc. - Do they use the PGP 5.5
> version with corporate key recovery internally?
>
> No, we don't. We have no need to. It would be inappropriate for our
> environment.

Most companies aren't as progressive as PGP, and most companies have a
corporate proprety ownership attitude even if they similarly have no
need for the actual functionality.

Adam
--
Now officially an EAR violation...
Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/

print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<>
)]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`
Bill Stewart
1997-10-18 09:30:01 UTC
Permalink
At 05:14 PM 10/17/1997 -0700, Jon Callas wrote:
> > But let me ask a question about PGP, Inc. - Do they use the PGP 5.5
> > version with corporate key recovery internally?
>No, we don't. We have no need to.
>It would be inappropriate for our environment.

You've got applications it would be fine for:
- tech support, so people can encrypt messages to
somebody at tech support and other people
could read it if it's their turn on support that day.
- secretaries having access to their bosses' email
Thanks!
Bill
Bill Stewart, ***@ix.netcom.com
Regular Key PGP Fingerprint D454 E202 CBC8 40BF 3C85 B884 0ABE 4639
Rick Smith
1997-10-14 21:42:32 UTC
Permalink
It should be clear by now that privacy is not the only security objective
sought by customers of information security products, nor even by all
customers of crypto products. Practical users rarely pursue privacy at all
costs, nor do they pursue accountability and traffic visibility at all
costs. Many must find a balance between two fundamentally conflicting goals.

Regarding the practical uses of e-mail key disclosure, let me include one
from the guard/firewall world that I haven't seen mentioned yet:

We've been shipping products since 1994 that scan the contents of e-mail
messages and reject contents that violate specified filtering criteria.
Sites use it to block importation of viruses or other inappropriate
attachments, and to block the export of improperly released information.
Most of these systems have been sold to the government and use the Message
Security Protocol to encrypt data. The system rejects messages that don't
contain an extra key so that the firewall can scan message contents.

This violates the assumed requirement that the contents of an e-mail
message must not be viewed by anyone except the message's author and
recipient.

However, it's a security trade-off that some organizations want to make for
certain applications.

PGP's key recovery protocol isn't the perfect solution, but it would help
resolve a big problem. To send mail through these systems, the users must
be trained to include the firewalls as message recipients -- this produces
a copy of the symmetric key encrypted with the firewalls' individual PKs.
If a user forgets, then the message can not pass through. The PGP approach
of warning or demanding another PK token would help solve that problem at
least in simple cases.

ObPolitics: Personally, I think it's too soon to tell if PGP's
implementation would benefit the FBI in its pursuit of wiretapping keys. At
most it might resolve whether such mechanisms are in fact a practical
technology. I'm not yet convinced.

Also, if commercial sites have already co-opted PGP's recovery key for
their own uses, it's not clear that the FBI will be able to use it for
clandestine investigations. If they approach the site's IS managers to
acquire copies of the firewall keys, there's a good chance a rumor will get
back to the people being targeted for surveillance. Also, I believe the
overhead for separate eavesdropping keys would produce too clear a sign to
everyone that the FBI is listening. There is no precendent for such a thing
and even if it's adopted temporarily I doubt it will persist. People will
notice, it it will make them mad -- it will show them that the FBI is
indeed under everyones' bed. Even the FBI can't stand up against broadly
based grassroots pressure. Of course, I've been wrong before about politics.

Rick.
***@securecomputing.com Secure Computing Corporation
"Internet Cryptography" now in bookstores http://www.visi.com/crypto/
Adam Back
1997-10-14 22:42:08 UTC
Permalink
Rick Smith <***@securecomputing.com>
> Regarding the practical uses of e-mail key disclosure, let me include one
> from the guard/firewall world that I haven't seen mentioned yet:
>
> We've been shipping products since 1994 that scan the contents of e-mail
> messages and reject contents that violate specified filtering criteria.
> Sites use it to block importation of viruses or other inappropriate
> attachments, and to block the export of improperly released information.
> Most of these systems have been sold to the government and use the Message
> Security Protocol to encrypt data. The system rejects messages that don't
> contain an extra key so that the firewall can scan message contents.

I'm not sure whether pgp5.5 has this ability to screen messages prior
to reading, and also not sure whether it has the ability to snoop
messages prior to sending built in.

The basis is there for the functionality in the enforced second
recipient, but I'm not sure whether the client, or SNMP management
tools, or SMTP policy enforcer implement this functionality.

I'd welcome clarification from PGP Inc., employees, or anyone in the
US has tried the pgp5.5 software for business suite.

> This violates the assumed requirement that the contents of an e-mail
> message must not be viewed by anyone except the message's author and
> recipient.

It does yes. And that demonstrates that the principle is achieving
it's function in high-lighting areas of a design which could be used
for GAK purposes.

> However, it's a security trade-off that some organizations want to
> make for certain applications.

Absolutely. You should try hard though to see if there are any
software hacks or protocol reorganisations you can do to make the
system unusable for GAK, or as close to unusable as you can.

> PGP's key recovery protocol isn't the perfect solution, but it would help
> resolve a big problem.

Where that problem is can messages be screen by processes? One
solution if you restrict yourself to processes is to move the function
to the client and scan after decrypt. This isn't as easy to integrate
into existing MUAs but would be possible for fresh re-writes like
pgp5.x clients.

If you want ability for humans to scan messages manually as well, and
you can't live with the modifications in the client, you can do this
within the CDR design principles:

Have all email received by the company encrypted to the companies
public key. Have your virus filter/human screener check the email,
and then pass on to user in clear, or pass on to user encrypted with
users key.

Actually this is more secure than your solution, because the client
already has an in memory master key; and now you have one crypto
recipient rather than two to blow your security for you by allowing
inadvertent key compromise.

For outgoing email, configure the mail client plugin to either send in
plain-text, or to sign only (one hopes with non escrowed signature
keys, else your MIL friends will be able to forge each others mail),
or to encrypt to the outgoing filtering agent, and have the outbound
mail hub filter, or forward to human screener content checking.

CDR compliant. Has some extra resistance to GAK corruption.

As an additional application of CDR anti-GAK principles you could do
the small software hack of allowing the clients and/or mail filtering
agents to only communicate with each other if the machine IP addresses
look like they are on the same mail hub.

Don't provide source. (I suspect your company doesn't anyway for this
kind of app, or do MIL people like to inspect source?)

So now your system has a number of extra protections against being
used for GAK. They are not perfect, but you've done what you can, and
it's a definate improvement over what you had before.


However I'm not sure it matters for your application really whether it
is GAK compliant or not. This is because your application sounds like
it is mainly for defense contractors, and MIL or NSA type use. As far
as I'm concerned that lot deserve GAK. Keep themselves honest :-)

> To send mail through these systems, the users must be trained to
> include the firewalls as message recipients -- this produces a copy
> of the symmetric key encrypted with the firewalls' individual PKs.
> If a user forgets, then the message can not pass through. The PGP
> approach of warning or demanding another PK token would help solve
> that problem at least in simple cases.

You can acheive the same functionality by insisting on mail being
encrypted for the firewall. Firewall can re-encrypt for intended
user.

> ObPolitics: Personally, I think it's too soon to tell if PGP's
> implementation would benefit the FBI in its pursuit of wiretapping keys. At
> most it might resolve whether such mechanisms are in fact a practical
> technology. I'm not yet convinced.

I think you have a point there and that it's not entirely clear how
this would work out.

> Also, if commercial sites have already co-opted PGP's recovery key for
> their own uses, it's not clear that the FBI will be able to use it for
> clandestine investigations. If they approach the site's IS managers to
> acquire copies of the firewall keys, there's a good chance a rumor will get
> back to the people being targeted for surveillance.

I don't think that's the way it will work. What they'll do is require
SEC cleared companies to provide this key as a matter of law to the
government. If cheating is detected the bosses will get prison terms.
Then they'll slowly spread it from SEC cleared firms to other
companies. Maybe Public companies. Then they'll create a reichstag
fire FBI constructed publicity type cases; perhaps money laundering,
or purported mafia front business as an example of a public threat
from allowing companies to communicate without escrow. Individuals
too, some terrorist cases, a few more large scale bombings. No
problem.

If PGP provided facility for multiple enforced extra crypto
recipients, it would be even more GAK compliant, as companies could
just put a NSA key into the recipient box.


The CDR design principles also apply to standards. Standards are very
powerful things. If OpenPGP requires capability to understand and
encode to second crypto recipients which are not message recipients
for conformancy, we are in trouble also because all clients will then
be required to make GAK easier to enforce; the capability is right
there in pgp5.0 and pgp5.5 now.

If on the other hand the OpenPGP standard does not include the second
non message recipient crypto recipient feature, then companies who do
chose to do so must build up their own client base outside of the
installed client base, because they won't be able to build
GAK and have interoperability at the same time.

(We'll see whether or not PGP argue for this feature to go in the
standard in a bit, when the draft standard is released, and
discussions commence).

Also remember that another danger with GAK software is that your
country might be too liberal for the government to get away with
various things, but others aren't.

We don't want to encourage civil rights abuses. In some countries
saying non-government approved thoughts results in prison terms,
torture, and painful death.

> Also, I believe the overhead for separate eavesdropping keys would
> produce too clear a sign to everyone that the FBI is
> listening. There is no precendent for such a thing and even if it's
> adopted temporarily I doubt it will persist. People will notice, it
> it will make them mad -- it will show them that the FBI is indeed
> under everyones' bed. Even the FBI can't stand up against broadly
> based grassroots pressure. Of course, I've been wrong before about
> politics.

Well I hope you're right of course, but I feel PGP could do more to
prevent the scenario I present above, which I feel is fairly
realistic.

I also hope I have provided some worked examples of the logic in
applying the CDR design principles.

(They are counter-intuitive sorts of principles to work with because
they are all negatives: don't do this, don't do that, no
recommendations as such on what _to do_. This is because what you do
is explore the solution space outside of the restriction they apply on
it. They are also strange in that it is unusual to see formally
codified cryptographic property design principles which reflect
entirely a political issue. Must me a first :-)

Adam
--
Now officially an EAR violation...
Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/

print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<>
)]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`
Rick Smith
1997-10-15 17:25:05 UTC
Permalink
In response to some things I said about crypto and firewalls, Adam Back wrote:

>I'm not sure whether pgp5.5 has this ability to screen messages prior
>to reading, and also not sure whether it has the ability to snoop
>messages prior to sending built in.

It's not a feature of 5.5, it's a separate activity that relies on the
presence of a third party key. The actual inspection occurs at a security
choke point like a firewall. That way the tests are applied consistently
regardless of mistakes made by clients in software installation or
configuration.

You noted that it might in fact be possible to get the same result with a
more GAK resistant implementation, and I agree. For example, if the
principals all update their public keys fairly often, then there's no
relatively key that's easy to escrow. The practical version of forward
secrecy, as it were. I have no objection to that. As far as I can tell,
there's nothing intrinsic in the current PGP proposals to prevent such an
implementation.

Concerning your other examples, I agree that there are ways to trade off
between degrees of GAK sensitivity and various customer security
objectives. However, this whole discussion revolves around the fact that
people rely heavily on off the shelf solutions, and there aren't the right
tools to do everything you're suggesting. I'd like to see things in the
crypto marketplace be as flexible as possible. Just because a few provide
some features that could in theory be applied to GAK doesn't IMHO mean that
those will manage to prevail. A diverse marketplace of crypto devices is
essential for it to flourish -- one that gives every customer the flavor
they want. That way the general public will develop a working understanding
of it. And we'll all learn what really works and what doesn't.

Let's face it, crypto is at about the same stage as cars were when Henry
Ford started building them. We really know as much about how crypto will
change the future as he did about the effects of cars. We can't start
engineering in governors to restrict our speed, or be restricting our
designs to governor-proof transmissions.

I regret I'll have to bow out of this discussion because I'll be on travel
for a couple of weeks. Have fun everyone.

Rick.
***@securecomputing.com Secure Computing Corporation
"Internet Cryptography" now in bookstores http://www.visi.com/crypto/
Anthony E. Greene
1997-10-14 21:43:25 UTC
Permalink
-----BEGIN PGP SIGNED MESSAGE-----

At 15:42 1997-10-14 -0600, Rick Smith wrote:
>ObPolitics: Personally, I think it's too soon to tell if PGP's
>implementation would benefit the FBI in its pursuit of wiretapping keys. At
>most it might resolve whether such mechanisms are in fact a practical
>technology. I'm not yet convinced.

A major reason GAK runs into such strong opposition is the gut feeling
that it violates the expectation of privacy. If the communication in
question is already accessible by a third party, it can be argued that
that expectation no longer exists. In this environment a government
might succeed in passing GAK legislation that requires organizations
that have third-party access to encrypted communications to provide the
plaintext to the government on issue of a subpoena or court order.

>Also, if commercial sites have already co-opted PGP's recovery key for
>their own uses, it's not clear that the FBI will be able to use it for
>clandestine investigations. If they approach the site's IS managers to
>acquire copies of the firewall keys, there's a good chance a rumor will get
>back to the people being targeted for surveillance.

Assuming GAK legislation is passed, any such action on the part of an IS
manager would be illegal.
Anonymous Remailer
1997-10-15 00:17:54 UTC
Permalink
On Tue, Oct 14, 1997 at 10:37:21AM +0100, Adam Back wrote:
>
> If we take the design goal of designing a commercial data recovery
> system which is not GAK compliant, we can most succinctly state this
> design goal as the task of ensuring that:

I take a certain sad pleasure in reminding people that among my first
posts on this list I warned that cypherpunks had better design a
commercial key recovery system, or one they didn't like would be
forced down their throats. Attila now spouts my line exactly, Adam is
busy designing (though he still can't bring himself to use the term
'key recovery', being meme-dominated as he is), and Tim nods
approvingly.

Good luck, gentlemen -- you should have been doing this a year ago.

--
Kent Crispin "No reason to get excited",
***@songbird.com the thief he kindly spoke...
PGP fingerprint: B1 8B 72 ED 55 21 5E 44 61 F4 58 0F 72 10 65 55
http://songbird.com/kent/pgp_key.html



This message was automatically remailed. The sender is unknown, unlogged,
and nonreplyable. Send complaints and blocking requests to
<***@juno.com>.
Marshall Clow
1997-10-15 05:04:30 UTC
Permalink
Anthony Green may have written:
>At 15:42 1997-10-14 -0600, Rick Smith wrote:
>>ObPolitics: Personally, I think it's too soon to tell if PGP's
>>implementation would benefit the FBI in its pursuit of wiretapping keys. At
>>most it might resolve whether such mechanisms are in fact a practical
>>technology. I'm not yet convinced.
>
>A major reason GAK runs into such strong opposition is the gut feeling
>that it violates the expectation of privacy. If the communication in
>question is already accessible by a third party, it can be argued that
>that expectation no longer exists. In this environment a government
>might succeed in passing GAK legislation that requires organizations
>that have third-party access to encrypted communications to provide the
>plaintext to the government on issue of a subpoena or court order.
>
This is a very important point.
>From what I read, the law about privacy is falling to two extremes,
"private", or
"no expectation of privacy"; i.e, public.

Cell phones and cordless phones, for example, fall into the "no expectation"
class.

Calling patterns, (numbers dialed, and length of calls), also have "no
expectation
of privacy", because you disclose them to a third party (the phone company).
You can't avoid disclosing them, but that doesn't seem to matter. :-(

I can see a court deciding that since you encrypted to your company's CMR key,
your email is not private. Even if you are a company officer discussing company
business with the corporate attorney. IANAL, though.

This is not a weakness in CMR, per se, as a weakness in any
"encrypt to additional recepients" scheme.

-- Marshall

Marshall Clow Aladdin Systems <mailto:***@mailhost2.csusm.edu>

"In Washington DC, officials from the White House, federal agencies and
Congress say regulations may be necessary to promote a free-market
system." -- CommunicationsWeek International April 21, 1997
Will Price
1997-10-15 09:34:01 UTC
Permalink
Adam:

First, let me state some overriding design goals of a data recovery system
required to ensure privacy: the sender must know and consent to every key
that will be able to read the message during its lifetime, the encryption
must be end-to-end, and the recipient must know exactly who else can
decrypt the message. The sender's privacy is paramount as it is their data
which is being trusted to the system. These are basic principles not only
of a data recovery system, but for any cryptosystem.

The design you have been espousing for the last week or so in your many
messages takes the power out of the hands of the sender and encourages
automated violations of the sender's privacy by the recipient (perhaps even
unbeknownst to the recipient). In your model, the recipient automatically
decrypts and then re-encrypts to a data recovery key -- even though
end-user computers are likely to be insecure thus making this decrypt &
reencrypt step rather specious at best. The only information the sender
has before sending the message is "your message might be able to be read"
by someone else, or more likely no information whatsoever as there is no
need to put such information in the protocol as far as the format is
concerned. Either way, the sender is thus easily led into a false
assumption of security. The encryption is not end-to-end but rather is
completely unwrapped in the middle and then rewrapped introducing serious
security flaws, and the sender has no idea to whom the message will be
auto-reencrypted by the receiver.

As an actual data recovery system, it also fails fundamental tests. If I
encrypt critical data to a colleague wiping it from my system after
sending, then the colleague is incapacitated before receipt and processing
of the message, the data can never be retrieved. A data recovery system
must solve this kind of issue -- data recovery here means that from
end-to-end the data is recoverable in case of emergency. One cannot ignore
message transit time in this -- it can take days for a message to travel
from AOL to the outside world. If you don't need data recovery, don't use
it, but at least respect the people who do need it and need it to actually
work at all points.

>With these three principles you still have lots of flexibility because
>you can escrow storage keys

I'm truly amazed that you would attack in such a spiteful fashion a simple
system which adds a recipient-requested, sender-approved extra recipient
which is end-to-end wherein all recipients are under the sender's control
and each recipient knows who can read the message with no key escrow using
the same old PGP message format we all know and love without change, and
yet you propose a much less secure system which allows hiding critical
information from the sender and does not adequately perform its stated
purpose of data recovery.

- -Will


Will Price, Architect/Sr. Mgr.
Pretty Good Privacy, Inc.
555 Twin Dolphin Dr, Ste.570
Redwood Shores, CA 94065
Direct (650)596-1956
Main (650)572-0430
Fax (650)631-1033
Pager (310)247-6595
***@pgp.com
Internet Text Paging: <mailto:***@roam.pagemart.net>
<pgpfone://clotho.pgp.com>
<http://www.pgp.com>

PGPkey: <http://pgpkeys.mit.edu:11371/pks/lookup?op=get&search=0x5797A80B>
Ian Brown
1997-10-15 11:01:51 UTC
Permalink
-----BEGIN PGP SIGNED MESSAGE-----

> The design you have been espousing for the last week or so in your many
> messages takes the power out of the hands of the sender and encourages
> automated violations of the sender's privacy by the recipient (perhaps even
> unbeknownst to the recipient).

Whatever you do, the recipient has the plaintext (has the argument
really descended to this level?) As Ian Grigg has pointed out, there are
*no* technical means a message sender can employ to stop the recipient
'violating their privacy'. You are splitting hairs again. As Adam has
repeatedly pointed out, there is no problem with flagging on a key that
the message may be read by someone else. This is actually more honest
than a scheme where 'this message can be read by key X' flags are used;
once the recipients have the plaintext, they can give it to whoever they
like to read. Adam's scheme does not put in place an infrastructure
which encourages automated snooping. It leaves it entirely up to
separate organisations as to whether they implement data recovery. Have
you *read* Bruce Schneier's post on how quickly GAK proponents in
Washington have picked up on this? Are you proud to have provided an
argument for S909? Are you happy to have the NSA using you as an
argument that GAK works?!!!

> The NSA states that key recovery is doable and will not jeopardize
> national security. And there is an existence proof for key recovery
> software in the new PGP release.

Adam's design does NOT "encourage automated violations of the sender's
privacy" - that I would reserve for PGP 5.5. You split hairs yet again
by claiming PGP 5.5 is "a simple system... wherein all recipients are
under the sender's control". As Adam has pointed out in his "many
posts", it's not much use letting the sender remove the extra recipient
if they know the message will then simply be bounced. Adam's request for
you to remove these fields make the system simpler. His communications
key idea adds additional security, but that is the only reason for the
increased complexity - an *additional* security feature.

This argument is exhausting. You may not give two hoots what I, or Adam
Back, or any number of people say. But could you not at least listen to
Schneier, who you must admit is quite an authority in this field? Even
if you *were* right, you have not managed to convince him. Do you really
think it's going to be good for PGP Inc if he recommends that clients
and anyone else who asks should not use PGP Inc products?

Ian.
Adam Back
1997-10-15 22:45:01 UTC
Permalink
Part of the problem in this debate I think is that I have proposed
many alternate designs, with varying degrees of GAK-hostility.

Also I have been accused of using "lots of anti GAK rhetoric, but
giving no proposals" by Kent. I reject that claim. (I did use lots
of rhetoric, but this was to try to impress upon those arguing for CMR
of it's dangers. They do not seem to acknowledge them.) I'll try in
this post to steer clear of anti-GAK rhetoric. We'll instead take it
as a given that pgp5.5 and pgp5.0 are GAK compliant because of CMR and
that this is a bad thing.

Will is correct on one point: at the begining I had not properly
thought one aspect through: my earlier decrypt & re-encrypt construct
violates anti-GAK design principle 2; namely it is guilty of the same
violation as CMR, it has an effective second crypto recipient outside
of the senders control. I agree with Will Price's comments on the GAK
pervertability tendencies of this construct. I spotted the error of
my ways since clarifying my thoughts on the subject by constructing
the more formalised anti-GAK design principles. Readers will see my
codification of my recognition of the dangers of the re-encrypting
construct in corollary 1 of the anti-GAK design principles (copy of
principles below [1]). The fact that I posted this corollary in
updated copies of the design principles prior to Will's post should
show that this claim is sincere and not an attempt at side stepping
Will's observation: he is correct, I agree.

The rest of Will's post seems to miss the point, so it seems to me
that the best way to transfer understanding of how to use the anti-GAK
design principles to design less GAK friendly systems is to present a
worked example.

This first simple CDR replacement for PGP's CMR method also attempts
to keep changes necessary to implementations and packet formats to a
minimum. Please understand also that it violates one of the major
design principles in order to acheive this simplicity and so that
objections that it is not that much more valuable than CMR will be
countered by showing you how to achieve an even more GAK-hostile
design by removing the violation of design principle 3, albeit with
more coding effort and design modifications on PGP's part. Never the
less it is already much harder to pervert for GAK than CMR.

Design 1.

Instructions:

- scrap the CMR key extension

- store a copy of the private half of the users PGP encryption key
encrypted to the company data recovery key on the users disk.

- (optional) design the software to make it hard to copy the data
recovery packet from the disk, hide the data, bury it in keyrings,
stego encode it, whatever, use your imagination. This is to attempt
to restrict the third parties ability to by pass the principle of
non communication of recovery information


Recovery method:

Custodian of recovery key inserts recovery floppy disk in machine,
decrypts copy of users private key, hands control back to user to
choose new passphrase.


Possible objections:

objection #1. what if disk burns?
counter #1: backup your disk

objection #2: users don't back up disks
counter #2: that is a good way to loose data :-) if they don't have
the data the key protecting the data won't help them

GAK-hostility rating:

Harder to pervert for GAK than pgp5.5 / pgp5.0 CMR design.


I'd be interested to see Will, or Hal, or other PGPer's criticisms of this
simple modification, perhaps criticisms could most constructively answer:

- what is stopping you implementing this
- are there any plug ins which can't cope with this
- are there user requirements which it can't meet
- is there some fundamental flaw you think I have missed
- can you see ways that this could be perverted to implement GAK
(yes I can too, btw, but...)
- are those ways logisitically harder for GAKkers to acheive than for CMR

Please be specific, no general waffle about understanding the
complexities of balancing user ergonomics, user requirements etc.
That is a no-brainer, you need to do this analysis, the cost function
for evaluating such design issus is now expressed explicitly in design
principle 4 rather than being assumed. List problems and explain the
significance of the all important deployability criteria.

Cryptographic protocol designs are very flexible; most design goals can
be met, or worked around I claim within the positive GAK-hostility
side of the cryptographic protocol and product design solution space.

Lastly, I would encourage readers to be critical of the GAK-hostile design
principles themselves:

- can you see any aspects which inaccurately reflect trade-offs
- can you see methods to bypass inadvertently or deliberately the design
that might require another corollary to correct.

In anticipation of constructive criticism,

Adam

[1]
==============================8<==============================
GAK-hostile design principles

If we take the design goal of designing systems including
confidentiality which are not GAK compliant, we can most succinctly
state this design goal as the task of ensuring that:

- at no point will any data transferred over communications links be
accessible to anyone other than the sender and recipient with out
also obtaining data on the recipient and/or senders disks


We can then derive the design principles required to meet the design
goal of a non-GAK compliant system with confidentiality services down
to ensuring that:

principle 1:
no keys used to secure communications in any part of the system are
a-priori escrowed with third parties

principle 2:
second crypto recipients on encrypted communications are not
used to allow access to third parties who are not messaging
recipients manually selected by the sender

principle 3:
communications should be encrypted to the minimum number of
recipients (typically one), and those keys should have as short a
life time as is practically possible

principle 4:
deployment wins. violating any of principles 1 to 3 whilst
still retaining some GAK-hostility can be justified where
deployment is thereby increased to the extent that the violations
increase the degree of GAK hostility in the target jurisdictions
overall

Corrollary 1: Included in design principle 2) is the principle of not
re-transmitting keys or data after decryption over communication
channels, re-encrypted to third parties -- that is just structuring --
and violates design principle 2.

Corrollary 2: where communications are transmitted which violate
principles 1, 2 or 3 it is in general more GAK hostile to enforce as
far as possible that the recovery or escrow information remains in as
close proximity to the data as possible.

Corrollary 3: where communications are transmitted which violate
principles 1, 2 or 3 it is in general more GAK hostile to make these
communications as difficult to automate as possible. For example no
scripting support is given to enforce that GUI user interaction is
required, and/or that the process is made artificially time consuming,
and/or that the communication must not use electronic communication
channels

==============================8<==============================
Gene Hoffman
1997-10-15 23:11:09 UTC
Permalink
On Wed, 15 Oct 1997, Adam Back wrote:

>
> - store a copy of the private half of the users PGP encryption key
> encrypted to the company data recovery key on the users disk.
>

You would rather have PGP implement private key escrow?

Gene Hoffman
PGP, Inc.
Adam Back
1997-10-16 07:11:28 UTC
Permalink
Gene Hoffman <***@pgp.com> writes:
> On Wed, 15 Oct 1997, Adam Back wrote:
>
> >
> > - store a copy of the private half of the users PGP encryption key
> > encrypted to the company data recovery key on the users disk.
> >
>
> You would rather have PGP implement private key escrow?

Yes.

This is less GAK friendly than the way that PGP are implementing CMR.

In worked example #2 and I might do a #3 as well, I will as promised
show you how to apply the design principles to achieve greater
GAK-hostility than example #1 which you are objected to above.

However, in the mean time, I would like you and other PGPers to
re-read my post and answer the questions contained in it:

> - can you see ways that this could be perverted to implement GAK
> (yes I can too, btw, but...)
> - are those ways logisitically harder for GAKkers to acheive than for CMR

You appear to claim that your answer to the second question is no.

I would like to see you explain your reasoning for why this is so.

You may find it constructive to re-read some of Tim May's recent posts
as he explains the logic of this fairly clearly. Tim May does not
need the anti-GAK design principles to think in an critical
GAK-hostile way.

PGP Inc does appear to need them because their design principles are
currently at best GAK-neutral, and appear to be largely based on
wooly, ill thought-out pro-privacy / liberal thinking.

You have to think in a crypto-anarchist, saboteur mindset to maximise
your ability to prevent mandatory GAK becoming reality. The anti-GAK
design principles are a codification of the crypto-anarchist GAK
saboteur's natural predilections to want to prevent the GAKkers.

I have in waiting some other design principles which codify more
general crypto-anarchist design principles. I will not be adding
these to the anti-GAK design principles at this stage for fear of
confusing the first issue: how to best prevent GAK occuring in our and
other countries.

Adam
--
Now officially an EAR violation...
Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/

print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<>
)]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`
Kent Crispin
1997-10-16 03:35:09 UTC
Permalink
On Wed, Oct 15, 1997 at 11:45:01PM +0100, Adam Back wrote:
>
>
> Part of the problem in this debate I think is that I have proposed
> many alternate designs, with varying degrees of GAK-hostility.
>
> Also I have been accused of using "lots of anti GAK rhetoric, but
> giving no proposals" by Kent.

Adam, you've tossed out half-baked ideas buried in several thousand
lines of anti-GAK rant. None of them were thought through in terms of
infrastructure impact. The idea of reencrypting the data strikes me
as half-baked, as well -- I sit and wonder about the pass-phrase
handling for the transient encryption keys that are changing on a daily
or weekly basis -- or is there no pass-phrase -- is the key just
stored on disk with no protection


> I reject that claim. (I did use lots
> of rhetoric, but this was to try to impress upon those arguing for CMR
> of it's dangers. They do not seem to acknowledge them.)

The evidence seems to suggest that the PGP folks agonized pretty
heavily over their design. A stupid attack such as yours is far more
likely to cement resistance than it is likely to win cooperation.

> I'll try in
> this post to steer clear of anti-GAK rhetoric. We'll instead take it
> as a given that pgp5.5 and pgp5.0 are GAK compliant because of CMR and
> that this is a bad thing.

Trying real hard...

> Will is correct on one point: at the begining I had not properly
> thought one aspect through:

I suspect there are several other flaws you are now quite aware
of...too bad, I hoped you had something.

[...]

> Design 1.
>
> Instructions:
>
> - scrap the CMR key extension
>
> - store a copy of the private half of the users PGP encryption key
> encrypted to the company data recovery key on the users disk.

I work for a large organization, I have a unix workstation, an
xterminal booting off a departmental server, and a Mac in my office.
As is typical in large organizations, a system admin team takes care
of all routine administration of my systems. They all have root, of
course, and routinely do system upgrades and software installs on my
Mac.

Your solution doesn't seem to fit this environment very well...
[...]

> Recovery method:
>
> Custodian of recovery key inserts recovery floppy disk in machine,
> decrypts copy of users private key, hands control back to user to
> choose new passphrase.

Must be a very special boot floppy, of course, otherwise I just
subvert the floppy driver, feign forgetting my passphrase, and
collect the corporate crown jewels. Or I hack into somebody else's
system and corrupt their key...

[...]
>
> - what is stopping you implementing this

It's completely unrealistic.

> - are there any plug ins which can't cope with this
> - are there user requirements which it can't meet
> - is there some fundamental flaw you think I have missed
> - can you see ways that this could be perverted to implement GAK
> (yes I can too, btw, but...)
> - are those ways logisitically harder for GAKkers to acheive than for CMR
>
> Please be specific, no general waffle about understanding the
> complexities of balancing user ergonomics, user requirements etc.

Unfortunately, for real products you do have to consider these
factors.

[...]
>
> Adam
>
> [1]
> ==============================8<==============================
> GAK-hostile design principles
>
> If we take the design goal of designing systems including
> confidentiality which are not GAK compliant, we can most succinctly
> state this design goal as the task of ensuring that:
>
> - at no point will any data transferred over communications links be
> accessible to anyone other than the sender and recipient without
> also obtaining data on the recipient and/or senders disks

This is great.


> We can then derive the design principles required to meet the design
> goal of a non-GAK compliant system with confidentiality services down
> to ensuring that:
>
> principle 1:
> no keys used to secure communications in any part of the system are
> a-priori escrowed with third parties
>
> principle 2:
> second crypto recipients on encrypted communications are not
> used to allow access to third parties who are not messaging
> recipients manually selected by the sender
>
> principle 3:
> communications should be encrypted to the minimum number of
> recipients (typically one), and those keys should have as short a
> life time as is practically possible

Key lifetime is a major issue. Keys are either protected by
pass-phrase, or vulnerable. Think about how you are going to
generate new keys every day, or every week...

Think about off-line composition of email -- I have a laptop, download
my mail from the pop server, compose email. Now I can't store my
friends public keys on my disk, because they expire every day. So I
have to go to the public keyserver for every correspondent's public
key -- if the keyserver is unaccessible I'm out of luck. This
radically changes the expected semantics of email.

--
Kent Crispin "No reason to get excited",
***@songbird.com the thief he kindly spoke...
PGP fingerprint: B1 8B 72 ED 55 21 5E 44 61 F4 58 0F 72 10 65 55
http://songbird.com/kent/pgp_key.html
Adam Back
1997-10-16 13:39:16 UTC
Permalink
Kent Crispin has a tenacious ability to continue to think logically
and critically, and not be drawn into emotional exchanges in the face
of jibes and hostilities (I am referring to previous unpleasantaries
on cypherpunks). I think we all could take a leaf out of his book. I
am going to attempt to myself from this point on in the CMR vs CDR
argument. (I accept Jon Callas comments along similar lines.)

Kent Crispin <***@songbird.com> writes:
> On Wed, Oct 15, 1997 at 11:45:01PM +0100, Adam Back wrote:
> >
> >
> > Part of the problem in this debate I think is that I have proposed
> > many alternate designs, with varying degrees of GAK-hostility.
> >
> > Also I have been accused of using "lots of anti GAK rhetoric, but
> > giving no proposals" by Kent.
>
> Adam, you've tossed out half-baked ideas buried in several thousand
> lines of anti-GAK rant. None of them were thought through in terms of
> infrastructure impact.

I have resolved to repent my ways with regard to unconstructive
ranting and rudeness. This should have the positive side effect of
reducing the length of my posts, and ensuring that more people read
them critically.

That the ideas seemed initially fuzzy is a reflection of the fact that
I have, as I presume others have also, been gradually improving my
understanding of these complex issues.

I found the exercise in attempting to codify what I consider to be
optimal design decisions from the point of view of maximising the GAK
resistance properties of protocol designs, software implementations
and communications standards useful in clarifying my thoughts.

I feel confident in my ability to demonstrate that my GR (GAK
resistant) design principles can be usefully applied to increase the
GAK resistance of the design of systems using confidentiality and
communicatons.

I will comment on your specific comments on this aspect below.

> The idea of reencrypting the data strikes me as half-baked, as well
> -- I sit and wonder about the pass-phrase handling for the transient
> encryption keys that are changing on a daily or weekly basis -- or
> is there no pass-phrase -- is the key just stored on disk with no
> protection

Your suggestion that the re-encryption construct is weak from a GAK
resistancy (GR) perspective is correct. See my earlier comments on
realising the danger of that construct.

> > I reject that claim. (I did use lots
> > of rhetoric, but this was to try to impress upon those arguing for CMR
> > of it's dangers. They do not seem to acknowledge them.)
>
> The evidence seems to suggest that the PGP folks agonized pretty
> heavily over their design. A stupid attack such as yours is far more
> likely to cement resistance than it is likely to win cooperation.

I am forced to agree that emotional attacks are likely to hinder
cooperation. Therefore emotional attacks on this topic are themselves
likely to be counter to global GR optimisation. It is for this reason
that I will attempt from this point on in this discussion to swallow
my anger unless I can justify outbursts in terms of GR optimisation.
Readers are encouraged to remind me if I start slipping.

I can only offer as an excuse that it seems to me that PGP Inc could
be doing more to increase the GAK resistance of their product and
design within the financial and user requirement constraints they
face. Emotional appeals are as you say is not likely to be the best
means to explain this belief.

I also offer as a mild excuse that in carrying out interactive list
discussions with people in the US who are on GMT-8 that my sleep
deprivation may have been showing through in irritability :-) I found
myself for instance going to bed at 8.30 am the other night.

> > I'll try in
> > this post to steer clear of anti-GAK rhetoric. We'll instead take it
> > as a given that pgp5.5 and pgp5.0 are GAK compliant because of CMR and
> > that this is a bad thing.
>
> Trying real hard...

Allow me to rephrase that in the light of my GR maximisation motivated
apparent character reform:

I believe that PGP Inc could make their design more resistant to GAK.

The reason I believe that PGP Inc has thus far arrived at different
design conclusions than I, Bruce Schneier, Tim May, Ian Brown, and
others, is that PGP Inc have differently prioritised the multiple
desirable properties of a socially progressive crypto design.

Desirable properties are in prioritsed order of importance as I see
them:

1. preventing big brotherish governments enforcing GAK
2. discouraging little brotherish business practices
3. encouraging transparency of intent (marking keys with statements of
intent on handling of plaintext)
4. application ergonomics

I think that PGP Inc has transposed criteria 1 and 2 in their
prioritisation. I believe PGP Inc's design decisions and
recommendations to companies reflect honest attempts to discourage
little brother, and their use of statement of intent technology
demonstrates their commitment to preventing big brother. But I fear
that their prioritisation reduces the big brother resistance of their
CMR system because this resistance has been traded off to attempt to
provide little brother resistance.

If I understand correctly several PGP employees have claimed that you
should attempt to enforce the statement of intent principle with
protocol modifications. Whilst statement of intent is useful, and a
good innovation which I applaud, criteria 1 and 2 should take
precedence where protocol modifications which are thought to
strengthen statement of intent have a side effect of reducing GAK
resistance.

Independently I think that it is not semantically useful to try to
enforce statement of intent at the protocol level with the CMR method.
This is because having an enforced second recipient in no way
guarantees that the second recipient will read the message, or is able
to read the message (the second recipient might not receive the
ciphertext, or he might have lost the company access key).

> > Will is correct on one point: at the begining I had not properly
> > thought one aspect through:
>
> I suspect there are several other flaws you are now quite aware
> of...too bad, I hoped you had something.

I believe that using the GAK resistant design principles allows one to
focus more clearly on the benefits of the various trade-offs possible
and to more acurately prioritise the multiple social criteria.

> > Design 1.
> >
> > Instructions:
> >
> > - scrap the CMR key extension
> >
> > - store a copy of the private half of the users PGP encryption key
> > encrypted to the company data recovery key on the users disk.
>
> I work for a large organization, I have a unix workstation, an
> xterminal booting off a departmental server, and a Mac in my office.
> As is typical in large organizations, a system admin team takes care
> of all routine administration of my systems. They all have root, of
> course, and routinely do system upgrades and software installs on my
> Mac.

Sounds like a good example to base discussions upon: X-terminals,
multi user unix work stations and remotely configurable PCs.

> Your solution doesn't seem to fit this environment very well...

I would take your point there to be that you can't store the recovery
key on the local disk for the reason that the disk isn't local, and
that when the recovery key is stored on the local disk on remotely
administered or multi-user workstations that this is less secure.

I agree. It is less secure.

(I have htmlized, and attempted to more clearly re-word the GR design
principles:

http://www.dcs.ex.ac.uk/~aba/gakresis/

I have also added a fourth corollary which you might like to comment
on (it's not relevant to the above point).)

To return to your criticisms based on the mish-mash of shared user and
X-terminals typical of corporate environments, corollary 2 expresses
the best that can be done in this scenario:

> Corollary 2: where communications are transmitted in ways which
> violate principles 1, 2 or 3 it is in general more GAK resistant to
> enforce as far as possible that the recovery or escrow information
> remains in as close proximity to the data as possible, and as much
> under the control of the user as possible.

So if you are using an X-terminal, your passphrase will be going over
the ethernet, and all the files will be on a unix box. About all you
can do about this to minimise security problems is to try to secure
your ethernet with IPSEC technology. This is not currently very
widely deployed especially for intranet use.

Certainly if you are using your X-terminal to connect to machines over
the internet many companies have taken steps to reduce the dangers of
this. VPN systems, and SSH achieve this kind of thing. IPSEC and VPN
technology are typically fairly GAK resistant anyway, because use of
forward secrecy is common.

The overall system is _still_ more GAK resistant than CMR for the
sorts of logistical reasons that Tim May has been describing. You may
like to comment on this claim which I am willing to defend.

> > Recovery method:
> >
> > Custodian of recovery key inserts recovery floppy disk in machine,
> > decrypts copy of users private key, hands control back to user to
> > choose new passphrase.
>
> Must be a very special boot floppy, of course, otherwise I just
> subvert the floppy driver, feign forgetting my passphrase, and
> collect the corporate crown jewels. Or I hack into somebody else's
> system and corrupt their key...

You can hack around it, and this is implicitly acknowledged. The
point is that in trying to design the system so that it must be hacked
around before allowing easy use in a mandatory GAK setting you have
built in extra GAK resistance in the form of the deployment and
logistical problems the government will have in developing, and
deploying patches and making sure every one applies them.

> [...]
> >
> > - what is stopping you implementing this
>
> It's completely unrealistic.

It was stated in it's simplest possible form for clarity.

It is however possible to build resistance into the system in the form
of inertia of the deployed code base in not providing automated ways
to access the recovery information outside of the software package.

Bill Stewart came up with the very good suggestion for example that
you only keep some of the bits of the recovery key to ensure that
recovery appears is artificially made time consuming. This means that
with out replacing your software base, the government has a much
harder time installing GAK. This also has the social benefit of
discouraging companies from using what are intended to be data
recovery features as snooping features. This is exactly the sort of
lateral thinking that I am hoping to encourage.

> > - are there any plug ins which can't cope with this
> > - are there user requirements which it can't meet
> > - is there some fundamental flaw you think I have missed
> > - can you see ways that this could be perverted to implement GAK
> > (yes I can too, btw, but...)
> > - are those ways logisitically harder for GAKkers to acheive than for CMR
> >
> > Please be specific, no general waffle about understanding the
> > complexities of balancing user ergonomics, user requirements etc.
>
> Unfortunately, for real products you do have to consider these
> factors.

I fully agree that you have to acknowledge user ergonomics and user
requirements. What I was asking was that people in criticising my GR
design principles explain which user requirements they think can not
be met, or which user ergonomics features are hindered. I also
explicitly state in design principle 4 that you should balance these
considerations to maximise the global GAK resistance of the deployed
software and hardware in the target jurisdiction.

> > principle 3:
> > communications should be encrypted to the minimum number of
> > recipients (typically one), and those keys should have as short a
> > life time as is practically possible
>
> Key lifetime is a major issue. Keys are either protected by
> pass-phrase, or vulnerable. Think about how you are going to
> generate new keys every day, or every week...

Perhaps if I give some examples of how some one designing a protocol
according to these principles might proceed in this direction it would
be clearer how to minimise the impact on ergonomics without losing
security to the extent that it allows government access simply as a
property of the induced weakness. (In otherwords I am willing to
trade security for extra GAK resistance if it comes to it, but
typically does not seem to be required as GR design principles 1, 2,
and 3 are independently sound security objectives).

Companies would protect all keys by password, or by smart card token,
or by secured facilities or just by the nature of it being sufficient
for their security requirements to rely on the same weak physical
security which protects their paper files.

So: the signature key does not have recovery information -- I think
this is agreed by all. The storage key used in encrypting information
on your disk is has recovery information stored as described. The
shorter lived forward secret keys could be also recovery protected
(during their lifetime). PGP 5.x already has this feature. Consider
the encryption keys to be your forward secret keys. Give them shorter
than normal expiry dates.

Your next comment is very pertinent in demonstrating the sorts of
problems you must work around in attempting to maximise use of forward
secrecy.

> Think about off-line composition of email -- I have a laptop, download
> my mail from the pop server, compose email. Now I can't store my
> friends public keys on my disk, because they expire every day. So I
> have to go to the public keyserver for every correspondent's public
> key -- if the keyserver is unaccessible I'm out of luck. This
> radically changes the expected semantics of email.

I have 2 comments on this problem:

Consider: a system which automatically adapts and is forward secret
when it is able but not when it is not possible is more GAK resistant
than one which never uses forward secrecy at all because of the
existance of some situations where it is difficult to use.

Consider also: a system which is relatively forward secret (perhaps
with key updates every week, or month) is more GAK resistant than one
which makes no frequent key recommendations and leaving people to
choose long expiries of 1 year or with user no comments made on the
dangers of not having expiries at all is more dangerous.

Adam
--
Now officially an EAR violation...
Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/

print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<>
)]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`
Jon Callas
1997-10-16 02:00:21 UTC
Permalink
At 11:45 PM 10/15/97 +0100, Adam Back wrote:


Okay, Adam, I'll be civil here, but here's something I want to note:

You've ranted, raved, politicized, propagandized, given ad hominem attacks,
and stated the opinion that anyone who disagrees with you is evil. You've
sent flames to our internal development lists, which is at least impolite.
Yet you say, "constructive criticism only." Sure. I'd like an apology from
you, though. Deal?



> Also I have been accused of using "lots of anti GAK rhetoric, but
> giving no proposals" by Kent. I reject that claim. (I did use lots
> of rhetoric, but this was to try to impress upon those arguing for CMR
> of it's dangers. They do not seem to acknowledge them.) I'll try in
> this post to steer clear of anti-GAK rhetoric. We'll instead take it
> as a given that pgp5.5 and pgp5.0 are GAK compliant because of CMR and
> that this is a bad thing.

Uh huh. Steer clear of rhetoric, but we'll take it as a given that you're
right and everyone else is wrong. At least this is a de-escalation.

> Design 1.
>
> Instructions:
>
> - scrap the CMR key extension
>
> - store a copy of the private half of the users PGP encryption key
> encrypted to the company data recovery key on the users disk.

Okay -- constructive criticism only. I sincerely hope I'm reading this
correctly. You're saying that someone's private key should be encrypted to
the corporation's key. This sounds like key escrow to me. How does this
differ from the overly strict, nit-picking, freedom-threatening definition
that I gave?

This is better than the throw-the-floppy-in-the-safe model in that the
company-readable version of your key is sitting on your machine. That's good.

I see a threat here that if the corporation backs up my disk, they they
have my secret key and thus can read all files that key has ever encrypted.
This is bad. Normally, if they back up my system, they have my secret key,
but they have to hack my secret key. Most people's passphrases are easier
to crack than a public key, but I think this is worse.

With this system, the corporation can read everything I encrypt with that
key, because they effectively own it. Encrypting my secret key to them
essentially gives it to them. With CMR, I have the option of making some
files readable, and some not. This isn't necessarily a good thing -- some
companies want access to all data, and your proposal helps them.

I'm actually very surprised by this design of yours. On the scale of
property-balanced-with-privacy, you've come down hard on the side of
property. Your system makes it so that an employee of a company can *never*
use this key for a purpose the company can't snoop on. This isn't
necessarily bad, I think that people *should* have separate keys for work
and personal use. This just makes the work key definitely the work key. A
number of our customers will like that.

> - (optional) design the software to make it hard to copy the data
> recovery packet from the disk, hide the data, bury it in keyrings,
> stego encode it, whatever, use your imagination. This is to attempt
> to restrict the third parties ability to by pass the principle of
> non communication of recovery information

This is security through obscurity. We publish our source code, so this
won't work.

> Recovery method:
>
> Custodian of recovery key inserts recovery floppy disk in machine,
> decrypts copy of users private key, hands control back to user to
> choose new passphrase.

Choosing a new passphrase is not sufficent. If the custodian ever uses that
key, it *must* be revoked, a new encryption key issued, and all data
encrypted with it re-encrypted. There is also the problem of
re-distributing the revocation and new encryption key to all the people who
have your old one. This is no worse than any other revocation problem, but
CMR does not require revoking the user's key.

> Possible objections:
>
> objection #1. what if disk burns?
> counter #1: backup your disk
>
> objection #2: users don't back up disks
> counter #2: that is a good way to loose data :-) if they don't have
> the data the key protecting the data won't help them

This is no different with CMR. One of the design goals of CMR is to avoid
the myriad logistic and security problems associated with data archival.

> GAK-hostility rating:
>
> Harder to pervert for GAK than pgp5.5 / pgp5.0 CMR design.

Why? With your mechanism, if the G manages to A the K, then they can
decrypt every message that key has ever encrypted. I think this is a design
flaw.


> I'd be interested to see Will, or Hal, or other PGPer's criticisms of this
> simple modification, perhaps criticisms could most constructively answer:
>
> - what is stopping you implementing this
> - are there any plug ins which can't cope with this
> - are there user requirements which it can't meet
> - is there some fundamental flaw you think I have missed
> - can you see ways that this could be perverted to implement GAK
> (yes I can too, btw, but...)
> - are those ways logisitically harder for GAKkers to acheive than for CMR
>
> Please be specific, no general waffle about understanding the
> complexities of balancing user ergonomics, user requirements etc.
> That is a no-brainer, you need to do this analysis, the cost function
> for evaluating such design issus is now expressed explicitly in design
> principle 4 rather than being assumed. List problems and explain the
> significance of the all important deployability criteria.
>
> Cryptographic protocol designs are very flexible; most design goals can
> be met, or worked around I claim within the positive GAK-hostility
> side of the cryptographic protocol and product design solution space.
>
> Lastly, I would encourage readers to be critical of the GAK-hostile design
> principles themselves:
>
> - can you see any aspects which inaccurately reflect trade-offs
> - can you see methods to bypass inadvertently or deliberately the design
> that might require another corollary to correct.
>
> In anticipation of constructive criticism,

Okay, general observations:

I'm really surprised at this. In the continuum between privacy and
property, you've come down hard on the side of property. You've said that a
key owned by a corporation is *fully* owned by the corporation, and any
employee who uses it for personal purposes is daft. This is not what I
expected you to be arguing.

Enforcement. Most corporations want some level of enforcement on their
policies. The enforcement we put in isn't fool-proof, but it's far easier
to comply than resist. This is a design goal. I have a concern that the
only enforcement that the corporation has is to take your private key. If
this is their only way to make you follow their rules, they'll do it. Many
of them will play nice if possible, but hardball if they have to.

Fair-warning. In my first missive, I talked about my own principles, and
one of them is the "fair-warning" principle. It states that users should
know what is going on. If you have a key that is used in this system, there
is nothing in it that tells me that your company can read a message I send
you. I see this as a flaw, and one that I consider to be a *very* big deal.
Full disclosure is one of my hot buttons.



I think this is breaks a number of your principles.

Principle 1: The end-user's keys *are* escrowed with the company. If my
disk is ever backed up, then the corporation has my secret key. In order to
keep it from being implicitly escrowed, I have to put it someplace like
off-line media that can be gotten to if I'm hit by a bus. If you disagree,
please tell me how this is different from escrow.

Principle 2: The corporation is always a tacit crypto-recipient. It's no
different than CMR, and has the additional disadvantage that senders don't
know that the implicit receivers are there.

Principle 3: Again, the corporation is a tacit recipient in *all* uses of
the key. With CMR, they are an explicit recipient, and it's possible to
exclude them. There's no way to exclude the corporation here.

Principle 4: I don't see how this differs between your proposal and CMR.


Lastly, I here's a summation of what I think.

I think it's an interesting proposal. You're much more of a
corporate-control proponent than I am. I think control and privacy have to
be balanced, whereas you obviously think corporate control is trump. We
disagree there.

I am uncomfortable at the ease with which the end user can lose their key.
The end user must somehow prevent the employer from even so much as backing
up their computer, or it's just plain escrow.

I am uncomfortable not only with your siding with the corporation against
the employee's privacy, but also with your siding against the privacy of
someone who sends a message to the employee. Furthermore, I think that the
absence of a disclosure mechanism in your protocol is for us, a fatal flaw.
We'd never implement a system that does not have disclosure.

I do not see how your system is GAK-hostile. I think it is no more
GAK-hostile than CMR, and potentially more GAK-friendly, because it is
based around manipulating the actual secret key material. The failure mode
of CMR is that an adversary can decrypt messages, whereas the failure mode
of your proposal is that the adversary gets the key.

Jon


Adam

[1]
==============================8<==============================
GAK-hostile design principles

If we take the design goal of designing systems including
confidentiality which are not GAK compliant, we can most succinctly
state this design goal as the task of ensuring that:

- at no point will any data transferred over communications links be
accessible to anyone other than the sender and recipient with out
also obtaining data on the recipient and/or senders disks


We can then derive the design principles required to meet the design
goal of a non-GAK compliant system with confidentiality services down
to ensuring that:

principle 1:
no keys used to secure communications in any part of the system are
a-priori escrowed with third parties

principle 2:
second crypto recipients on encrypted communications are not
used to allow access to third parties who are not messaging
recipients manually selected by the sender

principle 3:
communications should be encrypted to the minimum number of
recipients (typically one), and those keys should have as short a
life time as is practically possible

principle 4:
deployment wins. violating any of principles 1 to 3 whilst
still retaining some GAK-hostility can be justified where
deployment is thereby increased to the extent that the violations
increase the degree of GAK hostility in the target jurisdictions
overall

Corrollary 1: Included in design principle 2) is the principle of not
re-transmitting keys or data after decryption over communication
channels, re-encrypted to third parties -- that is just structuring --
and violates design principle 2.

Corrollary 2: where communications are transmitted which violate
principles 1, 2 or 3 it is in general more GAK hostile to enforce as
far as possible that the recovery or escrow information remains in as
close proximity to the data as possible.

Corrollary 3: where communications are transmitted which violate
principles 1, 2 or 3 it is in general more GAK hostile to make these
communications as difficult to automate as possible. For example no
scripting support is given to enforce that GUI user interaction is
required, and/or that the process is made artificially time consuming,
and/or that the communication must not use electronic communication
channels

==============================8<==============================



-----
Jon Callas ***@pgp.com
Chief Scientist 555 Twin Dolphin Drive
Pretty Good Privacy, Inc. Suite 570
(415) 596-1960 Redwood Shores, CA 94065
Fingerprints: D1EC 3C51 FCB1 67F8 4345 4A04 7DF9 C2E6 F129 27A9 (DSS)
665B 797F 37D1 C240 53AC 6D87 3A60 4628 (RSA)
TruthMonger
1997-10-16 09:39:37 UTC
Permalink
Jon Callas wrote:
> At 11:45 PM 10/15/97 +0100, Adam Back wrote:

> Okay, Adam, I'll be civil here, but here's something I want to note:
>
> You've ranted, raved, politicized, propagandized, given ad hominem attacks,
> and stated the opinion that anyone who disagrees with you is evil. You've
> sent flames to our internal development lists, which is at least impolite.
> Yet you say, "constructive criticism only." Sure. I'd like an apology from
> you, though. Deal?

Praise the Lord!
The CypherPunks mailing list dialogue re: CMR/PGP is CypherPissing at
its finest.

As a cryptographer, I am pretty much a carpetbagging pretender, but,
up to now, I have managed to fool quite a number of people into thinking
that I may understand the issues involved in privacy, security and
encryption.

Now that the shit has seriously hit the fan, however, I find that I
am completely clueless as to the true import of the latest developments
which will decide the future of encryption. (And I suspect that I am
not alone in this.)
I believe that my philosophical viewpoints of encryption issues are
valid in many ways (and probably irrelevant in other ways), but the
current nadir point in encryption development is one in which there is
no possibility of many of us making sound decisions as to what position
we should ethically take, unless those who truly have a solid grounding
in the underlying technology manage to accurately explain the issues
involved to those of us who *don't* dream in algorithms.

I am extremely pleased with Adam Back's in-your-face, "I'm from
Missouri...show me!" attitude, since I think that this issue is
important enough that no one should give an inch of ground until
their philosophical opponents have given them valid cause for doing
so. I am also pleased that Adam is honestly and openly asking for
those who *can* 'show' him, to do so.
I am also every bit as interested in hearing and learning from the
position that Jon Callas is taking, based upon his own knowledge of
what CMR/PGP is, and is not.

I honestly do not care in the least whether Adam and Jon are 'both
right', whether they are both 'half-right', or none of the above.
What I *do* care about is that they both honestly state their case
to the extent that I have enough information to make my own decision
as to what future course of action I should take on these issues.

My depth of concern in this matter springs from the following:
I care...and I act. As a result, my actions have effects, for which
I consider myself responsible.
I truly believe that abortion results in the extinguishing/murder
of a divine spark of human/spiritual life energy. Yet I risked my
life and my freedom, helping my sister smuggle home-abortion
literature into a predominantly Catholic country behind the Iron
Curtain. Why? Because it is not up to me to make the decisions
for *everyone*, and I do not believe that it is in the interest
of humanity to have *two* spirits die because those who choose
to do home abortions do not have access to information that will
preserve their life.
The 'Right To Life' faction will publish *their* statistics and
opinions, as will the 'Pro Choice' faction, but I refuse to take
the easy way out and convince myself that I can flip a coin to
decide which faction will bear the responsibility for *my* own
decision in the matter.

The coming developments in information technology will undoubtably
make George Orwell look like an optomist.
We have to make our decisions without having the benefit of hindsight
that history affords us. If Hitler had indeed only wanted 'Austria',
then the concessions that world leaders of the time made might have
proven to have saved many needless deaths. History has proven this to
be wrong, but those of us who did not live through that time would have
a difficult time divining who was 'honestly wrong' and who 'sold out.'

How many guilty men should go free in order to guarantee that a single
innocent man is not imprisoned?
My answer: "More than one, less than a million." (ymmv)

> Fair-warning. In my first missive, I talked about my own principles, and
> one of them is the "fair-warning" principle. It states that users should
> know what is going on. If you have a key that is used in this system, there
> is nothing in it that tells me that your company can read a message I send
> you. I see this as a flaw, and one that I consider to be a *very* big deal.
> Full disclosure is one of my hot buttons.

I could be wrong, *but*:
With PGP 5.0, I found that if someone sent me a message that
was encrypted to someone else, I would get a message telling me
that I didn't have the proper key, but would not tell me who the
message *was* encrypted to.
I could drop into PGP 2.6.2 and get a message saying (paraphrased),
"Encrypted to John Doe <***@dev.null>, you don't have that key."
(OK, *badly* paraphrased.)

With PGP 2.6.2, I routinely used a bogus password in my first pass at
decyphering messages, so that I could find out who all the message was
encrypted to. It makes me nervous that one has to 'make a mistake' in
order to get 'the rest of the story', rather than automatically be
informed when a message is also encrypted to others.

Also, as a 'teaser', I would like to announce to one and all that
the quickly closing saga of 'InfoWar' will include an epilogue
chapter titled, "I Broke PGP," written by myself.
Believe it or not, I speak the truth, although not in a way that
it direct and obvious.
If you think I am bullshiting, then stop washing your asshole,
starting today, because if you can show me I am wrong, I will kiss
your ugly, hairy ass.

Love and Kisses,
TruthMangler
Jon Callas
1997-10-16 17:40:56 UTC
Permalink
At 03:39 AM 10/16/97 -0600, TruthMonger wrote:

I could be wrong, *but*:
With PGP 5.0, I found that if someone sent me a message that
was encrypted to someone else, I would get a message telling me
that I didn't have the proper key, but would not tell me who the
message *was* encrypted to.

You're right, that was in PGP 5.0. It sucked. It's fixed in 5.5. 5.5 shows
you a nice little box on every message showing you who it is encrypted to.

Jon



-----
Jon Callas ***@pgp.com
Chief Scientist 555 Twin Dolphin Drive
Pretty Good Privacy, Inc. Suite 570
(415) 596-1960 Redwood Shores, CA 94065
Fingerprints: D1EC 3C51 FCB1 67F8 4345 4A04 7DF9 C2E6 F129 27A9 (DSS)
665B 797F 37D1 C240 53AC 6D87 3A60 4628 (RSA)
Adam Back
1997-10-16 17:58:39 UTC
Permalink
To reply to Jon Callas request for an apology for my rudeness, I'll
offer a deal:

- If I am unable to show you how to improve the GR (GAK resistance)
property of a fully viable, implementable email security system within
your choice of user requirement, ergonomic, and typical corporate
environments restrictions, I will publically apologize for being
rude, and for wasting your time.

- However, in return: if I am succesful in proving to PGP Inc that they
can improve the GR property of an email security system within their
requirements, I _don't_ want an apology.

I want something much more interesting:

I want PGP Inc to implement and deploy it.

- The group which will judge this process is PGP Inc.

Do we have a deal?

Adam
--
Now officially an EAR violation...
Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/

print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<>
)]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`
Adam Back
1997-10-16 18:03:54 UTC
Permalink
To reply to Jon Callas request for an apology for my rudeness, I'll
offer a deal:

- If I am unable to show you how to improve the GR (GAK resistance)
property of a fully viable, implementable email security system within
your choice of user requirement, ergonomic, and typical corporate
environments restrictions, I will publically apologize for being
rude, and for wasting your time.

- However, in return: if I am succesful in proving to PGP Inc that they
can improve the GR property of an email security system within their
requirements, I _don't_ want an apology.

I want something much more interesting:

I want PGP Inc to implement and deploy it.

- The group which will judge this process is PGP Inc.

Do we have a deal?

Adam
--
Now officially an EAR violation...
Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/

print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<>
)]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`
Bill Stewart
1997-10-17 08:51:10 UTC
Permalink
At 11:45 PM 10/15/1997 +0100, Adam Back wrote:
>We'll instead take it as a given that
>pgp5.5 and pgp5.0 are GAK compliant because of CMR and
>that this is a bad thing.

I'll duck the 5.5 argument for now, but you're
incorrect on 5.0. The PGP 5.0 key format includes
separate keys for signature and privacy, which is a mostly good thing,
and includes the ability to associate a group of
keyIDs and flag bits with each privacy key.
(or each set of privacy key + signature key?? Jon?)
What semantics are attached to this association are dependent
on the computer program, and PGP 5.0 does nothing CMRish with them,
much less GAKish. OpenPGP could do anything it wants with the field,
such as use it for keyids (or IPv4 addresses, since it's 32 bits)
of keyservers with copies of the keys, or whatever.
PGP 5.6 could at least take the case of multiple keyids
and secret-share the session key between them rather than
allowing any single key to access it.


>The rest of Will's post seems to miss the point, so it seems to me
>that the best way to transfer understanding of how to use the anti-GAK
>design principles to design less GAK friendly systems is to present a
>worked example.

Good - we'll attempt to nitpick it at least as thoroughly as
we're nitpicking PGP 5.5 :-)

>- store a copy of the private half of the users PGP encryption key
> encrypted to the company data recovery key on the users disk.
No. This is evil. Don't go there. Even with secret sharing,
and especially without.

>- (optional) design the software to make it hard to copy the data
> recovery packet from the disk, hide the data, bury it in keyrings,
> stego encode it, whatever, use your imagination.
This is secutity by obscurity; not highly useful.

>Possible objections:
>objection #1. what if disk burns?
>counter #1: backup your disk
>objection #2: users don't back up disks

Objection 3 - users _do_ back up disks -
that means every backup tape or disk or server volume
potentially contains your disk's encryption keys
CAKked up with the Corporate Master Key, which is Bad,
and wildly out of the user's control, violating
one of your principles.


Thanks!
Bill
Bill Stewart, ***@ix.netcom.com
Regular Key PGP Fingerprint D454 E202 CBC8 40BF 3C85 B884 0ABE 4639
Adam Back
1997-10-17 17:27:35 UTC
Permalink
Bill Stewart <***@ix.netcom.com> writes:
> I'll duck the 5.5 argument for now, but you're
> incorrect on 5.0. The PGP 5.0 key format includes
> separate keys for signature and privacy, which is a mostly good thing,

A very good thing.

> and includes the ability to associate a group of
> keyIDs and flag bits with each privacy key.

So can you use pgp5.0 to construct a CMR key which would
interoperate with pgp5.5 for business?

> (or each set of privacy key + signature key?? Jon?)
> What semantics are attached to this association are dependent
> on the computer program, and PGP 5.0 does nothing CMRish with them,
> much less GAKish.

Clearly pgp5.0 does nothing with these flags on reciept, but does it
understand them when sending?

Scenario: I work for Mega Corp, and it is using pgp5.5, and is using
CMR with companies key.

Does pgp5.0 reply encrypting to just me as individual, or two crypto
recipients me, and Mega Corp recovery key?

I would be interested on clarification of this point.

> OpenPGP could do anything it wants with the field,
> such as use it for keyids (or IPv4 addresses, since it's 32 bits)
> of keyservers with copies of the keys, or whatever.
> PGP 5.6 could at least take the case of multiple keyids
> and secret-share the session key between them rather than
> allowing any single key to access it.

Yes, I'm not so much arguing about the flexibility or the mechanism,
as what it is being used for. I consider there are more secure and
more GAK resistant ways to acheive same functionality.

> >- store a copy of the private half of the users PGP encryption key
> > encrypted to the company data recovery key on the users disk.
> No. This is evil. Don't go there. Even with secret sharing,
> and especially without.

It is evil. But it is not _as_ evil.

The reason for this is that government access to storage keys is not
as evil as government access to communications keys, because the
government has to come and capture the ciphertext (take your disk),
whereas with communications they can grab them via an arrangement with
your ISP.

> >- (optional) design the software to make it hard to copy the data
> > recovery packet from the disk, hide the data, bury it in keyrings,
> > stego encode it, whatever, use your imagination.
> This is secutity by obscurity; not highly useful.

Indeed. It is of only marginal use.

> >Possible objections:
> >objection #1. what if disk burns?
> >counter #1: backup your disk
> >objection #2: users don't back up disks
>
> Objection 3 - users _do_ back up disks -
> that means every backup tape or disk or server volume
> potentially contains your disk's encryption keys
> CAKked up with the Corporate Master Key, which is Bad,
> and wildly out of the user's control, violating
> one of your principles.

See point above.

This is not avoidable for storage ... if you are encrypting data on
disk, and if you want recovery information, you have no choice but to
allow company access.

The recovery information should be as decentralised as much as
possible. (Perhaps this should be the Stewart corollary).

The point though is that storage recovery is a completely separable
issue from communications "recovery" which is a euphamism for allowing
companies to read, or snoop, employees email, unless it is being used
soley for data recovery of mail stored in mail folders (which seems to
be what PGP Inc means by CMR term), in which case it is not necessary
functionality, and can be better acheived by encrypting the mail
archive with a user symmetric key with company storage recovery on
that key.

I would be interested to hear novel ways to minimise the likelihood
that the government could walk into your offices, grab the CD duke
box, grab the single corporate recovery key and walk off.

Decentralisation is one good way -- recovery keys are in hands of
department and group heads -- secret splitting is another good way --
and (your suggestion) omitting some of recovery bits is a small
addtional hinderance to intruders after plaintext.

Adam
--
Now officially an EAR violation...
Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/

print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<>
)]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`
Bill Stewart
1997-10-18 08:49:19 UTC
Permalink
At 06:27 PM 10/17/1997 +0100, Adam Back wrote:
>Bill Stewart <***@ix.netcom.com> writes:
>> I'll duck the 5.5 argument for now, but you're
>> incorrect on 5.0. The PGP 5.0 key format includes
>> separate keys for signature and privacy, which is a mostly good thing,
>A very good thing.

It's very good, but it does have its own risks.
If you use the same keys for signature and encrypting to you,
and the government wants to GAK your encryption keys,
they have to steal your signature keys also,
which just about everybody agrees is Bad, and it's
simply unacceptable in a business environment.
On the other hand, if the keys are separate, Louis Freeh
can tell the Congress that it's not a big problem,
he'd NEVER dream of GAKing your signature keys,
he only wants emergency access to your privacy keys --
this may give the GAK folks a better chance of getting it.
Similarly, your corporate security bureaucrats can understand
the concept that if they CAK your privacy keys,
they're risking having official company signatures get forged,
and they'll often do the right thing and desist,
but with separate keys that won't stop them.


> The PGP 5.0 key format [....]
>> and includes the ability to associate a group of
>> keyIDs and flag bits with each privacy key.
>So can you use pgp5.0 to construct a CMR key which would
>interoperate with pgp5.5 for business?

No. As far as I know (without reading 7000 pages of code :-),
pgp5.0 won't construct CMR key fields, and won't use them,
it just isn't supposed to die if it receives a key containing them.

>Clearly pgp5.0 does nothing with these flags on reciept,
>but does it understand them when sending?

If it receives a key containing CMRKs, it doesn't choke on the CMRKs,
and when you encrypt a message to the key, it ignores the CMRKs.
If it receives a message encrypted to both real people and CMRKers,
there's no way to tell which are which (though if you don't have
CMRKs in your key, they're obviously not your CMRKs.)

>Does pgp5.0 reply encrypting to just me as individual, or two crypto
>recipients me, and Mega Corp recovery key?
Just you.

>> >- store a copy of the private half of the users PGP encryption key
>> > encrypted to the company data recovery key on the users disk.
>> No. This is evil. Don't go there. Even with secret sharing,
>> and especially without.
>
>It is evil. But it is not _as_ evil.
>
>The reason for this is that government access to storage keys is not
>as evil as government access to communications keys, because the
>government has to come and capture the ciphertext (take your disk),
>whereas with communications they can grab them via an arrangement with
>your ISP.

First of all, the only reason for having a CMRK attached to your key
is that either your mail service will reject mail to you that doesn't
contain it, or your employer insists on it. In either case,
it can be done without a special CMRK field on your key --
PGP multiple recipients are enough to do that, and the sender
just has to remember to include the (no longer automagically attached) CMRK.
So leaving out the CMRK doesn't protect you.

Second, if the government is coming to get your disk anyway,
they can get themselves a court order to have you reveal the key,
and you can argue with the judge about whether you should be
compelled to reveal it, and at least in the US there's a
Fifth Amendment backing up your arguments (though like the
other amendments, it's weakened by the "except for drugs" clause...)
GAK asserts that the government has the right to your keys
before you get to court. Corporate access to storage keys,
on the other hand, is concerned with protecting the company's
information on the company's computers, and you can reasonably
negotiate how much of that you want to live with and comply with.
Some companies want to protect their information in case you
get hit by a bus, or a lawsuit; other companies don't even
have the sense to provide decent automated network-based backup
to protect their information from head crashes.

On yet another hand, while it may be obvious when the
government steals your disk and uses Storage-GAK,
companies using Storage-CAK or Storage-CMR can use it
just as well on the backup tapes without your notice
as on your disk drive. Furthermore, you can think of
the data backup process as communications from you
to the backupmeister, so Storage-CAK _is_ Message-CAK,
and Storage-CMR is Message-CMR.

But CAKing the disk doesn't protect the company's information,
and there's therefore no excuse for using it. Superencryption
is always possible, in messages as well as files, but with
message encryption the eavesdropping-prone corporation can
detect superencrypted messages going by (though not stego'd),
while PGP-encrypted files on your disk only show up _after_
you've been hit by the bus on your way to the headhunter's.

BTW, PGP5.5 CMR _is_ CMR'd storage encryption.
It's not as convenient as encrypted file systems
like PGPdisk and Secdev,
but people are using it to encrypt stored data,
including email and non-email files.

On a technical note, GAK for storage can be made less dangerous,
though not less offensive, by adding a layer of indirection -
use your public key to encrypt a symmetric key, store the encrypted
symmetric key on your disk, and then use the symmetric key for
encrypting the storage (or as a master key for encrypting the
per-file or per-block storage keys, if you're doing that,
which you probably should.) This means that a search warrant
which is required to itemize the things it's looking for can be
more effectively restricted to specific files rather than
cracking the whole disk and every other disk that uses the
same encryption keys.

>This is not avoidable for storage ... if you are encrypting data on
>disk, and if you want recovery information, you have no choice but to
>allow company access. The recovery information should be
as decentralised as much as possible.

>The point though is that storage recovery is a completely separable
>issue from communications "recovery" which is a euphamism for allowing
>companies to read, or snoop, employees email, unless it is being used
>soley for data recovery of mail stored in mail folders (which seems to
>be what PGP Inc means by CMR term), in which case it is not necessary
>functionality, and can be better acheived by encrypting the mail
>archive with a user symmetric key with company storage recovery on
>that key.

Trust me - you _really_ don't want mailboxes encrypted,
recovery key or no recovery key, unless it's implemented very very well.
Microsoft Mail, and as near as I can tell Microsoft Exchange,
puts the user's entire mailbox, stored message folders and all,
in one big ugly cheaply-encrypted file. The encryption isn't
strong enough to keep the NSA out, but it's strong enough to keep
you from repairing the file if part of it gets damaged,
and enough to keep you from extracting the undamaged parts,
or accessing it with sorting tools not built into MSMail.
Combined with the Microslush Mail Mindset of never sending text
when a Microsoft Word file could do, and never sending Word
when an even-more-bloated PowerPoint Presentation can fit,
your mailbox easily expands to over 100MB, too big to fit
on a ZIP drive. Eventually, something always gets corrupted,
and you end up with Corporate Message Non-Recoverability.


Thanks!
Bill
Bill Stewart, ***@ix.netcom.com
Regular Key PGP Fingerprint D454 E202 CBC8 40BF 3C85 B884 0ABE 4639
Adam Back
1997-10-18 21:17:44 UTC
Permalink
Bill Stewart <***@ix.netcom.com> writes:
> > [separate signing and encryption keys]
>
> On the other hand, if the keys are separate, Louis Freeh
> can tell the Congress that it's not a big problem,
> he'd NEVER dream of GAKing your signature keys,

Valid argument with some value yes. Actually now that you bring it up
I dimly remember this aspect being raised on cypherpunks some time
back. Perhaps it was you who raised it even. Or perhaps it was Phil
Karn, or someone.

> Similarly, your corporate security bureaucrats can understand
> the concept that if they CAK your privacy keys,
> they're risking having official company signatures get forged,
> and they'll often do the right thing and desist,
> but with separate keys that won't stop them.

Hmmm. I think that if companies start encrypting anything, they're
going to need recovery of some sort. Else they just won't use it at
all.

Most of them aren't using storage encryption right now anyway, which
is why Tim May's suggestion to store emails after decrypt in the clear
makes a lot of sense as a simple interim way out of this problem until
the issues have been explored more.

This largely avoids company requirement to recover emails.

One valid objection to this approach is that if the employee forgets
their password whilst there are lots of emails queued up to receive
that those emails will be lost. Or if the employee gets hit by a
truck.

However I'm not so sure this is a big deal for a number of reasons:

1. how often do employees get hit by trucks?
2. how often do employees forget passwords? (all the time unfortunately)
3. things which are important the sender is likely able to resend because
he has in clear on disk
4. senders using the same software can have archives of what they have sent
and easily able to resend.

> >Does pgp5.0 reply encrypting to just me as individual, or two crypto
> >recipients me, and Mega Corp recovery key?
> Just you.

If you are right, it will bounce when it hits an enforcer with the
strict setting turned on.

Is this what will happen?

This will mean that the users will have to manually figure out how to
solve (get enforcer key, multiple encrypt to that key, possibly by
cc'ing to enforcer email address/userID even if this bounces).

> >It is evil. But it is not _as_ evil.
> >
> >The reason for this is that government access to storage keys is not
> >as evil as government access to communications keys, because the
> >government has to come and capture the ciphertext (take your disk),
> >whereas with communications they can grab them via an arrangement with
> >your ISP.
>
> First of all, the only reason for having a CMRK attached to your key
> is that either your mail service will reject mail to you that doesn't
> contain it, or your employer insists on it. In either case,
> it can be done without a special CMRK field on your key --
> PGP multiple recipients are enough to do that, and the sender
> just has to remember to include the (no longer automagically attached) CMRK.
> So leaving out the CMRK doesn't protect you.

Lack of automation is some weak protection. What are users to do?
Send Cc: <***@nsa.gov>. What if they forget? Much more
plausible to forget. Makes strict penalties for forgetting difficult
in western countries. 5 years jail time for forgetting? Don't think
so.

What about the traffic? If you make it an invalid address, just to
pick up the key it'll flood email systems with bounces; every single
encrypted mail will get a bounce.

Not an overwhelming protection, but may mean more people will more
resist it, and more people will forget often with the current email
MUA deployed base.

With many people using CMR based pgp5.5, forgetting will be much less
plausible.

Putting in CMR encourages adoption of this kind of filtering/bouncing
approach.

I am interested in analysis and discussion of the level of
significance this difference makes to user uptake of GAK in say the US
as an example (say some time in 1998 when many more businesses and
individuals have upgraded to pgp5.x). What differences are there in
resistance between a pgp5.5 using CMR and with a pgp5.6 using CDR (or
storage-CAK as Bill terms the approach)?

> Second, if the government is coming to get your disk anyway,
> they can get themselves a court order to have you reveal the key,
> and you can argue with the judge about whether you should be
> compelled to reveal it, and at least in the US there's a
> Fifth Amendment backing up your arguments (though like the
> other amendments, it's weakened by the "except for drugs" clause...)

Yes. This is the kind of reason I argue that storage recovery is less
dangerous than communications recovery. They've got to get the disk
first. And they can't tell if you have used GAK until they get it;
when they get it if you're suspecting they will try this, you will
ensure there is no GAK access, you will use other software, as you
have nothing to lose.

> GAK asserts that the government has the right to your keys
> before you get to court.

Well that is the scenario that Freeh is arguing for yes. Faced with a
choice of giving him your comms keys or your storage keys, I'd go for
storage keys anyday. I can lie to him, and give him some random
numbers and he'll never know. The point at which he will know will be
after the dawn raid; if it gets to dawn raids you are indendently in
trouble anyway.

> On yet another hand, while it may be obvious when the government
> steals your disk and uses Storage-GAK, companies using Storage-CAK
> or Storage-CMR can use it just as well on the backup tapes without
> your notice as on your disk drive. Furthermore, you can think of
> the data backup process as communications from you to the
> backupmeister, so Storage-CAK _is_ Message-CAK, and Storage-CMR is
> Message-CMR.

Technically yes. Practically no, there is a large difference. The
extra protection of Storage-CAK method is the extra GAK resistance due
to the fact that availability of access to communications is patchy,
and more expensive for the government to achieve, and enforcement is
patchy, even detection is patchy. This patchiness is good. Mass
keyword scanning is impossible on a wide scale. The government can
keyword scan some of you, but they can do that already at similar cost
levels: they can plant bugs, have undercover federal investigator
inflitrate your company in guise of employee etc. The aim of the game
is to make the cost higher than existing physical attacks, or at least
as high as possible.

> But CAKing the disk doesn't protect the company's information,
> and there's therefore no excuse for using it.

Surely it does?

If you are in ACME Corp and they want all disks encrypted as a
security policy. They provide smart cards to employees, and
workstations data is inaccessible until correct smart card is
inserted. Employee lets dog chew on smart card. No recovery implies
that this data is irretrievably lost.

> Superencryption is always possible, in messages as well as files,
> but with message encryption the eavesdropping-prone corporation can
> detect superencrypted messages going by (though not stego'd), while
> PGP-encrypted files on your disk only show up _after_ you've been
> hit by the bus on your way to the headhunter's.

This is good. Both systems are hackable from all three directions.
(individuals can hack around system to increase privacy; corporations
can hack around to decrease privacy (keyboard sniffer); governments
can too by walking in and taking disks and threatening people fail
time for not handing over keys.) Many aspects of this are better with
storage data recovery than they are with communications recovery.
Especially government aspects. Company aspects are almost neutral
non-issue in my mind due to ease with which company can remove your
privacy as they own machines, and can do all sorts of things to your
software, hardware, video cam, bug phone, keyboard sniffer, keyboard
log, etc.,etc.

> BTW, PGP5.5 CMR _is_ CMR'd storage encryption.
> It's not as convenient as encrypted file systems
> like PGPdisk and Secdev,
> but people are using it to encrypt stored data,
> including email and non-email files.

Yes. pgp5.0 which I looked at windows version (available on
ftp://ftp.replay.com/ (netherlands) somewhere for other non-US
people), does file and email encryption. As does linux version.

> On a technical note, GAK for storage can be made less dangerous,
> though not less offensive, by adding a layer of indirection -
> use your public key to encrypt a symmetric key, store the encrypted
> symmetric key on your disk, and then use the symmetric key for
> encrypting the storage (or as a master key for encrypting the
> per-file or per-block storage keys, if you're doing that,
> which you probably should.) This means that a search warrant
> which is required to itemize the things it's looking for can be
> more effectively restricted to specific files rather than
> cracking the whole disk and every other disk that uses the
> same encryption keys.

Good point.

> >The point though is that storage recovery is a completely separable
> >issue from communications "recovery" which is a euphamism for allowing
> >companies to read, or snoop, employees email, unless it is being used
> >soley for data recovery of mail stored in mail folders (which seems to
> >be what PGP Inc means by CMR term), in which case it is not necessary
> >functionality, and can be better acheived by encrypting the mail
> >archive with a user symmetric key with company storage recovery on
> >that key.
>
> Trust me - you _really_ don't want mailboxes encrypted,
> recovery key or no recovery key, unless it's implemented very very well.

PGP Inc does use this otherwise there would be no argument about need
for recovery information -- you don't need recovery information for
plaintext.

> [100Mb mail folder corruption nightmares]

Your example of corruption problems with large mail boxes is one
argument against this practice.

Adam
--
Now officially an EAR violation...
Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/

print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<>
)]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`
Richard Johnson
1997-10-15 23:34:31 UTC
Permalink
At 02:34 -0700 on 10/15/97, Will Price wrote:
> Adam:
> ...
> I'm truly amazed that you would attack in such a spiteful fashion a simple
> system which adds a recipient-requested, sender-approved extra recipient
> which is end-to-end wherein all recipients are under the sender's control
> and each recipient knows who can read the message with no key escrow using
> the same old PGP message format we all know and love without change, and
> yet you propose a much less secure system which allows hiding critical
> information from the sender and does not adequately perform its stated
> purpose of data recovery.


I don't see Adam's proposals as spiteful attacks. His proposed
alternatives may or may not do the job, but I believe they are honest
attempts to provide for corporate data recovery without enabling a greater
problem.

Enforcing encryption to a 3rd-party key, in addition to the intended
recipient's key, is bad.

Doing so makes GAK easier for a government or other attacker to manage, as
they'll only have to handle thousands of corporate recovery keys, not
millions of individual keys.

Doing so also provides easy hooks to mandate such access. In the USA, for
example, broadcast messages usually have no legal expectation of privacy.
Encrypting to a general recipient may cause messages to fall into that
category, and thus require no warrant for interception (but IANAL).

Doing so also opens up new avenues for illicit third-party snooping. With
enforced (or merely requested) encryption to 3rd-party keys, it may be
possible, depending on the implementation, to engineer a "man-on-the-side"
attack in order to snoop on message content. At the very least, it's
something else for an implementation to trip over.

Finally, doing so creates a higher value target key, the recovery key. An
attacker who can, for example, social-engineer the passphrase for that key
out of an executive assistant can thus achieve a greater payoff.


For those basic reasons, I would prefer to keep open-pgp simple. Perhaps
specify that conforming implementations must not enforce encryption of
messages to 3rd-party keys, but at the very least simply leave any kind of
specific 3rd-party key specification and enforcement out of the standard.


Richard
Richard Johnson
1997-10-15 18:09:22 UTC
Permalink
At 02:34 -0700 on 10/15/97, Will Price wrote:
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
>
> Adam:
> ...
> The design you have been espousing for the last week or so in your many
> messages takes the power out of the hands of the sender and encourages
> automated violations of the sender's privacy by the recipient (perhaps even
> unbeknownst to the recipient). ...


This is simply a reflection of reality. The sender has little real control
over what the recipient actually does with any message. If the recipient
shares the information content, the sender is basically limited to civil or
criminal sanctions after that sharing becomes evident.

Designing a standard for encrypted communications that attempts to fight
that fact will likely be wasted effort.


Richard
s***@ix.netcom.com
1997-10-15 18:38:56 UTC
Permalink
At 02:34 AM 10/15/1997 -0700, Will Price, probably annoyed at the
flaming that PGP5.5 has received, flames out Adam Back's proposal
for not doing the right things for his perceived market.
But PGP5.5 doesn't do most of those same things either.

>First, let me state some overriding design goals of a
>data recovery system required to ensure privacy:
>the sender must know and consent to every key
>that will be able to read the message during its lifetime,

As Ian and Ian have pointed out, this is bogus -
the sender can't control the recipients' uses of the message.
At most the sender has a right to control who receives the
message _until_ the recipient gets it,
and if they don't trust the recipient, they shouldn't send it.

>the encryption must be end-to-end, and
This is fine.

>the recipient must know exactly who else can decrypt the message.

PGP 5.5 does not provide this; the message headers only provide KeyID
for each recipient, not the person using the keys, so the recipient
only knows those recipients whose KeyIDs he knows or can look up.
(Plus deadbeef attacks make even those suspect.)

>In your model, the recipient automatically
>decrypts and then re-encrypts to a data recovery key -- even though
>end-user computers are likely to be insecure thus making this decrypt &
>reencrypt step rather specious at best.

Again, this is bogus - if the recipient's computer is insecure,
then the data is insecure from the moment the recipient decrypts it,
a step that even PGP5.5 does not usually prevent :-)

>As an actual data recovery system, it also fails fundamental tests.
>If I encrypt critical data to a colleague wiping it from my system
>after sending, then the colleague is incapacitated before receipt
>and processing of the message, the data can never be retrieved.

Bogus. If you send critical data to a colleague entirely in the clear,
and sendmail eats it instead if delivering it to your colleague,
or the colleague's mailbox disk drive crashes before he reads it,
and you have wiped the only copy before confirming receipt of the message,
you lose and PGP5.5 won't help any, since neither PGP5.5 nor the
PGP SMTP filters cause extra copies to be created.
If you do this sort of thing often, you need a new definition of critical.

What you need is automated message receipts, and the decryption system
needs to offer your recipient a user-friendly way to send receipts
when he actually reads the message. PGP5.5 doesn't do this
(not its job - it's an encryption program, not a mail user agent)
and the mail client's receipts aren't enough, since they don't know
if your recipient could decrypt the message successfully
(nor whether they could read the language it was written in,
nor whether the contents made any sense,
nor whether the recipient agreed with the content of the message. :-)

>I'm truly amazed that you would attack in such a spiteful fashion

If you can't tell serious concerns from spite, your ego's in the way
(you've probably been reading too many negative reviews lately. :-)
There are serious problems with the PGP5.5 approach, even though it
does solve real business problems that some of your real customers have,
or at least think they have.

>a simple system which adds a recipient-requested, sender-approved
>extra recipient which is end-to-end wherein all recipients are
>under the sender's control and

Only the choice of keys is under the sender's control,
not the knowledge of what actual _people_ hold those keys or
what the CMRKers will do with the data, or even whether they
have received copies unless she mails copies them directly
In the PGP SMTP filter context, if the sender must include
certain keys to get the message delivered, that's a rather
limited definition of "under the sender's control".

> each recipient knows who can read the message with no key escrow

As above, the recipients don't know each other unless
they happen to have each others' KeyIDs on their keyservers,
and since PGP5.5 "elegantly" doesn't indicate whether a recipient
is there because the sender wanted them or whether they're CMRKers.
Sure, if there's only one real recipient, both sender and recipient
know the eavesdroppers, but if there's more than one real recipient,
there's no way to tell.

>using the same old PGP message format we all know and love without change,
If you count 5.0 message format as "old" :-) And while the
CMRK fields apparently were in 5.0 key record formats, they weren't used,
and the semantics are much different even if the syntax is the same.
Treating desired recipients and undesirable recipients the same
is one approach, but it doesn't accomodate people who want
secret sharing to prevent a single CMRker from recovering the message.

>and yet you propose a much less secure system which allows hiding
>critical information from the sender and does not adequately perform
>its stated purpose of data recovery.
I'm not flaming Adam's proposal here; that's a job for another message :-)
In particular, it seems to be evolving, and I haven't figured it out yet,
nor am I convinced there is a way to adequately perform the purposes
of data recovery.
Thanks!
Bill
Bill Stewart, ***@ix.netcom.com
Regular Key PGP Fingerprint D454 E202 CBC8 40BF 3C85 B884 0ABE 4639
Loading...