RE: ISSUE-128: Strong / weak algorithms? [Techniques]

One thought is that a standards body takes an approach like the DoD
took and creates robustness standards. This way everyone knows what is

For US Federal systems (specifically DOD) robustness of security
capabilities were created and published in the document titled DoD
Instruction 8500.2. In this document security capabilities including
ciphers and settings were grouped into basic, medium and high

This document is a nosebleed but as far a I can tell it is a public
document and can obtained through the following link. Could not find
any classification markings to note restrictions.

The most secure public or commercial systems seem to fall somewhere
between basic and medium robustness.


-----Original Message-----
[] On Behalf Of Luis Barriga
Sent: Wednesday, October 17, 2007 11:21 AM
Subject: RE: ISSUE-128: Strong / weak algorithms? [Techniques]

No, I don't want (WSC) to go there. Let experts do that if possible.
That's why I wrote at the end of the same email

>>One way forward is to ask Ecrypt folks to produce such TLS suite
>>recommendation since the step is not that far using as baseline
>>their current crypto recommendations.

Thus, I agree that we should provide a reference to authoritative
documents recommending ciphers (e.g. IETF, ECRYPT)and cipher suites
(if/when available). We should not provide the the cipher
recommendations themselves, except for general established security
principles such as "the weakest cipher in the suite determines the
overal cipher strength"


-----Original Message-----
From: Yngve Nysaeter Pettersen [] 
Sent: den 17 oktober 2007 16:09
To: Luis Barriga;;;
Subject: Re: ISSUE-128: Strong / weak algorithms? [Techniques]

On Wed, 17 Oct 2007 15:10:05 +0200, Luis Barriga
<> wrote:

> I checked those docs and they are still focused on each cipher 
> independently. The Ecrypt paper is closer (compared to FIPS) to what

> think we need but still not there.
> The point I'm trying to make is that we need some recommendations not

> at the *separate cipher* level, but at the *cipher suite* level so

> combination of public and symmetric is also consistent. Correct me if

> i'm wrong...

Are you sure you want to go there? It could get messy. There was a
heavy discussion on the TLS WG last week about this issue.

The thing about the suites is that the symmetric cipher and the digest
operation are the only fixed size items in the suite. The keyexchange
method have no fixed size, save whatever minimum the exchange structure
causes the size of the key to be.

My approach to the this is to say that the weakest method or keysize
mapped into an equivalence map) trumps the security evaluation, meaning
that a 1024 DHE from 512 bit RSA with 256 bit AES  is considered very
unsecure, while a similar 1024/1024/256 is (still) considered fairly

Then there is the fact that some security parameters are not
part of the ciphersuite selection, but are defined by the keys used in
the certificate chain.

IMO The best way to phrase a recommendation would be to reference the
most up to date version of these documents, and suggest that
applications only consider secure IETF defined cipher suites that are
composed of methods (and associated keysizes) that are considered
for at least 10 or 15 years by these documents. We might want to
specificall discourage use of some suites, such as the anonymous and
authentication/integrity-only suites, just to be on the safe side.

What methods that can be considered secure is really a bit to flexible
to cast in stone, even with a lot of disclaimers.

> For example, the NIST document recommends the following ciphers
> ...
> One way forward is to ask Ecrypt folks to produce such TLS suite 
> recommendation since the step is not that far using as baseline their

> current crypto recommendations.
> Luis
> -----Original Message-----
> From: 
> []
> On Behalf Of Yngve N. Pettersen (Developer Opera Software ASA)
> Sent: den 17 oktober 2007 13:25
> To: Luis Barriga;; 
> Subject: Re: ISSUE-128: Strong / weak algorithms? [Techniques]
> Any reason why the result of ACTION-285 doesn't suffice?
> On Wed, 17 Oct 2007 13:06:50 +0200, Luis Barriga 
> <> wrote:
>> FIPS main audience is *crypto* implementors. It seems too low level 
>> and thus doesn't seem to be the primary document to refer to.
>> We need to refer to some authoritative document(s) recommending TLS 
>> suites to web site *security* administrators so they can decide

>> ones to enable/disable when deploying TLS-enabled web sites. I don't

>> think administrators would get that much help digging into FIPS.
>> NIST has such document, but as I mentioned in is for govermental

>> which excludes RC4, that as far as I know (?) is widely deployed due

>> to its high performance.
>> Luis
>> -----Original Message-----
>> From: 
>> []
>> On Behalf Of
>> Sent: den 17 oktober 2007 00:02
>> To:;
>> Subject: RE: ISSUE-128: Strong / weak algorithms? [Techniques]
>> It might be better in a W3C standard to reference the international 
>> equivalents of FIPS 140.
>> The FIPS 140-1 equivalent is ISO/IEC FCD 19790 "Security

>> for cryptographic modules".
>> Last I heard, FIPS 140-2 was the US input document to an NP recently

>> approved by CS1.  At that time it had not yet been assigned an 
>> ISO/IEC number, but maybe that has changed.
>> Mike
>> -----Original Message-----
>> From: 
>> []
>> On Behalf Of Anil Saldhana
>> Sent: Tuesday, October 16, 2007 3:08 PM
>> To: Web Security Context Working Group WG
>> Subject: Re: ISSUE-128: Strong / weak algorithms? [Techniques]
>> FIPS 140-2 is the defining standard for cryptology (at least in the
>> Maybe we can use that as the frame of reference in the rec doc?
>> Doyle, Bill wrote:
>>> A number of standards bodies that we can point to that note 
>>> recommended strengths.
>>> In the US the National Institute of Standards and Technology (NIST)

>>> provides the clearing house for recommended practices. Systems

>>> follow Federal Information Processing Standards (FIPS) or FIPS
>>>     *From:*
>>>     [] *On Behalf Of
>>>     Phillip
>>>     *Sent:* Tuesday, October 16, 2007 11:33 AM
>>>     *To:* Thomas Roessler
>>>     *Cc:* Luis Barriga; Web Security Context Working Group WG
>>>     *Subject:* RE: ISSUE-128: Strong / weak algorithms?
>>>     I would prefer not to make a recommendation here since it is
>>>     document that I would want to keep continuously updated.
>>>     There is a strong industry consensus here and what we need to
>>>     is to ensure that it is widely recognized as such and have a
>>>     mechanism to alert people when the consensus changes (e.g. the
>>>     results on SHA-1).
>>>     *From:* Thomas Roessler []
>>>     *Sent:* Tue 16/10/2007 4:08 AM
>>>     *To:* Hallam-Baker, Phillip
>>>     *Cc:* Luis Barriga; Web Security Context Working Group WG
>>>     *Subject:* Re: ISSUE-128: Strong / weak algorithms?
>>>     On 2007-10-15 20:26:04 -0700, Phillip Hallam-Baker wrote:
>>>     > I don't think we should write an exhaustive list olf strong
>>>     > ciphers. The most we should do is to note that there is a set
>>>     > ciphers that the consensus recognizes as being acceptably
>>>     > which should be supported.
>>>     I'd rather we either reference some known-authoritative
>>>     that is being maintained elsewhere (because I don't see us 
>>> taking
>> on
>>>     that kind of document maintenance role for this particular
>> problem).
>>>     The second-best approach might be to say "these are known bad
>> [REF]
>>>     [REF] [REF], for the rest, please do your due diligence."
>>>     Regards,
>>>     --
>>>     Thomas Roessler, W3C  <>
>> --
>> Anil Saldhana
>> Project/Technical Lead,
>> JBoss Security & Identity Management
>> JBoss, A division of Red Hat Inc.

Yngve N. Pettersen
Senior Developer		                 Email:
Opera Software ASA         
Phone:  +47 24 16 42 60              Fax:    +47 24 16 40 01

Received on Wednesday, 17 October 2007 16:18:36 UTC