Jump to navigation

Poupou's Corner of the Web

Looking for perfect security? Try a wireless brick.
Otherwise you may find some unperfect stuff here...


"New" TLS/SSL support in Mono 0.31

Mono 0.31 has just been released. And while this isn't the first release to support SSL/TLS* it is the first time that this support is integrated into the WebRequest and WebResponse classes.

[*] Much of this work was contributed by Carlos Guzman Alvarez and reside in Mono.Security.dll assembly. Carlos is currently adding server-side support for TLS/SSL. Carlos, you rock! (and you should blog ;-)

TLS and SSL being such complex beast, requires to qualify the term support. In this case complexity comes from the fact that TLS/SSL is a negociating protocol - so here support equals supported ciphers. Here are the supported ciphers when using .NET framework 1.1 on Windows XP SP1 when using TLS with an OpenSSL server:

  • EXP1024-RC4-SHA
  • EXP-RC4-MD5
  • RC4-MD5
  • RC4-SHA

The [red] algorithms are not supported in Mono 0.31. Actually it is not so bad because (a) you don't wanna use exportable algorithms (EXP) and (b) it's a negotiating protocol.

Now here is what Mono 0.31 supports:

  • AES256-SHA
  • AES128-SHA
  • RC4-SHA
  • RC4-MD5

The [green] algorithms are not supported on Windows XP / Fx 1.1. Yes you seen it correctly Mono will try to use a 256 bits TLS connection if the server supports it. So who's missing the 40-56 bits exportable ciphers ?

Now I won't encourage anyone to use exportable ciphers - because you shouldn't, unless you're not legally required to do so :-(. As some people may not have the choice (server side) and because a client can't always choose it's server, the next version of Mono (i.e. not the one just released, but the code is available in CVS if you can't wait for it) will add the support of the following algorithms.

  • EXP-RC4-MD5 - 40 bits (effective key strengh)
  • EXP-RC2-CBC-MD5 40 bits
  • EXP-DES-CBC-SHA - 40 bits
  • EXP1024-RC4-MD5 - 56 bits
  • EXP1024-RC4-SHA - 56 bits
  • EXP1024-DES-CBC-SHA - 56 bits, which anyway is the maximum for DES
  • EXP1024-RC2-CBC-MD5 - 56 bits

This should cover the most of the unlucky earthlings. Frankly, unless a brick approach too close (or too fast) from my head, I don't think I'll add up new TLS/SSL algorithms soon. Sure there are others, mainly the one using Diffie-Hellman (DH), but I don't think it's worth the time right now. Of course any contribution will be gladly accepted.

Interested ? If so Diffie-Hellman is already supported in Mono and OpenSSL can make a good test server without too much trouble. In case you curious my version of OpenSSL (0.9.7a - which will soon be patched) supports:

  • AES256-SHA
  • AES128-SHA
  • EXP-KRB5-RC4-MD5
  • KRB5-RC4-MD5
  • KRB5-RC4-SHA
  • RC4-SHA
  • RC4-MD5
  • EXP1024-RC2-CBC-MD5
  • EXP1024-RC4-SHA
  • EXP1024-RC4-MD5
  • EXP-RC4-MD5

and now for the acronym-blind...

Advanced Encryption Standard, a.k.a Rjindael - see FIPS PUB 197
Cipher Block Chaining - see FIPS PUB 81
EDE with CBC - see FIPS PUB 46-3
Data Encryption Standard - see FIPS PUB 46-2
Digital Signature Standard - see FIPS PUB 186-2
Encrypt (with first key), Decrypt (with second key), Encrypt (with third key)
Ephemeral Diffie-Hellman - see RFC2631
Exportable (weak) algorithm - limited to 40 bits (original) or 56 bits (relaxed).
Kerberos 5 - see IETF Kerberos Work Group
Message Digest 5 - a hash algorithm described in RFC1321
Ron's Cipher #2 - a block cipher document in RFC2268
Ron's Cipher #4 - a s3kr3t stream cipher
Secure Hash Algorithm - a hash algorithm described in FIPS PUB 180-1
128, 256, 1024
Those are called numbers not acronyms ;-)

3/19/2004 10:59:55 | Comments | Permalink

Xml Digital Signature Status

After a few day of extreme frustration Atsushi and I finally got some interesting results. All fifteen tests in Merlin's xmldsig test suite can now be validated successfully. This is funny because the current Microsoft implementation can only validate 14 of them because it doesn't accept an X509Data element that contains both an X509Certificate and a X509CRL. This time interoperability with the W3C specification is more important than compatibility with Microsoft implementation.

A large part of the frustration came from the Phaos test suite. No matter what we did we never got any signature to validate - even when using the MS runtime! Now that we got Merlin's tests running I'm almost convinced that the Phaos tests have some kind on encoding issue prior to (or when) being zipped.

During our difficulties I began to have some unfounded doubts about our C14N implementation (written by Aleksey Sanin of xmlsec). So I wrote a little tool that C14N a file so we could compare its results with Merlin's results. As it may be useful for lots of things, more productive than doubting our C14N implementation, like comparing XML documents, here's the source code:

// // c14n.cs - C14N // // Author: // Sebastien Pouliot <sebastien@ximian.com> // // (C) 2004 Novell (http://www.novell.com) // using System; using System.IO; using System.Text; using System.Xml; using System.Security.Cryptography; using System.Security.Cryptography.Xml; public class C14N { // default transform static string url = "http://www.w3.org/TR/2001/REC-xml-c14n-20010315"; public static void Usage (string error) { Console.WriteLine ("C14N - Copyright (C) 2004 Novell.{0}", Environment.NewLine); if (error != null) { Console.WriteLine ("{0}Error: {1}{0}", Environment.NewLine, error); } Console.WriteLine ("Usage: c14n input [transform_url] [element]"); Console.WriteLine ("[input] \tXML document to canonalize"); Console.WriteLine ("[transform_url]\tTransformation algorithm URL"); Console.WriteLine (" \tDefault is{0}", url); Console.WriteLine ("[element] \tPartial C14N from this element and childs"); } public static void Main (string[] args) { if (args.Length < 1) { Usage (null); return; } string filename = args [0]; if (!File.Exists (filename)) { Usage (String.Format ("Missing file {0}", filename)); return; } XmlDocument xml = new XmlDocument (); xml.PreserveWhitespace = true; xml.Load (filename); MemoryStream ms = new MemoryStream (); for (int i=1; i < args.Length; i++) { if (args [i].StartsWith ("http://")) { url = args [i]; } else { XmlNodeList xnl = xml.GetElementsByTagName (args [i], SignedXml.XmlDsigNamespaceUrl); byte[] si = Encoding.UTF8.GetBytes (xnl [0].OuterXml); ms.Write (si, 0, si.Length); } } if (ms.Position == 0) { // process the whole document xml.Save (ms); } ms.Position = 0; Transform t = (Transform) CryptoConfig.CreateFromName (url); if (t == null) { Usage (String.Format ("Unknown transformation algorithm {0}", url)); return; } t.LoadInput (ms); StreamReader sr = new StreamReader ((Stream) t.GetOutput (), Encoding.UTF8); Console.Write (sr.ReadToEnd ()); } }

Side note: after hitting my head hard enough (guess on what ?) I finally figured out that C14N could mean CrazySebastien and not Canonicalization - but ran two letter shorts. Strangely I didn't had any more problem with C14N afterward...

3/16/2004 21:36:10 | Comments | Permalink

The downside of a fully managed world

Hey you didn't actually believe that there was no inconvenient, did you ?

Have a look at the following sample:

// wow this is really fast ! RSACryptoServiceProvider rsa = new RSACryptoServiceProvider (2048); // hey what's going on ??? string keypair = rsa.ToXmlString (true);

Bad news

If you try to run this code you will see that one line is much longer than the other. Which one is it ? hint: it's not the comments. Well you're both right and wrong - because it depends on your runtime.

Mono use a totally managed implementation for RSA (and DSA, DH ...). This has many advantages but also some inconvenient - mostly performance. Creating a new key pair is a very CPU heavy process so every processor cycle counts. Using a high level language like C# on top of a JIT, even as good as Mono, can hardly compare to hand-tuned assembly language often found for doing the job (or at least key part of the job). The result is that it can take much more time to generate similar sized key pairs in managed code.

Good news

However it may not be as bad as it seems in the sample...

Unlike Microsoft implementation Mono doesn't generate a new key pair in its constructor (when no CspParameter object is specified as a parameter). This is because it's a common, and very bad, pattern to create a RSACryptoServiceProvider object then immediately import an existing public key (or key pair) into it. Sadly I've seen this in too many samples on the internet and this simply kills performance on the Windows platform - in particular for server applications.

In order to avoid this pattern Mono doesn't generate a new key pair until it is actually required. This is nice most of the time but this also means that, in the case a new key pair is really needed, the delay required generating a new key pair moves until later in an application process (when the UI may not expect this).

But this also means that calling again a method requiring either the public or private key will be much faster because it doesn't require generating a new key pair (see next sample). So this is only a one time hit... Hopefully most implementation do not require creating new key pairs very frequently as opposed to signing/verifying or encrypting/decrypting.

// wow this is really really really fast ! RSACryptoServiceProvider rsa = new RSACryptoServiceProvider (16384); // hey what's going on ??? ... then one day later ... string keypair = rsa.ToXmlString (true); // we seems awake now! string backup = rsa.ToXmlString (true);

Another good news is that most optimization made to the JIT will results in improvement in the key generation performance. So it will keep going faster without changing the code ;-)

3/10/2004 19:50:53 | Comments | Permalink

Living in a managed world

But not just yet...

One of the, let's hope temporary, problem of living in a early managed world is the fact that it is still very difficult to build a complete and useful application without using lots of (sometimes hidden, sometimes in plain sight) unmanaged resources.

Newer (e.g. ClickOnce), sexier (e.g. C#), safer (e.g. CAS)... technologies may appear but their impact won't really register on most radar until their potential can be fully realized outside whitepapers.

Let's choose a totally non-random example (btw I'm truly sorry if you got here by a truly random incident *). The current .NET framework cryptographic support comes from the unmanaged CryptoAPI. No this is not a (totally ;-) bad thing, as CryptoAPI is FIPS 140-2 certified, and has not suffered much security issue in the past (excluding newer CryptoAPI 2 features like X.509 certificates support ;-). However it does add complexity to .NET application deployment as CryptoAPI support varies on different Windows operating system. See Shawn Farkas' blogs Which Cryptographic Operations are Available? for more details.

This is a real problem, and unlike many real world problems, there is a solution because a fully managed implementation is both possible and does actually exists. By now you should have guessed I was to talk about Mono, right ?

So are we going to be living in a managed world ? Well I certainly hope so as it is a lot better sounding than living in a material world ;-)

* Want to return to random noise ? If so... please hit your head on the wireless brick and try again. I assure you that the harder you hit the less chance you have to come back here.

3/9/2004 22:36:19 | Comments | Permalink

The views expressed on this website/weblog are mine alone and do not necessarily reflect the views of my employer.