Unobstructed HTTPS

There’s an interesting discussion on Slashdot about SSL certificates. It brings up two valid points:

  1. Invalid certificates, while providing a secure mechanism between the client/server are extremely annoying to use in Firefox 3 for many people because of the multi-step process. Previously it was just a warning dialog.
  2. There are no free SSL certificates that are really “usable” (not throwing up warnings in a many browsers). CAcert.org has likely gotten the most inclusion, but it’s barely anywhere.

Certificates not signed by a trusted certificate authority (CA) give up a warning because of the idea that a certificate authority verifies the certificate belongs to the person whose name is on the certificate. This concept was busted a while back as CA’s started doing “domain validation” to offer lower prices. To “remedy” this, they created EV SSL. EV SSL requires more background checking, but at a higher cost. This means there are three tiers of SSL:

  1. Untrusted/Self Signed – Free – The user is strongly discouraged from visiting a site with one of these. Indicates the technologically the channel is secure only.
  2. Signed By CA – Variable Pricing – The user is told this is secure.
  3. EV SSL – Expensive – The user is told these sites are super awesomely amazing and can cure cancer.

Essentially EV SSL is nothing more than a scheme to charge more. EV SSL is supposed to do what a signed certificate should have been doing all along. By 2012 I’d bet there will be a SEV SSL(Super Extended Validation Certificate). Maybe that would require a DNA and fingerprints to prove identity.

The Problem

It’s 2008 (actually more than half way through it). I still can’t use a secure https connection without either throwing up an error to users (who are always confused by it), or paying a fee? It seems right to me it should be free to use https without any barrier for a technical level of security.

Why is “trust” bound so tightly to encryption? Why can’t a medium be encrypted without being trusted? The technology shouldn’t be tied the way it is to the business side of things.

Trust should be bound to encryption, but encryption should not be bound to trust. Trust is the “needy” individual in this relationship. Encryption is strong and confident. At least it should be…

A modest proposal

I propose that browsers should allow for self signed certificates to be used without any dialog, interstitial or other obstruction provided they are properly formed and not expired. The user interface should indicate that the channel is encrypted and communication is unlikely to be intercepted between the user and the server. It should note if there is any change (just like SSH notifies the user if the signature is changed between sessions). Other than that it should be transparent.

SSL certificates and EV SSL certificates should indicate in the user interface the the site being browsed is not only encrypted, but trusted by a third party the browser trusts. These are suitable for ecommerce, banking etc.

This would allow for things like intranets and other places where encryption is desired, paying for a CA to verify identity is overkill, and “domain verification” is just pointless.

Trust should be bound to encryption. Encryption shouldn’t be bound to trust. Encryption shouldn’t require verification. Encryption should be self-serve.

I’d be curious to know what others thought of the issue.

Zero Day Vulnerability

This really isn’t very accurate. I don’t know the details of the vulnerability or even if there actually is one, but I question the marketing around the Zero Day Initiatives vulnerability report. The big news seems to be “only 5 hours” after the release.

This isn’t really accurate if you think about it. It would be if Firefox 3 were a tightly controlled product that nobody could see a final version of. Reality is that the entire source code lives in cvs, there are nightly builds, and formal release candidates posted. Could someone have downloaded it after release and found a security issue? Absolutely. Is the timing a little suspicious considering everything was done out in the open? Yes.

It wouldn’t have made any waves if a vulnerability was found in a release candidate. It would have just been patched and a new candidate posted.

The advantage to the open source development process is the transparency through the entire process. The code in the release build isn’t remotely new or surprising. Many people had been running it for days prior to the actual release.

Again, it’s possible it all happened in 5 hours. But I doubt someone discovered a security hole, documented it, then it was verified and confirmed in just 5 hours. Especially considering the open nature of the development process and how easy it is to check things out in advance.

8 Million Downloads In 24 Hours

It was a ton of fun to watch, absolutely addictive. 83 terabytes of data served just for downloads over 24 hours. There’s still a ton of people to update as the auto-update functionality has yet to be triggered. You can now see the scale of what’s involved. John Lilly’s got some great statistics on what just happened.

According to Arbor Networks, yesterday’s U.S. Open played at Torrey Pines (featuring Tiger Woods and a bunch of guys pretty much nobody cares about) generated so much traffic some ISP’s thought it was a DDoS attack. There was a huge spike on TCP/1935. Ironically this was about the same time Firefox 3 was unleashed. I wonder if that had any effect. Maybe next time, rather than a “world record” it should simply be “reek havoc on your ISP”.

Firefox 3.0 Is Out

Firefox

So the servers had a giant melt down. That’s hopefully history now. It’s out! Go download it. While your at it, spread the word and help break a world record. After all, how many world records have you participated in so far?

  • Awesomebar – Find what you want easier than ever.
  • Malware Protection – Stay safer when browsing the web
  • Native UI Appearance – It looks better than ever.
  • Better Addons/Plugins Manager – Manage plugins with ease, find new addons.
  • Download Manager – Resumable downloads!
  • Smart Bookmarks – Most visited, recently bookmarked, recently tagged.
  • Better Memory Management – Nuff said
  • Powered By Robots Not only are they awesome, they obey the Three Laws of Robotics

See Deb Richardsonā€™s Field Guide to Firefox 3 for more details.

Rebreaking The Web

It’s happening again. Once upon a time, browser vendors started adding their own features without consulting with each other and agreeing upon standards. What they created was a giant mess of inconsistencies across browsers and platforms that is still in effect today. Ask any web developer and they can tell you of the pains that they have suffered trying to make seemingly trivial things work everywhere consistently. It’s no easy task. Before IE 7, even an ajax required something along the lines of:

var httpRequest;
if (window.XMLHttpRequest) { // Mozilla, Safari, …
    httpRequest = new XMLHttpRequest();
} else if (window.ActiveXObject) { // IE
    httpRequest = new ActiveXObject("Microsoft.XMLHTTP");
}

That’s right, IE 6 didn’t support the native xmlHttpRequest object (more here). This is just one of many examples in JavaScript and CSS. document.all anyone?

The end result of this problem became to be known as the “Web Standards” movement. Simply put it’s an idea that code should follow a standard that results in consistent output across all browsers on various platforms. Write once, run anywhere. While it’s taken years for this to manifest, it’s slowly become a reality. Firefox, Safari, Opera have fairly consistent rendering (at least in comparison to the mess of just a few years ago on the browser scene. IE 6 was fairly poor in terms of modern web development, but IE 7 made progress, and IE 8 is Microsoft’s greatest effort to date to bring their browser up to speed.

Continue reading