Categories
General Mozilla Software

On Deprecating HTTP

Mozilla announced:

There’s pretty broad agreement that HTTPS is the way forward for the web. In recent months, there have been statements from IETF, IAB (even the other IAB), W3C, and the US Government calling for universal use of encryption by Internet applications, which in the case of the web means HTTPS.

I’m on board with this development 100%. I say this as a web developer who has, and will face some uphill battles to bring everything into HTTPS land. It won’t happen immediately, but the long-term plan is 100% HTTPS . It’s not the easiest move for the internet, but it’s undoubtedly the right move for the internet.

A brief history

The lack of encryption on the internet is not to different from the weaknesses in email and SMTP that make spam so prolific. Once upon a time the internet was mainly a tool of academics, trust was implicit and ethics was paramount. Nobody thought security was of major importance. Everything was done in plain text for performance and easy debugging. That’s why you can use telnet to debug most older popular protocols.

In 2015 the landscape has changed. Academic use of the internet is a small fraction of its traffic. Malicious traffic is a growing concern. Free sharing of information, the norm in the academic world is the exception in some of the places the internet reaches.

Protecting the user

Users deserve to be protected as much as technology will allow. Some folks claim “non-sensitive” data exist. I disagree with this as it’s objective and a matter of personal perspective. What’s sensitive to someone in a certain situation is not sensitive to others. Certain topics that are normal and safe to discuss in most of the world are not safe in others. Certain search queries are more sensitive than others (medical questions, sensitive business research). A web developer doesn’t have a good grasp of what is sensitive or not. It’s specific to the individual user. It’s not every network admin’s right to know if someone on their network browsed and/or purchased pregnancy tests or purchased a book on parenting children with disabilities on Amazon. The former may not go over well at a “free” conservative school in the United States for example. More than just credit card information is considered “sensitive data” in this case. Nobody should be so arrogant as to think they understand how every person on earth might come across their website.

Google and Yahoo took the first step to move search to HTTPS (Bing still seems to be using HTTP oddly enough). This is the obvious second step to protecting the world’s internet users.

Protecting the website’s integrity

Michelangelo David - CensoredUnfortunately you can no longer be certain a user sees a website as you intended it as a web developer. Sorry, but it doesn’t work that way. For years ISP’s have been testing the ability to do things like insert ads into webpages. As far as I’m aware in the U.S. there’s nothing explicitly prohibiting replacing ads. Even net neutrality rules seem limited to degrading or discriminating against certain traffic, not modifying payloads.

I’m convinced the next iteration of the great firewall will not explicitly block content, but censor it. It will be harder to detect than just being denied access to a website. The ability to do large-scale processing like this is becoming more practical. Just remove the offending block of text or image. Citizens of oppressed nations will possibly not notice a thing.

There’s also been attempts to “optimize” images and video. Again even net-neutrality is not entirely clear assuming this isn’t targeted to competitors for example.

But TLS isn’t perfect!

True, but let’s be honest, it’s 8,675,309 times better than using nothing. CA’s are a vulnerability, they are a bottleneck, and a potential target for governments looking to control information. But browsers and OS’s allow you to manage certificates. The ability to stop trusting CA’s exists. Technology will improve over time. I don’t expect us to be still using TLS 1.1 and 1.2 in 2025. Hopefully substantial improvements get made over time. This argument is akin to not buying a computer because there will be a faster one next year. It’s the best option today, and we can replace it with better methods when available.

SSL Certificates are expensive!

First of all, domain validation certificates can be found for as little as $10. Secondly, I fully expect these prices to drop as demand increases. Domain verification certificates have virtually no cost as it’s all automated. The cheaper options will experience substantial growth as demand grows. There’s no limit in “supply” except computing power to generate them. A pricing war is inevitable. It would happen even faster if someone like Google bought a large CA and dropped prices to rock bottom. Certificates will get way cheaper before it’s essential. $10 is the early adopter fee.

But XYZ doesn’t support HTTPS!

True, not everyone is supporting it yet. That will change. It’s also true some (like CDN’s) are still charging insane prices for HTTPS. It’s not practical for everyone to switch today. Or this year. But that will change as well as demand increases. Encryption overhead is nominal. Once again pricing wars will happen once someone wants more than their shopping cart served over SSL. The problem today is demand is minimal, but those who need it must have it. Therefore price gouging is the norm.

Seriously, we need to do this?

Yes, seriously. HTTPS is the right direction for the Internet. There’s valid arguments for not switching your site over today, but those roadblocks will disappear and you should be re-evaluating where you stand periodically. I’ve moved a few sites including this blog (SPDY for now, HTTP/2 soon) to experience what would happen. It was largely a smooth transition. I’ve got some sites still on HTTP. Some will be on HTTP for the foreseeable future due to other circumstances, others will switch sooner. This doesn’t mean HTTP is dead tomorrow, or next year. It just means the future of the internet is HTTPS, and you should be part of it.

Categories
Mozilla

Shumway

From Mozilla Research:

Shumway is an experimental web-native runtime implementation of the SWF file format. It is developed as a free and open source project sponsored by Mozilla Research…

I’m pretty amazed by this one. In 2009 JS was emulating the NES. In 2012 it’s running SWF. That’s really impressive if you think about it. JavaScript is slowly taking over the world.

Categories
Mozilla

Perception Of Performance

Google is pervasive about associating Chrome with being fast. It’s was their primary pitch when they first announced it. Back when Firefox went 1.0, it wasn’t so much about speed but “not sucking” as all the geeks liked to say. Given IE 6 was the competition, that was likely the best marketing on earth. Sure it was faster, but sucking fast wasn’t nearly as good as not sucking. Not sucking encompassed the missing features, broken rendering, crashing, constant parade of security problems. It summarized the product surprisingly well for not being an official slogan by any means.

Google now launched Chrome for iOS. On the desktop Chrome and Safari both use WebKit, Chrome applies it’s own touches to make things faster. Notably they have their own JS engine. Safari also has it’s own JS engine. This is the secret sauce of performance. In the iOS world however Apple being the totalitarian dictator decided that iOS will provide WebKit and JS. If your app has any web browser functionality it will utilize these API’s and not implement it’s own engine. Verbatim:

2.17 Apps that browse the web must use the iOS WebKit framework and WebKit Javascript

Google Chrome for iOS however is Google integration into a reskinned experience of Safari. It’s the same browser. Just a new UI bolted on with some Google features integrated in. It’s not a separate browser. It’s a UI.

That however doesn’t stop Google’s marketing machine (I’d argue Apple marketing’s top rival) from putting “fast” as the second word:

Browse fast with Chrome, now available on your iPhone, iPod touch and iPad. Sign in to sync your personalized Chrome experience from your computer, and bring it with you anywhere you go.

It goes on to clarify:

  • Search and navigate fast, directly from the same box. Choose from results that appear as you type.

So Google isn’t truly misleading. It’s just very strategic wording.

The truth of the matter however is that Google Chrome on iOS is substantially slower than Safari. Safari uses Nitro to accelerate JavaScript, which powers most of the complicated websites that will slow down a browser on any modern device. Apple however restricts Nitro to Safari, and doesn’t let third party apps like Google Chrome use it. This is still the case as of iOS 5, and I believe is the case in iOS 6, though I haven’t personally verified that.

How much slower is Google Chrome on iOS in comparison to Safari? Well Here’s a SunSpider test I did on my iPad 3:

Safari

============================================
RESULTS (means and 95% confidence intervals)
--------------------------------------------
Total: 1817.9ms +/- 0.2%
--------------------------------------------

3d: 214.7ms +/- 1.1%
cube: 72.3ms +/- 0.7%
morph: 57.9ms +/- 0.9%
raytrace: 84.5ms +/- 2.2%

access: 224.9ms +/- 0.6%
binary-trees: 44.4ms +/- 1.7%
fannkuch: 96.2ms +/- 0.6%
nbody: 56.0ms +/- 0.0%
nsieve: 28.3ms +/- 2.7%

bitops: 141.0ms +/- 0.4%
3bit-bits-in-byte: 23.4ms +/- 1.6%
bits-in-byte: 29.5ms +/- 1.3%
bitwise-and: 37.8ms +/- 1.5%
nsieve-bits: 50.3ms +/- 0.7%

controlflow: 15.7ms +/- 2.2%
recursive: 15.7ms +/- 2.2%

crypto: 123.3ms +/- 0.6%
aes: 70.5ms +/- 0.5%
md5: 29.4ms +/- 1.3%
sha1: 23.4ms +/- 1.6%

date: 274.4ms +/- 0.7%
format-tofte: 139.8ms +/- 1.1%
format-xparb: 134.6ms +/- 0.7%

math: 175.1ms +/- 0.3%
cordic: 61.5ms +/- 0.8%
partial-sums: 74.4ms +/- 0.7%
spectral-norm: 39.2ms +/- 0.8%

regexp: 70.8ms +/- 0.6%
dna: 70.8ms +/- 0.6%

string: 578.0ms +/- 0.5%
base64: 78.3ms +/- 1.9%
fasta: 68.1ms +/- 0.9%
tagcloud: 109.5ms +/- 1.2%
unpack-code: 207.5ms +/- 1.2%
validate-input: 114.6ms +/- 0.7%

Google Chrome

============================================
RESULTS (means and 95% confidence intervals)
--------------------------------------------
Total: 7221.0ms +/- 0.1%
--------------------------------------------

3d: 802.7ms +/- 0.2%
cube: 230.9ms +/- 0.6%
morph: 297.3ms +/- 0.5%
raytrace: 274.5ms +/- 0.1%

access: 1112.0ms +/- 0.2%
binary-trees: 98.4ms +/- 1.1%
fannkuch: 609.6ms +/- 0.2%
nbody: 247.9ms +/- 0.2%
nsieve: 156.1ms +/- 0.4%

bitops: 957.2ms +/- 0.2%
3bit-bits-in-byte: 210.4ms +/- 0.6%
bits-in-byte: 232.9ms +/- 0.2%
bitwise-and: 188.5ms +/- 0.4%
nsieve-bits: 325.4ms +/- 0.2%

controlflow: 129.5ms +/- 0.3%
recursive: 129.5ms +/- 0.3%

crypto: 493.3ms +/- 0.2%
aes: 214.3ms +/- 0.4%
md5: 140.2ms +/- 0.3%
sha1: 138.8ms +/- 0.5%

date: 381.1ms +/- 0.3%
format-tofte: 214.2ms +/- 0.2%
format-xparb: 166.9ms +/- 0.5%

math: 770.7ms +/- 0.2%
cordic: 316.6ms +/- 0.2%
partial-sums: 243.2ms +/- 0.3%
spectral-norm: 210.9ms +/- 0.4%

regexp: 1340.2ms +/- 0.2%
dna: 1340.2ms +/- 0.2%

string: 1234.3ms +/- 0.6%
base64: 175.7ms +/- 0.5%
fasta: 205.6ms +/- 0.2%
tagcloud: 284.0ms +/- 2.3%
unpack-code: 370.1ms +/- 0.9%
validate-input: 198.9ms +/- 0.6%

Quite a bit slower.

So really, if you’re using Chrome on iOS, it’s because you absolutely love the design and integration with Google’s services, and are willing to trade off considerable JavaScript performance for those perks.

That however doesn’t stop many people from thinking it’s fast. Just in the past few minutes I’m able to find these Tweets among the thousands streaming across the web. I won’t mention or link to them directly (you could find them however if you wanted):

“Chrome for iOS is FAST, takes the mobile browsing experience to a new level.”

“I like it! It’s fast and can sync with Chrome desktop, which I use all of the time.”

“Liking #chrome on #iOS very slick, fast and clean looking”

“using chrome on my iphone right now.. cant believe how fast it is”

“That chrome for iOS is freaking fast but so basic. No tweet button, no add-on. Man I kinda disappointed. I give ’em 1 ‘fore the update”

“Chrome for iOS? Hell yes!! So fast! #chrome”

“Google Chrome for iOS is fast.”

“Holy hell Chrome is fast on the iPad.”

The most touted feature isn’t actually a feature. It’s technically not even there. The numbers and the technology insist that it’s not (they prove it’s actually slower). But that’s what everyone is ranting and raving about. You could argue Google’s UI is faster, but I’d be highly skeptical that Google’s found Cocoa tricks Apple engineers haven’t. Perhaps a UI transition or two makes you think it’s faster or more responsive, however even that I can’t find any evidence of.

All the hard work the Google engineers did squeezing their services into a compact simple to use UI are ignored in favor of this non-existent feature. And as a developer who can’t ignore such a thing, I will say they did a great job with their UI.

I present to you, the power of marketing!

Categories
Mozilla

On Boot To Gecko

Always bet on JavaScript, always bet on the web. This is really the reason Boot to Gecko is so interesting. Microsoft is now learning this the hard way. If Apple isn’t careful they too will learn this the hard way.

There’s been a lot of talk today about Telefónica’s involvement, but it’s worth noting the Mozilla blog announcement also mentions Deutsche Telekom Innovation Labs will join the Boot to Gecko project with dedicated development resources. That’s a pretty big deal.

The ability to run on lower end hardware which is cheaper to produce in quantity will make a huge difference. Tech in general tends to focus on North America, Europe, Japan, Korea, Brazil, Australia in terms of target market. They do this because they are wealthy countries reasonably free markets, similar taste, and trade agreements make it favorable.

This however has a huge downside. Overall this is excluding a huge chunk of the world. China alone is about 1.3 billion people (CIA estimate 2012) with a GNI of $7,570 (compared to $47,120 for the US). As large as Brazil is (~192.3 million), it’s only half of the 385.7 million in South America.

Take a look at the map in terms of GNI:
World by GNI PPP Per Capita

Now compare that to population density. Pay close attention to Asia, South Asia and Western Africa:
Population Density

Who’s dominated this market to date? For most of the time it’s been Symbian since phones that run that slim OS have been rather cost effective. More recently it’s becoming Android now that older hardware exists that can be produced cheaply. Notice in the graph below where Android’s growth is coming from. Many would like you to think it’s only Apple and Android out there. That’s hardly where Android is growing users from. It’s market-share from Symbian.


Mobile Market Share

That Apple iOS dip is likely the drop-off prior to the iPhone 4S shipping rather than Android. It was only released in October after iPhone 4 sales stalled in anticipation. You see a similar dip in 2008. 2009 is likely offset by the iPad’s success.

Of course an OS that runs fast on slower ARM hardware will run blazing fast on more expensive state of the art hardware. So everyone really benefits from being lean and fast.

This is about bringing the mobile internet to billions of people. It’s a big deal.

Hat Tip to Wikipedia for the maps and graph

Categories
Mozilla Open Source

Why Open Source Is Pretty Awesome

At some point I think it’s easy to take things for granted. Being able to alter software to meet your needs is an awesome power.

Today, a tweet rehashed an annoyance regarding a tactic on websites to alter copy/paste and put a link with tracking code in your clipboard. I could opt out, but that doesn’t fix when websites roll their own. It’s a fairly simple thing to implement. In my mind there’s little (read: no) legitimate justification for oncopy, oncut or onpaste events.

So I did an hg pull while working on some other stuff. I came back and wrote a quick patch, started compiling and went back to working on other stuff.

Then came back to a shiny new Firefox build with a shiny new preference that disabled the offending functionality. A quick test against a few websites shows it works as I intended by simply killing that event. You can’t do these things with closed source.

Of course I found the relevant bug and added a patch for anyone interested.

A 15 minute diversion and my web browsing experience got a little better. Sometimes I forget I’ve got experience on that side of the wire too 😉 .

Categories
Mozilla

On Firefox Versioning

Writing software is actually quite easy. Writing good software is relatively harder, but still easy. Writing software to a programmer is like painting to a painter. Shipping software is an incredibly complicated task. It’s like getting a stadium full of babies to all have clean diapers at the same time with only one or two people to do the work. As soon as you fix one thing, you discover more crap. The process stinks and you’ll never reach the end. Those who do it either by printing a CD, uploading a binary, or pushing out changes to a tier of web servers know what I’m talking about.

It’s easy to write code to do things. It’s harder to build a product. It’s harder still to actually draw a line in the sand and decide when you’re “done”. The truth is all software ships with bugs. Someone who tells you otherwise is an idiot. They almost certainly aren’t all discovered, very likely some will be, but they absolutely exist. The general consensus is you want no glaring bugs and you don’t want big bugs in common use cases. Obscure use cases will always be more buggy. That’s the nature of the beast.

Knowing this, it’s easy to understand that changing release cycles will be an arduous process with lots of details to think about. Not everything is quantitative or can be reduced to a math equation. How long is it worth waiting for a feature? Is the shiny button worth 3 days? 3 weeks? 3 months? Indefinite hold? Will it even work as we think? What bugs will it introduce? How long to deal with those? Not an easy decision. Even harder to reach a consensus on. The only thing certain is the lack of a decision will guarantee a failure to launch.

The Firefox Version Problem

Firefox is now a 6 week release cycle. This means features get out the door soon after they are fully baked. That’s a very good thing. That means adoption of modern technologies and the latest in security is out there quickly. We all benefit from that.

The downside however is that upgrades are disruptive. They can break compatibility, and they require extensive testing in large deployments (big companies, educational institutions). That can be expensive and time consuming if you’re impacted.

The other side of this is version numbers get blurred. 4.0, 5.0, 6.0… “WTF is the difference” most users would think given it looks largely the same. But is it really 4.0.1, 4.0.2, 4.0.3? As a web developer, what versions are you supporting? This is now much more complicated (don’t even get me started in testing).

Stable vs. Slipstream

My modest proposal is a Stable/Slipstream (I prefer “slipstream” vs. “bleeding edge”) model. For example:

Firefox 7.0 ships in 6 weeks, September 27 as of this blog post. From then on, every 6 weeks a new release ships and would become 7.1, 7.2, 7.3 etc. For users, it’s just auto-updates every so often. These intermediate releases are disposable as the users are on the slipstream. They rapidly update. A matter of weeks after the release the previous one is unsupported. Previous releases are just a rumor, recognizable only as deja vu and dismissed just as quickly1. They are oblivious to the concept of “versions” for the most part. After several release cycles (9-12 months), this becomes “stable” at 7.x. The next day 8.x starts and the process starts over.

From then on (I’d propose 12 months) only security fixes will be provided to 7.x. For large deployments who need to do extensive QA, they adopt the stable branch once a year on a predictable schedule and stick to it. For the vast majority of the internet, they adopt the slipstream (default) and get the latest release every 6 weeks. The stable branch is only around for a limited period of time before it moves to the next version. That last release cycle may be a bit more modest and lower risk than the previous ones.

The end result is that nobody cares about a release older than 12 months. Generally speaking only 2 matter. Slipstreamed users are updating rapidly (and will likely update even more rapidly as the process improves). Stable users have 12 months to hop to the next lily pad. This goes for IT, web developers, add-on developers, browser developers.

In the long term (next few years), I think web applications will become more agile and less rigid. Part of what things like HTML5 provide is a more standardized and less hacky way of doing things. That means less compatibility issues with untested browsers. As those older applications are phased out, the test cycles for large deployments will decrease. Ideally some will eventually just migrate away from “stable”.

Version Numbers

Yes, version numbers still exist, but for most users they don’t mean terribly much unless they have a problem or need to verify compatibility with something. In which case, the major release number is likely the important one. They are still a necessary evil, and users do need to know how to get it, even if they don’t need to know it offhand. Browser version number is pretty much the first step of any diagnostics for a web application as it’s the ultimate variable.

Just my thoughts on the last several weeks of debate.

1. Men In Black (2007)

Categories
In The News Mozilla

Mork And Casey Anthony

Jamie Zawinski linked to a very interesting blog post about the forensics problem in the recent Casey Anthony trial. To summarize, she was using an older version of Firefox, which stores its history in a Mork DB. For those not familiar with Mozilla internals, Mork is (I’m quoting JWZ here):

…the single most braindamaged file format that I have ever seen in my nineteen year career”.

That bug was actually one of two times where I brushed with Mork, that time learning, and another time shortly afterwards where I learned first hand how impossible it really is to work with as part of a hack I was trying to build and later abandoned. Perhaps it was my experience at the time that just made it impossible, perhaps it really was Mork.

Categories
Mozilla

Firefox 4

Firefox 4

Firefox 4 is out! If you for some reason don’t know why you want it here’s a few things you’ll love about Firefox 4.0.

Congrats to everyone involved in shipping.

Categories
Mozilla

Things You’ll Love About Firefox 4.0

It’s that time again. Here’s my list of awesome things you’ll love about Firefox 4:

For Users

New Look For Tabs

New Tabs For Firefox 4
One of the first things that you’ll notice is tabs on top. This paradigm really makes more sense since the tab defines not just the content but the environment it’s viewed (prev/next button, URL bar). It’s also just much sleeker looking. After a few minutes you’ll likely agree this is a better approach than tabs under.

Another nice touch is if you enter a URL that’s already open in another tab, you’ll be given the option to switch to that tab. Perfect for those of us who end up with 50 tabs by lunch time.

It also just feels tighter and less intrusive on the web browsing experience.

Categories
Mozilla Security Web Development

Wanted: Native JS Encryption

I’d like to challenge all browser vendors to put together a comprehensive JS API for encryption. I’ll use this blog post to prove why it’s necessary and would be a great move to do so.

The Ultimate Security Model

I consider Mozilla Sync (formerly known as “Weave”) to have the ultimate security model. As a brief background, Mozilla Sync is a service that synchronizes your bookmarks, browsing history, etc. between computers using “the cloud”. Obviously this has privacy implications. The solution basically works as follows:

  1. Your data is created on your computer (obviously).
  2. Your data is encrypted on your computer.
  3. Your data is transmitted securely to servers in an encrypted state.
  4. Your data is retrieved and decrypted on your computer.

The only one who can ever decrypt your data is you. It’s the ultimate security model. The data on the server is encrypted and the server has no way to decrypt it. A typical web service works like this:

  1. Your data is created on your computer.
  2. Your data is transmitted securely to servers.
  3. Your data is transmitted securely back to you.

The whole time it’s on the remote servers, it could in theory be retrieved by criminals, nosy sysadmins, governments, etc. There are times when you want a server to read your data to do something useful, but there are times where it shouldn’t.

The Rise Of Cloud Data And HTML5

It’s no secret that more people are moving more of their data in to what sales people call “the cloud” (Gmail, Dropbox, Remember The Milk, etc). More and more of people’s data is out there in this maze of computers. I don’t need to dwell too much about the issues raised by personal data being stored in places where 4th amendment rights aren’t exactly clear in the US and may not exist in other locales. It’s been written about enough in the industry.

Additionally newer features like Web Storage allow for 5-10 MB of storage on the client side for data, often used for “offline” versions of a site. This is really handy but makes any computer or cell phone used a potentially treasure trove of data if that’s not correctly purged or protected. I expect that 5-10 MB barrier to rise over time just like disk cache. Even my cell phone can likely afford more than 5-10 MB. My digital camera can hold 16 GB in a card a little larger than my fingernail. Local storage is already pretty cheap these days, and will likely only get cheaper.

Mobile phones are hardly immune from all this as they feature increasingly robust browsers capable of all sorts of HTML5 magic. The rise of mobile “apps” is powered largely by the offline abilities and storage functionality. Web Storage facilitates this in many ways but doesn’t provide any inherent security.

Again, I don’t need to dwell here, but people are leaving increasingly sensitive data on devices they use, and services they use. SSL protects them while data is moving over the wire, but does nothing for them once data gets to either end. The time spent over the wire is measured in milliseconds, the time spent at either end can be measured in years.

Enter JS Crypto

My proposal is that there’s a need for native JS Cryptography implementing several popular algorithms like AES, Serpent, Twofish, MD5 (I know it’s busted, but still could be handy for legacy reasons), SHA-256 and expanding as cryptography matures. By doing so, the front end logic can easily and quickly encrypt data before storing or sending.

For example to protect Web Storage before actually saving to globalStorage:

globalStorage[‘mybank.com’].lastBalance = "0.50";
globalStorage[‘mybank.com’].lastBalance = Crypto.AES.encrypt("0.50", password);

Using xmlHttpRequest or POST/GET one could send encrypted payloads directly to the server over http or https rather than send raw data to the server. This greatly facilitates the Mozilla Sync model of data security.

This can also be an interesting way to transmit select data in a secure manner while serving the rest of a site over http using xmlHttpRequest by just wrapping the data in crypto (that assumes a shared key).

I’m sure there are other uses that I haven’t even thought of.

Performance

JS libraries like Crypto-JS are pretty cool, but they aren’t ideal. We need something as fast and powerful as we can get. Like I said earlier, mobile is a big deal here and mobile has performance and power issues. Intel and AMD now have AES Native Instructions (AES NI) for their desktop chips. I suspect mobile chips who don’t have this will eventually do so. I don’t think any amount of JS optimization will get that far performance wise. We’re talking 5-10 MB of client side data today, and that will only grow. We’re not even talking about encrypting data before remote storage (which in theory can break the 10MB limit).

Furthermore, most browsers already have a Swiss Army knife of crypto support already, just not exposed via JS in a nice friendly API. I don’t think any are currently using AES NI when available, though that’s a pretty new feature and I’m sure in time someone will investigate that.

Providing a cryptography API would be a great way to encourage websites to up the security model in an HTML5 world.

Wait a second…

Shouldn’t browsers just encrypt Web Storage, or let OS vendors turn on Full Disk Encryption (FDE)?

Sure, both are great, but web apps should be in control of their own security model regardless of what the terminal is doing. Even if they are encrypted, that doesn’t provide a great security model if the browser has one security model in place for Web Storage and the site has its own authentication system.

Don’t JS Libraries already exist, and isn’t JS getting the point of almost being native?

True, libraries do exist, and JS is getting amazingly fast to the point of threatening native code. However crypto is now being hardware accelerated. It’s also something that can be grossly simplified by getting rid of libraries. I view JS crypto libraries the way I view ExplorerCanvas. Great, but I’d prefer a native implementation for its performance. These libraries do still have a place bridging support for browsers that don’t have native support in the form of a shim.

But if data is encrypted before sending to a server, the server can’t do anything with it

That’s the point! This isn’t ideal in all cases for example you can’t encrypt photos you intend to share on Facebook or Flickr, but a DropBox like service may be an ideal candidate for encryption.

What about export laws?

What about them? Browsers have been shipping cryptography for years. This is just exposing cryptography so web developers can better take advantage and secure user data. If anything JS crypto implementations likely create a bigger legal issue regarding “exporting” cryptography for web developers.

Your crazy!

Perhaps. To quote Apple’s Think Different Campaign

Here’s to the crazy ones. The misfits. The rebels. The troublemakers. The round pegs in the square holes.

The ones who see things differently. They’re not fond of rules. And they have no respect for the status quo. You can quote them, disagree with them, glorify or vilify them.

About the only thing you can’t do is ignore them. Because they change things. They invent. They imagine. They heal. They explore. They create. They inspire. They push the human race forward.

Maybe they have to be crazy.

How else can you stare at an empty canvas and see a work of art? Or sit in silence and hear a song that’s never been written? Or gaze at a red planet and see a laboratory on wheels?

While some see them as the crazy ones, we see genius. Because the people who are crazy enough to think they can change the world, are the ones who do.

Time to enable the crazy ones to do things in a more secure way.

Updated: Changed key to password to better reflect likely implementation in the psudocode.