Categories
Mozilla

On Firefox Versioning

Writing software is actually quite easy. Writing good software is relatively harder, but still easy. Writing software to a programmer is like painting to a painter. Shipping software is an incredibly complicated task. It’s like getting a stadium full of babies to all have clean diapers at the same time with only one or two people to do the work. As soon as you fix one thing, you discover more crap. The process stinks and you’ll never reach the end. Those who do it either by printing a CD, uploading a binary, or pushing out changes to a tier of web servers know what I’m talking about.

It’s easy to write code to do things. It’s harder to build a product. It’s harder still to actually draw a line in the sand and decide when you’re “done”. The truth is all software ships with bugs. Someone who tells you otherwise is an idiot. They almost certainly aren’t all discovered, very likely some will be, but they absolutely exist. The general consensus is you want no glaring bugs and you don’t want big bugs in common use cases. Obscure use cases will always be more buggy. That’s the nature of the beast.

Knowing this, it’s easy to understand that changing release cycles will be an arduous process with lots of details to think about. Not everything is quantitative or can be reduced to a math equation. How long is it worth waiting for a feature? Is the shiny button worth 3 days? 3 weeks? 3 months? Indefinite hold? Will it even work as we think? What bugs will it introduce? How long to deal with those? Not an easy decision. Even harder to reach a consensus on. The only thing certain is the lack of a decision will guarantee a failure to launch.

The Firefox Version Problem

Firefox is now a 6 week release cycle. This means features get out the door soon after they are fully baked. That’s a very good thing. That means adoption of modern technologies and the latest in security is out there quickly. We all benefit from that.

The downside however is that upgrades are disruptive. They can break compatibility, and they require extensive testing in large deployments (big companies, educational institutions). That can be expensive and time consuming if you’re impacted.

The other side of this is version numbers get blurred. 4.0, 5.0, 6.0… “WTF is the difference” most users would think given it looks largely the same. But is it really 4.0.1, 4.0.2, 4.0.3? As a web developer, what versions are you supporting? This is now much more complicated (don’t even get me started in testing).

Stable vs. Slipstream

My modest proposal is a Stable/Slipstream (I prefer “slipstream” vs. “bleeding edge”) model. For example:

Firefox 7.0 ships in 6 weeks, September 27 as of this blog post. From then on, every 6 weeks a new release ships and would become 7.1, 7.2, 7.3 etc. For users, it’s just auto-updates every so often. These intermediate releases are disposable as the users are on the slipstream. They rapidly update. A matter of weeks after the release the previous one is unsupported. Previous releases are just a rumor, recognizable only as deja vu and dismissed just as quickly1. They are oblivious to the concept of “versions” for the most part. After several release cycles (9-12 months), this becomes “stable” at 7.x. The next day 8.x starts and the process starts over.

From then on (I’d propose 12 months) only security fixes will be provided to 7.x. For large deployments who need to do extensive QA, they adopt the stable branch once a year on a predictable schedule and stick to it. For the vast majority of the internet, they adopt the slipstream (default) and get the latest release every 6 weeks. The stable branch is only around for a limited period of time before it moves to the next version. That last release cycle may be a bit more modest and lower risk than the previous ones.

The end result is that nobody cares about a release older than 12 months. Generally speaking only 2 matter. Slipstreamed users are updating rapidly (and will likely update even more rapidly as the process improves). Stable users have 12 months to hop to the next lily pad. This goes for IT, web developers, add-on developers, browser developers.

In the long term (next few years), I think web applications will become more agile and less rigid. Part of what things like HTML5 provide is a more standardized and less hacky way of doing things. That means less compatibility issues with untested browsers. As those older applications are phased out, the test cycles for large deployments will decrease. Ideally some will eventually just migrate away from “stable”.

Version Numbers

Yes, version numbers still exist, but for most users they don’t mean terribly much unless they have a problem or need to verify compatibility with something. In which case, the major release number is likely the important one. They are still a necessary evil, and users do need to know how to get it, even if they don’t need to know it offhand. Browser version number is pretty much the first step of any diagnostics for a web application as it’s the ultimate variable.

Just my thoughts on the last several weeks of debate.

1. Men In Black (2007)

Categories
In The News Mozilla

Mork And Casey Anthony

Jamie Zawinski linked to a very interesting blog post about the forensics problem in the recent Casey Anthony trial. To summarize, she was using an older version of Firefox, which stores its history in a Mork DB. For those not familiar with Mozilla internals, Mork is (I’m quoting JWZ here):

…the single most braindamaged file format that I have ever seen in my nineteen year career”.

That bug was actually one of two times where I brushed with Mork, that time learning, and another time shortly afterwards where I learned first hand how impossible it really is to work with as part of a hack I was trying to build and later abandoned. Perhaps it was my experience at the time that just made it impossible, perhaps it really was Mork.

Categories
Mozilla

Firefox 4

Firefox 4

Firefox 4 is out! If you for some reason don’t know why you want it here’s a few things you’ll love about Firefox 4.0.

Congrats to everyone involved in shipping.

Categories
Mozilla

Things You’ll Love About Firefox 4.0

It’s that time again. Here’s my list of awesome things you’ll love about Firefox 4:

For Users

New Look For Tabs

New Tabs For Firefox 4
One of the first things that you’ll notice is tabs on top. This paradigm really makes more sense since the tab defines not just the content but the environment it’s viewed (prev/next button, URL bar). It’s also just much sleeker looking. After a few minutes you’ll likely agree this is a better approach than tabs under.

Another nice touch is if you enter a URL that’s already open in another tab, you’ll be given the option to switch to that tab. Perfect for those of us who end up with 50 tabs by lunch time.

It also just feels tighter and less intrusive on the web browsing experience.

Categories
Mozilla Security Web Development

Wanted: Native JS Encryption

I’d like to challenge all browser vendors to put together a comprehensive JS API for encryption. I’ll use this blog post to prove why it’s necessary and would be a great move to do so.

The Ultimate Security Model

I consider Mozilla Sync (formerly known as “Weave”) to have the ultimate security model. As a brief background, Mozilla Sync is a service that synchronizes your bookmarks, browsing history, etc. between computers using “the cloud”. Obviously this has privacy implications. The solution basically works as follows:

  1. Your data is created on your computer (obviously).
  2. Your data is encrypted on your computer.
  3. Your data is transmitted securely to servers in an encrypted state.
  4. Your data is retrieved and decrypted on your computer.

The only one who can ever decrypt your data is you. It’s the ultimate security model. The data on the server is encrypted and the server has no way to decrypt it. A typical web service works like this:

  1. Your data is created on your computer.
  2. Your data is transmitted securely to servers.
  3. Your data is transmitted securely back to you.

The whole time it’s on the remote servers, it could in theory be retrieved by criminals, nosy sysadmins, governments, etc. There are times when you want a server to read your data to do something useful, but there are times where it shouldn’t.

The Rise Of Cloud Data And HTML5

It’s no secret that more people are moving more of their data in to what sales people call “the cloud” (Gmail, Dropbox, Remember The Milk, etc). More and more of people’s data is out there in this maze of computers. I don’t need to dwell too much about the issues raised by personal data being stored in places where 4th amendment rights aren’t exactly clear in the US and may not exist in other locales. It’s been written about enough in the industry.

Additionally newer features like Web Storage allow for 5-10 MB of storage on the client side for data, often used for “offline” versions of a site. This is really handy but makes any computer or cell phone used a potentially treasure trove of data if that’s not correctly purged or protected. I expect that 5-10 MB barrier to rise over time just like disk cache. Even my cell phone can likely afford more than 5-10 MB. My digital camera can hold 16 GB in a card a little larger than my fingernail. Local storage is already pretty cheap these days, and will likely only get cheaper.

Mobile phones are hardly immune from all this as they feature increasingly robust browsers capable of all sorts of HTML5 magic. The rise of mobile “apps” is powered largely by the offline abilities and storage functionality. Web Storage facilitates this in many ways but doesn’t provide any inherent security.

Again, I don’t need to dwell here, but people are leaving increasingly sensitive data on devices they use, and services they use. SSL protects them while data is moving over the wire, but does nothing for them once data gets to either end. The time spent over the wire is measured in milliseconds, the time spent at either end can be measured in years.

Enter JS Crypto

My proposal is that there’s a need for native JS Cryptography implementing several popular algorithms like AES, Serpent, Twofish, MD5 (I know it’s busted, but still could be handy for legacy reasons), SHA-256 and expanding as cryptography matures. By doing so, the front end logic can easily and quickly encrypt data before storing or sending.

For example to protect Web Storage before actually saving to globalStorage:

globalStorage[‘mybank.com’].lastBalance = "0.50";
globalStorage[‘mybank.com’].lastBalance = Crypto.AES.encrypt("0.50", password);

Using xmlHttpRequest or POST/GET one could send encrypted payloads directly to the server over http or https rather than send raw data to the server. This greatly facilitates the Mozilla Sync model of data security.

This can also be an interesting way to transmit select data in a secure manner while serving the rest of a site over http using xmlHttpRequest by just wrapping the data in crypto (that assumes a shared key).

I’m sure there are other uses that I haven’t even thought of.

Performance

JS libraries like Crypto-JS are pretty cool, but they aren’t ideal. We need something as fast and powerful as we can get. Like I said earlier, mobile is a big deal here and mobile has performance and power issues. Intel and AMD now have AES Native Instructions (AES NI) for their desktop chips. I suspect mobile chips who don’t have this will eventually do so. I don’t think any amount of JS optimization will get that far performance wise. We’re talking 5-10 MB of client side data today, and that will only grow. We’re not even talking about encrypting data before remote storage (which in theory can break the 10MB limit).

Furthermore, most browsers already have a Swiss Army knife of crypto support already, just not exposed via JS in a nice friendly API. I don’t think any are currently using AES NI when available, though that’s a pretty new feature and I’m sure in time someone will investigate that.

Providing a cryptography API would be a great way to encourage websites to up the security model in an HTML5 world.

Wait a second…

Shouldn’t browsers just encrypt Web Storage, or let OS vendors turn on Full Disk Encryption (FDE)?

Sure, both are great, but web apps should be in control of their own security model regardless of what the terminal is doing. Even if they are encrypted, that doesn’t provide a great security model if the browser has one security model in place for Web Storage and the site has its own authentication system.

Don’t JS Libraries already exist, and isn’t JS getting the point of almost being native?

True, libraries do exist, and JS is getting amazingly fast to the point of threatening native code. However crypto is now being hardware accelerated. It’s also something that can be grossly simplified by getting rid of libraries. I view JS crypto libraries the way I view ExplorerCanvas. Great, but I’d prefer a native implementation for its performance. These libraries do still have a place bridging support for browsers that don’t have native support in the form of a shim.

But if data is encrypted before sending to a server, the server can’t do anything with it

That’s the point! This isn’t ideal in all cases for example you can’t encrypt photos you intend to share on Facebook or Flickr, but a DropBox like service may be an ideal candidate for encryption.

What about export laws?

What about them? Browsers have been shipping cryptography for years. This is just exposing cryptography so web developers can better take advantage and secure user data. If anything JS crypto implementations likely create a bigger legal issue regarding “exporting” cryptography for web developers.

Your crazy!

Perhaps. To quote Apple’s Think Different Campaign

Here’s to the crazy ones. The misfits. The rebels. The troublemakers. The round pegs in the square holes.

The ones who see things differently. They’re not fond of rules. And they have no respect for the status quo. You can quote them, disagree with them, glorify or vilify them.

About the only thing you can’t do is ignore them. Because they change things. They invent. They imagine. They heal. They explore. They create. They inspire. They push the human race forward.

Maybe they have to be crazy.

How else can you stare at an empty canvas and see a work of art? Or sit in silence and hear a song that’s never been written? Or gaze at a red planet and see a laboratory on wheels?

While some see them as the crazy ones, we see genius. Because the people who are crazy enough to think they can change the world, are the ones who do.

Time to enable the crazy ones to do things in a more secure way.

Updated: Changed key to password to better reflect likely implementation in the psudocode.

Categories
Apple Google Mozilla

On Chrome Dropping H.264

The Chrome team announced they are dropping support for H.264.

WebM Support

WebM support will be growing quickly as Firefox 4 rolls out (Firefox upgrade adoption is legendary). Chrome commands sizable market share and is pushing the Chrome OS platform. Opera is also supporting WebM.

Apple and Microsoft could join the party and bundle WebM support along with the other codecs they support at any time, though they are licensors for H.264 and wouldn’t benefit from WebM market penetration. Microsoft’s implementation does allow for VP8 support if a codec is installed. I’m not aware of anything for Safari and am rather certain nothing can be done for the iPhone without Apple intervening.

On the hardware side AMD, ARM, Nvidia are backing WebM. Broadcom announced support, as did Qualicomm and TI. These are major vendors for mobile chips. Intel is working on stuff too.

H.264 Trouble

H.264 is problematic and bad for the web for many reasons I’ve mentioned here before as well as great posts by roc and shaver. I’ll leave it at that rather than rehash.

There was buzz a while back about H.264 being “free” (quotes intentional), but it’s not really “free” if you read the fine print. As Peter Csathy of Sorenson Media notes:

But, you say, MPEG LA recently announced that it will no longer charge royalties for the use of H.264. Yes, it’s true – MPEG LA recently bowed to mounting pressure from, and press surrounding, WebM and announced something that kind of sounds that way. But, I caution you to read the not-too-fine print. H.264 is royalty-free only in one limited case – for Internet video that is delivered free to end users. Read again: for (1) Internet delivery that is (2) delivered free to end users. In the words of MPEG LA’s own press release, “Products and services other than [those] continue to be royalty-bearing.”

That’s hardly “free”. That’s just one potential use case that’s now royalty exempt. The reason they are doing that is presumably if they can get H.264 adoption high enough, all the other cases will be paying and therefore subsidizing this one case.

WebM is licensed a little different: Patent wise, it’s irrevocably royalty free. License is about as liberal as you can get.

There’s no proprietary html, css, or images (GIF was, now it’s dead) used across the web. Why should video be any different? The key to success and growth has always been an open platform that’s low cost and encourages innovation.

Implementing Today

For anyone who suggests that this further fragments the market, that’s not really true. Adobe Flash actually creates an excellent shim to help migrate away from Flash to <video/>. Allow me to explain:

Adobe will soon be supporting WebM through Flash. Adobe already support H.264 in Flash. For legacy browsers and those who won’t support WebM, you have the option of delivering a Flash experience just like most websites do today. There are websites doing this today via Flash and H.264. For modern browsers you can just use <video/>. Once your non-WebM market share drops low enough, you can get rid of the Flash experience. Soon enough you’ll be able to push WebM to your Flash users. The benefit of switching your Flash experience to WebM as a middle step would be one encoding for both delivery mechanisms vs. using H.264 and WebM in parallel. Of course if you’re supporting mobile you likely need H.264 for a bit longer but likely use a smaller resolution and different profile for mobile consumption.

No matter what there will be two delivery mechanisms for those looking to push video using HTML5 to users today. The only thing that changes is the lean towards standardizing on the actively developed WebM codec vs. H.264.

All new technology has speed bumps, that’s the cost of being on the bleeding edge. However this is a positive turn as things are now starting to line up. The most awesome thing is that the codec, HTML5 specs, and some of the most popular browsers in the world are open and inviting feedback and contributions to improve things.

Categories
Mozilla Security

Firesheep Is Just The Messenger

I must say that I’m glad to see there are no plans to pull Firesheep. Add-ons have a lot of power since they run in a privileged space. Anything your browser can access, your add-ons can access. The point to being able to kill add-ons was to protect the user in situations where an add-on was either bundling malware or sending information without the users consent. Firesheep does none of that. It behaves exactly as advertised. It also causes no harm to the user or their computer.

Firesheep doesn’t do anything that couldn’t be done with a packet sniffer, it just makes it trivial enough that the average person can do it. It just makes a flaw in many websites more visible. The more technical folks have known this for years. Firesheep is just the messenger. These insecure bits of traffic have traveled across the wire for a decade or more. All traffic across Ethernet is visible to all devices. This is how Ethernet works. The network is a shared medium. It’s just a matter of looking at it. WiFi is a slightly different ballgame but at the end of the day if a wireless signal is unencrypted, it’s just a matter of listening.

I am not a lawyer (nor do I play one on TV) but from a legal perspective I suspect Gregg Keizer is correct in suggesting that it’s likely legal under federal wiretapping statutes (ethics is another debate). However a company likely can still fire you for using it, and a school likely can still kick you out for using it on their network. Private networks have their own rules and policies.

That covers the detection of a session. If you were to actually session jack, that would likely be considered fraud, hacking, identity theft, etc. depending on what you do. Generally speaking, unauthorized access to a computer system is illegal. If you are using someone else’s credentials, that’s by definition unauthorized access.

Electronic communications law is hardly considered developed or mature but generally there isn’t an expectation of privacy when no encryption is used and transmission is done over a shared connection. It’s akin to speaking to someone on the street and being overheard. That said, if someone reads their credit card number while on a cell phone call and you use the credit card information you overheard, it’s still fraud regardless of the interception method.

Bottom line: It’s time to start securing connections.

Categories
Mozilla Security

Firesheep Demonstrates The Need For SSL

There’s been a storm of discussion over the past 72 hours about Eric Butler’s Firefox extension Firesheep. To summarize, it’s a Firefox extension that facilitates session hijacking by packet sniffing for data from certain websites. As far as software goes, it’s more evolutionary than revolutionary, at its core it’s a packet sniffer. The evolution is the pretty UI which makes it trivial to hijack someone’s session (he really did do a good job on the UI, it’s so easy a child could use it).

It’s actually surprising to me that so many people are shocked by what this demonstrates. Even those who claim to be technically literate seem taken back. Insecure sites by definition are insecure. Anyone can read what’s going across the wire (that includes WiFi) when it is sent unencrypted. If your browser can interpret and use the information to let you browse Facebook, Twitter, etc. so can any browser, on any computer. It’s that simple. Firesheep only supports a handful of sites, but adding support for more sites isn’t difficult. If your favorite website hasn’t been done yet, I expect it will be soon enough.

How Do You Protect Yourself?

The best way to protect yourself is to demand that websites that hold private information use HTTPS from the moment you log in until you log out. Short of that, the best you can do is use a Firefox extension like EFF’s HTTPS Everywhere to force your browser to use HTTPS. This won’t work everywhere as not every web server even has HTTPS working, but many secretly do. They sometimes use HTTPS for certain things like login, then use insecure HTTP for the rest of your visit. That’s so your password isn’t transmitted in plain text. Protecting a password is important, but if the session is insecure anyone can intercept what you do. HTTPS Everywhere works by rewriting all requests to many popular sites to use HTTPS ensuring your privacy and security through the length of your visit. Some websites will have minor issues. For example Facebook Chat is impossible to support right now due to it not working via HTTPS. The rest of Facebook however works.

For more advanced users, HTTPS Everywhere lets you write your own rulesets for sites it doesn’t support.

How Do Websites Protect Their Users?

It’s very simple. Use HTTPS for the period a user is logged in, not just when authenticating and submitting sensitive data. Sure it’s a little slower and requires more hardware, but scaling HTTPS these days isn’t nearly as difficult as it was just 5 years ago. In 2 years it will be even easier. Google went as far as forcing HTTPS upon all of Gmail users. Binding a session to an IP address is fussy and largely ineffective due to NAT, WiFi hotspots and mobile services that can cause an IP to just change with little/no notice. It’s not effective security. It’s better than nothing, but it’s not a fix.

Google could make a huge difference by supporting SSL in Google AdSense, something I’ve called for since 2008. Google has supported SSL with Google Analytics for some time, but they have lagged with rolling out support in other services. Lots of websites monetize with AdSense and this is just another reason websites put off supporting SSL. Other ad networks should do the same. Google AdSense has the least barrier to entry since they serve their text ads off of their own infrastructure, vs. creatives hosted by other parties like some smaller ad networks. One could argue having third-party code inserted on a page mitigates security but it would still be a major improvement over the current state of affairs and would prevent simple session jacking.

Categories
Mozilla Security

On HTML5 And The Future Of Privacy

Today’s alarmist without much research news is “New Web Code Draws Concern Over Risks to Privacy” about HTML5 and its threat to privacy. How evil of HTML5 and its creators.

The Real Deal

Persistent cookies are nothing new. Essentially the strategy works like this: Store data everywhere you can on the users footprint, and if data it deleted in a few locations, you copy it back from another location the next time you can. It’s regenerative by design. A popular example is evercookie which uses:

  • Standard HTTP Cookies
  • Local Shared Objects (Flash Cookies)
  • Storing cookies in RGB values of auto-generated, force-cached PNGs using HTML5 Canvas tag to read pixels (cookies) back out
  • Storing cookies in and reading out Web History
  • Storing cookies in HTTP ETags
  • Internet Explorer userData storage
  • HTML5 Session Storage
  • HTML5 Local Storage
  • HTML5 Global Storage
  • HTML5 Database Storage via SQLite

Note that several of these aren’t HTML5 specific. More than one of which isn’t cleared by just “erasing cookies”.

HTML5 does add a few new possibilities, but they are also by design as easy to control, monitor and restrict as your browser (or third-party add-on) will allow. HTML5 storage mechanisms are bound to the host that created them making them easy to search/sift/manage as HTTP cookies. Much worse are some of the more obscure cookie methods (Flash Cookies, various history hacks). They don’t really provide any more of a privacy risk than what the browser already has been offering for the past decade.

To Shut Up The Geolocaiton Conspiracy Theorists

Before someone even attempts the “Geolocation API lets advertisers know my location” myth, lets get this out of the way. The specification explicitly states:

User agents must not send location information to Web sites without the express permission of the user. User agents must acquire permission through a user interface, unless they have prearranged trust relationships with users, as described below. The user interface must include the URI of the document origin [DOCUMENTORIGIN]. Those permissions that are acquired through the user interface and that are preserved beyond the current browsing session (i.e. beyond the time when the browsing context [BROWSINGCONTEXT] is navigated to another URL) must be revocable and user agents must respect revoked permissions.

Some user agents will have prearranged trust relationships that do not require such user interfaces. For example, while a Web browser will present a user interface when a Web site performs a geolocation request, a VOIP telephone may not present any user interface when using location information to perform an E911 function.

To my knowledge no user agent implements Geolocation without complying with these specifications. None.

No HTML5 Needed For Fingerprinting

Even if you do manage to wipe all the above storage locations, you’re still not untraceable. Browser fingerprinting is the idea that just your system configuration makes you unique enough to be traceable. This includes things like your browser version, platform, flash version, and various other bits of data plugins may additionally leak. The EFF recently did a rather impressive study to learn about the accuracy of this technique. Computers with Flash and Java installed sport 18.8 bits of entropy and result in 94.2% of browsers being unique in the EFF study [cite, pdf]. Of course their data was likely skewing towards more experienced web users who are more likely to have an assortment of customizations to their computer (specific plugins, more variety in web browsers, operating systems, fonts) than the average internet user. I’d wager that their data downplays the effectiveness of this technique.

The idea that HTML5 is a privacy risk is FUD. It doesn’t provide any worse security than anything else already out there. It’s actually easier to counteract than what’s already being used since it’s handled by the browser.

The Future

I still believe all browsers out there can do a much better job of protecting privacy when it comes to local data storage for the purpose of tracking. What I believe what needs to happen is web browsers need to start moving away from the “cookie manager” interfaces that are now a decade+ old and move towards a “my data management” interface that lets users view and delete more than just cookies. It needs to encompass all the storage methods listed above as supported by the browser. Hooks should also exist so that plug-ins that have data storage (like Flash) can also be dealt with using the same UI.

Additionally it needs to be possible to control retention policies per website. For example I should be able to let Google storage persist indefinitely, Facebook for 2 weeks, and Yahoo for the length of my browser session should I wish.

My personal preference would be for a website to denote the longest storage time for any object on a webpage in the UI. Clicking on it would give a breakdown of all hostnames that makeup the page, what they are storing and let the user select their own policy. With 2 clicks I could then control my privacy on a granular level. For example visiting SafePasswd.com would give me a [6] in the UI. Clicking would show me a panel this:

+------------------------------------------------------------------------------+
| My Data Settings for SafePasswd.com:                                         |
|                                                                              |
|  Host                        Longest Requested Lifespan    Your Choice       |
|                                                                              |
| *safepasswd.com              2 years                       [site default]    |
| googleads.g.doubleclick.net  6 years                       [browser session] |
|                                                                              |
|                                                                              |
|                                                       (Done)  (Cancel)       |
+------------------------------------------------------------------------------+

I could then override googleads.g.doubleclick.net to be for the browser session via the drop down if that’s what I wanted. I could optionally forbid it from saving anything if that’s what I wanted. I could optionally click-through for more detail or view the data to help me make my decision. Perhaps this would also be a good place for P3P like data to be available. One of the notable failures of P3P that impeded usage was it was never easy to view so it never caught on.

The browser would then remember I forbid googleads.g.doubleclick.net from storing data beyond my browser session. This would apply to googleads.g.doubleclick.net regardless of what website it was used on.

This model works better than the “click to confirm cookie” model that only a handful of people on earth ever had the patience for. It provides easy access to control and view information with minimal click-throughs.

It also makes a web page much more transparent to an end-user who could then easily see who they are interacting with when they visit one webpage with several ads, widgets, social media integration points etc.

One click to view data policies, two clicks to customize, three to save.

HTML5 is not a risk here. The web moving to HTML5 is like going from the lawless land to a civilized society where structure and order rule.

Categories
Mozilla Security

Decrypting The Internet

Bruce Schneier on the new wiretapping proposal:

Any surveillance system invites both criminal appropriation and government abuse. Function creep is the most obvious abuse: New police powers, enacted to fight terrorism, are already used in situations of conventional nonterrorist crime. Internet surveillance and control will be no different.

Official misuses are bad enough, but the unofficial uses are far more worrisome. An infrastructure conducive to surveillance and control invites surveillance and control, both by the people you expect and the people you don’t. Any surveillance and control system must itself be secured, and we’re not very good at that. Why does anyone think that only authorized law enforcement will mine collected internet data or eavesdrop on Skype and IM conversations?

I 100% agree here. A security vulnerability, intentional or not is a vulnerability. Even systems with no known security holes are eventually broken. Look at the recent reverse engineering of HDCP, which was theorized as vulnerable in 2001 but not broken for several years, a pretty good run. Eventually all security mechanisms will be broken. Starting with something broken just increases the window of opportunity for abuse and misuse.

In theory this proposal could (I’m no lawyer, I don’t even play one on TV) even impact things like Firefox Sync (Formerly Weave) which employs the best security mechanism I’ve seen in a service. To summarize, it works by encrypting your data before transmission to the server. However the key is never sent. That means even if the Gestapo took the servers with your data, they would still need to get the key from you, or do battle with the encryption which isn’t easy. Even Mozilla can’t read your data, unless a flaw were found in the encryption algorithm. The question is if sync were considered to fall under “services that enable communications”. That seems broad enough to leave room to argue that sync facilitates communication since the browser is the ultimate communication client. The browser is also valuable since it potentially has passwords, bookmarks, and history giving a good motivator to make that argument. Argue that to a 75-year-old judge who never used a computer and it might work.

Meanwhile just weeks ago UAE ironically gets criticized by the US for proposing a Blackberry ban for the same reasons.