Categories
Google Mozilla Web Development

Full SPDY Ahead

For those not keeping score, Twitter, and Facebook have both come out publicly in favor of SPDY. Twitter is already using it in production. It sounds like Facebook will be soon. Mozilla implemented it in Firefox. Opera has SPDY. Google, the author of SPDY is using it in production.

This leaves Microsoft and Apple as the holdouts. Microsoft’s HTTP + Mobility is SPDY at it’s core. Microsoft hasn’t started supporting SPDY in any products, but it seems inevitable at some point. They are a holdout in implementation but not opposed to SPDY it seems.

Apple is the last major holdout. SPDY hasn’t been announced for iOS 6 or Mac OS X 10.8. As far as I’m aware Apple hasn’t made any statement suggesting support or opposition to SPDY. However I can’t see why they would oppose it. There’s nothing for them to disapprove of, other than it’s not using their IP. I’d be surprised if they don’t want to implement it.

However given SPDY is a rather backwards compatible thing to support, I don’t see this holding back adoption. Nginx is adding support for SPDY (thanks to WordPress creator Automattic), and Google is working on mod_spdy for Apache. That makes adoption for lots of large websites possible.

While the details of SPDY and the direction it will go are still in flux, it seems nearly certain that SPDY is the future of the web. Time to start digging into how to adopt it and ease the transition. The primary concerns I see are as follow:

  1. TLS Required – While not explicitly required, SPDY essentially builds on TLS and virtually any real world application needs it. This means purchasing SSL certificates for any website you wish to use SPDY with. Some have argued performance and scalability, but Google, Facebook and Twitter use SSL extensively on commodity hardware.
  2. IP Address – Unless you use Server Name Indication (SNI), which almost no websites do because of compatibility, you need an IP address for every hostname that you use TLS with. That means until IPv6 is widely adopted, it will be putting further strain on the remaining IPv4 pool.

Both of the above concerns increase complexity and cost of building websites at scale and for those who are on a very tight budget (the rest of us will manage). Because of this, I don’t think we’ll see a 100% SPDY or HTTP 2.0 web for quite some time. Don’t expect SPDY for shared hosting sites anytime soon.

In a world of increasing surveillance and user data being integrated into everything, the benefits of TLS will be realized. Both Facebook and Twitter acknowledge it’s importance in preventing user data from getting into the wrong hands.

I, For One, Welcome Our New SPDY overlord.

Categories
Internet Mozilla Security

Protecting Photo Privacy Via Browsers

Browsers can do more to protect users from inadvertently violating their own privacy. The NY Times today had an article about a topic that has been discussed in various circles several times now. The existence of geotagging data in photos. Many cameras, in particular smart phones like the iPhone can tag photos with GPS data. This is pretty handy for various purposes including organizing photos at a later date, iPhoto for example does a pretty nice job of it. Most photo applications however don’t make this information very visible, as a result many users don’t even know it exists, others simply forget.

What the problem looks like

The data, embedded in a photo looks something like this:

GPSLatitude                    : 57.64911
GPSLongitude                   : 10.40744
GPSPosition                    : 57.64911 10.40744

Which I could map.

Proposal

I propose that browsers need to have a content policy for when users upload images that can better protect them from uploading information they may not even realize. Here’s what I’m imagining:

The first time a user attempts to upload a photo that has EXIF or XMP data containing location they are prompted if they want it stripped from the image they are uploading. The original file remains unharmed, just the uploaded version won’t have the data. They can also choose to have the browser remember their preference to prevent being prompted in the future. They can revise their choice in the preferences window later if they want. This isn’t to different from how popups are handled. I thnk that per-site policy might be too confusing and not warranted, but perhaps I’m wrong.

Warning users about hidden information they may be revealing is a worthwhile effort. It’s only a matter of time before someone uses a “contest” or some other form of social engineering to solicit pictures that may reveal location data for users. Evildoers always find creative ways to exploit people.

Caveat

There are a notable caveat to this approach. The most notable is that flash uploaders would bypass this security measure though individual uploaders could do it themselves, or Adobe could do it, but I don’t think that’s enough of a turnoff to this approach. The same caveat applied to “private browsing” in browsers.

Prior Work

As far as I know no browser actually implements a security feature like this yet. There are a few Firefox Add-ons like Exif Viewer and FxIF (both written in pure JavaScript) that look at EXIF data but nothing that intercepts uploads.

Who Can Do It First?

I’m curious who can do it first. By add-on (seems like it should be possible at least in Firefox), and dare I say include in a browser itself? If this were earlier in the year I would have added this to the Summer of Code ideas list. Instead I’m just throwing it into the wind until 2011 rolls around.

Categories
Google Mozilla

WebM

In August 2009 after the On2 announcement, I suggested that Google might open source a codec in hopes of derailing OGG which it feels is inferior as well as h.264 which is patent-encumbered. Google took VP8, the successor to the popular VP7 codec and started The WebM Project. To quote the project page:

WebM is an open, royalty-free, media file format designed for the web.

WebM defines the file container structure, video and audio formats. WebM files consist of video streams compressed with the VP8 video codec and audio streams compressed with the Vorbis audio codec. The WebM file structure is based on the Matroska container.

Google describes the license as “BSD-style”. A very good move since it’s liberal enough to encourage widespread open and proprietary inclusion. GPL is to viral for some potential adopters.

Software Support

For the browser side, Chromium and Firefox Nightly builds support WebM starting today. Opera and Google Chrome to come shortly.

Google also created patches against FFmpeg for encode as well as decode and created DirectShow filters which are available for download. I suspect by way of libavcodec we’ll see support in lots of other products in the near future.

Microsoft will support VP8 in Internet Explorer 9 if you have the VP8 codec installed. Not quite “support”, but better than nothing.

Adobe is also supporting VP8 in Flash, which means content producers can eventually kill VP7 and VP6 encoding and use VP8 to reach most of their audience. This is very important as encoding videos into several formats is costly and time consuming (I know this very well).

Hardware Support

Google has already said they are working with video and silicon vendors to add VP8 hardware acceleration to their chipsets. I suspect newer phones in the near future will be supporting it. Especially if they run Android.

Content

Google is supporting WebM in the HTML5 test for YouTube which I mentioned a few months ago. I suspect we’ll see lots more support in the very near future.

Supporters

Even more telling of the potential than the above is the list of supporters which contains some big names who can put a lot of weight behind hardware/software/content support. AMD (who owns ATI), NVIDIA, Marvell (lots of mobile chipsets), Qualcomm (think mobile chipsets), TI, Broadcom, ARM on the hardware side alone is impressive. If the majority of them add hardware support to their upcoming offerings, that will be game changing. On the software side leaves 1.5 holdouts in the web video world: Apple (1) and Microsoft (0.5).

This is a game changer.

Categories
Apple Internet

Opera Mini Approved For iPhone

I’ve yet to actually try it myself, but Opera Mini was approved today for the iPhone. While this is the first non-WebKit browser “on” the iPhone, it’s worth noting that the rendering engine isn’t actually on the phone. The rendering is done on a proxy server which is how they save bandwidth and increase performance.

Interesting, but I’d still like to see other rendering engines on the iPhone.

Categories
Google Mozilla Web Development

Adventures With document.documentElement.firstChild

Here’s an interesting DOM test-case I ran across inadvertently yesterday.

For the purpose of this post assume the following markup:

< !DOCTYPE html>
<html>
<!– i broke the dom –>
<head>
    <meta http-equiv="Content-Type" content="text/html; charset=utf-8"/>
    <title>Testcase</title>
</head>
<body>
<p>Something</p>
</body>
</html>

If I use document.documentElement.firstChild I don’t get consistent behavior. In Firefox and IE I get the <head/> element, which is what I was initially expecting. In WebKit (Safari/Chrome) and Opera. I get the HTML comment which I wasn’t.

Categories
Mozilla Web Development

Debating Ogg Theora and H.264

Since the big HTML 5 news that there will be no defined codec for <audio/> or <video/> there has been a lot of discussion about the merits of such a decision, and what led to it. To quote Ian Hickson’s email:

Apple refuses to implement Ogg Theora in Quicktime by default (as used by Safari), citing lack of hardware support and an uncertain patent landscape.

Google has implemented H.264 and Ogg Theora in Chrome, but cannot provide the H.264 codec license to third-party distributors of Chromium, and have indicated a belief that Ogg Theora’s quality-per-bit is not yet suitable for the volume handled by YouTube.

Opera refuses to implement H.264, citing the obscene cost of the relevant patent licenses.

Mozilla refuses to implement H.264, as they would not be able to obtain a license that covers their downstream distributors.

Microsoft has not commented on their intent to support

I think everyone agrees this is going nowhere and isn’t likely to change in the near future. For the sake of moving HTML5 forward, this is likely the best decision.

Here’s how I interpret everyone’s position:

Apple’s Argument

One of the undeniable perks behind H.264 right now is that there is hardware decoding available and used on on certain devices. One of the most notable is the iPhone. Using hardware decoding means your not using the CPU which results in better performance, and most importantly better battery life.

Thus far there’s no hardware Theora decoder on the market (if you know of any let me know, my research says none), which I suspect is why Apple is hesitant to jump on board. Until there’s hardware that’s proven to perform well, be cost-effective in the quantities Apple needs, and not be bombarded with patent infringement claims, I suspect they’d rather settle with H.264. The patent part is critical. Apple can update software to comply with patent wars pretty quickly, as many other companies have done with software in the past. Hardware is not so easy. Last minute hardware changes are harder to deal with than software because of the many things it impacts, and the inability to update at a later date.

I’m almost positive the lack of hardware support is the exact same reason Apple has been so against Flash support. Remember the YouTube application isn’t using VP6 like regular flash, it’s using H.264 (that’s why it took so long for all of YouTube to be available on the iPhone).

If there’s enough Theora content out there, there will likely be Theora decoder hardware made to meet market demand. To get to this point will be difficult with the amount of VP6 (Flash) and H.264 content already on the web. H.264 alone has a major head start in applications. VP6 has several years of video on the web now (and I still don’t think it has a hardware decoder on the market though that might be due to licensing again).

In the long run, I think mobile technology will improve enough to make this a somewhat unnecessary constraint. Mobile CPU’s and GPU’s are just starting to get to the caliber needed for video. Performance per watt should improve. Battery technology is just starting to get pushed to the limits. This is a good thing for Theora in the long run, but the question is how long?

Until it can be played with minimal impact on battery life, I don’t think any company who has a heavy investment in mobile will want to jump on board.

Google’s Argument

Google has money and can license H.264. Shocker. Google however has trouble when it comes to Chromium. I suspect Google doesn’t care too much about which way this goes since what they support in Chrome doesn’t mandate that YouTube support it. However if the encoding quality for a given bitrate is good enough, it becomes a viable option.

Regarding the quality argument, I’ll simply point to this comparison. I the quality today is comparable already, and likely to get better as the encoders improve. I’ll leave this discussion here.

Opera’s Argument

Opera says H.264 is to expensive to license. I don’t know what the costs are, and what they would be for Opera, but I’ll take their word on it. After all, the do have a product available for free download. While commercial and closed source, they don’t have Google’s revenue stream and I respect that.

Mozilla’s Argument

Mozilla can’t license for downstream Gecko use etc. I’m sure a good part of the argument is also that requiring licensing fees to use <video/> is bad for the web and open source. I agree.

Microsoft’s Argument

No comment. Historically they implemented <marquee/> but not the <blink/>. Make of that what you will.

<video/> could be supported by plugin if needed. I recall Adobe supporting SVG by plugin a few years ago.

Where to go from here?

I think there are a few possible outcomes. As for what I think are the most likely:

  1. There’s a push for hardware decoding that makes Theora on mobile technically possible and working well. If Apple legally is satisfied and jumps on board that changes the game. As I stated earlier I think Google is mostly ambivalent since they support both right now. Opera doesn’t want H.264 anyway, so they are cool. IE 8 can likely be handled by a plugin. Apple really is the deciding factor. Theora is the future.
  2. See what the web does. I suspect at least for a long while the web will just stick with Flash since it works on almost all desktops. For mobile the iPhone and Android make up pretty much the bulk of the mobile video market and that doesn’t look like it’s changing so fast. Content providers that want mobile will encode for mobile. That means 3 target platforms, not ideal but reasonable. H.264 and whatever Adobe adopts is the future.

I know how the media is interpreting all of this. How do other developers, and open source folks see it?

Categories
Mozilla

Microsoft Cutting Back On IE?

Asa pointed out an interesting CNBC piece regarding cutbacks in what looks like contractors on the IE team:

One of the units already seeing cutbacks is Microsoft’s sagging browser business. A report in the Seattle Times says 180 contract workers were told last month that their services would not be renewed. Just yesterday, researcher Net Applications reported that Microsoft’s Internet Explorer browser registered 68 percent market share in December, down from 74 percent in May.

If this is true, and I think it is likely as CNBC is a rather reputable source of business news, I predict Trident’s days are numbered. As I pointed out back in November, Balmer suggested they might look at WebKit. I should note I do not think this will have any impact on IE 8, which is nearly complete. They could of course choose Gecko which would save them from needing to work with Google and Apple (which might freak out some government regulators).

The other very real option is to either license Opera’s Presto engine, or simply buy Opera which would give them some strength in the mobile market. I think Microsoft would prefer to buy simply because of the mobile implications. Opera has a decent foothold in the mobile market. They would still have the expense of developing a rendering engine but instead of playing catch up they would be much more “ready to play”. This would save them the overhead expenses of trying to cram several years of development to simply catch up to the other browsers. Since Presto is proprietary they still can utilize their other proprietary technologies without leaking any code to the open source community. As I said in the past, keeping things proprietary is important to Microsoft’s web strategy.

Poor standards compliant, performance, bugs lingering for years, security issues, are all issues that have plagued this rendering engine. The final nail in the coffin might end up being a recession and the need to cut costs.

Of course it’s possible Microsoft may not be renewing these contractors since IE 8 is nearly done and it will simply slow down IE 9 development, but I don’t think it’s likely considering the speed the competitors are going. I don’t think Microsoft will fall asleep at the wheel a second time.

So I’d like to adjust my statements back in November regarding Microsoft’s use of WebKit. I said before that it was unlikely. If this news is true, I think it’s becomes very realistic they will drop Trident. Maybe it really is as busted internally as we’ve all suspected for years.

There will still be fierce competition between WebKit, Gecko, and Presto regardless of what happens. Innovation and competition are essential to a healthy internet. This in fact makes it much more competitive since the one in last place in terms of supporting the latest in standards would suddenly catch up overnight.

Enough speculation for now. Lets see what turns out to be fact, and what turns out to be CompSci Fiction.

Edit [1/3/2009 @ 9:40 PM EST]:: Via Asa, apparently the layoffs were actually the MSN Homepages team, not the IE team as CNBC suggested.

Categories
Internet Web Development

MAMA Scripting Analysis

Opera did some interesting research into JavaScript used on the web. As someone who writes a fair amount of JavaScript and reads through countless lines of other people’s scripts, I found this to be pretty interesting.

Overall none of the results were very surprising, though a few things did catch my eye:

  • Omniture/SiteCatalyst Analytics ranks pretty high in the results. This to me suggests that the index of pages skews towards enterprise and large sites since Omniture is rather expensive service.
  • Google Analytics made the list to nobody’s suprise. I am however surprised not to see Quantcast which seems to be pretty popular now.
  • The popularity of window.open really hurts. Opening in a new window is so counter to how things are supposed to work. The user should decide on their own if they want to pop a new window (or tab). Most sites do this hoping the user forgets about the previous window and it improves their “average time on site” metric.
  • VBScript usage is slightly disturbing. Thankfully (in my experience) it’s most often found on older sites.

I wouldn’t mind knowing the popularity of scripts like SWFObject and Lightbox, assorted clones and PNGFix.

An analysis of graphics on the web could be interesting. GIF, JPEG, PNG. Then an analysis of the palette for GIF, JPEG compression, alpha transparency, interlacing, average file size and average amount per page.

Categories
Mozilla

No Opera For iPhone

I’m not to thrilled to read this:

Mr. von Tetzchner said that Opera’s engineers have developed a version of Opera Mini that can run on an Apple iPhone, but Apple won’t let the company release it because it competes with Apple’s own Safari browser.

This isn’t news, it’s been known for a while. I’m honestly wondering why Opera invested the development time with this in mind.

Apple’s going to learn the hard way that if it doesn’t drop this clause it’s going to be subject to Android’s wrath. Android is going to take some time to gather steam (I’d guess at least 18 months before it can catch up to the iPhone due to it still being pretty clunky and limited in availability) but when it does it catch up, it could be problematic.

It would be great to see a iPhone version of Fennec, but until Apple wises up, it’s not going to happen.

I predict just like Apple initially had a “no third party applications” policy, this too will change once it becomes obvious that this will end up hurting them in the long run. The question remains: how long will that take?

Categories
Mozilla

Opera’s Evangelism

Opera is said to be sending evangelism emails to websites that have compatibility problems with their browser. What’s interesting is that they are customizing the emails with actual fixes for the problems. This is pretty clever. In theory it will improve the problems regarding compatibility and make the web more standards compliant (which is where Opera excels).

One thing I do question is if webmasters will read it, at least where it matters. Most large companies have a contact form, or an email address, but it’s often forwarded to customer support, or sometimes just into a giant bin where a handful get processed. Will the information get to the people who need it? I suspect it will for small companies who read all the email they get from the web. For large companies, I doubt it, and that’s where I think it matters the most. The bigger sites that the majority of the web visit.

Regardless, it’s interesting to see, for me in particular since I wrote reporter. I suspect the best efforts are still to encourage the industry as a whole to adopt best practice. Considering the move to go mobile, and be more flexible on the front-end, using standards is just becoming more of a requirement. I think that will ultimately end up being the winning effort. It’s already winning as newer sites are generally pretty good when it comes to standards. The old ones will take time.

With Safari 3 and Opera 9.5 out, Firefox 3 taking off, IE 8 coming soon, it’s pretty obvious that standards are the future.