From Mozilla Research:
From Mozilla Research:
There’s another attack on Java via a new zero day flaw. This is why I don’t keep Java enabled in web browsers anymore. If you still do, I’d suggest turning it off. There’s a good chance you won’t miss it.
I’ve yet to get there with Flash, but the day is coming. After the previous post a few months ago, I think I like the idea of a blacklist/whitelist for plugins in general that allow a user to enable them only for specific hostnames. That would make it a bit more intuitive to use plugins when still needed, but gain the security of not having them available for any hostname you happen to stumble upon. The options would be something like:
Enable [plugin name] on [hostname.tld] for: (This session only) (Forever) (Never)
For certain things like YouTube, you could enable Flash forever since Google is rather trustworthy. For other sites, perhaps just the session. For others, maybe never.
It’s becoming clear to me that despite Apple having a huge chunk of the mobile web, it still treats the web as a second class citizen on iOS and Mac OS X. My latest battle is adaptive images, in particular for use in High DPI devices (“Retina” on the Mac). High DPI displays are awesome. I own an iPad 3, and it’s one of the greatest displays I’ve ever looked at. What I don’t get is why Apple is making it so difficult to take advantage of as a developer.
Currently, there’s no easy way to switch image resolutions based on the display being used. The basis of that isn’t Apple’s fault. Nobody thought of the problem when HTML was first created. All of the methods have ugly tradeoffs. They are hacks. Even Apple.com doesn’t have a great solution. They were doing image replacement (easier to read version here). Apple does however have a solution called
image-set which looks like it will be in iOS 6 and Mac OS X 10.8.
That’s several months later. The iPad 3 was announced March 7, which was only about 2 weeks after the initial proposal. Why wasn’t a solution for web authors included when the iPad 3 shipped? It seems silly that there’s no API to properly interface with one of the most touted features of the new device. Of course there’s a way to take advantage of that brilliant display if you build a native app.
You could argue that Apple didn’t want to rush implementing something proprietary without discussing it with the community at large, but Apple has said in the past:
tantek (Mozilla): I think if you’re working on open standards, you should propose your features before you implement them and discuss that here.
smfr (Apple): We can’t do that.
sylvaing (Microsoft): We can’t do that either.
So Apple is pretty candid about reserving the right to implement features without discussion, yet nothing happened. It’s not that such a discussion would have been a “leak”. The iPhone 4′s display would be adequate justification for the feature. In fact it’s mentioned in the first sentence of that proposal in www-style by Edward O’Connor. So not disclosing the new product doesn’t seem to be the reason either. Apple could have done this a year ago without anyone being any wiser about the iPad 3′s display.
I can’t be the only one who’s scratching his head over this. Why didn’t the iPad 3 ship with a browser capable of providing an efficient way to switch images? The cynic in me would say “to encourage native app development”, but then why bother now?
The upside is Apple products have high OS adoption rates. All those retina iPad 3′s will be running iOS 6 relatively quickly. If it were a popular Android device I’d be much more concerned because we’d be dealing with 2 years of devices on a stale OS with no support. This is why we need more competition in mobile. We need web solutions to be a priority, not an afterthought.
As far as I’m aware
image-set is also prefixed, but that’s another rant.
For those not keeping score, Twitter, and Facebook have both come out publicly in favor of SPDY. Twitter is already using it in production. It sounds like Facebook will be soon. Mozilla implemented it in Firefox. Opera has SPDY. Google, the author of SPDY is using it in production.
This leaves Microsoft and Apple as the holdouts. Microsoft’s HTTP + Mobility is SPDY at it’s core. Microsoft hasn’t started supporting SPDY in any products, but it seems inevitable at some point. They are a holdout in implementation but not opposed to SPDY it seems.
Apple is the last major holdout. SPDY hasn’t been announced for iOS 6 or Mac OS X 10.8. As far as I’m aware Apple hasn’t made any statement suggesting support or opposition to SPDY. However I can’t see why they would oppose it. There’s nothing for them to disapprove of, other than it’s not using their IP. I’d be surprised if they don’t want to implement it.
However given SPDY is a rather backwards compatible thing to support, I don’t see this holding back adoption. Nginx is adding support for SPDY (thanks to WordPress creator Automattic), and Google is working on mod_spdy for Apache. That makes adoption for lots of large websites possible.
While the details of SPDY and the direction it will go are still in flux, it seems nearly certain that SPDY is the future of the web. Time to start digging into how to adopt it and ease the transition. The primary concerns I see are as follow:
Both of the above concerns increase complexity and cost of building websites at scale and for those who are on a very tight budget (the rest of us will manage). Because of this, I don’t think we’ll see a 100% SPDY or HTTP 2.0 web for quite some time. Don’t expect SPDY for shared hosting sites anytime soon.
In a world of increasing surveillance and user data being integrated into everything, the benefits of TLS will be realized. Both Facebook and Twitter acknowledge it’s importance in preventing user data from getting into the wrong hands.
I, For One, Welcome Our New SPDY overlord.
Google is pervasive about associating Chrome with being fast. It’s was their primary pitch when they first announced it. Back when Firefox went 1.0, it wasn’t so much about speed but “not sucking” as all the geeks liked to say. Given IE 6 was the competition, that was likely the best marketing on earth. Sure it was faster, but sucking fast wasn’t nearly as good as not sucking. Not sucking encompassed the missing features, broken rendering, crashing, constant parade of security problems. It summarized the product surprisingly well for not being an official slogan by any means.
Google now launched Chrome for iOS. On the desktop Chrome and Safari both use WebKit, Chrome applies it’s own touches to make things faster. Notably they have their own JS engine. Safari also has it’s own JS engine. This is the secret sauce of performance. In the iOS world however Apple being the totalitarian dictator decided that iOS will provide WebKit and JS. If your app has any web browser functionality it will utilize these API’s and not implement it’s own engine. Verbatim:
Google Chrome for iOS however is Google integration into a reskinned experience of Safari. It’s the same browser. Just a new UI bolted on with some Google features integrated in. It’s not a separate browser. It’s a UI.
That however doesn’t stop Google’s marketing machine (I’d argue Apple marketing’s top rival) from putting “fast” as the second word:
Browse fast with Chrome, now available on your iPhone, iPod touch and iPad. Sign in to sync your personalized Chrome experience from your computer, and bring it with you anywhere you go.
It goes on to clarify:
- Search and navigate fast, directly from the same box. Choose from results that appear as you type.
So Google isn’t truly misleading. It’s just very strategic wording.
How much slower is Google Chrome on iOS in comparison to Safari? Well Here’s a SunSpider test I did on my iPad 3:
============================================ RESULTS (means and 95% confidence intervals) -------------------------------------------- Total: 1817.9ms +/- 0.2% -------------------------------------------- 3d: 214.7ms +/- 1.1% cube: 72.3ms +/- 0.7% morph: 57.9ms +/- 0.9% raytrace: 84.5ms +/- 2.2% access: 224.9ms +/- 0.6% binary-trees: 44.4ms +/- 1.7% fannkuch: 96.2ms +/- 0.6% nbody: 56.0ms +/- 0.0% nsieve: 28.3ms +/- 2.7% bitops: 141.0ms +/- 0.4% 3bit-bits-in-byte: 23.4ms +/- 1.6% bits-in-byte: 29.5ms +/- 1.3% bitwise-and: 37.8ms +/- 1.5% nsieve-bits: 50.3ms +/- 0.7% controlflow: 15.7ms +/- 2.2% recursive: 15.7ms +/- 2.2% crypto: 123.3ms +/- 0.6% aes: 70.5ms +/- 0.5% md5: 29.4ms +/- 1.3% sha1: 23.4ms +/- 1.6% date: 274.4ms +/- 0.7% format-tofte: 139.8ms +/- 1.1% format-xparb: 134.6ms +/- 0.7% math: 175.1ms +/- 0.3% cordic: 61.5ms +/- 0.8% partial-sums: 74.4ms +/- 0.7% spectral-norm: 39.2ms +/- 0.8% regexp: 70.8ms +/- 0.6% dna: 70.8ms +/- 0.6% string: 578.0ms +/- 0.5% base64: 78.3ms +/- 1.9% fasta: 68.1ms +/- 0.9% tagcloud: 109.5ms +/- 1.2% unpack-code: 207.5ms +/- 1.2% validate-input: 114.6ms +/- 0.7%
============================================ RESULTS (means and 95% confidence intervals) -------------------------------------------- Total: 7221.0ms +/- 0.1% -------------------------------------------- 3d: 802.7ms +/- 0.2% cube: 230.9ms +/- 0.6% morph: 297.3ms +/- 0.5% raytrace: 274.5ms +/- 0.1% access: 1112.0ms +/- 0.2% binary-trees: 98.4ms +/- 1.1% fannkuch: 609.6ms +/- 0.2% nbody: 247.9ms +/- 0.2% nsieve: 156.1ms +/- 0.4% bitops: 957.2ms +/- 0.2% 3bit-bits-in-byte: 210.4ms +/- 0.6% bits-in-byte: 232.9ms +/- 0.2% bitwise-and: 188.5ms +/- 0.4% nsieve-bits: 325.4ms +/- 0.2% controlflow: 129.5ms +/- 0.3% recursive: 129.5ms +/- 0.3% crypto: 493.3ms +/- 0.2% aes: 214.3ms +/- 0.4% md5: 140.2ms +/- 0.3% sha1: 138.8ms +/- 0.5% date: 381.1ms +/- 0.3% format-tofte: 214.2ms +/- 0.2% format-xparb: 166.9ms +/- 0.5% math: 770.7ms +/- 0.2% cordic: 316.6ms +/- 0.2% partial-sums: 243.2ms +/- 0.3% spectral-norm: 210.9ms +/- 0.4% regexp: 1340.2ms +/- 0.2% dna: 1340.2ms +/- 0.2% string: 1234.3ms +/- 0.6% base64: 175.7ms +/- 0.5% fasta: 205.6ms +/- 0.2% tagcloud: 284.0ms +/- 2.3% unpack-code: 370.1ms +/- 0.9% validate-input: 198.9ms +/- 0.6%
Quite a bit slower.
That however doesn’t stop many people from thinking it’s fast. Just in the past few minutes I’m able to find these Tweets among the thousands streaming across the web. I won’t mention or link to them directly (you could find them however if you wanted):
“Chrome for iOS is FAST, takes the mobile browsing experience to a new level.”
“I like it! It’s fast and can sync with Chrome desktop, which I use all of the time.”
“Liking #chrome on #iOS very slick, fast and clean looking”
“using chrome on my iphone right now.. cant believe how fast it is”
“That chrome for iOS is freaking fast but so basic. No tweet button, no add-on. Man I kinda disappointed. I give ‘em 1 ‘fore the update”
“Chrome for iOS? Hell yes!! So fast! #chrome”
“Google Chrome for iOS is fast.”
The most touted feature isn’t actually a feature. It’s technically not even there. The numbers and the technology insist that it’s not (they prove it’s actually slower). But that’s what everyone is ranting and raving about. You could argue Google’s UI is faster, but I’d be highly skeptical that Google’s found Cocoa tricks Apple engineers haven’t. Perhaps a UI transition or two makes you think it’s faster or more responsive, however even that I can’t find any evidence of.
All the hard work the Google engineers did squeezing their services into a compact simple to use UI are ignored in favor of this non-existent feature. And as a developer who can’t ignore such a thing, I will say they did a great job with their UI.
I present to you, the power of marketing!
For the past 2 years now I’ve been browsing the web with Java disabled. I’ve had less than 5 situations where I needed to turn it on to do something, and all of those were situations with a limited audience (a very old technical tool, intranet applications). I’m of the opinion you really don’t need it enabled to happily browse the web anymore. I can’t disable Flash yet, but Java I seem to be largely fine without. I still have it on my computer in case I need it, but it’s seldom.
Given the past security issues and the fact that Java is outright annoying UI wise and slow to load, I don’t miss it at all. It served a purpose years ago in a webpage when it was difficult to build apps, but those days are long gone. It’s amazing if you remember Java being used for mouseovers way back when.
Apple’s iPad 3 video starts off with what I think should be the guiding principle behind all user experience:
We believe technology is at its very best when it’s invisible. When you’re conscious only of what you’re doing, not the device you’re doing it with…
Apple is still a hardware company and selling iPads, so they used the word “device”, but it’s safe to change this to “technology” and not loose anything. Go ahead, read that sentence again before continuing.
That principle is the reason the iPad is dominating the tablet market. That principle is the reason the iPhone sells so well despite its high price tag (in a bad economy no less) and being so locked down. If it wasn’t for that philosophy Apple would be in trouble. That principle is the explanation for everything that technology implementors just don’t get about Apple. Same goes for Facebook and even Google (to a degree). That principle is everything in consumer technology.
This is why I disagree with the “learn to code” mantra of 2012. It’s well-intentioned, but it shouldn’t be necessary. It violates this golden principle. It completely flips this principle upside down. It makes only the technology visible and abstracts what you’re actually trying to accomplish. It’s the complete opposite of what users want and expect from technology. That is why programming never became mainstream. That’s why repairing your own car or home appliances isn’t mainstream. When you make the technology the focus, you loose.
We won’t have flying cars until the necessary technology is simplified to the point where it’s as simple as steering in the direction you want to go and some basic
driving flying rules (which are etiquette more than technology limitations). You don’t expect people to understand lift coefficient (CL) or Angle Of Attack (AOA) to go grocery shipping. That’s why we have pilots and people drive cars. I expect a pilot to understand these concepts and avoid a stall. When it’s Jetsons simple, we’ll have flying cars.
Want to enable creation? Abstract the technology to the point where the user only focuses on content creation. There’s a reason why email didn’t take off until AOL made a pretty easy to use client (by 90′s standards). There’s a reason photo sharing didn’t takeoff when you could just email them to someone. There’s a reason why people aren’t creating content outside walled gardens. People only care about the activity and the goals they have in mind, not the technology that makes it possible.
The last major innovation in web content creation outside a walled garden was the WYSIWYG editor. Look around, few still exist. The ones that do are focused on FTP of static pages to a web server. Not one that I’m aware of would let a user generate for example a WordPress or Drupal theme without touching code. Purely WYSIWYG. It’s 2012 and it’s not possible to create a blog theme without merging markup and some server side code (PHP in this example). As a reference point support for a handful of CMS’s would cover a huge chunk of the web not owned by large companies. You shouldn’t need to understand CSS selectors to set a background color and you shouldn’t need to know
#000000) is “black” (which can also be used).
The suggestion that users are in the wrong for not being willing or able to learn is invalid. They shouldn’t need to.
Enabling content creation needs to be done the same way enabling content consumption is done: by making it so the technology is invisible and task at hand is the sole focus. Why should creating a spreadsheet with my finances be less technically complicated than publishing a paragraph of text on the web?
We’ve failed if the only way to participate on the web is to fully understand the technology. Walled gardens have manage to abstract it fairly well. Surely there’s a better way1.
1. I’ve got more thoughts on that, but I’ll save it for another day/blog post.
Once again the debate over H.264 has come up in the Mozilla community. I’ve been a strong advocate of the WebM/VP8 codec given its liberal license and abilities and still am, but agree H.264 needs to be supported. It’s a requirement for mobile (B2G), and becoming necessary on the desktop.
A little over a year ago Chrome talked about dropping support for H.264. To date they have not done so, or given any indication that is even still in the plans as far as I know. In 2010 Adobe said they would be supporting WebM (link in that same blog post). They too have failed to live up to their promises. In either case I’ve found no indication on the internet they ever plan to go forward with those plans.
I suspect in Google’s case they were pressured by various providers and mobile partners who don’t want to encode or support another encoding. Google’s been trying to woo anyone/everyone for the purposes of Google TV and presumably YouTube. It’s likely just not worth it for them to push. There are various theories floating around about Adobe including a lack of clear Flash strategy in an HTML5 world. Adobe does however have a “tools” strategy. Perhaps time will tell.
Furthermore Apple and Microsoft are fundamentally opposed to WebM as they are both licensors for H.264. The odds of them supporting something that hurts their bottom line unless the rest of the web is threatening to leave them behind is nearly 0.
I question however if it should be bundled vs. using system codecs. Windows XP aside, system codecs mean that Microsoft and Apple are essentially responsible for making it work as well as the expense. Plugins could be used for OS’s that don’t ship with the appropriate codecs.
The real issue is larger than the
<video/> element. It’s software patents and their ability to undermine innovation and progress. It’s important to keep this in mind. Just look at mobile. It’s completely possible that the entire mobile industry could come to a halt over patent lawsuits and fear of lawsuits. All it takes is a company willing to press the button. Google spent $12.5 billion in what is essentially the patent equivalent of nuclear proliferation. That’s how real the threat is perceived. H.264 is arguably a fart in a hurricane.
We do many things throughout the day. Most of the time we don’t give these things much thought. Often they are repetitive tasks we do every day. Our “routine” we call it. It may be that bathroom break mid-day, or that coffee break. Or might be those
n Google searches throughout the day. You might be able to name some of them and put a count to it, but stop and think for a second. How many things do you actually know how many times you performed them? How much time was spent? How much energy/expense?
Companies collect this information, but strangely individuals don’t. The companies who we deal with often know more about us than we do. Google knows how many times you searched in a given day. It may (depending on your privacy settings) be able to recall each search you ever made. A feat I bet you can’t perform. Your credit card company knows how many times you purchase coffee at a given store in a given year. You quite possibly have no idea.
Stephen Wolfram has been analyzing his life for years. Just tiny aspects of it. The data is stunning. It makes you wonder why we don’t have more products out there that give us access to and control of our own data. Everyone else has more access to it than we have.
Collusion is a Firefox extension that gives another little bit of insight. Who knows where you’ve been online. Try installing it and running it for a week. It’s fascinating to see. But still so much in the browser isn’t exposed to the user. Your search history knows what you searched for. Your browser history knows when you browse the web, where you’re going. There’s a mountain of data there. The authorities use it when a crime is committed for a reason. about:me is a great extension for getting a little bit more of this information out of Firefox. It’s a fascinating area where I hope we’ll see more people spend time on. The great thing about these is they are client side and private. You don’t need to give your data away to someone else if you want to learn about yourself.
However we’re still at the infancy in personal analytics. There’s very few products out there to let us know what we do all day. FitBit can tell you when you sleep, when you’re active and how active you are. But not terribly much else about you. Your computer has a wealth of info, but really doesn’t tell you much. To even get a little out of it you need to be fairly technically adept.
I propose it’s time to encourage people to start learning more about themselves. Data is amazing and can change our behavior for the better. Data is all around us yet somehow it eludes us. Big companies know things about us that even we don’t know. Perhaps it’s time to change that?
There’s been a lot of talk today about Telefónica’s involvement, but it’s worth noting the Mozilla blog announcement also mentions Deutsche Telekom Innovation Labs will join the Boot to Gecko project with dedicated development resources. That’s a pretty big deal.
The ability to run on lower end hardware which is cheaper to produce in quantity will make a huge difference. Tech in general tends to focus on North America, Europe, Japan, Korea, Brazil, Australia in terms of target market. They do this because they are wealthy countries reasonably free markets, similar taste, and trade agreements make it favorable.
This however has a huge downside. Overall this is excluding a huge chunk of the world. China alone is about 1.3 billion people (CIA estimate 2012) with a GNI of $7,570 (compared to $47,120 for the US). As large as Brazil is (~192.3 million), it’s only half of the 385.7 million in South America.
Take a look at the map in terms of GNI:
Now compare that to population density. Pay close attention to Asia, South Asia and Western Africa:
Who’s dominated this market to date? For most of the time it’s been Symbian since phones that run that slim OS have been rather cost effective. More recently it’s becoming Android now that older hardware exists that can be produced cheaply. Notice in the graph below where Android’s growth is coming from. Many would like you to think it’s only Apple and Android out there. That’s hardly where Android is growing users from. It’s market-share from Symbian.
That Apple iOS dip is likely the drop-off prior to the iPhone 4S shipping rather than Android. It was only released in October after iPhone 4 sales stalled in anticipation. You see a similar dip in 2008. 2009 is likely offset by the iPad’s success.
Of course an OS that runs fast on slower ARM hardware will run blazing fast on more expensive state of the art hardware. So everyone really benefits from being lean and fast.
This is about bringing the mobile internet to billions of people. It’s a big deal.
Hat Tip to Wikipedia for the maps and graph