Categories
Google Web Development

Where Is The Asynchronous AdSense Google?

Anyone who cares about website performance comes to the quick realization that ads are a huge slow down. They are also the source of income for many websites. I somewhat casually mentioned back in December that Google was beta testing an async Google Analytics call. I found a bug in the early version but since updated it works extremely well and is non-blocking. It’s rather awesome.

Google AdSense like most ad networks is still a major slowdown. Often ads are implemented via a script which itself can include several other resources (including other JavaScript). Because it’s just a <script/> it’s blocking. It would be nice if AdSense would have an asynchronous version that lets you specify an element where it could insert the ad when loaded by passing an element object or id. It could then be non-blocking and speed up a large number of websites thanks to its popularity.

I can’t find any reason this wouldn’t technically be possible. It would make setup slightly more complicated, but the payoff for those who choose this implementation would be a faster website. Perhaps Steve Souders could poke the AdSense folks for a faster solution.

I’m also still hoping for SSL support. Another place where Google Analytics is ahead.

Categories
Web Development

Facebook’s New PHP “Runtime”

According to SDTimes Facebook is about to release a new open source project where it has either re-written the PHP Runtime (unlikely) or built a PHP compiler (more likely).

There is another possibility. It could be a Zend extension acting as an opcode cache (APC, XCache, etc.) and a FastCGI replacement.

It’s also possible they used Quercus as either a starting point or inspiration and it’s actually Java based, but that sounds unlikely.

Regardless, it will be interesting to see what comes of this.

Categories
Google Security Web Development

The Future Of SSL

Google announced the other day that it will now enable HTTPS by default on Gmail. Previously a user had to either manually type in HTTPS or change a setting to default to it, something most people likely never bothered to do. Google says it’s not related but it seems oddly coincidental that this chance coincides with its China announcement.

However Gmail using HTTPS is not the big story here.

The big story is that HTTPS is now being used in places where it before was considered excessive. Once upon only financial information was generally sent over HTTPS. As time went on, so did most website login pages, though the rest of the sites often were unencrypted. The reason for being so selective is that it’s more costly to scale HTTPS due to it’s CPU usage on the server-side, and it’s performance on the client side. These days CPU is becoming very cheap.

In the next few years I think we’ll see more and more of the web switch to using HTTPS. If things like network neutrality don’t work this trend could accelerate at an even quicker rate just like it did for P2P using MSE/PE to mask traffic.

Like I said, these days the CPU impact is pretty affordable, however the performance impact due to HTTP handshaking can be pretty substantial. Minimizing HTTP requests obviously helps. HTTP Keepalive is a good solution however that generally results in more child processes on the server as they aren’t freed as quickly (read: more memory needed).

Mobile is a whole different ballgame since CPU is still more limited. I’m not aware of any mobile devices that have hardware to specifically handle SSL, which does exist for servers. Add in the extra latency and mobile really suffers. Perhaps it’s time to re-examine how various Crypto libraries are optimized for running on ARM hardware? I think the day will come where performance over SSL will matter as it becomes more ubiquitous.

Categories
Networking

802.11n Finalized

802.11n, something I was starting to think would never get beyond draft is now approved. Having suffered through “compliant” 802.11b devices I long ago decided wireless networking is fussy enough to warrant stricter standards. As a result I stuck to Wi-Fi Alliance certified 802.11g devices, and the results have been awesome. I’m still of the opinion that the difference between “compliant” and “certified” is gigantic. Certified 802.11n devices should start to appear in the next few months.

Looks like the goals for any 802.11n upgrade are MIMO (obviously) and preferably dual-band (2.4GHz and 5GHz). I can’t see why I would want to do anything otherwise.

Considering most ISP’s don’t yet provide the downstream or upstream bandwidth necessary to take saturate a good 802.11g network, I’m not sure it’s really necessary to upgrade just yet. Thanks to a solid signal I can sustain up to about 19 Mbps over 802.11g even with WPA2 overhead and slight signal degradation. Under 1ms pings as well. ISP currently offers up to 16 Mbps, 12 Mbps plans for mortals. Rarely is that performance actually seen thanks to “the Internets being a series of tubes”. At least for today upgrading would only improve local network performance, not Internet performance. Most traffic is going outside the network anyway. 802.11n would bring capacity up to 130 Mbps, but since the uplink is still 12 Mbps, that really provides no real performance boost.

For anyone who would argue the faster CPU’s on the newer access points would improve performance, I’ve found that my current AP rarely sees more than a 2% load, with rare spikes up to about 40% capacity.

Of course hardware providers, and retail outlets will continue to tell people that downloading will be 6X faster1, but logic and common sense proves otherwise. It’s the equivalent of a Bugatti Veyron stuck behind a funeral procession.

That of course also assumes all devices are connecting via 802.11n. If you have an 802.11g and 802.11n devices connecting over 2.4 GHz, you’re going to be in mixed mode and slow down while 802.11g devices send/receive anyway. As far as I know there’s no way around that.

Then there’s the issue of all the pre-N adapters sold in laptops over the past few years and their compatibility, which is generally pretty good, but not perfect when mixing vendors.

So despite the marketing getting even stronger, I don’t see how it would be really beneficial to upgrade just yet. The actual performance increase for most activity will be virtually non-existent until ISP’s get faster. I’d rather wait until the hardware matures and prices drop more.

1. up to 6X faster, actual results may vary.

Categories
Mozilla Web Development

Optimizing @font-face For Performance

You want to use @font-face, then you realize it’s got some downsides. First of all, it’s another http request, and we know that the golden rule of web performance is to keep http requests to a minimum. Secondly fonts aren’t even small files, they can be 50k+ in size. Lastly the lag of fonts loading last means you page seems to morph into it’s final form.

Here’s a cool little optimization. By using a data: url you can use the font inline by encoding in base64. For example:

@font-face {
    font-family: "My Font";
    src: url("data:font/opentype;base64,[base-encoded font here]");
}
 
body {
    font-family: "My Font", serif
}

You can see this in action here. This seems to work fine in Firefox 3.5, and Safari 4 (presumably any modern WebKit based browser). Other browsers will simply act as if they don’t support @font-face.

In practice I’d recommend putting it in a separate stylesheet rather than inline css so that your pages are smaller and css can be cached for subsequent page views.

Data url’s are part of Acid2, which most modern browsers either pass or plan to pass. If you use an Open Type font you’d get pretty decent compatibility (IE only supports Open Type). Using True Type you’d still get pretty good compatibility sans IE. Check the @font-face page on MDC for more details. Unlike images, browsers that support @font-face are likely to support data: url’s as well, making this a pretty good solution.

Special thanks to Open Font Library for having some nice free fonts with awesome licensing. This post was partially in response to a comment left the other day on my @font-face hacks blog post.

Categories
Mozilla Web Development

Geek Reading: High Performance Web Sites

So I decided to do a little book shopping a few weeks ago and one thing I purchased was High Performance Web Sites: Essential Knowledge for Front-End Engineers (affiliate link). At its core is essentially a 14 step guide to making faster websites. I don’t think any of the steps are new or innovative, so anyone looking for something groundbreaking will be sorely disappointed. I don’t think the target audience has that expectation though. It’s still a rather practical book for any developer who spends a lot of time on the front-end of things.

It gives many great examples on how to implement, as well as suggestions based on what some of the biggest sites on the web are doing (including Yahoo, the authors employer). I found it pretty helpful because it saves hours worth of research on what other sites are doing to improve their performance. For that reason alone it’s a worthwhile book to checkout. For each rule there’s enough discussion to help you decide if you can implement an improvement on your own site or not. Most sites are limited by their legacy systems such as cms, processes (including human) and audience in what they can actually do. Unless you’ve got a serious budget, you likely fail rule #2 (use a CDN) right off the bat. Regardless there’s likely a few tips you can take advantage of. It’s also a very fast book to get through.

Most steps are pretty quick to implement provided they are feasible in your situation. Overall one of the best “make it better” tech books I’ve seen regarding web development. One of the few that actually appeared worth purchasing (and I did). The majority of the tips require a somewhat tech savvy approach to web development, the book isn’t oriented much towards web designers (with the notable exception of reducing the # of requests by using CSS and better use of images) or casual webmasters. It’s important for those who understand the importance of HTTP headers, but could use some help deciding on best practices, and those who want to know how the small things can add up.

Interestingly enough, I learned about the book by trying the YSlow extension which essentially evaluates a page against the rules suggested in the book. Interesting from a marketing perspective I guess. Overall this blog evaluates ok (about as well as it ever will considering I’m not putting it on a CDN anytime soon). Though I guess I could add some expires headers in a few places.

Categories
Apple Software

Mac OS X 10.5 Leopard

I got my copy of Mac OS X 10.5 earlier this week. Bought it from J&R (via Amazon) since it was $99 + shipping, less than Amazon itself was selling it for. For some reason both of them are able to undercut Apple (even with a corporate discount) which seemed odd. Here’s my rundown of the new OS during the first 24 hours.

Categories
Around The Web Internet Mozilla Web Development

JavaScript Badges And Widgets Considered Harmful?

Jeremy Zawodny has a great post about common JavaScript usage where he concludes it’s harmful. Whether you agree or not, you have to admit it’s a great blog post. Here comes another long blog post.

Categories
Google Web Development

Optimizing Page Load

An awesome article on Optimizing Page Load Time from Aaron Hopkins (who works for Google). While his suggestions aren’t quite revolutionary he’s got a lot of data and experience to back up his statements, which is really great when your looking to improve performance. To summarize the best point: high object counts hurt performance regardless of size and broadband connectivity.

Categories
Software

Windows System Performance Ratings

If Microsoft were smart, they would have released the tool for Windows 2000/XP to test for Windows System Performance Ratings in Windows Vista. That way users could test and start upgrading if necessary, so when launch time comes around, their customer base is more ready to accept the new product.

Then again, if they are smart, they will offer massive discounts to upgraders, since I know I’m not alone in thinking that Windows XP may be more than all right for another few years. Not sure I’d pay $200+ for the privilege of installing an even more bloated OS on my computer. It could be a hard sell. It will be interesting to see how they actually do when launch time comes.

Edit [3/16/2006 @ 11:10 PM EST]: According to Engadget they will release a tool for Windows XP users to test with.