Stable WiFi Connections With Mac OS X

I’ve been digging into Mac OS X’s sometimes unstable WiFi connections for a while now, and have come to the conclusion that the Broadcom drivers in Mac OS X 10.6+ are either too fussy or just buggy in particular when dealing with 802.11n.

Apple’s iOS drivers seem to be different as few people see the same issues across Mac OS X and iOS. On the hardware side, the iPad 3 and iPhone 4S use a Broadcom BCM4330, while the slightly older iPhone 4 uses a BCM4750. MacBook, MacBook Pro, Air use a Broadcom BCM4331 these days. Some older ones (pre-2010 I believe) used Atheros AR5008. As you can see the hardware is pretty similar suggesting software as the discrepancy. Despite using a Darwin based OS it makes sense to have slightly different drivers. These devices have very different needs in terms of data usage patterns and power consumption. iOS devices seem to use less power than their OS X based counterparts. That makes perfect sense. The question is how does this impact connectivity and what can we do about it?

Apple has recommendations for iOS. For the most part these are universally good recommendations, however I’ve found a few things to be different:

  • 802.11 a/b/g/n – If you’ve got a broad set of clients, without question seek out a simultaneous dual-band wireless router. Not dual-band, simultaneous dual-band. This will save you a lot of headache and ensure good performance. Two radio’s are better than one.
  • Channel – Apple says to set it to “auto”, however I’ve found if there are several access points on other channels nearby this can be troublesome for OS X based clients on 802.11n in the 5 GHz spectrum. You’re best off setting it to the most open frequency and leaving it if you experience problems. This alone will likely resolve many (if not all) connectivity issues in my experience. 2.4 GHz seems to do better in auto channel. I’m not entirely sure why this is, however I suspect it has to do with power saving strategies employed by the driver. This seems to be even more problematic with 40 MHz channel width, which sort of makes sense given they are related.
  • Set 5 GHz channel width to 20/40… maybe – Apple says to set the 5 GHz channel width to 20/40 MHz if supported because not all devices support 40 MHz, and this is most compatible. If you’ve got simultaneous dual band, you can consider setting it to 5 GHz 802.11n only with 40 MHz channel width and set the other radio set to 802.11b/g 2.4 GHz / 20 MHz serve as adequate backwards compatibility for non-40 MHz devices. I’ve run things both ways, and IMHO either will serve most needs well. Just depends what devices you are supporting.

This is pretty obvious in retrospect. The 5 GHz spectrum seems to have some funny business with channel selection and this can be solved by just being more strategic about your usage. If you’ve got an Apple device being fussy with network connections, this is the first thing to play with.

Number Based Consumerism

Number based consumerism is when a consumer bases their buying habits on one or more numbers typically part of a products specifications. You likely see this all the time, and perhaps even have been guilty of it yourself. It’s most prevalent in technology though it exists in other sectors.

Continue reading

802.11n Finalized

802.11n, something I was starting to think would never get beyond draft is now approved. Having suffered through “compliant” 802.11b devices I long ago decided wireless networking is fussy enough to warrant stricter standards. As a result I stuck to Wi-Fi Alliance certified 802.11g devices, and the results have been awesome. I’m still of the opinion that the difference between “compliant” and “certified” is gigantic. Certified 802.11n devices should start to appear in the next few months.

Looks like the goals for any 802.11n upgrade are MIMO (obviously) and preferably dual-band (2.4GHz and 5GHz). I can’t see why I would want to do anything otherwise.

Considering most ISP’s don’t yet provide the downstream or upstream bandwidth necessary to take saturate a good 802.11g network, I’m not sure it’s really necessary to upgrade just yet. Thanks to a solid signal I can sustain up to about 19 Mbps over 802.11g even with WPA2 overhead and slight signal degradation. Under 1ms pings as well. ISP currently offers up to 16 Mbps, 12 Mbps plans for mortals. Rarely is that performance actually seen thanks to “the Internets being a series of tubes”. At least for today upgrading would only improve local network performance, not Internet performance. Most traffic is going outside the network anyway. 802.11n would bring capacity up to 130 Mbps, but since the uplink is still 12 Mbps, that really provides no real performance boost.

For anyone who would argue the faster CPU’s on the newer access points would improve performance, I’ve found that my current AP rarely sees more than a 2% load, with rare spikes up to about 40% capacity.

Of course hardware providers, and retail outlets will continue to tell people that downloading will be 6X faster1, but logic and common sense proves otherwise. It’s the equivalent of a Bugatti Veyron stuck behind a funeral procession.

That of course also assumes all devices are connecting via 802.11n. If you have an 802.11g and 802.11n devices connecting over 2.4 GHz, you’re going to be in mixed mode and slow down while 802.11g devices send/receive anyway. As far as I know there’s no way around that.

Then there’s the issue of all the pre-N adapters sold in laptops over the past few years and their compatibility, which is generally pretty good, but not perfect when mixing vendors.

So despite the marketing getting even stronger, I don’t see how it would be really beneficial to upgrade just yet. The actual performance increase for most activity will be virtually non-existent until ISP’s get faster. I’d rather wait until the hardware matures and prices drop more.

1. up to 6X faster, actual results may vary.

WMM Slowdown

I turned on Wireless Multimedia (WMM) support the other day on my wireless network, figuring QoS for a wireless network would pretty much be a slam dunk. For those who don’t know, the four access categories it uses are:

  • voice
  • video
  • best effort
  • background

I was surprised to find, at least with the Netopia box that this actually resulted in a significant slowdown in http traffic, even when there was no other services being used. To put some numbers out there, we’re talking 10000 kbps with it enabled vs. 17400 kbps when disabled (these aren’t scientific, they are just bandwidth tests). I think the performance hit negated any real benefit, at least in this case. The box doesn’t handle much VoIP, so it really doesn’t do much. Video is more about raw bandwidth these days than latency thanks to CDN’s becoming more common and reducing the bulk of the latency issue. Also interesting is that the CPU hit seems pretty minimal. Daily average increased from 2% to about 4%, it’s double but really nothing serious. With it enabled it never spiked past 50%, and that was only one time.

So after a few days testing, WWM is turned off. Seems QoS at least in this case doesn’t pay. I can’t complain, wireless performance (20Mbps+) and signal strength are fantastic (when the microwave isn’t on) for an 802.11g network. Despite that, there’s always the desire to find ways to make it even better. Next step would be 802.11n, but I have a thing against uncertified gear. Once it’s standardized, I’d strongly consider it, especially if I can find a device that supports Linux firmware.

Experiment complete.