I turned on Wireless Multimedia (WMM) support the other day on my wireless network, figuring QoS for a wireless network would pretty much be a slam dunk. For those who don’t know, the four access categories it uses are:
- best effort
I was surprised to find, at least with the Netopia box that this actually resulted in a significant slowdown in http traffic, even when there was no other services being used. To put some numbers out there, we’re talking 10000 kbps with it enabled vs. 17400 kbps when disabled (these aren’t scientific, they are just bandwidth tests). I think the performance hit negated any real benefit, at least in this case. The box doesn’t handle much VoIP, so it really doesn’t do much. Video is more about raw bandwidth these days than latency thanks to CDN’s becoming more common and reducing the bulk of the latency issue. Also interesting is that the CPU hit seems pretty minimal. Daily average increased from 2% to about 4%, it’s double but really nothing serious. With it enabled it never spiked past 50%, and that was only one time.
So after a few days testing, WWM is turned off. Seems QoS at least in this case doesn’t pay. I can’t complain, wireless performance (20Mbps+) and signal strength are fantastic (when the microwave isn’t on) for an 802.11g network. Despite that, there’s always the desire to find ways to make it even better. Next step would be 802.11n, but I have a thing against uncertified gear. Once it’s standardized, I’d strongly consider it, especially if I can find a device that supports Linux firmware.