Categories
Security

How To Configure SSL For Apache Securely

I’ve been doing some reading up on best practices for SSL. From what I can gather, and seeing what other big sites are doing this seems to be the best practice as of today. This is assuming you’re in an OpenSSL 0.9.x (via mod_ssl) and Apache2 world, which is the majority of Linux/Unix based environments. Use a 2048 bits key SHA1 signed cert. Which is now pretty much standard.

SSLHonorCipherOrder On
SSLProtocol -ALL +SSLv3 +TLSv1
SSLCipherSuite ECDHE-RSA-AES256-SHA384:AES256-SHA256:RC4:HIGH:!MD5:!aNULL:!EDH:!AESGCM
SetEnvIf User-Agent ".*MSIE.*" nokeepalive ssl-unclean-shutdown

That will disable potentially insecure cyphers and help mitigate a BEAST attack. Note that this disables SSL 2.0 which shouldn’t be necessary for the vast majority of visitors. I don’t think many websites still support it.

Categories
Software

AVG Wastes Bandwidth

AVG really needs to fix their “LinkScanner” product. It essentially scans pages for links and pre-downloads them to check for malware. If that doesn’t sound so bad, then your obviously not paying for bandwidth or trying to keep your server load manageable. Essentially it means more traffic pegging servers and downloading pages, but most of it being a total waste.

This isn’t just bad for webmasters. This excess traffic hogs ISP’s (who now plan to charge by-the-byte) and WiFi. In a country where we are tight on bandwidth, this is really a pretty lousy implementation.

AVG even went so far as to use multiple user agents, all of which seem to spoof IE, making it more difficult to block.

The best way to block the bogus AVG traffic seem to be by looking for the Accept-Encoding HTTP header, which could be done using an Apache rewrite rule if you can’t do so on the firewall or load balancer level.

AVG really needs to reaccess this poorly designed product. It’s unnecessarily taxing the web.

Categories
Mozilla Open Source

Open Source And Recessions

There’s an interesting blog post on Open Source and recessions worth reading. Essentially the question is this: Does a recession have a negative impact on open source?

I’d say the answer is somewhat more complex than a simple yes/no. There are many different types of projects out there with entirely different circumstances. However I suspect a projects impact could be gaged on a few key aspects of it’s operation:

Purpose – The purpose of the project is likely the most critical aspect. For example, I don’t think there would be any significant impact on projects like the Linux kernel which is essential to many products out there including server infrastructure that powers much of the web and many companies computer systems. Then you have consumer products like TiVo, Google Android etc. Because it’s purpose is so broad there are enough people with a financial interest in seeing development continue. WebKit, Mozilla, Apache, are good examples of this. They have broad usage by many. Something specific to a more obscure task would have more trouble due to it’s more limited market.

Development Team – Of course for a project to succeed it needs one or more developers. During a recession one could theorize that many would be less inclined to participate. This may not necessarily so. First of all, quite a bit of open source development is loosely sponsored. Several projects have actual staff, paid employees who write open source code. For example Apple employees people to work on WebKit. Mozilla has staff working on Firefox. There are people paid to work on Linux (Red Hat, IBM, Novell, etc.) and many other open source projects. There are also companies who contribute some code that would be of strategic value to them. There’s also those who are simply willing to sponsor some work they want to see happen. All of which fund developers of larger open source projects. But would developers who aren’t sponsored or employed to code still participate? I theorize most still would as they don’t depend on it for income during good times, presumably a job during a recession wouldn’t generally prohibit participation and more than a job during years of economic growth. There’s also the impact of college students who participate partially for the educational aspect. The early 2000’s was a recession and still showed a fair amount of growth of open source. In fact many of todays stars really started to take shape during that period. For example:

Funding – Somewhat obvious: Funding is key. Who pays the developers (partially the last aspect I discussed)? Who pays for the projects needs (servers, etc.)? Many of the more popular projects (almost all of the above) have either an organization of for-profit company around built around it. That company often sponsors the needs of the project. Unless the needs of that companies product/service is no longer needed during the recession, funding likely remains. That’s partially the first aspect I discussed.

It’s my belief the larger and more popular open source projects would feel a minimal impact during a recession. I think history has shown this, and common sense agrees. They are mostly low development cost, adequately funded (often from diverse sources), stable, and have a broad team of developers. The projects that are in trouble are the ones who have very few or only 1 developer, even worse if they share the same sponsor, even worse if there is little community around the project. Most projects would generally experience a slight slowdown in development the degree would depend on the above. A few may go dormant for a period of time. Thanks to things like GPL licensing, another developer can pick up should there be a market in the open source ecosystem.

Overall I don’t think open source would be nearly as impacted as most businesses during a recession. The model is very different. Open source when successful has a community and many different sponsors. The diversity allows the project to survive even when recession causes some sponsors to need to reduce or eliminate involvement. Open Source also by definition is used to this type of environment. It’s used to developing on a budget, soliciting sponsors to help cover costs, etc.

The interesting thing about recession is that it impacts everyone, but the degree to which someone is impacted varies. For example construction and housing are generally harder hit than other industries. People tend to cut back on new home purchases before they cut back on other things. Each of those industries has computing needs, sometimes met by open source. This all feeds into the open source ecosystem.

I’d suggest that all of the projects I have mentioned here will do ok during a recession. Many with a slowdown, but all will still continue as long as they provide value. A notable situation is Mozilla’s income comes largely from Google which is based on ad revenue. During a recession and bubble bursting this would likely dramatically reduce the revenue brought in. This isn’t being ignored. As the 2006 Financial FAQ states:

First, the cash reserve is of course a form of insurance against the loss of income. We will continue to maintain enough of a reserve to allow us flexibility in making product decisions….

It seems that an open source project with a diverse stream of funding from individuals and companies of various industries, as well as developers in different situations is in the best position to survive.

It’s an interesting topic.

Categories
Apple Hardware

New Home Server

Over the past few weeks, I’ve been in the process of setting up a new home server. The previous one was an old Beige G3 (266MHz) running Mac OS X 10.2 that was starting to show it’s age. The new system is a much more capable B&W G3 (400MHz) running Mac OS X 10.4. Despite only a slight increase in clock speed, the B&W G3 has much more modern hardware (USB, Firewire) not to mention more room for more storage. The opportunities are endless.

Decided to go with a multi-drive setup considering the extra bays. The system had a still usable 40GB Seagate Barracuda IV drive which would make a perfect system disk for OS/Software. Installed via a ACard ATA/66 controller it’s no speed daemon, but for the purpose it’s fine. For the data drive I decided to get a SIIG SATA card and a pair of Seagate SATA drives I found a good deal on at BestBuy. The drives were Seagate ST303204N1A1AS, which corresponds to 320GB. Inside the boxes as expected were (the newer and better) ST3320620AS, which is a Seagate Barracuda 7200.10 with firmware 3.AAE (not the AAK people have had in the past). Perfect.

Next I wanted to replicate data across the drives on a cron. Initially I was thinking rsync, since as of 10.4, it’s resource-fork aware. It turns out that’s not really true. I ended up going back to SuperDuper to copy between the drives. It only copies changed files, and once a week will delete removed files (so if you accidentally delete something, there’s still a chance to recover, unless you do it at the wrong time). Not a bad solution IMHO. Still would prefer rsync more. Initial backup took less than 1/2 hour. Just a few minutes should be enough to keep the disks in sync. I briefly considered setting up RAID, but decided against it since RAID is not backup. It doesn’t protect against things like corruption.

Apple needs to kill off resource forks ASAP. They should have done so when moving to Mac OS X several years ago.

Next up, I tried putting a copy of TechTools Pro I no longer use on my Mac Mini since upgrading to Leopard on the system, but that resulted in some drive problems that I couldn’t resolve without uninstalling. They seem to know about the problem, but haven’t fixed it. You see the following error repeatedly in the system.log file until you reboot:

kernel[0]: IOATAController device blocking bus.

Drag.

Also updated mrtg, and this time compiled GD, libpng, libjpeg, etc. all by hand, rather than use fink. Last time I went with fink, which saved me a few keystrokes, but when fink no longer updated packages for 10.2, left me high and dry. This time I think I’ll avoid it when possible. I need to try getting RRDtool setup at some point, since it’s so much better.

I use a few php scripts for easy admin of the box, and decided PHP 4 wasn’t adequate since it’s pretty much discontinued. So I upgraded to php 5.2, and all seems good so far. I think Apache 1.3.33 will serve me just fine for the moment, so not upgrading that.

I might give setting up BIND a try, since local DNS would be pretty handy for easily accessing the server without modifying the host file on computers.

I also disabled things like spotlight, which have absolutely no purpose on this box.

On another note, glib for some reason won’t compile for me. No clue what’s going on. Overall it’s looking pretty good. Should be about ready for real use. Just want to make sure the backups work as expected.