IE7 Testing

I complained a while back about how unlike Firefox or Opera, it’s hard it is to support multiple versions of IE. Microsoft apparently did listen to developer complaints, and came out with a solution.

You can now download a time crippled VirtualPC Image and running IE6 in VirtualPC. It’s available free of charge. The catch is it’s time crippled (though it seems like they will release a new image before that time).

This vastly improves the situation, but I wish they also offered IE7 virtualized, for those who can’t/won’t upgrade at this time (corporate IT policy, etc.). I also think there would be significant benefit to developers to see IE 5.x as available (since we still often have to support it). A VMware version for Linux (and eventually Mac) developers would also be great to see, though I won’t hold my breath on that one.

But regardless, it’s a great move. I just hope they keep coming out with updates to keep the program going. Hooray for solutions!

MacMarionette

This is one of the cooler hardware hacks I’ve run across in the past few weeks: Turning a PowerBook into a Marionette. Unfortunately it’s motion detector specific, so no real chance of that being ported to other devices.

I’m still waiting for someone to take one of those new Lenovo X60 Tablets and make it into a giant Etch A Sketch. Lenovo/IBM’s motion detector (ADXL320 managed through the embedded controller) may be up to the task, but nobody has really done anything with it thus far, other than Knock-Knock by IBM.

PortableApps Suite

If your a fan of running an application completely on your USB drive, check out the new PortableApps Suite. It’s really great. I’ve been using PortableApps for a while, and this is a real nice suite. Now things are easier to access than ever. The next improvement I’d like to see is some Mac versions become available. The ultimate solution would be Mac/Win/Linux versions that use the same data directory. At that point it would truly be portable. It’s also cool enough to work with third party applications, so you can add other things not available through them to the suite such as everyone’s favorite Putty. Great for Firefox on the go.

The advantage of the suite is not only the easy install, but the ability to quickly open applications without navigating to them. There are other tools out there like PStart, but this is in my opinion a little more polished.

SafePasswd.com Update 11/19

I updated SafePasswd.com tonight. Updates include better generation of memorable passwords, easier to handle length selector, and better quality bar. Most of the changes were to algorithms, rather than major features. Hopefully the quality of passwords generated is now slightly improved. It’s not always about the big things, sometimes it’s the smaller refinements that need to be done.

Phishing Unit Testing And Other Phishy Things

Seeing these results is pretty cool. I hope someone has/will come up with a way to have a test like this running periodically (at least weekly, if not daily or multiple times a day) which does an analysis on Phishing sites and how many are being blocked. I’d presume Google and other data services would have some interest in this. It could be as simple as an extension for browsers (yes IE too) which reads a feed and visits each site, and reports the results to a web service. Running in a confined environment (virtual machine, or dedicated box) free of tampering. I think the real advantage would be to see how effectiveness varies over time as phishers become more sophisticated.

Take for example spammers. First spam was pretty simple, now they are using animated GIF’s, sophisticated techniques to poison Bayesian analysis, botnet’s etc. I presume over time we’ll see the exact same thing with Phishing attacks. I doubt it’s going to get any better. On the positive side of things, this is still at it’s infancy, so we can start learning now, and be more aggressive than people were about the spam problem, which got way out of hand before everyone realized it was really something to worry about.

I’d ultimately like to see just percentages of different anti-phishing blacklists/software updated frequently, so we can keep a running tally. Perhaps it would be a good indicator of when phishing tactics require a software or methodology update. I think overall everyone would benefit from some industry collaboration rather than competition. The problem with phishing is to be effective your research must be good. To do good research you need to cast a wide net, and capture only one species of phish while not letting any dolphins get stuck in the net (sorry, couldn’t resist).

I’d be curious to know what others think of such testing, and efforts (from general users, as well as anti-phishing/spam vendors). Is the war against spam effective? Should the same techniques be used? Is it time for coalition building? Should we each go in alone? How do you monitor changes in techniques used by phishing?

I know Google is pretty serious about keeping up with the data in a very timely manner, and from what I can tell, most other vendors are as well. But I wonder how industry wide statistics could further benefit. Perhaps simply the competition of trying to have a higher average score. Perhaps simply the detection of changes in techniques (noted by everyones collective decline in detection rate).

I’d love to hear what others think of Phishing protection. It’s a rather interesting topic that many don’t give too much thought to, but it really is an important part of how browsers make the internet safer.

Some Changes

I just made some changes around here, cleaning up some of the older code. Most notable changes:

  • Comments – New lighter ajax comment script in place, this should make the pages load a little faster (actually quite a bit if your on 56k). Could use a little help testing so feel free to leave a comment. If it doesn’t work contact me.
  • Menu Bar – Some CSS changes to make it look more consistent across browsers, and it now highlights the current section.

Some other, more obscure changes were also made. Let me know if there is trouble.

Moving Forward

It seems that since Firefox 2.0 has shipped, everyone is really taking some time to think about the future. Not that it wasn’t on peoples minds before 2.0. For me 2.0 was really a maintenance release. End users got some great new features and fixes, but all I really contributed was a small fix or two, most of the time I could allocate was spent on planning and server side development (more on that some other time). Mike Connor and I seem to have overlapped on feelings towards future improvements:

  • Site compatibility We’re doing pretty good, marketshare helps, but we need to be better. We need to push Reporter, and put real time into analysis of the top sites showing up there. Sometimes its our fault and we need to prioritize bugfixing, sometimes its Tech Evangelism (and we need to get back to doing that too).

I mentioned a few weeks ago it’s important for end users to report problems, and got some traction. But I’m still looking for ways to get more casual users reporting problems they encounter. Anyone with ideas on how this could best be accomplished, without annoying the user or adding intrusive UI should let us know. Either leave a comment here or contact me. We can’t help what we don’t know.

To help improve the quality of analysis of reports, I have gotten pretty close to a new reporter webtool. This version has much more flexibility and allows for easier viewing and manipulation of data. I hope to give it more time in the next few weeks and make a drive to go live with it. It’s been delayed several times (my fault), but it’s now in the final stretch. Future revisions will be much more incremental to prevent such delays again.

To further help improve the quality of reports, Gavin Sharp wrote a patch to capture the character encoding of web pages reported. I wrote one to allow users to send a screen shot of what they see. Both I believe should make 3.0. I think these changes will help improve things in the long term. Knowing the charset can help improve character related problems users experience (since charset detection is somewhat of a messy game), and having actual screenshots of what users see is of course beneficial for rendering issues.

Hopefully some of the bigger Gecko changes taking place on the trunk will further improve site compatibility. Of course growing marketshare has and will continue to help websites adopt a policy of cross browser compatibility. That has in the past, and will continue to be a driving force. So remember: don’t spoof your useragent more than absolutely necessary. Make sure webmasters know who you are. “Stand up and be counted”.

My personal goals for Firefox 3.0 are these:

  • Get new reporter webtool in place. Sooner the better. It’s been delayed to many times. At least now it’s close.
  • Get charset and screenshot support up and running. Investigate if there’s something else that would be really beneficial.
  • Find new ways to get more end users to let us know when they encounter a problem, rather than just keep quiet.
  • Keep reading, playing around and getting new ideas. IMHO that should be a goal for anyone doing anything in life.

I think that’s a rather obtainable set of goals with a definite positive impact.

I’m also working on an update to MozPod to allow for synchronizing your iPod calendar with Lighting (in Thunderbird) or Sunbird. It’s somewhat working but still rather buggy. There are also several fixes for other issues since the last release of MozPod. I hope to have that out by the end of the year (which would be 1 year after the last release).

On a more personal note, last month I accepted a position with CBS Digital Media as a developer at CBSNews.com. For those wondering, yes I am working on improving the experience for Firefox (and Mac users) among other development work. It’s not too bad right now (personal opinion), but being better is of course welcome. The usual disclaimer that the views on this blog are mine alone and do not represent those of my employer of course apply (but I’m sure you knew that already).

So there you have it. I plan to write a few more posts in the next few weeks more specific to individual things discussed here, but for now that should give everyone an idea about what I’ve got cooking. It’s a rather interesting mix of things I get to work on.