Categories
Programming Tech (General)

Z2k9 Bug Strikes The Zune

From the company that brought you Windows ME, and Windows Vista, Microsoft Corporation today introduced the world to the Z2K9 bug. Apparently all 30GB Zune’s reboot and freeze due to a bug in the date/time drivers. Classic. Microsoft’s solution is to simply wait until 2009 (a few more hours). Even more classic.

This does bring up one of every programmer’s biggest pet peeves: date/time code. I’ve mentioned my hatred of time before. It’s one of the most obnoxiously complicated things to work with due to all of the complexities from leap seconds to leap years. If you need to do something involving old dates, it gets even more complicated. Remember Julian Thursday, 4 October 1582 was followed by Gregorian Friday, 15 October 1582. Yes you read that right. Also don’t forget that only certain countries (mostly those under strict influence of the Pope) switched on that date. There was dual dating for some time. Then you have timezones, which ideally would be geographically correct and 15° of longitude apart, but instead zigzag and not even along territorial borders. Worst of all is daylight savings time. Not everyone participates in that, and sometimes just not every year, or at the same time. Even states are split, just check out the chaos in Indiana.

Griping aside, none of these likely caused the Zune bug. Since it’s a freeze, I’d guess it’s nothing more than an infinite loop or some other trivial programming error on a leap year.

Everyone remembers the infamous Y2K bug. Many uneducated folks still claim it was nothing to worry about and overblown, but it still cost between $300-600 billion dollars depending on whose estimates you believe (3.596 billion from the US military alone). Since a large portion of the cost was in the private sector, there’s no true tally.

The next big day to keep in mind is January 19 2038 3:14:07 GMT. That’s when the 32 bit computing will officially freak out since most Unix-like computers store time as a signed 32 bit integer counting the seconds since Jan 1, 1970 (Unix Epoch). After that we go back to 1901. There will likely be some 32 bit computing left in 2038 considering how long embedded systems can be ignored and silently slaving away in the background. For reference the B-52 Stratofortress entered operation in 1955 (they were built until 1962). They are expected to be taken out of service in 2040. This is the exception for US military aircraft, but don’t think this is the only old hardware out there. The Hubble Space Telescope has a 32 bit 486 processor and launched in 1990 and assuming the backup computer is functional it will be serviced soon to extend it’s life by another few years making it’s service life 20+ years. It’s unlikely Hubble will make it to 2038 but Hubble shows how long expensive systems can survive in active use. This date is only 30 years away. This will cost the world some serious cash.

On the upside according to Wikipedia 64 bit systems will be good until Sunday, December 4, 292,277,026,596. Odds are that won’t be a concern for most people alive today.

Reassuring? Yes. But your Zune is still fried for a few more hours.

Update [1/5/2009]: Here’s some pretty detailed confirmation that it was indeed an infinite loop error. I know my crashes 😉 .

Categories
Audio/Video Funny

Santa Roasting On An Open Fire

Burn Santa Burn

Santa roasting on an open fire. Isn’t this what the holidays are all about? Click the thumbnail for the pyro-loving video.

Categories
General Mozilla Personal

Happy Festivus

Festivus PoleAs everyone knows, today is Fetivus. So happy Festivus.

For those who need some background, they can read my previous post from 2006.

Hopefully we can get bug 394616 fixed so Firefox doesn’t mark Festivus as a misspelled word. It’s getting annoying (I know I can add the word, but I periodically clean things up and that goes away). That said, a spell checker’s ability to correct my amazingly wrong spelling of Hanukkah is amazing. No matter how badly I misspell it, and I can assure you I do a great job of it, Firefox always knows what I want. If you haven’t tried that before, give it a go. Someone told me a long time ago Hanukkah is one of the best words to test the abilities of a spell checker with. This is true either because spell spell checkers require skill to fix it, or because it’s so easy to misspell. Next time you need to review spell checkers, this is a test you can use. you’re welcome.

While I’m on the topic of holidays, here’s a Christmas message from a monkey as the queen typically does. It’s pretty well done for a YouTube video.

Categories
Apple Software

SimCity For iPhone

EA Mobile games released the classic SimCity for the iPhone. Having played SimCity Classic, 2000 and the Palm OS version, when I heard about this version, I knew it was going to be on my short list of wanted apps. Truth be told most of the games in the AppStore are worthless, so spending $9.99 on a game series I’ve enjoyed before didn’t seem like a bad deal.

It’s essentially SimCity 2000 + a few things. If you liked 2000, it’s a pretty safe bet the iPhone version won’t be a let down. You can of course save your game and keep it going for months since this isn’t the typical 5 minute iPhone game.

Categories
Mozilla Programming Rants

Object-Oriented Masturbation

Doing some research for an upcoming installment of an infamous series of blog posts (to be released at an undetermined date) I’ve come to notice this annoyance. In general I like object-oriented programming. I think it allows you to take a complicated problem and produce simple, more reusable and easier to maintain code. Assuming the programmer is sane individual, and that sometimes is a leap of faith.

I’m not sure if there are programmers who feel the need to complicate things just for the sake of showing off (oh, look at me!), or if they legitimately don’t know any other way. I suspect a little of both. Perhaps the programmer thought it was possible the code would grow in complexity in the future, and just wanted to prepare in advance. I don’t know. But it annoys me.

I consider this abuse of Object-oriented programming to be Object-oriented masturbation since the only one who gets any enjoyment out of it is the developer who does it. Here’s a slightly exaggerated (though not far off) example of Object-oriented masturbation typical of what I’ve seen many times before:

Objective

Sum two numbers and print the result in the format “The answer is: X” where X is the sum of the two numbers.

Correct (sane) Answer

// This is the simple, obvious answer
function sum(x,y){
    document.write(‘The answer is: ‘ + (x+y));
}
 
// To run:
sum(1,1);

Arguably, you could do the math and store in a variable if you are squeamish to doing math in any line containing an output method. Regardless, this is dead obvious and simple. And yes, this technically still does use OOP since it uses document.write().

Insane (masturbatory) Answer

function MathLib(){
    this.answer = null;
}
 
MathLib.prototype.sum = function (x,y){
    this.answer = x + y;
}
 
MathLib.prototype.getAnswer = function(){
    return this.answer;
}
 
function Printer(){
    this.preText = ;
}
 
Printer.prototype.setPreText = function(str){
    this.preText = str;
}
 
Printer.prototype.out = function (str){
    document.write(this.preText + str);
}
 
// To run
var math = new MathLib();
var print = new Printer();
math.sum(1,1);
print.setPreText(‘The answer is: ‘);
print.out(math.answer);

I did the exact same thing in 1 logical line of code (3 if you include the function itself) up above. What was gained from the object-oriented design? I argue nothing. Rewriting the first example would take a minute should something need changing. If there were a need to localize that string, I could pull it out and either make that a 3rd parameter, or some global string the function can use. There’s no benefit here. This is over engineering. There’s nothing truly reusable here. The closest is the Printer.out() method, and in reality it’s a wrapper for document.write().

The bonus joke is that the object-oriented implementation is significantly slower.

Run 1: 0.0266, 0.0451
Run 2: 0.0314, 0.0464
Run 3: 0.0329, 0.0462
Run 4: 0.0268, 0.0468
Run 5: 0.0274, 0.0475
Avg: 0.02902, 0.0464

The first example is also 62.5% faster in Firefox 3.0.4 since the all that overhead is gone. If you have FireBug installed you can check that out here.

object-oriented masturbation
ob-jikt | o·ri·ent-ed | mas⋅tur⋅ba⋅tion

  1. the stimulation or manipulation of one’s own ego by way of using object-oriented code in places where it has no advantage instead resulting in unnecessary complication and bloat.

If your guilty of doing this, please cut it out. It’s annoying. Thank you.

No I will not single out links to the pages that inspired this, but rest assured it litters the web.

Categories
Internet Web Development

MAMA Scripting Analysis

Opera did some interesting research into JavaScript used on the web. As someone who writes a fair amount of JavaScript and reads through countless lines of other people’s scripts, I found this to be pretty interesting.

Overall none of the results were very surprising, though a few things did catch my eye:

  • Omniture/SiteCatalyst Analytics ranks pretty high in the results. This to me suggests that the index of pages skews towards enterprise and large sites since Omniture is rather expensive service.
  • Google Analytics made the list to nobody’s suprise. I am however surprised not to see Quantcast which seems to be pretty popular now.
  • The popularity of window.open really hurts. Opening in a new window is so counter to how things are supposed to work. The user should decide on their own if they want to pop a new window (or tab). Most sites do this hoping the user forgets about the previous window and it improves their “average time on site” metric.
  • VBScript usage is slightly disturbing. Thankfully (in my experience) it’s most often found on older sites.

I wouldn’t mind knowing the popularity of scripts like SWFObject and Lightbox, assorted clones and PNGFix.

An analysis of graphics on the web could be interesting. GIF, JPEG, PNG. Then an analysis of the palette for GIF, JPEG compression, alpha transparency, interlacing, average file size and average amount per page.

Categories
Apple Hardware

The Next Generation Of Computing

I got my current laptop in Oct 2005, though the model was released in spring/summer of that year. My Mac mini is an 1st Gen (G4 1.4GHz) from Jan 2005. Needless to say my hardware at home is getting close to the point of needing an upgrade. Due to my laptop being replaced unexpectedly I have two computers getting to that magic point at almost the same time. Not much I can do about that. I’ve been thinking about the next generation and what I want to do. Ideally I’d like to simplify my setup, and hopefully in that process get more bang for the buck. In some ways I think I will, in others I won’t.

My initial though is to eventually get a MacBook Pro once it’s truly 64bit and supports at least 8GB RAM. GPU accelerated video decoding would also be nice. I like my computers expandable and to last a while. In 3-4 years time I think I’ll want more than 3GB of RAM considering I think 2GB is the minimum today. Yes the hardware they ship today technically supports this, but Apple’s firmware doesn’t for reasons unknown. I’d also like one or more USB 3 ports, but we’ll see if that happens in the 2nd half of 2009 or not. I don’t think the lack of would be a deal killer though. I think it will take quite a while for USB 3.0 speeds to be necessary to the point of widespread adoption.

Laptops are great since they can be moved around which is handy from time to time (though I use my laptop more at my desk than anywhere else), but they do lack the power that I sometimes want. The Mac mini obviously never delivered what I really needed in that department. My thought is to build a desktop rig composed of a multi core CPU (whatever makes sense at the moment), min 4GB RAM, at least two SATA drives (primary/backup) and dual boot Windows and Linux. This beast would be pretty much for when I need some real horsepower. By building it myself I could invest a little wiser in a good case, power supply, etc. and upgrade this thing through several revisions for years to come rather than throw it all away after a few years. I can also target my $ towards components I care about.

My primary (day to day) computer would be the MacBook Pro and would likely have Parallels installed so I can run Windows if/when necessary (mainly since Quicken for Mac sucks last I checked and so I can test web pages in Windows). When I need to do something that laptops suck for due to small slow disks or just being slower, I would have the desktop rig available.

One of the downsides here is that while my current display is VGA/DVI, both of these systems would be DVI. I could either degrade the signal to VGA and use my current KVM, or upgrade to a DVI capable KVM switch which isn’t cheap (I haven’t seen less than $250 for something like a IOGEAR GCS1782). The DVI switches don’t support dual displays unless you drop some serious cash, so that’s pretty much out of the question. This adds to the complexity. Is building a KVM for DVI really that much more difficult to warrant the price difference? Or is it simply there aren’t enough in the market to drive the price down?

When should I start this? What system should I target first? Which should that system replace? Who the hell knows. I’m thinking later next year. It’s not so much a “plan” as an idea. I know I need to upgrade to more modern hardware since I won’t be able to run Mac OS X 10.6, and XP is getting to me. Both use 2.5″ ATA/100 drives which are becoming hard to find, and even when you find them are pretty small. Both are maxed out in RAM.

I’ll likely retire the Thinkpad to just for travel and other silliness, and perhaps save the G4 mac mini for some diabolical scheme. Apple even alludes to some of the possibilities on it’s site (see “Big Ideas” on the right rail of the Mac mini page).

It gets surprisingly complicated when you want it all and have it fit on your desk.

Categories
Blog Mozilla

Wordle

A bunch of folks on Planet Mozilla are running Wordle on their blogs. I can’t resist. My apologies to all who hate these memes. I’m doing a little bit of a twist though. The first is my blog, the 2nd is only the Mozilla related posts so that this is a bit more relevant to PMO.

Blog

Mozilla Posts

I’ve always had a little fascination with this stuff. I’ve had tag clouds on my blog archives page for years now. It’s an interesting way to look at text. It gives you a good feel for the content of a large body of text in just a quick glance. Often better than any summary could.

Firefox Extension?

Places, perhaps using the spell checking dictionary to pull words out of url’s for example. For obvious reasons the page title is easier. It would be interesting to be able to view your browsing history like this. I think it should take into account the number of times a word is found as well as the number of times you visit a page with a given word. The number of times a word is found should have a slightly higher weight. It could be implemented by using <canvas/> as Benjamin Smedberg demonstrated in a similar exercise. The complexity here would likely be processing time. Anyone interested?

Categories
Google Internet

Technology Growing In Public Consciousness

Google Zeitgeist 2008 is out. As always it’s a fun to read because it’s a recap of 2008. It’s also gives some pretty good insight into 2009, in particular for tech since people tend to turn to Google to explain technology for them. I noticed some very interesting things:

What is…

  1. what is love
  2. what is life
  3. what is java
  4. what is sap
  5. what is rss
  6. what is scientology
  7. what is autism
  8. what is lupus
  9. what is 3g
  10. what is art

What Is RSS

Number 5 “what is rss” is really what’s interesting. There’s long been speculation if RSS will ever move beyond a more technical audience. It powers many things on the internet, and is possibly the most popular use of XML (I’ve got no data on that), but it’s never had tremendous adoption by mainstream users who still Google for websites and read news off the homepages.

Steve Rubel thinks it’s peaked at 11% citing Forrester research back in October. Personally I think it’s still got a way to go. It will grow, but slowly. The following reasons are why I think this will turn out to be true:

  • RSS and “feed” are terms that are just now entering mainstream public consciousness. People tend to see things a while before they care about them when it comes to technology. They prefer not to waste time on fads. What we see above is evidence of a transitional step in this process.
  • Modern browsers finally offer a smoother process to help users take advantage of RSS. For most of it’s life browsers simply showed raw XML on the screen, this was unusable for 99.5% of the population. Showing XML looked foreign and overwhelming. The newer interfaces, while they can still be improved upon, are much easier for users.
  • RSS readers are at their infancy. I’m an advocate of Google Reader since very early on. That said, I think there’s a lot that can be improved upon. I think they can become a bit more usable for mainstream users, in particular when it comes to bucketing into folders and sifting/sorting. It’s management of data rather than just displaying that needs work.
  • Greater need to manage time. Lack of time and information overload have long been growing problems for people. RSS actually helps here (unless your Steve Rubel and addicted) by reducing the amount of time you need to access and digest information. I keep tabs on a few hundred sites with minimal effort throughout the day, it’s essentially a constantly evolving newspaper for me. People need to reduce the time they spend monitoring things they care about. RSS is the leading candidate to help them in this task.

Because of the need, and the fact that RSS is a pretty good solution, despite the lack of good interfaces to intproduce it to users, I suspect there will still be growth as people overcome the barriers and take advantage of it. The only way that won’t happen is if there’s a more disruptive technology. Even if that doesn’t pan out, RSS will be with us for many years due to it’s pervasive use across the net.

What is 3G

I presume this is highly related to the iPhone 3G release. The term “iPhone” appears in the Zeitgeist for several countries but not the US. I suspect that’s because it’s new to those countries. What’s new to the US is the iPhone 3G, and that’s what people wanted more information on.

What is Java, What is SAP

Enterprise IT departments love the Google. Enough said.

Other thoughts

I was a little disappointed to not see “who is rick astley“. As well as “i can has cheezburger?”

Categories
Google Security

Google AdSense And SSL

Google’s implementation of AdSense never ceases to amaze me. AdSense has been a major source of revenue for many websites for a few years now and has allowed many businesses to succeed where previously they would have had little chance. It’s a great program and I appreciate how it allows websites to monetize content quickly and with little effort. That said, I’m still so confused by Google’s implementation. It just doesn’t make much sense.

Since July 2007 Google AdSense has had the ability to crawl login protected pages so that it can scan (and therefore provide relevant ads) to pages behind logins. This is great since many pages on sites, in particular social networks where the the majority of page views are post-login can now be monetized.

Despite this progress, Google still doesn’t provide an SSL version of AdSense, so while the page itself can be served over SSL, the ad isn’t. This is problematic since the browser will alert the user that the page is not entirely secure. I really don’t understand why this can’t be done. Google does appear to scan these pages as the ads are relevant, so I don’t think the crawler is the issue. They just don’t want to serve ads over SSL.

Come on Google, the web would be a much more secure place if AdSense supported SSL. It would remove a big reason for sites to not use SSL in places that they should.

For those who would argue that putting third party ads on an SSL page defeats the purpose, that’s only partially true. Yes in an ideal world there’s no third party content on an SSL page. In the real world, Google already supports using SSL with Google Analytics (as do virtually all other analytics services), and you can bet almost any SSL page you access has some analytics on it already. This is no worse. If anything it’s better since unlike Analytics, the nature of the service involves much less recording of user behavior.

By not supporting SSL it’s just encouraging sites to not use it in places where a users privacy and security would be better off with it.