As many sites are reporting (I’ll just link slashdot), DRM has been found in the new Intel Macs. So what? It doesn’t really say much. Mac OS X has always been tied to Mac hardware (with a small window for clones during the 90’s). It’s a little premature to assume that it will be used all over your computer. It’s likely because of this that those
Mac_OS_X_10.4_Tiger_x86.iso‘s haven’t made it onto the net yet. There’s no way they would work without DRM. The good thing about DRM is that just because it exists, doesn’t mean it has to be used. Apple has been locking hardware with exclusive ROM’s, and special motherboards. Now they replaced all that with 1 chip. IMHO that’s just consolidation. If anything, it’s opened up a few doors. Now perhaps software manufacturers can allow us to activate software from the privacy of our own computers without phoning home (something that always bothered me a bit). Why do I have to tell Microsoft that I installed their software? Isn’t it enough I bought it? Do I actually need to call them up and tell them? I’d rather the DRM chip so they can preserve their licensing, and I can preserve the right to not initiate an electronic conversation with them to let them know I installed Windows XP (and obviously give them my IP address).
I don’t quite get the fuss. IMHO DRM by computer chip is much less invasive than most DRM methods currently around (product activation). Why are we upset about things that potentially shield us from more privacy invading techniques?