Categories
Mozilla Web Development

On Prefixing And Monobrowser Culture

I’ll say right off the bat that Daniel Glazman is right, and I fully support his message. The failure to alter the course of the web now will lead to headaches. Truthfully it’s already a headache, it’s just going to get worse. The IE Days were the dark ages of web development. I don’t want to go back to that.

In an ideal world, CSS prefixing wouldn’t be necessary. Browser vendors would spec things out, agree on a standard and implement it. That however is too rational, so CSS prefixing is an unfortunate reality. It outright won’t happen by the admission of Microsoft and Apple (pointed out by bz):

tantek (Mozilla): I think if you’re working on open standards, you should propose
your features before you implement them and discuss that here.
smfr (Apple): We can’t do that.
sylvaing (Microsoft): We can’t do that either.

Of course you can question if there’s really a legitimate need to work on standards in private. I’m personally skeptical a css property will leak the next iPhone.

It’s also worth noting Apple and Microsoft are both OS vendors and (cynically speaking) have interests that are explicitly contrary to the internet being a universal platform. Fragmenting the web and making it a more difficult platform to develop on is potentially in their interest. Not to different from their stance on h.264 of whom they are both licensors and thus haven’t implemented WebM.

I’m starting to second guess the permanence of prefixes. I personally think once there’s a standard the first release of a browser 12 months after standardization drops support for the prefix. Yes, this will break a few websites that never update. However it’s almost always an easy fix. I’d venture 95%+ of the time it could be done safely via a regex. Truth is you’re talking about 18-24 months from initial implementation in practice anyway. Possibly longer. A website that is so stale it can’t manage to deal with this in 1.5-2 years is in pretty poor shape to begin with. LESS and Sass can also be a big help in automating this. W3C CSS Validator already errors on prefixes. The tools to deal with this are in place today.

I should note dropping is unlikely to happen and thus wishful thinking.

A large part of this issue is how many websites are built these days, especially “mobile sites” which are typically separate sites bolted onto an API or even the backend database of a website. Often built by 3rd party vendors getting things passable and out the door is key. As a result every shortcut in the book is taken, including the absolute minimum in testing and compatibility.

For what it’s worth, this blog has only one prefix in use, and it’s coded as:

-moz-border-radius: 3px;
-khtml-border-radius: 3px;
-webkit-border-radius: 3px;
border-radius: 3px;

Which catches everyone. That takes all of 30 seconds at most to do.

Categories
Google Internet

Google Should Use Google Wave Against Facebook

Help me Google; you're my only hopeGoogle should use Google Wave against Facebook.

It’s not as crazy as it sounds. I will be the first to say I was unimpressed by Google Wave from a user point of view. I should note Google Wave was pitched as an email alternative, and it’s not great at that job. The technical perspective was pretty impressive. It is however a potentially killer distributed social media network. It will take slight retooling to adjust it for the task, but it is already better suited to compete against Facebook than against email.

It’s actually a pretty good alternative if the UI were better tuned to the task. Allow me to explain:

It’s close feature wise

I won’t go into point after point, but Google Wave can carry out many of the same things that Facebook can. It’s a good way to communicate in an open or closed fashion and each wave can already be granular in terms of privacy. It can be used to share much more than text. It can be used for the purposes of photos or video. It can be extended by third parties utilizing its API. It already has chat support. It’s built on XMPP. It can easily parody Facebook in almost every way already. It can be extended to do what it can’t today. Profiles are the biggest thing it lacks. I suspect that wouldn’t take much to add in. I’m thinking an extendable XMPP vCard from the technical side.

It’s distributed

Google Wave is hosted by Google, but it’s also an open protocol and Google’s releasing chunks of their implementation. That means they can partner with other large companies (AOL, Yahoo, Microsoft, Apple etc.) who can federate and let their users all instantly be part of one huge social network. Users already have “friends” via their address books for email. Importing from other sources is easy, just look how Facebook did it. If Google got AOL, Yahoo, or Microsoft to partner join them they would overnight reach a huge chunk of the Internet population via their e-mail users.

For those who are going to try and argue that Facebook users don’t have email addresses, yes they do. It’s a primary method of notifying users of things other than SMS and is required to signup for an account.

This also means you can host yourself, or use the provider of your choice. Your not subject to Facebook deciding your fate, or any one company.

It would be more private

One of the primary gripes against Facebook is its privacy measures are inadequate. Facebook has motives to force people to be more public. There’s little incentive to help you stay private, since the alternatives are slim. With Google Wave being hosted by several providers they will need to give you more control, or you will just move to a provider that will give you the controls you want. Just like with email. By using your own domain to point to a provider you would have portability of your identity. Once again Google Wave by design is more granular than Facebook. It’s based already around the concept of sharing data. What Google Wave really needs is a robust profile implementation with granular permissions and the ability to bucket contacts to make permissions more manageable.

Despite its UI and marketing pitch, it’s a surprisingly close Facebook competitor.

It would be a healthier ecosystem

Like I mentioned before, Google Wave has a fairly decent API already. What is great about it is that providers would be pressured to provide a robust enough API so that the killer apps exist on their platform. Again, no more reliance on a single source. By standardizing at least a subset of the API developers can target multiple providers and implementors. It also means providers will need to allow for more granular controls over privacy settings for third-party apps or once again, people will be switching.

Google wins too – keeps them in the center of the universe

Google likes to be the center of things, especially information. By doing this Google would still be able to organize a users information in meaningful ways for them, which is really what Google Wave’s main goal for Google is. Google has a major win. Anyone a user trusts to index their information can do so. If the user is paranoid, they can keep totally private. If you really want to be private you could run it on your own private server. If you don’t trust Google, you can avoid them but still join the party.

It would be more permanent

Facebook is still not guaranteed to be around in 10 years. Email however is overwhelmingly likely to still be around. Just like newsgroups and IRC still have their place, even if they aren’t as mainstream anymore. Why? Because they are all open standards and not tied to one companies profitability. I can still find and read old newsgroup posts from over 20 years ago. Feel that confident about Twitter? Facebook? foursquare? How much time do you invest in them?

What about dispora or _______?

diaspora is a clever effort and a noble one getting a lot of press today. It really is. But I think it’s to complex for real widespread adoption, especially in the era of easy to use web apps. It’s true that users flocked to P2P apps despite complexity but that’s because of no alternatives with less overhead. I’d give most of these efforts a 5% chance of any real success.

StarWars is copyright Lucasfilm

Categories
Internet Politics

Federal Support For RSS

An interesting little note going around the web today is the push for RSS/Atom feeds by the new administration. For example in the Initial Implementing Guidance for the American Recovery and Reinvestment Act of 2009 [PDF] it specifically dictates that feeds are “required”:

For each of the near term reporting requirements (major communications, formula block grant
allocations, weekly reports) agencies are required to provide a feed (preferred: Atom 1.0,
acceptable: RSS) of the information so that content can be delivered via subscription. Note that
the required information can be supplied in the feed or the feed can point to a file at the agency
using the convention noted below. If an agency is immediately unable to publish feeds, the
agency should post each near term information flow (major communications, formula block
grant allocations, weekly reports) to a URL directory convention suggested below:
www.agency.gov/recovery/year/month/date/reporttype.
It is expected that the information files
will be posted at the following URLs:

  • Major Communications: www.HUD.gov/recovery/2009/02/16/comms
  • Formula Block Grant Allocation: www.HUD.gov/recovery/2009/02/16/fbga
  • Weekly Report: www.HUD.gov/recovery/2009/03/01/weekly

I predicted a few months ago there would be slow growth of feeds in the future. This is just another example of what will be fueling that growth. While I won’t debate RSS vs. Atom here, it’s still interesting to see. It seems whitehouse.gov also prefers Atom using it throughout it’s feeds.

Interestingly a year ago when I profiled all the candidate websites only Hilary Clinton (D), Tom Tancredo (R) and Ron Paul (R) preferred Atom. Everyone else used RSS. I couldn’t even find a feed on Barack Obama’s site.

I wonder if the federal government will ever have a syndication standard, either RSS or Atom. I’m guessing that decision comes down to the National Institute of Standards and Technology (NIST) who I don’t think has any standard for syndication. They themselves use RSS. So does NASA among other government agencies with websites. Considering Atom has come closer to IETF standardization it might have an edge over RSS.

Categories
Hardware

Stick to Certified WiFi Gear

Gartner is warning against early adoption of 802.11n citing the need for more testing, and waiting for the specs to be truly finalized before adoption (likely 2007).

I couldn’t agree with them more. 802.11 gear is only good if the devices are “Certified” (not to be confused with “Compatible”). I’d bet that 90% of problems people have with wireless gear is simply because they choose “Compatible” rather than “Certified”. One meaning the manufacturer feels it’s good enough, the other meaning it’s up to the specifications.

I really don’t believe in “Turbo Mode”, and all these other proprietary addons to WiFi hardware. They can’t even get the basics right (look many still aren’t Certified).

Early on (I think it was 2001) I started playing with some early Linksys hardware, uncertified. A real drag. As soon as I started putting Certified equipment in place, the certified equipment worked flawlessly, while non-certified gear still had occasional problems. Now I’m only buying certified hardware, and everything runs very nicely. You especially see problems with non-certified gear when mixing brands. Right now I have 3 different WiFi Adapters connecting to an Access Point from yet another vendor. Not a problem.

A word of the wise, if you insist on reliability, always get certified. You can lookup your products here to see if they are.

Categories
Internet

Private DNS Address Space

RFC 1918 defines the following IP blocks as designated for private intranets:

10.0.0.0 – 10.255.255.255 (10/8 prefix)
172.16.0.0 – 172.31.255.255 (172.16/12 prefix)
192.168.0.0 – 192.168.255.255 (192.168/16 prefix)

I think it’s about time we have the same thing for DNS, for example:

.dev
.intra

The logic is as follows.

.dev for intranet based development instances of a site. For example this website’s dev instance on my intranet is robert.accettura.dev.

.intra for intranet url’s such as yourdomain.intra. This can be used for any intranet purpose (internal homepage, email system, blogs, wiki, etc.).

This is a much more logical system than using intranet DNS servers to hijack a domain for internal purposes, or reserving subdomains for the purpose.

Someone should go pester ICANN about such a standard. Btw: .local is stupid, if it’s local, it’s localhost.

Categories
Web Development

Holy poo, Yahoo’s Strict

Yea, take a look at that source.

Interesting eh?

Categories
Software

Adobe DNG

Adobe today released the specs for a new image format: DNG.

I’m interested. The ideas behind it seem to make sense. JPEG sucks. That’s about all there is to it. It’s what we have, it’s what we use. But it stinks.

I’m curious if Adobe’s going to be able to get DNG actually out there. It’s going to be an uphill battle for sure. JPEG is such a defacto standard, it’s going to be tough. But I’m welcoming the change.

Categories
Tech (General)

It’s time for an international standard on Instant Messaging

Well, actually it’s well past time. Instant Messaging has all the earmarks to be the communications of the future, and it royally stinks.

Problems today:

  • Networks don’t communicate together, hence locking users in (MSN, AIM, Yahoo!)
  • Phones don’t Text Message (same as IM essentially) across networks. Barely from net to phone.
  • Each has proprietary ‘extras’ (file transfer method, voice chat, web cam, pictures, etc). Far from standardized.

I think it’s time for the IETF to write up an official recommendation for Instant Messaging.

Here’s my wish list:

  • UTF8 encoding for all messages
  • XML messages. Adds capabilities to easily integrate with other systems (since XML is the way of the future). Stylesheets define how it appears.
  • MathML support – for those wanting to get geeky.
  • SVG Graphics – why not? Slim, clean, XML. This could be used for multiple things: Emoticons πŸ™‚ for example could be sent via SVG. Things like whiteboard (which allow you to draw and have the other party see what you draw) could be done in SVG.
  • Of course, an open standard, like Email. Cross platform, many clients, no licensing restrictions. So everyone can enjoy it.

With this, there’s a lot of flexibility. Using XML as a message format, rather than HTML, allows for a stylesheet to render it pretty. A person with a vision impairment could have a product read the XML directly. You could honor a stylesheet provided by the person you are talking too, download them online, or create your own. Big text? Small text? Color contrast? All in your control. And with SVG emoticons, they can resize appropriately without losing quality. Phones can resize as necessary thanks to custom stylesheets.

It’s a real shame it hasn’t happened yet. There’s no great IM clients. The protocols all have their limitations (AOL stinks behind firewalls, Yahoo’s got minimal users, MSN is spam ridden). All the current systems stink. Their clients are even worse. AOL’s adware, MSN’s buggy client (and terrible Mac client), Yahoo’s terribly slow development.

Look at all the IRC clients available. So many, each with their own features, toys, ehancements. All working together.

Yes, I do hate IM’s as of today. But imagine what could be done? It could be as universal as email. Secure, fast, flexible framework. But instead, we’ve got garbage to date.

The time for standards in IM is now. It’s only going to get more proprietary from here on out. And lock users into their networks.

Oh… spam prevention built into the protocol would be nice. Lets avoid another Email like spam attack.

Just my $0.02

Categories
Mozilla

David Hyatt’s right

David Hyatt seems to have generated some waves over his recent posts regarding Safari “bugs”.

I must say he’s 100% right, and I applaud him for taking a stand.

I’m a web developer for several years in one form or another. I’ve had a web presence for several years now. I started out just using Netscape Communicator and Claris Homepage for web site development. Not knowing ANY HTML. Slowly I learned.

Over the past 2 years now, I’ve become almost obsessed with writing “the perfect code”. A website that looks good in all browsers, is small, and works perfectly every time.

What I’ve found is that you can save tons of time by just using valid code. Since I’ve been an addict about validating my pages, I haven’t had rendering issues. I used to have many bugs that would drive me crazy. Fix one thing, and it breaks in another browser. It was a mess. Since I went with good valid HTML (I still prefer HTML 4.01/Trans with sparing use of CSS, since it breaks a little nicer IMHO in older browsers).

And being a Mac lover and Mozilla addict, means I’m also well aware of how webmasters ignore standards, as long as bad code breaks in a desired way in IE. It’s been driving me crazy for years.

I’m going to keep generating valid websites. I have no intent on going back to my ways of the past. Some pages will be invalid for some time until their backend is updated appropriately. But all new systems/pages written will generate valid HTML. And that’s my plan from here on out. Why? Because it’s faster than patching for eternity, it works, and it’s the RIGHT way of doing things. It’s cheaper really. Cleaner code is less bandwidth and more efficient business wise. Read here for more on that. Good companies want valid code as well.

I really hope some other webmasters beside myself start listening to Mr. Hyatt. He’s a wise authority on web development giving his experience in Mozilla and Safari. If there’s a webmaster out there that doesn’t monitor his blog(s), I encourage them to start doing so, and read up in his archive on his past posts (you can skip the pop culture ones if you want… but he’s got good taste in games, and especially TV i.e. HBO)

Perhaps he’s just started a revolution that will change the web? Perhaps he’s only enlightened a handful of webmasters. Either way… Thank you.

Categories
Tech (General)

Microsoft and Standards

Microsoft recently redesigned their website (at least their homepage). As you may know, they aren’t a big backer of standards.

Look how well their page validates.

This concludes this episode of Microsoft-doesn’t-care-about-technology.

Validating as XHTML is even more fun.

Apple isn’t perfect, but isn’t to bad either.

This website is fine though πŸ™‚

Go standards!