Google Public DNS Analysis

Google’s new Public DNS is interesting. They want to lower DNS latency in hopes of speeding up the web.

Awesome IP Address

This is the most interesting thing to me. I view IP addresses similar to the way Steve Wozniak views phone numbers, though I don’t collect them like he does phone numbers.

[Querying whois.arin.net]
[whois.arin.net]
Level 3 Communications, Inc. LVLT-ORG-8-8 (NET-8-0-0-0-1) 
                                  8.0.0.0 - 8.255.255.255
Google Incorporated LVLT-GOOGL-1-8-8-4 (NET-8-8-4-0-1) 
                                  8.8.4.0 - 8.8.4.255

# ARIN WHOIS database, last updated 2009-12-02 20:00
# Enter ? for additional hints on searching ARIN's WHOIS database.

Looks like Google is working with Level 3 (also their partner for Google Voice I hear) for the purpose of having an easy to remember IP. From what I can tell it’s anycasted to a Google data center.

For what it’s worth, 6.6.6.6 is owned by the US Army. Make of that what you will.

NXDOMAIN

First thought is Google would hijack NXDOMAIN for the purpose of showing ads, like many ISP’s and third party DNS providers. Instead they explicitly state:

If you issue a query for a domain name that does not exist, Google Public DNS always returns an NXDOMAIN record, as per the DNS protocol standards. The browser should show this response as a DNS error. If, instead, you receive any response other than an error message (for example, you are redirected to another page), this could be the result of the following:

  • A client-side application such as a browser plug-in is displaying an alternate page for a non-existent domain.
  • Some ISPs may intercept and replace all NXDOMAIN responses with responses that lead to their own servers. If you are concerned that your ISP is intercepting Google Public DNS requests or responses, you should contact your ISP.

Good. Nobody should ever hijack NXDOMAIN. DNS should be handled per spec.

Performance Benefits

Google documented what they did to speed things up. Some of it anyway. Good news is they will still be obeying TTL it seems. My paraphrasing:

  • Infrastructure – Tons of hardware/network capacity. No shocker.
  • Shared caching in the cluster – Pretty self explanatory.
  • Prefetching name resolutions – Google is using their web search index and DNS server logs to figure out who to prefetch.
  • Anycast routing – Again obvious. They do note however that this can have negative consequences:

    Note, however, that because nameservers geolocate according to the resolver’s IP address rather than the user’s, Google Public DNS has the same limitations as other open DNS services: that is, the server to which a user is referred might be farther away than one to which a local DNS provider would have referred. This could cause a slower browsing experience for certain sites.

Google also discusses the security practices to mitigate some common security issues.

Privacy

Google says after 24-48 hours they erase any IP information in their privacy policy. Assuming you trust Google that may be better than what your ISP is doing though your ISP could still log by monitoring DNS traffic over their network. As far as I’m aware there are no US laws governing data retention, though proposed several times.

I am curious how this will be treated in Europe who does have some data retention laws for ISP’s. Does providing DNS, traditionally an ISP activity make you an ISP? Or do you need to handle transit as well? Does an ISP need to track DNS queries of someone using a 3rd party DNS? Remember recording IP’s alone is not the same thanks to virtual hosting. Many websites can exist on one IP.

OpenDNS and others may have flown under the radar being smaller companies, but Google will attract more attention. I suspect it’s only a matter of time before someone raises this question.

Would I use it?

I haven’t seen any DNS related problems personally. I’ve seen degraded routing from time to time from my ISP. Especially in those cases, my nearby ISP provided DNS would be quicker than Google. I don’t really like how nameservers may geolocate me further away, but that’s not a deal killer. I don’t plan on switching since I don’t see much of a benefit at this time.

Measurement Lab

Google today unwrapped Measurement Lab (M-Lab) which can measure connection speed, run diagnostics and check if your ISP is blocking or throttling things via it’s blog.

In general it’s a good idea, but it’s nothing new. Sites like dslreports.com and SpeedTest.net have been measuring speed and providing diagnostics for years. The BitTorrent test however isn’t replicated by many yet.

One thing that isn’t addressed is how they will detect if an ISP is adjusting their routing to handle M-Lab servers specially. What stops an ISP from not throttling from one of Google’s “36 servers in 12 locations in the U.S. and Europe” but throttling all other data? Perhaps Vint Cerf and friends have a plan in mind but it seems to me this could be a cat and mouse game.

Postage for Email? My Internet != Your Internet?

There’s been a lot of buzz lately over AOL and Yahoo charging to email their customers. I think this quote most likely will end up being the future:

“AOL users will become dissatisfied when they don’t receive the e-mail that they want, and when they complain to the senders, they’ll be told, ‘it’s AOL’s fault,’ ” said Richi Jennings, an analyst at Ferris Research, which specializes in e-mail.

Well said. Just wait until AOL customers realize they aren’t getting order confirmations, notifications, and other email’s because the sender won’t pay.

Another concern not really discussed is the possibility of having a Level 3/Cogent style battle where one ISP refuses to let another email their customers, because they aren’t getting paid what they feel they should.

Right now, email is essentially 100% peered. Everyone emails everyone, nobody charges. You pay your ISP to run the mail server, and that’s it. If commercial entities need to pay to email you, your going to get separate charges. Want an email when your order ships? Pay extra. Want an email when this item is back in stock? Pay extra.

This is a very slippery slope. Just one or two greedy ISP’s is all you need to ruin email. Once you can’t reliably email, the system is dead. Spam can reduce efficiency, but can’t kill email. Remember Email is by far the most used protocol in business.

I doubt this system will do anything to reduce spam for AOL customers. It will however help AOL’s revenue, which I’m assuming is the real goal. A slightly bold move as AOL is assuming their customers won’t mind not getting all the legitimate email they would if they used a free Gmail or even Hotmail account.

There’s also a decent possibility AOL customers might have to pay merchants an email fee when they buy products, to help cover that cost. Of course merchants eventually will sneak in their percentage there, further hiking prices.

Personally, I think this biggest threat is a Level 3/Cogent style dispute.

Should also note there’s currently a lot going on over Net Neutrality. Google’s been thrown into the middle of that, merely because of how ubiquitous the company is. Vint Cerf’s letter on the topic is really a must read. Paying for email right is really just an inverted case of network neutrality. Instead of the middle man dictating who you can/can’t communicate with, the next ISP down the line decides. That’s no better.

The Internet as an open medium could drastically change in the next few months if some of this stuff becomes reality. There are quite a few companies out there who believe the internet is enough of a threat to their business, that they want to go as far as crippling it.

Spammer Spot Checking

It’s pretty well known at this time that a rather large sum of Spam comes through regular ISP’s. There is a rather large debate on how to get rid of them. Some ISP’s just ignore it. Some block port 25. But is there a better way?

I’m going to propose the following:

  • A random check of 1 out of every 100 emails sent through an ISP’s servers, or via port 25 (for ISP’s who allow 3rd party mail servers) get checked by a spam filter (such as SpamAssassin).
  • If a user gets flagged, the user enters a “gray list”. In which their emails are checked at a lower interval (1 out of 25) for the next several days.
  • If more than 10% get flagged (a rather large margin for today’s Spam Filters). That account should be suspended and investigated by the ISP before being re-enabled.

The vast majority of the above can be automated. But how would this cut down on spam?

Explanation

The vast majority of users send less than 100 emails a day. So the percentage of extra CPU required would be relatively minimal for each legitimate user an ISP has (only 1/100 of outgoing email would be scanned). Odds are the user will have 1 email scanned every 3-7 days (assuming they send between 15-20 emails a day) . For a spammer, or a computer infected with a Trojan, this computer will be sending large sums of spam (perhaps hundreds an hour). It will be rather likely to have one fall into the group tested by the spam filter. Then when it falls into the gray list, it will become rather obvious if it was a fluke (emailing a spouse about Viagra), or a spammer. Spammers need to send bulk amounts of mail to be profitable, since not many who get it actually click and buy something.

Why would an ISP want to bother?

A spammer not only can put a large burden on a mail server (read: cost), but cause an ISP to be blacklisted. This is a negative thing for any ISP because it reduces the quality of service for legitimate users, and could cause customers to feel they can get better service elsewhere. The best way to avoid being blacklisted is to keep your mail servers clean.

Wouldn’t this violate privacy policies?

Not likely. Many ISP’s already scan incoming email for spam and viruses. This is simply applying it in the reverse. There’s likely no additional privacy concerns by doing it this way.

Couldn’t this prevent many virus outbreaks?

Yes, it could be done to prevent viruses, simply by doing the above with a virus scanner.

Could this be done without a “gray list” to make it easier to implement?

Yes, in theory it could. You can just flag an account so an admin is aware. Or suspend right away. Suspending right away (on 1 catch) may cause more false positives than you would want, so I’d advise against it. I’d opt towards flagging an account or perhaps notifying an admin by email. If someone is a real spammer, they will be part of the random sampling a dozen or so times rather quickly. So it will be rather obvious. A “gray list” is more programming, but makes the system more automatic and tolerant. Providing a better experience for end users, with less work for admin’s in the long run.

Where did 1 out of 100 come from?

It’s somewhat arbitrary, but should prove effective. I’m sure some analysis could come up with an even better number. The goal is to prevent spam with minimal CPU. Odds are a spammer won’t send 1 email a day. So they will send it in volume (since the more they send, the higher the chances a consumer will bite). Hopefully more often than note, 1 will fall into the filter. You can cut that in half (1 out of 50) to double your chances. At the expense of system resources.

Wouldn’t this just make email slower?

Not really. You can send the email before you scan it. So this doesn’t slow outbound email. It’s just taking a random sampling at an interval, and reacting based on the analysis. Even if the filter goes off, the mail should be sent (it could be a false positive). Only when the user is flagged as a spammer should the account be unable to send email. This results in minimal disruption of service. For a spammer this should happen relatively quick. scanning 1% of outgoing email shouldn’t be to substantial. Assuming you keep an eye on your mail server anyway, this should only speed up the detection of a spammer using it. If you go to a 1:50 ratio of scanning, you’ll only improve your odds and speed in catching spammers.

Has anyone implemented this? Is there a tutorial?

To the best of my knowledge, nobody has done this yet, at least based on my theories. If you have done this, and would like to contribute some code, information, wisdom, or just mention who did it, let me know.

Why not just scan all outgoing email?

It’s just not practical for performance/resource reasons. Nor is it really necessary, since spammers need to send in bulk.

Couldn’t spammers work around this?

Well, they can space out when they send out mail, say batches of 50, but they still fall trap to perhaps being 1:100 and being scanned. They could send less, but that would be costly. They need to send in bulk so they can get as many eyes looking at their offers as possible. So for them, just sending less isn’t good business. This would hit them where it hurts. By making their business model ineffective. If they can’t send the mail, they can’t profit.

Doesn’t this protect others, rather than myself?

Yes, and no. We are a community, and communities do look out for each other. If everyone did this, the load on incoming mail servers would be substantially less. As said before, by catching your own spammers, you prevent being blacklisted by the many blacklists out there. That has a direct benefit to your business.

What about bounced email?

Those should be scanned as well. Simply because a spammer can bounce their spam off of your mail servers to get around blacklists. If I email invalid@goodisp.com, with a spoofed “From:” header, they will likely “bounce” that email to my recipient (who I put in my “from:” tag), quoting the message (my spam). By scanning these as well (1 out of 100), you can effectively cut down on this abuse by your leeching spammers.

The bottom line

By using the above method of scanning outgoing email, you can effectively prevent spammers from profiting off of your mail servers. Spammers need to send in bulk. The more they send, the easier it will be to catch them. This is an easy way for an ISP, webhost or mail provider to cripple the spammers business without harming legitimate email users.

ISP’s should run BitTorrent Cache’s

I’ve went on a bit about BitTorrent before. And in part is has happened (regarding Mozilla). We at least have torrents on the homepage!

Now to send a little messages to ISP’s:

BitTorrent could be an ISP’s best friend. Think networking basics for a minute: Staying within the network is faster, and more reliable. If a user subscribes to Comcast, their connection to Comcast’s network is optimal. Theoretically faster than anything else the can access. Also, Comcast doesn’t need outbound bandwidth by peering, or purchasing bandwidth when a user is using internal content (savings).

If an ISP were to embrace something like BitTorrent, it would really be an advantage to ISP’s. When something new is released, such as a Game, Linux Distro, or other large file, people go and download it all at once. To accommodate that takes some bandwidth. There’s no good reason why an ISP can’t handle the bulk of that internally, and provide faster downloads to their users (great marketing), and lower operational costs.

If an ISP were to setup perhaps a cache, simply to provide fast internal downloading through a method like BitTorrent there would be significant benefit to all parties. File hosts save bandwidth, consumers get files quicker, and ISP’s relieve uplink bandwidth, as well as get something new to market.

Even if the cache only mirrored the very popular things, perhaps took the top 10 of the past 24hrs. That would make a significant difference.

Netscape Desktop Navigator

As mentioned all around the web, Netscape is creating an ISP. In hopes of turing into an ISP/content based brand Netscape the other day released a beta of it’s Netscape Desktop Navigator program. So I decided to give it a quick wirl….

Overall Impressions

I found it to be quite useful, but not exactly perfect for my needs. For example, I’ll never use it to search the web, but that’s the focal point of the app. Nor will I shop, or look at personals. IMHO they can be removed.

Very useful is the weather features. Weather in the System Tray, and the app itself remembers my zip code. Very nice to have. Another useful feature is the Movie Showtimes. It’s now one click to see where a movie is playing, and what time. Also very convient is maps and white pages. TV guide is somewhat easy, though remembering I have cable, and what network I use would be nice, so I can see all my cable listings right in there.

News was a little disapointing. To little news to make it useful, still find it easier to use google’s news feed, than this app. To brief IMHO.

Also wanted is some customization. Let me remove what I don’t want to have. For example search, I’ll never use it, but it’s valuable space. Wouldn’t mind having more headlines there. Perhaps add sports and stocks? I would love to have those.

Hopefully they keep it ad free.

I’ll keep it on my computer, I find it rather useful. Provided it doesn’t become adware. It’s a smooth little app, that saves me a few clicks, and puts things at my reach. It works rather well.

Would be nicer if it used Gecko and XUL… but then again, this is AOL were talking about. They sold their soul to the devil (quite literally).

Did someone leave the lights on?

In case the news hasn’t reached everyone yet, Netscape will update it’s release sometime early this summer, speculated to be based on Mozilla 1.7.

Personally, I wonder who will be actually coordinating this update? And secondly, why not wait until Firefox is released, and market it as Netscape 8.0 Light, and make a pro version for use with it’s new ISP. At least that makes much more logical sense from a business point of view. It would allow them to promote their new business model by using their old business model. They can also advertise their service as having all the wonderful feature Firefox has. Could release Mac/Linux version as well. Capture that part of the market.

Then again, who said AOL/Netscape made logical sense from a business point of view? After all, it’s now known as: AOLTW for a good reason.

FBI Attempting to damage Business?

An interesting article via Slashdot, discusses how the FBI decided to shut down an ISP, rather than take specific data, citing it’s more “efficient”.

I wonder if someone hosted at a large hosting facility like one of Verio’s, or Sprint, etc. Would they cease the entire facility? Turning off the servers for Fortune 500 companies? Or do they only do this to small businesses? Perhaps if it happened

Either way, I question the ethics of the FBI’s move. This undoubtedly, did irreversible damage to the ISP, since I’m sure some customers will leave after the downtime. Not to mention all it’s customers.

Just in from Netscape

Just got this email from Netscape:

Coming Soon: All-New Netscape Internet Service!
As a Netscape Network user, we’d like to make you aware of a new Internet service Netscape will soon be releasing. With this new service you’ll enjoy unlimited internet access for a low price of $9.95 per month. Stay tuned for more details on this new offer. Or, if you want to be one of the first to try the new Netscape service, please go to the hyperlink below. You may either click on the link or copy and paste this address into your browser.

As mentioned earlier.

New Mozilla.org

As undoubtably, everyone will blog in the next 24 hours… New mozilla.org has been launched. Looks great. Also has a wonderful end user focus.

One thing I would do, is make a subdomain for corporate customers. Gear the same information, but corporate advantages (why Mozilla is good for an organization). How to deploy? Security? Updating? Customizing? Branding? etc. These corporate users involve thousands of users per company. And remember. Convince 1 company, and potentially thousands of users are exposed to Mozilla. That means some will undoubtedly, download for home use.

An ISP targeted subdomain may not be a bad idea either.

While not technically end users. These customers will advertise for the Mozilla project. All have reason to consider Mozilla. For example licensing. Mozilla is free distro for all OS’s. Great for ISP’s. Can be customized, etc.

Food for thought.