Categories
Security Tech (General)

Coin Tosses Not Totally Random

It’s generally assumed that a coin toss is “fair” because it’s considered “random” as long as you don’t use a double headed coin. In fact professional sports like football use it. While previously known or at least suspected, it’s not so random. Research shows it has a 1% bias. Making the odds 51-49, hopefully in my favor. They were even able to build a machine to predictably flip a coin.

James Devlin at Coding The Wheel has a great writeup simplified for those who don’t have a head for all the math (pun intended):

  1. If the coin is tossed and caught, it has about a 51% chance of landing on the same face it was launched. (If it starts out as heads, there’s a 51% chance it will end as heads).
  2. If the coin is spun, rather than tossed, it can have a much-larger-than-50% chance of ending with the heavier side down. Spun coins can exhibit “huge bias” (some spun coins will fall tails-up 80% of the time).
  3. If the coin is tossed and allowed to clatter to the floor, this probably adds randomness.
  4. If the coin is tossed and allowed to clatter to the floor where it spins, as will sometimes happen, the above spinning bias probably comes into play.
  5. A coin will land on its edge around 1 in 6000 throws, creating a flipistic singularity.
  6. The same initial coin-flipping conditions produce the same coin flip result. That is, there’s a certain amount of determinism to the coin flip.
  7. A more robust coin toss (more revolutions) decreases the bias.

There’s also some potential strategy, a worthwhile read.

There paper is also available as as well if your so inclined, though you’d need to be a real math/stats nerd to want to read that.

Categories
Internet Politics

Redefining Broadband

The FCC for years has been considering any connection greater than 200kbps to be broadband. For the past several years that’s been pretty misleading. In addition, they only collect downstream, not upstream. They also consider an entire zip code to have broadband if only 1 home can get it. That’s not very accurate. This makes the broadband situation in the US look better than it really is.

The definition of broadband in the US is now being redefined as 768kbps. They will now collect upstream data, and use census-track data. This is a major win since it will more accurately show how many people really do have broadband, and more importantly how many do not.

I personally disagree on the number and think it should be at least 2Mbps, but it’s a win regardless.

The pacific rim annihilates the United States when it comes to broadband. According to Akamai’s State Of The Internet for Q1 2008 high broadband (greater than 5Mbps) is where we really start to show our deficiencies. Here’s a look at broadband which they define as simply greater than 2Mbps:

Rank Country % >2Mbps Q4 07 Change
Global 55% -2.0%
1 South Korea 93% -1.5%
2 Belgium 90% +1.5%
3 Switzerland 89% +0.5%
4 Hong Kong 87% -1.5%
5 Japan 87% +1.0%
6 Norway 83% -2.3%
7 Tunisia 82% +29%
8 Slovakia 81% +0.5%
9 Netherlands 78% -2.6%
10 Bahamas 74% -3.0%
24 United States 62% -2.8%

Pretty pathetic considering our last Vice President invented the Internet πŸ˜‰ . We are the largest in terms of sq miles, but when you consider the US population density, the bulk of our land is very sparsely populated. 80.8% of the US population lives in an urban setting [Warning: PDF].

US Population Density

Japan by comparison has 66.0% of it’s population in an urban setting. Belgium has a surprising 91.5% which may account for it’s #2 position. Switzerland has 44.4% yet makes 3rd place threatening Belgium’s position.

I’m far from the first one to complain about the poor state of broadband. BusinessWeek and CNet both have relatively good discussions about the topic.

The future of media is clearly moving online as people demand to consume it on their schedule as they desire. Take a look at some of the statistics and it’s clearly a large industry. I suspect the lack of broadband infrastructure will be a real problem in the next several years as the rest of the world becomes very easy to distribute media to, and the US still faces challenges.

Solution? Highly debatable, but if so many other countries can do something about it, I suspect it’s achievable here in the US as well. I suspect that the taxes made from companies that do business on the internet from ecommerce to advertising would make this a decent investment for the US government to at least partially back. The more places companies make money, the more places the government does. That may be necessary as not all markets are profitable enough for telco’s to bother with. There have been various attempts to jumpstart this effort, but none to date have been successful.

It’s not only about just having access, it’s also the cost. As BusinessWeek points out in the article above, broadband in the US is not cheap.

Perhaps wireless will finally allow for competition and lower prices, at least that’s what everyone is hoping for. The question is if it will happen, if the technology will be there (wireless is generally high latency), and if it will be affordable for the common man.

I suspect in the next 4 years this will become and even bigger topic of discussion as some of the top ranking countries start to reach the point of saturation.

Categories
In The News Networking

Strange Population Statistics

Yesterday the estimated world population passed 6,666,666,666. Interesting (though just coincidence) the estimated number of available IPv4 addresses was supposed to pass 666,666,666. Perhaps we are the beast?

An interesting thing to note is that the population is increasing at a very rapid rate. How long it’s sustainable before a Malthusian catastrophe is subject to debate. Some say the industrial age freed us of that pending disaster, others say that just bought a little more time. By about 2024 there is expected to be 8 billion people. IPv6 can’t come soon enough

[Hat Tip: Slashdot]

Categories
Mozilla

Shell Stats

Since it seems like everyone else on Planet Mozilla is doing it… My twist: multiple systems. Actually I found it interesting too see the variation based on what I use them for.

Home

$ history | awk '{a[$2]++}END{for(i in a){print a[i] " " i}}' | sort -rn | head
114 ls
91 cd
63 sh
41 ssh
37 sudo
14 pico
13 exit
12 ping
10 ./gl_tail
8 top

Home Server

$ history | awk '{a[$2]++}END{for(i in a){print a[i] " " i}}' | sort -rn | head
68 ls
64 dig
55 cd
45 whois
27 ps
24 clear
23 sudo
14 pico
12 top
12 exit

Work

$ history | awk '{a[$2]++}END{for(i in a){print a[i] " " i}}' | sort -rn | head
67 top
65 ssh
31 ls
29 sudo
26 dig
23 cd
20 ps
19 svn
17 php
8 ping
Categories
Apple Mozilla

DEMOfall 07 Browser/OS Statistics

Stephen Wildstrom did a little survey of demo machines at DEMOfall 07. 81% Windows, 19% Mac. He says that’s growth, and I’m not shocked to hear that. He also did a survey of browsers and found all Mac’s use Firefox over Safari, and makes a reference to it’s skin (an interesting observation considering the current discussion over reskinning the browser for 3.0). Firefox did decent on the Windows front as well. I’ve noticed this myself. People seem to prefer demoing their web based product in Firefox. Is it out of habit? Or because some ajax based websites feel slower in IE7?

Categories
Google Web Development

Google Web Authoring Statistics

Google has some great Web Authoring Statistics. Very cool stuff.

Categories
Mozilla

Mozilla CVS Statistics

Ran some statistics on my cvs tree after a basic pull to build Firefox. There are several things to note:

  • gerv cheats (relicensing). πŸ˜‰
  • we didn’t pull /mail, so mscott and bienvenu are underestimated
  • several people commit many patches for non-cvs people, so not all code is owned by the committer.
  • removing whitespace counts as a line, so does reformatting. So that inflates counts
  • copying code counts as new code. So forks, etc. inflate count.
  • removes aren’t counted accurately
  • this is just for fun, it’s not scientific. Don’t use this for anything important.