Categories
Space Tech (General)

Space Shuttle’s Y2K-like Problem

Here’s curious tidbit from someone on reddit.com who identifies themselves as a Johnson Space Center Employee:

The Shuttle suffers from its own Y2K problem. The system computers run clocks that are set for GMT days: I think today is GMT 49. Anyway, when it gets to December 31, it’s GMT day 365. When it moves to January 1, it goes to GMT 001. This screws up the flight computer. I don’t believe there has ever been a Shuttle flight over a new year. A software fix is possible, but it has never been worth the millions of dollars necessary to fix it.

This actually seems very believable. For a little background, the Space Shuttle originally flew a set of 5 IBM AP-101‘s. In 1991 they upgraded to AP-101S, which has about 1 MB of semiconductor memory (as opposed to the core memory on the AP-101) and 3X the CPU speed. 4 run in sync, and 1 runs a separate set of software written independently for the ultimate in redundancy. They sit in two separate places in the orbiter and are quite rugged and power-hungry at 550W. That’s substantial considering the processing power. Since they mainly handle number crunching for the orbiter’s thrusters and run through things like the launch sequence. They just need to be reliable. They are programmed using HAL/S. The original memory limitations are likely why it uses GMT dates, and the reason to avoid upgrading the software is because of the complexity of the environment.

While a software upgrade would likely fix this issue, upgrading something that needs to be this well-tested would be insanely expensive.

Categories
Internet

The Ultimate Human Supercomputer

For most of my life there’s been a list known as the TOP500. The list of the fastest computers in the world. The fastest today is K computer, built by Fujitsu. We refer to them as “computers”, but in reality they are large rooms filled with servers that operate in coordination with one and other to solve a problem. They are individual nodes that work together.

I think this ranking is becoming increasingly irrelevant. Peak Tflops isn’t necessarily what matters anymore in terms of solving problems. I propose that the internet itself is the fastest and most powerful computer in terms of it’s ability to solve complicated problems. We haven’t figured out how to fully utilize it, we don’t even know it’s capacity or how to optimize it yet. But we know it has amazing potential already.

A photographer during the Vancouver riots (loosing in hockey is a real first world problem isn’t it?) photographed an anonymous kissing couple. In less than 24hrs they were identified. Someone requested help identifying the source of some amazing Nazi era photographs on a Tuesday morning. By Tuesday evening the photographer and the back story were coming into focus. There are no algorithms for these problems.

Things like tagging photos, status updates, tweets, wiki’s, and the ability to index, search, sort them in near real-time is at it’s infancy. We’ve barely got the technology to handle data at this magnitude, much less optimize, and realize it’s full potential. Already it can solve things that you can’t brute force using a supercomputer. At some point we’ll be able to question this “machine” and it will know who most likely would know and who shouldn’t be bothered even seeing the question. This isn’t artificial intelligence, this is computer network assisted human intelligence collaboration.

The internet likely won’t solve π to a trillion places anytime soon, if ever. We’ll leave that dull task to a “simplistic” supercomputer. But it’s important to not underestimate the power of the collective human brain. It’s already able to solve very obscure and complicated questions with ease and can be used without a degree in computer science. It’s only 2011. Just 10 years ago we couldn’t solve the above questions nearly as quickly, or perhaps at all if the person(s) with an answer wasn’t online. Imagine what 2021 will bring.

Categories
Funny

Hot Dog Bun Math

It’s the 4th of July weekend here in the US. Today’s BBQ1 got me thinking about that conspiracy theory from Father of the Bride regarding the mismatch of hot dogs quantities and hot dog bun quantities. Steve Martin’s character goes nuts over the discrepancy.

This is really an exercise in Least Common Multiples that no teacher seems to exploit (at least none that I ever had).

Research tells me that the reality of this joke is a little more complicated than 8 hot dogs and 12 buns. It may even vary based on location. From what I can tell the most common hot dog packages are 8 and 10, while the most common bun packages are 10 and 12. That means a 10/10 purchase is a win in terms of efficiency. I suspect there are more combinations, but 8/10 and 10/12 seem to be the most common. Here’s a table of the possibilities:

Hot Dog Qty. Hot Dog Bun Qty. Least Common Multiple Least Hot Dog Packages Least Hot Dog Bun Packages
8 10 40 5 4
8 12 24 4 2
10 10 10 1 1
10 12 60 6 5

This leaves me to question: who profits more from this? To figure this out, we’d need to know buying habits of people and costs involved in producing, packaging, shipping these goods. I don’t have that on hand, but I can draw a pretty graph of how many packages of each you’d need to not waste food:

So it looks like we’ll be eating hot dogs in sandwich bread and making tiny sandwiches out of left over hot dog buns for years to come.

1. Technically you grill hot dogs (hot and fast), not BBQ (low and slow) but American etymology is funny.