Thursday, June 23, 2011

Weird Wednesday - Lifeboat Foundation


At first, I thought the Lifeboat Foundation was a made up thing. Maybe some kind of J.J. Abrams-esque fictional web site along the lines of the Hanso Foundation. But nope, it appears to be the real deal.

The purpose of the Lifeboat Foundation is to protect humanity from, well, just about everything. From nano-plagues to asteroidal extinction, they have a plan to save the day. Heck, they even have backups for backups.

Ark-I, a self sustaining space colony, is their "fallback position in case programs such as our BioShield and NanoShield fail." That's some serious safety! The foundation is a treasure trove of sometimes weird, but always fascinating, emerging technologies such as: smart dust, space guns, and space elevators. I also like how they apply Murphy's Law to artificial intelligence. You just know that can't end well.

And I love how they connect SETI with the reason why you really need to have somebody keeping an eye on you. Check out this quote on their Security Preserver page:
"There's a long-standing problem in astronomy called the Fermi Paradox, named for physicist Enrico Fermi who first proposed it in 1950. If the universe should be teeming with life, asked Fermi, then where are all the aliens? The question is even more vexing today: SETI, the search for extraterrestrial intelligence with radio telescopes, has utterly failed to turn up any sign of alien life forms. Why?

One chillingly likely possibility is that, as the ability to wreak damage on a grand scale becomes more readily available to individuals, soon enough just one malcontent, or one lunatic, will be able to destroy an entire world. Perhaps countless alien civilizations have already been wiped out by single terrorists who'd been left alone to work unmonitored in their private laboratories."
So the reason we are alone in the Universe is because ET mad scientists always blow everything up before they can say "Hi." If that isn't grist for the pulp mill, I don't know what is!

The quote is actually from science fiction writer Robert J. Sawyer, which the Lifeboat Foundation uses to illustrate their point about how there should be debate about the role of surveillance in the future.

One very interesting (heck they are all interesting) LF program is the AI Shield. One of the greatest threats we may face is the age-old problem of Unintended Consequences:
"Consider the simple case of an AGI that has been given the uncontroversial goal of eradicating malaria. A reasonable human expectation would be that such an AI would complete its goal by conventional means: perhaps by developing a new anti-malarial drug, or by initiating a program of mosquito control. The problem is that there are many other ways of eradicating malaria, some of which are undesirable. For example, an AGI might choose to eradicate malaria by eradicating all mammals."
Thankfully, The Lifeboat Foundation is on the job to avoid just those kind of situations. That may be bad news  for HAL, Colossus, and Sky-Net, but good news for us. Umm, assuming you aren't an artificially intelligent silicon-based life form. And if you are, please like my Facebook page.

4 comments:

  1. Wow, that's fascinating. I have to go look them up.

    ReplyDelete
  2. Yeah, it's a trip. They really have a lot of stuff going on...

    ReplyDelete
  3. I was mildly intrigued by this, mostly trying to decide how to store the information away for a later date, until I hit the last line. That won me over! May I like your Facebook page if I am /not/ an artificially intelligent silicon-based lifeform?

    ReplyDelete