Friday, April 18, 2008

Heavy Thinking for a Friday Night

I don't write science fiction, partly because I've read so much of it that I know how much I do not know. Good science fiction can be predictive. One of my favorite authors in this genre is David Brin. He waxes political and scientific on his blog, Contrary Brin. Take this, for example:
I am provoked by two recent articles. The first describes the approaching completion, after 14 years and $8 billion of CERN’s Large Hadron Collider, near Geneva, where high energy particle collisions should reveal much more about the underpinnings of natural law. And where - a few worry - there might be accidentally created some kind of planet-devouring monster. A “strangelet” or possibly a microscopic black hole that, contrary to present-day theory, might not dissipate, but instead absorb neighboring atoms and keep growing, voraciously, eventually gobbling up our world. Despite having written a best-seller around this notion, a couple of decades ago, I have never credited the scenario with high likelihood. Still, it falls under the increasingly important field of Risk Analysis, dealing with unlikely but high-stake threats. It seems our destiny, in this new century, to deal with many of these. To learn more drop by the Lifeboat Foundation and see just how calm and rational - and even fun - it can be to wallow with the worrywarts. (I have my own low probability/high consequence cause.)

Indeed, I consider it premature to pick any one fear, or hope, to zero-in upon, monomaniacally. There are just too many possibilities, both up and down.

With so many ways for things to turn, it seems prudent to concentrate on choosing leaders who demonstrate some flexibility of mind, and who do not disparage intelligence as the very opposite of wisdom. Because, at the fringes of this topic, there lurks the Fermi Paradox -- the notion that the universe ought to be teeming with evidence for advance interstellar civilizations, by this late date. And yet, there is no credible proof that we aren’t alone. Did some of those “high consequence” mistakes bring most of our predecessors down? Might it have been one particular error, so alluring that every species tries it, whenever they reach a certain level of development... just before they would have built starships?


I know. It's Friday. You want to relax, but think of that . . . we could be alone because other societies have made a critical error.