May 10, 2004



Tipping Point Management

Phil wrote last week about the "tipping point" that could lead directly to the Singularity – the advent of "machines that are able to create machines more complex than themselves."

There are tipping points, like the Industrial Revolution, that have worked in our favor and others, like Global Warming, that we want to avoid.

I am of two minds about "Global Warming." Many of those who are most vocal about the dangers of global warning are the same people that distrust economic and technological development. Most futurists, myself included, feel that increased technological development are the best hope for improving the environment.

For that reason I look with suspicion upon efforts like the Kyoto treaty. The cutbacks in carbon dioxide emissions required by Kyoto could push us over an economic tipping point into a Depression. But there never was much danger that the United State would ever accept Kyoto.

I see democracy and capitalism as great allies in the real fight for a cleaner environment. Among developed nations, the freest also are the cleanest. There are those who see little promise in continued economic and technological development. But I see accelerating development as our best hope for dealing with the problem of global warming.

Humanity needs a much better understanding of the global system before it will be able to accurately predict the effects of global warming. Many of the models that have been used for past predictions have failed to take into account the ability of the planet to correct for increased greenhouse gasses.

On the other hand, many of my fellow conservatives have acted as though the Earth's ability to absorb abuse is infinite. I've actually heard Rush Limbaugh say that we couldn't destroy the planet if we wanted to. Of course we could. Any complex system has a tipping point - a point beyond which it is unable to cope with additional stress. Our problem is that we have don't know what that point is.

The environmentalists would argue that since we don't know what the environmental tipping point is, that we should err on the side of caution. That we should adopt Kyoto in the hope that we are not too late already. This is their best argument. Certainly my greatest concern with Global Warming is that we stumble over a tipping point unaware.

But the Kyoto economic depression would also delay real technological progress. We are so close to another tipping point, the Technological Singularity, that there is real hope that our emissions problems can be mostly solved within a generation.

As for knowing where the Global Warming tipping point is, the answer is to apply greater computation to the problem. Today the world's fastest supercomputer, known as the Earth Simulator, is devoted to world climate simulations.

The ultimate key to solving these complex problems is tipping point management. We should avoid signing Kyoto, but push for technological developments that would allow us to meet the Kyoto goals anyway. We should push for incentives for businesses to adopt cleaner practices. We should also learn as much as we can through climate simulations so that we can have a full cost/benefit analysis on emissions.

Posted by Stephen Gordon at May 10, 2004 04:39 PM | TrackBack
Comments

John Smart suggests (with reservations) that our universe is biased towards certain kinds of tipping points and against others.

This is a controversial topic, so I will mention it only briefly, but suffice it to say that after extensive research I have concluded that no biological or nuclear destructive technologies that we can presently access, either as individuals or as nations, could ever scale up to "species killer" levels. All of them are sharply limited in their destructive effect, either by our far more complex, varied, and overpowering immune systems, in the biological case, or by intrinsic physical limits—combinatorial explosion of complexity in designing multistage fission-fusion devices—in the nuclear weapons case. These destructive limits may exist for reasons of deep universal design. A universe that allowed impulsive hominids like us an intelligence-killing destructive power wouldn't propagate very far along the timeline.

This is a slightly more (pardon my French) "subtle and nuanced" statement than Rush made. Failure is not impossible, but the dice seem to be strangely loaded in favor of success.

Posted by: Phil at May 10, 2004 06:03 PM

I don't think it's so strange. If it were easy to destroy the universe, for example, then someone would probably have done it by now. So there's an anthropic principle at work.

But I don't see how John Smart can conclude that the universe is predisposed to preventing species destruction. First, our immune system is rather sophisticated, but we've already developed diseases that can overcome it. Similarly, I've heard estimates that a 10,000 megaton nuclear bomb surrounded by ample cobalt (one of the fabled "Doomsday devices") could effectively kill off most large animals (including humans) on Earth and poison the ecosystem for a while (radioactive cobalt has a half life of five years) though properly prepared, humanity could survive such an event I think. Both the US and Russia have sufficient nuclear explosive power to give this a try, but neither has shown any inclination to be this colossally stupid.

Claiming the existence of "deep universal design" seems a little premature in my opinion. We've steadily ramped up in explosive power to the fusion bomb. I think further development has been hampered less by complexity than by the impracticality of such a device. Both the US and Russia have enough destructive power to obliterate any combination of countries that they chose to. These devices have great strategic value. There's no incentive to make a bigger bang.

Instead, the value has been in designing more effective weapons (eg, "smart bombs") that deliver less force, but do so more accurately. In the various recent wars, the US has dominated through use of these sorts of systems rather than through it's powerful nuclear weapons.

Posted by: Karl Hallowell at May 11, 2004 09:28 AM

I like the term "anthropic principle" because it has it's own best criticism built right into the name.

That criticism goes something like this: Of course it looks like the universe favors intelligent life from the point of view of an intelligent life form that's around to think about it. All those forms of intelligent life that would think differently are not around to voice an opinion. They didn't make it.

That said, it certainly seems that this universe favors the arrival of intelligent life in some form - if not specifically human intelligent life. The reasons why are worthy of another post (or a book or two).

Posted by: Stephen Gordon at May 11, 2004 10:32 AM

I'm ready to start my own political party but I'm not sure what to name it. The Good Old Common Sense Party with a Dash of Bright Ideas, maybe.

I just hope the Universe isn't listening to the news lately and regretting favoring our arrival.

Posted by: Kathy at May 11, 2004 03:24 PM

720 Get your online poker fix at http://www.onlinepoker-dot.com

Posted by: online poker at August 15, 2004 04:39 PM
Post a comment









Remember personal info?