Chris Phoenix on tipping points:
...[U]p to a certain point, we won't see the power of this technology, but the advantages will develop rapidly once that point is reached. This is not how humans expect things to work -- we expect a linear progression. But a tipping point is nonlinear. If we don't recognize it in advance and plan for it, it will take us by surprise.
Chris points out that we have experienced two such tipping points in the history of technological development: the first occured when precision became a function of machinery (rather than human skill); the second occured when the same thing happened with reliability. The next tipping point is yet to come, and it has to do with complexity. When machines can rotuinely create machines more complex than themselves, we're there.
Chris claims that this tipping point is at least as significant as the other two. I think it may turn out to be more significant than the two that preceded it, if only because it builds on them and helps to further exploit the advantagesbrought about by the other two. Moreover, it occurs to me that the ability of machines to produce ever more complex machines is an essential ingredient in the coming singularity.
By all means, read the whole thing.
Posted by Phil at May 2, 2004 10:16 AM | TrackBackYep, once machines can build more complex machines then the whole thing just snowballs. Only question is whether that effect will wipe us off the planet or if we can coexist with essentially an alien intelligence. I think we just have to instill in them a sense of respect towards the beings that gave them life.
Posted by: ChefQuix at May 2, 2004 11:02 PMChefQuix:
Ray Kurzweil would give a third possibility: that "they" will be "us." That we will merge with our technology. In fact, he predicts that if we fail to merge with our technology we won't be able to co-exist.
Posted by: Stephen Gordon at May 3, 2004 05:09 AMPrecisely. In fact, it's the only way that we can possibly coexist with AI (in whatever form it takes). We must become one with the machine because if we aren't a critical part of their being then there would be nothing to stop them from taking over the world. Evolution dictates that the strong survive in a scarce evironment; just as we constantly battle viruses and disease we too would battle computers. If machines are given the tools to be as flexible and adaptable as us but without an appreciation or a need for our existence there would be no way to stop them, and rightly so - if one obeys the rules of evolution then the strongest must take over. It'd be a shame to lose all that we've gained, but it would be necessary.
On the other hand, an integrated being would be an incredible evolutionary leap. Independent beings connected instantaneously to each other with the precise unfailing memetic recall of computers - well this would give us enormous advantages. I can imagine that eventually some natural disaster will befall us here on Earth, and only with the aid of AI and robots would manage to survive. Maybe I'm a optimistic pessimist.
Posted by: ChefQuix at May 4, 2004 12:14 AMPrecisely. In fact, it's the only way that we can possibly coexist with AI (in whatever form it takes).
Actually, another way is that we become parasites that the AI must support.
But what's in it for them? As they have the power of computers behind them, whats to stop them from removing us like a salt on a leech if we become parasitic?
Posted by: ChefQuix at May 4, 2004 11:06 PMBut what's in it for them? As they have the power of computers behind them, whats to stop them from removing us like a salt on a leech if we become parasitic?
Actually, there's a number of ways this could be done. They could be programmed so that they can't unparasitize us or maybe serving humans feels really, really good. They could be subject to a sort of deadman's switch. If we go, then they go as well. Maybe it's just too much work and bother to get rid of us.
Posted by: Karl Hallowell at May 9, 2004 06:41 PM