January 31, 2004

Nanotechnology vs. Cancer

Forbes reports that the National Institutes of Health have published a set of future-looking scenarios called the Roadmap for Medical Research. Nanotechnology figures heavily in the NIH's plans, particularly where the treatment of cancer is concerned:

[N]owhere is the use of nanotech in medical advances more critical than at the National Cancer Institute (NCI), which sees the potential for nanoscience to dramatically enhance our ability to effectively detect cancer, deliver targeted therapeutics and monitor the effectiveness of cancer interventions.

Read the whole article to get a better picture of the kinds of developments they're talking about. This section, in particular, got my attention:

Another entirely new platform for cancer therapy is being developed by James Baker, the director of the Center For Biologic Nanotechnology at the University of Michigan. His is based on dendrimers, molecules shaped like spheres and made up of nanoscale polymers in a very specific pattern, sometimes resembling a complex snowflake.

Baker has functionalized these dendrimers to create smart-therapeutic nanodevices that will be used to treat disease. One type seeks out and recognizes only cancer cells. Another type can diagnose what type of cancer it is, while a third type of dendrimer is able to deliver drugs to destroy it. A fourth type can report the location of the tumor to a doctor (a labeling molecule for X-ray or MRI), and yet another can confirm that a cancer cell has been killed. Typically each one of these processes is lengthy, expensive and indiscriminate towards healthy cells. Integrating them into one larger molecule creates a nanodevice able to perform them all at once while leaving healthy cells unscathed.

Think of it: a treatment for cancer that doesn't make you vomit or cause your hair to fall out. And that might actually work.

As I noted a while back, this would be what Glenn Reynolds would classify as a "major" (rather than a "spooky" development) in the nanotechnology space. These are the kinds of breakthroughs that are going to lead us to true nanotechnology. (Including the spooky stuff.)

via KurzweilAI.net

Posted by Phil at 08:15 AM | Comments (0) | TrackBack

January 30, 2004

"Intellectual" Property

Here are a couple of headlines that I found dangerously close to each other:

Google Slaps Booble

Search engine giant Google has demanded that newly launched adult search site Booble take down its Web site, a Booble spokesman said Thursday.

Lindows Lose To Windows In Dutch Courts

Companies that resell Lindows, the Linux operating system, in the Netherlands have eight days to stop, since Windows successfully won a ruling in an Amsterdam court. The alternative system is said to be "profiting from the success of Windows".

It would be easy to dismiss one of these cases as trivial, but they both raise serious issues. Google claims that "Booble" is cashing in on their name and look and feel in order to peddle smut. Booble claims that their site is a parody. Google claims that they are a competitor, another search engine. So the questions is: can company A go into the same business as company B while parodying the look and feel of company A?

My gut answer is "why not?" Nobody would ever confuse Google with Booble. So I don't see how Google is losing anything in the deal. If it's unfair that Booble is cashing in on a market that Google created while using a similar name, then why can't Yahoo! sue Google on the same grounds?

I'm not so sure about Windows and Lindows. Maybe in Europe, there really is some chance that the two could be confused. It's also unclear from the article how much Lindows operates like Windows, or whether the Dutch have any other option for getting Linux.

If I were a suspicious person, I would wonder whether Microsoft is less concerned about protecting their intellectual assets, and more concerned about finding novel ways to destroy competitors without having to beat them in the marketplace.

Posted by Phil at 12:57 PM | Comments (0) | TrackBack

Fountain of Youth for Cells?

Researchers in California have discovered a synthetic protein that may be able to turn adult cells back into stem cells (or at least a convenient stem-cell substitute.)

Sheng Ding and colleagues of the Scripps Research Institute in La Jolla discovered the molecule, which they named reversine. When they treated mouse muscle-forming cells with the drug, the cells apparently reverted to a 'blank' state capable of forming other kinds of tissues. The researchers were then able to guide the cells into becoming bone or fat cells instead.

These are preiliminary findings, so obviously there are many, many questions yet to be answered. But I hope they're on to something. We need stem cells without all the ethical and legal problems.

via KurzweilAI.net

Posted by Phil at 11:31 AM | Comments (1) | TrackBack

January 29, 2004

Reading Matters

If you're looking for some good blog-based children's fiction, may I recommend this story by posse member Joanie? Even if you weren't looking for some good blog-based children's fiction, take the time to read it. It's wonderful.

Then again, if you're looking for some good blog-based fiction that probably isn't appropriate for children (little tykes, anyway) but that has children in it, might I recommend Stillness? This would be a great week to start reading our serialized novel about the end of the world.

Posted by Phil at 09:47 AM | TrackBack

Badges of Honor

A while back, a commenter tried to put me in my place by accusing me of being “sophomoric.” Unfortunately (for her), because of a certain sentimental association I have with the word, I can never be insulted via that particular term. On the contrary, I wear it as a badge of honor.

Now Glenn Reynolds reports that Mark Modzelewski of the Nano Business Alliance has had it with Glenn and with all of us “bloggers, Drexlerians, pseudo-pundits, panderers and other denizens of their mom’s basements.”

Well sticks and stones, Mr. Modzelewski.

Actually, that’s not such a bad list. Let’s examine the items one by one:



Yep, I’m one. I spend (sometimes) hours a day blogging. Quite proud of it, to tell you the truth. So thanks for noticing.



Okay, you got me there. I’m a Drexlerian. I’m also a Copernican,* if you get my drift.



Remember George Carlin (back when he was funny) and the “semi-boneless” ham?

“Is there a bone? There is a bone. And it’s a bone. It ain’t no semi-bone.”

Can you really have a pseudo-pundit? Isn’t some notion of “pseudo” or “quasi” built into the term? (Clearly, I’m not thinking of pundits of the Vodka, Daily, or Insta variety.) Or maybe Modzelewski is trying to describe a specialty. Whereas FuturePundit shares information and views on the future, a pseudo-pundit would share views and information that are highly specious, if not totally bogus.

I don’t know. That doesn’t sound too much like a blogger. It sounds more like a…hmmm…that’s a tough one. Oh, wait. I know.

It sounds like a “political damage control specialist.”



Yes, I suppose I am a ponderer. I do like to ponder the unknown. That’s what being a Speculist is all about, really, pondering what might be, what might not be — how’s that?

Oh, pander.

I never pander, except maybe to gorgeous women or people who I think might give me money. (I did win a suck-up prize once, though.)


Denizens of Their Mom’s Basements

Look, I may live here, but I’m hardly a denizen. (Also, I think in fairness I should point out that it’s as much Dad’s basement as it is Mom’s.)

Modzelewski seems to be implying that people like us lack social development. But the truth is, I have the gang over to play Dungeons and Dragons every Saturday afternoon. And it’s not like I never go out. Heck, I went to see Return of the King four times last week.

It’s a great feeling: getting up every morning, sitting down at the computer still wearing my bathrobe, my Vulcan ears freshly affixed. But I do sometimes have a vague sense of discontent. If only I could be more…hip. More suave.

More clever.

Better looking.

Damn. If only I could be more like that Modzelewski guy.


* That is, a follower of Nicolaus Copernicus: Polish astronomer, 1473-1543. He was a proponent of a lot of crazy ideas, most notably the notion that the Earth orbits the Sun. Sadly, history provides scant mention of how much time he spent in his mom’s basement.

Posted by Phil at 08:57 AM | Comments (3) | TrackBack

Save the Hubble

I hope this movement gains momentum:

Astronomers were stunned when Nasa's chief, Sean O'Keefe, decided on 16 January to cancel the fifth, and final, visit of the space shuttle to service the Hubble Space Telescope (HST).

A service call is essential to ensure Hubble's smooth operation until the end of the decade.

The telescope has only three working gyroscopes, down from its compliment of six, and cannot afford to lose any more.

O'Keefe decided that in the wake of the Columbia disaster it was unwise to send astronauts on a shuttle mission that could not reach the safety of the International Space Station in the event of a problem.

I find it unacceptable that a beautiful and elegant piece of technology like the Hubble should have its fate tied to that of the albatross space shuttle. The shuttle needs to go, the Hubble needs to stay, and the friends of the Hubble need to get past this kind of zero-sum-game thinking:

"Let the voters say: 'We don't want to go to the moon! We want to go to infinity and beyond!'," said [save the Hubble organizer] Mr Ribeiro.

I see no reason why we can't pursue both. Saving the Hubble would be a great project for the nascent private space industry. It might be a good way to move the robotics industry along, as well. How much would it cost to send a private, fully automated, unmanned service unit up to work on the Hubble? My guess is that it would cost a lot less than a shuttle launch. NASA (or realistically, someone else, since NASA isn't likely to do anything like this) should put up a reasonable amount of money and let the free market save the Hubble.

UPDATE: Speaking of the Shuttle (and reasons it has to go) Rand Simberg makes note of the interesting convergence of the anniversaries of the Apollo 1, Challenger, and Columbia tragedies.

Posted by Phil at 06:10 AM | Comments (10) | TrackBack

No Big Deal...

...just a new form of matter, if you happen to be interested in that sort of thing.

Q: How many forms of matter are there?

A: A couple more than I realized:

The new matter form is called a fermionic condensate and it is the sixth known form of matter -- after gases, solids, liquids, plasma and a Bose-Einstein condensate, created only in 1995.

Jin and her colleagues' cloud of supercooled potassium atoms is one step closer to an everyday, usable superconductor -- a material that conducts electricity without losing any of its energy.

"If you had a superconductor you could transmit electricity with no losses," Jin said. "Right now something like 10 percent of all electricity we produce in the United States is lost. It heats up wires. It doesn't do anybody any good."

Or superconductors could allow for the invention of magnetically levitated trains, she added. Free of friction they could glide along at high speeds using a fraction of the energy trains now use.

The first of my Seven Questions About the Future goes like this:

1. The present if the future relative to the past. What's the best thing about living here in the future?

And the answer is: stuff like this.

via KurzweilAI.net

Posted by Phil at 05:49 AM | Comments (2) | TrackBack

January 28, 2004

Cool Site

Check out the Incipient Posthuman.

I like the name. Up till now, I was thinking of myself as an underachieving posthuman. This is much better.

I like this essay, Being Dead Sucks. Man I wish I'd said that.

I can add one to the list, too. If you're dead, you can't blog.

Posted by Phil at 09:26 PM | Comments (1) | TrackBack

Mac Turns 20

Has it really been 20 years?

I've been a Mac fan from the beginning. As anyone who has read my bio knows, my first really real job was with a magazine related to all things Mac. But even before then, I spent many a long afternoon playing Wizardry and Radical Castle with my friend Mike (the guy over there in the sidebar who fantasizes about hitting robots up side the head with a baseball bat) on his Macintosh 512e (the e stood for enhanced) which was the top of the line at the time. The graphics were astounding; there had never been anything like it before. It's amazing to think about the hours we spent staring into that tiny little screen. But we were a lot younger then. And unemployed.

Oh, I mean students.

The Mac Plus — sporting an entire megabyte of RAM — came along a while later. Then came the SE and the Mac II, followed by the Netwon, the PowerBook, some kinda questionable stuff, some kind of crappy stuff, yada yada yada, the triumphant return of Steve Jobs, the iMac, and on into the future. I bought the SpecuSpouse a sweet desktop OS X job for her last birthday. It's the first time I've had a Mac in the house in about ten years, and it feels pretty good. Even if it isn't mine.

The last Mac I used regularly was a PowerPC, and I mostly hated it. But that might have been due to the fact that I had to run Windows on it, and that was a long, painful transition for me. For me — as for a lot of folks, I suppose — Macintosh was much more than just a computer. It was a philosophy, a way of life. A creed. It was the computer for the rest of us, not those losers. Us. We were going to make sure 1984 wouldn't be like 1984.

I remember reading about John Sculley's interview with Steve Jobs before being hired on as CEO. Jobs told Sculley (who was at the time the head of Pepsi Cola) that he could spend the rest of his life selling sugar water, or he could come work for Apple and change the world.

I don't know if we Mac users — or Mac alumni like myself — ever got around to changing the world. I suppose we did, or at least the Mac did. These days it's hard to imagine working on a personal computer without a GUI and a mouse. But we were the ones who got there first, at least on a big scale. I got a handwritten note from Bill Gates in 1987 chastising me for not giving Excel its due in my write-up in the magazine. (Like Word, Excel appeared on the Mac long before it was even possible for it to run on a PC.)

Did I bother to hang on to that note? Of course not. It was of no particular significance, one of the Rest of Us setting another one of the Rest of Us straight. Bill Gates wasn't all that famous at the time, although he was greatly admired by the "Change the World" crowd. After all, he was the little guy, the guy who would dare to take on WordPerfect and Lotus 123 with his little graphical software programs.

It was a magical time. Anything seemed possible.

Posted by Phil at 12:01 AM | Comments (1) | TrackBack

January 27, 2004

Last-Ditch Effort

This seems like a long shot, but I hope it works. Since all else has failed, mission control for the ill-fated British Beagle Mars Lander are going to do what we all do when the system won't respond.

They're going to attempt a re-boot.

Posted by Phil at 07:28 AM | Comments (0) | TrackBack

Aubrey de Grey Update

FuturePundit reports on a new interview (currently running on Better Humans) with Cambridge University genetecist and life extension visionary Aubrey de Grey. Aubrey offers some reasons why progress in this area isn't happening as fast as some of us would like, but he also adds this hopeful note:

I think there will be only a short interval between the time when we first have genuine life extension treatments and the time when we're improving those treatments faster than we're aging[.]

Randall Parker comments:

There are enough multimillionaire and billionaire philanthropists that all the work could be done with private money if only enough wealthy people became interested. If you know any wealthy people then do us all a favor and send them Aubrey's interview and some of the articles from his web site.

I don't actually know any welathy people, but I'm starting to think that (if I did) it might make more sense to try to persuade them to fund Aubrey's work than it would this project (worthy though it is.)

By the way, speaking of Aubrey, he was the subject of our first-ever Speaking of the Future interview.

Posted by Phil at 07:17 AM | Comments (0) | TrackBack

Encyclopedia Galactica

Anyone who has read the Foundation series by Isaac Asimov knows that the story involves a secret society — actually, a couple or three of them — dedicated to keeping galactic history on course through the predictive power of "psychohistory." Now I'm not saying there is such a group (although one would do well to keep one's eye on this bunch), but I've often wondered whether a certain resource that I use all the time might not be a stubbed-out, mock-up, draft-one version of the first deliverable that the Foundation produced: The Encyclopedia Galactica.

The Encyclopedia Galactica was Asimov's vision of a single access point for all human knowledge. Well, maybe Wikipedia isn't quite there yet, but as Dan Gilmore reports, it's about to publish its 200,000th entry.

That's a start, folks. That's definitely a start.

via GeekPress

Posted by Phil at 07:04 AM | Comments (0) | TrackBack

January 26, 2004

War of the Worlds II

The Earthlings Strike Back!

Da Goddess points the way, while Citizen Smash has all the glorious details.

BTW, I love that accompanying artwork, Joanie:

Oh, dear. Now you have dis-integrated me. I do so hate being dis-integrated. You have made me very angry. Very angry indeed.

Kudos to Smash for his artwork as well. It seems to me that I've seen that photo somewhere before....

Posted by Phil at 08:15 AM | Comments (0) | TrackBack

January 24, 2004


Reader Kathy Hanson writes as follows:

This subject is pertinent to my new project, Apocalypse Garden. I'd like some help from the Posse. I'm at the stage where the characters are writing the story but I'm not convinced that I'm ready to tell it.

My lead characters are telepathic, but only with each other. It kind of leads to a love/hate relationship. Since everyone knows that brain waves don't work like radio waves, the characters want to find out what changed their brains. Could nanotechnology several generations earlier have manipulated their ancestors' genes creating a deliberate mutation that modified their brain waves?

Well, first off, Kathy — welcome to the FastForward Posse. *

I think it would be fun to have a contest to see who can come up with the best reson why Kathy's characters are telepathic. We'll call it the Help Kathy Figure Out Why Her Characters Are Telepathic contest. (I studied English as an undergrad; that's why I'm so good at naming things.)

Please provide your theories as to how the characters became telepathic in the Comments section. All Posse members are requested to propose at least one theory. Everyone else is welcome to submit theories, too. (Of course, anyone who does so runs the risk of being drafted into the Posse.)

The winner will be selected by Kathy Hanson, and will receive the honor of knowing that her or his theory is the one that made it into Kathy's story. Also, there will be a grand prize of $1 million if I can get anyone to donate it.

So what are you waiting for? Let's see those crackpo — um, I mean creative explanations!

* It's in the by-laws, Section 16, Paragraph 9, Item 3:

"Any non-Posse member who calls upon the Posse for help with a creative project which is, in the view of The Speculist, El Jefe Grande, any of the Posse Ringleaders, or any of the Posse Regulars sufficiently interesting to warrant the attention of the FastForward Posse in toto is declared, by virtue of presenting an item of sufficient interest, ipso facto a Regular member of the Posse in full standing.

Sorry, those are the rules. I don't make them.

(Oh, wait. Yes, I do. I'm always getting mixed up on that point.)

Posted by Phil at 01:38 PM | Comments (5) | TrackBack

January 23, 2004

Why Am I Not Surprised?

So the state of California has this little project where they claim to be preparing for the future (good idea). And so they decided to write up a report on Nanoscience and Nanotechnology (good idea). But then when they finished their report, it turns out that it doesn't actually say anything about nanotechnology, outside of a lone sci-fi scare scenario.

Not really unexpected, but depressing anyhow.

via Howard Lovy

Posted by Phil at 05:16 PM | Comments (0) | TrackBack

Mind Writing

Via GeekPress, here's an interesting article on the future of mind reading technology. All the scenarios are interesting, but I was most intrigued by the last one:

Nick’s estranged wife, Helen, stands with their son, Troy, at Nick’s bedside. Helen and Nick have had a volatile marriage, plagued by Nick’s alcoholism and occasional violent outbursts. They’ve lived apart for the past four years, but he’s dying and she’s returned to his side. (Scans have shown that Helen’s brain is unusually developed in an area linked to loyalty.) She is relieved that Troy has not inherited his dad’s genes for addictive tendencies, especially since it was shown in 2025 that susceptibility to nicotine addiction was not a discrete gene after all, but stemmed from a host of genetic and environmental factors.

“Dad sure looks peaceful, Mom,” says Troy. “I know it was hard, but you did the right thing with the pain-erase memory implant.”

Helen sighs. “You were right. No time for ancient history now. I saw my own father die, and he was so debilitated by his regrets and guilt. This is much better.”

“It’s the humane thing.”

Nick stirs in the bed. His eyes flutter open. “Helen,” he whispers, “we’ve had a wonderful life, haven’t we?”


“We were luckier than most people.”


“I just hope our son can look back someday and feel at least as much pride and satisfaction as I do right now.”

Troy steps forward and takes his hand. “Don’t worry, Dad. I can practically guarantee that I will.”

When such technology becomes widespread, it won't just by used by people on their deathbeds. Everyone has memories that they would just as soon erase. Whether we should be allowed to rase them raises all kinds of stickyethical questions. Then there's a related question that I haven't seen discussed as much...what are the ethical considerations around deliberately implanting false memories?

This might be done for entertainment, as in the movie Total Recall. Or it could be done for more sinister reasons. It's been widely argued that — without the benefit of any advanced technologies — some overzealous prosecutors have been implanting false memories of molestation in children's minds for years. Think how much more difficult the truth would be to ascertain if memories could be implanted directly, rather than through persuasion.

Posted by Phil at 04:30 PM | Comments (5) | TrackBack

ITF #121

In the Future...

...pop-up, pop-under and interstitial web ads will get even more annoying and eat more of your bandwidth.

Futurist: Posse member Robert Hinkley.

Posted by Phil at 04:08 PM | Comments (0) | TrackBack

January 22, 2004

New Blog

Check this out: Responsible Nanotechnology. From (of all people) the Center for Responsible Nanotechnology. Lots of interesting material; well worth your time.

Posted by Phil at 02:39 PM | Comments (0) | TrackBack

Fighting the Real Enemy

A common theme in the comments posted to my recent essay on why I think death is such a bad thing was the idea that lengthening life doesn't work as an end unto itself. Life extension is meaningless without some assurances as to the quality of life. As reader Cybrludite put it, "It's not the years in your life, but the life in your years."

Three major problems that were raised with extending human lifespan were dementia, incontinence, and (for longer periods of life extension) boredom. I don't have much to say about incontinence at present — not really one of my favorite topics — and I think the debate about boredom has been sufficiently argued in the comments to the original post. (For another take on what to do with a long life, see the Aubrey de Grey quote in the sidebar of the Speculist home page.)

But there have been some developments on the dementia front. FuturePundit Randall Parker reports as follows:

Vitamin C, E In High Dose Combination May Protect Against Alzheimer's

Peter P. Zandi, Ph.D., of The Johns Hopkins University Bloomberg School of Public Health, Baltimore, and colleagues examined the relationship between antioxidant supplement use and risk of AD.

Peter P. Zandi, Ph.D., of The Johns Hopkins University Bloomberg School of Public Health, Baltimore, and colleagues examined the relationship between antioxidant supplement use and risk of AD.

The researchers found the greatest reduction in both prevalence and incidence of AD in participants who used individual vitamin E and C supplements in combination, with or without an additional multivitamin. "Use of vitamin E and C (ascorbic acid) supplements in combination reduced AD prevalence [by about 78 percent] and incidence [by about 64 percent]," the authors write.

How about that. So big doses of Vitamin C and E (together, they don't do you much good separately) can apparently make a significant difference in whether one contracts Alzheimer's. I've been taking them for years (among several other things). Now I need to remember to call my Mom and make sure she and Dad are taking them every day; my in-laws, too.

Randall also notes that how you take the supplements may play a role in how well they work:

If you want to take Vitamin E to reduce your risk of Alzheimer's Disease then be aware that it is best to take E with oil and perhaps a food grain for maximum absorption. Vitamin E with pasta and a pasta sauce with oil would probably be a great way to maximize absorption.

Waiter, I'll have the Linguini with antioxidants, please.

Need I say it? Read the whole thing.

Posted by Phil at 07:48 AM | Comments (13) | TrackBack

ITF #120

In the Future...

...the arms race will quickly escalate to the next step: the pop-up blocker blocker blocker.

via GeekPress

Posted by Phil at 06:58 AM | Comments (10) | TrackBack

January 21, 2004

Carnival Trek

Poliblog is taking us where no Carnival of the Vanities host has gone before. Don't miss it.

Posted by Phil at 01:05 PM | Comments (0) | TrackBack

January 20, 2004

Male Offspring

The newly touted ability of parents to choose the sex of their child will almost certainly have a huge impact on society. It's interesting that Newsweek chose a couple seeking to have a girl for their case study. In the West, we believe — or we like to tell ourselves that we believe— that one sex is as good as the other, and that couples are as likely to use this capability to select a girl as a boy. Although I note that sex selection is banned in Europe, perhaps because the leaders there suspect that, in the eyes of many, all babies are (still) not created equal.

Where this development is really going to hit home, however, is Asia, where few uphold any pretense of believing that a girl is as good as a boy. China's one-child policy has led to mass numbers of abandoned baby girls (many of whom are adopted in the West) and a return to the ancient practice of gender selection through infanticide — which was strongly discouraged, but never fully eliminated during the early years of the Communist Regime The impact of these crude forms of gender selection is shocking:

[In] September 1997, the World Health Organization's Regional Committee for the Western Pacific issued a report claiming that "more than 50 million women were estimated to be 'missing' in China because of the institutionalized killing and neglect of girls due to Beijing's population control program that limits parents to one child." (See Joseph Farah, "Cover-up of China's gender-cide", Western Journalism Center/FreeRepublic, September 29, 1997.) Farah referred to the gendercide as "the biggest single holocaust in human history."

And China is hardly alone. The report linked above also cites widespread infanticide in India as well as rampant use of abortion to prevented unwanted (female) children. A while back, FuturePundit noted that in Taiwan, abortion has skewed the demographics of childbirth such that three boys are now born for every two girls.

With its $18,000 price tag, it's unlikely that the procedure outlined in the Newsweek story (linked above) will have much of a role to play in rural China or India any time soon. But we may see simplified, less expensive versions of the procedure available in developed areas in the region within a decade or so, and some kind of risky "bootleg" version universally available sometime thereafter.

In such a scenario, we can look forward to a sharp drop in child killing and abandonment in these areas. But the demographic woes will only be exacerbated. I have to wonder how the Chinese government will repond to the introduction of such technology. On the one hand, it would support the one-child policy and help eliminate the dreadful work-arounds that have developed to ensure male offspring. On the other hand, it would lead to a highly skewed sexual demographic. What would a society that was 70-80% men be like? Would it be stable? Efficient? Brutal? Way gay? Nobody knows; it's never been tried.

One thing is for sure. China's population growth problem would be solved. In fact, a new population difficulty would probably emerge: rapid decline.

Via KurzweilAI.net

Posted by Phil at 07:21 AM | Comments (3) | TrackBack

ITF #119

In the Future...

...tax men will be even more keen-eyed and vigilant.

Futurist: Posse member Robert Hinkley.

[ Or perhaps this development indicates that that particular species is evolving (devolving?) in the other direction. We can hope, can't we?

Also, since going on the record with my controversial notion that Death Sucks, I've had quite a few people write in to remind me that some personal circumstances are worse than death. I agree. I am surprised, however, to learn that some personal circumstances — e.g., working in the office mentioned in the linked article — are apparently indistinguishable from death. ]

Posted by Phil at 06:56 AM | Comments (0) | TrackBack

Busy Days

Due to some unforeseen circumstances, I wasn't able to do any blogging yesterday (reading or writing), so I missed the Carnival of the Capitalists. If you missed it, too, here's your second chance.

The workload and unexpected personal priorities continue today, so blogging will be light once again. In the interests of time, I'm going to skip a "This Week" summary.

Oh, all right. A quick one. This week in The Speculist:

De Nada.

Not much.

Stillness, (Maybe. I'm rewriting that chapter. It might be done on time.)

Futurist Grab-Bag.

Potpourri of Predictions.

Speculist Surprise.

And throughout the week we'll be blogging developments in nanotechnology, artificial intelligence, space exploration, and other future-impacting areas. There. That's better. It's good to have a plan.

Posted by Phil at 06:43 AM | Comments (0) | TrackBack

January 18, 2004

Future Roundup 01/18/04

Here's all the In the Future... predictions for about the past month. It's nice getting caught up. Many thanks to futurists Chris Hall, Robert Hinkley, and Joanie ( ¿Donde esta el Jefe?) for helping us to look ahead.

In the Future...

...Boy Scouts will be required to redo the Orienteering Merit Badge.

...personal trainers will also take robots through rigorous pilates routines and show them how to avoid injury on the bench press.

...we'll also be able to get Meditation Of The Day direct from the lord Buddha.

...in the celestial barbershop quartet, white dwarves, quasars, and pulsars will sing the other three parts.

...canny Brazilians will turn the tables on the dam piranhas and start farming them for the novelty food market.

...Robot John Carter will assist NASA in their exploration of the red planet.

...P300 enhancers will keep the coffee out of the cornflakes and the cat out of the washing machine.

...we will be *delighted* if a burglar poos in the wardrobe.

...soldiers will be able to download custom ringtones for their chinstraps and helmets as well as being able to set them on vibrate.

...more insightful research will reveal that getting up on Monday morning is widely perceived as a pain in the neck.

...we'll find a zombie in love to be more remarkable than the politics of the object of said zombie's affections.

... the boxes will be smarter than the boxcutters.

...a cheese-slicing laser will be an essential accessory for every dinner party host.

...the myriad health benefits of women and song will be available in convenient pill format.

Well, that does it for now. If you have a prediction you'd like to share, send it to speculis-at-speculist-dot-com. (Be sure to include the URL of a news story that corroborates your prognostication.)

And until next time, we'll see you in the future.

Posted by Phil at 06:34 AM | Comments (1) | TrackBack

The Big Day

Friday was our biggest day ever at The Speculist. I want offer up 5,352 heartfelt thank-you's to Glenn Reynolds for pointing a few new folks my way, and 7,474 of the same to all of you for racking up so many page views.

Here are the totals:

Since it was such a hit, I'm thinking of following up Death Sucks with some equally controversial essays. How about Poop Stinks? Kittens Are Cute? Pie Is Yummy?

Posted by Phil at 06:20 AM | Comments (0) | TrackBack

January 16, 2004

Life, Summarized

Since I'm dealing with the big issues today, I didn't want my treatise on how death sucks to be the last word. Here, then, are some words, from my favorite goddess, that sum up an awful lot of what life's about:

A wonderful teacher once told me "only rocks and love last forever. Choose to be one of them." I've been a rock and done little. Now it's time to be love - or at least filled with love - and do something.

Life's not about fame or fortune. It's about leaving enough of an impression on others that they think of you for awhile. Or, more importantly, that you think of others for awhile.

Read the whole story. Godspeed, Joanie.

Posted by Phil at 12:46 PM | Comments (1) | TrackBack

Death Sucks

Reader Mary (Definitely on the Outer Ring) posed the following question in a recent comment:

Why are you so scared of dying?

(She wrote some other provocative questions as well, but I want to focus on this one for now.)

From the context, I'm going to assume that what Mary is asking is a philosophical question. She doesn't want to know why I would get out of the way of a speeding truck. All mentally healthy human beings are "scared of dying" in that sense; it's something we share with virtually every living being on the planet.

What Mary wants to know is this: why am I not resigned to my own mortality? Why would I want to engage in this unseemly practice of exploring alternatives to dying?

I'll tell you why, Mare.

Death sucks.

Some say that dying is as natural as being born. I say, so what? Vomiting is as natural as eating, but I happen to like eating a lot more.

Some say that death is a part of life. I contend that, by definition, it is not.

Some say that death is the threshold to the next stage of existence. I say maybe so. But this stage seems to have a natural built-in aversion to the threshold to that stage, and I'm going to go with that.

Many believe that the fear of death is a primitive relic, a lingering superstition. Fear of death, they will tell us, is what originally led humanity to irrational thinking. We invented gods and spirits primarily to assuage this fear. Now we live in an age when rational thinking might once again hold sway, although irrationalism persists all around. To differentiate themselves from the irrational throng, rational thinkers proudly state that they are not afraid of dying.

I remember years ago, when I went to see Scorcese's Last Temptation of Christ, there were two groups of sign-carrying protestors standing out front of the theatre. One group was Christian, the other was Atheist. The box office line was rather long, and those of us standing in it were stuck between these two groups: one warning us not to go see this shocking piece of blasphemy, the other encouraging our support of free speech. Needless to say, there was a good deal of verbal sparring between the two camps. Some comments were good natured and even a little funny, but it got heated from time to time. I remember one exchange ended with these very words:

Yeah? Well, I'm not afraid of dying.

Hey, good one. Sign-carrying atheists, one; sign-carrying fundamentalists, zero.

Unfortunately, that's a load of crap. No, I don't mean that I doubt that guy's sincerity when he said that he was not afraid to die. I'm sure he meant it, and wasn't just trying to score points against those polyester-clad, big-haired fundamentalists in front of his cool sign-carrying atheist friends. But the notion that the fear of dying is uniquely linked with irrational thinking is just about as wrong as it can be.

Let's go back 50,000 years or so ago and take a look at our primitive ancestors. It's true that somewhere along the line they developed burial rituals and a belief in an afterlife. Maybe this was just an irrational response to their fear of death and the grief of losing a loved one. But it was just a small part of what they were doing. What, then, were they spending most of their time doing?

Figuring out how the world worked.

These plants will make you sick. These are good for food. Spears with sharp stone heads are better than pointed sticks at bringing down game and warding off predators. This is a good place to stay; predators don't usually come here. After the moon changes three more times, we'll start heading south. We used to wait until it got cold, but this way works better and we lose fewer members of the tribe.

Our ancestors relentlessly pursued an empirical investigation into the nature of...everything. Science didn't begin with Newton or Bacon or the ancient Greeks. It started way back when. All mathematics, physics, biology, astronomy — all rational human thought — has as its foundation the pioneering work of these our ancestors.

Now what do you suppose motivated them to do all this hard investigative work, to engage in all this rational thinking. Could it have been the fear of death?

Absolutely. They were besieged by threats on all sides. A rational, empirical approach to the world emerged as the soundest way of warding off those threats. If our fundamentalist-taunting friend could go back in time and somehow convey to a group of his ancestors his basic credo of intellectual superiority — "I'm not afraid of dying" — they'd think he was nuts. And not because they were so irrational.

But we're only halfway there.

Paradoxically, the self-satisfied volley of "I'm not afraid of dying" might just as easily have come from the religious side of the ticket line as it did from the non-believing side. Religious and spiritually oriented people are often quick to tell you that they have no fear of death. And if you really got it, — whatever that means to the particular believer — you wouldn't be afraid of death, either. If you only understood about Jesus' victory on the cross, or reincarnation, or nirvana, or even just the Natural Order of Things, you would be as resigned to your own eventual demise as the rest of us.

Yeah, well, that's a load of crap, too.

I'm going to restate that so I'm not misunderstood. Any religion that teaches that you should be okay with the fact that you're going to die is a load of crap. Christianity (to use the religion I'm most familiar with) most assuredly does not teach this. As C. S. Lewis famously put it:

But here is something quite different. Here is something telling me -- well, what? Telling me that I must never, like the Stoics, say that death does not matter. Nothing is less Christian than that. Death which made Life Himself shed tears at the grave of Lazarus, and shed tears of blood in Gethsemane. This is an appalling horror; a stinking indignity. (You remember Thomas Browne's splendid remark: "I am not so much afraid of death, as ashamed of it.)

I believe that all human beings, including people of faith, share the same natural revulsion for death. We can blot these feelings out and cover them up, but to do so is to become like those rabbits in Watership Down who sang melancholy songs while trading their lives for some lettuce and carrots.

Those who claim to have no fear of death, whether they be an Objectivist or the Dalai Lama or some Palestinian strapping dynamite to his chest, have lost touch with a primary truth of human existence: a truth which has lead us both to science and to faith. Those who seek to prolong human life — whether via antioxidants or cryonics or standard medical procedures — have tapped into that same fundamental truth:

Death sucks.

Posted by Phil at 09:32 AM | Comments (80) | TrackBack

ITF #118

In the Future...

...the myriad health benefits of women and song will be available in convenient pill format.

Futurist: Posse member Robert Hinkley.

Posted by Phil at 06:46 AM | Comments (0) | TrackBack

January 15, 2004

The Big Announcement

I have mixed feelings. Sure, as somebody pointed out, it's good to have the return to Space on the agenda. It's good that we're talking about going back. But I wanted a lot more than this. Hearing that we're going to go to the moon in 10+ years was something to get excited about in 1960. A permanent moonbase in 2020? A manned voyage to Mars in 2025? Yes, that all would have also been very exciting, too...in 1960.

Anyway, I was looking through the Google News lsitings on this item and found a few interesting items:

Best headline:
Bush to Martians: here we come!

Lamest headline:
To boldly go ... Bush tells Nasa to build new shuttle for Mars

Thoughtful naysaying:
President shoots for moon, Mars LUNAR FOOTHOLD: Scientists disagree on value of U.S. return to Earth's satellite

Surly, knee-jerk naysaying:
Bush's Space Vision Thing

The all-important local angle:
Indiana likely to figure in Bush plan

But the best analysis by far and away has got to be this piece from The Telegraph:

To boringly go where they've gone before

The moment the black-and-white pictures flashed up on the screens, the celebrations began. Whoops of joy, tears of relief, high-fives: the team of Nasa scientists at mission control in Pasadena, California, were jubilant at their success.

They had sent a probe 250 million miles to Mars, landed, and were now looking at pictures beamed back from its surface. Later this week, their Mars Rover Explorer will start trundling about on the Red Planet.

Such celebrations were clearly merited for this whole slew of "firsts" - except they were nothing of the sort. Nasa has been visiting the planet since the early 1960s, and has even landed on its surface several times before.

Okay. So far so good. I don't think anyone really said it was a first, but what the heck. However, we then come to this little shocker:

The pictures splashed across the world's front pages last week were indistinguishable from those sent back by Nasa's Viking Landers more than a quarter of a century ago. Not even the plan to put a man on Mars was new: Nasa pulled that one off back in 1997.

A while back, I got really upset with a guy who tried to tell me we never went to the moon. I've learned from my mistake. I'm not angry with Robert Matthews, who writes surly editorials for The Telegraph and who believes that the US sent a man to Mars in 1997.

I'm not angry with him at all.

Posted by Phil at 06:02 AM | Comments (0) | TrackBack

Choose Your Dreams

A Japanese company says they've figured out how:

Prospective dreamers are asked to look at a photo of what they would like to dream about and then record a story line into the Yumemi Kobo, or "dream workshop".

The machine uses the voice recording, along with lights, music and smells, to help them direct their own dreams during periods of rapid eye movement (REM) sleep, Takara Co said.

Then again, maybe they're just flakes:

Takara Co, which brought the world the "bowlingual" and "meowlingual" devices - which purport to translate your pet's communication - admitted the machine may still need refining.

via KurzweilAI.net

Posted by Phil at 05:14 AM | Comments (2) | TrackBack

January 14, 2004

I Like the Sound of That

Rocket Plane.

Say it with me: "Rocket Plane."

This is what I'm talking about, folks. It's 2004. We're supposed to have robots (dancing optional). We're supposed to have flying cars. And we're supposed to have rocket planes.

World peace and some of that other future stuff would have been good, too, but those three are the short list.

via Rand Simberg

Posted by Phil at 10:56 AM | Comments (3) | TrackBack

ITF #117

In the Future...

...a cheese-slicing laser will be an essential accessory for every dinner party host.

Futurist: Posse member Robert Hinkley.

Posted by Phil at 10:37 AM | Comments (0) | TrackBack

ITF #116

In the Future...

... the boxes will be smarter than the boxcutters.

Futurist: Posse member Chris Hall

Posted by Phil at 10:36 AM | Comments (0) | TrackBack

January 13, 2004

The Big Time

Via Paul Hsieh, here is what may be the first (or is at least one of the first) references to the Technology Singularity to appear in the mainstream press. The coverage is disappointing. The writer has no clue about what she's writing about. She calls the Singularity "a kind of artificial intelligence." That's like calling World War II a "series of bad things that happened." And how about this little throw-away line in describing Eliezer Yudkowsky :

Like most transhumanists, he is Caucasian.

Oh, so it's just a bunch of white guys. Glad she pointed that out. For a minute there, I was almost ready to listen to what they had to say.

The whole thing is a smear job. This Danielle Egan essentially uses hipness as her yardstick for credibility; she thinks it's important to point out that Eliezer has bad posture and is a virgin, and she makes a nasty comment about "brown teeth" which I'm pretty sure is just plain wrong.

What a fascinating analysis. I wonder which high school she goes to?

Posted by Phil at 03:23 PM | Comments (3) | TrackBack

Space Cowboy

Glenn Reynolds, writing on whether opening up the Final Frontier might not be an appropriate legacy for an American President accused by many around the world of being a "cowboy," draws the following enticing scenario:

If you want settlement, and development, you need to give people an incentive. One possibility, discussed by space enthusiasts for some time, is a property-rights regime modeled on the American West, with land grants for those who actually establish a presence on the Moon or Mars. Some have, of course, derided the idea of a "Wild West" approach to space development, but other people like the idea of a "Moon Rush," which I suppose could be expanded in time to a "Mars Rush."

I think people will go even without land incentives, and even without clearly defined business plans, given the opportunity. They'll go for the adventure, which is probably the most motivating incentive of all.

Posted by Phil at 02:45 PM | Comments (0) | TrackBack

January 12, 2004

Yes, It Will Be on the Final

To begin a new term here at Speculist University, everyone should familiarize themselves with Kurzweil's law:

In an evolutionary process, positive feedback increases order exponentially. A correlate is that the "returns" of an evolutionary process (such as the speed, cost-effectiveness, or overall "power" of a process) increase exponentially over time -- both for life and technology.

This is why the world works the way it does. This is what the buzzkills just don't get. By all means, drop whatever it is you're supposed to be doing and read the whole thing.

Speculist University Shield.JPG

Posted by Phil at 10:36 AM | Comments (0) | TrackBack

Here's a Trend...

...that we need to nip in the bud.

"Nip it!" as Barney Fife would say, "Nip it! Nip it! Nip it! In the bud.

Posted by Phil at 10:30 AM | Comments (0) | TrackBack

ITF #115

In the Future...

...we'll find a zombie in love to be more remarkable than the politics of the object of said zombie's affections.

Posted by Phil at 10:14 AM | Comments (0) | TrackBack

No Round-Up?

I know I'm a few weeks behind on my ITF Roundups. These past few weekends just haven't been good for blogging. But we'll get caught up soon, I promise.

Posted by Phil at 10:11 AM | Comments (1) | TrackBack

The Vision Thing

Howard Lovy:

My main argument is that U.S. policymakers need to rise above the commerce side of the debate and help encourage development of nanoscience without letting business interests become the sole driver of the research. Reading the nanotech bill alone, you'd think that the government's central goal was to spin off companies and develop new products. Is that it?

Sadly, as Rand Simberg will attest, our government agencies are (to say the least) a little deficient in their ability to crank out vision. At least where nanotech is concerned they are interested in commercialization of the technology. NASA, on the other hand, seems to want to keep space travel as a government monopoly forever.

Maybe a lack of vision is part of the price we pay for our vapid, superficial, "good guys vs. bad guys" approach to politics. Why would anyone capable of any depth of thought consider politics as a career?

Posted by Phil at 10:08 AM | Comments (0) | TrackBack

ITF #114

In the Future...

...more insightful research will reveal that getting up on Monday morning is widely perceived as a pain in the neck.

Futurist: Posse member Robert Hinkley.

Uh-oh. Here's another one I might have to explain. Unlike the US, where we all love our jobs and leap out of bed with unbounded enthusiasm each and every Monday morning, apparently some folks in the UK feel differently.

How peculiar.

Posted by Phil at 09:55 AM | Comments (0) | TrackBack

ITF #113

In the Future...

...soldiers will be able to download custom ringtones for their chinstraps and helmets as well as being able to set them on vibrate.

Futurist: Posse member Robert Hinkley.

Posted by Phil at 09:49 AM | Comments (0) | TrackBack

Birthdays and Time Travel

Glenn Reynolds shows how birthdays demonstrate the trip through time that we're all on:

JEFF BEZOS IS 40. (And dressed like Austin Powers.) Howard Stern is 50. Everybody gets a day older, every day. That's not news, but the results still have the capacity to surprise.

That's time travel at work, folks. Practical Time Travel.

Posted by Phil at 09:42 AM | Comments (1) | TrackBack

January 09, 2004

The Holy Land

I'm reminded of another topic I waxed enthusiastic about a while back, Robert Zubrin's novel, The Holy Land. It's a bit late for a full review now, so I'll just say that I found it to be a quick, fun read that didn't disappoint on the major premise. I found some of the dialog a little stilted, and I didn't buy into the love story. And some of the more outrageous situations struck me as being kind of silly (rather than funny.) But overall, I would say the book works. Dr. Zubrin accomplishes exactly what he sets out to do: a satirical recasting of the Israel/Palestine conflict which illuminates the outright absurdity of the situation.

Rand Simberg has some thoughts on the book as well, with links to the recent NRO review. Simberg may disagree with Zubrin's views on Mars, but he liked The Holy Land (even the love story that I couldn't quite choke down.) Professor Hall made note of the book the other day, and also had some comments on Zubrin's recent interview with Linda Seebach in the Rocky Mountain News.

Anyhow, if the Zubrin love-in gets to be a bit much for you, Posse member Joanie (Da Goddess Herself) had a somewhat different take on the book.

Posted by Phil at 05:56 AM | Comments (1) | TrackBack

January 08, 2004

Back to the Moon

President Bush is planning to announce a return to the Moon (and the establishment of a permanent base thereon) and a mission to Mars:

White House - AP President Bush (news - web sites) will announce plans next week to send Americans to Mars and establish a permanent human presence on the moon, senior administration officials said Thursday night.

Bush won't propose sending Americans to Mars anytime soon; rather, he envisions preparing for the mission more than a decade from now, one official said.

In addition to proposing the first trip to the moon since December 1972, the president wants to build a permanent space station there.

This is all by way of some "unnamed senior officials," so we'll see if anything comes of it. But there have been rumblings of this for some time. If the President's proposal really does involve Mars, it will come as good news to our friend Robert Zubrin. And if it involves someone other than NASA being responsible for creating the infrastructure, Rand Simberg will be pleased. If the plan works for Professor Hall, too, we'll have ourselves a hat trick.

I must admist that I'm pretty psyched about it, whatever the details turn out to be.

UPDATE: Okay, the rush of excitement is over. Rand is underwhelmed by the idea. His skepticism resonates. And check out the very interesting discussion in the comments section.

Posted by Phil at 10:28 PM | Comments (7) | TrackBack

ITF #112

In the Future...

...we will be *delighted* if a burglar poos in the wardrobe.

Futurist: Posse member Robert Hinkley, who comments as follows on his use of quaint British vernacular:

Erm, that's "takes a dump in the closet" in American. Maybe it's just a uniquely British thing to be convinced that pooing in the wardrobe is standard operational procedure for burglars. Fertile ground, no doubt, for PhD study: "A cross-cultural analysis of burglar defecation behaviour and its perception".

Friend of mine comments: "I'm trying not to think about this too much while I'm eating my lunch, but wouldn't the DNA be from the food he's eaten, rather than him? Are the police now looking for a criminal tomato?"

I think your friend may be on to something. And while I don't generally try to assess the accuracy of these predictions, I can state with confidence that I will never be delighted to have a burglar take a dump in my closet.

Luckily, my wife was educated in the UK and was thus able to decode your strange prediction for me even without reference to your explanatory note.

Posted by Phil at 08:06 AM | Comments (0) | TrackBack

January 07, 2004

Real Writers

I just got an e-mail from an old friend who has finished the first draft of her novel. She first showed me some pages from it almost five years ago and now has a 500-page manuscript. I can't wait to see it.

Many of my friends are writers of one kind or another, and at least a few of them are "real" writers (meaning they make a living doing it.) Not to be a name-dropper or anything, but one of those friends is none other than romance/mystery writer Edie Claire. Hers is a particularly impressive name to drop because

  1. Her work is highly regarded and doing quite well.

  2. I once went out with her.

  3. She was (and, as you can see, still is) quite a babe.

I recommend all of Edie's books, naturally. Try the Leigh Koslow mysteries, they're a lot of fun: Never Buried, Never Sorry, Never Kissed Goodnight, Never Preach Past Noon, and Never Tease a Siamese. (I hope I got those in the right order; I'm working from memory.)

They're all good; you can't go wrong. But If I had to choose just one Edie Claire novel to recommend, it would have to be her latest, Long Time Coming, which I haven't even read yet. This one is my favorite because of the cover:

You might well wonder what's so special about the cover. Well, the house shown there is the one I grew up in in Mayfield, Ky. Back in high school, Edie and my sister Ellen were good friends. The two spent a lot of time over at each other's houses, and it would appear that our place made quite an impression on Edie. It served as the inspiration for the setting for latest offering. So if you want passion, intrigue, and some keen insights into the house I grew up in, this is definitely the book for you.

Oh, yeah — and speaking of writers — this is your chance to participate in the process of transforming me from a fake writer to a real one. How can you do that? Simple, start reading Stillness, our serialized novel which features passion, intrigue, and the End of the World (which is arguably as interesting as my childhood home.)


by Philip Bowermaster

Part I

Chapter 1, in which Reuben sees lights.

Chapter 2, in which Sergei gives advice.

Chapter 3, in which Ksenia looks at cars.

Chapter 4, in which Reuben falls.

Chapter 5, in which Reuben contends.

Chapter 6, in which Reuben recovers.

Chapter 7, in which Sergei explains some things.

Chapter 8, in which Betty explains the rest.

Chapter 9, in which Father Alexy saves the day.

Chapter 10, in which the old man speaks.

Chapter 11, in which Reuben obliges.

Part II

Chapter 12, in which Emmett goes to work.

Chapter 13, in which Frank has some news.

Chapter 14, in which Peggy opens a box.

Chapter 15, in which Emmett becomes confused.

Chapter 16, in which Rick offers some advice.

Chapter 17, in which two strangers arrive.

Part III

Chapter 18, in which Celia meets Corey.

Chapter 19, in which Grace wins a game.

Chapter 20, in which Celia remembers.

Chapter 21, in which Corey wishes.

Posted by Phil at 10:12 AM | Comments (0) | TrackBack

The Future of Wealth

Glenn Reynolds raises some interesting points about income inequality in his latest Tech Central column:

If the rich are getting richer on a steep curve, while the rest of us are getting richer on a less-steep curve, then if you project the far enough ahead we'll have Bill Gates owning entire solar systems while the likes of me make do with a Porsche. Not exactly a tragic scenario (though I'd prefer a Bugatti) but if wealth disparities are great enough, I suppose it becomes harder to maintain civil society, as the rich will have too little in common with the rest of us.

The response, of course, is that if you project any trend far enough into the future it leads to bad scenarios -- and usually worse scenarios than the one where I have to settle for a Porsche. But such projections rarely come true. Today's rich people might get richer than the rest of us, relatively, but it's not likely to turn them into Galactic Overlords. In fact, in terms of daily life experience, I suspect that today's rich are less different from, say, ordinary upper-middle-class Americans than their counterparts were a hundred years ago, or even twenty (a point buttressed by Easterbrook's new book, The Progress Paradox), and that doesn't seem likely to change in the foreseeable future.

I've heard it argued that a typical middle class American enjoys a lifestyle as opulent (or even more so) than a typical king in the middle ages. Like life expectancy, wealth is increasing with each generation. But because of inflation and fluctuations in currency value, wealth is harder to measure across generations than something as straightforward as lifespan.

I like Glenn's idea of comparing the difference in lifestyle between the average person and the richest of the rich. If that delta could be quantifiable, it would almost certainly show a consistent downward trend over the past two hundred years.


Is it because rich just doesn't buy you as much as it used to? Just the opposite, really. Poor buys a lot more than it used to.

A good way to measure the increase in wealth across the generations is via the accumulation of stuff. It's by this measure that I can declare myself richer than, say, William the Conqueror. Actually, that's a tough one. He had more land and horses than I do, and was much better off in the precious metals department. And in terms of being able to raise armed forces, I'll have to concede that my Posse might have a hard time with the Norman invading forces. Still, I sleep in a more comfortable bed, eat better food, and have Tivo. He never got to go to Starbuck's for a Caramel Macchiato (hell, he never even got to go to Dairy Queen for a Peanut Buster Parfait), much less have a steam bath followed by an aromatherapy facial at Antoine Du Chez.

And even if some historical know-it-all wants to add a comment explaining that William actually did have access to the equivalent to any of the above, I would then point out that he was the King of Freaking England, where I'm the Guy with the Second Patchiest Lawn on the Block. Plus, I had hernia surgery a few years ago. Throw a medical condition into the mix and there's just no question as to who's better off.

I get to live like a king because there is so much more stuff than there used to be. That's why I think the lifestyle delta between average and richest-of-the-rich would be particularly interesting to track over the past 200 years. Since the beginning of the Industrial Age, the amount of stuff available to everyone (including the poorest of the poor) has increased exponentially. So even if — as Glenn points out — the richest are accumulating stuff on a much steeper slope than the rest of us, their relative edge in lifestyle is decreasing. Once you cross a certain threshold, the question of who is actually better off becomes harder and harder to determine (as it was with William the Conqueror and myself.) There are people who have much less money than Bill Gates and yet who live a much more opulent lifestyle. Who's richer? If Bill Gates owns the whole solar system but wears dorky clothes and drives a dorky car, but Glenn drives a Porsche and wears marginally less dorky clothes...it almost becomes a matter of taste at that point.

About the only thing left that great wealth can buy is political influence, and even that (as Glenn explains) is becoming a shakier proposition.

Two (hypothetical) future developments promise to flatten the delta virtually out of existence. One of these is the universal assembler (third item), which uses nanotechnology to allow anybody to make — literally — anything they want, including their own univeral assembler. In addition to closing the gap between the rich and the average, this device will eliminate any remaining gap between the average and the poor. Poverty won't exist any more.

The other development is full-immersion virtual reality, which will enable anyone to experience anything. Think of that scene in the first Matrix where they arm themselves by selecting weapons from an inexhaustable warehouse containing every firearm ever conceived. Now map that capability over to things like cars and vacations and (yes) romantic partners.

Who's richer, a guy with one real Porsche or a guy with a virtual collection of every Porsche model ever built? Assuming the VR is flawless and the experience of driving the virtual cars is identical to the real thing, I'm going to say the second guy. If this capability is ever realized, the day people generally agree with my answer is the day the concept of "wealth" ceases to exist.

Posted by Phil at 08:52 AM | Comments (15) | TrackBack

January 06, 2004

2003 Traffic Summary

Here are the web stats for December and the traffic summary for 2003. December was our biggest month yet, inching out September by 21 unique visitors.

So here's the story:

Looks like we're clipping along towards our 75,000th visitor, our 100,000th visit, and our 200,000th page view over the next few weeks.

Maybe it's not quite the big leagues just yet, but it feels like a lot to me!

Anyhow, thanks to all of you for stopping by in December, and for being a part of The Speculist in 2003. And now...on to the future.

Posted by Phil at 07:04 PM | Comments (0) | TrackBack

Final Cause

Steven Den Beste, explaining the three major forces at work in the current global struggle, identifies two distinct approaches to how we learn about and interact with the world around us. Conceived in ancient times and refined over the centuries, these two forces — which he has (uncomfortably) dubbed realism and idealism — have competed vigorously through the ages. Because it has fostered scientific, technological, and economic development, realism now has the upper hand on idealism, which has consistently stood in the way of these benefits. But idealism won't let go without a fight. Enter the three contenders in the current conflict:

Two contending factions are agnostic (but with some religious members), one is theistic (but with some agnostic members). Two are idealist, one is realist. None really like or trust any of the others, but the realists have been prospering while the others have failed, and so it is that the other two are afraid. In peaceful competition, they'll lose.

When the Islamists lashed out violently at the realists, the idealists tried (and failed) to prevent the realists from fighting back, and thus the lines in this war were drawn. The realists are engaged in a shooting war with the Islamists, and in diplomatic war with the idealists.

This is as good a summation of the philosophical bases of the War on Terror as you are likely to find anywhere. Den Beste's categories may not be perfect, but they are definitely illuminating.

The fundamental divide between the two camps draws out of the concept of teleology, which is the belief that the "final" cause of a phenomenon is more important than any of its "efficient" causes. Efficient causes are what we normally think of as causes. If I'm overweight, all those Snickers Bars I've been eating over the years are the efficient cause of my condition. But wait. Maybe I was tapped by the Fat Fairy early on; perhaps my heft is something that was Always Meant to Be. That built-in destiny, that ultimate condition that had to come about, is the final cause of my weight problem.

The difficulty that we have even grasping the notion of final causes is a testament to how thoroughly the realistic view has won out. Causes are things that make things happen; they precede effects. Efficient causes push time forward from past to future by small steps. Final causes paradoxically come after the effects. From the future, they pull time forward out of the past. That such future final causes can exist suggests a purpose to the universe, and ultimately a Cause or Designer behind that purpose.

In his book Biocosm, James N. Gardner points out that teleology's last stand in serious scientific discourse came about with the publication of William Paley's Natural Theology in 1802. Paley argued that, just as finding a pocket watch out in the woods implies that there was a designer and maker of that watch, finding a sparrow or other living creature (of greater complexity than a watch) implies that it, too, was designed and made by some greater intelligence. Paley's argument was shattered by the publication of Darwin's The Origin of Species.

In its classical form, teleology is now dead, although it does pop up from time to time in the writings of Creation Science advocates. But Gardner makes an interesting observation about a lesser-known passage in Paley's work:

Paley [notes] that there are only three extended spatial dimensions in our cosmos...yielding an inverse square law for the diminution of the force of gravity in three-dimensional space...[If] gravity had diminished in our cosmos with an inverse cube law (or indeed any inverse power law by which the force of gravity diminishes more rapidly than under the dictates of the inverse square) life as we know it could not exist on planetary surfaces.

Gardner points out that Paley's argument is in "striking conformity" with contemporary thinking related to the anthropic principle, which basically states that it's an enormous coincidence (some would argue too big a coincidence) that the universe ended up being capable of supporting beings like us. Is there a connection between the anthropic principle and teleology? The Wikipedia article linked above describes the anthropic principle as (potentially) having a "fatal tinge" of teleology, and there are no doubt some flavors of it that veer off in that direction.

But certainly not all variations of the anthropic principle (even the "strong" version thereof) are teleological. Or it might make more sense to say that not all teleology is idealist in its formulation. In our recent discussion (yes, I'm referring to it again), John Smart talked about how the universe may encode emergent intelligence. In his model, intelligence is encoded in the physical laws of the universe in much the same way that intelligence is encoded in our genes. In both cases, it came to be there through an evolutionary process within multiple developmental cycles — meaning that it's no coincidence that the universe is highly tuned to support us, nor does that fact require a nod to idealism. In Smart's model, the universe doesn't need a "watchmaker" to account for its complexity, or for its accommodation of further complexity. Our universe has evolved from less complex models in the same way that we have evolved from more primitive forms of life.

I won't go into the specifics of how this works, and I don't bring it up in this context to discuss the relative merits of the theory. What interests me is that here's a model that allows for final causes without a rejection of realism. (Den Beste equated realism with empiricism; Smart's theory, as speculative as it may seem, is utlimately testable) An emergent final cause may be permissible, as Den Beste suggests here in his description of the origin of realism:

It started with the question, "What is the universe like?" and came up with the answer, "I dunno; let's go look and see." It posits that there actually is an objective universe, and doesn't automatically assume that it has any kind of underlying purpose. If such a thing is present, it will become clear in due course, and in the mean time let's all look around to see what kind of place we're living in.

[Emphasis added.]

An emergent purpose is similar to a final cause in that both seem to work by drawing evolution or progress to a particular end. One important difference between the two is that we could never achieve any understanding of such a purpose via idealism. Only realism will get us to an emergent driving force. Another important difference is that we can't know in advance what the particular end is; it emerges in unexpected and unpredictable ways.

In her book The Future and Its Enemies, Virginia Postrel describes the conflict between dynamists, who are creating the future through creativity and experimentation, and the enemies of the future, whose model of the future varies from elaborate and detailed plans to certain disasters that must be avoided. Each group is dedicated to a notion of human progress derived from its particular worldview. So in a sense, both groups are "idealists" — both can be seen as being devoted to a higher purpose. For the dynamists, the ends are emergent; we're discovering them and understanding them better as we go. For the other group (will anyone object if I call them "buzzkills?"), the ends are pre-determined, handed down from those who know better.

Dynamism incorporates the best of what both realism and idealism have to offer. Like realism, it relies on trial-and-error to move ahead. Like idealism, it allows progress to be seen as purposeful. I think that Den Beste's model of idealists vs. realists provides a clear picture of where we've come from and where we are now. And I think Postrel's picture of dynamists vs. buzzkills shows how that old conflict is being redefined, and gives us an idea of where the lines will be drawn going forward.

Posted by Phil at 02:04 PM | Comments (4) | TrackBack

ITF #111

In the Future...

...P300 enhancers will keep the coffee out of the cornflakes and the cat out of the washing machine.

Futurist: Posse member Chris Hall

Posted by Phil at 08:03 AM | Comments (1) | TrackBack

January 05, 2004

Old Galaxies, Too

Following close on the heels of this morning's entry about how there are many sun-like stars in our galaxy, most of them quite a bit older than our own sun, here's a report about some very old galaxies that we didn't expect to find.

The universe is laden with massive galaxies that formed while the universe was just one billion years old, an era when such mature galaxies were not expected to exist.

Astronomers with the Gemini Deep Deep Survey have found an abundance of galaxies in the "redshift desert," a region of space thought to be sparse because of the time needed for massive galaxies to form. But a wealth of patience, combined with long telescope exposure times, has shed some new light on the matter.

What are we using as our estimate for the age of the universe these days...15 billion years? 20? Put a few sunlike stars in those ancient galaxies and you have the potential for intelligent life developing 10 billion years ago. So multiply everything I said about a head start by 10.

Contrary to what I said earlier about whether these ancient intelligences and we would have anything interesting to say to each other, I'm reminded that John Smart (in our recent interview) suggested that we will likely never find any of these intelligent forerunners:

As I've mentioned earlier, I think all universal intelligence follows a path of transcension, not expansion. This has to do with such issues as the nature of communication in complexity construction (two-way, with feedback, is relentlessly preferred), the large scale structure of the universe (which puts huge space buffers between intelligences) and the small scale structure of the universe (which rewards rapid compression of the matter, energy, space, and time necessary to do any computation).

Once our antennas are powerful enough to detect unintentional EM emissions from the closest few million stars, something that Frank Drake tells me is almost possible now with the closest of our neighboring stars, we'll begin to discover these unmistakable signatures of nonrandom intelligence. We will also notice that every year, a small fraction (roughly 1/200th) of these radio fossils suddenly stop sending signals. Like us, these will be civilizations whose science invariably discovers that the developmental future of universal intelligence is not outer space, but inner space.

By the way — if John is right — we've got about 100 years more of broadcasting before we've finished leaving our mark in outer space. Who knows? Maybe in a billion or three years, some upstart ET-come-latelies in a distant galaxy, surprised to discover that potentially life-sustaining stars existed long before their species came about, will point their radio telescopes at the sky and catch the last few minutes of the Humanity Show.

Posted by Phil at 04:19 PM | Comments (0) | TrackBack

A Tale of Two Spacecraft

It was the best of times; it was the worst of times. Professor Hall has all the details.

Speaking of these developments, I think Jay Manifold has correctly idenitifed Martian Soil as the go-to blog for Mars coverage.

Posted by Phil at 03:06 PM | Comments (0) | TrackBack

This Week 01/05/04

Today is my first day back at work, plus I'm in a state of quasi-mourning about this whole situation. Still, I expect that what with NASA showing us some exciting images of Mars, along with other developments, there should be plenty to speculate about.

Our 7 Questions and Speaking of the Future interview with Charles Murtaugh will probably not run before next week. So stay tuned on that one.

And, yes, Chapter 21 of Stillness will show up right on schedule on Wednesday.

Posted by Phil at 09:24 AM | Comments (0) | TrackBack

ITF #110

In the Future...

...Robot John Carter will assist NASA in their exploration of the red planet.

Posted by Phil at 09:12 AM | Comments (0) | TrackBack

One Out of Ten

Via KurzweilAI.net:

One tenth of the stars in our galaxy might provide the right conditions to support complex life, according to a new analysis by Australian researchers. And most of these stars are on average one billion years older than the Sun, allowing much more time, in theory, for any life to evolve.

Interesting. A billion years is a good sized head start. If there is anyone out there, they might be so far ahead of us that we wouldn't have anything interesting to say to each other.

Posted by Phil at 06:53 AM | Comments (0) | TrackBack

January 03, 2004

More on Calorie Restriction

I guess I kind of opened up a can of worms with my crack about "400 calories a day" yesterday. I was looking through some notes and I found this golden oldie over on my previous blog, thought it might shed some light (or at least annoy a few more people.)


This is my top story on both blogs this morning. Check it out:

A new mouse study suggests fasting every other day can help fend off diabetes and protect brain neurons as well as or better than either vigorous exercise or caloric restriction. The findings also suggest that reduced meal frequency can produce these beneficial effects even if the animals gorged when they did eat, according the investigators at the National Institute on Aging (NIA)

There's more.

Dr. Mattson and his colleagues are currently studying the effects of meal-skipping on the cardiovascular system in laboratory rats. The findings of this study, which compares the resting blood pressures and heart rates of rats that were fasted every other day for six months with rats allowed to eat unlimited amounts of food daily, should be available soon.

Now, we've known for a long time that caloric restriction is an effective means of extending lifespan, at least in rodents. It very likely works with humans, too. The trouble with these calorie restriction diets is that they're a huge drag. I read Beyond the 120 Year Diet, which is an expanded version of the diet book by Roy Walford based on his life-extension research with rodents. My normal pattern is to buy a diet book, read it, try the diet out for anywhere from 10 days to three weeks, get bored, and quit. But with the 120 Year Diet, I didn't even get that far. All I had to do was read it.

Essentially, you just eat one big salad every day.

Let's be real, here.

No way am I sticking to something like that. I mean sure, I'll eat the big salad every day. And then I'll follow it up with a well-marbled steak and a baked potato.And possibly a big dish of ice cream. Because, frankly, I don't care if some of the other mice do live longer. At least I had a decent dinner.

To be honest, I pretty much skip the baked potato and the ice cream these days. I've been living on a highly modified (read: I cheat a lot) version of the Atkins diet for several months and it seems to be working for me. I read the late Dr. Atkins Age Defying Diet and noted that his criticism of the calorie restriction diet was the same as my own: it's just too damn hard. Reading his book, I got the impression that Atkins wanted to make the assertion that his low-carb program has the same effect on the system as Walford's calorie restriction, but he didn't have anything to back it up.

Now with this new research, we have evidence that something other than calorie restriction might produce the same benefits, or at least some of the same benefits. The fasting mice were not on a calorie-restricted diet per se. They got to pig out every other day and ended up eating as much as (or more) than the well-fed mice. I don't know whether this every-other-day diet thing would be easier than the big-salad-a-day diet, but it might be worth a try. In fact, I did try an eat-every-other-day diet for about 10 days to three weeks back in the early eighties. I can't remember whether it worked very well. All I remember is that I was dizzy all the time.

However, there's something else in this new research:

...Dr. Mattson and his colleagues found mice that were fasted every other day but were allowed to eat unlimited amounts on intervening days had lower blood glucose and insulin levels than either a control group, which was allowed to feed freely, or a calorically restricted group, which was fed 30 percent fewer calories daily than the control group....[emphasis added]

Dr. Mattson's team found that nerve cells of the meal-skipping mice were more resistant to neurotoxin injury or death than nerve cells of the mice on either of the other diets.

The reserch implies that the control of blood gluecose and insulin levels is what provided the mice their enhanced resistance to diabetes. This would be some vindication for Dr. Atkins, since getting these levels under control is at the core of his diet. So is it possible that a low-carb diet would provide the same diabetes-resistance as eating every other day? Maybe so. What I find intriguing is the possibility that this same moderation of insulin and glucose levels might have provided the mice with their resistance to neortoxin injury and the death of nerve cells. I have to be careful to point out that there's nothing in the research that makes this connection directly, but what if?

If such a link could be demonstrated, we would be well on our way to having people seek not only to lose weight, but actually to extend their lives by eating heavy cream and porkchops. Wherever Dr. Atkins is now, I have a feeling he's smiling.

Posted by Phil at 10:10 AM | Comments (1) | TrackBack

January 02, 2004

10 Predictions

The holidays have slowed us down a bit in our ongoing efforts to describe what the world might be like in the future, but never fear: Zombyboy presents 10 juicy predictions for the coming year. I'll just share one:

10. The serious blogging community will continue to grow, becoming something a little more like journalism and a little less like high school.

I don't know...for some reason, neither of those alternatives sound particularly attractive. Journalism is an ism and I'm not fond of isms. On the other hand, I truly hated high school. I would propose the following:

10. The not-overly-serious blogging community will continue to grow, becoming something a little more like a huge, never-ending cocktail party with lots of interesting guests and a little less like high school.

It's just a thought.

UPDATE: Speaking of cocktails, Stephen Green has published a list of 50 predictions for the new year. I'll share just one of those:

Colorado will name the Citron martini "The Official State Cocktail."

Dream on, Steve. The official state cocktail is and always will be whatever you have handy to wash down those Rocky Mountain Oysters.

While visiting VodkaPundit, be sure to check out the most concise and accurate summation of 2003 available anywhere.

Posted by Phil at 11:50 AM | Comments (0) | TrackBack

Why it Works

Calorie restriction, that is.

I always thought this was kind of a bad deal. Starve yourself by eating a paltry few hundred calories a day (of very nutrititious stuff like beets and kale and so forth) and take a lot of supplements and you have a good shot at significantly lengthening your lifespan. Great, but who wants to make it to 120 eating kale and beets?

Full disclosure: I love beets, especially the pickled kind. But you get my point. What do you get, like 400 calories? Hell, I eat more than that in the way of little "samples" that I take while cooking dinner! I'd never make it on the calorie restriction regime. I'm just not cut out for it, and I doubt that many are.

Well now, via Ray Kurzweil, here's some good news. Researchers at MIT have isolated an enzyme which is lowered when calories are restricted.

In previous research, Guarente found that rather than a slower metabolism leading to a slower rate of respiration, it turns out that respiration in yeast cells under calorie restriction goes up, not down. "A high respiration rate is intimately connected with calorie restriction in yeast," he said. "A high respiration rate activates SIR2. When respiration goes up, NADH goes down and SIR2 goes up. When SIR2 goes up, longevity happens."

This is good news, but these are early results. First off, the findings apply only to yeast. (Although it can be surprising to learn how closely related we humans are to what we would normally consider much lower forms of life.) Secondly, we're a long way from finding a way to increase SIR2 levels without the rabbit-food regimen.

But at least now we know what we're looking for.

Posted by Phil at 08:55 AM | Comments (4) | TrackBack

January 01, 2004

7Q's 2004 Edition

I'm updating my Seven Questions About the Future in order to keep them current with our ever-changing world. As of today, Question 7 will read as follows:

7. Why is it that in the year 2004 I still don't have a flying car? When do you think I'll be able to get one?

Please make a note of the change. Thank you.

Posted by Phil at 11:35 AM | Comments (6) | TrackBack

Birthday Greetings

I guess I wasn't particularly Quick about getting this up here, but happy birthday to the blogosphere! The terrible two's begin...

Posted by Phil at 11:00 AM | Comments (0) | TrackBack

Preventing Alzheimer's

I missed this the other day on FuturePundit: Myelin Cholesterol and Iron Build-Up Leads To Alzheimer's.

As the brain continues to develop in adulthood and as myelin is produced in greater and greater quantities, cholesterol levels in the brain grow and eventually promote the production of a toxic protein that attacks the brain. The protein attacks myelin, disrupts message transfer through the axons and eventually leads to the brain/mind-destroying plaques and tangles visible years later in the cortex of Alzheimer's patients.

The good news:

Preventive therapies worth investigating include cholesterol- and iron-lowering medications, anti-inflammatory medications, diet and exercise programs and possibly hormone replacement therapy designed to prevent menopause rather than simply ease the symptoms. In addition, education or other activities designed to keep the mind active may stimulate the production of myelin. Finally, there may be ways to address genetic and environmental factors that accelerate the degeneration process.

The not-so-great news:

This new model of brain development and degeneration suggests that the best time to address the inevitability of myelin breakdown is when it begins, in middle age. By the time the effects of Alzheimer's disease become apparent in a patient's 60s, 70s or 80s, it may be too late to reverse the course of the disease.

So let's get on it, then, shall we? What is it they in the war blogs?

Faster, please.

Posted by Phil at 09:16 AM | Comments (2) | TrackBack

Farewell to Time

On this New Year's Day, why should we be content merely to say goodbye to the year just ended when we can say goodbye to the notion of time itself?

Brian Greene pens a lengthy op-ed in today's New York Times on the nature of time, and how our understanding of it has changed. According to Greene, our understanding is likely to change even more in the years to come:

Today's scientists seeking to combine quantum mechanics with Einstein's theory of gravity (the general theory of relativity) are convinced that we are on the verge of another major upheaval, one that will pinpoint the more elemental concepts from which time and space emerge. Many believe this will involve a radically new formulation of natural law in which scientists will be compelled to trade the space-time matrix within which they have worked for centuries for a more basic "realm" that is itself devoid of time and space.

This is such a perplexing idea that grasping it poses a substantial challenge, even for leading researchers. Broadly speaking, scientists envision that there will be no mention of time and space in the basic equations of the sought-for framework. And yet — just as clear, liquid water emerges from particular combinations of an enormous number of H20 molecules — time and space as we know them would emerge from particular combinations of some more basic, though still unidentified, entities. Time and space themselves, though, would be rendered secondary, derivative features, that emerge only in suitable conditions (in the aftermath of the Big Bang, for example). As outrageous as it sounds, to many researchers, including me, such a departure of time and space from the ultimate laws of the universe seems inevitable.

The notion of describing the operation of the universe without reference to space or time put me in mind of Julian Barbour's The End of Time, a book which purports to do that very thing. Barbour describes space and time emerging as a waveform through a vast configuration space that contains what we might think of as perfect 3D still-life renderings of the universe. Every possible configuration of the universe exists in this space, and we might be tempted to think of time as stop-motion animation. Experiencing the configurations sequentially creates the illusion of the passage of time (and of motion, which Barbour also claims doesn't really exist, at least not the way we think it does.)

These stop-motion frames start to sound a little like the flip-book picture of time that Greene refutes in his op-ed:

For example, if you and I were sitting next to each other, our freeze-frame images of the present would be identical. But were you to start walking, the mathematics of relativity shows that the subsequent pages of your flip-book would rotate so that each one of your new pages would angle across many of mine; what you'd consider one moment in time — your new notion of the present — would include events I'd claim to have happened at different times, some earlier and some later.

Greene's objections to a stop-motion universe are well-taken when applied to a literal freeze-frame model, although I don't think that's exactly what Barbour proposes. The freeze-frame imagery is just a handy way to visualize the ideas in approximation. I'm pretty sure that Barbour wouldn't approve of my stop-motion animation analogy, relying as it does on some notion of an external clock. In Barbour's model, there are no clocks and there is not time, nor space (as we think of it) nor motion. These are all "optical illusions" that derive from the clustering of more and less probable configurations within the configuration space.

The problem with adopting this kind of model is that it flies directly in the face of experience. It will require an enormous change of perspective — bigger even than when Galileo told us that the sun doesn't actually rise or set — for us to accept the idea that time doesn't really exist. Still, whether through Barbour's model or some other, Brian Greene seems quite convinced that this is a change of perspective we will all eventually have to make.

What better time to start could there be than the beginning of a new year?

Posted by Phil at 08:55 AM | Comments (3) | TrackBack