世界在破晓的瞬间前埋葬于深渊的黑暗
Friday, August 31, 2007
Thursday, August 30, 2007
Quote on Religion
------ Heinrich Heine, Gedanken Und Einfalle
Interpol -- Untitled
Wednesday, August 29, 2007
Modest Mouse -- Float On
不管是中药还是西药,能够治病的就是好药
(刊登于联合早报2005年。此篇文章当时引起了许多抗议和回响,尤其是从许多中医的支持者。)
我有些朋友是中医的忠实支持者,每当生病时都坚持给传统中医师看病,其行为的原因大概可用以下两点总结:(1) 中药有五千年的历史了;(2) 西药无法和中药相比,前者只是通过化学成分的压抑治疗病症,后者却透过调解体内不协调的 “气” 除病。
台湾作家李敖曾批评中医在正统的科学医学面前,只是“不正当的做法” 。当然,李敖并非斥责中医毫无用处。他所谴责的是所谓中医理论不合乎科学原理:“凡是 ‘科学’ 没尝过的药物,不管是神农尝过也好,黄帝尝过也罢,都要算是 ‘伪药’ ” 。任何熟悉基本生物学理论的人大概也会知道,所谓体内的 ‘气’ 这玩意儿早在两百年前就被正统生物学家唾弃的概念,而生物学和医学也因此进步。别人弃之的 “瑕疵品” 却被当成是重要的 “医疗原则” ,还真让人啼笑皆非呢。
这就意味中药完全无效吗?前阵子不是还刚透过研究证实了麻黄的疗效吗?不,本人并不想在此引起所谓东方 vs. 西方的争论,因为这很多时候只是拥有利害关系者硬要强加在我们身上的某种错误观念。对于真相而言,不是因为我们给它冠上了立场的帽子就变成真相了。也就是说,某种药物不会因为我们说它是有五千年历史的 “中药” 就能治百病,也不会因为它是西方进口的 “西药” 就有效。一个药物是否有效就取决于它是否能医病。怎么知道呢?李敖说的嘛,给科学尝尝。
上了卫生部的网站看看,发现他们对传统中医的立场大概如下:由于传统中医是中华文化的一部分,而一般大众又有使用中药的习惯,因此有关当局把传统中医视为一种附属的医疗。也就是说,有关当局把传统中医规划成正统医疗外的 “另类医疗” 。
然而,“文化” 、“习惯” 和 “传统” 是不能医病的。就如同附属性的 “另类医疗” 这概念本身就是自相矛盾一样。英国学者 Richard Dawkins曾经指出,所谓另类的医疗就是一种矛盾:“医学界里不应该有区分任何 ‘正统’ 和 ‘另类’ 的疗法。因为我们取决于任何疗法的标准都应该一致。只要任何疗法透过正统科学的双盲实验法的确认,管它是从哪里来的,我们都应该照用不误。”
近期卫生部宣布考虑放宽对针灸的管制,让在 “正统” 医疗设施的病人能够得到 “传统” 的针灸医疗 。然而,在国外已经有很多研究证实了针灸对于很多的病症都毫无疗效。虽然针灸被证实能够轻微麻醉,不过其功能却没有麻醉药来得有效和稳定。要知道使用麻醉药有很多副作用,如果针灸麻醉法真那么神,为什么大家还不趋之若鹜呢?就如同把传统中医视为附属性的另类医疗,这个决定让我匪夷所思。
本人必须在此重申,我不是崇洋,也不是说所有的中药都无效。我只相信结果和效率,以及真相。中医以 “气” 作为治疗的主导原则是建立在错误的生物学理论上,这是不争的事实。我经常认为,坚持接受传统治疗的人,其实买的不是医疗服务,而是某种莫名其妙的文化尊严。
或许,就是这种莫名其妙的文化尊严蒙蔽了很多人的双眼,让我们看不清楚这个社会上很多自相矛盾的东西吧。
Modest Mouse -- Invisible
Antony and the Johnsons -- Hope There Is Someone
America to the Rescue!!!
The Daily Show with Jon Stewart - The Most Trusted Name in Fake News
一段爱情的凋零
他们无暇的爱情如同黑白电影里的温柔
玫瑰在承诺永恒的情人节
把现实的残酷别离于她的生命
使得他的每个呼吸微风般在寂寞中安抚着她
他们无暇的爱情如同黑白电影里的玫瑰
在情人节把现实的残酷别离于她的生命
使得他的每个呼吸在寂寞中安抚着她
他无情如同黑玫瑰
情人的残酷别离
使得寂寞安抚着她
他同情残酷
别离寂寞着她
他同情着她
他她
Tuesday, August 28, 2007
Out of Your Body...
===========================================================================
By SANDRA BLAKESLEE
Published: August 23, 2007
Using virtual reality goggles, a camera and a stick, scientists have induced out-of-body experiences — the sensation of drifting outside of one’s own body — - in healthy people, according to experiments being published in the journal Science.
When people gaze at an illusory image of themselves through the goggles and are prodded in just the right way with the stick, they feel as if they have left their bodies.
The research reveals that “the sense of having a body, of being in a bodily self,” is actually constructed from multiple sensory streams, said Matthew Botvinick, an assistant professor of neuroscience at Princeton University, an expert on body and mind who was not involved in the experiments.
Usually these sensory streams, which include vision, touch, balance and the sense of where one’s body is positioned in space, work together seamlessly, Prof. Botvinick said. But when the information coming from the sensory sources does not match up, when they are thrown out of synchrony, the sense of being embodied as a whole comes apart.
The brain, which abhors ambiguity, then forces a decision that can, as the new experiments show, involve the sense of being in a different body.
The research provides a physical explanation for phenomena usually ascribed to other-worldly influences, said Peter Brugger, a neurologist at University Hospital in Zurich, Switzerland. After severe and sudden injuries, people often report the sensation of floating over their body, looking down, hearing what is said, and then, just as suddenly, find themselves back inside their body. Out-of-body experiences have also been reported to occur during sleep paralysis, the exertion of extreme sports and intense meditation practices.
The new research is a first step in figuring out exactly how the brain creates this sensation, he said.
The out-of-body experiments were conducted by two research groups using slightly different methods intended to expand the so-called rubber hand illusion.
In that illusion, people hide one hand in their lap and look at a rubber hand set on a table in front of them. As a researcher strokes the real hand and the rubber hand simultaneously with a stick, people have the vivid sense that the rubber hand is their own.
When the rubber hand is whacked with a hammer, people wince and sometimes cry out.
The illusion shows that body parts can be separated from the whole body by manipulating a mismatch between touch and vision. That is, when a person’s brain sees the fake hand being stroked and feels the same sensation, the sense of being touched is misattributed to the fake.
The new experiments were designed to create a whole body illusion with similar manipulations.
In Switzerland, Dr. Olaf Blanke, a neuroscientist at the École Polytechnique Fédérale in Lausanne, Switzerland, asked people to don virtual reality goggles while standing in an empty room. A camera projected an image of each person taken from the back and displayed 6 feet away. The subjects thus saw an illusory image of themselves standing in the distance.
Then Dr. Blanke stroked each person’s back for one minute with a stick while simultaneously projecting the image of the stick onto the illusory image of the person’s body.
When the strokes were synchronous, people reported the sensation of being momentarily within the illusory body. When the strokes were not synchronous, the illusion did not occur.
In another variation, Dr. Blanke projected a “rubber body” — a cheap mannequin bought on eBay and dressed in the same clothes as the subject — into the virtual reality goggles. With synchronous strokes of the stick, people’s sense of self drifted into the mannequin.
A separate set of experiments were carried out by Dr. Henrik Ehrsson, an assistant professor of neuroscience at the Karolinska Institute in Stockholm, Sweden.
Last year, when Dr. Ehrsson was, as he says, “a bored medical student at University College London”, he wondered, he said, “what would happen if you ‘took’ your eyes and moved them to a different part of a room? Would you see yourself where you eyes were placed? Or from where your body was placed?”
To find out, Dr. Ehrsson asked people to sit on a chair and wear goggles connected to two video cameras placed 6 feet behind them. The left camera projected to the left eye. The right camera projected to the right eye. As a result, people saw their own backs from the perspective of a virtual person sitting behind them.
Using two sticks, Dr. Ehrsson stroked each person’s chest for two minutes with one stick while moving a second stick just under the camera lenses — as if it were touching the virtual body.
Again, when the stroking was synchronous people reported the sense of being outside their own bodies — in this case looking at themselves from a distance where their “eyes” were located.
Then Dr. Ehrsson grabbed a hammer. While people were experiencing the illusion, he pretended to smash the virtual body by waving the hammer just below the cameras. Immediately, the subjects registered a threat response as measured by sensors on their skin. They sweated and their pulses raced.
They also reacted emotionally, as if they were watching themselves get hurt, Dr. Ehrsson said.
People who participated in the experiments said that they felt a sense of drifting out of their bodies but not a strong sense of floating or rotating, as is common in full-blown out of body experiences, the researchers said.
The next set of experiments will involve decoupling not just touch and vision but other aspects of sensory embodiment, including the felt sense of the body position in space and balance, they said.
A representation of one of the scenarios that scientists used to study out-of-body experiences.
Sunday, August 26, 2007
Bill Maher's New Rules -- 24th August 2007
Another new season of Real Time with Bill Maher, another bunch of New Rules....
Saturday, August 25, 2007
Michael Shermer Interview
Wednesday, August 22, 2007
假的真不了?
由于在近几个月来连续在台湾和中国分别爆出了黑道恐吓光盘和纸皮肉包子新闻造假事件,因此媒体的运作和品格成为了关注和讨论的话题。除了探讨这些事件是因为个人的贪婪私欲或者是整个媒体机构鼓出不良风气而引起,同时也思考着新闻媒体的品质是否能够在注重收视率的主导原则下有所保持。
当然,以上两件新闻造假事件并非史无前例,几年前台湾也发生过脚尾饭假新闻事件。这些事件的共同点是其虚构内容非常轰动,被报导时引起了某种程度上公愤、质疑和关注。因为纸皮制作肉包子和用脚尾饭做鲁肉饭听来太不可思议,以及引起了小贩的抗议,所以最终在其他人进一步跟进此新闻时才爆出真相。
这些假新闻事件最后被揭发是理所当然的事,在某种方面也提高了大众对新闻媒体报导的信心,因为假做新闻的人都受到了惩罚。然而,对于新闻媒体会充斥着假讯息的可能性,此类轰动性的新闻其实并不让人最担心。反之,本人觉得最可怕的是那些听似合理,但其实不然的讯息。这些非轰动性的讯息被当成正常新闻处理,在潜移默化之中进入了公众意识,最终因为鲜少有人揭发而被当成普通知识。
有关假新闻的报道让我想起了在一年多前于某份本地报纸读到的某篇新闻报道,其内容是讲述外国某个集团准备在本地开办医疗自闭症和语言障碍的课程。此集团的负责人声称由于自闭症和语言障碍是大脑异常所引起的症状,而研究显示人类透过训练可以改变大脑皮层 (cortex) 的结构,因此他们研发了一套针对病人大脑异常部分的活动,以协助这些病人往康复的道路迈进。
本人在阅读此报道后觉得事有蹊跷。尽管我不是自闭症或者语言障碍这方面的专家,不过在本人对于大脑和以上病症的有限知识里,该负责人的话听起来有些可疑。更何况,就算他所提到的自闭症和语言障碍是因为大脑异常和人类可以透过训练改变大脑皮层结构的讯息都属事实,但这却并不代表人类可以透过训练来改变大脑异常部分的结论就属实。
为了解开心中的疑问,本人还是打了封电邮给一位在语言障碍学会工作的朋友,询问她有关此集团所开办的课程。果然,这位朋友告诉我说语言障碍学会并不承认此集团的做法,因为其疗效从来没有被证实过。朋友的回答不仅在我心中引起更多的疑问。为何该报章没有去征询不同的看法呢?是因为交稿的压力而没有时间去确认,还是因为记者本身对于此类听似合理的新闻从不加以质疑呢?
以上的例子并非本人在翻阅报纸或者观看新闻和资讯节目时所碰到的唯一个案。许多充斥市场的教育配套和开发个人潜能的课程都犯了以上例子的毛病,也就是虽然在某种程度上听起来合理,不过支持其说法的证据少之又少。然而,我却屡次在新闻中看到类似的报导,或者是在资讯节目中看到推销这些课程和配套的业者以 “专家” 的身份接受访问,将没有实质根据的讯息当成普通知识来讨论。
美国 Skeptic 杂志的创办者 Michael Shermer 曾经说过,最让人感到棘手的并非那些过于荒唐和让人无法置信的讯息,而是那些徘徊在真相和荒唐之间的讯息。因为大众总会不假思索地将接纳这些讯息。就算日后有证据显示这些讯息并不正确,大众也不会改变看法,因为这些假讯息已经根深蒂固于大众意识了。与纸皮肉包子这种因为骇人惊闻而最终被揭发的假新闻相比,此类潜移默化于大众意识的非轰动性假讯息所造成的社会伤害会来得更大吧。
Friday, August 17, 2007
雨天 --- 杨宗纬
Friday, August 10, 2007
Thursday, August 09, 2007
TDS on Windfarms
I am always amazed at how The Daily Show with Jon Stewart can come up with all these "interviews" exposing the hypocrisy of the politicians.... They keep saying that they are not a news show, but I think they are much better than most news show...
The Daily Show with Jon Stewart - The Most Trusted Name in Fake News
Wednesday, August 08, 2007
This Week In God: Mormon Edition
The Daily Show with Jon Stewart - The Most Trusted Name in Fake News
Tuesday, August 07, 2007
Lewis Black Interview on The Progressive
=================================================================
Lewis Black Interview
By Antonino D’Ambrosio
April 2007 Issue
There is an old southern Italian saying: dobbiamo ridere per mantenere via le rotture (we must laugh to keep away the tears). Comedian Lewis Black not only believes this, but also practices it every day by making people laugh at the absurdity and hypocrisy that dominate modern American politics. As one of America’s foremost social satirists, Black caught the American public’s attention with his volcanic, hands-trembling, ticking-time-bomb-like social “commentaries on everything” in the “Back in Black” segment on Comedy Central’s The Daily Show with Jon Stewart. Here’s a sample about the Enron scandal: “You don’t want another Enron? Here’s the law: If you have a company, and you can’t explain, in one sentence, what the fuck it does, it’s illegal!”
Lewis Black was born fifty-six years ago to a middle class Jewish family in Washington, D.C. During the McCarthy Era and the Vietnam War, Black’s mother, father, and grandfather would condemn—loudly and outrageously—the government’s misuse and abuse of power. He told me his dad would say, during Vietnam, “If I knew it was going to be like this, I would have stayed in Russia.”
As a young man, Black first turned his creative sights on theater, namely as a playwright. Influenced by the likes of Beolco’s commedia dell’arte, Moliere, and Nobel Prize-winning satirist Dario Fo, Black would tackle every genre of theater, ultimately writing forty plays. While serving as playwright-in-residence and associate artistic director of the West Bank Cafe Downstairs Theatre Bar in Hell’s Kitchen, New York City, Black began performing stand-up as an opening act and master of ceremonies before each play. Black’s theatrical background gave him a unique edge: He cast himself as both prankster and prophet in his own one-act show.
Black has recorded five albums to date, winning a 2007 Grammy for Best Comedy Album for The Carnegie Hall Performance. He is also a bestselling author (Nothing’s Sacred), star of two HBO specials, and an actor appearing in a number of films, from Hannah and Her Sisters to Man of the Year.
Fittingly, I first met Black at an event in New York City honoring Lenny Bruce and free speech. Some of America’s top comedians were slated to perform, including Sarah Silverman. Black was the night’s emcee but did perform a show-ending monologue that brought the packed house to its feet. Afterwards, when I spoke to him, he was not the raving on-stage persona that has caused many to worry for his physical and mental health but a subdued, thoughtful man who graciously offered to meet with me to discuss his work.
Q: You have often said that you don’t consider yourself a political comic but a social satirist. Why?
Lewis Black: The overall theme for me is social satire because my setup is information. I start with the person making a dopey statement like former Senator Rick Santorum saying that gay marriage and homosexuality are a threat to the American family. Then I tell the real story: How is this a threat to the American people? It’s a prejudice to believe that. It’s the same thing as remarks about Jews drinking the blood of Christian babies during Passover.
But the media doesn’t really report on things in a detailed and thoughtful way that allows people to understand what a particular piece of news is really about. For example, Dick Cheney took millions from Halliburton and put it in a trust. The Administration quickly came to Cheney’s defense, explaining that there is nothing really wrong with this and that there is no conflict of interest. The joke here is all set up through the information: The Vice President is the former CEO of Halliburton, a company that is receiving huge defense contracts for the war in Iraq. You either get to be Vice President or you don’t. You either get to keep the money or you don’t. So, by simple observational social satire you explain this to the audience and expose that yes, there is a clear conflict of interest.
Q: You describe yourself as a socialist.
Black: From the time I was kid, I saw the broader context of how we live here in the U.S. When I was twelve, I saw Edward R. Murrow’s Harvest of Shame and that was it. It led me to uncover the image versus the reality of how people live. I then learned to pronounce “apartheid” and saw the treatment of blacks here in this country as they struggled for civil rights. It made me question deeply and ask myself: How can people like migrant workers who are helping us eat not have a pot to piss in? I started learning about countries that have a “share-the-wealth” system and I said to myself, “There is nothing wrong with that. This makes sense.”
Capitalism’s problem is that it has nothing to say about how to combat greed. For all the moralizing this country does, people don’t get it: They’re greedy. And it’s gotten worse in my lifetime. You don’t even have to have socialism. I am talking about minimal things. Put money aside to fund playgrounds and high school football teams. Are you kidding me? The Grammy Awards has to make a plea to keep music in schools? I mean, what planet are we on? I guess I am asking another question in my work as well: What happened?
Q: What do you think happened?
Black: The false needs like the third house, the fourth boat, the most expensive hoo-ha. I might do, for me, two or three over-the-top things a year like get on a boat and take a cruise. I do that for a week and that’s enough. And now I have people that help me manage my money and recently my accountant said, “You need to do this to pay less in taxes.” My response was that I have been waiting my whole life to pay taxes. This is how it’s supposed to work. This is how we are able to fund the things that make this country work—like roads and schools.
Q: Your rise to the top of American comedians has coincided with Bush’s time in office. Is there a connection?
Black: Every day these guys do something more outrageous and full of hubris. There is a huge amount of material to pull from. You can’t even get to the joke. It could take weeks. I just start yelling about this stuff and there is my act. Like with this “nonbinding resolution” thing. What does that mean, nonbinding resolution? How did they even come up with that word? For me, it’s like if you went to a doctor for diarrhea and he prescribed Ex-Lax—-now that’s nonbinding.
Q: What do you see as the main problem of this Administration?
Black: As much as the problem is the Iraq War, there is a bigger problem: There are unqualified, incompetent people within the government bureaucracy. There are people who spend their entire lives working within government and are experienced and know how to run FEMA, the FDA, the EPA. You can find these people and they will do their jobs well. It’s the Administration’s job to find them and put them in these positions. This is what is called maintenance. What fell apart is maintenance. I lived around the government when I was a kid and I know it’s got many problems but you don’t elect people who don’t like government because that’s what they’re in charge of.
Q: I know that Dick Cheney shooting someone in the face while quail hunting was your favorite moment in 2006. What are some other highlights?
Black: The Mark Foley scandal—that is a joke that tells itself. A Congressman who is on the committee to protect children from sexual predators is himself a sexual predator. The flag burning amendment last summer was really absurd. We have five million other things to worry about and this is what Congress is going to spend their time on? Ridiculous. I mean, were people running out of briquettes for barbeques and began using flags? Then there’s the proposal to build a wall on the Mexican border but they don’t vote on the money to build the wall. And if we can’t build levees and homes anymore for people who have been displaced by Katrina, what’s our ability to build a wall? And of course the natural, obvious joke is, they are going to use illegal immigrants to build the wall. Well, wouldn’t you know that a company down there got busted for really using illegal immigrants to build the wall? It just goes on and on and on.
Q: How do you respond to those politicians who trot out the excuse for Iraq: “If we knew then what we know now”?
Black: Oh boy. Really seems to be a standard line of the Democrats who voted for war, like Hillary. It just screams of incompetence. It’s cowardly. What did they need to know? But beyond that, what was the reason to attack him at that point in time anyway? There was no reason, none at all. They could have spent a year working on it. They could’ve trained people to actually speak the language.
Q: What is your take on Hillary’s Presidential bid?
Black: I can’t do it. The first time around they did a basically decent job of running government and keeping a stable atmosphere in the country. But they created a psychotic whirlwind around themselves, and I’m not ready to go back to that psychotic whirlwind. And people may have forgotten, but Hillary destroyed any possibility of a government-run health care program. She should go away for a little bit. Even Nixon went away for a little bit. You don’t get things the first go around. You have to work for it. And granted, living with Bill is working for it, but it’s not enough.
Q: And Obama?
Black: Would a few more years’ experience hurt? I mean, he beat Alan Keyes, for God’s sake. A Doberman could’ve beaten Keyes. Now, I think Obama’s very good and very smart and may bring something fresh, but I am not comfortable with anyone that age taking office in this current political climate. Ultimately, what I would really like to see is a Republican and a Democrat crossing over lines and running together and saying, “Fuck you!” to the status quo.
Q: Who would you seeing doing this?
Black: I put Senator Chuck Hagel, who I think really gets it, along with, I don’t know, pick one. They’re all barking the same tune. The thing that’s great about Hillary and Obama—a woman and a black man—is that we can save face a bit with the rest of the world. But is the country ready for a black President? Are you kidding me? I’m in Greenville, South Carolina, and I haven’t seen a black face yet. I don’t know where they are—I think they killed them.
Q: Speaking about South Carolina, what about McCain?
Black: He gave up the ghost. He had his shot but Bush beat him senseless in South Carolina and now he has put his tongue so far up Bush’s ass it’s incredible. I can’t trust someone like that. If you want the Presidency that much, you don’t get it. On top of that, this kowtowing has allowed him to lose all of those things that made him interesting to both sides.
Q: Giuliani?
Black: You got to be fucking kidding me! He was my mayor and that was enough. Trust me. His arrogance as mayor was awful. “It’s me, it’s me, look at me.” I’m sick of it.
Q: How do you develop your manic on-stage persona?
Black: I think out my routine and it’s exhausting. I only have seventy-five minutes or so to get it out so I go all the way and just go nuts. It’s like a workout for me. To tell you the truth, I don’t even know what I’m thinking when I’m on stage because I’m jumping up and down, yelling like a lunatic. I keep saying to people that the next time if I come back as a comedian I’m going to do it from a gurney with an IV drip.
Q: Many people are finding real news through fake news with shows like The Daily Show and The Colbert Report. Why do you think that is?
Black: I call this news by default. It starts with things like when The New York Times had to apologize for the lack of research they did reporting on the Iraq War. Whether anyone likes it or not, they are the paper of record and it was their responsibility to pay attention, and the reality is that they didn’t. And now it’s started again as they start quoting these “sources” in regards to Iran’s weapons. They report, “unnamed military sources.” I can’t believe they are doing this again. The New York Times said, and this is extraordinary, this is a much better presentation this time around.
Q: Finally, you are often criticized, just as Richard Pryor was, more for your use of profanity than the content of your work. Pryor’s response to this criticism was, “A lie is profanity. . . . A lie is the worst thing in the world. Art is the ability to tell the truth.”
Black: In the end, I don’t set out to do this. What makes me funny is my anger. I find many things every day that set me off. I really don’t search for the lies because nowadays you don’t have to look hard.
Antonino D’Ambrosio is a writer and filmmaker based in New York City. He is the author of “Let Fury Have the Hour: The Punk Rock Politics of Joe Strummer,” soon to be a documentary produced by Tim Robbins and Amnesty International. He is also the author of the upcoming “Politics in the Drums: A People’s History of Political Popular Culture.” And he is the founder of La Lutta NMC (www.lalutta.org), a nonprofit documentary production group.
Science and the Islam World.
This is an article found from the Physics Today (here) which discusses science in the Islamic world.
==========================================================================
By Pervez Hoodbhoy
This article grew out of the Max von Laue Lecture that I delivered earlier this year to celebrate that eminent physicist and man of strong social conscience. When Adolf Hitler was on the ascendancy, Laue was one of the very few German physicists of stature who dared to defend Albert Einstein and the theory of relativity. It therefore seems appropriate that a matter concerning science and civilization should be my concern here.
The question I want to pose—perhaps as much to myself as to anyone else—is this: With well over a billion Muslims and extensive material resources, why is the Islamic world disengaged from science and the process of creating new knowledge? To be definite, I am here using the 57 countries of the Organization of the Islamic Conference (OIC) as a proxy for the Islamic world.
It was not always this way. Islam's magnificent Golden Age in the 9th–13th centuries brought about major advances in mathematics, science, and medicine. The Arabic language held sway in an age that created algebra, elucidated principles of optics, established the body's circulation of blood, named stars, and created universities. But with the end of that period, science in the Islamic world essentially collapsed. No major invention or discovery has emerged from the Muslim world for well over seven centuries now. That arrested scientific development is one important element—although by no means the only one—that contributes to the present marginalization of Muslims and a growing sense of injustice and victimhood.
Such negative feelings must be checked before the gulf widens further. A bloody clash of civilizations, should it actually transpire, will surely rank along with the two other most dangerous challenges to life on our planet—climate change and nuclear proliferation.
First encounters
Islam's encounter with science has had happy and unhappy periods. There was no science in Arab culture in the initial period of Islam, around 610 AD. But as Islam established itself politically and militarily, its territory expanded. In the mid-eighth century, Muslim conquerors came upon the ancient treasures of Greek learning. Translations from Greek into Arabic were ordered by liberal and enlightened caliphs, who filled their courts in Baghdad with visiting scholars from near and far. Politics was dominated by the rationalist Mutazilites, who sought to combine faith and reason in opposition to their rivals, the dogmatic Asharites. A generally tolerant and pluralistic Islamic culture allowed Muslims, Christians, and Jews to create new works of art and science together. But over time, the theological tensions between liberal and fundamentalist interpretations of Islam—such as on the issue of free will versus predestination—became intense and turned bloody. A resurgent religious orthodoxy eventually inflicted a crushing defeat on the Mutazilites. Thereafter, the open-minded pursuits of philosophy, mathematics, and science were increasingly relegated to the margins of Islam.1
Figure 1 |
The 20th century witnessed the end of European colonial rule and the emergence of several new independent Muslim states, all initially under secular national leaderships. A spurt toward modernization and the acquisition of technology followed. Many expected that a Muslim scientific renaissance would ensue. Clearly, it did not.
What ails science in the Muslim world?
Figure 2 |
Is boosting resource allocations enough to energize science, or are more fundamental changes required? Scholars of the 19th century, such as the pioneering sociologist Max Weber, claimed that Islam lacks an "idea system" critical for sustaining a scientific culture based on innovation, new experiences, quantification, and empirical verification. Fatalism and an orientation toward the past, they said, makes progress difficult and even undesirable.
In the current epoch of growing antagonism between the Islamic and the Western worlds, most Muslims reject such charges with angry indignation. They feel those accusations add yet another excuse for the West to justify its ongoing cultural and military assaults on Muslim populations. Muslims bristle at any hint that Islam and science may be at odds, or that some underlying conflict between Islam and science may account for the slowness of progress. The Qur'an, being the unaltered word of God, cannot be at fault: Muslims believe that if there is a problem, it must come from their inability to properly interpret and implement the Qur'an's divine instructions.
In defending the compatibility of science and Islam, Muslims argue that Islam had sustained a vibrant intellectual culture throughout the European Dark Ages and thus, by extension, is also capable of a modern scientific culture. The Pakistani physics Nobel Prize winner, Abdus Salam, would stress to audiences that one-eighth of the Qur'an is a call for Muslims to seek Allah's signs in the universe and hence that science is a spiritual as well as a temporal duty for Muslims. Perhaps the most widely used argument one hears is that the Prophet Muhammad had exhorted his followers to "seek knowledge even if it is in China," which implies that a Muslim is duty-bound to search for secular knowledge.
Such arguments have been and will continue to be much debated, but they will not be pursued further here. Instead, let us seek to understand the state of science in the contemporary Islamic world. First, to the degree that available data allows, I will quantitatively assess the current state of science in Muslim countries. Then I will look at prevalent Muslim attitudes toward science, technology, and modernity, with an eye toward identifying specific cultural and social practices that work against progress. Finally, we can turn to the fundamental question: What will it take to bring science back into the Islamic world?
Measuring Muslim scientific progress
The metrics of scientific progress are neither precise nor unique. Science permeates our lives in myriad ways, means different things to different people, and has changed its content and scope drastically over the course of history. In addition, the paucity of reliable and current data makes the task of assessing scientific progress in Muslim countries still harder.
I will use the following reasonable set of four metrics:
- The quantity of scientific output, weighted by some reasonable measure of relevance and importance;
- The role played by science and technology in the national economies, funding for S&T, and the size of the national scientific enterprises;
- The extent and quality of higher education; and
- The degree to which science is present or absent in popular culture.
Scientific output
A useful, if imperfect, indicator of scientific output is the number of published scientific research papers, together with the citations to them. Table 1 shows the output of the seven most scientifically productive Muslim countries for physics papers, over the period from 1 January 1997 to 28 February 2007, together with the total number of publications in all scientific fields. A comparison with Brazil, India, China, and the US reveals significantly smaller numbers. A study by academics at the International Islamic University Malaysia2 showed that OIC countries have 8.5 scientists, engineers, and technicians per 1000 population, compared with a world average of 40.7, and 139.3 for countries of the Organisation for Economic Co-operation and Development. (For more on the OECD, see http://www.oecd.org.) Forty-six Muslim countries contributed 1.17% of the world's science literature, whereas 1.66% came from India alone and 1.48% from Spain. Twenty Arab countries contributed 0.55%, compared with 0.89% by Israel alone. The US NSF records that of the 28 lowest producers of scientific articles in 2003, half belong to the OIC.3
The situation may be even grimmer than the publication numbers or perhaps even the citation counts suggest. Assessing the scientific worth of publications—never an easy task—is complicated further by the rapid appearance of new international scientific journals that publish low-quality work. Many have poor editorial policies and refereeing procedures. Scientists in many developing countries, who are under pressure to publish, or who are attracted by strong government incentives, choose to follow the path of least resistance paved for them by the increasingly commercialized policies of journals. Prospective authors know that editors need to produce a journal of a certain thickness every month. In addition to considerable anecdotal evidence for these practices, there have been a few systematic studies. For example,4 chemistry publications by Iranian scientists tripled in five years, from 1040 in 1998 to 3277 in 2003. Many scientific papers that were claimed as original by their Iranian chemist authors, and that had been published in internationally peer-reviewed journals, had actually been published twice and sometimes thrice with identical or nearly identical contents by the same authors. Others were plagiarized papers that could have been easily detected by any reasonably careful referee.
The situation regarding patents is also discouraging: The OIC countries produce negligibly few. According to official statistics, Pakistan has produced only eight patents in the past 43 years.
Islamic countries show a great diversity of cultures and levels of modernization and a correspondingly large spread in scientific productivity. Among the larger countries—in both population and political importance—Turkey, Iran, Egypt, and Pakistan are the most scientifically developed. Among the smaller countries, such as the central Asian republics, Uzbekistan and Kazakhstan rank considerably above Turkmenistan, Tajikistan, and Kyrgyzstan. Malaysia—a rather atypical Muslim country with a 40% non-Muslim minority—is much smaller than neighboring Indonesia but is nevertheless more productive. Kuwait, Saudi Arabia, Qatar, the UAE, and other states that have many foreign scientists are scientifically far ahead of other Arab states.
National scientific enterprises
Conventional wisdom suggests that bigger science budgets indicate, or will induce, greater scientific activity. On average, the 57 OIC states spend an estimated 0.3% of their gross national product on research and development, which is far below the global average of 2.4%. But the trend toward higher spending is unambiguous. Rulers in the UAE and Qatar are building several new universities with manpower imported from the West for both construction and staffing. In June 2006, Nigeria's president Olusegun Obasanjo announced he will plow $5 billion of oil money into R&D. Iran increased its R&D spending dramatically, from a pittance in 1988 at the end of the Iraq–Iran war, to a current level of 0.4% of its gross domestic product. Saudi Arabia announced that it spent 26% of its development budget on science and education in 2006, and sent 5000 students to US universities on full scholarships. Pakistan set a world record by increasing funding for higher education and science by an immense 800% over the past five years.
But bigger budgets by themselves are not a panacea. The capacity to put those funds to good use is crucial. One determining factor is the number of available scientists, engineers, and technicians. Those numbers are low for OIC countries, averaging around 400–500 per million people, while developed countries typically lie in the range of 3500–5000 per million. Even more important are the quality and level of professionalism, which are less easily quantifiable. But increasing funding without adequately addressing such crucial concerns can lead to a null correlation between scientific funding and performance.
The role played by science in creating high technology is an important science indicator. Comparing table 1 with table 2 shows there is little correlation between academic research papers and the role of S&T in the national economies of the seven listed countries. The anomalous position of Malaysia in table 2 has its explanation in the large direct investment made by multinational companies and in having trading partners that are overwhelmingly non-OIC countries.
Figure 3 |
Higher education
According to a recent survey, among the 57 member states of the OIC, there are approximately 1800 universities.5 Of those, only 312 publish journal articles. A ranking of the 50 most published among them yields these numbers: 26 are in Turkey, 9 in Iran, 3 each in Malaysia and Egypt, 2 in Pakistan, and 1 in each of Uganda, the UAE, Saudi Arabia, Lebanon, Kuwait, Jordan, and Azerbaijan. For the top 20 universities, the average yearly production of journal articles was about 1500, a small but reasonable number. However, the average citation per article is less than 1.0 (the survey report does not state whether self-citations were excluded). There are fewer data available for comparing against universities worldwide. Two Malaysian undergraduate institutions were in the top-200 list of the Times Higher Education Supplement in 2006 (available at http://www.thes.co.uk). No OIC university made the top-500 "Academic Ranking of World Universities" compiled by Shanghai Jiao Tong University (see http://ed.sjtu.edu.cn/en). This state of affairs led the director general of the OIC to issue an appeal for at least 20 OIC universities to be sufficiently elevated in quality to make the top-500 list. No action plan was specified, nor was the term "quality" defined.
An institution's quality is fundamental, but how is it to be defined? Providing more infrastructure and facilities is important but not key. Most universities in Islamic countries have a starkly inferior quality of teaching and learning, a tenuous connection to job skills, and research that is low in both quality and quantity. Poor teaching owes more to inappropriate attitudes than to material resources. Generally, obedience and rote learning are stressed, and the authority of the teacher is rarely challenged. Debate, analysis, and class discussions are infrequent.
Academic and cultural freedoms on campuses are highly restricted in most Muslim countries. At Quaid-i-Azam University in Islamabad, where I teach, the constraints are similar to those existing in most other Pakistani public-sector institutions. This university serves the typical middle-class Pakistani student and, according to the survey referred to earlier,5 ranks number two among OIC universities. Here, as in other Pakistani public universities, films, drama, and music are frowned on, and sometimes even physical attacks by student vigilantes who believe that such pursuits violate Islamic norms take place. The campus has three mosques with a fourth one planned, but no bookstore. No Pakistani university, including QAU, allowed Abdus Salam to set foot on its campus, although he had received the Nobel Prize in 1979 for his role in formulating the standard model of particle physics. The Ahmedi sect to which he belonged, and which had earlier been considered to be Muslim, was officially declared heretical in 1974 by the Pakistani government.
Figure 4 |
The government should abolish co-education. Quaid-i-Azam University has become a brothel. Its female professors and students roam in objectionable dresses. . . . Sportswomen are spreading nudity. I warn the sportswomen of Islamabad to stop participating in sports. . . . Our female students have not issued the threat of throwing acid on the uncovered faces of women. However, such a threat could be used for creating the fear of Islam among sinful women. There is no harm in it. There are far more horrible punishments in the hereafter for such women.6
The imposition of the veil makes a difference. My colleagues and I share a common observation that over time most students—particularly veiled females—have largely lapsed into becoming silent note-takers, are increasingly timid, and are less inclined to ask questions or take part in discussions. This lack of self-expression and confidence leads to most Pakistani university students, including those in their mid- or late-twenties, referring to themselves as boys and girls rather than as men and women.
Science and religion still at odds
Science is under pressure globally, and from every religion. As science becomes an increasingly dominant part of human culture, its achievements inspire both awe and fear. Creationism and intelligent design, curbs on genetic research, pseudoscience, parapsychology, belief in UFOs, and so on are some of its manifestations in the West. Religious conservatives in the US have rallied against the teaching of Darwinian evolution. Extreme Hindu groups such as the Vishnu Hindu Parishad, which has called for ethnic cleansing of Christians and Muslims, have promoted various "temple miracles," including one in which an elephant-like God miraculously came alive and started drinking milk. Some extremist Jewish groups also derive additional political strength from antiscience movements. For example, certain American cattle tycoons have for years been working with Israeli counterparts to try to breed a pure red heifer in Israel, which, by their interpretation of chapter 19 of the Book of Numbers, will signal the coming of the building of the Third Temple,7 an event that would ignite the Middle East.
In the Islamic world, opposition to science in the public arena takes additional forms. Antiscience materials have an immense presence on the internet, with thousands of elaborately designed Islamic websites, some with view counters running into the hundreds of thousands. A typical and frequently visited one has the following banner: "Recently discovered astounding scientific facts, accurately described in the Muslim Holy Book and by the Prophet Muhammad (PBUH) 14 centuries ago." Here one will find that everything from quantum mechanics to black holes and genes was anticipated 1400 years ago.
Science, in the view of fundamentalists, is principally seen as valuable for establishing yet more proofs of God, proving the truth of Islam and the Qur'an, and showing that modern science would have been impossible but for Muslim discoveries. Antiquity alone seems to matter. One gets the impression that history's clock broke down somewhere during the 14th century and that plans for repair are, at best, vague. In that all-too-prevalent view, science is not about critical thought and awareness, creative uncertainties, or ceaseless explorations. Missing are websites or discussion groups dealing with the philosophical implications from the Islamic point of view of the theory of relativity, quantum mechanics, chaos theory, superstrings, stem cells, and other contemporary science issues.
Similarly, in the mass media of Muslim countries, discussions on "Islam and science" are common and welcomed only to the extent that belief in the status quo is reaffirmed rather than challenged. When the 2005 earthquake struck Pakistan, killing more than 90 000 people, no major scientist in the country publicly challenged the belief, freely propagated through the mass media, that the quake was God's punishment for sinful behavior. Mullahs ridiculed the notion that science could provide an explanation; they incited their followers into smashing television sets, which had provoked Allah's anger and hence the earthquake. As several class discussions showed, an overwhelming majority of my university's science students accepted various divine-wrath explanations.
Why the slow development?
Although the relatively slow pace of scientific development in Muslim countries cannot be disputed, many explanations can and some common ones are plain wrong.
For example, it is a myth that women in Muslim countries are largely excluded from higher education. In fact, the numbers are similar to those in many Western countries: The percentage of women in the university student body is 35% in Egypt, 67% in Kuwait, 27% in Saudi Arabia, and 41% in Pakistan, for just a few examples. In the physical sciences and engineering, the proportion of women enrolled is roughly similar to that in the US. However, restrictions on the freedom of women leave them with far fewer choices, both in their personal lives and for professional advancement after graduation, relative to their male counterparts.
The near-absence of democracy in Muslim countries is also not an especially important reason for slow scientific development. It is certainly true that authoritarian regimes generally deny freedom of inquiry or dissent, cripple professional societies, intimidate universities, and limit contacts with the outside world. But no Muslim government today, even if dictatorial or imperfectly democratic, remotely approximates the terror of Hitler or Joseph Stalin—regimes in which science survived and could even advance.
Another myth is that the Muslim world rejects new technology. It does not. In earlier times, the orthodoxy had resisted new inventions such as the printing press, loudspeaker, and penicillin, but such rejection has all but vanished. The ubiquitous cell phone, that ultimate space-age device, epitomizes the surprisingly quick absorption of black-box technology into Islamic culture. For example, while driving in Islamabad, it would occasion no surprise if you were to receive an urgent SMS (short message service) requesting immediate prayers for helping Pakistan's cricket team win a match. Popular new Islamic cell-phone models now provide the exact GPS-based direction for Muslims to face while praying, certified translations of the Qur'an, and step-by-step instructions for performing the pilgrimages of Haj and Umrah. Digital Qur'ans are already popular, and prayer rugs with microchips (for counting bend-downs during prayers) have made their debut.
Some relatively more plausible reasons for the slow scientific development of Muslim countries have been offered. First, even though a handful of rich oil-producing Muslim countries have extravagant incomes, most are fairly poor and in the same boat as other developing countries. Indeed, the OIC average for per capita income is significantly less than the global average. Second, the inadequacy of traditional Islamic languages—Arabic, Persian, Urdu—is an important contributory reason. About 80% of the world's scientific literature appears first in English, and few traditional languages in the developing world have adequately adapted to new linguistic demands. With the exceptions of Iran and Turkey, translation rates are small. According to a 2002 United Nations report written by Arab intellectuals and released in Cairo, Egypt, "The entire Arab world translates about 330 books annually, one-fifth the number that Greece translates." The report adds that in the 1000 years since the reign of the caliph Maa'moun, the Arabs have translated as many books as Spain translates in just one year.8
It's the thought that counts
But the still deeper reasons are attitudinal, not material. At the base lies the yet unresolved tension between traditional and modern modes of thought and social behavior.
That assertion needs explanation. No grand dispute, such as between Galileo and Pope Urban VIII, is holding back the clock. Bread-and-butter science and technology requires learning complicated but mundane rules and procedures that place no strain on any reasonable individual's belief system. A bridge engineer, robotics expert, or microbiologist can certainly be a perfectly successful professional without pondering profound mysteries of the universe. Truly fundamental and ideology-laden issues confront only that tiny minority of scientists who grapple with cosmology, indeterminacy in quantum mechanical and chaotic systems, neuroscience, human evolution, and other such deep topics. Therefore, one could conclude that developing science is only a matter of setting up enough schools, universities, libraries, and laboratories, and purchasing the latest scientific tools and equipment.
But the above reasoning is superficial and misleading. Science is fundamentally an idea-system that has grown around a sort of skeleton wire frame—the scientific method. The deliberately cultivated scientific habit of mind is mandatory for successful work in all science and related fields where critical judgment is essential. Scientific progress constantly demands that facts and hypotheses be checked and rechecked, and is unmindful of authority. But there lies the problem: The scientific method is alien to traditional, unreformed religious thought. Only the exceptional individual is able to exercise such a mindset in a society in which absolute authority comes from above, questions are asked only with difficulty, the penalties for disbelief are severe, the intellect is denigrated, and a certainty exists that all answers are already known and must only be discovered.
Science finds every soil barren in which miracles are taken literally and seriously and revelation is considered to provide authentic knowledge of the physical world. If the scientific method is trashed, no amount of resources or loud declarations of intent to develop science can compensate. In those circumstances, scientific research becomes, at best, a kind of cataloging or "butterfly-collecting" activity. It cannot be a creative process of genuine inquiry in which bold hypotheses are made and checked.
Religious fundamentalism is always bad news for science. But what explains its meteoric rise in Islam over the past half century? In the mid-1950s all Muslim leaders were secular, and secularism in Islam was growing. What changed? Here the West must accept its share of responsibility for reversing the trend. Iran under Mohammed Mossadeq, Indonesia under Ahmed Sukarno, and Egypt under Gamal Abdel Nasser are examples of secular but nationalist governments that wanted to protect their national wealth. Western imperial greed, however, subverted and overthrew them. At the same time, conservative oil-rich Arab states—such as Saudi Arabia—that exported extreme versions of Islam were US clients. The fundamentalist Hamas organization was helped by Israel in its fight against the secular Palestine Liberation Organization as part of a deliberate Israeli strategy in the 1980s. Perhaps most important, following the Soviet invasion of Afghanistan in 1979, the US Central Intelligence Agency armed the fiercest and most ideologically charged Islamic fighters and brought them from distant Muslim countries into Afghanistan, thus helping to create an extensive globalized jihad network. Today, as secularism continues to retreat, Islamic fundamentalism fills the vacuum.
How science can return to the Islamic world
In the 1980s an imagined "Islamic science" was posed as an alternative to "Western science." The notion was widely propagated and received support from governments in Pakistan, Saudi Arabia, Egypt, and elsewhere. Muslim ideologues in the US, such as Ismail Faruqi and Syed Hossein Nasr, announced that a new science was about to be built on lofty moral principles such as tawheed (unity of God), ibadah (worship), khilafah (trusteeship), and rejection of zulm (tyranny), and that revelation rather than reason would be the ultimate guide to valid knowledge. Others took as literal statements of scientific fact verses from the Qur'an that related to descriptions of the physical world. Those attempts led to many elaborate and expensive Islamic science conferences around the world. Some scholars calculated the temperature of Hell, others the chemical composition of heavenly djinnis. None produced a new machine or instrument, conducted an experiment, or even formulated a single testable hypothesis.
A more pragmatic approach, which seeks promotion of regular science rather than Islamic science, is pursued by institutional bodies such as COMSTECH (Committee on Scientific and Technological Cooperation), which was established by the OIC's Islamic Summit in 1981. It joined the IAS (Islamic Academy of Sciences) and ISESCO (Islamic Educational, Scientific, and Cultural Organization) in serving the "ummah" (the global Muslim community). But a visit to the websites of those organizations reveals that over two decades, the combined sum of their activities amounts to sporadically held conferences on disparate subjects, a handful of research and travel grants, and small sums for repair of equipment and spare parts.
One almost despairs. Will science never return to the Islamic world? Shall the world always be split between those who have science and those who do not, with all the attendant consequences?
Progress will require behavioral changes. If Muslim societies are to develop technology instead of just using it, the ruthlessly competitive global marketplace will insist on not only high skill levels but also intense social work habits. The latter are not easily reconcilable with religious demands made on a fully observant Muslim's time, energy, and mental concentration: The faithful must participate in five daily congregational prayers, endure a month of fasting that taxes the body, recite daily from the Qur'an, and more. Although such duties orient believers admirably well toward success in the life hereafter, they make worldly success less likely. A more balanced approach will be needed.
Science can prosper among Muslims once again, but only with a willingness to accept certain basic philosophical and attitudinal changes—a Weltanschauung that shrugs off the dead hand of tradition, rejects fatalism and absolute belief in authority, accepts the legitimacy of temporal laws, values intellectual rigor and scientific honesty, and respects cultural and personal freedoms. The struggle to usher in science will have to go side-by-side with a much wider campaign to elbow out rigid orthodoxy and bring in modern thought, arts, philosophy, democracy, and pluralism.
Respected voices among believing Muslims see no incompatibility between the above requirements and true Islam as they understand it. For example, Abdolkarim Soroush, described as Islam's Martin Luther, was handpicked by Ayatollah Khomeini to lead the reform of Iran's universities in the early 1980s. His efforts led to the introduction of modern analytical philosophers such as Karl Popper and Bertrand Russell into the curricula of Iranian universities. Another influential modern reformer is Abdelwahab Meddeb, a Tunisian who grew up in France. Meddeb argues that as early as the middle of the eighth century, Islam had produced the premises of the Enlightenment, and that between 750 and 1050, Muslim authors made use of an astounding freedom of thought in their approach to religious belief. In their analyses, says Meddeb, they bowed to the primacy of reason, honoring one of the basic principles of the Enlightenment.
In the quest for modernity and science, internal struggles continue within the Islamic world. Progressive Muslim forces have recently been weakened, but not extinguished, as a consequence of the confrontation between Muslims and the West. On an ever-shrinking globe, there can be no winners in that conflict: It is time to calm the waters. We must learn to drop the pursuit of narrow nationalist and religious agendas, both in the West and among Muslims. In the long run, political boundaries should and can be treated as artificial and temporary, as shown by the successful creation of the European Union. Just as important, the practice of religion must be a matter of choice for the individual, not enforced by the state. This leaves secular humanism, based on common sense and the principles of logic and reason, as our only reasonable choice for governance and progress. Being scientists, we understand this easily. The task is to persuade those who do not.
Pervez Hoodbhoy is chair and professor in the department of physics at Quaid-i-Azam University in Islamabad, Pakistan, where he has taught for 34 years.
References
- 1. P. Hoodbhoy, Islam and Science—Religious Orthodoxy and the Battle for Rationality, Zed Books, London (1991).
- 2. M. A. Anwar, A. B. Abu Bakar, Scientometrics 40, 23 (1997).
- 3. For additional statistics, see the special issue "Islam and Science," Nature 444, 19 (2006).
- 4. M. Yalpani, A. Heydari, Chem. Biodivers. 2, 730 (2005).
- 5. Statistical, Economic and Social Research and Training Centre for Islamic Countries, Academic Rankings of Universities in the OIC Countries (April 2007), available at [LINK].
- 6. The News, Islamabad, 24 April 2007, available at [LINK].
- 7. For more information on the red heifer venture, see [LINK].
- 8. N. Fergany et al., Arab Human Development Report 2002, United Nations Development Programme, Arab Fund for Economic and Social Development, New York (2002), available at [LINK].
Figure 1. Ottoman Empire astronomers working in 1577 at an observatory in Istanbul. This painting accompanied an epic poem that honored Sultan Murad III, who ruled from 1574 to 1595. The observatory was demolished in 1580 after astronomers sighted a comet and predicted a military victory that failed to materialize. The poem was published a year later. (For more on ancient Islamic astronomy, see the American Institute of Physics online cosmology exhibit, http://www.aip.org/history/cosmology/tools/tools-nakedeyes.htm#astrolabe.)
Figure 2. A student working with a scanning electron microscope at the American University of Sharjah, United Arab Emirates. The Emirate's ruler recently created the Sharjah Academy of Scientific Research, where a nanotechnology center and central lab facility is being established. Scientific researchers require financial resources and equipment. But can they also exercise the intellectual freedom and questioning skepticism that they need even more?
Credit: Nasser Hamdan/AUS
Figure 3. One of Pakistan's missile launchers. Military technology is an area of investment in a few Muslim countries as in other developing countries. But such arms are more often a triumph of reverse engineering than of original research and development.
Credit: FEDERATION OF AMERICAN SCIENTISTS
Figure 4. Students of a seminary, Jamia Hafsa, in Islamabad, demonstrating for the enforcement of Islamic law, March 2007. The seminary's head, a government employee, issued a threat to all female students in Islamabad to be similarly veiled or else face consequences. Is this a climate that is conducive to scientific inquiry?
Credit: Ishaque Choudhry
Lewis Black on Racial Discrimintion
The Daily Show with Jon Stewart - The Most Trusted Name in Fake News
Sunday, August 05, 2007
I Was Simpsonized!!!
Saturday, August 04, 2007
Argument Against Intelligent Design
===================================================================
The Great Mutator
By Jerry Coyne
I.
Browsing the websites of different colleges, a prospective biology student finds an unusual statement on the page of the Department of Biological Sciences at Lehigh University. It begins:
The faculty in the department of biological sciences is committed to the highest standards of scientific integrity and academic function. This commitment carries with it unwavering support for academic freedom and the free exchange of ideas. It also demands the utmost respect for the scientific method, integrity in the conduct of research, and recognition that the validity of any scientific model comes only as a result of rational hypothesis testing, sound experimentation, and findings that can be replicated by others.
So far, so good. After all, every science department should adhere to rigorous canons of research. But then comes a curious disclaimer:
The department faculty, then, are unequivocal in their support of evolutionary theory, which has its roots in the seminal work of Charles Darwin and has been supported by findings accumulated over 140 years. The sole dissenter from this position, Prof. Michael Behe, is a well-known proponent of "intelligent design." While we respect Prof. Behe's right to express his views, they are his alone and are in no way endorsed by the department.It is our collective position that intelligent design has no basis in science, has not been tested experimentally, and should not be regarded as scientific.
To my knowledge, such a statement is unique. Biology departments do not customarily assert publicly that they support a theory known for more than a century to be true. This is equivalent to a chemistry faculty announcing that "we are unequivocal in our support of atoms." Yet this disclaimer is perfectly understandable. For in this department resides Michael Behe -- that rara avis, a genuine biologist who is also an advocate of "intelligent design." And Lehigh University does not wish to lose prospective students who bridle at the thought of studying miracles in their science courses.
Intelligent design, or ID, is a modern form of creationism cleverly constructed to circumvent the many court decisions that have banned, on First Amendment grounds, the teaching of religious views in the science classroom. ID has shed many of the trappings that once cost creationists scientific and legal credibility, including explicit reference to God and the ludicrous idea that the Earth is only about ten thousand years old. Instead, God has been replaced by an unspecified "intelligent designer." Besides making the usual shopworn criticisms of evolutionary theory, IDers contend that some features of life are too complex to have evolved, and so required celestial intervention.
Behe has been an especially valuable ally of the IDers. Not only is he one of the few working scientists in their camp (he is a protein biochemist), thus giving them a smidgen of scientific respectability, but in 1996 he published Darwin's Black Box, a popular-science book that has become something of a manifesto for "intelligent design." In that book, Behe updated an old creationist chestnut: the assertion that some aspects of life could not have evolved by means of natural selection, because that evolution would have required untenable steps. Consider the eye, which consists of a number of interacting parts (such as the retina, the optic nerve, the lens, and the cornea) that work together to allow vision. How could such a complex feature have evolved gradually if it could not work unless all its components were already in place? Such features, said Behe, are "irreducibly complex": their evolution supposedly cannot be reduced to a sequential series of adaptive steps, as required by Darwinian natural selection.
Well, scientists already knew that "irreducibly complex" features can indeed be explained by natural selection; and Darwin himself had no trouble doing this for the eye in On the Origin of Species, describing a series of perfectly well-adapted living species, each of which had a slightly more advanced version of an eye. Behe's novelty was to extend this argument to complex biochemical pathways, the evolution of many of which we do not yet fully understand. It was in the complexity of metabolism, blood clotting, and immunology that Behe claimed to have found the hand of the Great Designer.
The reviews of Darwin's Black Box in the scientific community were uniformly negative, for two reasons. First, we do understand something about how these pathways might have evolved in stepwise fashion, though we are as yet admittedly ignorant of many details. (It is harder to reconstruct the evolution of biochemical pathways than the evolution of organisms themselves, because, unlike organisms, these pathways do not fossilize, and so their evolution must be reconstructed entirely from living species.) Second, in the scientific community a failure to understand something does not automatically count as evidence for divine creation. Science is littered with once-mysterious facts first imputed to God and later found out to be explicable solely through natural processes. This, in fact, is what Darwin's theory of natural selection did to the earlier idea that organisms were designed by a Creator.
More damaging than the scientific criticisms of Behe's work was the review that he got in 2005 from Judge John E. Jones III. After an ID textbook called Of Pandas and People was proposed for biology classes at a high school in Dover, Pennsylvania, a group of local parents brought suit against the Dover Area School District and some of its members. There followed a six-week trial in federal court in Harrisburg, Pennsylvania, with the plaintiffs supported by the ACLU and a Pennsylvania law firm, and the school district defended by a right-wing Christian law firm. The case of Kitzmiller et al. v. Dover Area School District et al., dubbed by some "the Scopes trial of our century," included luminaries from both the scientific camp and the ID camp battling it out in front of Judge Jones. With his scientific credentials, Behe was the key witness for the defense.
Jones's 139-page verdict for the plaintiffs was eloquent, strong, and unequivocal, especially coming from a churchgoing Republican. He ruled that "intelligent design" is not only unscientific, but a doctrine based firmly on religion. Jones called the introduction of the clandestinely creationist textbook at Dover High School an act of "breathtaking inanity." He also found Behe's testimony wholly unconvincing, noting that irreducible complexity was not evidence against evolution, and that the biochemical systems touted by Behe were not irreducibly complex anyway. Behe's credibility was damaged also by his admission that ID's definition of science was so loose that it could encompass astrology, and by his fatal assertion that the plausibility of the argument for ID depends upon the extent to which one believes in the existence of God.
But IDers, like all creationists, are never down for the count, because they see themselves as fighting for the Lord. So Behe is back now, with a new book and a brand-new theory that puts the Intelligent Designer back into biology. What has Behe now found to resurrect his campaign for ID? It's rather pathetic, really. Basically, he now admits that almost the entire edifice of evolutionary theory is true: evolution, natural selection, common ancestry. His one novel claim is that the genetic variation that fuels natural selection -- mutation -- is produced not by random changes in DNA, as evolutionists maintain, but by an Intelligent Designer. That is, he sees God as the Great Mutator.
II.
For a start, let us be clear about what Behe now accepts about evolutionary theory. He has no problem with a 4.5-billion-year-old Earth, nor with evolutionary change over time, nor apparently with its ample documentation through the fossil record -- the geographical distribution of organisms, the existence of vestigial traits testifying to ancient ancestry, and the finding of fossil "missing links" that show common ancestry among major groups of organisms. Behe admits that most evolution is caused by natural selection, and that all species share common ancestors. He even accepts the one fact that most other IDers would rather die than admit: that humans shared a common ancestor with chimpanzees and other apes.
Why does Behe come clean about all this? The reason is plain. There is simply too much evidence for any scientist to deny these facts without losing all credibility. "Intelligent design" is desperate for scientific respectability, and you do not get that by fighting facts about which everybody agrees. But with most of evolutionary biology accepted, what's left for a good IDer to contest? Behe finds his bugbear in evolutionary theory's view that "random mutation" provides the raw material for evolutionary change. And to understand his critique, we first have to grasp how mutation fits into evolutionary theory, and what scientists mean when they say that mutations are "random."
If evolution is a car, then natural selection is the engine and mutation is the gas. Although evolutionary change can be driven by several processes, natural selection is almost certainly the main one -- and the only one that can adapt organisms to their environment, creating the misleading appearance of deliberate design. Yet natural selection, which is simply the preservation of genes that give their possessors greater reproductive success than their competitors, cannot take place without genetic variation. Although Darwin had no idea where this variation came from, we now know that it is produced by mutation -- accidental changes in the sequence of DNA that usually occur as copying errors when a molecule replicates during cell division. We also know that mutation-generated variation is pervasive: different forms of genes produced by mutation, for example, explain variation in human eye color, blood types, and much of our -- and other species' -- variation in height, weight, biochemistry, and innumerable other traits.
Once the variation exists, those genes that enhance an individual's "fitness" are preserved, and those that reduce it are discarded. (Natural selection is not really a "process," but simply a description of the differential and adaptive survival of genes.) The polar bear, for instance, has a white coat (its hairs actually lack pigment but appear white because they reflect light), and since this color is unique among bears, the polar bear presumably evolved from a dark-furred ancestor. The likely scenario is that mutations occurred that produced individuals varying in their coat color. Bears with a lighter coat had an advantage over others, for they would be more camouflaged against the Arctic ice and snow and better at sneaking up on seals. Lighter bears would then outcompete darker ones at getting food and thus produce more offspring, leaving more copies of the "light-coat" genes. Over time, the population of bears would evolve lighter and lighter coats until they were almost invisible against the snow.
On the basis of much evidence, scientists have concluded that mutations occur randomly. The term "random" here has a specific meaning that is often misunderstood, even by biologists. What we mean is that mutations occur irrespective of whether they would be useful to the organism. Mutations are simply errors in DNA replication. Most of them are harmful or neutral, but a few of them can turn out to be useful. And there is no known biological mechanism for jacking up the probability that a mutation will meet the current adaptive needs of the organism. Bears adapting to snowy terrain will not enjoy a higher probability of getting mutations producing lighter coats than will bears inhabiting non-snowy terrain.
What we do not mean by "random" is that all genes are equally likely to mutate (some are more mutable than others) or that all mutations are equally likely (some types of DNA change are more common than others). It is more accurate, then, to call mutations "indifferent" rather than "random": the chance of a mutation happening is indifferent to whether it would be helpful or harmful. Evolution by selection, then, is a combination of two steps: a "random" (or indifferent) step -- mutation -- that generates a panoply of genetic variants, both good and bad (in our example, a variety of new coat colors); and then a deterministic step -- natural selection -- that orders this variation, keeping the good and winnowing the bad (the retention of light-color genes at the expense of dark-color ones).
It is important to clarify these two steps because of the widespread misconception, promoted by creationists, that in evolution "everything happens by chance." Creationists equate the chance that evolution could produce a complex organism to the infinitesimal chance that a hurricane could sweep through a junkyard and randomly assemble the junk into a Boeing 747. But this analogy is specious. Evolution is manifestly not a chance process because of the order produced by natural selection -- order that can, over vast periods of time, result in complex organisms looking as if they were designed to fit their environment. Humans, the product of non-random natural selection, are the biological equivalent of a 747, and in some ways they are even more complex. The explanation of seeming design by solely materialistic processes was Darwin's greatest achievement, and a major source of discomfort for those holding the view that nature was designed by God.
III.
In a series of rather disconnected and scientifically dubious arguments, Behe tries to claim that random mutations cannot possibly be the building blocks of evolution. His main argument involves malaria, in particular the evolution of humans to resist infection by malaria, and the evolution of the malaria parasite itself to counteract the evolution of human resistance and the development of anti-malarial drugs.
Malaria actually provides a superb example of natural selection, and its story has some intriguing quirks. The disease is caused by a protozoan carried by mosquitoes, who act as flying syringes that inject the microbe into the human bloodstream. There it takes up residence in the liver and then in the red blood cells, multiplies prolifically, and can ultimately cause anemia, kidney failure, hemorrhage, and death. Residence in red blood cells and the liver is adaptive for the parasites, because in those spots they are hidden from the immune system that usually destroys invading microbes. Yet the human spleen can also detect and destroy circulating parasite-laden cells. To counter this tactic, the malaria parasite secretes proteins that cause its carrier blood cells to stick to the walls of blood vessels, avoiding the spleen (this sticking is what causes hemorrhage).
Here, then, is an arms race between a blood-loving parasite and a human body seeking to destroy it. Yet the story is even more complicated and interesting. In sub-Saharan Africa, where malaria is rampant, a mutation has arisen in the gene producing hemoglobin that helps ward off malaria. The striking thing about this mutation, known as the sickle-cell mutation, is that it somehow reduces the chances of contracting malaria when its carriers have one copy of the gene (like most organisms, we have two copies of every gene, one on each of our two sets of chromosomes), but it causes sickle-cell anemia when the carriers have two copies. In sickle-cell anemia, the red blood cells form clumps because of the altered hemoglobin they carry, causing a syndrome of complications that invariably cause death before adulthood.
Thus we have the unusual situation in which heterozygotes, or individuals carrying both one "normal" and one "mutant" hemoglobin gene, are fitter than homozygous individuals, who carry either two "normal" genes (more susceptible to malaria) or two mutant genes (death from sickle-cell anemia). Evolutionary genetics tells us that in a case such as this one, both forms of the gene will remain in the population, ensuring some protection against malaria but also the continuing production of babies with sickle-cell anemia. Africans would be better off if everyone were a heterozygote, but that is impossible, because the two gene copies separate at reproduction and unite with other copies, necessarily producing some deleterious homozygotes.
This example shows that natural selection does not necessarily produce absolute perfection; it works with whatever mutations arise to create the best possible situation given the available raw material and the constraints of genetics. And finally, although the malaria parasite has been unable to counter-evolve resistance to heterozygotes for the sickle-cell gene, it has evolved, through mutation, resistance to various anti-malarial drugs devised by humans. This resistance has become so strong that some strains of malaria are completely resistant to drugs -- yet another example of successful natural selection.
Malaria, then, shows evolution acting on a number of levels. So how can it disprove Darwinian evolution? According to Behe, malaria shows that random mutation is insufficient to explain biological complexity. He disparages the defensive sickle-cell mutation and similar mutations, saying that they "are quintessentially hurtful mutations because they diminish the functioning of the human body" (does successfully resisting malaria really diminish the function of our body?) and that they are "not in the process of joining to build a complex, interactive biochemical system." And although the parasite and the humans are in the process of trying to outwit each other in an evolutionary arms race (including the development of drugs), Behe notes that this arms race has not prompted the evolution of biological complexity. Instead Behe sees the malaria/parasite battle as a mere "trench war of attrition."
In the end, after pages of rather tedious detail about malaria, Behe dismisses the evolutionary aspects of malaria as "chaotic and tangled," which, while showing random mutation and natural selection, are irrelevant to his main concerns: "In this book we are concerned with how machinery [i.e., complex organisms] can be built," he writes. "To build a complex machine many different pieces have to be brought together and fitted to one another." Behe buttresses his conclusion by describing how the AIDS virus evolved to outwit not only the strategies of the human immune system but also powerful anti-viral drugs. Again he sees little evolution of complexity: "HIV has killed millions of people, fended off the human immune system, and become resistant to whatever drug humanity could throw at it. Yet through all that, there have been no significant basic biochemical changes in the virus at all."
In light of evolutionary theory, these conclusions are truly bizarre. No sane evolutionist has ever claimed that an adaptation of a parasite to a host will produce complex biochemical changes. Evolutionary theory predicts only that parasites will adapt, not how they will adapt. In fact, both the malaria parasite and the HIV virus have undergone sufficient "biochemical change" to make them almost completely adapted to withstand both human drugs and the immune system. And humans have, through the sickle-cell mutation and other changes in hemoglobin, become somewhat resistant to malaria. Beyond that, what does Behe expect? A red blood cell with hands to throttle the parasite? A malaria parasite with a cunning brain to outwit the sickle-cell protein? HIV and malaria are doing pretty well at reproducing themselves -- sans new complex systems -- in their present environments. That is all evolution can do.
So what if the malaria parasite has not completely outwitted the sickle-cell mutation? No biologist, least among them Darwin, ever claimed that adaptation is always perfect. Every infection by a parasite, every disease, and every species that goes extinct represents a failure to adapt. Sometimes the right mutations do not arise or cannot arise because of the constraints of development; sometimes they do arise, but produce an imperfect adaptation. In his book The Causes of Evolution, the British biologist J.B.S. Haldane addressed this problem with tongue in cheek: "A selector of sufficient knowledge and power might perhaps obtain from the genes at present available in the human species a race combining an average intellect equal to that of Shakespeare with the stature of Carnera. But he could not produce a race of angels. For the moral character or for the wings, he would have to await or produce suitable mutations."
By disparaging the malaria system as mere "trench warfare" and characterizing the mutations involved as "not constructive" and causing "broken genes," Behe not only engages in sophistry, but shows an almost willful misunderstanding of Darwinism. Natural selection is not a one-way path to more complex adaptations or organisms. Organisms adapt to whatever environmental challenges they face, and those changes could require either small biochemical adjustments (as in the case of malaria), more extensive changes that require complex adaptation (as in the evolution of amphibians from fish), or even the evolution of less complexity. The tapeworm, for example, is considerably simpler than its ancestor: it has lost its nervous system, its digestive system, and most of its reproductive system, becoming in effect an absorptive sack of gonads. But it is nevertheless well adapted to its novel intestinal niche, where it can dispense with unnecessary features (its food, for one thing, is pre-digested). Using malaria and HIV to argue that random mutations cannot fuel the evolution of complexity is like displaying a crudely built footstool to prove that it is impossible to build a house with lumber, hammer, and nails.
IV.
The general reader, at whom The Edge of Evolution is aimed, is unlikely to find the scientific holes in its arguments. Behe writes clearly and engagingly, and someone lacking formal training in biochemistry and evolutionary biology may be easily snowed by his rhetoric. The snow falls most heavily when Behe writes about the complex biochemical adaptations of animals, such as the structure and the operation of cilia. Cilia are small, hairlike structures whose rhythmic beating propels microorganisms; they also help move things along in other species (cilia line the fallopian tubes of mammals, for example, where they sweep the egg into the uterus). Each cilium is built from more than two hundred different proteins, including those making up its structure and "motor proteins" that make it move. Moreover, when a cilium is damaged or a new one is built, an equally complex system of intraflagellar transport uses sixteen other proteins to bring new material from out of the cell for repair, rather like a molecular assembly line.
This description is entertaining and instructive, and those unacquainted with molecular biology will be wowed by the elegance of this adaptation. Indeed, such complex features were what lured many of us into biology, hoping to explain their evolution. But the purpose of Behe's exercise, beyond pedagogy, is simply to overwhelm the reader with nature's complexity, hoping to raise the question of how mutation and natural selection could possibly have built such a feature -- as if being wowed were the same as being persuaded. As Behe says, "The point is to see how elegant and interdependent the coherent system is -- to see how different it is from the broken genes and desperate measures that random mutation routinely involves." ("Broken genes and desperate measures" refers to the simple adaptations of the malaria parasite and its human opponent.) Surely, says Behe, a better theory is that cilia were created by the Intelligent Designer.
In Darwin's Black Box, Behe made exactly the same argument to show that a similar structure, the flagellum (a larger cilium that propels microorganisms), could not have evolved by natural selection. But in this case Behe's claim that no intermediate stages could have existed was refuted. Ken Miller, a biologist at Brown University (and an observant Catholic), showed how flagella and cilia could have had their precursors in mechanisms that the cell uses to transport proteins, mechanisms that are co-opted to construct flagella. Indeed, the whole problem of the evolution of cilia was argued before Judge Jones in Harrisburg, who ruled that there was no convincing evidence that evolution could not have produced this structure, making legal doctrine from something biologists already knew. In his new book, however, Behe simply ignores his critics, repeating his bankrupt claim that there is "no Darwinian explanation for the step-by-step origin of the cilium."
If ID were science, we could make an equivalent demand of its advocates. We could ask Behe to produce a complete step-by-step accounting of what God (sorry, the Intelligent Designer) did when He designed the cilia. And of course Behe would not be able to do that -- nor does he even try. IDers never produce their own "scientific" explanation of life. They just carp about evolution. And as evolutionists explain one thing after another, IDers simply ignore these successes and move on to the ever-dwindling set of unsolved problems in which they continue to see the hand of God.
Behe's arguments from the gaps in scientific knowledge are fatuous. It is certainly true that we do not yet understand every step in the origin of the cilium, but these are early days. Molecular biology is a very young field, and molecular evolutionary biology is even younger. The way to understand the evolution of cilia is to get to work in the laboratory, not to throw up our hands and cry "design." Perhaps we will never understand every step in the evolution of a complex feature, just as we cannot know everything about the development of human civilization from archaeology. But is the incompleteness of our knowledge a reason to invoke God? The history of science shows us that patching the gaps in our knowledge with miracles creates a path that leads only to perpetual ignorance.
V.
Beyond his own incredulity, Behe has two other arguments against the efficacy of mutation and natural selection in creating complex features:
First, steps. The more intermediate evolutionary steps that must be climbed to achieve some biological goal without reaping a net benefit, the more unlikely a Darwinian explanation. Second, coherence. A telltale signature of planning is the coherent ordering of steps toward a goal. Random mutation, on the other hand, is incoherent; that is, any given evolutionary step taken by a population of organisms is unlikely to be connected to its predecessor.
These arguments betray a profound, almost willful ignorance of the evolutionary process. It is indeed true that natural selection cannot build any feature in which intermediate steps do not confer a net benefit on the organism. As Darwin wrote in On the Origin of Species, "If it could be demonstrated that any complex organ existed, which could not possibly have been formed by numerous, successive, slight modifications, my theory would absolutely break down. But I can find out no such case." A century and a half later, there is still no such case. Behe certainly fails to make one.
But he does try gamely, claiming that complex interactions between proteins are features that simply could not have evolved. Proteins represent strings of building blocks -- amino acids -- and the cooperation between some proteins requires that sets of amino acids in one protein interact rather precisely with sets in another. (For example, the two strings of amino acids making up human hemoglobin, the alpha and beta chains, are closely aligned, and work together in a coordinated way to take aboard oxygen and later relinquish it in the right tissues.) Such precise protein-protein interactions, says Behe, could not have been formed by "numerous, successive, slight steps," because such Darwinian evolution would be wildly improbable.
To demonstrate the improbability, Behe does some math. He calculates the probability that such interactions between amino acids could evolve, assuming that a precise set of amino acids is required. Not surprisingly, it turns out that getting by mutation a set of three to four amino acids required for only one protein-protein interaction is very low (mutations in the DNA affect the building blocks of proteins, since DNA codes for a sequence of amino acids). It is especially low because Behe requires all of the three or four mutations needed to create such an interaction to arise simultaneously. Since any one mutation is very rare (on the order of one in a billion in any given generation at a specified DNA site), the chances that a specified group of changes could arise in one fell swoop is unimaginably rare. And this is for only a single pair of interacting proteins. When you consider the thousands of proteins in a cell that interact with others, some with as many as five or six others, evolution looks impossible.
Wrong. If it looks impossible, this is only because of Behe's bizarre and unrealistic assumption that for a protein-protein interaction to evolve, all mutations must occur simultaneously, because the step-by-step path is not adaptive. Yet Behe furnishes no proof, no convincing argument, that interactions cannot evolve gradually. In fact, interactions between proteins, like any complex interaction, were certainly built up step by mutational step, with each change producing an interaction scrutinized by selection and retained if it enhanced an organism's fitness. This process could have begun with weak protein-protein associations that were beneficial to the organism. These were then strengthened gradually, involving more and more amino acids to make the interaction stronger and more specific. At the end, you get what we see today: many proteins interacting strongly and specifically. What seems improbable in a single leap becomes much more likely when it evolves gradually, step by step.
A simple example shows this difference. Suppose a complex adaptation involves twenty parts, represented by twenty dice, each one showing a six. The adaptation is fueled by random mutation, represented by throwing the dice. Behe's way of getting this adaptation requires you to roll all twenty dice simultaneously, waiting until they all come up six (that is, all successful mutations must happen together). The probability of getting this outcome is very low; in fact, if you tossed the dice once per second, it would take about a hundred million years to get the right outcome. But now let us build the adaptation step by step, as evolutionary theory dictates. You start by rolling the first die, and keep rolling it until a six comes up. When it does, you keep that die (a successful first step in the adaptation) and move on to the next one. You toss the second die until it comes up six (the second step), and so on until all twenty dice show a six. On average, this would take about a hundred and twenty rolls, or a total of two minutes at one roll per second. This sequential way of getting twenty sixes is infinitely faster than Behe's method. And this is the way natural selection and mutation really work, not by the ludicrous scenario presented by Behe.
As for Behe's assertion that mutation and selection cannot produce "coherence," it is absurd. Coherence is precisely the product of natural selection working with mutation. Yes, mutations are random in the sense I have described, but to say that an evolutionary step taken by an organism is unconnected to its predecessor completely ignores the fact that during evolution organisms are adapting to something in their environment, and that this adaptation can involve a coherent, coordinated response of many features. Consider the evolution of whales from terrestrial animals, now documented by a superb fossil record. The fossils show a wolf-like creature gradually becoming aquatic, with the hind limbs being reduced and finally lost, the forelimbs transformed into flippers, and the nostrils gradually moving atop the head to form the blowhole. How can anyone say that these changes (which of course look planned at the end) are unconnected or incoherent? They represent a case of natural selection eventually turning a land animal into a well-adapted aquatic one.
Not surprisingly, Behe ignores the fact the evolutionists have indeed determined whether mutations are random. Instead, he asserts that randomness is simply assumed by biologists because "the dominant theory [evolution] requires it." That is, all evolutionists are dupes. But for several decades molecular biologists have tested in the laboratory Behe's assumption of non-randomness: that the probability of a mutation being useful increases when a species is exposed to a new environment. These experiments have all failed. As far as we can see, mutations really are random with respect to the "needs" of the organism. There is no reason to assume otherwise.
To get an idea of the power of truly random mutations coupled with selection, we need only look at the successes of animal and plant breeders over the past few centuries. In that time, they have turned the wolf into breeds as diverse as the greyhound, the dachshund, and the chihuahua, and the wild mustard into cabbage, broccoli, cauliflower, and brussels sprouts. Virtually every fruit, vegetable, and meat that we eat has been drastically remodeled by the artificial selection of wild ancestors. All these changes have been immeasurably faster than evolution in the wild, which takes hundreds of thousands to millions of years. And all of these changes have involved selection of random mutations. After all, commercial corn, greyhounds, tomatoes, and turkeys were redesigned by humans, not the Intelligent Designer, and since humans cannot produce miracle mutations, we are limited to selecting whichever ones arise -- that is, random ones. To think otherwise would require the extraordinary assumption that the Designer foresaw the intentions of breeders and supplied them with the appropriate miraculous mutations. The success of animal and plant breeding, far outstripping the pace of evolution in nature, is a severe rebuke to Behe's view that evolution cannot work unless God helps it along by producing nonrandom mutations.
VI.
As the philosopher Philip Kitcher shows in his superb new book, Living With Darwin, the theory of intelligent design is a mixture of "dead science" and non-science. That is, insofar as ID makes scientific claims (for example, that natural selection cannot produce complexity), those claims not only are wrong, but were proved wrong years ago. And ID is deeply unscientific in its assertion that certain aspects of evolution (mutation, in Behe's case) required supernatural intervention. Behe's attacks on evolutionary theory are once again wrongheaded, but the intellectual situation grows far worse when we see what theory he offers in its place.
The first problem is that Behe's "scientific" ideas are offered to the public in a trade book, and have never gone through the usual process of vetting in peer-reviewed scientific journals. This was also the case with Darwin's Black Box. In fact, Behe has never published a paper supporting intelligent design in any scientific journal, despite his assertion in Darwin's Black Box that his own discovery of biochemical design "must be ranked as one of the greatest achievements in the history of science," rivaling "those of Newton and Einstein, Lavoisier and Schrodinger, Pasteur, and Darwin." Surely such an important theory deserves a place in the scientific literature! But the reason for the lack of peer review is obvious: Behe's ideas would never pass muster among scientists, despite the fact that anybody who really could disprove Darwinism would win great renown.
So let us put some empirical questions to Behe, since his theory is supposedly scientific. Which features of life were designed, as opposed to evolved? How exactly did the mutations responsible for design come about? Who was the Designer? To what end did the Designer work? If the goal was perfection, why are some features of life (such as our appendix or prostate gland) palpably imperfect?
In response to the question of what exactly was designed, Behe's answer seems to be: pretty much everything, including cells, biochemical systems, and the features distinguishing major groups of organisms, such as wings and warm-bloodedness. Behe's criterion is basically twofold: a feature of life must have been designed if it consists of a "purposeful arrangement of parts" and is composed of at least three parts.
But when we see something that looks designed, how do we know whether that design is "purposeful"? Natural selection displaced divine design precisely because it offered a naturalistic explanation for things that appeared to be purposeful. And why three components? This appears to come from Behe's claim that three amino acids are unlikely to change simultaneously in a protein. Well, that claim is true, but it has nothing to do with protein evolution, much less with the evolution of complex features. As we know from the fossil record, the multiple features of organisms that make them look designed -- say, the feathers, legs, and wings of birds -- did not appear instantly and simultaneously, but evolved gradually. There is not the slightest connection between the likelihood of three binding sites appearing simultaneously in a protein and the likelihood of three features of an organism evolving. Conflating these issues, and hoping that the reader will not notice, must be a deliberate rhetorical trick on Behe's part, for surely even he is not that ignorant of basic biology.
How did the non-random mutations come about? Well, they were obviously created by the Intelligent Designer. In Darwin's Black Box, Behe made the outrageous claim that the Designer might have engineered the first cell to contain all the mutations for the evolution of every species that would ever exist. This claim is manifestly false: such a cell would have unmanageably large amounts of DNA, and we see no evidence of "future DNA reserves" in primitive organisms such as bacteria. Behe still raises this possibility in his new book, but he also floats another idea: that mutations might not have been built into the first organism but could have occurred later, foreseen by God. In other words, they were miracles. So you can choose between the two possibilities: either mutations occurred in one big miracle or in millions of little miracles. The first claim is religious and false. The second claim is religious and untestable. Neither claim is scientific.
Who, precisely, was the Designer? Here Behe weasels a bit, as he should given the federal judiciary's dislike of religion in the science classroom. He mentions that the Christian God is only one of several possibilities. But you can bet that it was not Brahma, or the Bushmen's Kaang, or a space alien. As Jones remarked in Kitzmiller v. Dover: "Consider, to illustrate, that Professor Behe remarkably and unmistakably claims that the plausibility of the argument for ID depends upon the extent to which one believes in the existence of God." It is disingenuous of IDers to pretend that the Great Designer is unknown. Intelligent design has deep roots in fundamentalist Christianity, and its advocates are not fooling anyone.
And what were the Designer's goals? This is where Behe really gives away the game. He asserts that the goal was "intelligent life." Of course, what he really means is humans, presumably because Christians (Behe is a Catholic) feel that humans were made in the image of God: "What we sense, as elaborated through modern science's instruments and our reasoning, is that we live in a universe fine-tuned for intelligent life." And elsewhere: "Parts were moving into place over geological time for the subsequent, purposeful, planned emergence of intelligent life."
From God's mouth to Behe's ear! At this point we can simply stop asking whether Behe's theory is scientific, for he provides not the slightest evidence that evolution had any goal, much less one of intelligent life. In fact, every form of life on Earth, from humans to ferns to squirrels, can trace its ancestry back to the same single species that lived about three and a half billion years ago. In that sense, all species are equally evolved and equally removed in time from life's origin. Science long ago dispensed with the notion of the scala natura: a progressive ladder of life with humans at the top. Rather, each existing species is at the tip of a branch on the tree of life. So what scientific reason can there be for singling out just one species as the Designer's goal? How do we know that the goal was not butterflies or sunflowers? Plainly, Behe is adopting religious dogma as part of his theory. Yet he continues to assert that "I regard design as a completely scientific conclusion."
And what about the features of organisms that do not look well designed, such as the appendix, the vestigial wings of the kiwi bird, or the vestigial pelvis of whales? In Darwin's Black Box, Behe punted and said that the Designer's goals were unknowable: "Features that strike us as odd in a design might have been placed there by the designer for a reason -- for artistic reasons, for variety, to show off, for some as-yet-undetected practical purpose, or for some unguessable reason -- or they might not." But if we do not know why the Designer did things, how can we possibly know that his goal was intelligent life?
Finally Behe gets to theodicy: why is there pain and evil in the world if the Designer is omnipotent? How come He/She/It allows innocent babies to get sickle-cell anemia? Behe's answer is that "maybe the Designer isn't all that beneficent or omnipotent. Science can't answer questions like that." But questions about the goals, the powers, and the limitations of the Designer are precisely what must be answered if ID is to become scientific. After all, we do know something about the power and the limitations of natural selection, a process that can explain pain and things that seem evil.
Is Behe's theory testable? Well, not really, since it consists not of positive assertions, but of criticisms of evolutionary theory and solemn declarations that it is powerless to explain complexity. And it is certainly true that scientists will never be able to give Darwinian explanations for the evolution of everything. The origins of many features, such as the bony plates on the back of the Stegosaurus, are lost in the irrecoverable past. But neither can archaeology unearth everything about ancient history. We do not maintain on these grounds that archaeology is not a science.
Behe waffles when confronted with the testability problem of ID and turns it back on evolutionists, saying that "coming from Darwinists, both objections [the lack of predictions and the untestability of ID] are instances of the pot calling the kettle black." He then waffles even more when implying that ID does not even need to be testable: "Both additional demands -- for hard-and-fast predictions or for direct evidence of a theory's fundamental principle--are disingenuous. Philosophers have long known that no simple criterion, including prediction, automatically qualifies or disqualifies something as science, and fundamental entities invoked by a theory can remain mysterious for centuries, or indefinitely."
But who is being disingenuous here? Evolution has been tested, and confirmed, many times over. Every time we find an early human fossil dating back several million years, it confirms evolution. Every time a new transitional fossil is found, such as the recently discovered "missing links" between land animals and whales, it confirms evolution. Each time a bacterial strain becomes resistant to an antibiotic, it confirms evolution. And evolutionary biology makes predictions. Here is one that Darwin himself made: that the earliest human ancestors will be found in Africa. (That prediction was confirmed, of course.) Another was made by Neil Shubin at the University of Chicago: that transitional forms between fish and amphibians would be found in 370-million-year-old rocks. Sure enough, he discovered that there were rocks of that age in Canada, went and looked at them, and found the right fossils. Intelligent design, in contrast, makes no predictions. It is infinitely malleable in the face of counterevidence, cannot be refuted, and is therefore not science.
In the end, The Edge of Evolution is not an advance or a refinement of the theory of intelligent design, but a retreat from its original claims -- an act of desperation designed to maintain credibility in a world of scientific progress. But it is all for nothing, because Behe's new theory remains the same old mixture of dead science and thinly disguised theology. There is no evidence for his main claim of non-random mutation, and scientists have plenty of evidence against it. His arguments against the Darwinian evolution of complex organisms are flawed and misleading. And there is not a shred of evidence supporting his claim that the goal of evolution is intelligent life. In contrast to the feast of evidence that nourishes evolutionary theory, Behe gives us an empty plate.
The overweening strategy of IDers, and their creationist forebears, is to say that everything that we do not understand is evidence of the existence of God. I can imagine IDers of two centuries ago claiming that God made the sun shine, because until 1938 we had no idea where all that energy came from. It was not until quantum mechanics arrived out of left field that the physicist Hans Bethe was able to surmise, correctly, that the sun is a giant fusion reactor, converting hydrogen atoms into helium and energy. Who knew?
One of the great joys of science is that we never know what will happen next. Who could have guessed twenty years ago that dinosaurs probably became extinct after a giant meteorite collided with Earth and produced a "nuclear winter"? IDers would deprive us of this essential excitement, urging us to stop working when we come up against the hard problems and to ascribe our difficulties to God. They would have us join the herd of the benighted who proclaim so confidently that they have descried the bounds of our knowledge. But this attitude, this philosophy, was anticipated and unmasked by none other than Darwin himself, who was prescient not only about biology, but also about the nature of science: "Ignorance more frequently begets confidence than does knowledge: it is those who know little, and not those who know much, who so positively assert that this or that problem will never be solved by science."