Towards A Good Samaritan World

Sunday, February 18, 2007

New blog home:

Sunday, February 11, 2007


I read this interesting tidbit about job-hunting the other day:

Bad Rule No. 7: Clean up your online identity

Stop stressing about the stupid stuff you posted when you were drunk (or worse, not drunk). It's out of your control.

Instead, build a more current online identity that will pop up highest when an employer or recruiter does an online background check (which about 70 percent do). One way to get your new identity to the top of the search engines is to use Naymz, a service that helps control what people find out about you online.

Another way to control what people see about you is to blog. A blog can represent you effectively to the online world, and a good blog will show up higher in searches than almost any kind of page that could damage you.

Wow. 70 percent of employers do online background checks? And a blog could actually help you?

That makes me think it might be better to blog under my own name already. Also, I've been a little frustrated with Blogger in a couple of ways. First of all, I've long wanted some way to track the number of visitors to the blog. Second, Blogger doesn't support Trackback.

So I just started a new blog, and I hope my readers (of which I guess I have at least two or three) will follow. The new blog is, or just

Why "The Free Thinker?" If you do a Google search on "freethinker" you'll find lots of atheist and secular humanist sites, some quite fiercely anti-religion. But I think that conventional idea of a freethinker is overdue for retirement. If a freethinker is someone who questions dogma, why should that mean questioning only religious dogmas, and not the dogmas of physicalism, sovereignty, and the various pieties of left-liberalism? Anyway, I'll write more about this at the new blog if I get any visitors. Meanwhile, a few posts are already up.

Wednesday, February 07, 2007


My argument about mind-brain supervenience set off a huge discussion in the comments. It's times like this that I feel like I really learn something from blogging.

One thing is that I'm more and more convinced of my original claim, that is, of the necessity of agnosticism about mind-brain supervenience. With all due respect to my interlocutors, they're somewhat at a loss here. Either they re-assert physicalist tenets as dogmas, or they attack straw men while half-conceding the central claim.

But I also realize that my claim and the argument for it are more difficult to understand than I expected. Let me try to deal with some of the sources of confusion.

"Faith," induction, and Hume vs. Popper. A red herring in the discussion has been my occasional references to "faith."

I developed a somewhat idiosyncratic usage of this word in a previous post, "On Faith." Faith as I use the term is, to begin with, a belief-without-proof in the converse of two subversive possible views: skepticism-about-induction and solipsism. Induction skepticism and solipsism are positions that no one believes, but which (I claim) cannot be rationally refuted.

The idea of skepticism about induction comes from Hume's An Enquiry Concerning Human Understanding. As Wikipedia explains:

David Hume framed the problem [of induction]... Among his arguments, Hume asserted there is no logical necessity that the future will resemble the past. Justifying induction on the grounds that it has worked in the past, then, begs the question. It is using inductive reasoning to justify induction, and as such is a circular argument... By Hume's arguments, there also is no strictly logical basis for belief in the Principle of the Uniformity of Nature. Notably, Hume's stated position on the issue was that instead of unproductive radical skepticism about everything, he actually was advocating a practical skepticism based on common sense, where the inevitability of induction is accepted (but not explained). Hume noted that someone who insisted on sound deductive justifications for everything would starve to death, in that they would not, for example, assume that based on previous observations of, e.g., what time of year to plant seeds, or who has bread for sale, even that bread previously nourished them and others, that these inductions would likely continue to hold true...

My argument in favor of "faith" is a response (of sorts) to Hume's critique of induction. We all hold (I argue) a belief-- or "meta-belief"-- that there are patterns in the world. This meta-belief undergirds all induction. It itself cannot be justified, or even clearly stated. (What are "patterns?"). Yet we cannot-- we simply do not have the mental capacity to-- reject it. I use the term "faith" to describe this meta-belief because I see parallels between it and two other beliefs-- the belief that there are other people, and the belief in God-- which seem similarly unprovable and impossible even to state satisfactorily (what is a "person?" what is "God?") yet which are also universal or near-universal. (One can argue about how widespread, or how genuine, atheism is...)

Another response to Hume was offered by Karl Popper, in "The Problem of Induction" (1953). Popper actually denies that we use induction even in practice-- it is, he says, "a kind of optical illusion"-- but rather, we are continually framing hypotheses, each of which leads to predictions, and we accept these hypotheses as "conjectural knowledge" until they are falsified. A point that is given great emphasis is that to be scientific, a hypothesis must be falsifiable. We must know exactly what a counter-example would look like, and be ready to abandon the hypothesis if we see one.

Now, while I think Popper's argument is an admirable contribution, and in particular a useful way to distinguish scientific from un-scientific hypotheses, I don't think, for reasons that go beyond the scope of this post, that Popper quite succeeds in solving Hume's problem. I think that "faith"-- the meta-belief in patterns-- is still necessary. However, my argument for agnosticism about mind-brain supervenience is independent of the issue of "faith." It is Popperian. We do not know what a counter-example to mind-brain supervenience would look like; therefore the hypothesis is un-falsifiable and illegitimate, and we must relinquish knowledge-claims about it.

Let me add, too, that the Humean critique of induction applies only to knowledge that is based on induction, mainly from sensory experience. I do accept subjective experience as a valid-- a "foundational"-- knowledge source, even if our reporting of that experience is, of course, imperfectly reliable.

Meta-induction and the motivation of physicalism. One response of physicalists to this argument is simply a dogmatic assertion of physicalist tenets. An e-mailer wrote: "I'd say that application of an epistemic falsifiability criterion on something like supervenience would not be appropriate. Supervenience is a means of setting the parameters of physicalism without metaphysical ghosts and spookies." Tom asserted that "in order for anything to have a cause and effect relationship it must be of the same system," i.e. the causal closure of the physical.

These are assertions rather than arguments and don't need refuting, but they raise the issue of why people embrace the physicalist dogma in the first place. I think the reason is a mis-application of induction. We have explained so much, the argument goes, within the physicalist parameters of the scientific method. So much that was once mysterious is now explained. Surely the natural sciences will eventually explain everything, conquer all the mysteries, will reduce everything to the building blocks of matter, energy and force!

We can easily see the flaw in this with the help of Karl Popper. What is the falsifiability-criterion of "science can explain everything in physicalist terms?" Suppose there is an unexplained phenomenon. Scientists try to explain it in physicalist terms. If they succeed, great, the physicalist hypothesis is vindicated! But if they fail, the physicalist hypothesis is not falsified or abandoned; scientists will keep looking for another physical explanation. It's "heads I win, tails we flip again." Since under no circumstances would scientists abandon the physicalist hypothesis, the physicalist hypothesis itself is un-falsifiable. It is the methodological assumption of the natural sciences, yet is itself unscientific.

Although it's easy to refute, the physicalist fallacy has deep roots in our culture and is very hard to destroy. Like a weed, you can rip it up again and again, but it keeps popping back up.

Science and intersubjectivity. Another type of response to my argument (particularly by Nato) was to insist that only "entities with testable properties" deserve consideration by science. This type of response actually concedes the central point, agnosticism about mind-brain supervenience. Maybe, it says, there are "tiny angels making the neurons fire," or whatever. But if these non-physical entities don't have testable properties, science has nothing to say about them.

Now, for the phrase, "testable properties," we must substitute "intersubjectively testable properties." My mind, after all, has plenty of properties that are quite evident to me: I enjoy (or suffer from, as the case may be) an endless stream of subjective experience. But this subjective experience is private. I can describe it to others, but often I don't, and even when I try to, the result is very imperfect. When I am confronted with any physicalist account of the operation of the mind which has implications for my subjective experience, I can always tell the scientist, "No, my subjective experience is inconsistent with your theory." But he can't see that for himself. Maybe we are misunderstanding each other, or maybe I am lying. It's not like a chemistry lab, where he can see the litmus paper turn red.

Agnosticism about mind-brain supervenience implies that non-supervenient minds, or non-supervenient aspects of minds, may or may not exist, but we cannot rule them out. The claim that every thought corresponds to some firing neuron might be true, but we can never prove it, nor even frame it as a Popperian-style falsifiable hypothesis and call it "conjectural knowledge." We must abandon it. Nato seems to accept this as an ontological claim, but argues that science can and ought to ignore these entities that are not intersubjectively testable, even if they might exist.

Well, yes. The methodological policy of science is to focus on that which is intersubjectively observable; and as the human condition happens to be such that all intersubjectivity is mediated through the physical world, science focuses on that. But that is a truth about science, not about the world. It doesn't follow that the physical world is all there is, or all that we need to know about.

Minds exist. We know from subjective experience that they exist and something about what they are. Minds matter. We know there is, at the least, some kind of interface between mind and body. The natural sciences can study that interface and teach us some useful things about the mind. Minds may, indeed, be supervenient on the physical world, but we do not know that they are. The supervenience of the mind on the brain is a claim that can be uncritically accepted as dogma, but cannot meet the Popperian standard of falsifiability.

If minds are, or may be, non-supervenient, or partially non-supervenient, on the physical world, they still matter. To simply exclude them, or aspects of them, from consideration on methodological grounds, will not do. The loss of the comforting intersubjectivity that comes from assuming a physicalist ontology is a price we'll have to pay.

This is where it gets interesting. The possibility of non-supervenient minds become a gateway to a richer ontology, because it is a license to take our mental experience more seriously. Platonism, with its Ideas of which the things we see in the world are only shadows, flickers to life as a renewed possibility...

I wonder how much I would make if I applied for this job. I'm qualified... Hmm. The downside of the land of opportunity: so many roads not taken.

Tuesday, February 06, 2007


In Public View, Saudis Counter Iran in Region:

With the prospect of three civil wars looming over the Middle East — and Iran poised to gain from them all — Saudi Arabia has abandoned its behind-the-scenes checkbook diplomacy and taken on a central, aggressive role in reshaping the region’s conflicts.

On Tuesday, the kingdom is playing host in Mecca to the leaders of Hamas and Fatah, the two feuding Palestinian factions, in what both sides say could lead to a national unity government and reduced bloodshed. Last fall, senior Saudi officials met secretly with Israeli leaders about how to establish a Palestinian state.

In recent months, Saudi Arabia has also increased its public involvement in Iraq and its support of the Sunni-led government in Lebanon. The process is shaping up as a counteroffensive to efforts by Iran to establish itself as the regional superpower, according to diplomats, analysts and officials here and throughout the region. Some even say that the recent Saudi commitment to temper the price of oil is aimed at undermining Iran’s economy, although officials here deny that.

“We realized that we have to wake up,” said a high-ranking Saudi diplomat who spoke on the condition of anonymity because he was not authorized to speak to the news media. “Someone rang the bell, ‘Be careful, something is moving.’ ”

Previously, we were providing the stability in the Middle East that the Saudis need, so they felt free to let their own clerics and populace breathe poison against the West while making nice to us in public. Now that stability is slipping away, they've turned around and started to take the lead on the stability front. Meanwhile, we've done them the favor of getting al-Qaeda bogged down in a dirty intra-Islamic civil war. With the jihadist threat against the Kingdom largely taken out of commission, the Saudis have more room to maneuver.

How long will it take the conventional wisdom to catch up all the way in which the Iraq War was geostrategically smart?

Monday, February 05, 2007


McCain Blasts Iraq Resolution (Newsday).

Meanwhile, Iraqis are blaming the US for not getting the surge started fast enough:

A growing number of Iraqis are saying that the United States is to blame for creating conditions that led to the worst single suicide bombing in the war, which devastated a Shiite market in Baghdad on Saturday. They argued that the Americans had been slow in completing the vaunted new American security plan, making Shiite neighborhoods much more vulnerable to such horrific attacks.

A funeral was held in Najaf on Sunday for some of the victims. Many Shiites believe the Mahdi Army should be allowed to protect them.

The critics said the new plan, which the Americans have started to execute, had emasculated the Mahdi Army, the Shiite militia that is considered responsible for many attacks on Sunnis, but that many Shiites say had been the only effective deterrent against sectarian reprisal attacks in Baghdadfs Shiite neighborhoods. Even some Iraqi supporters of the plan, like Hoshyar Zebari, the foreign minister who is a Kurd, said delays in carrying it out had caused great disappointment.

In advance of the plan, which would flood Baghdad with thousands of new American and Iraqi troops, many Mahdi Army checkpoints were dismantled and its leaders were either in hiding or under arrest, which was one of the plan’s intended goals to reduce sectarian fighting. But with no immediate influx of new security forces to fill the void, Shiites say, Sunni militants and other anti-Shiite forces have been emboldened to plot the type of attack that obliterated the bustling Sadriya market on Saturday, killing at least 135 people and wounding more than 300 from a suicide driverfs truck bomb.

“A long time has passed since the plan was announced,” Basim Shareef, a Shiite member of Parliament, said Sunday. “But so far security has only deteriorated.”

American officials have said the new plan will take time, but new concerns emerged Sunday about the readiness of Iraqi military units that are supposed to work with the roughly 17,000 additional American soldiers who will be stationed in Baghdad under the plan, which President Bush announced last month.

It would be one thing to force a troop withdrawal by denying funding for the troops, as John Edwards is advocating. That would show courage, it would be defensible as policy since it would have a policy effect. A symbolic resolution, whose only practical effect can be to lower our morale and encourage butchers like those who struck the Sadriya market, just to pander to anti-war voters, is a disgrace.


In response to an extended discussion on philosophy of mind in the comments a previous post...

Commenter froclown presents a good analogy for the physicalist view of the relationship between the mind and the brain: the brain is like a DVD, whose contents (e.g. Darth Vader) cannot be readily discerned from its physical form, but which nonetheless is ontologically reducible to its micro-physical properties. Similarly, the mass of grey cells doesn't look like it contains the thoughts we experience in our minds, but they are there nonetheless. In the technical language of philosophy, this view can be described as mind-brain supervenience, meaning that the mind, with its thoughts etc., is supervenient on the brain, i.e., all the mind's properties are mapped in the brain, and if any two brains have identical micro-physical properties, the contents/experiences of the corresponding minds (or, the "first-person perspective" on the contents of the brain) are also identical.

I cannot disprove mind-brain supervenience, but I can prove that it can never proven.

Or perhaps it is better to say that it can never be put to a Popperian test. For simply showing that mind-brain supervenience can't be proven is too easy. As Karl Popper has argued, in the strictest sense even the theory of gravity can't be proven. We can drop as many objects as we like, and observe that they accelerate downwards at 9.8 m/s2, but that will never prove that all objects obey the law of gravity. Similarly, no matter how many thoughts we successfully mapped onto firing neurons, this would never prove that all thoughts could be mapped onto firing neurons.

But at least in the case of gravity we can achieve "conjectural knowledge," in the sense that we can run a huge number of experience and clearly observe the results, and say with confidence that the theory of gravity is consistent with a huge body of experimental evidence and has never been falsified. (Maglev trains, feathers, and airships, and the modifications of the theory they necessitate, are of course a detour, which I won't go into.)

What would be a corresponding test of mind-brain supervenience? Suppose we develop a highly sophisticated brain-scanning device which allows us to make an extremely sophisticated three-dimensional model of the brain in our computers, and track all the activitiy in it. We then ask hundreds of subjects to think and describe their thoughts.

"I'm thinking about an ice cream cone," says the subject. "Now I'm thinking about a snake..." Our research assistants label the two-second sequences of brain activity "ice cream" and "snakes." We then compare these with hundreds of other brain-scans labeled "ice cream" and "snakes."

Very likely, we'll find some similarities between the "ice cream" brain scans. Perhaps the experience of thinking about ice cream will turn out to be similar to the experience of eating ice cream. Or perhaps for some subjects thinking about ice cream is similar to thinking about cake, while for others, it's similar to thinking about guilt and the need to go on a diet. But there will no doubt be a lot of errors, where the "ice cream" brain-activity sequence bears no resemblance to the other "ice cream" sequences.

It's the subjects, think the researchers. They're not reporting their thoughts accurately enough. Maybe a few were lying, because they didn't want to admit what they were really thinking about. And probably some of them said they were thinking about ice cream, when their thoughts had already moved on to something else. Well, can you blame them? After all, it's very hard to describe your own thoughts in real-time. And there are some thoughts you'd prefer to keep private. Talking also interferes with thought-processes, so that thought-processes that you are trying to describe in real-time will inevitably be different from normal thought-processes.

At the end of the day, researchers could never achieve the type of asymptotic "conjectural" knowledge that we can achieve in the case of the theory of gravity, because their only source of evidence on the subjective experience of thought is subjects' own reports, and these are very imperfectly reliable. Scanning brains to understand thought no doubt yields some real insights, but it not only can never "prove" that all thought is reducible to a micro-physical basis by accounting for all thought in micro-physical terms; it can never even make steady and generalized progress towards providing a comprehensive, physicalist picture of the operations of the mind.

Of course, we could easily provide a comprehensive, physicalist picture of the operations of a DVD.

It is the duty of any philosopher who is committed to the quest for truth to be agnostic about mind-brain supervenience. We cannot achieve any knowledge on the question of whether or not the subjective experience of the mind is comprehensively mapped onto the micro-physical properties of the brain, as the movie is mapped onto the DVD. Any knowledge-claims made here are arbitrary dogmatic pronouncements.

But 20th-century philosophy of mind has, by and large, failed in its duty to be agnostic here, and as a result, it is barren and has contributed nothing to civilization. As any belief survey will show, ordinary people prefer even the crudest forms of traditional or pseudo-traditional religion to anything that modern philosophy of mind has to offer. It was not always thus: at many periods in history, philosophy has had a broad and deep popular influence. It became useless in the 20th century because it sacrificed its raison d'etre, the quest for truth, to the physicalist dogma. Today the institutional apparatus of philosophy of mind is dominated by physicalist apparatchiks, and non-physicalists are deterred from entering the field. Hopefully, the stranglehold of physicalism will be broken, and philosophy will revive.

Sunday, February 04, 2007


Francis Fukuyama is right that Americans are unduly pessimistic. But he underestimates bin Laden:

[T]here is good reason to think that we have consistently overestimated threats to stability since 9/11 and that it is our reaction to this overestimation that has created special dangers. At the time of the Sept. 11 attacks, there were probably no more than a few dozen people in the world with the motivation and potential means to cause catastrophic harm to the United States. Once our mighty national security apparatus was turned to focus on this problem, the likelihood of a successful attack dropped dramatically.

Only a few dozen? Millions of people in the Muslim world regarded us (then as now) as the enemy. Bin Laden and the 19 hijackers showed the world that you can "cause catastrophic harm to the United States" with something as simple as box-cutters. Only a few dozen people had box-cutters?

Yes, we have a powerful military, but that's the whole point of terrorism: it's asymmetrical warfare. After 9/11, the men were dead, the myth lived. And bin Laden was, for Muslim radical youths from Pakistan to Palestine, a heroic symbol of defiance to the nefarious superpower of the West-- and of the corrupt and tyrannical rulers of the Muslim world itself. Bin Laden could have been a Che Guevara, a revolutionary hero-symbol, admired both in the Third World and the West.

It was when the Iraqi people welcomed Americans as liberators (yes, they did, we all saw it with our own eyes, and all the media sneers in the world can't change that fact) that bin Laden's mystique was shattered. Two elections and millions of defiant purple fingers showed the world that Iraqis-- who alone were in a position to express their real desires-- wanted, not the caliphate, but democracy. Al-Qaeda's reliance on asymmetrical warfare, waged on Iraqi soil, meant murdering thousands of innocent Muslims and forfeiting all hope of uniting the Muslim world behind their banners.

Yes, Americans' pessimism today is mistaken. But it's not despite Iraq, as Fukuyama would have us believe; rather, because of it. The surgery has been painful, but now the cancer is removed.

UPDATE: Fareed Zakaria makes a similar point, only better:

Osama bin Laden and Ayman Al-Zawahiri, both Sunnis, created Al Qaeda to be a Pan-Islamic organization, uniting all Muslims as it battled the West, Israel and Western-allied regimes like Saudi Arabia and Egypt. Neither Zawahiri nor bin Laden was animated by hatred of Shiites. In its original fatwas and other statements, Al Qaeda makes no mention of them, condemning only the "Crusaders" and "Jews." [...]

The trouble for Al Qaeda is that as a practical matter, loathing Shiites works in only a few places: principally Iraq, Pakistan, Saudi Arabia and some parts of the gulf. Most of the rest of the world's 1.3 billion Muslims are turned off by attacks on their co-religionists.

So, an organization that had hoped to rally the entire Muslim world to jihad against the West has been dragged instead into a dirty internal war within Islam. Bin Laden began his struggle hoping to topple the Saudi regime. He is now aligned with the Saudi monarchy as it organizes against Shiite domination. This necessarily limits Al Qaeda's broader appeal and complicates its basic anti-Western strategy.

Via Brothers Judd.

Saturday, February 03, 2007

David Warren of the Ottawa Citizen is a bracing columnist. An excerpt from Tribalism & Us:

Here is the paradox: that we cannot afford to abandon Iraq, for the very reason that things are so bad there. Were the Americans and allies to step out now, it would certainly become the staging area for both Shia and Sunni violence on a larger scale -- directed not only inward against each other, but outward across the Middle East, and given the huge Muslim diaspora now spread through Europe and North America, beyond.

Bear in mind, further, that Islamist terrorism against the West, feeds not only on Western weakness of will, but on this Muslim internecine strife. The most practical argument of the Islamists, to all their co-religionists, being: "We could unify ourselves if we all agreed to attack the West. We might even be able to destroy the West, because it has no idea how to defend itself."

It follows from this that a secret hope, not quite expressed in print (though sometimes expressed in the blogosphere), is vain. This is the hope that if Muslim fanatics are left to get on with killing each other, they will leave us alone. Like so many glib ideas, it sounds so plausible, but is the exact opposite of the truth.

An argument to back this claim up follows, but I don't quite understand it. However, I think Warren is partly right. That we have stirred up a Sunni-Shia split within the Muslim world could be to our advantage geostrategically. But it serves our interests best-- and is preferable for humanitarian reasons, too, of course-- that it be a cold war rather than a hot war. It's a good thing that Saudi Arabia (the great conservative money-power of the Sunni bloc) is hosting talks between Hamas and Abbas with a view (maybe) to Israel-Palestine peace, and trying to topple Ahmadinejad through an oil price war (which also gets up cheap gas at the pumps). But bloody sectarian warfare will leave behind a minefield of angry passions which can feed into future radicalism and terrorism.

The justification for the surge is that it might be able to stop the killing.

UPDATE: Fred Kaplan, long-time victim of Bush Derangement Syndrome, writes:

[I]n the unlikely event that the Bush administration succeeds in splitting the region along this sectarian divide, it will only harden tensions, inflame passions, and, by the way, do nothing to solve our immediate problems in Iraq.

This is why the Saudis and the Iranians are exploring common interests and seeking to mediate agreements—because the Americans, who used to do this sort of thing, have abdicated the role.

Wow, that's dumb. The Americans used to play the role of mediating agreements... with Iran?

Friday, February 02, 2007


The world has fewer officially atheist states now than it did twenty years ago, when the Soviet Union and the communist powers of the Warsaw Pact were still in business. One of those still standing is China. But religion is getting stronger there at the grassroots level, and it is also gaining more acceptance from the regime. The Economist reports:

The revival of the Black Dragon Temple's fortunes is part of a resurgence of religious or quasi-religious activity across China that—notwithstanding occasional crackdowns—is transforming the social and political landscape of many parts of the countryside. Religion is also attracting many people in the cities, where the party's atheist ideology has traditionally held stronger sway.

The resurgence encompasses ancient folk religions and ancestor worship, along with the organised religions of Buddhism, Taoism, Islam (among ethnic minorities) and, most strikingly, given its foreign origins and relatively short history in China, Christianity. In the face of this onslaught, the party is beginning to rethink its approach to religion. It now acknowledges that it may even have its uses...

Officially, the party regards folk religion as superstition, the public practice of which is illegal. But in many rural areas officials now bend the rules. In Yulin prefecture, with 3.4m people, there are 106 officially registered places of worship and many more that are not officially sanctioned...

Evidence of China's religious revival can be seen throughout the countryside in the form of lavish new temples, halls for ancestor worship, churches and mosques (except in the far western province of Xinjiang, where the government worries that Islam is intertwined with ethnic separatism and keeps tighter rein). Officially there are more than 100m religious believers in China (see table), or about 10% of the population. But experts say the real number is very much higher.

Christianity in particular is flourishing:

The Archbishop of Canterbury, Rowan Williams, who visited China in October, wrote afterwards in the Times that there was now a sense in China that civil society needed religion, with its motivated volunteers. During his trip he remarked on an “astonishing and quite unpredictable explosion” in Christian numbers in China in recent years.

If the growth of Christianity in China continues at the rates it has recently sustained, within our lifetimes China will be a predominantly Christian nation. A likely scenario? Perhaps not; but definitely possible.

In that case, would the American evangelicals-- who are already in a sense second-class citizens here, forced to pay, as it were, the jizya to an elite that teaches their own children in a fashion contrary to their faith-- begin to feel divided loyalties? It is incongruous that the world's most populous and dynamic (predominantly) Christian country is also the world's richest, when Jesus said that "it is easier for a camel to pass through the eye of a needle than for a rich man to enter the Kingdom of Heaven." In a newly converted China, American Christians could see much to admire.

Thursday, February 01, 2007

A brave commenter, "froclown," challenges me on physicalism and the brain:

If the information in your inner world is so private and unavailable to the public, not to mention as you say it's fundamentally non-physical, then how come it is possible to physically connect electrodes into your brain, and by use of a complex decoding program performed by a physical computer, the images in your mind's eye can be broadcast on a physical computer monitor for all to see?

How come NASA has had success with a program to decode and transmit spoken words from the brain of individual who have only thought the worlds, not yet spoken them?

And other such direct brain-computer interfaces. We know computers are physical, if the brain is not, then how can the two grok via a purely physical interface?

OK, first, the brain is physical; it's the mind which is non-physical, or rather, not reducible to the physical. What I am denying is not that the mind has any physical aspect, but the supervenience of the mind on the brain.

To understand this, it's critical to take note of the human tendency to use the physical world as an aid to our own thoughts. If we're calculating a difficult sum, we might use a piece of scratch-paper as an aid. People sometimes talk to themselves out loud when they're excited about something or are grappling with a vexing question. (I do, anyway. It's said to be a sign of insanity, but that's just the voice of envy from people who don't have anything interesting to talk to themselves about.) People keep diaries, sometimes to remember the past, sometimes just to record the way they feel and straighten out their own thoughts.

It's not surprising, therefore, that the mind often jots down its thoughts in the neurons of the brain, where they can sometimes be observed by NASA, or wired into a computer interface. That you can learn something about my thoughts by putting electronic sensors is parallel to you being able to learn something about my thoughts by reading my diary. In either case, my mind is using the physical world as an aid to its own thinking, and creating thought-traces which allow my thoughts to be partially observed by outsiders.

It does not follow that every thought is mapped onto the physical world, or that there is anything in the physical world correpsonding to all the aspects of the mind, or that my subjective experiences are not fundamentally private (even if some imperfect communication between minds, intentional or unintentional, occurs).

A great post on evolution at Brothers Judd: THEY WOULDN'T EVEN HAVE 13% WITHOUT THE COERCION THE MONOPOLY PROVIDES. Brothers Judd often takes potshots at evolution, but this post is more substantive. They quote an article "Why Do We Evoke Darwin?" from a magazine called The Scientist: Magazine of the Life Sciences:

Darwin's theory of evolution offers a sweeping explanation of the history of life, from the earliest microscopic organisms billions of years ago to all the plants and animals around us today. Much of the evidence that might have established the theory on an unshakable empirical foundation, however, remains lost in the distant past. For instance, Darwin hoped we would discover transitional precursors to the animal forms that appear abruptly in the Cambrian strata. Since then we have found many ancient fossils – even exquisitely preserved soft-bodied creatures – but none are credible ancestors to the Cambrian animals.

Despite this and other difficulties, the modern form of Darwin's theory has been raised to its present high status because it's said to be the cornerstone of modern experimental biology. But is that correct? "While the great majority of biologists would probably agree with Theodosius Dobzhansky's dictum that 'nothing in biology makes sense except in the light of evolution,' most can conduct their work quite happily without particular reference to evolutionary ideas," A.S. Wilkins, editor of the journal BioEssays, wrote in 2000.[1] "Evolution would appear to be the indispensable unifying idea and, at the same time, a highly superfluous one."

I would tend to agree. Certainly, my own research with antibiotics during World War II received no guidance from insights provided by Darwinian evolution. Nor did Alexander Fleming's discovery of bacterial inhibition by penicillin. I recently asked more than 70 eminent researchers if they would have done their work differently if they had thought Darwin's theory was wrong. The responses were all the same: No.

I also examined the outstanding biodiscoveries of the past century: the discovery of the double helix; the characterization of the ribosome; the mapping of genomes; research on medications and drug reactions; improvements in food production and sanitation; the development of new surgeries; and others. I even queried biologists working in areas where one would expect the Darwinian paradigm to have most benefited research, such as the emergence of resistance to antibiotics and pesticides. Here, as elsewhere, I found that Darwin's theory had provided no discernible guidance, but was brought in, after the breakthroughs, as an interesting narrative gloss.

In the peer-reviewed literature, the word "evolution" often occurs as a sort of coda to academic papers in experimental biology. Is the term integral or superfluous to the substance of these papers? To find out, I substituted for "evolution" some other word – "Buddhism," "Aztec cosmology," or even "creationism." I found that the substitution never touched the paper's core. This did not surprise me. From my conversations with leading researchers it had became clear that modern experimental biology gains its strength from the availability of new instruments and methodologies, not from an immersion in historical biology.

The genuine scientific contribution of Darwinian thinking can be separated from Darwinist ideology/creation-mythology by substituting the "Darwinian theory of evolution" with a "Darwinian theory of imperfect ecosystemic homeostasis."

Homeostasis is a property of living organisms, namely that the atoms of which they are physically comprised at any given time cycle in and out of them, yet the form of the organism maintains a high degree of continuity. As organisms exhibit homeostasis with respect to the atoms that comprise them, so ecosystems exhibit homeostasis with respect to the organisms that comprise them: trees, birds, insects, mammals are born, grow, and die, but the forest remains. This may be called ecosystemic homeostasis.

The Darwinian theory sheds light, both on (a) why the forest is able to maintain homeostasis, and (b) why the homeostasis is imperfect, i.e., the forest will not stay exactly the same forever (even putting to one side exogenous climatic or geological changes). The forest maintains homeostasis because organisms are adapted to their environments and achieve a sort of equilibrium; but it can change over time because the Mendelian genetics that underlies the forms of organisms, combined with natural selection and (very rarely) advantageous mutations, can enable organisms to upgrade, or to find new niches, creating ripple effects throughout the ecosystem. Experimental evidence that this kind of evolution can occur is not really needed; once the truths of Mendelian genetics (plus the possibility of mutation) are established, logic alone is enough to show that evolution is at least a possibility, though perhaps a vanishingly rare one. We don't know, based on logic alone, whether it has played a significant role in natural history or not. Perhaps fossil evidence suggests that it has played at least some role. (I'm not a paleontologist.)

That's as far as science, properly understood, can go. The unwarranted and superfluous further claim that this is how all life originated is our civilization's reigning creation-myth but is not science. We don't know how all life was created, and we probably never will. The evidence is just too scanty.