An argument for atheism that doesn’t work as well as its arguers seem to think

The, like, three of you who inexplicably read my blog may remember last year when I made it a Lenten discipline to write things and post them. I’m not officially doing that this year, but I am using the fact that it’s Lent to motivate me to post more often.

Quite a few of my friends are atheist or agnostic. Most of the ones who would say they’re agnostic are, for all intents and purposes, actually atheist, but that’s neither here nor there. Some of my atheist or agnostic friends are more tolerant of those who are religious; others are more militant in their atheism, or at least more insistent that those who don’t share their point of view are less intelligent. But there’s an argument that I’ve heard several of them use at one time or another to support their atheism or agnosticism which has always bothered me–not because I disagree with their position, although I do, but because the argument itself doesn’t sit right.

The argument goes more or less like this:

“I believe in science and reason and in what can be proven by rational or scientific means, and only in those things. If the existence of God is proven scientifically or rationally, I will consider accepting that God exists, but until that happens, I will not.”

(I should note that, by “God,” I mean more or less the supernatural creator and lawgiver of the Abrahamic religions; for my purposes here, I don’t need to get more detailed than that.)

My friends who make this argument, or versions of it, are generally proud of their commitment to reason and science and of their intellectual orientation to life, and in fairness to them these are traits worth being proud of. Most likely they are less uniformly committed to science and reason in practice than this argument would suggest–most of them would be pretty upset if they were forced to acknowledge love as nothing more than a biochemical reaction, for example–but that’s not really why I have trouble accepting this argument.

No, my main problem with this argument is that it puts the burden of proof entirely on God.

In an American court of law, the burden of proof is on the prosecution. That is, a person accused of a crime doesn’t have to prove that she didn’t do it; she just has to show that the prosecution’s case is faulty. It’s the person who accuses her who has to prove that his accusation is well-founded. The rationale is that, if we make her accuser prove her guilt, she’s less likely to be convicted of and punished for something she didn’t do than it would be in a system where the accused had to prove her innocence. Thus there’s a good reason to put the burden of proof on the prosecution in a legal case: it protects the rights of the accused, especially the ones who are innocent.

But in the argument for atheism cited above, the rationale for putting the burden of proof entirely on God (or at least on those who believe he exists) is less clear. See, the problem there is that neither the existence of God nor the nonexistence of God has been scientifically, rationally proven. There are arguments on both sides, sure, but nothing amounting to conclusive proof. So why is only one of those propositions tasked with proving itself, and why is the other exempt?

“God exists” is a truth claim. But “God does not exist” is also a truth claim. Sure, you can interpret those two statements as opposite propositions; we’ll call them p and ~p, respectively, since the second is in effect a negation of the first. The thing is, this particular ~p is also a proposition of its own.  We’ll call it q and its opposite ~q. That is, the proposition “God exists” is both p and ~q, and the proposition “God does not exist” is both ~p and q. Both statements say something about the universe: the former posits that it has a creator and lawgiver, while the latter posits that it exists and operates without one.

But neither p nor q has been proven or disproven. And that means that, strictly on the basis of logic, there is no rational basis for preferring one over the other–and no rational basis for putting the burden of proof entirely on one side or the other.

Now, at this point my atheist friends would argue that it is, in fact, somewhat more rational to demand proof for the existence of a thing than for its nonexistence, although for the reasons given above I have trouble seeing why. Nonetheless, I’ll offer a counterargument. And I’ll offer it not to place the burden of proof on the atheists–I believe theists and atheists should both be responsible for arguing their own cases as far as they are able–but to undermine the atheists’ argument that the burden of proof should rest solely with the theists.

Here it is:

“For most of known human history, the overwhelming majority of the world’s people–and the vast majority of the world’s people today–have believed in some sort of gods or supernatural beings. Their beliefs have varied, certainly, but those who say the natural world is all that exists have always been far outnumbered by those who say it’s presided over by some deity or deities. So is it more rational to put the burden of proof on the vast majority of humans throughout history, or is it more rational to make the relatively small minority prove their case against the majority whom they say are wrong?”

Some perspective on history and impermanence, courtesy of Friedman and Stephenson

In a recent column, New York Times columnist Thomas L. Friedman speculated that Iraq may have to break up into separate states for a while in order for Iraqis to figure out that they’re better off sharing a country. Leaving aside for now the question of whether Mr. Friedman is right–though he does usually have some good insights to offer into world affairs–his column and the prospect of the breakup of Iraq furnish me with an occasion to write about something that’s been on my mind lately: things that may be much more historically contingent than we assume.

The basic problem in Iraq has long been that various groups, the largest of which include Sunni Arabs, Shiite Arabs, and Kurds, that otherwise wouldn’t be inclined to live together have been forced to share a country. This problem is largely the fault of European powers who carved up the Middle East to serve their own interests with little regard for whom they were forcing to live with whom. (See also: Africa.) Strong leaders more or less managed to keep a lid on the tensions, usually at the expense of human rights for one group or another–but suppressing a problem isn’t the same as solving it, as became painfully obvious when Saddam Hussein was removed and sectarian hell broke loose. Now the nation-state of Iraq, not quite a century old, may well not survive another year.

Certainly the violence and bloodshed that have characterized this process are deeply, deeply tragic. But I’m not convinced that, in and of itself, the breakup of Iraq is tragic at all. It may be a good thing for the people or a bad thing for the people, or more likely good for some and bad for others. I don’t really know, and I don’t know enough to speculate (although the Kurds, for their part, seem to be on track to build a stable, inclusive government in the north). But whether the breakup of Iraq would be good or bad is beside my point; what isn’t beside my point is the impermanence of Iraq.

For most of the last century and a half or so, a huge majority of the earth’s land area has been carved up into states and territories with well-defined borders, to the extent that most people living today can be forgiven for assuming that such an arrangement is, if you’ll pardon the expression, the natural order of things. But countries haven’t always operated that way. For much of human history, international borders were unfixed; a country or kingdom or empire would claim whatever territory it could administer and defend, and that was that. Several modern European nations, including Germany and Italy, were motley collections of princedoms, principalities, and city-states as recently as the nineteenth century. And defending the fixed boundary of Hadrian’s Wall required far more soldiers than Rome’s previous strategy of letting roving bands of soldiers patrol its northern border in Britain as the generals thought appropriate. And of course kingdoms, republics, and other forms of government have appeared and disappeared and changed across the world throughout history.

In other words, perhaps it is not just Iraq that existed for a time and will eventually give way to something else; perhaps the same is true of states as well.

A few months ago I re-read Neal Stephenson’s speculative fiction novel Anathem, and one of the things I enjoyed most was Stephenson’s descriptions of the rise and fall of civilizations over thousands of years, the waxing and waning of populations and polities, the flow and ebb of human activity. Usually these descriptions take the form of the protagonist’s observations of abandoned towns and fields, dilapidated buildings, and the like. The effect of these passages, for me, was to call to mind not so much the ruins that Greece or Macchu Picchu became long ago but the ruins that New York or Seattle may become some centuries hence.

Think for a moment about the history books you’ve read. The history books of today can condense, say, Europe’s Late Middle Ages–the lives of millions of people and the cultures and politics of dozens of kingdoms, in all their vast complexity–to a single book, a single chapter, a single paragraph. A thousand years from now, what will the history books say about the twenty-first century? What institutions, what forms of culture and government, which of the things that we think of as enduring will be distilled to a paragraph that students will skim over in a vain attempt to pass a quiz? Liberal democracy? The United States? The modern university system? Labor unions?

This post isn’t meant to be a depressing rumination on the meaninglessness of the things we do. It’s meant to be a liberating reminder of how the things that we take for granted will last don’t actually need us to fight to keep them around, because it turns out that, in all probability, human life will go on even if Iraq does not.

On rape, non-consensual sex, and the uses of language

In an April 28 column for the Washington Post, Petula Dvorak laments the use of the term “non-consensual sex” for acts that used to be called–or ought to be called–“rape.” The new label, she says, has “become part of the weaseling, whitewashing way we deal with sexual harassment, sexual assault and rape” and renders it “too easy to minimize the scope of the problem” and threatens to allow predators and perpetrators to elude justice.

On one hand, she’s got a point. As labels go, “non-consensual sex” does sound euphemistic and insufficiently severe, rather like calling a robbery a “non-consensual donation” or something. It certainly fails to communicate the sense of trauma, horror, and violation that “rape” does. Swapping out “rape” for “non-consensual sex” in all cases and contexts would certainly be a serious mistake.

At the same time, whether Dvorak likes it or not, for most people the word “rape” carries connotations not merely of sex without consent but of sex by force. That means that, when most people hear the word “rape,” some of the things she wants to call “rape” don’t spring to mind. And that, in turn, means that some of the incidents Dvorak wants to call “rape” (putatively consensual sex between drunk people, for example, even though drunkenness nullifies putative consent) won’t get reported as rape or sexual assault.

“Non-consensual sex,” however, encompasses all forms of rape, sexual assault, drunk sex, and other sexual activity without consent, forcible or not. Its breadth invites people to recognize–and identify as wrong–a larger variety of sexual evils than most would identify as “rape.”

And that means that a label like “non-consensual sex” may even bring more incidents and patterns of sexual wrongdoing to light than a label like “rape” would, precisely because “non-consensual sex” is both broader and less freighted with specific kinds of imagery than “rape.” Dvorak’s own second paragraph supports this point, as does the Al Jazeera article linked therein:

The “non-consensual sex” rebranding is courtesy of Brett Sokolow, a lawyer who has been advising colleges and universities about dealing with rape on their campuses for the past 15 years. He told Al Jazeera America that college administrations don’t want to say the word “rape” and don’t want to believe their students could be rapists. But once he changed the term to “non-consensual sex,” the conversations were much easier. Focus groups loved it. Rape lite.

Certainly institutions invite PR nightmares when they take any action that appears to admit the existence of rape on their campuses, but “sexual assault” carries a bit less stigma, and “non-consensual sex” less still, so administrators can use those labels on their prevention-and-response programs with fewer worries. By the same token, the horror and violation inherent in the concept of “rape” keep plenty of victims from coming forward and plenty of perpetrators from accepting responsibility, facing justice, or reforming their ways.

And therein lies the tradeoff. Changing the language may trivialize evil that shouldn’t be trivialized, but it may also loosen some tongues that badly need to be loosened.

Thing is, despite Dvorak’s indignation at the change in terms, sometimes there is value in calling it “rape” and sometimes there’s value in calling it “non-consensual sex.” There’s probably value in using other terms sometimes, too; much depends on what the community–the whole community, including victims, administrators, perpetrators, and everyone else as well–needs in a given situation.

Sometimes it’s in everyone’s best interest to use the stronger terms to emphasize the evil of the actions and the severity of the problem. But sometimes it’s in everyone’s best interest to use the less freighted terms so people who might not otherwise talk about the problem finally will. Determining which situations call for which kind of term may be challenging, but discarding terms of either kind would be a mistake.

What every organist everywhere needs to, for the love of all that is good and right, STOP DOING

A few posts ago I wrote about certain problems common to leaders of musical worship, and I pointed out that organists are just as vulnerable to some temptations as are praise bands. Today I’ll write about another temptation peculiar to organists that I’ve noticed an awful lot.

That is the temptation to play any given hymn, response, musical setting, or other piece of music waaaayy too slowly.

If you’ve ever attended a church service that featured organ music, you’ve probably heard what I mean. Sometimes the “waaaayy” of “waaaayy too slowly” is an exaggeration, sometimes it’s not, but the “too slowly” is almost always present.

True story: When I was thirteen and my immediate family moved to a new state, we tried out a bunch of churches in our new town so we could decide which one we wanted to call home. We couldn’t just pick one from our old denomination, because our old denomination was Presbyterian (PCUSA, if you wanted that detail) and our new town was in New England, where Congregational churches abound and Presbyterian churches are few and far between. We eventually settled on our local United Methodist church for a variety of reasons, but one of the minor reasons was that it was the only one whose organist played the hymns at their proper speed.

Anyway, yeah, too often, organists play too slowly.

Sometimes I complain about this problem to a certain friend at my current church, who always counters that the organist is trying to play in a “stately” manner. In fairness to my friend, organs excel at playing in a stately manner, and the organist probably is trying to take advantage of that fact by giving every hymn, response, musical setting, and so on an air of stateliness. She may not even realize she’s doing it.

But there are at least two problems with taking that approach, and therefore at least two reasons I’d like organists everywhere to knock it off.

The first problem is that not every hymn, response, musical setting, or what have you is meant to sound stately. Sometimes a stately is just plain wrong for a given piece. “Come Thou Fount of Every Blessing,” for example–or anything else set to the tune “Nettleton“–should sound sprightly, not stately. “Nettleton” can be lively or mellow, though it should be played at the same tempo either way. But it just plain doesn’t have the gravitas you need if you’re aiming for stateliness, so for ol’ John Wyeth‘s sake don’t slow it down.

Try this, organists: Take the lyrics to any given hymn and sit down with them, or better yet stand up or go for a walk or something, and try singing them without an organ or any other instrument, at the tempo that feels most natural and right for that particular hymn. Then, when you return to your organ, play the hymn at that tempo instead of the one you usually use. Your congregation will thank you.

The second problem relates to a general principle of leading musical worship, which is that nothing kills a congregation’s enthusiasm like a song that drags on too long because it’s being led too slow. (This principle also holds for worship bands, youth group guitarists, and anyone else who leads group singing of any kind in any situation ever.) Even just a little too slow, so little you hardly notice if you’re the one leading the singing, can stretch a heretofore lively song out into an utter dirge once you multiply the slowness by three or four verses. When that happens, the congregation feels trapped in the song, and they stop thinking about God and start thinking about how the worship leader really should be playing faster. They stop closing their eyes in joyous rapture and start looking out the window to watch the glaciers whiz by.

The best tempo for any given hymn or musical setting or whatever is whatever tempo is proper for that hymn or musical setting or whatever. Usually that’s whatever tempo feels most natural, absent the limitations of the congregation or the instruments or whatever else might affect the way you lead singing. But I found, in my years leading musical worship for my college fellowship (yup, I was once that guy), that when you’re actually up there in front of the group, sometimes time distorts itself so the tempo that feels right to you then is actually a bit too slow.

The remedy is easy: Err on the side of playing a little too fast. Don’t race through it, of course; that doesn’t work either. Just play a little faster than you think you should. Sometimes that’ll mean that the song or hymn feels a little too fast, sure–but more often than not it’ll mean that you actually get the tempo just right. And your congregation will thank you.

Just please, please, please stop playing everything too slowly. Yes, this means you.

Thank you.

A few reflections for Good Friday

Today is Good Friday (or Holy Friday in some traditions), the day Christians commemorate the execution of Jesus. Rather than one long post on a single topic today, I thought I’d share a couple of brief(ish) observations on topics related in some way to the cross.

  1. Sometimes we spread the message of the cross in some really screwy ways. Mel Gibson’s 2004 splatter flick The Passion of the Christ, for example. When it first came out, an awful lot of voices in the Christian press were proclaiming how wonderful an opportunity it presented for evangelism. Take your friends to see it! Show them how much Jesus suffered for their sake!

    Except no. If you already know the story of Jesus, sure, you’ll have some context for the events of the movie–but if you don’t, all you really see is some poor guy getting wailed on for two hours. It’s basically torture porn. This is a message of love … how, exactly?

    In a way, what it reminded me of most was a movie they showed at a Maundy Thursday supper at our church when I was maybe three, that my parents didn’t think to remove me from the room for. It was one of those reel-to-reel jobs, and for some reason it didn’t have sound, because I remember the pastor narrating (“This is Peter–crying, because Jesus is dead”). But there was a graphic depiction of Jesus’ crucifixion. I remember thinking for months after that, “I’d cry if I were nailed to a cross.” I also remember unsettling dreams about myself or my family members facing imminent crucifixion, usually as some sort of medical procedure whose purpose was never explained. Point is, I saw Jesus’ crucifixion in the context of his life and in a family and church community that would continue to care for me and help me understand, and it still kind of traumatized me. As essential a part of the Gospel as the crucifixion is, we should think more carefully than we sometimes do about how we talk about it with non-Christians, not-yet-Christians, and new Christians.

  2. The theology of the cross is another thing we do weirdly sometimes. About ten or twelve years ago I found myself in conversation with the pastor of a Reformed Presbyterian Church. He was a great guy, but it was clear from the beginning that his theological perspective was very different from mine. He talked about the relationships between grace and law or between the Old and New Testaments in more oppositional terms than I was comfortable with. It wasn’t that I disagreed with his theology, not entirely; I just thought he was allowing too little room for other interpretations to supplement or modulate it.

    As our conversation wrapped up he handed me two pamphlets he had on him that he said would help explain the matter further and give me something to think about. I gladly accepted, and when I got home I read the pamphlets. His good intentions notwithstanding, they were no help whatsoever.

    The problem was that they were written not to convince an outsider that the RPC’s theology of the cross was true but to convince other RPC folks that this or that soon-to-be-ex-RPC-pastor’s theology of the cross was false. Both pamphlets were strident in their defense of something they called “classical reformed covenant theology” (which I’ll abbreviate “CRCT”), but I only knew what that was from a college class on the Puritans I took years before; the pamphlets themselves told me nothing about CRCT, except that so-and-so’s theology wasn’t it. We should defend truth against error, sure, but I was rather put off by the weak arguments (“He can’t be teaching CRCT because he’s not using the phrase ‘substitutionary atonement'” and that sort of thing), the theological rigidity, and the hostile tone of the pamphlets.

    To be fair to the pastor, he hadn’t meant to alienate me. I couldn’t help wondering if he’d meant to hand me those pamphlets or if he thought he was handing me different ones. Or maybe he meant to hand me those because he hadn’t read them and thought they were about something different. I guess, if there’s a lesson here, it’s to know what you’re sharing with whom.

  3. One day during our junior year of high school (which is grade 11, if any non-Americans are reading this), one of my best friends gave me a tape she’d made for me of the soundtrack to the Andrew Lloyd Webber/Tim Rice rock opera Jesus Christ Superstar. I hadn’t seen a stage performance or movie of it–still haven’t, actually–so it took me a while to sort out which characters were singing some of the songs. But I liked the music well enough, so I kept listening to it.

    I was a bit bothered by Jesus’ apparent lack of concern with matters of salvation and his ignorance of why his Father wanted him to die. And worse, the play ends with Jesus’ burial and doesn’t include his resurrection. Seems rather a glaring omission.

    But as I listened to the tape again and again over the years, I came to appreciate what the opera does well. For example, the political side of what some people hoped and others feared that Jesus would do often gets ignored amid our discussions of spiritual salvation, but Superstar captures it very well. The disciples’ cluelessness about what Jesus is actually up to is evident throughout, as are their changing moods, as the jubilation of Palm Sunday gives way to the violence of Good Friday–and as an egotistical Jesus swings wildly (as they see it) from benevolence to wrath to despair and back again. Even if it takes some liberties with the Scriptural accounts, Superstar has forced me to think about the events of Holy Week, the role of politics, and the perspectives of the disciples more deeply than I might have otherwise. That tape my friend gave me remains a staple of my music collection. And I’ve been grateful to her ever since.

Me and the Eucharist: a brief history

Tomorrow–maybe today, by the time I get this posted–is the day that my church calls Maundy Thursday, some other churches call Holy Thursday, and pretty much all churches commemorate the Last Supper Jesus shared with his closest followers before he was arrested. So here’s how I came to think what I think about the Lord’s Supper–or Communion, or the Eucharist, or whatever you might call the thing with the bread and wine (or grape juice or whatever).

I grew up in a Presbyterian (PCUSA) church that served Communion on the first Sunday of every month and invited all baptized Christians, regardless of their denominational background. There was a brief liturgy, including prayers and responsive readings, and then the congregation stayed in the pews while the servers brought around plates piled with small cubes of white bread followed by plates filled with small plastic shot glasses of grape juice, each of which elements we’d consume immediately.

Then my family moved and we began attending a local United Methodist church. Communion was also the first Sunday of the month there, again open to all baptized Christians, and again we’d have a brief liturgy of prayers and readings, but we’d receive the elements by going forward to kneel at the altar rail, where the servers would give us cubes of white bread and shot glasses of grape juice.

A few years later I attended my first Roman Catholic service: a Christmas Eve Midnight Mass with a friend’s family. My friend advised me that, since I wasn’t Catholic, when the time came to go forward for Communion I should remain seated. That was my introduction to the concept of closed Communion, which is when a church doesn’t serve Communion to people who aren’t of its denomination, and also to the practice of giving the congregation only the bread and not the wine.

That closed Communion experience followed me to an Anglican church I visited with some friends when I was in college. The service felt similar enough to the one at my other friends’ Catholic church that I assumed the same rules applied. When the time came for the Eucharist to be served, I went forward but crossed my arms over my chest, and the priest prayed a blessing over me. After the service, though, he asked if I was a believer and if I’d been baptized; when I said yes to both questions, he assured me that I could indeed receive Communion there–which, on subsequent visits, I happily did.

But during college I also attended a few other churches on occasion, including some non-denominational evangelical churches. The contrast between the way these churches did Communion and the way the Anglicans did it could hardly have been starker. Where the Anglicans had a complex, thorough, and well-ordered liturgy, these other churches had a Scripture reading and a brief extemporaneous prayer, and where the Anglicans made a point of serving Communion every week, these other churches served it once a month and in some cases only a few times a year. And I found that, where I left the Anglican service feeling like I had partaken of a sacrament infused with meaning and sacredness, I left these other churches feeling like I’d just eaten a crumb of bread and drunk a shot of grape juice and that was it.

Now, I’m not saying there wasn’t any spiritual significance in the way these other churches served Communion; I expect there was for most people in the congregation. And probably some folks would have found the Anglican liturgy impenetrable, restrictive, or otherwise off-putting. I’m just telling you how I felt about the services after I attended them. Your mileage is perfectly welcome to vary.

Anyway, after college I moved again and started attending another Presbyterian church (PCUSA again), but this one did Communion differently from the church of my childhood. Twice as often, for one thing: the first and third Sundays of the month, and a few years later they began serving Communion every week. For another thing, the congregation went up front to receive the elements, we’d tear pieces off a large loaf of bread–or break off a piece of matzo–rather than picking up pre-cut cubes, and we’d dunk the bread in goblets of grape juice rather than sipping the juice from individual shot glasses. On the whole they were a much more liturgical lot.

Then I moved again and started attending an Anglican church plant. The only thing that made their Communion feel any less meaningful than my earlier Anglican experience was that instead of the usual wafers they had us tear off pieces of pita, but for some reason the pita they used was made with a grain that sent big cough-inducing chunks down our throats.

Another move saw me again make my church home at an Anglican church–though this one uses the wafers, so there aren’t any grain particles. Also, you have the option of sipping wine from the chalice or dunking your wafer. (I usually sip, unless I’m sick.)

But one of the things I’ve found, after going to Anglican churches regularly for the last few years, is that, at least for me, weekly Communion doesn’t dilute the meaning or sacredness of the sacrament the way you might assume. (More than likely, that very assumption is precisely what drives some churches’ decision to serve communion only a few times a year: the less frequently Communion happens, the more it will probably mean to people when it does.) No, for me it’s quite the opposite. Weekly Communion becomes almost a form of sustenance, a frequent reminder of my dependence on a grace that I can’t earn but am still given and must still come forward to receive.

And tomorrow, on Maundy Thursday, I’ll go forward again.

Peace to you, whatever your faith, tradition, theology, or lack thereof, and whatever you’ll be doing on Thursday.

Why worship leaders should maybe stop listening to worship music

All right, despite my Lenten intentions, it’s been a few days since the last time I had a chance to post, and I’m not sure this week will be any better. It also remains to be seen what’ll happen once Lent ends, but maybe I’ll figure that out by then. But I’m here tonight, anyhow, so here’s a post.

A few of my recent posts have centered on music, so here’s one more. And this one will probably hold true for you regardless of what kind of worship music you prefer in your church services, regardless of what kind you connect with or what kind helps you connect with God.

A couple of months ago, a friend of mine posted a link to this article by Zac Hicks about how choirs in the middle ages and worship bands today too easily and too often slide from leading the congregation in worship to worshiping on behalf of the congregation. Sometimes that happens because, gradually and (one would hope) unconsciously, those entrusted with leading musical worship become enamored of ever more complex and interesting music–or of their own talents–and the congregation either can’t keep up or doesn’t see the point. Sometimes it happens because, gradually and (one would hope) unconsciously, the congregation takes an ever more passive role in musical worship, content to watch and listen to what happens up front rather than to participate themselves, and the worship leader eventually tires of cajoling them and stops trying. Sometimes these leader-side and congregation-side problems happen in tandem.

Thing is, despite Hicks’ focus on worship bands, worship bands aren’t the only ones subject to this problem. Sure, I’ve seen pop-star wannabes and rock-band theatrics that proved plenty of worship bands were more interested in their own music-making than the congregation’s. But I’ve also heard overwrought flourishes and awkward instrumental breaks that proved plenty of organists were more interested in their own music-making than the congregation’s. At their root, the instruments they’re using and the style of music they’re making don’t end up mattering that much.

Several years ago, I had the privilege of attending services at the Duke University Chapel a few times. The impressive space and aesthetics of the building and the majestic sound of the organ complemented each other brilliantly. But the organist (whose name I no longer remember) insisted on adding more and more layers of harmony to the successive verses of every single hymn, bringing the sound to the brink of horrific discord by the end and making even singing the melody difficult. The music ended up being one of the reasons I didn’t worship there more often. (Admittedly, the lack of community in the large and anonymous crowd was another, but it was still worth going once or twice.)

A few years before that, I regularly attended a Presbyterian church (PCUSA, if you were curious) that had both “traditional” and “contemporary” services every week, though I generally opted for the “traditional” service. The organist in those days was plenty talented at playing the organ, but not so talented at leading a congregation in worship. Between the second-to-last and last verses of every hymn he insisted on inserting a few bars’ worth of instrumental solo. And every week at least a few people, especially if they were new, would start singing the last verse too early and get embarrassed. Every. Single. Time. I liked enough other things about the church that I stuck around, but the organist was horribly off-putting. (I should note that he’s stopped doing those instrumental solos since then and just plays the hymns straight.)

Now, the offertory, the music played while Communion is being served, or any of those other parts with which the congregation isn’t expected to sing along anyway, those are all fine. And there’s nothing wrong with a recital or a concert either, whether by an organist or choir or worship band or anything else. I’m just saying that soloing, like anything else that smacks of showing off, has no place in the leading of what should be participatory musical worship. And that’s true no matter what musical styles or instruments are involved.

It does seem, though, that worship bands are vulnerable to the temptation to inappropriately solo in a way that organs and choirs may not be. That’s because much modern praise music is modeled after pop music, which often includes instrumental solos, usually before the last verse or refrain. And worship bands learn much of their repertoire by listening to CDs (What? I still own CDs.) recorded by Christian bands of one kind or another, and since the recording is usually just a band playing in a studio rather than a band leading a congregation, they leave the solo in. The worship leaders who listen to the recording think the solo sounds like a natural, normal part of the song, so when they’re rehearsing the song, they leave the solo in. And when they’re leading the congregation in singing the song, they leave the solo in.

Maybe organists and choir directors have a similar problem. I’ve heard a lot more recordings of modern praise music than of organ music or choir music, though, so I don’t know.

What I do know is that it’s dangerous for people entrusted with leading a congregation in musical worship to listen to recordings of worship music. It can be done, of course, if due care is given to preserve the participatory nature of congregational worship. But for far too many worship leaders, it’s too tempting to try to imitate the recording–the instrumentation, the structure of the song, and all other elements of it–too closely. For many worship leaders, the best course of action may be to sell, give away, throw out, burn, or otherwise get rid of their worship music CDs.

They’d be doing a great service to the congregations they lead.