Black Lives Matter and the Problem with Slogans

Slogans are often used as shorthand for a conjunctive set of ideological tenets. Affirmation of the slogan implies the affirmation of these tenets.[1] For example, let us suppose that Slogan S entails Tenet X and Tenet Y. Insofar as one rejects one of these tenets, one must also reject the slogan that represents them. More formally, this can be represented simply as:

  1. S if and only if X & Y
  2. Not-X
  3. Therefore, not-S.

While admitting that amorphous entities, such as movements or causes, are often fluid with respect to the tenets comprising them, this will suffice as a general characterization; for a slogan with no clear set of agreed-upon tenets cannot usefully represent a movement.

As with any movement, Black Lives Matter, too, has been built around a set of official tenets. Whether one finds these tenets uncontroversial, objectionable, or some combination of the two will be governed by one’s worldview. Given the desire to “disrupt the Western-prescribed nuclear family structure” and “foster a queer-affirming network,” and the deliberate use of terms with Marxist undertones, such as “comrades,” “power, ” and “liberation,” there is much to regard as objectionable from a Christian point-of-view. Therefore, though Christians believe that black lives matter, they are–and should be–reticent to endorse BLM (as a movement). To formalize this, for example, we could say that:

  1. BLM (as a movement) implies the beliefs that, among other things, black lives matter and the “Western-prescribed nuclear family structure” should be disrupted
  2. One should reject the belief that the “Western-prescribed nuclear family structure” should be disrupted.
  3. Therefore, BLM (as a movement) ought to be rejected.

The sort of reasoning that should compel us to distance ourselves from BLM is the same reasoning used when a member of Congress votes against a bill containing pork-barrel projects. If a bill called the “Roads Improvement Act” contains spending devoted to studies of mice, the bill ought to be rejected in principle.

Socially and publicly, this is not easy to do. In the case of BLM, the choice to make the movement’s slogan identical to a single, utterly uncontroversial tenet–namely, that black lives matter–is a powerful rhetorical device. This is the same kind of move adopted by numerous other movements, such as marketing abortion-on-demand as the uncontroversial “right to choose” or construing opposition to legalizing homosexual marriages as being in favor of “traditional family values.” This makes detractors from the movement as a whole easy to vilify, since anyone not willing to swallow the entire, jagged pill on account of controversial tenets is simply painted as a detractor from the uncontroversial tenets.

Most people repeating the phrase “black lives matter” probably do not mean to endorse BLM as a movement, but only to express their solidarity with the belief that black lives matter. Those falling into this camp might be tempted to think that the argument I am making here is an exercise in philosophical hair-splitting. If ideas do not have consequences, then it is indeed. But ideas do have consequences, and bad ideas have bad consequences. I submit that the pork-barrel ideology unhelpfully being subtly tethered to the idea that black lives matter is worse than “bad;” it is dangerous. If we are to “be wise as serpents and innocent as doves,”[2] we ought not be dismissive of our responsibility to think critically. We must separate the ethical wheat from the political chaff.

Ironically, it is because black lives matter that Christians ought not to align themselves with Black Lives Matter. As the human race has learned all too well,[3] a truth mixed with a falsehood is far more dangerous than a falsehood alone. Given that one of the real national crises of many black communities is that of fatherlessness,[4] one can only be appalled that the Black Lives Matter movement has as one of its stated aims the disruption of the “Western-prescribed nuclear family.” There are certainly many well-meaning Christians who wish to do good, to exercise compassion, solidarity, and kindness, who believe that all men are created equal, and yet have unwittingly aligned themselves with an organization they think has but a single tenet: that black lives matter. Yet, those of us who are fervently anti-racist, who want to empathize with those who are hurting, must nevertheless remind ourselves that, though we should respond with grace, it must not be at truth’s expense.


[1] This is true even if a person repeats a slogan without intending to endorse a broader set of tenets. One can only imagine how many German citizens, endorsing the Nazi party in 1920 because they agreed with the demand for “equality of rights for the German people in respect to the other nations,” came to later regret their vote on account of the outworking of another of the party’s tenets: “Only a member of the race can be a citizen. A member of the race can only be one who is of German blood, without consideration of creed. Consequently no Jew can be a member of the race.” See https://time.com/4282048/1920-hitler-political-platform/.

[2] Matt. 10:16

[3] Gen. 3:1

[4] See “About One-Third of U.S. Children Are Living with an Unmarried Parent,” https://www.pewresearch.org/fact-tank/2018/04/27/about-one-third-of-u-s-children-are-living-with-an-unmarried-parent/

Tagged , , , ,

"Money Doesn't Make One Happy:" Examining Christian Folk Wisdom on the Subject of Wealth

Casual conversation is replete with folk wisdom on the subject of money. One becomes accustomed to hearing such remarks as, “Oh, I would never want to be rich.” This is usually followed by some sort of qualifier (“I just want to be comfortable”), or a platitude (“Money doesn’t make one happy, you know”), or the unimaginative, “I wouldn’t even know what to do with a lot of money,” and so on. These, and other like phrases, comprise the sort of commonplace moralizing to which, through familiarity, one tends to become inured, and they are often reciprocated by their hearers with an approving nod, as if the speaker had just uttered a trivial truth, such as, “I just want to be a good person.”

Were such phrases uttered by someone who was not already wealthy, however, it would be much easier to regard them seriously; for phrases of this sort, are, of course, the sorts of things only wealthy people say. Indeed, it is only in rare cases that our American wealth-eschewer does not at that very moment have (at least) a computer worth $500 to $1,000, otherwise known as a “smart” phone, on his person, own a vehicle, a home, a laptop or tablet, have access to air-conditioning, plumbing, running (drinkable) water, electricity, internet, cable television, a formal education, and a source of regular income.[1]

That a person engaged in such a prodigious standard of living can, without experiencing the slightest tinge of cognitive dissonance, blatantly denigrate the possession of wealth is, perhaps, evidence for just how exceptional the American economy is. Such statements reinforce the fact that, beyond the attainment of those necessities essential to survival, “comfort,” like wealth, is a relative state of affairs. The state “being comfortable” means one thing to (say) a Papuan and, to the average American, another thing entirely. In the case of the latter, it usually means something that, both historically and globally, amounts to a standard of living beyond even that of ancient kings; for, despite being unimaginably affluent,[2] there was no amount of gold that could have afforded Solomon, for example, the availability of electricity or the medical benefits of penicillin–advances that, in America, even the relatively poor take for granted. The standard of living most people in the history of the world would have considered “upper-class” is far exceeded by that which contemporary Americans now consider merely “comfortable.” But find me the man whose family is starving, or who is of financial necessity engaged in mundane or dangerous or backbreaking work, or the couple who is forced, for financial reasons, to allow their children to be raised by others, who will yet utter with a straight face such a phrase as, “money doesn’t make one happy,” and you will have discovered one who is either in denial of his predicament or does not understand happiness.

The meaning of “Money doesn’t make one happy”

Were one to ask ten people what the word “comfortable” or “happiness” means, or to define what (to them) constitutes “a lot” of money, however, one is likely receive at least eleven different answers. (This is, after all, not a subject many of us have thought critically about.) The answers produced will inevitably be products of one’s upbringing, social context, and values. Thus, as in any discussion, and especially one at whose center are much-equivocated terms, if we are to avoid speaking in circles, we must clarify at the outset what we mean. Rather than attempt to undergo a lengthy exposition of the various meanings of  “happiness,” or to analyze the etymology, however, I shall simply say what I think most people intend by it. In the context of an assertion like “money doesn’t make one happy,” people cannot mean by this something that is obviously untrue, such as, “money can’t buy comfort” or “money can’t buy pleasure.” Rather, they seem to mean something akin to the following: “no amount of money can provide a holistic sense of satisfaction in life.” This is surely a claim to which only a thoroughgoing hedonist might object.[3] The rest of us are compelled to agree, else profound dissatisfaction would be absent among those with sufficient wealth (whatever that is); yet, there is clearly no shortage of those who are both rich and deeply unhappy. This is because inveterate unhappiness of the kind produced by meaninglessness often stems from becoming discontented with pleasure, rather than from the regular experience of pain.[4] Summarizing Augustine’s thoughts on the matter, Nicholas Wolerstorff notes–correctly, in my view–that, “Only enjoyment of God is worthy of desire for its own sake. If enjoyment of some earthly thing comes your way, praise God for it; but do not desire it, do not seek it for its own sake. That way lies unhappiness.”[5] Therefore, let us agree that material wealth cannot ultimately satisfy.

If this is indeed what people mean by the phrase, “Money doesn’t make one happy,” it is odd that they should utter it in the contexts they usually do; namely, as a reason for why they do not–or would not want to–make more money or to build wealth. This is made all the more strange by the fact the man who, in conversation, shuns building wealth is often the very same man who sends letters to friends and family asking for donations to fund his short-term missions trip, complains nightly to his wife that he is underappreciated and underpaid, prays that he might be granted a pay raise, and could not conceive of turning down a promotion, so long as it came with greater pay. He wants desperately to be financially prosperous, but not too much. By day, he rolls his eyes at lottery winners and “trust fund babies;” by night, he clips coupons and enters online raffle drawings. He wants to be handsomely rewarded, but not if it requires being too uncomfortable for too long.

The Wealth Paradox

It is here we encounter the existential paradox with which some Christians appear to struggle: “the love of money is the root of all kinds of evil” (1 Tim. 6:10), yet money is also clearly indispensible to a society and a means to many good things; it is a thing of which to be wary, yet cannot be escaped. Since it cannot be escaped entirely, apart from a retreat into monasticism, the solution some have adopted is to treat as an extravagance any material wealth beyond being merely “comfortable.” Never mind that, if we are to use our Lord as the paradigm, even this is an extravagance. Indeed, I have been somewhat haunted by C.S. Lewis’ thought on the question of how much we should give:

I do not believe one can settle how much we ought to give. I am afraid the only safe rule is to give more than we can spare. In other words, if our expenditure on comforts, luxuries, amusement, etc., is up to the standard common among those with the same income as our own, we are probably giving away too little. If our giving does not at all pinch or hamper us, I should say it is too small. There ought to be things we should like to do and cannot because our commitment to giving excludes them.[6]

I say “haunted” because my suspicion is that Lewis has said something right and difficult about the attitude we ought to cultivate with respect to our lifestyle and our giving.

In any case, the reason for this apparent paradox–acknowledging the dangers of money while recognizing its necessity–is, it seems to me, that many Christians tacitly assume that the only reason a person would work long and hard to create significant wealth is to spend it on oneself. If not for this assumption, comments like, “I would never want to be rich,” become virtually unintelligible, especially in view of the ironic fact that most of us will of necessity spend more waking hours attempting to earn money than we will spend doing anything else.

Likely, however, what people really mean by this is not that they do not want to become wealthier than they are, but that they either do not want to do what they believe is necessary to create greater wealth, or else they don’t want to become like a certain kind of person that they associate with affluence (i.e., the “rich man” archetype; cf. Luke 16:19-31). Regarding this latter concern, Lewis cautions that:

One of the dangers of having a lot of money is that you may be quite satisfied with the kinds of happiness money can give and so fail to realise your need for God. If everything seems to come simply by signing checks, you may forget that you are at every moment totally dependent on God.[7]

We would indeed do well to remember that God is ultimately the source of all good things, denouncing an unhealthy emotional attachment to our possessions and any means requiring the oppression of others to obtain them. But why think that working to create significant wealth is necessarily only a selfish endeavor, especially insofar as it allows one in the long run to spend their time in the pursuit of more worthwhile activities than earning a paycheck? On the contrary, why not deliberately delay one’s own gratification, not only for the sake of one’s own family’s future (which is a good in itself; see I Tim. 5:8), but for the sake of others? Correctly understood, and properly ordered among our values–our “loves,” if you will[8]–there need be no existential crisis with respect to creating wealth. Like a hammer, money is but a tool, and it is one’s character that will influence whether it is used for good or for ill: it can either buy slaves or fund an orphanage. As such, it is a useful servant, but a diabolical master.[9]

Wealth and Living Modestly

So long as it means living below one’s means in a manner devoid of vanity, living a modest lifestyle is commendable. But why assume that one must remain poor (relatively speaking) in order to live modestly? Need one assume that wanting to increase one’s wealth is indicative of disordered desires? If one wants to live on only $25,000 per year[10], but has the means to create much greater wealth (all other things being equal), why not continue to live on $25,000 per year and, if one so chooses, give the rest away?[11]

Thought of in this way, it becomes clear that at the root of pious statements, such as, “I would never want to be rich,” is sometimes a kind of selfishness. To the extent that our own needs are met, we tend not to entertain thoughts of making ourselves uncomfortable in the short term for the benefit of others in the long term. Indeed, many people refuse to delay their gratification even for their own future well-being,[12] and it is surprising that some seem to think it an exercise in humility to deliberately work for little pay, while simultaneously accruing debt to buy things they cannot afford.[13]

The reason people say they need to “pray about” whether to give to meet a certain need or fund a worthwhile cause is not because there is uncertainty about the merit of the need–this is usually obvious–but because they lack the resources to assent without reservation; for a drowning man is ill-equipped to save another. There is often little material difference between someone’s needing to “pray about” whether to give to X and someone’s needing to “consider” giving to X. As such, this is often just a way of responding that intends to avoid challenge by one’s interlocutor; for who would dare suggest that one not pray about something?[14] In it’s worst form, much as we often take home leftover food, knowing we will probably just toss it out later, responding this way is sometimes just a passive way of saying “no.” But we should caution ourselves against using God as a screen for our own dearth of confidence in asserting ourselves.

Some have justified deliberately remaining financially hand-to-mouth because of the extent of their giving to others, and of all the reasons to deliberately struggle financially, this is certainly the most laudable. Did not the widow who gave her only two coins effectively give more than even the rich?[15] As in many other cases, Jesus clearly sought to point out that the condition of the widow’s heart, combined with her obedience, was of more spiritual worth than the large gifts of the wealthy–presumably, not because there is something inherently holy about small gifts, but because there is something holy in obedience when obedience comes at significant cost to the giver.

It would be a mistake, however, to cite this passage as evidence that Jesus intended to praise the widow’s being poor. This is, in part, because there are throughout the Bible numerous commands to defend,[16] do justice to,[17] and be generous towards the poor,[18] as well as praises for God’s doing such things–presumably because being poor is something to be remedied, rather than sought, and because justice requires that we not neglect those in need.

The Distinction Between Jobs and Work

A friend of mine recently mentioned to a Christian acquaintance that he was actively pursuing financial independence in order to be able to focus on interests other than his job. His interlocutor responded by asking, “You know [Saint] Paul was a tentmaker, right?” This was as if to say, “if such a venerable figure as Paul remained employed in the course of his apostolic mission, who are you to seek to escape your job?” In addition to the stark assumption that my friend’s job was a good one, and therefore not worth leaving, there is at least one more unjustified assumption underlying this response–that the desire to escape one’s job is equivalent to the desire to escape work in general.

Because we colloquially refer to our jobs as our “work,” it is understandable that the distinction between the categories of “job” and “work,” of which jobs are subset, has become blurred. There are clearly many endeavors that involve work, however, even arduous work, which would not constitute a job[19]–raising children being but one obvious example. Thus, while the desire to avoid work in favor of unadulterated leisure is rightly to be eschewed, it is nevertheless incorrect to equate the attempt to free oneself financially from one’s job with the attempt to free oneself from the responsibility of work in general.[20]

Implicit in such remarks is the assumption that to bend towards the ascetic is in some way to bend towards holiness. After all, if we desire to follow Jesus, are not we to heed the words of our Lord?

Then Jesus told his disciples, “If anyone would come after me, let him deny himself and take up his cross and follow me. For whoever would save his life will lose it, but whoever loses his life for my sake will find it. For what will it profit a man if he gains the whole world and forfeits his soul? Or what shall a man give in return for his soul?[21]

Indeed we must. And it is on a certain understanding of this and other similar biblical passages[22] that the foundations of Christian monasticism were built; for monasticism is an attempt, among other things, to live consistently with Christ’s call to self-denial. This abnegation has historically involved the rejection of those worldly pleasures or drives that comprise ordinary human life in favor of abstinence, temperance, austerity, and, in its extreme forms, deliberately inflicting pain on oneself in the form of flagellation, sleep deprivation, or exposure to extreme temperatures. Hence, falling somewhere on the spectrum from complete self-indulgence to self-harm, asceticism (from the Greek “askesis”) tends to be a matter of degree, and there are many attempts at asceticism that fall short of something as all-encompassing as monasticism.[23]

In my own case, there was something in this call to self-denial that, even as child, I felt compelled to observe. Though no one had taught me to do so, I recall that, for some time, as our family would pray before each meal, I would quietly hold my breath for the duration of the prayer, believing that, in virtue of God’s holiness, I did not deserve even to breathe in His presence. As one might suspect, I had cause to seriously question the feasibility of this practice if, on a given night, my father’s thankfulness happened to be particularly long-winded.

At some point–I do not remember when–I came to realize that this practice, while perhaps admirable in its attempt at reverence, was at least unnecessary. The same could be said for asceticism writ large. Denying oneself food for a time, in recognition of the fact that “man shall not live on bread alone, but by every word that comes from the mouth of God” (Deut. 8:3; Matt. 4:4) is good; forever denying oneself food–or air–is deadly. Retreating periodically into solitude to pray is an indispensible discipline; withdrawing permanently from society is to relinquish our responsibility to fulfill the Great Commission.[24] Likewise, remaining ever-vigilant that our desires for comfort, leisure, or resources do not ascend in our hearts to the status of  “loves”–of ends in themselves–we should also caution ourselves against using this concern as an excuse to remain complacent. Becoming wealthy does not guarantee happiness,[25] but neither does being poor.[26] Being wealthy does not entail being materialistic, nor does being poor entail being righteous. We might be called to endure suffering on behalf of Christ. To do so with the right attitude is virtuous. But to inflict suffering on ourselves, or our families, either directly or indirectly, is to be foolish–or worse. While I do not intend to suggest that building wealth or pursuing financial independence is a moral obligation, good stewardship of the opportunities and resources afforded us is. As Solomon reminds us:

Go to the ant, O sluggard;
    consider her ways, and be wise.
Without having any chief,
    officer, or ruler,
she prepares her bread in summer
    and gathers her food in harvest.
How long will you lie there, O sluggard?
    When will you arise from your sleep?
10 A little sleep, a little slumber,
    a little folding of the hands to rest,
11 and poverty will come upon you like a robber,
    and want like an armed man.[27]

Insofar as one has the means to do so, I submit that we would do well to consider the ways in which our temporary self-denial might be applied, not only for our own gain, but to the very great benefit of others.


[1] See https://www.heritage.org/poverty-and-inequality/report/understanding-poverty-the-united-states-surprising-facts-about

[2] See I Kings 10:14

[3] And even he might only grant that, holistic satisfaction being illusory, the pursuit of pleasure is just the closest one can come to genuine joy.

[4] Conversely, the regular experience of pain need not produce inveterate unhappiness.

[5] Nicholas Wolterstorff, Justice: Rights and Wrongs (Princeton, NJ: Princeton University Press, 2008), p. 144. See also Proverbs 21:17.

[6] C.S. Lewis, Mere Christianity, (1952; repr. New York: Touchstone, 1996). See also Luke 21:1-4.

[7] C.S. Lewis, Mere Christianity,(1952; Harper Collins: 2001) 213-214.

[8] In You Are What You Love: The Spiritual Power of Habit (Grand Rapids, MI: Brazos Press, 2016), James K. Smith makes a compelling case for the fact that humans are properly understood as beings that love, rather than merely beings that think.

[9] See Matthew 6:24

[10] The weighted average poverty threshold for the United States in 2018 for a family of four with two children was $25,465. See https://www.census.gov/data/tables/time-series/demo/income-poverty/historical-poverty-thresholds.html

[11] For now, I set aside those situations in which parents force their children to go without for the sake of their own ideals.

[12] See https://www.cnbc.com/2019/02/21/consumer-debt-hits-4-trillion.html

[13] For example, encountering a recent college graduate who is debt-free is now an anomaly. See https://www.forbes.com/sites/zackfriedman/2019/02/25/student-loan-debt-statistics-2019/#5493a32c133f

[14] Cf. 1 Thess. 5:16-18

[15] See Luke 21:1-4

[16] See Jeremiah 5:28, Proverbs 31:8-9

[17] Leviticus 19:15, Ezekiel 22:29, Jeremiah 5:28; 22:3, Isaiah 1:17, Psalm 140:12

[18] For example, see Deuteronomy 15:11, Isaiah 58:10, Proverbs 14:31; 19:17; 22:9, , Matthew 5:42, Acts 20:35

[19] Whereby I mean a situation in which one trades time employing a skill for a wage.

[20] An additional problem with this reasoning is that it could be employed against any number of self-evident goods. Suppose one desired to avoid a hitting an iceberg while on a cruise ship. Another might as well respond with “You know Paul was shipwrecked, right?”

[21] Matthew 16:24-26

[22] Cf. 1 Pet. 2:11; 1 Cor. 9:27

[23] For example, Christians are commanded to engage in regular, temporary acts of asceticism in the form of fasting, abstinence, etc. See, for example, Acts 13:3, Acts 14:23; 1 Cor. 7:5.

[24] Matthew 28:16-20

[25] Cf. Ecclesiastes

[26] Cf. Proverbs 10:15; 14:20

[27] See Proverbs 6:6-11

Tagged , , ,

Grand Old Introspection

Donald-Trump-Caricature

 

So Trump it is, then. The dismissive chuckling at the prospect of a Trump nomination emanating from the Nate Silvers and Jennifer Rubins of the world, among scores of others, has trailed off into the animated din of those attempting to appraise the strange and–some would say–unfortunate state of political affairs we presently find ourselves in. There does, after all, come a point at which genuflection to fact is the only reasonable course for even the most prescient. While registering my sympathies with the conservative among us–those who are at present engaged in collective hand-wringing over Trump’s bizarrely consistent successes–I here wish to consider what follows from the apparent fact that Trump’s (stated) values and policies are not as asynchronous with the majority of the Republican voting base as initially supposed. That is to say, if Trump has merely set sail atop a previously latent political undercurrent, in addition to supplying more than a little of his own hot wind, then the problem of Trump’s nomination is more sinister, for it is no fluke.

Conservatives, then, must contend with the reality that, to the shame of the Republican party, a candidate of Trump’s caliber–or anti-caliber, as it were–has by popular demand been given a realistic shot at assuming the highest office in the nation. Yes, that same gentleman and scholar who remarked that “You know, it really doesn’t matter what [the media] write as long as you’ve got a young, and beautiful, piece of ass” (Esquire, 1991), boasted about his phallic proportions in a presidential debate, pretended to be his own publicist, and defended the notion of lethal attacks on terrorist’s families is seeking to be, among other things, Commander-in-Chief of the armed forces of the United States. Were it not actually the case, this could easily be the beginning of a good joke. Though I have in past elections grown accustomed to being underwhelmed by the results of the Republican primaries, never before have I been so disinclined to associate myself with the Party. One cannot help but imagine a White House emblazoned–perhaps literally in Trump’s case–with a large, gilded “T”–the perfect realization of what had previously been but a Leftist caricature of the GOP. Unfortunately in Trump’s case, however, no caricature is needed; or, rather, he supplies the necessary material himself. Thus, the GOP can no longer pretend that large swathes of its voting base are not as susceptible to cheap populist rhetoric as their progressive peers.

Though these forces have lain dormant for some time–or have at least been politically outmatched–the bloviating businessman’s puerility has apparently been sufficient to induce their emergence en masse from beneath the feelings of disenfranchisement that have heretofore characterized their apathetic relationship to Republican primaries (c.f., Politico). If an unabashedly fluid opportunist like Trump can best unabashed constitutional conservatives, such as Ted Cruz and Rand Paul, the GOP has no choice but to perform its own autopsy with respect to conservatism. Indeed, Trump’s rise has merely confirmed the lingering suspicions of many that a capital “R” following one’s name cannot reasonably be assumed to indicate one’s solidarity with conservatism.

Though, like many of the best medicines, it terrorizes the tongue, I consider this latter effect to be among the positives of Trump’s electoral success. Trump, though a danger to conservatism, may catalyze a schism–and hence a purification–of the GOP, wherein the conservative wheat is separated from the non-conservative chaff. So long as they are all largely in one place, whether the chaff is blown in or blown out makes little difference.

The criticisms elucidated here should not be taken to indicate a tacit preference for Trump’s chief rival to the throne–the former Secretary of State, Hillary Clinton; for I am here taking her odious duplicity for granted. Indeed, her history of casuistry is an equine corpse I shall leave for others to beat. Despite this, the common assertion that one must vote for Trump in order to avoid the inevitable calamity of an H.R.C. presidency is a difficult sell on at least two counts. First, because it is not at all clear that the erratic Trump is preferable to a largely predictable, if thoroughly duplicitous, Clinton. The claim that Trump is the solution to the disastrous presidency of Barack Obama, or that he is clearly preferable to Clinton (though he may be) is a tiresome one; namely, because it is naïve. From a conservative point of view, declarations of this kind are akin to the insistence that hemlock is clearly preferable to strychnine if taken with a bit of lemon. (This assumes, of course, that virtue–or, at least, a love of its pursuit–ought to be highly prized in a candidate.) However damaging Clinton’s proposed policies may be, we at least know, by and large, what they are; neither we nor Trump know what he will actually do as President, promises of “walls” and “deals” notwithstanding. Though a case can in certain contexts be made for tactical votes (i.e., votes intended primarily to keep a worse candidate from winning), there must be principled limits to such reasoning–limits which, in my opinion, Trump has far exceeded. (For instance, the candidate being voted for must not have vices which match or exceed the severity of those exhibited by his opponent. But I shall leave this aside.)

Second, insofar as the Trump-or-Hillary-ers began making their case long before Trump was the inevitable nominee, they showed themselves to be disingenuous as to their reasons, this lack of candor following from the apparent implication that, rather incredibly, there was not a single candidate among the very large initial field preferable to Trump. By almost any standard, such a notion is, to borrow Aquinas’ line, “repugnant to the intellect.” (That is, unless one’s standard for endorsement is, like Dr. Carson’s, the likelihood of being given a position in a candidate’s future administration.) Whatever the merits for such an argument now (that Trump is the inevitable nominee), it was thoroughly meritless then, and hence difficult to take seriously, especially since many of its purveyors are responsible for forcing the rest of us into this awkward predicament. As the Indian proverb goes: once you have cut off a person’s nose, there is no point giving them a rose to smell. As exemplary practitioners of this exercise in non sequitur, I have in mind such counterfeits as Sarah Palin and Dr. Carson, whose glowing endorsements of Trump are (to me) sufficient to justify their dismissal from any future conservative round-tables.

Assuming, as I am, that a Trump presidency is likely to be inimical to the values maintained by constitutional conservatives, we see exemplified in Trump’s current political success a potential continuation from the election of President Obama of what may be called the paradox of freedom; namely, that a free people is only truly free insofar as they are able to choose that which undermines their freedom. This point is given lucid treatment in Os Guinness’ A Free People’s Suicide: Sustainable Freedom and the American Future, the thesis of which is that freedom rests on three mutually-dependent legs: “Freedom requires virtue, which in turn requires faith of some sort, which in turn requires freedom. Only so can a free people remain ‘free always.’” If, as I suspect, Dr. Guinness has highlighted a profound insight, what are we to make of a voting public that, when pressed on the importance of virtue or faith in a leader, expends only what little effort it takes to raise its shoulders an inch or two? Though I hesitate to suggest what might be reasonably inferred of a nation of over three-hundred million that pits a Hillary Clinton against a Donald Trump as the two best candidates to don that venerable mantle wrought by the likes of Washington, Jefferson, and Adams, Christian Smith, in his 2009 book Lost In Transition: The Dark Side of Emerging Adulthood, does not, summarizing what he takes to be the deeply rooted afflictions in American culture, of which the success of a candidate like Trump is now ostensibly a symptom (see The American Conservative):

In short, if our sociological analysis in this book is correct, the problem is not simply that youth are bad students or that adults are poor teachers. It is that American culture itself seems to be depleted of some important cultural resources that it would pass on to youth if it had them — and yet not just for “moral” but also for identifiable institutional reasons, as repeatedly noted above. In which case, not only emerging adulthood, but American culture itself also has a dark side is well.

This analysis strikes me as apropos. For good or for ill, we do indeed get the leaders we deserve. Dylan’s poetic, if technically trivial, observation in 1964 was that “the times they are a-changin.’” Given that a real-life parody like Trump can even come close to attaining the presidency, I say: “changin’ indeed.” If the real problem is not merely a rogue candidate, but rather a culture throughout which vice and ignorance have metastasized, the work we have before us is great, but not impossible. It is the wearyingly slow but vital work of grassroots evangelism, both political and spiritual. For such ailments, there are no quick fixes, no obvious panaceas for which we might campaign or lobby; nor is there some candidate whose election would constitute a remedy, for the malady is pandemic. In view of the vitriol associated with this election cycle, however, I would not be the least surprised to discover an intimate, if indirect, connection between the beginning of a shift in the current paradigm and a widespread loving of one’s neighbor as oneself.

Be Good for Goodness Sake?

…remind them that this phrase is a line from a Christmas song whose message is that you should be good so that you are rewarded for it on Christmas.

Source: Be Good for Goodness Sake?

Cecile the Lioness


No, not “Cecil”—that unfortunate feline whose death is the recent cause of a global (but undoubtedly faddish) uproar. Cecile, as in Mrs. Cecile Richards—the no less unfortunate president of Planned Parenthood.

This coincidental lexical similarity between the names of two major figures in separate, but heavily reported, current events is in this case more than a mere curiosity, for the apposition serves to illustrate a regrettable reality: a contemporary milieu which—if I may—doesn’t know its head from its ass, morally speaking. Though it is easy to over-generalize in such discussions, there is at least a prima facie truth to the morbidity lurking behind the apparent comedy currently unfolding in the media over the death of Cecil the lion. What is comedic is not the lion’s death, but the resulting overreaction (e.g., here and here); what is morbid is the relative quiet of those same incensed individuals with regard to recent footage (here & here) leaked from discussions with those in the upper echelons of Planned Parenthood and its affiliates, which at least appears to show them nonchalantly haggling over the price of aborted fetal body parts. Whether Planned Parenthood is guilty of such allegations is irrelevant to my point; for if they are even possibly guilty, then the case is worthy of our full attention. In any case, if abortion is in most instances but a particular brand of unjustifiable homicide—as it is in my view—then this latest scandal only renders more egregious the moral aberrations which comprise Planned Parenthood’s standard (and advertised) operating procedure. As aptly remarked by Brit Hume, these latest revelations have “parted the veil of antiseptic tidiness” behind which Planned Parenthood has couched its gruesome operation. But the real problem is not the sale of fetal body parts; it is that there are such parts to sell.

That a large segment of the population exhausts itself in paroxysmal fits over the killing of a large, if impressive, cat, yet barely manages to produce a stifled yawn over the killing (and possible sale) of human babies is nothing less than appalling. Jimmy Kimmel, while quite concerned to defend Cecil, has apparently not seen fit to devote any portion of his show to rousing the moral sensibilities of his audience with regard to the cavalier execution of underdeveloped children. Perhaps among his audience there are few such sensibilities left to rouse. I have no special distaste for Mr. Kimmel; I mention him as but one among a large swath of the population whose attitudes appear to confirm Francis Schaffer’s observation that what was unthinkable a short time ago has not only become thinkable, but commonplace.

Likewise, National Geographic, despite being a longtime advocate for the oppressed around the world, aired a regal portrait of male lion “in memory of Cecil” on its Instagram account, complete with an impassioned plea to stop the hunting of endangered animals—an entreaty any true conservationist could easily endorse. But when synchronically juxtaposed with the chorus of crickets surrounding the ongoing scandal at Planned Parenthood, signing a petition to “save the lions” is worse than hollow; it is evidence of a severe disorder among our moral priorities. If “lions are people, too,” perhaps it is time to remind ourselves of what ought to be a trivial truth: that “people are people, too.” To call this epidemic of moral confusion “unfortunate” insofar as it concerns the murder and mutilation of our young is an understatement on the order of calling Michelangelo’s painting in the Sistine Chapel “nice.”

It is in this vein that I have referred to Mrs. Richards as “unfortunate”: anyone who has convinced herself that an institution offering to screen you for cancer with the left hand and to crush your unborn child into pieces with the right is an important instrument in facilitating the common good is morally debased. Such a person is not to be hated, but pitied. I have no doubts that Planned Parenthood does provide services which are of benefit to various communities. Indeed, Mrs. Richards does not hesitate to remind us of this fact in an article—rife with euphemism of Orwellian proportions—that she penned for The Washington Post, lest we should undergo amnesia amidst all this malicious hullabaloo brought on by “the extremists.” Clearly, however, if abortion is the unjustified killing of a human being—and that is the issue—then there is no other healthcare “service” one may provide such that abortion is rendered morally justifiable. A single abortion is not made acceptable by performing a million successful STD screenings. Even including abortion under the “healthcare” umbrella is a sort of sick joke; for it does little for the health or the care of those aborted.

I wish to make one other point, and that is to register an observation regarding Mrs. Richard’s pejorative use of the term “extremists.” So long as they are willing to put forth an argument, two individuals might civilly disagree over the question of whether unborn humans possess an intrinsic right to life. But if the matter is epistemically unsettled (i.e., we do not know whether unborn humans possess an intrinsic right to life), then it is at least epistemically possible that unborn humans possess a right to life. If it is possible that unborn humans possess a right to life, then it is possible that killing them results in a moral transgression (i.e., it is possible that abortion is murder). In such a case, far from being an “extremist,” the person who maintains that unborn humans possess a right to life chooses the “safe” option; for if he is himself uncertain whether unborn humans possess a right to life, it is clearly preferable in the abstract to choose the option which is least likely to result in a moral transgression.

Moreover, if it is the deceptive methods used to obtain the footage in question that Mrs. Richards considers the criterion of “extremism,” I demur yet again. On the contrary, if a person believes that it is even possible that a moral transgression is taking place in the case of abortion, this is exactly the kind of activity in which he should engage. He should expose the practice for what it is. We laud (and ought to laud) the undercover operations of those involved in liberating women from the sex trade. Likewise, the person who sees abortion as a crime against human individuals has no recourse but to appeal to the moral sensibilities of his peers (if any remain) and to the Almighty. Even if we disagree with the conclusion of such a person, surely we must applaud his motives. Indeed, if an “extremist” is simply a committed abolitionist—someone whose actions reflect a serious commitment to ending the practice of feeding our children to the proverbial lions (or lionesses, as the case may be)—then I count myself among their number.

Tagged , , , ,

In Defense of Old Books

Thomas Hardy

Photo Credit: guardian.co.uk

I have often wondered whether my opinion of the superiority of 19th century literature—and of significantly older literature in general—is in any way objectively justified, rather than simply finding its support in subjective notions of preference or bias. From the moment I began to regard the English language with an interest greater than that required by utility, I wondered at the great metamorphosis that has occurred in what is seemingly such a short period of time. (Being in no way a philologist, I do not know whether the evolution of English conforms to historical patterns and is therefore actually “short.”) Despite the acknowledgment that languages, being arbitrary conventions, must inevitably alter over time, I have still found it difficult to regard the differences between literary epochs as being merely that, and not evidence of some almost moral shift in the quality of what is considered good writing. Once having formed (and retained) a decided affinity for older works, I have since wondered self-critically whether this inclination is due to romantic notions of “the good ol’ days,” or whether it is actually possible for the general quality of literature to now be worse.

In being daily reminded of the chaotic state of the world, our tendency to subconsciously attribute real historical credence to fictitious renderings of past decades is, perhaps, a pardonable delusion; and where our perceptions of the past are accurate, we insist on downplaying the unique problems that afflicted older generations. In reading the works of the Brontë family, say, one might possibly be tempted to regard Victorian England in any number of charming lights, forgetting that all except the father died of illnesses before the age of 39—hardly a romantic state of affairs. Or, to take a more recent, personally incriminating, example: a soldier standing tediously in a Middle Eastern desert might naïvely claim to prefer to have been holed up in a nondescript cottage with the Greatest Generation in German-occupied France; this on account of having seen too many films and having thought only shallowly about what such an experience might have really been like. It is rarely true that the grass actually is greener on the inaccessible plot.

Yet whereas bygone times can be idealized by either the select apprehension or ignorance of certain facts, works of literature may be evaluated solely on their internal merits. Though it would clearly be remiss to think all 19th century works artistic masterpieces, or even good—the “penny dreadfuls” were the 19th century equivalent of popular contemporary literary abominations, such as 50 Shades of Grey—it would seem fair to compare the most (or least) esteemed works of the period to those of our own.

But in order to have any sort of objective degeneration, there must be some fixed standard or criteria against which to uniformly criticize various works. This standard, whatever its properties, is at bottom called the English Language; though when it comes to art, this standard is clearly more complex than the sum of its lingual parts. The difficulty in attempting to conduct such a thing as literary criticism, as with art criticism, is not only that English speakers have very different ideas about what constitutes good or acceptable English (these discrepancies occur even in the highest echelons of the discipline), but that the language itself is malleable and sometimes vague, with a wide margin for style. The English language as it now exists is stylistically unrecognizable from that of the Elizabethans, for example, even less so than Old English, but it is English nonetheless. The contemporary authorities on the subject would be at odds with their long-dead counterparts.

The slightly philosophical question that here presents itself is whether it is objectively possible to compare the linguistics and style of two works from two significantly different literary periods with regard to hierarchical value. This would seem to involve the question of discerning at what point a difference in style becomes evidence of either a greater or lesser ability to communicate. Authors have multifarious intentions, of course, but the question is at what point one author can be said to be definitively better or worse or more artful than another in communicating his intended message.

We might take an example: Thomas Hardy’s Far From the Madding Crowd. I do not intend to conduct a thorough analysis and comparison (even if I could) either of this or any other work, but a cursory glance will suffice:

“The night had a sinister aspect. A heated breeze from the south slowly fanned the summits of lofty objects, and in the sky dashes of buoyant cloud were sailing in a course at right angles to that of another stratum, neither of them in the direction of the breeze below. The moon, as seen through these films, had a lurid metallic look. The fields were sallow with the impure light, and all were tinged in monochrome, as if beheld through stained glass. The same evening the sheep had trailed homeward head to tail, the behaviour of the rooks had been confused, and the horses had moved with timidity and caution.” (Thomas Hardy, Far From the Madding Crowd, Wordsworth Editions: Ware, Hertfordshire, 1993, p. 189)

Compare with Earnest Hemmingway’s The Old Man and the Sea:

“They walked up the road together to the old man’s shack and went in through its open door. The old man leaned the mast with its wrapped sail against the wall and the boy put the box and the other gear beside it. The mast was nearly as long as the one room of the shack. The shack was made of the tough budshields of the royal palm which are called guano and in it there was a bed, a table, one chair, and a place on the dirt floor to cook with charcoal. On the brown walls of the flattened, overlapping leaves of the sturdy fibered guano there was a picture in color of the Sacred Heart of Jesus and another of the Virgin of Cobre. These were relics of his wife. Once there had been a tinted photograph of his wife on the wall but he had taken it down because it made him too lonely to see it and it was on the shelf in the corner under his clean shirt.” (The Old Man and the Sea, p. 4)

I admit to the selection of these passages as being arbitrary, but they are nevertheless characteristic of each respective author and therefore sufficient to make my point. In reading these two passages, it seems to me quite nearly an objective fact that Hardy’s descriptions possess an artfulness lacking in those of Hemmingway, which by comparison appear bland and staccato, devoid of an alluring prosaic rhythm. Whatever Hemmingway’s merits, they do not in my mind compete for dominance in the category of eloquence.

That many people deliberately prefer books that adopt a style akin to Hemmingway’s, perhaps finding them less tiresome to read than the elaborate verbiage characteristic of many 19th century works, say, is a fact reaffirmed by the kinds of literature given the highest praise in contemporary literary circles. At the risk of sounding snobbish, I have routinely found it difficult to read the works contained in certain literary periodicals (Ploughshares, Writer’s Digest, The Sun), as well as some contemporary novels, without undergoing a slight cringe. Nevermind the fact that a popular theme in many of these works is either gratuitous sexuality in one form or another, or some form of implicit nihilism, it is the way in which the content is generally either dully presented or, in attempting to be artful, creates instead the impression of one over-acting a scene. Take a highly typical excerpt from a work of fiction in the July 2013 issue of The Sun magazine:

It was my junior year of high school, and I was living in a Victorian on Beach Avenue with my sister, Alex, who is my twin but always somehow prettier and skinnier than me, and our grandmother, Zilpha. The house was old and handsome, like our grandmother, and it sat surrounded by perennial gardens on a grassy hill above the south shore of Lake Ontario in Rochester, New York.

My parents had been killed in a car accident when I was in seventh grade. They were driving home from a wine tasting in Buffalo and got caught in a sudden snowstorm that swept off Lake Erie, and their car skidded off the road and hit a tree. My boyfriend Rick’s father had died of cancer of the throat, and I think one reason Rick loved me was because my parents were dead. He and his mother lived on my grandmother’s street, which is how we met. As residents we had access to a private beach, where Alex and I liked to swim and sunbathe in the summer, and where Rick and I liked to screw around late at night when the moon was out and the water smelled sweet and clean and the Big Dipper hung low above the lake, way out over Canada.” (Christian Zwahlen, “It Must Have Been Beautiful, But Now It’s Gone,” The Sun, issue 451, July 2013, http://thesunmagazine.org/issues/451/it_must_have_been_beautiful_but_now_its_gone)

Even without a drawing a comparison to another work, the above excerpt is at best only grammatically correct, having the sort of matter-of-fact descriptions one would expect of someone filing a police report. Curiously—but not, to my mind, surprisingly—the preceding excerpt is not altogether beneath the previous excerpt of Hemmingway. This is the lamentable reality: with regard to prosaic style (not content), there is often only a subtly discernible gap in quality (if at all) between lay writers and those highly esteemed. This exemplifies the vast difference in current literary trends and in what is now considered writing worthy of publication. I grant that it is rather unfair to compare a famous work of old to an unknown piece by a contemporary author, so let us approach the issue from a fairer angle—by looking at a passage from George R. R. Martin’s famous series, A Song of Ice and Fire (1):

“The morning had dawned clear and cold, with a crispness that hinted at the end of summer. They set forth at daybreak to see a man beheaded, twenty in all, and Bran rode among them, nervous with excitement. This was the first time he had been deemed old enough to go with his lord father and his brothers to see the king’s justice done. It was the ninth year of summer, and the seventh of Bran’s life.

The man had been taken outside a small holdfast in the hills. Robb thought he was a wildling, his sword sworn to Mance Rayder, the King beyond-the-Wall. It made Bran’s skin prickle to think of it. He remembered the hearth tales Old Nan told them. The wildlings were cruel men, she said, slavers and slayers and thieves. They consorted with giants and ghouls, stole girl children in the dead of night, and drank blood from polished horns. And their women lay with the Others in the Long Night to sire terrible half-human children.

But the man they found bound hand and foot to the holdfast wall awaiting the king’s justice was old and scrawny, not much taller than Robb. He had lost both ears and a finger to frostbite, and he dressed all in black, the same as a brother of the Night’s Watch, except that his furs were ragged and greasy.” (George R. R. Martin, A Song of Ice and Fire: A Game of Thrones, p. 11)

This passage, while certainly not poorly written, and much better than Mr. Zwahlen’s, is rather uninteresting in its depictions. The story is fascinating, highly complex, and full of an engrossing intensity, but there is something un-compelling in the way in which Martin sets his scenes and develops his characters. I have begun reading this book on three separate occasions, making it several hundred pages deep, and each time finding the prose too uninspiring to be worth my time; though I may eventually return to it on account of the story. If only Thomas Hardy had contrived to write the story of the Starks and Lannisters…

Now take this excerpt from one of my favorite novels, Jane Eyre:

“From my discourse with Mr. Lloyd, and from the above reported conference between Bessie and Abbot, I gathered enough of hope to suffice as a motive for wishing to get well: a change seemed near,—I desired and waited it in silence. It tarried, however: days and weeks passed: I had regained my normal state of health but no new allusion was made to the subject over which I brooded. Mrs. Reed surveyed me at times with a severe eye, but seldom addressed me: since my illness, she had drawn a more marked line of separation than ever between me and her own children; appointing me a small closet to sleep in by myself, condemning me to take my meals alone, and pass all my time in the nursery, while my cousins were constantly in the drawing-room. Not a hint, however, did she drop about sending me to school: still I felt an instinctive certainty that she would not long endure me under the same roof with her; for her glance, now more than ever, when turned on me, expressed an insuperable and rooted aversion.” (Charlotte Brontë, Jane Eyre, Chapter IV, p. 41.)

The subjects of these respective passages are irrelevant; the latter is superior to the preceding two in its command of the English language and the artistry with which it communicates information about the characters to the reader. Even if one detests the subject of the story, or wishes, as one friend recently said to me of Charlotte Brontë’s characters, to “throw them all into a pit of snakes,” a person with any regard for good diction cannot but appreciate it. One may enjoy a work of literature, but fail to appreciate it. Likewise, one may thoroughly appreciate a work, hating its subject or philosophy or author all the while.

In what I regard to be tantamount to literary blasphemy, Kurt Vonnegut famously espoused a sentiment that many other existing authors have either consciously or unconsciously put into practice: “Here is a lesson in creative writing. First rule: Do not use semicolons. They are transvestite hermaphrodites representing absolutely nothing. All they do is show you’ve been to college” (A Man Without a Country). I wish Mr. Vonnegut had ended more appropriately with what would have then been a prophetic addition, “…or that one is capable of thoughts exceeding 140 characters.” Dispense with the semicolon and one gets Twitter; or, rather, one gets short stories and novels written as if Twitter’s 140-character limit were the primary constraint upon their composition.

It is, of course, perfectly acceptable for a person to prefer literature that is not of the highest literary quality; just as it is perfectly acceptable for a person to occasionally indulge in foods that are unhealthy. I only mean to suggest that the recent works most highly regarded are inferior to even those works considered only modestly successful a hundred years ago. But the moment one begins to praise Rothko at Caravaggio’s expense, I feel I must lift a finger in protest. It is not that modern or contemporary works are entirely devoid of merit––far from it—but something has largely been lost and not merely changed. There is a point at which it is no longer a question of apples and oranges but of fruit-specific integrity. Modern apples, however, unlike modern fiction, have thus far, at least, retained their sweetness.

Notes:

1.) In citing this as an example of unimpressive writing, I realize I will now proceed to make enemies.

Tagged , , , , , , , , , , , , , , , , ,

Racist Anti-Racism

Photo Credit: CNN.com

In our day, it is not only people that are categorized according to race, but ideas. If recent history is any indication, even my attempt to broach the subject of the double-standards common in discussions on racism will be met with (at least) skepticism by some who consider my pigmentation inherently disqualifying. After all, how might a white man understand the plight of blacks (1) who feel ostracized on account of their race? The answer is simple: racism is an ideology, which means it cannot be the prerogative of any particular race, and therefore does not operate in only one direction .

That the U.S. has been guilty of pervasive official discrimination against blacks in the past is an incontrovertible fact, and it is praiseworthy that those unjust legal biases have rightly been eradicated. Though there still exist fringe groups that openly profess racial prejudice (e.g., the Klu Klux Klan), however, the kind and scope of persecution alleged to occur against the black community today is, I aver, not of the KKK brand. (Nobody attributes any credence to anything issuing forth from the obtuse mouths of the Klan members, anyway.) I submit that it is not the prevalence of melanin in a person’s skin alone that fosters a tendency to form possibly unjust preconceptions, but that in conjunction with the perpetuation of what is called the “black community” or “black culture” (2). Discrimination against blacks, when is does actually occur, is rarely an aversion to the color of their skin, but often the result of a distaste for the substance of what is understood to be the associated subculture. This is not really racism, but a sort of anti-culturalism. Just what constitutes “black culture,” I do not presume to know; I know only that it is the subject of volumes of literature and consistent media attention, and that if the term were meaningless, it could hardly receive the press it now enjoys.

Even granting that there does exist such a thing as black culture, it is, of course, not technically correct or appropriate for the average person (3) to presuppose of any other with very dark skin their membership within that culture. This is true of any kind of stereotype one might be inclined to attribute to some race. Recognition of the fact that black culture is the recipient of much positive advertising in the media, however—particularly from prominent rap and hip hop artists and television (e.g., BET)—ought to diminish the swiftness with which blacks fire allegations of discrimination. Were I to don a tee-shirt emblazoned with only the word “Cornell,” a person might be forgiven for falsely assuming that I actually attended there.

Examples of the deliberate effort to cultivate this cultural distinction are numerous. There are nationally recognized organizations that specifically promote a kind of racial distinction: Miss Black USA, Ebony Magazine, Black Enterprise, and the—dare I say infamous—National Association for the Advancement of Colored People. Likewise, at my own undergraduate alma mater, there were at least two black interest groups: the Society of Black Engineers and The Society of Distinguished Black Women. That such groups exist simultaneously with loud efforts to end discrimination makes it difficult to avoid the sense that there is a glaring double-standard. I can vividly imagine the kind of vitriol one would inevitably receive, the innumerable accusations of hatred and prejudice, had one tried to start a parallel group in which one only exchanged the word “black” for “white.”

Though making racial distinctions is not necessarily wrong—after all, there are niche groups for almost everything—it is at least exceedingly counterproductive to the stated aims of such groups as the NAACP. If the ultimate goal is really a pervasive social “color blindness,” it is difficult to see how having a sort of “black pride” is in any way helpful. Such a sentiment is understandable in the context of the 1970s, when America was still plagued by the lingering prevalence of an anti-black milieu, but the circumstances now hardly resemble that unfortunate state of affairs. Having a black president was at that time unfeasible; now, the first black president is well into his second term. Whatever its current manifestations, racism against blacks is hardly what it once was.

The kind of rhetoric bandied about by the Al Sharptons and Jesse Jacksons of this country would seem to lead one to the opposite conclusion: that racism in America is still as prevalent as ever. Given the great improvements in the understanding of equality that have been made in the public consciousness, one tires of the impassioned speeches that predictably attempt to channel the moral indignation of the honorable Martin Luther King, Jr. The compelling desire to be needed, to spearhead a fight against an injustice which on a grand scale no longer exists, has the unfortunate effect of creating the very problem that needs solving—the purveyance of racism, albeit in the opposite direction. This achieves in effect a sort of counter-racism. Some blacks have even gone so far as to suggest that it is impossible for them to be racist. This is quite simply because racism is in their minds a one-sided endeavor in which they are the sole victims.

The most prominent example of this divisiveness is witnessed in the circumstances surrounding the ongoing trial of George Zimmerman regarding the shooting of Trayvon Martin. In large part, many people (blacks in particular) instantly accused George Zimmerman of having committed a racially motivated crime, well before the full facts of the case had been made public, and have since artificially manufactured the case as a civil rights issue. For many blacks (though certainly not all), Zimmerman is guilty and will remain so in their eyes even if he is acquitted. In a continual barrage of callow extortion taking place in the sophisticated realm of Twitter, Mr. Zimmerman has received numerous open death threats from some blacks claiming to be willing to take justice into their own hands—hardly an effective way to win acceptance in the public eye. If respect and acceptance is truly the goal of the black community, the volatile outrage that Trayvon Martin’s death ignited therein is achieving exactly the opposite effect, and it must therefore be condemned.

Further evidence that the purveyors of black culture are ideologically entrenched can be found in the strained—and, frankly, outrageous—lengths to which some writers have gone to defend Rachel Jeantel’s abysmal testimony in Mr. Zimmerman’s trial. Not only was Ms. Jeantel found to have lied on several occasions (at least once while under oath), her openly disrespectful attitude is hardly becoming. Her genuine ignorance and incivility can certainly be forgiven, but they cannot be respected. In an article titled, “Why Black People Understand Rachel Jeantel,” author Christina Coleman begins, “If ever I thought myself objective and unbiased, the George Zimmerman trial is definitely not that moment.” (I suppose we must thank Ms. Coleman for saving us the trouble of detecting her bias by having to read the entire article.) She goes on:

“But maybe the reason white people don’t understand Rachel Jeantel has something more to do with white privilege then [sic], what they would call, Rachel’s capricious nature. / Let’s for one second try to understand why Rachel is “angry” (read emotional), “hood” (read blunt), and “unintelligent” (read multilingual).”

That Ms. Coleman categorically attributes Ms. Jeantel’s faults to “white privilege” and understands the word “unintelligent” to be synonymous with “multilingual” is, I think, telling. The straw-grasping in an attempt to defend any member of the black community, no matter the apparent transgression, is precisely the best way to undermine any sympathies people may feel towards blacks. If I may be so bold, engaging in constructive criticism or condemnation when it is warranted would perhaps constitute a more effective PR campaign than hurried attempts to wave away any and every apparent vice.

Racism, if it is to go the way of smallpox and Dodos, must be attacked wherever it is manifested. There is no one who may by virtue of their race consider themselves immune to even subtle prejudice or, worse, justified in engaging in open racial hostility. If America is to enjoy the richness that may be had as a result of being a true melting pot, no subculture may consider itself above pointed self-criticism; for by fancying himself invulnerable a man chinks his own armor.

 Notes: 

(1) I use the term “black” in distinction to African American, since it is possible to be an American with very dark skin and yet not necessarily be of African decent, as is the case with Belizeans.

(2) It must be noted that, wherever they find their origin, these terms have been adopted and perpetuated proudly by certain demographics within the black community.

(3) A case for racial profiling for purposes of security, as in an airport, can in my opinion be made on the grounds of valid statistics. The success of Israeli security, for example, is in no small part due to the fact that they unapologetically employ profiling techniques.

Tagged , , , , , , , , , ,

Private Milk and Social Vinegar

Photo credit: biography.com

Photo credit: biography.com

According to the sort of wisdom one gleans from overhearing people at social gatherings and coffee shops, as with milk and vinegar one ought never to mix religion and politics. The degree to which this is true depends heavily on what is meant by the word “mix”; for theocracies have certainly in most cases been of the most tyrannical variety, tending as they do to place a divine impetus behind any and every moral atrocity. C.S. Lewis observed that “theocracy has been rightly abolished not because it is bad that learned priests should govern ignorant laymen, but because priests are wicked men like the rest of us” (The Weight of Glory). Indeed, history makes a truly damning case against any system of governance that grants tremendous power to broken and afflicted men – which is to say all men.

It is, however, a great error to suppose that, as Mr. Hitchens bitterly put it, it is religion that poisons everything. This is an increasingly popular sentiment, and its refrain echoes down the halls of the university and is reverberated by young atheists before they have scarcely learned how to spell “Nietzsche.”

But even if Mr. Hitchens were correct, then surely none of us, not even the ostensibly impervious Mr. Hitchens himself, is exempt; for whatever ill effects we may be forced by circumstance to endure at the hands of others – as in a theocrac­y – the greater and necessarily more potent dose of poison is the one self-administered. Philosophy – for that is what a religion is, after all – cannot be escaped. If a person believes anything, he believes in a religion. As any parent will attest, this religion is from the earliest moments of childhood one of Self. It is only later that some children learn clever nomenclature by which to declare in exalted tones their religion of Self as being one of utilitarianism or humanism or collectivism – all various philosophical rearrangements of social self-service.

Contrary to what one might initially be inclined to suppose on the basis of the terminology, there is no atheistic utilitarianism, no atheistic system of ethics, that is not at its very core selfish. Materialism has the very unpleasant consequence of making selfish even the motivations for altruism. The opening line of Oscar Wilde’s 1891 essay, The Soul of Man Under Socialism, reads, “The chief advantage that would result from the establishment of Socialism is, undoubtedly, the fact that Socialism would relieve us from that sordid necessity of living for others which, in the present condition of things, presses so hardly upon almost everybody” (emphasis mine). Wilde’s honesty is exceedingly rare in such circles. Any notions of selflessness espoused by a materialist cannot be anything but farcical; and any materialist who is not a hedonist must be a very foolish materialist indeed, for he has been duped into thinking that there is some “greater good” that exists outside himself, beyond his short life, and for which he ought to sacrifice. Reflecting on his eventual abandonment of Leftist ideals in the conclusion to his memoir, Hitch-22, Christopher Hitchens writes:

I suspect that the hardest thing for the idealist to surrender is the teleological, or the sense that there is some feasible, lovelier future that can be brought nearer by exertions in the present, and for which “sacrifices” are justified. With some part of myself, I still “feel,” but no longer really think, that humanity would be the poorer without this fantastically potent illusion. “A map of the world that did not show Utopia,” said Oscar Wilde, “would not be worth consulting.” I used to adore that phrase, but now reflect more upon the shipwrecks and prison island to which the quest has led. (p. 420)

The only kind of sacrifice that a materialist can consistently endorse is the kind that has some prospect of benefit in the here and now. Any notions of the “greater good” or objective progress, political or otherwise, necessarily assume the existence of a kind of transcendence that can only reside in the supernatural. Thus, the professing materialist who fiercely defends selflessness as a virtue is not really a materialist at all.

It cannot be called clever – cute, perhaps­­ – the way in which some atheists claim in bouts of counterproductive conceit to only “lack belief” in God, rather than believing positively that God does not exist. Nevermind that this places them squarely on par with rocks, cats, and every other thing incapable of thought or rationality. Theists, on this view, are literally the only exception to that which may be called “atheist.” Yet how mendacious it would be of theists to espouse a lack of belief in atheism rather than to assume the burden of proof that necessarily accompanies every positive claim! Atheists ought not to get a pass in fancying themselves devoid of religion simply because they have defined religion in such a way as to exclude themselves. They are like the politician who fancies himself above politics by referring to himself as an “independent.” The person who considers himself a strictly objective observer simply because he ascribes to a form of philosophical materialism exhibits a very crass sort of disingenuousness, as if because of his rejection of the supernatural his actions (and therefore his politics) do not arise as a direct result of his deeply held beliefs.

The person who thinks it possible to separate religious conviction from political influence shows that he understands neither religion nor politics. One’s religion necessarily touches everything, or else it touches nothing, and therefore cannot be said to be a religion in any meaningful sense of the word. A professing Muslim, for example, who neither prays, nor exercises charity, nor fasts, nor makes the hajj is as good a Muslim as any atheist, and indeed–if he were honest with himself–might very well be one. He is like the atheist who acts selflessly.

Acting on the basis of religious conviction does not necessarily imply a coercive governmental system (i.e. theocracy). A person is not only able to exercise his convictions without becoming onerous, he truly has no choice in the matter. He can choose not to oppress those with whom he disagrees, but he cannot choose to act apart from his convictions. Claims that any influence of religious conviction upon politics is oppressive (or preventable) cannot be regarded seriously. A person claiming to act in denial of his personal philosophy only serves to clarify just what is his true philosophy. A politician claiming, for instance, to value the lives of unborn children, yet who endorses legislation to the contrary by way of appeals to a Jeffersonian “wall of separation,” only shows that his true beliefs (and therefore his religion­) afford a higher value to personal “liberty” – a truly vulgar use of the word in this context – than to the preservation of life. We may well argue the degree to which one’s beliefs ought to take shape in public policy – a distinction similar to that between applied ethics and morality – but it is not my intent to do so here; only to suggest that a complete detachment is impossible.

Thus, conceptions of politics that envision policy derived apart from one’s fundamental beliefs are illusory. In justification for such a notion, however, one often hears atheists make monopoloid claims upon “reason” as being their sole guide. These atheists exhibit a fundamentalism worse than even the most legalistic Christian; for at least the Christian realizes he has a religion. This sort of atheist is so religious, so blindly devoted to reason, that he is not even aware of it. (Nevermind that, despite Kant’s impressive efforts, virtue seems not to be explained–at the very least not without great difficulty–by the proposal that it is grounded in rational principle.)

But a person need not be aware of his own religion in order to exercise it, for it is out of the overflow of the heart that the mouth speaks (Mt. 12:34). Everyone lives his life by what he believes, consciously or not, to be true; that is his religion, and no amount of semantic squirming or disdain for the word is capable of severing the necessary dependence of his actions (and votes) upon it.

Tagged , , , , , , , , , ,

Language and the Progressive

orwell1-01

If it is true that government is at best a necessary evil, then the state ought to be regarded with a wary eye and, likewise, any political party or philosophy that seeks to elevate the state to a position of esteem. And yet, rather than with suspicion, the Progressive confers upon the state a role of immense honor and importance, attributing to it almost salvific powers by which he hopes society’s ills might be cured. Like the Israelites, who pleaded with God to give them a king that they might be “like all the nations”, the Left receives with open arms the ever-encroaching intrusions of the state. But God did not give the Israelites a king as a blessing, but, granting their incessant pleas, as a curse upon their foolishness; and for which they suffered immensely. The state, on the Progressive view, is not merely the reluctant by-product of flawed men, but almost a philanthropic entity all its own – it is not a government but The Government. “Once abolish the God,” wrote Chesterton, “and the state becomes the God.” Indeed, rather than consider some transcendent Authority, acknowledging with humility the inevitable tendency of all men towards a very real moral corruption in positions of power, the Progressive will in nearly every instance exhaust himself in defense of the state, often to the point of absurdity. Take, for example, MSNBC’s Chris Matthews’ likening of the President’s recent divisive inaugural address to Lincoln’s Gettysburg Address; or, worse, the ease with which the President’s extraordinarily massive and irresponsible spending has been consistently overlooked and explained away, the blame often being placed on the shoulders of his predecessor. Such examples are innumerable and can be obtained fresh from any cursory viewing of the evening news.

Desiring to make his political and moral infractions more palatable to those forced to abide by his decrees, and effectively providing for his would-be defenders a more plausible ground by which to make their case, a politician need only become an expert in Orwellian doublespeak. (The chicken-and-egg question of whether excellent doublespeakers tend to become politicians or vice versa is a sociological question I shall not attempt to untangle.) So long as the language is appropriately tailored to circumvent the conscience, placing the appropriate emphasis on the absolute necessity of a piece of legislation to secure safety or health or prosperity or some other such collective good, there is no absurd or immoral policy which cannot be foist upon the citizenry. For example, it is very easy to sell such a concept as infanticide ­– one need only call an unborn child by a different name ­– a “fetus” – and proponents of the practice “Pro-Choice”; for who would dare oppose a person’s freedom to choose? Convincing a person to surrender his arms is equally as simple – gun-control need only be referred to matter-of-factly as “reducing gun violence”; for who would dare voice opposition to such a proposition? Forcibly taking a man’s money in order to give it to another need only be called “charity”; for surely none of us wishes to be thought miserly? The very term “Progressive” is itself exemplary of an attempt to rebrand old ideas.

Insofar as it concerns the passage of legislation or the attempt to persuade large groups to adopt some particular idea, the master of rhetoric need not be a master of anything else. Though the term “progressive” would seem on its face to suggest otherwise, this is not a new phenomenon. As Plato aptly observed, “In politics we presume that everyone who knows how to get votes knows how to administer a city or a state. When we are ill… we do not ask for the handsomest physician, or the most eloquent one.” The use of language as propaganda is by no means solely a Leftist tactic, but one used by anyone seeking to circumvent the trouble of engaging with detractors. It is the Left, however, that has operated primarily and consistently by a very astute method of language co-option; and it is quite clear that if one is able to demonize his opponents, shaming them into silence, one need not go through the troublesome task of addressing and refuting arguments. Expressing dissent on matters of Progressive policy involving the legalization of gay marriage, entitlement programs, gun-control, and global warming (now, more flexibly, “climate change”) is tantamount to labeling oneself a hateful, miserly, cruel, uncaring, ignorant, “unbelievably stupid” (thank you, Mr. Morgan), child-hating, bigot. It is, unfortunately, a tactic as effective as it is fallacious.

The reason for this apparent tactical difference that seems generally to occur between Conservatives and Progressives is due simply to the fact that the advancement of Progressive goals requires the sale of a host of ideas that often defy reason or conscience (or the Constitution). For example, inclusion among the American Progressive ranks evidently requires that one promote state-funded infanticide, high taxes, federally-controlled (mandated) healthcare, and other such programs which could not be advanced or maintained without the prodigious use of smoke and mirrors to obfuscate from the public eye their many unpleasant aspects.

But the difference between Conservatives and Progressives can perhaps be observed most simply in how each regards the people – those under the domain of the state. Conservatives regard people in an optimistic light, generally believing that people are trustworthy, well-intentioned, astute, ingenious, and capable. The Conservative case for a small central government is erected upon the notion that people ought to possess the freedom to choose what is best for themselves, that the securing of liberty is morally and practically superior to any system that involves reaching into every corner of a man’s life and pocketbook. In stark contrast, rather than as a group of individuals, Progressives tend to view people as a collective mass that requires controlling, herding, restricting, whose hands need and ought to be held at every opportunity. The common man has value, but only when considered as a part of the collective whole. Liberty, on the Progressive view, is only the smattering of crumbs left over after the state has gobbled up the many freedoms it deems necessary to sufficiently control what it regards to be a largely ignorant and volatile populace. Rather than a transcendent principle to be secured, Progressive “liberty” is instead condescendingly granted by the state; rights are demoted to privileges.

The rhetorician has reached the height of his craft when he finds it effortless to say certain words and alter their arrangement and context slightly such that he means something quite different from the way in which they are normally understood. It is precisely an understanding of this keen ability that will explain how President Obama can do everything in his limited power to effectively neuter the Second Amendment, while simultaneously proclaiming his affirmation of it; how he can use words like “together” and “collective” and at the same time, under a façade of unity, deliberately slight swathes of those by whom he is employed. In Progressive hands it is only the language of the Constitution that remains – its meaning and intent is reversed, or at least severely disfigured; and it is by way of such semantic disfigurement, as well as ceaseless appeals to emotion, that Progressives seek to convince us of the state’s beneficence and efficacy, implying that we ought to put our trust in an elite few, bowing low to kiss the rings on the state’s compassionate hand.

It is not self-evident truths that must be couched in the vagueness of language, but only those ideas in which lurks something foul. A people may be led happily to their destruction so long as they are capable of taking the state at its word; but a simple question, uttered firmly and persistently, would undoubtedly be the undoing of the Progressive movement: “What do you mean by that?”