The death of rationalism & the rise of passion

President Andrew Shepherd: Look, if the people want to listen to —

Lewis Rothschild: They don’t have a choice! Bob Rumson is the only one doing the talking! People want leadership, Mr. President, and in the absence of genuine leadership, they’ll listen to anyone who steps up to the microphone. They want leadership. They’re so thirsty for it they’ll crawl through the desert toward a mirage, and when they discover there’s no water, they’ll drink the sand.

President Andrew Shepherd: Lewis, we’ve had presidents who were beloved, who couldn’t find a coherent sentence with two hands and a flashlight. People don’t drink the sand because they’re thirsty. They drink the sand because they don’t know the difference.

American President, 1995

All the elegant, nuanced reasoning in the world will not make an iota of difference to the average Trump supporter. In fact, given Trump’s anti-intellectual stance, shared by those who want him in the White House, such nuanced arguments are much more likely to bolster their support for Trump than lead them to reconsider.

I find this a terrifying notion. Reason has always been subservient to emotions (as Scottish philosopher David Hume indicated), and the antagonism between these two states of mind has been well-documented since Plato. Yet the prevailing view among philosophers has been that reason is the guiding principle to which we strive. A rational life, philosophers have stated, is a fuller life.

Continue reading


Brexit, the U.S. presidential race, & what they have in common with XBox

This is the script of a talk I gave today at my wife’s church, the ERUCC in Frederick.

Thank you for having me here today. Thank you, Barbara, for inviting me in a way that gently expressed I had no other option. What I’ve been asked to talk about is the U.S. presidential election and Brexit. So, you know, no biggie. I hope you’ll forgive me a little indulgence, and forgive me if I lapse into a little philosophy and perhaps talk a bit about what I see behind these two phenomena, rather than boring you with the sausage-making of the political process, which you probably already know.

But first: These are worrying times, aren’t they? These are uncertain times.

But first, I’m going to apologize. I’d like to say I’m some kind of expert on politics. Certainly, after more than 13 years covering politics in Frederick County from the ground up, which is to say, from the small cities and towns that make up where we live, then Frederick City Hall, then the marbled halls of Annapolis, you’d think I’d have some experience, some notions of how to frame this for you in a way that encapsulates the state of play, that tells you, well, yes, this because of this and so on.

It’d be fair to ask me, pointedly, weren’t you an opinion journalist? Isn’t it your business to be able to communicate to us, the readers and watchers of politics, some predictions about how this whole insidious business will pan out? After all, to whom do we turn in times when we need assistance formulating our thoughts; people in the media are supposed to sound the warning bell, right? Why? You asked us. How can this be? How did this happen?

The short answer is: We don’t know. Sorry.

The media, of which not to short a time ago I was a member, have tried to afflict the comfortable, as Finley Peter Dunne once wrote. Oh yes, we’ve tried, Thousands of words have been written about the candidates — one in particular, and I believe you know who — and yet, these words have been like drops of rain running off a Rain-X coated windshield. In fact, I think theses thousands of words have only helped.

The media is part of the “establishment,” a word I purposely have put in quotes; and Mr. Trump (in case you had any question who I was talking about) has made an art of engendering, representing and encouraging this idea that somehow he is outside of that establishment and, as such, is best placed to “do something” about it. And yes, I put “do something” in quotes also. The “do something” has so far lacked any specifics, other than whatever it is, it will be “great” or, possibly, “huge.” Whatever that means. It’s that anti-establishment posture I’ll be getting to in a minute when I attempt to illustrate a theory behind his success.

Perhaps part of me talking to you here today should be to “comfort the afflicted,” which is to say all of us here. Well, again, sorry. I really wish I could offer some comfort, truly. We’re all grown ups; I am deeply aware of how intelligent and educated the people in this room are, so I won’t sugar coat. I don’t want to insult you with trite predictions. Things are bad. Really bad. The ancient Chinese passive-aggressive curse is, “May you live in interesting times.” “Interesting” seems inadequate.

If the Trump vs. Clinton race here in the U.S. weren’t awful enough, then there was the Brexit vote in the U.K. to leave the European Union, which passed, leaving supporters about as stunned as those who opposed it, causing great uncertainty and not a small amount of swearing at my parents apartment in Naples, Florida, on June 23.

Politics is in chaos. Not just here, in this once marvelous boiling experiment of democracy, but in established, centuries-old systems of governance, such as Her Majesty’s Parliament.

All I can tell you is that we’re at a precipitous point in global history: Either collapse or a transformative revolution. Change is something we may all have to accept, and change, when it comes to the historical interplay and evolution of countries, is painful. Things may get worse before they get better. We’re not just divided, as some pundits may try to make us believe. We are fractured. Shootings of LGBT club-goers, shootings of people by police, simply because of the color of their skin; shootings of police, simply because of their profession. Race, gender, sexuality, religion, politics. These fault lines, exacerbated by the abundance of ubiquitous, 24/7 information, are tearing holes in us. People are angry, so angry, and I honestly cannot tell you why. I personally look around and am quite content, selfishly, with my personal lot.

Now, to ease my mind, I play a lot of Xbox. Don’t laugh. I’m getting to my point. Game design is a very expensive, very complex business.

Complex situations emerge in video games, board games, or table top role-playing games that emerge from the interaction of relatively simple game mechanics.

This is phenomena known as “emergent gameplay.” In a nutshell, that’s where players in particular games emerge and innovate, and do so entirely within the rules of the game by creating new, unthought of rules that conform to the game, but within its gray areas. Those individuals, often not governed by the common ethical rules that define society, are hated. (It should be pointed out that some of them should be.)

Survivor was the example most given, and you may remember Richard Hatch, season 1’s Survivor winner, and how he created the unforeseen alliance system that became so commonplace in later episodes. In fact, if you’ll indulge me in a theological moment, it is those rule benders, seemingly unfettered from societal rules, we see in numerous historical archetypes: Lucifer, Loki in the Norse pantheon, Set in the Egyptian, etc., etc. So, while I may be using modern language to describe the phenomenon, it’s actually an very old concept indeed.

But indulgences aside. Back to the point.

What the notion of “emergent gameplay” outlines is, I think, true for any system. Only, the bigger and more complex the system, the longer and more difficult it is for innovative gameplayers to emerge, so the innovations they adapt within the game structure tend to scale up. And when that happens, the system can thrown into chaos. And it is at this inflection point the system can fail, if its parameters are not flexible enough to accommodate the new styles of play. Other players follow suit, mimicking the original innovator. Finally, the system once again reaches some equilibrium, but the game is changed forever, radically, until a new emergent stream of gameplay evolves.

I’m sure you realize what I’m getting at here, and where I’m drawing a link between Brexit and Mr. Trump. Both are emergent gameplayers, Trump, I believe, more intentionally than Hillary Clinton. Bernie Sanders, too, in a way.

I may not have been entirely accurate when I told you at the beginning of this talk that I could offer you no hope. I can offer you some hope, although whether you take as such is up to you. (I hope you do.) Sometimes those gamesplayers, as innovative as they are, as original as they are, whether wittingly or not, can change the system but lose the game. That loss comes with a price. Just look at the buyers’ remorse following the Brexit vote as an example. Those who were central to the Leave campaign’s leadership are finding their careers tattered: London Mayor Boris Johnson’s prime ministerial ambitions were assassinated by his friend and colleague Michael Gove, who decided he would make a better candidate for the premier, only to be failing in his campaign to take over the Tories. Nigel Farage, the outrageous, racist and toxic leader of the UK Independence Party, has resigned his post saying he wants his life back.

Caveat emptor.

These players may leave us with a system that is chaotic or in crisis. However, the upside is that emergent gamers may break the system, and I’m saying that forcefully with positive connotations because it can subsequently cause an awakening, or an epiphany, to those of us who have, until the revolution began, been lethargic, and have failed to nurture the system adequately. The loopholes, so to speak, that allowed the rules to be flexed or ignored, may end up being closed. The paradigm will be reasserted, our complacency eradicated, and the whole made more robust that it was before. That’s democracy working.

Emergent gameplayers can shock the system enough to bring it back to life and propel it out of apathy. While Brexit and Mr. Trump may be causing enormous uncertainty, huge and great uncertainty, I’d submit to you that there’s hope in that. Chaos may not be the most comforting thought, but it’s what we aspire to be as Americans following that crisis that counts.

Tagged , ,

Clever & still stupid

A wonderful article over on Scientific American Mind outlines “dysrationalia,“* a term coined by the author, Keith E. Stanovich, and the differences between being intelligent and acting rationally. In other words, it’s about clever people doing stupid things. IQ is an inadequate measure of our capacity for idiocy.

“It is useful to get a handle on dysrationalia and its causes because we are beset by problems that require increasingly more accurate, rational responses. In the 21st century, shallow processing can lead physicians to choose less effective medical treatments, can cause people to fail to adequately assess risks in their environment, can lead to the misuse of information in legal proceedings, and can make parents resist vaccinating their children. Millions of dollars are spent on unneeded projects by government and private industry when decision makers are dysrationalic, billions are wasted on quack remedies, unnecessary surgery is performed and costly financial misjudgments are made.”

Much of the crux of dysrationality comes down to how we allocate our energy for thinking. Most of us default to being “cognitive misers,” in that we often allocate less resources to fully think through more complex problems and, as a result, get the wrong answer.Thinking rationally also requires the right “tools”: “[R]ules, data, procedures, strategies and other cognitive tools (knowledge of probability, logic and scientific inference) that must be retrieved from memory to think rationally,” Stanovich notes.

Look through the test questions he gives, and wonder at your inability to think deeply. I still can’t get my head around the one about the bat and ball cost.

* I’ll also be using this as the name for my metal band.

Tagged , ,

Distance diagnostics

Interesting post over on Mindhacks about “celebrity analysis,” or the practice of diagnosing from afar the state of mind or mental issues of people in the spotlight. The post covers some interesting history about how those professional standards arose.

I’ve come across this in my job as an editorial page editor — a columnist wanted to argue that a certain public official was a sociopath — and argued against it. If mental health professionals are discouraged from doing it, how much more so should we have safeguards against the average journalist making those assessments? Even more egregiously, those diagnoses are conducted based on the flimsiest of familiarities, without any personal, intimate knowledge of the diagnosee.

Horrifying. And, in my opinion, potentially libelous, like referring to someone as “crazy” in print. And we all want to avoid a defamation lawsuit.

Frankly, the field of psychiatry is deeply complex, often defies common sense and isn’t easily encapsulated by know-nothing laymen with access to a search field on and a copy of the DSM V. Those without the proper training in mental health shouldn’t be making diagnoses any more than a journalist should do brain surgery.

Broad but not deep: An argument for becoming a news consumer

One of my biggest concerns about the Internet is how the immediate and relentless access to information — both received passively and sought out actively — has dampened our discriminative abilities.

Less prosaically — the batteries in our bullshit detectors are dead. Or if not dead, drained from constant overuse.

“The great irony of our time is that there is more information available at our fingertips than anytime in human history, but less and less confidence in that information. Rather than being better informed because of the proliferation of easily available information, studies show news consumers are less informed on key issues of public policy.”

That’s from a June 2014 paper by the Brookings Institute’s James Klurfeld and Howard Schneider, “News Literacy: Teaching the Internet Generation to Make Reliable Information Choices.” In brief, the paper talks about Stony Brook University’s decision (from an idea of Howard Schneider, the dean of the University’s School of Journalism) to create a “news literacy” course in 2006 “aimed at educating the next generation of news consumers on how to make reliable news and information choices.” Hahaha, I would have said at the time. Good luck.

I love the idea of teaching news literacy, although I have the slightly uneasy feeling that it is a skill parents should really be passing along to their children by example, rather than something introduced at the college level*.

Children of a certain generation — like, ahem, my own and those older than I — soaked up news literacy osmotically by watching our parents watch TV news and read papers. I fondly remember sitting with my grandfather through the afternoon’s news on BBC 1 (we didn’t subscribe to any papers). Of course, the pre-Internet outlets for information were much fewer, more trusted and much, much better funded, and there was much more of it. So, while we may not have learned the critical skills of dissecting news content, many of us learned to appreciate it, because adults we emulated and looked up to appreciated it. We also didn’t have much choice. A lack of selection forced us to engage with the news.

Now we spend a large and increasing majority of our time online primarily looking for something to entertain and otherwise distract us.** Being informed, i.e., being a discriminating reader of news, is part of being a responsible member of society. I could launch here into a disparate analysis of why having this information makes us more rounded people, or how it helps us define our ethical values, or how it widens our perspective, but that could consume another thousand words, and this post is long enough as it is. Suffice to say, these are valuable skills that will benefit you in every area of your life.

From the Brookings report on Stony Brook, Richard Hornik, a faculty member, who will be teaching news literacy in China:

“The ability of the next generation of citizens to judge the reliability and relevance of information,” he says, “will be a leading indicator of the public health of civil societies around the world.”

Reading attentively and critically is tough to do. It’s work. It takes time not all of us have.

Like diet, we’re hard-wired for the easily obtainable sugary treats, and Facebook, Reddit, Tumblr and Twitter provide them in excess, especially when we’re tired. We prize convenience — the easy option — and are averse to the work of carving out the precious time to plan our meals and set aside time for the gym, even though the benefits are clearly proven. On the flip side, we’re reluctant to engage with information that may challenge what we think, or may enrage or depress us.

Like a side of steamed broccoli, we may only ever brush up against the long-chain carbs of real news by accident when a headline shows up in our newsfeeds posted by one of our friends. We may even comment on it.

(I’m kicking myself for not saving the link, but I did read somewhere (probably on Facebook) that most people who comment under linked articles posted to Facebook never actually bother to click through and read them. It might have been NPR, now that I think about it. Whatever. The point is that even when we see news, we brush right past it.)

As I’ve written before, (In the Land of the One-Eyed Men, a title that would be equally applicable for this post) the Internet is impacting our ability to concentrate. Our attention spans are becoming shorter. And it’s a vicious cycle — the more we read Buzzfeed’s top 10 reasons and other listicles, the more we’ll want to read Buzzfeed’s pablum. We’ll read and continue to read, almost certainly, especially if that reading engages and entertains us. But the consequences for long-form journalism — and at this point I mean anything over 500 words — are dire. Why? Because the information we often most need to be informed isn’t that entertaining. I mean, I could write several hundred words on the importance of exceeding maintenance of effort funding in Frederick County schools, for instance, but unless I can somehow tie it to “A group of shirtless guys remade Beyonce’s 7/11 while stuck in a snowstorm,“*** or you’re in the school system, it’s unlikely you’ll take the time to understand it.

It’s a vicious cycle. We only look at the information that entertains us, so information media provides us with the information that entertains us, and so we circle the bowl ever downward. We become less attuned to what’s important, and media providers are less likely to provide meaningful information because it doesn’t get eyeballs.

Thus, we become more and more misinformed, even though we believe that because of our knowledge grazing, we’re not.

How much so? Let’s return to the Brookings report for a second:

“A University of Maryland study of voter knowledge in the 2010 congressional elections conducted by found voters substantially misinformed on issues from the impact of the stimulus economic package to whether scientists believe climate change is occurring to President Obama’s birthplace, to cite just a few examples. Said the report: ‘Voters misinformation included beliefs at odds with the conclusion of government agencies, generally regarded as non-partisan, consisting of professional economists and scientists.’ … The survey also reflected a widespread perception of bias in the media that has too often poisoned any reasonable dialogue on the difficult issues of our times.”

Here’s the point: Evolving — becoming better versions of ourselves — takes constant work, constant diligence, and it’s not, as the self-help genre would have us believe, all that easy. At some point, though, we have to make the decision that despite our crowded schedules, we’re not going to settle for the status quo — to go on that diet, sign up for a gym membership, refuse that donut our coworker so thoughtfully provided. We do this because we hope for a better future.

We have to train ourselves to want that news, to seek it out, and when we find it, engage with it in a meaningful way.

Being informed takes an act of willpower. One must commit to it.

So, what I propose is this: Pick a newspaper, any newspaper. A proper paper copy, not a digital subscription or free website version. Why? Because you won’t be tempted down the Internet rabbit hole by all the inline links, and because a printed newspaper, while never absolutely accurate, is about as reliably vetted as it can be. Carve out 20 minutes every morning — 10, if you really have to — lay out the paper at your dining room table, and read the front page. Flip to the inside past the jump. Trust me, just doing this will make you a better-informed person.

If you’re really serious about it, like me. get a pen and underline the important bits. Seriously, I do this. It helps me focus.

Get used to doing that. Then find another newspaper. Read that, too. Before you know it, you’ll be a news junky.

(Lifehacker has some great suggestions for the mechanics of reading to become informed — maybe too many. You don’t have to be that tightly wound about it.)

* Universities, colleges and other bastions of higher education are, of course, often the first time students are exposed to subject matter that requires deep, analytical thought, of which news literacy is a subset.

** Fun fact. According to Nielsen’s 2013 Social Media report, a third of 18 to 34 years olds were on social media sites while using the bathroom. Whether the majority were during No. 1s or No. 2s was not researched.

*** If you clicked through to that link, get the fuck off my blog.

**** It’s probably cheaper _and_ more effective than a Luminosity membership.

… after the fire a still small voice*


I’m somewhat amazed at the millions — maybe billions — of words written about Zen meditation. I’ve read a fraction of them and listened to some of the practice’s leading minds (Zoketsu Norman Fischer, for example, who is exceptional, and Dr. Dan Siegel’s more secular work on mindsight).

It’s somewhat ironic, because there seems to be this endless human need to use words to relate aspects of a non-linguistic experience that is best experienced much more simply by doing — which is precisely what many of the best texts recommend. I guess it’s much like the field of writing, and the amount of writing on how to do writing, when it’s best to sit down at the keyboard and just write.

With that caveat in mind, I’m going to add my own small contribution to the literature — small, certainly, but, I feel, important (… and aren’t they all? other writers might grumble).

I believe the following illustrates what’s meant by trying to achieve a state of “not thinking” during zazen.** Rather than this being the simple notion of having a completely inert, blank mind, it is uncovering a state where one’s mind is in motion, i.e., thinking, but is not inducing thought.

This is where the individual allows his or her thoughts to arise naturally, but doesn’t provoke thinking intentionally.

Putting it conversationally, don’t actively try to think. If it just happens, that’s OK, but don’t work at thinking. Thoughts may come of their own volition. Just let them.

I believe this is what Zen teachers mean when they talk about sitting without “expectation.” Expecting something helps drive thinking; it’s an active process, a wanting process, a doing process. In the odd realm of mind, expecting something to happen — thought, feeling, imagining — makes it happen.

Think of the process of thought like combustion in an engine (the mind). Active thinking is pushing the mind’s accelerator — it’s an intentional action. The engine should be left idling in neutral during meditation.

Unbidden thoughts are allowed. Not bidden thoughts.



* You’ll forgive me for using this title, because I think it’s cool even while it mischaracterizes slightly what I’m trying to communicate here. “Voice” invokes thoughts of some kind of language communication, and thus the intentional act of thinking that language requires, whereas the experience of zazen should be the reverse. However, in my interpretation, the “voice” is more a state of passive, non-judgmental awareness, or a communing with the true, Buddha self. I know, I know. I’m stretching the analogy to justify using this title. Bah.

** Central to Zen Buddhism is the practice of zazen, or sitting meditation. This is not about achieving some otherworldly trance state, or contemplation, but an attitude of wakeful awareness in which one observes in a detached fashion the workings of one’s mind. It’s not as easy as it sounds.

Tagged , , ,

In the land of the one-eyed men …


The Washington Post has a rather fascinating article on our growing inability to focus on long pieces of text. I know I’m heaping on the irony here, but here’s the nutgraf:

To cognitive neuroscientists, Handscombe’s experience is the subject of great fascination and growing alarm. Humans, they warn, seem to be developing digital brains with new circuits for skimming through the torrent of information online. This alternative way of reading is competing with traditional deep reading circuitry developed over several millennia.

This is something I’m particularly subject to — I find myself skimming quite a lot in my day-to-day job as I look for topics on which to editorialize and supporting information to help make my arguments. The rise of the Internet, then mobile devices, has only contributed to our inability to concentrate. And if we can’t read well, that is, take in long chunks of text information and retain key elements about what we’ve just read, what’s happening to our writing ability, which takes, I believe, considerably more concentration and cognition than reading?

[Maryanne] Wolf [a Tufts University cognitive neuroscientist and the author of “Proust and the Squid: The Story and Science of the Reading Brain”], one of the world’s foremost experts on the study of reading, was startled last year to discover her brain was apparently adapting, too. After a day of scrolling through the Web and hundreds of e-mails, she sat down one evening to read Hermann Hesse’s “The Glass Bead Game.”

“I’m not kidding: I couldn’t do it,” she said. “It was torture getting through the first page. I couldn’t force myself to slow down so that I wasn’t skimming, picking out key words, organizing my eye movements to generate the most information at the highest speed. I was so disgusted with myself.”

Food for thought, because if we’ve experienced this amazing cognitive adaptation in such a short period of time — just a couple of decades — to the point where it’s over-writing even the decades of pre-Web experience in deep and prolonged reading such that the researchers of the phenomena are noticing it in themselves, imagine what life will be like for our children. Wolf had to retrain herself to read, or, as she termed it, become “bi-literate,” able to both skim and scan, but also retain enough focus to read at length. The process took about two weeks, according to The Post.

“We can’t turn back,” Wolf said. “We should be simultaneously reading to children from books, giving them print, helping them learn this slower mode, and at the same time steadily increasing their immersion into the technological, digital age. It’s both. We have to ask the question: What do we want to preserve?”

That’s a fascinating question. Not only is being able to concentrate and focus invaluable for reading and processing information, it’s indispensable to a meaningful life. Hence the title — those who manage to retain their ability to focus and, by extension, think and remember more powerfully, will truly be the kings and queens in a technologically blind age of short-term memory and constant interruption.





Last Tuesday was not my best morning. For some reason, over the past couple of weeks I’ve been having trouble sleeping, which has led to trouble getting up. And when I’m dog-tired, depression is not far behind. Not serious depression, but that glum, want-to-be-miserableness that adds just a wee pinch of sand to the pistons of living.

I was steeping in that feeling while sitting on the couch, nursing a cup of coffee before the kids woke up. I’m not sure where my moment of enlightenment came from or what, if anything triggered it, but it was transformative. By the time I was halfway through my cup, by the time my daughter was up and had cuddled next to me on the couch, my mood was completely different: cheerful, happy, even joyous. I was re-energized.

In that interim, I had begun to make comparisons.

Unlike most “positive thinking” exercises, where the comparisons are forced, these came naturally and fluidly to mind, almost unbidden and countered each negative statement I found myself making with a simple, “but …”

Ah, Christ, it’s going snow again. I hate the cold.

BUT: I have a warm jacket, and my children have warm jackets. We never have to be cold.

Ah, shit, I’ve put on weight in the last few months.

BUT: I have access to good food, good friends with whom to eat it, and, more importantly, I can choose what I put in my mouth. I am never hungry, except through choice.

Yet another school lunch to put together; this is never ending.

BUT: My children don’t go hungry. I can afford food for them. I have the luxury of being lazy, once in a while, and “treating” them to a cafeteria lunch, once again, because I can afford it.

Life is so damned busy: work, the kids, and everything!

BUT: I have the luxury of flexibility. My wife and I communicate well and make time for each other and our interests. We have a good support system. We’re an equal partnership. So what if there’s piles of laundry clogging up our bedroom? We have running water for the wash, and clothes.

What I found was the connections that branched off from considering the “but” led me on to other statements along a chain of thought, equally positive, creating an arena of gratitude.

Perhaps what primed the pump was a Lent exercise from my wife’s church that we did at dinner this weekend — an hour of gratitude at family dinner that shared poverty statistics from around the world. Perhaps it was the affect caffeine had on my tired system. Who knows. But it was a powerful experience that has stayed with me for much of the day, and completely banished the depression.

We normally experience the authority of “but” in its negative sense, when we’re trying to get out of something, for example, or when we make a positive mental statement that we immediately counter with a negative. In this case, I used it to reverse a negative statement into its positive corollary. What surprised me was that it’s normally so damn easy to be negative, but real work to stay positive. This time, thanks to coffee or closeness, it was the reverse.

What was the lesson? I was reminded that I have no significant challenges to overcome; in fact, I have significant advantages from my life’s socio-economic position. Certainly, people have it worse. They also have it better. But rather than see my life from the perspective of those completely fabricated people*, rather than see what I have in comparison to an illusory set of standards, I need to compare what I have with the problems I identify in these thought statements, and the only genuine comparison I can make is with myself.



* What I mean by “fabricated” is not that these people don’t exist, but the human tendency to create imaginary groups rather than actual real individuals, e.g., “the rich,” “the talented,” etc., or whatever advertising or “reality” TV happens to be in the zeitgeist of the time. To take this point a little further, though, I believe we should also avoid comparing ourselves even to real people, because we often have illusory and totally arbitrary standards for what makes them better or worse than us.

Tagged , , , , , ,

Uncovering Zen

I’ve been experiencing for the past couple of months a growing interest in Zen Buddhism, specifically the branches from Japan more so than Korea (Seon) or China (Ch’an).

I can’t quite put my finger on what it is I find so appealing, but the draw is incontestable, and each layer peeled away from the onion only reveals something more fundamentally compelling.

I’ve always had a fascination with Japan in general, although mostly when I was much, much younger, through the physical aspects of the country’s martial arts. Of course, as with nearly everything oriental, the spiritual is never far away. Martial arts have their own spiritual heritage, whether it’s recognized or not.

Eastern cultures, especially the Chinese and Japanese, have deep concern for the mind — for want of a better word to describe our inner phenomenology that Westerners will understand.

Zen, in particular, is less a religion than a practice, that has fairly dogmatic rituals and hierarchies, but few dogmatic beliefs. Saying it’s spiritual is to mischaracterize Zen. Central to the core of Zen is zazen, or simply sitting in meditation. Its goal, more or less, is satori, or enlightenment, or realization of the pre-linguistic nature of mind.

To put it in more familiar language, Zen practice leads the practitioner to an awakening of themselves, an awareness of the “I” behind the relentless stream of chatter, images and connections of memories constantly thrown up by the mind. Zen, then, if it has a purpose, which is debatable, seeks a connection with the fundamental reality of … well, I guess “is-ness” is as good a word as any. The Chinese would call it “tao”; in Sanskrit, tathata, or “suchness.” In Christian terms, “I am that I am.” I rather like Alan Watts’ almost-nonsense phrase: “The which than which there is no whicher.” It’s an experience of realness, of engaging with the world with directness. In sitting, one allows the phnomenological contents and mental processes to present themselves in awareness — images, inner narrative, memories and fantasies about the future — simply allowing them to happen, to provoke the feelings they will, then letting them pass.

I’ve been doing this for a short time, and it’s far from being as easy as it sounds. Like a blind, deaf man who has his sight and hearing restored, once one really pays attention to the mind’s contents, the resulting influx of stimulus can be overwhelming.

Ultimately — spoilers ahead — we understand we were enlightened all along. Zazen merely takes the blinkers off the horse. Eventually.

Words, as you may have gathered, are pretty poor ways to define concepts that rely on a direct connection or unveiling of the something-something. Zen teachers can sound maddeningly paradoxical on these experiences. Dogen, one of the world’s greatest Zen masters, described the mental state of zazen as “think not thinking.”

The notion of satori may not simply be a singular or all-encompassingly profound revelation. It’s a matter on which Zen schismed into two schools, Soto and Rinzai, the former of which talks about gradual enlightenment, the latter, a sudden awakening.

Modern American Zen practices even de-emphasize enlightenment as an oversold concept with “unreal expectations,” Robert Aitken is quoted as saying in “One Bird, One Stone,” by Sean Murphy. (Aitken is one of American Zen’s most noted teachers. He was one of the first Americans to visit Japan and export Zen westward.)

I digress. Most practitioners tend to agree that satori, whatever its conception, is a point of beginning rather than a definitive end, one simple term for innumerable personal experiences both simple and profound.

In any case, satori, enlightenment, realization or awakening is not something about which I’m qualified to expound. If I’ve had it, I’m not aware of it, so take everything I say with a grain of salt as I’m simply digesting and regurgitating thoughts from those who have thought about the experience. Or thought about thoughts about the written thoughts about the experiential action of Zen.

Ironically, for an art that relies so much on a pre-linguistic awareness of self, divested of language and outside logic, Zen teachers have written thousands of books. Some are simply stated, some are deeply obscure, but there’s no end to the human desire to communicate these revelations. Even the act of writing — which is to me a supremely logical thing — can become a meditation practice. Life is a zendo — a place for meditation, a crucible for Zen mind.

Tagged , , , , , ,

Objectivity vs. advocacy

Interesting discussion on one of the fundamentals of journalism at The New York Times. Is objectivity an outdated concept (and it is a relatively recent invention of  modern journalism), or should we allow the opinions of the journalist to become part of the reporting process?

Glenn Greenwald, who’s broken a number of major stories from the information gathered by former NSA contractor Edward Snowden, believes objectivity’s time has come and is an excuse for reporters to write their opinions into their stories from under cover. Bill Keller disagrees, saying it’s impartiality that allows journalists to get closer to the truth.

There’s elements of value to both. Personally, I think the objectivity/neutrality/impartiality debate is a slightly imperfect one, and draws premises from an incorrect conflation of balance and fairness.

Balance — in which equal time and space is allotted to all parties and points of view — is an external condition almost impossible to fulfill, even in the best of circumstances.

Fairness, however, is an internal standard that merely offers equal opportunity to all voices, parties and points of view. It’s their responsibility to take you up on it.


Tagged , , ,