Saturday, November 5, 2011

Connecting the Dots

Distance does not make you falter,
now, arriving in magic, flying,
and finally, insane for the light,
you are the butterfly and you are gone. — from Goethe, The Holy Longing
On the evenings I step back inside my home from an hour at my local coffee-house, I often pause by a bookcase just inside the door. I pick a book at random, usually one I’ve not read for awhile or even never read—having bought books over the years that I grow into eventually—and opening it anywhere, take in the tone and cadence, the rhythm of the sentences, the delight of walking in on a conversation in full swing. Reading out of context breaks the mind out of dull expectation; it throws one almost violently into a world emerging into light, a creative disjunction, an optical bending of shapes into images. All that, and it’s fun, too. 
I picked up Michael Meade’s Men and the Water of Life, an initiation into myth and storytelling, and found a poem by Goethe I’d not read before called “The Holy Longing,” which concludes with this:
And so long as you haven’t experienced
this: to die and so to grow,
you are only a troubled guest
on the dark earth.
 Then I pulled down Colin Wilson’s brilliant work, The Outsider, written when he was only 24, in 1956. The Outsider traces the literary development of the alienated ones, the  people just beyond the thinnest edge of the crowd, the ones who by their very nature do not fit nor conform to polite society. They cherish their aloneness, yet they need others to truly be themselves. And the first page I opened it to . . . contained the stanza above from Goethe’s poem. 
These moments of serendipity are mysterious and welcome. For me, they happen often enough that I am not surprised, though I’m always grateful. They are one of the small wonders of the universe. It’s like coming upon a bonsai garden, the tiny, perfectly-formed trees. sometimes hundreds of years old, that stand majestically in their created environments. 
On my way up the hill to home, with the sound of endless traffic behind me and a moonless sky above, I was thinking of “home.” Not the domicile (from Latin, domus) where, as the thesaurus puts it, “whenever you are absent, you intend to return,” but this Earth, this world. Perhaps not just this third rock from the Sun, but more the world we both create and observe, the imaginative world within which we live and move and study ourselves. 
My students and I had been talking in philosophy class about freedom, freewill and determinism, the questions that ask whether we choose our actions, whether we are destined or fated, or if we are simply flung upon this earth. The question I had put to them reflected our readings and our discussion: 
The determinist says: Every event has its explanatory cause.
Some people say: Everything happens for a reason.
Is there a difference between these two positions?
The answers were thoughtful, wry, insightful, even humorous. One group stepped up vigorously and denied any differences. Cause and reason, they said, are different words for the same thing. We see an event: we trace it back to a cause. If everything happens for a reason then there must be a cause, since reason implies purpose, and purposes don’t come out of nowhere. 
Another group advanced more cautiously, working the knife in between the stones in the wall and in finding the differences. For them, ‘cause’ implied a point of origin, the initial shove that set something in motion. ‘Everything happens for a reason’ is the phrase that people use in the aftermath of an event when they’re trying to make sense of something. They say it over the shoulder as they doggedly trudge forward. 
A smaller group saw it as the bridge between science and religion, since science seeks knowledge of events and religion looks to faith to interpret what cannot be solely based on facts. 
And all week I had been, in spare moments, reading Walter Isaacson’s new biography of Steve Jobs, both rich in detail and broad in its scope. It’s a fascinating work, not only because Jobs is a fascinating subject, but because Isaacson sees the relentless purity at the center of the man’s soul. Jobs was a man whose dark side got up every morning and went to work with a knife between his teeth. His light side appeared occasionally, smiling and charming, with the knife held loosely behind his back. He was the dazzling embodiment of Kierkegaard’s maxim, “Purity of heart is to will one thing.” And for him the one thing was found at the intersection of Art and Technology where extraordinary engineering met exquisite design. He could not bear any deviance from the path of simplicity that led to perfection. How deep were his flaws and how high his aspirations!
Such purity of heart is dangerous, a flame that consumes all and finally itself. Is this what it takes to make a dent in the universe? 
Tell a wise person, or else keep silent.
Because the massman will mock it right away.
I praise what is truly alive,
what longs to be burned to death.
Every event has a cause but not all events are visible. Everything happens for a reason but sometimes we only see it after it’s over. Looking back, we connect the dots.

Saturday, October 22, 2011

Gaddafi, Interrupted

“When the passions of the past blend with the prejudices of the present, human reality is reduced to a picture in black and white.” — Marc Bloch, The Historian’s Craft
In the moments before starting a class I was about to teach in bioethics, one of those moments in which some students stare into space while others read over the assignment, a large woman burst into the classroom with a shout, her face wreathed in smiles, arms over her head, body swaying in a herky-jerky dance: “Gaddafi is dead!”, she sang out. “Gaddafi is dead!” Some in the classroom registered mild surprise, others merely nodded, one or two gasped; the majority simply smiled at the delight of this woman who continued to chatter amiably about the event. 
The next morning I glimpsed the front page of the New York Times and saw a blurred photograph of Gaddafi, head bloodied, a rictus of terror on his face, surrounded by men with guns, under a blazing sun that cast everything into patterns of light and dark. The photo was taken from a video shot in the moment—no doubt with a cell phone—a video that the Times assured us was even now circling the globe. A hated dictator comes to his end, dragged out from a drainage ditch, spreadeagled across the hood of a car amidst a mob, and eventually shot in the head at point-blank range. That’s one version of the story, anyway. 
The moment of liberation has finally come, a moment which Gaddafi, for all his paranoid bluster and atavistic arrogance, must have imagined in his night-sweats while on the run. ‘Lo, how the mighty have fallen,’ came to mind, as did memories of Sadam Hussein, wild-eyed and disheveled, dragged like a maggot from his hole on his way to a quick finish at the end of a rope. These moments are preserved for us, first in pixels, then in memory. But as Susan Sontag reminds us in her On Photography, “The ethical content of photographs is fragile.” In time, these photos will fade, not just physically, but from our immediate consciousness also. “A photograph of 1900 that was affecting then because of its subject would, today, be more likely to move us because it is a photograph taken in 1900.”
How do we distinguish the moments in which the hinge of history slides ponderously open? That which the media chooses today as the key to the future becomes in that future a footnote to a larger event from the past. With time and patience the historian discovers a pattern among the bones. But I am fascinated by the need in us to create a meaning now, to build a house upon the sand for the purpose of selling the real estate before the tide turns and it is swept away. Thus, every spike in the EKG of world affairs draws out the pundits whose wisdom-for-hire keeps us amused and distracted. 
This is not to say that we shouldn’t put current events in a context nor should we refrain from trying to understand what’s going on. I think it’s something else entirely, this uneasiness I have about a beta version of historical meaning. I came across a quote recently which looks at this puzzle from the other end.
“It often happens that those who live at a later time are unable to grasp the point at which the great undertakings or actions of this world had their origin. And I, constantly seeking the reason for this phenomenon, could find no other answer than this, namely that all things (including those that at last come to triumph mightily) are at their beginnings so small and faint in outline that one cannot easily convince oneself that from them will grow matters of great moment (Matteo Ricci).”
We feel the need to know what will happen, so much so that we will create a probable history so that our commitments of time and money and political capital will find the greatest return on investment. We’re not very good at predicting the future. Who foresaw the Arab spring? We’re much more sure about our ‘winter of discontent,’ as gas prices surge and ebb, as health care in this country leaves millions in the cold, and as the political campaigning runs its brutal, if predictable, course. 
Journalists of the old school, used to finding the facts and delivering them with as little authorial inflection as possible, are now asked to render judgment on what they report. This is a waste of time and talent, but not of money. In this branding era a news staff comes to be known for its daring—not the courage of reporters entering a war zone or taking on the rich and powerful—but of those who turn headlines into questions that have no answer. “Which Dictator is Going Down Next?” “Is Cain Dead in the Water?” “Can Rick Perry Overcome His Debate Blunders?” and “Will the Murdoch Clan Survive?”
Marc Bloch, an eminent French historian, joined the Resistance against the Nazi occupation of France when he was nearly 60. He was later captured by the Nazis, tortured, and finally executed near Lyons with twenty-six other patriots on June 16, 1944. In 1941, having been forced out of his academic post because of his Jewish ancestry, he began a book, The Historian’s Craft, which was never finished because of his untimely death. In it he reports on an incident, “the airplane of Nuremberg,” in which rumors of a provocation by the French against the Germans were not only untrue but went undisputed because it was useful to believe them. “Of all the types of deception,” he says, “not the least frequent is that which we impose upon ourselves, and the word ‘sincerity’ has so broad a meaning that it cannot be used without admitting a great many shadings.” 
I like what Steve Jobs said in his now-famous Stanford Commencement address of 2005: “You can’t connect the dots looking forward. You can only connect them looking backwards.” Some think journalism is the first draft of history, but in a 24/7 news-cycle today’s news is tomorrow’s history—and that’s simply not enough time to connect the dots.

Saturday, October 15, 2011

Transcending Opinions

“It is not enough to relate our experiences: we must weigh them and group them; we must also have digested them and distilled them so as to draw out the reasons and conclusions they comport.” — Michel Montaigne, The Art of Friendship
“This is only my opinion, but. . . .”  Lately, whenever I hear that in the classroom, in a conference, in a faculty meeting, or in casual conversation, I want to tear off all my clothes and start screaming. Since that is against most social norms and my better judgment, I signal my displeasure by the merest arching of an eyebrow. 


How did we come to this point in common discourse? Why is it that when we edge ever closer to subjects of significance and weight, points that ought to be argued, elements of life that divide and conquer people, we retreat with a disarming smile into a cloud of unknowing? 
The rules of engagement in these battles are followed to the letter. First, the disclaimer: “This is only my opinion. . . . “ Translation: I’m sorry if you take offense at anything I say, but everyone has the right to their own opinion.” This is followed by the actual opinion, which varies in its relevance to the discussion, but usually reflects the unconscious prejudices of the opinionator. Finally, there is the idemnification clause, intended to protect against the disagreeable opinions of others fired at point-blank range: “You may disagree, that’s okay—everyone is entitled to their own opinion—but I’m just saying. . . .” Then the speaker usually lapses into passivity, content to have said his piece, but uninterested in any extension of the argument unless it challenges his right to express his opinion. 
This signals the death of dialogue and the throttling of democracy, which relies on the free exchange of ideas. But how can ideas freely circulate when they come walled about with petulant assertions designed to shore up fragile egos? We have lost the art of “conversation,” a word which can be traced back to its Latin roots in the idea of living in company with others, literally, ‘to turn about with.’ Another ancient root, a scriptural meaning, relates conversation to a ‘manner of life,’ or a way of being, never merely as a means of communication. It signifies a willingness to trust one another, to extend to others the means of grace whereby genuine learning can take place. It assumes that conversation takes time, that it evolves, and that it is so much more than mere assertion. 
Robert Grudin places this squarely in the realm of liberty and calls these conversational skills the ‘arts of freedom.’ In a fascinating meditation entitled On Dialogue, Grudin says, “Once gained, moreover, the arts of freedom must be kept fresh by thought and action, taught to the young, bequeathed down generations.” Otherwise, he warns, the posturing demagogue and the ravenous mass-marketer “will turn liberty into its own caricature, a barbarous fool driven by fear and greed.” 
It might seem a long leap from a classroom discussion to the foundations of democracy. We must also be wary of blaming the end of civilization on the young and restless. But Grudin, a professor of English at the University of Oregon, believes that these arts can and should be taught. “The operative pedagogical philosophy is that skill in these arts will enable people to make decisions and follow courses of action beneficial to themselves and society. In other words, people can learn freedom. Freedom is useless without a rational and emotional instrumentation that gives it substance.”  
What I often see in classroom discussions is more a clash of egos than an exchange of ideas. Many times those who speak up are so eager to claim their point of view as theirs that the point—if there even was one—is lost. Teachers don’t help much either. When I worked in faculty development I saw many syllabi which laid out elaborate rules for classroom discussions. I was struck by the pervasive fear which ran through the assumptions behind these rules. Students had to be protected from the sharp edges of differences between them: once you entered the classroom there were no races, genders, or cultures. Reference to these social categories was taboo: each person was both an individual so autonomous that he perceived reality in exclusively personal terms and he was a member of a massive, amorphous, egalitarian lump. No doubt the intentions were that no student should feel discriminated against—something no one should have to suffer—but the effect was to limit discussion to the confident few who wielded their vorpal swords for sport. These parts of our identity help make us who we are and we ignore them at our peril. They come back as labels and epithets if we don’t take their influence into consideration.
We learn with each other, that’s what conversation means. We are social beings, which is to say we find out who we are through interaction with others as well as reflection by ourselves. Self-awareness and self-reflection, though, are learned behaviors, brought about through practice in hearing about ourselves from other people as we dialogue. When we don’t practice at listening before we speak we panic when spoken to. Our desire to be known for ourselves rises up and before we know it we are chanting the mantra of the blindingly obvious: “Everyone is entitled to their own opinion. . . .” Whereupon we deliver our opinion as a verdict rather than an invitation. 
I once went to a conference for men held at a large hall in downtown Washington, D.C. It was led by Robert Bly, a poet and self-styled men’s mentor, who had just published a book entitled Iron John. It was a manifesto on being a real man without becoming a slack-jawed, brutish jerk. During the course of his presentation he gave some time for statements and questions from the floor, but placed some conditions on the speakers.  They had to keep their contributions to three sentences in the interest of time and they could ask questions—but any sentence that was not a question had to be a simple, declarative sentence. It was issued as a challenge: say what’s on your heart without hedging it about with qualifiers. I took it as a request for open, sincere, and rugged conversation. Nobody could do it. Virtually everyone who spoke danced about their subjects, adding implied questions, footnotes, self-referential phrasing, and jargon. Bly was disgusted and berated us for our narcissism. 
I have often thought of that experience for it revealed some principles I’d like to live by. We need to think before we speak; we need to listen to others; we need to give each other grace so that we have a space in which to learn from each other. That’s not my opinion, that’s my invitation.  

Saturday, October 8, 2011

Jobs for Everyone

“One comfort is, that Great Men, taken up in any way, are profitable company. We cannot look, however imperfectly, upon a great man, without gaining something by him.”  — Thomas Carlyle, On Great Men

The front page of Apple’s website on Wednesday, October 5, 2011, featured a single image — a black and white portrait of Steve Jobs with the dates, 1955-2011. Simple, elegant, minimalist, the photograph had the classic style of an Apple ad. One almost expected to scroll down and see the words, “Think Different.” 
Rarely does a CEO garner such respect and affection, much less one who built and headed a corporation with more cash in the bank than some countries. But then Steve Jobs was more than a businessman, more than an entrepreneur. His death at 56 cut his arc of brilliance before it reached its apogee and robbed us of the chance to see what he might be like in 20 years. As the accolades poured in, and the flowers, notes and apples were laid at the doors of Apple stores around the world, it reminded us that this kind of attention usually follows the death of royalty (Diana) or rock stars (John Lennon). 
But Jobs was neither. A man with a child’s sense of wonder, he was the quintessential American success story. Adopted at birth by a working-class couple, he dropped out of college 17 years later because it was eating up his parents’ life savings—and he didn’t have a clue what he wanted to be anyway. Three years later he and Steve Wozniak built a prototype of an Apple computer in his parents’ garage. Within 10 years he had a $2 billion company, he and his pirate team had built the first personal computer, and in an ironic twist, he’d been fired from his own company by the man he brought in to help guide it. He went on to found NeXT and Pixar, and finally to return to Apple where he brought out the iPod, the Macbook, iTunes, the iPad and most famously, the iPhone. He died at home, surrounded by family, the day after the latest iteration, the 4S, was announced.  
By now the essential elements of Jobs’ professional life are well-known, much as we know the beginning and the end of the Beatles. Like the Beatles he lived most of his adult life in the hot glare of media attention while he guarded his private life with a tenacity rarely seen in the celebrity world. But when Walter Isaacson’s authorized biography of Jobs is released later this October, much of that life will no doubt be revealed at last. 
By many accounts Jobs was mercurial and ruthless, a perfectionist with an eye for detail  and the capacity to drive employees to despair with his demands. But many also speak of his kindness, his love for his wife of 20 years and their four children, his willingness to mentor those young entrepreneurs in whom he saw some of his early fire and brashness. 
When I saw the announcement of his death my eyes filled with tears. In the days since I’ve found myself returning time and again to his image and life in odd moments between classes or when I’ve been waiting at a stoplight. I’ve wondered what his children and wife are going through, how his closest colleagues will feel when they walk the halls of 1 Infinite Loop, the Cupertino headquarters of Apple, and especially, what he must have thought about in the last painful weeks of his life. I have asked myself why he holds such fascination for so many of us and who he will become in the psyche of 21st century people. 
Already he is spoken of in the same breath as Edison, Walt Disney, and Leonardo da Vinci. David Pogue, in his regular column on tech products in the New York Times, speculated on the chances another young visionary like Jobs is even now working in a garage somewhere, and put the odds at “Zero. Absolute zero.” People like Pogue, who have known Jobs for decades and have sometimes disagreed vociferously with decisions he made, see him as a rare creature, one of the few who deserve to be called “genius.” 
FIrst, Jobs brought together technology and art in ways that no one had thought of before. The products of his design teams were the result of his own visions and imagination. Someone recently described the process of design at Apple as stripping away layer after layer of clutter and chaos until they arrive at the luminescent, irreducible pearl in the center. Most corporations don’t allow that kind of time to be spent in reduction instead of addition, but then most corporations are content to repeat what works until well after it doesn’t anymore. To open the box on a new product from Apple is to witness the epitome of presentation. Every part of the packaging has a purpose, every part contributes to the whole, and the whole is much more than the sum of its parts. It makes you want to keep the packaging as art in itself.  
Second, Jobs never looked back. It was his view that if you’d succeeded at something it was time to throw yourself in the deep end and splash around until you found a new lifeline. He grabbed the idea of the mouse from PARC, made it standard in the industry and then moved us away from it to something ever more intuitive and natural—the gestures of hands and fingers. He took away our CD-ROMS, our external hard drives, cables, and flash drives. In their place he gave us elegance and simplicity. “It just works” was a refrain that constantly came through Apple’s marketing and advertising. 
Third, no one in recent memory has both commanded a corporation and put himself in the skin of an average consumer. Jobs had an uncanny ability to think like a customer, to focus on the results wanted, and then to provide the means to get there. 
Fourth, Jobs could see not just what would sell, and not just what would make something good even better, but something that no one had thought of yet. Even back in the Apple II days, when most people couldn’t imagine computers doing more than keeping recipes and shopping lists, Jobs was designing a personal computer to be a “bicycle for the mind.” He had to wait for the rest of the world to catch up sometimes, but more often than not he made us want the future before we could understand it—and when we did it was so natural it made us feel like we’d invented it ourselves. That’s called vision and almost nobody has it. Bush the First derided ‘the vision thing’ because he saw it like most corporate managers do as something a committee cuts and pastes together when they’ve run out of ideas.
Finally, Jobs raised the bar on performance so high that he made others want to do better. That’s charisma. Leaders want it, but it’s not for sale. Seldom seen, it’s the result, I think, of a person who embodies a cluster of paradoxes: power that surrounds itself with others who are brilliant; confidence without egotism; purpose with a sense of humor; and enthusiasm without mania. It’s the Tao in action. We are fortunate to have shared time and space with a man who found it—as we all may—inside himself. 

Saturday, October 1, 2011

Moneyball's Learning Curve

The fundamental mistake is in taking the patterns we observe around us as facts of nature. They are not; they are the result of rational individuals adjusting to a particular set of constraints. . . . Change the constraints and, given a little time to adjust, the patterns change. — David Friedman, Hidden Order: The Economics of Everyday Life
What’s the best way to get someone to change their behavior? Use a carrot? Use a stick? I’ve been interested in motivation every since I became a teacher and discovered that teachers can’t motivate students. If you beat them with a stick it doesn’t increase their skills and they’ll come to hate the process of learning. If you entice them with the carrot they’ll do just enough to get the carrot and no more. Unlike teachers, most students don’t enjoy learning for its own sake. Come to think of it, teachers don’t either: they learn in order to accomplish a goal. But one thing that separates teachers from students is that teachers can’t understand why the goals they love aren’t what students love. 
And that brings us to Moneyball, the movie starring Brad Pitt and Jonah Hill, based on the book of the same title by Michael Lewis. Lewis has a talent for making economics interesting in such bestsellers as Liar’s Poker and The Big Short. With Moneyball he looks inside the economics of baseball. The conventional wisdom is that the big spenders (Yankees, Red Sox) buy the best players and the division titles. They may even win the World Series. Money equals wins. But Billy Beane, the general manager of the Oakland A’s, had little money—in fact, the lowest buying budget of any team around the league. 
His first move was to hire a shy, soft-spoken kid, an economics graduate from Yale, who lived and breathed statistics—baseball statistics. It was the kid’s idea that meticulous scrutiny of a player’s stats could reveal patterns of performance that pointed to value rarely seen by scouts or managers alike. Most of the players Beane picks up in following this advice are the bargain-basement overlooked or the over-the-hill gang that no one wants.  
Predictably, the A’s scouts, a group of leathery, tough old guys, can’t see the logic and don’t appreciate the implication that tables of stats can trump years of experience. Beane is too old to waste time on methods that no longer work and young enough that he’s willing to bet the farm—and his reputation—on unproven concepts. This isn’t a movie review and I’ll try not to spoil it for you, but this is the takeaway I received: what people are worth is the value they place on their integrity. 
Billy Beane takes a clutch of misfits, has-beens, and also-rans and turns them into a team that wins 20 in a row—a record that no other team had achieved before—by thinking of them as parts in a system rather than individual stars. Without the money to buy a slugger he goes for the ones who get on base. He buys a pitcher whose delivery looks like a knuckle-dragging primate on speed, and turns a broken catcher, Scott Hatteberg, into a first-baseman. “What’s your biggest fear?” asks David Justice, a veteran player. “That someone will hit the ball toward me,” breathes Hatteberg. After Justice stops chuckling, he says, “Good one! That’s funny. But seriously. . . “   “No, really,” says Hatteberg, looking away. “That is my biggest fear.” 
By accepting the constraints and working to maximize the effects, Beane and his staff turned the club around and, some would argue, changed the game. He was hardly an inspirational speaker, at least as portrayed in the movie, and he seems to have deliberately distanced himself at first from the players. “That makes it easier for him to cut us, right?” asks a player of Beane’s assistant. But as the season grinds on with few wins Beane holds informal seminars on the method and gradually convinces the players that together they can win. A few men with journeymen talent and an ability to put ego aside can achieve more than the glittering superstars. He trades a player whose taste for the fast life is messing with his game, sends a rookie star to Philadelphia because of his attitude, and regretfully but firmly drops a player who cannot measure up. 
What are we worth? Exactly what we contribute when we put our hearts into it. But there’s no gauzy optimism in the A’s locker room, and you’ll never hear “I Believe I Can Fly” blasting from the sound system. In a scene that would make motivational coaches and school counselors cringe, Beane strides into the locker room after yet another loss and berates the players for celebrating anyway. “Do you like losing?” he yells and flings a bat down a corridor. In the sudden silence the sound reverberates for a long, long time. “That’s what losing sounds like,” he snarls, and stalks out. In the economy of teams at the bottom only one effect can give rise to a new cause: you have to hate losing more than you care about winning. In Beane’s pedagogy that’s neither a carrot nor a stick: it’s self-respect coupled with realism. 
Carol Dweck is a psychologist whose 30-plus years of research into motivation among children seems to back up Beane’s intuitions. She notes in her book, Self-Theories, that “The hallmark of successful individuals is that they love learning, they seek challenges, they value effort, and they persist in the face of obstacles.” Moreover, she punctures beliefs that are prevalent in our society, such as those with high ability are more likely to be mastery-oriented, and praising a student’s intelligence will encourage qualities of mastery. They’re not and it won’t, she says. Instead, the ones who succeed are most often those who persist with vigor and humility to overcome obstacles, and who believe that they can learn, that intelligence is not fixed at birth. That’s a cheer for the underdogs, but Dweck even goes one better: her research shows that the students who easily pull A’s but collapse in frustration when up against something difficult can learn a new attitude. They can shift from avoiding anything that would spoil their record to enjoying the challenge of learning something new. In other words, they are motivated from within. 
Perhaps, at the risk of over-simplification, this could be expressed as a set of goals: See your limitations as challenges. Learn to love the questions. Keep at it. Share what you know. 

Saturday, September 24, 2011

Understanding Backwards


Life must be lived forward, but can only be understood backwards. — Soren Kierkegaard
For a good part of my life I have seen religion as a duty which must be accomplished with dedication if not enjoyment. Since all people are sinners and sinners must seek salvation it did not occur to me that some people might not see the point in all this religion business. “Oh, I’m not religious,” some friends would say to me, as if it were genetically transmitted or perhaps an acquired taste. They would blithely go about their lives, unencumbered by guilt, enjoying their sins, and occasionally pausing to shake their heads at my dutifulness. “Why do you bother?” they would ask curiously. 
For my part, I could not understand how religion could be regarded as an accessory. It was core, at the heart, deep inside, that which guided and prompted all that was good and pure and true. One could no more shuck it off and live a decent upright life than one could see one’s hand in a room without light. There was one way to salvation and that was through obedience to the rules, as inexplicable as they appeared sometimes. And yet I continued to meet people who claimed no religious allegiance, but seemed to me honest and good, the kind of people who would take you in during a storm or give you a lift miles out of their way. It was disconcerting. Some of them even smoked.
So I tried harder, tried to be dutiful, tried to be aligned without completely losing myself. But myself would slip out of my grasp at the most inopportune moments and do something embarrassing, like refusing to stand and go forward for altar calls. Even if I had made a clear and heartfelt decision years before to join the side of the angels, I squirmed in the pew when the preacher began his pitch. I felt that I owed it to the unchurched and the disbelievers in the house to stand yet again and be a living example. Despite my inner diatribe that religion was personal and that honesty demanded a consonance between motivation, belief and action, I felt I was letting down the team. 
And yet I was always fascinated by religion, or rather by the quest for God and transcendence. Growing up in California in the 60s, I was surrounded by those who sought a shortcut to enlightenment or at least bliss. I plodded along, waving as they roared past, secure that I had the safer path if by far the slower one. If it was there, I thought, I’d find it eventually by dint of just keeping at it, one foot in front of the other. But I didn’t. 
I studied theology, philosophy of religion, eventually got a doctorate and taught religion for some years. I had no doubt I should be there and yet I constantly felt like an imposter. I could not be like my colleagues, men who had signed up for the church for life and who seemingly could overlook all manner of missteps and outright lies on the part of the church. I struggled to understand how to avoid the sin of self-righteousness while side-stepping hypocrisy. But pride goes before a fall—and I took a fall of my own making. 
Years later I am seeing some things much differently. I am learning not to let the foibles of the official church body distract me from my own spiritual quest. I have met the enemy, like Pogo, and they are me. I know what I am capable of doing against my better judgment and where most of the fault lines appear in my foundations. 
And I have learned or perhaps discovered, that signing up for a set of beliefs is not the point. Some beliefs fall away over the years because they never really found a place because I never really believed them. Others simply don’t make sense no matter how I’ve tried. But the vast majority of religious beliefs ought to be seen as practices. We practice them because in the practicing comes understanding and with understanding comes the willingness to live in grace, to be in God. “Religion is a practical discipline that teaches us to discover new capacities of mind and heart,” says Karen Armstrong in The Case for God. “You will discover their truth—or lack of it—only if you translate these doctrines into ritual or ethical action.” 

Orthopraxy over orthodoxy—right action over right belief is how I see it—but with two important caveats. First, we do not earn our way through “right” action because this is not a contractual relationship. God is in the giving business, not the litigation business. Thus, I have nothing to fear from him; I have no need to protect myself. Second, belief is not abandoned, but made firm through action. “Like any skill,” continues Armstrong, “religion requires perseverance, hard work, and discipline. Some people will be better at it than others, some appallingly inept, and some will miss the point entirely. But those who do not apply themselves will get nowhere at all.”
In the end—and in fact, in the beginning and in the middle, is grace. That is what makes this whole venture possible. Room to move, to experiment, to make mistakes and learn from them. Here is the mysterious presence of the Christ. T. S. Eliot knew something of this, laying down the lines in The Wasteland:
Who is the third who walks always beside you?
When I count, there are only you and I together
But when I look ahead up the white road
There is always another one walking beside you. . . .”
And that is enough for the time being.

Saturday, September 17, 2011

9/11 and Counting


“Not until we are lost, in other words, not till we have lost the world, do we begin to find ourselves, and realize where we are and the infinite extent of our relations.” — Henry David Thoreau, Walden
My grandfather was an historian and a college teacher who filled me with a love for history and a respect for those who write it. I was raised by my grandparents, teachers both, and our home was packed with books, magazines, and journals, many of them about ancient history, medieval history, and modern history. My first understanding was that history told the story of what had happened long ago, that it was a true and valid record of those events, and that it stood in the same relation to the Eternal Verities as the Law. One learned and believed History and one kept the Law. Neither was profitably to be questioned.
These beliefs, solidified and tested by teachers in elementary school, were gently but firmly undermined by my grandfather’s tutelage. History, he said, was what historians reconstructed from written documents, eyewitness accounts, physical evidence and a sanctified imagination. The story could have been written another way; in fact, there were many ways to tell the story and most of them could be seen to fit the facts as they were known.
This was endlessly fascinating to me. It threw a relativism into the works that furnished my imagination with a constant stream of long shots and close-ups seen from different angles. Historical figures, outsized characters like Lord Nelson, Napoleon, and Winston Churchill, became, through my grandfather’s stories, people whose flaws were as tangible as their virtues. Neither was to be ignored nor were the flaws to define the person, as tempting as that was. History, in the way my grandfather taught it, was complex and multi-layered, a spider’s web of nobility, contrivances, deceit and bravery. It was not, as Henry Ford was famously quoted as declaring, ‘one damn thing after another,’ but a vast and ongoing story—a tale told with a point, freely offered up for scrutiny.
The events of the 60s, exploding over my generation, came so fast and furiously that Ford’s complaint seemed justified. It was one damn thing after another. Apparently random events took on a sinister afterglow, conspiracy theories bred like fruit flies, and the Book of Revelation bookmarked the nightly news. And if journalism was the rough draft of history, then propaganda from all points on the political spectrum was the marketer’s flack, guaranteed to fill the worst with a passionate intensity.  
My cognitive dissonance over America’s Manifest Destiny gone rancid in Vietnam was further jolted by the realization that Martin Luther King, Jr. was breaking the Law. He didn’t just break it though, he first hauled it up from the depths like some blinkered Morlock, where it could be seen for the poisonous creature it was. The social effect of his nonviolent resistance to institutional racism was the permanent dwarfism of law. From that point on, certainly for my budding political awareness, the law no longer had the implicit seal of approval from on high. I saw it as a human construct, flawed and dangerous when it served only the interests of the powerful. For me this was a new experience: I found myself in a dark wood without the familiar landmarks and the path I’d traveled daily suddenly looked alien and forbidding. Thoreau had been there too, literally if not metaphorically. The traveller in the forest looks around and “Though he knows that he has travelled it a thousand times, he cannot recognize a feature in it,” says Thoreau. “. . . .[I]t is as strange to him as if it were a road in Siberia.” 
Perhaps that was our feeling on 9/11 when, on a perfect day in September, our predictable world turned dark and terrifying. How could this happen here? Ten years on from that day I wonder what we have learned. Are we still the adolescent nation, armed, fractious, and trigger-happy? Are we still oblivious to the effect our bumbling self-centeredness has on the world around us?  “Not till we have lost the world,” said Thoreau, “do we begin to find ourselves, and realize where we are and the infinite extent of our relations.”
Someone once remarked that “the world is passing strange and wondrous.” That it certainly is. There is mystery and wonder all around us—in the violent dislocations as well as in the harmonies we find. Our common histories quickly become uncommon when we make allowance for a shift in view.
“In rethinking our history,” says Howard Zinn, historian and author of The People’s History of America,” we are not just looking at the past, but at the present, and trying to look at it from the point of view of those who have been left out of the benefits of so-called civilization. It is a simple but profoundly important thing we are trying to accomplish, to look at the world from other points of view.” 
That can be the legacy of 9/11.