Saturday, November 19, 2011

Shall We Let the Dogs of War Sleep?

Who will tell me
why I was born,
why this monstrosity
called life.— Anna Swir, from Poetry Reading
One of the unintended consequences of globalization is that no one is a bystander to world events anymore. A. C. Grayling, Master of the New College of the Humanities in London, philosopher, and frequent contributor to The Times, notes that “Saying that there are no bystanders any more means that everyone is involved in everything.” In Grayling’s words, running away from our knowledge of atrocities and terrorism “is a refusal to recognise, think through, and try to deal with the sources of that danger.” 
There have been plenty of opportunities to think through the atrocities of the twentieth century, the bloodiest in modern history, and one of them, the Khmer Rouge genocide against the middle class in Cambodia, surfaced this week in a story in the New York Times. A tribunal that is trying leaders of the Khmer Rouge has released one of the defendants, Ieng Thirith, 79, the most powerful woman in that government. Between 1975 and 1979 the Khmer Rouge government murdered 1.7 million people through “execution, torture, forced labor, starvation and disease.” Ms. Thirith, the former minister for social affairs, was charged with crimes against humanity in “planning, direction, coordination and ordering of widespread purges.” 
But the tribunal has recommended the immediate release of Thirith because she “lacks capacity to understand proceedings against her or to meaningfully participate in her own defense.” She exhibits symptoms of Alzheimer’s, is disoriented and forgetful, and sometimes talks to herself. Occasionally, she snaps in public and rants at the tribunal, proclaiming her innocence and expressing shock that she, the scion of a respectable family, should be hauled up on such outrageous charges as murder and genocide. Apparently, her powers of reasoning allow her to place the blame for murder on her compatriots, while she was only responsible for bureaucratic paper-shuffling. 
Her fellow defendants are, like herself, old people now, but once they were young revolutionaries who joined Pol Pot in turning Cambodia into the killing fields. Pot Pot died in 1998 without coming to trial. Should the international community forgive these people because it was a long time ago and the defendants are weak, powerless people with one foot in the grave? 
It is a mark of moral courage that courts such as the International Criminal Court even exist. The United States is one of three countries worldwide that unsigned itself from the Court during the Bush era, will not participate in any proceedings, and will not allow its citizens to be brought up on charges. No doubt there are varied and complex reasons for this, but it smells bad. 
Since we are all participants and no longer bystanders, the action of the U.N. court in Cambodia raises all sorts of ethical questions. A humane society holds that no matter the culpability of a defendant, that person cannot be tried if he or she cannot understand the charges through mental incompetence. The presumption is that only the sane can be tried because only the sane are responsible for their crimes and for the acknowledgement of them. The banality of evil in people (the phrase is Hannah Arendt’s) means that a person can sign the death warrants of millions and go home to a loving family, a cosy dinner, and a satisfying sleep for a job well done. Thus, Ieng Thirith, no doubt as sane as any government official can be, could participate in genocide but cannot be held accountable for it years later because she has the mental and moral capacity of a squirrel. 
Many of the 20th-century’s war criminals have been indicted while in their golden years but die before a verdict can be reached. Slobodan Milosevic and Augusto Pinochet come to mind, while the early phase of Mubarak’s trial in Egypt was conducted while he was in a hospital bed. No doubt Syria’s Assad, should he ever come to trial for crimes against humanity, will suffer a heart attack. I’m sure it’s all very stressful. On the other hand, rough justice of a sort caught up with Saddam, and Gaddafi, already indicted for war crimes before he met his ignoble end in the midst of an angry mob, might have also stood trial. 
Is it the sheer magnitude of their crimes, that sometimes beggar description, which fill us with revulsion? Is that why they should be brought to justice? What do we gain by sentencing a 70-year old to 134 years in prison? Even if they are executed that doesn’t serve as a deterrent to up and coming young dictators; each one seems to believe that he plays out his drama on a stage sequestered from the world. Can we make up for the loss of thousands of lives, sometimes millions, of victims who will never live out their potential? Can one death redress the hurt of so many of the victims families? 
We know it can’t. But we’re also not willing to let these crimes pass by. Why do we pursue the perpetrators, spending years and sometimes millions of dollars tracking them down, producing witnesses, compiling evidence, and presenting the facts? 
Perhaps it is for two reasons: to honor the memory of those who were humiliated, displaced, tortured and executed, and to remember what it means to be human. Vengeance is God’s but honor remains to us, the living. We must carry on from day to day, fighting the impulse to strike back in like manner, and instead, through a scrupulously fair legal process, show that the poison of evil that pervades the human psyche does not define the human spirit. 

Saturday, November 12, 2011

Rick Perry and the Politics of Certainty

How it tilts while you are thinking,
and then you know. How it makes no difference
for a long time—then it does. — William Stafford, “Figuring Out How It Is”
This week Rick Perry cocked a finger at Ron Paul in another Republican debate and shot a blank. In a gaffe heard round the world, Perry couldn't come up with the third in a short list of federal agencies he'd throttle if he became president. In a single, riveting moment all his Texas-sized bravado farted out like an untied balloon. It was awful and cringeworthy and . . . there's a lesson in it for all of us.


The world is made up of two kinds of people: those who think they know and those who know they don’t. I am definitely in the second camp. . . I think. How can we even make definitive statements like the one above when we are “of two minds”? How can we know anything with certainty? 
I am fascinated by people who speak with absolute certainty, and slightly repelled also. I wonder how they can be so sure, why they think they have an inside track on knowledge, and most of all, do they ever admit to being wrong? Confucius said, “Do you know what true knowledge is? To know when you know a thing, and to know when you do not know a thing. That is true knowledge.”  Epictetus, that tough old Stoic, used to say, “You can’t teach a man something he thinks he already knows.” And therein lies the beginning of wisdom, without a doubt. . . 
It’s not easy being this way. For one thing, living in a state of doubt means constantly seeking evidence, testing, sifting, weighing what appears, until something emerges from this process that offers a glimmer of hope. There are facts, of course, and necessary truths, such as 2 + 2 = 4, and all those a priori truths that Kant lured out of the shadows. For the doubter, even these pose at least a momentary pause (Whaddya mean these are axiomatic? Prove it!) until the mind overrules the emotions in the interest of saving time. 
Down at the level of leather-on-the-pavement this kind of epistemological suspicion can become quite inconvenient. For awhile after the United States Postal Service misdirected a couple of bills and my electricity was cut off I could not bring myself to drop any letters through a post box slot. Instead, I delivered the check in person, not trusting a service that daily delivers, with uncanny precision, tons of junk mail to each and every citizen with an address. I got over it. Eventually.  
For years I have wished that I could hold a viewpoint with confidence if not with complete assurance, for it would make life so much easier. Inevitably, I admit that an opposing perspective has its points, that in all honesty some of its points are better than mine, and after all, who am I to say that I stand upon the solid rock, while all around is shifting sand? Seeing multiple points of view often leads to double vision—and to vertigo—that existential disease that leaves one panting, hanging over the abyss while mice gnaw at the sleeve caught on a branch that soon will snap. Dubious workarounds present themselves in such desperate circumstances. One begins a sentence without knowing how it will end but the mind churns on, dredging up in nanoseconds all manner of rusty facts and anecdotes, the tires of memory lying at the bottom of our subconscious, the flotsam and jetsam of headlines and conversation. Occasionally, the will to power asserts itself, all niceties are sheared away, and the mind fastens, terrier-like, upon a position, any position that looks like it could stand an absent-minded glance if not a steely scrutiny. In those moments, one feels a giddiness that can be mistaken for  certainty until someone breaks the silence that follows with a sigh and a shake of the head. 
Time and time again I’ve had the experience of suddenly seeing something familiar shift ever so slightly and take on a new form. In those moments I wonder at the filters I’ve apparently installed that prevent me from seeing the full spectrum of visible light. Once having seen the new thing it cannot be ignored, of course, and one is left to ponder how much else has been overlooked or ignored because it simply did not register on our consciousness. But selective perception is not the only constraint upon us. In a discussion I used to be the one who waited so long with a question or a comment that the general train of thought had hurtled over the horizon by the time I offered it up. I wanted to make sure that my question did not betray any lack of knowledge or foresight.   Once I realized that recognizing our ignorance is the first movement toward learning, much of the ego simply melted away. 
So I bow to the idea that we are social animals and that we learn together. I’m rarely capable of doing a Descartes—shutting myself up in a little room and doubting my way down through the detritus to the solid foundation of indubitable existence. I learn faster when I’m with a group of people who have maximum curiosity and the willingness to share it. Most of what we know is handed to us, warm to the touch, from people like ourselves or sometimes from people we think we’d like to be. In those cases, having our doubts can be a good thing because they give us a moment to step back and look at the wide shot first. 
Humility and grace—the two virtues that free us up to learn. Of that I am certain.

Saturday, November 5, 2011

Connecting the Dots

Distance does not make you falter,
now, arriving in magic, flying,
and finally, insane for the light,
you are the butterfly and you are gone. — from Goethe, The Holy Longing
On the evenings I step back inside my home from an hour at my local coffee-house, I often pause by a bookcase just inside the door. I pick a book at random, usually one I’ve not read for awhile or even never read—having bought books over the years that I grow into eventually—and opening it anywhere, take in the tone and cadence, the rhythm of the sentences, the delight of walking in on a conversation in full swing. Reading out of context breaks the mind out of dull expectation; it throws one almost violently into a world emerging into light, a creative disjunction, an optical bending of shapes into images. All that, and it’s fun, too. 
I picked up Michael Meade’s Men and the Water of Life, an initiation into myth and storytelling, and found a poem by Goethe I’d not read before called “The Holy Longing,” which concludes with this:
And so long as you haven’t experienced
this: to die and so to grow,
you are only a troubled guest
on the dark earth.
 Then I pulled down Colin Wilson’s brilliant work, The Outsider, written when he was only 24, in 1956. The Outsider traces the literary development of the alienated ones, the  people just beyond the thinnest edge of the crowd, the ones who by their very nature do not fit nor conform to polite society. They cherish their aloneness, yet they need others to truly be themselves. And the first page I opened it to . . . contained the stanza above from Goethe’s poem. 
These moments of serendipity are mysterious and welcome. For me, they happen often enough that I am not surprised, though I’m always grateful. They are one of the small wonders of the universe. It’s like coming upon a bonsai garden, the tiny, perfectly-formed trees. sometimes hundreds of years old, that stand majestically in their created environments. 
On my way up the hill to home, with the sound of endless traffic behind me and a moonless sky above, I was thinking of “home.” Not the domicile (from Latin, domus) where, as the thesaurus puts it, “whenever you are absent, you intend to return,” but this Earth, this world. Perhaps not just this third rock from the Sun, but more the world we both create and observe, the imaginative world within which we live and move and study ourselves. 
My students and I had been talking in philosophy class about freedom, freewill and determinism, the questions that ask whether we choose our actions, whether we are destined or fated, or if we are simply flung upon this earth. The question I had put to them reflected our readings and our discussion: 
The determinist says: Every event has its explanatory cause.
Some people say: Everything happens for a reason.
Is there a difference between these two positions?
The answers were thoughtful, wry, insightful, even humorous. One group stepped up vigorously and denied any differences. Cause and reason, they said, are different words for the same thing. We see an event: we trace it back to a cause. If everything happens for a reason then there must be a cause, since reason implies purpose, and purposes don’t come out of nowhere. 
Another group advanced more cautiously, working the knife in between the stones in the wall and in finding the differences. For them, ‘cause’ implied a point of origin, the initial shove that set something in motion. ‘Everything happens for a reason’ is the phrase that people use in the aftermath of an event when they’re trying to make sense of something. They say it over the shoulder as they doggedly trudge forward. 
A smaller group saw it as the bridge between science and religion, since science seeks knowledge of events and religion looks to faith to interpret what cannot be solely based on facts. 
And all week I had been, in spare moments, reading Walter Isaacson’s new biography of Steve Jobs, both rich in detail and broad in its scope. It’s a fascinating work, not only because Jobs is a fascinating subject, but because Isaacson sees the relentless purity at the center of the man’s soul. Jobs was a man whose dark side got up every morning and went to work with a knife between his teeth. His light side appeared occasionally, smiling and charming, with the knife held loosely behind his back. He was the dazzling embodiment of Kierkegaard’s maxim, “Purity of heart is to will one thing.” And for him the one thing was found at the intersection of Art and Technology where extraordinary engineering met exquisite design. He could not bear any deviance from the path of simplicity that led to perfection. How deep were his flaws and how high his aspirations!
Such purity of heart is dangerous, a flame that consumes all and finally itself. Is this what it takes to make a dent in the universe? 
Tell a wise person, or else keep silent.
Because the massman will mock it right away.
I praise what is truly alive,
what longs to be burned to death.
Every event has a cause but not all events are visible. Everything happens for a reason but sometimes we only see it after it’s over. Looking back, we connect the dots.

Saturday, October 22, 2011

Gaddafi, Interrupted

“When the passions of the past blend with the prejudices of the present, human reality is reduced to a picture in black and white.” — Marc Bloch, The Historian’s Craft
In the moments before starting a class I was about to teach in bioethics, one of those moments in which some students stare into space while others read over the assignment, a large woman burst into the classroom with a shout, her face wreathed in smiles, arms over her head, body swaying in a herky-jerky dance: “Gaddafi is dead!”, she sang out. “Gaddafi is dead!” Some in the classroom registered mild surprise, others merely nodded, one or two gasped; the majority simply smiled at the delight of this woman who continued to chatter amiably about the event. 
The next morning I glimpsed the front page of the New York Times and saw a blurred photograph of Gaddafi, head bloodied, a rictus of terror on his face, surrounded by men with guns, under a blazing sun that cast everything into patterns of light and dark. The photo was taken from a video shot in the moment—no doubt with a cell phone—a video that the Times assured us was even now circling the globe. A hated dictator comes to his end, dragged out from a drainage ditch, spreadeagled across the hood of a car amidst a mob, and eventually shot in the head at point-blank range. That’s one version of the story, anyway. 
The moment of liberation has finally come, a moment which Gaddafi, for all his paranoid bluster and atavistic arrogance, must have imagined in his night-sweats while on the run. ‘Lo, how the mighty have fallen,’ came to mind, as did memories of Sadam Hussein, wild-eyed and disheveled, dragged like a maggot from his hole on his way to a quick finish at the end of a rope. These moments are preserved for us, first in pixels, then in memory. But as Susan Sontag reminds us in her On Photography, “The ethical content of photographs is fragile.” In time, these photos will fade, not just physically, but from our immediate consciousness also. “A photograph of 1900 that was affecting then because of its subject would, today, be more likely to move us because it is a photograph taken in 1900.”
How do we distinguish the moments in which the hinge of history slides ponderously open? That which the media chooses today as the key to the future becomes in that future a footnote to a larger event from the past. With time and patience the historian discovers a pattern among the bones. But I am fascinated by the need in us to create a meaning now, to build a house upon the sand for the purpose of selling the real estate before the tide turns and it is swept away. Thus, every spike in the EKG of world affairs draws out the pundits whose wisdom-for-hire keeps us amused and distracted. 
This is not to say that we shouldn’t put current events in a context nor should we refrain from trying to understand what’s going on. I think it’s something else entirely, this uneasiness I have about a beta version of historical meaning. I came across a quote recently which looks at this puzzle from the other end.
“It often happens that those who live at a later time are unable to grasp the point at which the great undertakings or actions of this world had their origin. And I, constantly seeking the reason for this phenomenon, could find no other answer than this, namely that all things (including those that at last come to triumph mightily) are at their beginnings so small and faint in outline that one cannot easily convince oneself that from them will grow matters of great moment (Matteo Ricci).”
We feel the need to know what will happen, so much so that we will create a probable history so that our commitments of time and money and political capital will find the greatest return on investment. We’re not very good at predicting the future. Who foresaw the Arab spring? We’re much more sure about our ‘winter of discontent,’ as gas prices surge and ebb, as health care in this country leaves millions in the cold, and as the political campaigning runs its brutal, if predictable, course. 
Journalists of the old school, used to finding the facts and delivering them with as little authorial inflection as possible, are now asked to render judgment on what they report. This is a waste of time and talent, but not of money. In this branding era a news staff comes to be known for its daring—not the courage of reporters entering a war zone or taking on the rich and powerful—but of those who turn headlines into questions that have no answer. “Which Dictator is Going Down Next?” “Is Cain Dead in the Water?” “Can Rick Perry Overcome His Debate Blunders?” and “Will the Murdoch Clan Survive?”
Marc Bloch, an eminent French historian, joined the Resistance against the Nazi occupation of France when he was nearly 60. He was later captured by the Nazis, tortured, and finally executed near Lyons with twenty-six other patriots on June 16, 1944. In 1941, having been forced out of his academic post because of his Jewish ancestry, he began a book, The Historian’s Craft, which was never finished because of his untimely death. In it he reports on an incident, “the airplane of Nuremberg,” in which rumors of a provocation by the French against the Germans were not only untrue but went undisputed because it was useful to believe them. “Of all the types of deception,” he says, “not the least frequent is that which we impose upon ourselves, and the word ‘sincerity’ has so broad a meaning that it cannot be used without admitting a great many shadings.” 
I like what Steve Jobs said in his now-famous Stanford Commencement address of 2005: “You can’t connect the dots looking forward. You can only connect them looking backwards.” Some think journalism is the first draft of history, but in a 24/7 news-cycle today’s news is tomorrow’s history—and that’s simply not enough time to connect the dots.

Saturday, October 15, 2011

Transcending Opinions

“It is not enough to relate our experiences: we must weigh them and group them; we must also have digested them and distilled them so as to draw out the reasons and conclusions they comport.” — Michel Montaigne, The Art of Friendship
“This is only my opinion, but. . . .”  Lately, whenever I hear that in the classroom, in a conference, in a faculty meeting, or in casual conversation, I want to tear off all my clothes and start screaming. Since that is against most social norms and my better judgment, I signal my displeasure by the merest arching of an eyebrow. 


How did we come to this point in common discourse? Why is it that when we edge ever closer to subjects of significance and weight, points that ought to be argued, elements of life that divide and conquer people, we retreat with a disarming smile into a cloud of unknowing? 
The rules of engagement in these battles are followed to the letter. First, the disclaimer: “This is only my opinion. . . . “ Translation: I’m sorry if you take offense at anything I say, but everyone has the right to their own opinion.” This is followed by the actual opinion, which varies in its relevance to the discussion, but usually reflects the unconscious prejudices of the opinionator. Finally, there is the idemnification clause, intended to protect against the disagreeable opinions of others fired at point-blank range: “You may disagree, that’s okay—everyone is entitled to their own opinion—but I’m just saying. . . .” Then the speaker usually lapses into passivity, content to have said his piece, but uninterested in any extension of the argument unless it challenges his right to express his opinion. 
This signals the death of dialogue and the throttling of democracy, which relies on the free exchange of ideas. But how can ideas freely circulate when they come walled about with petulant assertions designed to shore up fragile egos? We have lost the art of “conversation,” a word which can be traced back to its Latin roots in the idea of living in company with others, literally, ‘to turn about with.’ Another ancient root, a scriptural meaning, relates conversation to a ‘manner of life,’ or a way of being, never merely as a means of communication. It signifies a willingness to trust one another, to extend to others the means of grace whereby genuine learning can take place. It assumes that conversation takes time, that it evolves, and that it is so much more than mere assertion. 
Robert Grudin places this squarely in the realm of liberty and calls these conversational skills the ‘arts of freedom.’ In a fascinating meditation entitled On Dialogue, Grudin says, “Once gained, moreover, the arts of freedom must be kept fresh by thought and action, taught to the young, bequeathed down generations.” Otherwise, he warns, the posturing demagogue and the ravenous mass-marketer “will turn liberty into its own caricature, a barbarous fool driven by fear and greed.” 
It might seem a long leap from a classroom discussion to the foundations of democracy. We must also be wary of blaming the end of civilization on the young and restless. But Grudin, a professor of English at the University of Oregon, believes that these arts can and should be taught. “The operative pedagogical philosophy is that skill in these arts will enable people to make decisions and follow courses of action beneficial to themselves and society. In other words, people can learn freedom. Freedom is useless without a rational and emotional instrumentation that gives it substance.”  
What I often see in classroom discussions is more a clash of egos than an exchange of ideas. Many times those who speak up are so eager to claim their point of view as theirs that the point—if there even was one—is lost. Teachers don’t help much either. When I worked in faculty development I saw many syllabi which laid out elaborate rules for classroom discussions. I was struck by the pervasive fear which ran through the assumptions behind these rules. Students had to be protected from the sharp edges of differences between them: once you entered the classroom there were no races, genders, or cultures. Reference to these social categories was taboo: each person was both an individual so autonomous that he perceived reality in exclusively personal terms and he was a member of a massive, amorphous, egalitarian lump. No doubt the intentions were that no student should feel discriminated against—something no one should have to suffer—but the effect was to limit discussion to the confident few who wielded their vorpal swords for sport. These parts of our identity help make us who we are and we ignore them at our peril. They come back as labels and epithets if we don’t take their influence into consideration.
We learn with each other, that’s what conversation means. We are social beings, which is to say we find out who we are through interaction with others as well as reflection by ourselves. Self-awareness and self-reflection, though, are learned behaviors, brought about through practice in hearing about ourselves from other people as we dialogue. When we don’t practice at listening before we speak we panic when spoken to. Our desire to be known for ourselves rises up and before we know it we are chanting the mantra of the blindingly obvious: “Everyone is entitled to their own opinion. . . .” Whereupon we deliver our opinion as a verdict rather than an invitation. 
I once went to a conference for men held at a large hall in downtown Washington, D.C. It was led by Robert Bly, a poet and self-styled men’s mentor, who had just published a book entitled Iron John. It was a manifesto on being a real man without becoming a slack-jawed, brutish jerk. During the course of his presentation he gave some time for statements and questions from the floor, but placed some conditions on the speakers.  They had to keep their contributions to three sentences in the interest of time and they could ask questions—but any sentence that was not a question had to be a simple, declarative sentence. It was issued as a challenge: say what’s on your heart without hedging it about with qualifiers. I took it as a request for open, sincere, and rugged conversation. Nobody could do it. Virtually everyone who spoke danced about their subjects, adding implied questions, footnotes, self-referential phrasing, and jargon. Bly was disgusted and berated us for our narcissism. 
I have often thought of that experience for it revealed some principles I’d like to live by. We need to think before we speak; we need to listen to others; we need to give each other grace so that we have a space in which to learn from each other. That’s not my opinion, that’s my invitation.  

Saturday, October 8, 2011

Jobs for Everyone

“One comfort is, that Great Men, taken up in any way, are profitable company. We cannot look, however imperfectly, upon a great man, without gaining something by him.”  — Thomas Carlyle, On Great Men

The front page of Apple’s website on Wednesday, October 5, 2011, featured a single image — a black and white portrait of Steve Jobs with the dates, 1955-2011. Simple, elegant, minimalist, the photograph had the classic style of an Apple ad. One almost expected to scroll down and see the words, “Think Different.” 
Rarely does a CEO garner such respect and affection, much less one who built and headed a corporation with more cash in the bank than some countries. But then Steve Jobs was more than a businessman, more than an entrepreneur. His death at 56 cut his arc of brilliance before it reached its apogee and robbed us of the chance to see what he might be like in 20 years. As the accolades poured in, and the flowers, notes and apples were laid at the doors of Apple stores around the world, it reminded us that this kind of attention usually follows the death of royalty (Diana) or rock stars (John Lennon). 
But Jobs was neither. A man with a child’s sense of wonder, he was the quintessential American success story. Adopted at birth by a working-class couple, he dropped out of college 17 years later because it was eating up his parents’ life savings—and he didn’t have a clue what he wanted to be anyway. Three years later he and Steve Wozniak built a prototype of an Apple computer in his parents’ garage. Within 10 years he had a $2 billion company, he and his pirate team had built the first personal computer, and in an ironic twist, he’d been fired from his own company by the man he brought in to help guide it. He went on to found NeXT and Pixar, and finally to return to Apple where he brought out the iPod, the Macbook, iTunes, the iPad and most famously, the iPhone. He died at home, surrounded by family, the day after the latest iteration, the 4S, was announced.  
By now the essential elements of Jobs’ professional life are well-known, much as we know the beginning and the end of the Beatles. Like the Beatles he lived most of his adult life in the hot glare of media attention while he guarded his private life with a tenacity rarely seen in the celebrity world. But when Walter Isaacson’s authorized biography of Jobs is released later this October, much of that life will no doubt be revealed at last. 
By many accounts Jobs was mercurial and ruthless, a perfectionist with an eye for detail  and the capacity to drive employees to despair with his demands. But many also speak of his kindness, his love for his wife of 20 years and their four children, his willingness to mentor those young entrepreneurs in whom he saw some of his early fire and brashness. 
When I saw the announcement of his death my eyes filled with tears. In the days since I’ve found myself returning time and again to his image and life in odd moments between classes or when I’ve been waiting at a stoplight. I’ve wondered what his children and wife are going through, how his closest colleagues will feel when they walk the halls of 1 Infinite Loop, the Cupertino headquarters of Apple, and especially, what he must have thought about in the last painful weeks of his life. I have asked myself why he holds such fascination for so many of us and who he will become in the psyche of 21st century people. 
Already he is spoken of in the same breath as Edison, Walt Disney, and Leonardo da Vinci. David Pogue, in his regular column on tech products in the New York Times, speculated on the chances another young visionary like Jobs is even now working in a garage somewhere, and put the odds at “Zero. Absolute zero.” People like Pogue, who have known Jobs for decades and have sometimes disagreed vociferously with decisions he made, see him as a rare creature, one of the few who deserve to be called “genius.” 
FIrst, Jobs brought together technology and art in ways that no one had thought of before. The products of his design teams were the result of his own visions and imagination. Someone recently described the process of design at Apple as stripping away layer after layer of clutter and chaos until they arrive at the luminescent, irreducible pearl in the center. Most corporations don’t allow that kind of time to be spent in reduction instead of addition, but then most corporations are content to repeat what works until well after it doesn’t anymore. To open the box on a new product from Apple is to witness the epitome of presentation. Every part of the packaging has a purpose, every part contributes to the whole, and the whole is much more than the sum of its parts. It makes you want to keep the packaging as art in itself.  
Second, Jobs never looked back. It was his view that if you’d succeeded at something it was time to throw yourself in the deep end and splash around until you found a new lifeline. He grabbed the idea of the mouse from PARC, made it standard in the industry and then moved us away from it to something ever more intuitive and natural—the gestures of hands and fingers. He took away our CD-ROMS, our external hard drives, cables, and flash drives. In their place he gave us elegance and simplicity. “It just works” was a refrain that constantly came through Apple’s marketing and advertising. 
Third, no one in recent memory has both commanded a corporation and put himself in the skin of an average consumer. Jobs had an uncanny ability to think like a customer, to focus on the results wanted, and then to provide the means to get there. 
Fourth, Jobs could see not just what would sell, and not just what would make something good even better, but something that no one had thought of yet. Even back in the Apple II days, when most people couldn’t imagine computers doing more than keeping recipes and shopping lists, Jobs was designing a personal computer to be a “bicycle for the mind.” He had to wait for the rest of the world to catch up sometimes, but more often than not he made us want the future before we could understand it—and when we did it was so natural it made us feel like we’d invented it ourselves. That’s called vision and almost nobody has it. Bush the First derided ‘the vision thing’ because he saw it like most corporate managers do as something a committee cuts and pastes together when they’ve run out of ideas.
Finally, Jobs raised the bar on performance so high that he made others want to do better. That’s charisma. Leaders want it, but it’s not for sale. Seldom seen, it’s the result, I think, of a person who embodies a cluster of paradoxes: power that surrounds itself with others who are brilliant; confidence without egotism; purpose with a sense of humor; and enthusiasm without mania. It’s the Tao in action. We are fortunate to have shared time and space with a man who found it—as we all may—inside himself. 

Saturday, October 1, 2011

Moneyball's Learning Curve

The fundamental mistake is in taking the patterns we observe around us as facts of nature. They are not; they are the result of rational individuals adjusting to a particular set of constraints. . . . Change the constraints and, given a little time to adjust, the patterns change. — David Friedman, Hidden Order: The Economics of Everyday Life
What’s the best way to get someone to change their behavior? Use a carrot? Use a stick? I’ve been interested in motivation every since I became a teacher and discovered that teachers can’t motivate students. If you beat them with a stick it doesn’t increase their skills and they’ll come to hate the process of learning. If you entice them with the carrot they’ll do just enough to get the carrot and no more. Unlike teachers, most students don’t enjoy learning for its own sake. Come to think of it, teachers don’t either: they learn in order to accomplish a goal. But one thing that separates teachers from students is that teachers can’t understand why the goals they love aren’t what students love. 
And that brings us to Moneyball, the movie starring Brad Pitt and Jonah Hill, based on the book of the same title by Michael Lewis. Lewis has a talent for making economics interesting in such bestsellers as Liar’s Poker and The Big Short. With Moneyball he looks inside the economics of baseball. The conventional wisdom is that the big spenders (Yankees, Red Sox) buy the best players and the division titles. They may even win the World Series. Money equals wins. But Billy Beane, the general manager of the Oakland A’s, had little money—in fact, the lowest buying budget of any team around the league. 
His first move was to hire a shy, soft-spoken kid, an economics graduate from Yale, who lived and breathed statistics—baseball statistics. It was the kid’s idea that meticulous scrutiny of a player’s stats could reveal patterns of performance that pointed to value rarely seen by scouts or managers alike. Most of the players Beane picks up in following this advice are the bargain-basement overlooked or the over-the-hill gang that no one wants.  
Predictably, the A’s scouts, a group of leathery, tough old guys, can’t see the logic and don’t appreciate the implication that tables of stats can trump years of experience. Beane is too old to waste time on methods that no longer work and young enough that he’s willing to bet the farm—and his reputation—on unproven concepts. This isn’t a movie review and I’ll try not to spoil it for you, but this is the takeaway I received: what people are worth is the value they place on their integrity. 
Billy Beane takes a clutch of misfits, has-beens, and also-rans and turns them into a team that wins 20 in a row—a record that no other team had achieved before—by thinking of them as parts in a system rather than individual stars. Without the money to buy a slugger he goes for the ones who get on base. He buys a pitcher whose delivery looks like a knuckle-dragging primate on speed, and turns a broken catcher, Scott Hatteberg, into a first-baseman. “What’s your biggest fear?” asks David Justice, a veteran player. “That someone will hit the ball toward me,” breathes Hatteberg. After Justice stops chuckling, he says, “Good one! That’s funny. But seriously. . . “   “No, really,” says Hatteberg, looking away. “That is my biggest fear.” 
By accepting the constraints and working to maximize the effects, Beane and his staff turned the club around and, some would argue, changed the game. He was hardly an inspirational speaker, at least as portrayed in the movie, and he seems to have deliberately distanced himself at first from the players. “That makes it easier for him to cut us, right?” asks a player of Beane’s assistant. But as the season grinds on with few wins Beane holds informal seminars on the method and gradually convinces the players that together they can win. A few men with journeymen talent and an ability to put ego aside can achieve more than the glittering superstars. He trades a player whose taste for the fast life is messing with his game, sends a rookie star to Philadelphia because of his attitude, and regretfully but firmly drops a player who cannot measure up. 
What are we worth? Exactly what we contribute when we put our hearts into it. But there’s no gauzy optimism in the A’s locker room, and you’ll never hear “I Believe I Can Fly” blasting from the sound system. In a scene that would make motivational coaches and school counselors cringe, Beane strides into the locker room after yet another loss and berates the players for celebrating anyway. “Do you like losing?” he yells and flings a bat down a corridor. In the sudden silence the sound reverberates for a long, long time. “That’s what losing sounds like,” he snarls, and stalks out. In the economy of teams at the bottom only one effect can give rise to a new cause: you have to hate losing more than you care about winning. In Beane’s pedagogy that’s neither a carrot nor a stick: it’s self-respect coupled with realism. 
Carol Dweck is a psychologist whose 30-plus years of research into motivation among children seems to back up Beane’s intuitions. She notes in her book, Self-Theories, that “The hallmark of successful individuals is that they love learning, they seek challenges, they value effort, and they persist in the face of obstacles.” Moreover, she punctures beliefs that are prevalent in our society, such as those with high ability are more likely to be mastery-oriented, and praising a student’s intelligence will encourage qualities of mastery. They’re not and it won’t, she says. Instead, the ones who succeed are most often those who persist with vigor and humility to overcome obstacles, and who believe that they can learn, that intelligence is not fixed at birth. That’s a cheer for the underdogs, but Dweck even goes one better: her research shows that the students who easily pull A’s but collapse in frustration when up against something difficult can learn a new attitude. They can shift from avoiding anything that would spoil their record to enjoying the challenge of learning something new. In other words, they are motivated from within. 
Perhaps, at the risk of over-simplification, this could be expressed as a set of goals: See your limitations as challenges. Learn to love the questions. Keep at it. Share what you know.