Saturday, October 8, 2011

Jobs for Everyone

“One comfort is, that Great Men, taken up in any way, are profitable company. We cannot look, however imperfectly, upon a great man, without gaining something by him.”  — Thomas Carlyle, On Great Men

The front page of Apple’s website on Wednesday, October 5, 2011, featured a single image — a black and white portrait of Steve Jobs with the dates, 1955-2011. Simple, elegant, minimalist, the photograph had the classic style of an Apple ad. One almost expected to scroll down and see the words, “Think Different.” 
Rarely does a CEO garner such respect and affection, much less one who built and headed a corporation with more cash in the bank than some countries. But then Steve Jobs was more than a businessman, more than an entrepreneur. His death at 56 cut his arc of brilliance before it reached its apogee and robbed us of the chance to see what he might be like in 20 years. As the accolades poured in, and the flowers, notes and apples were laid at the doors of Apple stores around the world, it reminded us that this kind of attention usually follows the death of royalty (Diana) or rock stars (John Lennon). 
But Jobs was neither. A man with a child’s sense of wonder, he was the quintessential American success story. Adopted at birth by a working-class couple, he dropped out of college 17 years later because it was eating up his parents’ life savings—and he didn’t have a clue what he wanted to be anyway. Three years later he and Steve Wozniak built a prototype of an Apple computer in his parents’ garage. Within 10 years he had a $2 billion company, he and his pirate team had built the first personal computer, and in an ironic twist, he’d been fired from his own company by the man he brought in to help guide it. He went on to found NeXT and Pixar, and finally to return to Apple where he brought out the iPod, the Macbook, iTunes, the iPad and most famously, the iPhone. He died at home, surrounded by family, the day after the latest iteration, the 4S, was announced.  
By now the essential elements of Jobs’ professional life are well-known, much as we know the beginning and the end of the Beatles. Like the Beatles he lived most of his adult life in the hot glare of media attention while he guarded his private life with a tenacity rarely seen in the celebrity world. But when Walter Isaacson’s authorized biography of Jobs is released later this October, much of that life will no doubt be revealed at last. 
By many accounts Jobs was mercurial and ruthless, a perfectionist with an eye for detail  and the capacity to drive employees to despair with his demands. But many also speak of his kindness, his love for his wife of 20 years and their four children, his willingness to mentor those young entrepreneurs in whom he saw some of his early fire and brashness. 
When I saw the announcement of his death my eyes filled with tears. In the days since I’ve found myself returning time and again to his image and life in odd moments between classes or when I’ve been waiting at a stoplight. I’ve wondered what his children and wife are going through, how his closest colleagues will feel when they walk the halls of 1 Infinite Loop, the Cupertino headquarters of Apple, and especially, what he must have thought about in the last painful weeks of his life. I have asked myself why he holds such fascination for so many of us and who he will become in the psyche of 21st century people. 
Already he is spoken of in the same breath as Edison, Walt Disney, and Leonardo da Vinci. David Pogue, in his regular column on tech products in the New York Times, speculated on the chances another young visionary like Jobs is even now working in a garage somewhere, and put the odds at “Zero. Absolute zero.” People like Pogue, who have known Jobs for decades and have sometimes disagreed vociferously with decisions he made, see him as a rare creature, one of the few who deserve to be called “genius.” 
FIrst, Jobs brought together technology and art in ways that no one had thought of before. The products of his design teams were the result of his own visions and imagination. Someone recently described the process of design at Apple as stripping away layer after layer of clutter and chaos until they arrive at the luminescent, irreducible pearl in the center. Most corporations don’t allow that kind of time to be spent in reduction instead of addition, but then most corporations are content to repeat what works until well after it doesn’t anymore. To open the box on a new product from Apple is to witness the epitome of presentation. Every part of the packaging has a purpose, every part contributes to the whole, and the whole is much more than the sum of its parts. It makes you want to keep the packaging as art in itself.  
Second, Jobs never looked back. It was his view that if you’d succeeded at something it was time to throw yourself in the deep end and splash around until you found a new lifeline. He grabbed the idea of the mouse from PARC, made it standard in the industry and then moved us away from it to something ever more intuitive and natural—the gestures of hands and fingers. He took away our CD-ROMS, our external hard drives, cables, and flash drives. In their place he gave us elegance and simplicity. “It just works” was a refrain that constantly came through Apple’s marketing and advertising. 
Third, no one in recent memory has both commanded a corporation and put himself in the skin of an average consumer. Jobs had an uncanny ability to think like a customer, to focus on the results wanted, and then to provide the means to get there. 
Fourth, Jobs could see not just what would sell, and not just what would make something good even better, but something that no one had thought of yet. Even back in the Apple II days, when most people couldn’t imagine computers doing more than keeping recipes and shopping lists, Jobs was designing a personal computer to be a “bicycle for the mind.” He had to wait for the rest of the world to catch up sometimes, but more often than not he made us want the future before we could understand it—and when we did it was so natural it made us feel like we’d invented it ourselves. That’s called vision and almost nobody has it. Bush the First derided ‘the vision thing’ because he saw it like most corporate managers do as something a committee cuts and pastes together when they’ve run out of ideas.
Finally, Jobs raised the bar on performance so high that he made others want to do better. That’s charisma. Leaders want it, but it’s not for sale. Seldom seen, it’s the result, I think, of a person who embodies a cluster of paradoxes: power that surrounds itself with others who are brilliant; confidence without egotism; purpose with a sense of humor; and enthusiasm without mania. It’s the Tao in action. We are fortunate to have shared time and space with a man who found it—as we all may—inside himself. 

Saturday, October 1, 2011

Moneyball's Learning Curve

The fundamental mistake is in taking the patterns we observe around us as facts of nature. They are not; they are the result of rational individuals adjusting to a particular set of constraints. . . . Change the constraints and, given a little time to adjust, the patterns change. — David Friedman, Hidden Order: The Economics of Everyday Life
What’s the best way to get someone to change their behavior? Use a carrot? Use a stick? I’ve been interested in motivation every since I became a teacher and discovered that teachers can’t motivate students. If you beat them with a stick it doesn’t increase their skills and they’ll come to hate the process of learning. If you entice them with the carrot they’ll do just enough to get the carrot and no more. Unlike teachers, most students don’t enjoy learning for its own sake. Come to think of it, teachers don’t either: they learn in order to accomplish a goal. But one thing that separates teachers from students is that teachers can’t understand why the goals they love aren’t what students love. 
And that brings us to Moneyball, the movie starring Brad Pitt and Jonah Hill, based on the book of the same title by Michael Lewis. Lewis has a talent for making economics interesting in such bestsellers as Liar’s Poker and The Big Short. With Moneyball he looks inside the economics of baseball. The conventional wisdom is that the big spenders (Yankees, Red Sox) buy the best players and the division titles. They may even win the World Series. Money equals wins. But Billy Beane, the general manager of the Oakland A’s, had little money—in fact, the lowest buying budget of any team around the league. 
His first move was to hire a shy, soft-spoken kid, an economics graduate from Yale, who lived and breathed statistics—baseball statistics. It was the kid’s idea that meticulous scrutiny of a player’s stats could reveal patterns of performance that pointed to value rarely seen by scouts or managers alike. Most of the players Beane picks up in following this advice are the bargain-basement overlooked or the over-the-hill gang that no one wants.  
Predictably, the A’s scouts, a group of leathery, tough old guys, can’t see the logic and don’t appreciate the implication that tables of stats can trump years of experience. Beane is too old to waste time on methods that no longer work and young enough that he’s willing to bet the farm—and his reputation—on unproven concepts. This isn’t a movie review and I’ll try not to spoil it for you, but this is the takeaway I received: what people are worth is the value they place on their integrity. 
Billy Beane takes a clutch of misfits, has-beens, and also-rans and turns them into a team that wins 20 in a row—a record that no other team had achieved before—by thinking of them as parts in a system rather than individual stars. Without the money to buy a slugger he goes for the ones who get on base. He buys a pitcher whose delivery looks like a knuckle-dragging primate on speed, and turns a broken catcher, Scott Hatteberg, into a first-baseman. “What’s your biggest fear?” asks David Justice, a veteran player. “That someone will hit the ball toward me,” breathes Hatteberg. After Justice stops chuckling, he says, “Good one! That’s funny. But seriously. . . “   “No, really,” says Hatteberg, looking away. “That is my biggest fear.” 
By accepting the constraints and working to maximize the effects, Beane and his staff turned the club around and, some would argue, changed the game. He was hardly an inspirational speaker, at least as portrayed in the movie, and he seems to have deliberately distanced himself at first from the players. “That makes it easier for him to cut us, right?” asks a player of Beane’s assistant. But as the season grinds on with few wins Beane holds informal seminars on the method and gradually convinces the players that together they can win. A few men with journeymen talent and an ability to put ego aside can achieve more than the glittering superstars. He trades a player whose taste for the fast life is messing with his game, sends a rookie star to Philadelphia because of his attitude, and regretfully but firmly drops a player who cannot measure up. 
What are we worth? Exactly what we contribute when we put our hearts into it. But there’s no gauzy optimism in the A’s locker room, and you’ll never hear “I Believe I Can Fly” blasting from the sound system. In a scene that would make motivational coaches and school counselors cringe, Beane strides into the locker room after yet another loss and berates the players for celebrating anyway. “Do you like losing?” he yells and flings a bat down a corridor. In the sudden silence the sound reverberates for a long, long time. “That’s what losing sounds like,” he snarls, and stalks out. In the economy of teams at the bottom only one effect can give rise to a new cause: you have to hate losing more than you care about winning. In Beane’s pedagogy that’s neither a carrot nor a stick: it’s self-respect coupled with realism. 
Carol Dweck is a psychologist whose 30-plus years of research into motivation among children seems to back up Beane’s intuitions. She notes in her book, Self-Theories, that “The hallmark of successful individuals is that they love learning, they seek challenges, they value effort, and they persist in the face of obstacles.” Moreover, she punctures beliefs that are prevalent in our society, such as those with high ability are more likely to be mastery-oriented, and praising a student’s intelligence will encourage qualities of mastery. They’re not and it won’t, she says. Instead, the ones who succeed are most often those who persist with vigor and humility to overcome obstacles, and who believe that they can learn, that intelligence is not fixed at birth. That’s a cheer for the underdogs, but Dweck even goes one better: her research shows that the students who easily pull A’s but collapse in frustration when up against something difficult can learn a new attitude. They can shift from avoiding anything that would spoil their record to enjoying the challenge of learning something new. In other words, they are motivated from within. 
Perhaps, at the risk of over-simplification, this could be expressed as a set of goals: See your limitations as challenges. Learn to love the questions. Keep at it. Share what you know. 

Saturday, September 24, 2011

Understanding Backwards


Life must be lived forward, but can only be understood backwards. — Soren Kierkegaard
For a good part of my life I have seen religion as a duty which must be accomplished with dedication if not enjoyment. Since all people are sinners and sinners must seek salvation it did not occur to me that some people might not see the point in all this religion business. “Oh, I’m not religious,” some friends would say to me, as if it were genetically transmitted or perhaps an acquired taste. They would blithely go about their lives, unencumbered by guilt, enjoying their sins, and occasionally pausing to shake their heads at my dutifulness. “Why do you bother?” they would ask curiously. 
For my part, I could not understand how religion could be regarded as an accessory. It was core, at the heart, deep inside, that which guided and prompted all that was good and pure and true. One could no more shuck it off and live a decent upright life than one could see one’s hand in a room without light. There was one way to salvation and that was through obedience to the rules, as inexplicable as they appeared sometimes. And yet I continued to meet people who claimed no religious allegiance, but seemed to me honest and good, the kind of people who would take you in during a storm or give you a lift miles out of their way. It was disconcerting. Some of them even smoked.
So I tried harder, tried to be dutiful, tried to be aligned without completely losing myself. But myself would slip out of my grasp at the most inopportune moments and do something embarrassing, like refusing to stand and go forward for altar calls. Even if I had made a clear and heartfelt decision years before to join the side of the angels, I squirmed in the pew when the preacher began his pitch. I felt that I owed it to the unchurched and the disbelievers in the house to stand yet again and be a living example. Despite my inner diatribe that religion was personal and that honesty demanded a consonance between motivation, belief and action, I felt I was letting down the team. 
And yet I was always fascinated by religion, or rather by the quest for God and transcendence. Growing up in California in the 60s, I was surrounded by those who sought a shortcut to enlightenment or at least bliss. I plodded along, waving as they roared past, secure that I had the safer path if by far the slower one. If it was there, I thought, I’d find it eventually by dint of just keeping at it, one foot in front of the other. But I didn’t. 
I studied theology, philosophy of religion, eventually got a doctorate and taught religion for some years. I had no doubt I should be there and yet I constantly felt like an imposter. I could not be like my colleagues, men who had signed up for the church for life and who seemingly could overlook all manner of missteps and outright lies on the part of the church. I struggled to understand how to avoid the sin of self-righteousness while side-stepping hypocrisy. But pride goes before a fall—and I took a fall of my own making. 
Years later I am seeing some things much differently. I am learning not to let the foibles of the official church body distract me from my own spiritual quest. I have met the enemy, like Pogo, and they are me. I know what I am capable of doing against my better judgment and where most of the fault lines appear in my foundations. 
And I have learned or perhaps discovered, that signing up for a set of beliefs is not the point. Some beliefs fall away over the years because they never really found a place because I never really believed them. Others simply don’t make sense no matter how I’ve tried. But the vast majority of religious beliefs ought to be seen as practices. We practice them because in the practicing comes understanding and with understanding comes the willingness to live in grace, to be in God. “Religion is a practical discipline that teaches us to discover new capacities of mind and heart,” says Karen Armstrong in The Case for God. “You will discover their truth—or lack of it—only if you translate these doctrines into ritual or ethical action.” 

Orthopraxy over orthodoxy—right action over right belief is how I see it—but with two important caveats. First, we do not earn our way through “right” action because this is not a contractual relationship. God is in the giving business, not the litigation business. Thus, I have nothing to fear from him; I have no need to protect myself. Second, belief is not abandoned, but made firm through action. “Like any skill,” continues Armstrong, “religion requires perseverance, hard work, and discipline. Some people will be better at it than others, some appallingly inept, and some will miss the point entirely. But those who do not apply themselves will get nowhere at all.”
In the end—and in fact, in the beginning and in the middle, is grace. That is what makes this whole venture possible. Room to move, to experiment, to make mistakes and learn from them. Here is the mysterious presence of the Christ. T. S. Eliot knew something of this, laying down the lines in The Wasteland:
Who is the third who walks always beside you?
When I count, there are only you and I together
But when I look ahead up the white road
There is always another one walking beside you. . . .”
And that is enough for the time being.

Saturday, September 17, 2011

9/11 and Counting


“Not until we are lost, in other words, not till we have lost the world, do we begin to find ourselves, and realize where we are and the infinite extent of our relations.” — Henry David Thoreau, Walden
My grandfather was an historian and a college teacher who filled me with a love for history and a respect for those who write it. I was raised by my grandparents, teachers both, and our home was packed with books, magazines, and journals, many of them about ancient history, medieval history, and modern history. My first understanding was that history told the story of what had happened long ago, that it was a true and valid record of those events, and that it stood in the same relation to the Eternal Verities as the Law. One learned and believed History and one kept the Law. Neither was profitably to be questioned.
These beliefs, solidified and tested by teachers in elementary school, were gently but firmly undermined by my grandfather’s tutelage. History, he said, was what historians reconstructed from written documents, eyewitness accounts, physical evidence and a sanctified imagination. The story could have been written another way; in fact, there were many ways to tell the story and most of them could be seen to fit the facts as they were known.
This was endlessly fascinating to me. It threw a relativism into the works that furnished my imagination with a constant stream of long shots and close-ups seen from different angles. Historical figures, outsized characters like Lord Nelson, Napoleon, and Winston Churchill, became, through my grandfather’s stories, people whose flaws were as tangible as their virtues. Neither was to be ignored nor were the flaws to define the person, as tempting as that was. History, in the way my grandfather taught it, was complex and multi-layered, a spider’s web of nobility, contrivances, deceit and bravery. It was not, as Henry Ford was famously quoted as declaring, ‘one damn thing after another,’ but a vast and ongoing story—a tale told with a point, freely offered up for scrutiny.
The events of the 60s, exploding over my generation, came so fast and furiously that Ford’s complaint seemed justified. It was one damn thing after another. Apparently random events took on a sinister afterglow, conspiracy theories bred like fruit flies, and the Book of Revelation bookmarked the nightly news. And if journalism was the rough draft of history, then propaganda from all points on the political spectrum was the marketer’s flack, guaranteed to fill the worst with a passionate intensity.  
My cognitive dissonance over America’s Manifest Destiny gone rancid in Vietnam was further jolted by the realization that Martin Luther King, Jr. was breaking the Law. He didn’t just break it though, he first hauled it up from the depths like some blinkered Morlock, where it could be seen for the poisonous creature it was. The social effect of his nonviolent resistance to institutional racism was the permanent dwarfism of law. From that point on, certainly for my budding political awareness, the law no longer had the implicit seal of approval from on high. I saw it as a human construct, flawed and dangerous when it served only the interests of the powerful. For me this was a new experience: I found myself in a dark wood without the familiar landmarks and the path I’d traveled daily suddenly looked alien and forbidding. Thoreau had been there too, literally if not metaphorically. The traveller in the forest looks around and “Though he knows that he has travelled it a thousand times, he cannot recognize a feature in it,” says Thoreau. “. . . .[I]t is as strange to him as if it were a road in Siberia.” 
Perhaps that was our feeling on 9/11 when, on a perfect day in September, our predictable world turned dark and terrifying. How could this happen here? Ten years on from that day I wonder what we have learned. Are we still the adolescent nation, armed, fractious, and trigger-happy? Are we still oblivious to the effect our bumbling self-centeredness has on the world around us?  “Not till we have lost the world,” said Thoreau, “do we begin to find ourselves, and realize where we are and the infinite extent of our relations.”
Someone once remarked that “the world is passing strange and wondrous.” That it certainly is. There is mystery and wonder all around us—in the violent dislocations as well as in the harmonies we find. Our common histories quickly become uncommon when we make allowance for a shift in view.
“In rethinking our history,” says Howard Zinn, historian and author of The People’s History of America,” we are not just looking at the past, but at the present, and trying to look at it from the point of view of those who have been left out of the benefits of so-called civilization. It is a simple but profoundly important thing we are trying to accomplish, to look at the world from other points of view.” 
That can be the legacy of 9/11.

Saturday, September 10, 2011

Readers Without Borders


“The essential point to grasp is that in dealing with capitalism we are dealing with an evolutionary process. . . . The fundamental impulse that sets and keeps the capitalist engine in motion comes from the new consumers’ goods, the new methods of production or transportation, the new markets, the new forms of industrial organization that capitalist enterprise creates.” — Joseph A. Schumpeter, Capitalism, Socialism, and Democracy
Borders closes this week, after a six-week countdown to bankruptcy, and the selling off, at constantly-dropping prices, of everything from books to bookshelves. This is the fourth local Borders that has closed in the past few months, two in Washington, DC and two in Maryland. I knew all of them well and spent considerable time and money in each one. On the flyleaf of every book I’ve bought from them is my name, the date of purchase, and the location of the store. The last Borders book I bought—a collection of J. G. Ballard’s best short stories—was from the Silver Spring store and it’s closing was duly noted in the flyleaf. 
But it’s a cruel world out there (a doggie-dog world, as my students would say) and Borders is proof that everybody is someone else’s lunch. An independent bookstore in Takoma Park, Maryland, that many of us loved, Chuck and Dave’s, finally succumbed some years ago to the relentless pressure of discount pricing that the local Borders provided. And now Borders itself, Nook-less, and struggling to counteract Barnes and Noble on one side and the gigantic presence of Amazon on the other, has been devoured. There are rumors that the Silver Spring store will be taken over by Books-A-Million, an outfit that could charitably be described as ‘a book distribution outlet.’ Try asking its staffers for books by Auden, Barth, Nabokov, Proust, or Tolstoy, and you may be asked to repeat the question. 
As bookstores go, Borders was formulaic, as befits a contemporary corporate franchise. From store to store you could count on the same titles in the same sections. The effect, I suppose, was a predictability much like that of any major chain from MacDonald’s to Goodyear. But Borders staffers seemed to love books and know quite a bit about them; they had favorites and knew where to find them. If you asked they would drop what they were doing and lead you to comparable titles. It’s true that the reshelving process rarely occurred in some stores so that you’d find Lolita shacked up with Jerome Dickey and Isaac Newton in the Psychology section, but that could simply have been indication of a lively clientele constantly on the move. 
Depending on the location, Borders appealed to the local demographic, but still had a depth in its selection of which Barnes and Noble still seem only fitfully aware. If you wanted Herodotus they had him, along with Tacitus and Livy; if it was Freud you were after they had his works—and Jung’s and Adler’s as well. Jeffrey Deaver? Shelf after shelf, but Erle Stanley Gardener, Dorothy Sayers, Kurt Mitchell, and even George MacDonald could be found. Religion and philosophy were pretty well represented at Borders too, although I couldn’t help notice that Nietzsche invariably occupied more shelf space in every store than any other philosopher. But their selection of Taoist, Confucianist, Buddhist, Hindu, and Jewish philosophy was excellent and far outstripped anything you could find at Barnes and Noble. Again, while B & N gives a lot of real estate to religion and spirituality, most of it seems to run to the lighter variety of “inspirational” works that clutter the waiting rooms of medical offices. 
So Borders is gone and Barnes and Noble has won—for the time being. Yet, I don’t believe that e-readers like Kindle, the Nook, or the Sony reader spell the end of “real” books nor do I think that brick-and-mortar bookstores will completely disappear any time soon. And while the statistics that one hears about American reading habits are appalling if true (a majority of American adults don’t even read one book a year), I’m guessing that places like DC and the Washington Metro area are probably typical of many cities in America with universities, high concentrations of professionals, and diverse populations of people who read a great deal. 
Even so, the small, independent booksellers are quietly dropping out of sight, a fact greatly to be mourned. They simply can’t compete with the buying power of the bigger stores. They offer topical interest (every issue of Ellery Queen Mystery Magazine since 1941!), quietude broken only by the buzz of fluorescent lighting or the flatulence of the owner’s elderly cats, and comfy chairs. Very often there will be a pot of coffee or tea simmering, and a variety of biscuits at hand—the small comforts of a literate society. 
Every breakthrough in technology brings both untold blessings and considerable gnashing of teeth. Gutenberg’s printing press could churn out 3,600 pages per day as opposed to the few that could be produced by hand. Gone was the scriptorium, the rooms full of monks nodding over their copies of the Bible or Christian classics. Now  eager readers of the latest works by Luther and Erasmus could have their own books, thus inciting revolutions in thinking that spread like a virus. 
The transitions between technologies are rarely smooth because they cannot be planned for and their effects remain to be seen. Understanding the changes at first may be like encountering a tsunami at sea—it’s traveling 600 miles per hour but it’s only an inch high. By the time it hits the shallows of public awareness it’s too late to get out of the way. This is what Joseph Schumpeter called the Creative Destruction of capitalism. Every new technology destroys the previous one and sets up the conditions for its own destruction. While we benefit from the innovation we lose the traditions. “This process of Creative Destruction is the essential fact about capitalism,” he said. “It is what capitalism consists in and what every capitalist concern has got to live in.”
A few years ago I got a Kindle, a gift from my wife, because it saved on shelf space, the books were cheaper, and ultimately it saved trees. My gain in efficiency and conservation helped to bring about the demise of Borders, a “clean, well-lighted place,” in which ‘real’ books could be lingered over, paged through, bought and carried out. I love the feel, the smell, the texture of a book in hand, and I’ve got the bookshelves to prove it. And while the wheel of innovation turns and brings its own pleasures, I shall, God willing, shuffle off this mortal coil years from now, surrounded by loved ones and books. These are the wistful joys we carry into our temporary futures.

Saturday, September 3, 2011

One Two Many


It is neither a universe pure and simple nor a multiverse pure and simple. — William James, Pragmatism
Is the universe one or many? Or to put it another way, are we hedgehogs or foxes? Isaiah Berlin, philosopher, cultural critic, and wise man, wrote an essay years ago about this with the focus on Tolstoy’s view of history. It has taken on a life of its own over the years, known mostly for the first few pages where Berlin sets the context. “There is a line among the fragments of the Greek poet Archilochus,” he writes, ‘The fox knows many things, but the hedgehog knows one big thing.’ " If we take the line figuratively it divides the world up into two groups: those who relate everything to one single, unifying vision, overarching everything and giving meaning to all things, down to the minutest detail. Those are the hedgehogs, and Berlin counts among their august company such figures as Dante, Plato, Lucretius, Pascal, Hegel, Dostoevsky, Nietzsche, and Proust. 
On the other side, across the vast chasm that divides the two, are the foxes, those who pursue many ends, related or not, usually contradictory in their purposes, and connected only in some de facto way. Their ideas, notes Berlin, are centrifugal rather than centripetal, flying outward unencumbered by any “fanatical, unitary inner vision.” Shakespeare, according to Berlin, is just such an animal, as is Herodotus, Aristotle, Montaigne, Erasmus, Moliere, Goethe, Pushkin, and Joyce. And maybe we could add Woody Allen, John Lennon, Jack Kerouac, Susan Sontag, and Robin Williams.  
The difference between them does not seem to be the presence of a metaphysical ADHD, but rather this view of the universe as either monistic or pluralistic. It’s an old philosophical problem, probably the equivalent of a parlor game at Plato’s Academy. William James, realizing that many in his audience were hardly kept awake at night over such matters, considered it even so “the most central of all philosophic problems.” He believed that “if you know whether a man is a decided monist or a decided pluralist, you perhaps know more about the rest of his opinions than if you give him any other name ending in ist.” Philosophy, he notes, throughout its long history, has taken the search for unity as its default position, so much so that no one really questions it. But what about the variety in things? Doesn’t that matter too? In his usual brisk and humorous manner he asks what practical difference it would make to see all things through a single, unifying lens and goes on to show the presence of unity in our everyday lives. 
For example, even in our ways of talking we assume a oneness to “the world” and trade on this assumption to avoid having to explain the multitude of parts every time we open our mouths. We also find a continuity between the parts such that we believe the whole is made up of the way the pieces hang together. And so on. Through several examples James seems to beat the monistic drum until you realize that he has slyly provided a third option: the world is neither One nor Many but One and Many. 

I don’t toss and turn at night, vexed by this problem. But it’s always there, just at the edge of my peripheral vision, something that if looked at directly seems to float away and yet is constant in its persistence. Standard operating procedure these days seems to force one to choose between the extremes: either conservative or progressive, right or left, all or nothing. But between Rambo and Diary of a Wimpy Kid lies a vast spectrum of degrees of kind, along which we most certainly can claim a rightful place. I’m convinced that as we move through our days and years we instinctively find the Middle Path, an inward moral and cultural gyroscope that guides us through the social terrain. In a number of areas of life we might benefit from our own versions of James’ brooding reflection on this question. 
Religion: Like the barnyard denizens of Animal Farm, we’ve learned to chant in unison, “Four legs good! Two legs bad!”,  except that it usually comes out as “Religion bad! Spirituality good!” Choosing one over the other brings out the worst in both: a sclerotic religiosity leads to self-righteousness and hypocrisy—and that’s just on a good day. Spirituality unattached to communion with others has no reference points; it floats in a vaguely mystical haze unable to communicate with others and with no hope of transcending itself. In the geometry of the soul the ideal state is probably an angle bisecting the vertical axis Godward and the horizontal axis toward others. Jesus did say, after all, that we are to love God and treat others as we wish to be treated. 
Politics: Democracies demand commitment; politicians will settle for your vote and your cash. Since commitment can’t be bought, and many don’t vote, this democracy seems both anemic and volatile. There’s a rage just under the surface, like a persistent fever that drains our energy and spikes our resistance to what we don’t understand. And there’s a lot we don’t understand, like how grownups can act like children fighting in a schoolyard, all sweat, threats, and wounded egos. Wasn’t politics the ‘art of compromise’? Isn’t it possible to hold convictions, recognize the convictions of others, and yet find a way to do the good thing in the right way? 
Communication: We are social animals, clearly not ourselves without others. Through patience, practice, and a gracious humility, we can learn to communicate with others quite different from ourselves. But it doesn’t come naturally; it’s a learned response. Much of the tutoring is carried on through the media, odd creatures that have heads like humans and the backsides of. . . . horses. Lately, the view of those who follow the media has been of jostling backsides with precious few heads in sight. Perhaps we need to be out in front where we can put our heads together. 
As for me, I’m perpetually betwixt and between. I’m a fox with a hedgehog headache. So many interests, so little time—wouldn’t it be nice to synthesize all this into a simple Rule of Life. I shall, for the moment, leave the last word to Bono: we’re one but we’re not the same.

Saturday, August 27, 2011

Did You Feel That?

There's a 100 percent chance of an earthquake today. Though millions of persons may never experience an earthquake, they are very common occurrences on this planet. So today — somewhere — an earthquake will occur. — The United States Geological Survey
This week a most unusual thing happened: the East Coast “suffered” an earthquake registering 5.8 magnitude. This is unusual because earthquakes rarely happen here. The last one originating in Maryland was in 1990 and was a magnitude of 2.5, barely enough to rattle the windows. The last one in Virginia of any note was in 2003 and hit 4.5, somewhere in the median range of magnitude between zero and nine—nine being of epic, apocalyptic proportions, like the one that hit Sendai, Japan this year.
Nevertheless, the one that hit Washington, D.C. shortly after 1:51 pm on Tuesday, August 23, tipped over a couple of coffee cups and dropped a few stone blocks from the Washington Cathedral. That was the extent of the damage, but you wouldn’t have known it by the media firestorm that erupted within minutes. As Howard Kurtz, writing for The Daily Beast, pointed out, the Libyan rebels were closing in on Gaddafi (choose your spelling), certainly a significant event on the world stage, but here in Washington, the earthquake blew everything else off the screen. The Washington Post the next day featured a front-page story and photograph of three panic-stricken women apparently fleeing for their lives down Connecticut Avenue. The local news faucets featured “man-in-the-street” interviews round the clock of the generic type such as,  “I was in my office near the Capitol when the windows starting rattling. . . At first I thought it was a truck going by, then I thought we were being bombed, so I ran out of the building. . .” And so on. 
Some of us reacted differently. When a friend recounted how she had fled her building in terror and asked what I had done, I hesitated and then said, “Well, I grew up in California, so. . . .” I probably came off as insufferably smug, but it really wasn’t that big of a deal. For me the interest is two-fold: how rare earthquakes are in this region and how much people overreacted. No doubt the two are linked. I checked the U. S. Geological Survey online tonight and found that there were dozens of earthquakes all over California today, all up the coast to Washington State, with Alaska’s Aleutian Islands pocked with many also, most of them over 4.5 magnitude. I’m fairly certain none of them got more than a passing notice in the media.
This week alone Vanuatu took nine earthquakes in one day between 5.0 - 7.0. An earthquake with a magnitude of 7 shows up in red on the USGS website as “significant.”  The Vanuatuans had three days in a row of such tremors, then they took a day off, and resumed with vigor, finishing out the week with enough earthquakes of significance to last Maryland for three centuries at least. You’re wondering where Vanuatu is, I bet. I looked it up on the map thoughtfully provided on the website and found that it’s a couple of dots several hundred miles off the northeast coast of Australia. I don’t know if anybody lives there but if they do they probably have other things to worry about besides their lawn chairs tipping over. 
Scanning a list of earthquakes world-wide so far this year (1,747 between 5 and 5.9 alone) one place appeared with alarming frequency, an area referred to as “east of Honshu, Japan.” That’s the fault line that slipped and sent a 23-foot tsunami into Sendai and other towns in the Honshu prefecture in March, with the death toll unofficially at 10,000. It makes the list more than any other place, but there are several other hot spots around the Pacific Rim that are offering some tough competition for first.   
I saw video shot in a grocery store in Japan during that devastating quake. Customers do not scream or run, instead, they steady the grocery shelves and as the camera vibrates with the tremors, they bend down to begin to clean up the mess. I guess it’s all about what one gets used to. 
As a species humans are wonderfully adaptable. Drop us into a wretched situation and within days, sometimes hours, we will have figured out a way to cope. But coping as a mechanism for dealing with the unexpected seems to vary from culture to culture. For example, cultures that emphasize the community over the individual often pull together more quickly in crises than cultures where individualism is the priority. So Japan, high on the collectivistic scale, handles these situations of genuine devastation and horror with more patience and equanimity than we in the US (top of the list in individualism worldwide) seem to do. They’ve certainly had more practice. On the other hand, the natives of Joplin, Missouri, hit with a tornado on May 22 that killed 153, a record number of tornado-related deaths in many years, bore up resolutely under the strain and opened their schools on time this August, one of them in an abandoned former big-box store. 
It’s tempting to indulge in generalizations about these things, especially those relating to cultural differences. But here is a truth, one that I hold as self-evident, that occurred to me as I reflected on our “all-earthquake-all-the-time” mentality. The American media, especially in the major markets, is addicted to drama and that message seeps into our unconscious to the point that we find danger everywhere. In fact, we seem to revel in our manufactured paranoia. Paul Watzlawick, a leading psychiatric researcher at Stanford for many years, says “Any idea, when firmly held, nurtured, and cultivated, will eventually create its own reality (The Situation is Hopeless, But Not Serious, 57).” We seem to be living in what Gever Tulley calls a culture of “dangerism.” Since perception is so often reality, we have to respect the fears of others, however silly they might seem to us. But we don’t have to share them. I’m opting for an attitude that says by God’s grace we’ll endure whatever happens, we’ll try not to whine about it, and we’ll help one another. A lot of people, no doubt, will do without the first phrase while admirably living up to the other two. Personally, I’m hoping that more of us who believe in God’s grace can exemplify the courage of that modest group.