The Vanishing World of Touch

Not long ago I celebrated in my brimmings blog the realm of touch, so wonderfully depicted by my favorite nature writer, Diane Ackerman, in A Natural History of the Senses. What she doesn’t touch upon is the increasing loss of that tactile dimension in a virtual age powered by Artificial Intelligence now pushed to the forefront by the corona pandemic. Nearly a third of us now work from our homes. Fewer of us are needed. Sadly, we are probably witnessing the loss of a way of life to which we won’t fully return: fewer teachers, doctors, etc. , increased surveillance, a cadre of workers, many of color, working as grocery clerks, industrial farm laborers, or from remote warehouses.

The loss of a tactile world undermines the human enterprise for which social media becomes a poor substitute. And then the outcome for families, the stress of uncertainty and limited horizons of opportunity in a touchless society where we no longer shake hands, give hugs, or bestow a kiss upon the cheek, airport embraces of coming and going reduced to impalpable memory.

As never before in a world such as ours, we are children in the night needing to be held and to be loved. We cannot live happily in a world of reduced signifiers of human belonging. Touch is the lingua franca fundamental to our destiny.

—rj

The Left’s Problem with Free Speech

It didn’t take long for opposition to Harper’s Magazine letter featuring 153 heavyweight intellectuals, largely academics and writers protesting censorship, to engage counter protest. Not from the Right as one might suppose, but from the Left in a counter letter featuring 160 signatories, published in the online site, The Objective.

Some argued the Harper signatories were white, economically privileged, academic elitists who don’t merit any claim to duress for their views. “They are totalitarians in the waiting,“ commented Parker Molloy of Media Matters. “They are bad people. They want you to shut-up.” Molloy is referencing the current cancel culture conflict, intimating the Harper signatories would repress minorities from speaking out.

Not only is this the race card fallback again, but it’s absurd on two counts:

Twenty-four of the signees were people of color. As one Black signatory to the Harper letter wrote, “If they didn’t recognize your name, they assumed you’re white.”

Protestors seem to have ignored signatories Salman Rushdie who had to go into hiding after a fatwah was issued on his life and must still change his addresses frequently, or chess champion Garry Kasparov who was ostracized in Russia for opposing Vladimir Putin.

Do you think Noam Chomsky and Gloria Steinem haven’t been told to shut-up by adversaries from the Right?

What especially rankles some is J.K. Rowling’s presence on the Harper list. You may not agree with her sentiments re: transgender access to bathrooms as a traumatized rape victim, but she’s the one they specifically want to shut-up, with some calling for a boycott.

Ironically, there are several rank hypocrites among the Harper signatories: New York Times editor Bari Weiss, literary scholar Cary Nelson, and political scientist Yascha Mounk.  Weiss and Nelson have actively worked to silence pro-Palestinian voices;  Mounk in 2019 enthusiastically supported the Bolivian coup bringing Jeanine Añez to power.  Since then, massacres have followed, dissent been restricted, and an election postponed.

In all of this comes the need to distinguish criticism from censorship. The first is fundamental to liberal democracy; the latter, its nemesis. The Left’s vitriolic response, its ad hominem assault by race, economic status, and on alleged motives of the Harper signatories bear all the trademarks of a repressive body politic inimical to debate.

Leftist writer Freddie de Boer’s gets it right: “The people furious at this letter largely have genuine ideological problems with liberal norms and laws regarding free speech. Please, think for a minute and consider: what does it say when a completely generic endorsement of free speech and open debate is in and of itself immediately diagnosed as anti-progressive and anti-left?”

—rj

70th Anniversary: Korea, the Forgotten War

A forgotten war that shaped the modern world

Seventy years ago today, North Korean troops crossed the 38th parallel, invading South Korea. I was ten years old, but my father would send me up Front Street in Philly to get the Inquirer or Bulletin. I think it was five cents in those days, fifteen on Sundays. Pa would split the paper with me.

I knew the details intimately and followed the battle lines faithfully that summer, when U. S. troops intervened in large numbers.

And then came MacArthur’s superbly executed amphibious Inchon invasion behind enemy lines that fall, reigning in the North Korean forces. Boldly, we marched into North Korea. It turned into a protracted war, however, with the intrusion of vast numbers of Chinese soldiers into the conflict in a colossal failure of U. S. intelligence that would cost many lives. I remember GI’s telling me how the bodies of charging waves of Chinese would pile-up in front of their machine guns, preventing clear fire.

Truman dismissed MacArthur, who returned home to a hero’s welcome. He had wanted to hit China. The bridges across the Yalu were never touched, allowing the Chinese free access. I remember the Chinese encirclement of our marines at the Chosin Reservoir, their desperate retreat after seventeen days of protracted battle amid sub-zero temperatures in what became America’s version of Dunkirk.

The war would continue unabated into election year, 1952, with an unpopular Truman bowing out. Eisenhower would be swept into office, pledging to end the conflict, which he did by intimating nuclear intervention. The enemy got the message and in 1953, an armistice was signed. It provided for prisoner exchange. The sad reality is it never fully happened, some 7,800 American POWs unaccounted for. All told, more than 50,000 Americans perished, 100,000 were wounded. Five million Koreans, North and South, died, the vast majority civilians.

The war proved to be the opening salvo of the Cold War, foreshadowing Vietnam. As a little boy sprawled out on the floor reading the war accounts, I never imagined I’d be part of an occupying force four years after the war’s end, safeguarding the Republic of South Korea from the North. I spent thirteen months there, initially as a seventeen year old. I remember naked, hungry children begging, adults living in holes covered over with sheet metal, bullet shredded walls.

Today, North Korea remains a rogue state, menacing not only the Republic, but with its increasingly sophisticated nuclear arsenal, the U. S. mainland as well. We are at a loss for answers.

You’ll be hard-pressed to find anniversary accounts of the conflict in today’s press, understandably consumed with the pandemic, economic turndown and, not least, Donald Trump. Still, the Korean War has long been called America’s forgotten war. It hasn’t been that way for me. More like a shadow I can’t escape.

—RJoly

Oliver Sacks’ Ambivalence on Living in the Digital Age

Image result for Oliver Sacks

There isn’t anything I enjoy more in a stress-laden world than a time-out for a good read. Books lend me a purview of how others experience life, lending sagacity and connection with my fellows. Books teach me that I’m not alone.

Courtesy of The New Yorker (February 11, 2019), this morning I came upon Oliver Sacks’ restive short piece, ¨The Machine Stops.” Written in the last weeks of his impending death, the famed neurologist reflects on the fallout of living in the digital age.

Brilliant, cogent, unceasingly eloquent and abidingly compassionate, Sacks specialized in the eccentricities imposed by the brain, most famously in his Awakenings, later turned into one of the most compelling movies I’ve seen.

Sacks laments here the social distancing wrought by a technology that should be bringing us together, reminding me of Tolstoy’s initial response on seeing a film clip for the first time in his advanced years and countering that though this new technology was latent with promise, too often technology had been harnessed for ignoble ends.

Beginning with the ubiquitous cellphone, Sacks complains that he “cannot get used to seeing myriads of people in the street peering into little boxes or holding them in front of their faces, walking blithely in the path of moving traffic, totally out of touch with their surroundings. I am most alarmed by such distraction and inattention when I see young parents staring at their cell phones and ignoring their own babies as they walk or wheel them along. Such children, unable to attract their parents’ attention, must feel neglected, and they will surely show the effects of this in the years to come.”

In short, our digital milieu has decimated a once fecund public and private life, replacing social interchange with inferior virtual substitutes. I remember in my boyhood sitting with neighbors on stoops in Philadelphia on humid summer nights, conversing until the arrival of night’s cool breezes sweeping across the Delaware; houses teeming with porches where we played games, conversed, and shared neighborhood babble. Mornings, I’d grab my ball glove and saunter off to a crowded diamond. Those ball fields, in Philly and afar, lie increasingly vacant in these days of video games:

In similar vein, Sacks continues that he’s “confronted every day with the complete disappearance of the old civilities. Social life, street life, and attention to people and things around one have largely disappeared, at least in big cities, where a majority of the population is now glued almost without pause to phones or other devices—jabbering, texting, playing games, turning more and more to virtual reality of every sort.”

0ur personal lives have been turned inside out, our privacy invaded. Think of what Facebook has done with posts you thought were personal to your friends, or that daily invasion of your cell phone space by a stream of telemarketing calls, or the tracking of your computer viewing via cookies.

And then there’s that immense loss for our culture and, consequently, for ourselves in our spendthrift use of our time for trivialities, foreclosing on better priorities such as art, music, literature and science that have buttressed our civilization and refine our humanity, promoting sensitivity, tolerance, knowledge and wisdom. Inundated by media, we traffic in noise. Bored, we may not like ourselves. We no longer know how to sit still.

“Everything is public now, potentially, Sacks writes: one’s thoughts, one’s photos, one’s movements, one’s purchases. There is no privacy and apparently little desire for it in a world devoted to non-stop use of social media. Every minute, every second, has to be spent with one’s device clutched in one’s hand. Those trapped in this virtual world are never alone, never able to concentrate and appreciate in their own way, silently. They have given up, to a great extent, the amenities and achievements of civilization: solitude and leisure, the sanction to be oneself, truly absorbed, whether in contemplating a work of art, a scientific theory, a sunset, or the face of one’s beloved.”

The punchline of all this arrives for Sacks in his now retreating days of life when he conjectures the worth of a life lived for better values in a context of seemingly burgeoning social indifference:

“. . . it may not be enough to create, to contribute, to have influenced others if one feels, as I do now, that the very culture in which one was nourished, and to which one has given one’s best in return, is itself threatened. Though I am supported and stimulated by my friends, by readers around the world, by memories of my life, and by the joy that writing gives me, I have, as many of us must have, deep fears about the well-being and even survival of our world.”

And yet Sacks stubbornly defies those hovering specters of demise:

“Nonetheless, I dare to hope that, despite everything, human life and its richness of cultures will survive, even on a ravaged earth. While some see art as a bulwark of our collective memory, I see science, with its depth of thought, its palpable achievements and potentials, as equally important; and science, good science, is flourishing as never before, though it moves cautiously and slowly, its insights checked by continual self-testing and experimentation. I revere good writing and art and music, but it seems to me that only science, aided by human decency, common sense, farsightedness, and concern for the unfortunate and the poor, offers the world any hope in its present morass.”

I fervently hope along with you that Sacks’ midnight wager turns out right. But to paraphrase Keats, the thought paradoxically lingers in me: does Sacks “wake or sleep”?

—rj

And a Child Shall Lead Them: Healing What Ails Us

Image result for and a child shall lead themI was talking just a few minutes ago with my better half, wondering just how I used to spend my idle hours before the Internet came into vogue. As is, I’m cuffed to a binary lodestone, whether smart phone, iPad, or desktop, dulling awareness, squandering time, exponentially addictive.

Generally, my dawns begin not with photographing sunrises or heading to the gym, but grabbing my tablet, which accompanies me even to bed, for a wakeup breakfast of The Guardian, BBC, CNN, and NPR.

Unsatiated, I imbibe local news back home where I lived for 41 years before moving this past summer, all of this consuming at least an hour. I check for updates several times throughout the day.

I comb Facebook for friend posts, get off text messages as day tumbles into noon.

One of the inveterate things I do is to google this and google that. If curiosity killed the cat, it’s stuffed my brain into info overload.

According to a recent report in Ofcom featured in The Guardian, I’m not alone by any means. 78% of us now have smartphones, rising to 95% of young people, 16-24. Returning from work, we grab a fast meal, throw ourselves into a comfy chair, turning on, say, Netflix, for a few hours more of wasteful indulgence.

Bored and stressed, we moderns seek distraction. We have difficulty keeping company with ourselves.

Addicted, each day becomes a round of what Buddhists term Samsara, or the unenlightened repetition of daily round, captured famously in Bill Murray’s stellar performance in Groundhog Day.

And we pay a steep price for all of this in a lifespan never really long enough, missing out on the miracle of life that´s not only ourselves, and won’t happen again, but of those around us enveloped in a cosmos, earthly and heavenly, infinite, yet temporal.

It was Wordsworth, nature poet of a quieter time, who told us “the child is father of the man” in the ¨Rainbow.” What he probably meant is that what we are as children we become as adults in the maturation of habits and sensibilities acquired when children, particularly an early fondness for nature.

I’d extend its meaning to include a child’s sometimes extraordinary ability to show us the way as adults in their frequent exemplar of sensory delight in the nowness of things, each day a renewed cornucopia, at least before the advent of video games.

Maybe you’re getting my drift—that one way out of our electronic matrix is to rethink what we loved to do as children and rediscover it again. I loved studying languages, playing and watching baseball, walking to the library and the adventure of new book, traveling to new places, meeting new people, learning new things, the smell of country air, the touch of bare feet on cool earth in early morning in our garden.

Children teach us not to fret about tomorrow.
To stop if it isn’t fun.
To be curious.
Honest.
Passionate.
Hopeful.
Forgiving.
To savor the moment.
To forgive.
To love.

I hold to action over prayer, but were I a praying guy, I’d surely pray, “Lord, give me the mind of a child again!”

–rj

The Plight of Native Americans in a White America

The White Man’s misdeeds in America towards its indigenous peoples are incalculable in number and cruelty. I was reminded of this last week when Karen and I visited the Grand Canyon and learned from the Visitor Center that Yavapai and Apaches once lived adjacent to the Canyon. That is, until 1874, when the government closed the Camp Verde Reservation and forced its residents to trek 180 miles to the San Carlos Apache Reservation. More than 100 Native Americans perished.

Nearly two years ago we witnessed the subjugation of the Dakota Access Pipeline protests in North Dakota that had commenced in 2016. Primarily affecting Sioux residents of America’s fifth largest Indian reservation, encompassing 2 million acres, the pipeline traverses sites sacred to the tribe and perhaps compromises the Reservation’s water purity.

Initially, it appeared the tribe had won when President Obama shelved the plan in late 2016, pending an environmental review, which would take years to complete.

Alas, there came the surprise of Trump’s election win and the inauguration of an administration strident in anti-environmental bias. In January 2017, came Trump’s executive order approving both the Keystone (Alaska) and Dakota pipelines.

The result, several hundred thousand barrels of oil now flow beneath the once pristine landscape.

This wasn’t a first happening for the tribe. In the 1960s, the Army Corps of Engineers built the Oahe Dam near Pierre, SD, flooding 56,000 acres of the Reservation’s farms and woodlands. Elderly residents recall their homes being burned prior to the flooding.

Ironically, the Standing Rock Reservation is the birthplace and final residence of Sitting Bull, who fiercely resisted white infringement on Indian land. It was his refusal to submit to the government’s order to remove the Sioux to a reservation that led to the famous Battle of the Little Big Horn, in which the Sioux defeated federal troops led by Custer’s 7th Cavalry in 1876.

In 1890, he was shot to death at Standing Rock Reservation by Indian agents attempting his arrest. Several weeks later, the army massacred 150 Sioux, perhaps more, at Wounded Knee Creek. Some historians suggest it was an act of vengeance, carried out by the 7th cavalry.

A wise, observant chief, it was Sitting Bull who asserted, “Hear me people: we now have to deal with another race—small and feeble when our fathers first met them, but now great and overbearing. Strangely enough they have a mind to till the soil and the love of possession is a disease with them. These people have made many rules that the rich may break but the poor may not. They take their tithes from the poor and weak to support the rich and those who rule.”

Prescient and explicit, Sitting Bull’s comment lends context to the historical narrative of White infringement on the rights of its native peoples that continues even now.

–rj

Does the Qur’an Preach Violence?

Yesterday came news of the slaughter of up to 300 Sufi worshippers exiting a mosque in Egypt’s Sinai at the close of prayer, among them, twenty-seven children. It isn’t the first time such a murderous attack on unarmed civilians, even fellow Muslims, has occurred in Egypt and elsewhere.

Increasingly, Islamic violence has spread to Europe and North America as well. Thanks to Carnegie Mellon’s interactive platform, EarthTimeLapse, drawing on the Global Terrorism Database, we can even precisely map both its locale and frequency over the last 20-years.

Last year, 2016, Islamic extremists killed 269 people across Europe. In America, we’ve largely escaped since 9/11, apart from several sporadic incidents, the latest occurring in NYC when Uzbekistan immigrant Sayfullo Habibullaevic Saipov drove a rented truck into a crowded bike lane in lower Manhattan, killing eight on October 31, 2017.

All of this pales, however, when we include the Middle East and Africa, where 19,121 died last year, according to the Global Terrorist Data project, nearly all of them Muslim, which we’re likely to miss in our frequent ethnocentrism.

Consequently, it’s not unreasonable that many have come to associate Islam, “the religion of peace,” with violence. Critics call it Islamophobia.

Feeding into the public’s unease have been the likes of Franklin Graham, Robert Spencer and Pamela Geller, who have vociferously argued that Islam is intrinsically disposed to violence, both in its long history and the present, posing an insidious threat to our nation’s future, given their rising immigration numbers, high birth levels, and alleged intolerance.

And, of course, there’s President Trump who has seemingly bought into the nation’s anxiety, perhaps for political advantage.

I would personally like to defuse my own unease that erupts with almost daily news bulletins announcing some new, malicious violence somewhere on our troubled planet. Hopefully, its source isn’t Islam, but almost always it is.

I studied in France at the University of Dijon in the summer of 1985 and my best memories are of the friendships I shared with Muslims from Morocco, Iraq, Jordan, Iran, Syria and Israel. What won their affection was my sympathy for the Palestinians in their pursuit of nationhood, something they found incongruous, what with the stereotype Muslims often have of Americans, given our country’s traditional support for Israel.

We never quarreled. In fact, religion never came up at all.

At home, while I haven’t had much contact with Muslims, I’ve met several who treated me with kindness when I’ve met them in stores, allowing me to precede them in a line or fetching me a grocery cart.

My experience of Muslim reciprocal kindness tells me that like most human beings, their heart is good and wants to share its goodness, and that their faith has been grievously  maligned.

Now comes Gary Will’s new book, What the Qur’an Meant And Why It Matters (Viking,2017). This a crucial book, since Muslim proponents of terrorism trace what we now know as jihad back to the Qur’an, which they interpret literally, devoid of context or cultural antecedents, doing what fundamentalists generally do whatever their proclaimed religion.

Taken out of context, the Qur’an can indeed be disturbing reading, but so can the Old Testament with its advocacy of genocide towards those of different faith and culture like the Moabites. Wills, on the other hand, persuasively argues that the Qur’an is utterly incompatible with the barbaric atrocities committed in its name by Isis, Al Qaeda, Boko Haram, the Taliban, and Al-shabaab.

Wills points out, for example, that the Qur’an’s employment of  jihad “striving,” i.e.,”zeal,” and sharia don’t resonate their original nuance for these groups, and that jihad, for example, can suggest “holy war” in modern Arabic.

As for “sharia” with its modern association with complex religious laws, it occurs only once in the Qur’an (Q.45:18) and simply means “the right path.” In fact, no complex system of religious laws even existed in the Prophet’s lifetime.

In Wills’ view, the Qur’an has been grossly abused by militant Muslims, its text supporting peaceable co-existence with Judaism and Christianity and recognizing Adam, Noah, Abraham, Moses and Jesus as antecedent prophets to the preeminent  prophet, Mohammed.

The Qur’an does allow for defensive warfare against militant aggressors opposed to monotheism: “If God did not refuel some people by means of others, many monasteries, churches, synagogues, and mosques, where God’s name is much invoked, would have been destroyed (Q. 22:40).

The one liability for Wills is the Qur’an’s seeming denigration of women as in Q. 4:34: “If you fear bad conduct from your wives, advise them, then ignore them in bed, then strike them. If they obey you, you have no right to act against them.”

By the same token, the Qur’an doesn’t assign blame to Eve, who is unnamed, for the transgression in the garden, unlike the Old Testament.

While Wills’ scholarship seems impeccable in its fairness and exactitude, the problem of the Qur’an’s grievously distorted message and misappropriation by radical Islamic extremists and many Western critics, remains. Good as Wills’ book is, it will prove no more effectual in promoting reconciliation with all faiths than a seed by itelf can produce a harvest.

Islam needs to undergo theological and cultural reform, as occurred with Judaism and Christianity to curtail radical extremism.  Though there are reformers trying to do just that such as Maajid Nawaz, Asra Nomani, and Irshad Manji, they’re all too few and face vehement opposition, if not enmity, among purists and entrenched theocracies like Iran. Islam isn’t merely a faith but a total way of life in which change, if any occurs, will only come grudgingly.

What this book can do for those who read it is expurgate the vast majority of the world’s one billion Muslims, whether Sunni or Shiite, from the shibboleth of violence and intolerance so often impugning both the Qur’an and its practice as a consequence of an extremist minority. After all, Muslims have been by far, terrorism’s victims as yesterday’s Sufi massacre attests.

—rj

NFL Hypocrisy

The media has been all over this story of Sunday’s NFL response to Trump’s
provocative tweet that NFL team owners should fire players who don’t stand proud when the national anthem is played: “Wouldn’t you love to see one of these NFL owners when somebody disrespects our flag to say, ‘Get that son of a bitch off the field right now. He’s fired.”

Even NFL commissioner Roger Goodell got in his licks at Trump, responding that “The NFL and our players are at our best when we help create a sense of unity in our country and our culture.”

All fine and good, but the NFL’s last minute conversion to players’ right to freedom of speech reeks with blatant hypocrisy. In July 2016, six Dallas police officers were killed in a sniper ambush. As a symbol of community support for police officers, the Dallas Cowboys asked permission from the NFL to wear a helmet “Arm in Arm” decal. The NFL refused. Where was the “unity” then?

Meanwhile, NFL teams continue to discriminate against free agent Colin Kaepernick, who started the take-a-knee protests during the anthem. Quarterbacks have been subsequently signed without ever having thrown a football in an NFL game.

Now’s the time for NFL teams to walk the talk and return this former Super Bowl quarterback with a 90.3 rating to the playing field. Sooner of later, some team’s going to suffer an injured quarterback. Voila!

–rj

Baseball’s Decline

Last night, the Cleveland Indians won their twentieth straight game, tying the 2000 Oakland A’s, an American League record. Win one more, and they’ll have tied the 1935 Chicago Cubs. Still, they have a ways to go for the all time record, depending on how you count: the 2016 New York Giants won 26 straight games, although there was a tie that wasn’t counted against that streak.

You’d think Cleveland fans would turn out in droves to see their sizzling team, but not so. Last night, just 24,624 fans witnessed their historic blitz behind ace pitcher Corey Kluber at Progressive Field.

Seems they’d rather invest in their perennially dismal Cleveland Browns, who drew 67, 431 for their NFL opener against the Pittsburgh Steelers, which, not surprisingly, they lost. How bad is it? The Browns have lost 13 of their last 17 home openers.

Despite MLB’s aggressive, multifarious marketing efforts, let’s face it, baseball faces an evolving slide into a pastime of only marginal interest, or something like used to be true of lacrosse or soccer, the latter increasingly shoveling baseball aside as a draw for young people. As baseball aficionado C. J. Kelly observes in reminiscing his old neighborhood,

the baseball fields near my house lay empty on hot summer days except for the occasional Church softball games. The park that surrounds them is even devoid of kids most of the year. The fast-flowing river is all you can hear. The sound of a ball hitting a bat whether it be wood, aluminum and even Whiffle, that was so much a part of my childhood, is missing. You’re more likely to hear skateboards rumbling down the hill leading to the park. I can’t remember the last time I saw a kid walking anywhere with a baseball glove.

By the way, the average age of those tuning into a MLB TV game is now 55+.

I  remember being a kid in Philadelphia in 1950 when the Phillies won their first pennant since 1915. The excitement was palpable. Believe or not, the World Series had a tradition then of playing day games. In factories, workers turned on their radios, our teachers routinely filled us in on what was happening, city newspapers screamed the team’s fortunes in bold black front page headlines.

Changed, all utterly changed.

The 80s inaugurated the era of performance enhancing drugs. Maris, who had legitimately broken the Babe’s record of 60 homers in 154 games way back in 1961 would ultimately fade into stellar darkness as Bonds, McGuire, and Sosa eclipsed his accomplishment in the 90’s. Now we know how they did it. How many others did it too we’ll never know, since the MLB only inaugurated PED testing in 2003.

Ticket prices have soared. After all, players have to feed their families. Today, the average ball player makes $4 million a year. Draftees sign bonuses in the millions, never having played a single game in the Majors.

And for what? Take a relief pitcher, for example; maybe he pitches two or three times a week to just one or two batters each time, yet he can earn a huge payday, say like a million at the very least. The aces, of course, make much more. Arolis Chapman of the Yankees gets a cool $21 million a year. Pity Craig Kimbrel. He lags far behind at $13,250,000.

And then there’s the DH or designated hitter, kind of a built-in pinch hitter who can bat multiple times in a game and never take the field. Again, a lot of big bucks seldom proportionate to their actual contribution.

There’s no team or fan loyalty anymore. Today’s credo–meet my bottomline or I’ll take my glove elsewhere. Players often end their career having played for four or more teams;

The game’s lost much of its finesse like bunting. And when players do bunt, some pitchers get angry, taking it personally.

Ridiculously, you see a shift on nearly every player these days, even the low 200 hitters.

Strikeouts don’t matter either as batters swing for the fences rather than the base hit. This year, homers rival New Mexico hot air balloons in their celestial ascent, while batting averages remain earthbound.

At the heart of baseball’s decline is game length, usually at least three hours, often more. We’re living in the age of the clock and speed, but baseball hasn’t gotten the message. Pitchers taking thirty seconds each time to throw the ball can can drive you into dementia. And then there’s the frequent catcher-pitcher confabs.

Meanwhile, young people are turning away from the game in droves, and even Little League participation is declining.

What’s especially disheartening to me is the increasing scarcity of African-Americans playing the game, opting out for football or basketball. I think of Jackie Robinson who pioneered their inclusion in Major League Baseball and the long tradition of Black prowess contributing so much to making the game appealing and a gateway for disenfranchised Black youth.

The truth is, baseball is dangerously close to regressing to a white man’s game again, apart from the rich contribution of Caribbean ball players.

History used to matter in baseball, but they tore down Yankee Stadium anyway and built another, which they frequently can’t fill. Is Wrigley Field or Fenway next?

Back to when I was a Philly kid, I’d listen to radio broadcasts while on the floor playing with my toys. Soon I knew every player by name. If I had any pocket change, I’d be up at Shibe Park watching the hapless A’s. Summers, not a day, but I’d rush out the flat, joining the neighborhood kids playing stick ball against the factory walls.

I wish the old game were back. Today, it’s about money.

–rj

PS:  My wife just told me the Indians played this afternoon, winning their 21st straight game, a new American League record.  I don’t know about you, but I find that exciting.

 

Is Mindfulness Warmed-over Buddhism?

momentMindfulness meditation seems everywhere these days. Even the corporate world embraces it, e. g., Google, Facebook, EBay and Twitter. And in medical circles, it’s all the rage, particularly in psychiatry where it increasingly rivals pharmaceutical intervention as a primary therapy in treating depression and general anxiety disorders.

But is there any real science behind mindfulness, or is it simply Buddhism warmed over for Western consumers?

Supposedly, mindfulness is all about being in the present. Never mind regrets about mistakes you made or things you’ll do to make things better. Just let go.  What matters is being sentient in the Now.  In the sports world, you might call it, “Being in the zone.”

Mindfulness, as in Buddhism, has three steps; namely, concentration, insight and its sequel, empathy.

You get there largely by focusing on your breathing. While your mind will inevitably stray with what Buddhists call “monkey mind,” don’t worry about it.  Simply listen to, and not engage, any thoughts that press-in on you.  Mindfulness encourages acceptance and avoids being judgmental.

But why mindfulness, even if its does help relieve your stress?

Why not a pill?

Why not counseling?

Or soft music?

Or having fun with a good friend?

Or relaxing on the beach?

Why not just slow things down and sit still?

Where’s the research to back-up the craze or to validate it’s more effective than traditional ways of promoting well-being?

In short, mindfulness has its critics, some of whom argue that self-confrontation can even be dangerous for you.  Do you really want to probe repressed memories and labyrinthian chambers of loss, grief, and failure?

Melanie McDonagh, a writer for the Evening Standard (London), argues in Spectator that Mindfulness didn’t work for her, given her inability to stay focused.

Mindfulness is supposed to ultimately make you more compassionate. But where’s the proof of that?:

…as far as I can gather, it’s mostly About Me Sitting.  Concentration on your breathing is a good way to chill out and de-stress, but it’s not a particularly good end in itself. Radiating compassion is fine, but it doesn’t obviously translate into action. Where’s the bit about feeding the hungry, visiting the prisoner, all the virtues that Christianity extols? Where in fact is your neighbor in the practice of self-obsession?

In rebuttal, the test of properly practiced mindfulness is demonstrated outwardly in leaving ourselves behind and thinking of others. Any failure doesn’t lie in mindfulness, but those who really haven’t entered into what it’s all about.  I like how Shinzen Young  phrases it:  “The new self is not a noun, it is a verb” (The Science of Enlightenment: How Meditation Works).

What really irks McDonagh is an underlying dislike of Buddhism. While extolling the virtues of Christianity, she glosses over its redolent history of crusades, inquisitions, misogyny, embrace of slavery, hostility towards gays, colonial genocide, etc. You’ll not find any of this in Buddhism.

Mindfulness, as in Buddhism, or even Christianity, teaches you to rid yourself of the sense of a separate self. In short, we’re all part of the experiential flux of time and the temporal.

Stephen Batchelor, a former Buddhist monk, puts it this way in his observations of its exemplum in the Dalai Lama, whom he has met and spent time with on several occasions:

At the heart of [his] sensibility plays a deep empathy for the plight of others, which seemed to pour forth from him effortlessly and abundantly…Such empathy requires that one undergoes a radical emptying of self, so that instead of experiencing oneself as a fixed, detached ego, one comes to see how one is inextricably enmeshed in the fabric of the world (Confessions of a Buddhist Atheist).

McDonagh is just plain wrong in her reductionism, which short-circuits any fair appraisal based on a thorough knowledge of mindfulness in its antecedents, methodology, and scientific appraisal when she asserts that mindfulness is just essentially warmed-over hash: “Think meditation, think Buddhism, and you’re there, so long as you don’t forget the breathing.”

On the contrary, while Western mindfulness owes much to Buddhism, it’s essentially rationalistic, eschewing metaphysics, and eclectic in its make-up, drawing from many strands to implement those methodologies congruent with current science, validated through empirical research, much of it utilizing brain imaging data.

It professes no deities, practices no rituals; has no hierarchy, and no theology. It attracts the best minds.

What it does share with Buddhism–and science for that matter–is a belief in the interweave of causality and effect and the primary role of empiricism, not speculation, in assessing evidence.

Hence its appeal to Western minds and the fact that it works for diverse needs and in a plethora of settings.

–rj