Growth is Good?

When driving, I not infrequently see bumper stickers proclaiming, “Growth is Good.” I’m not sure about that. I never have been.

I’m drawn to solitude. I can’t fully explain why; I only know it has always been so.

From childhood, I loved my native New England—its rural charm, its villages gathered around commons harkening back to Revolutionary times. I miss the sea, the rolling mountains, the stone walls and white-steepled churches.

Vermont has long held me in its grip, with Maine not far behind—rural, twisting lanes with scant traffic, mountains rising beyond charming villages, the faint salt scent of the seashore.

For all my love of open space, my early years were divided between rural Rowley, MA, and Philadelphia. I’ll take Rowley any day over the so-called City of Brotherly Love. I have returned to Philadelphia twice; both visits stirring up memories I would rather forget—urban blight, high crime, dirty streets, endless row houses and sweltering summer nights.

When I lived in Rowley in the 1950s, the town numbered about 1,600 souls. Thirty-five miles north of Boston, it has since tripled in population to roughly 6,400 and become a fashionable bedroom community for Boston’s well-paid professionals, aided by the MBTA Commuter Rail—a comfortable fifty-five-minute jaunt into Boston’s North Station.

We lived on Bennett Hill Road, a mile’s walk from the common. In summer I would rush out the door, glove hooked over my bicycle’s handlebar, eager for baseball on the green.

I loved walking that narrow, curving road lined with beach plum bushes and open fields, breathing the smell of nearby salt marsh bordering the ocean, as I waited for the school bus at the corner.

Home was a two-hundred-year-old house set on twenty-six wooded acres of pine. Its long driveway was lined with boulder stone walls in the old New England manner, guarded by towering elms like ancient sentries.

Deep in those woods I had a private place I visited often, drawn by its stillness—the scent of pine rising from fallen needles that softened the earth beneath my feet.

Today, Rowley finds itself embroiled in debate over a state mandate requiring density zoning for roughly six hundred new multi-family housing units near the train station.

Bennett Hill Road has changed. The beach plum bushes are gone; open fields have yielded to wooded suburban enclaves. In the 1950s, the average home sold for $10,000. Today it approaches $1 million.

I cannot forecast Rowley’s fate, though I hope it doesn’t become another Danvers or Peabody—once distinct towns now absorbed into the dense sprawl of greater Boston.

A major development called “Rowley Farms” proposes a “town within a town,” combining denser housing and retail—an idea that would have been unimaginable in 1957.

Bumper stickers may insist that growth is good. I remain unconvinced. I am, and likely will remain, a stubborn holdout for a way of life that prized quiet contentment—secluded from the noise, the restlessness, and the accumulated griefs of urban life.

—rj

Presidents’Day and the Enigma of George Washington and a Nation

Painting: John Trumbull. George Washington resigns as Commander-in-Chief, Continental Army.

I grew up in an America that celebrated George Washington’s February 22 birthday as a national holiday, signed into law by President Rutherford B. Hayes in 1879 as a federal workers’ holiday, then extended nationally in 1885.

Things changed in 1968 when Congress passed the Uniform Monday Holiday Act, providing longer weekends and consolidating observances. Washington’s Birthday was reassigned to the third Monday in February, still legally designated as his birthday, though some congressional members proposed including Lincoln’s February 12 birthday under the broader aegis of Presidents’ Day.

That broader title caught on. Retailers and state governments increasingly adopted “Presidents’ Day.” Like so many holidays, it became a corporate sales opportunity.

Accordingly, the first item in my email this morning was a Presidents’ Day furniture sale—the first of what will no doubt be a barrage as the third Monday approaches.

Historian Jonathan Horn finds it absurd that we no longer distinctly celebrate Washington’s Birthday, given that it was observed in the young nation even before the framers of the Constitution met in 1787 (“Just Call It Washington’s Birthday,” Free Press, Feb. 11, 2026).

We owe much to this exemplary leader. Ken Burns credits Washington as “the glue that held it all together” in his PBS documentary The American Revolution. Facing superior, disciplined British forces, Washington understood that victory would require patience: knowing when to retreat, striking unexpectedly, and prolonging the war.

After defeats in New York, expiring enlistments, and desertions, matters reached their nadir during the winter encampment at Valley Forge outside British-occupied Philadelphia in 1777. Starvation loomed. Smallpox ravaged the ranks. Soldiers were unpaid, underfed, and poorly clothed.

With sagacity, Washington enlisted the Prussian officer Baron von Steuben, who molded the army into a disciplined fighting force. Recruits followed in greater numbers.

After General Gates’ victory at Saratoga, Washington was able to engage French support, working through liaison with the Marquis de Lafayette.

The decisive blow came when French naval forces blocked Cornwallis’s escape at Yorktown. Washington had deceived Cornwallis into believing New York was his objective while covertly moving his troops south. The British capitulated, leading ultimately to peace in 1783.

Washington had endured criticism without vindictiveness, even surviving a mutiny threat by disgruntled, unpaid officers.

For Burns, Washington’s greatest moment was not Yorktown or Trenton, but his resignation of his commission in Annapolis in 1783. As Burns tells it, he “knew how to defer to Congress, knew how to inspire ordinary people in the dead of night, knew how to pick subordinate talent—just had a kind of presence to him that, without him, we don’t have a country” (Chadwick Moore, New York Post, Nov. 11, 2025).

He did something similar in refusing a third presidential term. His 1796 Farewell Address remains prescient in its warnings against partisanship, permanent foreign alliances, sectionalism, and constitutional usurpation: “Let there be no change by usurpation; for though this, in one instance, may be the instrument of good, it is the customary weapon by which free governments are destroyed.”

Yet Washington has come under fierce attack, criticized for slave holding and judged by contemporary moral standards. Some view him primarily as a symbol of racial oppression and seek removal of his name and likeness from public spaces.

In the aftermath of George Floyd’s killing in 2020 and the ensuing unrest, efforts accelerated to eliminate reminders of racial injustice, including monuments to Washington, Jefferson, Lincoln, Theodore Roosevelt, and Woodrow Wilson. Streets, buildings, and schools were renamed; statues toppled or defaced.

In Portland in 2020, protesters toppled and defaced a statue of Washington in a public park.

At the University of Washington, protesters called for removal of his statue.

In 2020, a statue of Washington at George Washington University was beheaded.

In 2021, the San Francisco Board of Education voted to rename a school honoring Washington, later reversing course after public scrutiny.

A Washington, D.C. working group, commissioned amid racial justice protests, recommended reviewing public names and monuments, suggesting federal sites be reconsidered for contextualization or renaming.

While some of this fury can be understood as anger over longstanding injustice, historian Howard Zinn argues in A People’s History of the United States that Washington’s mythic stature obscures his slaveholding and his violent campaigns against Indigenous peoples. Jill Lepore, in These Truths, likewise underscores the inseparability of his leadership from slavery.

It is painful to read, but much rings true.

Burns recounts Washington’s 1779 campaign against the Iroquois, ordering the destruction of settlements in retaliation for their alliance with the British: “Lay waste all the settlements around… that the country may not be merely overrun, but destroyed.” Towns and crops were burned. Many perished in the ensuing winter from famine and disease.

America was founded on a compromise that would lead to civil war and immense loss of life. Our history is marked by both courage and cruelty, liberty and bondage. We diminish ourselves if we pretend otherwise.

But we also diminish ourselves if we forget the magnitude of what was achieved: a fragile republic wrested from empire, sustained not by perfection but by discipline, restraint, and the voluntary surrender of power.

We are a nation still struggling to reconcile our ideals with our conduct. The work of ordered liberty, of constitutional self-government, of moral reckoning without erasure, remains unfinished.

The Revolution continues—not in the toppling of statues or canceling history, but by whether we can tell the truth about our past without losing the capacity to honor it.

—rj

Learning What Is Enough

On winter mornings, before the day has decided what it will become, the fields hold a stillness that feels provisional—frost clinging to the grass, fence lines darkened with damp, the land waiting without impatience. It is a good hour for reading slowly, for choosing words that do not hurry ahead of their meanings.

I have begun the year reading Wendell Berry. Now in his ninety-second year, he continues—more slowly, more deliberately—to farm and to write, unchanged in his fidelity to limits: the authority of place over abstraction, the moral claims of the local over the corporate, tradition understood not as nostalgia but as knowledge earned through use and endurance.

I read him most mornings. His work steadies the day. It does not offer solutions so much as orientation—toward what is given, what is sufficient, and what must be borne. Berry has always made room for joy, but never without sorrow, nor for hope without the acknowledgment of failure, including one’s own.

Some of his most influential prose appeared early, when his voice was still finding its public footing. The Long-Legged House and The Unsettling of America argued, quietly and insistently, that culture and agriculture are inseparable, and that when land is treated as commodity rather than community, both soil and people are diminished.

I return often to his poetry, especially A Timbered Choir: The Sabbath Poems. Written on Sundays and largely free of polemic, these poems are acts of attention. They move patiently through the stages of a human life—birth, labor, love, diminishment—offering a sacramental vision of ordinary days lived close to the ground. Among them is Berry’s most widely known poem, “The Peace of Wild Things,” whose calm acceptance of life’s ephemerality offers not escape from anxiety, but release from the burden of false mastery:

“I come into the peace of wild things who do not tax their lives with forethought of grief.”

The peace the poem offers is not consolation so much as proportion. Its discipline lies in relinquishing the anxious reach into the future and reentering creaturely time—where life is finite, local, and sufficient.

That same discipline governs Berry’s essay “Why I Am Not Going to Buy a Computer,” first published in 1987 and often misread as a rejection of technology itself. It is instead a meditation on the moral weight of tools. Berry does not deny their usefulness; he questions their claims. Certain technologies, he suggests, quietly privilege speed over deliberation and convenience over care, reshaping habits of attention until efficiency becomes an unquestioned good.

The good life, in Berry’s accounting, is not optimized. It is inhabited. To live well requires learning the difference between what is necessary and what merely promises ease.

Barbara Kingsolver, another Kentuckian, names this work plainly when she writes:

“I consider it no small part of my daily work to sort out the differences between want and need. I’m helped along the way by my friend Wendell, without his ever knowing it. He advises me to ask, in the first place, whether I wish to purchase a solution to a problem I don’t have.”

Berry’s essay is not finally about computers at all. It is about scale and consequence. It asks not simply what a tool can do, but what it may undo—what forms of patience, responsibility, and mutual care it quietly displaces. It asks how our choices shape our relationships to family, to community, and to the land that sustains both.

Berry still writes with pencil on a yellow legal pad. He still farms, though within the limits age imposes. He still publishes—new poems, even a recent novel. The persistence itself feels instructive.

In a culture bent on expansion and acceleration, Berry’s life suggests another measure of success: fidelity to place, restraint in use, and the long patience required to learn what is enough.

—rj


The Discipline of Kindness

Anger has become one of the easiest responses to modern life.

What troubles me most is not anger itself, but how easily my emotions can be manipulated—by discourtesy, by noise, by global politics, by the ambient insensitivity of others. That reactive state isn’t who I really am, yet it’s one I’m repeatedly invited into.

I want to be kind, not reactionary; deliberate, not pushed into negativity. I want to remain self-governing. As Marcus Aurelius put it, “Be tolerant with others and strict with yourself.”

Years ago, I came upon a short piece in Reader’s Digest titled “Do I Act or React?” I read it at exactly the right moment. Why should a discourteous store employee spoil my day? Or a waiter serving me something I never ordered? Or a driver that cuts me off?

These moments are trivial in isolation, yet corrosive in accumulation. As the Stoic writer Ryan Holiday reminds us, “Jerks abound everywhere. That’s their business, not yours.”

My wife once shared an encounter she had with a rude bank teller. Instead of meeting rudeness with rudeness, she simply said, “I hope your day gets better.” It was disarming—not passive, not superior, just humane. Too often, we carry our moods like open containers, spilling them onto others without noticing.

The distinguished novelist and essayist George Saunders suggests that literature can help us cultivate tolerance—not all at once, but incrementally, and therefore cumulatively. A practicing Buddhist, Saunders believes that reading fiction trains us in three essential truths:

You’re not permanent.
You’re not the most important thing.
You’re not separate.

Literature slows judgment. It places us inside other consciousnesses. It rehearses moral humility. In doing so, it loosens the grip of the ego—the very thing that insists on being offended.

Kindness, after all, isn’t mere niceness, which can look away from cruelty. Kindness sees clearly. It chooses understanding over reflex, restraint over retaliation.

So many of our perceived hurts come down to our desire to control others: their tone, their behavior, their awareness of us. When that control fails—as it inevitably does—we suffer. The antidote is not indifference, but proportion. We need to take ourselves less seriously.

Saunders doesn’t offer a list of recommended books, though he has often spoken admiringly of Grace Paley.

I return often to Marcus Aurelius’ Meditations, a slender volume that rewards endless rereading.

One line, in particular, feels endlessly applicable:

“Is a world without shameless people possible? No. So this person you’ve just met is one of them. Get over it.”

Not resignation—clarity. Not bitterness—freedom.

The Moral Arithmetic of American Capitalism

Did you know that the average CEO compensation at large U.S. public companies now stands at roughly 280 times the pay of a frontline worker?

That represents a staggering shift from the 1960s, when CEOs earned 20 to 30 times what their workers made. Since the 1970s, the CEO-to-worker pay ratio has increased by over 1,000 percent.

This divergence did not occur by accident. One pivotal change came in the late 1970s, when American corporations moved away from a model centered on growth, stability, and shared prosperity toward one focused on maximizing shareholder value. Executive pay was increasingly tied to stock price rather than the long-term health of the firm.

With the rise of stock options and equity grants, CEOs could reap enormous rewards without raising wages, expanding productivity, or strengthening the workforce. Compensation ballooned even when companies stagnated.

Tax policy amplified the effect. In the 1950s and 1960s, top marginal income tax rates exceeded 70 to 90 percent, effectively discouraging runaway executive pay. That restraint largely disappeared in the 1980s, as marginal tax rates fell sharply, making extreme compensation both legal and cheap.

At the same time, labor power collapsed. Union membership declined, offshoring and automation accelerated, and job security eroded. Productivity rose; worker wages did not. Executive compensation absorbed the gains.

Business leaders defend this system by claiming that outsized pay is necessary to attract top talent. In practice, this has produced a self-perpetuating escalation, as boards benchmark CEO pay against ever-rising peer averages. In a globalized economy, profits flow upward, not outward.

Yet America’s extreme CEO-worker wage gap is not an inevitable feature of advanced capitalism.

Consider international comparisons:

Typical CEO-to-worker pay ratios (large firms):

  • United States: ~250–350:1
  • Western Europe: ~40–90:1
  • Japan: ~15–40:1

In much of Europe, workers sit on corporate boards, restraining excess. In Japan, adopting the American compensation model would be seen as collective irresponsibility, not enlightened management.

Public anger is justified—especially amid persistent inflation and decades of wage stagnation. Can the old restraints return?

There are tentative steps. The Tax Excessive CEO Pay Act of 2025, introduced by Rep. Rashida Tlaib and Sen. Bernie Sanders among others, would raise corporate tax rates on companies whose CEO pay exceeds worker pay by extreme margins, beginning at 50-to-1. But meaningful reform would require broad coalitions and a substantial shift in Congress. Change, if it comes, will be slow—and uncertain.

Transparency may be the public’s strongest immediate tool.

What has happened in America is not merely an economic evolution; it is a moral shift. Accumulation has replaced public responsibility as the dominant ethic, not only in corporate life but across society. Its most vivid emblem is the twice-elected billionaire president, Donald Trump, whose politics celebrate wealth while dismantling social safeguards.

Since 1990, the number of U.S. billionaires has grown from 66 to more than 800, while the median hourly wage has increased by only about 20 percent.

This is not efficiency.

It is not merit.

It is not inevitability.

It is obscene.

—rj

The Shouting Silence

D.G. Chapman, Upsplash

Silence has always allured me, most often when it is bound to expanses empty of people—though not always. I can find it just as readily in a library, or even in my own home when left to myself.

It is not, I believe, a resistance to an oppressive environment—work, academics, trauma, peer pressure, or the quotidian churn of human caprice—what psychiatry terms “psychological reactance.” It goes deeper than that, perhaps rooted in my introversion, which inclines me away from crowds and constant social encounter.

I carry memories of three landscapes that produced instant rapture: a sense of detachment, of absence from time itself—something larger than me, and yet intimately felt.

The first occurred when I was a graduate student in North Carolina, visiting the hillside at Kitty Hawk where the Wright brothers first achieved sustained flight in their ungainly aerial contraption. I had gone with friends, who wandered along the beachfront below, leaving me alone atop the hill. There, history seemed to recede. The wind moved through the grass, the sky stretched open and unmarked, and for a moment the present dissolved, as though time itself had paused in reverence.

Then there was Arlington National Cemetery, its vast rows of symmetrical white grave markers extending beyond easy comprehension. The stillness there was not empty but weighted, a silence shaped by collective sacrifice. For a brief moment, the eternal peace of America’s fallen became my own.

Most memorable of all were Scotland’s Highlands. Driving eastward from Edinburgh, they rose suddenly and unexpectedly across the horizon—rugged, green, and seemingly untouched by human intrusion. I pulled over, stepped out, tested the firmness of their verdure beneath my feet, and listened to what I can only call their shouting silence. That moment remains my most cherished travel memory.

As an English major in college, I once took a course devoted entirely to Wordsworth—England’s great poet of landscape. I am, perhaps, a rarity in having read all of his several hundred poems. Among them, “Tintern Abbey” most fully captures my response to those landscapes:

“…that serene and blessed mood,
In which the affections gently lead us on,—
Until, the breath of this corporeal frame
And even the motion of our human blood
Almost suspended, we are laid asleep
In body, and become a living soul…”

Literary scholars describe this response under the notion of the sublime: the experience of being overwhelmed through intimacy with nature, a flash of clarity in which one intuits a larger coherence behind nature’s mystery. Wordsworth gives it further voice:

“And I have felt
A presence that disturbs me with the joy
Of elevated thoughts; a sense sublime
Of something far more deeply interfused,
Whose dwelling is the light of setting suns,
And the round ocean and the living air,
And the blue sky, and in the mind of man….”

Psychology approaches the experience from another angle. One theory frames it as a sensory reset—the mind’s need to unburden itself from obligation and affliction, a release from the cognitive overload of daily life.

I am especially drawn to E. O. Wilson’s Biophilia Hypothesis, which proposes that humans evolved in constant contact with nature, calibrating the nervous system through millennia of hunter-gatherer life. In that context, a deserted landscape could signal safety—the absence of predators, permission to rest.

Another perspective, the Default Mode Network, suggests that quiet environments can trigger awe by suspending habitual rumination. Freed from constant external demands, the mind drifts toward reflection, memory, and imaginative connection. In such moments, the brain is allowed to hear the rhythms it evolved to monitor.

This makes intuitive sense. We live in a world saturated with anthropophonic noise—human-made sound without pause or mercy. Though nature is never truly silent—wind, water, and the subtle movements of life persist—these sounds soothe rather than assault. They restore rather than demand.

Wordsworth seems to anticipate this longing even in the heart of the city. In “Composed upon Westminster Bridge, September 3, 1802,” he finds London redeemed by a rare moment of stillness:

“Earth has not any thing to show more fair:
Dull would he be of soul who could pass by
A sight so touching in its majesty:
This City now doth, like a garment, wear
The beauty of the morning; silent, bare,
Ships, towers, domes, theatres, and temples lie
Open unto the fields, and to the sky;
All bright and glittering in the smokeless air.
Never did sun more beautifully steep
In his first splendour, valley, rock, or hill;
Ne’er saw I, never felt, a calm so deep!
The river glideth at his own sweet will:
Dear God! the very houses seem asleep;
And all that mighty heart is lying still!”

Perhaps silence, then, is not an absence but a presence—one that returns us to ourselves, quiets the mind’s noise, and restores a way of listening we once possessed, and have not entirely forgotten.

Convergence: Seeing Others as Ourselves

I’ve been keeping a list for some time of my favorite blogs. There are so many to choose from that discovering one capable of sparking genuine enthusiasm often feels like chance—like gazing nightly at the starry heavens and marveling at what lies beyond human reach. When such a blog does appear, it draws me back again and again, not out of habit but wonder.

Relying on that list, I came upon a short paragraph-poem this morning by Dr. Drew Lanham, the award-winning African American professor of environmental studies at Clemson University. Lanham confesses to being “a man in love with nature—a wanderer finding foundation in wild places.” What follows is both intimate and expansive:

Handle my life in your hands as if it were your own. Feel the heart beating—small as it may be—and imagine it in your own chest. Beating in syncopated time to become shared meter. That pulse, the breathing, is your rhythm. Your in’s and out’s, its in’s and out’s. Look close under
whatever warty skin or soft fur or gaudy feathers and see self. Its being is your being. Be in that same skin for what moments it will allow. Then, when the convergence between you is sealed, release that wild soul to free roaming as you would desire of your own.

What strikes me is the poem’s quiet tone: a persona grounded in kinship with the natural world, alive with empathy for the vulnerable. Applied beyond nature—to our fellow humans—it gestures toward something transcendent: a way of bridging difference, whether of creed, ethnicity, or race.

Such bridging begins only when we recognize our linkage, when we are willing, even briefly, to see ourselves in others—“Be in that same skin for what moments it will allow.”

Lanham wrote this poem after rescuing a frog from his cat’s pursuit.

After the Stars Go Dark: Thermodynamics for Mortals

Our universe is mind-boggling in its vastness and mystery. Astronomers estimate it contains on the order of a septillion stars—roughly a 1 followed by twenty-four zeroes—though such figures apply only to the observable universe, bounded by the reach of our most powerful telescopes.

Though stars appear to us as ageless fixtures of deep time, they too—like all things—have beginnings and endings.

Our universe itself burst into being some 13.8 billion years ago. It continues to expand as galaxies rush away from one another, yet it will not expand forever in any form recognizable to us. Ultimately, it will be unable to sustain the structures that make matter—and life—possible.

How it ends remains a matter of fierce conjecture. The leading scenario suggests that as galaxies drift ever farther apart, they will fade beyond visibility; their stars will exhaust their fuel, leaving space cold, dark, and diffuse—a state known as thermodynamic equilibrium, or maximum entropy.

Sleep well, however. Such an ending lies far beyond the human temporal imagination.

Our sun, a middle-aged star at roughly 4.6 billion years old, has another four or five billion years ahead of it. Before its quiet extinction, it will grow hotter and brighter, boiling away Earth’s oceans and transforming the planet into a scorched desert—an irreversible greenhouse effect.

What truly unsettles me is the possibility that other universes may have existed before our own. Classical Big Bang theory posited a singular origin—a point of infinite density from which space and time themselves emerged. More recent theories challenge this view, suggesting instead that our universe may be one episode in an endless cycle of expansion and contraction.

The mystery deepens further. Might other universes exist now, alongside our own? Many physicists think so. If space extends infinitely beyond the observable horizon—currently about 46 billion light-years in radius—there may be regions forever beyond our capacity to detect, perhaps governed by laws of physics unlike our own.

The takeaway is not randomness, but recurrence: a cosmos governed by patterned transformation—birth, death, and regeneration repeating across unimaginable scales.

The end of Earth, and even of our universe, would not mark finality, but transformation.

Our suffering arises from clinging to permanence. The Buddha may have intuited this truth 2,500 years ago: reality is not static but dynamic—endless flux, expansion and contraction. Modern physics echoes the insight in the laws of thermodynamics: energy is neither created nor destroyed, only transformed; and all systems tend toward entropy.

We live, then, not in a fragile accident, but in a universe shaped by the regularity of change itself.

–rj

Ken Burn’s The American Revolution

I’ve finished watching Ken Burns’ six part series, The American Revolution, and I think it brilliant, reproducing through letters, paintings, actual locale, staged reenactment, and historian insight a reasonable, balanced portraiture of the genesis of a new nation.

In watching it, I’ve found myself unlearning the version of American history I absorbed in school—one that portrayed the country as born purely of promise, while minimizing its foundations in slavery and the seizure of Indigenous lands.

I hadn’t realized, for instance, that the Revolution was in effect America’s first civil war: nearly 20 percent of the population sided with Britain as Loyalists. Atrocities occurred on both sides—burned homesteads, pillage, and widespread rape.

George Washington emerges as essential to the colonies’ improbable victory over seasoned British troops, often intuitive, and when necessary, boldly improvisational—especially in his surprise attack on the Hessian garrison at Trenton.

He also also fought with chronically scarce resources, including men and weapons. Smallpox devastated his ranks until ever practical Washington ordered mandatory inoculation for the entire Continental Army. For this, and much more, he merits the accolade, “the father of our country.”

The second sentence of the Declaration of Independence remains, for me, among the greatest ever written: “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”

It’s also, as historian Jill Lepore points out in These Truths, proved an instrument of exclusion, its author Thomas Jefferson—like Washington—owning hundreds of enslaved people and enjoying immense wealth.

The Declaration is a document of soaring ideals and deep compromises, and we live with those contradictions still—half of America’s wealth held by one percent of the population, and inequities woven through our social and economic life.

The American Revolution, then, is best understood as a work in progress. It inspires hope that we can do better—and in some respects, we have—though much remains unfinished.

While the Revolution’s principal architects—Jefferson, Adams, Franklin—were men of the Enlightenment who trusted reason to guide human flourishing, the war itself was largely fought by working-class coalitions, many lured by the promise of 100 acres of land taken from Indigenous nations.

Burns isn’t receptive to the argument advanced by the 1619 Project—that American history truly begins with the arrival of enslaved Africans in 1619, and that the Revolution was in part propelled by Southern fears that Britain would eventually abolish slavery.

We do see, however, that the Dunmore Proclamation (1775)—offering freedom to enslaved people who joined the British cause—galvanized Southern resistance. Yet Britain itself, as Burns points out, was hardly on the brink of abolition, its Caribbean wealth built on massive slave labor. Simply a political ploy, Dunmore owned many slaves, and slavery would endure in the Empire for another sixty years.

France entered the conflict in 1777, driven not by idealism but by a desire to avenge its humiliation at Britain’s hands and to reclaim lost influence. Without French military and financial support, the colonies almost certainly would have remained British dominions. By this point, the Revolution had become a global conflict, fought on many fronts.

Part V turns to Valley Forge, outside Philadelphia—the de facto capital of the newly united colonies. There the Revolution reached its nadir: troops half-starved, poorly clothed, ill-housed, and undersupplied as a brutal winter descended, the Congress unable for months to pay the troops. Many died. Many deserted.

With Spring, the French presence is felt, dividing British resources. By 1781, the British suffer massive defeat at Yorktown through a combined force of American troops and the French fleet, blocking British escape. A peace treaty, however, would not ensue until 1783.

The war left the new nation weak and divided, its economy wracked with inflation, huge national debt, and resentful farmers who bore much of the burden, leading to the insurrection in western Massachusetts of 1000 farmers before it was put down by militia. The nation’s weakness would lead, however, to the Constitutional Convention of 1787 defining American governance with its checks and balances under The Constitution.

Women and slaves were, nonetheless, still omitted from the democratic franchise; indigenousness lands seized with violent alacrity.

Washington emerges the series hero, not only innovative on the battlefield with few resources, but committing to democratic rule in resigning his military commission at war’s end.

The series’ central insight is that while the Revolution promised a nation unlike any other, that promise survives only through continual reengagement.

It merits wide viewing: a masterpiece deserving of the highest praise.

—rj

A Legacy of Righteous Minds

Existence exerts a randomness in its distribution of fate. The wicked, as Job tells us, often live long, escaping their misdeeds with impunity; the just and talented, curtailed lives amid their greatest promise.

The list of those I deem the “righteous,” those who’ve especially influenced who I am, the values I embrace, and my hopes for a better human future taken from us early, their age at death indicated in parentheses, includes Princeton sage Walter Kaufman (59), biologist Stephen Jay Gould (62), astronomer Carl Sagan (62), science fiction writer Octavia Butler (58), essayist and novelist George Orwell (46), political sage and philosopher John Stewart Mill (66), and, not least, poet Gerard Manley Hopkins (44).

I’m tempted to write a series of extended separate tributes to each of them in Brimmings, but will limit my commentary for now.

I was in my early twenties. a college student just out of the military, when I somehow came upon Walter Kaufmann’s The Faith of a Heretic (1961), which I’m re-reading now. He was the first to admonish me to accept only the empirical in the quest to discern the probable, to find courage to change course, and live daringly: “The question is not whether one has doubts, but whether one is honest about them.”

Evolutionary biologist Stephen Jay Gould impressed me early with his clear cutting, scintillating prose endowed with grace, teaching me that science is not simply pursuing the factual, but a way of thinking that enlarges one’s humanity. Life is by-product of chance and contingency: “Human beings arose, rather, as a consequence of thousands of linked events, none of which foresaw the future.”

Astronomer Carl Sagan demanded the imprimatur of evidence for any accepted belief. Rationality demands we not cloister ourselves in cultural hand downs—that extraordinary beliefs merit skepticism: Compromising truth invites demagoguery and superstition’s advance: “We make our world significant by the courage of our questions and the depth of our answers.”

African-American Octavia Butler has been a remarkable recent read, writing eleven science fiction novel standouts resonating urgency in confronting systemic collapse of ecosystems consequent with climate change. Her Parable of the Sower, a must read, has proven chillingly prescient. Change is life’s inevitability, morally indifferent, demanding adaptability to survive: “Human beings fear difference, and they fear it so deeply that they will not only oppress but destroy what they see as different.”

George Orwell, well known for his clairvoyant 1984, has always impressed me with the clarity of his writing, achieved through disciplined study; his wariness of manipulative despotism and its verbal deceit stratagems such as ’doublespeak,” timely and precise in their warnings of euphemism and abstraction: “Political language… is designed to make lies sound truthful and murder respectable.”

John Stewart Mill, “the saint of rationalism,” remains a seminal influence, ahead of his time, a champion of classical liberalism and its advocacy of the minority’s right to dissent. He taught me about nature’s indifference and logic’s necessity in a world absent of revelation. I return to him repeatedly for wisdom and inspiration: “If all mankind minus one, were of one opinion, and only one person were of the contrary opinion, mankind would be no more justified in silencing that one person, than he, if he had the power, would be justified in silencing mankind.”

Gerard Manley Hopkins has long been my most esteemed poet with his vibrant “sprung rhythm,” latent with emotion, a passion for nature and for those who suffer—so many—life’s inequities. His poetry sings, reenacting experience via the sensory, capturing the essence of all things. As a Jesuit priest, while not resolving the problem of suffering by resorting to a cozy theodicy or relying on sentimentality, he helps render its endurance: ‘I wake and feel the fell of dark, not day.”

I am forever grateful for their stalwart witness to life’s truths. Their lives argue by example rather than system—that meaning isn’t guaranteed by justice, nor extinguished by its absence. Fate distributes arbitrarily; conscience does not.

—rj