I am not partisan when it comes to politics. I support those whom I believe best represent the interests of our country and its democratic institutions.
Unfortunately, as many of us recognize, we are now in imminent danger under an obviously incompetent, self-centered, reactionary president whose conduct increasingly places personal interests above national stability and global responsibility.
Not only has he launched an unnecessary and destabilizing conflict, he now appears willing to offer Iran terms that verge on appeasement in his haste to bring the crisis to an end—without meaningful guarantees regarding nuclear enrichment, missile delivery systems, or Tehran’s support for terrorist proxies throughout the region.
This is a pariah regime that brutalizes its own people under the cloak of religion, yet the administration’s posture suggests geopolitical calculation rather than principled strategy, particularly with China looming in the background ahead of his upcoming visit there.
Trump has long displayed an affinity for authoritarian leaders such as Putin and Xi Jinping, often showing greater respect for strongmen abroad than for democratic allies at home.
Meanwhile, climate change is ignored, Ukraine is increasingly abandoned, FEMA and Medicaid are weakened, violence spreads across international shipping lanes, and the economy shows growing signs of strain. The list is long, and the cumulative effect is profound national instability.
Midterm elections are still six months away, while this presidency has another two and a half years remaining. Congress, despite its constitutional responsibilities, appears unwilling to act.
The ship of state is listing badly, and unless the country changes course, this president may take us all down with him.
Why some of the world’s most documented abuses fail to generate sustained global response
Iran, by almost any measure, ranks among the world’s most repressive regimes, much of its coercive power cloaked in religious authority.
It has not hesitated to use lethal force against its own citizens and consistently ranks among the highest globally in executions and political imprisonment (Amnesty International, 2023; United Nations Human Rights Council, 2024).
In the aftermath of the 1979 revolution (1979–1981), mass executions targeted former officials of the Shah’s government, military personnel, and political opponents.
One of the most notorious episodes occurred in 1988, when Ayatollah Khomeini issued a secret fatwa ordering the execution of political dissidents, primarily leftists. Estimates range from 3,000 to 5,000 victims, though human rights organizations place the toll closer to 30,000. The state has never acknowledged these killings; mass graves have been concealed, and public mourning by families has been forbidden (Human Rights Watch, 2024; Amnesty International, 2023; UN Special Rapporteur reports on Iran).
More recently, during the 2019 protests triggered by fuel price increases, security forces killed approximately 1,500 people within days (Amnesty International, 2020).
In 2022–2023, the death of Mahsa Amini, arrested by the Morality Police for alleged hijab violations, sparked nationwide protests. Authorities claimed she died of a heart attack, but eyewitness accounts and leaked medical evidence suggested severe physical abuse. At least 415 protesters were killed, including around fifty children (Human Rights Watch, 2023; IranWire, 2026).
Reports of an even more extreme crackdown emerged following protests beginning in late December 2025. On January 6, 2026, the government reportedly shut down internet access to obscure its response, culminating in a major escalation on January 8.
Official figures claimed 3,117 deaths, though these are widely disputed. The Human Rights Activists News Agency estimates approximately 6,800 civilians killed, with thousands more detained (UK Parliament Commons Library, 2026).
Further reporting suggests even higher numbers, with intelligence-linked documents alleging over 36,000 deaths in just two days (Iran International, 2026). If accurate, this would represent one of the deadliest short-term crackdowns in modern history.
Analysis of verified victim lists indicates that 16.1% of those killed were under 18, drawn from across Iranian society (IranWire, 2026).
Beyond Human Targets
The regime’s record of violence extends beyond human rights.
In July 2022, more than 1,700 dogs were reportedly killed during a raid on the Gandalf Dog Shelter near Damavand. The operation was documented with photos and video, and the animals were reportedly vaccinated and sterilized, undermining any public health justification (Al Bawaba, 2022; One Green Planet, 2022). Fire trucks were deployed afterward to wash away the evidence. Several animal rights advocates who attempted to reach the shelter were arrested (Source: Iran International).
The response from major Western animal rights organizations was, by any measure, shameful. PETA issued a tweet or two. The Humane Society, the ASPCA, and World Animal Protection said nothing of substance. There were no sustained campaigns, no petitions with celebrity endorsements, no diplomatic pressure on Western governments to raise the matter with Iran.
Animal organizations genuinely wary of having their condemnations exploited by hawkish political actors to justify sanctions or military posturing toward Iran. While this is not an entirely unreasonable concern, it produces a paralysis in which documented atrocities go unaddressed to avoid political repercussions.
Dog ownership itself is increasingly restricted in Iran. Municipal bans now exist in eleven Iranian cities, forbidding dog-walking in public spaces (Iran International, 2025; Wikipedia, n.d.).
A proposed law—“Protection of the Public’s Rights Against Animals”—would require government permits for pet ownership, even for common animals such as cats and rabbits (Network for Animals, 2021).
Part of the rationale offered by authorities draws on religious interpretation. In many traditions of Islamic jurisprudence, dogs are considered ritually impure (najis), particularly in relation to saliva, based on certain hadith literature. While the Qur’an does not prohibit dogs and even depicts them in functional roles such as hunting and guarding, later legal interpretations in some schools of thought discouraged close domestic contact.
Importantly, these interpretations are not uniform across the Muslim world. In many Muslim-majority societies, dogs are widely kept as pets or working animals. However, in the Islamic Republic of Iran, clerical authorities have drawn on stricter interpretations to reinforce social stigma and justify periodic enforcement campaigns.
The result is a convergence of ideology and governance: religious framing reinforcing state control over private life, including animal ownership.
A Pattern of Silence
Despite extensive documentation, international response has often been uneven.
Many large Western NGOs lean left politically, and the Iranian regime has long been framed, however implausibly, as a victim of Western imperialism and sanctions. Forceful condemnation of Iranian state conduct sits uncomfortably within that worldview.
The same dynamic has led Western feminist organizations to largely sidestep sustained criticism of Iran’s mandatory hijab laws. The tension between anti-imperialism and universal rights, whether human or animal, is never cleanly resolved, and Iran consistently receives the benefit of the doubt.
This selective principle of bearing witness has a broader parallel. Holocaust photographs—mass graves, liberation of the camps—are reproduced in textbooks and museums on the entirely defensible grounds that visual evidence is essential to bearing witness and resisting denial. Yet equivalent documentation from contemporary atrocities is routinely suppressed or ignored. The principle turns out to be applied not universally, but according to whose suffering fits the prevailing political narrative.
“When it comes to Iran, nothing matters.” — Marjan Keypour (Jerusalem Post, 2022)
The result is a selective visibility of suffering—where political context shapes not only diplomatic response, but also moral attention.
Conclusion
From the mass executions of the 1980s to recent protest crackdowns and ongoing social controls, the record points to a sustained pattern of repression.
Yet international attention remains intermittent.
What stands out is not only the severity of the abuses, but the unevenness of the world’s willingness to consistently see them.
In moments of political outrage, we often hear that some “moral compact” has been broken between the government and the people, as though public life rests on an understood promise of honesty and good faith. It is a comforting idea. It’s also, I think, a naïve one.
The record we inherit suggests otherwise. Bill Clinton misled the public under oath. John F. Kennedy and Lyndon B. Johnson deepened American involvement in the Vietnam War under a widening credibility gap. Franklin D. Roosevelt withheld the full truth of his condition from the electorate. These are not anomalies. They are reminders.
None of this excuses Donald Trump, whose conduct stands as an extreme case in its brazenness. But it does suggest that the deeper problem is not the breaking of a compact, but our belief that such a compact has ever governed political life in more than name.
Niccolò Machiavelli understood that those who govern must often act against truth while preserving its appearance. Thomas Hobbes argued that we submit to authority not because it is virtuous, but because it is necessary. Between them, the so-called compact begins to look less like a foundation than a useful fiction, one that steadies public faith even as it obscures political reality.
If that is so, then the question is not how to restore a moral politics, but how much of our moral life we should ever have entrusted to politics in the first place.
Here Henry David Thoreau offers a necessary restraint: that government is best which governs least, not because it is especially good, but because it is always liable to be otherwise.
And Wendell Berry reminds us, more quietly, that the work of responsibility does not belong first to governments at all, but to persons, living within limits, bound to places, accountable to one another in ways no distant authority can finally secure.
We have asked too much of politics, and in doing so, we have misunderstood it.
Power does not keep faith; it manages necessity. It persuades, it conceals, it endures. It cannot bear the weight of the moral order we would like to rest upon it.
That burden remains where it has always been—closer to home.
Not in the abstractions of the state, nor in the promises of those who govern, but in the small, stubborn practices of truthfulness and care: in what we say, what we refuse to say, what we permit, and what we will not.
If there is any compact worth defending, it is not the one we imagine between ourselves and power. It is the one we keep, or fail to keep, with one another.
On February 28, 2026, the United States and Israel launched coordinated aerial strikes on Iran, ostensibly to induce regime change and secure stability in the Middle East.
Central to this strategy was the expectation of a mass popular uprising—something glimpsed in January 2026, when tens of thousands of unarmed civilians took to the streets and were met with lethal force. Estimates suggest as many as 30,000 were killed, 7000 independently confirmed. The regime they opposed—a repressive Islamic theocracy entrenched since 1979—remains intact.
This is a war America is unlikely to win.
Despite the destruction of command centers, arsenals, and the targeted killing of senior leadership, including Supreme Leader Ayatollah Ali Khamenei, Iran has demonstrated a capacity for resilience that was either underestimated or ignored.
Its response has been asymmetric and expansive: ballistic missiles and drone strikes aimed not only at Israel but at a widening circle of nations hosting American bases—Qatar, the United Arab Emirates, Kuwait, Jordan, and Saudi Arabia. Recent launches have extended even farther, toward Crete, Turkey, and the joint UK–U.S. base on Diego Garcia in the Indian Ocean, 2400 miles distant.
At such range, the perimeter of vulnerability shifts. Southeastern Europe comes into view; with further technological refinement, even cities such as Rome or Berlin may not remain beyond reach; in a decade, the United States.
As this conflict widens, its economic consequences are already apparent. Iran’s disruption of the Strait of Hormuz, through which roughly one-fifth of the world’s oil supply passes, has driven prices upward, with projections rising sharply. The leverage is stark: Iran need not defeat the United States militarily to impose severe costs. It need only prolong the conflict.
The political implications are equally stark. Domestic opposition is mounting here at home, shaped less by geopolitics than by inflation, felt daily in grocery bills and at the gas pump. A war that amplifies those pressures becomes difficult to sustain, regardless of its stated aims.
Even proposed escalations such as a seizure of Kharg Island, which handles a significant portion of Iran’s oil exports, risk becoming symbolic victories at disproportionate cost.
The comparison to Iwo Jima is not misplaced: a tactical gain unlikely to alter the strategic reality. Much of Iran’s missile infrastructure remains embedded and protected, beyond the reach of conventional assault.
Recent Trump statements suggesting that U.S. objectives have largely been met already signal a search for an exit. Yet an off-ramp may not be readily available. Iran’s advantage lies in time. By sustaining pressure, economic as much as military, it can compel concessions without decisive confrontation.
There was another path.
Rather than precipitating war, a strategy of containment through sanctions and patience might have allowed the regime to atrophy under the weight of its own contradictions. Iran faces converging crises: acute water scarcity, environmental degradation, declining agricultural productivity, economic duress, and deep internal dissent. A large portion of its population—diverse, young, and increasingly disillusioned—has already demonstrated a willingness to demand change at great personal risk.
That internal pressure, not external force, may have proven the more decisive agent of transformation.
Talks between the United States and Iran continue this week in Geneva.
When desperate Iranians poured into the streets demanding regime change, President Trump pledged that help was on the way, only to retreat. In the aftermath, credible human rights organizations report that thousands of protesters were killed and many more arrested in a sweeping crackdown designed to extinguish dissent.
Now it appears the administration has chosen to defer confrontation in favor of negotiation. The pattern is familiar: pressure followed by pursuit of a deal. As in Venezuela, where engagement and economic overtures failed to dislodge a repressive regime despite widely disputed elections, the governing apparatus endured.
Tehran understands the language of leverage and Trump’s susceptibility to profitable business deals. It has reportedly floated the prospect of lucrative energy, mining, and aircraft agreements in exchange for relief from punitive economic sanctions. Yet the offer comes with a critical caveat: Iran would modify, but not relinquish, its uranium enrichment and leave untouched its ballistic missile programs.
Such an arrangement would not dismantle the regime’s capacity for repression at home or projection of force abroad. It would instead stabilize a government that has spent nearly half a century suppressing its people and destabilizing the Middle East, while edging ever closer to nuclear weapons capability and refining the missiles to deliver it.
Yes, help may indeed be on the way, but not for Iran’s beleaguered citizens if sanctions are lifted without fundamental change. It will be relief for the very regime that has kept them in chains.
Painting: John Trumbull. George Washington resigns as Commander-in-Chief, Continental Army.
I grew up in an America that celebrated George Washington’s February 22 birthday as a national holiday, signed into law by President Rutherford B. Hayes in 1879 as a federal workers’ holiday, then extended nationally in 1885.
Things changed in 1968 when Congress passed the Uniform Monday Holiday Act, providing longer weekends and consolidating observances. Washington’s Birthday was reassigned to the third Monday in February, still legally designated as his birthday, though some congressional members proposed including Lincoln’s February 12 birthday under the broader aegis of Presidents’ Day.
That broader title caught on. Retailers and state governments increasingly adopted “Presidents’ Day.” Like so many holidays, it became a corporate sales opportunity.
Accordingly, the first item in my email this morning was a Presidents’ Day furniture sale—the first of what will no doubt be a barrage as the third Monday approaches.
Historian Jonathan Horn finds it absurd that we no longer distinctly celebrate Washington’s Birthday, given that it was observed in the young nation even before the framers of the Constitution met in 1787 (“Just Call It Washington’s Birthday,” Free Press, Feb. 11, 2026).
We owe much to this exemplary leader. Ken Burns credits Washington as “the glue that held it all together” in his PBS documentary TheAmericanRevolution. Facing superior, disciplined British forces, Washington understood that victory would require patience: knowing when to retreat, striking unexpectedly, and prolonging the war.
After defeats in New York, expiring enlistments, and desertions, matters reached their nadir during the winter encampment at Valley Forge outside British-occupied Philadelphia in 1777. Starvation loomed. Smallpox ravaged the ranks. Soldiers were unpaid, underfed, and poorly clothed.
With sagacity, Washington enlisted the Prussian officer Baron von Steuben, who molded the army into a disciplined fighting force. Recruits followed in greater numbers.
After General Gates’ victory at Saratoga, Washington was able to engage French support, working through liaison with the Marquis de Lafayette.
The decisive blow came when French naval forces blocked Cornwallis’s escape at Yorktown. Washington had deceived Cornwallis into believing New York was his objective while covertly moving his troops south. The British capitulated, leading ultimately to peace in 1783.
Washington had endured criticism without vindictiveness, even surviving a mutiny threat by disgruntled, unpaid officers.
For Burns, Washington’s greatest moment was not Yorktown or Trenton, but his resignation of his commission in Annapolis in 1783. As Burns tells it, he “knew how to defer to Congress, knew how to inspire ordinary people in the dead of night, knew how to pick subordinate talent—just had a kind of presence to him that, without him, we don’t have a country” (Chadwick Moore, New York Post, Nov. 11, 2025).
He did something similar in refusing a third presidential term. His 1796 Farewell Address remains prescient in its warnings against partisanship, permanent foreign alliances, sectionalism, and constitutional usurpation: “Let there be no change by usurpation; for though this, in one instance, may be the instrument of good, it is the customary weapon by which free governments are destroyed.”
Yet Washington has come under fierce attack, criticized for slave holding and judged by contemporary moral standards. Some view him primarily as a symbol of racial oppression and seek removal of his name and likeness from public spaces.
In the aftermath of George Floyd’s killing in 2020 and the ensuing unrest, efforts accelerated to eliminate reminders of racial injustice, including monuments to Washington, Jefferson, Lincoln, Theodore Roosevelt, and Woodrow Wilson. Streets, buildings, and schools were renamed; statues toppled or defaced.
In Portland in 2020, protesters toppled and defaced a statue of Washington in a public park.
At the University of Washington, protesters called for removal of his statue.
In 2020, a statue of Washington at George Washington University was beheaded.
In 2021, the San Francisco Board of Education voted to rename a school honoring Washington, later reversing course after public scrutiny.
A Washington, D.C. working group, commissioned amid racial justice protests, recommended reviewing public names and monuments, suggesting federal sites be reconsidered for contextualization or renaming.
While some of this fury can be understood as anger over longstanding injustice, historian Howard Zinn argues in A People’s History of the United States that Washington’s mythic stature obscures his slaveholding and his violent campaigns against Indigenous peoples. Jill Lepore, in These Truths, likewise underscores the inseparability of his leadership from slavery.
It is painful to read, but much rings true.
Burns recounts Washington’s 1779 campaign against the Iroquois, ordering the destruction of settlements in retaliation for their alliance with the British: “Lay waste all the settlements around… that the country may not be merely overrun, but destroyed.” Towns and crops were burned. Many perished in the ensuing winter from famine and disease.
America was founded on a compromise that would lead to civil war and immense loss of life. Our history is marked by both courage and cruelty, liberty and bondage. We diminish ourselves if we pretend otherwise.
But we also diminish ourselves if we forget the magnitude of what was achieved: a fragile republic wrested from empire, sustained not by perfection but by discipline, restraint, and the voluntary surrender of power.
We are a nation still struggling to reconcile our ideals with our conduct. The work of ordered liberty, of constitutional self-government, of moral reckoning without erasure, remains unfinished.
The Revolution continues—not in the toppling of statues or canceling history, but by whether we can tell the truth about our past without losing the capacity to honor it.
On winter mornings, before the day has decided what it will become, the fields hold a stillness that feels provisional—frost clinging to the grass, fence lines darkened with damp, the land waiting without impatience. It is a good hour for reading slowly, for choosing words that do not hurry ahead of their meanings.
I have begun the year reading Wendell Berry. Now in his ninety-second year, he continues—more slowly, more deliberately—to farm and to write, unchanged in his fidelity to limits: the authority of place over abstraction, the moral claims of the local over the corporate, tradition understood not as nostalgia but as knowledge earned through use and endurance.
I read him most mornings. His work steadies the day. It does not offer solutions so much as orientation—toward what is given, what is sufficient, and what must be borne. Berry has always made room for joy, but never without sorrow, nor for hope without the acknowledgment of failure, including one’s own.
Some of his most influential prose appeared early, when his voice was still finding its public footing. The Long-Legged House and The Unsettling of America argued, quietly and insistently, that culture and agriculture are inseparable, and that when land is treated as commodity rather than community, both soil and people are diminished.
I return often to his poetry, especially A Timbered Choir: The Sabbath Poems. Written on Sundays and largely free of polemic, these poems are acts of attention. They move patiently through the stages of a human life—birth, labor, love, diminishment—offering a sacramental vision of ordinary days lived close to the ground. Among them is Berry’s most widely known poem, “The Peace of Wild Things,” whose calm acceptance of life’s ephemerality offers not escape from anxiety, but release from the burden of false mastery:
“I come into the peace of wild things who do not tax their lives with forethought of grief.”
The peace the poem offers is not consolation so much as proportion. Its discipline lies in relinquishing the anxious reach into the future and reentering creaturely time—where life is finite, local, and sufficient.
That same discipline governs Berry’s essay “Why I Am Not Going to Buy a Computer,” first published in 1987 and often misread as a rejection of technology itself. It is instead a meditation on the moral weight of tools. Berry does not deny their usefulness; he questions their claims. Certain technologies, he suggests, quietly privilege speed over deliberation and convenience over care, reshaping habits of attention until efficiency becomes an unquestioned good.
The good life, in Berry’s accounting, is not optimized. It is inhabited. To live well requires learning the difference between what is necessary and what merely promises ease.
Barbara Kingsolver, another Kentuckian, names this work plainly when she writes:
“I consider it no small part of my daily work to sort out the differences between want and need. I’m helped along the way by my friend Wendell, without his ever knowing it. He advises me to ask, in the first place, whether I wish to purchase a solution to a problem I don’t have.”
Berry’s essay is not finally about computers at all. It is about scale and consequence. It asks not simply what a tool can do, but what it may undo—what forms of patience, responsibility, and mutual care it quietly displaces. It asks how our choices shape our relationships to family, to community, and to the land that sustains both.
Berry still writes with pencil on a yellow legal pad. He still farms, though within the limits age imposes. He still publishes—new poems, even a recent novel. The persistence itself feels instructive.
In a culture bent on expansion and acceleration, Berry’s life suggests another measure of success: fidelity to place, restraint in use, and the long patience required to learn what is enough.
I’ve been absent from Brimmings for nearly a week, recovering from a serious bout with the flu—the fever lingering for ten days. A chronic cough remains my daily companion.
That hasn’t stopped me from reading—slowly, attentively—six books already this year.
As I’ve previously shared, alongside my annual eclectic reading list, I’ve committed to a topical approach to reading as a way of resisting intellectual grazing and cultivating sustained attention (Topical Reading). I’ve begun with Kentucky sage Wendell Berry, now in his ninety-second year.
I didn’t want to one day come upon his obituary and feel the guilt pangs of having neglected an agrarian pacifist, a champion of the local, often described, without much exaggeration, as America’s “moral conscience.”
Berry has farmed a 125-acre hilly tract adjacent to the Ohio River at Port Royal in Henry County, Kentucky, for more than forty years. Farming, for him, is not metaphor but moral practice. As he writes, “The care of the Earth is our most ancient and most worthy, and after all our most pleasing responsibility.”
Academically, Berry is no lightweight: a BA and MA in English from the University of Kentucky, a Stegner Fellowship at Stanford, and a Guggenheim that took him to Italy, he taught briefly at New York University before returning—against the counsel of colleagues who believed he was jettisoning a promising academic career—to rural Kentucky and the family farm.
They were wrong.
Berry has since written more than fifty books spanning essays, novels, and poetry. His great theme is stewardship—not management or control, but reverent care. “The idea that people have a right to an economy that destroys nature is a contradiction,” he writes, insisting that economic life must answer to ecological reality.
For the farmer Berry, stewardship begins with the soil: an antipathy to chemicals, a reverencing of the biosphere, and a life lived according to natural rhythms. He is deeply opposed to industrial agriculture, which he regards as a cultural as well as ecological calamity: “Industrial agriculture is not just bad for farmers; it is bad for land, for rural communities, and ultimately for culture.”
Among American environmental writings, the two most salient works I’ve encountered are Thoreau’s Walden (1854) and Rachel Carson’s Silent Spring (1962). Thoreau’s aphoristic brilliance lends itself to endless quotation: “Our life is frittered away by detail… Simplify, simplify,” while Carson’s prose approaches poetry. Her opening paragraphs of SilentSpring remain, to my mind, the finest in environmental literature, exposing the arrogance behind what she called “the control of nature, a phrase conceived in arrogance, born of the Neanderthal age of biology and philosophy.”
I’m only in the early stages of getting acquainted with Berry, but he keeps distinguished company with Thoreau and Carson in his passion for preserving nature’s bounty and the pulchritude of a simplified life lived in fidelity to place and community.
In this sense, Berry reaches back to Thomas Jefferson, whom he quotes more than any other figure: “In my own politics and economics I am Jeffersonian.” Jefferson believed liberty was best secured in small, decentralized communities of independent producers, warning that distant power—whether governmental or economic—inevitably corrodes responsibility and freedom.
Though Berry was an activist who vehemently opposed the Vietnam War and has voted Democratic, his politics resist easy classification. He has lamented that America’s two major parties have grown increasingly to resemble one another.
There may appear, at first glance, to be overlap with libertarianism—his opposition to big government, military expansion, and imperial intervention—but the resemblance is superficial. Libertarianism exalts the autonomous individual; Berry emphasizes communal obligation. “We do not have to sacrifice our economic well-being in order to act responsibly toward our land and our neighbors,” he writes. “Rather, we must do so in order to preserve our economic well-being.”
Berry has his critics. His suspicion of technology strikes some as untenable in a hungry, overpopulated world. Can an aggregate of small family farms feed a wired and burgeoning global population, particularly in parts of Africa?
I find myself grappling with his apparent parochialism. Only a tiny fraction of Americans now farm. What of the rest of us who earn our livelihoods elsewhere? And in an interconnected age, can the local truly stand apart from the global?
Berry would respond that the issue is not technology itself, but dependence. “There is a difference between being technologically advanced and being technologically dependent,” he reminds us—a distinction too often elided in contemporary debates.
Ironically, Berry would fit comfortably in an Amish community. He still plows with horses. He owns no computer, television, or mobile phone, and has no internet access. He writes first in pencil, then types. He uses electricity sparingly, supplemented by solar panels, and his writing studio is without electricity. He walks the talk, living a life rooted—quite literally—in the land. Thoreau would have approved.
An iconoclast, Berry remains well worth reading. Growth, he reminds us, is not synonymous with the earth’s welfare. Economies, like soils, can be exhausted. Big government and industrial systems, he argues, erode local responsibility, foster dependency, and inflame military and international tensions. Rural poverty in places like Appalachia persists, in his view, because urban prosperity has been purchased by the plundering of these regions.
In 2013, President Barack Obama awarded Berry the National Humanities Medal.
In 2015, he became the first living writer inducted into the Kentucky Writers Hall of Fame.
That same year, the Library of America published a boxed set of his work—an honor accorded to only two living American writers at the time.
Berry may be impractical. He may be impossible to scale. But he leaves us with an uncomfortable and necessary reminder: care, once abandoned, is not easily restored—and neither are the land, the culture, nor the communities that depend upon it.
Did you know that the average CEO compensation at large U.S. public companies now stands at roughly 280 times the pay of a frontline worker?
That represents a staggering shift from the 1960s, when CEOs earned 20 to 30 times what their workers made. Since the 1970s, the CEO-to-worker pay ratio has increased by over 1,000 percent.
This divergence did not occur by accident. One pivotal change came in the late 1970s, when American corporations moved away from a model centered on growth, stability, and shared prosperity toward one focused on maximizing shareholder value. Executive pay was increasingly tied to stock price rather than the long-term health of the firm.
With the rise of stock options and equity grants, CEOs could reap enormous rewards without raising wages, expanding productivity, or strengthening the workforce. Compensation ballooned even when companies stagnated.
Tax policy amplified the effect. In the 1950s and 1960s, top marginal income tax rates exceeded 70 to 90 percent, effectively discouraging runaway executive pay. That restraint largely disappeared in the 1980s, as marginal tax rates fell sharply, making extreme compensation both legal and cheap.
At the same time, labor power collapsed. Union membership declined, offshoring and automation accelerated, and job security eroded. Productivity rose; worker wages did not. Executive compensation absorbed the gains.
Business leaders defend this system by claiming that outsized pay is necessary to attract top talent. In practice, this has produced a self-perpetuating escalation, as boards benchmark CEO pay against ever-rising peer averages. In a globalized economy, profits flow upward, not outward.
Yet America’s extreme CEO-worker wage gap is not an inevitable feature of advanced capitalism.
Consider international comparisons:
Typical CEO-to-worker pay ratios (large firms):
United States: ~250–350:1
Western Europe: ~40–90:1
Japan: ~15–40:1
In much of Europe, workers sit on corporate boards, restraining excess. In Japan, adopting the American compensation model would be seen as collective irresponsibility, not enlightened management.
Public anger is justified—especially amid persistent inflation and decades of wage stagnation. Can the old restraints return?
There are tentative steps. The Tax Excessive CEO Pay Act of 2025, introduced by Rep. Rashida Tlaib and Sen. Bernie Sanders among others, would raise corporate tax rates on companies whose CEO pay exceeds worker pay by extreme margins, beginning at 50-to-1. But meaningful reform would require broad coalitions and a substantial shift in Congress. Change, if it comes, will be slow—and uncertain.
Transparency may be the public’s strongest immediate tool.
What has happened in America is not merely an economic evolution; it is a moral shift. Accumulation has replaced public responsibility as the dominant ethic, not only in corporate life but across society. Its most vivid emblem is the twice-elected billionaire president, Donald Trump, whose politics celebrate wealth while dismantling social safeguards.
Since 1990, the number of U.S. billionaires has grown from 66 to more than 800, while the median hourly wage has increased by only about 20 percent.
It is now 2026, with the midterm elections approaching in November. My New Year’s wish is straightforward: the impeachment of Donald Trump—assuming Democrats regain a decisive House majority—followed by a Senate trial resulting in his removal from office.
The Framers of the Constitution were not naïve about power. They were steeped in history’s lessons about its corrupting tendencies and had lived, in their own time, under the despotism of a foreign monarch. Their revolution was not merely against a man, but against unchecked executive authority.
Accordingly, the 55 delegates to the Constitutional Convention of 1787 chose to create a president, not a king. Crucially, they did not rely solely on periodic elections as a safeguard. Recognizing that elections alone might prove insufficient in moments of grave danger, they embedded in the Constitution a remedy for removing a corrupt or dangerous chief executive.
Those impeachment provisions are found in Articles I and II. Article I grants the House of Representatives the sole power of impeachment by majority vote and assigns the Senate the sole power to try impeachments, requiring a two-thirds vote of members present for conviction. Article II, Section 4 defines the standard:
“The President, Vice President and all civil Officers of the United States, shall be removed from Office on Impeachment for, and Conviction of, Treason, Bribery, or other high Crimes and Misdemeanors.”
Impeachment, it bears emphasis, is an accusation; removal requires conviction.
Presidential impeachment trials are exceedingly rare. In more than two centuries of constitutional government, only four have occurred:
Andrew Johnson (1868) Bill Clinton (1999) Donald Trump (2019 and 2021)
All four trials ended in acquittal. Richard Nixon almost certainly would have been removed, but he resigned before the House could vote on impeachment—the first presidential resignation in American history.
The rarity of impeachment trials reflects not restraint alone, but the gravity of the remedy. As Alexander Hamilton explained, impeachable offenses are those that violate the public trust—abuses of power that strike at the constitutional order itself.
Measured against that standard, there should be little ambiguity regarding Donald Trump’s “high crimes and misdemeanors.” They include conduct that betrays the nation’s best interests and undermines the rule of law: defiance of judicial orders; the use of the Department of Justice to shield allies and punish perceived enemies; and the deployment of violent rhetoric that incites threats against judges and congressional critics, including calls for the execution of former public servants from the military and intelligence communities.
To these may be added the consistent placation of authoritarian foreign leaders and the initiation of military actions—such as attacks on Venezuela and vessels at sea—without clear congressional authorization.
Conviction in the Senate requires 67 votes. Given political realities, Republicans alone are unlikely to supply them. That leaves responsibility where it has always rested in a constitutional democracy: with the electorate.
If the Constitution is to function as intended—if law is to prevail over personal power—then it falls to citizens to vote in numbers sufficient to make accountability possible. The midterms present such a moment.
Whether the nation seizes it will determine not merely the fate of one presidency, but the durability of the constitutional order itself.