Showing posts with label Political Culture. Show all posts
Showing posts with label Political Culture. Show all posts

Monday, April 10, 2023

Monday, March 27, 2023

Americans Pull Back From Values That Once Defined United States, Poll Finds

I teach this. My son was just saying, "This is nothing new to you." He's right. It's not. But it's cool to have a WSJ article I can share with my students and use in assignments.

See, at Wall Street Journal, "America Pulls Back From Values That Once Defined It, WSJ-NORC Poll Finds: Patriotism, religion and hard work hold less importance."

Sunday, January 1, 2023

The Boomers in the Twilight Zone

Following-up, "Three-Quarters of Generation Z 'Not Interested In Sports'."

From Andrew Sullivan, "How exactly are they going to die? And how much choice should they have in it?":

I’m not particularly afraid of death. But I’m afraid of dying.

And dying can now take a very, very long time. In the past, with poorer diets, fewer medicines, and many more hazards, your life could be over a few months after being born or moments after giving birth or just as you were contemplating retirement. Now, by your sixties, you may well have close to a quarter of your life ahead of you. In 1860, life expectancy was 39.4 years. By 2060, it’s predicted to be 85.6 years. This is another deep paradigm shift in modernity we have not come close to adapting to.

For some, with their bodies intact and minds sharp, it’s a wonderful thing. But for many, perhaps most others, those final decades can be physically and mentally tough. Increasingly living alone, or in assisted living or nursing homes, the lonely elderly persist in a twilight zone of extended, pain-free — but not exactly better — life.

We don’t like to focus on this quality-of-life question because it calls into question the huge success we have had increasing the quantity of it. But it’s a big deal, it seems to me, altering our entire perspective on our lives and futures. Ricky Gervais has a great bit when he tells how he’s often told to stop smoking, or eat better, or exercise more — because leaving these vices behind will add a decade to his life. And his response is: sure, but the wrong decade! If he could get a decade in his thirties or forties again, he’d take it in an instant. But to live a crepuscular experience in your nineties? Not so much. “Remember, being healthy is basically just dying as slowly as possible,” he quipped. Not entirely wrong.

Anyone who has spent time caring for aging parents knows the drill: the physical and then the mental deterioration; the humiliations of helplessness; the often punitive absorption of drug after drug, treatment after treatment; multiple medicinal protocols of ever-increasing complexity and side effects. Staying in a family home becomes impossible for those who need 24-hour care, and for adult children to handle when they’re already overwhelmed by work and kids. Home-care workers — increasingly low-paid immigrants — can alleviate only so much.

All this is going to get much worse in the next couple of decades as the Boomers age further: “The population aged 45 to 64 years, the peak caregiving age, will increase by 1% between 2010 and 2030 while the population older than 80 years will increase by 79%.” I’ll be among them — on the edge of Gen X and Boomerville.

I mention all this as critical background for debating policies around euthanasia or “assisted dying” (a phrase that feels morbidly destined to become “death-care.”) Oregon pioneered the practice in the US with the Death with Dignity Act in 1997. At the heart of its requirements is a diagnosis of six months to live. Following Oregon’s framework, nine other states and DC now have laws for assisted suicide. Public support for euthanasia has remained strong — 72 percent in the latest Gallup.

But this balance could easily get destabilized in the demographic traffic-jam to come. In 2016, euthanasia came to Canada — but it’s gone much, much further than the US. The Medical Assistance in Dying (or MAID) program is now booming and raising all kinds of red flags: there were “10,000 deaths by euthanasia last year, an increase of about a third from the previous year.” (That’s five times the rate of Oregon, which actually saw a drop in deaths last year.) To help bump yourself off in Canada, under the initial guidelines, there had to be “unbearable physical or mental suffering that cannot be relieved under conditions that patients consider acceptable,” and death had to be “reasonably foreseeable” — not a strict timeline as in Oregon. The law was later amended to allow for assisted suicide even if you are not terminally ill.

More safeguards are now being stripped away:

Gone is the “reasonably foreseeable” death requirement, thus clearing the path of eligibility for disabled individuals who otherwise might have a lifetime to live. Gone, too, is the ten-day waiting requirement and the obligation to provide information on palliative-care options to all applicants. … [O]nly one [independent witness] is necessary now. Unlike in other countries where euthanasia is lawful, Canada does not even require an independent review of the applicant’s request for death to make sure coercion was not involved.

This is less a slippery slope than a full-on, well-polished ice-rink. Several disturbing cases have cropped up — of muddled individuals signing papers they really shouldn’t have with no close relatives consulted; others who simply could not afford the costs of survival with a challenging disease, or housing, and so chose death; people with severe illness being subtly encouraged to die in order to save money:

In one recording obtained by the AP, the hospital’s director of ethics told [patient Roger Foley] that for him to remain in the hospital, it would cost “north of $1,500 a day.” Foley replied that mentioning fees felt like coercion and asked what plan there was for his long-term care. “Roger, this is not my show,” the ethicist responded. “My piece of this was to talk to you, (to see) if you had an interest in assisted dying.”

It’s hard to imagine a greater power-dynamic than that of a hospital doctor and a patient with a degenerative brain disorder. For any doctor to initiate a discussion of costs and euthanasia in this context should, in my view, be a firing offense.

Support the Dish for less than $1/week

Then this: in March, a Canadian will be able to request assistance in dying solely for mental health reasons. And the law will also be available to minors under the age of 18. Where to begin? How do we know that the request for suicide isn’t a function of the mental illness? And when the number of assisted suicides jumps by a third in one year, as it just did in Canada, it’s obviously not a hypothetical matter.

Ross Douthat had a moving piece on this — and I largely agree with his insistence on the absolute inviolable dignity of every human being and the unquantifiable moral value of every second of his or her life. I’m a Catholic, after all. At the same time, we have to assess what this moral absolutism means in practice. It can entail a huge amount of personal suffering; it deprives anyone of a right to determine how she or he will die; and it hasn’t been adapted to our unprecedented scientific achievements, which have turned so many medical fates into choices we simply cannot avoid.

Does the person who lives the longest win the race? So much of our medical logic suggests this, but it’s an absurd way to think of life. I’m changed forever by losing some of my closest friends when they were in their twenties and thirties from AIDS a couple decades ago. They died; I didn’t. Wrapping my head around that has taken a while, but it became a burning conviction inside me that their lives were not worth less than mine for being cut so short; that life is less a race than a performance, less about how many years you can rack up, but how much love and passion and friendship a life can express, however brief or interrupted.

I still think this. Which is why I do not want to force terminally sick people to live as their bodies and minds disintegrate so badly that they would really rather die. Dignity goes both ways. My suggestion would be simply not aggressively treating the conditions and illnesses that old age naturally brings, accepting the decline of the body and mind rather than fighting like hell against it, and finding far better ways to simply alleviate pain and distress.

And at some point, go gentle. Treating those at the end of life with psilocybin, or ketamine, or other psychedelics should become routine, as we care for the soul in the days nearer our deaths. (Congress should pass this bipartisan bill to waive Schedule 1 status when it comes to the terminally ill.) We can let people die with dignity, in other words, by inaction as much as action, and by setting sane, humane limits on our medicinal power — with the obvious exception of pain meds.

Even Ross allows that “it is not barbaric for the law to acknowledge hard choices in end-of-life care, about when to withdraw life support or how aggressively to manage agonizing pain.” But that should be less of an aside than a strong proposal. What kind of support for how long? In my view, not much and not for too long. What rights does a dying patient have in refusing treatment? Total. What depths of indignity does she have to endure? Not so much. I’m sure Dish readers have their own views and unique experiences — so let’s air them as frankly as we can in the weeks ahead (dish@andrewsullivan.com). There has to be a line. Maybe we can collectively try to find it

I think of Pope John Paul II’s extremism on the matter of life — even as his body and mind twisted into a contortion of pain and sickness due to Parkinson’s and old age. His example did the opposite of what he intended: he persuaded me of the insanity of clinging to life as if death were the ultimate enemy. There’s little heroism in that — just agony and proof that we humans have once again become victims of our own intelligence, creating worlds we are not equipped or designed to live in, achieving medical successes that, if pursued to their logical conclusion, become grotesque human failures.

Moderation please, especially in our dotage. And mercy.

 

Three-Quarters of Generation Z 'Not Interested In Sports'

Generation Z, along with Millennials, are strangling this country. 

It's not just a change in the modernist, 20th century America economic, political, and social culture, but the annihilation of it.

I find it very strange, though I worry less about it as I get older.In any case, at Axios, "Gen Z more likely to stream live sports events."


Sunday, September 25, 2022

Fighting the Culture War Through Christ

 Allie Beth Stuckey makes the case:

I know this is going to be controversial, even (especially?) for many conservatives. Sorry. This is Twitter. You have to endure my takes & I have to endure yours 😜 Anywho -

We can say, scientifically, that a unique human life is formed at conception. This is a fact. But that fact doesn’t tell us why that human is valuable, why she should be protected, and why it’s wrong to kill her.

We can employ logic & look at history to tell us that dehumanizing any person based on arbitrary reasons like size, age, or location leads to dehumanization of other kinds of people. But this doesn’t tell us why dehumanization or even murder is wrong.

We can look to biology to tell us that humans are sexually dimorphic, that the categories of male & female are fixed. But this doesn’t tell us why these facts are more important than how a person feels.

We can talk about the negative consequences of men identifying as women on girls’/women’s rights, safety & fairness, but we still don’t know why these rights matter more than the rights of men who want to enter women’s spaces.

We can point to economics & history to tell us why socialism and communism are failures. But we are defining “failure” with the assumption that mass starvation & poverty, the murder of dissidents, etc. are evil. Where does that assumption come from? We can say the state shouldn’t go after political opponents. Justice should be impartial. Bad behavior should be punished, good people should be left alone, & the innocent should be protected. But every single one of those words must be defined. Where do we get those definitions?

Admit it or not, our “why” behind the above arguments is the Bible. It’s wrong to kill a baby in the womb because God, who is the creator & authority over the universe, says he made us in His image & therefore it’s wrong to murder (Gen 9:6). There is no substantive, ultimate reason for the existence of human rights of humans are just accidental clumps of matter. The basis for innate humans rights is that humans are uniquely valuable above plants & animals. Christianity insists we are because God says we are.

This God says He made us not only in His image, but as male & female. Feelings don’t override physical reality because we were God-created, not self-created. We don’t have the power to self-declare & self-identify, because God told us who we are when he made us. The right to & legitimacy of private property comes from God (“you shall not steal,” “you shall not covet”). The authority of a government comes from God & He gives its duties to punish evil & reward good (Romans 13).

Good & evil exists because God says that they do and He defines them. “Murder is bad because it is” “women’s rights & privacy matter because they do” will not ultimately be enough against secular progressivism, which is a religion in itself with its own rigid doctrines.

You don’t have to be a Christian to acknowledge the necessity of its worldview in holding together everything that has ever made the West or any civilization lastingly good. There is ultimately no secular way to justify any anti-progressive or conservative argument.

Progressives understand this in a way conservatives don’t. They’re constantly attacking Christianity, because Christianity is and always has been the fundamental threat to their total control. We’ve always been a boil on the back of wicked tyrants and we still are.

God is a God of order, and progressivism constantly seeks disorder. Christians are always to be agents of order, in every place & age. Therefore we will always be - & rightly so - enemies of progressive ideology.

So - while I am happy to link arms with people of all backgrounds to push back against the chaos of today’s leftist lunacy, at the end of the day, simple anti-wokism will never, ever be a match for the threat this ideology poses. No secular movement will. The biggest failure of the conservative movement is thinking we are fighting for neutrality. Meanwhile, the left is playing for keeps knowing every space is for the taking, and nothing is neutral. Everything will be dominated by a worldview. The question is only ever, which one?

 

A New Counterculture?

From N.S. Lyons, who writes "The Upheaval" Substack page.

At City Journal, "If the Right captures some of the Left’s youthful energy and rebellious cachet, it would represent a tectonic cultural and political shift":

In July, the New York Times posted a job announcement seeking a reporter-cum-anthropologist to cover an important new beat: infiltrating the “online communities and influential personalities making up the right-wing media ecosystem” and “shedding light on their motivations” for the benefit of Times readers. Establishing this “critical listening post” would not be a role for the faint of heart. The daring candidate would have to be specifically “prepared to inhabit corners of the internet” where “far-right” ideas were discussed, all for the higher goal of determining “where and why these ideas take shape.”

You could be forgiven for questioning why the paper needed yet another reporter to shape the narrative about the political Right, given its constant focus on Donald Trump and the populist MAGA movement since 2016. But the timing of the announcement seemed to suggest that the Times had something else in mind. It arrived amid an explosion of media interest in understanding a strange new tribe, discovered suddenly not in the wilds of Kansas but right under their noses.

Back in April, an article by James Pogue in Vanity Fair revealed the emergence of a collection of “podcasters, bro-ish anonymous Twitter posters, online philosophers, artists, and amorphous scenesters”—sometimes called “‘dissidents,’ ‘neo-reactionaries,’ ‘post-leftists,’ or the ‘heterodox’ fringe . . . all often grouped for convenience under the heading of America’s New Right”—who represented the “seam of a much larger and stranger political ferment, burbling up mainly within America’s young and well-educated elite.” That last bit about the demographics of this so-called New Right may have been what got the Times’s attention. But Pogue had even more striking news: these dissidents, he wrote, had established “a position that has become quietly edgy and cool in new tech outposts like Miami and Austin, and in downtown Manhattan, where New Right–ish politics are in, and signifiers like a demure cross necklace have become markers of a transgressive chic.” This may have been the most alarming news of all for the paper of record: somehow, traditionalist right-wing conservatism had perhaps become cool.

Is it true—and if so, how is it possible? For at least a century, the Left has held a firm monopoly on “transgressive chic,” profitably waging a countercultural guerilla war against society’s hegemonic status quo. For the Right to capture some of the Left’s youthful energy and rebellious cachet would represent a tectonic cultural and political shift. We shouldn’t be shocked if it happens.

Few things are more natural for young people than to push back against the strictures and norms of their day, even if only to stand out a little from the crowd and assert their independence. A counterculture forms as a reaction against an official or dominant culture—and today, it is the woke neoliberal Left that occupies this position in America’s cultural, educational, technological, corporate, and bureaucratic power centers. In this culture, celebration of ritualized, old forms of transgression is not only permitted, but practically mandatory. Dissent against state-sponsored transgression, however, is now transgressive. All of what was once revolutionary is now a new orthodoxy, with conformity enforced by censorship, scientistic obscurantism, and eager witch-hunters (early-middle-aged, zealously dour, tight-lipped frown, NPR tote bag, rainbow “Coexist” bumper sticker, pronouns in email signature—we all know the uniform).

Moreover, young people living under the permanent revolution of today’s cultural mainstream often tend to be miserable. Their disillusionment opens the door to subversive second thoughts on such verities as the bulldozing of sexual and gender norms, the replacement of romance by a Tinder hellscape, general atomized rootlessness, working life that resembles neo-feudal serfdom, and the enervating meaninglessness of consumerism and mass media. In this environment, the most countercultural act is to embrace traditional values and ways of life—like the vogue among some young people for the Latin Mass. We shouldn’t be too surprised if at least a subset of those youth seeking to rebel against the Man might, say, choose to tune in to Jordan Peterson, turn on to a latent thirst for objective truth and beauty, and drop out of the postmodern Left...

He's good. 

Keep reading.

 

 

Friday, June 17, 2022

Our Civilizational Destruction

I can't disagree with Ms. Allie:



Tuesday, March 22, 2022

The Takeover of America's Legal System

A truly must-read article, from Aaron Sibarium, on Bari Weiss's Substack, "The kids didn't grow out of it." 

Ms. Bari has the introduction:

If you are a Common Sense reader, you are by now highly aware of the phenomenon of institutional capture. From the start, we have covered the ongoing saga of how America’s most important institutions have been transformed by an illiberal ideology—and have come to betray their own missions.

Medicine. Hollywood. Education. The reason we exist is because of the takeover of newspapers like The New York Times.

Ok, so we’ve lost a lot. A whole lot. But at least we haven’t lost the law. That’s how we comforted ourselves. The law would be the bulwark against this nonsense. The rest we could work on building anew.

But what if the country’s legal system was changing just like everything else?

Today, Aaron Sibarium, a reporter who has consistently been ahead of the pack on this beat, offers a groundbreaking piece on how the legal system in America, as one prominent liberal scholar put it, is at risk of becoming “a totalitarian nightmare.”

This is a long feature on a subject we think deserves your time. Save it, share it, or print it to read in a quiet moment...

And read Mr. Sibarium in full. Really. Don't miss it

 

Monday, March 7, 2022

Herbert Marcuse and the Left's Endless Campaign Against Western 'Repression'

This is from Benedict Beckeld, at Quillette (via Maggie's Farm):

The Frankfurt School of social theory began about a century ago, in the Weimar Republic. It consisted in the main of a group of rather anti-capitalist, Marxist-light gentlemen who embraced oikophobia (the hatred or dislike of one’s own cultural home), and who were understandably disillusioned by the carnage of World War I. Our interest today is mainly historical; of its earlier members, such as Ernst Bloch, Walter Benjamin, Max Horkheimer, Herbert Marcuse, and Theodor Adorno, really only Adorno is still read with a measure of seriousness outside of academia.

The Frankfurt School popularized historicism—the belief that reflection itself is a part of history, which is to say that earlier thoughts are historically conditioned by the circumstances in which the thinkers lived, and should be seen in that light; and that what passes for “knowledge” is marred by the historical time and place in which that knowledge appeared. (This idea was present already in the second part of The Communist Manifesto.) The insights that a more positivist outlook claims to be certain, based on sensory data, historicism will consider uncertain and necessarily bound by subjective value judgments. A part of this view is the concern—and the French postmodernists will pick up this point—to identify, isolate, and thereby exorcise every sort of domination that any group might have held over any other group.

They wanted to find the particular reasons why someone in the past had thought in a particular way, reasons that were to be found mainly in external factors. Essentially, the Frankfurt School endeavored to establish a “value-free” social science, that is, the erasure of any sort of prejudice among philosophers and sociologists. Since Western civilization was monomaniacally seen as the history of dominations by various groups over one another—which meant that individual actors had to be viewed as purely nefarious oppressors—it followed quite naturally that much of the West was ready for the garbage heap. Not only were the workers and the poor oppressed by the rich, but the rich in turn were, along with everyone else, oppressed psychologically by Christian sexual mores and by the overall familial hierarchy of Western civilization. This is why, to many of the school’s members, not only smaller fixes had to be implemented here and there, but the whole edifice had to be brought down (which was itself ultimately a morally positivistic effort). With the rise of Nazism in Germany, many Frankfurt scholars moved to New York, and thereby gained a broader audience of impressionable college students....

Keep reading.

And see Linda Kimball's now classic article, "Cultural Marxism."  


Sunday, February 13, 2022

What's Really at Stake in America's History Wars?

At WSJ, "In debates about monuments, curricula and renaming, the facts of the past matter less than how we are supposed to feel about our country":

In January, McMinn County, Tenn., made international news for perhaps the first time in its history when the school board voted to remove “Maus,” the acclaimed graphic novel about the Holocaust, from the 8th-grade curriculum. The board stated that it made the change on account of the book’s “use of profanity and nudity,” asking school administrators to “find other works that accomplish the same educational goals in a more age-appropriate fashion.”

This curricular change, affecting a few hundred of the approximately 5,500 K-12 students in McMinn’s public schools, was quickly amplified on social media into a case of book banning with shades of Holocaust denial. The author of “Maus,” Art Spiegelman, said that the decision had “a breath of autocracy and fascism.” “There’s only one kind of people who would vote to ban Maus, whatever they are calling themselves these days,” tweeted the popular fantasy writer Neil Gaiman, earning more than 170,000 likes. The controversy sent the book to the top of Amazon’s bestseller list.

This outrage of the week will soon give way to another, but the war over history—how to remember it, represent it and teach it—is only getting fiercer. America’s political and cultural divisions increasingly take the form of arguments not about the future—what kind of country we want to be and what policies will get us there—but about events that are sometimes centuries in the past. The Holocaust, the Civil War, the Founding, the slave trade, the discovery of America—these subjects are constantly being litigated on social media and cable TV, in school boards and state legislatures.

None of those venues is well equipped to clarify what actually happened in the past, but then, the facts of history seldom enter into the war over history. Indeed, surveys regularly show how little Americans actually know about it. A 2019 poll of 41,000 people by the Woodrow Wilson National Fellowship Foundation found that in 49 states, a majority couldn't earn a passing score on the U.S. citizenship test, which asks basic questions about history and government. (The honorable exception was Vermont, where 53% passed.)

Ironically, the year after the survey, the Woodrow Wilson Foundation announced that it would drop the historical reference in its own name, citing the 28th president’s “racist legacy.” It was part of a growing trend. Woodrow Wilson’s name was also dropped from Princeton University’s school of international affairs. Yale University renamed a residential college named for John C. Calhoun, the antebellum Southern politician who was an ardent defender of slavery. The San Francisco school board briefly floated a plan to drop the names of numerous historical figures from public schools for various reasons, including George Washington and Thomas Jefferson because they were slaveholders.

It makes sense that educational institutions are leading the wave of renaming, because it is above all a teaching tool, one suited to the short attention span of today’s public debates. Actual historical understanding requires a much greater investment of effort and imagination than giving a thumbs up or down to this or that name. Often even a Wikipedia search seems to be too much to ask. One of the names that the San Francisco school board proposed to get rid of was Paul Revere’s, on the grounds that he was a leader of the Penobscot Expedition of 1779, which a board member believed was a campaign to conquer territory from the Penobscot Indians. In fact, it was a (failed) attempt to evict British naval forces from Penobscot Bay in Maine.

Clearly, the war over history has as much to do with the present as the past. To some extent, that’s true of every attempt to tell the story of the past, even the most professional and objective. In the 19th century, the German historian Leopold von Ranke saw it as his task to determine “how things really were,” but if that could be done, it wouldn’t be necessary for each generation of historians to write new books about the same subjects. We keep retelling the story of the Civil War or World War II not primarily because new evidence is discovered, but because the way we understand the evidence changes as the world changes.

That’s why so many of America’s historical battles have to do with race, slavery and colonialism—because no aspect of American society has changed more dramatically over time. It has never been a secret, for instance, that George Washington was a slaveholder. When he died in 1799, there were 317 enslaved people living at Mount Vernon.

But when Parson Weems wrote the first bestselling biography of Washington in 1800, he barely referred to the first president’s slaveholding, except for noting that in his will he provided for freeing his slaves, “like a pure republican.” When Weems does inveigh against “slavery” in the book, he is referring to British rule in America. For instance, he writes that the tax on tea, which led to the Boston Tea Party in 1773, was meant to “insult and enslave” the colonies. Today it’s impossible to ignore this glaring contradiction. Weems didn’t notice it and clearly didn’t expect his readers to, either.

Another explanation for this blind spot can be found in the book’s full title: “The Life of George Washington: With curious anecdotes, equally honorable to himself and exemplary to his young countrymen.” Weems was a minister, and his goal was moral uplift. That’s why he avoided writing about Washington’s treatment of his slaves but included the dubious story about young George confessing to chopping down the cherry tree. The point was to show Washington in a light that would make readers want to be better themselves.

Today’s war over history involves the same didactic impulses. Fights over the past aren’t concerned with what happened so much as what we should feel about it. Most people who argue about whether Columbus Day should become Indigenous Peoples’ Day, regardless of what side they’re on, have only a vague sense of what Columbus actually did. The real subject of debate is whether the European discovery of America and everything that flowed from it, including the founding of the U.S., should be celebrated or regretted. Our most charged historical debates boil down to the same terms Weems used: Is America “exemplary” and “honorable,” or the reverse?

How we answer that question has important political ramifications, since the farther America is from the ideal, the more it presumably needs to change. But today’s history wars are increasingly detached from practical issues, operating purely in the realm of emotion and symbol. Take the “land acknowledgments” that many universities, arts institutions and local governments have begun to practice—the custom of stating the name of the Native American people that formerly occupied the local territory. For example, the Board of Supervisors of Pima County, Az., recently voted to begin its meetings with the statement, “We honor the tribal nations who have served as caretakers of this land from time immemorial and respectfully acknowledge the ancestral homelands of the Tohono O’odham Nation.”

To their supporters, land acknowledgments are a way of rectifying Americans’ ignorance or indifference about the people who inhabited the country before European settlement. The use of words like “caretakers” and “time immemorial,” however, raises historical questions that the Pima Board of Supervisors is presumably unqualified to answer. People have been living in what is now Arizona for 12,000 years: Were the Tohono O’odham Nation really in their territory “from time immemorial,” or might they have displaced an earlier population?

Of course, the Board has no intention of vacating Tucson and restoring the land to its former inhabitants, so the whole exercise can be seen as pointless. Still, by turning every public event into a memorial of dispossession, land acknowledgments have the effect of calling into question the legitimacy of the current inhabitants—that is, the people listening to the acknowledgment.

The fear that the very idea of America is being repudiated has led Republican legislators in many states to introduce laws regulating the teaching of American history. These are often referred to as “anti-critical race theory” laws, but in this context the term is just a placeholder for a deeper anxiety. The controversial law passed in Texas last year, for instance, doesn’t prevent teachers from discussing racism. On the contrary, House Bill 3979 mandates the study of Frederick Douglass and Martin Luther King, Jr. , as well as Susan B. Anthony and Cesar Chavez. However, it does insist that students learn that “slavery and racism are…deviations from, betrayals of, or failures to live up to, the authentic founding principles of the United States, which include liberty and equality.” In other words, students should believe that the U.S. is “exemplary” and “honorable” in principle, if regrettably not in practice.

In the U.S., the war over history usually has to do with curricula and monuments because those are some of the only things the government can directly control. Removing “Maus” from the 8th-grade reading list can be loosely referred to as a “ban” only because actual book bans don’t exist here, thanks to the First Amendment. But other countries that are less free also have their history wars, and in recent years governments and ideologues have become bolder about imposing an official line.

In Russia last December, a court ordered the dissolution of Memorial, a highly respected nonprofit founded in 1989 to document the crimes of the Soviet era, after prosecutors charged that it “creates a false image of the USSR as a terrorist state.” In 2018, Poland made it illegal to attribute blame for the Holocaust to the “Polish nation.” In India in 2014, Penguin India agreed to stop publishing a book about the history of Hinduism by the respected American scholar Wendy Doniger, after a nationalist leader sued on the grounds that it focused on “the negative aspects” of the subject.

Such episodes are becoming more common with the rise of nationalist and populist movements around the world. When people invest their identity wholly in their nation, pointing out the evils in the nation’s past feels like a personal attack. Conversely, for people whose political beliefs hinge on distrusting nationalism, any refusal to focus on historic evils feels dangerous, like a tacit endorsement of them, as in the “Maus” episode. These extremes feed off one another, until we can only talk about the past in terms of praise or blame that would be too simple for understanding a single human being, much less a collection of millions over centuries.

It’s surprising to realize how quickly the American consensus on history has unraveled under the pressure of polarization...

Friday, June 4, 2021

Critical Race Theory Rapidly Destroying American Health Care

A great, great piece from Katie Herzog, at Bari Weiss's Substack, "What Happens When Doctors Can't Tell the Truth?":

People Are Afraid to Speak Honestly

They meet once a month on Zoom: a dozen doctors from around the country with distinguished careers in different specialities. They vary in ethnicity, age and sexual orientation. Some work for the best hospitals in the U.S. or teach at top medical schools. Others are dedicated to serving the most vulnerable populations in their communities.

The meetings are largely a support group. The members share their concerns about what’s going on in their hospitals and universities, and strategize about what to do. What is happening, they say, is the rapid spread of a deeply illiberal ideology in the country’s most important medical institutions.

This dogma goes by many imperfect names — wokeness, social justice, critical race theory, anti-racism — but whatever it’s called, the doctors say this ideology is stifling critical thinking and dissent in the name of progress. They say that it’s turning students against their teachers and patients and racializing even the smallest interpersonal interactions. Most concerning, they insist that it is threatening the foundations of patient care, of research, and of medicine itself.

These aren’t secret bigots who long for the “good old days” that were bad for so many. They are largely politically progressive, and they are the first to say that there are inequities in medicine that must be addressed. Sometimes it’s overt racism from colleagues or patients, but more often the problem is deeper, baked into the very systems clinicians use to determine treatment.

“There’s a calculator that people have used for decades that predicts the likelihood of having a successful vaginal delivery after you've had a cesarean,” one obstetrician in the Northeast told me. “You put in the age of the person, how much they weigh, and their race. And if they’re black, it calculates that they are less likely to have successful vaginal delivery. That means clinicians are more likely to counsel black patients to get c-sections, a surgery they might not actually need.”

There’s no biological reason for race to be a factor here, which is why the calculator just changed this year. But this is an example of how system-wide bias can harm black mothers, who are two to three times more likely to die in childbirth than white women even when you control for factors like income and education, which often make racial disparities disappear.

But while this obstetrician and others see the problems endemic in their field, they’re also alarmed by the dogma currently spreading throughout medical schools and hospitals.

I’ve heard from doctors who’ve been reported to their departments for criticizing residents for being late. (It was seen by their trainees as an act of racism.) I’ve heard from doctors who’ve stopped giving trainees honest feedback for fear of retaliation. I’ve spoken to those who have seen clinicians and residents refuse to treat patients based on their race or their perceived conservative politics.

Some of these doctors say that there is a “purge” underway in the world of American medicine: question the current orthodoxy and you will be pushed out. They are so worried about the dangers of speaking out about their concerns that they will not let me identify them except by the region of the country where they work.

“People are afraid to speak honestly,” said a doctor who immigrated to the U.S. from the Soviet Union. “It’s like back to the USSR, where you could only speak to the ones you trust.” If the authorities found out, you could lose your job, your status, you could go to jail or worse. The fear here is not dissimilar.

When doctors do speak out, shared another, “the reaction is savage. And you better be tenured and you better have very thick skin.”

“We’re afraid of what's happening to other people happening to us,” a doctor on the West Coast told me. “We are seeing people being fired. We are seeing people's reputations being sullied. There are members of our group who say, ‘I will be asked to leave a board. I will endanger the work of the nonprofit that I lead if this comes out.’ People are at risk of being totally marginalized and having to leave their institutions.”

While the hyper focus on identity is seen by many proponents of social justice ideology as a necessary corrective to America’s past sins, some people working in medicine are deeply concerned by what “justice” and “equity” actually look like in practice.

“The intellectual foundation for this movement is the Marxist view of the world, but stripped of economics and replaced with race determinism,” one psychologist explained. “Because you have a huge group of people, mostly people of color, who have been underserved, it was inevitable that this model was going to be applied to the world of medicine. And it has been.”

Whole Areas of Research Are Off-Limits

“Wokeness feels like an existential threat,” a doctor from the Northwest said. “In health care, innovation depends on open, objective inquiry into complex problems, but that’s now undermined by this simplistic and racialized worldview where racism is seen as the cause of all disparities, despite robust data showing it’s not that simple.”

“Whole research areas are off-limits,” he said, adding that some of what is being published in the nation’s top journals is “shoddy as hell.”

Here, he was referring in part to a study published last year in the Proceedings Of The National Academy Of Sciences. The study was covered all over the news, with headlines like “Black Newborns More Likely to Die When Looked After by White Doctors” (CNN), “The Lack of Black Doctors is Killing Black Babies” (Fortune), and “Black Babies More Likely to Survive when Cared for by Black Doctors” (The Guardian).

Despite these breathless headlines, the study was so methodologically flawed that, according to several of the doctors I spoke with, it’s impossible to extrapolate any conclusions about how the race of the treating doctor impacts patient outcomes at all. And yet very few people were willing to publicly criticize it. As Vinay Prasad, a clinician and a professor at the University of California San Francisco, put it on Twitter: “I am aware of dozens of people who agree with my assessment of this paper and are scared to comment.”

“It’s some of the most shoddy, methodologically flawed research we’ve ever seen published in these journals,” the doctor in the Zoom meeting said, “with sensational conclusions that seem totally unjustified from the results of the study.”

“It’s frustrating because we all know how hard it is to get good, sound research published,” he added. “So do those rules and quality standards no longer apply to this topic, or to these authors, or for a certain time period?”

At the same time that the bar appears to be lower for articles and studies that push an anti-racist agenda, the consequences for questioning or criticizing that agenda can be high.

Just ask Norman Wang. Last year, the University of Pittsburgh cardiologist was demoted by his department after he published a paper in the Journal of the American Heart Association (JAHA) analyzing and criticizing diversity initiatives in cardiology. Looking at 50 years of data, Wang argued that affirmative action and other diversity initiatives have failed to both meaningfully increase the percentage of black and Hispanic clinicians in his field or to improve patient outcomes. Rather than admitting, hiring and promoting clinicians based on their race, he argued for race-neutral policies in medicine.

“Long-term academic solutions and excellence should not be sacrificed for short-term demographic optics,” Wang wrote. “Ultimately, all who aspire to a profession in medicine and cardiology must be assessed as individuals on the basis of their personal merits, not their racial and ethnic identities.”

At first, there was little response. But four months after it was published, screenshots of the paper began circulating on Twitter and others in the field began accusing Wang of racism. Sharonne Hayes, a cardiologist at the Mayo Clinic, implored colleagues to “rise up.” “The fact that this is published in ‘our’ journal should both enrage & activate all of us,” she wrote, adding the hashtag #RetractRacists.

Soon after, Barry London, the editor in chief of JAHA, issued an apology and the journal retracted the work over Wang’s objection. London cited no specific errors in Wang’s paper in his statement, just that publishing it was antithetical to his and the journal’s values. Retraction, in a case like this, is exceedingly rare: When papers are retracted, it’s generally because of the data or the study has been discredited. A search of the journal’s website and the Retraction Database found records of just two retractions in JAHA: Wang’s paper and a 2019 paper that erroneously linked heart attacks to vaping.

After the outcry, the American Heart Association (AHA), which publishes the journal, issued a statement denouncing Wang’s paper and promising an investigation. In a tweet, the organization said it “does NOT represent AHA values. JAHA is editorially independent but that’s no excuse. We’ll investigate. We’ll do better. We’re invested in helping to build a diverse health care and research community.”

As the criticism mounted, Wang was removed from his position as the director of a fellowship program in clinical cardiac electrophysiology at University of Pittsburgh Medical Center and was prohibited from making any contact with students. His boss reportedly told him that his classroom was “inherently unsafe” due to the views he expressed.

Wang is now suing both the AHA and the University of Pittsburgh for defamation and violating his First Amendment rights. To the doctors on the Zoom call, his case was a stark warning of what can happen when one questions policies like affirmative action, which, according to recent polling, is opposed by nearly two-thirds of Americans, including majorities of blacks, Hispanics, and Asians.

“I’m into efforts to make medicine more diverse,” a doctor from the Zoom group said. “But what’s gone off the rails here is that there is an intolerance of people that have another point of view. And that's going to hurt us all.”

JAHA isn’t the only journal issuing apologies. In February, the Journal of the American Medical Association (JAMA) released a podcast hosted by surgeon and then-deputy journal editor Edward Livingston, who questioned the value of the hyper focus on race in medicine as well as the idea that medicine is systemically racist.

“Personally, I think taking racism out of the conversation will help,” Livingston said at one point. “Many of us are offended by the concept that we are racist.”

It’s possible Livingston’s comments would have gone unnoticed but JAMA promoted the podcast on Twitter with the tone-deaf text: “No physician is racist, so how can there be structural racism in health care?”

Even more than in the case of Norman Wang, this tweet, and the podcast it promoted, led to a massive uproar. A number of researchers vowed to boycott the journal, and a petition condemning JAMA has received over 9,000 signatures. In response to the backlash, JAMA quickly deleted the episode, promised to investigate, and asked Livingston to resign from his job. He did.

If you try to access the podcast today, you find an apology in its place from JAMA editor-in-chief Howard Bauchner, who called Livingston’s statements, “inaccurate, offensive, hurtful and inconsistent with the standards of JAMA.” Bauchner was also suspended by JAMA pending an independent investigation. This Tuesday, JAMA announced that Bauchner officially stepped down. In a statement, he said he is “profoundly disappointed in myself for the lapses that led to the publishing of the tweet and podcast. Although I did not write or even see the tweet, or create the podcast, as editor in chief, I am ultimately responsible for them.”

Shortly after this announcement, the New York Times reported that “JAMA’s reckoning” led to a backlash from some JAMA members, who wrote in a letter to the organization that “there is a general feeling that the firing of the editors involved in the podcast was perhaps precipitous, possibly a blot on free speech and also possibly an example of reverse discrimination.” Bauchner’s last day at JAMA is June 30...

Keep reading.

 

Wednesday, June 2, 2021

Republicans Fight Back Against Critical Race Theory

Republicans are fighting critical race theory, but what about conservatives? 

It's not hard to see the C.R.T. is going to be with us for the long haul. The problem is what to do with it. What's I'm seeing in response so far isn't very appealing, much less conservative. It's a lot of cancel culture coming from the right. It's too bad, too, for the solutions aren't too far and away. Folks should look to first principles, especially federalism. That is, push education policy down to the local level as much as possible, and as fast as you can. Get Congress out of the picture. Give states and localities the money, and then let them decide their own curricula. The real conservative bet would be abolishing the Department of Education. Can the Republicans do that? They're the putative conservative party. They should go big and call for a massive downsizing of the federal government, devolving more and more responsibilities to the states. I can't recall really any Republican administration doing that, not even Ronald Reagan's. 

Maybe you'd have to go back to Barry Goldwater's The Conscience of a Conservative for such bold initiatives to reinvent government? 

Downsize, devolve, and delegate education policy down from the federal government to the states. And then get government out of the way and let the people decide what's best for their kids and communities. 

We'll see.

At NYT, "Disputing Racism’s Reach, Republicans Rattle American Schools":

In Loudoun County, Va., a group of parents led by a former Trump appointee are pushing to recall school board members after the school district called for mandatory teacher training in “systemic oppression and implicit bias.”

In Washington, 39 Republican senators called history education that focuses on systemic racism a form of “activist indoctrination.”

And across the country, Republican-led legislatures have passed bills recently to ban or limit schools from teaching that racism is infused in American institutions. After Oklahoma’s G.O.P. governor signed his state’s version in early May, he was ousted from the centennial commission for the 1921 Race Massacre in Tulsa, which President Biden visited on Tuesday to memorialize one of the worst episodes of racial violence in U.S. history.

From school boards to the halls of Congress, Republicans are mounting an energetic campaign aiming to dictate how historical and modern racism in America are taught, meeting pushback from Democrats and educators in a politically thorny clash that has deep ramifications for how children learn about their country.

Republicans have focused their attacks on the influence of “critical race theory,” a graduate school framework that has found its way into K-12 public education. The concept argues that historical patterns of racism are ingrained in law and other modern institutions, and that the legacies of slavery, segregation and Jim Crow still create an uneven playing field for Black people and other people of color.

Many conservatives portray critical race theory and invocations of systemic racism as a gauntlet thrown down to accuse white Americans of being individually racist. Republicans accuse the left of trying to indoctrinate children with the belief that the United States is inherently wicked.

Democrats are conflicted. Some worry that arguing America is racist to the root — a view embraced by elements of the party’s progressive wing — contradicts the opinion of a majority of voters and is handing Republicans an issue to use as a political cudgel. But large parts of the party’s base, including many voters of color, support more discussion in schools about racism’s reach, and believe that such conversations are an educational imperative that should stand apart from partisan politics.

“History is already undertaught — we’ve been undereducated, and these laws are going to get us even less educated,” said Prudence L. Carter, the dean of the Graduate School of Education at the University of California, Berkeley. Attempts to suppress what is still a nascent movement to teach young Americans more explicitly about racist public policy, like redlining or the Chinese Exclusion Act of 1882, amount to “a gaslighting of history,” she said, adding, “It’s a form of denialism.”

The debate over the real or perceived influence of critical race theory — not just in schools but also in corporate, government and media settings — comes as both parties increasingly make issues of identity central to politics. And it accelerated during the presidency of Donald J. Trump, when discussions over racism in the country were supercharged by his racist comments and by a wave of protests last year over police killings of Black people.

 

Tuesday, March 2, 2021

Poor F*king George Stephanopoulos

This fake "journalist" is the reason I quit watching ABC News, and that includes even "ABC World News Tonight," which previously was my favorite, back in the day, when Peter Jennings held down the nightly news chair --- and that guy was the real deal, and star broadcaster with incredible appeal and savoir faire out the wazoo. 

Nowadays, if I watch MSM news programming, I prefer CBS News, especially "CBS This Morning," which while leftist, is still aiming for a pretty "middle class / working class" demographic, and I enjoy a lot of their segments, although I'm too lazy to blog them.

So, just read the whole thing, at also uber-woke CNN, a network I still watch, except for Jake Tapper, who I just can't stand. (And while the whole story isn't out yet, it turns out the Brooke Baldwin is not leaving the network on her own accord --- the truth will come out sometime, of course, but I'm sure she's got some revelations of "power struggles" over there, and it's going to be interesting to hear more about them.) 

And one more thing about CNN, I still like Wolf Blitzer. I know he's under pressure from his producers to toe the "woke" line, but, jeez, he's 100s of times better that the dork Tapper, so at least in the early afternoons, if I'm watching CNN, it's not too bad. After that, I flip over to Fox News, and I definitely try to watch Tucker every night, and that's even after sometimes I think HE's a phony, given his elite pedigree (his dad married divorcée Patricia Caroline Swanson, of Swanson TV dinner fame). And if you recall, Tucker used to be a "golden boy" on daytime news shows, including a stint at --- you guessed it! --- CNN, when he was a co-host of "Crossfire" for a time, back when he wore a bow-tie. He's dropped that habit like a hot potato, and now looks more, well, normal, with his regular coat and tie on his evening shows. 

Anyway, being a political scientist, I literally have to watch some television news, but all these "woke" networks are making it a chore. 

So, RTWT yourself, at "woke" CNN (and featuring the network's resident potato-head, Brian Stelter), "David Muir's new role at ABC News leads to drama with George Stephanopoulos and a visit from Bob Iger."


Monday, December 14, 2020

Unthinkable? As Pandemic Rages, Colleges Cut Tenure

At Tax Prof, "WSJ: Hit By Covid-19, Colleges Do The Unthinkable and Cut Tenure":

When Kenneth Macur became president at Medaille College in 2015, the small, private school in Buffalo, N.Y., was “surviving paycheck to paycheck,” he said. Enrollment was declining and the small endowment was flat.

Then came the coronavirus pandemic. The campus shut down and revenue plummeted 15%. Dr. Macur saw what he considered an opportunity: With the approval of the board of trustees, he suspended the faculty handbook by invoking an “act of God” clause embedded in it. He laid off several professors, cut the homeland security and health information management programs, rescinded the lifelong job security of tenure and rewrote the faculty handbook, rules that had governed the school for decades.

“I believe that this is an opportunity to do more than just tinker around the edges. We need to be bold and decisive,” he wrote in a letter to faculty on April 15. “A new model is the future of higher education.”

Dr. Macur and presidents of struggling colleges around the country are reacting to the pandemic by unilaterally cutting programs, firing professors and gutting tenure, all once-unthinkable changes. Schools employed about 150,000 fewer workers in September than they did a year earlier, before the pandemic, according to the Labor Department. That is a decline of nearly 10%. Along the way, they are changing the centuries-old higher education power structure.

The changes upset the “shared governance” model for running universities that has roots in Medieval Europe. It holds that a board of trustees has final say on how a school is run but largely delegates academic issues to administrators and faculty who share power.

This setup, and the job protection of tenure, promote a need for consensus and deliberation that is one reason why universities often endure for centuries. But this power structure can also hamper an institution’s ability to make tough personnel decisions or react quickly to changes in the labor market or economy.

In recent months, the American Association of University Professors, which advocates for faculty and helped establish the modern concept of tenure in 1940, has received about 100 complaints from professors around the country alleging power grabs by college presidents. The organization has labeled the changes at colleges a “national crisis.”

Hat Tip: Instapundit.


Wednesday, July 29, 2020

White Women in Pennyslvania Still All In for Trump

Interesting.

At Vanity Fair, "“You Might See People Digging In”: Can Joe Biden Actually Sway Obama–Trump Voters?":

In Pennsylvania, Joe Biden is hoping to peel off just enough white, working-class voters in crucial counties to edge out the president. But the women here—waitresses, churchgoers, bingo players, lifelong Democrats—show no signs of budging, pandemic be damned. “I am 110% Trump,” says one. “I love him.”

It was a Thursday night in January, before the coronavirus shut everything down, meaning it was time for bingo at St. Andrew Parish in Wilkes-Barre, Pennsylvania. Three dozen regulars—almost all of them women—filed into the church basement. Some grabbed “cuts” of pizza from the front of the room; a few lingered in the cold for one last smoke. As the clock approached 6:00, they settled into metal folding chairs, spread out their game sheets, and focused on the numbers.

The entire political world, in turn, has been focused on these women and the numbers—and potential power—they represent. The bingo players are part of the white working class, a prized group that helped elect Donald Trump in 2016. Many are Democrats who supported Barack Obama in one or both of his races and had never pulled the GOP lever before. To Republicans they represent the path to the president’s reelection. To Democrats they personify opportunity, a chance to siphon off just enough Trump votes in swing states to remove him from office. “I don’t need to win them,” said Democratic pollster Jill Normington. “I need to lose by less.”

Ever since Trump pulled off upset victories in the former Democratic strongholds of Michigan, Wisconsin, and Pennsylvania, both parties have viewed white women without college degrees as pivotal 2020 voters. White, working-class women and men are the nation’s largest bloc of voters, especially here in the Rust Belt, and women are considered more likely to reject Trump this time around. Polls bear this out, showing that the men in this group remain overwhelmingly behind the president, while many of the women are having second thoughts. Democrats hope that just as suburban women outside cities like Philadelphia, about two hours south of Wilkes-Barre, turned on Republicans in 2018, white, working-class women will follow suit this year.

But the bingo players at St. Andrew and their counterparts in their key region of Pennsylvania may be unexpectedly resistant. In this historically Democratic bastion, where coal once ruled and black-and-white photos of JFK still adorn walls, women who voted for Trump show few signs of wavering. They applaud his brusque demeanor, or they don’t. They support his right-wing policies, or they don’t. It doesn’t matter. They think Democrats have persecuted him without justification, believe he’s doing everything possible to combat COVID-19, and generally support his “law-and-order” response to what are likely the most pervasive protests in U.S. history. They have faith that he has the business acumen to reinvigorate the economy. Mainly, they have faith in him.

They support Trump because they like him. Actually, the word many of them use is “love.”

The reason is simple: He speaks to them, not down to them, eschewing words like “eschew.” While his life experience as a New York playboy-celebrity rich kid is wholly different from their own, they feel he’s one of them. “I am 110% Trump. I love him,” Barbara Bono said as she set up her bingo cards. “I love the way he talks. I understand him more than any other president. This whole place is Trump,” she said, sweeping her arm across the room as women around her nodded.

Bono, a 63-year-old retired Lord & Taylor warehouse worker, is precisely the kind of voter Joe Biden’s campaign hopes to win over: She’s a registered Democrat and former union member who never voted Republican before casting a ballot for Trump. She is Catholic, like so many in these parts, but supports abortion rights. She thought Bill Clinton was a “wonderful” president and didn’t care for George W. Bush. She voted for Obama in 2008, but sat out the election of 2012 because, she said, his Affordable Care Act drove up her health insurance costs. Still, she didn’t support Obama’s GOP opponent, Mitt Romney, another rich guy who, it must be noted, speaks nothing like her.

“I love the way he talks! Crazy Nancy!” Bono said, echoing the president’s nickname for House Speaker Nancy Pelosi, the highest-ranking elected woman officeholder in the history of the United States. “I love it. He is up early in the morning...He’s always talking to the American people. He’s all about our country.

“Pelooooski,” she continued, drawing laughs from the other bingo players. Then, in the spirit of Trump: “I can’t wait for her teeth to fall out!”

Bono is like a lot of the women I met during visits to Luzerne and Lackawanna Counties in northeastern Pennsylvania, called “NEPA” by locals, in late 2019 and early 2020. She has lived in Wilkes-Barre, the county seat of Luzerne, her entire life, though she now spends part of her winters in Valdosta, Georgia. She is a high school graduate, one of the white women without college degrees whose support Trump can’t afford to lose.

During conversations spanning seven months, the women I spoke to made plain that there is little, if anything, that would make them abandon Trump. Not the emergence of Biden, who’s fond of invoking his early childhood in nearby Scranton. Not a quarantine that has cost some of them their wages. Not the ensuing economic fallout. And certainly not the Trump detractors who say he has mishandled life-and-death issues that have consumed the nation: the coronavirus, the police killing of George Floyd, and the systemic racism it brought to the fore.

“A lot of people hate him, but I don’t get it,” said Florence “Flo” Eldredge, a waitress who missed months of work because of the pandemic. “I think he’s doing the best he can under the circumstances.”

Added her next-door neighbor Linda Stetzar: “I would give him a crown.”

It’s hard for an outsider to distinguish between Luzerne and Lackawanna Counties, which share a rolling landscape in the Appalachian Mountains and in the valleys along the Susquehanna River. Many of their towns flow smoothly into one another, their Americana displayed in street banners that celebrate their war heroes. But locals know the difference between Pittston and West Pittston, Old Forge and Forty Fort. They will tell you that the Irish settled this town, the Italians that town, the Poles moved here and the Germans there. They say it while acknowledging that influxes of immigrants weren’t always made to feel welcome. Their ancestors came from Europe to mine anthracite coal, which they did for generations until the mines closed in the middle of the last century. All these years later, they wear their heritage proudly.

Families remain close. I came across more than one pair of sisters, mothers and daughters, aunts and nieces dining or working or playing bingo together. Almost all were descendants of those early miners. But in recent years, after the mines closed and lace factories came and went, more and more residents have departed. Young people who attend college leave most frequently, unable to find white-collar jobs close to home. Those who stay are mostly white and older. They are often more conservative than their children. They still attend the churches, predominantly Catholic, that their forebears built. They are courteous and unhurried, the kind of people who call strangers like me “hon.” They work for state or local government or health care providers or a local university or, increasingly, one of the numerous warehouses and call centers that have popped up along the tangle of highways that crisscross here. They don’t expect something for nothing.

“There’s that dignity piece,” Scranton mayor Paige Cognetti told me. “If mom and dad worked in coal and lace, they worked their asses off. People don’t want programs or help. They want to earn it.”

Before the novel coronavirus, the economies in these two counties had improved considerably. Though they weren’t as strong as elsewhere in the state—the median income was lower, the unemployment rate higher—people didn’t despair. In fact, in Donald Trump, a lot of them saw hope.

Trump beat Hillary Clinton by a total of just 77,744 votes in Pennsylvania, Wisconsin, and Michigan, states once considered Democrats’ “blue wall.” More than half of that vote margin—44,292—came from Pennsylvania. And more than one third came from Luzerne County, a place that hadn’t voted Republican since George H.W. Bush in 1988 and that has voted for the presidential candidate who carried Pennsylvania since 1932. Trump carried Luzerne by 26,237 votes and had the biggest margin of victory there—19 points—since Richard Nixon in 1972. Next door, in Lackawanna County, Clinton won by a scant 3,599 votes even though she had personal ties to the area, having spent her childhood summers at a family cottage on Lake Winola, a short drive from her father’s hometown and final resting place, Scranton. Four years earlier Obama beat Romney there by 26,579 votes.

Theories abound as to why Trump did so well, particularly in Luzerne, which Pennsylvania pollster G. Terry Madonna told me was “a place where in my lifetime I never thought a Republican would win.” Madonna, director of the Franklin & Marshall College Poll, said Trump stepped into a void created by Democrats, who essentially abandoned cultural conservatives. “As the Democrats became an urban-based party, they moved away from the working-class roots that had been a part of their constituency,” he said...
Still more.