It’s Illegal for Most Public Universities in Missouri to Offer PhDs

Only one public university (system) in Missouri can offer PhDs. Only one can offer first-professional degrees in law (J.D.), medicine (M.D.), and more.

The University of Missouri and its supporters in the legislature have for decades maintained a monopoly on doctoral degrees. For a long time, only UM system schools could offer them.

For instance, in 2005, Missouri State University was banned from offering any doctoral, first-professional, or engineering programs unless it was in cooperation with Mizzou, which would be the degree-awarding institution. This was the price of then-Southwest Missouri State’s name change to Missouri State. The name for limits on growth, to protect Mizzou’s position as the state’s largest university and its “prestige.” Other laws barred or scared off other universities from offering the highest degrees.

In 2018, Missouri passed a law with some good changes, some bad. Universities were finally given a pathway to offer more doctoral degrees — like, say, a Doctorate in Education (Ed.D) — without going through Mizzou. But it was enshrined into law that “the University of Missouri shall be the only state college or university that may offer doctor of philosophy degrees or first-professional degrees, including dentistry, law, medicine, optometry, pharmacy, and veterinary medicine” (H.B. 1465, page 4). Further, engineering degrees and a few others must still go through Mizzou.

Impacted universities include Missouri State, Truman, Central Missouri, Southeast Missouri, Harris-Stowe, Lincoln, Missouri Southern, Missouri Western, and Northwest Missouri. Looking at their catalogues you find no doctoral programs, with a few exceptions, such as two at Central Missouri offered through Mizzou and Indiana State, and eight at Missouri State, with one through UMKC.

Proponents frame all this as eliminating costly duplicate programs and promoting cooperation. But by that reasoning, why should multiple universities offer the same bachelor’s degrees? The actual reasoning is obvious. A monopoly on doctoral degrees means more students and income for the UM system. At the expense of every other public university. At the expense of students, who may want to study elsewhere. And to the detriment of the state, which loses money to other states when students don’t get into Mizzou or a sister school, are priced out, or do not find the program they’re looking for — they have no choice but to go to graduate school in another state.

It’s high time Missouri legislators corrected this nonsense. Students, alumni, and everyday proponents of fairness and sanity should contact their legislators and those who serve the districts of affected universities.

For more from the author, subscribe and follow or read his books.

War is Peace, Freedom is Slavery, Ending Democracy is Saving It

George Orwell’s 1984 quickly introduces the reader to the three slogans of its fictional authoritarian government: war is peace, freedom is slavery, ignorance is strength. According to the common interpretations, these are not meant to be literal equivalents — to be at war is not to be at peace. Rather, as the novel suggests, they are propagandistic cause-effect relationships, tradeoffs. War, the State promises, will bring about peace. True freedom is found in slavery — if you submit to the Party, you will live a successful, comfortable, happy life. Ignorance, giving up personal and contrary ways of thinking, makes society stable, safe, united. The slogans present necessary evils, unpleasant means to noble ends: accepting war, slavery, and ignorance brings personal and national benefits. (The order reversal of the middle slogan is intriguing. We have, from the reader’s perspective, “bad is good,” “good is bad,” “bad is good.” Orwell chose not to pen “slavery is freedom,” which would have aligned with the others and made the “slavery brings freedom” interpretation even stronger. Still, any notion of “freedom bringing slavery” is difficult to reconcile with the other two, given that this propaganda is presenting terrible things as desirable. The Party isn’t going to tell citizens to watch out for slavery but embrace ignorance and war.) Winston Smith, of course, finds out the hard way what happens when war, slavery, and ignorance are not accepted.

In a time of rightwing attempts to overthrow free and fair elections, rising authoritarianism among the populace, and an American system too underdeveloped to handle anti-democratic threats like Trump, one can’t help but think of Orwell. We’ve seen in terrifying fashion how democracy requires the truth to survive, withering in ages of disinformation. Even language became concerning. Blatant falsities about an inauguration crowd size were infamously labeled “alternative facts,” not really doublethink, but reminiscent of how past facts were erased and replaced in the novel. Truth Social, a platform built for Trump and his lies, sounds uncomfortably like the Ministry of Truth, the propaganda division of Oceania whose pyramid-shaped building displays the Party’s three slogans. Of course, conservatives delight in noting that 1984 was a 1949 response to authoritarian socialism in the Soviet Union, and often whine about how woke cancel culture, COVID vaccines, masks, and lockdowns, or welfare and universal services represent the tyranny and submissive collectivity of which Orwell wrote. But they forget Orwell was a socialist who advocated for democratic socialism as frequently as he warned of communism, and they live in a strange world where every liberal (to say nothing of Leftist) policy or cultural shift warrants screams of 1984 but demagogic leaders, casual dismissals of legal and democratic norms, absurdities spewed for reasons of power, plots to ignore election results, violent attacks on the Capitol, authoritarian and nationalistic voters, and so on somehow are of little concern.

But clearly, while it may be most appropriate for the text, depending on one’s reading, the cause-effect interpretation of the slogans doesn’t best reflect our realities. (Though you do see hints of it at times. American war has long been framed as necessary for peace, even if it achieves the opposite, and other horrors.) A literal equivalent interpretation gets much closer. While it probably won’t be publicized and sloganeered in a cartoonish manner, authoritarianism appears to rely on parts of the populace living in parallel worlds. (The State would publicize tradeoffs and push you to accept them, but it would not advertise the fact that you believe falsities and contradictions.) Parallel worlds, built on conspiracy theories and lies, were of course a major reason German democracy collapsed in the 1930s. The Nazis blamed Jews and Communists for Germany’s problems, which justified Hitler’s dismantling of democratic processes and restriction of civil rights. This is how authoritarianism grows and triumphs. It is not that one part of the populace believes war is necessary for peace and another does not. One believes war is peace. It doesn’t realize or accept that it’s ignorant, enslaved, at war; it thinks it is peaceful, free, and strong (this is different from the novel, where everyone knows, for instance, that it is wartime, with news from the front broadcast everywhere; “Winston could not definitely remember a time when his country had not been at war”). One part of the population believes destroying democracy is saving it. The armed mob that broke into the Capitol, the conservatives decrying mass voter fraud (60% of Republicans, nearly 40% of the nation, still believe the 2020 election was stolen), and even some of the politicians sustaining the lunacy…they believe democracy is in danger as sincerely as liberals (and moderates and sane conservatives). It must be protected from those cheating Democrats, fraudulent votes, bad voting machines. Their own reality. Such dupes are completely detached from quality standards of evidence and reason (why would you trust a bad documentary or an article on Breitbart over the conclusions of Republican-controlled, recounted states, Trump’s own Justice Department and Department of Homeland Security, and some 60 federal court cases?), but they think they’re saving democracy. When they’re actually cutting its throat.

For more from the author, subscribe and follow or read his books.

How You Can Help Missouri State Reach an FBS Conference

Missouri State students and alumni have long been unhappy being stuck in the Missouri Valley Conference. Just look at the extremely active forums of Missouri State’s page on 247Sports.com, where you will find constant dreamers longing for a school of our size to move onward and upward.

Much of this centers around football. MSU basketball, baseball, and so on playing in a smaller, less-renowned D-I conference has never been ideal, but at least we can win conference championships and go on to compete for NCAA national titles. We have the chance to battle at the highest level. With football, we’re FCS, and have no such opportunity. Bears fans want to step up to the FBS. 

And the administration is starting to feel the vibe. In August 2021, athletics director Kyle Moats told the Springfield News-Leader, “We’re happy in the Valley” but wanted to have everything in place so that “if we ever got the offer, we’d be ready to go.” Ten years ago, you would have only gotten the first part of that quote.

A move to FBS is no pipe dream. Since 2000, 33 FCS schools have advanced: Massachusetts, Old Dominion, Appalachian State, Georgia Southern, and more. Before that were the likes of Boise State, UConn, Boston, and Marshall. Geographically, Missouri State is well-positioned to join the Sun Belt Conference, Conference USA, or the American Athletic Conference (the Mid-American Conference is also a possibility; Bears swimming and diving is already a member). While a Power 5 conference like the Big 12 or SEC won’t happen, at least for another century or two, MSU has good opportunities for advancement now.

But the university and its supporters must take crucial steps to encourage the necessary invite. We need, as Moats pointed out, upgrades to Plaster Stadium. We need to keep improving the fan experience. Supporters must keep donating through the Missouri State Foundation site and MSU’s GiveCampus page. We need to attend games of all sports, no matter how the season is going. The NCAA has attendance requirements for FBS schools, though enforcement does not appear strict these days. More importantly, studies show higher attendance increases the odds of victory. We need to win to be noticed. And if you can’t make a game, stream it on ESPN+, watch it on TV, etc. Show broadcasters you love the content. Do the little things to help enrollment, too. Buy a car decal, wear MSU gear, post on social media. It’s small acts carried out by tens of thousands of people that change the world.

The arguments against ditching The Valley have never outweighed the potential benefits. Bigger conferences can mean bigger costs, yes. Some wouldn’t want to see MSU fail in a bigger conference, or shift to one unfocused on basketball. This is all short-sighted thinking. The SBC, CUSA, or AAC is a gateway to a more excited fanbase, broader national exposure, a higher profile, increased revenue from enrollment and attendance gains and TV contracts, and so on. We’ll have good years and off years, but we already know we can compete at the highest level of any sport if we have the right pieces in place. University advancement is an upward spiral, but you have to start spinning. When MSU sports regularly play Navy, Rice, SMU, or App State, you’ll be glad you did.

This article originally appeared on Yahoo! and the Springfield News-Leader.

For more from the author, subscribe and follow or read his books.

Protect Your Relationship From Politics at All Costs

There’s a delightful scene in Spiderman: Far From Home:

“You look really pretty,” Peter Parker tells MJ, his voice nearly shaking. They stand in a theatre as an orchestra warms up.

“And therefore I have value?” MJ replies, peering at her crush from the corner of her eye.

“No,” Peter says quickly. “No, that’s not what I meant at all, I was just –“

“I’m messing with you.” A devilish smile crosses her face. “Thank you. You look pretty, too.”

To me, the moment hints at the need to insulate love from politics. In my own experience and in conversations with others, I’ve come across the perhaps not-uncommon question of how, in an age when politics has ventured into (some would say infected or poisoned) every aspect of life, do partners prevent division and discomfort? There are probably various answers, because there are various combinations of human beings and ideologies, but I’ll focus on what interests me the most and what the above scene most closely speaks to: love on the Left.

For partnerships of Leftists, or liberals, or liberals and Leftists, political disagreements may be rare (perhaps less so for the latter). But arguments and tensions can arise even if you and your partner(s) fall on the same place on the spectrum, because we are all, nevertheless, individuals with unique perspectives who favor different reasoning, tactics, policies, and so on. If this has never happened to you in your current relationship, you’ve either found something splendidly exceptional or simply not given it enough time. I recently spoke to a friend, D, who is engaged to E. They are both liberals, but D is at times spoken to as if this wasn’t the case, as if an education is in order, even over things they essentially agree on but approach in slightly different ways. Arguments can ensue. For me personally, there exists plenty of fodder for disagreements with someone likeminded: I’m fiercely against a Democratic expansion of the Supreme Court, and have in other ways critiqued fellow Leftists. This is what nuanced, independent thinkers are supposed to do, but it can create those “Christ, my person isn’t a true believer” moments.

If partners choose to engage in political dialogue (more on that choice in a moment), it’s probably a fine idea for both to make a strong verbal commitment to give the other person the benefit of the doubt. That’s a rule that a scene from a silly superhero movie reminded me of. MJ offered this to Peter, while at the same time making a joke based in feminist criticism. She could have bit off his head in earnest. Had she been talking to a cat-caller on the street, a toxic stranger on the internet, a twit on Twitter, she probably would have. But this isn’t a nobody, it’s someone she likes. Her potential partner and relationship are thus insulated from politics. She assumes or believes that Peter doesn’t value her just for her looks. He isn’t made to represent the ugliness of men. There’s a grace extended to Peter that others may not get or deserve. Obviously, we tend to do this with people we know, like family and friends. We know they’re probably coming from a good place, they’ve earned that grace, and so on. (There may be a case to extend this mercy to all people, until compelled to retract it, among other solutions, in the interests of cooling the national temperature and keeping us from tearing each other to pieces, but we’ll leave that aside.)

But thinking and talking about all this, which we often fail to do, seems important. How do I protect my relationship from politics? Hey, could we give each other the benefit of the doubt? Arguments between likeminded significant others can be birthed or worsened by not assuming the best right from the start. Each person should suppose, for example, that an education is not in order. I call it seeing scaffolding beneath ideas. If your person posits a belief, whether too radical or reactionary, that shocks your conscience, your first instinct might be to argue, “That’s obviously wrong/terrible, due to Reasons 1, 2, 3, and 4.” You know, to bite your lover’s head off. But this isn’t some faceless idiot on the screen. Instead, assume they know those reasons already — because they probably do — and reached their conclusion anyway. Imagine that Reasons 1-4 are already there, the education is already there, forming the scaffolding to this other idea. Instead of immediately correcting them, ask them how they reached that perspective, given their understanding of Reasons 1-4 (if they’ve never heard of those, then proceed with an education). No progressive partner wants to be misrepresented, to hear that they only think this way because they don’t understand something, are a man and therefore think in dreadful male ways (like Peter and the joke), and so on: you think that because you’re a woman, white or black, straight or gay, poor or wealthy, too far Left or not far enough, not a true believer. Someone’s knowledge, beliefs, or identity-based perspective can be flawed, yes — suppose it’s not until proven otherwise. These things determine one’s mode of thought; suppose it’s in a positive way first. “Well, well, well, sounds like the straight white man wants to be shielded from critique!” God, yes. With your lover, I think it’s nice to be seen as a human being first. I certainly want to be seen as a human being before being seen as a man, for instance. I don’t want to represent or stand in for men in any fashion. A disgusting thought. Some will say that’s an attempt to stand apart from men to pretend my views aren’t impacted in negative ways by my maleness — to avoid the valid criticisms of maleness and thus myself. Perhaps so. But maybe others also wish to be seen as a human being before a woman, a human being before an African American, a human being before a Leftist. Because politics has engulfed everything, there are so few places left where this is possible. It may not be doable or even desirable to look at other people or all people in this way, but having one person to do it with is lovely. Or a few, for the polyamorous. It’s a tempting suggestion, to shield our love from politics, to transcend it in some way (Anne Hathaway, in an Interstellar line that was wildly inappropriate for her scientist character, said that love was the one thing that transcended time and space — ending with “politics” would have made more sense). One way of doing that is to assume the best in your partner, and see before you an individual beyond belief systems, beyond identity, beyond ignorance. Again, until forced to do otherwise. All this can be tough for Leftists and liberals, because we’re so often at each other’s throats, racing to be the most pure or woke, and so on. There exists little humility. We want to lecture, not listen. Debate, not discuss. It’s a habit that can bleed into relationships, but small changes can reduce unwanted tensions and conflict. (If it’s wanted, if it keeps things spicy, I apologize for wasting your time. Enjoy the make-up sex.)

I do not know if rightwing lovers experience comparable fights, but I imagine all this could be helpful to them as well. They have their own independent thinkers and failed true believers.

An even better way to protect your relationship from politics is to simply refuse to speak of such things. Purposefully avoid the disagreements. This may be best for those dating across the ideological divide (though offering the benefit of the doubt would still be best for the Right-Left pairings or groupings that choose to engage in discourse). This may be surprising, but this is generally my preferred method, whether I’m dating someone who thinks as I do or rather differently. (I of course have a proclivity for a partner who shares my values, but I have dated and probably still could date conservatives, if they were of the anti-Trump variety. Some people are too far removed from my beliefs to be of interest, which is natural. This article is not arguing one should stay with a partner who turns out to have terrible views or supports a terrible man. This is also why “respect each other’s views” is a guideline unworthy of mention. Apart from being too obvious, it at some point should not be done.) Perhaps it’s because so much of my work, writing, and reading has to do with politics. I would rather unplug and not discuss such things with a mate, nor with many close friends and family members. Though it happens every now and then. If partners together commit to making this a general policy, it can be quite successful. And why not? While I see the appeal of learning and growing with your person through meaningful discussion of the issues, it risks having something come between you, and having an oasis from the noise and nightmare sounds even better, just as loving your partner for who they are sounds much less stressful than trying to change them.

For more from the author, subscribe and follow or read his books.

Will the NFL Convert to Flag Football in the Next Century?

A big part of the fun of American football is players smashing into each other. From the gladiatorial spectacles of Rome to today’s boxing, UFC/MMA, and football, watching contestants exchange blows, draw blood, and even kill one another has proved wildly entertaining. I know I have base instincts as well that enjoy, or are at least still engrossed by, brutal sport. I write “at least still” because the NFL has become harder to watch knowing the severe brain damage it’s causing.

This prompts some moral musings. The NFL certainly has the moral responsibility to thoroughly inform every player of the risks (and to not bury the scientific findings, as they once did). If all players understand the dangers, there is probably no ethical burden on them — morality is indeed about what does harm to others, but if all volunteer to exchange CTE-producing blows that’s fine. Beating up a random person on the street is wrong, but boxing isn’t, because it’s voluntary. In a scenario where some football players know the risks but not all, that’s a bit trickier. Is there something wrong about potentially giving someone brain damage who doesn’t know that’s a possibility, when you know? As for fans, is there a moral burden to only support a league (with purchases, viewership, etc.) that educates all its players on CTE? But say everyone is educated; if afterwards the NFL still has a moral duty to make the game safer through better pads and rules to reduce concussions, does it by extension also have the moral duty to end contact and tackles to eliminate concussions? There’s much to think about.

In any case, after head trauma findings could no longer be ignored, the NFL made, and continues to make, rule changes to improve safety (to limited effect thus far). Better helmets, elimination of head-to-head blows, trying to reduce kick returns, banning blindside blocks, and so on. At training camp, players are even wearing helmets over their helmets this year. Though some complain the game is being ruined, and others suggest the NFL is hardly doing enough, all can agree that the trend is toward player safety. Meanwhile, some young NFL players have quit as they’ve come to understand the risks. They don’t want disabilities and early death.

A parallel trend is the promotion of flag football. The NFL understands, Mike Florio notes, that if flag can be popularized all over the world then the NFL itself will become more international and make boatloads more money. It’s not really about safety (except perhaps for children). The organization helped get flag football into the World Games 2022 and promoted the journeys of the U.S. men and women’s teams, and is now trying for the 2028 Olympics. NFL teams have youth flag leagues, and Michael Vick, Chad Ochocinco, and Terrell Owens are playing in the NFL-televised American Flag Football League. The Pro Bowl is being replaced with a skills competition and a flag football game.

Troy Vincent, an NFL vice president, said recently, “When we talk about the future of the game of football, it is, no question, flag. When I’ve been asked over the last 24 months, in particular, what does the next 100 years look like when you look at football, not professional football, it’s flag. It’s the inclusion and the true motto of ‘football for all.’ There is a place in flag football for all.” He was careful to exclude the professional game here, focusing on opening the sport to girls, women, and poorer kids in the U.S. and around the world, but one wonders how long that exception will hold. If current trajectories continue, with a growth of flag and a reduction of ferocity in the NFL, one day a tipping point may be reached. It won’t happen easily if the NFL thinks such a change would cut into its profits, but it’s possible. It may not be in 50 years or 100, but perhaps after 200 or 500.

Changes in sports — the rules, the equipment, everything — may be concerning but should never be surprising. Many years ago, football looked rather different, after all. You know, when you couldn’t pass the ball forward, the center used his foot instead of his hands to snap, the point after was actually four points, you could catch your own punt and keep the ball, etc. The concussion crisis has of course also spurred calls to take the NFL back to pre-1940s style of play, getting rid of helmets and other protections to potentially improve safety. There’s evidence players protect their heads and those of others better when they don’t feel armored and invincible. This is another possible future. However, it’s also a fact that early football was much deadlier, and the dozens of boys and men who died each year playing it almost ended the sport in the early 20th century, so one may not want to get rid of too many modern pads and rules if we’re to keep tackle. An apparent contradiction like this means many factors are at play, and will have to be carefully parsed out. Perhaps a balance can be found — less armor but not too little — for optimal safety.

Though my organized tackle and flag experiences ended after grade school, with only backyard versions of each popping up here and there later on, I always considered flag just as fun to play. And while I think the flag of the World Games is played on far too narrow a field, and both it and the AFFL need field goals, kicks, light-contact linemen, and running backs (my flag teams had these), they’re both fairly entertaining (watch here and here). One misses the collisions and take-downs, but the throws, nabs, jukes, picks, and dives are all good sport. No, it’s not the same, but the future rarely is.

For more from the author, subscribe and follow or read his books.

The MAIN Reasons to Abolish Student Debt

Do you favor acronyms as much as you do a more decent society? Then here are the MAIN reasons to abolish student debt:

M – Most other wealthy democracies offer free (tax-funded) college, just like public schools; the U.S. should have done the same decades ago.

A – All positive social change and new government programs are “unfair” to those who came before and couldn’t enjoy them; that’s how time works.

I – Immense economic stimulus: money spent on debt repayment is money unspent in the market, so end the waste and boost the economy by trillions.

N – Neighbors are hurting, with skyrocketing costs of houses, rent, food, gas, and more, with no corresponding explosion of wages; what does Lincoln’s “government for the people” mean if not one that makes lives a little better?

For more from the author, subscribe and follow or read his books.

Three Thoughts on Democracy

The following are three musings on what might undermine and end American democracy, in the hopes such things can be countered.

Did the Electoral College prime Americans to reject democracy? The current declining trust in democracy and rising support for authoritarianism could perhaps be partly explained by preexisting anti-democratic norms. Supporters of the Electoral College, or those apathetic, were already comfortable with something disturbing: the candidate with fewer votes winning an election. How great a leap is it from there to tolerating (or celebrating) a candidate with fewer votes taking the White House due to some other reason? Trump and his supporters’ attempts to overturn a fair election may not be the best example here, as many of them believed Trump in fact won the most votes and was the proper victor, but one can fill in the blank with a clearer hypothetical. Imagine a violent coup takes place without anyone bothering to pretend an election was stolen; the loser simply uses force to seize power. Would a citizenry long agreeable to someone with fewer votes taking power be more complacent when a coup allows for the same? (Now imagine half the country wanted the coup leader to win the election — and this same half historically favored the Electoral College! Fertile soil for complacency.)

Does a two-party system make authoritarianism inevitable? No matter how terrible a presidential candidate is, he or she is better than the other party’s nominee. That is the mindset, and it helped secure Trump’s 2016 victory — the 62.9 million who voted for him were not all cultish true believers; many just regarded Democrats as the true enemy. Same for the 74.2 million who voted for him in 2020. Trump was a duncical demagogue with authoritarian tendencies who tried to deal a fatal blow to our democracy to stay in power. Future candidates will act in similar fashion. None of that matters in a nation with extreme political polarization. Authoritarians will earn votes, and possibly win, simply because they are not with the other party. The two-party trap could exterminate democracy.

We forget that authoritarians are popular. The Netflix docuseries How to Become a Tyrant offers many important warnings to those who care about preserving democracy. Perhaps its most crucial reminder is that authoritarians are popular. (Another: democracy is usually ended slowly, chipped away at.) Many are elected by majorities; even long after coming to power — with democracy replaced by reigns of terror — strongmen can have broad support, even devotion. This should not be so surprising. As noted above, one can see that authoritarianism as an ideology can grow favorable, as can candidates and politicians with authoritarian sentiments. (Research suggests the strongest predictor of whether someone is a Trump supporter is whether he or she has authoritarian views. Trump likely understood and used this.) Yet for those raised in free societies, this can be confounding. Could Americans really vote away democracy, could they be so blind? I would never do that. The answer is yes, and the question is: are you sure?

For more from the author, subscribe and follow or read his books.

Five Ways to Raise MSU’s Profile by 2025

We have three years. In 2025, Missouri State University will celebrate twenty years since our name change. We’ve bolstered attendance, built and renovated campus-wide, and grown more competitive in sports, resulting in a fast-climbing reputation and wider brand awareness.

Let’s keep it going. Here are five strategies to go from fast-climbing to skyrocketing before the historic celebration.

1) Sponsor “Matt & Abby” on social media. Matt and Abby Howard, MSU grads, have over 3 million followers on TikTok, over 1 million subscribers on YouTube, and nearly 800,000 followers on Instagram. Their fun videos occasionally provide free advertising, as they wear MO State shirts and hoodies, but a sponsorship to increase and focus this (imagine them doing BearWear Fridays) would be beneficial. Their views are now collectively in the billions.

2) Offer Terrell Owens a role at a football game. Legendary NFL receiver Terrell Owens (who has a sizable social media presence of his own) appeared on the MSU sideline during the 2021 season, as his son Terique is a Bears wide receiver. Invite Terrell Owens to join the cheer squad and lead the chants at a game. Or ask him to speak at halftime. Advertise it widely to boost attendance and get the story picked up by the national press.

3) Convince John Goodman to get on social media. Beloved actor and MSU alumnus John Goodman is now involved in university fundraising and related media — that’s huge. (Say, get him a role at a game, too.) The only thing that could make this better is if he would get on socials. Goodman would have millions of followers in a day, and with that comes exposure for MO State. Who knows what it would take to convince him after all these years avoiding it, but someone at this university has his ear…and should try.

4) Keep going after that Mizzou game. Mizzou men’s basketball coach Cuonzo Martin, as the former coach at MSU, is our best bet in the foreseeable future for the first MSU-Mizzou showdown since the Bears’ 1998 victory. In fact, a deal was in the works in summer 2020, but quickly fell apart. Martin’s contract ends in 2024 — if it is not renewed, scheduling a game will become much more difficult. Today MO State plays Mizzou in nearly all sports, even if football is irregular (last in 2017, next in 2033). We should keep fighting for a men’s basketball game. Then, of course, win it.

5) Build and beautify. From the John Goodman Amphitheatre to the renovation of Temple Hall, the campus is growing, dazzling. This should continue, for instance with the proposed facility on the south side of Plaster Stadium. Improving football facilities ups the odds of a future invite to an FBS conference. And one cannot forget more trees, possibly the most inexpensive way to radically beautify a university. Filling campus with more greenery, with more new and restored buildings, will position Missouri State as a destination campus for the next 20 years and beyond.

This article first appeared on Yahoo! and the Springfield News-Leader.

For more from the author, subscribe and follow or read his books.

Slowly Abandoning Online Communication and Texting

I grow increasingly suspicious of speaking to others digitally, at least in written form — comments, DMs, texts. It has in fact been 1.5 years since I last replied to a comment on socials, and in that time have attempted to reduce texting and similar private exchanges. Imagine that, a writer who doesn’t believe in written communication.

The motive for these life changes were largely outlined in Designing a New Social Media Platform:

As everyone has likely noticed, we don’t speak to each other online the way we do in person. We’re generally nastier due to the Online Disinhibition Effect; the normal inhibitions, social cues, and consequences that keep us civil and empathetic in person largely don’t exist. We don’t see each other the same way, because we cannot see each other. Studies show that, compared to verbal communication, we tend to denigrate and dehumanize other people when reading their written disagreements, seeing them as less capable of feeling and reason, which can increase political polarization. We can’t hear tone or see facial expressions, the eyes most important of all, creating fertile ground for both unkindness and misunderstandings. In public discussions, we also tend to put on a show for spectators, perhaps sacrificing kindness for a dunk that will garner likes. So let’s get rid of all that, and force people to talk face-to-face.

Circling back to these points is important because they obviously apply not only to social media but to texting, email, dating apps, and many other features of modern civilization. We all know how easy it is for a light disagreement to somehow turn into something terribly ugly when texting a friend, partner, or family member. It happens so fast we’re bewildered, or angered that things spiraled out of control, that we were so inexplicably unpleasant. It needn’t be this way. Some modes of communication are difficult to curb — if your job involves email, for instance — but it’s helpful to seek balance. You don’t have to forsake a tool completely if you don’t want to, just use it differently, adopt principles. A good rule: at the first hint of disagreement or conflict, stop. (Sometimes we even know it’s coming, and can act preemptively.) Stop texting or emailing about whatever it is. Ask to Facetime or Zoom, or meet in person, or call (at least you can hear them). Look into their eyes, listen to their voice. There are things that are said via text and on socials that would simply never be said in person or using more intimate technologies.

Progress will be different for each person. Some would rather talk than text anyway, and excising the latter from their lives would be simple. Others may actually be able to email less and cover more during meetings. Some enviable souls have detached themselves from social media altogether — which I hope to do at some point, but have found a balance or middle ground for now, since it’s important to me to share my writings, change the way people think, draw attention to political news and actions, and keep track of what local organizations and activists are up to (plus, my job requires social media use).

Changing these behaviors is key to protecting and saving human relationships, and maybe even society itself. First, if there’s an obvious way to avoid firestorms with friends and loved ones, keeping our bonds strong rather than frayed, we should take it. Second, the contribution of social media to political polarization, hatred, and misinformation since 2005 (maybe of the internet since the 1990s) is immeasurable, with tangible impacts on violence and threats to democracy. Society tearing itself apart due at least partially to this new technology sounds less hyperbolic by the day.

And it’s troubling to think that I, with all good intentions, am still contributing to that by posting, online advocacy perhaps having a negative impact on the world alongside an important positive one. What difference does it really make, after all, to share an opinion but not speak to anyone about it? Wouldn’t a social media platform where everyone shared their opinions but did not converse with others, ignored the comments, be just as harmful to society as a platform where we posted opinions and also went to war in the comments section? Perhaps so. The difference may be negligible. But in a year and a half, I have not engaged in any online debate or squabble, avoiding heated emotions toward individuals and bringing about a degree of personal peace (I have instead had political discussions in person, where it’s all more pleasant and productive). If I could advocate for progressivism or secularism while avoiding heightened emotions toward individual pious conservatives, whether friends or random strangers, they could do the same, posting and opining while sidestepping heightened emotions toward me. This doesn’t solve the divisiveness of social media — the awful beliefs and posts from the other side (whichever that is for you) are still there. Plenty of harmful aspects still exist beside the positive ones that keep us on. But perhaps it lowers the temperature a little.

For more from the author, subscribe and follow or read his books.

The Future of American Politics

The following are five predictions about the future of U.S. politics. Some are short-term, others long-term; some are possible, others probable.

One-term presidents. In a time of extreme political polarization and razor-thin electoral victories, we may have to get used to the White House changing hands every four years rather than eight. In 2016, Trump won Michigan by 13,000 votes, Wisconsin by 27,000, Pennsylvania by 68,000, Arizona by 91,000. Biden won those same states in 2020 by 154,000, 21,000, 82,000, and 10,000, respectively. Other states were close as well, such as Biden’s +13,000 in Georgia or Clinton’s +2,700 in New Hampshire. Competitive races are nothing new in election history, and 13 presidents (including Trump) have failed to reach a second term directly after their first, but Trump’s defeat was the first incumbent loss in nearly 30 years. The bitter divisions and conspiratorial hysteria of modern times may make swing state races closer than ever, resulting in fewer two-term presidents — at least consecutive ones — in the near-term.

Mail privacy under rightwing attack. When abortion was illegal in the United States, there were many abortions. If Roe falls and states outlaw the procedure, or if the Supreme Court continues to allow restrictions that essentially do the same, we will again see many illegal terminations — only they will be far safer and easier this time, with abortion pills via mail. Even if your state bans the purchase, sale, or use of the pill, mail forwarding services or help from out-of-town friends (shipping the pills to a pro-choice state and then having them mailed to you) will easily get the pills to your home. Is mail privacy a future rightwing target? The U.S. has a history of banning the mailing of contraceptives, information on abortion, pornography, lottery tickets, and more, enforced through surveillance, requiring the Supreme Court to declare our mail cannot be opened without a warrant. It is possible the Right will attempt to categorize abortion pills as items illegal to ship and even push for the return of warrantless searches.

Further demagoguery, authoritarianism, and lunacy. Trump’s success is already inspiring others, some worse than he is, to run for elected office. His party looks the other way or enthusiastically embraces his deceitful attempts to overturn fair elections because it is most interested in power, reason and democracy be damned. Same for Trump’s demagoguery, his other lies and authoritarian tendencies, his extreme policies, his awful personal behavior — his base loves it all and it’s all terribly useful to the GOP. While Trump’s loss at the polls in 2020 may cause some to second-guess the wisdom of supporting such a lunatic, at least those not among the 40% of citizens who still believe the election was stolen, at present it seems the conservative base and the Republican Party are largely ready for Round 2. What the people want and the party tolerates they will get; what’s favored and encouraged will be perpetuated and created anew. It’s now difficult to imagine a normal human being, a classic Republican, a decent person like Mitt Romney, Liz Cheney, Jon Huntsman, John Kasich, or even Marco Rubio beating an extremist fool at the primary polls. The madness will likely continue for some time, both with Trump and others who come later, with only temporary respites of normalcy between monsters. Meanwhile, weaknesses in the political and legal system Trump exploited will no doubt remain unfixed for an exceptionally long time.

Republicans fight for their lives / A downward spiral against democracy. In a perverse sort of way, Republican cheating may be a good sign. Gerrymandering, voter suppression in all its forms, support for overturning a fair election, desperation to hold on to the Electoral College, and ignoring ballot initiatives passed by voters are the acts and sentiments of the fearful, those who no longer believe they can win honestly. And given the demographic changes already occurring in the U.S. that will transform the nation in the next 50-60 years (see next section), they’re increasingly correct. Republicans have an ever-growing incentive to cheat. Unfortunately, this means the Democrats do as well. Democrats may be better at putting democracy and fairness ahead of power interests, but this wall already has severe cracks, and one wonders how long it will hold. For example, the GOP refused to allow Obama to place a justice on the Supreme Court, and many Democrats dreamed of doing the same to Trump, plus expanding the Court during the Biden era. Democrats of course also gerrymander U.S. House and state legislature districts to their own advantage (the Princeton Gerrymandering Project is a good resource), even if Republican gerrymandering is worsefour times worse — therefore reaping bigger advantages. It’s sometimes challenging to parse out which Democratic moves are reactions to Republican tactics and which they would do anyway to protect their seats, but it’s obvious that any step away from impartiality and true democracy encourages the other party to do the same, creating a downward anti-democratic spiral, a race to the bottom.

(One argument might be addressed before moving on. Democrats generally make it easier for people to vote and support the elimination of the Electoral College, though again liberals are not angels and there are exceptions to both these statements. Aren’t those dirty tactics that serve their interests? As I wrote in The Enduring Stupidity of the Electoral College, which shows that this old anti-democratic system is unfair to each individual voter, “True, the popular vote may serve Democratic interests. Fairness serves Democratic interests. But, unlike unfairness, which Republicans seek to preserve, fairness is what’s right. Giving the candidate with the most votes the presidency is what’s right.” Same for not making it difficult for people who usually vote the “wrong” way to cast their ballots! You do what is right and fair, regardless of who it helps.)

Democratic dominance. In the long-term, Democrats will become the dominant party through demographics alone. Voters under 30 favored the Democratic presidential candidate by large margins in 2004, 2008, 2012, 2016, and 2020 — voters under 40 also went blue by a comfortable margin. Given that individual political views mostly remain stable over time (the idea that most or even many young people will grow more conservative as they age is unsupported by research), in 50 or 60 years this will be a rather different country. Today we still have voters (and politicians) in their 80s and 90s who were segregationists during Jim Crow. In five or six decades, those over 40 today (who lean Republican) will be gone, leaving a bloc of older voters who have leaned blue their entire lives, plus a new generation of younger and middle-aged voters likely more liberal than any of us today. This is on top of an increasingly diverse country, with people of color likely the majority in the 2040s — with the white population already declining by total numbers and as a share of the overall population, Republican strength will weaken further (the majority of whites have long voted Republican; the majority of people of color vote blue). A final point: the percentage of Americans who identify as liberal is steadily increasing, as opposed to those who identify as conservative, and Democrats have already won the popular vote in seven of the last eight presidential elections. Republican life rafts such as the Electoral College (whose swing states will experience these same changes) and other anti-democratic practices will grow hopelessly ineffective under the crushing weight of demographic metamorphosis. Assuming our democracy survives, the GOP will be forced to moderate to have a chance at competing.

For more from the author, subscribe and follow or read his books.

Is It Possible For Missouri State to Grow Larger Than Mizzou?

Students and alumni of Missouri State (and perhaps some of the University of Missouri) at times wonder if MSU will ever become the largest university in the state. While past trends are never a perfect predictor of the future, looking at the enrollment patterns of each institution can help offer an answer. Here are the total student growths since 2005.

Mizzou
Via its Student Body Profile reports and enrollment summary (Columbia campus):

2005 – 27,985
2006 – 28,253
2007 – 28,477
2008 – 30,200
2009 – 31,314
2010 – 32,415
2011 – 33,805
2012 – 34,748
2013 – 34,658
2014 – 35,441
2015 – 35,448
2016 – 33,266
2017 – 30,870
2018 – 29,866
2019 – 30,046
2020 – 31,103
2021 – 31,412

Missouri State
Via its enrollment history report (Springfield campus):

2005 – 19,165
2006 – 19,464
2007 – 19,705
2008 – 19,925
2009 – 20,842
2010 – 20,949
2011 – 20,802
2012 – 21,059
2013 – 21,798
2014 – 22,385
2015 – 22,834
2016 – 24,116
2017 – 24,350
2018 – 24,390
2019 – 24,126
2020 – 24,163
2021 – 23,618

In the past 16 years, MSU gained on average 278.3 new students each Fall. Mizzou gained 214.2 new students per year, an average tanked by the September 2015 racism controversy. Before the controversy (2005-2015 data), Mizzou gained 746.3 new students per year (MSU, over the same ten years, +366.9). From a low point in 2018, Mizzou has since, over a three-year period, gained on average 515.3 new students (over the same time, MSU saw -257.3 students — one school’s gain is often the other’s loss). This is too short a timeframe to draw unquestionable conclusions, but with Mizzou back on its feet it seems likely to continue to acquire more students on average each year, making MSU’s ascension to the top unlikely.

Predicting future enrollment patterns is rather difficult, of course. Over the past decade, fewer Americans have attended university, including fewer Missourians — and that was before COVID. Like a pandemic or a controversy, some disruptors cannot be predicted, nor can boosts to student populations. But most challenges will be faced by both schools: fewer young people, better economic times (which draws folks to the working world), pandemics, etc. The rising cost of college may give a university that is slightly more affordable an edge, as has been Missouri State’s long-time strategy. An increased profile through growing name recognition (it’s only been 16 years since Missouri State’s name change), success in sports, clever marketing schemes (alumnus John Goodman is now involved with MSU), ending Mizzou’s near-monopoly on doctoral degrees, and so on could make a difference, but there remains a huge advantage to simply being an older school, with a head-start in enrollment and brand recognition.

For more from the author, subscribe and follow or read his books.

COVID Showed Americans Don’t Leech Off Unemployment Checks

In most states, during normal times, you can use unemployment insurance for at most 26 weeks, half the year, and will receive 30-50% of the wages from your previous job, up to a certain income. This means $200-400 a week on average. One must meet a list of requirements to qualify, for instance having been fired from a job due to cutbacks, not through fault of your own. Only 35-40% of unemployed persons receive UI.

This means that at any given time, about 2 million Americans are receiving UI; in April/May 2020, with COVID-19 and State measures to prevent its spread causing mass firings, that number skyrocketed to 22 million. Put another way, just 1-3% of the workforce is usually using UI, and during the pandemic spike it was about 16%. Just before that rise, it was at 1.5% — and it returned to that rate in November 2021, just a year and a half later. Indeed, the number of recipients fell as fast as it shot up, from 16% to under 8% in just four months (September 2020), down to 4% in six months (November 2020). As much pearl-clutching as there was among conservatives (at least those who did not use UI) over increased dependency, especially with the temporary $600 federal boost to UI payments, tens of millions of Americans did not leech off the system. They got off early, even though emergency measures allowed them to stay on the entire year of 2020 and into the first three months of 2021! (The trend was straight down, by the way, even before the $600 boost ended.)

This in fact reflects what we’ve always known about unemployment insurance. It’s used as intended, as a temporary aid to those in financial trouble (though many low-wage workers don’t have access to it, which must be corrected). Look at the past 10 years of UI use. The average stay in the program (“duration”) each year was 17 or 18 weeks in times of economic recovery, 14 or 15 weeks in better economic times (sometimes even fewer). Four months or so, then a recipient stops filing for benefits, having found a job or ameliorated his or her crisis in some fashion. Some “enjoy” the 30-50% of previous wages for the whole stretch, but the average recipient doesn’t even use UI for 20 weeks, let alone the full 26 allowed. This makes sense, given how much of a pay cut UI is. Again, many Americans stop early, and the rest are cut off — so why all the screaming about leeching? Only during the COVID crisis did the average duration climb higher, to 26-27 weeks, as the federal government offered months of additional aid, as mentioned — again, many did not receive benefits for as long as they could have.

Those that receive benefits will not necessarily do the same next year. In times of moderate unemployment, for example, about 30% of displaced workers and 50% of workers on temporary layoff who receive benefits in Year 1 will reapply for benefits in Year 2. The rest do not refile.

However, we must be nuanced thinkers. Multiple things can be true at the same time. UI can also extend unemployment periods, which makes a great deal of sense even if UI benefits represent a drastic pay cut. UI gives workers some flexibility to be more selective in the job hunt. An accountant who has lost her position may, with some money coming in and keeping a savings account afloat, be able to undertake a longer search for another accounting job, rather than being forced to take the first thing she can find, such as a waitressing job. This extra time is important, because finding a similar-wage job means you can keep your house or current apartment, won’t fall further into poverty, etc. There are many factors behind the current shortage of workers, and UI seems to be having a small effect (indeed, studies range between no effect and moderate effects). And of course, in a big, complex world there will be some souls who avoid work as long as they can, and others who commit fraud (during COVID, vast sums were siphoned from our UI by individuals and organized crime rings alike, in the U.S. and from around the globe; any human being with internet access can attempt a scam). But that’s not most Americans. While UI allows workers to be more selective, prolonging an unemployed term a bit, they nevertheless generally stop filing for benefits early and avoid going back.

To summarize, for the conservatives in the back. The U.S. labor force is 161 million people. A tiny fraction is being aided by UI at any given moment. Those that are generally don’t stay the entire time they could. Those who do use 26 weeks of benefits will be denied further aid for the year (though extended benefits are sometimes possible in states with rising unemployment). Most recipients don’t refile the next year. True, lengths of unemployment may be increased some, and there will always be some Americans who take advantage of systems like this, but most people would prefer not to, instead wanting what all deserve — a good job, with a living wage.

For more from the author, subscribe and follow or read his books.

Big Government Programs Actually Prevent Totalitarianism

There is often much screaming among conservatives that big government programs — new ones like universal healthcare, universal college education, or guaranteed work, and long-established ones like Social Security, Medicaid, and Medicare — somehow lead to dictatorship. There is, naturally, no actual evidence for this. The imagined correlation is justified with nothing beyond “that’s socialism, which always becomes totalitarianism,” ignorance already addressed. The experience of advanced democracies around the world, and indeed the U.S. itself, suggests big government programs, run by big departments with big budgets and big staffs helping tens of millions of citizens, can happily coexist alongside elected governing bodies and presidents, constitutions, and human rights, as one would expect.

Threats to democracy come from elsewhere — but what’s interesting to consider is how conservatives have things completely backward. Big government programs — the demonstration that one’s democracy is a government “for the people,” existing to meet citizen needs and desires — are key to beating back the real threats to a republic.

In a recent interview with The Nation, Bernie Sanders touched on this:

“Why it is imperative that we address these issues today is not only because of the issues themselves—because families should not have to spend a huge proportion of their income on child care or sending their kid to college—but because we have got to address the reality that a very significant and growing number of Americans no longer have faith that their government is concerned about their needs,” says the senator. “This takes us to the whole threat of Trumpism and the attacks on democracy. If you are a worker who is working for lower wages today than you did 20 years ago, if you can’t afford to send your kid to college, etc., and if you see the very, very richest people in this country becoming phenomenally rich, you are asking yourself, ‘Who controls the government, and does the government care about my suffering and the problems of my family?’”

Sanders argues that restoring faith in government as a force for good is the most effective way to counter threats to democracy.

And he’s right. Empirical evidence suggests economic crises erode the rule of law and faith in representative democracy. Depressions are not the only force that pushes in this direction, but they are significant and at times a killing blow to democratic systems. Unemployment, low wages, a rising cost of living — hardship and poverty, in other words — drive citizens toward extreme parties and voices, including authoritarians. Such leaders are then elected to office, and begin to dismantle democracy with support of much of the population. Europe in the 1930s is the oft-cited example, but the same has been seen after the global recession beginning in 2008, with disturbing outgrowths of recent declining trust in democracy: the success of politicians with demagogic and anti-democratic bents like Trump, hysteria over fictional stolen elections that threatens to keep unelected people in office, and dangerous far-right parties making gains in Europe. The Eurozone and austerity crisis, the COVID-induced economic turmoil, and more have produced similar concerns.

What about the reverse? If economic disaster harms devotion to real democracy and politicians who believe in it, does the welfare state increase support for and faith in democracy? Studies also suggest this is so. Government tackling poverty through social programs increases satisfaction with democratic systems! The perception that inequality is rising and welfare isn’t doing enough to address it does the exact opposite. A helping hand increases happiness, and is expected from democracies, inherently linking favorability views on republics and redistribution. If we wish to inoculate the citizenry against authoritarian candidates and anti-democratic practices within established government, shoring up loyalty to democracy through big government programs is crucial.

It is as Sanders said: the most important thing for the government to do to strengthen our democracy and even heal polarization (“Maybe the Democrats putting $300 per child per month in my bank account aren’t so evil”), is simply to help people. To work for and serve all. Healthcare, education, income support, jobs…such services help those on the Right, Left, and everyone in between. This should be done whether there is economic bust or boom. People hold fast to democracy, a government of and by the people, when it is clearly a government for the people. If we lose the latter, so too the former.

For more from the author, subscribe and follow or read his books.

COVID Proved Social Conditions Largely Determine Our Health

In the past year, it has been heavily impressed upon Kansas Citians that one’s health is to a significant degree determined by factors beyond one’s control. The COVID-19 era is a key moment to further break down the reactionary notion that personal health choices are all that stands between an individual and optimal physical and mental well-being. It’s broadened our understanding of how health is also a product of social conditions.

The first and most elementary fact to note is that viruses, while often focusing on vulnerable populations such as the elderly, are not often entirely discriminatory. They end the lives of the young and healthy as well. Regardless of one’s habits of eating, exercise, or not smoking, random exposure to illnesses new or old as one shops for groceries or rides in an Uber helps introduce the point: The environment often makes a mockery of our personal choices, as important as those are.

The family you are born into, where you grow up, and other factors beyond your control — and often your own awareness — have a large impact on your development and health as a child, which in turn impacts your health as an adult. (And the environment you happen to be in continues to affect you.) Poverty, extremely stressful on the mind and body in many ways, is the ultimate destructive circumstance for children and adults alike. Take the disturbing life expectancy differences between the poor and the better-off, for instance. In Kansas City’s poorest ZIP codes, which are disproportionately black, you can expect to live 18 fewer years on average compared to our richest, whitest ZIP codes, as Flatland reported on June 22. Poor families are less likely to have health care offered by an employer or be able to afford it themselves. They live in social conditions that include more violence or worse air and water pollution. They can at times only afford housing owned by negligent landlords slow to take care of mold, and cope with a million other factors.

During the pandemic, what serious observers of the social determinants of health predicted came true: Black Kansas Citians were hammered by COVID-19. Here we feel, today, the cold touch of slavery and Jim Crow, which birthed disproportionate poverty, which nurtured worse health, which resulted in Black Kansas Citians being more likely to catch coronavirus and die from it, as The Star reported even in the early stages of the pandemic. Worse still, on Feb. 24, the paper noted that richer, whiter ZIP codes — the areas of less urgent need — were getting disproportionately more vaccines than poorer areas with more Black residents. The vaccines were first shipped by the state to health centers that were convenient for some but distant from others.

Imagine history and race playing a role in your health, how soon you could get a shot. Imagine transportation options and where you live being factors. Likewise, imagine the kind of job you have doing the same: Lower-income workers are more likely to have front-line jobs at restaurants and grocery stores, where you can catch the virus. The privileged, better-off often work from home.

Whether it is drinking water you don’t know is unsafe or working at a job that requires much human contact during a pandemic, the determinants of health stretch far beyond exercising, eating right, and choosing not to smoke. To reflect on this fact is to understand a moral duty. If social conditions affect the health of individuals and families, it is urgent to change social conditions — to build a decent society, one without poverty and the many horrors that flow from it.

In this moment, one important way to help move toward this goal is to urge the U.S. House to pass the reconciliation budget that just passed the Senate, to extend the direct child tax credit payments to families, boldly expand education and health care, and more. Onward, a better world awaits.

This article first appeared in The Kansas City Star: https://www.kansascity.com/opinion/readers-opinion/guest-commentary/article253638658.html

For more from the author, subscribe and follow or read his books.

Is Time the Only Cure for COVID Foolishness?

As August 2021 began, 50% of the U.S. population was fully vaccinated against COVID-19, over 165 million people. There have been 615,000 confirmed deaths — the actual number, given the national excess mortality rate since the start of 2020, is likely double official figures. Over a 12-month period, since last August, 2.5 million people were hospitalized, many leaving with lasting medical problems. All the while, protests and foaming at the mouth over mask and vaccine mandates continue; half the population has refused or delayed the vaccine, this group disproportionately (+20%) Republican.

Attempting to convince the conspiracy theorists, bullheaded conservatives, and those concerned over how (historically) fast the vaccine breakthrough occurred is of course still the moral and pressing thing to do. This piece isn’t an exercise in fatalism, despite its headline. However, great frustration exists: if the hesitant haven’t been convinced by now, what will move the needle? With over a year and a half to absorb the dangers of COVID, deadly and otherwise, and eight months to observe a vaccine rollout that has given 1.2 billion people globally highly effective protection, with only an infinitesimally small percentage seeing any side effects (similar to everyday meds), what could possibly be said to convince someone to finally listen to the world’s medical and scientific consensus, to listen to reason? People have been given a chance to compare the disease to the shots (the unvaccinated are 25 times more likely to be hospitalized from COVID and 24 times more likely to die, with nearly all [97, 98, 99%] of COVID deaths now among the unprotected population), but that requires a trust in the expert consensus and data and trials and peer-reviewed research and all those things that make American stomachs churn. Giving people accurate information and sources can even make them less likely to see the light! There is, for some bizarre reason, more comfort and trust in the rogue doctor peddling unfounded nonsense on YouTube.

It may be of some comfort then to recognize that the insanity will surely decrease as time goes on. It’s already occurring. The most powerful answer to “what will move the needle?” is “personal impact” — as time passes, more people will know someone hospitalized or wiped from existence by the disease, and also know someone who has been vaccinated and is completely fine. There will be more family members who get the vaccine behind your back and more friends and acquaintances you’ll see online or in the media expressing deep regret from their ICU hospital beds. You may even be hospitalized yourself. Such things will make a difference. States currently hit hardest by the Delta variant and seeing overall cases skyrocket — the less vaccinated states — are also witnessing increases in vaccination rates. Even conservative media outlets and voices are breaking under the weight of reason, finally beginning to promote the vaccine and changing viewers’ minds, while naturally remaining in Absurdsville by pretending their anti-inoculation hysteria never occurred and blaming Democrats for vaccine hesitancy. Eventually, falsities and mad beliefs yield to science and reason, as we’ve seen throughout history. True, many will never change their minds, and will go to their deaths (likely untimely) believing COVID to be a hoax, or exaggerated, or less risky than a vaccine. But others will yield, shaken to the core by loved ones lost to the virus (one-fourth to one-third of citizens at least know someone who died already) or vaccinated without becoming a zombie, or even by growing ill themselves.

To say more time is needed to end the foolishness is, admittedly, in part to say more illness and death are needed. As stated, the more people a hesitant person knows who have grown ill or died, the more likely the hesitant person is to get his or her shots. A terrible thing to say, yet true. That is why we cannot rest, letting time work on its own. We must continue trying to convince people, through example, empathy (it’s often not logic that changes minds, but love), hand-holding, and other methods offered by psychologists. Lives can be saved. And to convince someone to get vaccinated is not only to protect them and others against COVID, it suddenly creates a person in someone else’s inner circle who has received the shots, perhaps helping the behavior spread. Both us and Father Time can make sure hesitant folk know more people who have been vaccinated, the more pleasant piece of time’s function.

Hopefully, our experience with coronavirus will prepare us for more deadly pandemics in the future, in terms of our behavior, healthcare systems, epidemiology, and more. As bad as COVID-19 is, as bad as Delta is, humanity was exceptionally lucky. The disease could have been far deadlier, far more contagious; the vaccine could have taken much longer, and been less effective. We’ve seen four million deaths worldwide, but even with this virus evolving and worsening, we’ll likely see nothing like the 50 million dead from the 1918 pandemic. Some see the rebellion against masks, lockdowns, and vaccines as a frightening sign: such insanity will spell absolute catastrophe when a deadlier virus comes around. This writer has always suspected (perhaps only hoped) that view to be a bit backward. A deadlier virus would likely mean less rebellion (as would a virus you could see on other people, something more visually horrifying like leprosy). It’s the relative tameness of COVID that allows for the high degree of madness. Admittedly, there was anti-mask resistance during the 1918 crisis, but there could be a correlation nonetheless between the seriousness of the epidemic and the willingness to engage in suicidal foolishness. That aligns with this idea that the more people you lose in your inner circle the more likely you are to give in and visit your local health clinic. Let’s hope science and reason reduce the opportunities to test this correlation hypothesis.

For more from the author, subscribe and follow or read his books.

Woke Cancel Culture Through the Lens of Reason

What follows are a few thoughts on how to view wokeism and cancel culture with nuance:

Two Basic Principles (or, Too Much of a Good Thing)

There are two principles that first spring to mind when considering cancel culture. First, reason and ethics, to this writer, suggest that social consequences are a good thing. There are certain words and actions that one in a free society would certainly not wish to result in fines, community service, imprisonment, or execution by government, but are deserving of proportional and reasonable punishments by private actors, ordinary people. It is right that someone who uses a racial slur loses their job or show or social media account. A decent person and decent society wants there to be social consequences for immoral actions, because it discourages such actions and helps build a better world. One can believe in this while also supporting free speech rights and the First Amendment, which obviously have to do with how the government responds to what you say and do, not private persons and entities.

The second principle acknowledges that there will be many cases where social consequences are not proportional or reasonable, where things go too far and people, Right and Left, are crushed for rather minor offenses. It’s difficult to think of many social trends or ideological movements that did not go overboard in some fashion, after all. There are simply some circumstances where there was an overreaction to words and deeds, where mercy should have been the course rather than retribution. (Especially worthy of consideration: was the perpetrator young at the time of the crime, with an underdeveloped brain? Was the offense in the past, giving someone time to change and grow, to regret it?) Readers will disagree over which specific cases fall into this category, but surely most will agree with the general principle, simply that overreaction in fact occurs. I can’t be the only Leftist who both nods approvingly in some cases and in others thinks, “She didn’t deserve that” or “My, what a disproportionate response.” Stupid acts might deserve a different response than racist ones, dumb ideas a different tack than dangerous ones, and so on. It might be added that overreactions not only punish others improperly, but also encourage forced, insincere apologies — somewhat reminiscent of the adage than you shouldn’t make faith a requirement of holding office, as you’ll only end up with performative religiosity.

Acknowledging and pondering both these principles is important.

“Free Speech” Only Concerns Government-Citizen Interaction

Again, in most cases, the phrase “free speech” is basically irrelevant to the cancel culture conversation. It’s worth emphasizing. Businesses and individuals — social media companies, workplaces, show venues, a virtual friend who blocks you or deletes your comment — have every right to de-platform, cancel, censor, and fire. The whining about someone’s “free speech” being violated when they’re cancelled is sophomoric and ignorant — the First Amendment and free speech rights are about whether the government will punish you, not non-government actors.

Which makes sense, for an employer or individual could just as easily be said to have the “free speech right” to fire or cancel you — why is your “free speech right” mightier than theirs?

Public universities and government workplaces, a bit different, are discussed below.

Why is the Left at Each Other’s Throats?

At times the national conversation is about the left-wing mob coming for conservatives, but we know it comes for its own with just as much enthusiasm. Maybe more, some special drive to purge bad ideas and practices from our own house. Few involved in left-wing advocacy of some kind haven’t found themselves in the circular firing squad, whether firing or getting blasted — most of us have probably experienced both. It’s a race to be the most woke, and can lead to a lot of nastiness.

What produces this? Largely pure motives, for if there’s a path that’s more tolerant, more just, that will build a better future, we want others to see and take it. It’s a deep desire to do what’s right and get others to do the same. (That the pursuit of certain kinds of tolerance [racial, gender, etc.] would lead to ideological intolerance has been called ironic or hypocritical, but seems, while it can go too far at times, more natural and inevitable — there’s no ending separate drinking fountains without crushing the segregationist’s ideology.)

But perhaps the inner turmoil also comes from troublesome ideas of group monolithic thinking, plus a desperate desire for there to be one right answer when there isn’t one. Because we sometimes look at impacted groups as comprised of members all thinking the same way, or enough thinking the same way, there is therefore one right answer and anyone who questions it should be trampled on. For example, you could use “person with autism” (person-first language) rather than “autistic person” (identity-first language) and fall under attack for not being woke enough. Identity-first language is more popular among the impacted group members, and the common practice with language among non-impacted persons is to defer to majority opinions. But majority opinions aren’t strictly “right” — to say this is of course to say the minority of the impacted group members are simply wrong. Who would have the arrogance and audacity to say this? It’s simply different opinions, diversity of thought. (Language and semantics are minefields on the Left, but also varying policy ideas.) There’s nothing wrong with deferring to majority opinion, but if we were not so focused on there being one right answer, if we didn’t view groups as single-minded or single-minded enough, we would be much more tolerant of people’s “mistakes” and less likely to stoop to nastiness. We’d respect and explore and perhaps even celebrate different views within our side of the political spectrum. It’s worth adding that we go just as crazy when the majority impacted group opinion is against an idea. It may be more woke, for example, to support police abolition or smaller police presences in black neighborhoods, but 81% of black Americans don’t want the police going anywhere, so the majority argument won’t always help a case. Instead of condemning someone who isn’t on board with such policies as not caring enough about racial justice, not being woke enough, being dead wrong, we should again remember there is great diversity of thought out there and many ideas, many possible right answers beyond our own, to consider and discuss with civility. One suspects that few individuals, if intellectually honest, would always support the most radical or woke policy posited (more likely, you’ll disagree with something), so more tolerance and humility is appropriate.

The same should be shown toward many in the middle and on the Right as well. Some deserve a thrashing. Others don’t.

The University Onus

One hardly envies the position college administrators find themselves in, pulled between the idea that a true place of learning should include diverse and dissenting opinions, the desire to punish and prevent hate speech or awful behaviors, the interest in responding to student demands, and the knowledge that the loudest, best organized demands are at times themselves minority opinions, not representative.

Private universities are like private businesses, in that there’s no real argument against them cancelling as they please.

But public universities, owned by the states, have a special responsibility to protect a wide range of opinion, from faculty, students, guest speakers, and more, as I’ve written elsewhere. As much as this writer loves seeing the power of student organizing and protest, and the capitulation to that power by decision-makers at the top, public colleges should take a harder line in many cases to defend views or actions that are deemed offensive, in order to keep these spaces open to ideological diversity and not drive away students who could very much benefit from being in an environment with people of different classes, ethnicities, genders, sexual orientations, religions, and politics. Similar to the above, that is a sensible general principle. There will of course be circumstances where words and deeds should be crushed, cancellation swift and terrible. Where that line is, again, is a matter of disagreement. But the principle is simply that public colleges should save firings, censorship, cancellation, suspension, and expulsion for more extreme cases than is current practice. The same for other public entities and public workplaces. Such spaces are linked to the government, which actually does bring the First Amendment and other free speech rights into the conversation, and therefore there exists a special onus to allow broader ranges of views.

Cancel Culture Isn’t New — It’s Just the Left’s Turn

If you look at the surveys that have been conducted, two things become clear: 1) support for cancel culture is higher on the Left, but 2) it’s also a problem on the Right.

50% of staunch progressives “would support firing a business executive who personally donated to Donald Trump’s campaign,” vs. 36% of staunch conservatives who “would support firing Biden donors.” Republicans are much more worried about their beliefs costing them their jobs (though a quarter of Democrats worry, too), conservatives are drastically more afraid to share opinions (nearly 80%, vs. just over 40% for strong liberals), and only in the “strong liberal” camp does a majority (58%) feel free to speak its mind without offending others (liberals 48%, conservatives 23%). While almost 100% of the most conservative Americans see political correctness as a problem, 30% of the most progressive Americans agree, not an insignificant figure (overall, 80% of citizens agree). There’s some common ground here.

While the Left is clearly leading modern cancel culture, it’s important to note that conservatives often play by the same rules, despite rhetoric about how they are the true defenders of “free speech.” If Kaepernick kneels for the anthem, he should be fired. If a company (Nike, Gillette, Target, NASCAR, Keurig, MLB, Delta, etc.) gets political on the wrong side of the spectrum, boycott it and destroy your possessions, while Republican officials legislate punishment. If Republican Liz Cheney denounces Trump’s lies, remove her from her leadership post. Rage over and demand cancellation of Ellen, Beyonce, Jane Fonda, Samantha Bee, Kathy Griffin, Michelle Wolf, and Bill Maher for using their free speech. Obviously, no one called for more firings for views he didn’t like than Trump. If the Dixie Chicks criticize the invasion of Iraq, wipe them from the airways, destroy their CDs. Thomas Hitchner recently put together an important piece on conservative censorship and cancellation during the post-9/11 orgy of patriotism, for those interested. And don’t forget what happened to Sinéad O’Connor after she tore up a photograph of the Pope (over the Catholic Church sexual abuse scandal) on SNL in 1992: her records were crushed under a steamroller in Times Square and her career was cancelled.

More importantly, when we place this phenomenon of study in the context of history, we come to suspect that rather than being something special to the Left (or naturally more powerful on the Left, because liberals hate free speech and so on), cancel culture seems to be, predictably, led by the strongest cultural and political ideology of the moment. When the U.S. was more conservative, it was the Right that was leading the charge to ensure people with dissenting views were fired, censored, and so on. The hammer, rather than wielded by the far Left, came down on it.

You could look to the socialists and radicals, like Eugene Debs, who were literally imprisoned for speaking out against World War I, but more recently the McCarthy era after World War II, when government workers, literary figures, media anchors, and Hollywood writers, actors, and filmmakers accused of socialist or communist sympathies were hunted down and fired, blacklisted, slandered, imprisoned for refusing to answer questions at the witch trials, and so forth, as discussed in A History of the American People by conservative Paul Johnson. The Red Scare was in many ways far worse than modern cancel culture — it wasn’t simply the mob that came for you, it was the mob and the government. However, lest anyone think this was just Republican Big Government run amok rather than a cultural craze working in concert, recall that it was the movie studios doing the actual firing and blacklisting, the universities letting faculty go, LOOK and other magazines reprinting Army “How to Spot a Communist” propaganda, ordinary people pushing and marching and rallying against communism, etc.

All this overlapped, as leftwing economic philosophies usually do, with the fight for racial justice. Kali Holloway writes for The Nation:

There was also [black socialist] Paul Robeson, who had his passport revoked by the US State Department for his political beliefs and was forced to spend more than a decade living abroad. Racism and red-scare hysteria also canceled the acting career of Canada Lee, who was blacklisted from movies and died broke in 1952 at the age of 45. The [anti-segregationist] song “Mississippi Goddam” got Nina Simone banned from the radio and much of the American South, and the Federal Bureau of Narcotics essentially hounded Billie Holiday to death for the sin of stubbornly refusing to stop performing the anti-lynching song “Strange Fruit.”

Connectedly, there was the Lavender Scare, a purge of gays and suspected gays from government and private workplaces. 5,000-10,000 people lost their jobs:

“It’s important to remember that the Cold War was perceived as a kind of moral crusade,” says [historian David K.] Johnson, whose 2004 book The Lavender Scare popularized the phrase and is widely regarded as the first major historical examination of the policy and its impact. The political and moral fears about alleged subversives became intertwined with a backlash against homosexuality, as gay and lesbian culture had grown in visibility in the post-war years. The Lavender Scare tied these notions together, conflating gay people with communists and alleging they could not be trusted with government secrets and labelling them as security risks, even though there was no evidence to prove this.

The 1950s was a difficult era for the Left and its civil rights advocates, class warriors, and gay liberators, with persecution and censorship the norm. More conservative times, a stronger conservative cancel culture. This did not end in this decade, of course (one of my own heroes, Howard Zinn, was fired from Spelman College in 1963 for his civil rights activism), but soon a long transition began. Paul Johnson mused:

The significant fact about McCarthyism, seen in retrospect, was that it was the last occasion, in the 20th century, when the hysterical pressure on the American people to conform came from the right of the political spectrum, and when the witchhunt was organized by conservative elements. Thereafter the hunters became the hunted.

While, as we saw, the Right are still often hunters as well, and therefore we see much hypocrisy today, there is some truth to this statement, as from the 1960s and ’70s the nation began slowly liberalizing. Individuals increasingly embraced liberalism, as did some institutions, like academia, the media, and Hollywood (others, such as the church, military, and law enforcement remain quite conservative). The U.S. is still growing increasingly liberal, more favoring New Deal policies, for example, even though more Americans still identify as conservative:

Since 1992, the percentage of Americans identifying as liberal has risen from 17% then to 26% today. This has been mostly offset by a shrinking percentage of moderates, from 43% to 35%. Meanwhile, from 1993 to 2016 the percentage conservative was consistently between 36% and 40%, before dipping to 35% in 2017 and holding at that level in 2018.

On top of this, the invention and growth of social media since the mid-2000s has dramatically changed the way public anger coalesces and is heard — and greatly increased its power.

So the Left has grown in strength at the same time as technology that can amplify and expand cancel culture, a convergence that is both fortunate and unfortunate — respectively, for those who deserve harsh social consequences and for those who do not.

For more from the author, subscribe and follow or read his books.

Did Evolution Make it Difficult for Humans to Understand Evolution?

It’s well known that people are dreadful at comprehending and visualizing large numbers, such as a million or billion. This is understandable in terms of our development as a species, as grasping the tiny numbers of, say, your clan compared to a rival one you’re about to be in conflict with, or understanding amounts of resources like food and game in particular places, would aid survival (pace George Dvorsky). But there was little evolutionary reason to adeptly process a million of something, intuitively knowing the difference between a million and a billion as easily as we do four versus six. A two second difference, for instance, we get — but few intuitively sense a million seconds is about 11 days and a billion seconds 31 years (making for widespread shock on social media).

As anthropologist Caleb Everett, who pointed out a word for “million” did not even appear until the 14th century, put it, “It makes sense that we as a species would evolve capacities that are naturally good at discriminating small quantities and naturally poor at discriminating large quantities.”

Evolution, therefore, made it difficult to understand evolution, which deals with slight changes to species over vast periods of time, resulting in dramatic differences (see Yes, Evolution Has Been Proven). It took 16 million years for Canthumeryx, with a look and size similar to a deer, to evolve into, among other new species, the 18-foot-tall giraffe. It took 250 million years for the first land creatures to finally have descendants that could fly. It stands to reason that such statements seem incredible to many people not only due to old religious tales they support that evidence does not but also because it’s hard to grasp how much time that actually constitutes. Perhaps it would be easier to comprehend and visualize how small genetic changes between parent creatures and offspring could add up, eventually resulting in descendants that look nothing like ancient ancestors, if we could better comprehend and visualize the timeframes, the big numbers, in which evolution operates. 16 million years is a long time — long enough.

This is hardly the first time it’s been suggested that its massive timescales make evolution tough to envision and accept, but it’s interesting to think about how this fact connects to our own evolutionary history and survival needs.

Just one of those wonderful oddities of life.

For more from the author, subscribe and follow or read his books.

Suicide is (Often?) Immoral

Suicide as an immoral act is typically a viewpoint of the religious — it’s a sin against God, “thou shalt not kill,” and so on. For those free of religion, and of course some who aren’t, ethics are commonly based on what does harm to others, not yourself or deities — under this framework, the conclusion that suicide is immoral in many circumstances is difficult to avoid.

A sensible ethical philosophy considers physical harm and psychological harm. These harms can be actual (known consequences) or potential (possible or unknown consequences). The actual harm of, say, shooting a stranger in the heart is that person’s suffering and death. The potential harm on top of that is wide-ranging: if the stranger had kids it could be their emotional agony, for instance. The shooter simply would not know. Most suicides will entail these sorts of things.

First, most suicides will bring massive psychological harm, lasting many years, to family and friends. Were I to commit suicide, this would be a known consequence, known to me beforehand. Given my personal ethics, aligning with those described above, the act would then necessarily be unethical, would it not? This seems to hold true, in my view, even given my lifelong depression (I am no stranger to visualizations of self-termination and its aftermath, though fortunately with more morbid curiosity than seriousness to date; medication is highly useful and recommended). One can suffer and, by finding relief in nonexistence, cause suffering. As a saying goes, “Suicide doesn’t end the pain, it simply passes it to someone else.” Perhaps the more intense my mental suffering, the less unethical the act (more on this in a moment), but given that the act will cause serious pain to others whether my suffering be mild or extreme, it appears from the outset to be immoral to some degree.

Second, there’s the potential harms, always trickier. There are many unknowns that could result from taking my own life. The potential harms could be more extreme psychological harms, a family member driven to severe depression or madness or alcoholism. (In reality, psychological harms are physical harms — consciousness is a byproduct of brain matter — and vice versa, so stress on one affects the other.) But they could be physical as well. Suicide, we know, is contagious. Taking my own life could inspire others to do the same. Not only could I be responsible for contributing, even indirectly, to the death of another person, I would also have a hand in all the actual and potential harms that result from his or her death! It’s a growing moral burden.

Of course, all ethics are situational. This is accepted by just about everyone — it’s why killing in self-defense seems less wrong than killing in cold blood, or why completely accidental killings seem less unethical than purposeful ones. These things can even seem ethically neutral. So there will always be circumstances that change the moral calculus. One questions if old age alone is enough (one of your parents or grandparents taking their own lives would surely be about as traumatic as anyone else), but intense suffering from age or disease could make the act less unethical, in the same way deeper and deeper levels of depression may do the same. Again, less unethical is used here. Can the act reach an ethically neutral place? The key may simply be the perceptions and emotions of others. Perhaps with worsening disease, decay, or depression, a person’s suicide would be less painful to friends and family. It would be hard to lose someone in that way, but, as we often hear when someone passes away of natural but terrible causes, “She’s not suffering anymore.” Perhaps at some point the scale is tipped, with too much agony for the individual weighing down one side and too much understanding from friends and family lifting up the other. One is certainly able to visualize this — no one wants their loved ones to suffer, and the end of their suffering can be a relief as well as a sorrow, constituting a reduction in actual harm — and this is no doubt reality in various cases. This writing simply posits that not all suicides will fall into that category (many are unexpected), and, while a distinguishing line may be frequently impossible to see or determine, the suicides outside it are morally questionable due to the ensuing harm.

If all this is nonsense, and such sympathetic understanding of intense suffering brings no lesser amount of harm to loved ones, then we’re in trouble, for how else can the act break free from that immoral place, for those operating under the moral framework that causing harm is wrong?

It should also be noted that the rare individuals without any real friends or family seem to have less moral culpability here. And perhaps admitted plans and assisted suicide diminish the immorality of the act, regardless of the extent of your suffering — if you tell your loved ones in advance you are leaving, if they are there by your side in the hospital to say goodbye, isn’t that less traumatizing and painful than a sudden, unexpected event, with your body found cold in your apartment? In these cases, however, the potential harms, while some may be diminished in likelihood alongside the actual, still abound. A news report on your case could still inspire someone else to commit suicide. One simply cannot predict the future, all the effects of your cause.

As a final thought, it’s difficult not to see some contradiction in believing in suicide prevention, encouraging those you know or those you don’t not to end their lives, and believing suicide to be ethically neutral or permissible. If it’s ethically neutral, why bother? If you don’t want someone to commit suicide, it’s because you believe they have value, whether inherent or simply to others (whether one can have inherent value without a deity is for another day). And destroying that value, bringing all that pain to others or eliminating all of the individual’s potential positive experiences and interactions, is considered wrong, undesirable. Immorality and prevention go hand-in-hand. But with folks who are suffering we let go of prevention, even advocating for assisted suicide, because only in those cases do we begin to consider suicide ethically neutral or permissible.

In sum, one finds oneself believing that if causing harm to others is wrong, and suicide causes harm to others, suicide must in some general sense be wrong — but acknowledging that there must be specific cases and circumstances where suicide is less wrong, approaching ethical neutrality, or even breaking into it.

For more from the author, subscribe and follow or read his books.

Expanding the Supreme Court is a Terrible Idea

Expanding the Supreme Court would be disastrous. We hardly want an arms race in which the party that controls Congress and the White House expands the Court to achieve a majority. It may feel good when the Democrats do it, but it won’t when it’s the Republicans’ turn. 

The problem with the Court is that the system of unwritten rules, of the “gentlemen’s agreement,” is completely breaking down. There have been expansions and nomination fights or shenanigans before in U.S. history, but generally when a justice died or retired a Senate controlled by Party A would grudgingly approve a new justice nominated by a president of Party B — because eventually the situation would be reversed, and you wanted and expected the other party to show you the same courtesy. It was reciprocal altruism. It all seemed fair enough, because apart from a strategic retirement, it was random luck — who knew when a justice would die? 

The age of unwritten rules is over. The political climate is far too polarized and hostile to allow functionality under such a system. When Antonin Scalia died, Obama should have been able to install Merrick Garland on the Court — Mitch McConnell and the GOP Senate infamously wouldn’t even hold a vote, much less vote Garland down, for nearly 300 days. They simply delayed until a new Republican president could install Neil Gorsuch. Democrats attempted to block this appointment, as well as Kavanaugh (replacing the retiring Kennedy) and Barrett (replacing the passed Ginsburg). The Democrats criticized the Barrett case for occurring too close to an election, mere weeks away, the same line the GOP had used with Garland, and conservatives no doubt saw the investigation into Kavanaugh as an obstructionist hit job akin to the Garland case. But it was entirely fair for Trump to replace Kennedy and Ginsberg, as it was fair for Obama to replace Garland. That’s how it’s supposed to work. But that’s history — and now, with Democrats moving forward on expansion, things are deteriorating further.

This has been a change building over a couple decades. Gorsuch, Kavanaugh, and Barrett received just four Democratic votes. The justices Obama was able to install, Kagan and Sotomayor, received 14 Republican votes. George W. Bush’s Alito and Roberts received 26 Democratic votes. Clinton’s Breyer and Ginsburg received 74 Republican votes. George H.W. Bush’s nominees, Souter and Thomas, won over 57 Democrats. When Ronald Reagan nominated Kennedy, more Democrats voted yes than Republicans, 51-46! Reagan’s nominees (Kennedy, Scalia, Rehnquist, O’Connor) won 159 Democratic votes, versus 199 Republican. Times have certainly changed. Partisanship has poisoned the well, and obstruction and expansion are the result.

Some people defend the new normal, correctly noting the Constitution simply allows the president to nominate and the Senate to confirm or deny. Those are the written rules, so that’s all that matters. And that’s the problem, the systemic flaw. It’s why you can obstruct and expand and break everything, make it all inoperable. And with reciprocal altruism, fairness, and bipartisanship out the window, it’s not hard to imagine things getting worse. If a party could deny a vote on a nominee for the better part of a year (shrinking the Court to eight, one notices, which can be advantageous), could it do so longer? Delaying for years, perhaps four or eight? Why not, there are no rules against it. Years of obstruction would become years of 4-4 votes on the Court, a completely neutered branch of government, checks and balances be damned. Or, if each party packs the Court when it’s in power, we’ll have an ever-growing Court, a major problem. The judiciary automatically aligning with the party that also controls Congress and the White House is again the serious weakening of a check and balance. Democrats may want a stable, liberal Court around some day to strike down rightwing initiatives coming out of Congress and the Oval Office. True, an expanding Court will hurt and help parties equally, and parties won’t always be able to expand, but for any person who sees value in real checks on legislative and executive power, this is a poor idea. All the same can be said for obstruction.

Here is a better idea. The Constitution should be amended to reflect the new realities of American politics. This is to preserve functionality and meaningful checks and balances, though admittedly the only way to save the latter may be to undercut it in a smaller way elsewhere. The Court should permanently be set at nine justices, doing away with expansions. Election year appointments should be codified as obviously fine. The selection of a new justice must pass to one decision-making body: the president, the Senate, the House, or a popular vote by the citizenry. True, doing away with a nomination by one body and confirmation by another itself abolishes a check on power, but this may be the only way to avoid the obstruction, the tied Court, the total gridlock until a new party wins the presidency. It may be a fair tradeoff, sacrificing a smaller check for a more significant one. However, this change could be accompanied by much-discussed term limits, say 16, 20, or 24 years, for justices. So while only one body could appoint, the appointment would not last extraordinary lengths of time.

For more from the author, subscribe and follow or read his books.

We Just Witnessed How Democracy Ends

In early December, a month after the election was called, after all “disputed” states had certified Biden’s victory (Georgia, Arizona, Nevada, Wisconsin, Michigan, and Pennsylvania), with some certifying a second time after recounts, after 40-odd lawsuits from Trump, Republican officials, and conservative voters had failed miserably in the American courts, the insanity continued: about 20 Republican states (with 106 Republican members of the House) sued to stop electors from casting their votes for Biden; only 27 of 249 Republican congresspersons would acknowledge Biden’s victory. Rightwing media played along, even while knowing stolen election claims were lies. A GOP state legislator and plenty of ordinary citizens pushed for Trump to simply use the military and martial law to stay in power. The Trump administration contemplated using the National Guard to seize voting machines.

At this time, with his legal front collapsing, the president turned to Congress, the state legislatures, and the Electoral College. Trump actually pushed for the Georgia legislature to replace the state’s 16 electors (members of the 2020 Electoral College, who were set to be Biden supporters after Georgia certified Biden’s win weeks prior) with Trump supporters! Without any ruling from a court or state in support, absurd imaginings and lies about mass voter fraud were to be used to justify simply handing the state to Trump — a truly frightening attack on the democratic process. Trump knew he lost but did not care. Officials in other battleground states got phone calls about what their legislatures could do to subvert election results as well (state secretaries later being asked to “recalculate” and told things like “I just want to find 11,780 votes”). And it was theoretically possible for this to work, if the right circumstance presented itself. ProPublica wrote that

the Trump side’s legislature theory has some basis in fact. Article II of the U.S. Constitution holds that “each State shall appoint, in such Manner as the Legislature thereof may direct, a Number of Electors” to vote for president as a member of the Electoral College. In the early days of the republic, some legislatures chose electors directly or vested that power in other state officials. Today, every state allocates presidential electors by popular vote…

As far as the Constitution is concerned, there’s nothing to stop a state legislature from reclaiming that power for itself, at least prospectively. Separately, a federal law, the Electoral Count Act of 1887, provides that whenever a state “has failed to make a choice” in a presidential election, electors can be chosen “in such a manner as the legislature of such State may direct.”

Putting aside how a battle between certified election results and misguided screams of election fraud might be construed as a “failure to make a choice” by a Trumpian judge somewhere, the door is open for state legislatures to return to the days of divorcing electors from the popular vote. The challenge, as this report went on to say, is that in these battleground states, the popular vote-elector connection “is enshrined in the state constitution, the state’s election code or both,” which means that change was only impossible in the moment because a party would need dominant political power in these states to change the constitutions and election codes — needing a GOP governor, control of or supermajorities in both houses of the legislature, even the passing of a citizens’ vote on the matter, depending on the state. Republican officials, if willing to pursue this (and true, not all would be), couldn’t act at that particular moment in history because success was a political impossibility. Wisconsin, Michigan, and Pennsylvania, for instance, had Democratic governors and enough legislators to prevent a supermajority veto override. But it isn’t difficult to envision a parallel universe or future election within our own reality where a couple states are red enough to reclaim the power to appoint electors and do so, returning someone like Trump to office and making voting in their states completely meaningless.

In the exact same vein, House Republicans laid plans to challenge and throw out electors in January. This was theoretically possible, too. Per the procedures, if a House rep and senator together challenge a state’s slate of electors, the Congress as a whole must vote on whether to confirm or dismiss the electors. The latter would reduce the electoral votes of one candidate. Like the state legislature intervention, this was sure to fail only due to fortunate political circumstances. The Independent wrote, “There’s no way that a majority of Congress would vote to throw out Biden’s electors. Democrats control the House, so that’s an impossibility. In the Senate, there are enough Republicans who have already acknowledged Biden’s win (Romney, Murkowski, Collins and Toomey, to name just a few) to vote with Democrats.” Would things have gone differently had the GOP controlled both houses?

Desperate, Republicans even sued Mike Pence in a bizarre attempt to make the courts grant him the sole right to decide which electoral votes counted! Rasmussen, the right-leaning polling institution, liked this idea, favorably lifting up a (false) quote by Stalin saying something evil in support of Pence throwing out votes. The vice president has the ceremonial duty of certifying the electoral votes after they are counted in Congress — Trump and others pushed Pence hard to simply reject votes for Biden. There is no law giving the vice president this authority. Pence refused to do this, understanding the illegality, but if he had played along the courts would have had to confirm that the Electoral Count Act of 1887 granted the vice president ceremonial, not decision-making, power if democracy was to survive.

Indeed, all that would be needed for success after such acts are judges to go along with them. Given that such changes are not unconstitutional, final success is imaginable, whether in the lower courts or in the Supreme Court, where such things would surely end up. It’s encouraging to see, both recently and during Trump’s term, that the judicial system has remained strong, continuing to function within normal parameters while the rest of the nation went mad. In late 2020, Trump and rightwing efforts to have citizen votes disqualified and other disgusting moves based on fraud claims were tossed out of the courtrooms due to things like lack of something called “evidence.” Even the rightwing Supreme Court, with three Trump appointees, refused to get involved in and shot down Trump’s nonsense (much like Trump’s own Justice Department and Department of Homeland Security). Yet we waited with bated breath to see if this would be so. It could have gone the other way — votes thrown out despite the lack of evidence, such decisions upheld by higher courts, results overturned. That’s all it would have taken this time — forget changing how electors are chosen or Congress (or Pence) rejecting them! If QAnon types can make it into Congress, if people like Trump can receive such loyalty from the congresspersons who approve Supreme Court justices and other judges, if someone like Trump can win the White House and be in a position to nominate justices, the idea of the absurdity seeping into the judicial system doesn’t seem so far-fetched. Like other presidents, Trump appointed hundreds of federal judges. And if that seems possible — the courts tolerating bad cases brought before them — then the courts ruling that states can return to anti-democratic systems of old eras or tolerating a purge of rightful electors seems possible, too. Any course makes citizen voting a sham.

The only bulwark to the overturning of a fair election and the end of American democracy was basically luck, comprising, before January 6 at any rate: 1) a small group of Republican officials being unable to act, and/or 2) a small group of judges being unwilling to act. It isn’t that hard to imagine a different circumstance that would have allowed state legislators or Congress or the vice president to terminate democracy and/or seen the Trumpian insanity infecting judges like it has voters and elected officials. In this sense, we simply got extremely lucky. And it’s worth reiterating that Number 1 needn’t even be in the picture — all you need is enough judges (and jurists) to go along with the foolishness Trump and the GOP brought into the courtroom and real democracy is finished.

(If interested to know exactly how many people would be required to unjustly hand a battleground state to the loser, the answer is 20. This includes a U.S. district judge and jury, a majority of an appellate court, and a majority of the Supreme Court. This number drops to just eight if the district court somehow sees a bench trial, without a jury. But at most, the sanity of just 20 people stands between democracy and chaos in each state. In this election, one state, any of the battleground states, would not have been enough to seize the Electoral College, you would have needed three of them. Meaning at most five Supreme Court justices and 45 judges and jurors. In this sense, this election was far more secure than some future election that hinges on one state.)

Then on January 6, we noticed that our luck was comprised of something else. It also included 3) a military that, like our justice system, hadn’t lost its mind yet. More on that momentarily.

When January 6 arrived, and it was time for Congress to count and confirm the Electoral College votes, GOP House reps and senators indeed came together to object to electors, forcing votes from both houses of Congress on elector acceptance. Then a Trumpian mob, sizably armed, overwhelmed the police and broke into the Capitol building to “Stop the Steal,” leaving people dead and IEDs needing disarming — another little hint at what a coup, of a very different sort, might look and feel like. Though a few Republicans changed their minds, and plans to contest other states were scrapped, 147 Republican congresspersons still voted to sustain objections to the electors of Arizona and Pennsylvania! They sought to not confirm the electoral votes of disputed states until an “election audit” was conducted. Long after the courts (59 cases lost by then), the states (some Republican), and Trump’s own government departments had said the election was free and fair, and after they saw how Trump’s lies could lead directly to deadly violence, Republicans continued playing along, encouraging the continued belief in falsities and risking further chaos. They comprised 65% of GOP House members and 15% of GOP senators. (Some did the right thing: 10 Republicans in the House would vote to impeach Trump, seven in the Senate to convict.) This time, fortunately, there wasn’t enough congressional support to reject electoral votes. Perhaps next time there will be — and a judicial system willing to tolerate such a travesty.

Recent times have been a true education in how a democracy can implode. It can do so without democratic processes, requiring a dear leader spewing lies, enough of the populace to believe those lies, enough of the most devout to take violent action, and military involvement or inaction. If armed supporters storm and seize the Capitol and other places of power then it doesn’t really matter what the courts say, but this only ultimately works if the military does it or acquiesces to it. While the January 6 mob included active soldiers and veterans, this had no support from the branches, instead condemnation and protective response. This was ideal, but next time we may not be so fortunate. But the end of the great experiment can also happen through democratic processes. Democratic systems can eliminate democracy. Other free nations have seen democracy legislated away just as they have seen military coups. You need a dear leader spewing lies to justify acts that would keep him in charge, enough of the populace to believe those lies, enough of the dear leader’s party to go along with those lies and acts for power reasons (holding on to branches of government), and enough judges to tolerate such lies and approve, legitimize, such acts. We can count our lucky stars we did not see the last one this time, but it was frightening to witness the first three.

Trump’s conspiracy theories about voter fraud began long before the election (laying the groundwork to question the election’s integrity, only if he lost) and continued long after. Polls suggested eight or nine of every ten Trump voters believed Biden’s victory wasn’t legitimate; about half of Republicans agreed. So many Republican politicians stayed silent or played along with Trump’s voter fraud claims, cementing distrust in the democratic process and encouraging the spread of misinformation, which, like Trump’s actions, increased political division, the potential for violence, and the odds of overturning a fair election. As with voter suppression, gerrymandering, denying justice confirmation votes, and much else, it is clear that power is more important than democracy to many Republicans. Anything to keep the White House, Congress, the Supreme Court. You can’t stand up to the Madman and the Masses. The Masses adore the Madman, and you can’t lose their votes (or, if an earnest supporter, go against your dear leader). Some politicians may even fear for their safety.

It was frightening to realize that democracy really does rest, precariously, on the truth. On human recognition of reality, on sensible standards of evidence, on reason. It’s misinformation and gullibility that can end the democratic experiment, whether by coup or democratic mechanisms or both.

What would happen next? First, no one would be punished, at least not for long. American law allows presidential pardons, another systemic weakness. Anyone who joined a violent coup or quietly worked against democracy would be forgiven and legitimated by the victorious president. Trump, after using this power during his term to free corrupt allies, later vowed to forgive the January 6 rioters if he ever returned to the White House.

More broadly, tens of millions of Americans would celebrate. They wouldn’t be cheering the literal end of democracy, they would be cheering its salvation, because to them fraud had been overcome. So a sizable portion of the population would exist in a delusional state, completely disconnected from reality, which could mean a relatively stable system, like other countries that drifted from democracy. Perhaps the nation simply continues on, in a new form where elections are shams — opening the door to further authoritarianism. Despite much earnest sentiment toward and celebration of democracy, there is a troubling popularity of authoritarianism among Trump voters and to a lesser extent Americans as a whole. Unless the rest of the nation became completely ungovernable, whether in the form of nationwide strikes and mass civil disobedience or the actual violence that the typically hyperbolic prophets of “civil war” predict, there may be few alternatives to a nation in a new form. Considering Congress would need high Republican support to remove a president, or considering Congress would be neutered by the military, an effective governmental response seems almost impossible.

We truly witnessed an incredible moment in U.S. history. It’s one thing to read about nations going over the cliff, and another to see the cliff approaching before your very eyes. Reflecting the rise of authoritarians elsewhere in history, Trump reached the highest office using demagoguery (demonizing Mexicans and illegal immigrants, Muslims, China, the media, and other existential threats) and nationalism (promising to crush these existential threats and restore American greatness). The prejudiced masses loved it. As president his worst policies not only acted upon his demagoguery, with crackdowns on all legal immigration, Muslim immigrants, and illegal immigrants, he also consistently gave the finger to democratic and legal processes, such as ordering people to ignore subpoenas, declaring a national emergency to bypass congressional power and get his wall built, obstruction of justice (even a Republican senator voted to convict him of this), and so on. Then, at the end, Trump sought to stay in office through lies and a backstabbing of democracy, the overturning of a fair vote. And even in all this, we were extremely lucky — not only that the judicial and military systems remained strong (it was interesting to see how unelected authorities can protect democracy, highlighting the importance of some unelected power in a system of checks and balances), but that Trump was always more doofus than dictator, without much of a political ideology beyond “me, me, me.” Next time we may not be so fortunate. America didn’t go over the cliff this time, but we must work to ensure we never approach it again.

For more from the author, subscribe and follow or read his books.

The Toolbox of Social Change

After reading one of my books, folks who aren’t involved in social movements often ask, in private or at public talks, “What can we do?” So distraught by horrors past and present, people feel helpless and overwhelmed, and want to know how we build that better world — how does one join a social movement, exactly? I often say it’s easy to feel powerless before all the daunting obstacles — and no matter how involved you get, you typically feel you’re not doing enough. Perhaps even the most famous activists and leaders felt that way. Fortunately, I continue, if you look at history it becomes clear that social change isn’t just about one person doing a lot. It’s about countless people doing just a little bit. Howard Zinn said, “Small acts, when multiplied by millions of people, can transform the world.” And he was right, as we’ve seen. Whatever challenges we face today, those who came before us faced even greater terrors — and they won, because growing numbers of ordinary people decided to act, decided to organize, to put pressure on the economically and politically powerful. I then list (some of) the tools in the toolbox of social change, which I have reproduced below so I can pass them along in written form.

The list roughly and imperfectly goes from smaller, less powerful tools to larger, more powerful ones. The first nine are largely done “together alone,” while the last nine are mostly in the realm of true organizing and collective action. Yet all are of extreme importance in building a more decent society. (It ignores, perhaps rightly, the sentiments of some comrades that there should be no participation in current electoral systems, instead favoring using all possible tools at one’s disposal.) This is in no way a comprehensive list (writing books is hopefully on this spectrum somewhere, alongside many other things), but it is enough to get the curious started.

 

Talk to people

Post on social media

Submit editorials / earn media attention / advertise

Sign petitions

Call / email / write the powerful

Donate to candidates

Donate to organizations

Vote for candidates

Vote for policy initiatives

Volunteer for candidates (phonebank / canvass / register or drive voters)

Volunteer for policy initiative campaigns (phonebank / canvass / register or drive voters)

Run for office

Join an organization

Launch a policy initiative campaign (from petition to ballot)

March / protest / picket (at a place of power)

Boycott (organized refusal to buy or participate)

Strike (organized refusal to return to work or school)

Sit-in / civil disobedience / disruption (organized, nonviolent refusal to leave a place of power, cooperate, or obey the law; acceptance of arrest)

For more from the author, subscribe and follow or read his books.

The Psychology of Pet Ownership

For years now, exhaustive psychological research and studies have concluded that a wealth of medical benefits exists for the individual who owns a pet. According to Abnormal Psychology (Comer, 2010), “social support of various kinds helps reduce or prevent depression. Indeed, the companionship and warmth of dogs and other pets have been found to prevent loneliness and isolation and, in turn, to help alleviate or prevent depression” (p. 260). Without companionship, people are far more likely fall into depression when life presents increased stress. An article in Natural Health summarizes the medical advantages of pet ownership by saying, “researchers have discovered that owning a pet can reduce blood pressure, heart rate, and cholesterol; lower triglyceride levels; lessen stress; result in fewer doctor visits; and alleviate depression” (Hynes, 2005). Additionally, Hynes explains, “Infants who live in a household with dogs are less likely to develop allergies later in life, not only to animals but also to other common allergens.”

While immune system adaptation explains allergy prevention, a pet’s gift of reducing depression is multilayered. One of the most important components is touch therapy. The physical contact of petting a cat or dog provides a calming effect, comforting the owner and fighting off stress. The New York Times reports pets “provide a socially acceptable outlet for the need for physical contact. Men have been observed to touch their pets as often and as lovingly as women do” (1982). Physical touch in infancy is vital to normal brain development, and the need for contact continues into adulthood as a way to ease tension, express love, and feel loved. 

Another aspect of this phenomenon is unconditional love. Pets can provide people with love that is difficult or sometimes impossible to find from another person. In the article Pets for Depression and Health, Alan Entin, PhD, says unconditional love explains everything. “When you are feeling down and out, the puppy just starts licking you, being with you, saying with his eyes, ‘You are the greatest.’ When an animal is giving you that kind of attention, you can’t help but respond by improving your mood and playing with it” (Doheny, 2010). Pets are often the only source of true unconditional love a man or woman can find, and the feeling of being adored improves mood and self-confidence.

Not everyone is a pet person, which is why owning a pet will not be efficacious for everyone. Indeed, people who are already so depressed they cannot even take care of themselves will not see improvements. However, those who do take on the responsibility of owning a cat, dog, or any other little creature, will see reduced depression simply because they are responsible for another living being’s life. In an article in Reader’s Digest, Dr. Yokoyama Akimitsu, head of Kyosai Tachikawa Hospital’s psychiatric unit, says pets help by “creating a feeling of being needed” (2000). This need, this calling to take care of the pet, will give the owner a sense of importance and purpose. It also provides a distraction from one’s life problems. These elements work in concert to battle depression. 

Owning a pet also results in increased exercise and social contact with people. According to Elizabeth Scott, M.S., in her 2007 article How Owning a Dog or Cat Reduces Stress, dog owners spend more time walking than non-owners in urban settings. Exercise is known to burn stress. Furthermore, Scott says, “When we’re out walking, having a dog with us can make us more approachable and give people a reason to stop and talk, thereby increasing the number of people we meet, giving us an opportunity to increase our network of friends and acquaintances, which also has great stress management benefits.” Increased exercise will also lead to an improved sense of well-being due to endorphins released in the brain, and better sleep.

Finally, owning a pet simply staves off loneliness. Scott says, “They could be the best antidote to loneliness. In fact, research shows that nursing home residents reported less loneliness when visited by dogs than when they spent time with other people” (2007). Just by being there for their owners, pets eliminate feelings of isolation and sadness. They can serve as companions and friends to anyone suffering from mild or moderate depression.

For more from the author, subscribe and follow or read his books.

References

Brody, J. E. (1982, August 11). Owning a Pet Can Have Therapeutic Value. In The New York Times. Retrieved December 13, 2010, from http://www.nytimes.com/1982/08/11/garden/owning-a-pet-can-have-therapeutic-value.html?scp=1&sq=1982%20pets&st=cse

Comer, R. J.  (2010). Abnormal Psychology (7th Ed.). New York: Worth Publishers

Doheny, K. (2010, August 18). Pets for Depression and Health. In WebMD. Retrieved December 13, 2010, from http://www.webmd.com/depression/recognizing-depression-symptoms/pets-depression

Hynes, A. (2005, March). The Healing Power of Animals. In CBS Money Watch. Retrieved December 13, 2010, from http://findarticles.com/p/articles/mi_m0NAH/is_3_35/ai_n9775602/

Scott, E. (2007, November 1). How Owning a Dog or Cat Can Reduce Stress. In About.com. Retrieved December 13, 2010, from http://stress.about.com/od/lowstresslifestyle/a/petsandstress.htm

Williams, M. (2000, August). Healing Power of Pets. In Reader’s Digest. Retrieved December 13, 2010, from http://www.drmartinwilliams.com/healingpets/healingpets.html

Designing a New Social Media Platform

In Delphi, Greece, μηδὲν ἄγαν (meden agan) was inscribed on the ancient Temple of Apollo — nothing in excess. Applying the famous principle to the design and structure of social media platforms could reduce a number of their negative effects: their addictive properties, online bullying, depression and lower self-worth, breakdowns in civility and their impact on political polarization, and so forth. Other problems, such as information privacy and the spread of misinformation (leading to all sorts of absurd beliefs, affecting human behaviors from advocacy to violence, with its own impact on polarization) will be more difficult to solve, and will involve proper management rather than UI changes (so they won’t be addressed here). The Social Dilemma, while mostly old news to anyone paying attention to such things, presents a good summary of the challenges and is worth a view for those wanting to begin an investigation.

A new, socially-conscious social media platform — we’ll call it “Delphi” for now — would be crafted to prevent such things to the extent possible, while attempting to preserve the more positive aspects of social media — the access to news and information, the sharing of ideas, exposure to differing views, the humor and entertainment, the preserved connections to people you like but just wouldn’t text or call or see. Because while breaking free and abandoning the platforms completely greatly improves well-being, the invention is as unlikely to disappear quickly as the telephone, so there should be some middle ground — moderation in all things, nothing in excess — between logging off for good and the more poisonous platforms we’re stuck with. People could then decide what works best for them. If you won’t break free, here’s at least something less harmful.

The new platform would do away with likes, comments, and shares. These features drive many of the addictive and depressive elements, as we all know; we obsessively jump back on to see how our engagement is going, and perhaps we can’t help but see this measurement as a measurement of our own self-worth — of our looks, intelligence, accomplishments, whatever the post “topic” might be. Comparing this metric to those of others, seeing how many more likes others get, can only worsen our perceptions of self, especially for young girls. Instagram is toying with removing public like counts, while still allowing users to see theirs in the back end, which is barely helpful. All three features should simply be abolished. With Delphi, one would post a status, photo, video, or link and simply have no idea how many friends saw it or reacted to it. Have you ever simply stopped checking your notifications on current platforms? It is quite freeing, in my experience. You know (suspect) people are seeing a post, but you have no clue how many or what their reactions are. There’s no racing back on to count the likes or reply to a compliment or battle a debater or be hurt by a bully. You’re simply content, as if you had painted a mural somewhere and walked away.

There are of course probable work-arounds here. Obviously, if someone posted a link I wanted to share, I could copy the address and post it myself. (There may be a benefit to forcing people to open a link before sharing it; maybe we’d be more likely to actually read more than the headline before passing the piece on.) This wouldn’t notify the original poster, who would only know (suspect) that I’d stolen the link if they saw my ensuing post. Likewise, there’s nothing to stop people from taking screenshots of posts or copy-pasting text and using such things in their own posts, with commentary. Unless we programmed the platform to detect and prevent this, or detect and hide such things from the original poster. But you get the idea: you usually won’t see any reaction to your content.

Delphi wouldn’t entirely forsake interaction, however. It would replace written communication and emoji reactions with face-to-face communication. There would in fact be one button to be clicked on someone’s post, the calendar button, which would allow someone to request a day, time, and place to meet up or do a built-in video call to chat about the post (a video call request could also be accepted immediately, like FaceTime). The poster could then choose whether to proceed. As everyone has likely noticed, we don’t speak to each other online the way we do in person. We’re generally nastier due to the Online Disinhibition Effect; the normal inhibitions, social cues, and consequences that keep us civil and empathetic in person largely don’t exist. We don’t see each other the same way, because we cannot see each other. Studies show that, compared to verbal communication, we tend to denigrate and dehumanize other people when reading their written disagreements, seeing them as less capable of feeling and reason, which can increase political polarization. We can’t hear tone or see facial expressions, the eyes most important of all, creating fertile ground for both unkindness and misunderstandings. In public discussions, we also tend to put on a show for spectators, perhaps sacrificing kindness for a dunk that will garner likes. So let’s get rid of all that, and force people to talk face-to-face. No comments or messenger or tags or laugh reacts. Not only can this reduce political divisions by placing people in optimal spaces for respectful, empathetic discourse, it can greatly reduce opportunities for bullying.

The goal is to only get notifications (preferably just in-app, not via your phone) for one thing: calendar requests. Perhaps there would also be invitations to events and the like, but that’s the general idea. This means far less time spent on the platform, which is key because light users of social media are far less impacted by the negative effects.

To this end, Delphi would also limit daily use to an hour or so, apart from video calls. No more mindless staring for four hours. Nothing in excess.

Much of the rest would be similar to what’s used today. We’d have profiles, pages, friends, a feed (the endless scroll problem is solved by the time limit). Abandoning the feed completely has benefits (returning to a world where you have to visit a profile or a page to see what’s happening), such as less depression-inducing peer comparison (look at how beautiful she is, how amazing his life is, and so on), but that could mean that one doesn’t really bother posting at all, knowing (suspecting) only a couple people will visit his or her profile. And one would also be less likely to be exposed to differing views if one has to seek them out. A feed may be necessary to keep some of the positive effects mentioned earlier. But perhaps going in the other direction could help — say, a feed just for pages and news, and a feed for friends, granting the ability to jump back and forth and ignore for a while so-and-so’s incredible trip to Greece.

For more from the author, subscribe and follow or read his books.

Merit Pay

“Too many supporters of my party have resisted the idea of rewarding excellence in teaching with extra pay, even though we know it can make a difference in the classroom,” President Barack Obama said in March 2009. The statement foreshadowed the appearance of teacher merit pay in Obama’s “Race to the Top” education initiative, which grants federal funds to top performing schools. Performance, of course, is based on standardized testing, and in the flawed Race to the Top, so are teacher salaries. Teacher pay could rise and fall with student test scores.

Rhetoric concerning higher teacher salaries is a good thing. Proponents of merit pay say meager teacher salaries are an injustice, and such a pay system is needed to alleviate the nation’s teacher shortage. However, is linking pay to test scores the best way to “reward excellence”? Do we know, without question, it “can make a difference in the classroom”? The answers, respectively, are no and no. Merit pay is an inefficient and potentially counterproductive way to improve education in American public schools. It fails to motivate teachers to better themselves or remain in the profession, it encourages unhealthy teacher competition and dishonest conduct, and it does not serve well certain groups, like special education students.

Educator Alfie Kohn, author of the brilliant Punished by Rewards, wrote an article in 2003 entitled “The Folly of Merit Pay.” He writes, “No controlled scientific study has ever found a long-term enhancement of the quality of work as a result of any incentive system.” Merit pay simply does not work. It has been implemented here and there for decades, but is always abandoned. A good teacher is intrinsically motivated: he teaches because he enjoys it. She teaches because it betters society. He teaches because it is personally fulfilling. Advocates of merit pay ignore such motivation, but Kohn declares, “Researchers have demonstrated repeatedly that the use of such extrinsic inducements often reduces intrinsic motivation. The more that people are rewarded, the more they tend to lose interest in whatever they had to do to get the reward.” Extra cash sounds great, but it is destructive to the inner passions of quality teachers.

Teachers generally rank salaries below too much standardization and unfavorable accountability on their lists of grievances (Kohn, 2003). Educators leave the profession because they are being choked by federal standards and control, and politicians believe linking pay to such problems is a viable solution? Professionals also generally oppose merit pay, disliking its competitive nature. Professor and historian Diane Ravitch writes an incentive “gets everyone thinking about what is good for himself or herself and leads to forgetting about the goals of the organization. It incentivizes short-term thinking and discourages long-term thinking” (Strauss, 2011). Teaching students should not be a game, with big prizes for the winners.

Further, at issue is the distorted view of students performance pay perpetuates. Bill Raabe of the National Education Association says, “We all must be wary of any system that creates a climate where students are viewed as part of the pay equation, rather than young people who deserve a high quality education” (Rosales, 2009). In the current environment of high-stakes tests (which do not really evaluate the quality of teaching at all), merit pay is just another way to encourage educators to “teach to the test,” or worse: cheating. The nation has already seen public school teachers under so much pressure they resort to modifying their students’ scores in order to save their salaries or their jobs.

It is clear that merit pay does not serve young learners, but this is especially true in the case of special education students. The Individuals with Disabilities Education Act (IDEA) requires states that accept federal funding to provide individual educational services to all children with disabilities. While the preeminence of “inclusion” of SPED children in regular classrooms is appropriate, the students are also included in the accountability statues of No Child Left Behind. SPED students are required to meet “adequate yearly progress” (AYP) standards based on high-stakes tests in reading, math, and science, like other students. While some youths with “significant cognitive disabilities” (undefined by federal law) can take alternate assessments, there is a cap on how many students can do so (Yell, Katsiyannas, & Shiner, 2006, p. 35-36). Most special education students must be included in standardized tests.

The abilities and the needs of special education students are too diverse to be put in the box that is a standardized test. SPED students are essentially being asked to perform at their chronological grade level, and for some students that is simply not possible. How does that fit in with a Free Appropriate Public Education, the education program the IDEA guarantees, that focuses on “individualized” plans for the “unique needs” of the student? It does not. Progress is individual, not standardized. Further, linking teacher pay to this unreasonable accountability only makes matters worse. Performance pay will likely punish special education instructors. Each year, SPED students may make steady progress (be it academic, cognitive, social, emotional, etc.), but teachers will see their salaries stagnate or slashed because such gains do not meet federal or state benchmarks. Such an uphill battle will discourage men and women from entering the special education field, meaning fewer quality instructors to serve students with disabilities.

When a school defines the quality of teaching by how well students perform on one test once a year, everyone loses. When pay is in the equation, it’s worse. Obama deserves credit for beginning to phase out NCLB, but merit pay is no way to make public schools more effective. If politicians want to pay good teachers better and weed out poor teachers, their efforts would be better directed at raising salaries across the board and reforming tenure.

For more from the author, subscribe and follow or read his books.

References

Kohn, A. (2003). The Folly of Merit Pay. Retrieved February 19, 2012 from https://www.alfiekohn.org/article/folly-merit-pay/.

Rosales, J. (2009). Pay Based on Test Scores? Retrieved February 19, 2012 from http://www.nea.org/home/36780.html.

Strauss, V. (2011). Ravitch: Why Merit Pay for Teachers Doesn’t Work. Retrieved February 19, 2012 from http://www.washingtonpost.com/blogs/answer-sheet/post/ravitch-why-merit-pay-for-teachers-doesnt-work/2011/03/29/AFn5w9yB_blog.html.

Yell, M. L., Katsiyannas, A., Shiner, J. G. (2006). The No Child Left Behind Act, Adequate Yearly Progress, and Students with Disabilities. Teaching Exceptional Children, 38 (4), 32-39.

On Student Teaching

I am now two weeks from concluding my first student teaching placement (Visitation School), and my classroom management skills are still being refined. After observing for five days, slowly beginning my integration into a leadership role, I took over completely from my cooperating teacher. While excited to start, initially I had a couple days where I found one 6th grade class (my homeroom) difficult to control. There were times when other classes stepped out of line, naturally, but the consistency with which my homeroom became noisy and rowdy was discouraging.

“They’re your homeroom,” my cooperating teacher reminded me. “They feel more at home in your classroom, and will try to get away with more.”

There were a few instances where students took someone else’s property, or wrote notes to classmates, but the side chatter was the major offense. I would be attempting to teach and each table would have at least someone making conversation, which obviously distracts both those who wish to pay attention and those who don’t care. I would ask them to refocus and quiet themselves, which would work for but a few precious moments. There was one day I remember I felt very much as if the students were controlling me, rather than the other way around, and I made the mistake of hesitating when I could have doled out consequences. I spoke to my cooperating teacher about it during our feedback session, and she emphasized to me that I needed to prove to the students my willingness to enforce the policies, that I have the same authority as any other teacher in the building.

At Visitation, their classroom management system revolves around “tallies,” one of which equals three laps at recess before one can begin play. My homeroom deserved a tally the day I hesitated. I needed to come up with a concrete, consistent way of disciplining disruptive behavior. So I went home and developed a simple system I had thought about a long time ago: behavior management based on soccer. I cut out and laminated a yellow card and a red card. The next day, I sat each class down in the hall before they entered the room, and told them the yellow card would be shown to them as a warning, the red card as tallies. These could be given individually or as a class, and, like soccer, a red card could be given without a yellow card.

The students were surprisingly excited about this. Perhaps turning punishment into a game intrigued them; regardless, it made me wonder if this would work. But it seemed discussing the expectations I had of them, and the enforcement of such expectations, helped a good deal. Further, I was able to overcome my hesitation that day and dole out consequences for inappropriate behavior. My homeroom I gave a yellow card and then a red card, and they walked laps the next day.

My cooperating teacher noted the system would be effective because it was visual for the students. I also found that it allowed me to easily maintain emotional control; instead of raising my voice, I simply raised a card in my hand, and the class refocused. Its visibility allowed me to say nothing at all.

While containing a different purpose and practice, this system draws important elements from the Do It Again system educator Doug Lemov describes, including no administrative follow-up and logical consequences, but most significantly group accountability (Lemov, 2010, p. 192). It holds an entire class responsible for individual actions, and “builds incentives for individuals to behave positively since it makes them accountable to their peers as well as their teacher” (p. 192). Indeed, my classes almost immediately started regulating themselves, keeping themselves accountable for following my expectations (telling each other to be quiet and settle down, for instance, before I had to say anything).

Lemov would perhaps frown upon the yellow card, and point to the behavioral management technique called No Warning (p. 199). He suggests teachers:

  • Act early. Try to see the favor you are doing kids in catching off-task behavior early and using a minor intervention of consequence to prevent a major consequence later.
  • Act reliably. Be predictably consistent, sufficient to take the variable of how you will react out of the equation and focus students on the action that precipitated your response.
  • Act proportionately. Start small when the misbehavior is small; don’t go nuclear unless the situation is nuclear.

I have tried to follow these guidelines to the best of my ability, but Lemov would say the warning is not taking action, only telling students “a certain amount of disobedience will not only be tolerated but is expected” (p. 200). He would say students will get away with what they can until they are warned, and will only refocus and cease their side conversations afterwards. Lemov makes a valid point, and I have indeed seen this happen to a degree. As a whole, however, the system has been effective, and most of my classes do not at all take advantage of their warning. Knowing they can receive a consequence without a warning has helped, perhaps. After a month of using the cards, I have given my homeroom a red card three times. In my other five classes combined during the same period, there have been two yellows and only one red. I have issued a few individual yellows, but no reds.

Perhaps it is counterproductive to have a warning, but I personally feel that since the primary focus of the system is on group accountability, I need to give talkative students a chance to correct their behavior before consequences are doled out for the entire class. Sometimes a reminder is necessary, the reminder that their actions affect their classmates and that they need to refocus. I do not want to punish the students who are not being disruptive along with those who are without issuing some sort of warning that they are on thin ice.

***

During my two student teaching placements this semester, I greatly enjoyed getting to know my students. It was one of the more rewarding aspects of teaching. Introducing myself and my interests in detail on the first day I arrived proved to be an excellent start; I told them I liked history, soccer, drawing, reading, etc. Building relationships was easy, as students seemed fascinated by me and had an endless array of questions about who I was and where I came from.

Art is something I used to connect with students. At both my schools, the first students I got to know were the budding artists, as I was able to observe them sketching in the corners of their notebooks and later ask to see their work. There was one girl at my first placement who drew a new breed of horse on the homeroom whiteboard each morning; a boy at my second placement was drawing incredible fantasy figures every spare second he had. I was the same way when I was their age, so naturally I struck up conversations about their pictures. I tried to take advantage of such an interest by asking students to draw posters of Hindu gods or sketch images next to vocabulary words to aid recall. Not everyone likes to draw, but I like to encourage the skill and at least provide them an opportunity to try. Beyond this, I would use what novels students had with them to learn about their fascinations and engage them, and many were excited I knew The Hunger Games, The Hobbit, and The Lord of the Rings. We would discuss our favorite characters and compare such fiction to recent films.

For all my students, I strove to engage them each day with positive behavior, including greeting them by name at the door, drawing with and for them, laughing and joking with them, maintaining a high level of interest in what students were telling me (even if they rambled aimlessly, as they had the tendency to do) and even twice playing soccer with them at recess. The Catholic community of my first placement also provided the chance to worship and pray with my kids, an experience I will not forget.

One of my successes was remaining emotionally cool, giving students a sense of calm, confidence, and control about me. Marzano (2007) writes, “It is important to keep in mind that emotional objectivity does not imply being impersonal with or cool towards students. Rather, it involves keeping a type of emotional distance from the ups and downs of classroom life and not taking students’ outbursts or even students’ direct acts of disobedience personally” (p. 152). Even when I was feeling control slipping away from me, I did my best to be calm, keep my voice low, and correct students in a respectful manner that reminded them they had expectations they needed to meet. Lemov (2010) agrees, writing, “An emotionally constant teacher earns students’ trust in part by having them know he is always under control. Most of all, he knows success is in the long run about a student’s consistent relationship with productive behaviors” (p. 219). Building positive relationships required mutual respect and trust, and emotional constancy was key.

Another technique I emphasized was the demonstration of my passion for social studies, to prove to them the gravity of my personal investment in their success. One lesson from my first placement covered the persecution of Anne Hutchinson in Puritan America; we connected it to modern sexism, such as discrimination against women in terms of wage earnings. Another lesson was about racism, how it originated as a justification for African slavery and how the election of Barack Obama brought forth a surge of openly racist sentiment from part of the U.S. citizenry. I told them repeatedly that we studied history to become dissenters and activists, people who would rise up and destroy sexism and racism. I told them I had a personal stake in their understanding of such material, a personal stake in their future, because they were the ones responsible for changing our society in positive ways. Being the next generation, ending social injustices would soon be up to them.

Marzano (2007) says, “Arguably the quality of the relationships teachers have with students is the keystone of effective management and perhaps even the entirety of teaching” (p. 149). In my observation experiences, I saw burnt out and bitter teachers, who focused their efforts on authoritative control and left positive relationship-building on the sideline. The lack of strong relationships usually meant more chaotic classrooms and more disruptive behavior. As my career begins, I plan to make my stake in student success and my compassion for each person obvious, and stay in the habit.

For more from the author, subscribe and follow or read his books. 

References

Lemov, D. (2010). Teach like a champion: 49 techniques that put students on the path to college. San Francisco, CA: Jossey-Bass.

Marzano, R. (2007). The art and science of teaching. Alexandria, VA: Association for Supervision and Curriculum Development.

Bernie Will Win Iowa

Predicting the future isn’t something I make a habit of. It is a perilous activity, always involving a strong chance of being wrong and looking the fool. Yet sometimes, here and there, conditions unfold around us in a way that gives one enough confidence to hazard a prediction. I believe that Bernie Sanders will win Iowa today.

First, consider that Bernie is at the top of the polls. Polls aren’t always reliable predictors, and he’s neck-and-neck with an opponent in some of them, but it’s a good sign.

Second, Bernie raised the most money in Q4 of 2019 by far, a solid $10 million more than the second-place candidate, Pete Buttigieg. He has more individual donations at this stage than any candidate in American history, has raised the most overall in this campaign, and is among the top spenders in Iowa. (These analyses exclude billionaire self-funders Bloomberg and Steyer, who have little real support.) As with a rise in the polls, he has momentum like no one else.

Third, Bernie is the only candidate in this race who was campaigning in Iowa in 2016, which means more voter touches and repeat voter touches. This is Round 2 for him, an advantage — everyone else is in Round 1.

Next, don’t forget, Iowa in 2016 was nearly a tie between Bernie and Hillary Clinton. It was the closest result in the state’s caucus history; Hillary won just 0.3% more delegate equivalents. It’s probably safe to say Bernie is more well-known today, four years later — if he could tie then, he can win now.

Fifth, in Iowa in 2016, there were essentially two voting blocs: the Hillary Bloc and the Bernie Bloc. (There was a third but insignificant candidate.) These are the people who actually show up to caucus — what will they do now? I look at the Bernie Bloc as probably remaining mostly intact. He may lose some voters to Warren or others, as this field has more progressive options than last time, but I think his supporters’ fanatical passion and other voters’ interest in the most progressive candidate will mostly keep the Bloc together. The Hillary Bloc, of course, will be split between the many other candidates — leaving Bernie the victor. (Even if there is much higher turnout than in 2016, I expect the multitude of candidates to aid Bernie — and many of the new voters will go to him, especially if they’re young. An historic youth turnout is expected, and they mostly back Sanders.)

This last one is simply anecdotal. All candidates have devoted campaigners helping them. But I must say it. The best activists I know are on the case. They’ve put their Kansas City lives on hold and are in Iowa right now. The Kansas City Left has Bernie’s back, and I believe in them.

To victory, friends.

For more from the author, subscribe and follow or read his books.

The Enduring Stupidity of the Electoral College

To any sensible person, the Electoral College is a severely flawed method of electing our president. Most importantly, it is a system in which the less popular candidate — the person with fewer votes — can win the White House. That absurdity would be enough to throw the Electoral College out and simply use a popular vote to determine the winner. Yet there is more.

It is a system where your vote becomes meaningless, giving no aid to your chosen candidate, if you’re in your state’s political minority; where small states have disproportionate power to determine the winner; where white voters have disproportionate decision-making power compared to voters of color; and where electors, who are supposed to represent the majority of voters in each state, can change their minds and vote for whomever they please. Not even its origins are pure, as slavery and the desire to keep voting power away from ordinary people were factors in its design.

Let’s consider these problems in detail. We’ll also look at the threadbare attempts to justify them.

The votes of the political minority become worthless, leading to a victor with fewer votes than the loser

When we vote in presidential elections, we’re not actually voting for the candidates. We’re voting on whether to award decision-making power to Democratic or Republican electors. 538 people will cast their votes and the candidate who receives a majority of 270 votes will win. The electors are chosen by the political parties at state conventions, through committees, or by the presidential candidates. It depends on the state. The electors could be anyone, but are usually involved with the parties or are close allies. In 2016, for instance, electors included Bill Clinton and Donald Trump, Jr. Since they are chosen for their loyalty, they typically (though not always, as we will see) vote for the party that chose them.

The central problem with this system is that most all states are all-or-nothing when electors are awarded. (Only a couple states, Maine and Nebraska, have acted on this unfairness and divided up electors based on their popular votes.) As a candidate, winning by a single citizen vote grants you all the electors from the state. 

Imagine you live in Missouri. Let’s say in 2020 you vote Republican, but the Democratic candidate wins the state; the majority of Missourians voted Blue. All of Missouri’s 10 electors are then awarded to the Democratic candidate. When that happens, your vote does absolutely nothing to help your chosen candidate win the White House. It has no value. Only the votes of the political majority in the state influence who wins, by securing electors. It’s as if you never voted at all — it might as well have been 100% of Missourians voting Blue. As a Republican, wouldn’t you rather have your vote matter as much as all the Democratic votes in Missouri? For instance, 1 million Republican votes pushing the Republican candidate toward victory alongside the, say, 1.5 million Democratic votes pushing the Democratic candidate forward? Versus zero electors for the Republican candidate and 10 electors for the Democrat?

In terms of real contribution to a candidate’s victory, the outcomes can be broken down, and compared to a popular vote, in this way:

State Electoral College victor: contribution (electors)
State Electoral College loser: no contribution (no electors)

State popular vote victor: contribution (votes)
State popular vote loser: contribution (votes)

Under a popular vote, however, your vote won’t become meaningless if you’re in the political minority in your state. It will offer an actual contribution to your favored candidate. It will be worth the same as the vote of someone in the political majority. The Electoral College simply does not award equal value to each vote (see more examples below), whereas the popular vote does, by allowing the votes of the political minority to influence the final outcome. That’s better for voters, as it gives votes equal power. It’s also better for candidates, as the loser in each state would actually get something for his or her efforts. He or she would keep the earned votes, moving forward in his or her popular vote count. Instead of getting zero electors — no progress at all.

But why, one may ask, does this really matter? When it comes to determining who wins a state and gets its electors, all votes are of equal value. The majority wins, earning the right to give all the electors to its chosen candidate. How exactly is this unfair?

It’s unfair because, when all the states operate under such a system, it can lead to the candidate with fewer votes winning the White House. It’s a winner-take-all distribution of electors, each state’s political minority votes are ignored — but those votes can add up. 66 million Americans may choose the politician you support, but the other candidate may win with just 63 million votes. That’s what happened in 2016. It also happened in the race of 2000, as well as in 1876 and 1888. It simply isn’t fair or just for a candidate with fewer votes to win. It is mathematically possible, in fact, to win just 21.8% of the popular vote and win the presidency. While very unlikely, it is possible. That would mean, for example, a winner with 28 million votes and a loser with 101 million! This is absurd and unfair on its face. The candidate with the most votes should be the victor, as is the case with every other political race in the United States, and as is standard practice among the majority of the world’s democracies.

The lack of fairness and unequal value of citizen votes go deeper, however.

Small states and white power

Under the Electoral College, your vote is worth less in big states. For instance, Texas, with 28.7 million people and 38 electors, has one elector for every 755,000 people. But Wyoming, with 578,000 people and 3 electors, has one elector for every 193,000 people. In other words, each Wyoming voter has a bigger influence over who wins the presidency than each Texas voter. 4% of the U.S. population, for instance, in small states, has 8% of the electors. Why not 4%, to keep votes equal? (For those who think all this was the intent of the Founders, to give more power to smaller states, we’ll address that later on.)

To make things even, Texas would need many more electors. As would other big states. You have to look at changing population data and frequently adjust electors, as the government is supposed to do based on the census and House representation — it just doesn’t do it very well. It would be better to do away with the Electoral College entirely, because under a popular vote the vote of someone from Wyoming would be precisely equal to the vote of a Texan. Each would be one vote out of the 130 million or so cast. No adjustments needed.

It also just so happens that less populous states tend to be very white, and more populous states more diverse, meaning disproportionate white decision-making power overall.

Relatedly, it’s important to note that the political minority in each state, which will become inconsequential to the presidential race, is sometimes dominated by racial minorities, or at least most voters of color will belong to it. As Bob Wing writes, because “in almost every election white Republicans out-vote [blacks, most all Democrats] in every Southern state and every border state except Maryland,” the “Electoral College result was the same as if African Americans in the South had not voted at all.”

Faithless electors

After state residents vote for electors, the electors can essentially vote for whomever they want, in many states at least. “There are 32 states (plus the District of Columbia) that require electors to vote for a pledged candidate. Most of those states (19 plus DC) nonetheless do not provide for any penalty or any mechanism to prevent the deviant vote from counting as cast. Four states provide a penalty of some sort for a deviant vote, and 11 states provide for the vote to be canceled and the elector replaced…”

Now, electors are chosen specifically because of their loyalty, and “faithless electors” are extremely rare, but that doesn’t mean they will always vote for the candidate you elected them to vote for. There have been 85 electors in U.S. history that abstained or changed their vote on a whim. Sometimes for racist reasons, on accident, etc. Even more changed their votes after a candidate died — perhaps the voters would have liked to select another option themselves. Even if rare, all this should not be possible or legal. It is yet another way the Electoral College has built-in unfairness — imagine the will of a state’s political majority being ignored.

(All this used to be worse, in fact. Early on, some state legislatures appointed electors, meaning whatever party controlled a legislature simply selected people who would pick its favored presidential candidate. How voters cast their ballots did not matter.)

* * *

Won’t a popular vote give too much power to big states and cities?

Let’s turn now to the arguments against a popular vote, usually heard from conservatives. A common one is that big states, or big cities, will “have too much power.” Rural areas and less populous states and towns will supposedly have less.

This misunderstands power. States don’t vote. Cities don’t vote. People do. If we’re speaking solely about power, about influence, where you live does not matter. The vote of someone in Eudora, Kansas, is worth the same as someone in New York, New York.

This argument is typically posited by those who think that because some big, populous states like California and New York are liberal, this will mean liberal rule. (Conservative Texas, the second-most populous state, and sometimes-conservative swing states like Florida [third-most populous] and Pennsylvania [fifth-most populous] are ignored.) Likewise, because a majority of Americans today live in cities, and cities tend to be more liberal than small towns, this will result in the same. The concern for rural America and small states is really a concern for Republican power.

But obviously, in a direct election each person’s vote is of equal weight and importanceregardless and independent of where you live. 63% of Americans live in cities, so it is true that most voters will be living and voting in cities, but it cannot be said the small town voter has a weaker voice than the city dweller. Their votes have identical sway over who will be president. In the same way, a voter in a populous coastal state has no more influence than one in Arkansas.

No conservative looks with dismay at the direct election of his Democratic governor or congresswoman and says, “She only won because the small towns don’t have a voice. We have to find a way to diminish the power of the big cities!” No one complains that X area has too many people and too many liberals and argues some system should fix this. No one cries, “Tyranny of the majority! Mob rule!” They say, “She got the most votes, seems fair.” Why? Because one understands that the vote of the rural citizen is worth the same as the vote of an urban citizen, but if there happens to be more people living in cities in your state, or if there are more liberals in your state, so be it. That’s the freedom to live where you wish, believe what you wish, and have a vote worth the same as everyone else’s.

Think about the popular vote in past elections. About half of Americans vote Republican, about half vote Democrat. One candidate gets a few hundred thousand or few million more. It will be exactly the same if the popular vote determined the winner rather than the Electoral College — where you live is irrelevant. What matters is the final vote tally.

It’s not enough to simply complain that the United States is too liberal. And therefore we must preserve the Electoral College. That’s really what this argument boils down to. It’s not an argument at all. Unfair structures can’t be justified because they serve one political party. Whoever can win the most American votes should be president, no matter what party they come from.

But won’t candidates only pander to big states and cities?

This is a different question, and it has merit. It is true that where candidates campaign will change with the implementation of a popular vote. Conservatives warn that candidates will spend most of their time in the big cities and big states, and ignore rural places. This is likely true, as candidates (of both parties) will want to reach as many voters as possible in the time they have to garner support.

Yet this carries no weight as an argument against a popular vote, because the Electoral College has a very similar problem. Candidates focus their attention on swing states.

There’s a reason Democrats don’t typically campaign very hard in conservative Texas and Republicans don’t campaign hard in liberal California. Instead, they campaign in states that are more evenly divided ideologically, states that sometimes go Blue and sometimes go Red. They focus also on swing states with a decent number of electors. The majority of campaign events are in just six states. Unless you live in one of these places, like Ohio, Florida, or Pennsylvania, your vote isn’t as vital to victory and your state won’t get as much pandering. The voters in swing states are vastly more important, their votes much more valuable than elsewhere.

How candidates focusing on a handful of swing states might be so much better than candidates focusing on more populous areas is never explained by Electoral College supporters. It seems like a fair trade, but with a popular vote we also get the candidate with the most support always winning, votes of equal worth, and no higher-ups to ignore the will of the people.

However, with a significant number of Americans still living outside big cities, attention will likely still be paid to rural voters — especially, one might assume, by the Republican candidate. Nearly 40% of the nation living in small towns and small states isn’t something wisely ignored. Wherever the parties shift most of their attention, there is every reason to think Blue candidates will want to solidify their win by courting Blue voters in small towns and states, and Red candidates will want to ensure theirs by courting Red voters in big cities and states. Even if the rural voting bloc didn’t matter and couldn’t sway the election (it would and could), one might ask how a handful of big states and cities alone determining the outcome of the election is so much worse than a few swing states doing the same in the Electoral College system.

Likewise, the fear that a president, plotting reelection, will better serve the interests of big states and cities seems about as reasonable as fear that he or she would better serve the interests of the swing states today. One is hardly riskier than the other.

But didn’t the Founders see good reason for the Electoral College?

First, it’s important to note that invoking the Founding Fathers doesn’t automatically justified flawed governmental systems. The Founders were not perfect, and many of the policies and institutions they decreed in the Constitution are now gone.

Even before the Constitution, the Founders’ Articles of Confederation were scrapped after just seven years. Later, the Twelfth Amendment got rid of a system where the losing presidential candidate automatically became vice president — a reform of the Electoral College. Our senators were elected by the state legislatures, not we the people, until 1913 (Amendment 17 overturned clauses from Article 1, Section 3 of the Constitution). Only in 1856 did the last state, North Carolina, do away with property requirements to vote for members of the House of Representatives, allowing the poor to participate. The Three-Fifths Compromise (the Enumeration Clause of the Constitution), which valued slaves less than full people for political representation purposes, is gone, and today people of color, women, and people without property can vote thanks to various amendments. There were no term limits for the president until 1951 (Amendment 22) — apparently an executive without term limits didn’t give the Founders nightmares of tyranny.

The Founders were very concerned about keeping political power away from ordinary people, who might take away their riches and privileges. They wanted the wealthy few, like themselves, to make the decisions. See How the Founding Fathers Protected Their Own Wealth and Power.

The Electoral College, at its heart, was a compromise between Congress selecting the president and the citizenry doing so. The people would choose the people to choose the president. Alexander Hamilton wrote that the “sense of the people should operate in the choice of the person to whom so important a trust was to be confided.” He thought “a small number of persons, selected by their fellow-citizens from the general mass, will be most likely to possess the information and discernment requisite to such complicated investigations.”

Yet the Founders did not anticipate that states would pass winner-take-all elector policies, and some wanted it abolished. The Constitution and its writers did not establish such a mechanism. States did, and only after the Constitution, which established the Electoral College, was written. In 1789, only three states had such laws, according to the Institute for Research on Presidential Elections. It wasn’t until 1836 that every state (save one, which held out until after the Civil War) adopted a winner-take-all law; they sought more attention from candidates by offering all electors to the victor, they wanted their chosen sons to win more electors, and so forth. Before (and alongside) the winner-take-all laws, states were divided into districts and the people in each district would elect an elector (meaning a state’s electors could be divided up among candidates). Alternatively, state legislatures would choose the electors, meaning citizens did not vote for the president in any way, even indirectly! James Madison wrote that “the district mode was mostly, if not exclusively in view when the Constitution was framed and adopted; & was exchanged for the general ticket [winner-take-all] & the legislative election” later on. He suggested a Constitutional amendment (“The election of Presidential Electors by districts, is an amendment very proper to be brought forward…”) and Hamilton drafted it.

Still, among Founders and states, it was an anti-democratic era. Some Americans prefer more democratic systems, and don’t cling to tradition — especially tradition as awful and unfair as the Electoral College — for its own sake. Some want positive changes to the way government functions and broadened democratic participation, to improve upon and make better what the Founders started, as we have so many times before.

Now, it’s often posited that the Founding Fathers established the Electoral College to make sure small states had more power to determine who won the White House. As we saw above, votes in smaller states are worth more than in big ones.

Even if the argument that “we need the Electoral College so small states can actually help choose the president” made sense in a bygone era where people viewed themselves as Virginians or New Yorkers, not Americans (but rather as part of an alliance called the United States), it makes no sense today. People now see themselves as simply Americans — as American citizens together choosing an American president. Why should where you live determine the power of your vote? Why not simply have everyone’s vote be equal?

More significantly, it cannot be said that strengthening smaller states was a serious concern to the Founders at the Constitutional Convention. They seemed to accept that smaller states would simply have fewer voters and thus less influence. Legal historian Paul Finkleman writes that

in all the debates over the executive at the Constitutional Convention, this issue [of giving more power to small states] never came up. Indeed, the opposite argument received more attention. At one point the Convention considered allowing the state governors to choose the president but backed away from this in part because it would allow the small states to choose one of their own.

In other words, they weren’t looking out for the little guy. Political scientist George C. Edwards III calls this whole idea a “myth,” stressing: “Remember what the country looked like in 1787: The important division was between states that relied on slavery and those that didn’t, not between large and small states.”

Slavery’s influence

The Electoral College is also an echo of white supremacy and slavery.

As the Constitution was formed in the late 1780s, Southern politicians and slave-owners at the Convention had a problem: Northerners were going to get more seats in the House of Representatives (which were to be determined by population) if blacks weren’t counted as people. Southern states had sizable populations, but large portions were disenfranchised slaves and freemen (South Carolina, for instance, was nearly 50% black).

This prompted slave-owners, most of whom considered blacks by nature somewhere between animals and whites, to push for slaves to be counted as fully human for political purposes. They needed blacks for greater representative power for Southern states. Northern states, also seeking an advantaged position, opposed counting slaves as people. This odd reversal brought about the Three-Fifths Compromise most of us know, which determined an African American would be worth three-fifths of a person.

The Electoral College was largely a solution to the same problem. True, as we saw, it served to keep power out of the hands of ordinary people and in the hands of the elites, but race and slavery unquestionably influenced its inception. As the Electoral College Primer put it, Southerners feared “the loss in relative influence of the South because of its large nonvoting slave population.” They were afraid the direct election of the president would put them at a numerical disadvantage. To put it bluntly, Southerners were upset their states didn’t have more white people. A popular vote had to be avoided.

For example, Hugh Williamson of North Carolina remarked at the Convention, during debate on a popular election of the president: “The people will be sure to vote for some man in their own State, and the largest State will be sure to succede [sic]. This will not be Virga. however. Her slaves will have no suffrage.” Williamson saw that states with high populations had an advantage in choosing the president. But a great number of people in Virginia were slaves. Would this mean that Virginia and other slave states didn’t have the numbers of whites to affect the presidential election as much as the large Northern states?

The writer of the Constitution, slave-owner and future American president James Madison, thought so. He said that

There was one difficulty however of a serious nature attending an immediate choice by the people. The right of suffrage was much more diffusive in the Northern than the Southern States; and the latter could have no influence in the election on the score of the Negroes. The substitution of electors obviated this difficulty…

The question for Southerners was: How could one make the total population count for something, even though much of the population couldn’t vote? How could black bodies be used to increase Southern political power? Counting slaves helped put more Southerners in the House of Representatives, and now counting them in the apportionment of electors would help put more Southerners in the White House.

Thus, Southerners pushed for the Electoral College. The number of electors would be based on how many members of Congress each state possessed — which recall was affected by counting a black American as three-fifths of a person. Each state would have one elector per representative in the House, plus two for the state’s two senators (today we have 435 + 100 + 3 for D.C. = 538). In this way, the number of electors was still based on population (not the whole population, though, as blacks were not counted as full persons), even though a massive part of the America population in 1787 could not vote. The greater a state’s population, the more House reps it had, and thus the more electors it had. Southern electoral power was secure.

This worked out pretty well for the racists. “For 32 of the Constitution’s first 36 years, a white slaveholding Virginian occupied the presidency,” notes Akhil Reed Amar. The advantage didn’t go unnoticed. Massachusetts congressman Samuel Thatcher complained in 1803, “The representation of slaves adds thirteen members to this House in the present Congress, and eighteen Electors of President and Vice President at the next election.”

Tyrants and imbeciles

At times, it’s suggested that the electors serve an important function: if the people select a dangerous or unqualified candidate — like an authoritarian or a fool — to be the party nominee, the electors can pick someone else and save the nation. Hamilton said, “The process of election affords a moral certainty, that the office of President will never fall to the lot of any man who is not in an eminent degree endowed with the requisite qualifications.”

Obviously, looking at Donald Trump, the Electoral College is just as likely to put an immoral doofus in the White House than keep one out. Trump may not fit that description to you, but some day a candidate may come along who does. And since the electors are chosen for their loyalty, they are unlikely to stop such a candidate, even if they have the power to be faithless. We might as well simply let the people decide.

It is a strange thing indeed that some people insist a popular vote will lead to dictatorship, ignoring the majority of the world’s democracies that directly elect their executive officer. They have not plunged into totalitarianism. Popular vote simply doesn’t get rid of checks and balances, co-equal branches, a constitution, the rule of law, and other aspects of free societies. These things are not incompatible.

France has had direct elections since 1965 (de Gaulle). Finland since 1994 (Ahtisaari). Portugal since 1918 (Pais). Poland since 1990 (Wałęsa). Why aren’t these nations run by despots by now? Why do even conservative institutes rank nations like Ireland, Finland, and Austria higher up on a “Human Freedom Index” than the United States? How is this possible, if direct elections of the executive lead to tyranny?

There are many factors that cause dictatorship and ruin, but simply giving the White House to whomever gets the most votes is not necessarily one of them.

Modern motives

We close by stating the obvious. There remains strong support for the Electoral College among conservatives because it has recently aided Republican candidates like Bush (2000) and Trump (2016). If the GOP lost presidential elections due to the Electoral College after winning the popular vote, like the other party does, they’d perhaps see its unfair nature.

The popular vote, in an increasingly diverse, liberal country, doesn’t serve conservative interests. Republicans have won the popular vote just once since and including the 1992 election. Conservatives are worried that if the Electoral College vanishes and each citizen has a vote of equal power, their days are numbered. Better to preserve an outdated, anti-democratic system than benefits you than reform your platform and policies to change people’s minds about you and strengthen your support. True, the popular vote may serve Democratic interests. Fairness serves Democratic interests. But, unlike unfairness, which Republicans seek to preserve, fairness is what’s right. Giving the candidate with the most votes the presidency is what’s right.

For more from the author, subscribe and follow or read his books.

An Absurd, Fragile President Has Revealed an Absurd, Fragile American System

The FBI investigation into Donald Trump, one of the most ludicrous and deplorable men to ever sit in the Oval Office, was a valuable lesson in just how precariously justice balances on the edge of a knife in the United States. The ease with which any president could obstruct or eliminate accountability for his or her misdeeds should frighten all persons regardless of political ideology.

Let’s consider the methods of the madness, keeping in mind that whether or not a specific president like Trump is innocent of crimes or misconduct, it’s smart to have effective mechanisms in place to bring to justice later executives that are guilty. The stupidity of the system could be used by a president of any political party. This must be rectified.

A president can fire those investigating him — and replace them with allies who could shut everything down

The fact the above statement can be written truthfully about an advanced democracy, as opposed to some totalitarian regime, is insane. Trump of course did fire those looking into his actions, and replaced them with supporters.

The FBI (not the Democrats) launched an investigation into Trump and his associates concerning possible collusion with Russia during the 2016 election and obstruction of justice, obviously justified given his and their suspicious behavior, some of which was connected to actual criminal activity, at least among Trump’s associates who are now felons. Trump fired James Comey, the FBI director, who was overseeing the investigation. Both Trump and his attorney Rudy Giuliani publicly indicated the firing was motivated by the Russia investigation; Comey testified Trump asked him to end the FBI’s look into Trump ally Michael Flynn, though not the overall Russia inquiry.

The power to remove the FBI director could be used to slow down an investigation (or shut it down, if the acting FBI director is loyal to the president, which Andrew McCabe was not), but more importantly a president can then nominate a new FBI director, perhaps someone more loyal, meaning corrupt. (Christopher Wray, Trump’s pick, worked for a law firm that did business with Trump’s business trust, but does not seem a selected devotee like the individuals you will see below, perhaps because by the time his installment came around the investigation was in the hands of Special Counsel Robert Mueller.) The Senate must confirm the nomination, but that isn’t entirely reassuring. The majority party could push through a loyalist, to the dismay of the minority party, and that’s it. Despite this being a rarity, as FBI directors are typically overwhelmingly confirmed, it’s possible. A new director could then end the inquiry.

Further, the president can fire the attorney general, the FBI director’s boss. The head of the Justice Department, this person has ultimate power over investigations into the president — at least until he or she is removed by said president. Trump made clear he was upset with Attorney General Jeff Sessions for recusing himself from overseeing the Russia inquiry because Sessions could have discontinued it. It was reported Trump even asked Sessions to reverse this decision! Sessions recused himself less than a month after taking office, a couple months before Comey was fired. For less than a month, Sessions could have ended it all.

Deputy Attorney General Rod Rosenstein, luckily no Trump lackey, was in charge after Sessions stepped away from the matter. It was Rosenstein who appointed Robert Mueller special counsel and had him take over the FBI investigation, after the nation was so alarmed by Comey’s dismissal. Rosenstein had authority over Mueller and the case (dodging a bullet when Trump tried to order Mueller’s firing but was rebuked by his White House lawyer; Trump could have rescinded statutes that said only the A.G. could fire the special counsel, with an explosive court battle over constitutionality surely following) until Trump fired Sessions and installed loyalist Matt Whitaker as Acting Attorney General. Whitaker is a man who

defended Donald Trump Jr.’s decision to meet with a Russian operative promising dirt on Hillary Clinton. He opposed the appointment of a special counsel to investigate Russian election interference (“Hollow calls for independent prosecutors are just craven attempts to score cheap political points and serve the public in no measurable way.”) Whitaker has called on Rod Rosenstein to curb Mueller’s investigation, and specifically declared Trump’s finances (which include dealings with Russia) off-limits. He has urged Trump’s lawyers not to cooperate with Mueller’s “lynch mob.”

And he has publicly mused that a way to curb Mueller’s power might be to deprive him of resources. “I could see a scenario,” he said on CNN last year, “where Jeff Sessions is replaced, it would [be a] recess appointment and that attorney general doesn’t fire Bob Mueller but he just reduces his budget to so low that his investigation grinds to almost a halt.”

Whitaker required no confirmation from the Senate. Like an official attorney general, he could have ended the inquiry and fired Robert Mueller if he saw “good cause” to do so, or effectively crippled the investigation by limiting its resources or scope. That did not occur, but it’s not hard to imagine Whitaker parroting Trump’s wild accusations of Mueller’s conflicts of interest, or whipping up some bullshit of his own to justify axing the special counsel. The same can be said of Bill Barr, who replaced Whitaker. Barr, who did need Senate confirmation, was also a Trump ally, severely endangering the rule of law:

In the Spring of 2017, Barr penned an op-ed supporting the President’s firing Comey. “Comey’s removal simply has no relevance to the integrity of the Russian investigation as it moves ahead,” he wrote. In June 2017, Barr told The Hill that the obstruction investigation was “asinine” and warned that Mueller risked “taking on the look of an entirely political operation to overthrow the president.” That same month, Barr met with Trump about becoming the president’s personal defense lawyer for the Mueller investigation, before turning down the overture for that job.

In late 2017, Barr wrote to the New York Times supporting the President’s call for further investigations of his past political opponent, Hillary Clinton. “I have long believed that the predicate for investigating the uranium deal, as well as the foundation, is far stronger than any basis for investigating so-called ‘collusion,’” he wrote to the New York Times’ Peter Baker, suggesting that the Uranium One conspiracy theory (which had by that time been repeatedly debunked) had more grounding than the Mueller investigation (which had not). Before Trump nominated him to be attorney general, Barr also notoriously wrote an unsolicited 19-page advisory memo to Rod Rosenstein criticizing the obstruction component of Mueller’s investigation as “fatally misconceived.” The memo’s criticisms proceeded from Barr’s long-held and extreme, absolutist view of executive power, and the memo’s reasoning has been skewered by an ideologically diverse group of legal observers.

What happy circumstances, Trump being able to shuffle the investigation into his own actions to his first hand-picked attorney general (confirmation to recusal: February 8 to March 2, 2017), an acting FBI director (even if not an ally, the act itself is disruptive), a hand-picked acting attorney general, and a second hand-picked attorney general. Imagine police detectives are investigating a suspect but he’s their boss’ boss. That’s a rare advantage.

The nation held its breath with each change, and upon reflection it seems almost miraculous Mueller’s investigation concluded at all. Some may see this as a testament to the strength of the system, but it all could have easily gone the other way. There were no guarantees. What if Sessions hadn’t recused himself? What if he’d shut down the investigation? What if Comey, McCabe, or Rosenstein had been friendlier to Trump? What if Whitaker or Barr had blown the whole thing up? Yes, political battles, court battles, to continue the inquiry would have raged — but there are no guarantees they would have succeeded.

Tradition, political and public pressure…these mechanisms aren’t worthless, but they hardly seem as valuable as structural, legal changes to save us from having to simply hope the pursuit of justice doesn’t collapse at the command of the accused or his or her political allies. We can strip the president of any and all power over the Justice Department workers investigating him or her, temporarily placing A.G.s under congressional authority, and eradicate similar conflicts of interest.

The Department of Justice can keep its findings secret

Current affairs highlighted this problem as well. When Mueller submitted his finished report to Bill Barr, the attorney general was only legally required to submit a summary of Mueller’s findings to Congress. He did not need to provide the full report or full details to the House and Senate, much less to the public. He didn’t even need to release the summary to the public!

This is absurd, obviously setting up the possibility that a puppet attorney general might not tell the whole story in the summary to protect the president. Members of Mueller’s team are currently saying to the press that Barr’s four-page summary is too rosy, leaving out damaging information about Trump. The summary says Mueller found no collusion (at least, no illegal conspiring or coordinating), and that Barr, Rosenstein, and other department officials agreed there wasn’t enough evidence of obstruction of justice. But one shouldn’t be forced to give a Trump ally like Barr the benefit of the doubt; one should be able to see the evidence to determine if he faithfully expressed Mueller’s findings and hear detailed arguments as to how he and others reached a verdict on obstruction. Barr is promising a redacted version of the report will be available this month. He did not have to do this — we again simply had to hope Barr would give us more. Just as we must hope he can be pressured into giving Congress the full, unedited report. This must instead be required by law, and the public is at least owed a redacted version. Hope is unacceptable. It would also be wise to find a more independent, bipartisan or nonpartisan way to rule on obstruction if the special counsel declines to do so — perhaps done in a court of law, rather than a Trump lackey’s office.

The way of doing things now is simply a mess. What if an A.G. is untruthful in his summary? Or wants only Congress to see it, not the public? What if she declines to release a redacted version? What if the full report is never seen beyond the investigators and their Justice Department superiors, appointed supporters of the president being investigated? What if a ruling on obstruction is politically motivated?

We don’t know if the president can be subpoenaed to testify

While the Supreme Court has established that the president can be subpoenaed, or forced, to turn over materials (such as Nixon and his secret White House recordings), it hasn’t specifically ruled on whether the president must testify before Congress, a special counsel, or a grand jury if called to do so. While the president, like any other citizen, has Fifth Amendment rights (he can’t be “compelled in any criminal case to be a witness against himself,” risking self-incrimination), we do need to know if the executive can be called as a witness, and under what circumstances. Mueller chose not to subpoena Trump’s testimony because it would lead to a long legal battle. That’s what unanswered questions and constitutional crises produce.

We have yet to figure out if a sitting president can be indicted

If the executive commits a crime, can he or she be charged for it while in office? Can the president go to trial, be prosecuted, sentenced, imprisoned? We simply do not know. The Office of Legal Counsel at the Justice Department says no, but there is fierce debate over whether it’s constitutional or not, and the Supreme Court has never ruled on the matter.

There’s been much worry lately, due to Trump’s many legal perilsover this possible “constitutional crisis” arising, a crisis of our own design, having delayed creating laws for this sort of thing for centuries. For now, the trend is to follow Justice Department policy, rather helpful for a president who’s actually committed a felony. The president can avoid prosecution and punishment until leaving office or even avoid it entirely if the statute of limitations runs out before the president’s term is over!

“Don’t fret, Congress can impeach a president who seems to have committed a crime. Out of office, a trial can commence.” That is of little comfort, given the high bar for impeachment. Bitter partisanship could easily prevent the impeachment of a president, no matter how obvious or vile the misdeeds. It’s not a sure thing.

The country needs to rule on this issue, at the least eliminating statutes of limitations for presidents, at most allowing criminal proceedings to occur while the president is in office.

We don’t know if a president can self-pardon (and pardons themselves are ruinous)

Trump, like the blustering authoritarian he is, declared he had the “absolute right” to pardon himself. But the U.S. has not figured this out either. It’s also a matter of intense debate, without constitutional clarity or judicial precedent. A sensible society might make it clear that the executive is not above the law — he or she cannot simply commit crimes with impunity, cannot self-pardon. Instead, we must wait for a crisis to force us to decide on this issue. And, it should be emphasized, the impeachment of a president who pardoned him- or herself would not be satisfactory. Crimes warrant consequences beyond “You don’t get to be president anymore.”

[Update: In early 2021, Trump spent his last days in office pardoning and granting clemency to corrupt allies and associates, those in prison for fraud, lying to the authorities, threatening witnesses, defying court orders, obstructing justice, and so on. Included were Steve Bannon, Roger Stone, Paul Manafort, Mike Flynn, George Papadopoulos, and more. Trump later promised that in a future term he would let more criminals back onto the streets: the January 6 Capitol rioters who assaulted police and broke into Congress to stop election proceedings, resulting in multiple deaths. The presidential pardon ensures that crimes, corruption, and attacks on democracy (one day perhaps successful) will simply go unpunished, encouraging further similar acts.]

Subpoenas can be optional

If you declined to show up in court after being issued a subpoena, you would be held in contempt. You’d be fined or jailed because you broke the law. It’s supposed to work a similar way when congressional committees issue subpoenas, instructing people to come testify or produce evidence. It is illegal to ignore a subpoena from Congress. Yet Trump has ordered allies like Carl Kline and Don McGahn to do just that, vowing to “fight all the subpoenas.” Leading Republican legislators like Lindsey Graham and Jim Jordan encouraged Donald Trump Jr. to ignore his subpoena. Barr waved away his subpoena to give Congress the full Mueller report. Various other officials have ignored their summonses as well.

When an individual does this, the congressional committee and then the whole house of Congress (either the Senate or the House of Representatives, not both) must vote on holding the individual in contempt.

Which means that the seriousness of a subpoena depends upon the majority party in a house of Congress. If it’s not in the interest of, say, a Republican Senate to hold a Republican official in contempt after he refused to answer a subpoena in an investigation (maybe of a Republican president), then that’s that. There is no consequence for breaking the law and ignoring the order to appear or provide evidence. As long as you’re on the side of the chamber majority, you can throw the summons from the committee in the trash. (This isn’t the case with Trump, as the Democrats control the House and are thus able to convict someone of contempt, but the utter disregard for subpoenas Trump and others showed raised the question of what happens next, revealing this absurd system to this writer and others. If a chamber does convict someone of contempt, there are a few options going forward to jail or fine said person, one of which has a similar debilitating partisan wrench.) Perhaps we should construct a system, perhaps by giving committees more control over conviction and enforcement or handing things over to the judicial system earlier, where breaking the law has consequences no matter who has majority power, to prevent that behavior and allow investigations to actually operate.

[Update: Trump’s attempts to overturn the free and fair election of 2020 highlighted more disturbing weaknesses on the books. First, that state legislatures still theoretically have the power to decouple electors from the voters, appointing their own electors to vote for a presidential candidate and making voting by ordinary people meaningless. Second, Congress also has the power to dismiss electors, again laying waste to the integrity of the vote. See We Just Witnessed How Democracy Ends.]

For more from the author, subscribe and follow or read his books.

The Odd Language of the Left

Language fascinates me. This applies to the study of foreign languages and the pursuit of a proper, ideal form of one’s native language (such as the preservation of the Oxford comma to stave off chaos and confusion), but most importantly to how language is used for political and social issues — what words are chosen, what words are ethical (and in what contexts), how the definitions of words and concepts change over time, and so on.

These questions are important, because words matter. They can harm others, meaning they can be, at times, immoral to use. Individuals and groups using different definitions can impede meaningful conversation and knowledge or perspective sharing, to such a degree that, in cases where no definition is really any more moral than another, long arguments over them probably aren’t worth it.

Despite incessant right-wing whining about political correctness, the Left is doing an important service in changing our cultural language. It’s driven by thinking about and caring for other people, seeking equality and inclusion in all things, which could theoretically be embraced by anyone, even those on the other side of the political spectrum who don’t agree with this or that liberal policy, or even understand or know people who are different. “Immigrants” is more humanizing than “aliens” or “illegals,” “Latinx” does away with the patriarchal, unnecessary male demarcation of groups containing both men and women (and invites in non-binary persons), and “the trans men” or simply “the men” is far more respectful than “the trangenders,” in the same way that there are much better ways of saying “the blacks.” There are of course more awful ways of talking about others, virulent hate speech and slurs; more people agree these things are unacceptable. As far as these less insidious word choices go, replacement is, in my view, right and understandable. Why not? Kind people seek ways to show more kindness, despite tradition.

What I find curious is when the Left begins questioning the “existence” of certain concepts. Finding better phrasing or definitions is often important and noble, but for years I’ve found the claims that such-and-such “does not exist” to be somewhat strange.

Take, for instance, “The friendzone does not exist.” This is the title of articles on BuzzfeedThought Catalog, and so forth, which the reader should check out to fully appreciate the perspective (of those rather unlike this writer, an admittedly privileged and largely unaffected person). It’s easy to see why one would want to wipe friendzone off the face of the Earth, as it’s often uttered by petulant straight men whining and enraged over being turned down. The rage, as noted in the articles, is the mark of entitled men feeling they are owed something (attention, a date, sex), wanting to make women feel guilty, believing they are victims, and other aspects of toxic masculinity. Such attitudes and anger lead to everything from the most sickening diatribes to the rape and murder of women. It’s a big part of why the feminist movement is important today.

Yet friendzone is a term used by others as well — it’s surely mostly used by men, but it’s impossible to know for certain if it’s disproportionately used by men of the toxic sort. If you’ll pardon anecdotal evidence, we’ve probably all heard it used by harmless people with some frequency. We’d need some serious research to find out. In any case, many human beings will at some point have someone say to them: “I don’t feel that way about you, let’s just be friends.” A silly term at some point arose (perhaps in Friends, “The One With the Blackout,” 1994) to describe the experience of rejection. What does it mean, then, to say “The friendzone does not exist”? It’s basically to say an experience doesn’t exist. That experience can be handled very differently, from “OK, I understand” to homicide, but it’s a happenstance that most people go through, so some kind of word for it was probably inevitable. If it wasn’t friendzone it likely would have been something else, and one suspects that if we eradicate this particular term a new one might eventually pop up in its place (justfriended?). It’s all a bit like saying “Cloud Nine does not exist” or “Cuffing season does not exist.” Well, those are expressions that describe real-world experiences. As long as a human experience persists, so will the concept and some kind of label or idiom, often more than one.

The relevant question is if the use of the term friendzone encourages and perpetuates toxic masculinity. Is it contributing to male rage? Does one more term for rejection, alongside many others (shot down, for instance), have that power? Or is it a harmless expression, at times wielded by awful men like anyone else? That’s a difficult question to answer. (The only earnest way would be through scientific study, the basis of many left-wing views.) While I could be wrong, I lean towards the latter. I don’t suppose it’s any more harmful or unkind than shot down and so forth, and see such terms as inevitable, meaning what’s really important is changing the reactions to certain life events. My guess is the word is experiencing a bit of guilt by association — terrible men use it while expressing their childish sentiments about how they deserve this or that, about how women somehow hate nice guys, and so on, and thus the term takes on an ugly connotation to some people. Other terms are used by them less and don’t have that connotation. Readers will disagree on how strong the connotation is, and how harmful the term is, but the main point was simply to ponder how a word for a common experience should be said to “not exist” — it’s hard to discern whether such phrasing intrudes more on one’s knowledge of reality or English. Perhaps both equally. It’s rather different than saying, “This word is problematic, here’s a better one.” I could be misinterpreting all this, and every instance of denying existence is supposed to mean the word simply shouldn’t be used, leaving space for other, better ways to describe the concept, but that just goes back to interest in how language is used in social issues — why say one but not the other, more clear, option? Anyway, read the articles and you’ll likely agree the very existence of concepts are being questioned. Finally, it’s interesting to consider why the Left ended up saying X doesn’t exist rather than, say, X is real and your toxic ass had better get used to it. What if, like words of the past, it had been adopted by those it was used against to strip it of its power and turn the tables? What causes that to happen to some words but not others? Is it because this one describes an event, not a person? Another intriguing question about language.

Similarly, does virginity exist? Not according to some (The OdysseyHer Campus). Again, the sentiment is understandable. Women’s worth has long been closely tied to virginity (read your bible), and with that came widespread oppressive efforts to keep women’s bodies under tight control, still manifested today in incessant shaming for engaging in sex as freely as men do, murder, and more. Men have experienced something related, though far less oppressive and in an opposite sense: women are more valuable as virgins (or with fewer overall partners) and are judged for being sexually active, while men are shamed or ridiculed for being virgins or not engaging in sex. Further, the definition of virginity is open to debate (the definition of friendzone is as well, though the most common one was used above). Is a straight person a virgin if he or she has only had anal sex? Is a gay person, who has regular sex with a partner, technically a virgin until death? Because the word’s meaning is subjective, and because it was a basis of patriarchal oppression, so the argument goes, “virginity doesn’t exist.”

Virginity is a way of saying one hasn’t had some form of sexual experience. For some it’s vaginal penetration, for others it’s different — the particular act doesn’t really matter. It’s simply “I haven’t had sex yet,” whatever form sex may take in the individual mind. Everyone has their own view of it, but that doesn’t make it unreal — in the same way everyone has their own idea of what love is, and yet love exists. Having sex for the first time is quite an event in any human being’s life, and most or many will experience it. Even if our history had been free of misogyny and patriarchy, there likely would have eventually arisen some term for having never experienced sex (or having been turned down). Does the statement “Virginity doesn’t exist” make sense? As with friendzone, it’s a labeled common experience, or lack thereof. While it was and is wielded by misogynistic oppressors, it’s an occurrence, and a concept, that certainly “exists.”

Does having a term for all this harm society and hurt others, helping preserve the hysteria over who’s had intercourse, and the associated maltreatment? Again, it’s possible. But my point is that a term is unavoidable. The state of being is real, thus the concept is real, thus a word or phrase will inevitably be employed. Being “single” happens — does “singleness” not exist? Won’t there always be some way to describe that state? We could get rid of the words virgin and virginity, but there’s no getting rid of “I’ve had sex” versus “I haven’t.” Another phrase or term will suffice just as well to describe the concept. We can abolish friendzone, but “The person I like turned me down” isn’t going away. There may be better words and definitions for concepts, but there’s often no case against a concept’s reality, which is how all this is framed. What’s important is to try to change the perceptions and attitudes toward these concepts, not deny they exist. “Yes, you were put in the friendzone, but you’ve done that to a lot of women you weren’t interested in. That’s life, you’ll live, grow up.” “So what if she’s not a virgin? Should your dateability or worth go down if you weren’t one? Why hers and not yours?” And so on. Indeed, it seems more difficult to change attitudes towards life events when you start off by saying, in essence, and confusingly, that an expression isn’t real.

There are other examples of assertions I find awkward, but as this article is lengthy already I will just briefly mention a couple of them and hope the reader assumes I’ve given them more thought than a few sentences would suggest. “There’s no such thing as race, it’s a social construct,” while doing a service by reminding us we are all part of the same human family, has always seemed mostly pointless in a reality where individuals biologically have different shades of skin and hair texture, and many are brutally victimized because of it. “No human being is illegal” puts forward an ideal, which I support: amnesty, a faster legal entrance policy, and so on (I also support the dissolution of all borders worldwide and the establishment of one human nation, but that may not be implied here). It’s also a call to describe people in a more respectful way, i.e. “undocumented” rather than “illegal.” Still, it always seemed a little off. Some human beings are here illegally, and our task is to change the law to make that history. That the State designates some human beings as illegal is the whole problem, the entire point. True, it’s an ideal, an inspirational call. But I always thought replacing “is” with “should be” or something would be more to the point. But enough splitting hairs.

For more from the author, subscribe and follow or read his books.

Someone Worse Than Trump is Coming. Much of the Right Will Vote For Him Too.

Donald Trump is a nightmare — an immoral, vile, ignorant human being.

It is impossible to fully document his awfulness with brevity. Even when summarizing the worst things Trump has said and done it is difficult to know where to stop.

He calls women “dogs” — they are “animals,” “big, fat pigs,” “ugly,” and “disgusting” if they cross him or don’t please his gaze. You have to “treat ’em like shit,” they’re “horsefaces.” He makes inappropriate sexual jokes and remarks about his own daughter, about “grabbing” women “by the pussy” and kissing them without “waiting,” and admits to barging into pageant dressing rooms full of teenage girls with “no clothes” on. He mocks people with disabilities, Asians with imperfect English, including, probably, “the Japs,” and prisoners of warTrump was sued for not renting to blacks, took it upon himself to buy full-page ads in New York papers calling for the restoration of the death penalty so we could kill black teens who allegedly raped a white woman (they were later declared innocent), and was a leader of the ludicrous “birther” movement that sought to prove Obama was an African national. He is reluctant to criticize Klansmen and neo-Nazis, and retweets racist misinformation without apology. He’s fine with protesters being “roughed up,” nostalgic about the good old days when they’d be “carried out on a stretcher,” even saying about one: “I’d like to punch him in the face.” He likewise makes light of physical attacks on journalists. He praises dictators. He threatens to violate the Constitution as a political strategy. He cheats on his wife with porn stars and pays them to keep quiet. The constant bragging of a high I.Q. (his “very, very large brain“) and his big fortune, among other things, are emblematic of his ugly narcissism. His daily rate of lies and inaccuracies is surely historic, with journeys into fantasyland over crowd sizes and wiretaps by former presidents.

And those are merely the uncontroversial facts. Trump faces nearly two dozen accusations of sexual assault. He is alleged to at times say extremely racist things, remarks about “lazy,” thieving “niggers.” His ex-wife claimed in 1990 that he sometimes read Hitler’s speeches, and Vanity Fair reported Trump confirmed this. The payment to Stormy Daniels was likely a violation of campaign finance laws — Trump’s former attorney implicated him in court. Trump is being sued for using the presidency to earn income, his nonprofit foundation being sued for illegal use of funds. Trump has almost certainly engaged in tax fraud, joined in his staff and own son’s collusion with Russia during the 2016 election, and obstructed justice.

All this of course speaks more to his abysmal personality and character than his political beliefs or actions as executive. That’s it’s own conversation, and it’s an important one because some conservatives accept Trump is not a good person but think his policies are just wonderful.

On the one hand, many of Trump’s policies are as awful as he is, and will not be judged kindly by history. Launching idiotic trade wars where he slaps a nation with tariffs and is immediately slapped with tariffs in return, hurting U.S. workersStoking nativist fear and stereotypes about Hispanic immigrants and Muslims, driving the enactment of (1) a ban on immigrants from several predominantly Muslim nations (doing away with vetting entirely, keeping good people, many fleeing oppression, war, and starvation, out with the bad) and limits to refugees and immigrants in general, and (2) the attempted destruction of DACA (breaking a promise the nation made to youths brought here illegally) and a “zero tolerance” policy on illegal entry that sharply increased family separations. Saying foreigners at the border who throw rocks at the military should be shot. Pushing to ensure employers are allowed to fire people for being gay or trans (and refuse them service as customers), eliminating anti-discrimination protections for trans students in public schools, and attempting to bar trans persons from military service. Voting against a U.N. resolution condemning the execution of gays.

On the other hand, we can be grateful that, to quote American intellectual Noam Chomsky, “Trump’s only ideology is ‘me.'” Trump is thoroughly defined by self-absorption. He flip-flops frequently — reportedly most influenced by the last person he speaks to — and even used to call himself more of a Democrat, advocating for a few liberal social policies while remaining conservative on business matters. He either changed his mind over time or, as I wrote elsewhere, believed running as a Republican offered the best chance at victory and thus adopted an extreme right-wing persona — an idea that doesn’t at all mean he isn’t also an awful person (rather, it’s evidence of the fact). Outside of policies that serve him personally it is difficult to know what Trump believes in — or if he even cares. He may genuinely lack empathy and have no interest in policies that don’t affect him. True, perhaps he isn’t merely playing to his base and actually has a vision for the country, but the “ideology of me” does appear preeminent. While it’s “deeply authoritarian and very dangerous,” as Chomsky says, it “isn’t Hitler or Mussolini.” And for this we can count ourselves somewhat fortunate. (Likewise, that Trump isn’t the brightest bulb in the box, speaking at a fourth-grade level, reportedly not reading that well and possessing a short attention span, lacking political knowledge, and being labeled a childish idiot by his allies.)

Next time we may not be so lucky. As hard or painful as it is to imagine, someone worse will likely come along soon enough.

One day Trump will leave the White House, and with a profound sense of relief we will hear someone declare: “Our long national nightmare is over.” That’s what Gerald Ford said to the country the day he took over from Nixon — a man corrupt, deceitful, paranoid, wrathful, and in many ways wicked (he is on audiotape saying “Great. Oh, that’s so wonderful. That’s good” when told his aides hired goons to break protesters’ legs). One wonders how many people in 1974 thought that someone like Trump would be along in just eight presidencies? If there was a lack of imagination we shouldn’t repeat it.

In significant ways, there are already foreshadows of the next nightmare. Trump opened a door. His success was inspiration for America’s worst monsters. They have seen what’s possible — and will only be more encouraged if Trump is reelected or goes unpunished for wrongdoing and nastiness. I wrote before the election:

When neo-Nazi leaders start calling your chosen candidate “glorious leader,” an “ultimate savior” who will “Make American White Again” and represents “a real opportunity for people like white nationalists,” it may be time to rethink the Trump phenomenon. When former KKK leader David Duke says he supports Trump “100 percent” and that people who voted for Trump will “of course” also vote for Duke to help in “preserving this country and the heritage of this country,” it is probably time to be honest about the characteristics and fears of many of the people willing to vote for Trump. As Mother Jones documents, white nationalist author Kevin McDonald called Trump’s movement a “revolution to restore White America,” the anti-Semitic Occidental Observer said Trump is “saying what White Americans have been actually thinking for a very long time,” and white nationalist writer Jared Taylor said Trump is “talking about policies that would slow the dispossession of whites. That is something that is very important to me and to all racially conscious white people.” Rachel Pendergraft, a KKK organizer, said, “The success of the Trump campaign just proves that our views resonate with millions. They may not be ready for the Ku Klux Klan yet, but as anti-white hatred escalates, they will.” She said Trump’s campaign has increased party membership. Other endorsements from the most influential white supremacists are not difficult to find.

It wasn’t all talk. Extreme racists got to work.

  • In 2016, David Duke of KKK fame, who was once elected to the Louisiana state house, came in seventh out of 24 candidates in a run-off election for U.S. Senate. He earned 3% of the vote; about 59,000 ballots were cast for him.
  • In August 2018, Paul Nehlen, an openly “pro-White” candidate too racist for most social media platforms, garnered 11% of the vote in the GOP primary for Wisconsin’s 1st District (U.S. House of Representatives). He lost, but beat three other candidates.
  • John Fitzgerald, a vicious anti-Semite who ran for U.S. House of Representatives, beat a Democratic and independent candidate in California District 11’s open primary, coming in second with 23% of the vote. 36,000 people chose him. On November 6 he lost with 28% of the vote (43,000 votes).
  • A Nazi named Arthur Jones was the Republican nominee for U.S. House of Representatives from Illinois’ 3rd District (though he was the only person who ran as a Republican candidate, becoming the nominee by default). He just got 26% of the vote — 56,000 supporters.
  • Seth Grossman, who believes black people to be inferior, was the GOP nominee for U.S. House of Representatives from New Jersey’s 2nd District. He beat three other rivals, with 39% of the vote. He just garnered 46% of the vote in the general election. That’s 110,000 voters, just 15,000 short of the victor.
  • Russell Walker, who espouses the superiority of the white race, ran for District 48 in the North Carolina state house. He won the GOP primary in May, beating his rival with 65% of the vote. On November 6 he earned 37% of the vote in his race.
  • Steve West spreads conspiracy theories about the Jews, even saying “Hitler was right” about their influence in Germany. He won nearly 50% of the vote in the GOP primary for Missouri state house District 15, beating three others. On November 6 he also received 37% of the vote against his Democratic opponent.
  • Steve King has served in the U.S. House of Representatives since 2003. Hailing from Iowa’s 4th District, he said whites contributed more to civilization than people of color and constantly bemoans the threat that changing demographics represents to our culture. He also endorses white nationalists because they are “Pro Western Civilization” and spends time with groups founded and led by Nazis. He won 75% of the vote in the GOP primary — 28,000 votes. Then he got 50% in the general election (157,000 votes), keeping his seat.

There were others, of course, more subtle in their bigotry — more like Trump. Overall, there was a “record breaking” number of white supremacist candidates running for office this year. In most of the cases above, America couldn’t even keep such candidates in the single digits. Many beat more normal, tolerant candidates.

Those numbers may not seem all that impressive, not high enough to warrant any fears over a more horrific candidate winning the GOP presidential nomination. But it does not always take much. Turnout for the primaries is so low only 9% of Americans chose Trump and Hillary as party nominees. More voted for others, but that’s all it took. Trump won the nomination with 13 million votes, with 16 million Republican voters choosing someone else (both record numbers). He thus won 45% of the primary votes, which is about what Mitt Romney (52%) and John McCain (47%) accomplished. In other words, it would take less than half of Republican voters in the primaries to usher a more extreme racist (or sexist or criminal or what have you) to the Republican nomination. After seeing what many conservative voters could ignore or zealously embrace about Trump, this does not seem so impossible these days. Many Trump supporters, in a tidal wave of surveys and studies, were shown to have extremely bigoted and absurd views. From there, it isn’t that hard to envision a similar situation many conservatives faced in 2016, where they voted for an awful person they disliked to continue advancing conservative policies and principles. You have to stop abortion and the gays, you have to pack the Supreme Court, and so on. Some, to their immense credit, refused to do this — not voting, voting third party, or even voting for Clinton. But of course they were a minority. (And no, if you also believe absurd things, Democrats and liberals did not swing the election for Trump.)

The day of the election I felt more confident of Clinton’s victory than I had a couple weeks before. Previously, I had predicted that Trump was “probably” going to win. Perhaps it was a foolish optimism that washed over me on election day, when I expressed that Clinton would somehow eke out a narrow victory. I — and everyone else — should have known better. The tendency of the two parties to trade the White House every eight years, Clinton’s unpopularity on the Left, Trump as a reaction to the country’s first black president, the threat of the Electoral College handing the White House to another Republican with fewer votes…all sorts of factors should have made this an easy election to predict. Perhaps many of us simply did not want to face reality, did not want to believe we lived in a country where someone so awful could win, where so many voters are just like him or simply don’t care enough about his awfulness to refuse to vote for him. But after the shock and horror at Trump’s triumph abated, I could not shake the dread that this was merely the opening salvo in a battle against increasingly dangerous, extremist candidates.

Let’s hope, whether he — and it will certainly be a straight white male, given the extremist base — comes along in mere years or many decades, that we will not make the same mistake. Whether he will win is of course impossible to say. It will depend on how passionately we protest, how obsessively we organize, how voluminously we vote.

For more from the author, subscribe and follow or read his books.