The Future of American Politics

The following are five predictions about the future of U.S. politics. Some are short-term, others long-term; some are possible, others probable.

One-term presidents. In a time of extreme political polarization and razor-thin electoral victories, we may have to get used to the White House changing hands every four years rather than eight. In 2016, Trump won Michigan by 13,000 votes, Wisconsin by 27,000, Pennsylvania by 68,000, Arizona by 91,000. Biden won those same states in 2020 by 154,000, 21,000, 82,000, and 10,000, respectively. Other states were close as well, such as Biden’s +13,000 in Georgia or Clinton’s +2,700 in New Hampshire. Competitive races are nothing new in election history, and 13 presidents (including Trump) have failed to reach a second term directly after their first, but Trump’s defeat was the first incumbent loss in nearly 30 years. The bitter divisions and conspiratorial hysteria of modern times may make swing state races closer than ever, resulting in fewer two-term presidents — at least consecutive ones — in the near-term.

Mail privacy under rightwing attack. When abortion was illegal in the United States, there were many abortions. If Roe falls and states outlaw the procedure, or if the Supreme Court continues to allow restrictions that essentially do the same, we will again see many illegal terminations — only they will be far safer and easier this time, with abortion pills via mail. Even if your state bans the purchase, sale, or use of the pill, mail forwarding services or help from out-of-town friends (shipping the pills to a pro-choice state and then having them mailed to you) will easily get the pills to your home. Is mail privacy a future rightwing target? The U.S. has a history of banning the mailing of contraceptives, information on abortion, pornography, lottery tickets, and more, enforced through surveillance, requiring the Supreme Court to declare our mail cannot be opened without a warrant. It is possible the Right will attempt to categorize abortion pills as items illegal to ship and even push for the return of warrantless searches.

Further demagoguery, authoritarianism, and lunacy. Trump’s success is already inspiring others, some worse than he is, to run for elected office. His party looks the other way or enthusiastically embraces his deceitful attempts to overturn fair elections because it is most interested in power, reason and democracy be damned. Same for Trump’s demagoguery, his other lies and authoritarian tendencies, his extreme policies, his awful personal behavior — his base loves it all and it’s all terribly useful to the GOP. While Trump’s loss at the polls in 2020 may cause some to second-guess the wisdom of supporting such a lunatic, at least those not among the 40% of citizens who still believe the election was stolen, at present it seems the conservative base and the Republican Party are largely ready for Round 2. What the people want and the party tolerates they will get; what’s favored and encouraged will be perpetuated and created anew. It’s now difficult to imagine a normal human being, a classic Republican, a decent person like Mitt Romney, Liz Cheney, Jon Huntsman, John Kasich, or even Marco Rubio beating an extremist fool at the primary polls. The madness will likely continue for some time, both with Trump and others who come later, with only temporary respites of normalcy between monsters. Meanwhile, weaknesses in the political and legal system Trump exploited will no doubt remain unfixed for an exceptionally long time.

Republicans fight for their lives / A downward spiral against democracy. In a perverse sort of way, Republican cheating may be a good sign. Gerrymandering, voter suppression in all its forms, support for overturning a fair election, desperation to hold on to the Electoral College, and ignoring ballot initiatives passed by voters are the acts and sentiments of the fearful, those who no longer believe they can win honestly. And given the demographic changes already occurring in the U.S. that will transform the nation in the next 50-60 years (see next section), they’re increasingly correct. Republicans have an ever-growing incentive to cheat. Unfortunately, this means the Democrats do as well. Democrats may be better at putting democracy and fairness ahead of power interests, but this wall already has severe cracks, and one wonders how long it will hold. For example, the GOP refused to allow Obama to place a justice on the Supreme Court, and many Democrats dreamed of doing the same to Trump, plus expanding the Court during the Biden era. Democrats of course also gerrymander U.S. House and state legislature districts to their own advantage (the Princeton Gerrymandering Project is a good resource), even if Republican gerrymandering is worsefour times worse — therefore reaping bigger advantages. It’s sometimes challenging to parse out which Democratic moves are reactions to Republican tactics and which they would do anyway to protect their seats, but it’s obvious that any step away from impartiality and true democracy encourages the other party to do the same, creating a downward anti-democratic spiral, a race to the bottom.

(One argument might be addressed before moving on. Democrats generally make it easier for people to vote and support the elimination of the Electoral College, though again liberals are not angels and there are exceptions to both these statements. Aren’t those dirty tactics that serve their interests? As I wrote in The Enduring Stupidity of the Electoral College, which shows that this old anti-democratic system is unfair to each individual voter, “True, the popular vote may serve Democratic interests. Fairness serves Democratic interests. But, unlike unfairness, which Republicans seek to preserve, fairness is what’s right. Giving the candidate with the most votes the presidency is what’s right.” Same for not making it difficult for people who usually vote the “wrong” way to cast their ballots! You do what is right and fair, regardless of who it helps.)

Democratic dominance. In the long-term, Democrats will become the dominant party through demographics alone. Voters under 30 favored the Democratic presidential candidate by large margins in 2004, 2008, 2012, 2016, and 2020 — voters under 40 also went blue by a comfortable margin. Given that individual political views mostly remain stable over time (the idea that most or even many young people will grow more conservative as they age is unsupported by research), in 50 or 60 years this will be a rather different country. Today we still have voters (and politicians) in their 80s and 90s who were segregationists during Jim Crow. In five or six decades, those over 40 today (who lean Republican) will be gone, leaving a bloc of older voters who have leaned blue their entire lives, plus a new generation of younger and middle-aged voters likely more liberal than any of us today. This is on top of an increasingly diverse country, with people of color likely the majority in the 2040s — with the white population already declining by total numbers and as a share of the overall population, Republican strength will weaken further (the majority of whites have long voted Republican; the majority of people of color vote blue). A final point: the percentage of Americans who identify as liberal is steadily increasing, as opposed to those who identify as conservative, and Democrats have already won the popular vote in seven of the last eight presidential elections. Republican life rafts such as the Electoral College (whose swing states will experience these same changes) and other anti-democratic practices will grow hopelessly ineffective under the crushing weight of demographic metamorphosis. Assuming our democracy survives, the GOP will be forced to moderate to have a chance at competing.

For more from the author, subscribe and follow or read his books.

Actually, “Seeing Is Believing”

Don’t try to find “seeing isn’t believing, believing is seeing” in the bible, for though Christians at times use these precise words to encourage devotion, they come from an elf in the 1994 film The Santa Clause, an instructive fact. It is a biblical theme, however, with Christ telling the doubting Thomas, “Because you have seen me, you have believed; blessed are those who have not seen and yet have believed” (John 20:29), 2 Corinthians 5:7 proclaiming “We walk by faith, not by sight,” and more.

The theme falls under the first of two contradictory definitions of faith used by the religious. Faith 1 is essentially “I cannot prove this, I don’t have evidence for it, but I believe nonetheless.” Many believers profess this with pride — that’s true faith, pure faith, believing what cannot be verified. This is just the abandonment of critical thinking, turning off the lights. Other believers see the problem with it. A belief can’t be justified under Faith 1. Without proof, evidence, and reason, they realize, their faith is on the baseless, ridiculous level of every other wild human idea — believing in Zeus without verification, Allah without verification, Santa without verification. Faith 2 is the corrective: “I believe because of this evidence, let me show you.” The “evidence,” “proof,” and “logic” then offered are terrible and fall apart at once, but that has been discussed elsewhere. “Seeing isn’t believing, believing is seeing” aligns with the first definition, while Faith 2 would more agree with the title of this article (though room is always left for revelation as well).

I was once asked what would make me believe in God again, and I think about this from time to time. I attempt to stay both intellectually fair and deeply curious. Being a six on the Dawkins scale, I have long maintained that deities remain in the realm of the possible, in the same way our being in a computer simulation is possible, yet given the lack of evidence there is little reason to take it seriously at this time, as with a simulation. For me, the last, singular reason to wonder whether God or gods are real is the fact existence exists — but supposing higher powers were responsible for existence brings obvious problems of its own that are so large they preclude religious belief. Grounds for believing in God again would have to come from elsewhere.

“Believing is seeing” won’t do. It’s just a hearty cry for confirmation bias and self-delusion (plus, as a former Christian it has already been tried). Feeling God working in your life, hearing his whispers, the tugs on your heart, dreams and visions, your answered prayers, miracles…these things, experienced by followers of all religions and insane cults, even by myself long ago, could easily be imagined fictions, no matter how much you “know” they’re not, no matter how amazing the coincidences, dramatic the life changes, vivid the dreams, unexplainable the events (of current experience anyway; see below).

In contrast, “seeing is believing” is rational, but one must be careful here, too. It’s a trillion times more sensible to withhold belief in extraordinary claims until you see extraordinary evidence than to believe wild things before verifying, maybe just hoping some proof, revelation, comes along later. The latter is just gullibility, taking off the thinking cap, believing in Allah, Jesus, or Santa because someone told you to. However, for me, “seeing is believing” can’t just mean believing the dreadful “evidence” of apologetics referenced above, nor could it mean the god of a religion foreign to me appearing in a vision, confounding or suggestive coincidences and “miracles,” or other personal experiences that do not in any way require supernatural explanations. That’s not adequate seeing.

It would have to be a personal experience of greater magnitude. Experiencing the events of Revelation might do it — as interpreted by Tim LaHaye and Jerry B. Jenkins in their popular (and enjoyable, peaking with Assassins) book series of the late 90s and early 2000s, billions of Christians vanish, the seas turn to blood, people survive a nuclear bombing unscathed, Jesus and an army of angels arrive on the clouds, and so forth. These kinds of personal experiences would seem less likely to be delusions (though they still could be, if one is living in a simulation, insane, etc.), and would be a better basis for faith than things that have obvious or possible natural explanations, especially if they were accurately prophesied. In other words, at some stage personal experience does become a rational basis for belief; human beings simply tend to adopt a threshold that is outrageously low, far outside necessitated supernatural involvement. (It’s remarkable where life takes you: from “I’m glad I won’t have to go through the tribulation, as a believer” to “The tribulation would be reasonable grounds to become a believer again.”) Of course, I suspect this is all mythological and have no worry it will occur. How concerned is the Christian over Kalki punishing evildoers before the world expires and restarts (Hinduism) or the Spider Woman covering the land with her webs before the end (Hopi)? I will convert to one of these faiths if their apocalyptic prophecies come to pass.

The reaction of the pious is to say, “But others saw huge signs like that, Jesus walked on water and rose from the dead and it was all prophesied and –” No. That’s the challenge of religion. Stories of what other people saw can easily be made-up, often to match prophecy. Even a loved one relating a tale could have been tricked, hallucinating, delusional, lying. You can only trust the experiences you have, and even those you can’t fully trust! This is because you could be suffering from something similar — human senses and perceptions are known to miserably fail and mislead. The only (possible) solution is to go big. Really big. Years of predicted, apocalyptic disasters that you personally survive. You still might not be seeing clearly. But belief in a faith might be finally justified on rational, evidentiary grounds, in alignment with your perceptions. “Seeing is believing,” with proper parameters.

Anything short of this is merely “believing is seeing” — elf babble.

For more from the author, subscribe and follow or read his books.

History, Theory, and Ethics

The writing of history and the theories that guide it, argues historian Lynn Hunt in Writing History in the Global Era, urgently need “reinvigoration.”[1] The old meta-narratives used to explain historical change looked progressively weaker and fell under heavier criticism as the twentieth century reached its conclusion and gave way to the twenty-first.[2] Globalization, Hunt writes, can serve as a new paradigm. Her work offers a valuable overview of historical theories and develops an important new one, but this paper will argue Hunt implicitly undervalues older paradigms and fails to offer a comprehensive purpose for history under her theory. This essay then proposes some guardrails for history’s continuing development, not offering a new paradigm but rather a framing that gives older theories their due and a purpose that can power many different theories going forward.

We begin by reviewing Hunt’s main ideas. Hunt argues for “bottom-up” globalization as a meta-narrative for historical study, and contributes to this paradigm by offering a rationale for causality and change that places the concepts of “self” and “society” at its center. One of the most important points that Writing History in the Global Era makes is that globalization has varying meanings, with top-down and bottom-up definitions. Top-down globalization is “a process that transforms every part of the globe, creating a world system,” whereas the bottom-up view is myriad processes wherein “diverse places become connected and interdependent.”[3] In other words, while globalization is often considered synonymous with Europe’s encroachment on the rest of the world, from a broader and, as Hunt sees it, better perspective, globalization would in fact be exemplified by increased interactions and interdependence between India and China, for example.[4] The exploration and subjugation of the Americas was globalization, but so was the spread of Islam from the Middle East across North Africa to Spain. It is not simply the spread of more advanced technology or capitalism or what is considered to be, in eurocentrism, the most enlightened culture and value system, either: it is a reciprocal, “two-way relationship” that can be found anywhere as human populations move, meet, and start to rely on each other, through trade for example.[5] Hunt seeks to overcome two problems here. First, the eurocentric top-down approach and its “defects”; second, the lack of a “coherent alternative,” which her work seeks to provide.[6]

Hunt rightly and persuasively makes the case for a bottom-up perspective of globalization as opposed to top-down, then turns to the question of why this paradigm has explanatory power. What is it about bottom-up globalization, the increasing interactions and interdependence of human beings, that brings about historical change? Here Hunt is situating her historical lens alongside and succeeding previous ones, explored early in the work. Marxism, modernization, and the Annales School offered theories of causality. Cultural and political change was brought about by new modes of economic production, the growth of technology and the State, or by geography and climate, respectively.[7] The paradigm of identity politics, Hunt notes, at times lacked such a clear “overarching narrative,” but implied that inclusion of The Other, minority or oppressed groups, in the national narrative was key to achieving genuine democracy (which more falls under purpose, to be explored later).[8] Cultural theories rejected the idea, inherent in older paradigms, that culture was produced by economic or social relations; culture was a force unto itself, comprised of language, semiotics, discourse, which determined what an individual thought to be true and how one behaved.[9] “Culture shaped class and politics rather than the other way around” — meaning culture brought about historical change (though many cultural theorists preferred not to focus on causation, perhaps similar to those engaged in identity politics).[10] Bottom-up globalization, Hunt posits, is useful as a modern explanatory schema for the historical field. It brings about changes in the self (in fact in the brain) and of society, which spurs cultural and political transformations.[11] There is explanatory power in increased connections between societies. For instance, she suggests that drugs and stimulants like coffee, brought into Europe through globalization, produced selves that sought pleasure and thrill (i.e. altered the neurochemistry of the brain) and changed society by creating central gathering places, coffeehouses, where political issues could be intensely discussed. These developments may have pushed places like France toward democratic and revolutionary action.[12] For Hunt, it is not enough to say culture alone directs the thinkable and human action, nor is the mind simply a social construction — the biology of the brain and how it reacts and operates must be taken into account.[13] The field must move on from cultural theories.

Globalization, a useful lens through which to view history, joins a long list, only partially outlined above. Beyond economics, advancing technology and government bureaucracy, geography and environment, subjugated groups, and culture, there is political, elite, or even “Great Men” history; social history, the story of ordinary people; the history of ideas, things, and diseases and non-human species; microhistory, biography, a close look at events and individuals; and more.[14] Various ways of looking at history, some of which are true theories that include causes of change, together construct a more complete view of the past. They are all valuable. As historian Sarah Maza writes, “History writing does not get better and better but shifts and changes in response to the needs and curiosities of the present day. Innovations and new perspectives keep the study of the past fresh and interesting, but that does not mean we should jettison certain areas or approaches as old-fashioned or irrelevant.”[15] This is a crucial reminder. New paradigms can reinvigorate, but historians must be cautious of seeing them as signals that preceding paradigms are dead and buried.

Hunt’s work flirts with this mistake, though perhaps unintentionally. Obviously, some paradigms grow less popular, while others, particularly new ones, see surges in adherents. Writing History in the Global Era outlines the “rise and fall” of theories over time, the changing popularities and new ways of thinking that brought them about.[16] One implication in Hunt’s language, though such phrasing is utilized from the viewpoint of historical time or those critical of older theories, is that certain paradigms are indeed dead or of little use — “validity” and “credibility” are “questioned” or “lost,” “limitations” and “disappointments” discovered, theories “undermined” and “weakened” by “gravediggers” before they “fall,” and so forth.[17] Again, these are not necessarily Hunt’s views, rather descriptors of changing trends and critiques, but Hunt’s work offers no nod to how older paradigms are still useful today, itself implying that different ways of writing history are now irrelevant. With prior theories worth less, a new one, globalization, is needed. Hunt’s work could have benefited from more resistance to this implication, with a serious look at how geography and climate, or changing modes of economic production, remain valuable lenses historians use to chart change and find truth — an openness to the full spectrum of approaches, for they all work cooperatively to reveal the past, despite their unique limitations. Above, Maza mentioned “certain areas” of history in addition to “approaches,” and continued: “As Lynn Hunt has pointed out, no field of history [such as ancient Rome] should be cast aside just because it is no longer ‘hot’…”[18] Hunt should have acknowledged and demonstrated that the precise same is true of approaches to history.

Another area that deserves more attention is purpose. In the same way that not all historical approaches emphasize causality and change, not all emphasize purpose. Identity politics had a clear use: the inclusion of subjugated groups in history helped move nations toward political equality.[19] With other approaches, however, “What is it good for?” is more difficult to answer. This is to ask what utility a theory had for contemporary individuals and societies (and has for modern ones), beyond a more complete understanding of yesteryear or fostering new research. It may be more challenging to see a clear purpose in the study of how the elements of the longue durée, such as geography and climate, of the Annales School change human development. How was such a lens utilized as a tool, if in fact it was, in the heyday of the Annales School? How could it be utilized today? (Perhaps it could be useful in mobilizing action against climate change.) The purpose of history — of each historical paradigm — is not always obvious.

Indeed, Hunt’s paradigm “offers a new purpose for history: understanding our place in an increasingly interconnected world,” a rather vague suggestion that sees little elaboration.[20] What does it mean to understand our place? Is this a recycling of “one cannot understand the present without understanding the past,” a mere truism? Or is it to say that a bottom-up globalization paradigm can be utilized to demonstrate the connection between all human beings, breaking down nationalism or even national borders? After all, the theory moves away from eurocentrism and the focus on single nations. Perhaps it is something else, one cannot know for certain. Of course, Hunt may have wanted to leave this question to others, developing the tool and letting others determine how to wield it. However, hesitation on Hunt’s part to more deeply and explicitly explore purpose, to adequately show how her theory is useful to the present, may be a simple desire to avoid the controversy of politics. This would be disappointing to those who believe history is inherently political or anchored to ethics, but either reason is out of step with Hunt’s introduction. History, Hunt writes on her opening page, is “in crisis” due to the “nagging question that has proved so hard to answer…‘What is it good for?’”[21] In the nineteenth and twentieth centuries, she writes, the answer shifted from developing strong male leaders to building national identity and patriotism to contributing to the social movements of subjugated groups by unburying histories of oppression.[22] All of these purposes are political. Hunt deserves credit for constructing a new paradigm, with factors of causality and much fodder for future research, but to open the work by declaring a crisis of purposelessness, framing purposes as political, and then not offering a fully developed purpose through a political lens (or through another lens, explaining why purpose need not be political) is an oversight.

Based on these criticisms, we have a clear direction for the field of history. First, historians should reject any implication of a linear progression of historical meta-narratives, which this paper argues Hunt failed to do. “Old-fashioned” paradigms in fact have great value today, which must be noted and explored. A future work on the state of history might entirely reframe, or at least dramatically add to, the discussion of theory. Hunt tracked the historical development of theories and their critics, with all the ups and downs of popularity. This is important epistemologically, but emphasizes the failures of theories rather than their contributions, and presents them as stepping stones to be left behind on the journey to find something better. Marxism had a “blindness to culture” and had to be left by the wayside, its replacement had this or that limitation and was itself replaced, and so on.[23] Hunt writes globalization will not “hold forever” either.[24] A future work might instead, even if it included a brief, similar tracking, focus on how each paradigm added to our understanding of history, continued to do so, and how it does so today. As an example of the second task, Anthony Reid’s 1988 Southeast Asia in the Age of Commerce, 1450-1680 was written very much in the tradition of the Annales School, with a focus on geography, resources, climate, and demography, but it would be lost in a structure like Hunt’s, crowded out by the popularity of cultural studies in the last decades of the twentieth century.[25] Simply put, the historian must break away from the idea that paradigms are replaced. They are replaced in popularity, but not in importance to the mission of more fully understanding the past. As Hunt writes, “Paradigms are problematic because by their nature they focus on only part of the picture,” which highlights the necessity of the entire paradigmic spectrum, as does her putting globalization theory into practice, suggesting that coffee from abroad spurred revolutionary movements in eighteenth-century Europe, sidelining countless other factors.[26] Every paradigm helps us see more of the picture. It would be a shame if globalization was downplayed as implicitly irrelevant only a couple decades from now, if still a useful analytical lens. Paradigms are not stepping stones, they are columns holding up the house of history — more can be added as we go.

This aforementioned theoretical book on the field would also explore purpose, hypothesizing that history cannot be separated from ethics, and therefore from politics. Sarah Maza wrote in the final pages of Thinking About History:

Why study history? The simplest response is that history answers questions that other disciplines cannot. Why, for instance, are African-Americans in the United States today so shockingly disadvantaged in every possible respect, from income to education, health, life expectancy, and rates of incarceration, when the last vestiges of formal discrimination were done away with half a century ago? Unless one subscribes to racist beliefs, the only way to answer that question is historically, via the long and painful narrative that goes from transportation and slavery to today via Reconstruction, Jim Crow laws, and an accumulation, over decades, of inequities in urban policies, electoral access, and the judicial system.[27]

This is correct, and goes far beyond the purpose of answering questions. History is framed as the counter, even the antidote, to racist beliefs. If one is not looking to history for such answers, there is nowhere left to go but biology, racial inferiority, to beliefs deemed awful. History therefore informs ethical thinking; its utility is to help us become more ethical creatures, as (subjectively) defined by our society — and the self. This purpose is usually implied but rarely explicitly stated, and a discussion on the future of history should explore it. Now, one could argue that Maza’s dichotomy is simply steering us toward truth, away from incorrect ideas rather than unethical ones. But that does not work in all contexts. When we read Michel Foucault’s Discipline and Punish, he is not demonstrating that modes of discipline are incorrect — and one is hardly confused as to whether he sees them as bad things, these “formulas of domination” and “constant coercion.”[28] J.R. McNeill, at the end of Mosquito Empires: Ecology and War in the Greater Caribbean, 1620-1914, writes that yellow fever’s “career as a governing factor in human history, mercifully, has come to a close” while warning of a lapse in vaccination and mosquito control programs that could aid viruses that “still lurk in the biosphere.”[29] The English working class, wrote E.P. Thompson, faced “harsher and less personal” workplaces, “exploitation,” “unfreedom.”[30] The implications are clear: societies without disciplines, without exploitation, with careful mosquito control would be better societies. For human beings, unearthing and reading history cannot help but create value judgements, and it is a small step from the determination of what is right to the decision to pursue it, political action. It would be difficult, after all, to justify ignoring that which was deemed ethically right.

Indeed, not only do historians implicitly suggest better paths and condemn immoral ones, the notion that history helps human beings make more ethical choices is already fundamental to how many lay people read history — what is the cliché of being doomed to repeat the unlearned past about if not avoiding tragedies and terrors deemed wrong by present individuals and society collectively? As tired and disputed as the expression is, there is truth to it. Studying how would-be authoritarians often use minority groups as scapegoats for serious economic and social problems to reach elected office in democratic systems creates pathways for modern resistance, making the unthinkable thinkable, changing characterizations of what is right or wrong, changing behavior. Globalization may alter the self and society, but the field of history itself, to a degree, does the same. This could be grounds for a new, rather self-congratulatory paradigm, but the purpose, informing ethical and thus political decision-making, can guide many different theories, from Marxism to globalization. As noted, prior purposes of history were political: forming strong leaders, creating a national narrative, challenging a national narrative. A new political purpose would be standard practice. One might argue moving away from political purposes is a positive step, but it must be noted that the field seems to move away from purpose altogether when it does so. Is purpose inherently political? This future text would make the case that it is. A purpose cannot be posited without a self-evident perceived good. Strong leaders are good, for instance — and therefore should be part of the social and political landscape.

In conclusion, Hunt’s implicit dismissal of older theories and her incomplete purpose for history deserve correction, and doing so pushes the field forward in significant ways. For example, using the full spectrum of paradigms helps us work on (never solve) history’s causes-of-causes ad infinitum problem. Changing modes of production may have caused change x, but what caused the changing modes of production? What causes globalization in the first place? Paradigms can interrelate, helping answer the thorny questions of other paradigms (perhaps modernization or globalization theory could help explain changing modes of production, before requiring their own explanations). How giving history a full purpose advances the field is obvious: it sparks new interest, new ways of thinking, new conversations, new utilizations, new theories, while, like the sciences, offering the potential — but not the guarantee — of improving the human condition.

For more from the author, subscribe and follow or read his books.


[1] Lynn Hunt, Writing History in the Global Era (New York: W.W. Norton & Company, 2014), 1.

[2] Ibid, 26, 35-43.

[3] Ibid, 59. See also 60-71.

[4] Ibid, 70.

[5] Ibid.

[6] Ibid, 77.

[7] Ibid, 14-17.

[8] Ibid, 18.

[9] Ibid, 18-27.

[10] Ibid, 27, 77.

[11] Ibid, chapters 3 and 4.

[12] Ibid, 135-141.

[13] Ibid, 101-118.

[14] Sarah Maza, Thinking About History (Chicago: University of Chicago Press, 2017).

[15] Maza, Thinking, 236.

[16] Hunt, Writing History, chapter 1.

[17] Ibid, 8-9, 18, 26-27, chapter 1.

[18] Maza, Thinking, 236.

[19] Hunt, Writing History, 18.

[20] Ibid, 10.

[21] Ibid, 1.

[22] Ibid, 1-7.

[23] Ibid, 8.

[24] Ibid, 40.

[25] Anthony Reid, Southeast Asia in the Age of Commerce, 1450-1680, vol. 1, The Lands Below the Winds (New Haven: Yale University Press, 1988).

[26] Hunt, Writing History, 121, 135-140.

[27] Maza, Thinking, 237.

[28] Michel Foucault, Discipline and Punish (New York: Vintage Books, 1995), 137.

[29] J.R. McNeill, Mosquito Empires: Ecology and War in the Greater Caribbean, 1620-1914 (New York: Cambridge University Press, 2010), 314.

[30] E.P. Thompson, The Essential E.P. Thompson (New York: The New Press, 2001), 17. 

Is It Possible For Missouri State to Grow Larger Than Mizzou?

Students and alumni of Missouri State (and perhaps some of the University of Missouri) at times wonder if MSU will ever become the largest university in the state. While past trends are never a perfect predictor of the future, looking at the enrollment patterns of each institution can help offer an answer. Here are the total student growths since 2005.

Mizzou
Via its Student Body Profile reports and enrollment summary (Columbia campus):

2005 – 27,985
2006 – 28,253
2007 – 28,477
2008 – 30,200
2009 – 31,314
2010 – 32,415
2011 – 33,805
2012 – 34,748
2013 – 34,658
2014 – 35,441
2015 – 35,448
2016 – 33,266
2017 – 30,870
2018 – 29,866
2019 – 30,046
2020 – 31,103
2021 – 31,412

Missouri State
Via its enrollment history report (Springfield campus):

2005 – 19,165
2006 – 19,464
2007 – 19,705
2008 – 19,925
2009 – 20,842
2010 – 20,949
2011 – 20,802
2012 – 21,059
2013 – 21,798
2014 – 22,385
2015 – 22,834
2016 – 24,116
2017 – 24,350
2018 – 24,390
2019 – 24,126
2020 – 24,163
2021 – 23,618

In the past 16 years, MSU gained on average 278.3 new students each Fall. Mizzou gained 214.2 new students per year, an average tanked by the September 2015 racism controversy. Before the controversy (2005-2015 data), Mizzou gained 746.3 new students per year (MSU, over the same ten years, +366.9). From a low point in 2018, Mizzou has since, over a three-year period, gained on average 515.3 new students (over the same time, MSU saw -257.3 students — one school’s gain is often the other’s loss). This is too short a timeframe to draw unquestionable conclusions, but with Mizzou back on its feet it seems likely to continue to acquire more students on average each year, making MSU’s ascension to the top unlikely.

Predicting future enrollment patterns is rather difficult, of course. Over the past decade, fewer Americans have attended university, including fewer Missourians — and that was before COVID. Like a pandemic or a controversy, some disruptors cannot be predicted, nor can boosts to student populations. But most challenges will be faced by both schools: fewer young people, better economic times (which draws folks to the working world), pandemics, etc. The rising cost of college may give a university that is slightly more affordable an edge, as has been Missouri State’s long-time strategy. An increased profile through growing name recognition (it’s only been 16 years since Missouri State’s name change), success in sports, clever marketing schemes (alumnus John Goodman is now involved with MSU), ending Mizzou’s near-monopoly on doctoral degrees, and so on could make a difference, but there remains a huge advantage to simply being an older school, with a head-start in enrollment and brand recognition.

For more from the author, subscribe and follow or read his books.

COVID Showed Americans Don’t Leech Off Unemployment Checks

In most states, during normal times, you can use unemployment insurance for at most 26 weeks, half the year, and will receive 30-50% of the wages from your previous job, up to a certain income. This means $200-400 a week on average. One must meet a list of requirements to qualify, for instance having been fired from a job due to cutbacks, not through fault of your own. Only 35-40% of unemployed persons receive UI.

This means that at any given time, about 2 million Americans are receiving UI; in April/May 2020, with COVID-19 and State measures to prevent its spread causing mass firings, that number skyrocketed to 22 million. Put another way, just 1-3% of the workforce is usually using UI, and during the pandemic spike it was about 16%. Just before that rise, it was at 1.5% — and it returned to that rate in November 2021, just a year and a half later. Indeed, the number of recipients fell as fast as it shot up, from 16% to under 8% in just four months (September 2020), down to 4% in six months (November 2020). As much pearl-clutching as there was among conservatives (at least those who did not use UI) over increased dependency, especially with the temporary $600 federal boost to UI payments, tens of millions of Americans did not leech off the system. They got off early, even though emergency measures allowed them to stay on the entire year of 2020 and into the first three months of 2021! (The trend was straight down, by the way, even before the $600 boost ended.)

This in fact reflects what we’ve always known about unemployment insurance. It’s used as intended, as a temporary aid to those in financial trouble (though many low-wage workers don’t have access to it, which must be corrected). Look at the past 10 years of UI use. The average stay in the program (“duration”) each year was 17 or 18 weeks in times of economic recovery, 14 or 15 weeks in better economic times (sometimes even fewer). Four months or so, then a recipient stops filing for benefits, having found a job or ameliorated his or her crisis in some fashion. Some “enjoy” the 30-50% of previous wages for the whole stretch, but the average recipient doesn’t even use UI for 20 weeks, let alone the full 26 allowed. This makes sense, given how much of a pay cut UI is. Again, many Americans stop early, and the rest are cut off — so why all the screaming about leeching? Only during the COVID crisis did the average duration climb higher, to 26-27 weeks, as the federal government offered months of additional aid, as mentioned — again, many did not receive benefits for as long as they could have.

Those that receive benefits will not necessarily do the same next year. In times of moderate unemployment, for example, about 30% of displaced workers and 50% of workers on temporary layoff who receive benefits in Year 1 will reapply for benefits in Year 2. The rest do not refile.

However, we must be nuanced thinkers. Multiple things can be true at the same time. UI can also extend unemployment periods, which makes a great deal of sense even if UI benefits represent a drastic pay cut. UI gives workers some flexibility to be more selective in the job hunt. An accountant who has lost her position may, with some money coming in and keeping a savings account afloat, be able to undertake a longer search for another accounting job, rather than being forced to take the first thing she can find, such as a waitressing job. This extra time is important, because finding a similar-wage job means you can keep your house or current apartment, won’t fall further into poverty, etc. There are many factors behind the current shortage of workers, and UI seems to be having a small effect (indeed, studies range between no effect and moderate effects). And of course, in a big, complex world there will be some souls who avoid work as long as they can, and others who commit fraud (during COVID, vast sums were siphoned from our UI by individuals and organized crime rings alike, in the U.S. and from around the globe; any human being with internet access can attempt a scam). But that’s not most Americans. While UI allows workers to be more selective, prolonging an unemployed term a bit, they nevertheless generally stop filing for benefits early and avoid going back.

To summarize, for the conservatives in the back. The U.S. labor force is 161 million people. A tiny fraction is being aided by UI at any given moment. Those that are generally don’t stay the entire time they could. Those who do use 26 weeks of benefits will be denied further aid for the year (though extended benefits are sometimes possible in states with rising unemployment). Most recipients don’t refile the next year. True, lengths of unemployment may be increased some, and there will always be some Americans who take advantage of systems like this, but most people would prefer not to, instead wanting what all deserve — a good job, with a living wage.

For more from the author, subscribe and follow or read his books.