Hegemony and History

The Italian Marxist Antonio Gramsci, writing in the early 1930s while imprisoned by the Mussolini government, theorized that ruling classes grew entrenched through a process called cultural hegemony, the successful propagation of values and norms, which when accepted by the lower classes produced passivity and thus the continuation of domination and exploitation from above. An ideology became hegemonic when it found support from historical blocs, alliances of social groups (classes, religions, families, and so on) — meaning broad, diverse acceptance of ideas that served the interests of the bourgeoisie in a capitalist society and freed the ruling class from some of the burden of using outright force. This paper argues that Gramsci’s theory is useful for historians because its conception of “divided consciousness” offers a framework for understanding why individuals failed to act in ways that aligned with their own material interests or acted for the benefit of oppressive forces. Note this offering characterizes cultural hegemony as a whole, but it is divided consciousness that permits hegemony to function. Rather than a terminus a quo, however, divided consciousness can be seen as created, at least partially, by hegemony and as responsible for ultimate hegemonic success — a mutually reinforcing system. The individual mind and what occurs within it is the necessary starting point for understanding how domineering culture spreads and why members of social groups act in ways that puzzle later historians.

Divided (or contradictory) consciousness, according to Gramsci, was a phenomenon in which individuals believed both hegemonic ideology and contrary ideas based on their own lived experiences. Cultural hegemony pushed such ideas out of the bounds of rational discussion concerning what a decent society should look like. Historian T.J. Jackson Lears, summarizing sociologist Michael Mann, wrote that hegemony ensured “values rooted in the workers’ everyday experience lacked legitimacy… [W]orking class people tend to embrace dominant values as abstract propositions but often grow skeptical as the values are applied to their everyday lives. They endorse the idea that everyone has an equal chance of success in America but deny it when asked to compare themselves with the lawyer or businessman down the street.”[1] In other words, what individuals knew to be true from simply functioning in society was not readily applied to the nature of the overall society; some barrier, created at least in part by the process of hegemony, existed. Lears further noted the evidence from sociologists Richard Sennett and Jonathon Cobb, whose subaltern interviewees “could not escape the effect of dominant values” despite also holding contradictory ones, as “they deemed their class inferiority a sign of personal failure, even as many realized they had been constrained by class origins that they could not control.”[2] A garbage collector knew the fact that he was not taught to read properly was not his fault, yet blamed himself for his position in society.[3] The result of this contradiction, Gramsci observed, was often passivity, consent to oppressive systems.[4] If one could not translate and contrast personal truths to the operation of social systems, political action was less likely.

To understand how divided consciousness, for Gramsci, was achieved, it is necessary to consider the breadth of the instruments that propagated dominant culture. Historian Robert Gray, studying how the bourgeoisie achieved hegemony in Victorian Britain, wrote that hegemonic culture could spread not only through the state — hegemonic groups were not necessarily governing groups, though there was often overlap[5] — but through any human institutions and interactions: “the political and ideological are present in all social relations.”[6] Everything in Karl Marx’s “superstructure” could imbue individuals and historical blocs with domineering ideas: art, media, politics, religion, education, and so on. Gray wrote that British workers in the era of industrialization of course had to be pushed into “habituation” of the new and brutal wage-labor system by the workplace itself, but also through “poor law reform, the beginnings of elementary education, religious evangelism, propaganda against dangerous ‘economic heresies,’ the fostering of more acceptable expressions of working-class self help (friendly societies, co-ops, etc.), and of safe forms of ‘rational recreation.’”[7] The bourgeoisie, then, used many social avenues to manufacture consent, including legal reform that could placate workers. Some activities were acceptable under the new system (joining friendly societies or trade unions) to keep more radical activities out of bounds.[8] It was also valuable to create an abstract enemy, a “social danger” for the masses to fear.[9] So without an embrace of the dominant values and norms of industrial capitalism, there would be economic disaster, scarcity, loosening morals, the ruination of family, and more.[10] The consciousness was therefore under assault by the dominant culture from all directions, heavy competition for values derived from lived experience, despite the latter’s tangibility. In macro, Gramsci’s theory of cultural hegemony, to quote historian David Arnold, “held that popular ideas had as much historical weight or energy as purely material forces” or even “greater prominence.”[11] In micro, it can be derived, things work the same in the individual mind, with popular ideas as powerful as personal experience, and thus the presence of divided consciousness.

The concept of contradictory consciousness helps historians answer compelling questions and solve problems. Arnold notes Gramsci’s questions: “What historically had kept the peasants [of Italy] in subordination to the dominant classes? Why had they failed to overthrow their rulers and to establish a hegemony of their own?”[12] Contextually, why wasn’t the peasantry more like the industrial proletariat — the more rebellious, presumed leader of the revolution against capitalism?[13] The passivity wrought from divided consciousness provided an answer. While there were “glimmers” of class consciousness — that is, the application of lived experience to what social systems should be, and the growth of class-centered ideas aimed at ending exploitation — the Italian peasants “largely participated in their own subordination by subscribing to hegemonic values, by accepting, admiring, and even seeking to emulate many of the attributes of the superordinate classes.”[14] Their desires, having “little internal consistency or cohesion,” even allowed the ruling class to make soldiers of peasants,[15] meaning active participation in maintaining oppressive power structures. Likewise, Lears commented on the work of political theorist Lawrence Goodwyn and the question of why the Populist movement in the late nineteenth century United States largely failed. While not claiming hegemony as the only cause, Lears argued that the democratic movement was most successful in parts of the nation with democratic traditions, where such norms were already within the bounds of acceptable discussion.[16] Where they were not, where elites had more decision-making control, the “received culture” was more popular, with domination seeming more natural and inevitable.[17] Similarly, Arnold’s historiographical review of the Indian peasantry found that greater autonomy (self-organization to pursue vital interests) of subaltern groups meant hegemony was much harder to establish, with “Gandhi [coming] closest to securing the ‘consent’ of the peasantry for middle-class ideological and political leadership,” but the bourgeoisie failing to do the same.[18] Traditions and cultural realities could limit hegemonic possibilities; it’s just as important to historians to understand why something does not work out as it is to comprehend why something does. As a final example, historian Eugene Genovese found that American slaves demonstrated both resistance to and appropriation of the culture of masters, both in the interest of survival, with appropriation inadvertently reinforcing hegemony and the dominant views and norms.[19] This can help answer questions regarding why slave rebellions took place in some contexts but not others, or even why more did not occur — though, again, acceptance of Gramscian theory does not require ruling out all causal explanations beyond cultural hegemony and divided consciousness. After all, Gramsci himself favored nuance, with coexisting consent and coercion, consciousness of class or lived experience mixing with beliefs of oppressors coming from above, and so on.

The challenge of hegemonic theory and contradictory consciousness relates to parsing out aforementioned causes. Gray almost summed it up when he wrote, “[N]or should behavior that apparently corresponds to dominant ideology be read at face value as a direct product of ruling class influence.”[20] Here he was arguing that dominant culture was often imparted in indirect ways, not through intentionality of the ruling class or programs of social control.[21] But one could argue: “Behavior that apparently corresponds to dominant ideology cannot be read at face value as a product of divided consciousness and hegemony.” It is a problem of interpretation, and it can be difficult for historians to parse out divided consciousness or cultural hegemony from other historical causes and show which has more explanatory value. When commenting on the failure of the Populist movement, Lears mentioned “stolen elections, race-baiting demagogues,” and other events and actors with causal value.[22] How much weight should be given to dominant ideology and how much to stolen elections? This interpretive nature can appear to weaken the usefulness of Gramsci’s model. Historians have developed potential solutions. For instance, as Lears wrote, “[O]ne way to falsify the hypothesis of hegemony is to demonstrate the existence of genuinely pluralistic debate; one way to substantiate it is to discover what was left out of public debate and to account historically for those silences.”[23] If there was public discussion of a wide range of ideas, many running counter to the interests of dominant groups, the case for hegemony is weaker; if public discussion centered around a narrow slate of ideas that served obvious interests, the case is stronger. A stolen election may be assigned less casual value, and cultural hegemony more, if there existed restricted public debate. However, the best evidence for hegemony may remain the psychoanalysis of individuals, as seen above, that demonstrate some level of divided consciousness. Even in demonstrability, contradictory consciousness is key to Gramsci’s overall theory. A stolen election may earn less casual value if such insightful individual interviews can be submitted as evidence.  

In sum, for Gramscian thinkers divided consciousness is a demonstrable phenomenon that powers (and is powered by) hegemony and the acceptance of ruling class norms and beliefs. While likely not the only cause of passivity to subjugation, it offers historians an explanation as to why individuals do not act in their own best interests that can be explored, given causal weight, falsified, or verified (to degrees) in various contexts. Indeed, Gramsci’s theory is powerful in that it has much utility for historians whether true or misguided.

For more from the author, subscribe and follow or read his books.


[1] T.J. Jackson Lears, “The Concept of Cultural Hegemony: Problems and Possibilities,” The American Historical Review 90, no. 3 (June 1985): 577.

[2] Ibid, 577-578.

[3] Ibid, 578.

[4] Ibid, 569.

[5] Robert Gray, “Bourgeois Hegemony in Victorian Britain,” in Tony Bennet, ed., Culture, Ideology and Social Process: A Reader (London: Batsford Academic and Educational, 1981), 240.

[6] Ibid, 244.

[7] Ibid.

[8] Ibid, 246.

[9] Ibid, 245.

[10] Ibid.

[11] David Arnold, “Gramsci and the Peasant Subalternity in India,” The Journal of Peasant Studies 11, no. 4 (1984):158.

[12] Ibid, 157.

[13] Ibid, 157.

[14] Ibid, 159.

[15] Ibid.

[16] Lears, “Hegemony,” 576-577.

[17] Ibid.

[18] Arnold, “India,” 172.

[19] Lears, “Hegemony,” 574.

[20] Gray, “Britain,” 246.

[21] Ibid, 245-246.

[22] Ibid, 276.

[23] Lears, “Hegemony,” 586.

20% of Americans Are Former Christians

It’s relatively well-known that religion in this country is declining, with 26% of Americans now describing themselves as nonreligious (9% adorning the atheist or agnostic label, 17% saying they are “nothing in particular”). Less discussed is where these growing numbers come from and just how much “faith switching” happens here.

For example, about 20% of citizens are former Christians, one in every five people you pass on the street. Where these individuals go isn’t a foregone conclusion — at times it’s to Islam (77% of new converts used to be Christians), Hinduism, or other faiths (“Members of non-Christian religions also have grown modestly as a share of the adult population,” the Pew Research Center reports). But mostly it’s to the “none” category, which has thus risen dramatically and is the fastest-growing affiliation. In a majority-Christian country that is rapidly secularizing, all this makes sense. (For context, 34% of Americans — one in three people — have abandoned the belief system in which they were raised, this group including atheists, Christians, Buddhists, Muslims, everyone. 4% of Americans used to be nonreligious but are now people of faith.)

While Islam is able to gain new converts at about the same rate it loses members, thus keeping Islam’s numbers steady (similar to Hinduism and Judaism), Christianity loses far more adherents than it brings in, and is therefore seeing a significant decline (77% to 65% of Americans in just 10 years):

19.2% of all adults…no longer identify with Christianity. Far fewer Americans (4.2% of all adults) have converted to Christianity after having been raised in another faith or with no religious affiliation. Overall, there are more than four former Christians for every convert to Christianity.

This statistic holds true for all religions, as well: “For every person who has left the unaffiliated and now identifies with a religious group more than four people have joined the ranks of the religious ‘nones.'”

This is so even though kids raised to be unaffiliated are somewhat less likely to remain unaffiliated! 53% of Americans raised nonreligious remain so. This is better than the 45% of mainstream protestants who stick with their beliefs, but worse than the 59% of Catholics or 65% of evangelical protestants. (Hinduism, Islam, and Judaism again beat everyone — one shouldn’t argue that high retention rates, or big numbers, prove beliefs true, nor low ones false.) Yet it is simply the case that there are currently many more religious people to change their minds than there are skeptics to change theirs:

The low retention rate of the religiously unaffiliated may seem paradoxical, since they ultimately obtain bigger gains through religious switching than any other tradition. Despite the fact that nearly half of those raised unaffiliated wind up identifying with a religion as adults, “nones” are able to grow through religious switching because people switching into the unaffiliated category far outnumber those leaving the category.

Overall, this knowledge is valuable because the growing numbers of atheists, agnostics, and the unaffiliated are occasionally seen as coming out of nowhere, rather than out of Christianity itself. (And out of other faiths, to far lesser degrees: Muslims are 1% of the population, Jews 2%.) As if a few dangerous, free-thinking families were suddenly having drastically more children, or a massive influx of atheistic immigrants was pouring into the U.S., skewing the percentages. Rather, the 26% of Americans who are nonreligious is comprised of much of the 20% of Americans who have abandoned Christianity. The call’s coming from inside the church.

For more from the author, subscribe and follow or read his books.

How Should History Be Taught?

Debate currently rages over how to teach history in American public schools. Should the abyss of racism receive full attention? Should we teach our children that the United States is benevolent in its wars and use of military power — did we not bring down Nazi Germany? Is the nation fundamentally good based on its history, worthy of flying the flag, or is it responsible for so many horrors that an ethical person would keep the flag in the closet or burn it in the streets? Left and Right and everyone in between have different, contradictory perspectives, but to ban and censor is not ideal. Examining the full spectrum of views will help students understand the world they inhabit and the field of history itself.

While there was once an imagining of objectivity, historians now typically understand the true nature of their work. “Through the end of the twentieth century,” Sarah Maza writes in Thinking About History, “the ideal of historical objectivity was undermined from within the historical community… The more different perspectives on history accumulated, the harder it became to believe that any historian, however honest and well-intentioned, could tell the story of the past from a position of Olympian detachment, untainted by class, gender, racial, national, and other biases.” Selecting and rejecting sources involves interpretation and subconsciously bent decisions. Historians looking at the same sources will have different interpretations of meaning, which leads to fierce debates in scholarly journals. Teachers are not value-neutral either. All this is taken for granted. “It is impossible to imagine,” Maza writes, “going back to a time when historians imagined that their task involved bowing down before ‘the sovereignty of sources.'” They understand it’s more complex than that: “The history of the American Great Plains in the nineteenth century has been told as a tale of progress, tragedy, or triumph over adversity,” depending on the sources one is looking at and how meaning is derived from them.

But this is a positive thing. It gives us a fuller picture of the past, understanding the experiences of all actors. “History is always someone’s story, layered over and likely at odds with someone else’s: to recognize this does not make our chronicles of the past less reliable, but more varied, deeper, and more truthful.” It also makes us think critically — what interpretation makes the most sense to us, given the evidence offered? Why is the evidence reliable?

If historians understand this, why shouldn’t students? Young people should be taught that while historical truth exists, any presentation of historical truth — a history book, say — was affected by human action and sentiment. This is a reality that those on the Left and Right should be able to acknowledge. Given this fact, and that both sides are after the same goal, to teach students the truth, the only sensible path forward is to offer students multiple interpretations. Read A Patriot’s History of the United States (Schweikart, Allen) and A People’s History of the United States (Zinn). There are equivalent versions of these types of texts for elementary and middle schoolers. Read about why World War II was “The Good War” in your typical textbook, alongside Horrible Histories: Woeful Second World War. Have students read history by conservatives in awe of a greatest country in the whole wide world, as well as by liberals fiercely critical of the nation and many of its people for keeping liberty and democracy exclusively for some for far longer than many other countries. They can study top-down history (great rulers, generals, and leaders drive change) and bottom-up social history (ordinary people coming together drives change). Or compare primary sources from the late nineteenth century to the early twentieth demanding or opposing women’s rights. Read the perspectives of both Native Americans and American settlers in the plains. Why not? This gives students a broader view of the past, shows them why arguments and debates over history exist, and helps them understand modern political ideologies.

Most importantly, as noted, it helps students think critically. Many a teacher has said, “I don’t want to teach students what to think, but rather how to think.” Apart from exploring the logical fallacies, which is also important, this doesn’t seem possible without exploring varying perspectives and asking which one a young person finds most convincing and why. One can’t truly practice the art of thinking without one’s views being challenged, being forced to justify the maintenance of a perspective or a deviation based on newly acquired knowledge. Further, older students can go beyond different analyses of history and play around with source theories: what standard should there be to determine if a primary source is trustworthy? Can you take your standard, apply it to the sources of these two views, and determine which is more solid by your metric? There is much critical thinking to be done, and it makes for a more interesting time for young people.

Not only does teaching history in this way reflect the professional discipline, and greatly expand student knowledge and thought, it aligns with the nature of public schools, or with what the general philosophy of public schools should be. The bent of a history classroom, or the history segment of the day in the youngest grades, is determined by the teacher, but also by the books, curricula, and standards approved or required by the district, the regulations of the state, and so forth. So liberal teachers, districts, and states go their way and conservative teachers, districts, and states go theirs. But who is the public school classroom for, exactly? It’s for everyone — which necessitates some kind of openness to a broad range of perspectives (public universities are the same way, as I’ve written elsewhere).

This may be upsetting and sensible at the same time. On the one hand, “I don’t want my kid, or other kids, hearing false, dangerous ideas from the other side.” On the other, “It would be great for my kid, and other kids, to be exposed to this perspective when it so often is excluded from the classroom.” Everyone is happy, no one is happy. Likely more the latter. First, how can anyone favor bringing materials full of falsities into a history class? Again, anyone who favors critical thinking. Make that part of the study — look at the 1619 Project and the 1776 Report together, and explore why either side finds the other in error. Second, how far do you go? What extreme views will be dignified with attention? Is one to bring in Holocaust deniers and square their arguments up against the evidence for the genocide? Personally, this writer would support that: what an incredible exercise in evaluating and comparing the quantity and quality of evidence (and “evidence”). Perhaps others will disagree. But none of this means there can’t be reasonable limits to presented views. If an interpretation or idea is too fringe, it may be a waste of time to explore it. There is finite time in a class period and in a school year. The teacher, district, and so on will have to make the (subjective) choice (no one said this was a perfect system) to leave some things out and focus on bigger divides. If Holocaust denial is still relatively rare, controversy over whether the Civil War occurred due to slavery is not.

Who, exactly, is afraid of pitting their lens of history against that of another? Probably he who is afraid his sacred interpretation will be severely undermined, she who knows her position is not strong. If you’re confident your interpretation is truthful, backed by solid evidence, you welcome all challengers. Even if another viewpoint makes students think in new ways, even pulling them away from your lens, you know the latter imparted important knowledge and made an impression. As the author of a book on racism used in high schools and colleges, what do I have to fear when some conservative writes a book about how things really weren’t so bad for black Kansas Citians over the past two centuries? By all means, read both books, think for yourself, decide which thesis makes the most sense to you based on the sources — or create a synthesis of your own. The imaginary conservative author should likewise have no qualms about such an arrangement.

I have thus far remained fairly even-handed, because Leftists and right-wingers can become equally outraged over very different things. But here I will wonder whether the Right would have more anxiety over a multiple-interpretation study specifically. Once a student has learned of the darkness of American history, it is often more difficult to be a full-throated, flag-worshiping patriot. This risk will drive some conservatives berserk. Is the Leftist parent equally concerned that a positive, patriotic perspective on our past alongside a Zinnian version will turn her child into someone less critical, more favorable to the State, even downplaying the darkness? I’m not sure if the Leftist is as worried about that. My intuition, having personally been on both sides of the aisle, is that the risk would be more disturbing for conservatives — the horrors still horrify despite unrelated positive happenings, but the view of the U.S. as the unequivocal good guy is quickly eroded forever. Hopefully I am wrong and that is the mere bias of a current mindset talking. Either way, this pedagogy, the great compromise, is the right thing to do, for the reasons outlined above.

In conclusion, we must teach students the truth — and Americans will never fully agree on what that is, but the closest one could hope for is that this nation and its people have done horrific things as well as positive things. Teaching both is honest and important, and that’s what students will see when they examine different authors and documents. In my recent review of a history text, I wrote that the Left “shouldn’t shy away from acknowledging, for instance, that the U.S. Constitution was a strong step forward for representative democracy, secular government, and personal rights, despite the obvious exclusivity, compared to Europe’s systems.” Nor should one deny the genuine American interest in rescuing Europe and Asia from totalitarianism during World War II. And then there’s inventions, art, scientific discoveries, music, and many other things. The truth rests in nuance, as one might expect. James Baldwin said that American history is “more beautiful and more terrible than anything anyone has ever said about it.” (What nation does not have both horrors and wonderful things in its history? Where would philosophy be without the German greats?) I’ve at times envisioned writing a history of the U.S. through a “hypocrisy” interpretation, but it works the same under a “mixed bag” framing: religious dissenters coming to the New World for more freedom and immediately crushing religious dissenters, the men who spoke of liberty and equality who owned slaves, fighting the Nazi master race with a segregated army, supporting democracy in some cases but destroying it in others, and so on. All countries have done good and bad things.

That is a concept the youngest children — and the oldest adults — can understand.

For more from the author, subscribe and follow or read his books.

Big Government Programs Actually Prevent Totalitarianism

There is often much screaming among conservatives that big government programs — new ones like universal healthcare, universal college education, or guaranteed work, and long-established ones like Social Security, Medicaid, and Medicare — somehow lead to dictatorship. There is, naturally, no actual evidence for this. The imagined correlation is justified with nothing beyond “that’s socialism, which always becomes totalitarianism,” ignorance already addressed. The experience of advanced democracies around the world, and indeed the U.S. itself, suggests big government programs, run by big departments with big budgets and big staffs helping tens of millions of citizens, can happily coexist alongside elected governing bodies and presidents, constitutions, and human rights, as one would expect.

Threats to democracy come from elsewhere — but what’s interesting to consider is how conservatives have things completely backward. Big government programs — the demonstration that one’s democracy is a government “for the people,” existing to meet citizen needs and desires — are key to beating back the real threats to a republic.

In a recent interview with The Nation, Bernie Sanders touched on this:

“Why it is imperative that we address these issues today is not only because of the issues themselves—because families should not have to spend a huge proportion of their income on child care or sending their kid to college—but because we have got to address the reality that a very significant and growing number of Americans no longer have faith that their government is concerned about their needs,” says the senator. “This takes us to the whole threat of Trumpism and the attacks on democracy. If you are a worker who is working for lower wages today than you did 20 years ago, if you can’t afford to send your kid to college, etc., and if you see the very, very richest people in this country becoming phenomenally rich, you are asking yourself, ‘Who controls the government, and does the government care about my suffering and the problems of my family?’”

Sanders argues that restoring faith in government as a force for good is the most effective way to counter threats to democracy.

And he’s right. Empirical evidence suggests economic crises erode the rule of law and faith in representative democracy. Depressions are not the only force that pushes in this direction, but they are significant and at times a killing blow to democratic systems. Unemployment, low wages, a rising cost of living — hardship and poverty, in other words — drive citizens toward extreme parties and voices, including authoritarians. Such leaders are then elected to office, and begin to dismantle democracy with support of much of the population. Europe in the 1930s is the oft-cited example, but the same has been seen after the global recession beginning in 2008, with disturbing outgrowths of recent declining trust in democracy: the success of politicians with demagogic and anti-democratic bents like Trump, hysteria over fictional stolen elections that threatens to keep unelected people in office, and dangerous far-right parties making gains in Europe. The Eurozone and austerity crisis, the COVID-induced economic turmoil, and more have produced similar concerns.

What about the reverse? If economic disaster harms devotion to real democracy and politicians who believe in it, does the welfare state increase support for and faith in democracy? Studies also suggest this is so. Government tackling poverty through social programs increases satisfaction with democratic systems! The perception that inequality is rising and welfare isn’t doing enough to address it does the exact opposite. A helping hand increases happiness, and is expected from democracies, inherently linking favorability views on republics and redistribution. If we wish to inoculate the citizenry against authoritarian candidates and anti-democratic practices within established government, shoring up loyalty to democracy through big government programs is crucial.

It is as Sanders said: the most important thing for the government to do to strengthen our democracy and even heal polarization (“Maybe the Democrats putting $300 per child per month in my bank account aren’t so evil”), is simply to help people. To work for and serve all. Healthcare, education, income support, jobs…such services help those on the Right, Left, and everyone in between. This should be done whether there is economic bust or boom. People hold fast to democracy, a government of and by the people, when it is clearly a government for the people. If we lose the latter, so too the former.

For more from the author, subscribe and follow or read his books.

COVID Proved Social Conditions Largely Determine Our Health

In the past year, it has been heavily impressed upon Kansas Citians that one’s health is to a significant degree determined by factors beyond one’s control. The COVID-19 era is a key moment to further break down the reactionary notion that personal health choices are all that stands between an individual and optimal physical and mental well-being. It’s broadened our understanding of how health is also a product of social conditions.

The first and most elementary fact to note is that viruses, while often focusing on vulnerable populations such as the elderly, are not often entirely discriminatory. They end the lives of the young and healthy as well. Regardless of one’s habits of eating, exercise, or not smoking, random exposure to illnesses new or old as one shops for groceries or rides in an Uber helps introduce the point: The environment often makes a mockery of our personal choices, as important as those are.

The family you are born into, where you grow up, and other factors beyond your control — and often your own awareness — have a large impact on your development and health as a child, which in turn impacts your health as an adult. (And the environment you happen to be in continues to affect you.) Poverty, extremely stressful on the mind and body in many ways, is the ultimate destructive circumstance for children and adults alike. Take the disturbing life expectancy differences between the poor and the better-off, for instance. In Kansas City’s poorest ZIP codes, which are disproportionately black, you can expect to live 18 fewer years on average compared to our richest, whitest ZIP codes, as Flatland reported on June 22. Poor families are less likely to have health care offered by an employer or be able to afford it themselves. They live in social conditions that include more violence or worse air and water pollution. They can at times only afford housing owned by negligent landlords slow to take care of mold, and cope with a million other factors.

During the pandemic, what serious observers of the social determinants of health predicted came true: Black Kansas Citians were hammered by COVID-19. Here we feel, today, the cold touch of slavery and Jim Crow, which birthed disproportionate poverty, which nurtured worse health, which resulted in Black Kansas Citians being more likely to catch coronavirus and die from it, as The Star reported even in the early stages of the pandemic. Worse still, on Feb. 24, the paper noted that richer, whiter ZIP codes — the areas of less urgent need — were getting disproportionately more vaccines than poorer areas with more Black residents. The vaccines were first shipped by the state to health centers that were convenient for some but distant from others.

Imagine history and race playing a role in your health, how soon you could get a shot. Imagine transportation options and where you live being factors. Likewise, imagine the kind of job you have doing the same: Lower-income workers are more likely to have front-line jobs at restaurants and grocery stores, where you can catch the virus. The privileged, better-off often work from home.

Whether it is drinking water you don’t know is unsafe or working at a job that requires much human contact during a pandemic, the determinants of health stretch far beyond exercising, eating right, and choosing not to smoke. To reflect on this fact is to understand a moral duty. If social conditions affect the health of individuals and families, it is urgent to change social conditions — to build a decent society, one without poverty and the many horrors that flow from it.

In this moment, one important way to help move toward this goal is to urge the U.S. House to pass the reconciliation budget that just passed the Senate, to extend the direct child tax credit payments to families, boldly expand education and health care, and more. Onward, a better world awaits.

This article first appeared in The Kansas City Star: https://www.kansascity.com/opinion/readers-opinion/guest-commentary/article253638658.html or https://edition.pagesuite.com/popovers/dynamic_article_popover.aspx?artguid=1ce78851-fef4-4f5d-b7a4-448618c1526c.

For more from the author, subscribe and follow or read his books.