Purpose, Intersectionality, and History

This paper posits that primary sources meant for public consumption best allow the historian to understand how intersections between race and gender were used, consciously or not, to advocate for social attitudes and public policy in the United States and the English colonies before it. This is not to say utilization can never be gleaned from sources meant to remain largely unseen, nor that public ones will always prove helpful; the nature of sources simply creates a general rule. Public sources like narratives and films typically offer arguments.[1] Diaries and letters to friends tend to lack them. A public creation had a unique purpose and audience, unlikely to exist in the first place without an intention to persuade, and with that intention came more attention to intersectionality, whether in a positive (liberatory) or negative (oppressive) manner.

An intersection between race and gender traditionally refers to an overlap in challenges: a woman of color, for instance, will face oppressive norms targeting both women and people of color, whereas a white woman will only face one of these. Here the meaning will include this but is expanded slightly to reflect how the term has grown beyond academic circles. In cultural and justice movement parlance, it has become near-synonymous with solidarity, in recognition of overlapping oppressions (“True feminism is intersectional,” “If we fight sexism we must fight racism too, as these work together against women of color,” and so on). Therefore “intersectionality” has a negative and positive connotation: multiple identities plagued by multiple societal assaults, but also the coming together of those who wish to address this, who declare the struggle of others to be their own. We will therefore consider intersectionality as oppressive and liberatory developments, intimately intertwined, relating to women of color.

Salt of the Earth, the 1954 film in which the wives of striking Mexican American workers ensure a victory over a zinc mining company by taking over the picket line, is intersectional at its core.[2] Meant for a public audience, it uses overlapping categorical challenges to argue for gender and racial (as well as class) liberation. The film was created by blacklisted Hollywood professionals alongside the strikers and picketers on which the story is based (those of the 1950-1951 labor struggle at Empire Zinc in Hanover, New Mexico) to push back against American dogma of the era: normalized sexism, racism, exploitation of workers, and the equation of any efforts to address such problems with communism.[3] Many scenes highlight the brutality or absurdity of these injustices, with workers dying in unsafe conditions, police beating Ramon Quintero for talking back “to a white man,” and women being laughed at when they declare they will cover the picket line, only to amaze when they ferociously battle police.[4]

Intersectionality is sometimes shown not told, with the protagonist Esperanza Quintero facing the full brunt of both womanhood and miserable class conditions in the company-owned town (exploitation of workers includes that of their families). She does not receive racist abuse herself, but, as a Mexican American woman whose husband does, the implication is clear enough. She shares the burdens of racism with men, and those of exploitation — with women’s oppression a unique, additional yoke. In the most explicit expositional instance of intersectionality, Esperanza castigates Ramon for wanting to keep her in her place, arguing that is precisely like the “Anglos” wanting to put “dirty Mexicans” in theirs.[5] Sexism is as despicable as racism, the audience is told, and therefore if you fight the latter you must also fight the former. The creators of Salt of the Earth use intersectionality to argue for equality for women by strategically tapping into preexisting anti-racist sentiment: the men of the movie understand that bigotry against Mexican Americans is wrong from the start, and this is gradually extended to women. The audience — Americans in general, unions, the labor movement — must do the same.

A similar public source to consider is Toni Morrison’s 1987 novel Beloved. Like Salt of the Earth, Beloved is historical fiction. Characters and events are invented, but it is based on a historical happening: in 1850s Ohio, a formerly enslaved woman named Margaret Garner killed one of her children and attempted to kill the rest to prevent their enslavement.[6] One could perhaps argue Salt of the Earth, though fiction, is a primary source for the 1950-1951 Hanover strike, given its Hanover co-creators; it is clearly a primary source for 1954 and its hegemonic American values and activist counterculture — historians can examine a source as an event and what the source says about an earlier event.[7] Beloved cannot be considered a primary source of the Garner case, being written about 130 years later, but is a primary source of the late 1980s. Therefore, any overall argument or comments on intersectionality reflect and reveal the thinking of Morrison’s time.

In her later foreword, Morrison writes of another inspiration for her novel, her feeling of intense freedom after leaving her job to pursue her writing passions.[8] She explains:

I think now it was the shock of liberation that drew my thoughts to what “free” could possibly mean to women. In the eighties, the debate was still roiling: equal pay, equal treatment, access to professions, schools…and choice without stigma. To marry or not. To have children or not. Inevitably these thoughts led me to the different history of black women in this country—a history in which marriage was discouraged, impossible, or illegal; in which birthing children was required, but “having” them, being responsible for them—being, in other words, their parent—was as out of the question as freedom.[9]

This illuminates both Morrison’s purpose and how intersectionality forms its foundation. “Free” meant something different to women in 1987, she suggests, than to men. Men may have understood women’s true freedom as equal rights and access, but did they understand it also to mean, as women did, freedom from judgment, freedom not only to make choices but to live by them without shame? Morrison then turns to intersectionality: black women were forced to live by a different, harsher set of rules. This was a comment on slavery, but it is implied on the same page that the multiple challenges of multiple identities marked the 1980s as well: a black woman’s story, Garner’s case, must “relate…to contemporary issues about freedom, responsibility, and women’s ‘place.’”[10] In Beloved, Sethe (representing Garner) consistently saw the world differently than her lover Paul D, from what was on her back to whether killing Beloved was justified, love, resistance.[11] To a formerly enslaved black woman and mother, the act set Beloved free; to a formerly enslaved man, it was a horrific crime.[12] Sethe saw choice as freedom, and if Paul D saw the act as a choice that could not be made, if he offered only stigma, then freedom could not exist either. Recognizing the unique challenges and perspectives of black women and mothers, Morrison urges readers of the 1980s to do the same, to graft a conception of true freedom onto personal attitudes and public policy.

Moving beyond historical fiction, let us examine a nonfiction text from the era of the Salem witch trials to observe how Native American women were even more vulnerable to accusation than white women. Whereas Beloved and Salt of the Earth make conscious moves against intersectional oppression, the following work, wittingly or not, solidified it. Boston clergyman Cotton Mather’s A Brand Pluck’d Out of the Burning (1693) begins by recounting how Mercy Short, an allegedly possessed servant girl, was once captured by “cruel and Bloody Indians.”[13] This seemingly out of place opening establishes a tacit connection between indigenous people and the witchcraft plaguing Salem. This link is made more explicit later in the work, when Mather writes that someone executed at Salem testified “Indian sagamores” had been present at witch meetings to organize “the methods of ruining New England,” and that Mercy Short, in a possessed state, revealed the same, adding Native Americans at such meetings held a book of “Idolatrous Devotions.”[14] Mather, and others, believed indigenous peoples were involved in the Devil’s work. Further, several other afflicted women and girls had survived Native American attacks, further connecting the terrors.[15]

This placed women like Tituba, a Native American slave, in peril. Women were the primary victims of the witch hunts.[16] Tituba’s race was an added vulnerability (as was, admittedly, a pre-hysteria association, deserved or not, of Tituba with magic).[17] She was accused and pressured into naming other women as witches, then imprisoned (she later recanted).[18] A Brand Pluck’d Out of the Burning was intended to describe Short’s tribulation, as well as offer some remedies,[19] but also to explain its cause. Native Americans, it told its Puritan readers, were heavily involved in the Devil’s work, likely helping create other cross-categorical consequences for native women who came after Tituba. The text both described and maintained a troubling intersection in the New England colonies.

A captivity narrative from the previous decade, Mary Rowlandson’s The Sovereignty and Goodness of God, likewise encouraged intersectional oppression. This source is a bit different than A Brand Pluck’d Out of the Burning because it is a first-hand account of one’s own experience; Mather’s work is largely a second-hand account of Short’s experience (compare “…shee still imagined herself in a desolate cellar” to the first-person language of Rowlandson[20]). Rowlandson was an Englishwoman from Massachusetts held captive for three months by the Narragansett, Nipmuc, and Wompanoag during King Philip’s War (1675-1676).[21] Her 1682 account of this event both characterized Native Americans as animals and carefully defined a woman’s proper place — encouraging racism against some, patriarchy against others, and the full weight of both for Native American women. To Rowlandson, native peoples were “dogs,” “beasts,” “merciless and cruel,” creatures of great “savageness and brutishness.”[22] They were “Heathens” of “foul looks,” whose land was unadulterated “wilderness.”[23] Native society was animalistic, a contrast to white Puritan civilization.[24]

Rowlandson reinforced ideas of true womanhood by downplaying the power of Weetamoo, the female Pocassett Wompanoag chief, whose community leadership, possession of vast land and servants, and engagement in diplomacy and war violated Rowlandson’s understanding of a woman’s proper role in society.[25] Weetamoo’s authority was well-known by the English.[26] Yet Rowlandson put her in a box, suggesting her authority was an act, never acknowledging her as a chief (unlike Native American men), and emphasizing her daily tasks to implicitly question her status.[27] Rowaldson ignored the fact that Weetamoo’s “work” was a key part of tribal diplomacy, attempted to portray her own servitude as unto a male chief rather than Weetamoo (giving possessions first to him), and later labeled Weetamoo an arrogant, “proud gossip” — meaning, historian Lisa Brooks notes, “in English colonial idiom, a woman who does not adhere to her position as a wife.”[28] The signals to her English readers were clear: indigenous people were savages and a woman’s place was in the domestic, not the public, sphere. If Weetamoo’s power was common knowledge, the audience would be led to an inevitable conclusion: a Native American woman was inferior twofold, an animal divorced from true womanhood.

As we have seen, public documents make a case for or against norms of domination that impact women of color in unique, conjoining ways. But sources meant to remain private are often less useful for historians seeking to understand intersectionality — as mentioned in the introduction, with less intention to persuade comes less bold or rarer pronouncements, whether oppressive or liberatory. Consider the diary of Martha Ballard, written 1785-1812. Ballard, a midwife who delivered over eight hundred infants in Hallowell, Maine, left a daily record of her work, home, and social life.[29] The diary does have some liberatory implications for women, subverting ideas of men being the exclusive important actors in the medical and economic spheres.[30] But its purpose was solely for Ballard — keeping track of payments, weather patterns, and so on.[31] There was little need to comment on a woman’s place, and even less was said about race. Though there do exist some laments over the burdens of her work, mentions of delivering black babies, and notice of a black female doctor, intersectionality is beyond Ballard’s gaze, or at least beyond the purpose of her text.[32]

Similarly, private letters often lack argument. True, an audience of one is more likely to involve persuasion than an audience of none, but still less likely than a mass audience. And without much of an audience, ideas need not be fully fleshed out nor, at times, addressed at all. Intersectional knowledge can be assumed, ignored as inappropriate given the context, and so on. For instance, take a letter abolitionist and women’s rights activist Sarah Grimké wrote to Sarah Douglass of the Philadelphia Female Anti-Slavery Society on February 22, 1837.[33] Grimké expressed sympathy for Douglass, a black activist, on account of race: “I feel deeply for thee in thy sufferings on account of the cruel and unchristian prejudice…”[34] But while patriarchal norms and restrictions lay near the surface, with Grimké describing the explicitly “female prayer meetings” and gatherings of “the ladies” where her early work was often contained, she made no comment on Douglass’ dual challenge of black womanhood.[35] The letter was a report of Grimké’s meetings, with no intention to persuade. Perhaps she felt it off-topic to broach womanhood and intersectionality. Perhaps she believed it too obvious to mention — or that it would undercut or distract from her extension of sympathy toward Douglass and the unique challenges of racism (“Yes, you alone face racial prejudice, but do we not both face gender oppression?”). On the one hand, the letter could seem surprising: how could Grimké, who along with her sister Angelina were pushing for both women’s equality and abolition for blacks at this time, not have discussed womanhood, race, and their interplays with a black female organizer like Douglass?[36] On the other, this is not surprising at all: this was a private letter with a limited purpose. It likely would have looked quite different had it been a public letter meant for a mass audience.

In sum, this paper offered a general view of how the historian can find and explore intersectionality, whether women of color facing overlapping challenges or the emancipatory mindsets and methods needed to address them. Purpose and audience categorized the most and least useful sources for such an endeavor. Public-intended sources like films, novels, secondary narratives, first-person narratives, and more (autobiographies, memoirs, public photographs and art, articles, public letters) show how intersectionality was utilized, advancing regressive or progressive attitudes and causes. Types of sources meant to remain private like diaries, personal letters, and so on (private photographs and art, some legal and government documents) often have no argument and are less helpful. From here, a future writing could explore the exceptions that of course exist. More ambitiously, another might attempt to examine the effectiveness of each type of source in producing oppressive or liberatory change: does the visual-auditory stimulation of film or the inner thoughts in memoirs evoke emotions and reactions that best facilitate attitudes and action? Is seeing the intimate perspectives of multiple characters in a novel of historical fiction most powerful, or that of one thinker in an autobiography, who was at least a real person? Or is a straightforward narrative, the writer detached, lurking in the background as far away as possible, just as effective as more personal sources in pushing readers to hold back or stand with women of color? The historian would require extensive knowledge of the historical reactions to the (many) sources considered (D.W. Griffith’s Birth of a Nation famously sparked riots — can such incidents be quantified? Was this more likely to occur due to films than photographs?) and perhaps a co-author from the field of psychology to test (admittedly present-day) human reactions to various types of sources scientifically to bolster the case.

For more from the author, subscribe and follow or read his books.


[1] Mary Lynn Rampolla, A Pocket Guide to Writing in History, 10th ed. (Boston: Bedford/St. Martin’s, 2020), 14.

[2] Salt of the Earth, directed by Herbert Biberman (1954; Independent Productions Corporation).

[3] Carl R. Weinberg, “‘Salt of the Earth’: Labor, Film, and the Cold War,” Organization of American Historians Magazine of History 24, no. 4 (October 2010): 41-45.

  Benjamin Balthaser, “Cold War Re-Visions: Representation and Resistance in the Unseen Salt of the Earth,” American Quarterly 60, no. 2 (June 2008): 347-371.

[4] Salt of the Earth, Biberman.

[5] Ibid.

[6] Toni Morrison, Beloved (New York: Vintage Books, 2004), xvii.

[7] Kathleen Kennedy (lecture, Missouri State University, April 26, 2022).

[8] Morrison, Beloved, xvi.

[9] Ibid, xvi-xvii.

[10] Ibid., xvii.

[11] Ibid., 20, 25; 181, 193-195. To Sethe, her back was adorned with “her chokecherry tree”; Paul D noted “a revolting clump of scars.” This should be interpreted as Sethe distancing herself from the trauma of the whip, reframing and disempowering horrific mutilation through positive language. Paul D simply saw the terrors of slavery engraved on the body. Here Morrison subtly considers a former slave’s psychological self-preservation. When Sethe admitted to killing Beloved, she was unapologetic to Paul D — “I stopped him [the slavemaster]… I took and put my babies where they’d be safe” — but he was horrified, first denying the truth, then feeling a “roaring” in his head, then telling Sethe she loved her children too much. Then, like her sons and the townspeople at large, Paul D rejected Sethe, leaving her.

[12] Ibid., 193-195.

[13] Cotton Mather, A Brand Pluck’d Out of the Burning, in George Lincoln Burr, Narratives of the New England Witch Trials (Mineola, New York: Dover Publications, 2012), 259.

[14] Ibid, 281-282.

[15] Richard Godbeer, The Salem Witch Hunt: A Brief History with Documents (New York: Bedford/St. Martin’s, 2018), 83.

[16] Michael J. Salevouris and Conal Furay, The Methods and Skills of History (Hoboken, NJ: Wiley-Blackwell, 2015), 211.

[17] Godbeer, Salem, 83.

[18] Ibid., 83-84.

[19] Burr, Narratives, 255-258.

[20] Ibid., 262.

[21] Mary Rowlandson, The Sovereignty and Goodness of God by Mary Rowlandson with Related Documents, ed. Neal Salisbury (Boston: Bedford Books, 2018).

[22] Ibid., 76-77, 113-114.

[23] Ibid., 100, 76.

[24] This was the typical imperialist view. See Kirsten Fischer, “The Imperial Gaze: Native American, African American, and Colonial Women in European Eyes,” in A Companion to American Women’s History, ed. Nancy A. Hewitt (Malden MA: Blackwell Publishing, 2002), 3-11.

[25] Lisa Brooks, Our Beloved Kin: A New History of King Philip’s War (New Haven: Yale University Press, 2018), chapter one.

[26] Ibid., 264.

[27] Ibid.

   Rowlandson, Sovereignty, 81, 103.

[28] Brooks, Our Beloved Kin, 264, 270.

[29] Laurel Thatcher Ulrich, A Midwife’s Tale: The Life of Martha Ballard, Based on Her Diary, 1785-1812 (New York: Vintage Books, 1999).

[30] Ibid., 28-30.

[31] Ibid., 168, 262-263.

[32] Ibid., 225-226, 97, 53.

[33] Sarah Grimké, “Letter to Sarah Douglass,” in Kathryn Kish Sklar, Women’s Rights Emerges within the Antislavery Movement, 1830-1870 (New York: Bedford/St. Martin’s, 2019), 94-95.

[34] Ibid., 95.

[35] Ibid., 94.

[36] Ibid., 84-148.

‘Beloved’ as History

In one sense, fiction can present (or represent, a better term) history as an autobiography might, exploring the inner thoughts and emotions of a survivor or witness. In another, fiction is more like a standard nonfiction work, its omniscient gaze shifting from person to person, revealing that which a single individual cannot know and experience, but not looking within, at the personal. Toni Morrison’s 1987 Beloved exemplifies the synthesis of these two commonalities: the true, unique power of fiction is the ability to explore the inner experiences of multiple persons. While only “historically true in essence,” as Morrison put it, the novel offers a history of slavery and its persistent trauma for the characters Sethe, Paul D, Denver, Beloved, and more.[1] It is posited here that Morrison believed the history of enslavement could be more fully understood through representations of the personal experiences of diverse impacted persons. This is the source of Beloved’s power.

One way to approach this is to consider different perspectives of the same event or those similar. To Sethe, her back was adorned with “her chokecherry tree”; Paul D noted “a revolting clump of scars.”[2] This should be interpreted as Sethe distancing herself from the trauma of the whip, reframing and disempowering horrific mutilation through positive language. Paul D simply saw the terrors of slavery engraved on the body. Here Morrison subtly considers a former slave’s psychological self-preservation. As another example, both Sethe and Paul D experienced sexual assault. Slaveowners and guards, respectively, forced milk from Sethe’s breasts and forced Paul D to perform oral sex.[3] Out of fear, “Paul D retched — vomiting up nothing at all. An observing guard smashed his shoulder with the rifle…”[4] “They held me down and took it,” Sethe thought mournfully, “Milk that belonged to my baby.”[5] Slavery was a violation of personhood, an attack on motherhood and manhood alike. Morrison’s characters experienced intense pain and shame over these things; here the author draws attention to not only the pervasive sexual abuse inherent to American slavery but also how it could take different forms, with different meanings, for women and men. Finally, consider how Sethe killed her infant to save the child from slavery.[6] Years later, Sethe was unapologetic to Paul D — “I stopped him [the slavemaster]… I took and put my babies where they’d be safe” — but he was horrified, first denying the truth, then feeling a “roaring” in his head, then telling Sethe she loved her children too much.[7] Then, like her sons and the townspeople at large, Paul D rejected Sethe, leaving her.[8] This suggests varying views on the meaning of freedom — death can be true freedom or the absence of it, or perhaps whether true freedom is determining one’s own fate — as well as ethics and resistance and love; a formerly enslaved woman and mother may judge differently than a formerly enslaved man, among others.[9]

Through the use of fiction, Morrison can offer diverse intimate perspectives, emotions, and experiences of former slaves, allowing for a more holistic understanding of the history of enslavement. This is accomplished through both a standard literary narrative and, in several later chapters, streams of consciousness from Sethe, Denver, Beloved, and an amalgamation of the three.[10] Indeed, Sethe and Paul D’s varying meanings and observations here are a small selection from an intensely complex work with several other prominent characters. There is much more to explore. It is also the case that in reimagining and representing experiences, Morrison attempts to make history personal and comprehensible for the reader, to transmit the emotions of slavery from page to body.[11] Can history be understood, she asks, if we do not experience it ourselves, in at least a sense? In other words, Beloved is history as “personal experience” — former slaves’ and the reader’s.[12]

For more from the author, subscribe and follow or read his books.


[1] Toni Morrison, Beloved (New York: Vintage Books, 2004), xvii.

[2] Ibid., 20, 25.

[3] Ibid., 19-20, 127.

[4] Ibid., 127.

[5] Ibid., 236.

[6] Ibid., 174-177.

[7] Ibid., 181, 193-194.

[8] Ibid., 194-195.

[9] Morrison alludes, in her foreword, to wanting to explore what freedom meant to women: ibid., xvi-xvii.

[10] Ibid., 236-256.

[11] Morrison writes that to begin the book she wanted the reader to feel kidnapped, as Africans or sold/caught slaves experienced: ibid., xviii-xix. 

[12] Ibid., xix.

‘Salt of the Earth’: Liberal or Leftist?

Labor historian Carl R. Weinberg argues that the Cold War was fought at a cultural level, films being one weapon to influence American perspectives on matters of class and labor, gender, and race.[1] He considers scenes from Salt of the Earth, the 1954 picture in which the wives of striking Mexican American workers ensure a victory over a zinc mining company by taking over the picket line, that evidence a push against hierarchical gender relations, racial prejudice, and corporate-state power over unions and workers.[2] Cultural and literary scholar Benjamin Balthaser takes the same film and explores the scenes left on the cutting room floor, positing that the filmmakers desired a stronger assault against U.S. imperialism, anti-communism at home and abroad (such as McCarthyism and the Korean War), and white/gender supremacy, while the strikers on which the film was based, despite their sympathetic views and militancy, felt such commentary would hurt their labor and civil rights organizing — or even bring retribution.[3] Balthaser sees a restrained version born of competing interests, and Weinberg, without exploring the causes, notices the same effect: there is nearly no “mention of the broader political context,” little commentary on communism or America’s anti-communist policies.[4] It is a bit odd to argue Salt of the Earth was a cultural battleground of the Cold War that had little to say of communism, but Weinberg falls roughly on the same page as Balthaser: the film boldly takes a stand for racial and gender equality, and of course union and workers’ rights, but avoids the larger ideological battle, capitalism versus communism. They are correct: this is largely a liberal, not a leftist, film.

This does not mean communist sympathies made no appearance, of course: surviving the editing bay was a scene that introduced the character of Frank Barnes of “the International” (the Communist International), who strongly supported the strike and expressed a willingness to learn more of Mexican and Mexican American culture.[5] Later, “Reds” are blamed for causing the strike.[6] And as Weinberg notes, the Taft-Hartley Act, legislation laced with anti-communist clauses, is what forces the men to stop picketing.[7] Yet all this is as close as Salt comes to connecting labor, racial, and women’s struggles with a better world, how greater rights and freedom could create communism or vice versa. As Balthasar argues, the original script attempted to draw a stronger connection between this local event and actual/potential political-economic systems.[8] The final film positions communists as supporters of positive social changes for women, workers, and people of color, but at best only implies that patriarchy, workplace misery or class exploitation, and racism were toxins inherent to the capitalist system of which the United States was a part and only communism could address. And, it might be noted, the case for such an implication is slightly weaker for patriarchy and racism, as the aforementioned terms such as “Reds” only arise in conversations centered on the strike and the men’s relationships to it.

True, Salt of the Earth is a direct attack on power structures. Women, living in a company town with poor conditions like a lack of hot water, want to picket even before the men decide to strike; they break an “unwritten rule” by joining the men’s picket line; they demand “equality”; they mock men; they demand to take over the picket line when the men are forced out, battling police and spending time in jail.[9] Esperanza Quintero, the film’s protagonist and narrator, at first more dour, sparkles to life the more she ignores her husband Ramon’s demands and involves herself in the huelga.[10] By the end women’s power at the picket line has transferred to the home: the “old way” is gone, Esperanza tells Ramon when he raises a hand to strike her.[11] “Have you learned nothing from the strike?” she asks. Likewise, racist company men (“They’re like children”) and police (“That’s no way to talk to a white man”) are the villains, as is the mining company that forces workers to labor alone, resulting in their deaths, and offers miserable, discriminatory pay.[12] These struggles are often connected (intersectionality): when Esperanza denounces the “old way,” she compares being put in her place to the “Anglos” putting “dirty Mexicans” in theirs.[13] However, it could be that better working conditions, women’s rights, and racial justice can, as the happy ending suggests, be accomplished without communism. Without directly linking progress to the dismantling of capitalism, the film isolates itself from the wider Cold War debate.

For more from the author, subscribe and follow or read his books.


[1] Carl R. Weinberg, “‘Salt of the Earth’: Labor, Film, and the Cold War,” Organization of American Historians Magazine of History 24, no. 4 (October 2010): 42.

[2] Ibid., 42-44.

[3] Benjamin Balthaser, “Cold War Re-Visions: Representation and Resistance in the Unseen Salt of the Earth,” American Quarterly 60, no. 2 (June 2008): 349.

[4] Weinberg, “Salt,” 43.

[5] Salt of the Earth, directed by Herbert Biberman (1954; Independent Productions Corporation).

[6] Ibid.

[7] Ibid.

[8] Balthaser, “Cold War,” 350-351. “[The cut scenes] connect the particular and local struggle of the Mexican American mine workers of Local 890 to the larger state, civic, and corporate apparatus of the international cold war; and they link the cold war to a longer U.S. history of imperial conquest, racism, and industrial violence. Together these omissions construct a map of cold war social relations…”

[9] Salt of the Earth, Biberman.

[10] Ibid.

[11] Ibid.

[12] Ibid.

[13] Ibid.

Work, Activism, and Morality: Women in Nineteenth-Century America

This paper argues that nineteenth-century American women viewed work as having a moral nature, and believed this idea extended to public advocacy. The latter is true in two senses: 1) that public advocacy also had a moral nature, and 2) that at times a relationship existed between the moral nature of their work and that of their activism. Private work could be seen as a moral duty or an evil nightmare, depending upon the context, and different women likewise saw activism as either right and proper or unethical and improper. More conservative women, for instance, did not support the shattering of traditional gender roles in the public sphere, the troubling efforts of other women to push for political and social change, no matter the justification. Abolition, women’s rights, and Native American rights, if worth pursuing at all, were the purview of men. Reformist women, on the other hand, saw their public roles as moral responsibilities that echoed those of domestic life or addressed its iniquities. While the moral connection between the two spheres is at times frustratingly tenuous and indirect, let us explore women’s divergent views on the rightness or wrongness of their domestic work and political activity, while considering why some women saw relationships between them. In this context, “work” and its synonyms can be defined as a nineteenth-century woman’s everyday tasks and demeanor — not only what she does but how she behaves in the home as well (as we will see, setting a behavioral example could be regarded as crucial a role in domestic life as household tasks).

In the 1883 memoir Life Among the Piutes, Sarah Winnemucca Hopkins (born Thocmentony) expressed a conviction that the household duties of Piute women and men carried moral weight.[1] She entitled her second chapter “Domestic and Social Moralities,” domestic moralities being proper conduct regarding the home and family.[2] “Our children are very carefully taught to be good,” the chapter begins — and upon reaching the age of marriage, interested couples are warned of the seriousness of domestic responsibilities.[3] “The young man is summoned by the father of the girl, who asks him in her presence, if he really loves his daughter, and reminds him, if he says he does, of all the duties of a husband.”[4] The concepts of love, marriage, and becoming a family were inseparable from everyday work. The father would then ask his daughter the same question. “These duties are not slight,” Winnemucca Hopkins writes. The woman is “to dress the game, prepare the food, clean the buckskins, make his moccasins, dress his hair, bring all the wood, — in short, do all the household work. She promises to ‘be himself,’ and she fulfils her promise.”[5] “Be himself” may be indicative of becoming one with her husband, or even submitting to his leadership, but regardless of interpretation it is clear, with the interesting use of present tense (“fulfils”) and lack of qualifiers, that there is no question the woman will perform her proper role and duties. There is such a question for the husband, however: “if he does not do his part” when childrearing he “is considered an outcast.”[6] Mothers in fact openly discussed whether a man was doing his duty.[7] For Winnemucca Hopkins and other Piutes, failing to carry out one’s domestic labor was a shameful wrong. This chapter, and the book in general, attempts to demonstrate to a white American audience “how good the Indians were” — not lazy, not seeking war, and so on — and work is positioned as an activity that makes them ethical beings.[8] And ethical beings, it implies, do not deserve subjugation and brutality. True, Winnemucca Hopkins may have emphasized domestic moralities to garner favor from whites with certain expectations of duty — but that does not mean these moralities were not in fact roots of Piute culture; more favor could have been curried by de-emphasizing aspects whites may have felt violated the social norms of work, such as men taking over household tasks, chiefs laboring while remaining poor, and so on, but the author resists, which could suggest reliability.[9]

Like tending faithfully to private duties, for Winnemucca Hopkins advocacy for native rights was the right thing to do. A moral impetus undergirded both private and public acts. White settlers and the United States government subjected the Piutes, of modern-day Nevada, to violence, exploitation, internment, and removal; Winnemucca Hopkins took her skills as an interpreter and status as chief’s daughter to travel, write, petition, and lecture, urging the American people and state to end the suffering.[10] She “promised my people that I would work for them while there was life in my body.”[11] There was no ambiguity concerning the moral urgency of her public work: “For shame!” she wrote to white America, “You dare to cry out Liberty, when you hold us in places against our will, driving us from place to place as if we were beasts… Oh, my dear readers, talk for us, and if the white people will treat us like human beings, we will behave like a people; but if we are treated by white savages as if we are savages, we are relentless and desperate; yet no more so than any other badly treated people. Oh, dear friends, I am pleading for God and for humanity.”[12] The crimes against the Piutes not only justified Winnemucca Hopkins raising her voice — they should spur white Americans to do the same, to uphold their own values such as faith, belief in liberty, etc. For this Piute leader, just as there existed a moral duty to never shirk domestic responsibilities, there existed a moral duty to not turn a blind eye to oppression.

Enslaved women like Harriet Jacobs understood work in a different way. The nature of her domestic labor was decidedly immoral.[13] In Incidents in the Life of a Slave Girl (1861), she wrote “of the half-starved wretches toiling from dawn till dark on the plantations… of mothers shrieking for their children, torn from their arms by slave traders… of young girls dragged down into moral filth… of pools of blood around the whipping post… of hounds trained to tear human flesh… of men screwed into cotton gins to die…”[14] Jacobs, a slave in North Carolina, experienced the horrors of being sexual property, forced household work, and the spiteful sale of her children.[15] Whereas Winnemucca Hopkins believed in the rightness of her private work and public advocacy, related moral duties to the home and to her people, Jacobs had an even more direct connection between these spheres: the immorality of her private work led straight to, and justified, her righteous battle for abolition. Even before this, she resisted the evil of her work, most powerfully by running away, but also by turning away from a slaveowner’s sexual advances, among other acts.[16]

After her escape from bondage, Jacobs became involved in abolitionist work in New York and wrote Incidents to highlight the true terrors of slavery and push white women in the North toward the cause.[17] Much of her story has been verified by (and we know enough of slavery from) other sources; she is not merely playing to her audience and its moral sensitivities either.[18] One should note the significance of women of color writing books of this kind. Like Winnemucca Hopkins’ text, Jacobs’ contained assurances from white associates and editors that the story was true.[19] Speaking out to change hearts was no easy task — prejudiced skepticism abounded. Jacobs (and her editor, Lydia Maria Child) stressed the narrative was “no fiction” and expected accusations of “indecorum” over the sexual content, anticipating criticisms that could hamper the text’s purpose.[20] Writing could be dangerous and trying. Jacobs felt compelled to use pseudonyms to protect loved ones.[21] She ended the work by writing it was “painful to me, in many ways, to recall the dreary years I passed in bondage.”[22] Winnemucca Hopkins may have felt similarly. In a world of racism, doubt, reprisals, and trauma, producing a memoir was a brave, powerful act of advocacy.

Despite the pain (and concern her literary skills were inadequate[23]), Jacobs saw writing Incidents as the ethical path. “It would have been more pleasant for me to have been silent about my own history,” she confesses at the start, a perhaps inadvertent reminder that what is right is not always what is easy. She then presents her “motives,” her “effort in behalf of my persecuted people.”[24] It was right to reveal the “condition of two millions of women at the South, still in bondage, suffering what I suffered, and most of them far worse,” to show “Free States what Slavery really is,” all its “dark…abominations.”[25] Overall, the text is self-justifying. The evils of slavery warrant the exposé (Life Among the Piutes is similar). Jacobs’ public advocacy grew from and was justified by her experience with domestic labor and her moral values.

These things, for more conservative women, precluded public work. During the abolition and women’s rights movements of the nineteenth century, less radical women saw the public roles of their sisters as violating the natural order and setting men and women against each other.[26] Catherine Beecher, New York educator and writer, expressed dismay over women circulating (abolitionist) petitions in her 1837 “Essay on Slavery and Abolitionism, with Reference to the Duty of American Females.”[27] It was against a woman’s moral duty to petition male legislators to act: “…in this country, petitions to congress, in reference to the official duties of legislators, seem, IN ALL CASES, to fall entirely without [outside] the sphere of female duty. Men are the proper persons to make appeals to rulers whom they appoint…”[28] (This is an interesting use of one civil inequity to justify another: only men can vote, therefore only men should petition.) After all, “Heaven has appointed to one sex the superior, and to the other the subordinate station…”[29] Christianity was the foundation of the gender hierarchy, which meant, for Beecher, that women entering the political sphere violated women’s divinely-decreed space and responsibilities. Women’s “influence” and “power” were to be exerted through the encouragement of love, peace, and moral rightness, as well as by professional teaching, in the “domestic and social circle.”[30] In other words, women were to hint to men and boys the proper way to act in politics only while at home, school, and so forth.[31] This highlights why domestic “work” must reach definitionally beyond household tasks: just as Winnemucca Hopkins and Jacobs were expected to maintain certain demeanors in addition to completing their physical labors, here women must be shining examples, moral compasses, with bearings above reproach.

Clearly, direct calls and organizing for political and social change were wrong; they threatened “the sacred protection of religion” and turned woman into a “combatant” and “partisan.”[32] They set women against God and men. Elsewhere, reformist women were also condemned for speaking to mixed-sex audiences, attacking men instead of supporting them, and more.[33] Beecher and other women held values that restricted women to domestic roles, to “power” no more intrusive to the gender order than housework — to adopt these roles was moral, to push beyond them immoral. The connection between the ideological spheres: one was an anchor on the other. (Limited advocacy to keep women in domestic roles, however, seemed acceptable: Beecher’s essay was public, reinforcing the expectations and sensibilities of many readers, and she was an activist for women in education, a new role yet one safely distant from politics.[34]) Reformist women, of course, such as abolitionist Angelina Grimké, held views a bit closer to those of Jacobs and Winnemucca Hopkins: women were moral beings, and therefore had the ethical responsibility to confront wrongs just as men did, and from that responsibility came the inherent social or political rights needed for the task.[35]

The diversity of women’s beliefs was the product of their diverse upbringings, environments, and experiences. Whether domestic labor was viewed as moral depended on its nature, its context, its participants; whether engagement in the public sphere was seen as the same varied according to how urgent, horrific, and personal social and political issues were regarded. Clearly, race impacted how women saw work. The black slave could have a rather different perspective on moral-domestic duty than a white woman (of any class). One historian posited that Jacobs saw the evils of forced labor as having a corrosive effect on her own morals, that freedom was a prerequisite to a moral life.[36] A unique perspective born of unique experiences. Race impacted perspectives on activism, too, with voices of color facing more extreme, violent motivators like slavery and military campaigns against native nations. Factors such as religion, political ideology, lack of personal impact, race, class, and so on could build a wall of separation between the private and public spheres in the individual mind, between where women should and should not act, but they could also have a deconstructive effect, freeing other nineteenth-century American women to push the boundaries of acceptable behavior. That domestic work and public advocacy had moral natures, aligning here, diverging there, at times connecting, has rich support in the extant documents.

For more from the author, subscribe and follow or read his books.


[1] Sarah Winnemucca Hopkins, Life Among the Piutes (Mount Pleasant, SC: Arcadia Press, 2017), 25-27.

[2] Ibid., 25.

[3] Ibid. 25-26.

[4] Ibid. 26.

[5] Ibid.

[6] Ibid., 27.

[7] Ibid.

[8] Ibid.

[9] Ibid., 27-28.

[10] Ibid., 105-108 offers examples of Winnemucca Hopkins’ advocacy such as petitioning and letter writing. Her final sentence (page 107) references her lectures on the East coast.  

[11] Ibid., 105.

[12] Ibid., 106.

[13] “Slavery is wrong,” she writes flatly. Harriet Jacobs, Incidents in the Life of a Slave Girl, ed. Jennifer Fleischner (New York: Bedford/St. Martin’s, 2020), 95.

[14] Ibid., 96.

[15] Ibid., chapters 5, 16, 19.

[16] Ibid., 51 and chapter 27.

[17] Ibid., 7-18, 26.

[18] Ibid., 7-9.

[19] Ibid., 26-27, 207-209.

   Winnemucca Hopkins, Piutes, 109-119.

[20] Jacobs, Incidents, 25-27.

[21] Ibid., 25.

[22] Ibid., 207.

[23] Ibid., 25-26.

[24] Ibid., 26.

[25] Ibid.

[26] Catherine Beecher, “Essay on Slavery and Abolitionism, with Reference to the Duty of American Females,” in Kathryn Kish Sklar, Women’s Rights Emerges within the Antislavery Movement, 1830-1870 (New York: Bedford/St. Martin’s, 2019), 109-110.

[27] Ibid.

[28] Ibid., 110.

[29] Ibid., 109.

[30] Ibid.

[31] Ibid., 110.

[32] Ibid., 109-110.

[33] “Report of the Woman’s Rights Convention Held at Seneca Falls, N.Y.,” in ibid., 163.

“Pastoral Letter: The General Association of Massachusetts Churches Under Their Care,” in ibid., 120.

[34] Beecher, “Duty,” 109.

[35] Angelina Grimké, “An Appeal to the Women of the Nominally Free States,” in ibid., 103. See also Angelina Grimké, “Letter to Theodore Dwight Weld and John Greenleaf Whittier” in ibid., 132.

[36] Kathleen Kennedy (lecture, Missouri State University, April 12, 2022).

How the Women’s Rights Movement Grew Out of the Abolitionist Struggle

The women’s rights movement of mid-nineteenth century America grew out of the preceding and concurrent abolitionist movement because anti-slavery women recognized greater political power could help end the nation’s “peculiar institution.” The emancipation of women, in other words, could lead to the emancipation of black slaves. This is seen, for example, in the writings of abolitionist activist Angelina Grimké. “Slavery is a political subject,” she wrote in a letter to a friend on February 4, 1837, summarizing the words of her conservative critics, “therefore women should not intermeddle. I admitted it was, but endeavored to show that women were citizens & had duties to perform to their country as well as men.”[1] If women possessed full citizen rights, Grimké implied, they could fully engage in political issues like slavery and influence outcomes as men did. The political project of abolishing slavery necessitated political rights for the women involved in and leading it.

Other documents of the era suggest this prerequisite for abolition in similar ways. Borrowing the ideas of the Enlightenment and the national founding, abolitionists positioned the end of slavery as the acknowledgement of the inalienable rights of enslaved persons — to achieve this end, women’s rights would need to be recognized as well. In 1834, the American Anti-Slavery Society created a petition for women to sign that urged the District of Columbia to abolish slavery, calling for “the restoration of rights unjustly wrested from the innocent and defenseless.”[2] The document offered justification for an act as bold and startling (“suffer us,” “bear with us” the authors urge) as women petitioning government, for instance the fact that the suffering of slaves meant the suffering of fellow women.[3] Indeed, many Americans believed as teacher and writer Catherine Beecher did, that “in this country, petitions to congress, in reference to the official duties of legislators, seem, IN ALL CASES, to fall entirely without [outside] the sphere of female duty. Men are the proper persons to make appeals to rulers whom they appoint…”[4] It would not do for women to petition male legislators to act. In drafting, circulating, and signing this petition, women asserted a political right (an inalienable right of the First Amendment) for themselves, a deed viewed as necessary in the great struggle to free millions of blacks. (Many other bold deeds were witnessed in this struggle, such as women speaking before audiences.[5]

Beecher’s writings reveal that opponents of women’s political activism understood abolitionists’ sentiments that moves toward gender equality were preconditions for slavery’s eradication. She condemned the “thirst for power” of abolitionists; women’s influence was to be exerted through the encouragement of love, peace, and moral rightness, as well as by professional teaching, in the “domestic and social circle.”[6] The male sex, being “superior,” was the one to go about “exercising power” of a political nature.[7] Here gender roles were clearly defined, to be adhered to despite noble aims. The pursuit of rights like petitioning was, to Beecher, the wrong way to end the “sin of slavery.”[8] Yet this castigation of the pursuit of public power to free the enslaved supports the claim that such a pursuit, with such a purpose, indeed took place.

Overall, reformist women saw all public policy, all immoral laws, within their grasp if political rights were won (a troubling thought to Beecher[9]). In September 1848, one Mrs. Sanford, a women’s rights speaker at a Cleveland gathering of the National Negro Convention Movement, summarized the goals of her fellow female activists: they wanted “to co-operate in making the laws we obey.”[10] The same was expressed a month before at the historic Seneca Falls convention.[11] This paralleled the words of Grimké above, as well as her 1837 demand that women have the “right to be consulted in all the laws and regulations by which she is to be governed…”[12] Women saw themselves as under the heel of immoral laws. But as moral beings, Grimké frequently stressed, they had the responsibility to confront wrongs just as men did, and from that responsibility came the inherent political rights needed for such confrontations.[13] If a law such as the right to own human beings was unjust, women would need power over lawmaking, from petitioning to the vote, to correct it.

For more from the author, subscribe and follow or read his books.


[1] Angelina Grimké, “Letter to Jane Smith,” in Kathryn Kish Sklar, Women’s Rights Emerges within the Antislavery Movement, 1830-1870 (New York: Bedford/St. Martin’s, 2019), 93.

[2] The American Anti-Slavery Society, “Petition Form for Women,” in Kathryn Kish Sklar, Women’s Rights Emerges within the Antislavery Movement, 1830-1870 (New York: Bedford/St. Martin’s, 2019), 85.

[3] Ibid.

[4] Catherine Beecher, “Essay on Slavery and Abolitionism, with Reference to the Duty of American Females,” in Kathryn Kish Sklar, Women’s Rights Emerges within the Antislavery Movement, 1830-1870 (New York: Bedford/St. Martin’s, 2019), 110.

[5] “Report of the Woman’s Rights Convention Held at Seneca Falls, N.Y.,” in Kathryn Kish Sklar, Women’s Rights Emerges within the Antislavery Movement, 1830-1870 (New York: Bedford/St. Martin’s, 2019), 163.

[6] Beecher, “Duty,” 109.

[7] Ibid.

[8] Ibid., 111.

[9] Ibid., 110.

[10] “Proceedings of the Colored Convention,” in Kathryn Kish Sklar, Women’s Rights Emerges within the Antislavery Movement, 1830-1870 (New York: Bedford/St. Martin’s, 2019), 168.

[11] “Seneca Falls,” 165.

[12] Angelina Grimké, “Human Rights Not Founded on Sex,” in Kathryn Kish Sklar, Women’s Rights Emerges within the Antislavery Movement, 1830-1870 (New York: Bedford/St. Martin’s, 2019), 135.

[13] Angelina Grimké, “An Appeal to the Women of the Nominally Free States,” in Kathryn Kish Sklar, Women’s Rights Emerges within the Antislavery Movement, 1830-1870 (New York: Bedford/St. Martin’s, 2019), 103. See also Angelina Grimké, “Letter to Theodore Dwight Weld and John Greenleaf Whittier” in ibid., 132.

The Gender Order in Colonial America

Early New England history cannot be properly understood without thorough examination of the ways in which women, or the representations of women, threatened or maintained the gender hierarchy of English society. This is a complex task. Documents written by women and men alike could weaken or strengthen the ideology and practice of male dominance, just as the acts of women, whether accurately preserved in the historical record, distorted in their representation, or lost to humankind forever, could engage with the hierarchy in different ways. (The deeds of men could as well, but that falls beyond the scope of this paper.) This is not to say that every act or writing represented a conscious decision to threaten or shore up the gender order — some likely were, others likely not — but for the historian the outcome or impact grows clear with careful study. Of course, this paper does not posit that every source only works toward one end or the other. In some ways a text might undermine a social system, in other ways bolster it. Yet typically there will be a general trend. Uncovering such an overall impact upon the hierarchy allows for a fuller understanding of any given event in the English colonies during the sixteenth through eighteenth centuries. (That is, from the English perspective. This paper considers how the English saw themselves and others, yet the same analysis could be used for other societies, a point we will revisit in the conclusion.)

Let us begin with a source that works to maintain the gender order, Mary Rowlandson’s 1682 The Sovereignty and Goodness of God. Rowlandson was an Englishwoman from Massachusetts held captive for three months by the Narragansett, Nipmuc, and Wompanoag during King Philip’s War (1675-1676). Her text, which became popular in the colonies, carefully downplays the power of Weetamoo, the female Pocassett Wompanoag chief, whose community leadership, possession of vast land and servants, and engagement in diplomacy and war violated Rowlandson’s Puritan understanding of a woman’s proper place in society.[1] As historian Lisa Brooks writes, “Throughout her narrative, Rowlandson never acknowledged that Weetamoo was a leader equal to a sachem [chief], although this was common knowledge in the colonies. Rather, she labored to represent Weetamoo’s authority as a pretension.”[2] In contrast, Rowlandson had no issue writing of “Quanopin, who was a Saggamore​ [chief]” of the Narragansetts, nor of “King Philip,” Metacom, Wompanoag chief.[3] It was appropriate for men to hold power.

That Rowlandson presented Weetamoo’s authority as an act is a plausible interpretation of the former’s lengthy depiction of Weetamoo dressing herself — this “proud Dame” who took “as much time as any of the Gentry of the land.”[4] She was playing dress-up, playing a part, Rowlandson perhaps implied, an idea that grows stronger with the next line: “When she had dressed her self, her work was to make Girdles of Wampom…”[5] The gentry do not work; the powerful do not labor. How can a working woman have authority? Further, Rowaldson ignored the fact that wampum “work” was a key part of tribal diplomacy, attempted to portray her servitude as unto Quinnapin rather than Weetamoo (giving possessions first to him), and later labeled the chief an arrogant, “proud gossip” — meaning, Brooks notes, “in English colonial idiom, a woman who does not adhere to her position as a wife.”[6] Rowlandson likely felt the need, whether consciously or not, to silence discomforting realities of Native American nations. Weetamoo’s power, and a more egalitarian society, threatened the English gender order, and it would thus not do to present dangerous ideas to a wider Puritan audience.

“Likely” is used as a qualifier because it must be remembered that publication of The Sovereignty and Goodness of God had to go through male Puritan authorities like clergyman Increase Mather, who wrote the preface.[7] It remains an open question how much of this defense of the gender hierarchy comes from Rowlandson and how much from the constraints of that hierarchy upon her: under the eyes of Mather and others, a narrative that did not toe the Puritan line would simply go unpublished. But the overall impact is clear. Rowlandson, as presented in the text, held true to the proper role of women — and thus so should readers.

Conflict with native tribes and captivity narratives held a central place in the colonial English psyche. One narrative that did more to threaten the gender order was that of Hannah Dustin’s captivity, as told by religious leader Cotton Mather, first from the pulpit and then in his 1699 Decennium Luctousum. Unlike his father Increase, Cotton Mather was in a bit of a bind. Dangerous ideas were already on the loose; his sermon and writings would attempt to contain them.[8] Hannah Dustin of Massachusetts was captured by the Abenaki in 1697, during King William’s War. She was placed in servitude to an indigenous family of two men, three women, and seven children.[9] Finding themselves traveling with so few captors, Dustin and two other servants seized hatchets one night and killed the men and most of the women and children.[10] Dustin and the others scalped the ten dead and carried the flesh to Boston, earning fifty pounds, various gifts, and much acclaim. Mather’s representation of Dustin would have to confront and contextualize a seismic disturbance in the social order: women behaving like men, their use of extreme violence.

Mather first turned to the bible for rationalization, writing that Dustin “took up a Resolution, to imitate the Action of Jael upon Sisera…”[11] In Judges 4, a Kenite woman named Jael hammered a tent peg through the skull of the (male) Canaanite commander Sisera, helping the Israelites defeat the Canaanite army. Mather’s audiences would understand the meaning. Puritan women’s subservient and submissive status was rooted in the bible, yet there were extreme circumstances where female violence was justified; being likewise against the enemies of God, Dustin’s gruesome act could be tolerated. Mather then used place as justification: “[B]eing where she had not her own Life secured by any Law unto her, she thought she was not Forbidden by any Law to take away the Life of the Murderers…”[12] In other words, Dustin was in Native American territory, a lawless space. This follows the long-established colonizer mindset of our civilization versus their wilderness and savagery, but here, interestingly, the condemned latter was used as justification for a Puritan’s act.[13] Being unprotected by Puritan laws in enemy lands, Mather wrote, Dustin saw herself as also being free from such laws, making murder permissible. However, the clergyman’s use of “she thought” suggests a hesitation to fully approve of her deed.[14] He nowhere claims what she did was right.

Clearly, Mather attempted to prevent erosion of the gender order through various privisos: a woman murdering others could only be agreeable before God in rare situations, she was outside Puritan civilization and law, plus this was only her view of acceptable behavior. He was also sure to present her husband as a “Courageous” hero who “manfully” saved their children from capture at risk of his own life, as if a reminder of who could normally and properly use violence.[15] Yet Mather could not shield from the public what was already known, acts that threatened the ideology of male superiority and social dominance. The facts remained: a woman got the best of and murdered two “Stout” men.[16] She killed women and children, typically victims of men. She then took their scalps and received a bounty, as a soldier might do. Further, she was praised by men of such status as Colonel Francis Nicholson, governor of Maryland.[17] Mather could not fully approve of Dustin’s actions, but given the acclaim she had garnered neither could he condemn them. Both his relaying of Dustin’s deed and his tacit acceptance presented a significant deviation from social norms to Puritan communities.

Finally, let us consider the diary of Martha Ballard, written 1785-1812. Ballard, a midwife who delivered over eight hundred infants in Hallowell, Maine, left a daily record of her work, home, and social life. This document subverts the gender order by countering the contemporaneous texts positioning men as the exclusive important actors in the medical and economic spheres.[18] It is true that this diary was never meant for public consumption, unlike other texts. However, small acts by ordinary people undermine social systems, whether wittingly or not, and are never known to others. If this is true, surely texts, by their nature, can do the same. Either way, the diary did not remain private: it was read by Ballard’s descendants, historians, and possibly librarians, meaning its impact trickled beyond its creator and into the wider society of nineteenth century New England.[19]

Written by men, doctors’ papers and merchant ledgers of this period were silent, respectively, on midwives and women’s economic functions in towns like Hallowell, implying absence or non-involvement, whereas Ballard’s diary illuminated their importance.[20] She wrote, for example, on November 18, 1793: “At Capt Meloys. His Lady in Labour. Her women Calld… My patient deliverd at 8 hour 5 minute Evening of a fine daughter. Her attendants Mrss Cleark, Duttum, Sewall, & myself.”[21] This passage, and the diary as a whole, emphasized that it was common for midwives and women to safely and skilfully deliver infants, not a man or doctor present.[22] Further, her documentations such as “I have been pulling flax,” “Dolly warpt a piece for Mrs Pollard of 39 yards,” and “Dolly warpt & drawd a piece for Check. Laid 45 yds” made clear that women had economic responsibilities that went beyond their own homes, turning flax into cloth (warping is a key step) that could be traded or sold.[23] Women controlled their labor, earning wages: “received 6/ as a reward.”[24] Though Ballard’s text presents everyday tasks of New England women of her social class, and had a limited readership compared to Rowlandson or Mather’s writings, it too presents dangerous ideas that might bother a reader wedded to the gender hierarchy: that women could be just as effective as male doctors, and that the agency and labor of women hinted at possibilities of self-sufficiency.

The events in this essay, the captivity of English women during war and the daily activities of many English women during peace, would look different without gender analysis, without considering how the acts of women and representations of events related to the gender order. Rowlandson would simply be ignorant, failing to understand who her actual master was, Weetamoo’s position, and so on. Dustin’s violence would be business as usual, a captive killing to escape, with all of Mather’s rationalizations odd and unnecessary. Ballard’s daily entries would just be minutiae, with no connection to or commentary on the larger society from whence they came. Indeed, that is the necessary project. Examining how the gender hierarchy was defended or confronted provides the proper context for a fuller understanding of events — from an English perspective. A future paper might examine other societies, such as Native American nations, in the same way. Clearly, the acts of indigenous women and the (English) representations of those acts influenced English minds, typically threatening their hierarchy. But how did the acts of indigenous women and men, those of English women and men, and indigenous representations of such things engage with Native American tribes’ unique gender systems? We can find hints in English representations (Weetamoo may have been dismayed Rowlandson violated indigenous gender norms[25]), but for an earnest endeavor, primary sources by native peoples will be necessary, just as English sources enabled this writing.

For more from the author, subscribe and follow or read his books.


[1] Lisa Brooks, Our Beloved Kin: A New History of King Philip’s War (New Haven: Yale University Press, 2018), chapter one.

[2] Ibid., 264.

[3] Mary Rowlandson, The Sovereignty and Goodness of God by Mary Rowlandson with Related Documents, ed. Neal Salisbury (Boston: Bedford Books, 2018), 81.

[4] Ibid., 103.

[5] Ibid.

[6] Brooks, Our Beloved Kin, 264, 270.

[7] Rowlandson, Sovereignty, 28.

  Brooks, Our Beloved Kin, 264.

[8] “The Captivity of Hannah Dustin,” in Mary Rowlandson, The Sovereignty and Goodness of God by Mary Rowlandson with Related Documents, ed. Neal Salisbury (Boston: Bedford Books, 2018), 170-173.

[9] Ibid., 172.

[10] Ibid., 173.

[11] Ibid.

[12] Ibid.

[13] Kirsten Fischer, “The Imperial Gaze: Native American, African American, and Colonial Women in European Eyes,” in A Companion to American Women’s History, ed. Nancy A. Hewitt (Malden MA: Blackwell Publishing, 2002), 3-19.

[14] “Hannah Dustin,” 173.

[15] Ibid., 171-172.

[16] Ibid., 172.

[17] Ibid., 173.

[18] Laurel Thatcher Ulrich, A Midwife’s Tale: The Life of Martha Ballard, Based on Her Diary, 1785-1812 (New York: Vintage Books, 1999), 28-30.

[19] Ibid., 8-9, 346-352.

[20] Ibid., 28-30.

[21] Ibid., 162-163.

[22] See Ibid., 170-172 for infant mortality data.

[23] Ibid., 36, 73, 29.

[24] Ibid., 162. See also page 168.

[25] Brooks, Our Beloved Kin, 265.

The First American Bestseller: Mary Rowlandson’s 1682 ‘The Sovereignty and Goodness of God’

Historian John R. Gramm characterized Mary Rowlandson, an Englishwoman captured by allied Narragansett, Nipmuc, and Wompanoag warriors during King Philip’s War (1675-1676), as “both a victim and colonizer.”[1] This is correct, and observed in what is often labeled the first American bestseller. Rowlandson’s narrative of her experience, The Sovereignty and Goodness of God, is written through these inseparable lenses, a union inherent to the psychology of settler colonialism (to be a colonizer is to be a “victim”) and other power systems. Reading the narrative through both lenses, rather than one, avoids both dehumanization and a colonizer mindset, allowing for a more nuanced study.

Rowlandson as victim appears on the first page, with her town of Lancaster, Massachusetts, attacked by the aforementioned tribes: “Houses were burning,” women and children clubbed to death, a man dying from “split open…Bowels.”[2] At the final page, after she was held against her will for three months, forced to work, and ransomed for twenty pounds, she was still elaborating on the “affliction I had, full measure (I thought) pressed down and running over” — that cup of divinely ordained hardships.[3] Between war, bondage, the loss of her infant, and the elements such as hunger and cold, Rowlandson was a woman experiencing trauma, swept up in events and horrors beyond her control. “My heart began to fail,” she wrote, signifying her pain, “and I fell a weeping…”[4]

Rowlandson knew she was a victim. She did not know she was a colonizer, at least not in any negatively connoted fashion. Also from opening to close are expressions of racial and moral superiority. Native peoples are “dogs,” “beasts,” “merciless and cruel,” marked by “savageness and brutishness.”[5] She saw “a vast difference between the lovely faces of Christians, and the foul looks of these Heathens,” whose land was unadulterated “wilderness.”[6] Puritan society was civilization, native society was animalistic. That Rowlandson’s views persist despite her deeper understanding of and integration with Wompanoag society could be read as evidence of especially strong prejudices (though publication of her work may have required toeing the Puritan line). Regardless, her consciousness was thoroughly defined by religion and what historian Kristen Fischer called the “imperial gaze.”[7] Rowlandson’s town of Lancaster was in the borderlands, meaning more conflict with Native Americans; she was a prosperous minister’s wife, making religion an even more central part of her life than the average Puritan woman. (Compare this to someone like midwife Martha Ballard, whose distance from Native Americans and lower social class built a consciousness around her labor and relationships with other working women.[8]) Not only is the distinction between herself (civilized) and them (beastly) clear in Rowlandson’s mind, so is the religious difference — though for many European Americans Christianity and civilization were one and the same. The English victims are always described as “Christians,” which positions the native warriors as heathen Others (she of course makes this explicit as well, as noted).

These perspectives, of victim and colonizer, cannot be easily parsed apart. Setting aside Rowlandson’s kidnapping for a moment, settler colonization in some contexts requires a general attitude of victimhood. If “savages” are occupying land you believe God granted to you, as Increase Mather, who wrote Rowlandson’s preface, stated plainly, that is a wrong that can be addressed with violence.[9] Rowlandson is then a victim twofold. First, her Puritan promised land was being occupied by native peoples. Second, she was violently captured and held. To be a colonizer is to be a victim, by having “your” land violated by societies there before you, and by experiencing the counter-violence wrought by your colonization.

To only read Rowlandson’s captivity as victimhood is to simply adopt Rowlandson’s viewpoint, ignoring the fact that she is a foreigner with attitudes of racial and religious superiority who has encroached on land belonging to native societies. To read the captivity only through a colonizer lens, focusing on her troubling presence and views, is to dehumanize Rowlandson and ignore her emotional and physical suffering. When Chief Weetamoo’s infant dies, Rowlandson “could not much condole” with the Wampanoags, due to so many “sorrowfull dayes” of her own, including losing her own baby. She sees only the “benefit…more room.”[10] This callousness could be interpreted as a belief that Native Americans did not suffer like full human beings, mental resistance to an acknowledgement that might throw colonialism into question.[11] That is the colonizer lens. Yet from a victim-centered reading, it is difficult to imagine many contexts wherein a kidnapped person would feel much sympathy for those responsible for her captivity and servitude, the deaths of her infant and neighbors, and so on. Victim and colonizer indeed.

For more from the author, subscribe and follow or read his books.


[1] John R. Gramm (lecture, Missouri State University, February 15, 2022).

[2] Mary Rowlandson, The Sovereignty and Goodness of God by Mary Rowlandson with Related Documents, ed. Neal Salisbury (Boston: Bedford Books, 2018), 74.

[3] Ibid., 118.

[4] Ibid., 88.

[5] Ibid., 76-77, 113-114.

[6] Ibid., 100, 76.

[7] Kirsten Fischer, “The Imperial Gaze: Native American, African American, and Colonial Women in European Eyes,” in A Companion to American Women’s History, ed. Nancy A. Hewitt (Malden MA: Blackwell Publishing, 2002), 3-19.

[8] Laurel Thatcher Ulrich, A Midwife’s Tale: The Life of Martha Ballard, Based on Her Diary, 1785-1812 (New York: Vintage Books, 1999). Ballard and her husband, a working middle-class woman and a tax collector, faced financial hardship and ended up living in “semi-dependence on their son’s land”; see page 265. Compare this to Rowlandson, Sovereignty, 15-16: coming from and marrying into large landowning families, Rowlandson did not need to work to survive. Given her background, her consciousness goes beyond women and work, to larger collective concerns of community, civilization, and faith.

[9] Rowlandson, Sovereignty, 28.

  Lisa Brooks, Our Beloved Kin: A New History of King Philip’s War (New Haven: Yale University Press, 2018), 11.

[10] Rowlandson, Sovereignty, 97.

[11] Brooks, Our Beloved Kin, 282.

Two Thoughts on Salem

Christians Blamed Native Americans for Witchcraft

Boston clergyman Cotton Mather saw New Englanders like Mercy Short as particularly vulnerable to attacks by the Devil in the late seventeenth century due to the presence of Christianity on Native American land (or, more in his parlance, land formerly occupied only by the indigenous). Mather’s A Brand Pluck’d Out of the Burning of 1693 opens with two sentences outlining how Mercy Short was captured by “cruel and Bloody Indians” in her youth.[1] They killed her family and held her for ransom, which was eventually paid. This first paragraph may seem out of place, its only purpose seemingly being to evoke sympathy: see how much this young woman has suffered. “[S]he had then already Born the Yoke in her youth, Yett God Almighty saw it Good for her to Bear more…”[2]

However, the paragraph serves to establish a tacit connection between indigenous people and the witchcraft plaguing Salem. This is made more explicit later in the text, when Mather writes that someone executed at Salem testified “Indian sagamores” had been present at witch meetings to organize “the methods of ruining New England,” and that Mercy Short, in a possessed state, revealed the same, adding Native Americans at such meetings held a book of “Idolatrous Devotions.”[3] Mather, and others, believed Indigenous peoples were involved in the Devil’s work, so torturous to New Englanders. This was perceived to be a reaction to the Puritan presence. “It was a rousing alarm to the Devil,” Mather wrote in The Wonders of the Invisible World (1692), “when a great company of English Protestants and Puritans came to erect evangelical churches in a corner of the world where he had reigned…”[4] The Devil, displeased that Christianity was now “preached in this howling wilderness,” used native peoples to try to drive the Puritans out, including the sorcery of “Indian Powwows,” religious figures.[5] Because of Christianity’s presence in the “New World,” people like Mercy Short were far more at risk of diabolical terror — Mather thought “there never was a poor plantation more pursued by the wrath of the Devil…”[6]

The Accusers Parroted Each Other and No One Noticed

During the Salem witch trials of 1692, hysteria spread and convictions were secured due in part to near-verbatim repetition among the accusers. It seems likely that, rather than arousing suspicion, the fact that New Englanders accusing their neighbors of witchcraft used the precise same phrasing was viewed as evidence of truthtelling. Elizabeth Hubbard, testifying against a native woman named Tituba, reported: “I saw the apparition of Tituba Indian, which did immediately most grievously torment me…”[7] This occurred until “the day of her examination, being March 1, and then also at the beginning of her examination, but as soon as she began to confess she left off hurting me and has hurt me but little since.”[8] This is nearly identical to the testimony that occurred the same day from Ann Putnam, Jr. She said, “I saw the apparition of Tituba, Mr. Parris’s Indian woman, which did torture me most grievously…till March 1, being the day of her examination, and then also most grievously also at the beginning of her examination, but since she confessed she has hurt me but little.”[9] Though premeditation is in the realm of the possible (in other words, Putnam and Hubbard aligning their stories beforehand), this could be the result of spontaneous mimicking, whether conscious or subconscious, in a courtroom that was rather open (the second testifier copied the first because she was present to hear it).

This was a pattern in the trials that strengthened the believability of witchcraft tales. At the trial of Dorcas Hoar, accusers testified that “I verily believe in my heart that Dorcas Hoar is a witch” (Sarah Bibber), “I verily believe that Dorcas Hoar, the prisoner at the bar, is a witch” (Elizabeth Hubbard), “I verily believe in my heart that Dorcas Hoar is a witch” (Ann Putnam, Jr.), and “I verily believe in my heart that Dorcas Hoar is a most dreadful witch” (Mary Walcott).[10] Like the statements on Tituba, these occurred on the same day — a self-generating script that spelled destruction for the accused.

For more from the author, subscribe and follow or read his books.


[1] Cotton Mather, A Brand Pluck’d Out of the Burning, in George Lincoln Burr, Narratives of the New England Witch Trials (Mineola, New York: Dover Publications, 2012), 259.

[2] Ibid.

[3] Ibid, 281-282.

[4] Cotton Mather, The Wonders of the Invisible World, in Richard Godbeer, The Salem Witch Hunt: A Brief History with Documents (New York: Bedford/St. Martin’s, 2018), 49.

[5] Ibid.

[6] Ibid.

[7] “Elizabeth Hubbard against Tituba,” in Richard Godbeer, The Salem Witch Hunt: A Brief History with Documents (New York: Bedford/St. Martin’s, 2018), 92.

[8] Ibid.

[9] Ibid, 93.

[10] “Sarah Bibber against Dorcas Hoar,” “Elizabeth Hubbard against Dorcas Hoar,” “Ann Putnam Jr. against Dorcas Hoar,” and “Mary Walcott against Dorcas Hoar,” in Richard Godbeer, The Salem Witch Hunt: A Brief History with Documents (New York: Bedford/St. Martin’s, 2018), 121-122.

History, Theory, and Ethics

The writing of history and the theories that guide it, argues historian Lynn Hunt in Writing History in the Global Era, urgently need “reinvigoration.”[1] The old meta-narratives used to explain historical change looked progressively weaker and fell under heavier criticism as the twentieth century reached its conclusion and gave way to the twenty-first.[2] Globalization, Hunt writes, can serve as a new paradigm. Her work offers a valuable overview of historical theories and develops an important new one, but this paper will argue Hunt implicitly undervalues older paradigms and fails to offer a comprehensive purpose for history under her theory. This essay then proposes some guardrails for history’s continuing development, not offering a new paradigm but rather a framing that gives older theories their due and a purpose that can power many different theories going forward.

We begin by reviewing Hunt’s main ideas. Hunt argues for “bottom-up” globalization as a meta-narrative for historical study, and contributes to this paradigm by offering a rationale for causality and change that places the concepts of “self” and “society” at its center. One of the most important points that Writing History in the Global Era makes is that globalization has varying meanings, with top-down and bottom-up definitions. Top-down globalization is “a process that transforms every part of the globe, creating a world system,” whereas the bottom-up view is myriad processes wherein “diverse places become connected and interdependent.”[3] In other words, while globalization is often considered synonymous with Europe’s encroachment on the rest of the world, from a broader and, as Hunt sees it, better perspective, globalization would in fact be exemplified by increased interactions and interdependence between India and China, for example.[4] The exploration and subjugation of the Americas was globalization, but so was the spread of Islam from the Middle East across North Africa to Spain. It is not simply the spread of more advanced technology or capitalism or what is considered to be, in eurocentrism, the most enlightened culture and value system, either: it is a reciprocal, “two-way relationship” that can be found anywhere as human populations move, meet, and start to rely on each other, through trade for example.[5] Hunt seeks to overcome two problems here. First, the eurocentric top-down approach and its “defects”; second, the lack of a “coherent alternative,” which her work seeks to provide.[6]

Hunt rightly and persuasively makes the case for a bottom-up perspective of globalization as opposed to top-down, then turns to the question of why this paradigm has explanatory power. What is it about bottom-up globalization, the increasing interactions and interdependence of human beings, that brings about historical change? Here Hunt is situating her historical lens alongside and succeeding previous ones, explored early in the work. Marxism, modernization, and the Annales School offered theories of causality. Cultural and political change was brought about by new modes of economic production, the growth of technology and the State, or by geography and climate, respectively.[7] The paradigm of identity politics, Hunt notes, at times lacked such a clear “overarching narrative,” but implied that inclusion of The Other, minority or oppressed groups, in the national narrative was key to achieving genuine democracy (which more falls under purpose, to be explored later).[8] Cultural theories rejected the idea, inherent in older paradigms, that culture was produced by economic or social relations; culture was a force unto itself, comprised of language, semiotics, discourse, which determined what an individual thought to be true and how one behaved.[9] “Culture shaped class and politics rather than the other way around” — meaning culture brought about historical change (though many cultural theorists preferred not to focus on causation, perhaps similar to those engaged in identity politics).[10] Bottom-up globalization, Hunt posits, is useful as a modern explanatory schema for the historical field. It brings about changes in the self (in fact in the brain) and of society, which spurs cultural and political transformations.[11] There is explanatory power in increased connections between societies. For instance, she suggests that drugs and stimulants like coffee, brought into Europe through globalization, produced selves that sought pleasure and thrill (i.e. altered the neurochemistry of the brain) and changed society by creating central gathering places, coffeehouses, where political issues could be intensely discussed. These developments may have pushed places like France toward democratic and revolutionary action.[12] For Hunt, it is not enough to say culture alone directs the thinkable and human action, nor is the mind simply a social construction — the biology of the brain and how it reacts and operates must be taken into account.[13] The field must move on from cultural theories.

Globalization, a useful lens through which to view history, joins a long list, only partially outlined above. Beyond economics, advancing technology and government bureaucracy, geography and environment, subjugated groups, and culture, there is political, elite, or even “Great Men” history; social history, the story of ordinary people; the history of ideas, things, and diseases and non-human species; microhistory, biography, a close look at events and individuals; and more.[14] Various ways of looking at history, some of which are true theories that include causes of change, together construct a more complete view of the past. They are all valuable. As historian Sarah Maza writes, “History writing does not get better and better but shifts and changes in response to the needs and curiosities of the present day. Innovations and new perspectives keep the study of the past fresh and interesting, but that does not mean we should jettison certain areas or approaches as old-fashioned or irrelevant.”[15] This is a crucial reminder. New paradigms can reinvigorate, but historians must be cautious of seeing them as signals that preceding paradigms are dead and buried.

Hunt’s work flirts with this mistake, though perhaps unintentionally. Obviously, some paradigms grow less popular, while others, particularly new ones, see surges in adherents. Writing History in the Global Era outlines the “rise and fall” of theories over time, the changing popularities and new ways of thinking that brought them about.[16] One implication in Hunt’s language, though such phrasing is utilized from the viewpoint of historical time or those critical of older theories, is that certain paradigms are indeed dead or of little use — “validity” and “credibility” are “questioned” or “lost,” “limitations” and “disappointments” discovered, theories “undermined” and “weakened” by “gravediggers” before they “fall,” and so forth.[17] Again, these are not necessarily Hunt’s views, rather descriptors of changing trends and critiques, but Hunt’s work offers no nod to how older paradigms are still useful today, itself implying that different ways of writing history are now irrelevant. With prior theories worth less, a new one, globalization, is needed. Hunt’s work could have benefited from more resistance to this implication, with a serious look at how geography and climate, or changing modes of economic production, remain valuable lenses historians use to chart change and find truth — an openness to the full spectrum of approaches, for they all work cooperatively to reveal the past, despite their unique limitations. Above, Maza mentioned “certain areas” of history in addition to “approaches,” and continued: “As Lynn Hunt has pointed out, no field of history [such as ancient Rome] should be cast aside just because it is no longer ‘hot’…”[18] Hunt should have acknowledged and demonstrated that the precise same is true of approaches to history.

Another area that deserves more attention is purpose. In the same way that not all historical approaches emphasize causality and change, not all emphasize purpose. Identity politics had a clear use: the inclusion of subjugated groups in history helped move nations toward political equality.[19] With other approaches, however, “What is it good for?” is more difficult to answer. This is to ask what utility a theory had for contemporary individuals and societies (and has for modern ones), beyond a more complete understanding of yesteryear or fostering new research. It may be more challenging to see a clear purpose in the study of how the elements of the longue durée, such as geography and climate, of the Annales School change human development. How was such a lens utilized as a tool, if in fact it was, in the heyday of the Annales School? How could it be utilized today? (Perhaps it could be useful in mobilizing action against climate change.) The purpose of history — of each historical paradigm — is not always obvious.

Indeed, Hunt’s paradigm “offers a new purpose for history: understanding our place in an increasingly interconnected world,” a rather vague suggestion that sees little elaboration.[20] What does it mean to understand our place? Is this a recycling of “one cannot understand the present without understanding the past,” a mere truism? Or is it to say that a bottom-up globalization paradigm can be utilized to demonstrate the connection between all human beings, breaking down nationalism or even national borders? After all, the theory moves away from eurocentrism and the focus on single nations. Perhaps it is something else, one cannot know for certain. Of course, Hunt may have wanted to leave this question to others, developing the tool and letting others determine how to wield it. However, hesitation on Hunt’s part to more deeply and explicitly explore purpose, to adequately show how her theory is useful to the present, may be a simple desire to avoid the controversy of politics. This would be disappointing to those who believe history is inherently political or anchored to ethics, but either reason is out of step with Hunt’s introduction. History, Hunt writes on her opening page, is “in crisis” due to the “nagging question that has proved so hard to answer…‘What is it good for?’”[21] In the nineteenth and twentieth centuries, she writes, the answer shifted from developing strong male leaders to building national identity and patriotism to contributing to the social movements of subjugated groups by unburying histories of oppression.[22] All of these purposes are political. Hunt deserves credit for constructing a new paradigm, with factors of causality and much fodder for future research, but to open the work by declaring a crisis of purposelessness, framing purposes as political, and then not offering a fully developed purpose through a political lens (or through another lens, explaining why purpose need not be political) is an oversight.

Based on these criticisms, we have a clear direction for the field of history. First, historians should reject any implication of a linear progression of historical meta-narratives, which this paper argues Hunt failed to do. “Old-fashioned” paradigms in fact have great value today, which must be noted and explored. A future work on the state of history might entirely reframe, or at least dramatically add to, the discussion of theory. Hunt tracked the historical development of theories and their critics, with all the ups and downs of popularity. This is important epistemologically, but emphasizes the failures of theories rather than their contributions, and presents them as stepping stones to be left behind on the journey to find something better. Marxism had a “blindness to culture” and had to be left by the wayside, its replacement had this or that limitation and was itself replaced, and so on.[23] Hunt writes globalization will not “hold forever” either.[24] A future work might instead, even if it included a brief, similar tracking, focus on how each paradigm added to our understanding of history, continued to do so, and how it does so today. As an example of the second task, Anthony Reid’s 1988 Southeast Asia in the Age of Commerce, 1450-1680 was written very much in the tradition of the Annales School, with a focus on geography, resources, climate, and demography, but it would be lost in a structure like Hunt’s, crowded out by the popularity of cultural studies in the last decades of the twentieth century.[25] Simply put, the historian must break away from the idea that paradigms are replaced. They are replaced in popularity, but not in importance to the mission of more fully understanding the past. As Hunt writes, “Paradigms are problematic because by their nature they focus on only part of the picture,” which highlights the necessity of the entire paradigmic spectrum, as does her putting globalization theory into practice, suggesting that coffee from abroad spurred revolutionary movements in eighteenth-century Europe, sidelining countless other factors.[26] Every paradigm helps us see more of the picture. It would be a shame if globalization was downplayed as implicitly irrelevant only a couple decades from now, if still a useful analytical lens. Paradigms are not stepping stones, they are columns holding up the house of history — more can be added as we go.

This aforementioned theoretical book on the field would also explore purpose, hypothesizing that history cannot be separated from ethics, and therefore from politics. Sarah Maza wrote in the final pages of Thinking About History:

Why study history? The simplest response is that history answers questions that other disciplines cannot. Why, for instance, are African-Americans in the United States today so shockingly disadvantaged in every possible respect, from income to education, health, life expectancy, and rates of incarceration, when the last vestiges of formal discrimination were done away with half a century ago? Unless one subscribes to racist beliefs, the only way to answer that question is historically, via the long and painful narrative that goes from transportation and slavery to today via Reconstruction, Jim Crow laws, and an accumulation, over decades, of inequities in urban policies, electoral access, and the judicial system.[27]

This is correct, and goes far beyond the purpose of answering questions. History is framed as the counter, even the antidote, to racist beliefs. If one is not looking to history for such answers, there is nowhere left to go but biology, racial inferiority, to beliefs deemed awful. History therefore informs ethical thinking; its utility is to help us become more ethical creatures, as (subjectively) defined by our society — and the self. This purpose is usually implied but rarely explicitly stated, and a discussion on the future of history should explore it. Now, one could argue that Maza’s dichotomy is simply steering us toward truth, away from incorrect ideas rather than unethical ones. But that does not work in all contexts. When we read Michel Foucault’s Discipline and Punish, he is not demonstrating that modes of discipline are incorrect — and one is hardly confused as to whether he sees them as bad things, these “formulas of domination” and “constant coercion.”[28] J.R. McNeill, at the end of Mosquito Empires: Ecology and War in the Greater Caribbean, 1620-1914, writes that yellow fever’s “career as a governing factor in human history, mercifully, has come to a close” while warning of a lapse in vaccination and mosquito control programs that could aid viruses that “still lurk in the biosphere.”[29] The English working class, wrote E.P. Thompson, faced “harsher and less personal” workplaces, “exploitation,” “unfreedom.”[30] The implications are clear: societies without disciplines, without exploitation, with careful mosquito control would be better societies. For human beings, unearthing and reading history cannot help but create value judgements, and it is a small step from the determination of what is right to the decision to pursue it, political action. It would be difficult, after all, to justify ignoring that which was deemed ethically right.

Indeed, not only do historians implicitly suggest better paths and condemn immoral ones, the notion that history helps human beings make more ethical choices is already fundamental to how many lay people read history — what is the cliché of being doomed to repeat the unlearned past about if not avoiding tragedies and terrors deemed wrong by present individuals and society collectively? As tired and disputed as the expression is, there is truth to it. Studying how would-be authoritarians often use minority groups as scapegoats for serious economic and social problems to reach elected office in democratic systems creates pathways for modern resistance, making the unthinkable thinkable, changing characterizations of what is right or wrong, changing behavior. Globalization may alter the self and society, but the field of history itself, to a degree, does the same. This could be grounds for a new, rather self-congratulatory paradigm, but the purpose, informing ethical and thus political decision-making, can guide many different theories, from Marxism to globalization. As noted, prior purposes of history were political: forming strong leaders, creating a national narrative, challenging a national narrative. A new political purpose would be standard practice. One might argue moving away from political purposes is a positive step, but it must be noted that the field seems to move away from purpose altogether when it does so. Is purpose inherently political? This future text would make the case that it is. A purpose cannot be posited without a self-evident perceived good. Strong leaders are good, for instance — and therefore should be part of the social and political landscape.

In conclusion, Hunt’s implicit dismissal of older theories and her incomplete purpose for history deserve correction, and doing so pushes the field forward in significant ways. For example, using the full spectrum of paradigms helps us work on (never solve) history’s causes-of-causes ad infinitum problem. Changing modes of production may have caused change x, but what caused the changing modes of production? What causes globalization in the first place? Paradigms can interrelate, helping answer the thorny questions of other paradigms (perhaps modernization or globalization theory could help explain changing modes of production, before requiring their own explanations). How giving history a full purpose advances the field is obvious: it sparks new interest, new ways of thinking, new conversations, new utilizations, new theories, while, like the sciences, offering the potential — but not the guarantee — of improving the human condition.

For more from the author, subscribe and follow or read his books.


[1] Lynn Hunt, Writing History in the Global Era (New York: W.W. Norton & Company, 2014), 1.

[2] Ibid, 26, 35-43.

[3] Ibid, 59. See also 60-71.

[4] Ibid, 70.

[5] Ibid.

[6] Ibid, 77.

[7] Ibid, 14-17.

[8] Ibid, 18.

[9] Ibid, 18-27.

[10] Ibid, 27, 77.

[11] Ibid, chapters 3 and 4.

[12] Ibid, 135-141.

[13] Ibid, 101-118.

[14] Sarah Maza, Thinking About History (Chicago: University of Chicago Press, 2017).

[15] Maza, Thinking, 236.

[16] Hunt, Writing History, chapter 1.

[17] Ibid, 8-9, 18, 26-27, chapter 1.

[18] Maza, Thinking, 236.

[19] Hunt, Writing History, 18.

[20] Ibid, 10.

[21] Ibid, 1.

[22] Ibid, 1-7.

[23] Ibid, 8.

[24] Ibid, 40.

[25] Anthony Reid, Southeast Asia in the Age of Commerce, 1450-1680, vol. 1, The Lands Below the Winds (New Haven: Yale University Press, 1988).

[26] Hunt, Writing History, 121, 135-140.

[27] Maza, Thinking, 237.

[28] Michel Foucault, Discipline and Punish (New York: Vintage Books, 1995), 137.

[29] J.R. McNeill, Mosquito Empires: Ecology and War in the Greater Caribbean, 1620-1914 (New York: Cambridge University Press, 2010), 314.

[30] E.P. Thompson, The Essential E.P. Thompson (New York: The New Press, 2001), 17. 

Comparative Power

The practice of reconstructing the past, with all its difficulties and incompleteness, is aided by comparative study. Historians, anthropologists, sociologists, and other researchers can learn a great deal about their favored society and culture by looking at others. This paper makes that basic point, but, more significantly, makes a distinction between the effectiveness of drawing meaning from cultural similarity/difference and doing the same from one’s own constructed cultural analogy, while acknowledging both are valuable methods. In other words, it is argued here that the historian who documents similarities and differences between societies stands on firmer methodological ground for drawing conclusions about human cultures than does the historian who is forced to fill in gaps in a given historical record by studying other societies in close geographic and temporal proximity. Also at a disadvantage is the historian working comparatively with gaps in early documentation that are filled in later documentation. This paper is a comparison of comparative methods — an important exercise, because such methods are often wielded due to a dearth of evidence in the archives. The historian should understand the strengths and limitations of various approaches (here reciprocal comparison, historical analogy, and historiographic comparison) to this problem.

To begin, a look at reciprocal comparison and the meaning derived from such an effort, derived specifically from likenesses or distinctions. Historian Robert Darnton found meaning in differences in The Great Cat Massacre: and Other Episodes in French Cultural History. What knowledge, Darnton wondered in his opening chapter, could we gain of eighteenth century French culture by looking at peasant folk tales and contrasting them to versions found in other places in Europe? Whereas similarities might point to shared cultural traits or norms, differences would isolate the particular mentalités of French peasants, how they viewed the world and what occupied their thoughts, in the historical tradition of the Annales School.[1] So while the English version of Tom Thumb was rather “genial,” with helpful fairies, attention to costume, and a titular character engaging in pranks, in the French version the Tom Thumb character, Poucet, was forced to survive in a “harsh, peasant world” against “bandits, wolves, and the village priest by using his wits.”[2] In a tale of a doctor cheating Death, the German version saw Death immediately kill the doctor; with a French twist, the doctor got away with his treachery for some time, becoming prosperous and living to old age — cheating paid off.[3] Indeed, French tales focused heavily on survival in a bleak and brutal world, and on this world’s particularities. Characters with magical wishes asked for food and full bellies, they got rid of children who did not work, put up with cruel step-mothers, and encountered many beggars on the road.[4] Most folk tales mix fictional elements like ogres and magic with socio-economic realities from the place and time they are told, and therefore the above themes reflect the ordinary lives of French peasants: hunger, poverty, the early deaths of biological mothers, begging, and so on.[5] In comparing French versions with those of the Italians, English, and Germans, Darnton noticed unique fixations in French peasant tales and then contrasted these obsessions with the findings of social historians on the material conditions of peasant life, bringing these things together to find meaning, to create a compelling case for what members of the eighteenth century French lower class thought about day to day and their attitudes towards society.

Now, compare Darnton’s work to ethno-historian Helen Rountree’s “Powhatan Indian Women: The People Captain John Smith Barely Saw.” Rountree uses ethnographic analogy, among other tools, to reconstruct the daily lives of Powhatan women in the first years of the seventeenth century. Given that interested English colonizers had limited access to Powhatan women and a “cloudy lens” of patriarchal eurocentrism through which they observed native societies, and given that the Powhatans left few records themselves, Rountree uses the evidence of daily life in nearby Eastern Woodland tribes to describe the likely experiences of Powhatan women.[6] For example: “Powhatan women, like other Woodland Indian women, probably nurse their babies for well over a year after birth, so it would make sense to keep baby and food source together” by bringing infants into the fields with them as the women work.[7] Elsewhere “probably” is dropped for more confident takes: “Powhatan men and women, like those in other Eastern Woodland tribes, would have valued each other as economic partners…”[8] A lack of direct archival knowledge of Powhatan society and sentiments is shored up through archival knowledge of other native peoples living in roughly the same time and region. The meaning Rountree derives from ethnographic analogy, alongside other techniques and evidence, is that the English were wrong, looking through their cloudy lens, to believe Powhatan women suffered drudgery and domination under Powhatan men. Rather, women experienced a great deal of autonomy, as well as fellowship and variety, in their work, and were considered co-equal partners with men in the economic functioning of the village.[9]  

Both Darnton and Rountree admit their methods have challenges where evidence is concerned. Darnton writes that his examination of folktales is “distressingly imprecise in its deployment of evidence,” the evidence is “vague,” because the tales were written down much later — exactly how they were orally transmitted at the relevant time cannot be known.[10] In other words, what if the aspect of a story one marks as characteristic of the French peasant mentalité was not actually in the verbal telling of the tale? It is a threat to the legitimacy of the project. Rountree is careful to use “probably” and “likely” with most of her analogies; the “technique is a valid basis for making inferences if used carefully” (emphasis added), and one must watch out for the imperfections in the records of other tribes.[11] For what if historical understanding of another Eastern Woodland tribe is incorrect, and the falsity is copied over to the narrative of the Powhatan people? Rountree and Darnton acknowledge the limitations of their methods even while firmly believing they are valuable for reconstructing the past. This paper does not dispute that — however, it would be odd if all comparative methods were created equal.

Despite its challenges, reciprocal comparison rests on safer methodological ground, for it at least boasts two actually existing elements to contrast. For instance, Darnton has in his possession folktales from France and from Germany, dug up in the archives, and with them he can notice differences and thus derive meaning about how French peasants viewed the world. Such meaning may be incorrect, but is less likely to be so with support from research on the material conditions of those who might be telling the tales, as mentioned. Rountree, on the other hand, wields a tool that works with but one existing element. Historical, cultural, or ethnographic analogy takes what is known about other peoples and applies it to a specific group suffering from a gap in the historical record. This gap, a lack of direct evidence, is filled with an assumption — which may simply be wrong, without support from other research, like Darnton enjoys, to help out (to have such research would make analogy unnecessary). Obviously, an incorrect assumption threatens to derail derived meaning. If the work of Powhatan women differed in a significant way from other Eastern Woodland tribes, unseen and undiscovered and even silenced by analogy, the case of Powhatan economic equality could weaken. Again, this is not to deny the method’s value, only to note the danger that it carries compared to reciprocal comparison. Paradoxically, the inference that Powhatan society resembled other tribes nearby seems as probable and reasonable as it is bold, risky.

Similarly, Michel-Rolph Trouillot, in Silencing the Past: Power and the Production of History, also found meaning with absence when examining whether Henri Christophe, monarch of Haiti after its successful revolution against the French from 1791 to 1804, was influenced by Frederick the Great of Prussia when Christophe named his new Milot palace “San Souci.” Was the palace named after Frederick’s own in Potsdam, or after Colonel San Souci, a revolutionary rival Christophe killed? Trouillot studied the historical record and found that opportunities for early observers to mention a Potsdam-Milot connection were suspiciously ignored.[12] For example, Austro-German geographer Karl Ritter, a contemporary of Christophe, repeatedly described his palace as “European” but failed to mention it was inspired by Frederick’s.[13] British consul Charles Mackenzie, “who visited and described San Souci less than ten years after Christophe’s death, does not connect the two palaces.”[14] Why was a fact that was such a given for later writers not mentioned early on if it was true?[15] These archival gaps of course co-exist with Trouillot’s positive evidence (“Christophe built San Souci, the palace, a few yards away from — if not exactly — where he killed San Souci, the man”[16]), but are used to build a case that Christophe had Colonel San Souci in mind when naming his palace, a detail that evidences an overall erasure of the colonel from history.[17] By contrasting the early historical record with the later one, Trouillot finds truth and silencing.

This historiographic comparison is different from Rountree’s historical analogy. Rountree fills in epistemological gaps about Powhatan women with the traits of nearby, similar cultures; Trouillot judges the gaps in early reports about Haiti’s San Souci palace to suggest later writers were in error and participating in historical silencing (he, like Darnton, is working with two existing elements and weighs the differences). Like Rountree’s, Trouillot’s method is useful and important: the historian should always seek the earliest writings from relevant sources to develop an argument, and if surprising absences exist there is cause to be suspicious that later works created falsities. However, this method too flirts with assumption. It assumes the unwritten is also the unthought, which is not always the case. It may be odd or unlikely that Mackenzie or Ritter would leave Potsdam unmentioned if they believed in its influence, but not impossible or unthinkable. It further assumes a representative sample size — Trouillot is working with very few early documents. Would the discovery of more affect his thesis? As we see with Trouillot and Rountree, and as one might expect, a dearth in the archives forces assumptions.

While Trouillot’s conclusion is probable, he is nevertheless at greater risk of refutation than Darnton or, say, historian Kenneth Pomeranz, who also engaged in reciprocal comparison when he put China beside Europe during the centuries before 1800. Unlike the opening chapter of The Great Cat Massacre, The Great Divergence finds meaning in similarities as well as differences. Pomeranz seeks to understand why Europe experienced an Industrial Revolution instead of China, and must sort through many posited causal factors. For instance, did legal and institutional structures more favorable to capitalist development give Europe an edge, contributing to greater productivity and efficiency?[18] Finding similar regulatory mechanisms like interest rates and property rights, and a larger “world of surprising resemblances” before 1750, Pomeranz argued for other differences: Europe’s access to New World resources and trade, as well as to coal.[19] This indicates that Europe’s industrialization occurred not due to the superior intentions, wisdom, or industriousness of Europeans but rather due to unforeseen, fortunate happenings, or “conjunctures” that “often worked to Western Europe’s advantage, but not necessarily because Europeans created or imposed them.”[20] Reciprocal comparison can thus break down eurocentric perspectives by looking at a broader range of historical evidence. No assumptions need be made (rather, assumptions, such as those about superior industriousness, can be excised). As obvious as it is to write, a wealth of archival evidence, rather than a lack, makes for safer methodological footing, as does working with two existing evidentiary elements, no risky suppositions necessary.

A future paper might muse further on the relationship between analogy and silencing, alluded to earlier — if Trouillot is correct and a fact-based narrative is built on silences, how much more problematic is the narrative based partly on analogy?[21] As for this work, in sum, the historian must use some caution with historical analogy, historiographic comparison, and other tools that have an empty space on one side of the equation. These methods are hugely important and often present theses of high probability. But they are by nature put at risk by archival gaps; reciprocal comparison has more power in its derived meanings and claims about other cultures of the past — by its own archival nature.

For more from the author, subscribe and follow or read his books.


[1] Anna Green and Kathleen Troup, eds., The Houses of History: A Critical Reader in Twentieth-Century History and Theory, 2nd ed. (Manchester: Manchester University Press, 2016), 111.

[2] Robert Darnton, The Great Cat Massacre: And Other Episodes in French Cultural History (New York: Basic Books, 1984), 42.

[3] Ibid, 47-48.

[4] Ibid, 29-38.

[5] Ibid, 23-29.

[6] Helen C. Rountree, “Powhatan Indian Women: The People Captain John Smith Barely Saw,” Ethnohistory 45, no. 1 (winter 1998): 1-2.

[7] Ibid, 4.

[8] Ibid, 21.

[9] Ibid, 22.

[10] Darnton, Cat Massacre, 261.

[11] Rountree, “Powhatan,” 2.

[12] Michel-Rolph Trouillot, Silencing the Past: Power and the Production of History (Boston: Beacon Press, 1995), 61-65.

[13] Ibid, 63-64.

[14] Ibid, 62.

[15] Ibid, 64.

[16] Ibid, 65.

[17] Ibid, chapters 1 and 2.

[18] Kenneth Pomeranz, The Great Divergence: China, Europe, and the Making of the Modern World Economy (Princeton: Princeton University Press, 2000), chapters 3 and 4.

[19] Ibid, 29, 279-283.

[20] Ibid, 4.

[21] Trouillot, Silencing, 26-27.

When The Beatles Sang About Killing Women

Move over, Johnny Cash and “Cocaine Blues.” Sure, “Early one mornin’ while making the rounds / I took a shot of cocaine and I shot my woman down… Shot her down because she made me slow / I thought I was her daddy but she had five more” are often the first lyrics one thinks of when considering the violent end of the toxic masculinity spectrum in white people music. (Is this not something you ponder? Confront more white folk who somehow only see these things in black music, you’ll get there.) But The Beatles took things to just as dark a place.

Enter “Run For Your Life” from their 1965 album Rubber Soul, a song as catchy as it is chilling: “You better run for your life if you can, little girl / Hide your head in the sand, little girl / Catch you with another man / That’s the end.” Jesus. It’s jarring, the cuddly “All You Need Is Love” boy band singing “Well, I’d rather see you dead, little girl / Than to be with another man” and “Let this be a sermon / I mean everything I’ve said / Baby, I’m determined / And I’d rather see you dead.” But jealous male violence in fact showed up in other Beatles songs as well, and in the real world, with the self-admitted abusive acts and attitudes of John Lennon, later regretted but no less horrific for it.

This awfulness ensured The Beatles would be viewed by many of posterity as a contradictory element, with proto-feminist themes and ideas of the 1960s taking root in their music alongside possessive, murderous sexism. That is, if these things are noticed at all.

For more from the author, subscribe and follow or read his books.

Hegemony and History

The Italian Marxist Antonio Gramsci, writing in the early 1930s while imprisoned by the Mussolini government, theorized that ruling classes grew entrenched through a process called cultural hegemony, the successful propagation of values and norms, which when accepted by the lower classes produced passivity and thus the continuation of domination and exploitation from above. An ideology became hegemonic when it found support from historical blocs, alliances of social groups (classes, religions, families, and so on) — meaning broad, diverse acceptance of ideas that served the interests of the bourgeoisie in a capitalist society and freed the ruling class from some of the burden of using outright force. This paper argues that Gramsci’s theory is useful for historians because its conception of “divided consciousness” offers a framework for understanding why individuals failed to act in ways that aligned with their own material interests or acted for the benefit of oppressive forces. Note this offering characterizes cultural hegemony as a whole, but it is divided consciousness that permits hegemony to function. Rather than a terminus a quo, however, divided consciousness can be seen as created, at least partially, by hegemony andas responsible for ultimate hegemonic success — a mutually reinforcing system. The individual mind and what occurs within it is the necessary starting point for understanding how domineering culture spreads and why members of social groups act in ways that puzzle later historians.

Divided (or contradictory) consciousness, according to Gramsci, was a phenomenon in which individuals believed both hegemonic ideology and contrary ideas based on their own lived experiences. Cultural hegemony pushed such ideas out of the bounds of rational discussion concerning what a decent society should look like. Historian T.J. Jackson Lears, summarizing sociologist Michael Mann, wrote that hegemony ensured “values rooted in the workers’ everyday experience lacked legitimacy… [W]orking class people tend to embrace dominant values as abstract propositions but often grow skeptical as the values are applied to their everyday lives. They endorse the idea that everyone has an equal chance of success in America but deny it when asked to compare themselves with the lawyer or businessman down the street.”[1] In other words, what individuals knew to be true from simply functioning in society was not readily applied to the nature of the overall society; some barrier, created at least in part by the process of hegemony, existed. Lears further noted the evidence from sociologists Richard Sennett and Jonathon Cobb, whose subaltern interviewees “could not escape the effect of dominant values” despite also holding contradictory ones, as “they deemed their class inferiority a sign of personal failure, even as many realized they had been constrained by class origins that they could not control.”[2] A garbage collector knew the fact that he was not taught to read properly was not his fault, yet blamed himself for his position in society.[3] The result of this contradiction, Gramsci observed, was often passivity, consent to oppressive systems.[4] If one could not translate and contrast personal truths to the operation of social systems, political action was less likely.

To understand how divided consciousness, for Gramsci, was achieved, it is necessary to consider the breadth of the instruments that propagated dominant culture. Historian Robert Gray, studying how the bourgeoisie achieved hegemony in Victorian Britain, wrote that hegemonic culture could spread not only through the state — hegemonic groups were not necessarily governing groups, though there was often overlap[5] — but through any human institutions and interactions: “the political and ideological are present in all social relations.”[6] Everything in Karl Marx’s “superstructure” could imbue individuals and historical blocs with domineering ideas: art, media, politics, religion, education, and so on. Gray wrote that British workers in the era of industrialization of course had to be pushed into “habituation” of the new and brutal wage-labor system by the workplace itself, but also through “poor law reform, the beginnings of elementary education, religious evangelism, propaganda against dangerous ‘economic heresies,’ the fostering of more acceptable expressions of working-class self help (friendly societies, co-ops, etc.), and of safe forms of ‘rational recreation.’”[7] The bourgeoisie, then, used many social avenues to manufacture consent, including legal reform that could placate workers. Some activities were acceptable under the new system (joining friendly societies or trade unions) to keep more radical activities out of bounds.[8] It was also valuable to create an abstract enemy, a “social danger” for the masses to fear.[9] So without an embrace of the dominant values and norms of industrial capitalism, there would be economic disaster, scarcity, loosening morals, the ruination of family, and more.[10] The consciousness was therefore under assault by the dominant culture from all directions, heavy competition for values derived from lived experience, despite the latter’s tangibility. In macro, Gramsci’s theory of cultural hegemony, to quote historian David Arnold, “held that popular ideas had as much historical weight or energy as purely material forces” or even “greater prominence.”[11] In micro, it can be derived, things work the same in the individual mind, with popular ideas as powerful as personal experience, and thus the presence of divided consciousness.

The concept of contradictory consciousness helps historians answer compelling questions and solve problems. Arnold notes Gramsci’s questions: “What historically had kept the peasants [of Italy] in subordination to the dominant classes? Why had they failed to overthrow their rulers and to establish a hegemony of their own?”[12] Contextually, why wasn’t the peasantry more like the industrial proletariat — the more rebellious, presumed leader of the revolution against capitalism?[13] The passivity wrought from divided consciousness provided an answer. While there were “glimmers” of class consciousness — that is, the application of lived experience to what social systems should be, and the growth of class-centered ideas aimed at ending exploitation — the Italian peasants “largely participated in their own subordination by subscribing to hegemonic values, by accepting, admiring, and even seeking to emulate many of the attributes of the superordinate classes.”[14] Their desires, having “little internal consistency or cohesion,” even allowed the ruling class to make soldiers of peasants,[15] meaning active participation in maintaining oppressive power structures. Likewise, Lears commented on the work of political theorist Lawrence Goodwyn and the question of why the Populist movement in the late nineteenth century United States largely failed. While not claiming hegemony as the only cause, Lears argued that the democratic movement was most successful in parts of the nation with democratic traditions, where such norms were already within the bounds of acceptable discussion.[16] Where they were not, where elites had more decision-making control, the “received culture” was more popular, with domination seeming more natural and inevitable.[17] Similarly, Arnold’s historiographical review of the Indian peasantry found that greater autonomy (self-organization to pursue vital interests) of subaltern groups meant hegemony was much harder to establish, with “Gandhi [coming] closest to securing the ‘consent’ of the peasantry for middle-class ideological and political leadership,” but the bourgeoisie failing to do the same.[18] Traditions and cultural realities could limit hegemonic possibilities; it’s just as important to historians to understand why something does not work out as it is to comprehend why something does. As a final example, historian Eugene Genovese found that American slaves demonstrated both resistance to and appropriation of the culture of masters, both in the interest of survival, with appropriation inadvertently reinforcing hegemony and the dominant views and norms.[19] This can help answer questions regarding why slave rebellions took place in some contexts but not others, or even why more did not occur — though, again, acceptance of Gramscian theory does not require ruling out all causal explanations beyond cultural hegemony and divided consciousness. After all, Gramsci himself favored nuance, with coexisting consent and coercion, consciousness of class or lived experience mixing with beliefs of oppressors coming from above, and so on.

The challenge of hegemonic theory and contradictory consciousness relates to parsing out aforementioned causes. Gray almost summed it up when he wrote, “[N]or should behavior that apparently corresponds to dominant ideology be read at face value as a direct product of ruling class influence.”[20] Here he was arguing that dominant culture was often imparted in indirect ways, not through intentionality of the ruling class or programs of social control.[21] But one could argue: “Behavior that apparently corresponds to dominant ideology cannot be read at face value as a product of divided consciousness and hegemony.” It is a problem of interpretation, and it can be difficult for historians to parse out divided consciousness or cultural hegemony from other historical causes and show which has more explanatory value. When commenting on the failure of the Populist movement, Lears mentioned “stolen elections, race-baiting demagogues,” and other events and actors with causal value.[22] How much weight should be given to dominant ideology and how much to stolen elections? This interpretive nature can appear to weaken the usefulness of Gramsci’s model. Historians have developed potential solutions. For instance, as Lears wrote, “[O]ne way to falsify the hypothesis of hegemony is to demonstrate the existence of genuinely pluralistic debate; one way to substantiate it is to discover what was left out of public debate and to account historically for those silences.”[23] If there was public discussion of a wide range of ideas, many running counter to the interests of dominant groups, the case for hegemony is weaker; if public discussion centered around a narrow slate of ideas that served obvious interests, the case is stronger. A stolen election may be assigned less casual value, and cultural hegemony more, if there existed restricted public debate. However, the best evidence for hegemony may remain the psychoanalysis of individuals, as seen above, that demonstrate some level of divided consciousness. Even in demonstrability, contradictory consciousness is key to Gramsci’s overall theory. A stolen election may earn less casual value if such insightful individual interviews can be submitted as evidence.  

In sum, for Gramscian thinkers divided consciousness is a demonstrable phenomenon that powers (and is powered by) hegemony and the acceptance of ruling class norms and beliefs. While likely not the only cause of passivity to subjugation, it offers historians an explanation as to why individuals do not act in their own best interests that can be explored, given causal weight, falsified, or verified (to degrees) in various contexts. Indeed, Gramsci’s theory is powerful in that it has much utility for historians whether true or misguided.

For more from the author, subscribe and follow or read his books.


[1] T.J. Jackson Lears, “The Concept of Cultural Hegemony: Problems and Possibilities,” The American Historical Review 90, no. 3 (June 1985): 577.

[2] Ibid, 577-578.

[3] Ibid, 578.

[4] Ibid, 569.

[5] Robert Gray, “Bourgeois Hegemony in Victorian Britain,” in Tony Bennet, ed., Culture, Ideology and Social Process: A Reader (London: Batsford Academic and Educational, 1981), 240.

[6] Ibid, 244.

[7] Ibid.

[8] Ibid, 246.

[9] Ibid, 245.

[10] Ibid.

[11] David Arnold, “Gramsci and the Peasant Subalternity in India,” The Journal of Peasant Studies 11, no. 4 (1984):158.

[12] Ibid, 157.

[13] Ibid, 157.

[14] Ibid, 159.

[15] Ibid.

[16] Lears, “Hegemony,” 576-577.

[17] Ibid.

[18] Arnold, “India,” 172.

[19] Lears, “Hegemony,” 574.

[20] Gray, “Britain,” 246.

[21] Ibid, 245-246.

[22] Ibid, 276.

[23] Lears, “Hegemony,” 586.

How Should History Be Taught?

Debate currently rages over how to teach history in American public schools. Should the abyss of racism receive full attention? Should we teach our children that the United States is benevolent in its wars and use of military power — did we not bring down Nazi Germany? Is the nation fundamentally good based on its history, worthy of flying the flag, or is it responsible for so many horrors that an ethical person would keep the flag in the closet or burn it in the streets? Left and Right and everyone in between have different, contradictory perspectives, but to ban and censor is not ideal. Examining the full spectrum of views will help students understand the world they inhabit and the field of history itself.

While there was once an imagining of objectivity, historians now typically understand the true nature of their work. “Through the end of the twentieth century,” Sarah Maza writes in Thinking About History, “the ideal of historical objectivity was undermined from within the historical community… The more different perspectives on history accumulated, the harder it became to believe that any historian, however honest and well-intentioned, could tell the story of the past from a position of Olympian detachment, untainted by class, gender, racial, national, and other biases.” Selecting and rejecting sources involves interpretation and subconsciously bent decisions. Historians looking at the same sources will have different interpretations of meaning, which leads to fierce debates in scholarly journals. Teachers are not value-neutral either. All this is taken for granted. “It is impossible to imagine,” Maza writes, “going back to a time when historians imagined that their task involved bowing down before ‘the sovereignty of sources.'” They understand it’s more complex than that: “The history of the American Great Plains in the nineteenth century has been told as a tale of progress, tragedy, or triumph over adversity,” depending on the sources one is looking at and how meaning is derived from them.

But this is a positive thing. It gives us a fuller picture of the past, understanding the experiences of all actors. “History is always someone’s story, layered over and likely at odds with someone else’s: to recognize this does not make our chronicles of the past less reliable, but more varied, deeper, and more truthful.” It also makes us think critically — what interpretation makes the most sense to us, given the evidence offered? Why is the evidence reliable?

If historians understand this, why shouldn’t students? Young people should be taught that while historical truth exists, any presentation of historical truth — a history book, say — was affected by human action and sentiment. This is a reality that those on the Left and Right should be able to acknowledge. Given this fact, and that both sides are after the same goal, to teach students the truth, the only sensible path forward is to offer students multiple interpretations. Read A Patriot’s History of the United States (Schweikart, Allen) and A People’s History of the United States (Zinn). There are equivalent versions of these types of texts for elementary and middle schoolers. Read about why World War II was “The Good War” in your typical textbook, alongside Horrible Histories: Woeful Second World War. Have students read history by conservatives in awe of a greatest country in the whole wide world, as well as by liberals fiercely critical of the nation and many of its people for keeping liberty and democracy exclusively for some for far longer than many other countries. They can study top-down history (great rulers, generals, and leaders drive change) and bottom-up social history (ordinary people coming together drives change). Or compare primary sources from the late nineteenth century to the early twentieth demanding or opposing women’s rights. Why not? This gives students a broader view of the past, shows them why arguments and debates over history exist, and helps them understand modern political ideologies.

Most importantly, as noted, it helps students think critically. Many a teacher has said, “I don’t want to teach students what to think, but rather how to think.” This doesn’t seem possible without exploring varying perspectives and asking which one a young person finds most convincing and why. One can’t truly practice the art of thinking without one’s views being challenged, being forced to justify the maintenance of a perspective or a deviation based on newly acquired knowledge. Further, older students can go beyond different analyses of history and play around with source theories: what standard should there be to determine if a primary source is trustworthy? Can you take your standard, apply it to the sources of these two views, and determine which is more solid by your metric? There is much critical thinking to be done, and it makes for a more interesting time for young people.

Not only does teaching history in this way reflect the professional discipline, and greatly expand student knowledge and thought, it aligns with the nature of public schools, or with what the general philosophy of public schools should be. The bent of a history classroom, or the history segment of the day in the youngest grades, is determined by the teacher, but also by the books, curricula, and standards approved or required by the district, the regulations of the state, and so forth. So liberal teachers, districts, and states go their way and conservative teachers, districts, and states go theirs. But who is the public school classroom for, exactly? It’s for everyone — which necessitates some kind of openness to a broad range of perspectives (public universities are the same way, as I’ve written elsewhere).

This may be upsetting and sensible at the same time. On the one hand, “I don’t want my kid, or other kids, hearing false, dangerous ideas from the other side.” On the other, “It would be great for my kid, and other kids, to be exposed to this perspective when it so often is excluded from the classroom.” Everyone is happy, no one is happy. Likely more the latter. First, how can anyone favor bringing materials full of falsities into a history class? Again, anyone who favors critical thinking. Make that part of the study — look at the 1619 Project and the 1776 Report together, and explore why either side finds the other in error. Second, how far do you go? What extreme views will be dignified with attention? Is one to bring in Holocaust deniers and square their arguments up against the evidence for the genocide? Personally, this writer would support that: what an incredible exercise in evaluating and comparing the quantity and quality of evidence (and “evidence”). Perhaps others will disagree. But none of this means there can’t be reasonable limits to presented views. If an interpretation or idea is too fringe, it may be a waste of time to explore it. There is finite time in a class period and in a school year. The teacher, district, and so on will have to make the (subjective) choice (no one said this was a perfect system) to leave some things out and focus on bigger divides. If Holocaust denial is still relatively rare, controversy over whether the Civil War occurred due to slavery is not.

Who, exactly, is afraid of pitting their lens of history against that of another? Probably he who is afraid his sacred interpretation will be severely undermined, she who knows her position is not strong. If you’re confident your interpretation is truthful, backed by solid evidence, you welcome all challengers. Even if another viewpoint makes students think in new ways, even pulling them away from your lens, you know the latter imparted important knowledge and made an impression. As the author of a book on racism used in high schools and colleges, what do I have to fear when some conservative writes a book about how things really weren’t so bad for black Kansas Citians over the past two centuries? By all means, read both books, think for yourself, decide which thesis makes the most sense to you based on the sources — or create a synthesis of your own. The imaginary conservative author should likewise have no qualms about such an arrangement.

I have thus far remained fairly even-handed, because Leftists and right-wingers can become equally outraged over very different things. But here I will wonder whether the Right would have more anxiety over a multiple-interpretation study specifically. Once a student has learned of the darkness of American history, it is often more difficult to be a full-throated, flag-worshiping patriot. This risk will drive some conservatives berserk. Is the Leftist parent equally concerned that a positive, patriotic perspective on our past alongside a Zinnian version will turn her child into someone less critical, more favorable to the State, even downplaying the darkness? I’m not sure if the Leftist is as worried about that. My intuition, having personally been on both sides of the aisle, is that the risk would be more disturbing for conservatives — the horrors still horrify despite unrelated positive happenings, but the view of the U.S. as the unequivocal good guy is quickly eroded forever. Hopefully I am wrong and that is the mere bias of a current mindset talking. Either way, this pedagogy, the great compromise, is the right thing to do, for the reasons outlined above.

In conclusion, we must teach students the truth — and Americans will never fully agree on what that is, but the closest one could hope for is that this nation and its people have done horrific things as well as positive things. Teaching both is honest and important, and that’s what students will see when they examine different authors and documents. In my recent review of a history text, I wrote that the Left “shouldn’t shy away from acknowledging, for instance, that the U.S. Constitution was a strong step forward for representative democracy, secular government, and personal rights, despite the obvious exclusivity, compared to Europe’s systems.” Nor should one deny the genuine American interest in rescuing Europe and Asia from totalitarianism during World War II. And then there’s inventions, art, scientific discoveries, music, and many other things. The truth rests in nuance, as one might expect. James Baldwin said that American history is “more beautiful and more terrible than anything anyone has ever said about it.” (What nation does not have both horrors and wonderful things in its history? Where would philosophy be without the German greats?) I’ve at times envisioned writing a history of the U.S. through a “hypocrisy” interpretation, but it works the same under a “mixed bag” framing: religious dissenters coming to the New World for more freedom and immediately crushing religious dissenters, the men who spoke of liberty and equality who owned slaves, fighting the Nazi master race with a segregated army, supporting democracy in some cases but destroying it in others, and so on. All countries have done good and bad things.

That is a concept the youngest children — and the oldest adults — can understand.

For more from the author, subscribe and follow or read his books.

Famous Bands That Sang About Kansas City

One’s city pride quickly swells upon perusing Spotify for songs about Kansas City. There’s much to hear, from the gems of local talent (“Get Out – The KC Streetcar Song,” Kemet the Phantom) to the fantastic artists from afar (“Train From Kansas City,” Neko Case) to the biggest names in music history:

The Beatles sang of Kansas City beginning in 1961 with “Kansas City / Hey-Hey-Hey-Hey,” which they took from Little Richard’s work of the late 1950s, itself a version of the 1952 classic “Kansas City” by Leiber and Stoller (“I’m going to Kansas City / Kansas City here I come…”). Other famous musicians to record Leiber and Stoller’s song include Willie Nelson, James Brown, and Sammie Davis Jr.

Frank Zappa performed the “Kansas City Shuffle.” Van Morrison had “The Eternal Kansas City”: “Dig your Charlie Parker / Basie and Young.” Yusuf (Cat Stevens) sang “18th Avenue (Kansas City Nightmare).” Clearly, and sadly, he did not have a pleasant stay.

Jefferson Airplane was “gonna move to Kansas City”; for Rogers and Hammerstein, in their 1943 musical Oklahoma!, everything was “up to date in Kansas City.” More recently, The New Basement Tapes, The Mowgli’s, and of course Tech N9ne have joined in.

I have created a public playlist on Spotify of four hours of songs about KC. It has a bit of everything, from the jazz and blues of yesteryear to the folk and Americana and hip hop of today. It includes famous artists and the obscure, and everyone in between, with some repeats so one can hear different artists tackle the same song. “Kansas City Hornpipe” by Fred Morrison and “Kansas City, Missouri” by Humbird are particularly enjoyable. Some songs, naturally, are better than others, but the most subpar or campy of Spotify’s selection have been excluded (many local artists go nowhere for a reason). Finally, and unfortunately, one of the best hip hop songs about the city, Center of Attention’s “Straight Outta Kauffman,” is not available on Spotify, so it must be listened to elsewhere.

Find some of that “Kansas City wine” (Leiber and Stoller) and enjoy.

For more from the author, subscribe and follow or read his books.

Review: ‘A History of the American People’

At times I read books from the other side of the political spectrum, and conservative Paul Johnson’s A History of the American People (1998) was the latest.

This was mostly a decent book, and Johnson deserves credit for various inclusions: a look at how British democracy influenced American colonial democracy, the full influence of religion on early American society, Jefferson’s racism, U.S. persecution of socialists and Wobblies during World War I, how the Democratic Party was made up of southern conservatives and northern progressives for a long time, and more.

However, in addition to (and in alignment with) being a top-down, “Great Men,” traditionalist history, the work dodges the darkness of our national story in significant ways. That’s the only way, after all, you can say things like Americans are “sometimes wrong-headed but always generous” (a blatant contradiction — go ask the Japanese in the camps about generosity) or “The creation of the United States of America is the greatest of all human adventures” (what a wonderful adventure black people had in this country). It’s the pitfall of conservative, patriotic histories — if you want the U.S. to be the greatest country ever, our horrors must necessarily be downplayed.

Thus, black Americans don’t get much coverage until the Civil War, whereas Native Americans aren’t really worth discussing before or after the Trail of Tears era. Shockingly, in this history the internment of the Japanese never occurred. It’s simply not mentioned! Johnson offers a rosy view of what the U.S. did in Vietnam, believing that we should have inflicted more vigorous violence on both Vietnam and Cuba. Poverty doesn’t get much attention. The Founding Fathers’ expressions of protecting their own wealth, class interests, and aristocratic power when designing our democracy naturally go unmentioned. Likewise, American attacks on other countries are always from a place of benevolence and good intentions, rather than, as they often were in actuality, for economic or business interests, to maintain global power, or to seize land and resources. To Johnson, the U.S. had “one” imperialist adventure, its war with Spain — this incredible statement was made not long after his outline of the U.S. invasion of Mexico to expand its borders to the Pacific.

Other events and people given short shrift include LGBTQ Americans, non-European immigrants, and the abolitionist movement — until the end of the book when the modern pro-life movement is compared to it in approving fashion. The labor and feminist movements aren’t worth mentioning for their crucial successes, or intersectional solidarity in some places, only for their racism in others. Johnson is rather sympathetic of Richard Nixon, and somehow describes his downfall with no mention of Nixon’s attempts, recorded on White House tapes, to obstruct the Watergate investigation — the discovery of which led to his resignation. If anything, the book is a valuable study on how bias, in serious history and journalism, usually manifests itself in the sin of omission, conscious or no, rather than outright falsities, conscious or no (not that conservatives are the only ones who do this, of course; the Left, which can take the opposite approach and downplay positive happenings in American history, shouldn’t shy away from acknowledging, for instance, that the U.S. Constitution was a strong step forward for representative democracy, secular government, and personal rights, despite the obvious exclusivity, compared to Europe’s systems).

Things really start to go off the rails with this book in the 1960s and later, when America loses its way and becomes not-great (something slavery and women as second-class citizens could somehow never cause), with much whining about welfare, academia, political correctness, and the media (he truly should have read Manufacturing Consent before propagating the myth that the liberal media turned everyone against the war in Vietnam). Affirmative action receives special attention and passion, far more than slavery or Jim Crow, and Johnson proves particularly thick-skulled on other matters of race (Malcolm X is a “black racist,” slang and rap are super dangerous, no socio-economic and historical causes are mentioned that could illuminate highlighted racial discrepancies, and so on). Cringingly blaming the 1960-1990 crime wave on a less religious society, one wonders what Johnson would make of the dramatic decrease in crime from the 1990s to today, occurring as the percentage of religious Americans continues to plunge — a good lesson on false causation.

All this may not sound at all like a “mostly decent” book, but I did enjoy reading most of it, and — despite the serious flaws outlined here, some unforgivable — most of the information in the space of 1,000 pages was accurate and interesting. It served as a good refresher on many of the major people and events in U.S. history, a look at the perspective of the other side, a prompt for thinking about bias (omission vs. inaccuracy, subconscious vs. conscious), and a reminder of who and what are left out of history — and why.

For more from the author, subscribe and follow or read his books.

The Great Debate Over Robert Owen’s Five Fundamental Facts

In the early 1830s, British social reformer Robert Owen, called the “Founder of Socialism”[1] by contemporaries, brought forth his “Five Fundamental Facts” on human nature and ignited in London and elsewhere a dramatic debate — in the literal sense of fiery public discussions, as well as in books, pamphlets, and other works. While the five facts are cited in the extant literature on Owen and his utopian movement, a full exploration of the controversy is lacking, which is unfortunate for a moment that left such an impression on witnesses and participants. Famous secularist and editor George Jacob Holyoake, at the end of his life in 1906, wrote, “Human nature in England was never so tried as it was during the first five years” after Owen’s writings, when these five facts “were discussed in every town in the kingdom. When a future generation has courage to look into this unprecedented code as one of the curiosities of propagandism, it will find many sensible and wholesome propositions, which nobody now disputes, and sentiments of toleration and practical objects of wise import.”[2]

The discourse continued into the 1840s, but its intensity lessened, and thus we will focus our attention on its decade of origin. This work will add to scholarship a little-explored subject, and argue that the great debate transcended common ideological divisions, not simply pitting socialist against anti-socialist and freethinker against believer, but freethinker against freethinker and socialist against socialist as well. The debate was nuanced and complex, and makes for a fascinating study of intellectual history in Victorian Britain, an overlooked piece of the Western discourse on free will going back to the ancient Greek philosophers and nature-nurture stirred up by John Locke and René Descartes in the 17th century.

The limited historiography of the “Five Fundamental Facts” recognizes their significance. J.F.C. Harrison of the University of Sussex wrote that Owen, in his “confidence in the discoverability of laws governing human action,” thought as immutable as physical laws, in fact “provided the beginnings of behavioural science.”[3] Indeed, “in an unsophisticated form, and without the conceptual tools of later social psychology, Owen had hit upon the crucial role of character structure in the social process.”[4] Further, Nanette Whitbread wrote that the school Owen founded to put his five facts into action and change human nature, the New Lanark Infant School, could “be justly described as the first in the developmental tradition of primary education.”[5] However, the facts are normally mentioned only in passing — works on Owen and his movement that make no mention of them at all are not unusual — and for anything close to an exploration of the debate surrounding them one must turn to brief outlines in works like Robert Owen: A Biography by Frank Podmore, not an historian at all, but rather a parapsychologist and a founder of the Fabian Society.[6]

Robert Owen, to quote The Morning Post in 1836, was “alternately venerated as an apostle, ridiculed as a quack, looked up to and followed as the founder of a new philosophy, contemned as a visionary enthusiast, denounced as a revolutionary adventurer.”[7] He was born in Wales in 1771, and as a young man came to manage a large textile mill in Manchester and then buy one in New Lanark, Scotland. Influenced by the conditions of the working poor and the ideas of the Enlightenment, and as a prosperous man, he engaged in writing, advocacy, and philanthropy for better working conditions and early childhood education in Britain after the turn of the century. Adopting a philosophy of cooperative, communal economics, Owen purchased an American town, New Harmony in Indiana, in 1825 and ran a utopian experiment, inspiring many more across the U.S. and elsewhere, that was ultimately unsuccessful. He returned home in 1828, living in London and continuing to write and lecture for broad social change.

Soon Owen brought forth his Outline of the Rational System of Society, in circulation as early as 1832 — and by 1836 “too well known to make it requisite now to repeat,” as a Mr. Alger put it in the Owenite weekly New Moral World.[8] The Home Colonisation Society in London, an organization promoting the formation of utopian communities with “good, practical education” and “permanent beneficial employment” for all, without the “present competitive arrangements of society,” was just one of the work’s many publishers.[9] Owen, not one for modesty, declared it developed “the First Principles of the Science of Human Nature” and constituted “the only effectual Remedy for the Evils experienced by the Population of the world,” addressing human society’s “moral and physical Evils, by removing the Causes which produce them.”[10]

The text from the Home Colonisation Society began with Owen’s “Five Fundamental Facts,” the key to his rational system and therefore the prime target of later criticism.[11] They assert:

1st. That man is a compound being, whose character is formed of his constitution or organization at birth, and of the effects of external circumstances upon it from birth to death; such original organization and external influences continually acting and re-acting each upon the other.

2d. That man is compelled by his original constitution to receive his feelings and his convictions independently of his will.

3d. That his feelings, or his convictions, or both of them united, create the motive to action called the will, which stimulates him to act, and decides his actions.  

4th. That the organization of no two human beings is ever precisely similar at birth; nor can art subsequently form any two individuals, from infancy to maturity, to be precisely similar.

5th. That, nevertheless, the constitution of every infant, except in the case of organic disease, is capable of being formed into a very inferior, or a very superior, being, according to the qualities of the external circumstances allowed to influence that constitution from birth.[12]

As crucial as Owen’s five facts were to the subsequent arguments, he offered no defense of them in the short Society pamphlet, stating them, perhaps expectedly, as fact and immediately proceeding to build upon them, offering twenty points comprising “The Fundamental Laws of Human Nature.” Here again he explained that the character of an individual was malleable according to the environment and society in which he or she developed and existed — and how by building a superior society humanity could allow its members to flourish and maximize well-being. This was the materialism of the early socialists. That section was followed by “The Conditions Requisite for Human Happiness,” “The Principles and Practice of the Rational Religion,” “The Elements of the Science of Society,” and finally a constitution for a new civilization.

This paper will not explore Owen’s specific utopian designs in detail, but at a glance the rational society offered a government focused on human happiness, with free speech, equality for persons of all religions, education for all, gender equality, communal property, a mix of direct and representative democracy, the replacement of the family unit with the larger community structure, an end to punishments, and more. Overall, the needs of all would be provided for collectively, and work would be done collectively — the termination of “ignorance, poverty, individual competition…and national wars” was in reach.[13] Happier people were thought better people — by creating a socialist society, addressing human needs and happiness, “remodelling the character of man” was possible.[14] The five facts aimed to demonstrate this. While this pamphlet and others were brief, in The Book of the New Moral World, Owen devoted a chapter to justifying and explaining each of the five facts, and wrote of them in other publications as well. In that work he clarified, for instance, that it was an “erroneous supposition that the will is free,” an implication of the second and third facts.[15]

The reaction? As Holyoake wrote, in a front-page piece in The Oracle of Reason, “Political economists have run wild, immaculate bishops raved, and parsons have been convulsed at [Owen’s] communities and five facts.”[16] The facts, to many of the pious, smacked of the determinism rejected by their Christian sects. An anonymous letter on the front page of a later edition of the same publication laid out a view held by both Christians and freethinkers: “‘Man’s character is formed for him and not by him’ — therefore, all the religions of the world are false, is the sum and substance of the moral philosophy of R. Owen.”[17] With biological inheritances and environmental influences birthing one’s “feelings and convictions,” one’s “character,” free will was put into question. What moral culpability did human beings then have for their actions, and how could an individual truly be said to make a “choice” to believe or follow religious doctrine? Any religion that rested on free will would be contradictory to reality, and thus untrue. But, the anonymous writer noted, Calvinists and other determinists were safer — they believed in “supernatural” causes that formed one’s character, thus it would be disingenuous to say “all the religions of the world” were fiction, solely on the grounds that individuals did not have mastery over who they were.

The writer then offered further nuance and assistance to ideological opponents (he or she was clearly a freethinker, not only given the journal read and written to but also revealed by lines such as: “But what care religionists for justice in this world or the next? If they cared anything about ‘justice,’ and knew what the word meant, they would have long ere this abandoned the doctrine of an eternal hell”).[18] It was pointed out that “original sin” was found in non-deterministic and deterministic Christian sects alike — a formation of character before birth. “How then can the ‘five facts’ refute all religions…?”[19] If human beings were, from the universal or at least near-universal Christian point of view, shaped by supernatural forces beyond their control after Adam and Eve’s storied betrayal, it was a non sequitur, in the anonymous author’s mind, to say the molding of character invalidated common religions. Here we see an introduction to the complex ways the British of the Victorian era approached the debate.

Yet others were not always so gracious. In 1836, The Monthly Review wrote that “No one doubts the sincerity of Mr. Owen” and his desire to “create a world of happiness,” but “no man who takes for his guides common observation, and common sense — much more, that no person who has studied and who confides in the doctrines of the Bible, can ever become a convert to his views.”[20] The five facts were “intangible” and “obscure,” the arguments “bold, unauthorised, unsupported, ridiculous,” the vision for society as a whole “fanciful, impractical, and irreligious.”[21] How was it, the periodical asked, that these views could be “demonstrably true” yet had “never found acceptance with the mass of sober intelligent thinkers,” only the “paltry, insignificant, uninfluential, and ridiculed class of people” that were the Owenites, and Owen himself, who was “incompetent”?[22] The writer (or writers) further resented how Owen centered himself as something of a savior figure. Ridding the world of evil could be “accomplished by one whose soul like a mirror was to receive and reflect the whole truth and light which concerned the happiness of the world — and I, Robert Owen, am that mirror” — and did not the New Testament already serve the purpose of outlining the path to a more moral and happier world?[23] Overall, it was a scathing attack, an example of the hardline Christian view.

The January 1838 volume of The Christian Teacher, published to “uphold the religion of the New Testament, in contradistinction to the religion of creeds and parties,” included a writing by H. Clarke of Chorley.[24] To him the facts were “inconsistent and fallacious”: facts one, two, and four contradicted the fifth.[25] The first, second, and fourth facts established that a “man’s self” at birth “has at least something to do with forming his character,” but then the fifth established that “by the influence of external circumstances alone, any being” could be transformed into a “superior being.”[26] To Clarke, the facts at first emphasized that one’s biological constitution played a sizable, seemingly co-equal, role in forming one’s character — then the fifth fact threw all that out the window. If anyone could be made into a superior being, just via environment, what sense did it make to say that biology had any effect whatsoever on an individual’s nature?

Owen did seem to view circumstances as the predominant power. Though he firmly believed there existed, as he wrote, a “decided and palatable difference between [infants] at birth” due to biology, he indeed believed in bold, universal results: “selfishness…will cease to exist” alongside “all motives to individual pride and vanity,” and as “all shall be trained, from infancy, to be rational,” a humanity of “superior beings physically, intellectually, and morally” could arise.[27] Clarke was not alone in this critique. J.R. Beard wrote something similar in The Religion of Jesus Christ Defended from the Assaults of Owenism, which further held the common blank slate view of human nature (“at birth there is no mental or moral development”), meaning environment was all that was left: “What is this but to make ‘external circumstances’ the sole creator of the lot of man?”[28]

Clarke further took issue with what he viewed as the contradictory or hypocritical language of the Owenites. “So I learn from the votaries of Owenism…man’s feelings and convictions are forced upon him irrespective of his will, it is [therefore] the extreme of folly to ask a man to believe this or that.”[29] The Christian believed in belief, but “Owenism denies that man can believe as he pleases…yet strange to tell, almost the first question asked by an Owenite is, ‘Do you believe Mr. Owen’s five fundamental facts?’”[30] Belief in the five facts, Clarke pointed out, was required to be a member of Owen’s association, which an “Appendix to the Laws and Regulations” of the association printed in The New Moral World in 1836 made clear.[31] If one’s convictions were formed against one’s will, what sense did it make to ask after or require beliefs? Clarke’s own beliefs, one should note, while against Owen’s views of human nature, were not necessarily hostile to socialism. He prefered “Christ to Mr. Owen, Christian Socialism to the five-fact-socialism.”[32]

There were some who saw a distinction between the value of Owen’s theories on human nature and that of his planned civilization. In 1836, The Morning Post found Owen, in his Book of the New Moral World, to be “radical” and “destructive,” wanting to dissolve civilization and remake it; the idea that humanity had for millenia been living in systems contrary to their own nature and happiness was “almost incredible.”[33] But the Post came from a more philosophical position and background than theological (“the Millenium [is] about as probable a consummation as the ‘Rational System’”).[34] Owen had therefore “displayed considerable acuteness and ability” regarding “metaphysical discussions,” making the book worth a read for ontologists and those who enjoyed a “‘keen encounter of the wit.’”[35]

As we saw with the anonymous writer in The Oracle of Reason, the five facts divided not only freethinkers and Christians, but also freethinkers as a group. There was too much intellectual diversity for consensus. For example, Charles Southwell, who was “rapidly becoming one of the most popular freethought lecturers in London,” debated Owen’s facts with well-known atheist Richard Carlile in Lambeth, a borough of south London.[36] The room “was crowded to suffocation, and hundreds retired unable to attain admittance. The discussion lasted two nights, and was conducted with talent and good feeling by both parties.”[37] Southwell defended the facts, while Carlile went on the offensive against them. 

The agnostic Lloyd Jones, journalist and friend of Owen, had much to say of Richard Carlile’s lectures on this topic.[38] In A Reply to Mr. R. Carlile’s Objections to the Five Fundamental Facts as Laid Down by Mr. Owen, Jones remarked that Carlile had called Owen’s Book of the New Moral World a “book of blunders” during his talk on November 27, 1837, but the audience “certainly could not avoid observing the multitudinous blunders committed by yourself, in endeavouring to prove it such.”[39] Carlile, according to Jones, insisted that individuals had much more power to steel themselves against circumstances and environments than Owen was letting on, throwing facts one and two into doubt. This is all rather one-sided, as Jones did not even bother to quote Carlile directly, but instead wrote, “You tell us we have a power to adopt or reject [convictions and feelings]: you have not given us your reasons for so saying; in fact, you did not condescend to reason upon any of the subjects broached during the evening’s discussion.”[40] Carlile should “try the question… Can you, by a voluntary action of your mind, believe that to be true which you now consider to be false; — or believe that to be false which you now consider true?… Certainly not.”[41] Jones also defended the idea that conviction and will were distinct, rather than one and the same as Carlile insisted.[42]

For the socialists, many of them of course Owenites anyway, there was much acceptance of the five facts. James Pate, for the Socialists of Padiham, wrote that an Owenite named Mr. Fleming came to their organization and, to a full house of about 300 people, “proved, in a plain yet forcible manner, the truth of the five fundamental facts; and…showed how little difficulty there would be in the practical application of Mr. Owen’s views to all classes of society.”[43] The audience was “so fully convinced” that few “dared venture to question any remarks.”[44] But here divergent thoughts existed too, as we saw with H. Clarke. The branches of religious socialism and secular socialism made for varying thoughts on human nature among the radicals, or simply those sympathetic to or not offended by the idea of socialism. Frederick Lees, for instance, secretary of the British Association for the Suppression of Intemperance, castigated the “infidelity” of Owenism and his five facts but had little to say of socialism, save that it was a front for the former: “In the fair name of Socialism, and in the mask of friendship, Judas like, she [untruth, especially as related to infidelity] seeks to ensnare and betray.”[45] Owen’s followers, while they professed to desire the “establishment of a ‘SOCIAL COMMUNITY,’ their chief and greatest object is the ascendancy of an ‘INFIDEL CREED.’”[46] Lees, striking a sympathetic note once more, added that Owenites should “dissolve the forced and arbitrary union between their absurd and infidel metaphysics, and the practical or working part of Socialism, which association of the two excites the rightful opposition of all lovers of christian truth…”[47]

For a forceful defense of religious socialism, take T.H. Hudson’s lengthy work Christian Socialism, Explained and Enforced, and Compared with Infidel Fellowship: Especially, as Propounded by Robert Owen, Esq., and His Disciples. It was up to “the Christian Religion to secure true socialism,” whereas Owen’s views were “more likely to serve the purposes of the Prince of darkness.”[48] Hudson spent one chapter, about forty pages, attacking the five facts, followed by three chapters, over 120 pages, advocating for Christian Socialism. The five facts were “based on the false assumptions, that man is good by nature” and were “decidedly irreligious.”[49] Hudson lambasted the “disguised atheism” of the first fact: it did not mention God as man’s creator, nor his spirit or soul, and left him helpless before nature, without free will.[50] The “infidel Socialist,” in believing facts two and three, deepened trust in fatalism and the irresponsibility of individuals, but also fell for a “gross contradiction.”[51] Hudson pointed out that the second fact established feelings and convictions were received independently of one’s will, yet the third fact stated the will was made up of, created by, one’s feelings and convictions.[52] Initially presented as distinct phenomena, subsequently as a unified phenomenon. J.R. Beard echoed this: it would have been better to say feelings and convictions were received “anteriorly ‘to his will’; for it is obviously his notion that man’s will is not independent, but the result, the creation of his feelings and convictions.”[53]

Like the atheist Carlile, Hudson thought one could put up “resistance” to external influences, could decide whether to “receive” or reject feelings and convictions — an exercise in willpower, which was thus independent of and prior to feelings and convictions; a person was not a “slave to circumstances.”[54] This was a refrain of Owen’s critics, with the added element at times of the impossibility of personal change under Owen’s theory (indeed the impossibility that changing circumstances could change people). For instance, Minister John Eustace Giles, in Socialism, as a Religious Theory, Irrational and Absurd (1839), based on his lectures in Leeds, wondered how Owen could believe that “‘man is the creature of circumstances’” yet “professes to have become wise” — did that not show Owen had “resisted” circumstances?[55] Did not this, plus Owen’s desire to “change the condition of the world…thus shew that while man is the creature of circumstances, circumstances are the creatures of man”?[56] After focusing on semantics and perceived ambiguities in the fourth fact, but not closed to the possibility it was a simple truism, Hudson saw the improvement of individuals in the fifth fact true but was insulted that Christianity, no longer “being alienated from God” and addressing humanity’s “depraved nature,” was not thought necessary to this improvement alongside changing environments.[57] Indeed, most egregious was the Owenite belief that people were fundamentally good.[58]

Whether due to varying personal beliefs or simply varying cautions about driving away potential converts in a pious age, the actual presentation of the fundamental facts as irreligious was not consistent. Lloyd Jones, in an 1839 debate over whether socialism was atheistic with Mr. Troup, editor of The Montrose Review, asked some variant of “Where is the Atheism here?” after reading each of the five facts.[59] Whereas Owen, also an unbeliever, in an 1837 debate with Rev. J.H. Roebuck of Manchester, called religions “geographical insanities” that could be wiped away by the five facts.[60] “Mr. Roebuck stated…that the two systems for which we contend are opposed to each other, and that both, therefore, cannot be true. Herein we perfectly agree.”[61] The national discourse so intertwined the facts and the question of God that a person, on either side of the debate, could not help but assume that one would accompany the other. When a debate on “the mystery of God” was proposed to Owenite J. Smith in January 1837, “the challenge was [mis]understood by myself and all our friends, to be the discussion of the five fundamental facts.”[62]

Overall, perhaps Robert Owen’s facts flustered the religious and irreligious, and socialists and anti-socialists alike, because they were simply so counterintuitive — not to mention theoretical, without contemporary science to back them up. Owen wrote, in The Book of the New Moral World, for instance: “Man is not, therefore, to be made a being of a superior order by teaching him that he is responsible for his will and his actions.”[63] Such blunt statements turned on its head what many, across ideologies, judged common sense. Owen’s ideas were “contrary to common sense” for Hudson, Christian socialist, in the same way they were “opposed to the common sense of mankind” for Giles, anti-socialist.[64] Would not teaching individual moral responsibility enable personal change and create a better society? Not so for Owen. The will was formed by circumstances — thus true personal change came about by purposefully changing environments. Create a better society first, and the positive personal change would follow. These were, according to Owen, “the laws of nature respecting man, individually, and the science of society,” and few posited laws of nature, proven or otherwise, do not provoke intense philosophical debate.[65]

For more from the author, subscribe and follow or read his books.


[1] J. Eustace Giles, Socialism, as a Religious Theory, Irrational and Absurd: the First of Three Lectures on Socialism (as Propounded by Robert Owen and Others) Delivered in the Baptist Chapel South-Parade, Leeds, September 23, 1838 (London: Simpkin, Marshall, & Co., Ward & Co., G. Wightman, 1838), 4, retrieved from https://babel.hathitrust.org/cgi/pt?id=uiuo.ark:/13960/t63560551&view=1up&seq=10&q1=founder.

[2] George Jacob Holyoake, The History of Co-operation (New York: E.P. Dutton & Company, 1906), 1:147.

[3] J.F.C. Harrison, Robert Owen and the Owenites in Britain and America (Abingdon: Routledge, 2009), 66.

[4] Ibid.

[5] Nanette Whitbread, The Evolution of the Nursery-infant School: A History of Infant and Nursery Education in Britain, 1800-1970 (Abingdon: Routledge, 2007), 39:9-10.

[6] Frank Podmore, Robert Owen: A Biography (London: Hutchinson & CO, 1906), 481-482, 499-502.

[7] The Morning Post, September 14, 1836, cited in “The Book of the New Moral World,” The New Moral World (Manchester: Abel Heywood, 1836-7), 3:6, retrieved from https://babel.hathitrust.org/cgi/pt?id=uc1.31970026956075&view=1up&seq=18&size=125&q1=%22five%20fundamental%20facts%22.

[8] The Westminster Review (London: Robert Heward, 1832), 26:317, retrieved from https://babel.hathitrust.org/cgi/pt?id=nyp.33433096159896&view=1up&seq=329&q1=%22five%20fundamental%20facts%22; The New Moral World (London: Thomas Stagg, 1836), 2:62, retrieved from https://babel.hathitrust.org/cgi/pt?id=uc1.31970026956117&view=1up&seq=74&q1=%22five%20fundamental%20facts%22.

[9] Robert Owen, Outline of the Rational System of Society (London: Home Colonization Society, 1841), 2, retrieved fromhttps://babel.hathitrust.org/cgi/pt?id=hvd.hnsp9t&view=1up&seq=6.

[10] Ibid, 1.

[11] This was explicitly stated by critics. Dismantle the five facts and the rest of the system goes down with it. See T.H. Hudson, Christian Socialism, Explained and Enforced, and Compared with Infidel Fellowship, Especially, As Propounded by Robert Owen, Esq., and His Disciples (London: Hamilton, Adams, and Co., 1839), 52, retrieved from https://babel.hathitrust.org/cgi/pt?id=nyp.33433075925721&view=1up&seq=62&q1=%22fundamental%20facts%22.

[12] Owen, Outline, 3.

[13] Ibid, 14.

[14] Ibid.

[15] Robert Owen, The Book of the New Moral World (London: Richard Taylor, 1836), 17, retrieved from https://babel.hathitrust.org/cgi/pt?id=mdp.39015003883991&view=1up&seq=47&q1=%22five%20fundamental%20facts%22.

[16] The Oracle of Reason (London: Thomas Paterson, 1842), 1:113, retrieved from https://archive.org/details/oracleofreasonor01lond/page/112/mode/2up?q=five+facts.

[17] Ibid, 161.

[18] Ibid.

[19] Ibid.

[20] The Monthly Review (London: G. Henderson, 1836), 3:62, retrieved from https://babel.hathitrust.org/cgi/pt?id=umn.319510028065374&view=1up&seq=80&q1=%22five%20fundamental%20facts%22.

[21] Ibid, 62, 67-68.

[22] Ibid, 63.

[23] Ibid, 62-63.

[24] The Christian Teacher and Chronicle of Beneficence (London: Charles Fox, 1838), 4:219, retrieved from https://babel.hathitrust.org/cgi/pt?id=hvd.ah6jrz&view=1up&seq=255&q1=%22five%20facts%22.

[25] Ibid.

[26] Ibid, 220.

[27] Owen, Book, 22-24.

[28] J.R. Beard, The Religion of Jesus Christ Defended from the Assaults of Owenism (London: Simpkin, Marshall and Company, 1839), 233, retrieved from https://babel.hathitrust.org/cgi/pt?id=hvd.hnmy5r&view=1up&seq=243&q1=%22second%20fact%22.

[29] Christian Teacher, 220.

[30] Ibid.

[31] Ibid, 220; New Moral World, 2:261.

[32] Christian Teacher, 220.

[33] New Moral World, 3:6.

[34] Ibid.

[35] Ibid.

[36] Edward Royle, Victorian Infidels: The Origins of the British Secularist Movement, 1791-1866 (Manchester: Manchester University Press, 1974), 69.

[37] The New Moral World (Leeds: Joshua Hobson, 1839), 6:957, retrieved from https://babel.hathitrust.org/cgi/pt?id=uc1.31970026956133&view=1up&seq=361&size=125&q1=%22five%20fundamental%20facts%22.

[38] Regarding Jones’ agnosticism, see: Report of the Discussion betwixt Mr Troup, Editor of the Montrose Review, on the part of the Philalethean Society, and Mr Lloyd Jones, of Glasgow, on the part of the Socialists, in the Watt Institution Hall, Dundee on the propositions, I That Socialism is Atheistical; and II That Atheism is Incredible and Absurd (Dundee: James Chalmers & Alexander Reid, 1839), retrieved from shorturl.at/pvxM1.

[39] Lloyd Jones, A Reply to Mr. Carlile’s Objections to the Five Fundamental Facts as Laid Down by Mr. Owen (Manchester: A. Heywood, 1837), 4, retrieved from https://babel.hathitrust.org/cgi/pt?id=wu.89097121669&view=1up&seq=12&q1=%22five%20fundamental%20facts%22.

[40] Ibid, 9.

[41] Ibid.

[42] Ibid, 10-11.

[43] New Moral World, 3:380.

[44] Ibid.

[45] Frederick R. Lees, Owenism Dissected: A Calm Examination of the Fundamental Principles of Robert Owen’s Misnamed “Rational System” (Leeds: W.H. Walker, 1838), 7, retrieved from https://babel.hathitrust.org/cgi/pt?id=uiug.30112054157646&view=1up&seq=7&q1=%22socialism%22.

[46] Ibid, 16.

[47] Ibid.

[48] Hudson, Christian Socialism, 4, 13.

[49] Ibid, 50-51.

[50] Ibid, 53-63.

[51] Ibid, 63-64, 66.

[52] Ibid, 66.

[53] Beard, Religion, 234.

[54] Hudson, Christian Socialism, 65-66.

[55] Giles, Socialism, 7.

[56] Ibid.

[57] Hudson, Christian Socialism, 72-81, 87-88.

[58] Ibid, 89.

[59] Report of the Discussion, 12.

[60] Public Discussion, between Robert Owen, Late of New Lanark, and the Rev. J.H. Roebuck, of Manchester (Manchester: A. Heywood, 1837), 106-107, retrieved fromhttps://babel.hathitrust.org/cgi/pt?id=uc1.c080961126&view=1up&seq=111&q1=%22fundamental%20facts%22.

[61] Ibid, 107.

[62] New Moral World, 3:122.

[63] Owen, Book, 20.

[64] Hudson, Christian Socialism, 65; Giles, Socialism, 36.

[65] Owen, Book, 20.

On the Spring-Stone Debate

While finding a decisive victor in debates on semantics and historical interpretation often proves difficult, in the lively clash between historians David Spring and Lawrence Stone on social mobility into Britain’s landed elite, the former presented the strongest case. The discourse, of the mid-1980s, centered around the questions of how to define “open” when considering how open the upper echelon was to newcomers from 1540-1880 and, most importantly, to newcomers who came from the business world. On both counts, Spring offered a more compelling perspective on how one should regard the historical evidence and data Stone collected in his work An Open Elite? Namely, that it was reasonable to call the landed elite open to members of lower strata, including business leaders.

The debate quickly obfuscated lines between the two questions. In his review of An Open Elite?, Spring noted that Stone showed a growth in elite families from 1540-1879, beginning with forty and seeing 480 join them, though not all permanently. Further, “Stone shows that regularly one-fifth of elite families were newcomers.”[1] In his reply, Stone declined to explore the “openness” of a twenty percent entry rate because it was, allegedly, irrelevant to his purpose: he was only interested in the entry of businessmen like merchants, speculators, financiers, and manufacturers, who did not come from the gentry, the relatively well-off stratum knocking at the gate of the landed elite. Spring “failed to distinguish between openness to new men, almost all from genteel families, who made a fortune in the law, the army, the administration or politics…and openness to access by successful men of business, mostly of low social origins.”[2]

True, Stone made clear who and what he was looking at in An Open Elite?: the “self-made men,” the “upward mobility by successful men of business,” and so on, but leaned into, rather than brushed aside or contradicted, the idea of general social immobility.[3] For instance, observe the positioning of: “When analysed with care…the actual volume of social mobility has turned out to be far less than might have been expected. Moreover, those who did move up were rarely successful men of business.”[4] The notion of the landed elite being closed off in general was presented, followed by the specific concern about businessmen. Stone went beyond business many times (for instance: “the degree of mere gentry penetration up into the elite was far smaller than the earlier calculations would indicate”[5]), positing that not only was the landed elite closed to businessmen but also universally, making his protestations against Spring rather disingenuous. Stone insisted to Spring that an open elite specifically meant, to historians and economists, a ruling class open to businessmen, not to all, but Stone himself opened the door to the question of whether the landed elite was accessible to everyone by answering nay in his book. Therefore, the question was admissible, or fair game, in the debate, and Spring was there to provide a more convincing answer. A group comprised of twenty percent newcomers from below, to most reasonable persons, could be described as relatively open. Even more so with the sons of newcomers added in: the landed elite was typically one-third newcomers and sons of newcomers, as Spring pointed out. Though it should be noted both scholars highlighted the challenge of using quantitative data to answer such historical questions. The collection and publication of such numbers is highly important, but it hardly ends the discussion — the question of openness persists, and any answer is inherently subjective.

However, it was the second point of contention where Spring proved most perceptive. He pointed out that while the gentry constituted 181 entrants into the landed elite during the observed centuries, those involved in business were not far behind, with 157, according to Stone’s data. This dwarfed the seventy-two from politics and seventy from the law. As Spring wrote, Stone’s quantitative tables conflicted with his text. Stone wrote in An Open Elite? that “most of the newcomers were rising parish gentry or office-holders or lawyers, men from backgrounds not too dissimilar to those of the existing county elite. Only a small handful of very rich merchants succeeded in buying their way into the elite…”[6] Clearly, even with different backgrounds, businessmen were in fact more successful at entering the landed elite than politicians and lawyers in the three counties Stone studied. What followed a few lines down in the book from Stone’s selected words made far more sense when considering the data: businessmen comprised “only a third of all purchasers…”[7] The use of “only” was perhaps rather biased, but, more significantly, one-third aligned not with the idea of a “small handful,” but of 157 new entrants — a third business entrants, a bit more than a third gentry, and a bit less than a third lawyers, politicians, and so on. Spring could have stressed the absurdity, in this context, of the phrase “only a third,” but was sure to highlight the statistic in his rejoinder, where he drove home the basic facts of Stone’s findings and reiterated that the landed elite was about as open to businessmen as others. Here is where quantitative data truly shines in history, for you can compare numbers against each other. The question of whether a single given number or percentage is big or small is messy and subjective, but whether one number is larger than another is not, and provides clarity regarding issues like whether businessmen had some special difficulty accessing Britain’s landed elite.

Stone failed to respond directly to this point, a key moment that weakened his case, but instead sidetracked into issues concerning permanence of newcomers and by-county versus global perspectives on the data, areas he explored earlier in his response, now awkwardly grafted on to Spring’s latest argument. Yet the reader is largely left to pick up on what is being implied, based on Stone’s earlier comments on said issues. He noted that only twenty-five businessmen of the 157 came from the two counties distant from London, seemingly implying that Hertfordshire, the London-area county, had tipped the scales. Merchants and others were not as likely to rise into the landed elite in more rural areas. What relevance that had is an open question — it seemed more a truism than an argument against Spring’s point, as London was a center for business, and thus that result was perhaps expected. Regardless, he did not elaborate. The adjacent implication was that Spring was again seeing “everything from a global point of view which has no meaning in reality, and nothing from the point of view of the individual counties.”[8] In the debate, Stone often cautioned that it made sense to look at counties individually, as they could be radically distinct — one should not simply look at the aggregated data. But Stone’s inherent problem, in his attempt at a rebuttal, was that he was using the global figures to make his overall case. He took three counties and lifted them up to represent a relatively closed elite in Britain as a whole. It would not do to now brush aside one county or focus heavily on another to bolster an argument. Spring, in a footnote, wrote something similar, urging Stone to avoid “making generalizations on the basis of one county. [Your] three counties were chosen as together a sample of the nation.”[9] To imply, as Stone did, that London could be ignored as some kind of anomaly contradicted his entire project.

Stone’s dodge into the permanence of entrants was likewise not a serious response to Spring’s observation that business-oriented newcomers nearly rivaled those from the gentry and far outpaced lawyers and politicians. He wrote that “of the 132 business purchasers in Hertfordshire, only 68 settled in for more than a generation…”[10] The transient nature of newcomers arose elsewhere in the debate as well. Here Stone moved the goalposts slightly: instead of mere entrants into the landed elite, look at who managed to remain. Only “4% out of 2246 owners” in the three counties over these 340 years were permanent newcomers from the business world.[11] It was implied these numbers were both insignificant and special to businesspersons. Yet footnote five, that associated with the statistic, undercut Stone’s point. Here he admitted Spring correctly observed that politicians and officeholders were forced to sell their county seats, their magnificent mansions, and abandon the landed elite, as defined by Stone, at nearly the same rate as businessmen, at least in Hertfordshire. Indeed, it was odd Stone crafted this response, given Spring’s earlier dismantling of the issue. The significance of Stone’s rebuttal was therefore unclear. If only sixty-eight businessmen lasted more than a generation, how did that compare to lawyers, office-holders, and the gentry? Likewise, if four percent of businessmen established permanent generational residence among the landed elite, what percentages did other groups earn? Again, Stone did not elaborate. But from his admission and what Spring calculated, it seems unlikely Stone’s numbers, when put in context, would help his case. Even more than the aggregate versus county comment, this was a non-answer.

The debate would conclude with a non-answer as well. There was of course more to the discussion — it should be noted Stone put up an impressive defense of the selection of his counties and the inability to include more, in response to Spring questioning how representative they truly were — but Spring clearly showed, using Stone’s own evidence, that the landed elite was what a reasonable person could call open to outsiders in general and businessmen in particular, contradicting Stone’s positions on both in An Open Elite? Stone may have recognized this, given the paucity of counterpoints in his “Non-Rebuttal.” Spring would, in Stone’s view, “fail altogether to deal in specific details with the arguments used in my Reply,” and therefore “there is nothing to rebut.”[12] While it is true that Spring, in his rejoinder, did not address all of Stone’s points, he did focus tightly on the main ideas discussed in the debate and this paper. So, as further evidence that Spring constructed the better case, Stone declined to return to Spring’s specific and central arguments about his own data. He pointed instead to other research that more generally supported the idea of a closed elite. Stone may have issued a “non-rebuttal” not because Spring had ignored various points, but rather because he had stuck to the main ones, and there was little to be said in response.

For more from the author, subscribe and follow or read his books.


[1] Eileen Spring and David Spring, “The English Landed Elite, 1540-1879: A Review,” Albion: A Quarterly Journal Concerned with British Studies 17, no. 2 (Summer 1985): 152.

[2] Lawrence Stone, “Spring Back,” Albion: A Quarterly Journal Concerned with British Studies 17, no. 2 (Summer 1985): 168.

[3] Lawrence Stone, An Open Elite? England 1540-1880, abridged edition (Oxford: Oxford University Press, 1986), 3-4.

[4] Ibid, 283.

[5] Ibid, 130.

[6] Ibid, 283.

[7] Ibid.

[8] Stone, “Spring Back,” 169.

[9] Spring, “A Review,” 154.

[10] Stone, “Spring Back,” 171.

[11] Ibid.

[12] Lawrence Stone, “A Non-Rebuttal,” Albion: A Quarterly Journal Concerned with British Studies 17, no. 3 (Autumn 1985): 396. For Spring’s rejoinder, see Eileen Spring and David Spring, “The English Landed Elite, 1540-1879: A Rejoinder,” Albion: A Quarterly Journal Concerned with British Studies 17, no. 3 (Autumn 1985): 393-396.

How to Write and Publish a Book (Odds Included)

My experience with writing books and finding publishers is extremely limited, but a few early insights might make it easier for others interested in doing the same. The following, it should be noted, relates to nonfiction works — the only world I know — but most of it could probably be applied to novels.

First, write your book. Take as much or as little time as needed. I cranked out the first draft of Racism in Kansas City in four months and promptly began sending it out to publishers (April 2014). Why America Needs Socialism I wrote off and on for six years, at the end throwing everything out (300 pages) and starting over (though making much use of old material), finishing a new version in five months. Just make your work the absolute best it can be, in terms of content and proper grammar. But you can reach out to certain publishers before your manuscript is wholly finished. Pay attention to the submission guidelines, but for most publishers it’s not a big deal (many ask you to explain how much of the work is complete and how long it will take you to finish). I feel safest having the manuscript done, and it would likely be risky to reach out if you didn’t have over half the piece written — your proposal to publishers will include sample chapters, and if they like those they will ask for more: the whole manuscript thus far.

You’ll scour the internet for publishers who print books like yours and who accept unsolicited materials, meaning you can contact them instead of a literary agent. If you want the big houses like Simon & Schuster, Penguin Random House, or HarperCollins, you’ll need an agent, and I have no experience with that and thus have no advice. But a million small- and medium-sized publishers exist that will accept unsolicited queries from you, including significant places like Harvard University Press or Oxford University Press.

Following the firm’s guidelines on its website, you’ll generally email a book proposal, which offers an overview, target audience, table of contents, analysis of competing titles, author information, timeline, and sample chapter. If there’s no interest you won’t usually get a reply, but if there is you’ll be asked for the manuscript. It’s an easy process, there’s simply much competition and varying editor interests and needs, so you have to do it in volume. Keep finding houses and sending emails until you’re successful. From March to May 2018, I sent a proposal for Why America Needs Socialism to 91 publishers. Eight (about 9%) requested the full manuscript, and two (about 2%) wanted to publish it. The terms of the first offer were unfavorable (walk away if you have to), but by September, after seven months of searching, a home for the book was secured, Ig Publishing in New York City.

The same technique and persistence is required when seeking blurbs of praise for the back cover and promotional materials. You simply find ways to call or email a dozen or so other authors and prominent people, explain your book and publisher, and then four of them accept your manuscript and agree to write a sentence of praise if they like it (or write a foreword, or peer review it, or whatever you seek). It is very convenient for nonfiction authors that so many of the folks you’d want to review your book are university professors. You simply find Cornel West’s email address on Harvard’s faculty page. Similarly, you shotgun emails to publications when the book comes out and ask them to review it. I sent a query to 58 magazines, papers, journals, and websites I thought would be interested in reviewing Why America Needs Socialism, offering to send a copy. Seven (12%) asked for the book to do a review; two others invited me to write a piece on the work myself for their publications.

I didn’t keep such careful records of my Racism in Kansas City journey, but after I began submitting proposals it took three months to find a publisher who agreed to publish the work — temporarily. I made the mistake of working for 10 months with a publisher without a contract. At times, publishers will ask you to made revisions before signing a contract, a big gamble (that I wasn’t even really aware of at the time). This publisher backed out due to the national conversation on race sparked by Mike Brown’s death and subsequent events through late 2014 and early 2015, which was seemingly counter-intuitive for a publisher, but they were more used to tame local histories than what I had produced, a history of extreme violence and subjugation. So the search continued.

Writing a local story, at least a nonfiction work, certainly limits your house options. There are some, like the above, that are out-of-state that will take interest, but generally your best bet lies with the local presses. And unfortunately, there aren’t many of them where I reside. The University of Missouri Press was shutting down, the University Press of Kansas (KU) wanted me to make revisions before they would decide — and I wasn’t looking to repeat a mistake. I didn’t approach every Kansas City-area publisher, but rather, feeling the pressure of much wasted time, decided to stop looking for a house and instead to self-publish (with Mission Point Press, run by the former head of Kansas City Star Books).

A traditional publisher pays all the costs associated with the book and you get an advance and a small royalty from each copy sold. (With Ig Publishing, I gave up an advance for a larger royalty — a worthwhile option if the book sells well.) With self-publishing, everything is in reverse: you pay a “nontraditional publisher” to birth the book — editing, cover design, maybe marketing and distribution — and you keep most of the profit from each copy sold (not all, as someone is printing it). There’s also the option of skipping a nontraditional publisher altogether and doing everything yourself, working only with a printer. A traditional house is the big prize for a writer, because it offers that coveted validation — a firm accepted your piece instead of rejecting it, like it rejected all those other authors. It’s about prestige and pride, and not having to say “Well…” after someone calls you a published author. But self-publishing can give you more control over the final product, in some circumstances more money over time, and it works well for a local book (it’s Kansas City readers and bookstores that want a book on Kansas City, so I don’t have to worry about marketing and distribution in other cities).

The whole process is an incredible adventure: the intense learning process of researching and writing, the obsession, the hunt for and exhilaration over a publisher, the dance and give-and-take with editors who see needed changes (“murder your darlings”), the thousands of footnotes you format (kidding, it’s hell), finding key individuals to write a foreword or advanced praise, getting that first proof in the mail, releasing and marketing your work, winning coverage and reviews in the press, giving book talks and interviews, hearing a reader say how much what you created meant to him, learning your book is a classroom text, being cited by other authors or asked to give advanced praise yourself, being recognized by strangers, seeing your work in libraries and bookstores across the city, and the country, and even the world.

For more from the author, subscribe and follow or read his books.

A Religious War

The Taiping Revolution was a devastating conflict, resulting in the deaths of tens of millions of people, between a growing Christian sect under Hong Xiuquan and the Qing Dynasty (1644-1911) government. While the political forces within Hong’s “God Worshipers” wanted to solve the internal turmoil in China, and certainly influenced events, the Taiping Rebellion was a religious war. It was more the influence of the West, not the problems at home, that prompted the violence. While many revolutions had occurred before this with no Christian influence, examining the viewpoint of God’s Worshipers and the viewpoint of Qing militia leader Zeng Guofan will make it exceedingly clear that without the influence of Western religion, the Taiping Rebellion never would have occurred. 

From the point of view of Hong Xiuquan, religion was at the heart of everything he did. The origins of his faith and his individual actions immediately after his conversion explain his later choices and those of his followers during the rebellion. According to Schoppa, Hong had a vision he was vanquishing demons throughout the universe, under orders from men whom Hong later determined to be God and Jesus Christ. Hong believed that Christ was his older brother and Hong was thus “God’s Chinese son” (71). Hong studied Liang Fa’s “Good Works to Exhort the Age,” which we examined during our discussion. Liang Fa emphasized that his conversion stemmed partly from the need to be pardoned of sin and partly from a desire to do good deeds to combat evil and eradicate it from his life (Cheng, Lestz 135). Reading Liang’s writings after the life-changing vision brought Hong to Christianity. It is essential to note that, as Schoppa puts it, “In his comprehension of the vision, Hong did not immediately see any political import” (71). All Hong was concerned about at this point was faith, not the Manchu overlords. He was so impassioned he would “antagonize his community by destroying statues of gods in the local temple” (Schoppa 71). What Hong would have done with his life had he not become a Christian is impossible to say. He had repeatedly failed the civil service examination; perhaps he would have had to take up farming like his father (Schoppa 71).

Instead, he formed the God Worshipping Society. According to Schoppa, certain groups that joined declared the demons in Hong’s vision were the Manchu, and had to be vanquished (72). It was outside influences that politicized Hong’s beliefs. Yet even through the politicization one will see that at the heart of the matter is religion. The very society Hong wished to create was based on Christian ideals. Equality of men and women led to both sexes receiving equal land in Hong’s 1853 land system, the faith’s sense of community led to family units with shared treasuries, and church was required on the Sabbath day and for wedding ceremonies (Schoppa 73). Christianity brought about the outlawing of much urban vice as well, such as drinking or adultery. One might argue that behind all these Christian ideological policies were long-held Confucian beliefs. As we saw in “Qian Yong on Popular Religion,” eradicating gambling, prostitution, drugs, etc. was just as important to the elites and literati (those who have passed the examination) as it was to Hong (Cheng, Lestz 129-131). While there were heavy indeed Confucian influences on Hong’s teachings (evidenced by their Ten Commandments and the proceeding odes found in “The Crisis Within”), Schoppa makes it clear that “the Taiping Revolution was a potent threat to the traditional Chinese Confucian system” because it provided people with a personal God rather than simply the force of nature, Heaven (75). The social policies that emerged from Hong’s Christian ideals, like family units and laws governing morality led Schoppa to declare, “It is little wonder that some Chinese…might have begun to feel their cultural identity and that of China threatened by the Heavenly Kingdom” (76). The point is, Hong never would have become a leader of the God Worshippers had Western Christianity not entered his life, and even after his growing group decided to overthrow the Manchu, the system of life they were fighting for and hoping to establish was founded on Christian beliefs. Just as Hong smashed down idols in his hometown after his conversion, so everywhere the God Worshippers advanced they destroyed Confucian relics, temples, and alters (Cheng, Lestz 148). The passion of Hong became the passion of all. 

It was also the opinion of the Manchu government that this was a religious war. As the God Worshippers grew in number, Schoppa writes, “The Qing government recognized the threat as serious: A Christian cult had militarized and was now forming an army” (72). Right away, the Manchu identified this as a religious rebellion. “It was the Taiping ideology and its political, social, and economic systems making up the Taiping Revolution that posed the most serious threat to the regime” (Schoppa 73). This new threat prompted the Qing to order Zeng Guofan to create militia and destroy the Taipings. “The Crisis Within” contains his “Proclamation Against the Bandits of Guangdong and Guangxi” from 1854. Aside from calling attention to the barbarism of the rebels, Zeng writes with disgust about Christianity and its “bogus” ruler and chief ministers. He mocks their sense of brotherhood, the teachings of Christ, and the New Testament (Cheng, Lestz 147). Zeng declares, “This is not just a crisis for our [Qing] dynasty, but the most extraordinary crisis of all time for the Confucian teachings, which is why our Confucius and Mencius are weeping bitterly in the nether world.” Then, in regards to the destruction of Confucian temples and statues, Zeng proclaims that the ghosts and spirits have been insulted and want revenge, and it is imperative that the Qing government enacts it (Cheng, Lestz 148). This rhetoric is not concerning politics and government, Manchu or anti-Manchu. Zeng makes it obvious what he aims to destroy and why. He views the rebellion as an affront to Confucianism. The Christians, he believes, must be struck down. 

With the leader’s life defined by Christianity, with a rebellious sect’s social structure based heavily on Christianity, with the continued destruction of Confucian works in the name of Christianity, and with the government’s aim to crush the rebellion in the name of Confucius and Mencius, can anyone rationally argue that the Taiping Rebellion was not a religious war? A consensus should now be reached! The rebellion’s brutality and devastation is a tragedy when one considers the similar teachings of both sides of the conflict, the Confucian call for peaceful mediation of conflicts and the Christian commandment not to kill. 

For more from the author, subscribe and follow or read his books.

Reference List

Pei-kai Cheng and Michael Lestz, and Jonathan D. Spence, eds., The Search for Modern China, (New York: W.W. Norton & Company, 1999), 128-149.

R. Keith Schoppa, Revolution and its Past (New Jersey: Prentice Hall, 2011), 71-76.

The Declining Value of Art

What gives art value? That is, inherent value, not mere monetary value. Perhaps it is actually quite similar for artist and spectator. The artist may impart value on her work based upon how much joy and fulfillment the process of its creation gave her, how satisfied she is with the final product if it matched or came close to her vision, how much pleasure others experience when viewing (or listening to) it, or how much attention, respect, and fame (and wealth) is directed her way because of it. Likewise, the spectator may see value in the work because he knows, perceives, or assumes the joy and satisfaction it might give the artist, he’s interested in and enjoys experiencing it, or because he respects a successful, famous individual.

There are various forces that impart value, but a significant one must be effort required. This is, after all, what is meant by the ever-present “My kid could do that” muttered before canvases splattered with paint or adorned with a single monochrome square in art museums across the world — pieces sometimes worth huge sums. People see less value in a work of art that takes (on average between human beings) less effort, less skill. Likewise, most artists would likely be less crushed were a fire to consume a piece they’d spent a day to complete versus one they’d spent a year to complete. To most people, effort imparts value.

I’d be remiss, and haunted, if I didn’t mention here that this demonstrates how most people think in Marxian ways about value. (If you thought, dear reader, that in an article on art you’d find respite from socialist theory, you were wrong.) Marx wrote that “the value of a commodity is determined by the quantity of labour” needed to create it (Value, Price, and Profit). Again, not mere monetary value. This doesn’t mean “the lazier a man, or the clumsier a man, the more valuable his commodity, because the greater the time of labour required for finishing the commodity.” Rather, Marx was speaking about the average labor needed to create something: “social average conditions of production, with a given social average intensity, and average skill of the labour employed.” Labor, effort, imparts value on all human creations, whether it’s art, whether it’s for sale, and so forth. Doesn’t it follow, then, that what takes less effort has less inherent value?

This train of thought — how the effort put into paintings, drawings, writings, photographs, sculptures, music, etc. affects their value — arose during an interesting conversation on how much respect should be awarded to each of these forms. Respect was based on effort-value. In other words, does a “good” photograph deserve the same respect as a “good” painting? Does a “great” piece of writing, like a book, deserve the same admiration — does it have the same value — as a “great” sculpture? One may feel at first that they shouldn’t be compared. But all forms have value because they require effort, and thus if we can determine how much effort, on average between human beings, is required for two compared art forms and then decide one takes more effort we will have also found a difference in value. (One need not worry about “great” being subjective, because we are only talking about how each individual personally views the value of different art forms; perceived effort will also be subjective, which is the whole point, as it determines one’s view on value.)

If it helps make this clearer, we might start with a comparison within a single form. Which takes more effort on average: to record a single or an album? Cartooning or hyperrealist drawing? Most people would say the latter finished products have more value because of the greater effort typically required (work may be a breeze for some hyperrealist artists, as easy as cartooning for cartoonists, but remember we are speaking of averages).

Now what about the average effort to create a “good” photograph versus, say, a “good” (let’s say realist) painting? It seems like it would certainly take more effort to make a good painting! The technology of photography always advances, making tasks easier and more accessible, and thus grows more widespread. After film yielded to digitalization and computerization, it became much easier to take a nice photograph — it’s easier to do and easier to do well. Exposure, shutter speed, aperture, ISO sensitivity, focus, white balance, metering, flash, and so on can now be manipulated faster and with greater ease, or automatically, requiring no effort at all. Recently it’s become possible to edit photographs after the fact, fixing and improving them. You just need a program and know how to use it. Because the form has never existed without technology, the average effort to create a great photograph has probably never rivaled the average effort to create a great painting, but the gap was smaller in the past. Today anyone with the right technology can produce a great photo; true, it requires know-how, but surely the journey from knowing nothing to mastery is shorter and easier than the same journey for realist painting. (Film — now digital video — production is a similar story.) Because the effort needed for the same result — a good photo — has declined over time, the value of the form overall has also decreased. (This does not mean some photographers aren’t more creative, skilled, or knowledgeable than others, nor that there doesn’t remain more value in the work of hardline traditionalists who refuse to use this or that new technology.) But painting — the technology of painting — hasn’t really changed much through the ages; it still requires about the same effort to produce the same quality work, therefore its value holds steady. If “painters” start having robots paint incredible works for them, or aid them, there would obviously be a reduction of value. No one is as impressed by robot paintings or machine-assisted paintings.

Music is facing a smaller-scale attack on the value of the form with digitally created instrumentals, autotune, and so forth. Perhaps the value of writing declined slightly as we shifted from penmanship to typewriting to computer-based writing (with backspace and spell-check!). It will decline again as voice transcription programs are perfected and grow in popularity.

Sculpting, painting, and drawing — the forms least infected by technology — still essentially require the same effort to do, and same effort to do well, as they have throughout human history. The tools and equipment have changed some, yes, but not nearly as much as those of other forms. Their value will remain the same as long as this state of affairs persists. If music, writing, film, and photography continually grow easier to do well, their value, by this metric, will decrease, slowed only by those who valiantly resist the technological changes. This does not mean a splatter painting automatically has more value than a beautiful photo — remember we’re each personally comparing the value of what we subjectively see as “good” paintings versus “good” photographs; you may not see a splatter painting as good. Rather, it may simply mean that what you see as a good painting takes more effort on average to create, and thus has more value, than what you see as a good photo. Perhaps also more than a good book, song, or video, depending on the size and scope of the projects being compared (it may surpass a good video but not a good film, or a good short book but not a good tome; up to you).

It could be that effort required is somewhat rule-based, too, rather than just technology-based. Music, writing, film, and photography rely on more rules. That’s probably why technology is encroaching quicker on such forms. In music, keys, pitches, quarter-notes, half-notes and so forth are rules. Build a program that knows and follows them and you don’t need human players or singers anymore at all. Writing has spelling, grammar, and punctuation rules. So spell-check and A.I. can help you or do it all for you. Film has frames per second, photography f-stops, and together a thousand other rules. Devices can handle them. Artists break the rules all the time, but that doesn’t mean their form doesn’t rely more heavily on them than other forms.

Sculpting marble or clay into something recognizable, adorning a canvas with life, or sketching a convincing face perhaps are not activities that rely as much on rules. This does not mean there are none; for instance there are drawing guidelines to make a face proportional and grids to help you transfer reality to the paper. Again, the rules may or may not be followed. And this does not mean an A.I. couldn’t do such activities, because it could. It’s just hard to define what rule you’d use to draw something so perfectly it looks like a photograph; but you know you have to hit certain notes to sing something perfectly. You have to be talented to do either — but maybe one has more foundational rules to get you there.

I’ve sometimes wondered if closing the “effort gap” or “talent gap” between novices and incredible artists is easier in some art forms than others. Meaning, is the gulf between an inexperienced writer and an incredible writer smaller than the gulf between an inexperienced painter and an incredible painter? What about the gap between a new photographer and masterful one compared to the gap between a new sculptor and a highly advanced one? On average, that is. I would suppose the art forms that in any given era take more effort would have the largest chasm to cross. So it would be harder to become a master painter than a master photographer. Perhaps harder, also, than becoming a master cinematographer, writer, singer, or even musician. (I think this view explains why I personally respect and admire the best works of sculpting, painting, and drawing more than the best works of other forms, though music is high up there too. And that’s coming from a writer.)

If so, perhaps rules have something to do with it. We know that practice makes perfect. Some are born with unique gifts, no question, but others go from zero to hero through practice. Might more rules make it easier? Do human beings learn better, faster, with those defined rules? If you stripped away the aforementioned technology of singing, music, and writing (it’s impossible to do this with photography and film), would the rules of the forms alone make these things easier to master than art forms with fewer rules? It’s interesting to consider.

For more from the author, subscribe and follow or read his books.

How the Founding Fathers Protected Their Own Wealth and Power

Extremely wealthy landowners, merchants, and slave-owners held political power both long before and long after independence from Britain.[1] As we have already seen, many of the founding fathers battled to keep religious power out of government,[2] but they saw not the need for separation of State and business interests. Most of these wealthy men opposed democracy, and designed the Constitution to ensure the aristocracy would continue to rule society.

Historian Howard Zinn wrote, “The Continental Congress, which governed the colonies throughout the war, was dominated by rich men, linked together in faction and compacts by business and family connections”[3] and that the Constitution

illustrates the complexity of the American system: that it serves the interests of a wealthy elite, but also does enough for small property owners, for middle-income mechanics and farmers, to build a broad base of support. The slightly prosperous people who make up this base of support are buffers against the blacks, the Indians, the very poor whites. They enable the elite to keep control with a minimum of coercion, a maximum of law—all made palatable by the fanfare of patriotism and unity…

When economic interest is seen behind the political clauses of the Constitution, then the document becomes not simply the work of wise men trying to establish a decent and orderly society, but the work of certain groups trying to maintain their privileges, while giving just enough rights and liberties to enough of the people to ensure popular support.[4]

The aristocrats made their desire for political power quite clear, in their pre- and post-Revolution repression of poor people’s revolts (Bacon’s Rebellion, Shay’s Rebellion, the Whiskey Rebellion) and democratic movements aimed at redistributing property or abolishing debts. (Shay’s Rebellion, an uprising against debts and taxes, took place at the time the new Constitution was being written, in 1787, and certainly influenced its formation; see Zinn, Truth Has a Power of Its Own.) Nathaniel Bacon said, “The poverty of the country is such that all the power and sway is got into the hands of the rich, who by extortious advantages, having the common people in their debt, have always curbed and oppressed them in all manner of ways.”[5] The rich also structured their new government in a very specific way. Most of them believed the wealthy, the “well-born,” deserved decision-making power, not the common man. Alexander Hamilton revealed the common sentiment of class hostility and prejudice against the poor when he wrote:

All communities divide themselves into the few and the many. The first are the rich and well-born, the other the mass of the people… The people are turbulent and changing; they seldom judge or determine right. Give therefore to the first class a distinct permanent share in the government… Can a democratic assembly who annually revolve in the mass of the people be supposed steadily to pursue the common good? Nothing but a permanent body can check the imprudence of democracy…[6]

So only rich representatives could determine what was best for the common American people, not the common American people themselves. As Russian anarchist Mikhail Bakunin wrote, a “representative system, far from being a guarantee for the people, on the contrary, creates and safeguards the continued existence of a governmental aristocracy against the people.”[7] Many other founders echoed Hamilton, as is to be expected from those in the same socioeconomic stratum. John Jay, who helped write the Constitution and later served as the first Chief Justice of the United States, often said, “The people who own the country ought to govern it.”[8] James Madison argued senators should “come from, and represent, the wealth of the Nation.” He wrote the “turbulence and contention” of pure democracy (the people voting on public policy) threatened “the rights of property,” and sneered at politicians who supported giving all men “perfect equality in their political rights.”[9] He worried a growing population of laborers would “secretly sigh for a more equal distribution of [life’s] blessings.”[10] “The danger to the holders of property can not be disguised, if they be undefended against a majority without property,” dangers including “agrarian laws” and “leveling schemes” and “the cancelling or evading of debts.” It was important to “secure the rights of property agst. the danger from an equality & universality of suffrage, vesting compleat power over property in hands without a share in it.”[11]

He warned that in a democracy,

When the number of landowners shall be comparatively small…will not the landed interest be overbalanced in future elections, and unless wisely provided against, what will become of your government? In England, at this day, if elections were open to all classes of people, the property of landed proprietors would be insecure. An agrarian law would soon take place. If these observations be just, our government ought to secure the permanent interests of the country against innovation. Landholders ought to have a share in the government, to support these invaluable interests, and to balance and check the other. They ought to be so constituted as to protect the minority of the opulent against the majority.[12]

In other words, if you give democratic decision-making power to the landless, to the poor, to the majority, they might come for the wealth of the opulent (“agrarian law” is explained further below). John Adams believed, writes David McCullough, that

inequalities of wealth, education, family position, and such differences were true of all people in all times. There was inevitably a “natural aristocracy among mankind,” those people of virtue and ability who were “the brightest ornaments and the glory” of a nation, “and may always be made the greatest blessings of society, if it be judiciously managed in the constitution.” These were the people who had the capacity to acquire great wealth and make use of political power…[13]

To his credit (not really), Adams believed these men should serve in the Senate but the executive branch should be protected against their interests.

These men wrote of the need to protect the minority against the tyranny of the majority, but saw no threat in a minority of wealthy men withholding political and economic power from the majority. No great surprise, as they themselves were the rich, enlightened few. Though the Constitution and Bill of Rights granted unprecedented individual rights, the common citizen had little political power. Popular vote had almost no place in the government they designed. The popular vote did not appoint judges (it still does not), nor the president (it still does not, as the Electoral College has yet to be dismantled), nor Senators (until 1913; Madison himself said that “the Senate, therefore, ought to be this body” that protects “the minority of the opulent against the majority”[14]). The people only directly elected members of the House, yet only property owners were allowed to vote, further disenfranchising the poor and keeping power in the hands of the better off. Only in 1856 did the last state, North Carolina, do away with property requirements to vote. And of course, most aristocrats had few qualms about failing to protect the groups they were personally subjugating, such as black slaves, free colonists of color, and women. In fact, with Britain beginning to dismantle slavery, its preservation was a motivating factor for independence among the colonial elite — another instance protecting their wealth.

Granting decision-making power to the masses was out of the question. The founding fathers had interests to protect, their own minority interests. Thomas Jefferson noted that

Where not suppressed by the rod of despotism, men, according to their constitutions, and the circumstances in which they are placed, differ honestly in opinion. Some are whigs, liberals, democrats, call them what you please. Others are tories, serviles, aristocrats etc. The latter fear the people, and wish to transfer all power to the higher classes of society; the former consider the people as the safest depository of power in the last resort; they cherish them therefore, & wish to leave in them all the powers to the exercise of which they are competents. This is the division of sentiment now existing in the US.[15]

Jefferson, perhaps the most democratic of the founders, also denounced “the aristocracy of our monied corporations which dare already to challenge our government to a trial of strength and bid defiance to the laws of our country”; the aristocrats wanted to create for themselves a “government of an aristocracy, founded on banking institutions, and moneyed incorporations.”

As James Madison wrote, “Those who hold and those who are without property have ever formed distinct interests in society.”[16] He wanted “representatives whose enlightened views and virtuous sentiments render them superior to local prejudices and schemes of injustice” to check the “rage for paper money, for an abolition of debts, for an equal division of property, or for any other improper or wicked project” the rabble might think up.[17] For Madison and others, it was vital for presidents, congressmen, and justices to come from the upper class, which was and is overwhelmingly the case.[18] The most dangerous ideas circulating in the poor majority were those of redistributing property and wealth, forgiving debts, and so on, to end poverty and suffering—the “agrarian law” Madison feared. Thomas Paine supported such redistribution—he wrote in 1795 in Agrarian Justice:

To understand what the state of society ought to be, it is necessary to have some idea of the natural and primitive state of man; such as it is at this day among the Indians of North America. There is not, in that state, any of those spectacles of human misery which poverty and want present to our eyes in all the towns and streets in Europe. Poverty, therefore, is a thing created by that which is called civilized life. It exists not in the natural state. On the other hand, the natural state is without those advantages which flow from agriculture, arts, science and manufactures…

Civilization, therefore, or that which is so-called, has operated two ways: to make one part of society more affluent, and the other more wretched, than would have been the lot of either in a natural state…. The first principle of civilization ought to have been, and ought still to be, that the condition of every person born into the world, after a state of civilization commences, ought not to be worse than if he had been born before that period…

It is a position not to be controverted that the earth, in its natural, cultivated state was, and ever would have continued to be, the common property of the human race. In that state every man would have been born to property. He would have been a joint life proprietor with the rest in the property of the soil, and in all its natural productions, vegetable and animal…. Cultivation is at least one of the greatest natural improvements ever made by human invention. It has given to created earth a tenfold value. But the landed monopoly that began with it has produced the greatest evil. It has dispossessed more than half the inhabitants of every nation of their natural inheritance, without providing for them, as ought to have been done, an indemnification for that loss, and has thereby created a species of poverty and wretchedness that did not exist before.

In advocating the case of the persons thus dispossessed, it is a right, and not a charity, that I am pleading for…. Every proprietor, therefore, of cultivated lands, owes to the community ground-rent (for I know of no better term to express the idea) for the land which he holds…[19]

Paine then suggested the creation of a national fund that would grant each person, man or woman, 15 pounds sterling at the age of 21, to compensate for his or her loss of a share in the common property of the earth. He also called for social security, suggesting 10 pounds be granted to each individual per year after the age of 50. Paine envisioned a one-time guaranteed income payment, social security, taxes on the rich, free public schooling, child welfare programs, public housing, and public works programs. His thoughts later inspired many socialists, like Charles Fourier and Robert Owen.[20]

For more from the author, subscribe and follow or read his books.

Notes

[1] See Zinn, The Politics of History, p. 57-71, for a discussion on the rule of the rich in pre-Revolution America

[2] See Kramnick and Moore, The Godless Constitution

[3] Zinn, A People’s History of the United States, 81

[4] Zinn, People’s, 99, 97

[5] Zinn, Politics of History, 61

[6] Zinn, People’s, 96

[7] Geurin, Anarchism, 17

[8] Frank Monaghan, John Jay, chapter 15, p. 323 (1935)

[9] James Madison, Federalist No. 10, “The Utility of the Union as a Safeguard Against Domestic Faction and Insurrection (continued),”  Daily Advertiser, November 22, 1787

[10] Chomsky, Common Good, 7

[11] http://press-pubs.uchicago.edu/founders/documents/v1ch16s26.html

[12] Yates, Notes of the Secret Debates of the Federal Convention of 1787

[13] McCullough, John Adams

[14] Yates, Notes of the Secret Debates of the Federal Convention of 1787

[15] http://lcweb2.loc.gov/service/mss/mtj.old/mtj1/054/054_1139_1141.pdf

[16] Federalist 10

[17] Federalist 10

[18] See Zinn, People’s, 260-261 for information on how class interests affected Supreme Court decisions

[19] Paine, Agrarian Justice

[20] Nichols, The “S” Word, 33, 46, 53; https://www.jacobinmag.com/2015/03/thomas-paine-american-revolution-common-sense/

Colonial Courting Rituals Would Be Creepy As Sin Today

Finding a mate just isn’t what it used to be. Back in the “good ol’ days,” the parents were more parental, the sexism was more sexist, and the hysteria over sex was more hysterical. The courting rituals were truly bizarre, and we can thank our lucky stars they no longer exist. Of course, most of these rituals were only practiced by white straight people, and some only by wealthier colonial or Victorian-era Americans. But today we can all mock them relentlessly together. Let’s get to it.

 

WHEN YOUR DAD PICKS WHO YOU’RE GOING TO BED WITH FOREVER

Gone are the days when your old man could get together with his buddy at the tavern, kick back, down a few cold ones, and decide who you’re going to spoon for the rest of your f*cking life. Yes, if you were unwise enough to be born in colonial times, dorky dads would arrange your marriage for you, hearing not your sobs but rather the jingling of cold hard cash wrought from your dowery or inheritance (depends on your gender). End up with some ugo disgustor? If you didn’t have any Freudian reason to think of your dad during business time, you certainly had this reason.

 

WHEN THE GIRL SEDUCES YOU WITH A FAN

When you see a well-to-do Victorian gal cooling herself with her fan, she ain’t cooling herself with her fan, son. She’s engaged in a complex system of signals ranking somewhere between the high-step strut of the Blue-Footed Booby and the third base coach of the New York Yankees. Is she fanning herself slowly? Sorry, she’s engaged. Quickly? Single. Fanning with the right hand? Oh my God / look at that face / you look like / my next mistake. Left hand? F*ck off. Fan open, then shut? Kiss her, bro. Fan open wide? She loves you. Fan half open? You’ve just been friend-zoned. Legs shut — I mean fan — fan shut? She hates you. Good luck remembering all that. Don’t mess this up.

 

WHEN AGE REALLY WAS JUST A NUMBER

The age at which most colonial women married hovered around 19-22 (men were usually in their late twenties). So not too different from modern times. But remember, that’s an average. Some girls did marry when they were teenagers (others were married off as children). The age of consent in the American colonies was usually 10, though sometimes 12. Eventually, states started raising it. California raised it to 14 in 1889, then 16 in 1897. Others followed suit after that, though one technically kept the age at 7 until the 1960s (looking at you, Delawarean sickos).

 

WHEN YOU HAD TO DO ALL YOUR FLIRTING IN FRONT OF HER MOM

Remember your middle school and early high school dances and the agonizing embarrassment of the ever-present, complex surveillance apparatus made up of moms, dads, older siblings, teachers, and Principal Bacon? Well, back in the olden days, chaperones weren’t something you could just wait out as the years ticked by. When a man came a-calling, he had to sweep the girl off her feet in front of the potential mother-in-law. There was no one-on-one time. You went over to her house and, if f*cking Pride and Prejudice with Keira Knightly is any indication, make boring conversation while drinking tea, playing cards, or abusing a piano. Want to come back again? You’ve got to impress the ‘rents.

 

WHEN YOU COULD GET MARRIED LITERALLY BY JUST SAYING YOU WERE

Yes, one courting ritual was called “handfasting” or “spousing.” If you wanted to be married (by law, mind you), all you had to do was just hold hands and say you were married. You could do this anywhere and at any time, during this age that now sort of sounds like a Libertarian paradise. Apparently (to absolutely no one’s surprise), men would often be all for this, getting married on the spot to a nice yet sexually repressed girl, having sex, and disappearing into the night. Then daddy had to hunt you down — not to kill you for sleeping with his little angel as might happen today, but to force you to actually be her husband.

 

WHEN YOU PARTOOK IN SLEEPOVERS AT HER MOM AND DAD’S

It was a simpler time, when religious parents knew that kids would mess around and knew there was nothing they could do about it, so they decided to pretend to do something about it while willfully facilitating it. We’re talking bundling, people. When parents said Yes, you kids can have a sleepover as long as you promise not to have sex, let ma sew boyfriend up in bag, and let pa install an impervious one-foot-tall bundling board between your sides of the bed. Not letting Nathaniel sleep over was, apparently, deemed an ineffective way of preventing two lovebirds from engaging in smash game in the room adjacent to mom and dad’s.

 

WHEN THE “COURTSHIP STICK” WAS HOW YOU SEXTED

Just like today, when lovers send texts to each other while snuggling on the couch together after Christmas dinner so relatives can’t overhear them, colonials found a way to keep things spicy with secretiveness. The courtship stick was a six-foot-long hollow stick that allowed young men and women to whisper some sexy messages to each other in a world of zero privacy. Small homes with parents or parents-in-law, especially those that think touching is a no-no, mean there’s just no way to tell your gal her butt looks great in that dress she wears daily or your man that his plowing is an incredible turn on.

 

WHEN YOU COULD SKIP THE ENGAGEMENT RING AND GIVE HER A THIMBLE INSTEAD

If you lived among the Puritans, you didn’t have to get your woman an engagement ring. Instead, you could give her a helpful piece of sewing equipment. Puritans weren’t showy people, so a little thimble could be offered to the woman (in their defense, it would later be fashioned into a ring; cheap-ass Puritans), presumably as a sign of all the trousers she will have to repair over the course of her lifetime. That’s how you really blow away the ladies.

For more from the author, subscribe and follow or read his books.

Fictional Rosa Parks Speech

First off, let me say there are many folk who could give a speech better than I. On top of that, there are many better men and women who walked the halls of this fine institution who should be standing here before you instead. Highlander Folk School shines on as a beacon for equality, a garden that continues to grow the best civil rights activists and labor organizers in the country. I am very happy to be back here and honored to speak on the bus boycott that occurred in Montgomery just a few years back. Seems people are under the impression these days that the boycott happened because of me. I would like to assure you this is untrue. I can’t take credit for the crusade that occurred in Alabama. I was just the last straw. There were others who would not give up their seat on a bus and were arrested long before me. On the day it happened to me, I just couldn’t bear the thought of giving up my seat on a city bus to another white man and standing in the back for the rest of the long ride home. I would rather be hauled off in handcuffs than face that humiliation and degradation again. As Mrs. Virginia Durr once wrote to you, Highlander gave me a taste of freedom and equality; I thought of this place while the officers dragged me off the bus and to the station.

Look ahead a single year, and our world is changed for the better. A boycott occurred, and it succeeded. After a single year, no black man or woman has to feel the burn of embarrassment or the injustice of segregation on a city bus again. The boycott didn’t succeed because we were organized, though that was part of it. It didn’t succeed because we were angry, though that was part of it as well. It succeeded because we had perseverance. Organization defines the road, anger gets you on the road, but making the long journey to the end of the road, that is perseverance.

Activists like Jo Ann Robinson, president of the Women’s Political Council, demonstrated what perseverance really is, and indeed so did her members. Mrs. Robinson wrote Mayor of Montgomery W. Gayle in 1954, with a polite request for more fair policies on city buses. She did not even ask for desegregation, but instead requested that blacks begin sitting at the back of the bus and whites begin sitting at the front, and when they meet in the middle and all the seats are occupied, that would be it. She asked that the buses make more stops in black neighborhoods and that we wouldn’t have to pay at the front of the bus and make the humiliating trudge to the back entrance.

Her message fell on deaf ears, for that same “honorable” judge she wrote to would two years later speak at the rally of the Central Alabama Citizens Council about how to preserve segregation. His presence supported and offered legitimacy to ten thousand angry white racists encouraging the killing of black men, women, and children. Jo Ann Robinson would not take no for an answer, however. Briefly mentioning the possibility of a boycott in her letter, she later organized it and made it a reality in December of 1955. She and her WPC members worked tirelessly into the early morning of the fifth to distribute tens of thousands of leaflets calling for a boycott all over Montgomery. Mrs. Robinson and fellow activists were arrested quickly after the movement began, but even in the face of harassment, imprisonment, and threats of violence, they did not yield.

If any two men showed us true strength of character and steady perseverance, it was the two reverends, Ralph Abernathy and Martin Luther King, Jr. They held Montgomery Improvement Association meetings every week until the boycott succeeded. Dr. King was unequivocally our leader. If I was the spark, he was the fire. He, under the same death threats and mistreatment we all faced and experienced, ignited a passion in our hearts that helped us see this thing through. At one MIA meeting, Dr. King said, “With every great movement toward freedom there will inevitably be trials. Somebody will have to have the courage to sacrifice. You don’t get to the Promised Land without going through the Wilderness. You don’t get there without crossing over hills and mountains, but if you keep on keeping on, you can’t help but reach it. We won’t all see it, but it’s coming and it’s because God is for it” (Martin Luther King, Jr. Speaks to the Crowd). We did what Dr. King called us to do. We kept on keeping on. We braved the wilderness. Dr. King, in his wisdom and his own depth of perseverance, inspired us to stay the course.

Then there was everyone else; every man, woman, and child who refused to ride the Montgomery buses. This boycott began as a one-day movement. Instead, it lasted a year, because the black folk of Montgomery united and persevered together. At the first mass meeting of the MIA, Dr. King and Reverend Abernathy had to fight their way into the church through a joyous crowd of seven thousand people. In February 1965, activist Bayard Rustin noted that “42,000 Negroes have not ridden the busses since December 5” and that two men “walked 7 miles and the other 14 miles” to work each day (Bayard Rustin’s Diary). They weren’t the only ones walking those distances, either. Moreover, during this period dozens of taxi drivers and car-pool drivers were arrested. Yet we did not yield. All the while white folks talked about using “guns, bows and arrows, sling shots and knives” to “abolish the Negro race” and act on white people’s right to “life, liberty and the pursuit of dead niggers” (Handbill from Central Alabama Citizens Rally). Yet we did not yield. We persevered together. As one black maid said during the second month of the movement, “When you do something to my people you do it to me too” (Interview About the Boycott). That is true unity of spirit.

 Our spirit went unbroken, and in November 1956 the Supreme Court upheld what we fought for in Browder v. Gayle. Bus segregation was rejected as unconstitutional and the next month buses in Montgomery were integrated. It was a glorious day when I again road a city bus. True equality is still a long way off. We are not out of the wilderness yet. However, the boycott victory has kept us going. As Dr. King said, “Let us continue with the same spirit, the same orderliness, with the same discipline, with the same Christian approach” (Martin Luther King, Jr. Speaks to the Crowd). There will be a day when prejudice and hate are not tolerated in this country. It is only a matter of time. Until that day, we will continue to persevere. We draw closer to the Promised Land.

For more from the author, subscribe and follow or read his books.

Fictional New Deal Editorial

In this month of January 1935 Congress will vote on the Social Security Act. While the debates are waged in our national legislature, in barbershops and department stores, and at kitchen tables across the country, it is the intention of this paper to shed light on several key areas of the bill in drastic need of revision. The Social Security Act, if passed as-is, would create an unjust burden on American workers in trying times. By implementing change to the means by which we achieve a noble end, working men and women can look forward to a brighter future rather than a darker.

The Social Security Act will enact much-needed care for our underprivileged countrymen. The elderly, the unemployed, the handicapped, and dependent children will all benefit greatly from welfare. This paper has no bones to pick with President Franklin D. Roosevelt concerning the admirable and necessary measures this bill will take. The act will create insurance, a pool of money that can be tapped into for relief to the poor, dependent, and unemployed.

One component of the bill that must be revised, however, is who will be included (or more importantly, excluded) from the benefits of social security. The act primarily benefits white males. What about the other factory workers? What about the other farmers, and working women? Mr. Roosevelt is leaving them in the dust, to fend for themselves. The NAACP criticizes the Act, pointing out occupations such as cash tenants, sharecroppers, and domestic servants will be excluded from social security, simply because blacks dominate those jobs. Mr. Roosevelt has bowed to the wishes of prejudiced southern congressmen, and as a result most blacks, and especially black women, will never see minimum wages, unemployment relief, or money for retirement. This, in our view, is unacceptable.

A graver issue is how Mr. Roosevelt plans to pay for this pool of relief money. Employers and employees will both pay a one percent tax on the first annual $3,000 earned. This will allow the Federal government to send monthly checks to retirees. Meanwhile, unemployment hovers at 25 percent. Millions of Americans who are bringing in a small income still live in poverty. The shacks in Hooverville did not disappeared when Hoover did. The United States economy has never been more severely crippled. And Mr. Roosevelt wants to take money from workers’ paychecks? From businesses? A business that is not burden with such a tax will have more money for innovation that could stimulate the economy, or can hire a new worker and get someone off the streets. A worker with a bit more discretionary income will spend it during these hard times, saving his or her family from starvation and kick starting the economy at the same time. The Los Angeles Times has declared that the current method of funding will slow recovery. The American people want reduced taxes, and have written Mr. Roosevelt pleading for such a motion. Now is clearly not the time to burden the American family, nor American business, with an extra tax.

Instead, consider the views of Huey Long and Francis Townsend, who thought it would be better not to burden the poorest, but the richest. Does that not sound more reasonable? Redistribution of national income continues to receive huge numbers of supporters. The Townsend Plan alone has five million members, with a petition of 20 million names. People see this plan as their salvation. Long suggests capping an individual’s income at a few million dollars and collecting the rest to use for the welfare system. The top one percent of America owns a hefty percentage of the nation’s wealth. Those millionaires would do right to give more. Mr. Roosevelt says that a worker paying into the system gives him (and in this case, it is almost certainly a him) the moral right to receive money once retired or laid off. This paper would ask, what about the moral right of the rich? The moral right of Mr. Roosevelt? In our view, the wealthiest would be immoral to say five million a year is not enough, immoral not to care for the elderly and the poor when the common man, the forgotten man, cannot. Long, Townsend, the Congress of Industrial Organization, this newspaper…we do not ask that millionaires give up their millions. Just their discretionary millions.

The Social Security Act should be passed, there is no question. However, it must be made more inclusive, refusing to stoop to the levels of older generations by enforcing Jim Crow laws on welfare. The plan must also be funded not on the backs of those suffering, but by those in mansions with new cars, who never have to fear for being out of work, out of money, or out of food. The common man deserves freedom from such fear. Mr. Roosevelt understands this. If Mr. Roosevelt wishes to drive the money changers from the temple, as it were, it is the opinion of this paper that he do so not with a stick, but with a sword.

For more from the author, subscribe and follow or read his books.

George Sakoulas

Learning a new language can be very difficult. There is grammar to learn and tenses to master. Becoming fluent is an even greater challenge. Some day I hope to speak Spanish fluently; I think that would be impressive. But have you ever met someone fluent in five languages? There is a man I know who has unique talents and skills, and lived through amazing history. He is George Sakoulas, my grandfather (“Popoule” in Greek).

“Everyone has a hidden ambition,” George says. His own was to work with words. He had a dream of being a freelance writer. George came to speak English, Spanish, Greek, Italian, and German fluently, with a little Portuguese on the side. He says English was his best language—not bad for knowing no English at all for years, speaking only Greek with his family and community. The inspiration for the remarkable achievement of becoming a master linguist? He flunked kindergarten. He could not speak English, and he couldn’t go to a Greek school—there were none. After that, his pursuit of languages began.

His father was an impoverished Greek immigrant who sailed to America in 1910. His father opened a restaurant in downtown Kansas City, called the Triangle Grill, because of its location between three streets. It no longer exists, but curiously a sculpture of many different triangles is erected where it once stood. George’s mother immigrated later. She was about thirteen when they married. George’s father left his wife for America and lived there for eleven years before he had enough money to send for his wife (and, unexpectedly, his preteen daughter Nicoletta, George’s older sister).

“I was just a boy on my bicycle,” George once told me, summarizing his childhood. His bicycle story always makes me wonder at life fifty or sixty years ago. He had a bicycle delivery route. Helping those in need, he delivered medicine all over the Kansas City. He was paid 50 cents a week, and was paying off his bicycle, which cost $22.50. His mother was worried a car would hit him. He was hit twice, but did not quit.

The Great Depression dominated George’s boyhood, when money was scarce, foodstuffs, oil, and materials were strictly rationed, and unemployment was high. George spent a good deal of time making his own toys. He remembers making a scooter from roller blades, a two-by-four, and an orange crate. He made toy guns using wood, clothespins, and rubber bands.

George was athletic, and was one of the fastest runners in track, which he did at school and at a junior college in KC. He played basketball in grade school. He remained very small in high school, and was therefore unable to participate in many sports. We Greeks are not known for our height. He later got into boxing, and was a champion in his weight division. “I got a lot of respect,” he says.

His generation was into Frank Sinatra and others. He felt too old for Elvis when “the king” became popular, telling me he considered Elvis to be a “weirdo” and a “hillbilly.” When the Beatles came over to the US, he thought they “looked like girls,” and made fun of them. George says, “So many changes come” and that modern singers “sound like they’re dying.”

When World War II began in September 1939, the age for new recruits in the United States dropped from 21 to 20. Years later, when the Japanese bombed Pearl Harbor in Hawaii, the age dropped to 18. In January 1943 George Sakoulas joined the army, interrupting his junior year in college. In 1943 the Allies invaded Italy, and the US began shipping Italian prisoners to stateside POW camps. Some were sent to a camp in Tooele, Utah, near Salt Lake City, which had been holding German prisoners already.

George was put in the infantry at first. There his linguistic skills were discovered and he was reassigned to the POW camp in Utah before he saw any fighting. “I hated it,” George tells me. While he was there he made applications to leave. He wanted to fight, not stay in the US. “Everyone wanted to fight,” he says.

“Other forces kept me from the war,” George says.

The army sent him to language school at Stanford (where he wishes he had finished, since Stanford is a prestigious school nowadays), where he improved his Italian. After that he was taken to Utah. As a translator, his primary job was to translate the commander’s orders. When he arrived at the camp, only Germans were being kept there—no Italians had yet arrived. George’s “baptism of fire,” as he calls it, came when a troop of Italian prisoners was finally brought into the compound. An old colonel who stood by him as the column of soldiers marched through the games.

The colonel pushed George towards them and ordered him to make them halt. George ran out in front, but did not remember the word for “halt.” So instead he shouted out “Stop!” in Italian, and the column obeyed. He later realized “halt” would have done fine; the Italian equivalent is “alt.”

Popoule wants it to be known how well the prisoners were treated. They were not abused in any way. He remembers life at the POW camp well. The Italians were allowed to cook their own food, and he would sometimes go down and eat alongside them, because their food was better than his own. He said he became friends with a lot of nice men.

The POWs were given tools for activities, and George received gifts like paintings. He was amazed to see a few Germans had constructed a small radio. The prisoners, if they attempted to escape (which happened rarely), were locked up for a whole week, with nothing to eat but bread and water. This was the only time prisoners were not treated well.

George oversaw a company of 200 Italian soldiers for the rest of the war. While the Geneva Convention prohibited hard prison labor, the Italians had plenty of tasks to keep them busy.

George never became a freelance writer. When he got back from Utah, he finished college at UMKC, receiving a degree in history and language. He then went into the restaurant business, where he was needed by his family. He married Goldie, my “Yia-Yia” (grandmother) who was also Greek. They met at a picnic at the Greek Orthodox Church, which they now attend regularly.

For more from the author, subscribe and follow or read his books.

The Little Rock Nine

ED 6010 is the most racially diverse classroom I have ever been in. Let that disheartening statement sink in a moment. It is in part because Foundations of Education is a small class, without a doubt the smallest I have ever known, with 11 students. Three black students, eight white students, one white professor. I have never attended a class in which 25 percent of those present were African-American, a sad testament to the lack of diversity in both Overland Park, Kansas, where I grew up, and Springfield, Missouri, where I attended undergraduate school. These thoughts stirred within me because a memoir, Warriors Don’t Cry, was still fresh in my mind when I first entered ED 6010.

Warriors Don’t Cry is Melba P. Beals’ harrowing account of the integration of Central High School in Little Rock, Arkansas in 1957. She and eight other black students attended the formerly all-white Central under the statutes of Brown vs. Board of Education of Topeka, which determined the unconstitutionality of racial segregation in public schools. While it is perhaps a miracle the “Little Rock Nine” survived the unimaginable terrors of physical and verbal abuse inflicted during their year at Central, the Supreme Court case that made it possible was a miracle in itself. Amazingly, the Brown case of 1954 was a unanimous decision. It shocked the white world. “Chief Justice Earl Warren worked hard to achieve the compromises necessary for a unanimous decision because he believed that the full court should be behind such a dramatic order” (Fraser, 2010, p. 293). The rulings of many court cases balance on the edge of a knife,with a single deciding vote tipping the rulings one way or the other. How monumental, that such a controversial case, arguably the most controversial in decades, would be without dissent.

The Court declared, referring to black students, “to separate them from others of similar age and qualifications solely because of their race generates a feeling of inferiority as to their status in the community that may affect their hearts and minds in a way unlikely to ever be undone” (Fraser, p. 294). While the decision determined that separate could never be truly equal,a great step forward to be sure, the Court did not immediately order desegregation. “A year later the Court finally ordered school desegregation, but only ‘with all deliberate speed.’ The lack of a timetable encouraged the southern states to resist” (Norton et al., 2005, p . 810). And resist they did.

Little time needs to be spent explaining why white crowds gathered around Central High to protest and physically prevent integration, or why whites from other states journeyed to swell their numbers, as did local cops, or why Arkansas Governor Orval E. Faubus sent 250 National Guardsmen to block Melba and her friends from entering the high school. Centuries of racial prejudice and hatred explain that. Each generation taught the next how to think and behave towards blacks. Melba was struck, bruised, and burned with acid. She was ridiculed and tormented and spat on. Most of her abuse was inflicted by white students within Central. The other eight suffered just as badly. Melba recalls being cornered and persecuted in a school bathroom:

I looked up to see a flaming paper was coming right down on me. Girls were leaning over the top of the stalls on either side of me. Flaming paper floated down and landed on my hair and shoulders….

“Help!” I shouted. “Help!” The door wouldn’t open. Someone was holding it—someone strong, perhaps more than one person. I was trapped.

“Did you think we were gonna let niggers use our toilets? We’ll burn you alive, girl,” a voice shouted through the door. “There won’t be enough of you to worry about.”

I felt the kind of panic that stopped me from thinking clearly. My right arm was singed. The flaming wads of paper were coming at me faster and faster. I could feel my chest muscles tightening. I felt as though I would die any moment (Beals, 1994, p. 164).

I was struck by how much worse each day became for Melba. Perhaps it was my knowledge that her efforts would lead to change and would better the lives of African-Americans in the long term, but I fully expected conditions to improve for the Little Rock Nine given enough time. How wrong that assumption was. The death threats continued and intensified. The name-calling persisted. Efforts to physically and mentally harm the black students only became more organized, more desperate, more sinister. After hundreds of pages depicting such abuse, it hurt to read more, yet it continued. Melba and the others went through hell, and their struggle naturally evokes pity.

I did not at all expect to feel pity toward the white students, the abusers themselves. Do not misunderstand me, each tormentor is responsible for his or her horrific actions, and justice should be wrought upon them all. They will have God to answer to. At the same time though,those kids were indoctrinated. They were not born with a hatred for the black race. Their parents and teachers taught them to hate. They taught them that blacks were inferior to whites, that it was acceptable to disrespect, cheat, and abuse them. I pity the kids because they were brainwashed,molded into bigots by people who were molded in the same fashion. Researcher Kenneth B.Clark’s findings, which influenced the Brown case, stated, “Children learn social, racial, and religious prejudices in the course of observing and being influenced by the existence of patterns in the culture in which they live” (Fraser, p. 297).

The cycle continues today in some families. Perhaps that will be the most important thing to keep in mind when I am an educator. Besides parents, teachers are the strongest influences and role models to children. It will be my responsibility to inspire attitudes of equality and respect and understanding, even—no, especially—if it challenges what students are hearing at home.

In her own home, Melba received support from her brother, her mother, and especially her grandmother. It was a different story in the black community as a whole. While some neighbors approved of the Nine’s actions as a catalyst for change, others opposed it and ostracized Melba and the others. After the school year ended and the Nine had spent time around the country being honored for their accomplishment, Melba recalls, “We had come home, to Little Rock, back to being called ‘niggers’ by the segregationists and those ‘meddling children’ by your own people. Our friends a neighbors resented not only the school closure but most especially the negative economic impact our presence in that school had on our community” (p. 307). Some African-Americans, like Melba’s distant father, opposed what the Nine were doing because it made Little Rock even more dangerous for their people. Vandalism and violence against blacks increased, and neighbors saw Melba as only inflaming an already tense relationship. Not only was it more dangerous on the streets of Little Rock, blacks were rejected in grocery stores and employment positions even faster and more harshly than usual, in retribution for integration.

Melba felt the strain of ostracism as keenly as that of racism. She was abandoned by her old group of friends, who were “not willing to die” (Beals, p. 216) with her. She was not invited to parties, and her sixteenth birthday party was a lonely one. She found strength and friendship in the other members of the Nine: Elizabeth, Ernest, Gloria, Carlotta, Minnijean, Terrence,Jefferson, and Thelma. Unfortunately, the situation grew more dire for Melba. Her mother was fired from her teaching position. “Her superiors told her they were taking away her contract because she had allowed me to participate in the integration of Central” (p. 286), Melba writes. Only through exposing the mistreatment to the press did Melba’s mother get her job back (p.294). Throughout the integration process, the press would prove to be a primary force in raising awareness, stirring sympathy for the Nine, and keeping the situation at Central from spiraling into chaos. With the world watching, perhaps white supremacists were held back from their most evil designs.

Melba’s faith throughout this crisis was astounding, and it clearly sustained her. She declares on page two, “The experience endowed me with an indestructible faith in God.” Her diary entries are often prayers to God, and she often mentions times when she prayed for Him to keep herself and her family safe. Trusting God during times of crisis and pain is possibly the most difficult thing to do as a Christian; it is often easier to blame God. Melba’s reliance on Him is as admirable as it was steadfast.

Melba and the other eight would never have gotten in the front door of Central without President Dwight D. Eisenhower federalizing the Arkansas National Guard and sending in the101st Airborne Division to protect the students and see to it integration took place. Governor Faubus challenged the authority of the Court and of the federal government in his effort to enforce segregation, and Eisenhower made a bold move in sending troops to demonstrate the power of federal over local government. There is controversy over the president’s thoughts and motivations, but Melba, her mother, and her grandmother looked upon him favorably for the decisions he made. Melba herself appreciated the Screaming Eagles’ protection, particularly that of her bodyguard Danny, and was sad to see them go (Beals, p. 162). Melba understood that Eisenhower was enforcing the decree of the Court (Beals, p. 145). However, I believe writing off Eisenhower as solely standing up for the federal government’s authority, as some might, is too simplistic.

After World War II, “Ike” was the most popular man in America (Kunhardt et al., 1999,p. 36) and throughout his presidency, he would avoid strong stances on controversial issues to protect that popularity (p. 40). He wanted to avoid dealing with civil rights directly, preferring to let race relations improve without government interference, but it is clear that Ike “disapproved of racial segregation” (Norton et al., p . 810). Ike was concerned about losing party votes in the South by acting on civil rights (Norton et al., p . 810). Boldly stepping in to force integration upon an angry southern populace ran counter to Ike’s way of doing things. He put aside concern for politics, a graver concern with popularity, and an aversion to controversial issues to do what he knew was right. Melba writes, “He had stepped over a line no other President dared cross” (p.309).

According to the Dwight D. Eisenhower Memorial Commission, Ike disliked racism, purposefully appointed federal judges who believed in civil rights, forced civil rights legislation through Congress, officially integrated the White House and the Army, and fought discrimination in the workplace (EMC website, 2011). In 1953 Ike said, “I believe as long as we allow conditions to exist that make for second-class citizens, we are making of ourselves less than first-class citizens” (EMC website, 2011). There was more to Ike’s decisions than the federal-state battle. He honestly wanted change and cared about the fate of the Nine.

So did Melba’s protector, a young member of the 101st Airborne named Danny. Judging from Melba’s accounts, Danny proved to truly care about her well-being. Though a soldier under orders, Danny’s commitment to Melba surpassed his instructions. This is possibly due to the soldiers being from the North, where more respectful attitudes toward African-Americans existed. “He looked me directly in the eye” (p. 135) is the first description Melba offers of Danny. A short, poignant sentence. If nothing else, it speaks of respect, even before they knew one another. Danny would later make sure Melba’s tormentors saw him and would stare them down (Beals, p. 136). He washed out her eyes when a student doused them in acid (Beals, p.173). He protected her at every turn, but also offered her advice. That was certainly not in his job description. “’Patience,’ Danny said. ‘In order to survive this year you will have to become a soldier. Never let your enemy know what you are feeling’” (Beals, p. 161). Melba writes:

I feel specially cared about because the guard is there. If he wasn’t there, I’d hear more of the voices of those people who say I’m a nigger…that I’m not valuable, that I have no right to be alive. Thank you, Danny (p. 145).

Clearly, Melba thought much of Danny and cared about him. I believe their relationship was special to both. Danny could easily have withheld advice or not spoken and looked upon her with respect. Those were not his orders. He did them anyway. Though Melba admits, “I will never know if he only behaved that kindly because he was a great soldier or a good person or both” (p. 202), Danny’s actions indicate he sincerely wanted to help and protect a student fighting for change.

Change was Melba’s aspiration. Throughout Warriors Don’t Cry, Melba often mentions the northern city of Cincinnati, which she visited before integration began. It implanted a vision in her mind of what life in the South could be like:

For me, Cincinnati was the promised land. After a few days there, I lost that Little Rock feeling of being choked and kept in “my place” by white people. They weren’t in charge of me and my family in Cincinnati. I felt free, as though I could soar above the clouds (p. 30).

She refers to Cincinnati in her dairy on September 3, 1957, the first time the Little Rock Nine attempted to enter Central High:

Dear Diary,

It’s happening today. What I’m afraid of most is that they won’t like me and integration won’t work and Little Rock won’t become like Cincinnati, Ohio (p. 46).

Melba discovered in Ohio that African-Americans could walk with pride, without having to step off the sidewalk for white people, that bathrooms and other facilities were integrated, and that white people smiled at her and treated her family with decency (Beals, p. 30-31). She found equality. Her aspiration was to bring similar change to Little Rock. In the beginning, Melba believed that just by crossing into the white world she could present herself and show whites there was no need to treat her differently. She was young, and her naïve belief that change could come quickly is understandable. How devastating it must have felt, after Melba survived an entire year at Central, when “Governor Faubus had the last word. He closed all of Little Rock’s high schools” (Beals, p. 306-307) to prevent another year of mixed classrooms. Personally, I felt a twinge of relief reading that. Melba and her friends would be spared another year of hell. The move, from a certain point of view, also reeked of desperation and defeat on Governor Faubus’ part.

Despite the setback for integration, Melba can rest assured today in the knowledge she was a significant part of the civil rights movement. She wrote Warriors Don’t Cry based on her diary entries, local newspapers kept from the time period, and her memory. Though memory can fade and change over time, I believe the story she has told is accurate and trustworthy, and is supported through other sources. Besides, far worse things have been done to African-Americans in this country’s history. Melba writes, “I marvel at the fact that in the midst of this historic confrontation, we nine teenagers weren’t maimed or killed” (p. 309). Her purpose in writing this gripping narrative was not to glorify herself.

I believe she wrote this because most history textbooks devote mere sentences to the story of the Little Rock Nine. The college textbook A People and a Nation provides a paragraph (Norton et al., p . 810). One paragraph can never explain what truly happened at Central High, and Melba knew the need existed to tell the whole story, no matter how painful it was for her and regardless of how painful it is to read it.

Melba writes:

I began the first draft of this book when I was eighteen, but in the ensuing years, I could not face the ghosts that its pages called up. During intervals of renewed strength and commitment, I would find myself compelled to return to the manuscript, only to have the pain of reliving the past undo my good intentions. Now enough time has elapsed to allow healing to take place, enabling me to tell my story without bitterness (p. xvii).

It took over 30 years to write. It took hours to read.

Melba Beal’s legacy can be seen in ED 6010, a peacefully integrated course. This Foundations of Education class is welcoming and respectful. I am blessed by both where I live and the time in which I live. In 1954, de jure integration was achieved. In 2011, de facto integration is incomplete in many parts of the nation, but much improved in 60 years, with significant thanks owed to Melba Beals, the rest of the Little Rock Nine, and their struggle.

For more from the author, subscribe and follow or read his books.

Reference List

Beals, M. P. (1994). Warriors Don’t Cry: A Searing Memoir of the Battle to Integrate Little Rock’s Central High. New York, NY: Washington Square Press.

Fraser, J. W. (2010). The School in the United States. New York, NY: Routledge.

Kunhardt Jr., P. B., Kunhardt III, P.B., Kunhardt, P. W. (1999). The American President. New York, NY: Riverhead Books.

No author. (2011). Dwight D. Eisenhower Memorial Commission website. Retrieved from http://www.eisenhowermemorial.org/Civil-Rights.htm

Norton, M.B., Katzman, D. M., Blight, D. W., Chudacoff, H.P., Logevall, F., Bailey, B., Paterson, T. G., & Tuttle, W. M. (2005). A People and a Nation. Boston, MA: Houghton Mifflin.

The Taiping Rebellion

The Taiping Rebellion (1850-1864) was a devastating conflict in China between a growing Christian sect under Hong Xiuquan (1815-1864) and the Qing Dynasty government (1644-1911) that resulted in the deaths of over ten million people. Opinions differ as to whether this was a religious or political war, and while elements of both are generally agreed to be involved, an understanding of the overwhelming significance of religion’s role seems nonexistent. While the political forces within Hong’s “God Worshippers” did want to solve the internal turmoil in China, and certainly influenced events, the Taiping Rebellion was a religious war. It was more the influence of the West, not the problems at home, that sparked the violence. While many revolutions had occurred before this with no Christian influence, examining the viewpoint of the God Worshippers and the viewpoint of Qing militia leader Zeng Guofan (1811-1872) will make it exceedingly clear that without the influence of Western religion, the Taiping Rebellion never would have occurred.

From the point of view of Hong Xiuquan, religion was at the heart of everything he did. The origins of his faith and his individual actions immediately after his conversion explain his later choices and those of his followers during the rebellion. According to Chinese scholar R. Keith Schoppa, Hong had a vision he was vanquishing demons throughout the universe, under orders from men whom Hong later determined to be God and Jesus Christ. Hong believed that Christ was his older brother and Hong was thus “God’s Chinese son” (71). Hong studied “Good Works to Exhort the Age,” in which Christian author Liang Fa emphasized that his own conversion stemmed partly from the need to be pardoned of sin and partly from a desire to do good deeds to combat evil and eradicate it from his life (Cheng, Lestz 135). Reading Liang’s writings after the life-changing vision brought Hong to Christianity. It is essential to note that, as Schoppa puts it, “In his comprehension of the vision, Hong did not immediately see any political import” (71). All Hong was concerned about at this point was faith, not the Manchu (Qing) overlords. He was so impassioned he would “antagonize his community by destroying statues of gods in the local temple” (Schoppa 71). What Hong would have done with his life had he not become a Christian is impossible to say. He had repeatedly failed China’s all-important civil service examination; perhaps he would have taken up farming like his father (Schoppa 71).

Instead, he formed the God Worshipping Society. According to Schoppa, certain groups that joined declared the demons in Hong’s vision were the Manchu, and had to be vanquished (72). It was outside influences that politicized Hong’s beliefs. Yet even through the politicization one will see that at the heart of the matter is religion. The very society Hong wished to create was based on Christian ideals. Equality of men and women led to both sexes receiving equal land in Hong’s 1853 land system, the faith’s sense of community led to familial units with shared treasuries, and church was required on the Sabbath day and for wedding ceremonies (Schoppa 73). Christianity brought about the outlawing of much urban vice as well, such as drinking and adultery. One might argue that behind all these Christian ideological policies were long-held Confucian beliefs. According to the 1838 work “Qian Yong on Popular Religion,” eradicating gambling, prostitution, drugs, etc. was just as important to the elites and literati (those who have passed the civil service examination) as it was to Hong (Cheng, Lestz 129-131).

While there were indeed heavy Confucian influences on Hong’s teachings (evidenced by their Confucian adaptations to the Ten Commandments and the proceeding hymns found in Cheng and Lestz’s “The Crisis Within”), Schoppa makes it clear that “the Taiping Revolution was a potent threat to the traditional Chinese Confucian system” because it provided people with a personal God rather than simply the force of nature, Heaven (75). The social policies that emerged from Hong’s Christian ideals, like familial units and laws governing morality led Schoppa to declare, “It is little wonder that some Chinese…might have begun to feel their cultural identity and that of China threatened by the Heavenly Kingdom” (76). The point is, Hong never would have become a leader of the God Worshippers had Western Christianity not entered his life, and even after his growing group decided to overthrow the Manchu, the system of life they were fighting for and hoping to establish was founded on Christian beliefs. Just as Hong smashed down idols in his hometown after his conversion, so everywhere the God Worshippers advanced they destroyed Confucian relics, temples, and altars (Cheng, Lestz 148). The passion of Hong became the passion of all.

On the other side of the coin, it was also the opinion of the Manchu government that this was a religious war. As the God Worshippers grew in number, Schoppa writes, “The Qing government recognized the threat as serious: A Christian cult had militarized and was now forming an army” (72). Right away, the Manchu identified this as a religious rebellion. “It was the Taiping ideology and its political, social, and economic systems making up the Taiping Revolution that posed the most serious threat to the regime” (Schoppa 73). This new threat prompted the Qing to order administrator Zeng Guofan to create militia units and destroy the Taipings. “The Crisis Within” contains his “Proclamation Against the Bandits of Guangdong and Guangxi” from 1854. Aside from calling attention to the barbarism of the rebels, Zeng writes with disgust about Christianity and its “bogus” ruler and chief ministers. He mocks their sense of brotherhood, the teachings of Christ, and the New Testament (Cheng, Lestz 147). Zeng declares, “This is not just a crisis for our [Qing] dynasty, but the most extraordinary crisis of all time for the Confucian teachings, which is why our Confucius and Mencius are weeping bitterly in the nether world” (Cheng, Lestz, 148). Then, in regards to the destruction of Confucian temples and statues, Zeng proclaims that the ghosts and spirits have been insulted and want revenge, and it is imperative that the Qing government enacts it (Cheng, Lestz 148). This rhetoric is not concerning politics and government, Manchu or anti-Manchu. Zeng makes it obvious what he aims to destroy and why. He views the rebellion as an affront to Confucianism. The Christians, he believes, must be struck down.

With the leader’s life defined by Christianity, with a rebellious sect’s social structure based on Christianity, with the continued destruction of Confucian works in the name of Christianity, and with the government’s aim to crush the rebellion in the name of Confucius and Mencius, can anyone rationally argue that the Taiping Rebellion was not a religious war? A consensus should now be reached! The rebellion’s brutality and devastation is a tragedy when one considers the similar teachings of both sides of the conflict, the Confucian call for peaceful mediation of conflicts and the Christian commandment not to kill. The Taiping hymn that accompanies the Christian sixth commandment says, “The whole world is one family, and all men are brethren / How can they be permitted to kill and destroy one another? / The outward form and the inward principle are both conferred by Heaven / Allow everyone, then, to enjoy the ease and comfort which he desires” (Cheng, Lestz 142).

For more from the author, subscribe and follow or read his books. 

References

Cheng, Pei-kai, Michael Lestz, and Jonathan D. Spence, eds. The Search for Modern China, (New York: W.W. Norton & Company, 1999), 128-149.

Schoppa, R. Keith. Revolution and its Past (New Jersey: Prentice Hall, 2011), 71-76.

Qing Dynasty and Language

If your homeland were conquered by a foreign power, which would you expect: your occupier to force its foreign tongue upon you or to adopt your language and operate its new government under it? Language is a powerful cultural identifier. For the Manchu people that conquered Ming Dynasty China and established the Qing Dynasty (1644-1911) in the seventeenth century, language was the most important factor in establishing the legitimacy of their rule. Careful analysis of Evelyn Rawski’s “Reenvisioning the Qing” reveals the Manchu sought to preserve and spread their own language and embrace the language identity of the Han Chinese, with intriguing historic consequences. Whether or not this possibly counterproductive policy helped or hurt the Manchu maintain their empire is ready for examination.

The Manchu had a history of interesting language interaction before seizing China. According to Rawski, “Mongol allies were vital to the Manchu conquest. Since these alliances were usually cemented by marriage exchanges, early Qing emperors claimed Mongol as well as Manchu ancestry. Mongolian and Manchu were the primary languages during the crucial conquest decades before 1644” (834). Even before they took Beijing, the Manchu were accustomed to adopting other tongues and sharing their own. Rawski calls attention to “the ability of the Manchus to bind warriors from a variety of cultural backgrounds to their cause” (834). Language was key to their success. Perhaps the ease of which the Manchu allowed a mutual exchange of language served as a precedent for their seemingly contradictory policies in China.

When the Qing Dynasty began, the Manchu immediately set to work dispelling the view that they were foreigners and establishing themselves as acceptable rulers. “The adoption of Ming state rituals was a crucial way for Manchu rulers to assert their legitimacy by linking themselves to the former legitimate imperial state” (Schoppa, 32). Religious, political, and household rituals were included. Rawski claims “the determination of the rulers to present themselves to their Chinese subjects as Confucian monarchs is evident in their acquisition of Chinese” (834). Among adoption of other Confucian rituals, the Manchu made a point to learn the Chinese language. As the empire expanded, so the embracing of local language increased. The Qianlong emperor of the eighteenth century spoke Manchu, Chinese, Mongolian, Uighur, and Tibetan, and declared these to be the official state languages (Rawski, 835). Rawski notes, “The emperor commissioned translations, dictionary compilations, and other projects to promote each language” (835). It is evident that the Manchu leaders wished to make the tongues of Han China a part of their own identity.

On the other side of the coin, they also aimed to preserve and teach Manchu. Rawski writes, “Northeastern peoples like the Daur, who had no written language of their own, learned Manchu” (836). The Manchu encouraged use of native languages throughout China, but here one sees the Manchu also sought to spread their own. The Daur, Ewenk and Oroqen eventually spoke and wrote Manchu script (Rawski, 836). The Manchu also sought to teach their language to allied leaders residing in the capitol: “Living in Peking, surrounded by the splendors of Han Chinese culture, they developed in the eighteenth century a definition of Manchu identity that stressed…fluency in the Manchu language” (Rawski, 838). Furthermore, the Manchu had many works translated into their tongue, and kept their government records and history in Manchu.

To the casual observer, it would seem that employing both strategies—preserving Manchu and embracing Chinese languages—would prove counterproductive. One might think that the Manchu should have required the use of their tongue in an effort to solidify their rule, or perhaps one would expect the Manchu to give up their language altogether to fully “sinicize,” to blend into Chinese culture. After all, its writing system was indeed in its infancy, having just been created by Nurgaci and his son Hongtaiji (Rawski, 840). One could make the case that sinicization would have been more complete had they let their native tongue die out. However, the Manchu maintaining their language and encouraging native languages established a balance of power that was the key to preserving their rule. It allowed them to demonstrate the legitimacy of their rule and hold a multiethnic together.

While the Manchu did not only spread their language, rituals and traditions (such as mounted archery) do not create a balance of power. Language is key. What better way to show the Han people that life can resume as normal after a hostile takeover than to allow the people the right to continue, and even spread, their own language? Other empires of history have not shown the same wisdom. Additionally, holding on to Manchu within government circles and using it to fill in the gaps of literacy (as noted before, with the tribes on the outer regions), carefully allows the invaders to preserve their identity. It distinguishes them, yes, but not in a way harmful to their rule, not in a way that marks them as aliens. They do so in a way that blends their tongue and thus their culture seamlessly into the multiethnic realm that is China. Whether one accepts that sinicization allowed the Qing Dynasty last so long, or that it was by building cultural links with multiple ethnic groups as Rawski believes (831), the balance of power the Manchu created through language was instrumental.

For more from the author, subscribe and follow or read his books.

References

Evelyn S. Rawski, “Reenvisioning the Qing: The Significance of the Qing Period in Chinese History,” The Journal of Asian Studies 55, no. 4 (November 1996): 829-850.

R. Keith Schoppa, Revolution and its Past (New Jersey: Prentice Hall, 2011), 32.

Chen Village

 

Chan, Anita, Richard Madsen and Jonathon Unger, pp. 1-40, 74-168, 186-212 in Chen Village: Revolution to Globalization (1984)

 

The three authors of this text provide a captivating narrative of a small community called Chen Village under the government of the Chinese Communist Party, which enacts various reform efforts upon China with often harrowing effects. From the Great Leap Forward to the campaigns to the Cultural Revolution, Chen Village suffers and struggles to survive under Mao’s and then Deng Xiaoping’s policies. The authors’ argument (or one of them) is that Chinese village leaders, such as Quingfa of Chen Village, often found themselves in a cruel irony: they came to power seen as opponents of class and were removed from power seen as supporters of class. So it is with Chen Quingfa. Commune leaders were looking for a man of words, a man of action, and a man of wisdom. Party leaders also wanted to select someone with a “clean” class background; Quingfa was extremely destitute and had been his whole life. He was illiterate with humble beginnings. He was their man, and was thus appointed secretary.

Quingfa would later come under fire, transformed into an image of a hated landlord. His relations to former removed landlords would incite criticism. He would be accused of giving the best land to himself and his kin, and eating finer foods than were available to the common man. He was disgraced under the accusation that he received foreign capitalist gifts and thus supported capitalism. Overall, having a better life or having a leadership role was often seen as being of higher class. This impossible situation Quingfa found himself in meant in addition to the turbulent nature of China’s economy and the CCP’s campaigns and policies, leadership roles such as his would be severely unstable and in a state of flux. This only hurt China and slowed its recovery.

The authors use concrete evidence. As many Chinese who lived in this time period are still alive today, there is a plethora of direct quotes from interviewees. Written documents from the time period are also used as primary sources. This book is convincing and effective in showing the reader what Chen Village went through during those trying days.

One thing that struck me was how the sense of identity according to kinship refused to budge even in the face of communist reforms and its new ideology. Quingfa was most helpful to his relatives and neighbors, even going so far as to rig the land distribution lottery to ensure they got better land. He grouped his closest friends into the same work team. Even amongst all the talk of communes, equality and classlessness, older ingrained beliefs and traditions remained. It seems to me that Quingfa did create a higher class for himself, his family and his friends. The ancient sense of identity undermined communism, ensuring that the creation of a classless society would ultimately fail.

For more from the author, subscribe and follow or read his books.