Merit Pay

“Too many supporters of my party have resisted the idea of rewarding excellence in teaching with extra pay, even though we know it can make a difference in the classroom,” President Barack Obama said in March 2009. The statement foreshadowed the appearance of teacher merit pay in Obama’s “Race to the Top” education initiative, which grants federal funds to top performing schools. Performance, of course, is based on standardized testing, and in the flawed Race to the Top, so are teacher salaries. Teacher pay could rise and fall with student test scores.

Rhetoric concerning higher teacher salaries is a good thing. Proponents of merit pay say meager teacher salaries are an injustice, and such a pay system is needed to alleviate the nation’s teacher shortage. However, is linking pay to test scores the best way to “reward excellence”? Do we know, without question, it “can make a difference in the classroom”? The answers, respectively, are no and no. Merit pay is an inefficient and potentially counterproductive way to improve education in American public schools. It fails to motivate teachers to better themselves or remain in the profession, it encourages unhealthy teacher competition and dishonest conduct, and it does not serve well certain groups, like special education students.

Educator Alfie Kohn, author of the brilliant Punished by Rewards, wrote an article in 2003 entitled “The Folly of Merit Pay.” He writes, “No controlled scientific study has ever found a long-term enhancement of the quality of work as a result of any incentive system.” Merit pay simply does not work. It has been implemented here and there for decades, but is always abandoned. A good teacher is intrinsically motivated: he teaches because he enjoys it. She teaches because it betters society. He teaches because it is personally fulfilling. Advocates of merit pay ignore such motivation, but Kohn declares, “Researchers have demonstrated repeatedly that the use of such extrinsic inducements often reduces intrinsic motivation. The more that people are rewarded, the more they tend to lose interest in whatever they had to do to get the reward.” Extra cash sounds great, but it is destructive to the inner passions of quality teachers.

Teachers generally rank salaries below too much standardization and unfavorable accountability on their lists of grievances (Kohn, 2003). Educators leave the profession because they are being choked by federal standards and control, and politicians believe linking pay to such problems is a viable solution? Professionals also generally oppose merit pay, disliking its competitive nature. Professor and historian Diane Ravitch writes an incentive “gets everyone thinking about what is good for himself or herself and leads to forgetting about the goals of the organization. It incentivizes short-term thinking and discourages long-term thinking” (Strauss, 2011). Teaching students should not be a game, with big prizes for the winners.

Further, at issue is the distorted view of students performance pay perpetuates. Bill Raabe of the National Education Association says, “We all must be wary of any system that creates a climate where students are viewed as part of the pay equation, rather than young people who deserve a high quality education” (Rosales, 2009). In the current environment of high-stakes tests (which do not really evaluate the quality of teaching at all), merit pay is just another way to encourage educators to “teach to the test,” or worse: cheating. The nation has already seen public school teachers under so much pressure they resort to modifying their students’ scores in order to save their salaries or their jobs.

It is clear that merit pay does not serve young learners, but this is especially true in the case of special education students. The Individuals with Disabilities Education Act (IDEA) requires states that accept federal funding to provide individual educational services to all children with disabilities. While the preeminence of “inclusion” of SPED children in regular classrooms is appropriate, the students are also included in the accountability statues of No Child Left Behind. SPED students are required to meet “adequate yearly progress” (AYP) standards based on high-stakes tests in reading, math, and science, like other students. While some youths with “significant cognitive disabilities” (undefined by federal law) can take alternate assessments, there is a cap on how many students can do so (Yell, Katsiyannas, & Shiner, 2006, p. 35-36). Most special education students must be included in standardized tests.

The abilities and the needs of special education students are too diverse to be put in the box that is a standardized test. SPED students are essentially being asked to perform at their chronological grade level, and for some students that is simply not possible. How does that fit in with a Free Appropriate Public Education, the education program the IDEA guarantees, that focuses on “individualized” plans for the “unique needs” of the student? It does not. Progress is individual, not standardized. Further, linking teacher pay to this unreasonable accountability only makes matters worse. Performance pay will likely punish special education instructors. Each year, SPED students may make steady progress (be it academic, cognitive, social, emotional, etc.), but teachers will see their salaries stagnate or slashed because such gains do not meet federal or state benchmarks. Such an uphill battle will discourage men and women from entering the special education field, meaning fewer quality instructors to serve students with disabilities.

When a school defines the quality of teaching by how well students perform on one test once a year, everyone loses. When pay is in the equation, it’s worse. Obama deserves credit for beginning to phase out NCLB, but merit pay is no way to make public schools more effective. If politicians want to pay good teachers better and weed out poor teachers, their efforts would be better directed at raising salaries across the board and reforming tenure.

 

References

Kohn, A. (2003). The Folly of Merit Pay. Retrieved February 19, 2012 from https://www.alfiekohn.org/article/folly-merit-pay/.

Rosales, J. (2009). Pay Based on Test Scores? Retrieved February 19, 2012 from http://www.nea.org/home/36780.html.

Strauss, V. (2011). Ravitch: Why Merit Pay for Teachers Doesn’t Work. Retrieved February 19, 2012 from http://www.washingtonpost.com/blogs/answer-sheet/post/ravitch-why-merit-pay-for-teachers-doesnt-work/2011/03/29/AFn5w9yB_blog.html.

Yell, M. L., Katsiyannas, A., Shiner, J. G. (2006). The No Child Left Behind Act, Adequate Yearly Progress, and Students with Disabilities. Teaching Exceptional Children, 38 (4), 32-39.

On Student Teaching

I am now two weeks from concluding my first student teaching placement (Visitation School), and my classroom management skills are still being refined. After observing for five days, slowly beginning my integration into a leadership role, I took over completely from my cooperating teacher. While excited to start, initially I had a couple days where I found one 6th grade class (my homeroom) difficult to control. There were times when other classes stepped out of line, naturally, but the consistency with which my homeroom became noisy and rowdy was discouraging.

“They’re your homeroom,” my cooperating teacher reminded me. “They feel more at home in your classroom, and will try to get away with more.”

There were a few instances where students took someone else’s property, or wrote notes to classmates, but the side chatter was the major offense. I would be attempting to teach and each table would have at least someone making conversation, which obviously distracts both those who wish to pay attention and those who don’t care. I would ask them to refocus and quiet themselves, which would work for but a few precious moments. There was one day I remember I felt very much as if the students were controlling me, rather than the other way around, and I made the mistake of hesitating when I could have doled out consequences. I spoke to my cooperating teacher about it during our feedback session, and she emphasized to me that I needed to prove to the students my willingness to enforce the policies, that I have the same authority as any other teacher in the building.

At Visitation, their classroom management system revolves around “tallies,” one of which equals three laps at recess before one can begin play. My homeroom deserved a tally the day I hesitated. I needed to come up with a concrete, consistent way of disciplining disruptive behavior. So I went home and developed a simple system I had thought about a long time ago: behavior management based on soccer. I cut out and laminated a yellow card and a red card. The next day, I sat each class down in the hall before they entered the room, and told them the yellow card would be shown to them as a warning, the red card as tallies. These could be given individually or as a class, and, like soccer, a red card could be given without a yellow card.

The students were surprisingly excited about this. Perhaps turning punishment into a game intrigued them; regardless, it made me wonder if this would work. But it seemed discussing the expectations I had of them, and the enforcement of such expectations, helped a good deal. Further, I was able to overcome my hesitation that day and dole out consequences for inappropriate behavior. My homeroom I gave a yellow card and then a red card, and they walked laps the next day.

My cooperating teacher noted the system would be effective because it was visual for the students. I also found that it allowed me to easily maintain emotional control; instead of raising my voice, I simply raised a card in my hand, and the class refocused. Its visibility allowed me to say nothing at all.

While containing a different purpose and practice, this system draws important elements from the Do It Again system educator Doug Lemov describes, including no administrative follow-up and logical consequences, but most significantly group accountability (Lemov, 2010, p. 192). It holds an entire class responsible for individual actions, and “builds incentives for individuals to behave positively since it makes them accountable to their peers as well as their teacher” (p. 192). Indeed, my classes almost immediately started regulating themselves, keeping themselves accountable for following my expectations (telling each other to be quiet and settle down, for instance, before I had to say anything).

Lemov would perhaps frown upon the yellow card, and point to the behavioral management technique called No Warning (p. 199). He suggests teachers:

  • Act early. Try to see the favor you are doing kids in catching off-task behavior early and using a minor intervention of consequence to prevent a major consequence later.
  • Act reliably. Be predictably consistent, sufficient to take the variable of how you will react out of the equation and focus students on the action that precipitated your response.
  • Act proportionately. Start small when the misbehavior is small; don’t go nuclear unless the situation is nuclear.

I have tried to follow these guidelines to the best of my ability, but Lemov would say the warning is not taking action, only telling students “a certain amount of disobedience will not only be tolerated but is expected” (p. 200). He would say students will get away with what they can until they are warned, and will only refocus and cease their side conversations afterwards. Lemov makes a valid point, and I have indeed seen this happen to a degree. As a whole, however, the system has been effective, and most of my classes do not at all take advantage of their warning. Knowing they can receive a consequence without a warning has helped, perhaps. After a month of using the cards, I have given my homeroom a red card three times. In my other five classes combined during the same period, there have been two yellows and only one red. I have issued a few individual yellows, but no reds.

Perhaps it is counterproductive to have a warning, but I personally feel that since the primary focus of the system is on group accountability, I need to give talkative students a chance to correct their behavior before consequences are doled out for the entire class. Sometimes a reminder is necessary, the reminder that their actions affect their classmates and that they need to refocus. I do not want to punish the students who are not being disruptive along with those who are without issuing some sort of warning that they are on thin ice.

 

 

During my two student teaching placements this semester, I greatly enjoyed getting to know my students. It was one of the more rewarding aspects of teaching. Introducing myself and my interests in detail on the first day I arrived proved to be an excellent start; I told them I liked history, soccer, drawing, reading, etc. Building relationships was easy, as students seemed fascinated by me and had an endless array of questions about who I was and where I came from.

Art is something I used to connect with students. At both my schools, the first students I got to know were the budding artists, as I was able to observe them sketching in the corners of their notebooks and later ask to see their work. There was one girl at my first placement who drew a new breed of horse on the homeroom whiteboard each morning; a boy at my second placement was drawing incredible fantasy figures every spare second he had. I was the same way when I was their age, so naturally I struck up conversations about their pictures. I tried to take advantage of such an interest by asking students to draw posters of Hindu gods or sketch images next to vocabulary words to aid recall. Not everyone likes to draw, but I like to encourage the skill and at least provide them an opportunity to try. Beyond this, I would use what novels students had with them to learn about their fascinations and engage them, and many were excited I knew The Hunger Games, The Hobbit, and The Lord of the Rings. We would discuss our favorite characters and compare such fiction to recent films.

For all my students, I strove to engage them each day with positive behavior, including greeting them by name at the door, drawing with and for them, laughing and joking with them, maintaining a high level of interest in what students were telling me (even if they rambled aimlessly, as they had the tendency to do) and even twice playing soccer with them at recess. The Catholic community of my first placement also provided the chance to worship and pray with my kids, an experience I will not forget.

One of my successes was remaining emotionally cool, giving students a sense of calm, confidence, and control about me. Marzano (2007) writes, “It is important to keep in mind that emotional objectivity does not imply being impersonal with or cool towards students. Rather, it involves keeping a type of emotional distance from the ups and downs of classroom life and not taking students’ outbursts or even students’ direct acts of disobedience personally” (p. 152). Even when I was feeling control slipping away from me, I did my best to be calm, keep my voice low, and correct students in a respectful manner that reminded them they had expectations they needed to meet. Lemov (2010) agrees, writing, “An emotionally constant teacher earns students’ trust in part by having them know he is always under control. Most of all, he knows success is in the long run about a student’s consistent relationship with productive behaviors” (p. 219). Building positive relationships required mutual respect and trust, and emotional constancy was key.

Another technique I emphasized was the demonstration of my passion for social studies, to prove to them the gravity of my personal investment in their success. One lesson from my first placement covered the persecution of Anne Hutchinson in Puritan America; we connected it to modern sexism, such as discrimination against women in terms of wage earnings. Another lesson was about racism, how it originated as a justification for African slavery and how the election of Barack Obama brought forth a surge of openly racist sentiment from part of the U.S. citizenry. I told them repeatedly that we studied history to become dissenters and activists, people who would rise up and destroy sexism and racism. I told them I had a personal stake in their understanding of such material, a personal stake in their future, because they were the ones responsible for changing our society in positive ways. Being the next generation, ending social injustices would soon be up to them.

Marzano (2007) says, “Arguably the quality of the relationships teachers have with students is the keystone of effective management and perhaps even the entirety of teaching” (p. 149). In my observation experiences, I saw burnt out and bitter teachers, who focused their efforts on authoritative control and left positive relationship-building on the sideline. The lack of strong relationships usually meant more chaotic classrooms and more disruptive behavior. As my career begins, I plan to make my stake in student success and my compassion for each person obvious, and stay in the habit.

 

References:

Lemov, D. (2010). Teach like a champion: 49 techniques that put students on the path to college. San Francisco, CA: Jossey-Bass.

Marzano, R. (2007). The art and science of teaching. Alexandria, VA: Association for Supervision and Curriculum Development.

Bernie Will Win Iowa

Predicting the future isn’t something I make a habit of. It is a perilous activity, always involving a strong chance of being wrong and looking the fool. Yet sometimes, here and there, conditions unfold around us in a way that gives one enough confidence to hazard a prediction. I believe that Bernie Sanders will win Iowa today.

First, consider that Bernie is at the top of the polls. Polls aren’t always reliable predictors, and he’s neck-and-neck with an opponent in some of them, but it’s a good sign.

Second, Bernie raised the most money in Q4 of 2019 by far, a solid $10 million more than the second-place candidate, Pete Buttigieg. He has more individual donations at this stage than any candidate in American history, has raised the most overall in this campaign, and is among the top spenders in Iowa. (These analyses exclude billionaire self-funders Bloomberg and Steyer, who have little real support.) As with a rise in the polls, he has momentum like no one else.

Third, Bernie is the only candidate in this race who was campaigning in Iowa in 2016, which means more voter touches and repeat voter touches. This is Round 2 for him, an advantage — everyone else is in Round 1.

Next, don’t forget, Iowa in 2016 was nearly a tie between Bernie and Hillary Clinton. It was the closest result in the state’s caucus history; Hillary won just 0.3% more delegate equivalents. It’s probably safe to say Bernie is more well-known today, four years later — if he could tie then, he can win now.

Fifth, in Iowa in 2016, there were essentially two voting blocs: the Hillary Bloc and the Bernie Bloc. (There was a third but insignificant candidate.) These are the people who actually show up to caucus — what will they do now? I look at the Bernie Bloc as probably remaining mostly intact. He may lose some voters to Warren or others, as this field has more progressive options than last time, but I think his supporters’ fanatical passion and other voters’ interest in the most progressive candidate will mostly keep the Bloc together. The Hillary Bloc, of course, will be split between the many other candidates — leaving Bernie the victor. (Even if there is much higher turnout than in 2016, I expect the multitude of candidates to aid Bernie — and many of the new voters will go to him, especially if they’re young. An historic youth turnout is expected, and they mostly back Sanders.)

This last one is simply anecdotal. All candidates have devoted campaigners helping them. But I must say it. The best activists I know are on the case. They’ve put their Kansas City lives on hold and are in Iowa right now. The Kansas City Left has Bernie’s back, and I believe in them.

To victory, friends.

The Enduring Stupidity of the Electoral College

To any sensible person, the Electoral College is a severely flawed method of electing our president. Most importantly, it is a system in which the less popular candidate — the person with fewer votes — can win the White House. That absurdity would be enough to throw the Electoral College out and simply use a popular vote to determine the winner. Yet there is more.

It is a system where your vote becomes meaningless, giving no aid to your chosen candidate, if you’re in your state’s political minority; where small states have disproportionate power to determine the winner; where white voters have disproportionate decision-making power compared to voters of color; and where electors, who are supposed to represent the majority of voters in each state, can change their minds and vote for whomever they please. Not even its origins are pure, as slavery and the desire to keep voting power away from ordinary people were factors in its design.

Let’s consider these problems in detail. We’ll also look at the threadbare attempts to justify them.

The votes of the political minority become worthless, leading to a victor with fewer votes than the loser

When we vote in presidential elections, we’re not actually voting for the candidates. We’re voting on whether to award decision-making power to Democratic or Republican electors. 538 people will cast their votes and the candidate who receives a majority of 270 votes will win. The electors are chosen by the political parties at state conventions, through committees, or by the presidential candidates. It depends on the state. The electors could be anyone, but are usually involved with the parties or are close allies. In 2016, for instance, electors included Bill Clinton and Donald Trump, Jr. Since they are chosen for their loyalty, they typically (though not always, as we will see) vote for the party that chose them.

The central problem with this system is that most all states are all-or-nothing when electors are awarded. (Only a couple states, Maine and Nebraska, have acted on this unfairness and divided up electors based on their popular votes.) As a candidate, winning by a single citizen vote grants you all the electors from the state. 

Imagine you live in Missouri. Let’s say in 2020 you vote Republican, but the Democratic candidate wins the state; the majority of Missourians voted Blue. All of Missouri’s 10 electors are then awarded to the Democratic candidate. When that happens, your vote does absolutely nothing to help your chosen candidate win the White House. It has no value. Only the votes of the political majority in the state influence who wins, by securing electors. It’s as if you never voted at all — it might as well have been 100% of Missourians voting Blue. As a Republican, wouldn’t you rather have your vote matter as much as all the Democratic votes in Missouri? For instance, 1 million Republican votes pushing the Republican candidate toward victory alongside the, say, 1.5 million Democratic votes pushing the Democratic candidate forward? Versus zero electors for the Republican candidate and 10 electors for the Democrat?

In terms of real contribution to a candidate’s victory, the outcomes can be broken down, and compared to a popular vote, in this way:

State Electoral College victor: contribution (electors)
State Electoral College loser: no contribution (no electors)

State popular vote victor: contribution (votes)
State popular vote loser: contribution (votes)

Under a popular vote, however, your vote won’t become meaningless if you’re in the political minority in your state. It will offer an actual contribution to your favored candidate. It will be worth the same as the vote of someone in the political majority. The Electoral College simply does not award equal value to each vote (see more examples below), whereas the popular vote does, by allowing the votes of the political minority to influence the final outcome. That’s better for voters, as it gives votes equal power. It’s also better for candidates, as the loser in each state would actually get something for his or her efforts. He or she would keep the earned votes, moving forward in his or her popular vote count. Instead of getting zero electors — no progress at all.

But why, one may ask, does this really matter? When it comes to determining who wins a state and gets its electors, all votes are of equal value. The majority wins, earning the right to give all the electors to its chosen candidate. How exactly is this unfair?

It’s unfair because, when all the states operate under such a system, it can lead to the candidate with fewer votes winning the White House. It’s a winner-take-all distribution of electors, each state’s political minority votes are ignored — but those votes can add up. 66 million Americans may choose the politician you support, but the other candidate may win with just 63 million votes. That’s what happened in 2016. It also happened in the race of 2000, as well as in 1876 and 1888. It simply isn’t fair or just for a candidate with fewer votes to win. It is mathematically possible, in fact, to win just 21.8% of the popular vote and win the presidency. While very unlikely, it is possible. That would mean, for example, a winner with 28 million votes and a loser with 101 million! This is absurd and unfair on its face. The candidate with the most votes should be the victor, as is the case with every other political race in the United States, and as is standard practice among the majority of the world’s democracies.

The lack of fairness and unequal value of citizen votes go deeper, however.

Small states and white power

Under the Electoral College, your vote is worth less in big states. For instance, Texas, with 28.7 million people and 38 electors, has one elector for every 755,000 people. But Wyoming, with 578,000 people and 3 electors, has one elector for every 193,000 people. In other words, each Wyoming voter has a bigger influence over who wins the presidency than each Texas voter. 4% of the U.S. population, for instance, in small states, has 8% of the electors. Why not 4%, to keep votes equal? (For those who think all this was the intent of the Founders, to give more power to smaller states, we’ll address that later on.)

To make things even, Texas would need many more electors. As would other big states. You have to look at changing population data and frequently adjust electors, as the government is supposed to do based on the census and House representation — it just doesn’t do it very well. It would be better to do away with the Electoral College entirely, because under a popular vote the vote of someone from Wyoming would be precisely equal to the vote of a Texan. Each would be one vote out of the 130 million or so cast. No adjustments needed.

It also just so happens that less populous states tend to be very white, and more populous states more diverse, meaning disproportionate white decision-making power overall.

Relatedly, it’s important to note that the political minority in each state, which will become inconsequential to the presidential race, is sometimes dominated by racial minorities, or at least most voters of color will belong to it. As Bob Wing writes, because “in almost every election white Republicans out-vote [blacks, most all Democrats] in every Southern state and every border state except Maryland,” the “Electoral College result was the same as if African Americans in the South had not voted at all.”

Faithless electors

After state residents vote for electors, the electors can essentially vote for whomever they want, in many states at least. “There are 32 states (plus the District of Columbia) that require electors to vote for a pledged candidate. Most of those states (19 plus DC) nonetheless do not provide for any penalty or any mechanism to prevent the deviant vote from counting as cast. Four states provide a penalty of some sort for a deviant vote, and 11 states provide for the vote to be canceled and the elector replaced…”

Now, electors are chosen specifically because of their loyalty, and “faithless electors” are extremely rare, but that doesn’t mean they will always vote for the candidate you elected them to vote for. There have been 85 electors in U.S. history that abstained or changed their vote on a whim. Sometimes for racist reasons, on accident, etc. Even more changed their votes after a candidate died — perhaps the voters would have liked to select another option themselves. Even if rare, all this should not be possible or legal. It is yet another way the Electoral College has built-in unfairness — imagine the will of a state’s political majority being ignored.

* * *

Won’t a popular vote give too much power to big states and cities?

Let’s turn now to the arguments against a popular vote, usually heard from conservatives. A common one is that big states, or big cities, will “have too much power.” Rural areas and less populous states and towns will supposedly have less.

This misunderstands power. States don’t vote. Cities don’t vote. People do. If we’re speaking solely about power, about influence, where you live does not matter. The vote of someone in Eudora, Kansas, is worth the same as someone in New York, New York.

This argument is typically posited by those who think that because some big, populous states like California and New York are liberal, this will mean liberal rule. (Conservative Texas, the second-most populous state, and sometimes-conservative swing states like Florida [third-most populous] and Pennsylvania [fifth-most populous] are ignored.) Likewise, because a majority of Americans today live in cities, and cities tend to be more liberal than small towns, this will result in the same. The concern for rural America and small states is really a concern for Republican power.

But obviously, in a direct election each person’s vote is of equal weight and importanceregardless and independent of where you live. 63% of Americans live in cities, so it is true that most voters will be living and voting in cities, but it cannot be said the small town voter has a weaker voice than the city dweller. Their votes have identical sway over who will be president. In the same way, a voter in a populous coastal state has no more influence than one in Arkansas.

No conservative looks with dismay at the direct election of his Democratic governor or congresswoman and says, “She only won because the small towns don’t have a voice. We have to find a way to diminish the power of the big cities!” No one complains that X area has too many people and too many liberals and argues some system should fix this. No one cries, “Tyranny of the majority! Mob rule!” They say, “She got the most votes, seems fair.” Why? Because one understands that the vote of the rural citizen is worth the same as the vote of an urban citizen, but if there happens to be more people living in cities in your state, or if there are more liberals in your state, so be it. That’s the freedom to live where you wish, believe what you wish, and have a vote worth the same as everyone else’s.

Think about the popular vote in past elections. About half of Americans vote Republican, about half vote Democrat. One candidate gets a few hundred thousand or few million more. It will be exactly the same if the popular vote determined the winner rather than the Electoral College — where you live is irrelevant. What matters is the final vote tally.

It’s not enough to simply complain that the United States is too liberal. And therefore we must preserve the Electoral College. That’s really what this argument boils down to. It’s not an argument at all. Unfair structures can’t be justified because they serve one political party. Whoever can win the most American votes should be president, no matter what party they come from.

But won’t candidates only pander to big states and cities?

This is a different question, and it has merit. It is true that where candidates campaign will change with the implementation of a popular vote. Conservatives warn that candidates will spend most of their time in the big cities and big states, and ignore rural places. This is likely true, as candidates (of both parties) will want to reach as many voters as possible in the time they have to garner support.

Yet this carries no weight as an argument against a popular vote, because the Electoral College has a very similar problem. Candidates focus their attention on swing states.

There’s a reason Democrats don’t typically campaign very hard in conservative Texas and Republicans don’t campaign hard in liberal California. Instead, they campaign in states that are more evenly divided ideologically, states that sometimes go Blue and sometimes go Red. They focus also on swing states with a decent number of electors. The majority of campaign events are in just six states. Unless you live in one of these places, like Ohio, Florida, or Pennsylvania, your vote isn’t as vital to victory and your state won’t get as much pandering. The voters in swing states are vastly more important, their votes much more valuable than elsewhere.

How candidates focusing on a handful of swing states might be so much better than candidates focusing on more populous areas is never explained by Electoral College supporters. It seems like a fair trade, but with a popular vote we also get the candidate with the most support always winning, votes of equal worth, and no higher-ups to ignore the will of the people.

However, with a significant number of Americans still living outside big cities, attention will likely still be paid to rural voters — especially, one might assume, by the Republican candidate. Nearly 40% of the nation living in small towns and small states isn’t something wisely ignored. Wherever the parties shift most of their attention, there is every reason to think Blue candidates will want to solidify their win by courting Blue voters in small towns and states, and Red candidates will want to ensure theirs by courting Red voters in big cities and states. Even if the rural voting bloc didn’t matter and couldn’t sway the election (it would and could), one might ask how a handful of big states and cities alone determining the outcome of the election is so much worse than a few swing states doing the same in the Electoral College system.

Likewise, the fear that a president, plotting reelection, will better serve the interests of big states and cities seems about as reasonable as fear that he or she would better serve the interests of the swing states today. One is hardly riskier than the other.

But didn’t the Founders see good reason for the Electoral College?

First, it’s important to note that invoking the Founding Fathers doesn’t automatically justified flawed governmental systems. The Founders were not perfect, and many of the policies and institutions they decreed in the Constitution are now gone.

Even before the Constitution, the Founders’ Articles of Confederation were scrapped after just seven years. Later, the Twelfth Amendment got rid of a system where the losing presidential candidate automatically became vice president — a reform of the Electoral College. Our senators were elected by the state legislatures, not we the people, until 1913 (Amendment 17 overturned clauses from Article 1, Section 3 of the Constitution). Only in 1856 did the last state, North Carolina, do away with property requirements to vote for members of the House of Representatives, allowing the poor to participate. The Three-Fifths Compromise (the Enumeration Clause of the Constitution), which valued slaves less than full people for political representation purposes, is gone, and today people of color, women, and people without property can vote thanks to various amendments. There were no term limits for the president until 1951 (Amendment 22) — apparently an executive without term limits didn’t give the Founders nightmares of tyranny.

The Founders were very concerned about keeping political power away from ordinary people, who might take away their riches and privileges. They wanted the wealthy few, like themselves, to make the decisions. See How the Founding Fathers Protected Their Own Wealth and Power.

The Electoral College, at its heart, was a compromise between Congress selecting the president and the citizenry doing so. The people would choose the people to choose the president. Alexander Hamilton wrote that the “sense of the people should operate in the choice of the person to whom so important a trust was to be confided.” He thought “a small number of persons, selected by their fellow-citizens from the general mass, will be most likely to possess the information and discernment requisite to such complicated investigations.”

Yet the Founders did not anticipate that states would pass winner-take-all elector policies, and some wanted it abolished. The Constitution and its writers did not establish such a mechanism. States did, and only after the Constitution, which established the Electoral College, was written. In 1789, only three states had such laws, according to the Institute for Research on Presidential Elections. It wasn’t until 1836 that every state (save one, which held out until after the Civil War) adopted a winner-take-all law; they sought more attention from candidates by offering all electors to the victor, they wanted their chosen sons to win more electors, and so forth. Before (and alongside) the winner-take-all laws, states were divided into districts and the people in each district would elect an elector (meaning a state’s electors could be divided up among candidates). Alternatively, state legislatures would choose the electors, meaning citizens did not vote for the president in any way, even indirectly! James Madison wrote that “the district mode was mostly, if not exclusively in view when the Constitution was framed and adopted; & was exchanged for the general ticket [winner-take-all] & the legislative election” later on. He suggested a Constitutional amendment (“The election of Presidential Electors by districts, is an amendment very proper to be brought forward…”) and Hamilton drafted it.

Still, among Founders and states, it was an anti-democratic era. Some Americans prefer more democratic systems, and don’t cling to tradition — especially tradition as awful and unfair as the Electoral College — for its own sake. Some want positive changes to the way government functions and broadened democratic participation, to improve upon and make better what the Founders started, as we have so many times before.

Now, it’s often posited that the Founding Fathers established the Electoral College to make sure small states had more power to determine who won the White House. As we saw above, votes in smaller states are worth more than in big ones.

Even if the argument that “we need the Electoral College so small states can actually help choose the president” made sense in a bygone era where people viewed themselves as Virginians or New Yorkers, not Americans (but rather as part of an alliance called the United States), it makes no sense today. People now see themselves as simply Americans — as American citizens together choosing an American president. Why should where you live determine the power of your vote? Why not simply have everyone’s vote be equal?

More significantly, it cannot be said that strengthening smaller states was a serious concern to the Founders at the Constitutional Convention. They seemed to accept that smaller states would simply have fewer voters and thus less influence. Legal historian Paul Finkleman writes that

in all the debates over the executive at the Constitutional Convention, this issue [of giving more power to small states] never came up. Indeed, the opposite argument received more attention. At one point the Convention considered allowing the state governors to choose the president but backed away from this in part because it would allow the small states to choose one of their own.

In other words, they weren’t looking out for the little guy. Political scientist George C. Edwards III calls this whole idea a “myth,” stressing: “Remember what the country looked like in 1787: The important division was between states that relied on slavery and those that didn’t, not between large and small states.”

Slavery’s influence

The Electoral College is also an echo of white supremacy and slavery.

As the Constitution was formed in the late 1780s, Southern politicians and slave-owners at the Convention had a problem: Northerners were going to get more seats in the House of Representatives (which were to be determined by population) if blacks weren’t counted as people. Southern states had sizable populations, but large portions were disenfranchised slaves and freemen (South Carolina, for instance, was nearly 50% black).

This prompted slave-owners, most of whom considered blacks by nature somewhere between animals and whites, to push for slaves to be counted as fully human for political purposes. They needed blacks for greater representative power for Southern states. Northern states, also seeking an advantaged position, opposed counting slaves as people. This odd reversal brought about the Three-Fifths Compromise most of us know, which determined an African American would be worth three-fifths of a person.

The Electoral College was largely a solution to the same problem. True, as we saw, it served to keep power out of the hands of ordinary people and in the hands of the elites, but race and slavery unquestionably influenced its inception. As the Electoral College Primer put it, Southerners feared “the loss in relative influence of the South because of its large nonvoting slave population.” They were afraid the direct election of the president would put them at a numerical disadvantage. To put it bluntly, Southerners were upset their states didn’t have more white people. A popular vote had to be avoided.

For example, Hugh Williamson of North Carolina remarked at the Convention, during debate on a popular election of the president: “The people will be sure to vote for some man in their own State, and the largest State will be sure to succede [sic]. This will not be Virga. however. Her slaves will have no suffrage.” Williamson saw that states with high populations had an advantage in choosing the president. But a great number of people in Virginia were slaves. Would this mean that Virginia and other slave states didn’t have the numbers of whites to affect the presidential election as much as the large Northern states?

The writer of the Constitution, slave-owner and future American president James Madison, thought so. He said that

There was one difficulty however of a serious nature attending an immediate choice by the people. The right of suffrage was much more diffusive in the Northern than the Southern States; and the latter could have no influence in the election on the score of the Negroes. The substitution of electors obviated this difficulty…

The question for Southerners was: How could one make the total population count for something, even though much of the population couldn’t vote? How could black bodies be used to increase Southern political power? Counting slaves helped put more Southerners in the House of Representatives, and now counting them in the apportionment of electors would help put more Southerners in the White House.

Thus, Southerners pushed for the Electoral College. The number of electors would be based on how many members of Congress each state possessed — which recall was affected by counting a black American as three-fifths of a person. Each state would have one elector per representative in the House, plus two for the state’s two senators (today we have 435 + 100 + 3 for D.C. = 538). In this way, the number of electors was still based on population (not the whole population, though, as blacks were not counted as full persons), even though a massive part of the America population in 1787 could not vote. The greater a state’s population, the more House reps it had, and thus the more electors it had. Southern electoral power was secure.

This worked out pretty well for the racists. “For 32 of the Constitution’s first 36 years, a white slaveholding Virginian occupied the presidency,” notes Akhil Reed Amar. The advantage didn’t go unnoticed. Massachusetts congressman Samuel Thatcher complained in 1803, “The representation of slaves adds thirteen members to this House in the present Congress, and eighteen Electors of President and Vice President at the next election.”

Tyrants and imbeciles

At times, it’s suggested that the electors serve an important function: if the people select a dangerous or unqualified candidate — like an authoritarian or a fool — to be the party nominee, the electors can pick someone else and save the nation. Hamilton said, “The process of election affords a moral certainty, that the office of President will never fall to the lot of any man who is not in an eminent degree endowed with the requisite qualifications.”

Obviously, looking at Donald Trump, the Electoral College is just as likely to put an immoral doofus in the White House than keep one out. Trump may not fit that description to you, but some day a candidate may come along who does. And since the electors are chosen for their loyalty, they are unlikely to stop such a candidate, even if they have the power to be faithless. We might as well simply let the people decide.

It is a strange thing indeed that some people insist a popular vote will lead to dictatorship, ignoring the majority of the world’s democracies that directly elect their executive officer. They have not plunged into totalitarianism. Popular vote simply doesn’t get rid of checks and balances, co-equal branches, a constitution, the rule of law, and other aspects of free societies. These things are not incompatible.

France has had direct elections since 1965 (de Gaulle). Finland since 1994 (Ahtisaari). Portugal since 1918 (Pais). Poland since 1990 (Wałęsa). Why aren’t these nations run by despots by now? Why do even conservative institutes rank nations like Ireland, Finland, and Austria higher up on a “Human Freedom Index” than the United States? How is this possible, if direct elections of the executive lead to tyranny?

There are many factors that cause dictatorship and ruin, but simply giving the White House to whomever gets the most votes is not necessarily one of them.

Modern motives

We close by stating the obvious. There remains strong support for the Electoral College among conservatives because it has recently aided Republican candidates like Bush (2000) and Trump (2016). If the GOP lost presidential elections due to the Electoral College after winning the popular vote, like the other party does, they’d perhaps see its unfair nature.

The popular vote, in an increasingly diverse, liberal country, doesn’t serve conservative interests. Republicans have won the popular vote just once since and including the 1992 election. Conservatives are worried that if the Electoral College vanishes and each citizen has a vote of equal power, their days are numbered. Better to preserve an outdated, anti-democratic system than benefits you than reform your platform and policies to change people’s minds about you and strengthen your support. True, the popular vote may serve Democratic interests. Fairness serves Democratic interests. But, unlike unfairness, which Republicans seek to preserve, fairness is what’s right. Giving the candidate with the most votes the presidency is what’s right.

An Absurd, Fragile President Has Revealed an Absurd, Fragile American System

The FBI investigation into Donald Trump, one of the most ludicrous and deplorable men to ever sit in the Oval Office, was a valuable lesson in just how precariously justice balances on the edge of a knife in the United States. The ease with which any president could obstruct or eliminate accountability for his or her misdeeds should frighten all persons regardless of political ideology.

Let’s consider the methods of the madness, keeping in mind that whether or not a specific president like Trump is innocent of crimes or misconduct, it’s smart to have effective mechanisms in place to bring to justice later executives that are guilty. The stupidity of the system could be used by a president of any political party. This must be rectified.

A president can fire those investigating him — and replace them with allies who could shut everything down

The fact the above statement can be written truthfully about an advanced democracy, as opposed to some totalitarian regime, is insane. Trump of course did fire those looking into his actions, and replaced them with supporters.

The FBI (not the Democrats) launched an investigation into Trump and his associates concerning possible collusion with Russia during the 2016 election and obstruction of justice, obviously justified given his and their suspicious behavior, some of which was connected to actual criminal activity, at least among Trump’s associates who are now felons. Trump fired James Comey, the FBI director, who was overseeing the investigation. Both Trump and his attorney Rudy Giuliani publicly indicated the firing was motivated by the Russia investigation; Comey testified Trump asked him to end the FBI’s look into Trump ally Michael Flynn, though not the overall Russia inquiry.

The power to remove the FBI director could be used to slow down an investigation (or shut it down, if the acting FBI director is loyal to the president, which Andrew McCabe was not), but more importantly a president can then nominate a new FBI director, perhaps someone more loyal, meaning corrupt. (Christopher Wray, Trump’s pick, worked for a law firm that did business with Trump’s business trust, but does not seem a selected devotee like the individuals you will see below, perhaps because by the time his installment came around the investigation was in the hands of Special Counsel Robert Mueller.) The Senate must confirm the nomination, but that isn’t entirely reassuring. The majority party could push through a loyalist, to the dismay of the minority party, and that’s it. Despite this being a rarity, as FBI directors are typically overwhelmingly confirmed, it’s possible. A new director could then end the inquiry.

Further, the president can fire the attorney general, the FBI director’s boss. The head of the Justice Department, this person has ultimate power over investigations into the president — at least until he or she is removed by said president. Trump made clear he was upset with Attorney General Jeff Sessions for recusing himself from overseeing the Russia inquiry because Sessions could have discontinued it. It was reported Trump even asked Sessions to reverse this decision! Sessions recused himself less than a month after taking office, a couple months before Comey was fired. For less than a month, Sessions could have ended it all.

Deputy Attorney General Rod Rosenstein, luckily no Trump lackey, was in charge after Sessions stepped away from the matter. It was Rosenstein who appointed Robert Mueller special counsel and had him take over the FBI investigation, after the nation was so alarmed by Comey’s dismissal. Rosenstein had authority over Mueller and the case (dodging a bullet when Trump tried to order Mueller’s firing but was rebuked by his White House lawyer; Trump could have rescinded statutes that said only the A.G. could fire the special counsel, with an explosive court battle over constitutionality surely following) until Trump fired Sessions and installed loyalist Matt Whitaker as Acting Attorney General. Whitaker is a man who

defended Donald Trump Jr.’s decision to meet with a Russian operative promising dirt on Hillary Clinton. He opposed the appointment of a special counsel to investigate Russian election interference (“Hollow calls for independent prosecutors are just craven attempts to score cheap political points and serve the public in no measurable way.”) Whitaker has called on Rod Rosenstein to curb Mueller’s investigation, and specifically declared Trump’s finances (which include dealings with Russia) off-limits. He has urged Trump’s lawyers not to cooperate with Mueller’s “lynch mob.”

And he has publicly mused that a way to curb Mueller’s power might be to deprive him of resources. “I could see a scenario,” he said on CNN last year, “where Jeff Sessions is replaced, it would [be a] recess appointment and that attorney general doesn’t fire Bob Mueller but he just reduces his budget to so low that his investigation grinds to almost a halt.”

Whitaker required no confirmation from the Senate. Like an official attorney general, he could have ended the inquiry and fired Robert Mueller if he saw “good cause” to do so, or effectively crippled the investigation by limiting its resources or scope. That did not occur, but it’s not hard to imagine Whitaker parroting Trump’s wild accusations of Mueller’s conflicts of interest, or whipping up some bullshit of his own to justify axing the special counsel. The same can be said of Bill Barr, who replaced Whitaker. Barr, who did need Senate confirmation, was also a Trump ally, severely endangering the rule of law:

In the Spring of 2017, Barr penned an op-ed supporting the President’s firing Comey. “Comey’s removal simply has no relevance to the integrity of the Russian investigation as it moves ahead,” he wrote. In June 2017, Barr told The Hill that the obstruction investigation was “asinine” and warned that Mueller risked “taking on the look of an entirely political operation to overthrow the president.” That same month, Barr met with Trump about becoming the president’s personal defense lawyer for the Mueller investigation, before turning down the overture for that job.

In late 2017, Barr wrote to the New York Times supporting the President’s call for further investigations of his past political opponent, Hillary Clinton. “I have long believed that the predicate for investigating the uranium deal, as well as the foundation, is far stronger than any basis for investigating so-called ‘collusion,’” he wrote to the New York Times’ Peter Baker, suggesting that the Uranium One conspiracy theory (which had by that time been repeatedly debunked) had more grounding than the Mueller investigation (which had not). Before Trump nominated him to be attorney general, Barr also notoriously wrote an unsolicited 19-page advisory memo to Rod Rosenstein criticizing the obstruction component of Mueller’s investigation as “fatally misconceived.” The memo’s criticisms proceeded from Barr’s long-held and extreme, absolutist view of executive power, and the memo’s reasoning has been skewered by an ideologically diverse group of legal observers.

What happy circumstances, Trump being able to shuffle the investigation into his own actions to his first hand-picked attorney general (confirmation to recusal: February 8 to March 2, 2017), an acting FBI director (even if not an ally, the act itself is disruptive), a hand-picked acting attorney general, and a second hand-picked attorney general. Imagine police detectives are investigating a suspect but he’s their boss’ boss. That’s a rare advantage.

The nation held its breath with each change, and upon reflection it seems almost miraculous Mueller’s investigation concluded at all. Some may see this as a testament to the strength of the system, but it all could have easily gone the other way. There were no guarantees. What if Sessions hadn’t recused himself? What if he’d shut down the investigation? What if Comey, McCabe, or Rosenstein had been friendlier to Trump? What if Whitaker or Barr had blown the whole thing up? Yes, political battles, court battles, to continue the inquiry would have raged — but there are no guarantees they would have succeeded.

Tradition, political and public pressure…these mechanisms aren’t worthless, but they hardly seem as valuable as structural, legal changes to save us from having to simply hope the pursuit of justice doesn’t collapse at the command of the accused or his or her political allies. We can strip the president of any and all power over the Justice Department workers investigating him or her, temporarily placing A.G.s under congressional authority, and eradicate similar conflicts of interest.

The Department of Justice can keep its findings secret

Current affairs highlighted this problem as well. When Mueller submitted his finished report to Bill Barr, the attorney general was only legally required to submit a summary of Mueller’s findings to Congress. He did not need to provide the full report or full details to the House and Senate, much less to the public. He didn’t even need to release the summary to the public!

This is absurd, obviously setting up the possibility that a puppet attorney general might not tell the whole story in the summary to protect the president. Members of Mueller’s team are currently saying to the press that Barr’s four-page summary is too rosy, leaving out damaging information about Trump. The summary says Mueller found no collusion (at least, no illegal conspiring or coordinating), and that Barr, Rosenstein, and other department officials agreed there wasn’t enough evidence of obstruction of justice. But one shouldn’t be forced to give a Trump ally like Barr the benefit of the doubt; one should be able to see the evidence to determine if he faithfully expressed Mueller’s findings and hear detailed arguments as to how he and others reached a verdict on obstruction. Barr is promising a redacted version of the report will be available this month. He did not have to do this — we again simply had to hope Barr would give us more. Just as we must hope he can be pressured into giving Congress the full, unedited report. This must instead be required by law, and the public is at least owed a redacted version. Hope is unacceptable. It would also be wise to find a more independent, bipartisan or nonpartisan way to rule on obstruction if the special counsel declines to do so — perhaps done in a court of law, rather than a Trump lackey’s office.

The way of doing things now is simply a mess. What if an A.G. is untruthful in his summary? Or wants only Congress to see it, not the public? What if she declines to release a redacted version? What if the full report is never seen beyond the investigators and their Justice Department superiors, appointed supporters of the president being investigated? What if a ruling on obstruction is politically motivated?

We don’t know if the president can be subpoenaed to testify

While the Supreme Court has established that the president can be subpoenaed, or forced, to turn over materials (such as Nixon and his secret White House recordings), it hasn’t specifically ruled on whether the president must testify before Congress, a special counsel, or a grand jury if called to do so. While the president, like any other citizen, has Fifth Amendment rights (he can’t be “compelled in any criminal case to be a witness against himself,” risking self-incrimination), we do need to know if the executive can be called as a witness, and under what circumstances. Mueller chose not to subpoena Trump’s testimony because it would lead to a long legal battle. That’s what unanswered questions and constitutional crises produce.

We have yet to figure out if a sitting president can be indicted

If the executive commits a crime, can he or she be charged for it while in office? Can the president go to trial, be prosecuted, sentenced, imprisoned? We simply do not know. The Office of Legal Counsel at the Justice Department says no, but there is fierce debate over whether it’s constitutional or not, and the Supreme Court has never ruled on the matter.

There’s been much worry lately, due to Trump’s many legal perilsover this possible “constitutional crisis” arising, a crisis of our own design, having delayed creating laws for this sort of thing for centuries. For now, the trend is to follow Justice Department policy, rather helpful for a president who’s actually committed a felony. The president can avoid prosecution and punishment until leaving office or even avoid it entirely if the statute of limitations runs out before the president’s term is over!

“Don’t fret, Congress can impeach a president who seems to have committed a crime. Out of office, a trial can commence.” That is of little comfort, given the high bar for impeachment. Bitter partisanship could easily prevent the impeachment of a president, no matter how obvious or vile the misdeeds. It’s not a sure thing.

The country needs to rule on this issue, at the least eliminating statutes of limitations for presidents, at most allowing criminal proceedings to occur while the president is in office.

We don’t know if a president can self-pardon

Trump, like the blustering authoritarian he is, declared he had the “absolute right” to pardon himself. But the U.S. has not figured this out either. It’s also a matter of intense debate, without constitutional clarity or judicial precedent. A sensible society might make it clear that the executive is not above the law — he or she cannot simply commit crimes with impunity, cannot self-pardon. Instead, we must wait for a crisis to force us to decide on this issue. And, it should be emphasized, the impeachment of a president who pardoned him- or herself would not be satisfactory. Crimes warrant consequences beyond “You don’t get to be president anymore.”

The Odd Language of the Left

Language fascinates me. This applies to the study of foreign languages and the pursuit of a proper, ideal form of one’s native language (such as the preservation of the Oxford comma to stave off chaos and confusion), but most importantly to how language is used for political and social issues — what words are chosen, what words are ethical (and in what contexts), how the definitions of words and concepts change over time, and so on.

These questions are important, because words matter. They can harm others, meaning they can be, at times, immoral to use. Individuals and groups using different definitions can impede meaningful conversation and knowledge or perspective sharing, to such a degree that, in cases where no definition is really any more moral than another, long arguments over them probably aren’t worth it.

Despite incessant right-wing whining about political correctness, the Left is doing an important service in changing our cultural language. It’s driven by thinking about and caring for other people, seeking equality and inclusion in all things, which could theoretically be embraced by anyone, even those on the other side of the political spectrum who don’t agree with this or that liberal policy, or even understand or know people who are different. “Immigrants” is more humanizing than “aliens” or “illegals,” “Latinx” does away with the patriarchal, unnecessary male demarcation of groups containing both men and women (and invites in non-binary persons), and “the trans men” or simply “the men” is far more respectful than “the trangenders,” in the same way that there are much better ways of saying “the blacks.” There are of course more awful ways of talking about others, virulent hate speech and slurs; more people agree these things are unacceptable. As far as these less insidious word choices go, replacement is, in my view, right and understandable. Why not? Kind people seek ways to show more kindness, despite tradition.

What I find curious is when the Left begins questioning the “existence” of certain concepts. Finding better phrasing or definitions is often important and noble, but for years I’ve found the claims that such-and-such “does not exist” to be somewhat strange.

Take, for instance, “The friendzone does not exist.” This is the title of articles on BuzzfeedThought Catalog, and so forth, which the reader should check out to fully appreciate the perspective (of those rather unlike this writer, an admittedly privileged and largely unaffected person). It’s easy to see why one would want to wipe friendzone off the face of the Earth, as it’s often uttered by petulant straight men whining and enraged over being turned down. The rage, as noted in the articles, is the mark of entitled men feeling they are owed something (attention, a date, sex), wanting to make women feel guilty, believing they are victims, and other aspects of toxic masculinity. Such attitudes and anger lead to everything from the most sickening diatribes to the rape and murder of women. It’s a big part of why the feminist movement is important today.

Yet friendzone is a term used by others as well — it’s surely mostly used by men, but it’s impossible to know for certain if it’s disproportionately used by men of the toxic sort. If you’ll pardon anecdotal evidence, we’ve probably all heard it used by harmless people with some frequency. We’d need some serious research to find out. In any case, many human beings will at some point have someone say to them: “I don’t feel that way about you, let’s just be friends.” A silly term at some point arose (perhaps in Friends, “The One With the Blackout,” 1994) to describe the experience of rejection. What does it mean, then, to say “The friendzone does not exist”? It’s basically to say an experience doesn’t exist. That experience can be handled very differently, from “OK, I understand” to homicide, but it’s a happenstance that most people go through, so some kind of word for it was probably inevitable. If it wasn’t friendzone it likely would have been something else, and one suspects that if we eradicate this particular term a new one might eventually pop up in its place (justfriended?). It’s all a bit like saying “Cloud Nine does not exist” or “Cuffing season does not exist.” Well, those are expressions that describe real-world experiences. As long as a human experience persists, so will the concept and some kind of label or idiom, often more than one.

The relevant question is if the use of the term friendzone encourages and perpetuates toxic masculinity. Is it contributing to male rage? Does one more term for rejection, alongside many others (shot down, for instance), have that power? Or is it a harmless expression, at times wielded by awful men like anyone else? That’s a difficult question to answer. (The only earnest way would be through scientific study, the basis of many left-wing views.) While I could be wrong, I lean towards the latter. I don’t suppose it’s any more harmful or unkind than shot down and so forth, and see such terms as inevitable, meaning what’s really important is changing the reactions to certain life events. My guess is the word is experiencing a bit of guilt by association — terrible men use it while expressing their childish sentiments about how they deserve this or that, about how women somehow hate nice guys, and so on, and thus the term takes on an ugly connotation to some people. Other terms are used by them less and don’t have that connotation. Readers will disagree on how strong the connotation is, and how harmful the term is, but the main point was simply to ponder how a word for a common experience should be said to “not exist” — it’s hard to discern whether such phrasing intrudes more on one’s knowledge of reality or English. Perhaps both equally. It’s rather different than saying, “This word is problematic, here’s a better one.” I could be misinterpreting all this, and every instance of denying existence is supposed to mean the word simply shouldn’t be used, leaving space for other, better ways to describe the concept, but that just goes back to interest in how language is used in social issues — why say one but not the other, more clear, option? Anyway, read the articles and you’ll likely agree the very existence of concepts are being questioned. Finally, it’s interesting to consider why the Left ended up saying X doesn’t exist rather than, say, X is real and your toxic ass had better get used to it. What if, like words of the past, it had been adopted by those it was used against to strip it of its power and turn the tables? What causes that to happen to some words but not others? Is it because this one describes an event, not a person? Another intriguing question about language.

Similarly, does virginity exist? Not according to some (The OdysseyHer Campus). Again, the sentiment is understandable. Women’s worth has long been closely tied to virginity (read your bible), and with that came widespread oppressive efforts to keep women’s bodies under tight control, still manifested today in incessant shaming for engaging in sex as freely as men do, murder, and more. Men have experienced something related, though far less oppressive and in an opposite sense: women are more valuable as virgins (or with fewer overall partners) and are judged for being sexually active, while men are shamed or ridiculed for being virgins or not engaging in sex. Further, the definition of virginity is open to debate (the definition of friendzone is as well, though the most common one was used above). Is a straight person a virgin if he or she has only had anal sex? Is a gay person, who has regular sex with a partner, technically a virgin until death? Because the word’s meaning is subjective, and because it was a basis of patriarchal oppression, so the argument goes, “virginity doesn’t exist.”

Virginity is a way of saying one hasn’t had some form of sexual experience. For some it’s vaginal penetration, for others it’s different — the particular act doesn’t really matter. It’s simply “I haven’t had sex yet,” whatever form sex may take in the individual mind. Everyone has their own view of it, but that doesn’t make it unreal — in the same way everyone has their own idea of what love is, and yet love exists. Having sex for the first time is quite an event in any human being’s life, and most or many will experience it. Even if our history had been free of misogyny and patriarchy, there likely would have eventually arisen some term for having never experienced sex (or having been turned down). Does the statement “Virginity doesn’t exist” make sense? As with friendzone, it’s a labeled common experience, or lack thereof. While it was and is wielded by misogynistic oppressors, it’s an occurrence, and a concept, that certainly “exists.”

Does having a term for all this harm society and hurt others, helping preserve the hysteria over who’s had intercourse, and the associated maltreatment? Again, it’s possible. But my point is that a term is unavoidable. The state of being is real, thus the concept is real, thus a word or phrase will inevitably be employed. Being “single” happens — does “singleness” not exist? Won’t there always be some way to describe that state? We could get rid of the words virgin and virginity, but there’s no getting rid of “I’ve had sex” versus “I haven’t.” Another phrase or term will suffice just as well to describe the concept. We can abolish friendzone, but “The person I like turned me down” isn’t going away. There may be better words and definitions for concepts, but there’s often no case against a concept’s reality, which is how all this is framed. What’s important is to try to change the perceptions and attitudes toward these concepts, not deny they exist. “Yes, you were put in the friendzone, but you’ve done that to a lot of women you weren’t interested in. That’s life, you’ll live, grow up.” “So what if she’s not a virgin? Should your dateability or worth go down if you weren’t one? Why hers and not yours?” And so on. Indeed, it seems more difficult to change attitudes towards life events when you start off by saying, in essence, and confusingly, that an expression isn’t real.

There are other examples of assertions I find awkward, but as this article is lengthy already I will just briefly mention a couple of them and hope the reader assumes I’ve given them more thought than a few sentences would suggest. “There’s no such thing as race, it’s a social construct,” while doing a service by reminding us we are all part of the same human family, has always seemed mostly pointless in a reality where individuals biologically have different shades of skin and hair texture, and many are brutally victimized because of it. “No human being is illegal” puts forward an ideal, which I support: amnesty, a faster legal entrance policy, and so on (I also support the dissolution of all borders worldwide and the establishment of one human nation, but that may not be implied here). It’s also a call to describe people in a more respectful way, i.e. “undocumented” rather than “illegal.” Still, it always seemed a little off. Some human beings are here illegally, and our task is to change the law to make that history. That the State designates some human beings as illegal is the whole problem, the entire point. True, it’s an ideal, an inspirational call. But I always thought replacing “is” with “should be” or something would be more to the point. But enough splitting hairs.

Someone Worse Than Trump is Coming. Much of the Right Will Vote For Him Too.

Donald Trump is a nightmare — an immoral, vile, ignorant human being.

It is impossible to fully document his awfulness with brevity. Even when summarizing the worst things Trump has said and done it is difficult to know where to stop.

He calls women “dogs” — they are “animals,” “big, fat pigs,” “ugly,” and “disgusting” if they cross him or don’t please his gaze. You have to “treat ’em like shit,” they’re “horsefaces.” He makes inappropriate sexual jokes and remarks about his own daughter, about “grabbing” women “by the pussy” and kissing them without “waiting,” and admits to barging into pageant dressing rooms full of teenage girls with “no clothes” on. He mocks people with disabilities, Asians with imperfect English (including, probably, “the Japs“), and prisoners of warTrump was sued for not renting to blacks, took it upon himself to buy full-page ads in New York papers calling for the restoration of the death penalty so we could kill black teens who allegedly raped a white woman (they were later declared innocent), and was a leader of the ludicrous “birther” movement that sought to prove Obama was an African national. He is reluctant to criticize Klansmen and neo-Nazis, and retweets racist misinformation without apology. He’s fine with protesters being “roughed up,” nostalgic about the good old days when they’d be “carried out on a stretcher,” even saying about one: “I’d like to punch him in the face.” He likewise makes light of physical attacks on journalists. He praises dictators. He threatens to violate the Constitution as a political strategy. He cheats on his wife with porn stars and pays them to keep quiet. The constant bragging of a high I.Q. (his “very, very large brain“) and his big fortune, among other things, are emblematic of his ugly narcissism. His daily rate of lies and inaccuracies is surely historic, with journeys into fantasyland over crowd sizes and wiretaps by former presidents.

And those are merely the uncontroversial facts. Trump faces nearly two dozen accusations of sexual assault. He is alleged to at times say extremely racist things, remarks about “lazy,” thieving “niggers.” His ex-wife claimed in 1990 that he sometimes read Hitler’s speeches, and Vanity Fair reported Trump confirmed this. The payment to Stormy Daniels was likely a violation of campaign finance laws — Trump’s former attorney implicated him in court. Trump is being sued for using the presidency to earn income, his nonprofit foundation being sued for illegal use of funds. Trump has almost certainly engaged in tax fraud, joined in his staff and own son’s collusion with Russia during the 2016 election, and obstructed justice.

All this of course speaks more to his abysmal personality and character than his political beliefs or actions as executive. That’s it’s own conversation, and it’s an important one because some conservatives accept Trump is not a good person but think his policies are just wonderful.

On the one hand, many of Trump’s policies are as awful as he is, and will not be judged kindly by history. Launching idiotic trade wars where he slaps a nation with tariffs and is immediately slapped with tariffs in return, hurting U.S. workersStoking nativist fear and stereotypes about Hispanic immigrants and Muslims, driving the enactment of (1) a ban on immigrants from several predominantly Muslim nations (doing away with vetting entirely, keeping good people, many fleeing oppression, war, and starvation, out with the bad) and limits to refugees and immigrants in general, and (2) the attempted destruction of DACA (breaking a promise the nation made to youths brought here illegally) and a “zero tolerance” policy on illegal entry that sharply increased family separations. Saying foreigners at the border who throw rocks at the military should be shot. Pushing to ensure employers are allowed to fire people for being gay or trans (and refuse them service as customers), eliminating anti-discrimination protections for trans students in public schools, and attempting to bar trans persons from military service. Voting against a U.N. resolution condemning the execution of gays.

On the other hand, we can be grateful that, to quote American intellectual Noam Chomsky, “Trump’s only ideology is ‘me.'” Trump is thoroughly defined by self-absorption. He flip-flops frequently — reportedly most influenced by the last person he speaks to — and even used to call himself more of a Democrat, advocating for a few liberal social policies while remaining conservative on business matters. He either changed his mind over time or, as I wrote elsewhere, believed running as a Republican offered the best chance at victory and thus adopted an extreme right-wing persona — an idea that doesn’t at all mean he isn’t also an awful person (rather, it’s evidence of the fact). Outside of policies that serve him personally it is difficult to know what Trump believes in — or if he even cares. He may genuinely lack empathy and have no interest in policies that don’t affect him. True, perhaps he isn’t merely playing to his base and actually has a vision for the country, but the “ideology of me” does appear preeminent. While it’s “deeply authoritarian and very dangerous,” as Chomsky says, it “isn’t Hitler or Mussolini.” And for this we can count ourselves somewhat fortunate. (Likewise, that Trump isn’t the brightest bulb in the box, speaking at a fourth-grade level, reportedly not reading that well and possessing a short attention span, lacking political knowledge, and being labeled a childish idiot by his allies.)

Next time we may not be so lucky. As hard or painful as it is to imagine, someone worse will likely come along soon enough.

One day Trump will leave the White House, and with a profound sense of relief we will hear someone declare: “Our long national nightmare is over.” That’s what Gerald Ford said to the country the day he took over from Nixon — a man corrupt, deceitful, paranoid, wrathful, and in many ways wicked (he is on audiotape saying “Great. Oh, that’s so wonderful. That’s good” when told his aides hired goons to break protesters’ legs). One wonders how many people in 1974 thought that someone like Trump would be along in just eight presidencies? If there was a lack of imagination we shouldn’t repeat it.

In significant ways, there are already foreshadows of the next nightmare. Trump opened a door. His success was inspiration for America’s worst monsters. They have seen what’s possible — and will only be more encouraged if Trump is reelected or goes unpunished for wrongdoing and nastiness. I wrote before the election:

When neo-Nazi leaders start calling your chosen candidate “glorious leader,” an “ultimate savior” who will “Make American White Again” and represents “a real opportunity for people like white nationalists,” it may be time to rethink the Trump phenomenon. When former KKK leader David Duke says he supports Trump “100 percent” and that people who voted for Trump will “of course” also vote for Duke to help in “preserving this country and the heritage of this country,” it is probably time to be honest about the characteristics and fears of many of the people willing to vote for Trump. As Mother Jones documents, white nationalist author Kevin McDonald called Trump’s movement a “revolution to restore White America,” the anti-Semitic Occidental Observer said Trump is “saying what White Americans have been actually thinking for a very long time,” and white nationalist writer Jared Taylor said Trump is “talking about policies that would slow the dispossession of whites. That is something that is very important to me and to all racially conscious white people.” Rachel Pendergraft, a KKK organizer, said, “The success of the Trump campaign just proves that our views resonate with millions. They may not be ready for the Ku Klux Klan yet, but as anti-white hatred escalates, they will.” She said Trump’s campaign has increased party membership. Other endorsements from the most influential white supremacists are not difficult to find.

It wasn’t all talk. Extreme racists got to work.

  • In 2016, David Duke of KKK fame, who was once elected to the Louisiana state house, came in seventh out of 24 candidates in a run-off election for U.S. Senate. He earned 3% of the vote; about 59,000 ballots were cast for him.
  • In August 2018, Paul Nehlen, an openly “pro-White” candidate too racist for most social media platforms, garnered 11% of the vote in the GOP primary for Wisconsin’s 1st District (U.S. House of Representatives). He lost, but beat three other candidates.
  • John Fitzgerald, a vicious anti-Semite who ran for U.S. House of Representatives, beat a Democratic and independent candidate in California District 11’s open primary, coming in second with 23% of the vote. 36,000 people chose him. On November 6 he lost with 28% of the vote (43,000 votes).
  • A Nazi named Arthur Jones was the Republican nominee for U.S. House of Representatives from Illinois’ 3rd District (though he was the only person who ran as a Republican candidate, becoming the nominee by default). He just got 26% of the vote — 56,000 supporters.
  • Seth Grossman, who believes black people to be inferior, was the GOP nominee for U.S. House of Representatives from New Jersey’s 2nd District. He beat three other rivals, with 39% of the vote. He just garnered 46% of the vote in the general election. That’s 110,000 voters, just 15,000 short of the victor.
  • Russell Walker, who espouses the superiority of the white race, ran for District 48 in the North Carolina state house. He won the GOP primary in May, beating his rival with 65% of the vote. On November 6 he earned 37% of the vote in his race.
  • Steve West spreads conspiracy theories about the Jews, even saying “Hitler was right” about their influence in Germany. He won nearly 50% of the vote in the GOP primary for Missouri state house District 15, beating three others. On November 6 he also received 37% of the vote against his Democratic opponent.
  • Steve King has served in the U.S. House of Representatives since 2003. Hailing from Iowa’s 4th District, he said whites contributed more to civilization than people of color and constantly bemoans the threat that changing demographics represents to our culture. He also endorses white nationalists because they are “Pro Western Civilization” and spends time with groups founded and led by Nazis. He won 75% of the vote in the GOP primary — 28,000 votes. Then he got 50% in the general election (157,000 votes), keeping his seat.

There were others, of course, more subtle in their bigotry — more like Trump. Overall, there was a “record breaking” number of white supremacist candidates running for office this year. In most of the cases above, America couldn’t even keep such candidates in the single digits. Many beat more normal, tolerant candidates.

Those numbers may not seem all that impressive, not high enough to warrant any fears over a more horrific candidate winning the GOP presidential nomination. But it does not always take much. Turnout for the primaries is so low only 9% of Americans chose Trump and Hillary as party nominees. More voted for others, but that’s all it took. Trump won the nomination with 13 million votes, with 16 million Republican voters choosing someone else (both record numbers). He thus won 45% of the primary votes, which is about what Mitt Romney (52%) and John McCain (47%) accomplished. In other words, it would take less than half of Republican voters in the primaries to usher a more extreme racist (or sexist or criminal or what have you) to the Republican nomination. After seeing what many conservative voters could ignore or zealously embrace about Trump, this does not seem so impossible these days. Many Trump supporters, in a tidal wave of surveys and studies, were shown to have extremely bigoted and absurd views. From there, it isn’t that hard to envision a similar situation many conservatives faced in 2016, where they voted for an awful person they disliked to continue advancing conservative policies and principles. You have to stop abortion and the gays, you have to pack the Supreme Court, and so on. Some, to their immense credit, refused to do this — not voting, voting third party, or even voting for Clinton. But of course they were a minority. (And no, if you also believe absurd things, Democrats and liberals did not swing the election for Trump.)

The day of the election I felt more confident of Clinton’s victory than I had a couple weeks before. Previously, I had predicted that Trump was “probably” going to win. Perhaps it was a foolish optimism that washed over me on election day, when I expressed that Clinton would somehow eke out a narrow victory. I — and everyone else — should have known better. The tendency of the two parties to trade the White House every eight years, Clinton’s unpopularity on the Left, Trump as a reaction to the country’s first black president, the threat of the Electoral College handing the White House to another Republican with fewer votes…all sorts of factors should have made this an easy election to predict. Perhaps many of us simply did not want to face reality, did not want to believe we lived in a country where someone so awful could win, where so many voters are just like him or simply don’t care enough about his awfulness to refuse to vote for him. But after the shock and horror at Trump’s triumph abated, I could not shake the dread that this was merely the opening salvo in a battle against increasingly dangerous, extremist candidates.

Let’s hope, whether he — and it will certainly be a straight white male, given the extremist base — comes along in mere years or many decades, that we will not make the same mistake. Whether he will win is of course impossible to say. It will depend on how passionately we protest, how obsessively we organize, how voluminously we vote.

But Abortion!

There exists a particularly obnoxious set of visuals and memes produced by both conservative and less sophisticated liberal social media pages (looking at you, Occupy Democrats). They have to do with hypocrisy, and often revolve around abortion.

An example from the Left reads: “Only in America can you be pro-death penalty, pro-war, pro-nuclear weapons, pro-guns, pro-torture, anti-health care, and anti-food stamps and still call yourself ‘pro-life.'”

One from the Right goes: “Oh I get it now… The death penalty is bad, abortion is good.”

The implication or accusation of hypocrisy appears in conversation as well. Often when I post or write something critical of some horrible thing it’s only a matter of time before a conservative friend or acquaintance drops by with the tired “Yet you support abortion rights, what a hypocrite.” There is a good chance if you’re reading this right now it is because you just said something along those lines, as my quest to one day be able to reply in article form to any political comment or question, to save vast amounts of time, continues.

The problem with such accusations of hypocrisy is that they are so easily reversed. Well, well, well, you’re pro-life yet not a pacifist — what we’ve got here is a hypocrite! Why, you’re a pacifist yet somehow pro-choice — at least be morally consistent! 

Typically, when someone comes along guns blazing in this fashion, they’re employing the whataboutism fallacy. It’s distracting from or even discrediting whatever was originally posited by accusing someone of hypocrisy. So perhaps I post about how I think we shouldn’t conduct drone bombings in the Middle East and Africa because they kill far more innocent civilians than actual targets. When the inevitable “but abortion!” comes, there is usually no agreement concerning the immorality of the original issue addressed. Sometimes there is, but usually the individual only provides it later (when pressed), after the implied or explicit accusation of hypocrisy. The individual isn’t much interested in discussing whether the original issue is or isn’t moral. He or she wants to discuss abortion and make sure you know you’re two-faced. In turn, I try to keep things on-topic (and celebrate agreements where we find them), a debate preference that seems to annoy some people to no end. I often say that each issue, each moral question, needs to be weighed on its own merits. People don’t often grasp right away that this belief is connected to whether or not someone is actually a hypocrite, and I don’t explain it because that would further derail the conversation away from whatever the original topic was. As a remedy, I’ll briefly explain my thoughts here.

Say you’re a conservative and you’ve posted about how killing babies in the womb is wrong. Here I come with “But you support our War on Terror, which kills countless pregnant women and other innocent human beings. Hypocritical much?” If you’re like me, you’d be somewhat annoyed at this distraction from the cause you were trying to advocate for, or perhaps you’re unlike me and don’t mind taking whatever detour someone wants to go on. Regardless, you likely think and believe something along the lines of: These things are not the same. They’re a bit different, they have slightly different contexts — even if they both result in similar tragedies. You’re probably counting the ways in which they’re distinct or shouldn’t be compared right now.

In thinking so, you are essentially acknowledging that each moral question should be weighed on its own merits. Unless you actually think you’re a hypocrite, you believe these are slightly different situations and therefore different stances concerning them may be morally justified.

And you would of course be correct. These situations — torture, war, the death penalty, abortion, homicide, unregulated gun ownership, free market healthcare, and on and on — are unique, and have very different questions you have to answer before you can make a decision on whether they’re ethical. You have to work through unique factors.

Many of the most deeply conservative and fervently religious people believe abortion is never morally permissible under any circumstance, while others (conservatives and liberals, religious persons and nonreligious persons, etc.) believe there are some or many instances where it is. The purpose of this article isn’t to argue one way or the other, which I have done elsewhere. No matter what you think about abortion, I hope to simply demonstrate that people across the political spectrum are a tad too quick to use the h-word. So what are some standard questions about abortion that make folks think differently?

  • Was the pregnancy the result of rape?
  • Does birth endanger the life of the mother?
  • Should the government force you to give birth against your will?
  • Is it less moral to commit abortion as the pregnancy goes on? Does the age of the fetus matter?
  • Does the fact that women seek unsafe black market abortions, resulting in health complications or death, in societies where abortion is illegal change the moral equation at all?

Those are important questions to think about and answer when deciding whether or when abortion is morally permissible, and each person will answer differently.

But the relevant question here is: Do we also need to ask those questions when we ponder the morality of war?

Not really. Those questions aren’t going to be very helpful when deciding whether massacring civilians while dropping bombs to kill terrorist suspects overseas is the right thing to do. The questions concerning war won’t sound like the questions concerning abortion, and vice versa. Each issue, each situation, has its own array of unique questions to consider. They’re truly dissimilar contexts. This is why accusations of hypocrisy like we saw above don’t make a lot of sense.

In fact, such accusations of hypocrisy are so easily reversed because they don’t really have much to do with hypocrisy at all. It’s a bit like saying it’s hypocritical to think killing someone in cold blood is wrong but killing someone in self-defense is not. It’s the same result, right? In either case someone is killed. You hypocrite! Well, no, these are different circumstances with different moral questions and answers. Real hypocrisy has more to do with situations that are essentially the same. If I curse like a sailor but lambaste others for cursing, that’s hypocrisy. If you think women should be forced to give birth regardless of circumstance but wouldn’t think the same for men if they could get pregnant, that’s hypocrisy. If you’re Mitch McConnell, that’s hypocrisy. And so on. It has to do with holding yourself to different standards than you hold others in the same situation, which is pretty disingenuous (the word actually derives from the Greek word ὑπόκρισις [hypókrisis], meaning play-acting or deceit). But in different situations you have unique things to figure out and may therefore end up with different moral answers. Even a close analog to abortion, infanticide (more universally opposed, yet not without exception, as with the infant in constant agony from an incurable illness), has a difference people have to mull over, namely that the baby has not yet been born. One can think both are wrong, that the difference is insignificant, but the fact remains it is a literal difference — the situations aren’t identical. They’re much closer than other comparisons, true, but there is a difference that is more significant to some than others. That’s my point. So you have to ask different questions and decide for yourself if different scenarios have the same moral conclusions; they may, but when they do not it isn’t necessarily hypocrisy, simply because the scenarios were not indistinguishable.

(This isn’t the only context in which “hypocrisy” isn’t really used correctly. I once thought of writing an article entitled No One Knows What Hypocrisy Means after I was called a hypocrite for frequently criticizing white attacks against innocent people of color but rarely — though not never — doing the same for the reverse. But one is an exponentially bigger societal problem than the other. I didn’t posit that one is the wrong thing to do and the other the right thing to do; it simply makes sense to focus most of our attention and energies on more prevalent problems.)

The conservative can say to the liberal, “You’re a hypocrite for being a pacifist yet pro-choice,” but why bother? The liberal can simply respond, “And you’re a hypocrite for being pro-life yet pro-war.” Stalemate. Are we all hypocrites then? I would posit, instead, that none of us are. I personally don’t believe a conservative who is pro-life yet pro-war is a hypocrite (if I did, we know what that would be an example of). This is because I know these issues are not the same, that the conservative has different reasoning for and answers to unique moral questions that could result in divergent conclusions between scenarios. I may not agree with that reasoning or those answers one iota, but I understand them and how they may not lead to the same place.

Your White Ancestors May Have Immigrated Illegally, Too

It is undeniable that the United States has a long history of extreme racism regarding citizenship. The Naturalization Act of 1790, passed just three years after a Constitution that spoke of “Justice” and “Liberty,” bluntly declared that only a “free white person” could become an American citizen. This remained unchanged for nearly a century, until the 14th Amendment in 1868, passed after the Civil War, determined anyone born in the U.S. was a citizen. This was immediately contradicted by the Naturalization Act of 1870, which declared the only non-whites this change applied to were blacks; the 1898 Supreme Court case of Wong Kim Ark v. the United States finally brought citizenship to all people born here.

As for those already born who desired citizenship, the struggle continued. Women became truer citizens when they won the right to vote in 1920, unless they married an Asian non-citizen; then their citizenship could be revoked! Native Americans — whose ancestors had been here before anyone — had to wait until 1924 to be eligible for citizenship, Filipinos and people from India until 1946. Throughout the 1950s and 1960s, social movements then battled to make what had been promised by law a reality for men and women of color, whether native-born or immigrants.

Given white supremacy’s zealous protection of citizenship, it may seem surprising that there were no laws against immigration itself until 1875, when prostitutes and convicts were barred from entry. (But then, perhaps not so surprising, as most immigrants were from Europe — this despite hostilities towards the Irish, Catholics, Jews, and southern and eastern Europeans. All immigrants represented cheap labor, too.) Before that, immigration was reported but not regulated. Anyone could simply show up and try to scratch out a life for him- or herself. You can come, but don’t expect citizenship, don’t expect any power or participation in this democracy.

Millions came by the time the first racist immigration restriction was created: the Chinese Exclusion Act of 1882, banning almost all immigration from China. Many American whites were openly bigoted, but also spoke of economics — Chinese workers hurting their wages and taking their jobs. Other Asians were banned as well, as were people deemed idiots and lunatics. So it was the late 19th century before illegal immigration was possible, because beforehand there really were no laws against immigration.

Racist laws continued, of course. In 1921, temporary caps were placed on the number of immigrants allowed into the U.S. from other countries; these were made permanent in the Immigration Act of 1924. This was particularly an effort to stem the post-Great War flood of southern and eastern European immigrants, especially Italians, who were coming by the hundreds of thousands. Complaints against them, says historian Mae Ngai of Columbia University, “sounded much like the ones that you hear today: ‘They don’t speak English. They don’t assimilate. They’re darker. They’re criminals. They have diseases.’”

Immigrants from northern and western European nations were favored, including the recent enemy, Germany, which was allowed the most immigrants. (Later, Nazi Germany would justify some of its own racist legislation using American law, which was widely considered the harshest immigration policy in the world; see Hitler’s American Model, Whitman.) In 1929, only 11.2% of yearly immigrants could come from Italy, Greece, Poland, Spain, Russia, and surrounding nations. Only 2.3% could come from outside of Europe, and outside the Americas (the Americas were exempt and had no quotas).

Quotas

via George Mason University

This quota system persisted until the civilizing effects of the civil rights era reformed immigration law in 1965 and opened up the U.S. to more non-European immigrants (though quotas were then put on American countries).

Today, U.S. permanent immigration from other nations is capped at 675,000 people per year, except for people with close family in the U.S. — the number of permanent visas for that category is unlimited. In 2016, 618,000 permanent resident visas were issued. 5 million more applicants wait. No country can receive more than 7% of our visas. Add to this the temporary visas that are successfully converted into permanent ones and around one million people, most from Mexico, China, and other American and Asian nations, achieve permanent residency status here each year. Europeans make up a small minority of immigrants to the U.S.

In today’s debate over illegal immigration and citizenship (solved here), the white conservative trope that Central and South Americans should “do it right, do it legally like my ancestors did” is played on repeat. One has to question, however, whether such confidence is justified. During this period of tight restrictions on European immigrants there were indeed many illegal immigrants from Europe. How certain are you, exactly, that you are not a descendant?

To dodge the quota system, European immigrants would journey to Canada, Mexico, or Cuba and cross the border into the United States. Or they would simply pull ashore. The American Immigration Council documents:

In 1925, the Immigration Service reported 1.4 million immigrants living in the country illegally. A June 17, 1923, New York Times article reported that W. H. Husband, Commissioner General of Immigration, had been trying for two years “to stem the flow of immigrants from central and southern Europe, Africa and Asia that has been leaking across the borders of Mexico and Canada and through the ports of the east and west coasts.” A September 16, 1927, New York Times article describes government plans for stepped-up Coast Guard patrols because thousands of Chinese, Japanese, Greeks, Russians, and Italians were landing in Cuba and then hiring smugglers to take them to the United States.

The 1925 report regretted that the undocumented person’s “first act upon reaching our shores was to break our laws by entering in a clandestine manner.” The problem was so bad that Congress was forced to act:

The 1929 Registry Act allowed “honest law-abiding alien[s] who may be in the country under some merely technical irregularity” to register as permanent residents for a fee of $20 if they could prove they had lived in the country since 1921 and were of “good moral character.”

Roughly 115,000 immigrants registered between 1930 and 1940—80% were European or Canadian. Between 1925 and 1965, 200,000 unauthorized Europeans legalized their status through the Registry Act, through “pre-examination”—a process that allowed them to leave the United States voluntarily and re-enter legally with a visa (a “touch-back” program), or through discretionary rules that allowed immigration officials to suspend deportations in “meritorious” cases. In the 1940s and 1950s, several thousand deportations a year were suspended; approximately 73% of those who benefited were Europeans (mostly Germans and Italians).

The 1929 Registry Act, Steve Boisson writes for American History Magazine, was “a version of amnesty…utilized mostly by European or Canadian immigrants.” Much kinder treatment than mass deportations and separating children from parents, to be sure.

One woman who took advantage of the program, according to The Los Angeles Times, was Rosaria Baldizzi, who snuck in after leaving Italy.

Baldizzi would not become “legal” until a special immigration provision was enacted to offer amnesty to mainly European immigrants who arrived without proper documentation after 1921, who had established families, and who had already lived in the U.S. for seven years. She applied for legal status under the new policy and earned her citizenship three years later, in 1948. Only then, for the first time in more than two decades, could she stop worrying about her immigration status.

If you trace your family history you may be surprised by what you find. According to the Philadelphia Inquirer, Stanford professor Richard White, after researching his family tree,

discovered that his maternal grandfather, an Irishman, had entered the U.S. illegally from Canada in 1924 because he could not get a visa that year under the new quota laws. His grandfather failed in his first attempt, when he walked across a bridge into Detroit, got caught by U.S. customs officers, and was deported.

From Canada, the grandfather called his brother-in-law, a Chicago policeman, who came to Canada and met him there… The pair then walked to Detroit, but this time the brother-in-law, who was dressed in his police uniform, flashed his badge at the customs officers, who waved the duo through.

Even today, there are white undocumented immigrants in the United States. There are 440,000 to 500,000 illegal immigrants from Europe. This includes an estimated 50,000 Irish.

The next time someone declares his or her ancestors came here legally, demand proof at once.

Let Them Flirt

Whether we have a Republican or Democratic president, diplomacy and open dialogue are key to peace with other countries. Given that, Trump is doing the right thing by talking and meeting with North Korea. It’s not a groundbreaking idea, as Obama also expressed willingness to meet with Kim and engaged in diplomacy with Iran that culminated in an important anti-nuclear accord (two things that conservatives who are now just in awe of Trump absolutely lost their shit over at the time; for some reason totalitarian enemies can now be trusted to keep their word, inspections now work, and so forth).

I wish with every atom of my being that it wasn’t Trump in negotiations with Kim, of course. Like, driving someone who’s dying to the hospital is the right thing to do, but do you really want the cat behind the wheel? I guess if Petals is all you’ve got… I’d prefer it be a president with actual political/international diplomatic experience, deep knowledge of North Korea and its regime, better attention capabilities and comprehension skills, fewer authoritarian mannerisms and ideas, and better moral character. I’d also like a president who talked more about negotiating to make North Korea’s horrific, Holocaust-like labor camps, where even family members of people who complain about the regime are starved and worked until death, a thing of the past. Kim doesn’t exactly “love his people,” as Trump says. This issue is just as urgent as ending a nuclear program. Reports suggest Trump didn’t bring up human rights abuses.

I will say, however, that I am pleasantly surprised with what Vox described as a “shockingly weak” concession from the supposed tough guy: Trump said U.S.-South Korean military exercises would cease. Such exercises have always been stupid, near-suicidal acts of aggression on our part. People just don’t realize how close the U.S. has come to nuclear catastrophe, accidental or intentional, over shit like that since the beginning of the nuclear Cold War; it really–and obviously–escalates things…when you want to de-escalate things. So that, if it actually occurs, would be good. We could use less “toughness” in that and other regards. It’s also a good thing North Korea has publicly recommitted itself to doing away with its nukes (the U.S. should of course do the same), as unlikely as that is (being the only deterrent to U.S. invasion), and that Trump spoke of U.S. troops one day leaving South Korea. We just have to hope for the best with these talks; we want these awful, volatile men friendly. The main point is I’d rather have Trump and Kim frolicking arm-in-arm down the streets Pyongyang than threatening each other with nuclear destruction. The world is a safer place under those circumstances.

My Disillusionment With Social Justice Organizing in Kansas City

While originated with a rather different context, Elvis’ line “A little less conversation, a little more action, please” dances through my head when I reflect on the state of social justice organizing in Kansas City. The following thoughts come from observing, co-founding, and being employed by social justice groups here over the past few years. They represent my biggest concerns. As I will emphasize at the end, these problems don’t apply to all organizations nor are they always seen to the same degree.

First, many social justice groups focus heavily on events and gatherings where people simply sit around and talk. For some groups, this is literally all they do — either someone talking at the attendees, participants speaking with each other, or some combination of both. The primary purpose is education, raising awareness, whether concerning ideology, a social issue, an organization’s affairs, and so forth.

Now, this has value. Education, discussion, and perspective-taking are important and have value. But how much I somewhat question (especially speaking comparatively; see next section). The people who come to monthly meetings, community forums, panels, and so on are mostly going to be people who already care about whatever issue or ideology is being discussed, and thus already know something about it. It’s true, no one is ever done learning or listening; and it is further true that there will always be a few newcomers who don’t know anything about racism or socialism or what it means to have no healthcare. But most people who attend probably know a great deal about these things, through personal experience or study or earlier thought and discussion. One gets that impression by observation, at any rate. That’s why I suspect there are real limits to the value of these kinds of events due to the prior interest, knowledge, and worldview of most of the audience. That is not to say they should never be held! It’s simply to question why they should be the majority or totality of a group’s efforts.

Things worsen when these events grow repetitive. There are some organizations’ events I pop into every once in a while, and unfortunately confirm they’re basically the same thing every time. And having been on the planning side of things, I understand why, or at least one of the reasons why: you’re always thinking of the few newcomers. If you dive too deep into an education newcomers will get utterly lost, or at least you fear they will. So you end up sticking with the basics, and boring anyone who knows a bit about the issue.

Therefore, it’s easy to simply stop going to the gatherings of groups whose ideals you earnestly support. You may enjoy conversing with your friends and fellows, and hearing the perspectives of others, but in the end you may not feel you’re learning all that much, things may get repetitive and boring, and it dawns on you that while all this isn’t without value it’s not bringing about social change as speedily as other possibilities. Is sitting and talking really the best use of our time, energy, and money? All this is my experience, anyway. (I recently quit my job over this very issue; it gnawed at me for months, and finally one day I stood up at a conference of social justice groups in D.C., told everyone this was a waste of money and time that could have been better used, and walked out.)

There has to be something beyond sitting and talking. You have to give people who care about these issues something to do. But too often that isn’t coming; organizers and attendees pat themselves on the back as if they’ve accomplished something (I sense that white people at conversations on race especially feel like they’ve done something useful, alleviating their white guilt but not really bettering society much), then everyone starts preparing for the next monthly meeting.

Most importantly, the majority of what many organizations do does not confront power. Resources, time, and human energy poured into sitting and talking aren’t being poured into activities and tactics that put pressure on decision-makers, which does more good for society. Educating yourself and others is just Step One; it is just the first tool in the toolbox of social change. Then you actually get to work. Get out the vote for policies and candidates (if your organization legally can). Put your own initiatives on ballots. Harass the powerful in business and politics with petitions, messages, and calls. Boycott businesses. Protest and march outside workplaces and representatives’ offices. Go on strike, refusing to return to work until your demands are met. Engage in acts of civil disobedience: sit in and occupy your workplace or a political chamber, block streets as the powerful try to head to work, chain yourself to trees, and other illegal acts, facing down the risk of arrest or violence by police or bystanders. And you keep doing these things until you win. That’s how social movements succeed.

We need to shift from education to agitation. Imagine if instead of regular meetings, groups organized regular phonebanking, signature gathering, protesting, civil disobedience, and so forth. Imagine constant disruption on a host of issues. Imagine the impact. We should set specific, measurable goals (local control of the police for instance) and do those things until we win. As long as it takes.

We could combine agitation with service. We could raise money to help pay off people’s medical debts, help create strike funds for workers, organize volunteer efforts to clean up long-neglected neighborhoods, and other tangible ways of helping others. Such things don’t put pressure on power (though they can grow organizations, and solidarity among the people), and they address symptoms rather than the diseases agitation seeks to eradicate, but they’re better than sitting around.

I simply feel that some social justice organizations need to ask themselves: How much of what we do puts the pressure on? Is our money, energy, and time confronting corporate power, political power, police power? Why settle for just 5% or 10% of your activities actually pressuring someone? Why not make it 75% or 80%, and drive social change forward faster, doing more to better people’s lives?

True, some groups face obstacles. You may have very limited resources, making cheap meetings tempting. If you’re a 501(c)(3), you can’t support candidates. If you’re a grant-funded nonprofit, your energy may have to go into what is dictated by (oftentimes corporate) funders. See, what one may wish to do may not have a grant that will fund it; one then must do things according to grants that exist; the requirements to fulfill such grants may not do much good for anyone. It’s a systemic problem. But I nevertheless imagine most slow-moving groups could find some room to shift from education to agitation, despite the challenges. If the limit is 55, why go 25?

Finally, the Left is fractured, which helps no one. Often Kansas City’s communists, socialists, and anarchists are all at each other’s throats. Differences between anti-capitalist ideologies have led some groups to simply declare they’re never working with these other groups ever again. And of course the radical Left as a whole often refuses to work with liberal or center-left groups that aren’t anti-capitalist, even when they’re fighting for a number of identical or near-identical policies. The liberal and center-left groups naturally don’t want to be associated with radicals who carry red flags, wear black masks, and talk about revolution. Yes, there are limits to cooperation here (you’re not going to get some revolutionaries to get out the vote for anything or anyone), and that’s fine, but there are many areas where cooperation is possible but is not being pursued for fairly stupid reasons. It is vital to the future of social justice work, and the future of countless people, for groups to find common ground and stand there in solidarity with each other, despite stark or maddening differences that lie outside such ground.

These divisions are so great that some groups won’t attend any protest or other event unless it’s their own. Unless they’re brought on board as a sponsor, some organizations wouldn’t dream of promoting important actions and activities being conducted by others. It’s not ours, why would we? That’s the attitude, one I’ve wrestled with professionally. Perhaps we feel it makes our own organization seem less legitimate: less of a leader or less independent or less active. Perhaps it’s the fear of lack of reciprocity. We’re spreading the word about their stuff, why aren’t they doing the same for us? There should really be some sort of formal agreement of mutual support for actions and activities that relate to shared values. You don’t have to help organize and plan everything everyone else is doing; just advertise it to your networks to help drive turnout and involvement in confronting power. You don’t have to promote things or participate in things you disagree with, just those you do. That’s solidarity, right?

This article certainly isn’t meant to indict all organizations in Kansas City. There are some that focus their efforts on pressuring the powerful and work with anyone who agrees on the solutions to specific problems. It’s urgent others move in that direction. That’s how we can be most effective at changing society in positive ways and do work we can take pride in.

On Monday, June 11, 2018, I will again be arrested for an act of civil disobedience with Stand Up KC and the Poor People’s Campaign. The time for sitting and talking is over.

If you feel as I do, join us.

Yes, Evolution Has Been Proven

Evolution is a simple idea: that over time, lifeforms change. In a small timespan, changes are subtle yet noticeable; in a massive one, changes are shockingly dramatic — descendants look nothing like their ancestors, becoming what we call new species.

Changes occur when genes mutate during the imperfect reproduction process, and are passed on if the mutation helps an individual creature escape predators, find food or shelter, or attract a mate, allowing it to more successfully reproduce than individuals without its new trait (natural selection). Some mutations, of course, hurt chances of survival or have no impact at all.

Naturalist and geologist Charles Darwin provided evidence for this idea in his 1859 book On the Origin of Species and other works, and over the century and a half since, research in multiple fields has consistently confirmed Darwin’s idea, irreparably damaging religious tales of the divine creation of life just as it exists today.

  

The Myths of Man

While many people of faith have adopted scientific discoveries such as the age of the earth and evolution into their belief systems, many have not. Hardline Christian creationists still believe humans and all other life originated 6,000 years ago, with a “Great Flood” essentially restarting creation 4,000 years ago, with thousands of “kinds” of land animals (tens of thousands of species) rescued on Noah’s ark. 

The logical conclusion of the story is utterly lost on believers. There are an estimated 6.5 million species that live on land today, perhaps 8-16 million total species on Earth (that’s a conservative estimate; it could be 100 million, as most of our oceans remain unexplored). People have cataloged 2 million species, discovering tens of thousands more each year. Put bluntly, believing that in four millennia tens of thousands of species could become millions of species requires belief in evolution at a pace that would make Darwin laugh in your face.

To evolve the diversity of life we see today, much time was needed. More than 4,000 years, a planet older than 6,000 years. We know the Earth is 4.5 billion years old because radioactive isotopes in terrestrial rocks (and crashed meteors) decay at consistent rates, allowing us to count backward. Fossil distribution, modern flora and fauna distribution, and the shape of the continents first indicated the continents were once one, and satellites proved the continents are indeed moving apart from each other at two to four inches per year, again allowing us to count backward (Why Evolution is True, Jerry Coyne). When we do so, we do not stop counting in the thousands.

Naturally, criticisms of myths can be waved away with more magic, which is why it’s mostly futile to tear them apart, something I learned after wasting time doing so during my early writing days. Perhaps God decided to make new species after the flood. Perhaps he in fact made millions of species magically fit on a boat roughly the size of a football field, like a bag from Harry Potter. It’s the same way he got pairs of creatures on whole other continents to, and later from, the Middle East; how one family, through incest, rapidly evolved into multiple human races immediately after the flood (or did he make new human beings, too?); how a worldwide flood and the total destruction of every human civilization left behind no evidence. The power of a deity — and our imagination — can take care of such challenges to dogma. But it cannot eviscerate the evidence for evolution. Science is the true arrow in mythology’s heel.

Still, notions of intelligent design bring up many curious questions, such as why a deity would so poorly design, in identical ways, the insides of so many species (see below), why said deity would set up a world in which 99% of his creative designs would go extinct, and so on.

It seems high time we set aside ancient texts written by primitive Middle Eastern tribes and listened to what modern science tells us. And that’s coming from a former creationist.

 

It Wasn’t Just Darwin

74596-120-F4F7C75F.jpg

Charles Darwin, 1809-1882. via Britannica

Creationists attempt to discredit evolution by attacking the reliability and character of Darwin, but forget he was just one man. Darwin spent decades gathering the best evidence for evolution of his day, showed for the first time its explanatory powers across disciplines (from geography to embryology), and brought his findings to the masses with his accessible books. But there were many who came before him that deepened our and his understanding of where diverse life came from and how the biblical Earth wasn’t quite so young. For example:

  • In the sixth century B.C., the Greek philosopher Anaximander studied fossils and suggested life began with fishlike creatures in the oceans.
  • James Hutton argued in the 1700s that the age of the earth could be calculated based on an understanding of geologic processes like erosion and the laying down of sediment layers.
  • In 1809, Jean-Baptiste Lamarck theorized that physical changes to an individual acquired during its life could be passed to offspring (a blacksmith builds strength in his arms…could that lead to stronger descendants?).
  • By the 1830s, Charles Lyell was putting Hutton’s ideas to work, measuring the rate at which sediments were laid, and counting backward to estimate Earth’s age.
  • Erasmus Darwin, Charles’ grandfather, suggested “all warm-blooded animals have arisen from one living filament,” with “the power of acquiring new parts…delivering down those improvements by generation.”
  • Alfred Wallace theorized natural selection independently of and at the same time as Charles Darwin!

In other words, if it wasn’t Darwin it would have been Wallace. If not Wallace then someone else. Like gravity or the heliocentric solar system, the scientific truth of evolution could not remain hidden forever.

Creationists also seize upon Darwin’s unanswered questions and use them to argue he “disproved” or “doubted” the validity of his findings. For example, Darwin, in his chapter on “Difficulties of the Theory” in The Origin of Species, said the idea that a complex eye “could have been formed by natural selection, seems, I freely confess, absurd in the highest possible degree.”

Emphasis on seems. He went on to say:

When it was first said that the sun stood still and the world turned round, the common sense of mankind declared the doctrine false… Reason tells me, that if numerous gradations from an imperfect and simple eye to one perfect and complex, each grade being useful to its possessor, can be shown to exist, as is certainly the case; if further, the eye ever slightly varies, and the variations be inherited, as is likewise certainly the case; and if such variations should ever be useful to any animal under changing conditions of life, then the difficulty of believing that a perfect and complex eye could be formed by natural selection, though insuperable by our imagination, cannot be considered real.

In other words, the evolution of eye is possible and there is no real difficulty in supposing this given other evidence he had found. Darwin knew he was not the end of the line. He made predictions concerning future discoveries, and supposed that other scientists would one day show how eyes could develop from non-existence to simple lenses to complex eyes, as they indeed have. It began with cells that are more sensitive to light than others. Biologists believe, in the words of Michael Shermer (God Is Not Great, Hitchens), that there was

Initially a simple eyespot with a handful of light-sensitive cells that provided information to the organism about an important source of the light; it developed into a recessed eyespot, where a small surface indentation filled with light-sensitive cells provided additional data on the direction of light; then into a deep recession eyespot, where additional cells at greater depth provide more accurate information about the environment; then into a pinhole camera eye that is able to focus an image on the back of a deeply-recessed layer of light sensitive cells; then into a pinhole lens eye that is able to focus the image; then into a complex eye found in such modern mammals as humans.

Earth has creatures with no eyes, creatures with “a handful of light-sensitive cells,” and all the other stages of eye development, right up to our complex camera eye. Given this, there is no reason to believe the evolution of the eye is impossible. As creatures evolved from lower lifeforms, there were slight variations in their ability to detect light, which proved useful for many, which helped creatures survive, which passed on the variations to offspring. This is how life can go from simple to complex over the generations. See The Evidence for Evolution, Alan Rogers, pp. 37-49, for a detailed study.

While the natural process has yet to be observed by humans — it takes eons, after all — we are able to create computer models that mimic beneficial mutations. Dan-Eric Nilsson and Susanne Pelger at Lund University in Sweden, for instance, made a simulation wherein a group of light-sensitive cells on top of a retina experienced random mutations in the tissues around them. The computer was programmed to keep mutations that improved vision in any way, no matter how small. So when the tissue pulled backward, for example, forming a “cup” for the primitive eye, this was preserved because it was an improvement. After 1,829 mutations (400,000 years), the simulation had a complex camera eye (Coyne). Computer models are a great tool for showing how evolution works. Simulations aren’t programed to build something complex, only to follow the simple laws of natural selection. Check out Climbing Mount Improbable by Richard Dawkins for more.

 

Strange Coincidences

TetrapodLimb.jpg

Homologous limbs. via University of California Museum of Paleontology

While the study of homologous structures is fascinating, most won’t impress creationists. Humans, bats, birds, whales, and other creatures all have a humerus, radius, ulna, carpals, metacarpals, and phalanges in their forelimbs, with simple variations in size and sometimes number, suggesting they are related via a common ancestor yet have changed, evolved. But the creationist can simply say a sensible deity created them with similar structures. 

Yet there are some coincidences and oddities that no serious person would call intelligent design, and in fact scream common ancestry.

Modern whales have tiny leg bones inside their bodies that are detached from the rest of the skeleton. We humans have three muscles under our scalps that allow some of use to wiggle our ears, which do nothing for our hearing but are the precise same muscles that allow other animals to turn their ears toward sounds. Goosebumps, now worthless, are vestiges of an era when our ancestors had fur. Our sinus cavities, behind our cheeks, have a drainage hole on top — our ancestors walked on all fours, and thus the location made sense, allowing better drainage. Cave salamanders have eyes but are totally blind. Koalas, which spend most of their time in trees, have pouches for their young that open up-side-down — their ancestors were diggers on the ground, so this was useful to protect young from dirt and rock thrown about, but now threatens to allow koala cubs to plunge from trees (The Greatest Show on Earth, Richard Dawkins).

Even more astonishing, within the neck of Earth’s mammals, the vagus/recurrent laryngeal nerve, instead of simply going the short distance from your brain to your voicebox, extends from the brain, goes down into your chest, twists around your aortic arch by the heart, and then travels back up to the voicebox! It’s three times longer than necessary.

Incredibly, this same lengthy, senseless detour is found in other mammals, even the towering giraffe, in which it is fifteen to twenty feet longer than needed (see evolutionist Richard Dawkins cut one open and look here). In fish, which evolved earlier than us, the nerve connects the brain to the gills in a simple, straightforward manner (Coyne). This indicates our common ancestors with fish did not have this issue, but our common ancestors with other, later species did. As our mammalian ancestors evolved, the nerve was forced to grow around other developing, growing, evolving structures.

Human males have another interesting detour. As explained by Dawkins, the vans deferens, the tube that carries sperm from testes to penis, is also longer than necessary — and indeed caught on something. The vans deferens leaves the testes, travels up above the bladder and loops around the ureter like a hangar on a department store rack. It then finally finds its target, the seminal vesicle, which mixes secretions with the sperm. Then the prostate adds more secretions, finalizing the product (semen), which ejaculates via the urethra. The vans deferens could go straight to the seminal vesicle (under instead of around the bladder and ureter), but it doesn’t.

This same trait is found in other male mammals, like pigs. Creatures like fish again do not have this mess. Our ancestors had testes within the body, like many modern species, and as they descended toward the scrotum, toward the skin for cooler temperatures, the wiring got caught on the ureter. Perhaps one could see an intelligent (?) designer having to jam some things together to make them work — a detour for the van deferens here, another for the recurrent laryngeal nerve there — in one species. But in mammals across the board? How does that make more sense than all this being the imperfect byproduct of mindless evolution over time?    

Recurrent_laryngeal_nerve-for_web

via Laryngopedia

1461403451_the-ductus-deferens.png

via Anatomy-Medicine

 

 

 

 

 

 

And it doesn’t end there. Vertebrates (species that have a backbone) like us happen to have eyes with retinas installed backward. Rogers writes:

The light-sensitive portion of the retina faces away from the light… The nerves, arteries, and blood vessels that serve each photocell are attached at the front rather than the back. They run across the surface of the retina, obscuring the view. To provide a route for this wiring, nature has poked a hole in the retina, which causes a substantial blind spot in each eye. You don’t notice these because your brain patches the image up, but that fix is only cosmetic. You still can’t see any object in the blind spot, even if it is an incoming rock.

But cephalopods (squid, octopi, and other advanced invertebrates) have a more sensible set-up, with wiring in the back (Rogers). Guess what kind of creature appeared on this planet first? Yes, the invertebrates. These coincidences and bad engineering suggest that as life evolved to be more complex there were greater opportunities for messy tangles of innards.

The best creationists can do is declare there are good reasons for these developments, that evolutionists “fail to demonstrate how this detour…disadvantages the male reproductive system” for example, which is completely beside the point. There were indeed biological reasons behind the development of these systems, which served as an advantage, not a hindrance (breaking the vans deferens or recurrent laryngeal nerve to let other organs grow and evolve would not be good for survival). The point is that if some species share this trait, it hints at a common ancestor.

So does embryology, the study of development in the womb. The field of genetics, which we explore further in the next section, helped us discover dead genes or pseudo genes in lifeforms. These are genes that are usually inactive but carry traits that, if developed, would be viewed as abnormal. In light of evolution it makes sense that we still have them. And sometimes dead genes wake up.

Humans have just under 30,000 genes, with over 2,000 of them pseudo genes. We have dead genes for growing tails, for instance. We all have a coccyx, four fused vertebrae that make up the end of our spine — four vertebrae that are larger and unfused in primates, thus being the base of their tails (Coyne). Not only are some humans born with an extensor coccygis, the muscle that moves the tail in primates but is worthless in us due to our vertebrae being fused, some people are born with a tail anywhere from one inch long to one foot! It has to be surgically removed.

Balaji

Arshid Ali Kahn, born in India in 2001, was worshiped as a reincarnation of the Hindu monkey god Hunaman. He had his tail removed in 2015. via Mirror

In fact, all human embryos begin with a fishlike tail, which is reabsorbed into the body around week seven. We develop a worthless yolk sac that is discarded by month two, a vestige of reptilian ancestors that laid eggs containing a fetus nourished with yolk. We develop three kidneys, the first resembling that of fish, the second resembling that of reptiles; these are also discarded, leaving us with our third, mammalian version. From month six to eight, we are totally covered in a coat of hair (lanugo) — primates develop their hair at the same stage, only they keep it. These marvels exist in other life, too. Horse embryos begin with three-toed hooves, then drop to one; they descended from creatures with more than just one toe. Occasionally, a horse is born with more than one hoof, or toe, on each foot (polydactyl horse), similar to its ancestors. Birds carry the genes necessary to grow teeth, minus a single vital protein; they descended from reptiles with teeth. Dolphin and whale embryos have hindlimb buds that vanish later; baleen whale embryos begin to develop teeth, then discard them (Coyne).

lanugo-bebe-vello-espalda-hombros

Premature infants still have some of their lanugo coat. They will soon lose it. via Mipediatra

It should also be noted that people with hypertrichosis are covered in fur like other primates — perhaps the reactivation of a “suppressed ancestral gene. In the course of evolution genes causing hair growth have been silenced and the appearance of hair in healthy humans can be explained by an erroneous reactivation of such genes.”

maxresdefault

Supatra “Nat” Sasuphan, who has hypertrichosis, is the Guinness Book of World Records holder for hairiest person. via Fox News

Quite interesting that God would give us genes to grow tails and fur.

Our fetal development, you likely noticed, actually mimics the evolutionary sequence of humanity. This is most noticeably true with our circulatory system, which first resembles that of fish, then that of amphibians, then that of reptiles, then finally develops into our familiar mammalian circulatory system (Coyne). Strange coincidences indeed.

But there are more. As one would expect if evolution occurred, fossils of creatures found in shallower rock more closely resemble species living today; fossils found in deeper, older sedimentary layers are more different than modern life. This pattern has never been broken by any fossil discovery, and supports Darwin’s idea (Coyne).

Similarly, consider islands. The species found on islands consistently resemble those on the nearest continent. This at first does not sound surprising, as one would predict that life (usually birds, insects, and plant seeds) that colonized islands would do so from the closest landmass. But the key word is “resemble.” What we typically see are a few species native to a continent (the ancestors) and an explosion of similar species on the nearby islands (the descendants). Hawaii has dozens of types of honeycreepers (finches) and half the world’s 2,000 types of Drosophila fruit flies; Juan Fernandez and St. Helena are rich in different species of sunflower; the Galapagos islands have 14 types of finches; 75 types of lemurs, living or extinct, have been documented on Madagascar, and they are found nowhere else; New Zealand has a remarkable array of flightless birds; and Australia has all the world’s marsupials, because the first one evolved there. To the evolutionist, a tight concentration of similar species on islands (and individual islands having their own special species) is the result of an ancestral explorer from a nearby landmass whose descendants thrived in a new environment unprepared for them (a habitat imbalance), reproducing and evolving like crazy. Thus a finch on a continent has a great number of finch cousins on nearby islands — like her but not the same species (Coyne). Darwin himself, still a creationist at the time, was shocked by the fact that each island in the Galapagos, most in sight of each other, had a slightly different type of mockingbird (Rogers).

To the creationist, God simply has an odd affinity for overkill on islands.

 

Shared DNA

In the 20th century, geneticists like Theodosius Dobzhansky synthesized Darwin’s theory with modern genetics, showing how the random, natural mutation of genes during the copying of DNA changes the physiology of lifeforms (should that altered state help a creature survive, it will be passed on to offspring). The study of DNA proved once and for all that Darwin was right. By mapping the genetic code of Earth’s lifeforms, scientists determined — and continue to show — that all life on Earth shares DNA.

DNA is passed on through reproduction. You get yours from your parents. You share more DNA with your parents and siblings than you do with your more distant relatives. In the same way, humans share more DNA with some living things than with others. We share 98% with chimps, 85% with zebra fish, 36% with fruit flies, and 15% with mustard grass. By share, we mean that 98% of DNA base pairs (adenine, guanine, cytosine, and thymine) are in the precise same spot in human DNA compared to chimp DNA. (These four nucleobases can be traded between species. There is no difference between them — we’re all made of the same biochemical stuff.) 

It is not surprising that creatures similar to us (warm-blooded, covered in hair, birth live young, etc.) are closer relatives than less similar ones. It’s no coincidence that apes look most like us and share the most DNA with us (and are able to communicate most directly with us, with one of our own languages, learning and holding entire conversations in American Sign Language). Evolutionary biologists used to use appearance and behaviors (such as gills or reproductive method) to suppose creatures were related, like the trout and the shark or the gorilla and the human being. But DNA now confirms the observations, as trout DNA is more similar to shark DNA than, say, buffalo DNA, and gorilla DNA is more similar to human DNA than, say, fruit fly DNA. 

But all life shares DNA, no matter how different (for a deeper analysis, see Rogers pp. 25-31, 86-92). That simple truth proves a common forefather. A god would not have to make creations with chimp and human DNA nearly the same, all the nucleobases laid out in nearly the same order; why do so, unless to suggest that evolution is true? When mapped out by genetic similarity, we see exactly what Darwin envisioned: a family tree with many different branches, all leading back to a common ancestor.  

tree-of-life_2000

Our tree of life. Click link in text above to zoom. via Evogeneao

 

Transitional Forms

Darwin predicted we would find fossils of creatures with transitional characteristics between species, for example showing how lifeforms moved from water to land and back again. Unfortunately, the discovery of such fossils has done nothing to end the debate over evolution. 

For instance, as transitional fossils began to accumulate, it became even more necessary to attack scientific findings on Earth’s age. If you can keep the Earth young, evolution has no time to work and can’t be true. So, as mentioned, creationists insist radiometric dating is flawed. Rocks cannot be millions of years old, thus the fossils encased within them cannot either. This amounts to nothing more than a denial of basic chemistry. Rocks contain elements, whose atoms contain isotopes that decay into something else over time at constant rates. So we can look at an isotope and plainly see how close it is to transformation. We know the rate, and thus can count backward. If researchers only had a single isotope they used, perhaps creationist would have a prayer at calling this science into question. But rubidium becomes stronium. Uranium changes to lead, potassium to argon, samarium to neodymium, rhenium to osmium, and more (see Rogers pp. 73-80 to explore further). This is something anyone devote study to, grab some rocks, and measure themselves. All creationists can do is say we aren’t positive that “the decay rate has remained constant”! Can you imagine someone saying that during Isaac Newton’s time gravity’s acceleration wasn’t 9.8 meters per second squared? Anyone can make stuff up!

(You’ll find most denials of evolution rest on denials or misunderstandings of the most basic scientific principles. Some creationists insist evolution is false because it betrays the Second Law of Thermodynamics, which states that the energy available for work in a closed system will decrease over time — that things fall apart. So how could simple mechanisms become more complex? How could life? What they forget is that the Earth’s environment is not a closed system. The sun provides a continuous stream of new energy. Similarly, some believe in “irreducible complexity,” the idea that complex systems with interconnected parts couldn’t evolve because one part would have no function until another evolved, therefore the first part would never arise, and thus neither could the complex system. But the “argument from complexity” fails per usual. [Other arguments, such as the “watchmaker” and “747” analogies, are even worse. Analogy is one of the weakest forms of argument because it inappropriately pretends things must be the same. No, a watch cannot assemble itself. That does not mean life does not evolve. Analogies fighting evidence are always doomed.] Biologists have discovered that parts can first be used for other tasks, as was determined for the bacterial flagellum, the unwise centerpiece of creationist Michael Behe’s skepticism. Independent parts can evolve to work together on new projects later on. Rogers writes:

Many hormones fit together in pairs like a lock and key. What good is the lock without the key? How can one evolve before the other? Jamie Bridgham and his colleagues studied one such pair and found that the key evolved first — if formerly interacted with a different molecule. They even worked out the precise mutations that gave rise to the current lock-and-key interaction.

A part of this process is sometimes scaffolding, where parts that helped form a complex system disappear, leaving the appearance that the system is too magical to have arisen. The scaffolding required to build our bridges and other structures is the obvious parallel.)

Let’s consider the fossils humanity has found. Tiktaalik was a fish with transitional structures between fins and legs. “When technicians dissected its pectoral fins, they found the beginnings of a tetrapod hand, complete with a primitive version of a wrist and five fingerlike bones… [It] falls anatomically between the lobe-finned fish Panderichthys [a fish with amphibian-like traits], found in Latvia in the 1920s, and primitive tetrapods like Acanthostega [an amphibian with fish-like traits], whose full fossil was recovered in Greenland not quite two decades ago.” Tiktaalik had both lungs protected by a rib cage and gills, allowing it to breath in air and water, like the West African lungfish and other species today. Its fossil location was actually predicted, as researchers knew the age and freshwater environment such a missing link would have to appear in (Coyne).

Ambulocetus had whale-esque flippers with toes (Rodhocetus is similar). Pezosiren was just like a modern manatee but had developed rear legs. Odontochelys semitestacea was an aquatic turtle with teeth. Darwinius masillae had a mix of lemur traits and monkey traits. Sphecomyrma freyi had features of both wasps and ants. Archeopteryx was more bird-like than other feathered dinosaurs (that’s feathered reptiles), yet not quite like modern birds. Its asymmetrical feathers suggest it could fly. The Microraptor gui, a dinosaur with feathered arms and legs, could likely glide. Other featured dinosaurs were found fossilized sleeping with their head tucked under their forearm or sleeping on a nest of eggs, just like modern birds (Coyne; see also Dawkins pp. 145-180).

Australopithecus afarensis, Australopithecus africanus, Paranthropus, Homo habilis, Homo erectus, and many more species had increasingly modern human characteristics. Less and less like a primate, closer and closer to modern Homo sapiens. Fossils indicate increasing bipedality (walking upright on two legs), smaller jaws and teeth, increasingly arching feet, larger brains, etc. (Also important to note are the increasingly complex tools and shelters found with such fossils. Homo erectus left behind huts, spears, axes, and bowls. Our planet had not-fully-human creatures crafting quite human-like things. Think on that. See The History of the World, J.M. Roberts.)

fossil-hominid-skulls.jpg

A: chimp skull. B-N: transitional species from pre-human to modern human. via Anthropology

It doesn’t stop there, of course. Evolution can been seen in both the obvious and minuscule differences between species.

See for example “From Jaw to Ear” (2007) and “Evolution of the Mammalian Inner Ear and Jaw” (2013). It was theorized that three important bones of a mammal’s ear — the hammer, anvil, and stirrup — were originally part of the jaw of reptilian ancestors (before mammals existed). In modern mammals there is no connecting bone between the jaw and the three inner-ear bones, but if there was an evolution from reptilian jaw bone to mammalian inner-ear bone, fossils should show transitional forms. And they do: paleontologists have found fossils of early mammals where the same bones are used for hearing and chewing, as well as fossils where the jaw bones and inner-ear bones are still connected by another bone.

Creationists have a difficult time imagining how species could evolve from those without wings to those with, from those that live on land to water-dwellers, from aquatic lifeforms back to land lovers, and so on, because they believe intermediary, transitional traits would be no good at all, could not help a creature survive. “What good is half a wing?”

Yet today species exist that show how transitional traits serve creatures well. Various mammals, marsupials, reptiles, amphibians, fish, and insects glide. It is easy to envision how reptiles could have evolved gliding traits followed by powered flight over millions of years. Or consider creatures like hippos, which are closely related to and look like terrestrial mammals but spend almost all their days underwater, only coming ashore occasionally to graze. They mate and give birth underwater, and are even sensitive to sunburn. Give it eons, and couldn’t such species change bit by bit to eventually give up the land completely? The closest living genetic relative to whales are in fact hippos (Coyne). And finally, what of the reverse? What of ocean creatures that head to land?Crocodiles can gallop like mammals (up-down spine flexibility) as well as walk like lizards (right-left spine flexibility; see Dawkins). The mangrove rivulus, the walking catfish, American eels, West African lungfish, four-eyed fish, snakeheads, grunions, killifish, the anabas, and other species leave the waters and come onto land for a while, breathing oxygen in the air through their skin or even lungs, flopping or slithering or squirming or walking to a new location to find mates, food, or safety. Why is it so difficult to imagine a species spending a bit more time on land with each generation until it never returns to the water?

“Half a wing” is not a thing. There are only traits that serve a survival purpose in the moment, like membranes between limbs for gliding. Traits may develop further, they may remain the same, they may eventually be lost, all depending on changes in the environment over time. Environment (food sources, mating options, predators, habitability) drive evolutionary changes differently for all species. That’s natural selection. When some members of a species break away from the rest (due to anything from mudslides to migration to mountain range formation), they find themselves in new environments and evolve differently than their friends they left behind. Coyne writes, “Each side of the Isthmus of Panama, for example, harbors seven species of snapping shrimp in shallow waters. The closest relative of each species is another species on the other side.” Species can change a little or change radically, unrecognizably, but either way they can be called a new species — in fact, unable to reproduce with their long-lost relatives, because their genes have changed too greatly. That’s speciation.

There is no question that the fossil record starts with the simplest organisms and, as it moves forward in time, ends with the most complex and intelligent — all beginning in the waters but not staying there. Single-cell organisms before multicellular life. Bacteria before fungi, protostomes before fish, amphibians before reptiles, birds before human beings.

If they wish, creationists can believe the fossil record reflects the chosen sequence of a logical God, even if it does not support the Judeo-Christian creation story (in which birds appear on the same “day,” Day 5, as creatures that live in water, before land animals, which appear on Day 6; the fossil record shows amphibians, reptiles, and mammals appearing long before birds — and modern whales, being descendants of land mammals, don’t appear until later still, until after birds, just 33 million years ago). Yet they must face the evidence and contemplate what it indicates: that a deity created fish, then later fish with progressively amphibious features, then later amphibians; that he created reptiles, then later reptiles with progressively bird-like features, then birds; and so forth. No discovery has ever contradicted the pattern of change slowly documented since Darwin. God is quite the joker, laying things out, from fossils to DNA, in a neat little way to trick humans into thinking we evolved from simpler forms (note: some creationists actually believe this).

Yes, the believer can simply claim these were all their own species individually crafted by God, with no ancestors or descendants who looked or acted any different. The strange fact that we have birds that cannot fly and mammals in the oceans that need to come up to the surface for air doesn’t engage the kind of critical thinking one might hope for. It’s all just a creative deity messing with animals!

 

Watching Evolution Occur

069.jpg

Renee, an albino kangaroo at Namadgi National Park, Australia. via Telegraph

Most creationists are in fact quite close to accepting evolution as true.

First, they accept that genes mutate and can change an individual creature’s appearance. They know, for instance, about color mutations. We’re talking albinism, melanism, piebaldism, chimeras, erythristics, and so on.

Second, most creationists accept what they call “microevolution”: mutations help individuals survive and successfully reproduce, passing on the mutation, changing an entire species generation by generation in small ways, but of course not creating new species. They accept that scientists have observed countless microevolutionary changes: species like tawny owls growing browner as their environments see less snowfall, Trinidad guppies growing larger, maturing slower, and reproducing later when predators are removed from their environments, green anole lizards in Florida developing larger toepads with more scales to escape invaders, and more, all within years or decades. They understand evolution is how some insects adapt to pesticides and some viruses, like HIV and TB, adapt to our vaccines over time, and how we human beings can create new viruses in the lab. They acknowledge that humanity is responsible, through artificial selection, or selective breeding, for creating so many breeds of dogs with varying appearances, sizes, abilities, and personalities (notice the greyhound, bred for speed by humans, closely resembles the cheetah, bred for speed by natural selection). In the same way, we’ve radically changed crops like corn and farm animals like turkeys (who are now too large to have sex), and derived cabbage, broccoli, kale, cauliflower, and brussels sprouts from a single ancestral plant, to better sate our appetites, simply by selecting individuals with traits we favor and letting them reproduce.

120711-BananaPhoto-hmed-1040a_files.grid-6x2.jpg

Wild banana (below) vs. artificially selected banana. via NBC News

The evidence presented thus far should push open-minded thinkers toward the truth, but for those still struggling to make the jump from microevolution to evolution itself, we are not done yet. The resistance is understandable given that small changes can easily be observed in the lab or nature, but large changes require large amounts of time — thousands, millions of years — and thus we mostly (but not entirely) have to rely on the evidence from DNA, fossils, embryology, and so on. Here are some points of perspective that can bridge the gap between small changes and big ones.

1. Little changes add up. If you accept microevolution, you accept that species can evolve to be smaller or bigger, depending on what helps them survive and reproduce. Scott Carroll studied soapberry bugs in the U.S. and observed some colonizing bigger soapberry bushes than normal; he predicted they would also grow larger, as larger individuals would be more successful at reaching fruit seeds. Over the course of a few decades, the bugs’ “beak” length grew 25%. That’s significant. Now imagine what could theoretically be done with more time. As Coyne writes, “If this rate of beak evolution was sustained over only ten thousand generations (five thousand years), the beaks would increase in size by a factor of roughly five billion…able to skewer a fruit the size of the moon.” This is unlikely to happen, but shows how little changes later yield dramatic results. Imagine traits other than size — all possible traits you can think of — changing at the same time and evolution doesn’t sound so impossible.

2. Genes are genes. This relates closely to the point above. If some genes can mutate, why can’t others? Genes determine everything about every creature. People who believe in microevolution accept that genes for size or color can change, but not genes for where your eyes are, whether you’re warm- or cold-blooded, whether you have naked skin or a thick coat of fur, whether you have a hoof or a hand, and so on. But there is no scientific basis whatsoever for this dichotomy of the possible. It’s simply someone claiming “These genes can mutate but not these, end of story” to protect the idea of intelligent design. Genes are genes. They are all simply sequences of nucleotides. As far as we know, no gene is safe from mutation.

2-goat.jpg

Octogoat, a goat with eight legs, born in Kutjeva, Croatia. via ABC News

3. Mutations can be huge. We’ve seen how humans can have tails, but we also see “lobster claw hands,” rapid aging, extra limbs, conjoined twins, and other oddities. Consider other mutations: snakes with two heads, octopi with only six tentacles, ducks with four legs, cats with too many toes. For the common fruit fly, the antennapedia mutation will mean you get legs where your antenna are supposed to be! Dramatic mutations are possible. Survival is possible. Passing on new, weird traits is possible. With evolution, sometimes groups with new traits totally displace and eliminate the ancestral groups; sometimes they live side-by-side going forward. If you came across a forest and discovered one area was occupied by two-headed snakes and another by single-headed snakes, all other traits being the same, wouldn’t you be tempted to call them different species? Declare something new had arisen on Earth?

4. We are currently watching evolution occur. Scientists have observed speciation. They’ve taken insects, worms, and plants, put small groups of them in abnormal environments for many generations, and then seen they can no longer reproduce with cousins in the normal environments because they have evolved. It’s easy to create new species of fruit flies in particular because their generations are so short. Evolution for other species is typically much slower, but significant changes are being observed.

Say you were instead on the African Savanna and came upon two groups of elephants. They are the same but for one startling difference: one group has no tusks. Like two-headed snakes, what a bold difference in appearance! Should we classify them as different species or the same? (Technically, they aren’t different species if they can still reproduce offspring together, but in the moment you aren’t sure.) Well, African elephants are increasingly being born without tusks. After all, those without are less likely to be killed by poachers for ivory. This is natural selection at work. Could not a changing environment and millions of years change more? Size, color, skin texture, hair, skeletal layout, teeth, and all other possible traits determined by all other genes?

Next, take a remarkable experiment involving foxes launched by Dmitry Belyaev and Lyudmila Trut in the Soviet Union in the late 1950s, which Trut is still running to this day. No, we can’t watch a species for 500,000 years to see dramatic evolution in action. But 60 years gives us something.

At the time, biologists were puzzled as to how dogs evolved to have different coats than wolves, since they couldn’t figure out how the dogs could have inherited those genes from their ancestors. Belyaev saw silver foxes as a perfect opportunity to find out how this happened. Belyaev believed that the key factor that was selected for was not morphological (physical attributes), but was behavioral. More specifically, he believed that tameness was the critical factor.

In other words, Belyaev wanted to see if foxes would undergo changes in appearance if they evolved different behaviors. So Belyaev and Trut set about taming wild silver foxes.

17686.jpg

Wild silver fox. via Science News

They took their first generation of foxes (which were only given a short time near people) and simply allowed the least aggressive to breed. They repeated this with every generation. They had a control group that was not subjected to selective breeding.

The artificial selection of course succeeded for fox behavior. They became much more open to humans, whining for attention, licking them, wagging their tails when happy. But there was more:

A much higher proportion of experimental foxes had floppy ears, short or curly tails, extended reproductive seasons, changes in fur coloration, and changes in the shape of their skulls, jaws, and teeth. They also lost their “musky fox smell.”

Spotted coats began to appear. Trut wrote that skeletal changes included shortened legs and snouts as well. Belyaev said they started to sound more like dogs (Dawkins). Geneticists are now seeking to isolate the genes related to appearance that changed when selectively breeding for temperament.

Belyaev was right. And his foxes, through evolution, came to look more and more like dogs. This is the same kind of path that some wolves took when they evolved into dogs (less aggressive wolves would be able to get closer to humans, who probably started feeding them, aiding survival; tameness increased and physical changes went with it).

If such changes can occur in just 60 years, imagine what evolution could do with a hundred million years.

NOVA_Dogs_crop.jpg

Dr. Lyudmila Trut with domesticated silver fox. via WXXI

 

In the Beginning

It’s true, scientists are still unsure how life first arose on Earth. And because it is an enduring mystery without hard evidence, scientists with hypotheses and speculations openly acknowledge this. Note that’s a big difference compared to evolution, which scientists speak confidently about due to the wealth of evidence.

But one professor at MIT believes that far from being unlikely, nonliving chemicals becoming living chemicals was inevitable.

From the standpoint of physics, there is one essential difference between living things and inanimate clumps of carbon atoms: The former tend to be much better at capturing energy from their environment and dissipating that energy as heat… When a group of atoms is driven by an external source of energy (like the sun or chemical fuel) and surrounded by a heat bath (like the ocean or atmosphere), it will often gradually restructure itself in order to dissipate increasingly more energy. This could mean that under certain conditions, matter inexorably acquires the key physical attribute associated with life.

Researchers have discovered lipids, proteins, and amino acids beneath the seafloor, suggesting the chemical interaction between the mantle and seawater could produce the building blocks of life. From there, time and proper conditions could give rise to the first self-replicating molecule. Evolution would then continue on, spending billions of years developing the diverse flora and fauna we see today (a single cell leading to complex life under the right conditions should not be so shocking; as J.B.S. Haldane said, “You did it yourself. And it only took you nine months”).

Determining precisely how the first cell arose is the next frontier of evolutionary biology, and it is exciting to be here to witness the journey of discovery. New findings and experiments will wipe away “watchmaker” arguments used against the first cell. They will once again crush the “God gap,” the bad habit of the faithful to fill gaps in our scientific knowledge with divine explanations. I imagine in our lifetime someone will successfully complete Stanley Miller’s famous 1950s experiment, in which he tried to recreate the Earth’s early conditions and create life itself.

Yet lack of knowledge concerning the beginning of life in no way hurts the case for evolution. Evolution is proven, as definitively as whether the earth orbits the sun.

The Scope of False Sexual Assault Allegations

When conservatives are confronted by the rise of a “liberal” cause, many find and point to a small problem in order to discredit or divert attention from the immense problem liberals are attacking.

It’s an unhealthy mix of the whataboutism fallacy (citing wrongs of the opposing side instead of addressing the point) and the false equivalence fallacy (describing situations as equivalent [I’ll add “in scope”] when they are not). We observe this during talk on racial violence, when many conservatives pretend hate crimes against whites are just as common as hate crimes against people of color; see “On Reverse Racism.”

Lately, the fallacy was on full display as high-profile men across the country were accused of sexual assault and harassment, many fired or urged to resign. In this frenzy of allegations, some Americans see and cheer a surge in bravery and collective solidarity among victims inspired by each other and seeking justice, while others see and decry a male “witch hunt,” with evil women growing more bold about their lies, perhaps on the George Soros payroll. Where you land is a fairly decent predictor of your political views. Who was accused also determined for many which women to believe, with some conservatives supporting Republican Roy Moore through his rape of underage girls scandal but attacking Democrat Al Franken’s groping of women. Sadly, some liberals did the reverse. I know I witnessed a left-leaning acquaintance or two trying to discredit accusations against Franken, that he publicly apologized for, by slandering the victims. Still, it is typically conservatives (often sexually frustrated men) who, when they encounter liberals talking about rape, sexual assault, sexual harassment, toxic masculinity, and so forth, bring up false rape accusations.

One comment on a mediocre article Men’s Health shared on how to make sure you have consent from a woman typified this. There were of course countless like it, many poorly written: “And remember if she regrets it the next day you’re still fucked”; “I bring my attorney and a notary on all dates and hook ups”; “There’s no such thing as consent anymore, it’s a witch hunt. Just say no gentleman”; “Don’t forget guys… If you have drank 12 drinks and she has 1 sip of beer…… You raped her.” And still more angry with the article’s existence: “Men’s health turning into click bate leftist agenda”; “Did a feminist write this?”; “Did a woman write this?” It’s sad consent is a liberal, feminist scheme. But this comment got much attention and support, likely because people found it thoughtful and measured for some odd reason:

This is a touchy subject. Yes, respect women—We all know that. Have a woman’s consent—Yes, we all know that. Do not rape or sexually assault a woman—Yes we all know that. We respect the rules. However, there are some women that exploit and take advantage of the rules. It’s sad to say, there are some out there that falsely accuse a man of rape or sexual assault—ruining their lives. Being a man in today’s era, I’m afraid to ask a woman on a date. I feel sometimes a man needs a contract just to protect himself. Yes, this might sound objectionable and supercilious—but you can’t be too careful nowadays. We live in a different time now. Men: We need to change our attitudes and treatment of women. However, it’s okay that we protect ourselves—and we shouldn’t be demonized or vilified for doing so. I don’t want to be viewed or portrayed as the enemy, nor be apologetic for being a man.

An amusing writing. “We all know” not to rape, assault, or harass women? If the collective male “we” legitimately “knew,” such things would be a thing of the past and a primer on consent unnecessary. “We live in a different time” where men are “afraid to ask a woman on a date”! If you’re going to “protect” yourself in some way, you wouldn’t be “demonized” for actually getting consent in some formal sense; only if you used illegal and unethical methods to “protect” yourself, like the secret filming of sex. And where are these women asking men to apologize for being a men, rather than for specific behaviors or attitudes that make them uncomfortable, scared, unsafe, or physically violated?

This is a perfect example of the fallacy above. “Men sexually assault women and shouldn’t, but what about the women who make false accusations?” The latter part is clearly his main concern — he didn’t stop by to condemn rapists, he came with another purpose. They may not intend to or even realize it (some do), but when men (or women) do this they position false reports as a problem of the same significance or nearing the same significance as actual sex crimes. As if the scope, the prevalence, is comparable. That’s what taking a conversation on consent and redirecting it to one of false accusations does. It says, “This is what’s important. It’s what we should be talking about.” It’s like bringing up asthma when everyone’s discussing lung cancer. It deflects attention away from a problem that is much more severe. It’s a subtle undermining of the credibility of rape victims. It’s not wrong to discuss small problems, of course, but they should always be kept in perspective. It’s my view that comments about hate crimes against whites or false accusations against men that don’t include the enormous asterisks that these are minuscule percentages of overall hate and sex crimes should never have been uttered at all. In that way, we can think about others first. We can protect the credibility of real victims. We can remain rooted in the facts — not imply a small problem is large, or vice versa. Naturally, including those caveats undermines the usual function of bringing up these issues, but no matter.

Yes, lying about sex crimes in an issue that exists. Yes, there should be some legal punishment for such an immoral act (not anywhere near the punishment for sexual assault and harassment, obviously, because these are not in any way morally equivalent crimes). Yes, people are innocent until proven guilty, which is why men are safe from prison until they see their day in court, even if they face social consequences like losing a job due to presumed guilt — which you can oppose on ethical grounds, but not so stable ground as you would hope, especially when a man is accused by a coworker, family member, or someone else in close proximity. Is it most ethical to oppose a firing until a trial and risk keeping a rapist around the workplace? Putting others in danger? Forcing a victim to clock in next to him each day? Or is it most ethical to fire him and risk tearing down the life of an innocent man? It’s an unpleasant dilemma for any employer, university administrator, or whomever, but ethically there’s not much question. One risk is far graver, thus the answer is simple. This only grows more axiomatic when we acknowledge the likelihood of events.

The prevalence of proven false accusations of sexual assault is somewhere between 2% and 8% of cases. The National Sexual Violence Research Center documents a 2006 study of 812 cases that found 2.1% were false reports, while a 2009 study of 2,059 cases and a 2010 study of 136 cases estimated 7.1% and 5.9%, respectively. Research from 2017 revealed a 5% false claim rate for rape. The Making a Difference Project, using data from 2008, estimates 6.8%. These numbers are mirrored in prior American decades and in similar countries. While we can acknowledge that some innocent people in prison never see justice, are never set free, since 1989 there have only been 52 men released from prison after it was determined their sexual assault charges were based on lies. This compared to 790 murder exonerations; the number of people in state prisons for murder vs. sexual assault/rape is about the same (though the former crime is far less common than the latter), making the low exoneration rate for sex crime convictions all the more significant.

Myriad definitions of both “false report” and “sexual assault” make the precise percentage difficult to nail down, and these statistics only address proven false reports (there are many cases in limbo, as conservative writers are quick to point out), but this research gives us a general idea. Reports of high percentages of false claims are typically not academic studies or have rather straightforward explanations, for example when Baltimore’s “false claim” rate plunged from 31% to under 2% when the police actually went through some training and “stopped the practice of dismissing rapes and sexual assaults on the scene”! It’s remarkable how legitimate investigations and peer-reviewed research can bring us closer to the truth.

In other words, when observing any sexual misconduct scandal, there is an extremely high chance the alleged victim is telling the truth. This is why we believe women. This is why they should be given the benefit of the doubt, not accused men. It’s why the moral dilemma for employers and the like is hardly one at all. Were precisely 50% of sexual assault allegations lies, it would still be most ethical to take the risk of firing a good man rather than the risk of keeping a predator around. But since women are most always telling the truth? Well, the decision is that much easier and ethical.

In the U.S., there are some 321,500 rapes and sexual assaults per year, and 90% of adult victims are women (you’ve probably noticed how “men are raped too” is used in a similar manner to all this). One in six women are rape or attempted rape survivors. For every 1,000 rapes, 994 perpetrators (99%) will never go to prison.

Just How Bad is American Poverty?

“I was already It, whatever It was,” Jack London wrote in 1905, “and by aid of the books I discovered that It was a Socialist.” He continued, in his essay entitled “How I Became a Socialist,” by declaring:

Since that day I have opened many books, but no economic argument, no lucid demonstration of the logic and inevitableness of Socialism affects me as profoundly and convincingly as I was affected on the day when I first saw the walls of the Social Pit rise around me and felt myself slipping down, down, into the shambles at the bottom.[1]

Huge numbers of people fall into the pit of poverty, which can be very difficult to escape. It is certainly not a mere 15% of Americans or thereabouts, as the government’s outdated “poverty line” would have it (the threshold for a single person is $11,500 a year, as if someone making $12,000 isn’t poor). In reality, 48% of Americans live in poverty or near-poverty.[2] This is expected, as 40% of U.S. workers made under $15 an hour in 2015 and 50% of all jobs in the U.S. paid $34,000 annually or less in 2013.[3] Though it varies slightly by state, $34,000 is about $24,000 after taxes, or about $2,000 in take-home pay a month. If you make minimum wage, you earn just over $1,100 a month if working full-time. Meanwhile, the median cost of rent is about $1,000 and climbing.

56% of citizens have less than $1,000 in the bank, and one in three families have no savings at all.[4] Individuals making low wages must spend everything or nearly everything they make on groceries, electricity, water, rent, and gas or bus fare right away. If anything can be saved, it is often wiped out by the typical hurdles of life that better-off people consider mere annoyances, such as broken down cars or doctor’s visits. 77% of Americans say they are living paycheck to paycheck.[5] Millions have negative wealth due to loans, negative equity on homes after the 2008 housing crash, and so on.[6] Even when the economy is doing well, millions remain unemployed.

The work of the poor is often unfulfilling, unpleasant, even humiliating or dangerous. Many work long hours — 65, 70, 75 or more a week — or multiple jobs to make ends meet, seeing their loved ones infrequently. While they work, their children attend inferior schools (school funding is based on property taxes), often experiencing low-quality teachers, crumbling facilities, overcrowded classes, and a lack of books, supplies, and physical and mental healthcare. “I want to be able to go to school and not have to worry about being bitten by mice, being knocked out by the gases, being cold in the rooms,” a Detroit student, Wisdom Morales, said in 2016. Poverty actually damages mental abilities and mental health in children and adults alike.

The life expectancy of the poor is over a decade shorter than the rich, due to worse health.[7] Factors include unhealthy food being most affordable, unhealthy air and environments, stress and depression, smoking, lack of healthcare, and so on. Many low-income people have to live in dilapidated apartments or houses infested with roaches, mice and feces, rot, and mold, sometimes without heating or air conditioning.[8] If you have a month where you can’t pay a utility bill, your water or electricity is immediately cut off. If you can’t pay rent, you are evicted.

Almost 50 million Americans rely on food stamps.[9] Even U.S. soldiers spend tens of millions worth of food stamps each year.[10] 65% of us will use welfare, help we must qualify for, at some point in our lives to get by.[11] There exists a population of 1.5 million households that live on $2 a day—Third World levels—due to unemployment, reduced hours, lack of knowledge concerning welfare programs, etc.[12] Some in these households sell themselves for sex, sell plasma, or sell scrap metal to survive.

Persons with disabilities have no minimum wage protection, and can make under $1 per hour.

Each year, 3.5 million people will experience homelessness at some point (while 18.5 million homes stand empty, waiting for citizens who can afford them). 23% are children; about 10% are veterans; over 40% are disabled; 20-25% suffer from mental illness; most homeless women are domestic abuse victims.[13] The homeless suffer humiliation, from being denied service at businesses due to appearance to cities criminalizing begging, loitering, and sleeping in public places or even private vehicles.[14] Benches and sidewalks are redesigned, at times with spikes, to drive away the homeless looking for rest.[15] When the temperature drops, homeless people die outside.

The percentage of workers over 65 doubled since 1985, partly due to the elderly not having enough money to retire and Social Security payments being too dismal to live on.[16] What kind of society allows its elderly to live in poverty? Or its children? One in four U.S. children are food insecure, meaning missing meals or malnourished with cheap, unhealthy food – ketchup sandwiches, for instance.[17] Anastasia Basil remembered:

I’d come home from high school and there’d be nothing in the fridge but a bottle of red wine vinegar and a head of lettuce. On the counter, there’d be a bag of potatoes and a bottle of olive oil from the Dollar Store. That was dinner, potatoes and lettuce.

In the wealthiest nation on earth, children of the poor go to school with extremely painful rotting or impacted teeth.[18] Education activist Jonathon Kozol, in Savage Inequalities, wrote of the slums of East St. Louis:

As in New York City’s poorest neighborhoods, dental problems also plague the children here. Although dental problems don’t command the instant fears associated with low birth weight, fetal death or cholera, they do have the consequence of wearing down the stamina of children and defeating their ambitions. Bleeding gums, impacted teeth and rotting teeth are routine matters for the children I have interviewed in the South Bronx. Children get used to feeling constant pain. They go to sleep with it. They go to school with it.

Sometimes their teachers are alarmed and try to get them to a clinic. But it’s all so slow and heavily encumbered with red tape and waiting lists and missing, lost or canceled welfare cards, that dental care is often long delayed. Children live for months with pain that grown-ups would find unendurable. The gradual attrition of accepted pain erodes their energy and aspiration. I have seen children in New York with teeth that look like brownish, broken sticks. I have also seen teen-agers who were missing half their teeth. But, to me, most shocking is to see a child with an abscess that has been inflamed for weeks and that he has simply lived with and accepts as part of the routine of life. Many teachers in the urban schools have seen this. It is almost commonplace.

With low wages and no health insurance, seeing the dentist is a luxury.

Among advanced democracies, the U.S. has the highest rates of poverty, and is among the highest for infant mortality, among the lowest for life expectancy, living standards for the poorest among us, and wages (see A People’s History of Poverty in America, Pimpare).

“As for the unfortunates, the sick, and ailing, and old, and maimed, I must confess I hardly thought of them at all [early on],” London wrote. “My joyous individualism was dominated by the orthodox bourgeois ethics.” But he experienced economic hardship personally, and travelled throughout America and Canada listening to people “all wrenched and distorted and twisted out of shape by toil and hardship and accident, and cast adrift by their masters like so many old horses.” He continued:

And as I listened my brain began to work. The woman of the streets and the man of the gutter drew very close to me. I saw the picture of the Social Pit as vividly as though it were a concrete thing, and at the bottom of the Pit I saw them, myself above them, not far, and hanging on to the slippery wall by main strength and sweat… Just as I had been an individualist without knowing it, I was now a Socialist without knowing it… I had been reborn…

 

 

Notes

[1] “How I Became a Socialist,” Jack London

[2] Half of U.S. Poor or Low-IncomeCBS

[3] http://www.thenation.com/article/almost-half-of-all-american-workers-make-less-than-15-an-hour/; http://www.nytimes.com/2012/07/29/opinion/sunday/why-cant-we-end-poverty-in-america.html?_r=4&pagewanted=all

[4] http://www.forbes.com/sites/maggiemcgrath/2016/01/06/63-of-americans-dont-have-enough-savings-to-cover-a-500-emergency/#2715e4857a0b19717426dde1

[5] http://socialistappeal.org/news-analysis/editorials/1112-qis-capitalism-dyingq.html

[6] https://www.bloomberg.com/news/articles/2016-08-01/new-york-fed-study-finds-15-of-u-s-households-have-no-wealth

[7] https://www.nytimes.com/2016/02/13/health/disparity-in-life-spans-of-the-rich-and-the-poor-is-growing.html

[8] http://www.miamiherald.com/news/nation-world/national/article76429037.html

[9] http://socialistappeal.org/news-analysis/editorials/1112-qis-capitalism-dyingq.html

[10] http://www.truth-out.org/news/item/37259-us-soldiers-are-relying-on-millions-of-dollars-in-food-stamps-to-survive

[11] A People’s History of Poverty in America, Pimpare

[12] http://www.cbsnews.com/news/the-surging-ranks-of-americas-ultrapoor/

[13] https://gsgriffin.com/2016/12/08/u-s-canadian-city-governments-ending-homelessness-by-offering-jobs/

[14] http://www.frontsteps.org/wp-content/uploads/2014/04/DiscriminationReport20141.pdf; http://www.huffingtonpost.com/bill-quigley/ten-facts-about-homelessn_b_5977946.html

[15] https://www.theatlantic.com/business/archive/2014/06/how-cities-use-design-to-drive-homeless-people-away/373067/

[16] https://www.usatoday.com/story/money/personalfinance/retirement/2017/05/10/punching-past-65-older-worker-rate-highest-since-1962/101447336/

[17] http://abcnews.go.com/US/hunger_at_home/hunger-home-american-children-malnourished/story?id=14367230; https://thinkprogress.org/ketchup-sandwiches-and-other-things-stupid-poor-people-eat-41617483b497/

[18] http://www.cpr.org/news/story/tooth-decay-silent-epidemic-especially-poor-kids-colo; Savage Inequalities, Jonathon Kozol

On Homeschooling

While citizens should have the freedom to homeschool their children, in the same way they should be free to choose private schools over public schools, that does not mean there are no disadvantages to such a choice, to individuals and society at large.

There are some 2 million homeschooled children in the United States today, roughly 3% of students. Parents cite several reasons for homeschooling their children, including the desire to provide “religious and moral instruction,” a “concern about the school environment,” and “dissatisfaction with the academic instruction” at schools. A 2009 Department of Education report revealed 83% of homeschool parents held providing religious and moral instruction as one of their reasons for partaking in this practice. Almost 70% of homeschool families are white, and the National Home Education Research Institute believes about 70% are evangelical Christians. This is still largely the white, conservative Christian movement it was when it launched in the 1980s, though we can happily say it is becoming more diverse: there are more minority families now, more people choosing this route not because of religion but because of factors like racism in schools or the sad state of many poorly-funded city school districts. However, the words of President Tony Perkins of the Family Research Council still explain why conservative evangelicals homeschool:

As a homeschooling parent myself, I understand the desire to give children an environment that affirms traditional values. The government has eliminated God from the classroom and too often replaced Him with an anti-life, anti-family curriculum that misses life’s deepest meaning.

Again, parents have the right to think this way and keep their children at home. Nevertheless, the central disadvantage of homeschooling lies in its very purpose. The true danger isn’t that kids will be isolated or socially inept; a few may, but most homeschool children participate in sports, organizations, and other social outlets (though it’s not as extensive as being among peers 8 hours a day or having instant access to a broad array of free clubs, societies, and teams). Children being homeschooled against their will is dangerous, as it can breed resentment against parents, but that is not universal. The real problem is that children are primarily exposed to a single worldview. And of course that is the whole point.

It’s a disappointing state of affairs. Consider how many teachers one has in a public K-12 education in the suburbs or the cities: perhaps 50-60. Each of these teachers has his or her own worldview and life experience, family background, job history, travels, religion, political beliefs, ethnicity, sexual orientation, income level, and, perhaps most importantly, degree in education. No reasonable person would suggest one or two parents, no matter how well-educated, could provide the depth of knowledge that 60 people with specialized degrees and experience could in physics, mathematics, the arts, history, English, and so on.

There is a reason we have multiple teachers from 6th grade up, rather than just one: it is a task no one person should have or could possibly be qualified for. This is not to say homeschool education can’t be successful (homeschool students often excel in college and have higher state test scores, as any child receiving one-on-one, individualized instruction should), but I believe that education will not be as strong or as well-rounded if coming from a single person with a single worldview and life experience. What a shame that homeschooled students have nearly no chance of learning about Islam from a Muslim, evolution from a trained biologist, or communism from a Marxist. To me it’s a shame, to others it’s the point. Instead, students are limited to a narrow perspective, which will at best provide instruction from a parent less qualified than someone with an actual degree in a particular subject, and at worst outright lies about the world (anti-evolution, anti-climate change) and intolerance toward certain people (homosexuals, trans Americans).

One might make the same point about the social value of having more extensive interaction with diverse students. Instead of primary interaction with siblings or other homeschooled children who hold the same religious, conservative ideas, wouldn’t it better prepare students for a diverse world, and help them think critically from multiple viewpoints, if they interacted daily with Hindus, atheists, and African Americans? This is not to say homeschooled students don’t meet and befriend such kids at scouts, ballet, or football, but public school classrooms provide much longer, broader interactions, in an academic environment. There is value in that.

We value the integration and interaction of public schooling over homeschooling for the same reason we value integration and interaction over racially segregated classrooms. As I write in my book:

Integration is our hope because it is only through interaction that we come to know the Other. Separation and isolation is a breeding ground for misunderstanding, misjudgment, fear, and hostility. Interaction is diminishing arrogance and eradicating hatred at every moment. White soldiers of the Civil War forsook prejudice and assisted their black comrades to relocate when the cannons finally quieted because they had served with and befriended those men of color. Religious fundamentalists come to accept homosexuals when they find themselves sitting next to each other and conversing. Young students’ fear of special needs children fades away the longer they share a classroom. Integration serves a moral and social purpose.

The public school classroom provides the most direct interaction of diverse students, encouraging acceptance and understanding. The primary reason to reject homeschooling is the primary reason to support public schooling.

Public schooling is a precious creation. Our tax dollars should provide equally and adequately funded schools that are free and open to the public, contingent only on geographic location. Geographic location is not perfect, as our living arrangements and thus our schools are still very much divided by race and class, but it provides the best opportunity for students to learn with and from others of all political persuasions, religions, sexual orientations, races, income levels, and dietary preferences. Interaction and integration will breed peace and understanding, as it always does. That is what I want my tax dollars to build and what I think students need to experience, not private, corporate-controlled, or home education. There are still many other challenges in the world of education, such as eliminating high-stakes testing or expanding democratic control of standards, but public education is worth preserving if we desire a more tolerant society.

Ben Carson Does Not Do Unto Others

Ben Carson said in September 2015, “I would not advocate that we put a Muslim in charge of this nation,” suspicious of any faith that is “inconsistent with the values and principles of America.” These words exploded in his face, plunging his presidential campaign into a firestorm of criticism from liberals and conservatives alike.

He quickly amended his comments, explaining that he meant he couldn’t support a Muslim candidate who hadn’t “renounced the central [tenet] of Islam: Sharia Law,” under which “homosexuals–men and women alike–must be killed. Women must be subservient. And people following other religions must be killed.” But he acknowledged “that there are many peaceful Muslims who do not adhere to these beliefs” he could support if they repudiated these edicts.

The plot thickened on October 3, when, after the nonprofit Council on American-Islamic Relations called for him to pull out of the presidential race, Carson sought revenge by pushing the I.R.S. to rescind the nonprofit’s tax-exempt status, claiming it violated rules about interfering in a campaign.

Come on, Ben. Your position can be dismissed as absurd the moment you remember that to be an ethical person, you must hold yourself to the same standards you hold others. You must give others the freedom you desire for yourself. The Golden Rule, some call it, a simple idea that is found in virtually all major world religions.

In Christianity, it’s found in the book of Matthew: “Do to others what you would have them do to you.” In Islam, it’s in the Hadith: “Not one of you truly believes until you wish for others what you wish for yourself.” Far older than either of these are the words of Confucius, who said in the Analects, “Do not do to others what you do not want done to yourself.”

If one were to suggest a Christian shouldn’t be president, or a black man shouldn’t be president, Carson would call this what it is: bigotry, hatred, ignorance. It’s amazing a black man is saying something like this. How long ago was it that whites could openly say a black man shouldn’t be president?

And if Christians, most of whom don’t take these laws seriously anymore, do not have to publicly renounce the Old Testament before getting Ben Carson’s support, why should Muslims who don’t take the Koran’s nastiest laws seriously have to?

The fact that extremist Islam is a much greater threat to humanity today makes no difference in terms of ethics. If more Muslims take primitive laws seriously than do Christians, the Golden Rule remains unchanged. Were Christian oppression and terror a greater threat, and Islam the main religion in the United States, peaceful Christians would still wish to run for office without fear of a witch hunt, of Islamic politicians trying to weed out Christian candidates like Ben Carson who have yet to condemn the Old Testament.

Also, one wonders if Carson would approve of a Muslim candidate fighting to see a Christian nonprofit taxed because the nonprofit called for the candidate’s withdraw after anti-Christian remarks.

Simple role reversal is not difficult. Neither is a cursory examination of U.S. laws specifically designed to protect people from the kind of discrimination Carson envisions.

When asked if he thought Islam was compatible with the Constitution, Carson said, “No.” True, edicts about killing non-believers and homosexuals and such would violate Constitutional law, but so would any requirement of religious confession or renunciation. Article VI of the Constitution notes, using several absolutes, that “no religious test shall ever be required as a qualification to any office.” True, Carson was simply speaking of who he would personally support, not explicitly calling for such an official test. Yet if one can recognize when someone else’s views do not reflect the spirit, and letter, of the Constitution, one should just as easily be able to recognize when one’s own views make the same mistake.

Snowden

Edward Snowden wants to come home.

The former National Security Agency analyst says he has volunteered “many times” to cut a deal with the U.S. government that would allow him to return to the U.S. from Russia in exchange for a reduced prison sentence. “So far, they’ve said they won’t torture me,” Snowden said, “but we haven’t gotten much further than that.”

Snowden faces up to 30 years in prison for exposing details of the NSA’s massive domestic spying program two years ago. The intelligence files he leaked to the press revealed the government was keeping records of nearly 2 billion phone calls, text messages, and emails every day. The Patriot Act of 2001 opened the door to this sort of program.

Despite the fact that some Americans labeled Snowden a “traitor,” a massive public uproar against the government spurred by Snowden’s revelations pushed President Obama to terminate the spying program in June 2015.

Yet the charges against Snowden remain, charges filed under the old Espionage Act, used in World War I to throw critics of the war in prison.

The U.S. is willing to cut a deal with Snowden, but it remains to be seen what sort of reduced sentence the government will accept.

Getting to Know China

Ask most Americans what they know about China and the response would probably be fairly simple: “Biggest population…run by a Communist Party…the Great Wall…China creates products we use…we owe them a ton of money.”

But as China appears in more and more newspaper headlines, there is a new interest in learning more about the growing superpower across the Pacific. If you’re one of those wanting to study deeper, here are 9 incredible facts about our Chinese friends to get you started.

 

1. CHINA WAS THE MOST ADVANCED SOCIETY FOR ALMOST ALL OF HUMAN HISTORY.

Chinese emperors ruled over more people than the Roman Empire, and constructed a larger road system as well, according to historian Chris Harman (A People’s History of the World). Building the Great Wall was a feat unparalleled in the ancient world, a structure at least 6,000 miles long, possibly 13,000 miles at its prime…that’s half the circumference of the Earth!

The Chinese were the first to invent iron in early 5th century B.C., then steel during the Northern Wei Dynasty (A.D. 386-557), modern paper during the Western Han Dynasty (202 B.C.-A.D. 9), the mechanical clock and moveable-type printing during the Tang Dynasty (618-907), the compass during the Song Dynasty (960-1279), and gunpowder in the 9th century. They were the first society to use gunpowder-based weapons.

2. FAMOUS CHINESE EXPLORERS PUT EUROPEAN EXPLORERS TO SHAME.

True, in 1492 Columbus sailed the ocean blue, to cross the Atlantic and make it to Asia. His first journey was with three ships and fewer than 90 men. But Zheng He set sail to explore the Pacific in 1405 with 62 ships and 27,800 men. Over several voyages, he explored the waters of Southeast Asia, India, east Africa, the Persian Gulf, and made it all the way to Mecca in Saudi Arabia. Zheng He came from a Muslim family.

 

3. CHINA ONLY HAS ABOUT 7 MORE YEARS AS THE WORLD’S MOST POPULOUS NATION.

China currently has a population of 1.4 billion, yet since the 1951 the population growth rate fell from 2.8% to 0.6% today. India, which currently has 1.3 billion people, is estimated to pass China in population in 2022.

 

4. CHINA’S ECONOMY WILL SURPASS THE U.S. ECONOMY AS THE LARGEST IN THE WORLD IN 11 YEARS.

China already has the largest economy in terms of purchasing power (it has more people spending more money than the U.S. does, adjusting for currency value and cost of living). But by 2026, Chinese productive output will surpass that of the U.S. as well.

 

5. ISLAM IS THE MOST POPULAR RELIGION AMONG YOUNG PEOPLE IN CHINA.

According to research at Renmin University in Beijing, Islam has the largest proportion of followers (22.4%) under 30 than any other religion. Currently, China has some 23 million Muslims, more than some nations in the Middle East.

 

6. CHINA HAS HIGHWAYS THAT ARE 50 LANES ACROSS. 50!

And yet traffic jams are still a problem. On Wednesday, October 7, 2015, thousands of cars were delayed for hours on the G4 Beijing-Hong-Kong-Macau Expressway. Drivers were forced to merge from 50 lanes down to 20. Still not as bad as a traffic jam in 2010, which lasted 11 days and stretched back 60 miles.

 

7. AN ESTIMATED 1.6 MILLION CHINESE DIE EACH YEAR FROM POLLUTED AIR.

That’s about 4,400 people a day. As pollution from Chinese industry and energy use poisons and clouds their cities, China serves as an example of what will happen globally if carbon dioxide emissions go unchecked.

 

8. CHINA BUILDS MASSIVE CITIES WHERE NO ONE LIVES.

China constructs huge “ghost cities,” to meet the needs and interests of construction companies and keep their economy surging, but also to prepare for the 300 million Chinese expected to move from rural areas into cities by 2030.

 

9. YET CHINA HAS SOME OF THE MOST BEAUTIFUL LANDSCAPES ON EARTH.

Not only is the northern slope of Mount Everest in China, and the Great Wall and Forbidden City as stunning as they are famous, this massive nation is home to many incredible lesser-known landscapes, such as the Yangshou region.

Mike Rowe Attacks Sanders

On Sunday afternoon, December 13, 2015, Bernie Sanders posted on Facebook:

At the end of the day, providing a path to go to college is a helluva lot cheaper than putting people on a path to jail.

He included a graphic reading, “$80 billion: the amount we spend every year to lock up 2.2 million fellow Americans. Share if you support investing in education rather than incarceration.”

When Sanders tweeted a similar statement, without the graphic, on Sunday evening, it caught the eye of television host Mike Rowe, who criticized Sanders on Facebook.

Rowe perceived that Sanders sought to “imply that a path to prison is the most likely alternative to a path to college.” He questions “the increasingly dangerous idea that a college education is the best path for the most people,” lambasting “misguided parents” and others who perpetuate the idea that work that doesn’t require a college degree is inferior.

As if the fear of falling into an inferior career wasn’t bad enough, Rowe writes,

…it seems the proponents of “college for all” need something even more frightening than the prospect of a career in the trades to frighten the next class into signing on the dotted line. According to Senator Sanders, that “something,” is a path to jail.

Rowe implies Sanders is a “knucklehead” showcasing “arrogance and elitism,” reminds Sanders of “the number of college graduates with criminal records” and people in vocational careers without a degree who do not go to jail, insists Sanders’ post implies there is “no hope” for you if you don’t go to college, and that it

…will encourage more kids who are better suited for an alternative path to borrow vast sums of money they’ll never be able to pay back in order to pay for a degree that won’t get them a job.

To his credit, Rowe shares his thoughts in a mostly respectful, thoughtful manner, even acknowledging that “Maybe the 140 character limit has doomed [Sanders] to be misunderstood or taken out of context. Certainly, it’s happened to me.”

He speaks rightly of the need to dispel the idea that vocational, physical, or trade work is somehow inferior, a “consolation prize.” Further, there is truth in his claim that a college degree is not a surefire way to gainful employment.

According to the Economic Policy Institute, unemployment for young college graduates is 7.2% (14.9% work part-time but want full-time work) and their wages have fallen 2.5% since 2000. In 2014, a massive 46% of employed college graduates under 27 were working in a job that did not require a college degree. Further, a “non college” job is more likely, compared to 2000, to be cashier, server, or bartender than electrician, dental hygienist, or mechanic, a reflection of “a decline in the demand for ‘cognitive skills.’”

This is something Rowe should keep in mind: while the demand for “college jobs” may weaken, so can the demand for jobs he favors that require vocational training, leaving an army of young people in fast food or otherwise unskilled jobs they neither desire nor enjoy.

Sadly, Rowe doesn’t seem to understand what Bernie Sanders means when he writes about “providing a path to college.” Sanders wants to make public colleges and universities tuition free, saying elsewhere:

It is a national disgrace that hundreds of thousands of young Americans today do not go to college, not because they are unqualified, but because they cannot afford it… We have got to make sure that every qualified American in this country who wants to go to college can go to college—regardless of income.

Either Rowe didn’t know this, which is surely the case, or his post is full of contradictions. Remember, he writes that Sanders and others should not encourage young people to take on vast sums of debt; he rightly calls the $1.3 trillion in student loans an “obscenity.”

But of course, where Sanders is concerned, “college for all” is not a call for everyone to go to college because any alternative is inferior. It is a call to use the vast wealth of the nation to end the massive waste of human talent, potential, and freedom inherent in a system where Americans who want to go to college cannot because of finances or the fear of the huge loans Rowe condemns. Hence, college as a right offered free of cost, like K-12 public school education. In other words, it should be available for those who desire it.

Though the graphic Sanders included on Facebook was not on the Twitter post Rowe saw, it clarifies his point: the U.S. spends huge sums to lock people up, which could be used to cover the cost of college, which implies Sanders sees a need for prison reform. Anyone who knows anything about Sanders, for example, knows he opposes the mass imprisonment of nonviolent offenders, supporting the legalization of marijuana. He says:

Too many Americans have seen their lives destroyed because they have criminal records as a result of marijuana use. That’s wrong. That has got to change.

States tend to spend more on housing inmates than educating K-12 students, and some spend more on prisons than colleges and universities. In recent decades, expenses on prisons have skyrocketed, largely to make room for drug offenders.

One study found that while 48.8% of the U.S. population had some college credits or a degree, only 12.7% of the incarcerated population had the same. This is largely because high school dropouts are far more likely to end up in prison than high school graduates; the large majority of prisoners tend to have no high school diploma. Factors that lead students to drop out of school, mostly overly harsh punishments and barriers to re-entering school, are called the “school to prison pipeline.”

Rowe is correct that prison isn’t the most likely alternative to college, something Sanders did not say, though we can understand why Rowe thought he implied it. And of course, not graduating high school is a much larger part of the problem than not going to college, to a greater degree perpetuating poverty, which breeds crime. Sanders is likely alluding to the fact so many of our prisoners are poor and uneducated, factors closely bound together.

Still, there is no reason to not seek to widen opportunities and make improvements in both K-12 schools and colleges, and ease social conditions for those who attend both. Despite the fact that there is, as Rowe says, vocational work that can make people happy and financially secure, Americans with college degrees still earn higher incomes, are more likely to have a pension and health insurance provided by an employer, and are less likely to be unemployed.

It might be wise to listen to Sanders for a way to broaden opportunities for lower- and middle-income people and eliminate crippling student debt, by using resources for free college, not to lock up nonviolent people. It might be wise to listen to Rowe to end stigmatization surrounding workers without college degrees: they are not inferior, lazy, foolish, or any other harmful descriptor.

Talking to Dead People (and Other Candidate Oddities)

We all have skeletons in the closet, but some skeletons are scarier than others. Here are 8 weird stories about the 2016 presidential candidates. It’s as they say, the truth is stranger than fiction.

 

1. WHEN BERNIE SANDERS WROTE BAD SEX FICTION

In his 1972 article “Man — and woman,” Bernie Sanders wrote about a men masturbating to thoughts of abused and bound women, and women fantasizing about being gang raped while having sex with their husbands.

A spokesperson for Sanders said it was “a dumb attempt at dark satire…attacking gender stereotypes in the 1970s.”

Sanders himself said, “It was very poorly written and if you read it, what it was dealing with was gender stereotypes, why some men like to oppress women, why other women like to be submissive, you know, something like ‘Fifty Shades of Grey.’”

Gives “Feel the Bern” a whole new meaning, doesn’t it?

 

2. WHEN JOHN KASICH WENT TO WAR WITH BLOCKBUSTER OVER A SINGLE MOVIE

Something went very wrong when John Kasich and his wife made an oopsie at Blockbuster in the late 1990s.

“[We] thought, What the heck are we watching here? It was billed as a comedy, but it wasn’t funny. It was graphic, and brutal, and completely unnecessary, and it rubbed us in so many wrong ways we had to shut the thing off right there in the middle.”

The movie? Fargo.

Finding the infamous “woodchipper” scene disturbing, Kasich had found his crusade. “I got on the phone to Blockbuster and demanded that they take the movie off their shelves.” He apparently worked out a “deal” with the manager, who agreed to label the movie “graphic content.” Kasich “took his business elsewhere.”

But when he caught wind the manager wasn’t keeping his end of the deal, Kasich pulled out the big guns: another phone call. “Karen had to tell me to back off because I was driving everyone crazy,” he said. He later regretted his “rantings of a wild man.”

 

3. WHEN DONALD TRUMP APPOINTED HIMSELF JUDGE, JURY, AND EXECUTIONER

Donald Trump has referred to African Americans as “the blacks,” said “laziness is a trait in blacks,” worried over “black guys counting my money” (according to the former president of Trump Plaza Hotel & Casino), and he was sued by the Justice Department for not renting to blacks, a case that ended in a settlement.

But nothing “trumps” the time in 1989 when he rushed to judgement about the rape of a white female jogger in Central Park. Trump took out full-page newspaper ads calling for the execution of the suspects, who were African American teenagers. Blacks and whites alike were outraged. The suspects were later exonerated.

Apparently, Trump dislikes the idea of “innocent until proven guilty.”

Also, black people.

 

4. WHEN HILLARY CLINTON TALKED TO IMAGINARY FRIENDS IN THE WHITE HOUSE

While she was the first lady, Hillary Clinton used to pretend to talk to Eleanor Roosevelt. Apparently, when Clinton told her about times getting tough, Eleanor would “usually respond by telling me to buck up or at least to grow skin as thick as a rhinoceros.”

Clinton was encouraged to do this “reflective meditation” by a New Age spiritual counselor named Jean Houston in 1996.

According to Bob Woodward (The Choice), she also talked to Gandhi.

Bill Clinton said in 2012, during the dedication of Franklin D. Roosevelt Four Freedoms Park:

A special thanks to the members of the Roosevelt family who are here. And the one who is not, Eleanor, who made sure that the four freedoms were included in the preamble to the Universal Declaration of Human Rights in 1948. I know that because, as all of you famously learned when I served as president, my wife, now the secretary of state, was known to commune with Eleanor on a regular basis. And so she called me last night on her way home from Peru to remind me to say that. That Eleanor had talked to her and reminded her that I should say that.

5. WHEN TED CRUZ WANTED TO GET BIG GOVERNMENT OUT OF OUR TOILETS

Say what you will about Ted Cruz, he is the only politician brave enough to take a stand for toilet freedom.

In 2012, he said on Glenn Beck’s radio show:

The federal government’s already shown that they believe they can control every aspect of our life. I mean, right now Congress is trying to tell us what kind of light bulbs to buy and what kind of toilets. Right now you are prevented from buying a toilet that actually flushes because the bureaucrats in Washington know better than you do.

In 2013, he condemned the “federal government that thinks they have the authority to regulate our toilet seats.”

The regulations have to do with sanitation and common courtesy for disabled people in public facilities: toilets must have a hinged lid and an adequate supply of toilet paper, there must be one toilet seat and one urinal per 40 workers, and most restrooms must have one toilet that is accessible for disabled persons.

Thank God Ted Cruz is around to stand up for justice.

 

6. WHEN BEN CARSON INSISTED THAT JOSEPH, NOT THE EGYPTIAN PHARAOHS, BUILT THE PYRAMIDS

In his quest to convince the American people one doesn’t actually have to be that smart to be a brain surgeon, Ben Carson confirmed in November 2015 that he still believes the ancient pyramids were built to store grain, not as tombs for Egyptian rulers.

In a 1998 speech that included, of all things, ignorance, history, and science, Carson said:

My own personal theory is that Joseph built the pyramids to store grain. Now all the archeologists think that they were made for the pharaohs’ graves. But, you know, it would have to be something awfully big if you stop and think about it. And I don’t think it’d just disappear over the course of time to store that much grain.

And when you look at the way that the pyramids are made, with many chambers that are hermetically sealed, they’d have to be that way for various reasons. And various of scientists have said, “Well, you know there were alien beings that came down and they have special knowledge and that’s how — ” you know, it doesn’t require an alien being when God is with you.

In other words, no need to trust decades of archaeological discoveries. If the Bible said Joseph had to store grain, well, why not assume, without a shred of evidence, it was in the pyramids? Just don’t go too far with your superstition. That alien stuff is cray.

 

7. WHEN MIKE HUCKABEE’S SON BUTCHERED A DOG AND HUCKABEE COVERED FOR HIM

In 1998, 17 year old David Huckabee participated in the torture and hanging of a stray dog at a Boy Scout camp in Arkansas, a misdemeanor but not a felony. Animal rights groups were enraged. A local prosecuting attorney requested the Arkansas state police help in the investigation.

But Mike Huckabee, the governor of Arkansas, worked to keep that from happening.

John Bailey, then the director of Arkansas’s state police, tells NEWSWEEK that Governor Huckabee’s chief of staff and personal lawyer both leaned on him to write a letter officially denying the local prosecutor’s request. Bailey, a career officer who had been appointed chief by Huckabee’s Democratic predecessor, said he viewed the lawyer’s intervention as improper and terminated the conversation. Seven months later, he was called into Huckabee’s office and fired. “I’ve lost confidence in your ability to do your job,” Bailey says Huckabee told him. One reason Huckabee cited was “I couldn’t get you to help me with my son when I had that problem,” according to Bailey. “Without question, [Huckabee] was making a conscious attempt to keep the state police from investigating his son,” says I. C. Smith, the former FBI chief in Little Rock…

The state police did not grant the request. No charges were ever filed against David Huckabee.

 

8. WHEN LINDSEY GRAHAM DIDN’T SEND AN EMAIL…EVER

Lindsey Graham spent 8 years as a representative and 12 years as a senator, yet somehow he’s found a way to grandma his way out of the 21st century.

In 2015 he said to Chuck Todd on NBC, “I don’t email. No, you can have every email I’ve ever sent. I’ve never sent one.”

Sanders Asks Youth to ‘Prove Them Wrong’

On January 13, 2016, the Bernie Sanders campaign launched its Prove Them Wrong website, which calls on young people, especially any 17 year old Iowan who will turn 18 by November 8, to pledge to “caucus” (vote) for Bernie Sanders in the Iowa caucus on February 1.

To participate in the Iowa Democratic caucus, students and Iowans in general must register as a Democrat and determine their voting location.

Prove Them Wrong declares:

They say you don’t care

They say you won’t caucus

They say Bernie can’t win

Prove them wrong

“They” refers to Americans who believe youth are apathetic about politics and voting. In a video to Iowa students, Sanders said the caucus

…gives you a unique opportunity to play a very big role in national politics… What your job is about is to raise the issues that are on your mind… Are we doing enough in terms of social justice in this country, combating racism and sexism and homophobia? How do we move forward to make sure the United States leads the world in combating climate change? What do we do about the high rate of childhood poverty in this country?

Sanders has a knack for attracting young potential voters, who support his progressive liberal platform that includes higher taxes on the extremely wealthy to pay for free college tuition, jobs programs, and universal health care; he also favors a higher minimum wage and fewer wars overseas.

The senator from Vermont is most popular among 18-29 year olds, according to The Guardian, amassing an enormous following of passionate supporters that flood the Internet with hashtags like #FeelTheBern and #BabesForBernie. In fact,

Of all those running for president, Sanders has the highest-level engagement on his individual Facebook posts, according to social media monitor CrowdTangle. He has the largest number of people liking his messages, sharing his thoughts, and commenting on his plans.

Sanders has 2.3 million likes on his presidential campaign Facebook page (2.8 million on his U.S. senator page), more than his closest rival Hillary Clinton. He is consistently the most-searched-for candidate on the web during the Democratic debates.

Sanders reached 2.5 million campaign donations faster than any presidential candidate in U.S. history, raised a fortune from ordinary Americans while refusing money from corporations and billionaires, and has attracted far more enormous crowds than Clinton and other candidates. He was voted the most popular senator in the nation.

Yesterday Sanders pulled ahead of Clinton in Iowa polls. As Barack Obama did in 2008, Sanders may garner a huge youth turnout that could help him win the state and propel him toward the White House.

Supreme Court to Rule on Obama’s Immigration Order

The U.S. Supreme Court will determine in 2016 the constitutionality of President Barack Obama’s recent executive action on immigration.

Obama’s 2014 Deferred Action for Parents of Americans program would allow up to 5 million illegal immigrants, parents of U.S. citizens or lawful permanent residents here for five years, to remain in the U.S. on work permits. They would not have legal status, but they would be exempt from deportation (thus the “deferred action” status).

26 states, all with Republican governors, have challenged the executive order on legal grounds. So far, a district court in Texas and the U.S. Court of Appeals for the Fifth District sided with the Republicans, and Obama’s Justice Department filed with the Supreme Court to reverse the decisions of the lower courts.

Texas not only argues the executive action violated federal law and the Constitution, it opposes having to spend millions to issue drivers licenses to half a million Texas parents that would be eligible. The Obama administration declared Texas would not be forced to do this.

The decisions of the lower court halted Obama’s plan. The president would have the remaining months of his term to implement the program if the Supreme Court sides with him in June.

Obama’s earlier and similar executive action on immigration has not been challenged. The 2012 Deferred Action for Childhood Arrivals protected children brought illegally to the U.S. Over 720,000 have thus far been shielded from deportation through the order. Obama’s 2016 order would also expand this program.

Obama wants to give the undocumented immigrants the opportunity to “come out of the shadows” and have access to legal work. The conservative states have successfully argued in the lower courts that access to legal work also gives illegal immigrants access to “Social Security, Medicare, tax credits, and unemployment benefits,” though of course through legal work the undocumented would also be paying into those systems.

The summer decision will arrive in the middle of the 2016 presidential race, which has already generated fierce debate over the solution to illegal immigration.

Human Rights Campaign Endorses Clinton, Internet Explodes

On Tuesday, January 19, 2016, the Human Rights Campaign, the most prominent LGBTQ civil rights organization in the U.S., announced which Democratic candidate it was endorsing in the 2016 campaign.

The organization declared on Facebook, just after 6:00 in the morning (CST): “Senator to Secretary of State to Presidential Candidate, Hillary Clinton is a proven champion for LGBT equality. HRC is proud to announce its support for Hillary Clinton for President.”

At about 6 p.m., the post had about 5,600 likes and 2,440 shares. The comments section, on the other hand, was subject to a relentless assault by invading Bernie Sanders supporters.

There were 6,600 comments at 6 p.m., the vast majority displeased with the announcement.

A sample of 402 comments posted between 1:18 and 1:44 p.m. revealed near-unanimous opposition to the endorsement, with most writers supporting Sanders. 16 comments either supported the Clinton endorsement or objected to Bernie Sanders (4% of the sample). 7 comments (under 2%) were too vague or strange to determine an opinion. 379 comments, or 94%, condemned the endorsement and/or declared support for Sanders. The top comment, expressing disappointment and praising Sanders, had 5,870 likes.

Many visitors declared their donations to the organization would immediately cease. A typical comment read: “And this is what will make me stop my monthly contribution to HRC. Nice to see your organization is purely corporate and has no interest in the candidate’s integrity or history when it comes to LGBT issues.”

Criticisms included Clinton’s support for anti-gay rights legislation such as the Defense of Marriage Act and Don’t Ask Don’t Tell in the 1990s, her opposition to gay marriage until 2013, her relationship with Human Rights Campaign President Chad Griffin, who worked as a junior aide in Bill Clinton’s White House, and both her and the Human Rights Campaign’s support from large corporations (corporations Sanders is constantly attacking).

These were compared with Sanders’ call for the end of laws against homosexual behavior in Vermont while running for governor in 1972, his support for Burlington’s first gay pride march and gay pride day, his attack on anti-gay housing discrimination, his vote against DOMA and DADT, and his backing of the first legal civil unions in the nation in Vermont in 2000. Sanders approved publicly of gay marriage in 2009.

A spokesman for the Sanders campaign said, “It’s understandable and consistent with the establishment organizations voting for the establishment candidate, but it’s an endorsement that cannot possibly be based on the facts and the record.”

A petition pushing for the Human Rights Campaign to retract its endorsement and give it to Sanders quickly appeared on Change.org.

Why Iowa?

Iowa is the first state to vote for its Democratic and Republican presidential nominees by simple chance. It’s only been first since 1972, when Iowa officials planning the state convention had to reschedule the event on an earlier date because there were no hotel rooms available in Des Moines on the weekend in June they had selected.

Moving the state convention up meant moving the district conventions, county conventions, and caucus up as well, and thus the caucus ended up in January, ahead of New Hampshire’s primary. Very little time was spent campaigning in Iowa in 1972, but in 1976, Jimmy Carter invested time there and built momentum that helped him eventually win the White House.

Iowa has been guarding its privileged position ever since, and candidates compete fiercely for that momentum Carter tapped. If a state tries to move up its caucus or primary, Iowa moves its up even further. Iowa’s first vote is often criticized because it is a rural state that is very white: it doesn’t reflect the demographics or lifestyle of the country as a whole.

At the Toss of a Coin

In possibly the most exciting Iowa Democratic caucus in U.S. history, Hillary Clinton took Iowa with 49.9% of the vote to Bernie Sanders’ 49.6%. Iowa Democratic Party chairman Andy McGuire called it “the closest in Iowa Democratic history.”

Even more incredible, six Iowa precincts were decided by coin flip Monday night, February 1, 2016. Who knew that Iowa law stipulates that should a precinct have an odd number of delegates voters are trying to win for their candidate, and the votes are a virtual tie, the final delegate must be decided via coin toss.

In an amazing stroke of luck, Hillary Clinton won all six coin tosses. One has a 1.6% chance to win 6 coin tosses in a row.

One of the coin tosses was rife with controversy. The Des Moines Register reports:

A total of 484 eligible caucus attendees were initially recorded at the site. But when each candidate’s preference group was counted, Clinton had 240 supporters, Sanders had 179 and Martin O’Malley had five (causing him to be declared non-viable).

Those figures add up to just 424 participants, leaving 60 apparently missing. When those numbers were plugged into the formula that determines delegate allocations, Clinton received four delegates and Sanders received three — leaving one delegate unassigned.

Unable to account for that numerical discrepancy and the orphan delegate it produced, the Sanders campaign challenged the results and precinct leaders called a Democratic Party hot line set up to advise on such situations.

Party officials recommended they settle the dispute with a coin toss.

Clinton declared victory Monday night with 95% of the votes tallied and a 0.2% lead. “Thank you all so much,” she said to a cheering crowd. “What a night. What a great campaign this has been.” She promised, “I am a progressive who gets things done for the people.”

Sanders’ reply? “The political revolution is just starting. Tonight we accomplished what the corporate media and political establishment once believed was impossible. Don’t underestimate us.” Indeed, Sanders was polling in single digits in Iowa not too long ago.

The race moves on to New Hampshire, which hosts its primary on February 9. Sanders holds a colossal lead over Clinton, 61% to 30%.

Republican candidate Ted Cruz came out on top in Iowa with 27.9% of the vote. His closest rival, Donald Trump, got 24.3%, followed by Marco Rubio with 23.1%. The latest New Hampshire polls for Republicans? “Trump with 30 percent, followed by Cruz, 12 percent; Rubio, 11 percent,” according to the Washington Post.

Norovirus Hits Kansas City

600 Kansas Citians grew ill in mid-January 2016 when the New Theatre Restaurant in Overland Park, Kansas was hit by norovirus.

Norovirus is commonly called “food poisoning” or “stomach flu,” its symptoms being stomach pain, nausea, diarrhea, and vomiting. It is often spread quickly due to improper hand washing, contaminating people, food, drinks, and surfaces.

The event garnered national media attention, from The Guardian and ABC News, for example.

On Friday, January 29, 2016 a disinfecting and decontamination company cleaned New Theatre for a massive $40,000. Stomach flu outbreaks can of course potentially cost businesses customers on top of the cleaning costs, though the New Theatre vice president for sales and marketing claims New Theatre continues to see “full houses.”

On Monday, KCTV 5 reported a Buffalo Wild Wings at 105th Street and Metcalf Avenue saw 10 sick patrons and staffers. Samples are under examination to determine if this is also norovirus; the Kansas Department of Agriculture recently cited the restaurant with 18 food safety violations. The restaurant is temporarily closed for cleaning.

Dr. Dana Hawkinson, an infectious disease specialist at the University of Kansas Hospital, says there is a possibility the two outbreaks linked.

“Certainly there’s always a possibility that there could be a link, especially since they are so close in proximity. That would be reasonable,” Hawkinson said.

Bernie Rising

With New Hampshire residents poised to flock to the voting booths this Tuesday, February 9, 2016, polls show Bernie Sanders should win by a landslide.

After the near-tie in the Iowa caucus (Clinton won 49.9% to Sanders’ 49.6%), a February 3 poll by NBC News/Wall Street Journal/Marist showed Sanders crushing Clinton 58% to 38% among likely Democratic voters in New Hampshire.

A February 4 poll from the University of Massachusetts-Lowell/7 News showed Sanders with an even more massive lead, 63% to Clinton’s 30%.

Clinton has lost much support since last spring, when she dominated New Hampshire polls. A writer for Mother Jones suggests this draws into question the idea Sanders is only winning because the state neighbors his home state of Vermont, as does the fact that New Hampshire tends to vote for more “establishment” candidates, not progressive outsiders. In other words, his support grows due to his ideas, not home field advantage.

Sanders’ insurgent campaign made similar gains before the closest result in Iowa history (coin tosses were involved), and has now eroded Clinton’s national lead as well. Formerly up by 31 points nationally, Clinton now leads Sanders by just 2 points (44%-42%), according to a Quinnipiac poll released Friday. A margin of error makes it neck and neck.

New Hampshire’s primary on Tuesday comes after last night’s tense, heated Democratic presidential debate, during which Clinton lambasted Sanders for his “artful smear” campaign insinuating “anybody who ever took donations or speaking fees from any interest group has to be bought” and that politicians who take “donations from Wall Street,” like Barack Obama and herself, are not “progressive.”

Clinton found it “quite amusing” that Sanders would call her part of the “establishment,” because “a woman running to be the first woman president” could not be part of the establishment.

Sanders countered:

What being part of the establishment is…is in the last quarter, having a super-PAC that raised $15 million from Wall Street, that throughout one’s life raised a whole lot of money from the drug companies and other special interests. To my mind, if we do not get a handle on money in politics and the degree to which big money controls the political process in this country, nobody is going to bring about the changes that is needed in this country for the middle class and working families.

This exchange is common in the race thus far, and will likely be repeated. Sanders refuses donations from corporations and the wealthy, instead building a grassroots campaign on small donations from individuals and unions. Clinton’s top donors are big banks and corporations, and despite her challenge to critics like Sanders to “just name one” instance where money influenced her vote, her quid pro quo relationship with corporate power is so well-documented other senators talk about it openly.

Even so, Sanders raised $20 million in January, to Clinton’s $15 million.

For the Glory of the City

Perhaps the news will break in The Kansas City Star some day that the Kansas City council, or a secret independent body, dispatched moles to the nation’s top magazines, newspapers, and websites.

Their mission was clear: find work in staff and editorial positions and begin a propaganda campaign. The glorification of Kansas City would lead to a mass migration of people and businesses, an economic and cultural renaissance, fame and fortune!

Hmmm, or maybe more Kansas Citians are simply finding national media jobs–completely devoid of conspiracy. Or perhaps the explanation for all this talk of Kansas City is even better. Perhaps this city, the geographic heart of America, really is that awesome.

Either way, let’s consider either how awesome Kansas City is or how well natives are peddling Kansas City propaganda. Up to you which one.

(Full list of KC honors here.)

 

HUFFINGTON POST

  • Named Kansas City the #1 “coolest” city in the U.S. to “visit right now.” We’re an “it” city with “hip” neighborhoods. (2014)
  • Put KC at #10 of the top cities for creatives. Called it “a creative hub to rival those in the northeast and on the West Coast,” cited 70 art galleries in the Crossroads Art District alone, the Nelson-Atkins Museum of Art, and the Kemper Museum of Contemporary Art. But we ranked below Des Moines and @&#$ing Kalamazoo. Thanks, moles. (2015)
  • Ranked KC #4 among the “most cultured cities.” That’s better. We don’t mind losing to New York. (2015)
  • Other honors over the past few years include #4 for best cities for newlyweds, #7 for college grads, and more. (2012)

 

FORBES

  • Forbes put Kansas City on its list of “Best Places for Business and Careers.” And who would know better than they? (2012)
  • Named us one of the 10 best cities to buy a home. (2012)
  • Called downtown one of America’s best. Forbes doesn’t seem to like specific rankings these days. (2011)
  • Things used to be different. #9 in “Top 10 Best Cities to Get Ahead,” #8 for commuters, #6 for couples, #16 for the outdoors, #7 for pet-friendly cities, #7 for volunteering. (2007-2009)

TRAVEL+LEISURE

  • Named KC #2 among “America’s Best Beer Cities.” Thank you, Boulevard. (2015)
  • Honestly, the agents at Travel + Leisure need to chill out or they’re gonna get made. Between March 3 and April 24 of 2015, they included Kansas City in “America’s Most Charming Cities,” “America’s Best Cities for Sweet Tooths,” “American’s Best Music Scenes,” “America’s Best Cities for Foodies,” and “20 Quirkiest Cities in America.” Wow.
  • Kansas City was America’s #3 favorite city. This one was by popular vote. We were voted #1 for good drivers, Christmas lights, affordability, flea markets, and BBQ. Hahaha, “good drivers.” They’ve been undercover too long. (2014)
  • #7 for coffee. #9 for picnics. (2014, 2015)

 

BUZZFEED

  • Some of what you find on Buzzfeed relating to Kansas City are “community member” posts–some random people made them, not the Buzzfeed staff. Still, the popular website is starting to take notice of the Paris of the Plains. Kansas City was included in their “29 Cities All Twentysomethings Should Pick Up and Move To” (admittedly, also determined by popular vote). Though KC was listed #1 (admittedly, they did use the phrase “in no particular order”). Still, we know why it was on top. Wink. (2015)
  • I won’t name names, but it’s clear who the mole at Buzzfeed is. He wrote “An Open Letter to Kansas City” during the Royals’ World Series run. (2014)
  • He also wrote “28 Signs You Grew Up in Kansas City.” Buzzfeed labeled it a “Top Post,” as it got over 367,000 views. (2013)
  • His name is Dan Oshinsky.

 

EXTRA AWESOMENESS

  • USA Today published Yelp’s ranking of the Nelson-Atkins as the #1 museum in the United States of America–it was the highest rated and most praised by visitors. #5 was the National World War I Museum. (2015)
  • #2 for “Up-and-Coming Downtowns” by Fortune. (2014)
  • #6 for friendliest cities by Men’s Health. (2014)
  • #8 among best burger cities by USA Today. (2012)
  • #5 among “Underrated Gay-Friendly Cities” by About.com. (2009)
  • #3 for best NFL stadiums by Fox Sports. (2006)
  • #2 for best NFL fan loyalty by American City Business Journal. Go Chiefs! (2006)

Bernie Sanders Can Still Win

On Super Tuesday 2016, Hillary Clinton dominated 6 Southern states and won one New England state, Massachusetts, with a 1% margin. Bernie Sanders won 4 states soundly: Colorado, Minnesota, Oklahoma, and Vermont.

The delegate count now stands at Clinton’s 577 to Sanders’ 386 (superdelegates aside).

Despite the scoreboard–and immediate establishment media talk of his doom–Sanders still has an opportunity to win the Democratic nomination, should his popularity continue to grow and his supporters charge the polls in the next primaries and caucuses.

Consider that on Super Tuesday,

Even in the states where Clinton won handily, like Texas, Virginia, and Georgia, Sanders still won handily with his core constituencies–voters aged 18 to 29, first-time primary voters, and independents. According to NBC News’ exit polls, Sanders won young voters by a 30-point margin in Texas, 39 points in Virginia, 13 points in Georgia, and even captured the youth vote in Clinton’s home state of Arkansas, where Bill Clinton served as governor, by 24 points.

Among first-time primary voters, Sanders won by, again, 30 points in Texas and 8 points in Virginia. And Sanders captured independent voters by 16 points in both Texas and Virginia, 3 points in Georgia, 13 points in Tennessee, and 17 points in Arkansas.

Only 15 of the 50 states have voted. As Melissa Cruz writes, “Taking into account both delegates and superdelegates, about 75 percent of delegates are still up for grabs. If superdelegates are not accounted for, roughly 64 percent of delegates are left within the Democratic primary election.”

Though it will be a battle, Bernie Sanders can still win.

Importantly, Clinton’s national lead over Sanders has disappeared, and in some polls he’s beating her by a slim margin. That balance of support–made obvious in the first three Democratic contests–will likely become evident again as the race moves forward, out of the South and into states with the largest offerings of delegates.

Even before the crucial Super Tuesday victories, Sanders proved he could do well in blue collar, Midwestern states like Iowa (where he lost by less than 1%), New England states like New Hampshire (where he crushed Clinton), and in the Southwest (he lost Nevada by just a few percentage points).

Cruz writes, “Sanders still has a strong chance in many of the blue collar states coming up, such as Michigan, Illinois, and Pennsylvania,” which offer 147, 182, and 210 delegates respectively. Ohio has 159 delegates, Indiana 92, and the highly progressive Wisconsin 96.

Even a Mother Jones writer who thinks “Bernie Sanders is in a whole lot of trouble” admits:

If he can roll with the punches, he just might make it to the sweet spot of the schedule, a four-week, 15-state stretch that represents his last best shot to turn things around, starting with Idaho, Utah, and Arizona on March 22.

This four-week stretch not only includes Southwestern states Sanders showed he can compete in but also the very liberal Washington, with its 118 delegates. Tom Cahill agrees that

…he faces a much more favorable electorate in states voting after March 15. If Sanders stays within 150 delegates by that benchmark, he can potentially narrow Clinton’s lead in the spring and overtake her in the summer as Sanders-favorable coastal states take to the polls.

Coastal states (Washington included) like the very liberal California, with its whopping 546 delegates awarded in June, could swing Sanders’ way and send him toward the White House. He has a good chance to win New England states like Rhode Island, Connecticut, and Delaware.

Like his passionate fan base, Sanders’ war chest is only continuing to grow. He’s raised $137 million in the campaign so far, and as the Mother Jones writer notes,

Sanders raised an absurd $42 million in February—$6 million of it on the Monday after the South Carolina blowout. Because he relies so heavily on small-dollar donors who haven’t hit the $2,700 limit, he can in theory keep circling back for more money to buy ads and build organizations in every state that comes up.

Jeffrey Lord Defends Belief the KKK is a “Leftist” Group With “Progressive Agenda”

On Wednesday, March 2, 2016, Donald Trump supporter and former Reagan administration official Jeffrey Lord shared a tense and heated exchange with CNN political analyst Van Jones for the second day in a row.

The previous day’s quarrel, sparked as the Super Tuesday election results rolled in, saw Jones criticize Trump for being slow to disavow supporters who belong to white supremacist groups like the Ku Klux Klan (Lord insisted Trump disavowed them “many, many, many times.”)

Trump, Jones said, “is whipping up and tapping into and pushing buttons that are very, very frightening to me and frightening to a lot of people. Number one, when he is playing funny with the Klan, that is not cool.”

Lord quickly called the Klan “a leftist terrorist organization.”

Jones at first steered away from “playing that game,” but Lord persisted, saying, “It’s wrong to understand that these are not leftists…they were the military arm, the terrorist arm of the Democratic Party, according to historians. For God’s sake, read your history.”

Jones countered, “I don’t care how they voted 50 years ago. I care about who they killed.”

Lord then attempted to connect the Klan’s association with southern Democrats in U.S. history to liberalism today. After Jones pointed out offensive things Trump said of minorities, Lord replied: “Van, but what you’re doing right here is dividing people. We’re all Americans here, Van. You are dividing people. This is what liberals do. You’re dividing people by race.”

“The Klan kill people by race,” Jones said.

“And they did it–they did it to further the progressive agenda. Hello?” Lord exclaimed.

An incredulous Jones called that idea “absurd.”

On Wednesday, Lord and Jones were back on CNN for more.

Jones said:

You know, I don’t understand why the right wing is so obsessed with trying to point out that the Ku Klux Klan, you know, 50, 60, 70 years was a part of the Democratic Party. The Democratic Party in that time was a racist party and there were violent elements. That is true because, obviously, the Republicans at that time were the party of Lincoln, who ended slavery. But we’ve had a reversal over these past 50 [years]–my entire lifetime.

Lord doubled down on his Tuesday comments:

My point is that race fuels the progressive movement and has always fueled the progressive movement. Whether it was slavery, segregation, lynching, the Ku Klux Klan, to today’s racial quotas, illegal immigration by skin color. You know, groups like La Raza, the Black Panthers, Black Lives Matter, et. cetera, it’s always about let’s divide people by race and then here is the progressive agenda that we want to enact.

Jones quickly asked if “you are going to say that the people who are dividing America by race were progressives, were liberals?”

“Yes.”

“It was not progressives that were trying to keep slavery in place,” Jones insisted. “It was not progressives that were trying to keep segregation in place… There is this weird strain now on the right that tries to pretend that their hands are completely clean when it comes to race.”

After much bickering and crosstalk, Lord said, “The reason we needed those civil rights laws in the 1960s is because the Democratic Party went out of their way to undermine the civil rights laws.”

Jones drew a distinction between the southern Democrats of decades past and the Democratic Party today, echoing his previous comment on a “reversal” of ideology between the two major political parties:

There is something wrong with this particular view that because horrible racist Dixiecrats in the South did horrible racist things–and they did horrible racist things for a long time. In fact–

“For political reasons,” Lord interjected.

Jones continued:

Let me finish. For a long time they did horrible racist things. 50, 60 years ago. I say that is horrible. But guess what, those people left the Democratic Party and they joined your party. That is the problem. No, they literally left your party–

“No. That is simply not true,” Lord countered.