Big Government Programs Actually Prevent Totalitarianism

There is often much screaming among conservatives that big government programs — new ones like universal healthcare, universal college education, or guaranteed work, and long-established ones like Social Security, Medicaid, and Medicare — somehow lead to dictatorship. There is, naturally, no actual evidence for this. The imagined correlation is justified with nothing beyond “that’s socialism, which always becomes totalitarianism,” ignorance already addressed. The experience of advanced democracies around the world, and indeed the U.S. itself, suggests big government programs, run by big departments with big budgets and big staffs helping tens of millions of citizens, can happily coexist alongside elected governing bodies and presidents, constitutions, and human rights, as one would expect.

Threats to democracy come from elsewhere — but what’s interesting to consider is how conservatives have things completely backward. Big government programs — the demonstration that one’s democracy is a government “for the people,” existing to meet citizen needs and desires — are key to beating back the real threats to a republic.

In a recent interview with The Nation, Bernie Sanders touched on this:

“Why it is imperative that we address these issues today is not only because of the issues themselves—because families should not have to spend a huge proportion of their income on child care or sending their kid to college—but because we have got to address the reality that a very significant and growing number of Americans no longer have faith that their government is concerned about their needs,” says the senator. “This takes us to the whole threat of Trumpism and the attacks on democracy. If you are a worker who is working for lower wages today than you did 20 years ago, if you can’t afford to send your kid to college, etc., and if you see the very, very richest people in this country becoming phenomenally rich, you are asking yourself, ‘Who controls the government, and does the government care about my suffering and the problems of my family?’”

Sanders argues that restoring faith in government as a force for good is the most effective way to counter threats to democracy.

And he’s right. Empirical evidence suggests economic crises erode the rule of law and faith in representative democracy. Depressions are not the only force that pushes in this direction, but they are significant and at times a killing blow to democratic systems. Unemployment, low wages, a rising cost of living — hardship and poverty, in other words — drive citizens toward extreme parties and voices, including authoritarians. Such leaders are then elected to office, and begin to dismantle democracy with support of much of the population. Europe in the 1930s is the oft-cited example, but the same has been seen after the global recession beginning in 2008, with disturbing outgrowths of recent declining trust in democracy: the success of politicians with demagogic and anti-democratic bents like Trump, hysteria over fictional stolen elections that threatens to keep unelected people in office, and dangerous far-right parties making gains in Europe. The Eurozone and austerity crisis, the COVID-induced economic turmoil, and more have produced similar concerns.

What about the reverse? If economic disaster harms devotion to real democracy and politicians who believe in it, does the welfare state increase support for and faith in democracy? Studies also suggest this is so. Government tackling poverty through social programs increases satisfaction with democratic systems! The perception that inequality is rising and welfare isn’t doing enough to address it does the exact opposite. A helping hand increases happiness, and is expected from democracies, inherently linking favorability views on republics and redistribution. If we wish to inoculate the citizenry against authoritarian candidates and anti-democratic practices within established government, shoring up loyalty to democracy through big government programs is crucial.

It is as Sanders said: the most important thing for the government to do to strengthen our democracy and even heal polarization (“Maybe the Democrats putting $300 per child per month in my bank account aren’t so evil”), is simply to help people. To work for and serve all. Healthcare, education, income support, jobs…such services help those on the Right, Left, and everyone in between. This should be done whether there is economic bust or boom. People hold fast to democracy, a government of and by the people, when it is clearly a government for the people. If we lose the latter, so too the former.

For more from the author, subscribe and follow or read his books.

COVID Proved Social Conditions Largely Determine Our Health

In the past year, it has been heavily impressed upon Kansas Citians that one’s health is to a significant degree determined by factors beyond one’s control. The COVID-19 era is a key moment to further break down the reactionary notion that personal health choices are all that stands between an individual and optimal physical and mental well-being. It’s broadened our understanding of how health is also a product of social conditions.

The first and most elementary fact to note is that viruses, while often focusing on vulnerable populations such as the elderly, are not often entirely discriminatory. They end the lives of the young and healthy as well. Regardless of one’s habits of eating, exercise, or not smoking, random exposure to illnesses new or old as one shops for groceries or rides in an Uber helps introduce the point: The environment often makes a mockery of our personal choices, as important as those are.

The family you are born into, where you grow up, and other factors beyond your control — and often your own awareness — have a large impact on your development and health as a child, which in turn impacts your health as an adult. (And the environment you happen to be in continues to affect you.) Poverty, extremely stressful on the mind and body in many ways, is the ultimate destructive circumstance for children and adults alike. Take the disturbing life expectancy differences between the poor and the better-off, for instance. In Kansas City’s poorest ZIP codes, which are disproportionately black, you can expect to live 18 fewer years on average compared to our richest, whitest ZIP codes, as Flatland reported on June 22. Poor families are less likely to have health care offered by an employer or be able to afford it themselves. They live in social conditions that include more violence or worse air and water pollution. They can at times only afford housing owned by negligent landlords slow to take care of mold, and cope with a million other factors.

During the pandemic, what serious observers of the social determinants of health predicted came true: Black Kansas Citians were hammered by COVID-19. Here we feel, today, the cold touch of slavery and Jim Crow, which birthed disproportionate poverty, which nurtured worse health, which resulted in Black Kansas Citians being more likely to catch coronavirus and die from it, as The Star reported even in the early stages of the pandemic. Worse still, on Feb. 24, the paper noted that richer, whiter ZIP codes — the areas of less urgent need — were getting disproportionately more vaccines than poorer areas with more Black residents. The vaccines were first shipped by the state to health centers that were convenient for some but distant from others.

Imagine history and race playing a role in your health, how soon you could get a shot. Imagine transportation options and where you live being factors. Likewise, imagine the kind of job you have doing the same: Lower-income workers are more likely to have front-line jobs at restaurants and grocery stores, where you can catch the virus. The privileged, better-off often work from home.

Whether it is drinking water you don’t know is unsafe or working at a job that requires much human contact during a pandemic, the determinants of health stretch far beyond exercising, eating right, and choosing not to smoke. To reflect on this fact is to understand a moral duty. If social conditions affect the health of individuals and families, it is urgent to change social conditions — to build a decent society, one without poverty and the many horrors that flow from it.

In this moment, one important way to help move toward this goal is to urge the U.S. House to pass the reconciliation budget that just passed the Senate, to extend the direct child tax credit payments to families, boldly expand education and health care, and more. Onward, a better world awaits.

This article first appeared in The Kansas City Star: https://www.kansascity.com/opinion/readers-opinion/guest-commentary/article253638658.html or https://edition.pagesuite.com/popovers/dynamic_article_popover.aspx?artguid=1ce78851-fef4-4f5d-b7a4-448618c1526c.

For more from the author, subscribe and follow or read his books.

Is Time the Only Cure for COVID Foolishness?

As August 2021 began, 50% of the U.S. population was fully vaccinated against COVID-19, over 165 million people. There have been 615,000 confirmed deaths — the actual number, given the national excess mortality rate since the start of 2020, is likely double official figures. Over a 12-month period, since last August, 2.5 million people were hospitalized, many leaving with lasting medical problems. All the while, protests and foaming at the mouth over mask and vaccine mandates continue; half the population has refused or delayed the vaccine, this group disproportionately (+20%) Republican.

Attempting to convince the conspiracy theorists, bullheaded conservatives, and those concerned over how (historically) fast the vaccine breakthrough occurred is of course still the moral and pressing thing to do. This piece isn’t an exercise in fatalism, despite its headline. However, great frustration exists: if the hesitant haven’t been convinced by now, what will move the needle? With over a year and a half to absorb the dangers of COVID, deadly and otherwise, and eight months to observe a vaccine rollout that has given 1.2 billion people globally highly effective protection, with only an infinitesimally small percentage seeing any side effects (similar to everyday meds), what could possibly be said to convince someone to finally listen to the world’s medical and scientific consensus, to listen to reason? People have been given a chance to compare the disease to the shots (the unvaccinated are 25 times more likely to be hospitalized from COVID and 24 times more likely to die, with nearly all [97, 98, 99%] of COVID deaths now among the unprotected population), but that requires a trust in the expert consensus and data and trials and peer-reviewed research and all those things that make American stomachs churn. Giving people accurate information and sources can even make them less likely to see the light! There is, for some bizarre reason, more comfort and trust in the rogue doctor peddling unfounded nonsense on YouTube.

It may be of some comfort then to recognize that the insanity will surely decrease as time goes on. It’s already occurring. The most powerful answer to “what will move the needle?” is “personal impact” — as time passes, more people will know someone hospitalized or wiped from existence by the disease, and also know someone who has been vaccinated and is completely fine. There will be more family members who get the vaccine behind your back and more friends and acquaintances you’ll see online or in the media expressing deep regret from their ICU hospital beds. You may even be hospitalized yourself. Such things will make a difference. States currently hit hardest by the Delta variant and seeing overall cases skyrocket — the less vaccinated states — are also witnessing increases in vaccination rates. Even conservative media outlets and voices are breaking under the weight of reason, finally beginning to promote the vaccine and changing viewers’ minds, while naturally remaining in Absurdsville by pretending their anti-inoculation hysteria never occurred and blaming Democrats for vaccine hesitancy. Eventually, falsities and mad beliefs yield to science and reason, as we’ve seen throughout history. True, many will never change their minds, and will go to their deaths (likely untimely) believing COVID to be a hoax, or exaggerated, or less risky than a vaccine. But others will yield, shaken to the core by loved ones lost to the virus (one-fourth to one-third of citizens at least know someone who died already) or vaccinated without becoming a zombie, or even by growing ill themselves.

To say more time is needed to end the foolishness is, admittedly, in part to say more illness and death are needed. As stated, the more people a hesitant person knows who have grown ill or died, the more likely the hesitant person is to get his or her shots. A terrible thing to say, yet true. That is why we cannot rest, letting time work on its own. We must continue trying to convince people, through example, empathy (it’s often not logic that changes minds, but love), hand-holding, and other methods offered by psychologists. Lives can be saved. And to convince someone to get vaccinated is not only to protect them and others against COVID, it suddenly creates a person in someone else’s inner circle who has received the shots, perhaps helping the behavior spread. Both us and Father Time can make sure hesitant folk know more people who have been vaccinated, the more pleasant piece of time’s function.

Hopefully, our experience with coronavirus will prepare us for more deadly pandemics in the future, in terms of our behavior, healthcare systems, epidemiology, and more. As bad as COVID-19 is, as bad as Delta is, humanity was exceptionally lucky. The disease could have been far deadlier, far more contagious; the vaccine could have taken much longer, and been less effective. We’ve seen four million deaths worldwide, but even with this virus evolving and worsening, we’ll likely see nothing like the 50 million dead from the 1918 pandemic. Some see the rebellion against masks, lockdowns, and vaccines as a frightening sign: such insanity will spell absolute catastrophe when a deadlier virus comes around. This writer has always suspected (perhaps only hoped) that view to be a bit backward. A deadlier virus would likely mean less rebellion (as would a virus you could see on other people, something more visually horrifying like leprosy). It’s the relative tameness of COVID that allows for the high degree of madness. Admittedly, there was anti-mask resistance during the 1918 crisis, but there could be a correlation nonetheless between the seriousness of the epidemic and the willingness to engage in suicidal foolishness. That aligns with this idea that the more people you lose in your inner circle the more likely you are to give in and visit your local health clinic. Let’s hope science and reason reduce the opportunities to test this correlation hypothesis.

For more from the author, subscribe and follow or read his books.

Woke Cancel Culture Through the Lens of Reason

What follows are a few thoughts on how to view wokeism and cancel culture with nuance:

Two Basic Principles (or, Too Much of a Good Thing)

There are two principles that first spring to mind when considering cancel culture. First, reason and ethics, to this writer, suggest that social consequences are a good thing. There are certain words and actions that one in a free society would certainly not wish to result in fines, community service, imprisonment, or execution by government, but are deserving of proportional and reasonable punishments by private actors, ordinary people. It is right that someone who uses a racial slur loses their job or show or social media account. A decent person and decent society wants there to be social consequences for immoral actions, because it discourages such actions and helps build a better world. One can believe in this while also supporting free speech rights and the First Amendment, which obviously have to do with how the government responds to what you say and do, not private persons and entities.

The second principle acknowledges that there will be many cases where social consequences are not proportional or reasonable, where things go too far and people, Right and Left, are crushed for rather minor offenses. It’s difficult to think of many social trends or ideological movements that did not go overboard in some fashion, after all. There are simply some circumstances where there was an overreaction to words and deeds, where mercy should have been the course rather than retribution. (Especially worthy of consideration: was the perpetrator young at the time of the crime, with an underdeveloped brain? Was the offense in the past, giving someone time to change and grow, to regret it?) Readers will disagree over which specific cases fall into this category, but surely most will agree with the general principle, simply that overreaction in fact occurs. I can’t be the only Leftist who both nods approvingly in some cases and in others thinks, “She didn’t deserve that” or “My, what a disproportionate response.” Stupid acts might deserve a different response than racist ones, dumb ideas a different tack than dangerous ones, and so on. It might be added that overreactions not only punish others improperly, but also encourage forced, insincere apologies — somewhat reminiscent of the adage than you shouldn’t make faith a requirement of holding office, as you’ll only end up with performative religiosity.

Acknowledging and pondering both these principles is important.

“Free Speech” Only Concerns Government-Citizen Interaction

Again, in most cases, the phrase “free speech” is basically irrelevant to the cancel culture conversation. It’s worth emphasizing. Businesses and individuals — social media companies, workplaces, show venues, a virtual friend who blocks you or deletes your comment — have every right to de-platform, cancel, censor, and fire. The whining about someone’s “free speech” being violated when they’re cancelled is sophomoric and ignorant — the First Amendment and free speech rights are about whether the government will punish you, not non-government actors.

Which makes sense, for an employer or individual could just as easily be said to have the “free speech right” to fire or cancel you — why is your “free speech right” mightier than theirs?

Public universities and government workplaces, a bit different, are discussed below.

Why is the Left at Each Other’s Throats?

At times the national conversation is about the left-wing mob coming for conservatives, but we know it comes for its own with just as much enthusiasm. Maybe more, some special drive to purge bad ideas and practices from our own house. Few involved in left-wing advocacy of some kind haven’t found themselves in the circular firing squad, whether firing or getting blasted — most of us have probably experienced both. It’s a race to be the most woke, and can lead to a lot of nastiness.

What produces this? Largely pure motives, for if there’s a path that’s more tolerant, more just, that will build a better future, we want others to see and take it. It’s a deep desire to do what’s right and get others to do the same. (That the pursuit of certain kinds of tolerance [racial, gender, etc.] would lead to ideological intolerance has been called ironic or hypocritical, but seems, while it can go too far at times, more natural and inevitable — there’s no ending separate drinking fountains without crushing the segregationist’s ideology.)

But perhaps the inner turmoil also comes from troublesome ideas of group monolithic thinking, plus a desperate desire for there to be one right answer when there isn’t one. Because we sometimes look at impacted groups as comprised of members all thinking the same way, or enough thinking the same way, there is therefore one right answer and anyone who questions it should be trampled on. For example, you could use “person with autism” (person-first language) rather than “autistic person” (identity-first language) and fall under attack for not being woke enough. Identity-first language is more popular among the impacted group members, and the common practice with language among non-impacted persons is to defer to majority opinions. But majority opinions aren’t strictly “right” — to say this is of course to say the minority of the impacted group members are simply wrong. Who would have the arrogance and audacity to say this? It’s simply different opinions, diversity of thought. (Language and semantics are minefields on the Left, but also varying policy ideas.) There’s nothing wrong with deferring to majority opinion, but if we were not so focused on there being one right answer, if we didn’t view groups as single-minded or single-minded enough, we would be much more tolerant of people’s “mistakes” and less likely to stoop to nastiness. We’d respect and explore and perhaps even celebrate different views within our side of the political spectrum. It’s worth adding that we go just as crazy when the majority impacted group opinion is against an idea. It may be more woke, for example, to support police abolition or smaller police presences in black neighborhoods, but 81% of black Americans don’t want the police going anywhere, so the majority argument won’t always help a case. Instead of condemning someone who isn’t on board with such policies as not caring enough about racial justice, not being woke enough, being dead wrong, we should again remember there is great diversity of thought out there and many ideas, many possible right answers beyond our own, to consider and discuss with civility. One suspects that few individuals, if intellectually honest, would always support the most radical or woke policy posited (more likely, you’ll disagree with something), so more tolerance and humility is appropriate.

The same should be shown toward many in the middle and on the Right as well. Some deserve a thrashing. Others don’t.

The University Onus

One hardly envies the position college administrators find themselves in, pulled between the idea that a true place of learning should include diverse and dissenting opinions, the desire to punish and prevent hate speech or awful behaviors, the interest in responding to student demands, and the knowledge that the loudest, best organized demands are at times themselves minority opinions, not representative.

Private universities are like private businesses, in that there’s no real argument against them cancelling as they please.

But public universities, owned by the states, have a special responsibility to protect a wide range of opinion, from faculty, students, guest speakers, and more, as I’ve written elsewhere. As much as this writer loves seeing the power of student organizing and protest, and the capitulation to that power by decision-makers at the top, public colleges should take a harder line in many cases to defend views or actions that are deemed offensive, in order to keep these spaces open to ideological diversity and not drive away students who could very much benefit from being in an environment with people of different classes, ethnicities, genders, sexual orientations, religions, and politics. Similar to the above, that is a sensible general principle. There will of course be circumstances where words and deeds should be crushed, cancellation swift and terrible. Where that line is, again, is a matter of disagreement. But the principle is simply that public colleges should save firings, censorship, cancellation, suspension, and expulsion for more extreme cases than is current practice. The same for other public entities and public workplaces. Such spaces are linked to the government, which actually does bring the First Amendment and other free speech rights into the conversation, and therefore there exists a special onus to allow broader ranges of views.

Cancel Culture Isn’t New — It’s Just the Left’s Turn

If you look at the surveys that have been conducted, two things become clear: 1) support for cancel culture is higher on the Left, but 2) it’s also a problem on the Right.

50% of staunch progressives “would support firing a business executive who personally donated to Donald Trump’s campaign,” vs. 36% of staunch conservatives who “would support firing Biden donors.” Republicans are much more worried about their beliefs costing them their jobs (though a quarter of Democrats worry, too), conservatives are drastically more afraid to share opinions (nearly 80%, vs. just over 40% for strong liberals), and only in the “strong liberal” camp does a majority (58%) feel free to speak its mind without offending others (liberals 48%, conservatives 23%). While almost 100% of the most conservative Americans see political correctness as a problem, 30% of the most progressive Americans agree, not an insignificant figure (overall, 80% of citizens agree). There’s some common ground here.

While the Left is clearly leading modern cancel culture, it’s important to note that conservatives often play by the same rules, despite rhetoric about how they are the true defenders of “free speech.” If Kaepernick kneels for the anthem, he should be fired. If a company (Nike, Gillette, Target, NASCAR, Keurig, MLB, Delta, etc.) gets political on the wrong side of the spectrum, boycott it and destroy your possessions, while Republican officials legislate punishment. If Republican Liz Cheney denounces Trump’s lies, remove her from her leadership post. Rage over and demand cancellation of Ellen, Beyonce, Jane Fonda, Samantha Bee, Kathy Griffin, Michelle Wolf, and Bill Maher for using their free speech. Obviously, no one called for more firings for views he didn’t like than Trump. If the Dixie Chicks criticize the invasion of Iraq, wipe them from the airways, destroy their CDs. Thomas Hitchner recently put together an important piece on conservative censorship and cancellation during the post-9/11 orgy of patriotism, for those interested. And don’t forget what happened to Sinéad O’Connor after she tore up a photograph of the Pope (over the Catholic Church sexual abuse scandal) on SNL in 1992: her records were crushed under a steamroller in Times Square and her career was cancelled. Think also of the Right’s myriad book bans of late (like those of yesteryear), and how this impacts authors.

More importantly, when we place this phenomenon of study in the context of history, we come to suspect that rather than being something special to the Left (or naturally more powerful on the Left, because liberals hate free speech and so on), cancel culture seems to be, predictably, led by the strongest cultural and political ideology of the moment. When the U.S. was more conservative, it was the Right that was leading the charge to ensure people with dissenting views were fired, censored, and so on. The hammer, rather than wielded by the far Left, came down on it.

You could look to the socialists and radicals, like Eugene Debs, who were literally imprisoned for speaking out against World War I, but more recently the McCarthy era after World War II, when government workers, literary figures, media anchors, and Hollywood writers, actors, and filmmakers accused of socialist or communist sympathies were hunted down and fired, blacklisted, slandered, imprisoned for refusing to answer questions at the witch trials, and so forth, as discussed in A History of the American People by conservative Paul Johnson. The Red Scare was in many ways far worse than modern cancel culture — it wasn’t simply the mob that came for you, it was the mob and the government. However, lest anyone think this was just Republican Big Government run amok rather than a cultural craze working in concert, recall that it was the movie studios doing the actual firing and blacklisting, the universities letting faculty go, LOOK and other magazines reprinting Army “How to Spot a Communist” propaganda, ordinary people pushing and marching and rallying against communism, etc.

All this overlapped, as leftwing economic philosophies usually do, with the fight for racial justice. Kali Holloway writes for The Nation:

There was also [black socialist] Paul Robeson, who had his passport revoked by the US State Department for his political beliefs and was forced to spend more than a decade living abroad. Racism and red-scare hysteria also canceled the acting career of Canada Lee, who was blacklisted from movies and died broke in 1952 at the age of 45. The [anti-segregationist] song “Mississippi Goddam” got Nina Simone banned from the radio and much of the American South, and the Federal Bureau of Narcotics essentially hounded Billie Holiday to death for the sin of stubbornly refusing to stop performing the anti-lynching song “Strange Fruit.”

Connectedly, there was the Lavender Scare, a purge of gays and suspected gays from government and private workplaces. 5,000-10,000 people lost their jobs:

“It’s important to remember that the Cold War was perceived as a kind of moral crusade,” says [historian David K.] Johnson, whose 2004 book The Lavender Scare popularized the phrase and is widely regarded as the first major historical examination of the policy and its impact. The political and moral fears about alleged subversives became intertwined with a backlash against homosexuality, as gay and lesbian culture had grown in visibility in the post-war years. The Lavender Scare tied these notions together, conflating gay people with communists and alleging they could not be trusted with government secrets and labelling them as security risks, even though there was no evidence to prove this.

The 1950s was a difficult era for the Left and its civil rights advocates, class warriors, artists, and gay liberators, with persecution and censorship the norm. More conservative times, a stronger conservative cancel culture. Not even rock and roll was safe. All this did not end in this decade, of course (one of my own heroes, Howard Zinn, was fired from Spelman College in 1963 for his civil rights activism), but soon a long transition began. Paul Johnson mused:

The significant fact about McCarthyism, seen in retrospect, was that it was the last occasion, in the 20th century, when the hysterical pressure on the American people to conform came from the right of the political spectrum, and when the witchhunt was organized by conservative elements. Thereafter the hunters became the hunted.

While, as we saw, the Right are still often hunters as well, and therefore we see much hypocrisy today, there is some truth to this statement, as from the 1960s and ’70s the nation began slowly liberalizing. Individuals increasingly embraced liberalism, as did some institutions, like academia, the media, and Hollywood (others, such as the church, military, and law enforcement remain quite conservative). The U.S. is still growing increasingly liberal, more favoring New Deal policies, for example, even though more Americans still identify as conservative:

Since 1992, the percentage of Americans identifying as liberal has risen from 17% then to 26% today. This has been mostly offset by a shrinking percentage of moderates, from 43% to 35%. Meanwhile, from 1993 to 2016 the percentage conservative was consistently between 36% and 40%, before dipping to 35% in 2017 and holding at that level in 2018.

On top of this, the invention and growth of social media since the mid-2000s has dramatically changed the way public anger coalesces and is heard — and greatly increased its power.

So the Left has grown in strength at the same time as technology that can amplify and expand cancel culture, a convergence that is both fortunate and unfortunate — respectively, for those who deserve harsh social consequences and for those who do not.

For more from the author, subscribe and follow or read his books.

Did Evolution Make it Difficult for Humans to Understand Evolution?

It’s well known that people are dreadful at comprehending and visualizing large numbers, such as a million or billion. This is understandable in terms of our development as a species, as grasping the tiny numbers of, say, your clan compared to a rival one you’re about to be in conflict with, or understanding amounts of resources like food and game in particular places, would aid survival (pace George Dvorsky). But there was little evolutionary reason to adeptly process a million of something, intuitively knowing the difference between a million and a billion as easily as we do four versus six. A two second difference, for instance, we get — but few intuitively sense a million seconds is about 11 days and a billion seconds 31 years (making for widespread shock on social media).

As anthropologist Caleb Everett, who pointed out a word for “million” did not even appear until the 14th century, put it, “It makes sense that we as a species would evolve capacities that are naturally good at discriminating small quantities and naturally poor at discriminating large quantities.”

Evolution, therefore, made it difficult to understand evolution, which deals with slight changes to species over vast periods of time, resulting in dramatic differences (see Yes, Evolution Has Been Proven). It took 16 million years for Canthumeryx, with a look and size similar to a deer, to evolve into, among other new species, the 18-foot-tall giraffe. It took 250 million years for the first land creatures to finally have descendants that could fly. It stands to reason that such statements seem incredible to many people not only due to old religious tales they support that evidence does not but also because it’s hard to grasp how much time that actually constitutes. Perhaps it would be easier to comprehend and visualize how small genetic changes between parent creatures and offspring could add up, eventually resulting in descendants that look nothing like ancient ancestors, if we could better comprehend and visualize the timeframes, the big numbers, in which evolution operates. 16 million years is a long time — long enough.

This is hardly the first time it’s been suggested that its massive timescales make evolution tough to envision and accept, but it’s interesting to think about how this fact connects to our own evolutionary history and survival needs.

Just one of those wonderful oddities of life.

For more from the author, subscribe and follow or read his books.