Yes, Evolution Has Been Proven

Evolution is a simple idea: that over time, lifeforms change. In a small timespan, changes are subtle yet noticeable; in a massive one, changes are shockingly dramatic — descendants look nothing like their ancestors, becoming what we call new species.

Changes occur when genes mutate during the imperfect reproduction process, and are passed on if the mutation helps an individual creature escape predators, find food or shelter, or attract a mate, allowing it to more successfully reproduce than individuals without its new trait (natural selection). Some mutations, of course, hurt chances of survival or have no impact at all.

Naturalist and geologist Charles Darwin provided evidence for this idea in his 1859 book On the Origin of Species and other works, and over the century and a half since, research in multiple fields has consistently confirmed Darwin’s idea, irreparably damaging religious tales of the divine creation of life just as it exists today.  

The Myths of Man

While many people of faith have adopted scientific discoveries such as the age of the earth and evolution into their belief systems, many have not. Hardline Christian creationists still believe humans and all other life originated 6,000 years ago, with a “Great Flood” essentially restarting creation 4,000 years ago, with thousands of “kinds” of land animals (tens of thousands of species) rescued on Noah’s ark. 

The logical conclusion of the story is utterly lost on believers. There are an estimated 6.5 million species that live on land today, perhaps 8-16 million total species on Earth (that’s a conservative estimate; it could be 100 million, as most of our oceans remain unexplored). People have cataloged 2 million species, discovering tens of thousands more each year. Put bluntly, believing that in four millennia tens of thousands of species could become millions of species requires belief in evolution at a pace that would make Darwin laugh in your face.

To evolve the diversity of life we see today, much time was needed. More than 4,000 years, a planet older than 6,000 years. We know the Earth is 4.5 billion years old because radioactive isotopes in terrestrial rocks (and crashed meteors) decay at consistent rates, allowing us to count backward. Fossil distribution, modern flora and fauna distribution, and the shape of the continents first indicated the continents were once one, and satellites proved the continents are indeed moving apart from each other at two to four inches per year, again allowing us to count backward (Why Evolution is True, Jerry Coyne). When we do so, we do not stop counting in the thousands.

Naturally, criticisms of myths can be waved away with more magic, which is why it’s mostly futile to tear them apart, something I learned after wasting time doing so during my early writing days. Perhaps God decided to make new species after the flood. Perhaps he in fact made millions of species magically fit on a boat roughly the size of a football field, like a bag from Harry Potter. It’s the same way he got pairs of creatures on whole other continents to, and later from, the Middle East; how one family, through incest, rapidly evolved into multiple human races immediately after the flood (or did he make new human beings, too?); how a worldwide flood and the total destruction of every human civilization left behind no evidence. The power of a deity — and our imagination — can take care of such challenges to dogma. But it cannot eviscerate the evidence for evolution. Science is the true arrow in mythology’s heel.

Still, notions of intelligent design bring up many curious questions, such as why a deity would so poorly design, in identical ways, the insides of so many species (see below), why said deity would set up a world in which 99% of his creative designs would go extinct, and so on.

It seems high time we set aside ancient texts written by primitive Middle Eastern tribes and listened to what modern science tells us. And that’s coming from a former creationist.

It Wasn’t Just Darwin

74596-120-F4F7C75F.jpg

Charles Darwin, 1809-1882. via Britannica

Creationists attempt to discredit evolution by attacking the reliability and character of Darwin, but forget he was just one man. Darwin spent decades gathering the best evidence for evolution of his day, showed for the first time its explanatory powers across disciplines (from geography to embryology), and brought his findings to the masses with his accessible books. But there were many who came before him that deepened our and his understanding of where diverse life came from and how the biblical Earth wasn’t quite so young. For example:

  • In the sixth century B.C., the Greek philosopher Anaximander studied fossils and suggested life began with fishlike creatures in the oceans.
  • James Hutton argued in the 1700s that the age of the earth could be calculated based on an understanding of geologic processes like erosion and the laying down of sediment layers.
  • In 1809, Jean-Baptiste Lamarck theorized that physical changes to an individual acquired during its life could be passed to offspring (a blacksmith builds strength in his arms…could that lead to stronger descendants?).
  • By the 1830s, Charles Lyell was putting Hutton’s ideas to work, measuring the rate at which sediments were laid, and counting backward to estimate Earth’s age.
  • Erasmus Darwin, Charles’ grandfather, suggested “all warm-blooded animals have arisen from one living filament,” with “the power of acquiring new parts…delivering down those improvements by generation.”
  • Alfred Wallace theorized natural selection independently of and at the same time as Charles Darwin!

In other words, if it wasn’t Darwin it would have been Wallace. If not Wallace then someone else. Like gravity or the heliocentric solar system, the scientific truth of evolution could not remain hidden forever.

Creationists also seize upon Darwin’s unanswered questions and use them to argue he “disproved” or “doubted” the validity of his findings. For example, Darwin, in his chapter on “Difficulties of the Theory” in The Origin of Species, said the idea that a complex eye “could have been formed by natural selection, seems, I freely confess, absurd in the highest possible degree.”

Emphasis on seems. He went on to say:

When it was first said that the sun stood still and the world turned round, the common sense of mankind declared the doctrine false… Reason tells me, that if numerous gradations from an imperfect and simple eye to one perfect and complex, each grade being useful to its possessor, can be shown to exist, as is certainly the case; if further, the eye ever slightly varies, and the variations be inherited, as is likewise certainly the case; and if such variations should ever be useful to any animal under changing conditions of life, then the difficulty of believing that a perfect and complex eye could be formed by natural selection, though insuperable by our imagination, cannot be considered real.

In other words, the evolution of eye is possible and there is no real difficulty in supposing this given other evidence he had found. Darwin knew he was not the end of the line. He made predictions concerning future discoveries, and supposed that other scientists would one day show how eyes could develop from non-existence to simple lenses to complex eyes, as they indeed have. It began with cells that are more sensitive to light than others. Biologists believe, in the words of Michael Shermer (God Is Not Great, Hitchens), that there was

Initially a simple eyespot with a handful of light-sensitive cells that provided information to the organism about an important source of the light; it developed into a recessed eyespot, where a small surface indentation filled with light-sensitive cells provided additional data on the direction of light; then into a deep recession eyespot, where additional cells at greater depth provide more accurate information about the environment; then into a pinhole camera eye that is able to focus an image on the back of a deeply-recessed layer of light sensitive cells; then into a pinhole lens eye that is able to focus the image; then into a complex eye found in such modern mammals as humans.

Earth has creatures with no eyes, creatures with “a handful of light-sensitive cells,” and all the other stages of eye development, right up to our complex camera eye. Given this, there is no reason to believe the evolution of the eye is impossible. As creatures evolved from lower lifeforms, there were slight variations in their ability to detect light, which proved useful for many, which helped creatures survive, which passed on the variations to offspring. This is how life can go from simple to complex over the generations. See The Evidence for Evolution, Alan Rogers, pp. 37-49, for a detailed study.

While the natural process has yet to be observed by humans — it takes eons, after all — we are able to create computer models that mimic beneficial mutations. Dan-Eric Nilsson and Susanne Pelger at Lund University in Sweden, for instance, made a simulation wherein a group of light-sensitive cells on top of a retina experienced random mutations in the tissues around them. The computer was programmed to keep mutations that improved vision in any way, no matter how small. So when the tissue pulled backward, for example, forming a “cup” for the primitive eye, this was preserved because it was an improvement. After 1,829 mutations (400,000 years), the simulation had a complex camera eye (Coyne). Computer models are a great tool for showing how evolution works. Simulations aren’t programed to build something complex, only to follow the simple laws of natural selection. Check out Climbing Mount Improbable by Richard Dawkins for more.

Strange Coincidences

TetrapodLimb.jpg

Homologous limbs. via University of California Museum of Paleontology

While the study of homologous structures is fascinating, most won’t impress creationists. Humans, bats, birds, whales, and other creatures all have a humerus, radius, ulna, carpals, metacarpals, and phalanges in their forelimbs, with simple variations in size and sometimes number, suggesting they are related via a common ancestor yet have changed, evolved. But the creationist can simply say a sensible deity created them with similar structures. 

Yet there are some coincidences and oddities that no serious person would call intelligent design, and in fact scream common ancestry.

Modern whales have tiny leg bones inside their bodies that are detached from the rest of the skeleton. We humans have three muscles under our scalps that allow some of use to wiggle our ears, which do nothing for our hearing but are the precise same muscles that allow other animals to turn their ears toward sounds. Goosebumps, now worthless, are vestiges of an era when our ancestors had fur. Our sinus cavities, behind our cheeks, have a drainage hole on top — our ancestors walked on all fours, and thus the location made sense, allowing better drainage. Cave salamanders have eyes but are totally blind. Koalas, which spend most of their time in trees, have pouches for their young that open up-side-down — their ancestors were diggers on the ground, so this was useful to protect young from dirt and rock thrown about, but now threatens to allow koala cubs to plunge from trees (The Greatest Show on Earth, Richard Dawkins).

Even more astonishing, within the neck of Earth’s mammals, the vagus/recurrent laryngeal nerve, instead of simply going the short distance from your brain to your voicebox, extends from the brain, goes down into your chest, twists around your aortic arch by the heart, and then travels back up to the voicebox! It’s three times longer than necessary.

Incredibly, this same lengthy, senseless detour is found in other mammals, even the towering giraffe, in which it is fifteen to twenty feet longer than needed (see evolutionist Richard Dawkins cut one open and look here). In fish, which evolved earlier than us, the nerve connects the brain to the gills in a simple, straightforward manner (Coyne). This indicates our common ancestors with fish did not have this issue, but our common ancestors with other, later species did. As our mammalian ancestors evolved, the nerve was forced to grow around other developing, growing, evolving structures.

Human males have another interesting detour. As explained by Dawkins, the vans deferens, the tube that carries sperm from testes to penis, is also longer than necessary — and indeed caught on something. The vans deferens leaves the testes, travels up above the bladder and loops around the ureter like a hangar on a department store rack. It then finally finds its target, the seminal vesicle, which mixes secretions with the sperm. Then the prostate adds more secretions, finalizing the product (semen), which ejaculates via the urethra. The vans deferens could go straight to the seminal vesicle (under instead of around the bladder and ureter), but it doesn’t.

This same trait is found in other male mammals, like pigs. Creatures like fish again do not have this mess. Our ancestors had testes within the body, like many modern species, and as they descended toward the scrotum, toward the skin for cooler temperatures, the wiring got caught on the ureter. Perhaps one could see an intelligent (?) designer having to jam some things together to make them work — a detour for the van deferens here, another for the recurrent laryngeal nerve there — in one species. But in mammals across the board? How does that make more sense than all this being the imperfect byproduct of mindless evolution over time?    

Recurrent_laryngeal_nerve-for_web

via Laryngopedia

1461403451_the-ductus-deferens.png

via Anatomy-Medicine

And it doesn’t end there. Vertebrates (species that have a backbone) like us happen to have eyes with retinas installed backward. Rogers writes:

The light-sensitive portion of the retina faces away from the light… The nerves, arteries, and blood vessels that serve each photocell are attached at the front rather than the back. They run across the surface of the retina, obscuring the view. To provide a route for this wiring, nature has poked a hole in the retina, which causes a substantial blind spot in each eye. You don’t notice these because your brain patches the image up, but that fix is only cosmetic. You still can’t see any object in the blind spot, even if it is an incoming rock.

But cephalopods (squid, octopi, and other advanced invertebrates) have a more sensible set-up, with wiring in the back (Rogers). Guess what kind of creature appeared on this planet first? Yes, the invertebrates. These coincidences and bad engineering suggest that as life evolved to be more complex there were greater opportunities for messy tangles of innards.

The best creationists can do is declare there are good reasons for these developments, that evolutionists “fail to demonstrate how this detour…disadvantages the male reproductive system” for example, which is completely beside the point. There were indeed biological reasons behind the development of these systems, which served as an advantage, not a hindrance (breaking the vans deferens or recurrent laryngeal nerve to let other organs grow and evolve would not be good for survival). The point is that if some species share this trait, it hints at a common ancestor.

So does embryology, the study of development in the womb. The field of genetics, which we explore further in the next section, helped us discover dead genes or pseudo genes in lifeforms. These are genes that are usually inactive but carry traits that, if developed, would be viewed as abnormal. In light of evolution it makes sense that we still have them. And sometimes dead genes wake up.

Humans have just under 30,000 genes, with over 2,000 of them pseudo genes. We have dead genes for growing tails, for instance. We all have a coccyx, four fused vertebrae that make up the end of our spine — four vertebrae that are larger and unfused in primates, thus being the base of their tails (Coyne). Not only are some humans born with an extensor coccygis, the muscle that moves the tail in primates but is worthless in us due to our vertebrae being fused, some people are born with a tail anywhere from one inch long to one foot! It has to be surgically removed.

Balaji

Arshid Ali Kahn, born in India in 2001, was worshiped as a reincarnation of the Hindu monkey god Hunaman. He had his tail removed in 2015. via Mirror

In fact, all human embryos begin with a fishlike tail, which is reabsorbed into the body around week seven. We develop a worthless yolk sac that is discarded by month two, a vestige of reptilian ancestors that laid eggs containing a fetus nourished with yolk. We develop three kidneys, the first resembling that of fish, the second resembling that of reptiles; these are also discarded, leaving us with our third, mammalian version. From month six to eight, we are totally covered in a coat of hair (lanugo) — primates develop their hair at the same stage, only they keep it. These marvels exist in other life, too. Horse embryos begin with three-toed hooves, then drop to one; they descended from creatures with more than just one toe. Occasionally, a horse is born with more than one hoof, or toe, on each foot (polydactyl horse), similar to its ancestors. Birds carry the genes necessary to grow teeth, minus a single vital protein; they descended from reptiles with teeth. Dolphin and whale embryos have hindlimb buds that vanish later; baleen whale embryos begin to develop teeth, then discard them (Coyne).

lanugo-bebe-vello-espalda-hombros

Premature infants still have some of their lanugo coat. They will soon lose it. via Mipediatra

It should also be noted that people with hypertrichosis are covered in fur like other primates — perhaps the reactivation of a “suppressed ancestral gene. In the course of evolution genes causing hair growth have been silenced and the appearance of hair in healthy humans can be explained by an erroneous reactivation of such genes.” We all have the genes for a full coat.

maxresdefault

Supatra “Nat” Sasuphan, who has hypertrichosis, is the Guinness Book of World Records holder for hairiest person. via Fox News

Quite interesting that God would give us genes to grow tails and fur.

Our fetal development, you likely noticed, actually mimics the evolutionary sequence of humanity. This is most noticeably true with our circulatory system, which first resembles that of fish, then that of amphibians, then that of reptiles, then finally develops into our familiar mammalian circulatory system (Coyne). Strange coincidences indeed.

But there are more. As one would expect if evolution occurred, fossils of creatures found in shallower rock more closely resemble species living today; fossils found in deeper, older sedimentary layers are more different than modern life. This pattern has never been broken by any fossil discovery, and supports Darwin’s idea (Coyne).

Similarly, consider islands. The species found on islands consistently resemble those on the nearest continent. This at first does not sound surprising, as one would predict that life (usually birds, insects, and plant seeds) that colonized islands would do so from the closest landmass. But the key word is “resemble.” What we typically see are a few species native to a continent (the ancestors) and an explosion of similar species on the nearby islands (the descendants). Hawaii has dozens of types of honeycreepers (finches) and half the world’s 2,000 types of Drosophila fruit flies; Juan Fernandez and St. Helena are rich in different species of sunflower; the Galapagos islands have 14 types of finches; 75 types of lemurs, living or extinct, have been documented on Madagascar, and they are found nowhere else; New Zealand has a remarkable array of flightless birds; and Australia has all the world’s marsupials, because the first one evolved there. To the evolutionist, a tight concentration of similar species on islands (and individual islands having their own special species) is the result of an ancestral explorer from a nearby landmass whose descendants thrived in a new environment unprepared for them (a habitat imbalance), reproducing and evolving like crazy. Thus a finch on a continent has a great number of finch cousins on nearby islands — like her but not the same species (Coyne). Darwin himself, still a creationist at the time, was shocked by the fact that each island in the Galapagos, most in sight of each other, had a slightly different type of mockingbird (Rogers).

To the creationist, God simply has an odd affinity for overkill on islands.

Shared DNA

In the 20th century, geneticists like Theodosius Dobzhansky synthesized Darwin’s theory with modern genetics, showing how the random, natural mutation of genes during the copying of DNA changes the physiology of lifeforms (should that altered state help a creature survive, it will be passed on to offspring). The study of DNA proved once and for all that Darwin was right. By mapping the genetic code of Earth’s lifeforms, scientists determined — and continue to show — that all life on Earth shares DNA.

DNA is passed on through reproduction. You get yours from your parents. You share more DNA with your parents and siblings than you do with your more distant relatives. In the same way, humans share more DNA with some living things than with others. We share 98% with chimps, 85% with zebra fish, 36% with fruit flies, and 15% with mustard grass. By share, we mean that 98% of DNA base pairs (adenine, guanine, cytosine, and thymine) are in the precise same spot in human DNA compared to chimp DNA. (These four nucleobases can be traded between species. There is no difference between them — we’re all made of the same biochemical stuff.) 

It is not surprising that creatures similar to us (warm-blooded, covered in hair, birth live young, etc.) are closer relatives than less similar ones. It’s no coincidence that apes look most like us and share the most DNA with us (and are able to communicate most directly with us, with one of our own languages, learning and holding entire conversations in American Sign Language). Evolutionary biologists used to use appearance and behaviors (such as gills or reproductive method) to suppose creatures were related, like the trout and the shark or the gorilla and the human being. But DNA now confirms the observations, as trout DNA is more similar to shark DNA than, say, buffalo DNA, and gorilla DNA is more similar to human DNA than, say, fruit fly DNA. 

But all life shares DNA, no matter how different (for a deeper analysis, see Rogers pp. 25-31, 86-92). That simple truth proves a common forefather. A god would not have to make creations with chimp and human DNA nearly the same, all the nucleobases laid out in nearly the same order; why do so, unless to suggest that evolution is true? When mapped out by genetic similarity, we see exactly what Darwin envisioned: a family tree with many different branches, all leading back to a common ancestor.  

tree-of-life_2000

Our tree of life. Click link in text above to zoom. via Evogeneao

Transitional Forms

Darwin predicted we would find fossils of creatures with transitional characteristics between species, for example showing how lifeforms moved from water to land and back again. Unfortunately, the discovery of such fossils has done nothing to end the debate over evolution. 

For instance, as transitional fossils began to accumulate, it became even more necessary to attack scientific findings on Earth’s age. If you can keep the Earth young, evolution has no time to work and can’t be true. So, as mentioned, creationists insist radiometric dating is flawed. Rocks cannot be millions of years old, thus the fossils encased within them cannot either. This amounts to nothing more than a denial of basic chemistry. Rocks contain elements, whose atoms contain isotopes that decay into something else over time at constant rates. So we can look at an isotope and plainly see how close it is to transformation. We know the rate, and thus can count backward. If researchers only had a single isotope they used, perhaps creationist would have a prayer at calling this science into question. But rubidium becomes stronium. Uranium changes to lead, potassium to argon, samarium to neodymium, rhenium to osmium, and more (see Rogers pp. 73-80 to explore further). This is something anyone devote study to, grab some rocks, and measure themselves. All creationists can do is say we aren’t positive that “the decay rate has remained constant”! Can you imagine someone saying that during Isaac Newton’s time gravity’s acceleration wasn’t 9.8 meters per second squared? Anyone can make stuff up!

(You’ll find most denials of evolution rest on denials or misunderstandings of the most basic scientific principles. Some creationists insist evolution is false because it betrays the Second Law of Thermodynamics, which states that the energy available for work in a closed system will decrease over time — that things fall apart. So how could simple mechanisms become more complex? How could life? What they forget is that the Earth’s environment is not a closed system. The sun provides a continuous stream of new energy. Similarly, some believe in “irreducible complexity,” the idea that complex systems with interconnected parts couldn’t evolve because one part would have no function until another evolved, therefore the first part would never arise, and thus neither could the complex system. But the “argument from complexity” fails per usual. [Other arguments, such as the “watchmaker” and “747” analogies, are even worse. Analogy is one of the weakest forms of argument because it inappropriately pretends things must be the same. No, a watch cannot assemble itself. That does not mean life does not evolve. Analogies fighting evidence are always doomed.] Biologists have discovered that parts can first be used for other tasks, as was determined for the bacterial flagellum, the unwise centerpiece of creationist Michael Behe’s skepticism. Independent parts can evolve to work together on new projects later on. Rogers writes:

Many hormones fit together in pairs like a lock and key. What good is the lock without the key? How can one evolve before the other? Jamie Bridgham and his colleagues studied one such pair and found that the key evolved first — if formerly interacted with a different molecule. They even worked out the precise mutations that gave rise to the current lock-and-key interaction.

A part of this process is sometimes scaffolding, where parts that helped form a complex system disappear, leaving the appearance that the system is too magical to have arisen. The scaffolding required to build our bridges and other structures is the obvious parallel.)

Let’s consider the fossils humanity has found. Tiktaalik was a fish with transitional structures between fins and legs. “When technicians dissected its pectoral fins, they found the beginnings of a tetrapod hand, complete with a primitive version of a wrist and five fingerlike bones… [It] falls anatomically between the lobe-finned fish Panderichthys [a fish with amphibian-like traits], found in Latvia in the 1920s, and primitive tetrapods like Acanthostega [an amphibian with fish-like traits], whose full fossil was recovered in Greenland not quite two decades ago.” Tiktaalik had both lungs protected by a rib cage and gills, allowing it to breath in air and water, like the West African lungfish and other species today. Its fossil location was actually predicted, as researchers knew the age and freshwater environment such a missing link would have to appear in (Coyne).

Ambulocetus had whale-esque flippers with toes (Rodhocetus is similar). Pezosiren was just like a modern manatee but had developed rear legs. Odontochelys semitestacea was an aquatic turtle with teeth. Darwinius masillae had a mix of lemur traits and monkey traits. Sphecomyrma freyi had features of both wasps and ants. Archeopteryx was more bird-like than other feathered dinosaurs (that’s feathered reptiles), yet not quite like modern birds. Its asymmetrical feathers suggest it could fly. The Microraptor gui, a dinosaur with feathered arms and legs, could likely glide. Other featured dinosaurs were found fossilized sleeping with their head tucked under their forearm or sleeping on a nest of eggs, just like modern birds (Coyne; see also Dawkins pp. 145-180).

Australopithecus afarensis, Australopithecus africanus, Paranthropus, Homo habilis, Homo erectus, and many more species had increasingly modern human characteristics. Less and less like a primate, closer and closer to modern Homo sapiens. Fossils indicate increasing bipedality (walking upright on two legs), smaller jaws and teeth, increasingly arching feet, larger brains, etc. (Also important to note are the increasingly complex tools and shelters found with such fossils. Homo erectus left behind huts, spears, axes, and bowls. Our planet had not-fully-human creatures crafting quite human-like things. Think on that. See The History of the World, J.M. Roberts.)

fossil-hominid-skulls.jpg

A: chimp skull. B-N: transitional species from pre-human to modern human. via Anthropology

It doesn’t stop there, of course. Evolution can been seen in both the obvious and minuscule differences between species.

See for example “From Jaw to Ear” (2007) and “Evolution of the Mammalian Inner Ear and Jaw” (2013). It was theorized that three important bones of a mammal’s ear — the hammer, anvil, and stirrup — were originally part of the jaw of reptilian ancestors (before mammals existed). In modern mammals there is no connecting bone between the jaw and the three inner-ear bones, but if there was an evolution from reptilian jaw bone to mammalian inner-ear bone, fossils should show transitional forms. And they do: paleontologists have found fossils of early mammals where the same bones are used for hearing and chewing, as well as fossils where the jaw bones and inner-ear bones are still connected by another bone.

Creationists have a difficult time imagining how species could evolve from those without wings to those with, from those that live on land to water-dwellers, from aquatic lifeforms back to land lovers, and so on, because they believe intermediary, transitional traits would be no good at all, could not help a creature survive. “What good is half a wing?”

Yet today species exist that show how transitional traits serve creatures well. Various mammals, marsupials, reptiles, amphibians, fish, and insects glide. It is easy to envision how reptiles could have evolved gliding traits followed by powered flight over millions of years. Or consider creatures like hippos, which are closely related to and look like terrestrial mammals but spend almost all their days underwater, only coming ashore occasionally to graze. They mate and give birth underwater, and are even sensitive to sunburn. Give it eons, and couldn’t such species change bit by bit to eventually give up the land completely? The closest living genetic relative to whales are in fact hippos (Coyne). And finally, what of the reverse? What of ocean creatures that head to land?Crocodiles can gallop like mammals (up-down spine flexibility) as well as walk like lizards (right-left spine flexibility; see Dawkins). The mangrove rivulus, the walking catfish, American eels, West African lungfish, four-eyed fish, snakeheads, grunions, killifish, the anabas, and other species leave the waters and come onto land for a while, breathing oxygen in the air through their skin or even lungs, flopping or slithering or squirming or walking to a new location to find mates, food, or safety. Why is it so difficult to imagine a species spending a bit more time on land with each generation until it never returns to the water?

“Half a wing” is not a thing. There are only traits that serve a survival purpose in the moment, like membranes between limbs for gliding. Traits may develop further, they may remain the same, they may eventually be lost, all depending on changes in the environment over time. Environment (food sources, mating options, predators, habitability) drive evolutionary changes differently for all species. That’s natural selection. When some members of a species break away from the rest (due to anything from mudslides to migration to mountain range formation), they find themselves in new environments and evolve differently than their friends they left behind. Coyne writes, “Each side of the Isthmus of Panama, for example, harbors seven species of snapping shrimp in shallow waters. The closest relative of each species is another species on the other side.” Species can change a little or change radically, unrecognizably, but either way they can be called a new species — in fact, unable to reproduce with their long-lost relatives, because their genes have changed too greatly. That’s speciation.

There is no question that the fossil record starts with the simplest organisms and, as it moves forward in time, ends with the most complex and intelligent — all beginning in the waters but not staying there. Single-cell organisms before multicellular life. Bacteria before fungi, protostomes before fish, amphibians before reptiles, birds before human beings.

If they wish, creationists can believe the fossil record reflects the chosen sequence of a logical God, even if it does not support the Judeo-Christian creation story (in which birds appear on the same “day,” Day 5, as creatures that live in water, before land animals, which appear on Day 6; the fossil record shows amphibians, reptiles, and mammals appearing long before birds — and modern whales, being descendants of land mammals, don’t appear until later still, until after birds, just 33 million years ago). Yet they must face the evidence and contemplate what it indicates: that a deity created fish, then later fish with progressively amphibious features, then later amphibians; that he created reptiles, then later reptiles with progressively bird-like features, then birds; and so forth. No discovery has ever contradicted the pattern of change slowly documented since Darwin. God is quite the joker, laying things out, from fossils to DNA, in a neat little way to trick humans into thinking we evolved from simpler forms (note: some creationists actually believe this).

Yes, the believer can simply claim these were all their own species individually crafted by God, with no ancestors or descendants who looked or acted any different. The strange fact that we have birds that cannot fly and mammals in the oceans that need to come up to the surface for air doesn’t engage the kind of critical thinking one might hope for. It’s all just a creative deity messing with animals!

Watching Evolution Occur

069.jpg

Renee, an albino kangaroo at Namadgi National Park, Australia. via Telegraph

Most creationists are in fact quite close to accepting evolution as true.

First, they accept that genes mutate and can change an individual creature’s appearance. They know, for instance, about color mutations. We’re talking albinism, melanism, piebaldism, chimeras, erythristics, and so on.

Second, most creationists accept what they call “microevolution”: mutations help individuals survive and successfully reproduce, passing on the mutation, changing an entire species generation by generation in small ways, but of course not creating new species. They accept that scientists have observed countless microevolutionary changes: species like tawny owls growing browner as their environments see less snowfall, Trinidad guppies growing larger, maturing slower, and reproducing later when predators are removed from their environments, green anole lizards in Florida developing larger toepads with more scales to escape invaders, and more, all within years or decades. They understand evolution is how some insects adapt to pesticides and some viruses, like HIV and TB, adapt to our vaccines over time, and how we human beings can create new viruses in the lab. They acknowledge that humanity is responsible, through artificial selection, or selective breeding, for creating so many breeds of dogs with varying appearances, sizes, abilities, and personalities (notice the greyhound, bred for speed by humans, closely resembles the cheetah, bred for speed by natural selection). In the same way, we’ve radically changed crops like corn and farm animals like turkeys (who are now too large to have sex), and derived cabbage, broccoli, kale, cauliflower, and brussels sprouts from a single ancestral plant, to better sate our appetites, simply by selecting individuals with traits we favor and letting them reproduce.

120711-BananaPhoto-hmed-1040a_files.grid-6x2.jpg

Wild banana (below) vs. artificially selected banana. via NBC News

The evidence presented thus far should push open-minded thinkers toward the truth, but for those still struggling to make the jump from microevolution to evolution itself, we are not done yet. The resistance is understandable given that small changes can easily be observed in the lab or nature, but large changes require large amounts of time — thousands, millions of years — and thus we mostly (but not entirely) have to rely on the evidence from DNA, fossils, embryology, and so on. Here are some points of perspective that can bridge the gap between small changes and big ones.

1. Little changes add up. If you accept microevolution, you accept that species can evolve to be smaller or bigger, depending on what helps them survive and reproduce. Scott Carroll studied soapberry bugs in the U.S. and observed some colonizing bigger soapberry bushes than normal; he predicted they would also grow larger, as larger individuals would be more successful at reaching fruit seeds. Over the course of a few decades, the bugs’ “beak” length grew 25%. That’s significant. Now imagine what could theoretically be done with more time. As Coyne writes, “If this rate of beak evolution was sustained over only ten thousand generations (five thousand years), the beaks would increase in size by a factor of roughly five billion…able to skewer a fruit the size of the moon.” This is unlikely to happen, but shows how little changes later yield dramatic results. Imagine traits other than size — all possible traits you can think of — changing at the same time and evolution doesn’t sound so impossible.

2. Genes are genes. This relates closely to the point above. If some genes can mutate, why can’t others? Genes determine everything about every creature. People who believe in microevolution accept that genes for size or color can change, but not genes for where your eyes are, whether you’re warm- or cold-blooded, whether you have naked skin or a thick coat of fur, whether you have a hoof or a hand, and so on. But there is no scientific basis whatsoever for this dichotomy of the possible. It’s simply someone claiming “These genes can mutate but not these, end of story” to protect the idea of intelligent design. Genes are genes. They are all simply sequences of nucleotides. As far as we know, no gene is safe from mutation.

2-goat.jpg

Octogoat, a goat with eight legs, born in Kutjeva, Croatia. via ABC News

3. Mutations can be huge. We’ve seen how humans can have tails, but we also see “lobster claw hands,” rapid aging, extra limbs, conjoined twins, and other oddities. Consider other mutations: snakes with two heads, octopi with only six tentacles, ducks with four legs, cats with too many toes. For the common fruit fly, the antennapedia mutation will mean you get legs where your antenna are supposed to be! Dramatic mutations are possible. Survival is possible. Passing on new, weird traits is possible. With evolution, sometimes groups with new traits totally displace and eliminate the ancestral groups; sometimes they live side-by-side going forward. If you came across a forest and discovered one area was occupied by two-headed snakes and another by single-headed snakes, all other traits being the same, wouldn’t you be tempted to call them different species? Declare something new had arisen on Earth?

4. We are currently watching evolution occur. Scientists have observed speciation. They’ve taken insects, worms, and plants, put small groups of them in abnormal environments for many generations, and then seen they can no longer reproduce with cousins in the normal environments because they have evolved. It’s easy to create new species of fruit flies in particular because their generations are so short. Evolution for other species is typically much slower, but significant changes are being observed.

Say you were instead on the African Savanna and came upon two groups of elephants. They are the same but for one startling difference: one group has no tusks. Like two-headed snakes, what a bold difference in appearance! Should we classify them as different species or the same? (Technically, they aren’t different species if they can still reproduce offspring together, but in the moment you aren’t sure.) Well, African elephants are increasingly being born without tusks. After all, those without are less likely to be killed by poachers for ivory. This is natural selection at work. Could not a changing environment and millions of years change more? Size, color, skin texture, hair, skeletal layout, teeth, and all other possible traits determined by all other genes?

Next, take a remarkable experiment involving foxes launched by Dmitry Belyaev and Lyudmila Trut in the Soviet Union in the late 1950s, which Trut is still running to this day. No, we can’t watch a species for 500,000 years to see dramatic evolution in action. But 60 years gives us something.

At the time, biologists were puzzled as to how dogs evolved to have different coats than wolves, since they couldn’t figure out how the dogs could have inherited those genes from their ancestors. Belyaev saw silver foxes as a perfect opportunity to find out how this happened. Belyaev believed that the key factor that was selected for was not morphological (physical attributes), but was behavioral. More specifically, he believed that tameness was the critical factor.

In other words, Belyaev wanted to see if foxes would undergo changes in appearance if they evolved different behaviors. So Belyaev and Trut set about taming wild silver foxes.

17686.jpg

Wild silver fox. via Science News

They took their first generation of foxes (which were only given a short time near people) and simply allowed the least aggressive to breed. They repeated this with every generation. They had a control group that was not subjected to selective breeding.

The artificial selection of course succeeded for fox behavior. They became much more open to humans, whining for attention, licking them, wagging their tails when happy. But there was more:

A much higher proportion of experimental foxes had floppy ears, short or curly tails, extended reproductive seasons, changes in fur coloration, and changes in the shape of their skulls, jaws, and teeth. They also lost their “musky fox smell.”

Spotted coats began to appear. Trut wrote that skeletal changes included shortened legs and snouts as well. Belyaev said they started to sound more like dogs (Dawkins). Geneticists are now seeking to isolate the genes related to appearance that changed when selectively breeding for temperament.

Belyaev was right. And his foxes, through evolution, came to look more and more like dogs. This is the same kind of path that some wolves took when they evolved into dogs (less aggressive wolves would be able to get closer to humans, who probably started feeding them, aiding survival; tameness increased and physical changes went with it).

If such changes can occur in just 60 years, imagine what evolution could do with a hundred million years.

NOVA_Dogs_crop.jpg

Dr. Lyudmila Trut with domesticated silver fox. via WXXI

In the Beginning

It’s true, scientists are still unsure how life first arose on Earth. And because it is an enduring mystery without hard evidence, scientists with hypotheses and speculations openly acknowledge this. Note that’s a big difference compared to evolution, which scientists speak confidently about due to the wealth of evidence.

But one professor at MIT believes that far from being unlikely, nonliving chemicals becoming living chemicals was inevitable.

From the standpoint of physics, there is one essential difference between living things and inanimate clumps of carbon atoms: The former tend to be much better at capturing energy from their environment and dissipating that energy as heat… When a group of atoms is driven by an external source of energy (like the sun or chemical fuel) and surrounded by a heat bath (like the ocean or atmosphere), it will often gradually restructure itself in order to dissipate increasingly more energy. This could mean that under certain conditions, matter inexorably acquires the key physical attribute associated with life.

Researchers have discovered lipids, proteins, and amino acids beneath the seafloor, suggesting the chemical interaction between the mantle and seawater could produce the building blocks of life. From there, time and proper conditions could give rise to the first self-replicating molecule. Evolution would then continue on, spending billions of years developing the diverse flora and fauna we see today (a single cell leading to complex life under the right conditions should not be so shocking; as J.B.S. Haldane said, “You did it yourself. And it only took you nine months”).

Determining precisely how the first cell arose is the next frontier of evolutionary biology, and it is exciting to be here to witness the journey of discovery. New findings and experiments will wipe away “watchmaker” arguments used against the first cell. They will once again crush the “God gap,” the bad habit of the faithful to fill gaps in our scientific knowledge with divine explanations. I imagine in our lifetime someone will successfully complete Stanley Miller’s famous 1950s experiment, in which he tried to recreate the Earth’s early conditions and create life itself.

Yet lack of knowledge concerning the beginning of life in no way hurts the case for evolution. Evolution is proven, as definitively as whether the earth orbits the sun.

For more from the author, subscribe and follow or read his books.

Why I Am Not a Communist (Nor an Anarchist)

Having criticized the authoritarian communist states that arose in the 20th century, in particular the Bolsheviks in Russia for crushing worker power, and having also explored the basic tenets of anarchism (and how it is the father of the blasphemous bastard child that is anarcho-capitalism/libertarianism), I wanted to devote some time to musing over the merits of communism and anarchism relative to socialism.

While all anti-capitalist, these ideologies are not the same and should not be confused. I therefore include basic outlines (leaving out many different subtypes of each) before considering their relative advantages and downsides. I attempt to present each in their most ethical, idealized form (most free, most democratic, and so forth). Criticisms of ideologies should not be mistaken as disrespect for my Marxist comrades who think differently.

Communism destroys capitalism from the top-down. The government, as an instrument of the people, owns all workplaces and organizes the economy and the workers according to a central plan that meets citizen needs. Under this system, competition can be wholly and more easily eliminated, making the enormous pressure to put profits over people a thing of the past. Wasteful and redundant production goes away with it, meaning more workers and resources for more important tasks that build a better society (for example, no more energy and billions spent on advertising, instead diverted to education). Further, the national wealth can be easily divided up among the people, public sector salaries enriching all.

However, communism entails enormous challenges. It surely requires giving up the full freedom to choose your line of work – if your community or national plan only allows for a certain number of bookstores or bookstore workers, there may not be room for you. You would be rejected upon applying with the local or national government to open a new bookstore (as you would surely have to do for a plan, and thus communism, to function) or upon applying for a job at an established bookstore. Under communism, workers are supposed to “own” their workplaces because they “own” the State, but this is a rather indirect form of control that leaves some people wanting. You may have options regarding the work you do, but you will have to sacrifice your interests for the sake of the plan.

Of course, as long as you don’t find yourself under authoritarian communism, you would help decide the plan, at the ballot box. But how much would you help? That raises a second challenge: can communism function without representative government (or a worse concentration of power)? A common notion is that the workers, the people, would elect members of their worker councils to participate in the design and execution of the national plan (or elect representatives from their geographic community, as is done in politics today). So if you worked in auto manufacturing while waiting for a bookstore job to open up, you would run or elect someone for the honor and task of representing the American Auto Workers Council on the National Planning Committee. The representatives, using a broad array of data on what goods and services are need where, and what resources and workers will be needed to create and distribute, would craft a central plan for a certain number of years.

Can this enormous power be socialized further? We understand the risks of representative governance – concentrated power is more easily influenced and corrupted, and doesn’t give people a direct say over their destinies. Even with the disappearance of capitalist businesses, a small group of decision-makers would still face enormous pressures from countless localities, people, and organizations. We could see to it that the people have a direct up or down vote on the plan after the representatives craft it (or other checks and balances). But eliminating a representative structure entirely seems impossible. Imagine the daunting task of voting on how much corn the U.S. should grow in a given three-year period. On how many more workers are needed to produce a higher number of epipens. On how many homes should be built in a city on the other side of the country. (It very much seems that you must make this vote on national matters, rather than simply voting on what your local community needs. If each municipality democratically decided what they needed, these decisions would have to be reconciled at the national level, as there may not be the resources to do everything every community decides to do. Like the would-be bookstore worker, some communities will not get what they wanted, making the vote a sham. And, naturally, trying “communism” at local levels, where communities can only use the workers and resources within their communities, leaves massive inequities between regions. It might be possible to instead divide up the national wealth to each region somewhat according to its need and then let each decide how to use its allotted funds, but how much each city or town should get would also be impossible to sensibly sort out using direct democracy.)

Organizing an economy is a monumental task requiring mountains of accurate, up-to-date data. How difficult for an elected body of experts – a full-time job with a high risk of costly mistakes and turmoil. Can workers devote the time and study to make educated decisions on what to produce, their quantities, prices, and required manpower and resources, for an entire country? Would not voting itself, on thousands or hundreds of thousands of economic details, take days, weeks, or months? And if the people cannot be expected to plan the economy via direct vote, how can they be expected to make an informed up-down vote on a plan formulated by others? There seems to be no escaping representative government with communism. These challenges suggest this system may not be preferable.

Anarchism does away with capitalism from the bottom-up. Workplaces would be owned and run by workers, would federate to coordinate activities rather than compete, and local communities would make all decisions democratically. The State, as a hierarchical structure like capitalism, would be abolished. In this way, people would be free as possible from compulsion, authority, and concentration of power, enjoying individual freedoms as long as they do not hurt others. You’d have equal power to make decisions that affect you, joining in your local citizen assembly and worker council. Anarchism harkens back to the era of “primitive communism” we explored elsewhere.

Anarchists have differing views on whether capitalism can be dismantled after the State. Does the State have a vital role to play in capitalism’s eradication? Anarchist H.G. Wells, among others, thought only socialism could make anarchism possible:

Socialism is the preparation for that higher Anarchism; painfully, laboriously we mean to destroy false ideas of property and self, eliminate unjust laws and poisonous and hateful suggestions and prejudices, create a system of social right-dealing and a tradition of right-feeling and action. Socialism is the schoolroom of true and noble Anarchism, wherein by training and restraint we shall make free men.[1]

The challenge with anarchism is that, like “local communism,” it leaves communities to fend for themselves, meaning poorer peoples beside richer ones. Unless, of course, communities worked together, sharing workers and resources, in a movement toward the integration of larger and larger units and the necessary joint administration (however democratic), weakening local control and journeying down the path toward what are essentially nations. Further, if you avoided that, while a spirit of human oneness could indeed rise with the disappearance of nations, one wonders what is to stop factionalism based on community identity. Is pride and loyalty to a neighborhood, town, or city not predictable? One worries about true global solidarity. In the same vein, individual anarchist communities seem vulnerable to rivalry and conflict, especially if they differ in wealth, habitability, and so on. It all sounds a bit like the city-states of ancient Greece, albeit less capitalistic and more democratic. At the least, such a world seems more prone to conflict than one with a single government spanning all continents and meeting the needs of all people. Some form of State may be preferred for its ability to protect people.

Skeptics of anarchism may also see that statement as the answer to the question of crime, which, while being greatly reduced, is not likely to disappear entirely with the abolition of poverty (think of crimes of passion over infidelity, for instance). Yet anarchists typically despise the police – the personification of force, authority, and State violence. Can the police be made a thing of the past?

Socialist George Orwell wrote, “I worked out an anarchistic theory that all government is evil, that the punishment always does more harm than the crime and the people can be trusted to behave decently if you will only let them alone.” But he concluded, “It is always necessary to protect peaceful people from violence. In any state of society where crime can be profitable you have got to have a harsh criminal law and administer it ruthlessly.”[2]

Here Orwell lacks nuance and vision – of community policing, proportionate punishment, restorative justice, rehabilitation, and so on – which do not require a State; they can be done on an intimate, local level. Skeptics can rest easy on this point. The relevant task of anarchism (and socialism or communism) is to build a more humane, peaceful, fair criminal justice system that does not morph into what came before.

Then there’s socialism. “I should tie myself to no particular system of society other than that of socialism,” as Nelson Mandela would say.[3] Socialism also eliminates capitalism from the bottom-up. As under anarchism, workers collectively own their workplaces, making decisions democratically and equitably sharing the profits of their labor, and such worker cooperatives can federate with each other to reduce competition and coordinate their creations and service. The State exists to serve various needs of the people, such as guaranteed healthcare and employment, and is in fact under the people’s direct democratic control (this was explored in detail in What is Socialism?). The problems with anarchism and communism can be avoided. Socialism is the human future.

For more from the author, subscribe and follow or read his books.

Notes

[1] New Worlds for Old, H.G. Wells

[2] The Road to Wigan Pier, George Orwell

[3] 1964 court speech, Nelson Mandela. http://www.motherjones.com/politics/2013/12/nelson-mandela-epitaph-own-words-rivonia/

The Scope of False Sexual Assault Allegations

When conservatives are confronted by the rise of a “liberal” cause, many find and point to a small problem in order to discredit or divert attention from the immense problem liberals are attacking.

It’s an unhealthy mix of the whataboutism fallacy (citing wrongs of the opposing side instead of addressing the point) and the false equivalence fallacy (describing situations as equivalent [I’ll add “in scope”] when they are not). We observe this during talk on racial violence, when many conservatives pretend hate crimes against whites are just as common as hate crimes against people of color; see “On Reverse Racism.”

Lately, the fallacy was on full display as high-profile men across the country were accused of sexual assault and harassment, many fired or urged to resign. In this frenzy of allegations, some Americans see and cheer a surge in bravery and collective solidarity among victims inspired by each other and seeking justice, while others see and decry a male “witch hunt,” with evil women growing more bold about their lies, perhaps on the George Soros payroll. Where you land is a fairly decent predictor of your political views. Who was accused also determined for many which women to believe, with some conservatives supporting Republican Roy Moore through his rape of underage girls scandal but attacking Democrat Al Franken’s groping of women. Sadly, some liberals did the reverse. I know I witnessed a left-leaning acquaintance or two trying to discredit accusations against Franken, that he publicly apologized for, by slandering the victims. Still, it is typically conservatives (often sexually frustrated men) who, when they encounter liberals talking about rape, sexual assault, sexual harassment, toxic masculinity, and so forth, bring up false rape accusations.

One comment on a mediocre article Men’s Health shared on how to make sure you have consent from a woman typified this. There were of course countless like it, many poorly written: “And remember if she regrets it the next day you’re still fucked”; “I bring my attorney and a notary on all dates and hook ups”; “There’s no such thing as consent anymore, it’s a witch hunt. Just say no gentleman”; “Don’t forget guys… If you have drank 12 drinks and she has 1 sip of beer…… You raped her.” And still more angry with the article’s existence: “Men’s health turning into click bate leftist agenda”; “Did a feminist write this?”; “Did a woman write this?” It’s sad consent is a liberal, feminist scheme. But this comment got much attention and support, likely because people found it thoughtful and measured for some odd reason:

This is a touchy subject. Yes, respect women—We all know that. Have a woman’s consent—Yes, we all know that. Do not rape or sexually assault a woman—Yes we all know that. We respect the rules. However, there are some women that exploit and take advantage of the rules. It’s sad to say, there are some out there that falsely accuse a man of rape or sexual assault—ruining their lives. Being a man in today’s era, I’m afraid to ask a woman on a date. I feel sometimes a man needs a contract just to protect himself. Yes, this might sound objectionable and supercilious—but you can’t be too careful nowadays. We live in a different time now. Men: We need to change our attitudes and treatment of women. However, it’s okay that we protect ourselves—and we shouldn’t be demonized or vilified for doing so. I don’t want to be viewed or portrayed as the enemy, nor be apologetic for being a man.

An amusing writing. “We all know” not to rape, assault, or harass women? If the collective male “we” legitimately “knew,” such things would be a thing of the past and a primer on consent unnecessary. “We live in a different time” where men are “afraid to ask a woman on a date”! If you’re going to “protect” yourself in some way, you wouldn’t be “demonized” for actually getting consent in some formal sense; only if you used illegal and unethical methods to “protect” yourself, like the secret filming of sex. And where are these women asking men to apologize for being a men, rather than for specific behaviors or attitudes that make them uncomfortable, scared, unsafe, or physically violated?

This is a perfect example of the fallacy above. “Men sexually assault women and shouldn’t, but what about the women who make false accusations?” The latter part is clearly his main concern — he didn’t stop by to condemn rapists, he came with another purpose. They may not intend to or even realize it (some do), but when men (or women) do this they position false reports as a problem of the same significance or nearing the same significance as actual sex crimes. As if the scope, the prevalence, is comparable. That’s what taking a conversation on consent and redirecting it to one of false accusations does. It says, “This is what’s important. It’s what we should be talking about.” It’s like bringing up asthma when everyone’s discussing lung cancer. It deflects attention away from a problem that is much more severe. It’s a subtle undermining of the credibility of rape victims. It’s not wrong to discuss small problems, of course, but they should always be kept in perspective. It’s my view that comments about hate crimes against whites or false accusations against men that don’t include the enormous asterisks that these are minuscule percentages of overall hate and sex crimes should never have been uttered at all. In that way, we can think about others first. We can protect the credibility of real victims. We can remain rooted in the facts — not imply a small problem is large, or vice versa. Naturally, including those caveats undermines the usual function of bringing up these issues, but no matter.

Yes, lying about sex crimes in an issue that exists. Yes, there should be some legal punishment for such an immoral act (not anywhere near the punishment for sexual assault and harassment, obviously, because these are not in any way morally equivalent crimes). Yes, people are innocent until proven guilty, which is why men are safe from prison until they see their day in court, even if they face social consequences like losing a job due to presumed guilt — which you can oppose on ethical grounds, but not so stable ground as you would hope, especially when a man is accused by a coworker, family member, or someone else in close proximity. Is it most ethical to oppose a firing until a trial and risk keeping a rapist around the workplace? Putting others in danger? Forcing a victim to clock in next to him each day? Or is it most ethical to fire him and risk tearing down the life of an innocent man? It’s an unpleasant dilemma for any employer, university administrator, or whomever, but ethically there’s not much question. One risk is far graver, thus the answer is simple. This only grows more axiomatic when we acknowledge the likelihood of events.

The prevalence of proven false accusations of sexual assault is somewhere between 2% and 8% of cases. The National Sexual Violence Research Center documents a 2006 study of 812 cases that found 2.1% were false reports, while a 2009 study of 2,059 cases and a 2010 study of 136 cases estimated 7.1% and 5.9%, respectively. Research from 2017 revealed a 5% false claim rate for rape. The Making a Difference Project, using data from 2008, estimates 6.8%. These numbers are mirrored in prior American decades and in similar countries. While we can acknowledge that some innocent people in prison never see justice, are never set free, since 1989 there have only been 52 men released from prison after it was determined their sexual assault charges were based on lies. This compared to 790 murder exonerations; the number of people in state prisons for murder vs. sexual assault/rape is about the same (though the former crime is far less common than the latter), making the low exoneration rate for sex crime convictions all the more significant.

Myriad definitions of both “false report” and “sexual assault” make the precise percentage difficult to nail down, and these statistics only address proven false reports (there are many cases in limbo, as conservative writers are quick to point out), but this research gives us a general idea. Reports of high percentages of false claims are typically not academic studies or have rather straightforward explanations, for example when Baltimore’s “false claim” rate plunged from 31% to under 2% when the police actually went through some training and “stopped the practice of dismissing rapes and sexual assaults on the scene”! It’s remarkable how legitimate investigations and peer-reviewed research can bring us closer to the truth.

In other words, when observing any sexual misconduct scandal, there is an extremely high chance the alleged victim is telling the truth. This is why we believe women. This is why they should be given the benefit of the doubt, not accused men. It’s why the moral dilemma for employers and the like is hardly one at all. Were precisely 50% of sexual assault allegations lies, it would still be most ethical to take the risk of firing a good man rather than the risk of keeping a predator around. But since women are most always telling the truth? Well, the decision is that much easier and ethical.

In the U.S., there are some 321,500 rapes and sexual assaults per year, and 90% of adult victims are women (you’ve probably noticed how “men are raped too” is used in a similar manner to all this). One in six women are rape or attempted rape survivors. For every 1,000 rapes, 994 perpetrators (99%) will never go to prison.

For more from the author, subscribe and follow or read his books.

Which U.S. Wars Actually Defended Our Freedoms?

When pondering which of our wars literally protected the liberties of U.S. citizens, it is important to first note that war tends to eradicate freedoms. Throughout U.S. history, war often meant curtailment of privacy rights (mass surveillance), speech rights (imprisonment for dissent), and even the freedom to choose your own fate (the draft).

It also should be stated upfront that this article is only meant to address the trope that “freedom isn’t free” — that military action overseas protects the rights and liberties we enjoy here at home (even if virulent bigotry meant different people had very different rights throughout our history and into our present). It will not focus on the freedoms of citizens in other nations that the U.S. may have helped establish or sustain through war, nor non-American lives saved in other countries. However, it will address legitimate threats to American lives (such a right to life is not de jure, but expected).

As a final caveat, I do not in any way advocate for war. That has been made exceptionally clear elsewhere. While violence may at times be ethically justified, in the vast majority of cases it is not, for a broad array of reasons. So nothing herein should be misconstrued as support for imperialism or violence; rather, I merely take a popular claim and determine, as objectively as possible, if it has any merit. To a large degree I play devil’s advocate. To say a war protected liberties back home is not to justify or support that war, nor violence in general, because there are many other causes and effects to consider which will go unaddressed.

In “A History of Violence: Facing U.S. Wars of Aggression,” I outlined hundreds of American bombings and invasions around the globe, from the conquest and slaughter of Native Americans to the drone strikes in Yemen, Pakistan, Somalia, and elsewhere today. It would do readers well to read that piece first to take in the scope of American war. We remember the American Revolution, the Civil War, the World Wars, Korea, Vietnam, Iraq, and the War on Terror. But do we recall our bloody wars in Guatemala, Haiti, Mexico, and the Philippines? Since its founding in 1776, 241 years ago, the United States has been at war for a combined 220 years, as chronicled by the Centre for Research on Globalization (CRG). 91% of our existence has been marked by violence.

How many of those conflicts protected the liberties of U.S. citizens? How many years did the military literally defend our freedoms?

Well, what precisely is it that poses a threat to our freedoms? We can likely all agree that what qualify as freedoms are 1) rights to actions and words that can be expressed without any retribution, guaranteed by law, and 2) the total avoidance of miseries like enslavement, imprisonment, or death. Thus, a real threat to freedom would require either A) an occupation or overthrow of our government, resulting in changes to or violations of established constitutional liberties, or B) invasions, bombings, kidnappings, and other forms of attacks. If you read the article mentioned above, it goes without saying the U.S. has much experience in assaults on the freedoms of foreign peoples. Much of our violence was the violence of empire, with the expressed and sole purpose of seizing natural resources and strengthening national power.

So what we really need to ask is how close has the U.S. come to being occupied or U.S. citizens attacked? How many times have either of these things occurred? We must answer these questions honestly. Should it be said fighting Native American or Mexican armies protected freedom? No, the only reason our nation exists is because Europeans invaded their lands. We will include no war of conquest, from our fight with Spain over Florida to our invasion of Hawaii. We killed millions of innocent people in Vietnam. Were they going to attack America or Americans? No, we didn’t want the Vietnamese to (democratically) choose a Communist government. Now, you can believe that justifies violence if you wish. But the Vietnam War had nothing to do with defending our freedoms or lives. Neither did our invasion of Cuba in 1898. Nor our occupation of the Dominican Republic starting in 1916. Nor our wars with Saddam’s hopelessly weak Iraq. Nor many others.

Using this criteria, my estimate to the titular question is that only four wars, representing 19 years, could reasonably meet Qualification 1 (some also meet the second qualification). These conflicts protected or expanded our liberties by law:

The American Revolution (1775-1783): While the Revolution was partly motivated by Britain’s moves to abolish slavery in its colonies, it did expand self-governance and lawful rights for white male property-holders.

The War of 1812 (1812-1815): While U.S. involvement in the War of 1812 had imperialist motives (expansion into Indian and Canadian territories) and economic motives (preserving trade with Europe), Britain was kidnapping American sailors and forcing them to serve on their ships (“impressment”). This war might have simply been included below, in Qualification 2, except for the fact that Britain captured Washington, D.C., and burned down the Capitol and the White House — the closest the U.S. has ever come to foreign rule.

The Civil War (1861-1865): Southern states, in their declarations of independence, explicitly cited preserving slavery as their motive. Four years later, slavery was abolished by law. Full citizenship, equal protection under the law, and voting rights for all men were promised, if not given.

World War II (1941-1945): The Second World War could also have simply been placed in Qualification 2 below. Beyond freeing Southeast Asia and Europe from the Axis, we would say the U.S. was protecting its civilians from another Pearl Harbor or from more German submarine attacks on trade and passenger ships in the Atlantic. Yet it is reasonable to suppose the Axis also posed a real threat to American independence, the only real threat since the War of 1812.

Had Germany defeated the Soviet Union and Britain (as it might have without U.S. intervention), establishing Nazi supremacy over Europe, it is likely its attention would have turned increasingly to the United States. Between the threat of invasion from east (Germany) and west (Japan), history could have gone quite differently.

German plans to bomb New York were concocted before the war; Hitler’s favorite architect described him as eager to one day see New York in flames. Before he came to power, Hitler saw the U.S. as a new German Empire’s most serious threat after the Soviet Union (Hillgruber, Germany and the Two World Wars). Some Japanese commanders wanted to occupy Hawaii after their attack, to threaten the U.S. mainland (Caravaggio, “‘Winning’ the Pacific War”). After Pearl Harbor, the U.S. did not declare war on Germany; it was the reverse. Japan occupied a few Alaskan islands, shelled the Oregon and California coasts, dropped fire balloons on the mainland, and planned to bomb San Diego with chemical weapons. Germany snuck terrorists into New York and Florida. The Nazis designed their A-9 and A-10 rockets to reach the U.S., under the “Amerika Bomber” initiative. Also designed were new long-range bombers, including one, the Silbervogel, that could strike the U.S. from space. Hitler once said, “I shall no longer be there to see it, but I rejoice on behalf of the German people at the idea that one day we will see England and Germany marching together against America.” While an Axis invasion of the United States is really only speculation, it has some merit considering their modus operandi, plus an actual chance at success, unlike other claims.

19 years out of 220 is just 8.6% (we’ll use war-time years rather than total years, erring on the side of freedom).

Qualification 2 is harder to quantify. U.S. civilians in danger from foreign forces is a far more common event than the U.S. Constitution or government actually being in danger from foreign forces. We want to include dangers to American civilians both at home and overseas, and include not just prolonged campaigns but individual incidents like rescue missions. This will greatly expand the documented time the military spends “protecting freedom,” but such time is difficult to add up. Many military rescue operations last mere weeks, days, or hours. The Centre for Research on Globalization’s list focuses on major conflicts. We’ll need one that goes into detail on small-scale, isolated conflicts. We’ll want to look not just at the metric of time, but also the total number of incidents.

But first, we will use the CRG list and its year-based metric to consider Qualification 2. The following wars were meant, in some sense, to protect the lives of U.S. citizens at home and abroad. They do not meet the first qualification. Conflicts listed in Qualification 1 will not be repeated here. Five wars, representing 36 years, meet Qualification 2:

The Quasi-War (1798-1800): When the United States refused to pay its debts to France after the French Revolution, France attacked American merchant ships in the Mediterranean and Caribbean.

The Barbary Wars (1801-1805, 1815): The United States battled the Barbary States of Tripoli and Algiers after pirates sponsored by these nations began attacking American merchant ships.

The Anti-Piracy Wars (1814-1825): The U.S. fought pirates in the West Indies, Caribbean, and Gulf of Mexico.

World War I (1917-1918): The Great War nearly found itself in Qualification 1. After all, Germany under Kaiser Wilhelm II made serious plans, in the 1890s, to invade the United States so it could colonize other parts of Central and South America. During World War I, Germany asked Mexico to be its ally against the U.S., promising to help it regain territory the U.S. stole 70 years earlier. However, invasion plans evaporated just a few years after 1900, and Mexico declined the offer. The Great War appears here for the American merchant and passenger ships sunk on their way to Europe by German submarines (not just the Lusitania).

The War on Terror (1998, 2001-2017): It is very difficult to include the War on Terror here because, as everyone from Osama bin Laden to U.S. intelligence attests, it’s U.S. violence in the Middle East and Africa that breeds anti-American terror attacks in the first place. Our invasions and bombings are not making us safer, but rather less safe by widening radicalism and hatred. However, though this predictably endless war is counterproductive to protecting American lives, it can be reasonably argued that that is one of its purposes (exploitation of natural resources aside) and that killing some terrorists can disrupt or stop attacks (even if this does more harm than good overall), so it must be included.

36 years out of 220 is 16.4%. Together, it could be reasonably argued that 25% of U.S. “war years” were spent either protecting our constitutional rights from foreign dismemberment or protecting citizen lives, or some combination of both.

But we can also look at the total number of conflicts this list presents: 106. Four wars out of 106 is 3.8%, another five is 4.7%. Let’s again err on the side of freedom and split the Barbary and Terror wars into their two phases, making seven wars for 6.6%. Adding 3.8% and 6.6% gives us 10.4% of conflicts protecting freedom.

Any such list is going to have problems. What does it include? What does it leave out? Does it describe the motivation or justification for violence? Does it do so accurately? Should recurring wars count as one or many? Does the list properly categorize events? This list labels U.S. forces violating Mexican territory to battle Native Americans and bandits as repeated “invasions of Mexico.” If Mexican forces did the same to the U.S., some of us would call it an invasion, others might rephrase. And couldn’t these incursions into a single nation be lumped together into a single conflict? Oppositely, the list lumps scores of U.S. invasions and occupations of most all Central and South American nations into a single conflict, the Banana Wars — something I take huge issue with. The solution to issues like these is to either create a superior list from scratch or bring other lists into the analysis.

Let’s look at “Instances of Use of United States Armed Forces Abroad,” a report by the Congressional Research Service (CRS). It is a bit different. First, it includes not just major conflicts but small, brief incidents as well, and it’s smarter about lumping conflicts together (no Banana Wars, no Anti-Piracy Wars, but the U.S. incursions into Mexico to fight Native Americans and bandits are listed as one conflict). Thus, 411 events are documented. Second, even this is too few, as the list begins at 1798 rather than 1776. Third, it does not include wars with Native Americans like the first list. This list is highly helpful because the CRS is an agency of the Library of Congress, conducting research and policy analysis for the House and Senate, and thus its justifications for military action closely reflect official government opinion.

We will apply the same standards to this list as to the last. We’ll include the nine conflicts we studied above if the timeframe allows, as well as any events that have to do with civilians, piracy, and counter-terrorism. We will thus modify 411 events in this way:

– 38 incidents/wars that involved “U.S. citizens,” “U.S. civilians,” “U.S. nationals,” “American nationals,” “American citizens,” etc.

– 9 incidents/wars related to “pirates” and “piracy” (does not include the rescue of U.S. citizen Jessica Buchanan, already counted above, nor Commodore Porter’s vicious 1824 revenge attack on the civilians of Fajardo, Puerto Rico, who were accused of harboring pirates)

– 6 official conflicts: the Quasi-War (“Undeclared Naval War with France”), two Barbary Wars, the War of 1812, and two World Wars (the Revolution does not appear on this list due to its timeframe; the Anti-Piracy Wars are included above, the War on Terror below)

+ 1 Civil War (it must be added, as it is not included on this list because it did not involve a foreign enemy)

– 27 incidents/wars related to combating “terrorism” or “terrorists”

That gives us 81 events that match Qualifications 1 and 2. 81 out of 412 is 19.7% — thus about one-fifth of military action since 1798 in some way relates to protecting Constitutional freedoms here at home or the right to life and safety for U.S. civilians around the globe. Of course, were we to only look at Qualification 1, we would have but three events — the War of 1812, the Civil War, and World War II — that preserved or expanded lawful rights, or 0.7% of our wars since 1798.

The CRS list does not break down some incidents into times shorter than years, and documenting those that are by days, weeks, or months is an enormous chore for a later day. Thus the estimation for time spent defending freedom will have to come from the CRG list: 25% of the time the military is active it is involved in at least one conflict that is protecting freedom. Also, just for some added information, there are 20 years on the CRS list where there is not a new or ongoing incident. That’s since 1798. This is almost identical to the 21 years of peace since 1776 in the CRG analysis. So of the 219 years since then, we’ve spent 91% of our time at war, the same as the CRG list since 1776 (or trimmed to 1798).

(A list created by a professor at Evergreen State College goes from 1890-2017 and has five years of peace. We’ve been at war 96% of the time since 1890. It lists 150 conflicts, with only 3 having to do with rescues or evacuations of Americans [2%], 11 having to do with the War on Terror in Arabia and Africa in 1998 and after 9/11 [7.3%], plus World War I [0.6%]. That’s 9.9% for Qualification 2. Throw in another 0.6% for World War II, and thus Qualification 1, and we have 10.5% of conflicts since 1890 protecting freedom. Because this list begins so late, however, we will not use it in our averaging. Doing so would require us to trim the other lists to 1890, cutting out the piracy era, the Revolution, the Civil War, etc.)

Averaging the percentages from the two lists relating to total conflicts gives us 2.3% for Qualification 1 and 15% for Qualification 2. 17.3% all together. Trimming the CRG list to begin at 1798 yields about the same result.

In sum, it could be reasonably asserted that the U.S. military protects our freedoms and lives in 17.3% of conflicts. (If we take out the War on Terror for its deadly counter-productivity, which I would prefer, that number drops to 10.8%, with 17% of war years spent defending American freedom.)

For more from the author, subscribe and follow or read his books.

Even Better Than ‘Angels in the Outfield’

Remember the movie Angels in the Outfield? It’s the classic story of Roger, a foster kid who prays for God to help the Angels win the pennant so that his dad will come back. (Sounds like one truly twisted deal, but Roger’s dad wasn’t at all serious. If we’re being honest, Roger seems old enough to have known about figurative language.)

If your memory is as decrepit as the cheap VCR tape of this movie in the box in your basement, this image may help:

Screen Shot 2017-11-27 at 2.08.52 PM

Jesus, Roger looks uncomfortable in this picture. I don’t remember him being on the verge of tears in this scene. This looks like the beginning of an episode of Law and Order: SVU. CHUNG-CHUNG.

This is the scene in which Roger and his best buddy J. P. meet the indelibly cheerful Angels manager George Knox, who grows from skeptic to believer about the whole angels-playing-baseball thing (Roger is the only one that can see them). When Roger does see one, he’s like:

giphy

That’s where that hand motion comes from if you ever see people (me) doing this during a baseball game. The Royals once used the theme music to the movie when someone hit a home run, and I could never understand why I was the only one at Kauffman Stadium doing this while it played.

Also: That moment you realize Roger was played by Joseph Gordon-Levitt of 500 Days of SummerInception, and Dark Knight Rises.

2d274908031122-tease-joseph-gordonlevitt-today-19032015_f206664b5e962821f9652b05c637eb98

Angels in the Outfield is truly the greatest baseball movie of all time (bite me, Kevin Costner), therefore I in no way compare the Kansas City Royals to it casually. But without question, in every arena the Royals’ story rivals and surpasses Roger’s. This is such big news, I’m surprised more media attention hasn’t been paid to it.

KANSAS CITY’S PAIN IS GEORGE KNOX’S PAIN

fRLdWQO

George Knox hates to lose. Can any clip better represent the boiling rage lurking beneath the skin of every Royals fan, just waiting to detonate, through all the miserable seasons of the past years, when Kansas City was the laughingstock of Major League Baseball?

A clip of a nuke wouldn’t suffice. It has to be George Knox marching through a locker room of two dozen half-naked losers and absolutely destroying their fruit and meat platters. That is the pain Royals fans felt after every season–no, every game–before the Royals’ meteoric rise.

And this is Knox after becoming manager rather recently. Multiply this rage by 29 years, and you’ll understand Kansas City’s agony. There’s no comparison.

Even this bloody movie made us look like total twits. Why does this guy not slide? What is he doing?

82zqwaD

MIRACLES CAN HAPPEN

Roger’s story is fictional, with fictional managers, ballplayers, and angels. At least, I hope angels don’t look like this:

1445613876-cl

Honestly, this angel looks like either the uncle you pray to God won’t sit next to you at Thanksgiving or the aunt that’s visibly ready to call your favorite music the work of Satan before you even tell her what it is. Not really sure which one at this point.

But the Royals’ story?

This isn’t a movie. And no players appear to defy physics as an angel lifts them into the air. It’s simply incredible baseball. It’s real life. That’s an important reason the Royals’ story is better.

Eric-Hosmer-Kansas-City-Royals.png

Consider last year: Riding Jeremy Guthrie’s 7-inning shutout to beat the White Sox 3-1 on September 26, clinching their first playoff berth in 29 years. Four days later, staging a roaring comeback against the Oakland A’s in the do-or-die American League wild card game, down 3-7 but leveling the game in the 9th inning, eventually winning 9-8 in the 12th, after nearly 5 hours of play.

Sweeping both the American League division and championship series, earning the most consecutive wins in MLB postseason history. Making it to Game 7 of the World Series against the San Francisco Giants, but experiencing the most painful of defeats.

And this year: Winning their first American League Central title since 1985 on September 24 against the Mariners. On the brink of elimination in Game 4 of the AL division series against the Astros, down 4 runs in the 7th, and smashing in 5 runs in the 8th inning and piled on more in the 9th to win the game 9-6. They won the series in the next game.

Winning Game 6 of the AL championship series versus the Blue Jays by Lorenzo Cain scoring from first base on Eric Hosmer’s single, with closer Wade Davis shutting down the Blue Jay’s comeback threat, a runner on first and third.

And last night, Game 1 of the 2015 World Series, verses the New York Mets. Alcides Escobar’s inside-the-park homer, the first in the World Series since 1929, the year the Great Depression began. Winning 5-4 after 14 innings, the longest game in World Series history.

Could all this possibly be topped by the story of guys who only made it to the postseason with divine intervention in sparkling pajamas?

tumblr_inline_n270fiP3Sq1qa12tx

No. They’re cheaters.

Also, that’s Matthew McConaughey being picked up there. Swear to God. As he later said from the driver’s seat of a Lincoln, “Sometimes you’ve got to go back…”

Adrien Brody is also a ballplayer in this movie. McConaughey, Gordon-Levitt, Brody, Danny Glover, Tony Danza, Christopher Lloyd…seriously, is there anyone this film doesn’t have?

THE SMARMY SPORTSCASTER

It has actor and concept art model for Mr. Incredible Jay O. Sanders. He plays Ranch Wilder.

Roger and George Knox had to deal with Ranch Wilder, the “voice of the Angels,” who makes it clear throughout the film he very much wants the Angels to lose. He hates George Knox, and is constantly being a Debbie Downer about the Angel’s postseason prospects.

hqdefault

Royals fans get Joe Buck.

ovc1zqgeabdkv7lqwscp

Buck took a lot of heat during the 2014 World Series for what Royals fans perceived to be bias, in support of the Giants…and one pitcher in particular.

Ranch Wilder got fired. Buck is still going strong, back to call this 2015 World Series.

This just makes a better story. No one really seemed to mind Ranch Wilder’s Angel-bashing in the film. He was only fired because he left his mic on when he really went berserk.

But Kansas City’s story has more conflict, more passion and intrigue. Buck is back, and a lot of KC fans are enraged, enough to start petitions and even call the games themselves.

THAT ONE FAN THAT GETS A LOT OF SCREEN TIME AND NO ONE KNOWS WHY

RadiantBelovedHamster-thumb360

Remember this guy? He’s that one fan in the crowd the movie focuses on, and likely the only human who has ever needed to professionally wax the sides of his neck.

He thinks Roger is crazy for seeing angels, he accidentally sits on Christopher Lloyd’s angel character, takes a baseball in the mouth, and at one point screams, “Hemmerling for Mitchell?! Go back to Cincinattiiiiiiii!” Classic quote.

Why is he always on screen? Why does he get so much attention? Why is that so obnoxious? In a way, he’s kind of the movie’s version of…of…

141024123919-marlins-man-01-story-top

Marlins Man.

This mysterious and no doubt totally loaded figure has been spotted behind home plate throughout this postseason and the one in 2014, and works his way to other sports championships as well.

Always on screen, he is the one fan that gets any attention. He gets national attention! Yes, he donates a ton of money to charities, but what of the other 37,000 people in the stands? What about their stories? He leaves them in the dust.

It’s all an intentional thing. He picks his seat so he can be on camera. He loves to rep his completely irrelevant team, which has hopefully fired its graphic design staff by now.

Because he’s desperate as a toddler for attention, I think he successfully one-ups the blowhard from Angels in the Outfield. And anyone who disagrees with me is, to quote J. P., a “Nacho Butt.”

PUSHING THROUGH THE LOSS OF A PARENT

As mentioned, Roger is a foster kid. About two-thirds into the movie, his deadbeat dad–the same one who said if the Angels won the pennant he and Roger could “be a family again”–abandons Roger for good.

“Sorry, boy,” Dad of the Year says as Roger rushes up to him, excited to tell him about how well the Angels are doing. Dad pats Roger on the cheek and walks away, leaving Roger to try to croak out “Where are you going?” before he begins to weep.

Screen Shot 2017-11-27 at 2.33.20 PM.png

If you’re a kid from a stable home watching this movie, it truly influences you, seeing someone your own age abandoned by his father. Not to mention Roger’s mother died, as did J. P.’s dad. Their stories are fictional, yet you know in the back of your mind while watching that millions of children experience abandonment, foster care, homelessness, or have parents deceased or in jail. The movie, unlike the vast majority of children’s films, makes you think about the suffering of others and how to persevere through pain.

And if a fictional story about this is powerful, how much more so is real life?

Sadly, three Royals lost a parent this season.

Mike Moustakas lost his mother Connie on August 9, while Chris Young lost his father Charles on September 26. As reported by The Kansas City Star, Young pitched the next day to honor his dad, and went 5 innings without giving up a hit.

Edinson Volquez pitched last night, in Game 1 of the World Series. His father Daniel died just before the game, and Volquez’s family requested that Royals manager Ned Yost not tell Volquez until after he pitched.

edinson-volquez-101615-getty-ftrjpg_lvveh1cbrtd51uxuw0zocfv83

In other words, the world knew of Volquez’s father’s death before Volquez.

Through all this, the Royals have persevered. Moustakas said after the game, “For all the stuff that’s happened this year, to all of our parents…it has to bring you closer together.”

Eric Hosmer said, “It’s just another angel above, just watching us and behind us through this whole run.”

A HAPPY ENDING?

The Angels in the movie won the pennant (we’re kind of left to wonder about the World Series). Roger and his best friend J. P. get adopted by George Knox and live happily ever after.

I don’t know if Ned Yost will adopt any players, nor if the Royals will finally, after 3 decades, win it all. But there is one thing I know to be true, that applies to touching movies and real life alike:

“It could happen.”

ANGELS IN THE OUTFIELD, Milton Davis Jr., Danny Glover, Joseph Gordon-Levitt, 1994, (c)Buena Vista P

For more from the author, subscribe and follow or read his books.