Is There Any Actual Science in the Bible?

Someone once told me that the bible was the greatest work of science ever written. This is mildly insane, as anyone who’s read the bible knows there is more scientific knowledge presented in any grade school or even children’s science book. (And, given thousands of extra years of research, it’s probably more accurate.) The purpose of the bible, secularists and believers can surely agree, was not to acknowledge or pass down scientific principles. Finding incredible scientific truths in the text typically requires very loose interpretations. But as religious folk sometimes point to science in the bible as proof of its divine nature, it seems necessary to critically examine these claims.

In making the case that “every claim [the bible] makes about science is not only true but crucial for filling in the blanks of our understanding about the origin of the universe, the earth, fossils, life, and human beings,” Answers in Genesis points to verses that vary in ambiguity. Meaning some are more believable than others as to whether they could present valid scientific information.

Take Job 26:7, in which it is said God “spreads out the northern skies over empty space; he suspends the earth over nothing.” One may wonder what it means to spread skies over empty space. Perhaps it’s referencing the expanding universe, as others think verses like Job 9:8 reference (“He alone spreads out the heavens”). But the second part matches well what we know today, that the globe isn’t sitting on the back of a turtle or something. Why this and other verses may not be as incredible as supposed is discussed below.

(It’s often asserted also that the Big Bang proves the bible right in its writing of a “beginning,” but we simply do not know for certain that no existence “existed” before the Big Bang.)

Answers in Genesis also believes the bible describes the water cycle. “All streams flow into the sea, yet the sea is never full. To the place the streams come from, there they return again,” reads Ecclesiastes 1:7. It also provides Isaiah 55:10: “The rain and the snow come down from heaven, and do not return to it without watering the earth and making it bud and flourish…” Some translations (such as NLT, ESV, and King James) are missing “without,” instead saying the rains “come down from heaven and do not return there but water the earth, making it bring forth and sprout,” which sounds more like a repudiation of the water cycle. But no matter; other verses, such as Psalm 135:7 in some translations or Job 36:27, speak of vapors ascending from the earth or God drawing up water.

From there things begin to fall apart (the Answers in Genesis list is not long).

The group presents Isaiah 40:22 and Psalm 103:12 as the bible claiming the world is spherical rather than flat (“He who sits above the circle of the earth”; “as far as the east is from the west”). But neither of these verses explicitly makes that case. A flat earth has east and west edges, and a circle is not three dimensional. “Circle,” in the original Hebrew, was חוּג (chug), a word variously used for circle, horizon, vault, circuit, and compass. A “circle of the earth,” the Christian Resource Center insists, refers simply to the horizon, which from high up on a mountain is curved. If biblical writers had wanted to explicitly call the earth spherical they could have described it like a דּוּר (ball), as in Isaiah 22:18: “He will roll you up tightly like a ball and throw you.” This is not to say for certain that the ancient Hebrews did not think the world was a sphere, it is only to say the bible does not make that claim in a clear and unambiguous manner.

The remaining “evidences” are really nothing to write home about. “For the life of the flesh is in the blood” (Leviticus 17:11) is supposed to show an understanding of blood circulation; “the paths of the seas” (Psalm 8:8) is supposed to represent knowledge of sea currents; “the fixed order of the moon and the stars” (Jeremiah 31:35) is allegedly a commentary on the predictable paths of celestial bodies in space (rather than, say, their “fixed,” unchanging positions in space, another interpretation). But none of these actually suggest any deeper understanding than what can be easily observed: if you are cut open and lose enough blood you die, bodies of water flow in specific ways, and the moon and stars aren’t blasting off into space in random directions but rather maintain consistent movement through the skies from our earthly perspective. Again, maybe there were actually deeper understandings of how these things worked, but they were not presented in the bible.

The Jehovah’s Witness website has a go at this topic as well, using most of the same verses (bizarrely, it adds two to the discussion on the water cycle, two that merely say rain comes from the heavens).

The site uses Jeremiah 33:25-26 (“If I have not made my covenant with day and night and established the laws of heaven and earth…”) and Jeremiah 38:33 (“Do you know the laws of the heavens? Can you set up God’s dominion over the earth?”) to argue that the bible makes the case for the natural laws of science. Perhaps, but again, this doesn’t demonstrate any knowledge beyond what can be observed and, due to consistency, called a law by ancient peoples. So maybe it’s one of God’s laws that the sun rises each day. It’s a law that water will evaporate when the temperature gets too high. And so forth. These verses are acknowledgements that observable things function a certain way and that God made it so. There’s no verse that explains an actual scientific principle, such as force being equal to a constant mass times acceleration, or light being a product of magnetism and electricity.

True, it’s sometimes said the bible imparts the knowledge of pi (3.1415926…) and the equation for the circumference of a circle, but this is a bit misleading. There are a couple places where a circle “measuring ten cubits” is mentioned, with it requiring “a line of thirty cubits to measure around it” (1 Kings 7:23, 2 Chronicles 4:2). Pi is implicitly three here. The equation (rough or exact) and pi (rough or exact) were possibly known, as they’re not too difficult to figure out after taking measurements, but that is not an absolute certainty based on this text. Regardless, neither the equation nor the value of pi are explicitly offered. (Why not? Because this is not a science book.) If these verses were meant, by God or man, to acknowledge or pass on scientific knowledge then they either didn’t have much figured out or were not feeling particularly helpful. “Figure out the equation and a more precise value of pi yourself.”

The Jehovah’s Witness site further believes it’s significant the ancient Hebrews had sanitary practices, like covering up feces (Deuteronomy 23:13), keeping people with leprosy isolated (Leviticus 13:1-5), and washing your clothes after handling a carcass (Leviticus 11:28). However, if you read Deuteronomy 23:14, you see that feces must be covered up so God “will not see among you anything indecent” when he visits. It wasn’t to protect community health — or at least that went unmentioned. Noticing that leprosy can spread and deciding to quarantine people who have it is not advanced science. The guidelines for cleanliness after touching dead animals start off reasonable, then go off the road. Even after washing your clothes you were for some reason still “unclean till evening,” just like any person or object that touched a woman on her period! (If this was just a spiritual uncleanliness, why were objects unclean? They don’t have souls.) The woman, of course, was unclean for seven days after her “discharge of blood.” How scientific.

Finally, this list mentions Psalm 104:6 (“You covered [earth] with the watery depths as with a garment; the waters stood above the mountains”) to posit that the biblical writers knew there was an era, before earth’s plate tectonics began to collide and form mountains, when the earth was mostly water — there is actual scientific evidence for this idea. The verse may be referencing the Great Flood story; verse 9 says of the waters, “never again will they cover the earth,” which sounds a lot like what God promised after wiping out humanity: “never again will there be a flood to destroy the earth” (Genesis 9:11). But if it does in fact reference the beginning of the world, it could be a verse a believer might use to make his or her case that the bible contains scientific truths, alongside Genesis 1:1-10, which also posits the earth was covered in water in the beginning.

There are of course many more alleged scientific truths, most more vague or requiring truly desperate interpretation. For instance, the “Behemoth” in Job 40 is sometimes said to describe a dinosaur, but it in no way has to be one. Hebrews 11:3 says: “By faith we understand that the worlds were framed by the word of God, so that the things which are seen were not made of things which are visible.” That can refer to nothing other than atoms — not any nonphysical possibility like, say, love and the breath of God. Others think a sentence like “all the peoples of the earth will mourn when they see the Son of Man coming on the clouds of heaven” (Matthew 24:30) hints at the future invention of the television! TV is apparently the only way everyone could see an event at the same time — miracles be damned. Still others suggest that when Genesis 2:1 says the heavens and earth “were finished” that this describes the First Law of Thermodynamics (constant energy, none created nor destroyed, in closed systems)! When Christ returns like a thief in the night, “the elements will melt with fervent heat; both the earth and the works that are in it will be burned up” (2 Peter 3:10) — that’s apparently a verse about nuclear fission. One begins to suspect people are reading too much into things.

We should conclude with four thoughts.

This can be done with any text. One can take any ancient document, read between the lines, and discover scientific truths. Take a line from the Epic of Gilgamesh, written in Babylonia: “The heroes, the wise men, like the new moon have their waxing and waning.” Clearly, the Babylonians knew the phases of the moon, how the moon waxes (enlarges) until it becomes full as it positions itself on the opposite side of the earth from the sun, allowing sunlight to envelope the side we can see. They knew how the moon then wanes (shrinks) as it positions itself between the earth and sun, falling into darkness (a new moon) because the sun only illuminates its backside, which we humans cannot see. This line must be in the text to acknowledge and impart scientific knowledge and prove the truth of the Babylonian faith, likely arranged by the moon god mentioned, Sin, or by his wife, Ningal.

This argument is no different than what we’ve seen above, and could be replicated countless times using other ancient books. Perhaps the Babylonians in fact did have a keen understanding of the moon and how it functions. But that does not mean a sentence like that in a story is meant to pass on or even indicate possession of such knowledge. Nor does it mean the gods placed it there, that the gods exist, or that the Epic is divinely inspired. Its presence in a text written between 2150 B.C. and 1400 B.C., even if surprising, simply does not make the book divine. It could be the first text in history that mentions the waxing and waning of the moon; that would not make its gods true.

(By contrast, archaeological and ethnographic research points to the Israelites as offshoots of Canaanites and other peoples around 1200-1000 B.C., with their first writings [not the Old Testament] appearing around the latter date. Though believers want to believe the Hebrews are the oldest people in human history, the evidence does not support this. I write this to stress that, like Old Testament stories taken from older cultures, the Hebrews may have learned of the water cycle and such from others.)

A society’s scientific knowledge may mix with its religion, but that does not make its religion true. Even if the Hebrews were the first group of modern humans, with the first writings, the first people to acquire and pass along scientific knowledge, that would not automatically make the supernatural elements of their writings true. As elaborated elsewhere, ancient religious texts surely have real people, places, and events mixed in with total fiction. If some science is included that’s nice, but it doesn’t prove all the gods are real. The Hebrews knowing about the water cycle or pi simply does not prove Yahweh or the rest of the bible true, any more than what’s scientifically accurate in the Epic of Gilgamesh, the Koran, the Vedas, or any other ancient text proves any of its gods or stories true. That goes for the more shocking truths as well, simply because…

Coincidence is not outside the realm of the possible. As difficult as it may be to hear, it is possible that verses that reference a watery early earth or an earth suspended in space are successful guesses, nothing miraculous required. If one can look up and see the moon resting on nothing, is it so hard to imagine a human being wondering if the earth experiences the same? Could the idea that the earth was first covered in water not be a lucky postulation? Look at things through the lens of a faith that isn’t your own. Some Muslims believe the Koran speaks of XX and XY chromosome pairs (“He creates pairs, male and female, from semen emitted”), the universe ending in a Big Crunch (“We will fold the heaven, like the folder compacts the books”), wormholes (“Allah [who owns] wormholes”), pain receptors of the skin (“We will replace their skins with other new skins so that they may taste the torture”), and more. (Like nearly all faiths, it posits a beginning of the universe too.) How could they possibly know such things? Must Allah be real, the Koran divinely inspired, Islam the religion to follow? Or could these just be total coincidences, lucky guesses mixed with liberal interpretations of vague verses? Supposed references to atoms or mentions of planetary details in the bible could easily be the same. If you throw out enough ideas about the world, you’ll probably be right at times. Could the Hebrews, like Muslims, have simply made a host of guesses, some right and others wrong? After all…

There are many entirely unscientific statements in the bible. Does the ant truly have “no commander, no overseer or ruler, yet it stores its provisions in summer and gathers its food at harvest” (Proverbs 6:6-8), or were the Hebrews just not advanced enough in entomology to know about the ant queen? Are women really unclean in some way for a full week after menstruating, with every person or thing they touch unclean as well? Or was just this male hysteria over menstruation, so common throughout history? If the sun “hurries back to where it rises” (Ecclesiastes 1:5), does this suggest the Hebrews thought the sun was moving around the earth? Or was it just a figure of speech? One could likewise interpret Psalm 96:10 (“The world is firmly established, it cannot be moved”) to mean the earth does not rotate on its axis or orbit the sun. If one can interpret verses to make people seem smart, one can do the same to make them look ignorant. Do hares actually chew their cud (Leviticus 11:4), or did the Hebrews just not know about caecotrophy? Did Jesus not know a mustard seed is not “the smallest of all seeds” (Matthew 13:32)? Likewise, seeds that “die” don’t “produce many seeds” (John 12:24); seeds that are dormant will later germinate, but not dead ones. Some translations of Job 37:18 describe the sky “as hard as a mirror that’s made out of bronze” (NIRV, KJV, etc.). One could also go through the scientific evidence of today that contradicts biblical stories like the order of creation, or look at the biblical translations that mention unicorns, dragons, and satyrs, or just argue that supernatural claims of miracles, angels, devils, and gods are unscientific in general because they can’t be proven. But the point is made: the bible takes stabs at the natural world that aren’t accurate or imply erroneous things.

In conclusion, the science in the bible is about what one would expect from Middle Eastern tribes thousands of years ago. There are some basic observations about the world that are accurate, others inaccurate. There are some statements about the universe that turned out to be true, just like in the Koran, but that doesn’t necessarily require supernatural explanations.

Advertisements

The Bereshit (Jesus in Genesis) Argument Has No Merit

On New Year’s Eve 2016, a friend introduced me to the term bereshit, Hebrew for “in the beginning.” It is the first word of the bible, and is believed by some to contain a secret message concerning the crucifixion of Christ. The bereshit argument is therefore also called the “Jesus in Genesis 1:1” theory.

The theory goes like this: Hebrew letters have special meanings, and when you examine the meanings of the six letters in bereshit (beyt-resh-aleph-shin-yud-tav) they form a sentence: “The Son of God is destroyed by his own hand on the cross.”

I told my friend I was skeptical but would research it, and later came across this graphic and this video (minutes 10:00 to 17:00). Both assert the following meanings or associations of the letters: beyt (house, tent), resh (first person, head), aleph (God), shin (consume, destroy, teeth), yud (hand, arm, works), and tav (covenant, mark, cross). Beyt and resh, when combined, make the word “son.” So the bereshit sequence can be read “son-God-destroy-hand-cross,” or “The Son of God is destroyed by his own hand on the cross.”

I reached out to some of today’s most respected and renowned Old Testament scholars to determine the merits of the bereshit theory. I also spoke to John E. Kostik, a well-traveled Christian speaker, who created the video. He informed me that proving bereshit theory was as simple as looking up the meanings of Hebrew letters, which have matching Hebrew words. “Bereshit begins with the letter beyt. The Hebrew word for ‘house’ is beyt!”

I remembered a question John Goldingay, professor of Old Testament at Fuller Theological Seminary, posited to me earlier that day: “Why would no one have seen it for thousands of years?” So I asked Kostik why web information on it is relatively sparse and why many pastors and believers don’t know about it. He said that because the original language of Hebrew is not widely known, and because Jewish scholars do not view Christ as the messiah and therefore do not have open eyes, the spread of this knowledge has been limited. I asked for sources on the topic, and Kostik directed me to Jeff A. Benner’s work.

Like Kostik (and myself), Benner is not a professional scholar. He works for an engineering company and lives in a log cabin, but like Kostik studying ancient Hebrew is his passion. He documents his studies on his website, which he dubbed the Ancient Hebrew Research Center. While disappointed not to find a university professor with findings published in peer-reviewed journals, that was the source I was given so I pressed on.

The first task was to see if the ancient Hebrew word for “house” indeed had the same name as the first letter in bereshit.

I looked up these words in Benner’s dictionary of commonly used ancient Hebrew words in the bible, and consulted Strong’s Concordance to ensure they were accurate, which they were.

The definitions below with ancient Hebrew lettering are from Benner, with a Strong’s Concordance number to crosscheck. Definitions without ancient Hebrew lettering are from Strong’s Concordance alone. Hebrew words are read right to left.

 (ba-yit): House. (The structure or the family, as a household that resides within the house. A housing. Within). Strong’s 1004.

 (rosh): Head. (The top of the body. A person in authority or role of leader. The top, beginning, or first of something.) Strong’s 7218.

 (a-luph): Chief. (Accorded highest rank or office; of greatest importance, significance, or influence. One who is yoked to another to lead and teach.) Strong’s 441.

Not pictured. (shen): Tooth. Strong’s 8127/8128.

 (yad): Hand. (The terminal, functional part of the forelimb. Hand with the ability to work, throw and give thanks.) Strong’s 3027.

Not pictured. (tav): Frowardness (perverse thing) or mark (from tavah, Strong’s 8427). Strong’s 8420/8420a.

These then needed to be compared to the letters themselves. Here are Benner’s descriptions of the early Hebrew letters:

 (beyt, today ב): image of a house, tent

 (resh, today ר): image of a man’s head

 (aleph, today א): image of an ox’s head

 (shin, today ש): image of two front teeth

 (yud, today י): image of arm and hand

 (tav, today ת): image of crossed sticks

You will notice the names of these Hebrew letters are indeed virtually the same as the Hebrew words above. We will get back to this.

Initial problems with the bereshit argument become evident fairly quickly. First, assuming these letters represent what’s asserted, bereshit reads “house-head-chief-tooth-hand-mark.” Benner himself does not include “God,” “consume,” “destroy,” “works,” “covenant,” or “cross” as definitions.

If we open the scope of the meanings to include Strong’s, that gives us:

  • House (court, door, dungeon, family, forth of, great as would contain, hangings)
  • Head (band, captain, company)
  • Chief (captain, duke, chief friend, governor, guide, ox; chief is actually not listed)
  • Tooth (crag, forefront, ivory, sharp)
  • Hand (be able, about, armholes, at, axletree, because of, beside, border)
  • Mark (very froward thing, perverse thing, desire, signature)

And still the key words are missing. “House-head-chief-tooth-hand-mark” is not all that close to the original bereshit claim. Even skipping Strong’s translations and using only Benner’s, a wide range of secret messages can be conjured. “Family-leader-yoked teacher-tooth-hand-perverse thing” is an equally valid secret message in the first word of the bible!

Key words necessary for the bereshit argument are simply assumed without basis. Aleph, while having to do with leader, has nothing to do with God, as confirmed by my scholars. Notice a noun is transformed into a verb in the conversion of “tooth” to “destroy”! It’s merely “inferring a verb,” says John J. Collins, professor of Old Testament Criticism and Interpretation at Yale Divinity School.

When I raised to John Kostik the fact that these words were missing, he sent me an image that depicted shin standing for destruction in another word, but could not provide a source. “Maybe common sense is to be employed,” he said, adding, “God doesn’t have to source everything through man. God is the source.” I pointed out common sense could also make shin stand for dental hygiene. I did not receive a reply.

You’ll notice “son” is missing here. As explained above, one must combine the first two letters to create “son.” Beyt and resh can join to form the word bar, son (Strong’s 1247). Thus, bereshit can at best be read “son-chief-tooth-hand-mark,” according to Benner and Strong. Or “son-most important-tooth-hand-perverse thing” if you prefer.

Of course, opening the door to letter combinations, rather than moving bereshit closer to validation, can move it farther away. As before, many combinations and words, and thus secret messages, are possible. Beyt-resh-aleph could form bara’ (choose, Strong’s 1254). Resh-aleph could be used for the name Ra. We could combine shin-yud-tav to create shith (to put or set, Strong’s 7896). Yud-tav could form yath (whom, Strong’s 3487). Therefore, “The house of Ra is set” is an equally valid secret message in the first word of the bible, if not superior.

“I actually find this use of the Bible scary,” says Mark S. Smith, professor of Old Testament Language and Exegesis at Princeton Theological Seminary, “because it ends [up] being made into meanings that its creators want, and not what the Bible really says.” A similar sentiment was expressed to me by Michael V. Fox, professor emeritus at the Center for Jewish Studies at the University of Wisconsin, John Goldingay (“One can prove almost anything by this method”), and Walter Brueggemann, professor emeritus at Columbia Theological Seminary (“Sound[s] more like nonsense to me, pressing to [see] what is not there”).

Further, we must be sure to note there are no prepositions with bereshit. My example at best could be “house-Ra-set.” There is no “the,” “of,” or “is.” Where do bereshit believers get any pieces beyond “son-chief-tooth-hand-mark”? Even if we had “son-god-destroyed-hand-cross” there would still be room to create other narratives, for instance: “My son god destroyed when his hands formed a cross.” Additionally, even if prepositions formed a complete “The Son of God is destroyed by his own hand on the cross” there would remain the possibility that this was first discovered by some first-century A.D. scribe who invented a story of Jesus to “fulfill the prophesy.” But no matter. While “son-god-destroyed-hand-cross” would be intriguing indeed, “son-chief-tooth-hand-mark” is the best we have.

I reached out to ask Benner if he was a bereshit believer. He replied, “I personally do not believe that secret messages are encoded in specific words of the Bible.”

However, Benner’s website does associate letters with certain meanings. The scholars I spoke to were adamant that ancient Hebrew letters should not be viewed as “standing for” something. Ron Hendel, professor of Hebrew Bible and Jewish Studies at UC-Berkeley, says of shin, “It’s just a letter of the alphabet. It doesn’t stand for anything except the sound ‘sh.'” This is because ancient Hebrew was never pictographic (where symbols represent things), it was phonetic (where symbols — letters — represent sounds).

Early Hebrew letters (Paleo-Hebrew) came from the older Phoenician alphabet (“phonetic” is not a coincidence), which had 22 letters, all consonants, just like its Hebrew offspring. The Phoenicians lived along the Syrian, Lebanese, and northern Israeli coast, and spread their alphabet across the Mediterranean regions, setting the stage for the development of Greek, Arabic, Hebrew, Latin, and later English.

In the phonetic Hebrew language the crossed sticks symbol, tav, represented only the “t” sound, as in “toy.” In a similar way, the Greek letter tau makes the “t” sound. English doesn’t generally spell out its letter names, but one could say the English tee makes the “t” sound. There is no evidence that the ox head, the crossed sticks, the man’s head, nor the others were actually used by the Hebrews in a pictographic way, where if one wanted to write the word house one would draw beyt. You had to use letters to form words, like  (ba-yit) above. And no one thought the word “house” contained the secret code of “house-arm-mark.”

“The letters never really ‘meant’ those things” to the Hebrews, says Molly Zahn, associate professor of Religious Studies at the University of Kansas, “because the whole point of an alphabet of only a limited number of letters (22 in the case of Hebrew) is to represent sounds, not ideas.” Pictographic languages like hieroglyphics require hundreds — thousands — of signs to be useful.

Other societies, such as the Egyptians and Sumerians, did use pictographic language for a time (think hieroglyphics and cuneiform), but there is no evidence the Hebrews did. The best evidence points to the first Hebrew writing system being an offshoot of the Phoenician script, which aligns neatly with the evidence that the Hebrew people themselves were an offshoot of the Canaanites, a group that included the Phoenicians.

Now, that does not mean the symbols used by the Hebrews were never used in a pictographic way — they were just never used in a pictographic way by the Hebrews. There is no evidence (“None whatsoever,” emphasized Victor H. Matthews, dean of Religious Studies at Missouri State University) that the Hebrews as an independent people used a pictographic language; they were likely already armed with a Canaanite phonetic language upon their formation. We thus arrive at this question of how it is the names of these Hebrew letters are essentially the same as the words of the everyday objects they were modeled on. This phenomenon has certainly made the bereshit argument seem plausible to some.

If we were to look back in time, before the Hebrews existed, before Phoenicia developed its groundbreaking alphabet, we would likely see the people of the region using pictograms of objects. As Zahn explains, they used the image of an ox’s head to mean an alpu (ox) and a little house drawing to represent a ba-yit. These were eventually used by the first phonetic thinkers to represent sounds, specifically these words’ first syllables, the “ah” and “b” sounds. Alpu evolved into different forms — aleph (Phoenician, Hebrew), alpha (Greek), alif (Arabic); so did ba-yit — beth (Phoenician), beyt (Hebrew), beta (Greek, today more vita), ba (Arabic), and so on. So it should not be surprising that objects and letters modeled off those objects should have nearly the same names. This is not unique to Hebrew, either. The Arabic word for tooth (sini) looks like سن and sounds, and appears, remarkably like the letter س (sin). The Arabic word for hand (yd) looks like يد and is somewhat close to the letter ي (ya). Other examples in Arabic and other tongues are not difficult to find.

Some will of course argue that the Hebrews, being “God’s chosen people,” invented the pictographs (and/or phonetics) themselves and disseminated them to other peoples. Or that regardless of how biblical Hebrew came about God nevertheless orchestrated events so that whoever wrote Genesis unwittingly put a secret message of Christ’s story in “in the beginning.” But given the evidence it must be concluded that the message could at best be “son-chief-tooth-hand-mark,” which itself is an arbitrary arrangement, word choice made by Christians wishing to construct what is not there.

The final verdict on bereshit? To quote Tremper Longman III, professor of Biblical Studies at Westmont College, “It’s bull.”

Is Relative Morality More Dangerous Than Objective Morality?

“The fool says in his heart, ‘There is no God.’ They are corrupt, their deeds are vile; there is no one who does good.”

Psalm 14:1 neatly summarizes the anti-atheist stereotype held by many people around the world, and further laid the foundation thousands of years ago for this modern Christian belief. It says so in the bible, thus it must be true. While some people of faith trust that the nonreligious are just as moral as they, others believe atheism makes one more likely to commit unethical acts or even that no one can be good without God.

Having already examined how deities are not necessary to explain morality nor to justify moral decisions, and having cleared up confusion concerning objective morality versus objective truth, it seems relevant to address the idea that relative morality (humans alone deciding what is right and wrong) is so much more dangerous than objective morality (right and wrong as allegedly dictated by God and outlined in holy books).

First we will look at theists’ “relative morality in practice” argument and then move on to the theoretical or philosophical question of which is preferable, relative or objective morality.

The “in practice” argument of course centers around the atrocities of Hitler, Stalin, and other mass killers. “These atheists were responsible for the worst genocides in human history,” thus any morality devoid of gods is dangerous prima facie. 

This falls apart for several reasons.

First, one notes the personal views of the worst despots are sometimes misconstrued. Hitler repeatedly professed his Christianity in his books and speeches, often to explicitly justify oppressing the Jews; he also publicly criticized the “atheist movement” of the Bolsheviks. Privately, however, he made clear he was an enemy of Christianity, calling it an “absurdity” based on “lies” (Bormann, Hitler’s Table Talk). “The heaviest blow that ever struck humanity was the coming of Christianity,” he said, because it led to Bolshevism. “Both are inventions of the Jew.” Christianity would be “worn away” by science, as all “myths crumble.”

However, anti-Christian is not necessarily atheist. Joseph Goebbels wrote that while Hitler “hates” Christianity, “the Fuhrer is deeply religious” (Goebbels Diaries). Hitler said in private that

An educated man retains the sense of the mysteries of nature and bows before the unknowable. An uneducated man, on the other hand, runs the risk of going over to atheism (which is a return to the state of the animal) as soon as he perceives that the State, in sheer opportunism, is making use of false ideas in the matter of religion… (Bormann)

Hitler said to companions, “Christianity is the most insane thing that a human brain in its delusion has ever brought forth, a mockery of everything divine,” suggesting a belief in higher powers.

And while some of Hitler’s policies attacked the Catholic Church and German Christianity in general, only those who stood up to the Nazis, like some church leaders and Jehovah’s Witnesses, were in danger of extermination. And Hitler also persecuted atheists, banning most atheist groups, such as the German Freethinkers League. Again, fear of the link between atheism and Bolshevism was a factor.

With no real evidence Hitler was an atheist, what of Stalin?

The Soviet dictator’s case is more straightforward. He became an atheist as a youth, while studying to become a priest (also what a young Hitler wanted to do). “They are fooling us,” he said of his teachers. “There is no god” (Yaroslavsky, Landmarks in the Life of Stalin). “God’s not unjust, he doesn’t actually exist. We’ve been deceived” (Montefiore, Young Stalin). Later, he explained that “all religion is something opposite to science,” and oversaw “anti-religious propaganda” to eradicate “religious prejudices” (Pravda interview, September 15, 1927). Such efforts were meant to “convince the peasant of the nonexistence of God” (Stalin, “The Party’s Immediate Tasks in the Countryside” speech, October 22, 1924). As implied above, Communism in the Soviet Union typically embraced science and secularism.

Stalin thought religion was “opium for the people,” an exercise in “futility” that wrought “evil” (Hoxha, With Stalin). “The introduction of religious elements into socialism,” he wrote, “is unscientific and therefore harmful for the proletariat” (Stalin, “Party News,” August 2, 1909). He favored the “struggle” against religion. He also said he did not believe in fate, calling it a “relic of mythology” (Stalin, interview with Emil Ludwig, December 13, 1931). In terms of policy, Stalin shifted from a relative tolerance of religious freedom to a reign of terror against the Russian Orthodox Church and other faith organizations in the 1920s and 1930s. Countless priests, monks, and nuns were exterminated (100,000 between 1937-1938 alone; Yakovlev, A Century of Violence in Soviet Russia).

We could go on, digging into the views of other tyrants. But moving forward to the second point, can it be reasoned that, all other factors remaining the same, Stalin would not have harmed anyone had he believed in God? If Hitler had been a Christian? It is logical to posit Stalin’s disbelief was a contributing factor to his holocaust against his own people, even the primary factor in his massacres of religious leaders, but considering what believers in God (and Christ) have been capable of throughout history it is difficult to conclude piety would have stopped Hitler’s war, the Holocaust of Jews, Roma, and homosexuals, or Stalin’s mass murder of political enemies, kulaks (wealthy peasants), and ethnic minorities (such as the Poles). Would faith really have cured the imperial ambitions, extreme racism, fanatical patriotism, authoritarianism, lack of empathy, and power lust of these men? This is the problem with arguing that atheism was anything more than a contributing factor, at best, to (some) of the worst crimes of the 20th century. There are countless other examples of horrific violence committed by men who were unquestionably religious yet exhibited the same evil, and whose actions had a much stronger connection to their faiths than Stalin or Hitler’s actions had to their more secular views (that is, faith was the primary factor, not a contributing factor).

The crimes of the sincerely religious are vast and unspeakable, stretching not merely a few decades but rather millennia. If we could step back and witness the graveyard of all who were killed in the name of God, what would that look like? How many millions have been oppressed, tortured, maimed, and killed because “God said so”? To please the gods? To spread the faith?

Look to the atrocities that no thinking person believes divorced from faith. The 700-year Inquisition, the torture and mass murder of anyone who questioned Christian doctrine in Europe or refused to convert in the Americas and parts of Asia. The 400-year witch hunts of Europe and North America, the execution of women supposedly in league with and copulating with the devil. The 1,900-year campaign of terror against the Jews in Europe, the “Christ-killers.” The Crusades, bloody Christian-Muslim wars for control of the Holy Land that spanned two centuries and killed millions. The European Wars of Religion during the Reformation that lasted a century (Thirty Years’ War, Eighty Years’ War, French Wars of Religion, etc.), killing millions. And these are just the major wars and crimes against humanity of Christians from Europe! (See “When Christianity Was as Violent as Islam.”)

We could look at Arabian Islam, from the bloody conquest to establish a caliphate across the Middle East, North Africa, and Spain to the murder of infidels, from the Shia-Sunni wars to the terrorist attacks of the modern era. We could examine the appalling executions and genocide conducted by the Hebrews, according to their holy book. We could study the human sacrifices to the gods in South American and other societies. We could investigate today’s Christian-Muslim wars and the destruction of accused witches in sub-Saharan Africa. The scope of all this so large, encompassing all people who believed in a higher power in all cultures throughout all human history. The crimes of 20th century tyrants were horrific, but is there really a strong case that they could not have occurred on just as large a scale had the tyrants been more religious?

You will notice that all these atrocities were more closely connected to the faiths of the perpetrators than the atrocities of Hitler and Stalin were to their anti-Christian or secular views. The Jews were not killed in the name of atheism. Hitler’s attempt to conquer Europe was not an anti-Christian campaign. Stalin wanted to destroy religion, but few would suggest that was his primary goal, ahead of eradicating capitalism, establishing Communism, and modernizing Russia into a world power. Secular beliefs may have contributed to atrocities, but unlike these other examples they were not the primary factors. If belief or non-belief only need be contributing factors to credit them for crimes, we could also look at religious persons who committed crimes against humanity that weren’t closely motivated by or connected to faith.

Doing so makes faith guilty of any crime committed by a person of faith. And why not? If the False Cause Fallacy can be applied to atheists it can just as easily be applied to theists! (Same with the Poisoning the Well Fallacy: these atheists were evil, so atheism is evil; these people of faith were evil, so faith is evil.)

The Ottomans committed genocide against the Armenians from 1915-1922, killing 1.5 million, 75% of the Armenian population. Prime Minister Mehmed Talaat was its principle architect, and because he was a Shia Muslim it must have been a belief in a higher power that enabled him to carry out this act. The Rwanda genocide of 1994 was not a religious conflict, but some Catholic faith leaders participated — a crime the Pope apologized for this year. Their belief in a god must be credited. Radovan Karadžić, president of Republika Srpska and a Serb, orchestrated the genocide of Muslims and Croats in 1995, during the Bosnian War. He saw his deeds as part of a “holy war” between Christianity and Islam. Would he have refrained from mass murder had he been an atheist? Would the old butcher Christopher Columbus? Would King Leopold II of Belgium? This Catholic monarch was responsible for the deaths of perhaps 10 million people in Congo. “I die in the Catholic religion,” he wrote in his last testament, “and I ask pardon for the faults I have or may have committed.” This game can be played with anyone in human history, from the Christian kings, queens, traders, and owners who enslaved 12-20 million Africans (which killed millions; see Harman, A People’s History of the World) to the Christian presidents of the United States who intentionally bombed millions of civilians in Vietnam.

One could make the embarrassing argument that those who committed such evils were not actually believers in God (a “secret atheist theory”). Yes, it is difficult to know an historical figure’s true thoughts. But one could just as easily pretend Stalin and others were secretly believers. We have to use the evidence we have.

So you can see how the legitimacy of casual connections is highly important. One who doesn’t care about the strength of such connections could easily attribute Hitler’s crimes to his belief in a higher power! (One could then argue Hitler’s belief was far more dangerous than Stalin’s atheism, as Hitler oversaw the deaths of 11 million noncombatants, versus Stalin’s 6 million — in the decades since the fall of the Soviet Union, researchers have determined the death toll estimate typically associated with Stalin, 20 million, is grossly inaccurate.) It is illogical to blame secularism for being anything more than a contributing factor to Stalin and Hitler’s actions in the same way it is illogical to blame faith for being anything more than a contributing factor to the Armenian, Congolese, or other genocides committed by religious persons. There are many events in history with faith as a primary cause, like the Inquisition, but it cannot be said the Holocaust and the Russian purges were primarily caused by atheism.

Third and finally, one could refute the notion atheists are worse people using scientific research. Children from nonreligious homes were actually found in a 2015 study to be more generous than those from religious homes. A “Good Samaritan” study found religiosity does not determine how likely people are to lend a helping hand. A study on cheating found that faith does not make one less likely to cheat. A 2014 study showed secular and religious people commit immoral acts equally. Some atheists trumpet the fact they are underrepresented in U.S. prisons, but shouldn’t due to the fact atheists are predominantly educated, middle-to-upper class whites, a group that is itself underrepresented. Similarly, some point out nations like the United Kingdom, the Netherlands, Denmark, Sweden, the Czech Republic, Japan, and others have some of the highest rates of atheism and lowest rates of crime in the world, but this should be avoided as a False Cause Fallacy as well. These nations are likewise disproportionately wealthy and educated — low crime rates and atheism are byproducts; they likely do not have a cause-effect relationship (but at least those worried about society falling into chaos and crime as atheism spreads can rest easy).

So is the belief in relative, godless morality so much more dangerous than the belief in objective, God-given morality? In practice, it appears not. The capacity for horrific actions in secular and religious people seems equivalent. Same with kindness and other positive actions.

From a theoretical standpoint, however, there are two facts that make relative morality better. They help explain why atheists are not worse people than believers.

First, objective morality has a glaring flaw: it cannot be known. Just as one cannot prove the existence of the Christian deity, there is no way to definitively prove that Christian right and wrong is the objective standard humanity is meant to follow. Why not Islamic right and wrong? Because one can’t prove which set of ethics is actually objective and god-decreed, each simply becomes one option among many and thus we have to choose among them (it’s quite relative!). Even if you believe in objective morality, there’s no way to actually know what it is. The person of (any) faith thinks he knows but might easily be wrong. “I’ve looked at her with lust in my heart, I’ve done wrong.” Well, perhaps not. It could be the higher power that actually exists doesn’t believe in thought crimes. Therefore, saying we should try to follow an objective morality, offered by a particular religion, is not particularly compelling. Relative ethics are of course known because we create them for ourselves.

Second, relativity allows us the freedom to make our ethics better. I understand why people of faith see a risk in humans deciding what’s right and wrong, but religion clearly isn’t any better in terms of danger to others (if you ask me why it’s because religion is man-made, so it all makes sense). We have gods saying all sorts of things are right: killing homosexuals, those who engage in extramarital sex, and people who work on the Sabbath (Old Testament); enslaving people and oppressing women (New Testament); waging Jihad on nonbelievers and cutting off body parts for crimes (Qur’an). Well, perhaps humans would like to base what’s wrong on what actually causes harm to others, not what insults a deity, which makes all that killing and maiming wrong and makes things like working on the Sabbath, homosexuality, and sex outside marriage (and porn, masturbation, smoking weed, etc.) ethically permissible. We have the ability to continue to improve our ethics to a point where fewer people get killed for nonviolent “crimes.” Relative morality allows us to move past the absurdities and barbarism of ancient desert tribes. We’ve been very successful at this.

Yes, it also allows us to return to barbarism, with no thoughts of angry higher beings to stop us. Faith-based appeals can prevent barbarism too (“I can’t kill, I’ll go to hell”). But at least we’re free to move in a more positive direction if we choose. Religion doesn’t really offer that. God’s word is perfect and is not to be altered or deviated from; it has been set for thousands of years. Being paralyzed by religious ethics keeps us stuck in the dark ages, from oppressive Islamic societies in the Middle East and Asia to the lingering hysteria in the United States over homosexuality, which is a very natural trait of the human species and other lifeforms. Progress on such matters requires putting aside ancient faith-based ideas of right and wrong (Americans were no longer allowed to execute homosexuals after 1786). The more humanity does so the more safe and free each of us becomes.

Kentucky Judge Refuses to Marry Atheists

In July 2016, Kentucky judge Hollis Alexander refused to wed atheists Mandy Heath and her fiancé Jon because they requested any mention of God be excluded from the ceremony.

“I will be unable to perform your wedding ceremony,” Alexander told them. “I include God in my ceremonies and I won’t do one without him.”

Alexander, being the only judge in Trigg County able to perform a wedding ceremony, advised Heath to seek out a judge in another county. The Freedom From Religion Foundation, a leader in suits against violations of constitutional church-state separation, sent the judge a letter outlining the laws he chose to break, adding:

There is no requirement that such ceremonies be religious (any such requirement would be unconstitutional). Ms. Heath sought you out as the only secular alternative available to her under Kentucky law.

As a government employee, you have a constitutional obligation to remain neutral on religious matters while acting in your official capacity. You have no right to impose your personal religious beliefs on people seeking to be married. Governments in this nation, including the Commonwealth of Kentucky, are secular. They do not have the power to impose religion on citizens. The bottom line is that by law, there must be a secular option for people seeking to get married. In Trigg County, you are that secular option.

There is no word yet if a lawsuit will follow.

Kentucky is the state where Kim Davis worked as a county clerk; she refused to issue marriage licenses to gay couples, citing her Christian faith. Alexander also refuses to conduct weddings for LGBT Americans.

Atheists Sue Kansas City Over Payment to Baptists

On July 22, 2016, the American Atheists group and two Kansas City residents sued Kansas City Mayor Sly James and the city government for designating $65,000 in taxpayer funds for Modest Miles Ministries’ National Baptist Convention, taking place at Bartle Hall in early September.

Missouri’s Constitution forbids using taxpayer money to fund religious events and institutions: “No money shall ever be taken from the public treasury, directly or indirectly, in aid of any church, sect, or denomination of religion.” The lawsuit aims to prevent the city from handing over the funds.

“The National Baptist Convention is inherently religious — and it is clear under Missouri law and the First Amendment that Missouri taxpayers should not be paying for it,” argues Amanda Knief, legal director of American Atheists. The group’s website also notes:

Modest Miles Ministries claims in emails to the City that the funds will be used for transportation to and from the convention, making the funding purposes “secular.” That would mean, according to Modest Miles Ministries’ funding application, about 25% of the entire budget of the convention — $65,000 — is being spent on shuttles to and from the convention.

The $65,000 grant for the Baptist Convention was the second largest grant that the City gave in 2016. This was the fourth time the City has approved funding the National Baptist Convention: in 1998, the City approved $100,000 (about 32% of the convention’s total budget); in 2003, the City approved $142,000 (about 42% of the convention’s total budget); and in 2010, the City approved $77,585 (about 27% of the convention’s total budget).

The city government refused to comment to The Kansas City Star. But the paper says, “City spokesman Chris Hernandez pointed out no contract has been signed yet to spend the money. If and when that does happen, Hernandez said, the contract has language spelling out that the money would be used for secular purposes.”

The lawsuit says the Kansas City plaintiffs have a “right to be free from compelled support of religious institutions and activities,” and cites another Missouri case, “Trinity Lutheran Church of Columbia, Inc. v. Pauley, upheld by the Eighth Circuit in 2015, in which this court refused to allow public money to be spent on a Lutheran day care.”

The contract between the city and Modest Miles Ministries is due this month.

Foundations of Faith: A Comparative Analysis of Kohlberg, Erikson, and Fowler

Developmental psychologist James W. Fowler (b. 1940) posited in 1981 that the way in which men and women understand faith is determined by his or her construction of knowledge. One’s perception of self and one’s experiences in specific environments are more telling of how meaning is made from faith than how often one attends temple, mosque, or mass services, how well one knows church doctrine, or how much holy scripture one can recite from memory. While it is important to note Fowler writes from a Christian perspective (being professor of theology at the United Methodist-affiliated Emory University in Atlanta, as well as a Methodist minister), his vision of human faith development is not meant to be content-specific. It is meant to be applicable to all faiths, disregarding religious bodies to focus solely on an individual’s spiritual and intellectual growth. Fowler formulated “stages of faith,” drawing inspiration from the developmental theories of Erik Erikson and Lawrence Kohlberg, among others. Upon exploring Fowler’s stages, this comparative analysis will examine the ideas of Kohlberg and Erikson, analyzing how their theoretical structures influenced the formation of Fowler’s work.

According to Stephen Parker’s “Measuring Faith Development,” Fowler’s idea was that faith was formed by many interrelated and developing structures, the interaction of which pinpointed one’s stage (2006, p. 337). “Stage progression, when it occurs, involves movement toward greater complexity and comprehensiveness in each of these structural aspects” (p. 337). The structures include form of logic (one progresses toward concrete and abstract reasoning), perspective taking (one gains the ability to judge things from various viewpoints), form of moral judgement (the improvement of moral reasoning), bounds of social awareness (becoming more open to changing social groups), locus of authority (moving toward self-confidence in internal decision-making), form of world coherence (growing aware of one’s own consciousness and one’s ability to understand the world using one’s own mental power), and symbolic function (increasing understanding that symbols have multiple meanings) (p. 338). These are the bricks that build each stage of faith; as one is able to think in more complex ways, one advances up Fowler’s spiritual levels.

The stages of faith are primal faith (pre-stage), intuitive-projective faith (1), mythic-literal faith (2), synthetic-conventional faith (3), individuative-reflective faith (4), conjunctive faith (5), and universalizing faith (6). According to Fowler, during the pre-stage, an infant cannot conceptualize the idea of “God,” but learns either trust or mistrust during relations with caretakers, which provides a basis for faith development (Parker, p. 339). More will be discussed on this later. In the intuitive-projective stage, a child of preschool age will conceptualize God, though only as “a powerful creature of the imagination, not unlike Superman or Santa Claus.” During the mythic-literal stage, the child will develop “concrete operational thought,” and will view God as a judge who doles out rewards and punishments in a fair manner. In the synthetic-conventional stage, one will develop “formal operational thought”; the idea of a more personal God arises, and one begins to construct meaning from beliefs. The individuative-reflective stage at last brings about self-reflection of one’s beliefs. Parker writes, “This intense, critical reflection on one’s faith (one’s way of making meaning) requires that inconsistencies and paradoxes are vanquished, which may leave one estranged from previously valued faith groups.” As this occurs, and somewhat ironically, God is viewed as the embodiment of truth. Conjunctive faith is a stage in which one attempts to reconcile contradictions; while staying wary of them, he or she may see the nature of God as inherently unknown, a “paradox,” while still being Truth. Where certainty breaks down, acceptance of the diverse beliefs of others grows more pervasive. Fowler suggests the conjunctive stage may occur during midlife. Finally, if one can attain it, the universalizing stage is when one becomes fully inclusive of other people, faiths, and ideas. People hold “firm and clear commitments to values of universal justice and love” (p. 339).

It is important to note these stages do not represent a universal, concrete timetable for faith development. Each stage requires greater critical thinking and self-reflection (which is what makes Fowler’s model applicable to multiple faiths), and therefore not everyone will progress through them at the same rate or even attain the same level of development. Further, the model does not address those who abandon faith completely; it demonstrates only a progressive scale that suggests one either stops where one is or moves toward greater knowledge of self and one’s values, and more open-mindedness in regards to others and the nature of God Himself. For many, faith development may not be so simple, nor so linear. Regardless, Fowler’s work has had a great impact on religious bodies and developmental psychology (Parker, p. 337).

Fowler borrowed much from other theorists. Psychologist and psychoanalyst Erik Erikson (1902-1994) created a model for the psychosocial development of men and women, from which Fowler later drew inspiration. In lieu of a lengthy summary of Erikson’s (and Kohlberg’s) ideas, this comparative analysis will provide a brief overview, and focus more on the aspects that relate closest to Fowler’s finished product. According to Erikson’s “Life Span Theory of Development,” human growth goes through eight stages, each of which featuring a crisis that, if successfully conquered, will result in the development of a valuable virtue, such as hope, love, or wisdom. Erikson’s crises were: Trust vs. mistrust (infancy), autonomy vs. shame (toddlerhood), initiative vs. guilt (preschool), industry vs. inferiority (childhood), identity vs. role confusion (adolescence), intimacy vs. isolation (young adulthood), generativity vs. stagnation (middle adulthood), and integrity vs. despair (late adulthood) (Dunkel & Sefcek, 2009, p. 14). One’s ability to embody the more positive aspect of one of these pairs makes it likely one will do the same with the next positive aspect (p. 14).

Fowler liked Erikson’s trust vs. mistrust idea, seeing it as the very foundation of faith development. Clearly, trust becomes a critical theme as one is exposed to spiritual beliefs, the “known”-yet-unseen. Can one trust the holy book? Can one trust the priest, rabbi, or parent? It is interesting to consider how the development of trusting or distrusting relationships will affect future spiritual development. What are the results of the trust vs. mistrust conflict? Erikson felt that “for basic trust versus mistrust a marked tendency toward trust results in hope” (Dunkel & Sefcek, p. 13), which implies a lack of hope if unresponsive caretakers breed feelings of mistrust. While it was strictly Erikson concerned with virtues gained from each life stage, Fowler, in adapting Erikson’s first stage, provides in his model a single stage with conflict. It begs questions. Can one successfully enter the intuitive-projective stage without building trusting relationships in the infant pre-stage? If so, what is the impact of mistrust in stage 1, and all the following stages? Could it mean different perspectives of God (for instance, perhaps as less fair-minded during the formation of concrete operational thought in the mythic-literal stage)? Would one likely progress through the stages more rapidly, or more slowly? Hypothetically, one less trusting might be quicker to see problems and contradictions in faith, advancing to the individuative-reflective stage sooner. Further, Erikson believed “optimal psychological health is reached when a ‘favorable ratio’ between poles is reached” (p. 13), meaning a positive trust-mistrust ratio is all that’s needed to develop hope and move through the stage. Therefore, “a ‘favorable ratio’ indicates that one can be too trusting” (p. 13). What will be the impact on faith development for someone who has grown too trusting of people? By their nature, both Erikson’s and Fowler’s stages build upon each other. For Erikson, trust made it “more likely the individual will develop along a path that includes a sense of autonomy, industry, identity, intimacy, generativity, and integrity” (p. 14). If Fowler’s model is built on the same principle of trust acquisition, what will happen to faith when the foundation is not ideal?

In reality, Fowler’s model parallels Erikson’s even more so, in regards to Erikson’s psychosocial crises. Erikson saw the individual as being pulled by two opposing forces in each stage, the favoring of the positive force leading to new virtues. On the surface, Fowler’s stages may appear simple and gradual, the progression seeming to occur naturally and expectedly, or at least without specifics on how or why individuals progress to higher levels of critical thinking and new perspectives on God. What takes one from an unexamined faith in the synthetic-conventional stage to taking a long, hard look at contradictions and controversies in the next? It cannot be simple maturation, or everyone would make it to the final stages. There must exist something that holds people back, or drives them forward. Que Erikson and his crises. Erikson would say the individual must accept the force pushing forward and resist the one pulling backward. In his fifth stage, for instance, that which Dunkel and Sefcek deem “the most important” (p. 14), an adolescent faces the crisis of identity versus role confusion. The adolescent must form an identity in the social world, build convictions, choose who he or she will be (p. 14). Confusion, temptation, and doubt will impede progress. In Fowler’s model, a crisis certainly makes sense, only perhaps less of a ratio or continuum and more of a single event or confrontation. For example, what better way to explain the transition from the intuitive-projective stage to the mythic-literal stage than the moment when the parent tells the child Santa Claus isn’t real? That could begin the shift from imagination to logic, and with it a change in the child’s perception of God. Personally, this author sees his own transition into Fowler’s individuative-reflective stage as beginning the afternoon he read a work by the late evolutionary biologist and Harvard professor Stephen Jay Gould, who pointed out contradictions between the timeline of the Biblical story of Noah and modern archeology. Though different for each individual, such turning points provide Erikson-esque crises that explain one’s advancement through Fowler’s model.

The work of psychologist Lawrence Kohlberg (1927-1987) also inspired Fowler. Fowler’s form of moral reasoning structure was an adaptation of Kohlberg’s “Six Stages of Development in Moral Thought” (Parker, p. 338). Kohlberg theorized that as one ages, the way in which one justifies actions advances through predictable stages. His Pre-Moral stage saw children motivated to make moral decisions through fear of punishment (Type 1), followed by the desire for reward or personal gain (Type 2). Morality of Conventional Role-Conformity was spurned by the desire to avoid the disapproval of peers and to abide by social norms (Type 3), and later the wish to maintain social order by obeying laws and the authorities who enforce them (Type 4). In the Post-Conventional stage, people acknowledge that laws are social contracts agreed upon democratically for the common good, and are thus motivated to behave morally to gain community respect (Type 5). Finally, one begins to see morality as solely within him- or herself: One must be motivated by universal empathy toward others, acting morally because it is just and true, not because it is the law or socially acceptable (Type 6) (Kohlberg, 2008, p. 9-10). It is not difficult to see how Fowler viewed the development of moral judgement as being a crucial building block to the development of faith. Universal morality, like universal faith, are byproducts of deeper critical thinking, reflection, and cognitive ability.

In that regard, it is easy to see how well Fowler’s six stages and Kohlberg’s six stages align. Both move from perceptions and beliefs borrowed from and influenced by others, and motivated by selfishness, to perceptions and beliefs formed in one’s own mind, motivated by empathy and love. They both advance toward justice for justice’s sake. One might think the stages are pleasantly compatible. What’s fascinating, however, is that Fowler believed the majority of people remained in his third stage, the synthetic-conventional (with the few who advanced usually only doing so in their later years), but Kohlberg showed in his studies with children that “more mature modes of thought (Types 4–6) increased from age 10 through 16, less mature modes (Types 1–2) decreased with age” (Kohlberg, p. 19). (With age, of course, comes factors such as “social experience and cognitive growth” (p. 18).) He saw youths who addressed moral conundrums (such as his famous Heinz Dilemma) with the Golden Rule and utilitarianism (p. 17), noting that “when Type 6 children are asked ‘What is conscience?’, they tend to answer that conscience is a choosing and self-judging function, rather than a feeling of guilt or dread” (p. 18).

Clearly, the post-conventional moral stage can emerge very early in life. While keeping in mind Fowler’s form of moral reasoning structure may not be a perfect reproduction of Kohlberg’s ideas, it is interesting to consider the contradiction between an adolescent in the synthetic-conventional stage, an era marked by unexamined beliefs, conformity to doctrine, and identity heavily influenced by others, and a “Type 6” adolescent in the post-conventional stage of moral thinking, who uses reason, universal ethics, empathy, and justice to solve moral problems. Would not such rapid moral development lead to more rapid progression through Fowler’s model? With Type 4-6 thinking increasing so early, why do so few begin thinking critically of their faith and analyzing contradictions, and so late in life? Perhaps it is simply that Type 6 children are such a minority; perhaps it is they that will go on to reach the individuative-reflective stage. It would be intriguing to compare a child’s ability to answer moral dilemmas with his or her perspective on God and faith. How did the children of Kohlberg’s research view God? Surely some believed in God (and thus could be placed on Fowler’s model) and some did not. Was there a positive or negative correlation between moral decisions and faith? Were the children moving through Fowler’s stages more likely or less likely to develop higher types of moral thinking? Or was there no effect at all? Fowler, of course, might say there are too many variables in faith progression, that it requires advancement in multiple interactive structures; even if a child makes it to Kohlberg’s final stage of moral development, there are six other structures that affect one’s spiritual progress that must be taken into account.

While this comparative analysis places an emphasis on Fowler, that is not to say Erikson and Kohlberg’s works do not stand on their own, or that their theories somehow automatically validate his. Placing them side-by-side simply provides an interesting perspective that both raises and answers questions. Whether examining the moral, the psychosocial, or the spiritual, it is clear self-reflection and critical thinking are paramount to development. Kohlberg, Erikson, and Fowler were leaders in their fields because they understood and based their research on this idea. Their combined theories present a convincing case that as one grows, greater cognitive power and the confrontation of new ideas can change perspectives in positive ways, from forming one’s identity to learning love, empathy, and respect for others.
References

Dunkel, C. S., & Sefcek, J. A. (2009). Eriksonian lifespan theory and life history theory: An integration using the example of identity formation. Review of General Psychology, 13(1), 13-23.

Kohlberg, L. (2008). The development of children’s orientations toward a moral order. Human Development, 51, 8-20.

Parker, S. (2006). Measuring faith development. Journal of Psychology and Theology, 34(4), 337-348.

The Philosophy of Morality

Having explored how human morality — ideas and feelings of right and wrong — does not need a god to explain it, instead being the product of our evolutionary history and our unique societies, it is time to address a common criticism of godless morality.

It goes something like this: If morality is purely subjective, if right and wrong do not exist “beyond” or “outside” what humans determine they should be (in other words, are not set by a god), how can one justify telling someone else she has behaved in an immoral way? If a man says rape or murder is morally right, how can another justify saying he is wrong? With no empirical standard of what is ethical, ethics are simply opinions, and why would one human’s opinion have more weight or importance than another’s? Relative morality is meaningless morality.

We can put aside the obvious point that even if a god-decreed empirical standard exists there is no way for us to know precisely what it is. We’d have to first prove which god is real and which gods are fictional, then get clarification directly from this being on issues not specifically mentioned in its holy text. So the same question of how one justifies telling another she is wrong haunts the theory of Objective Morality as well.

More importantly, the common criticism is an incomplete thought, failing to comprehend the premise.

The premise is indeed that morality is opinion-based. Though rooted in evolution, the society and family one happens to be born into, life experiences, psychological states, and so on, right and wrong are ultimately matters of opinion. The answer to this question (“If morals are human opinions, how can one justify condemning another person’s actions?”) is then obvious: no justification is needed at all. Opinions do not need this kind of justification.

Suppose I were to ask, “What is your favorite color?” and then demanded you justify it using an empirical standard, a standard beyond yourself, beyond humanity — beyond human opinion. The very idea is absurd. The concept of a “favorite color” does not exist in any form beyond our individual selves (do you think that it too was decided by God for us humans to follow?). What sense does it make to demand that the person who expresses a favorite color also “backs it up” using some mythological benchmark not set by humans? Opinions of the prettiest color rest on their own laurels — the subjective standards of man, not the objective ones of a deity.

In the precise same way, no external justification is needed to say, “What the rapist did was wrong, even if he didn’t think so.” If one states that another person behaved in an immoral way, that is a subjective viewpoint like one’s favorite color; there is no requirement that one justifies saying so using anything other than human thought and reason. Opinions, moral or otherwise, do not need to be measured or validated against standards “beyond” or “outside” humanity.

The religious may believe these things are different, because naturally an Objective Favorite Color does not exist but an Objective Morality does. That’s as impossible to prove as the deity it’s based on, but think that if you wish. Regardless, the statement “You have to justify judging others if you don’t believe in an empirical standard” makes no sense. It’s specifically because one doesn’t believe an empirical standard exists that one doesn’t need to justify judging others! If you don’t believe in an Objective Favorite Color, you do not have to justify your favorite color using that standard. If you don’t believe in Objective Morality, you do not have to justify why you think someone did something immoral using that standard. You can stick to human standards — both individual and collective, which you can use to justify your beliefs (for example, my morality — and that of many others — emphasizes minimizing physical and psychological harm, therefore rape is wrong, therefore the rapist has done wrong).

So if no justification is needed to state your opinion that a murderer has done wrong, if the very act of asking for justification is illogical because it ignores the obvious implication of the premise, what of the rest of the common criticism? If it’s all opinion, doesn’t one have to say all opinions are equal, if we look at things objectively? Any notion that Opinion A has more weight or importance than Opinion B is bunk. Is morality then meaningless?

It is true, if we view all this objectively, that Opinion A and Opinion B, whatever they may be, are indeed “equal,” “equally valid or important,” or however else you’d like to phrase it. How else could it be? If there is no deity, no Final Say, to give the thumbs up or down to moral opinions, that is simply reality. (Without an Objective Favorite Color, “My favorite is blue” and “My favorite is green” are both valid.) Now, this generally makes us uncomfortable or sick because it means that though I think the opinions and ethics of the child molester are detestable and inferior to my own there is no deity to say I am right and he is wrong, so our opinions are equally valid. But that’s not the end to the story, because while opinions are equal their real-world consequences are not.

Some moral views lead to death, physical and psychological pain, misery, terror, and so on. Others do not, or have opposite effects. These are real experiences. So while mere opinions, in and of themselves, can be said to be “equal,” we cannot say the same regarding their actual or possible effects. Some moral views are more physically and psychologically harmful than others. This is quite different than favorite colors.

See, the common criticism has it backwards. A lack of an empirical standard makes opinions meaningful, not meaningless. It’s where an empirical standard exists that opinions don’t matter. Consider an actual empirical standard: the truth (yes, atheists and liberals believe in absolute truth). Either George Washington existed or he didn’t. I say he did, another says he didn’t…one of us is incorrect. When it comes to the truth, opinions don’t matter. The objective truth is independent of our opinion. Morality is different: it is not independent of our opinions (it’s opinion-based, after all), and thus our moral views matter a great deal because some will cause more harm than others. If God exists and determined that killing a girl found to not be a virgin on her wedding night was right, your opinion about killing non-virgin girls on their wedding nights would be meaningless. It wouldn’t matter if you thought this wrong — you’d be incorrect. But if there is no deity-designed standard “beyond” humanity, your opinion is meaningful and matters a great deal because awful real-world consequences can be avoided if your moral opinion is heard and embraced.

“Well, so what?” one might ask. “Why is harm itself wrong? Who says we should consider death and pain ‘wrong’ rather than, say, life and happiness?”

The person who asks this has lost sight of linguistic meaning. What exactly does “wrong” (or “bad” or “evil” or “immoral”) mean? Well, it essentially means undesirable. To say something is wrong is to say it’s disagreeable, intolerable, unacceptable, something that should not be done, something to be avoided.

Why is harm wrong? Harm is wrong because it’s undesirable. To put it another way, asking “Why is harm wrong?” is really asking “Why is harm undesirable?” And the answer is “Because it hurts” — because we are conscious, organic creatures capable of experiencing death, pain, humiliation, grief, and so on. Now, this does not mean everyone will agree on what constitutes harm! That is the human story, after all: a vicious battle of opinions on what is harmful and what isn’t (and thus what’s wrong and what isn’t), with some ideas growing popular even while change awaits on the horizon. We even argue over whether causing harm to prevent a greater harm is right (desirable), as with killing one to save many or going to war to stop the evils of others. But the idea that harm is undesirable is universal, because each human creature has something they would not like to happen to them.

This includes those who bring pain and suffering to others or themselves. The rapist may not wish to be raped; the mullah who supports female genital mutilation may not wish to be castrated; the suicidal person may not wish to be tortured in a basement first; the masochist, who enjoys experiencing pain, may not wish to die; the serial killer may not wish to be left at the altar; the sadist, who loves inflicting pain, may not wish to be paralyzed from the neck down.

As soon as you accept the premise that each person has some form of harm he or she wants to avoid, you’ve accepted that harm is wrong — by definition. Even if our views on what is harmful (or how harmful something is) vary widely, we have a shared foundation built on the actual meanings of the terms we’re using. From this starting point, folk from all sides of an issue present their arguments (for instance, “It is wrong — undesirable — for a starving man to steal because that harms the property owner” vs. “It is right — desirable — for a starving man to steal because if he doesn’t he will die”). Though we individuals do not always do so, we often decide that what’s wrong (undesirable) for us is also wrong for others, because we evolved a capacity for empathy and are often smart enough to know a group living under rules that apply to all can actually protect and benefit us by creating a more stable, cooperative, caring society). The disagreements may be savage, but an important premise of harm being wrong because it’s undesirable is universally accepted. Things couldn’t be any other way unless you simply wanted to throw out the meaning of words.

The path forward from there is clear, despite the insistence of some that actions need external justification even if moral opinions do not. This is merely another go at an obviously flawed idea. If no external, objective standard is needed to justify moral views, why would you need one to justify actions based on those moral views? You wouldn’t. We justify our actions based on the subjective, human ideas that are our moral views, and then try to popularize our ideas because we think we know best. It’s simply what human creatures do, whether our ideas are in the minority or majority opinion, whether they lead to death and pain or peace and kindness.