Christianity may be compared to a grand cathedral with very little evidence for its foundation.
Christians who sincerely believe their religion is true have constructed many complex arguments to justify belief.
But although their intentions are good, their arguments are not.
Upon scrutiny, each argument leads to little or nothing of substance.
Due to the sheer quantity of arguments for Christianity, it took many years for me to recognize that none of them were solid.
I hope this document will help others to more easily see Christianity for what it is: a cathedral without a foundation.
Anyone who would like to improve this document may do so, and may freely distribute their modified versions for any purpose without any requirements
(...except I put a red dot in the bottom-right corner of graphics whose origins I have not yet investigated--you
might want to determine who owns those images before you use them in any commercial endeavor).
Cosmological argument
The cosmological argument is an attempt to prove the existence of God based on the existence of the cosmos.
It is usually presented like this:
- Whatever begins to exist has a cause.
- The cosmos began to exist.
- Therefore, the cosmos had a cause, and we will call it "God".
This argument attempts to exploit the ill-defined term "exist".
Something that has physical existence, such as a rock, exists because it occupies a place in space and time.
By contrast, something that only has conceptual existence, such as algebra,
exists because some intelligent being organized it into a concept,
or at least recognized it and gave it a name.
It would obviously be a mistake to suppose that because algebra exists (as a concept)
that it therefore has physical existence, occupying a place in space and time.
The cosmological argument attempts to go in the opposite direction,
by suggesting that because the Universe exists (physically),
that it must have been organized by by some intelligent being.
If we suppose that the cosmological argument uses a consistent definition of "exist",
then it quickly falls apart.
To illustrate, let us consider this argument with each possible definiton:
If we assume that "exist" refers to physical existence, its two premises fall apart.
The first premise depends on something that has never been observed.
No one has ever witnessed anything beginning to physically exist.
We don't even know if that is an event that can occur.
So any assertion about how such events occur, as in the second premise, is at least equally baseless.
If we assume that "exist" refers to conceptual existence, then the first two premises make a lot more sense.
Both natural and theological explanations suggest that the Universe did advance through stages of organization.
From this perspective, a simpler form of the cosmological argument is invoked by simply asking, "who detonated the Big Bang?".
And the implied answer, of course, is God.
However, solar fusion, supernovas, and other large cosmological explosions are known to occur in response to natural conditions.
Therefore, a reason is still lacking to suppose that a big bang would require something supernatural.
There is something seemingly intuitive about the suggestion that organization implies intelligence,
but a great many highly-organized systems can be observed to be caused by natural processes.
If we consider only two possibilities, "God always existed, and he created the Universe out of nothing", or "The Universe always existed",
then the latter possibility really seems to add fewer superfluous details,
especially since we can observe the Universe in existence but cannot easily observe the existence of God.
It would be a special pleading fallacy to suppose that the Universe must have a cause, but God does not.
Ultimately, the cosmological argument reduces to, "why is there something rather than nothing?"
The only honest answers to this question will contain some form of, "we don't know", and will not contain, "the true answer is..."
Dogmatism
Dogmatism occurs when a person pridefully refuses to consider the possibility that he or she may be mistaken in matters of belief.
It causes a person to place more confidence in an idea than is justified by any reasonable basis for the confidence.
Dogmatism involves an excessive trust in one's self, and consequently a lack of faith in others.
Dogmatism frustrates communication because the dogmatic person will not legitimately consider any suggestions that others make.
The dogmatic person assumes a role of superiority in conversations by attempting to influence others without investing any possibility for being influenced him or herself.
Dogmatism also inhibits a person's own ability to progress in the refinement of his or her beliefs.
To illustrate this principle, consider a situation in which information about a particular concept is made available incrementally (that is, line upon line, precept upon precept).
A non-dogmatic person will make small adjustments to his or her beliefs as the new revelations are received.
Over time, this may slowly cause one idea to overtake another idea, changing which idea dominates in the recipient's set of beliefs.
By contrast, a dogmatic person has a habit of shifting full confidence toward the idea that currently dominates.
Thus, as new information is received, the dogmatic person dismisses its influence, and ultimately never allows any new idea to take root.
It is quite possible to be dogmatic about either belief or non-belief in any subject.
Consequently, dogmatism is a problem for both believers and non-believers alike.
One significant cause of dogmatism is having a strong desire to be perceived as being right.
By contrast, open minded people would rather find out that they are wrong, so they can become right.
Why do people become dogmatic?
Do they believe that truth needs them to defend it?
Do they fear having to change their beliefs?
Do they fear the shame of being discovered in the wrong?
Perhaps it is a combination of all of these, but it is clear that there are few valid or good reasons to be dogmatic.
Dualism argument
Dualism is the belief that the processing of information (function) is insufficient to explain consciousness.
It stems from Descartes' mind-body problem.
If dualism is true, then a simulated brain would never be able to achieve consciousness, no matter how perfect the simulation was, because it would lack the "dual" that makes consciousness work.
Although dualism does not directly imply the existence of God, theists often claim that the dual is a spirit that dwells in a supernatural realm, which opens the possibility for resurrection, and an afterlife.
The argument for dualism usually sounds something like this: "I know there is a dual because I can sense that my feelings are so real, and there is no way that function alone could do that."
However, the dualism argument essentially reduces to an argument from ignorance.
It basically says, "I do not understand how functionalism could produce consciousness, therefore functionalism cannot produce consciousness".
But claiming the existence of a dual does nothing to explain consciousness either.
Daniel Dennett is famous, among other things, for identifying that dualism makes the homunculus fallacy.
That is, it attempts to explain consciousness by suggesting that there is a conscious component somewhere.
The problem is that this creates an infinite regress because then the dual would have to have a dual in order to be conscious.
If a hypothetical machine were functionally equivalent to a conscious being, but lacked the dual, then it follows that the dual could not provide any functional benefit.
The machine without the dual would then be indistinguishable from something that was conscious.
So the role of the dual could not be to do anything, but merely to exist.
Yet, if the brain even so much as had the capability to detect existence, then the dual has has crossed the interface into the functional realm, and could be replaced by something that only simulated the same function.
Entropy argument
The entropy argument postulates an imaginary law that forbids order from emerging from disorder.
If the entropy law were valid, then it would prevent natural evolution from occurring.
Because the entropy argument loosely resembles the Second Law of Thermodynamics, some people believe it is an actual physical law.
However, no such law has ever actually been established in physics, and numerous natural phenomena demonstrate that it would be false.
The Second Law of Thermodynamics says that the entropy of a closed system cannot decrease.
By contrast, the entropy argument claims that the entropy of a system cannot decrease without the application of intelligence.
Significantly, the entropy argument claims to apply to all systems, while the Second Law of Thermodynamics applies only to closed systems.
Also, the entropy argument implies that it can be circumvented by intelligence, wherease the Second Law of Thermodynamics is believed to always apply.
A closed system is one in which entropy cannot enter or leave.
For example, standard air conditioners would not be able to cool a room if the heat-exchanger were kept indoors.
However, by putting the heat exchanger outside, air conditioners can cool a room by heating the outside environment.
Life is not a closed system.
It consumes fuel, emits waste, and interacts with its environment in many ways.
Consequently, life is able to decrease in entropy.
A certain thought experiment proposed by James Clerk Maxwell, called Maxwell's Demon, suggested the possibility of circumventing the Second Law of Thermodynamics with the application of intelligence.
However, numerous attempts to demonstrate Maxwell's Demon in laboratory experiments have failed to work, and physicists almost universally agree that the Second Law of Thermodynamics cannot be bypassed by intelligence.
Confusion about Maxwell's Demon, confusion about the nature of open and closed systems, and an over-zealous desire to disprove evolution are probably all responsible for influencing the formation of the entropy argument.
"Evolution is not science" assertion
Sometimes theists claim that evolution is not observable, testable, repeatable, or falsifiable, and is therefore not part of science.
However, evolution is observable, testable, repeatable, and falsifiable.
Moreover, evolution is the product of many different disciplines of science.
Observable: The theory of evolution was the product of observations made about nature.
One can make the very same observations today.
For example, squirrels are easy to find, and they appear to be approximately half-way between mice and small monkeys.
As our powers of observation have improved, we have observed more evidence that supports evolution inside of individual cells.
Here is one of many examples of inter-species evolution being observed in modern times.
Testable: Numerous dating techniques have been developed.
They are tested against radioactive decay, which is know to be so reliable that it powers the atomic clocks on GPS satellites.
They have been tested against stalagmite accumulation.
They have been tested with continental drift with magnentic pole switching.
They have been tested against each other, against tree rings and known climate events, cosmic events, etc.
There are entire fields of science dedicated to testing dating techniques.
Fossils have been dated and those dates confirmed with alternate dating techniques.
Repeatable: Evolution is so reliable that the similar laboratory experiments can reliable anticipate that bacteria will evolve reistance to certain chemicals.
Here is a video of such an experiment.
Of course, we cannot repeat something that takes millions of years, but we can repeat the same principles in laboratory settings and confirm that they behave as expected.
As a simple example, genetic algorithms are an effective optimization technique in the domain of artificial intelligence.
They reliably improve with successive generations of simulated evolution.
Falsifiable: The theory of evolution was formed before we had the capability to sequence genomes.
If it had been a false theory, that would have been apparent when we started looking at the nucleotide sequences in DNA and found no statistically significant similarity between species that had previously been hypothesized to evolve into each other.
To the contrary, the field of phylogenetics was formed, and provides some of the strongest evidence yet available to support the theory of evolution.
Moreover, other markers such as mitochondrial DNA have also been found to reliably tell the same story.
Indeed, evolution has not been successfully falsified, but it has been subjected to many falsification attempts.
In particular, evolution has been a constant target by a large number of theist scientists with significant interest in debunking it.
The failure of many attempts to falsify the theory of evolution has been a tremendous boon toward validating it.
Faith argument
The faith argument sounds like this:
"Jesus loved you so much that he was willing to suffer for you. He suffered more than you can imagine, and ultimately laid down his life for your sins.
And all that he requires from you in exchange for his tremendous and loving gift of mercy that will enable you to attain incomprehensible glory for all eternity is that you have faith in him."
The first thing that should be immediately obvious when the faith argument is made is that someone really really really wants you to have faith.
The deal has been sweetened as far as any deal can possibly be sweetened:
If you just give them what they want, you will receive an eternal and unending reward of unimaginable bliss!
And if appealing to your sense of greed doesn't sway you, they resort to an emotional plea:
You would be a horrible ungrateful person for rejecting the most sincere act of sacrifice ever offered if you don't accept the deal!
And if that doesn't move you, they will try to appeal to your need for personal acceptance and sense of belonging:
Someone understands you, cares about you personally, accepts you, and loves you almost unconditionally, as long as you just have faith!
But who exactly wants you to have faith? And why do they want you to have faith so badly?
Let us consider both the believers' and unbelievers' answers to these questions:
Believers' version: God is the one who wants you to have faith.
The reason he wants you to have faith is for your own good.
Because God respects your free will, he will not force you to believe. (See also the free will argument.)
But if you believe, then he can take care of every other deficiency in your character and make you into a glorious being.
And when Christian religious denominations plead with you to have faith, they are only supporting God's will for your personal well-being.
Unbelievers' version: Religion is the entity that wants you to have faith.
The reason it wants you to have faith is because faith is what sustains the religion.
If a religion were to lose all of its believers, it would be a dead religion.
Just as corporations work to maximize profit, politicians work to improve their popularity, and universities work to establish credibility, so religions work to promote faith.
Perhaps, one of the best ways to discern which of these two explanations is more plausible is to simply try to understand them both.
It is very difficult to comprehend why God is interested so particularly in faith, and not so much in other attributes that help a person to learn, such as interest in learning, humility, willingness to work hard, resiliency to defeat, or curiosity.
It is also very difficult to comprehend the supposed relationship between free will and having faith. See also choice argument.
Faith in science
Sometimes theists point out that science requires faith.
The legitimacy of this claim depends entirely on how one defines "faith".
There are some definitions of faith for which this statement is true, and many for which it is not.
Sometimes faith is defined simply as "trust".
No scientist has time to repeat all of the experiments that all other scientists have performed.
Consequently, scientists operate with trust that other scientists actually performed the experiements they report to have performed, and actually obtained the results they report to have obtained.
This trust has been violated in several notable cases, such as:
However, it can also be argued that science is more worthy of trust precisely because such cases have been exposed.
In science, it is expected the researchers publish methods to validate their results, and this results in the exposure cases of fraud.
Science is decentralized, and strives to be as transparent as possible.
By contrast, religious claims are consistently unverifiable, so there is no way to measure how much fraud has occurred in religion.
Because faith has many conflicting definitions, it can be dangerous to admit that scientists have "faith", because someone will inevitably interpret the statement using a different definition of the word.
Another common definition of faith is, the choice to accept or believe in something without having any verifiable evidence.
By this definition, faith is directly opposed to the scientific method.
The following thought experiment illustrates some of the issues with this definition of faith:
Suppose Alice asks Bob to consider a new idea.
Bob might choose to trust in Alice by evaluating her new idea.
His choice to tentatively accept her idea without proof, so that he might give the idea due consideration, might be called a type of faith.
Alternatively, Bob might tell Alice that he that he has faith in his existing beliefs, and is therefore unwilling to consider her new idea.
This is also a type of faith, even though the attitudes are almost completely opposite!
The primary difference is the object of the faith.
In the first case, Bob put his faith in Alice. In the second case, he put his faith in himself, or perhaps in the original source of his beliefs.
It would be completely ambiguous to refer to one of Bob's two potential choices as "faith" and the other as "doubt".
If Bob exhibits "faith" in Alice, that implies a degree of uncertainty about his existing beliefs,
and if he exhibits faith in his existing beliefs, that implies a degree of distrust toward Alice.
Thus, one cannot just choose to "have more faith".
One can only choose to exercise faith in something, and in that act chooses to exercise doubt in the opposing direction.
One might ask, would it be good for Bob to exercise faith in Alice?
Ultimately, the answer depends on whether or not Alice's new idea is right.
However, proof of the correctness of Alice's idea is probably not available, or else there would be no call for Bob to exercise faith in it at all.
Therefore, Bob must make his choice without knowing the answer.
In general, tentative trust leads to greater productivity.
Thus, good faith tends to be that which is motivated by humility, open-mindedness, optimism, curiosity, and temporary trust.
bad faith tends to be that which is motivated by pride, closed-mindedness, pessimism, superstition, and permanent dogmatism.
Many people assume that "faith" implies "faith in God", and therefore lazily omit specifying the object when they talk about faith.
Such sloppy communication has a negative effect for those who do not believe in God because it implies that they are unable to extend trust.
It is even more exclusionary to refer to one's set of beliefs as one's "faith".
Other definitions of faith, such as "the substance of things hoped for, the evidence of things not seen", tend to be more confusing than helpful.
Founding fathers arguments
Many Christians in the United States revere the founding fathers of the United States (with some very good reasons).
However, sometimes they use the founding fathers to make such claims as:
"The founding fathers were all Christians", or "the founding fathers intended for the United States to be a Christian nation."
It is certainly true that most of the founding fathers were Christians.
However, given the time period, a surprising many of them were actually deists.
(Deism is the position that God created the Earth and life, but it rejects the idea of a personal God who intervenes in the affairs of humans.)
In the 1700's, science had not yet discovered any complete natural explanations for the organization of the Earth or the existence of complex life on it.
Consequently, deism represented what was the most intellectual and rational explanation that was then available.
The assertion that the founding fathers intended for the United States to be a Christian nation is easily refuted.
The Constitution of the United States makes only two references to religion, and only for the purpose of clarifying that it should be separated from politics.
Sometimes Christians refer to the Declaration of Independence, which does include statements that offer deference to a Creator, but the Declaration of Independence was not established for the purpose of setting precedent for the establishment of the nation, as the Constitution was.
The other writings of the founding fathers explicitly describe that they were deliberate in creating a wall of separation between church and state, so it was obviously not merely an oversight that they failed to mention Christianity in the Constitution.
Moreover, history indicates that the founding fathers were deliberately trying to avoid some of the problems that occur when politics and religion were mixed, such as the debacle in which Henry the Eighth established the Church of England.
Sometimes, the motto "in God we trust" as printed on U.S. money, or the "under God" clause from the Pledge of Allegiance, are cited to suggest that the United States is a Christian nation.
However, both of those were added in the 1950's, and do not represent the diversity of views that exist within the United States.
Since the Founding Fathers of the United States were a diverse group with many differing opinions, it is certainly possible that some of them actually did intend for the United States to be a Christian nation.
However, the wishes of some of the Founding Fathers neither establish the direction for the nation nor even necessarily represent what is good or right.
Fruits, knowing them by
Matthew 7:16 records that Jesus described a method for discerning false prophets, "Ye shall know them by their fruits".
This line is sometimes flung at atheists to suggest that the fruits of religion are sufficient basis to believe, and that religion teaches people to be moral, while atheism deprives people of a basis for morality.
This argument reduces to a form of the morality argument, and makes the same fallacy of assuming that belief in God is required for morality.
Moreover, upon closer inspection, the "fruits" of religion are not quite the fresh produce that is often assumed:
While religious denominations teach their individual members to love their neighbors, the high-level reputation of religion is still one of intolerance and hypocrisy.
And while religious denominations certainly teach their individual members to repent of all their sins, the high-level reputation of religion is one of never acknowledging any wrong-doing, and stubbornly refusing to acknowledge new evidence or make any meaningful changes in its teachings or practices.
Further, religion is not the only entity capable of bearing fruit.
Science is made of scientists who dedicate tremenous time and effort toward making positive impacts in the world.
The fruits of science include real advances in the domain of knowledge.
Religion may claim to receive revelations from God about supernatural concepts, but its supposed contributions to knowledge cannot be verified.
By contrast, science is responsible for every advancement in technology, and can claim such fruits as modern sanitation, electricity, computers, the Internet, smart phones, and modern medical procedures.
Religious apologists often make the religious scientists argument in attempt to take credit for these fruits, but it is ultimately the methods of science and not religion that produced the fruit, and the scientific method works just as effectively for both believers and unbelievers.
The chart below shows the distribution of charitable giving in the United States in 2017 according to http://givingusa.org:
More than twice as much money was donated to religion than to any of the other categories, and religions in the United States tend to focus much more on producing faith than on producing works.
Religion also has a strong effect for making people skeptical toward science, and thus tends to discourage giving for fundamental research, which tends to have the greatest long-term benefit.
Notably, donations toward fundamental research were not even significant enough to warrant their own category.
A pessimistic perspective might observe that because of religion far more money is invested in attempt to interfere with the advancement of verifiable knowledge than to promote it.
Gaps in the fossil record
Gaps in the fossil record are often cited as evidence that evolution is not true.
Here is an example meme used to make this point:
This claim falsely assumes that evolution predicts there would not be gaps in the fossil record.
Counter to popular intuition, most dead creatures do not form into fossils, and evolution does not progress at a constant or even steady rate.
The combination of these two factors naturally results in fossils that are clustered around certain species with very few intermediate remnants.
Why do very few dead creatures actually become fossils?
Because highly specific conditions are necessary for the fossilization process to even begin.
The vast majority of creatures either decompose or are eaten by other creatures before leaving any permanent trace.
Why does evolution happen in unsteady spurts?
When the mutation of only one particular nucleotide in a sequence of DNA leads to a selective advantage, nature can find it very rapidly.
However, this typically leads to local optima where a large population of a particular specie may form.
In order to break out of a local optimum, an unlikely combination of mutations may be required, which can take orders of magnitude longer to occur in nature.
When that finally happens, evolution proceeds rapidly again until it finds the next local optimum.
This sporadic behavior can be easily confirmed by plotting fitness with respect to generations in a genetic algorithm.
Moreover, even if evolution did predict that there would be a continuous fossil record, the claim that there are big gaps is debatable.
Even among creatures that are still living, enough intermediate stages are represented to form an apparent spectrum of continuity.
If scientists were to divide the size of every gap in half by finding an intermediate fossil, opponents of evolution would likely still be dissatisfied with the span of the remaining gaps because subjective evaluations are not very effective for convincing someone who is already determined to arrive at a particular conclusion. So, are the gaps really unreasonably large? It depends who you ask. In general, those with more knowledge on the subject are also more likely to agree that the existing amount of evidence is already very convincing.
(It should also be noted that the meme shown above also makes several factual errors in addition to its claim about gaps in the fossil record: For example, the number of chimpanzees has already declined to fewer than 300K, the number of humans has grown to more than 7B, and humans did not evolve directly from chimpanzees, but from a common ancestor that they both share.)
Improbability argument
The Improbability argument suggests that it is statistically improbable for even primitive life to have spontaneously formed.
(It differs from the Complexity argument, which uses complexity to dismiss evolution rather than abiogenesis.
Note that the two arguments are often made together by using complexity to dismiss both abiogenesis and evolution.)
The improbability argument is typically presented with the following supporting details:
- Even the simplest single-celled life form on Earth still contains significant complexity.
- The statistical probability of so much complexity spontaneously coming together is extremely small.
- Therefore, abiogenesis requires an intelligent designer.
The primary error of the improbability argument is its assumption that the simplest modern biological life is representative of early pre-life.
By modern definitions, "life" requires a number of complex features, such as cellular walls.
Consequently, simple chemical reaction, such as fire, are not regarded as being "alive".
However, nature is not constrained by modern definitions of life.
Whatever spontaneously formed more than 4 billion years ago probably met only a small subset of modern reqirements for life, and important features like cellular encapsulation, photosynthesis, or the Krebbs cycle probably evolved later.
The plausibility that pre-life may have been little more than a simple chemical reaction can be seen by noting how many properties of "life" are satisfied by fire, a simple chemical reaction that most people are familar with:
- Fire consumes food (fuel),
- it emits waste (smoke),
- it generates heat,
- it grows and propagates,
- and it can die.
Although fire is not technically alive, it exhibits several of the properties of life, and it spontaneously occurs in nature.
The fact that no simple chemical reaction has been continuously reacting for 4 billion years indicates there is a significant survival advantage for reactions with cellular encapsulation, so natural selection would apply just as well with pre-life as it does with modern life.
Another possible resolution for the improbability argument is massive parallelism.
The oceans are very big, especially relative to a microscopic organism, and abiogenesis only needed to occur once in order for the Earth to now be covered with life.
Moreover, the number of planets estimated to be in the Milky Way galaxy is estimated to be at least in the hundreds of billions, and the number of galaxies in the Observable Universe is estimated to be at least in the trillions, and the portion of the whole Universe occupied by the Observable Universe is entirely unknown.
It may be that much of the Universe is actually mostly lacking in life, and Earth just happens to have been a place where a highly improbable event occurred.
After all, if abiogenesis happened to occur anywhere at all, those places would be the only places where life would exist to wonder why it happened there.
Clearly, society itself is very much "alive".
In fact, in almost every aspect the life of society is much greater than that of any individual that supports it.
It has many more parallel "thoughts" than any human mind is even capable of supporting at once.
Yet, like all lesser forms of life, it seeks to survive, to grow, to propagate, and to advance.
This is significant to the human condition because it means that we are part of something much greater than our own lives.
The things we do with our own lives ultimately affect the life of society, which will not only live longer than we will, but will ultimately accomplish more good, achieve greater purpose, and facilitate prosperity for future generations.
This gives purpose to our own mortal lives, even if we will not personally live again after death.
It also dictates morality by showing us that the things we do are good, only to the extent that they positively assist the life of society.
There are, unfortunately, forms of life that pursue their own objectives by parasitically leeching from the lives of other entities.
Biological examples include cancers, mosquitos, virusses, and ticks.
Non-biological examples also exist, including fraudulent organizations that steal from others in order to sustain their own operations.
In light of the detrimental effects that such life forms have on the goals of other lives, it is clear that parasitic behavior is unethical.
Sadly, it is better that some lives are destroyed than that the lives of humans or society be limited by life forms that take such easy and selfish paths.
Just like many other organizations, organized religions also have life.
They survive by convincing people that they represent God, but they have such conflicting doctrines that it is clear that most of them are ultimately frauds.
Like other forms of life, these frauds exhibit such a strong will to survive that they will do whatever it takes to sustain themselves, including the teaching of lies, the taking of donations under false pretenses of representing God, and the redirecting of human lives for their own support.
It should also be admitted that organized religions often also do much good, both for the lives of their adherents, and also for society.
Nevertheless, as society advances, and as more of truth begins to be accurately modeled by science, the fabrications of false religions will increasingly prevent their adherents from being able to participate in the progress of society.
Truth is too important to sacrifice.
We cannot fail to model a large section of truth just to allow well-intentioned but fraudulent organizations to survive.
Simply put, the advancement of society matters much more than the lives of these religions.
Morality argument
The morality argument makes the implicit suggestion that atheists are immoral.
It is usually posed as follows:
- If there is no God, then there is no objective morality.
- There is objective morality.
- Therefore there is a God.
This argument relies on confusion about the definition of "objective morality".
When specific definitions are made, the argument falls apart.
To illustrate, consider the following specific definitions:
- relative morality: Morality is subjective. That is, it depends on individual priorities.
- divine sanctioned morality: What is moral depends only on God's priorities.
- objective morality: Morality does not depend on the priorities of anyone.
The precise differences between divine sanctioned morality and objective morality can be subtle, but are critical for the morality argument.
Objective morality suggests that if there is a God, he would conform to what is moral.
His role with respect to morality would be to show people what is objectively moral, but not to define it.
And, hypothetically, if God were to fall then what is moral would not be affected.
By contrast, divine sanctioned morality suggests that God defines what is moral.
In this context, it would be meaningless to call God "moral" because he would still be moral even if he were a big jerk.
And, hypothetically, if God were to fall then what is moral would fall with him.
So, if one believes in divine sanctioned morality, then the morality argument becomes:
- If there is no God, then there is no divine sanctioned morality.
- There is divine sanctioned morality.
- Therefore there is a God.
This view of the argument is trivial to dismiss because the second premise is not at all obvious.
And, if one believes in objective morality, then the morality argument becomes:
- If there is no God, then there is no one to show us what is objectively moral.
- There is someone to show us what is objectively moral.
- We call that being God.
Again, the argument falls apart because the second premise is no more obvious than the conclusion.
The morality argument is closely related to the transcendental argument, and may be considered a subset of it.
Pascal's wager
Blaise Pascal (1623-1662) pointed out in his writings that the possibility of an infinite reward would overwhelm any finite cost associated with potentially believing in something foolish.
It follows, therefore, that people should choose to believe in religion, even if it is probably not true, because the potential reward is so much bigger if it does turn out to be right.
This has come to be known as Pascal's Wager or Pascal's Gamble.
It is usually presented in a form somewhat like this:
- If I believe in God and I am wrong, I lose very little.
- If you do not believe in God, and you are wrong, you lose everything!
An implicit assumption of Pascal's wager is that God cares only about who believes in him.
Moreover, it also assumes that he either cannot tell the difference, or does not care about the difference, between those who are sincerely convinced of his existence and those who are merely pretending because they are obsessed with maximizing their own personal reward.
However, all of these attributes are uncharacteristic of an omni-benevolent all-knowing God.
The absurdity of Pascal's wager is more apparent when framed in the context of an advance fee scam:
Suppose a "Nigerian Prince" offers to send you $ONE MILLION DOLLARS if you pay and advance fee of $50.
Would a benevolent God be pleased with those who participate in such an obvious scam?
Obviously not!
Participating in a scam only strengthens the organization that operates it, and enables it to be more effective at defrauding other people.
But what if the the "Nigerian Prince" raised his reward to $INFINITY DOLLARS!!!
Then, according to Pascal's wager, the deal has become far too good to pass up because it is now worth any cost even if there is only a small chance of it turning out to be legitimate.
Practically, however, such a ridiculous delayed reward only emphasizes the shady nature of the operation.
Theists may contend that their religious denomination is unlike an advance fee scam because it does not profit from the faith that people place in God.
However, most crimes can be committed through negligence, and fraud is certainly one is perpetrated in this manner.
Even if no individual directly profits from faith, it is still what sustains the life of the organization.
And even if the teachings of one religion happen to be true, the majority of them still obtain it with false teachings.
Hence, there is significant risk of doing damage to society in accepting a religion that one does not genuinely believe to be true, and it is not reasonable to suppose that God would be pleased with those who are only gambling because they prioritize their own personal reward over their impact on society.
Postmortem compensation
Postmortem compensation is when God makes everything right after we die.
This principle is used to justify the injustices that God allows to occur in this life.
It is closely related to the mysterious ways justification.
Postmortem compensation operates on the same principle as an advance fee scam.
That is, the participant must deliver now, but the compensation will not come until later, when it is too late to implicate the scammer.
While this level of justification may be sufficient to appease the mind of someone who is already determined to believe in God, it is certainly not sufficient to persuade someone who is honestly uncertain about the existence of God.
Consequently, questions that are answered with postmortem compensation seriously call into question how a benevolent God could honestly expect people to believe in him while offering such poor basis for that belief.
Purpose of life
Theists often think they have the only answer to the question, "What is the purpose of life?"
In fact, answers to that question do not even need to depend on supersition!
At a personal level, one's purpose is whatever he or she chooses for it to be.
This is true even if there is a God who wants us to choose to live for a specific purpose, such as having faith in him.
(See the faith argument.)
God's purpose does not automatically become our purpose until we exercise our free will by choosing to live for his will.
Thus, the purpose of life is whatever we choose for it to be, whether or not God exists.
At a higher level, the purpose of life is really quite obvious, especially if one examines the nature of life itself.
Evidence suggests that life began on this Earth about 4 billion years ago.
Since that time, it has progressed slowly from pre-cellular life to the extremely complex creatures that we are today.
Hence, the purpose that life has followed in general since its inception has consistenly been the same: to progress.
When people choose a purpose for their individual lives, they are specializing within a society that works toward its higher-level purpose of progression.
Consider the following analogy: Suppose a small child becomes lost in the mountains.
The community quickly organizes in a unified effort to find the child.
Suppose 50 volunteers offer to help find the child.
They divide the area into 50 regions, and assign one region for each person to search.
After everyone does their part, the child is found and everyone rejoices.
Perhaps, one of the volunteers might lament, "I wanted to be the one to find the child. All of my effort was for nothing!"
But then a wise friend might counsel him, "Stop thinking about yourself! It's not all about you, and your personal glory. We all did our best to help, and the overall objective of the community was achieved! You need to learn to find joy in being part of something bigger than yourself."
Somewhat like the self-centered volunteer in that analogy, many theists suppose that life can have no purpose unless they personally get to live forever and obtain great personal glory.
But in focussing on themselves, they withdraw from their role in society.
If every lived for the "purpose" of having faith in God, society itself would make little progress.
But by pursuing whatever purpose we have individually chosen, we can find great meaning in sustaining and being part of something bigger than ourselves.
Significantly, even after we die, the life of society will live on.
Thus, living only for ourselves is really the path to having no purpose in life.
As William Shakespeare put it, "What e'er thou art, act well thy part."
The person making the Theory/proof fallacy typically assumes the colloquial definitions.
He or she may have heard that a theory is something that has not been proven.
Technically, using the scientific definitions this is correct because theories are not mathematical assertions, so there are no accepted axioms suitable for forming the basis of proofs on such matters.
But the Big Bang Theory and the Theory of Evolution have been thoroughly validated with an overwhelming abundance of empirical evidence.
There is no longer any significant remaining possibility that either theory will ever be proven false, and that is what makes them "theories" in the scientific sense.
However, theories are still only models for reality, not reality itself.
Neither the Big Bang Theory nor the Theory of Evolution is yet perfectly understood, and both will continue to be refined with time.
Hence many theists will springboard off of this detail to make the Science changes fallacy.
Timeline of history
Evidence suggests that a large explosion occurred about 13.8 billion years ago, hurling matter in all directions.
This "Big Bang" was probably not the beginning of everything, but it is where all evidence we have yet found begins.
As gravity pulled matter together, swirls and eddys formed within the currents of expanding matter.
Within these swirls, smaller and faster-spinning swirls formed.
One of these contains the observable Universe--everything we can hope to ever observe.
Within this, smaller swirls of matter pulled together into black holes, each surrounded by an accretion disk of matter we call a galaxy.
Within galaxies, yet smaller and faster-spinning clumps of matter agglomerated into stars surrounded by protoplanetary disks.
In time, these protoplanetary disks clumped into planets, surrounded by their own rings of matter that soon accreted into moons and satellites.
A simple pattern can be observed: The behavior at both small and large scales often exhibits similary.
At the smaller-scale things happen more quickly, so we can examine the outcome at smaller scales to obtain a hint about where things might be going at the larger scale.
Our solar system coalesced about 4.56 billion years ago, and planet Earth a short 50 million years later.
Initially, the Earth was a hot and inhospitable planet, but its surface cooled and formed a crust.
This crust buffered its surface from the heat beneath, and allowed oceans of liquid water to pool on its surface.
Carbon dioxide, water vapor, nitrogen, and other chemicals degassed from volcanos, forming oceans and an early atmosphere.
No one knows for sure exactly how life first formed on Earth (See abiogenesis), but we do know that when certain chemicals come together, reactions occur.
What we now call "life" may have originally been little more than a continuous chemical reaction.
(Instead of animals, think of a slowly burning fire or the foam that emerges when baking soda meets something acidic.)
Like life, chemical reactions consume fuel and produce some byproduct.
Most reactions die when the fuel runs out.
But if the byproduct of some reaction were a bubble, and that bubble contained all the chemicals necessary to do it again, then it would be a primitive cell.
It could wait for more food to become available, then would generate more cells.
That is all it would take to set the wheels of evolution in motion.
Single-celled organisms propagated, and diversity began to occur within their numbers.
The earliest evidence of life on earth
dates to approximately 3.5 billion years ago.
About 3 billion years ago, cyanobacteria performed an important chemical reaction called Photosynthesis.
This reaction consumed carbon dioxide and emitted oxygen as a waste product.
Over the subsequent billion years, cyanobacteria terraformed our planet's atmosphere.
The introduction of oxygen into the atmosphere opened the way for many new forms of life to evolve.
While some forms of life sat around waiting for food to come to them, others evolved to be more proactive.
Some of these even invaded other cells and stole the stored energy that they had prepared for fuel.
Being a parasite was not good for the host, but it was certainly good for the parasite, so the behavior became prevalent.
Then, approximately 2 billion years ago, something very significant happened:
A parasitic cell--perhaps while engaged in the very act of committing cellular burglary--changed.
It switched from being parasitic to being symbiotic.
Cells that were once energy thieves became mitochondria, the energy factories of the cell.
The other organelles in eukaryotic cells may have evolved in a similar matter, as they found that working together was more effective than competing individually.
In other words, that which is altruism at the level of an organelle is self-interestedness at the level of the Eukaryotic cell.
Because of this primitive altruism, eukaryotes were generally more effective than simpler forms of life.
Approximately 1 billion years ago, Eukaryotes found that it was also better to be altruistic at the cellular level, instead of self-interested at this level, and they began to band together as multicellular life.
Instead of pursuing their own immediate interests, cells began to pursue what was best for the group as a whole.
That which is altruism at the level of the cell is self-interestedness at the level of the multi-cellular organism.
Following similar patterns at a higher level, specialization eventually began to occur within multicellular life forms.
Organs formed to facilitate digestive systems, circulatory systems, and nervous systems.
Simple animals emerged in the oceans about 600 million years ago.
By 500 million years ago, there were fish.
In the following 100 million years, both plants and animals emerged from the oceans and spread over the land.
It wasn't until 300 million years ago that reptiles began to roam the Earth.
We often think of dinosaurs as being ancient, but relative to the life of the Universe, this was the very recent past, spanning only 2% of its history.
How did we manage to get from such primitive creatures to modern society in such a relatively short period of time?
The answer is that we stopped serving only ourselves.
When altruism finally evolved at the next layer, that layer began to make rapid progress.
That which is is altruism at the level of organs is self-interesteddness for the life of the animal.
Reptillian brains are well-known for seeking the well-being of the individual animal.
For example, it is believed that dinosaurs layed their eggs, then abandoned their children to fend for themselves as the parent went off to serve itself.
About 200 million years ago, mammals emerged with a better way of thinking.
Mammals have an additional layer in their brains called the mamillian complex.
Unlike the reptillian complex, which only seeks the well-being of the individual, the mamillian complex give animals a desire to seek for the well-being of their social groups.
Once again, we see the next level of altruism.
It may be characterized by a pack of wolves.
Individual members may sacrifice themselves for the good of the pack.
And that which is altruism at the level of a wolf is self-interestedness at the level of the pack.
66 million years ago, the Cretaceous-Paleogene extinction event occurred.
A large meteor impacted the Yucatan peninsula near the city of Chixulub in southern Mexico, triggering a lingering impact winter.
This made life more difficult for everyone, but it had a greater impact on the reptiles who left their young to fend for themselves.
The more altruistic mammals survived to a much greater extent.
The evolutionary advantage was even stronger for primates, who appear to have diverged from mammals somewhat before this event.
Primates have a third layer in their brains, the primate complex, that gives them the ability to reason.
About 1.8 million years ago, this advantage started to dramatically sway the evolutionary direction of primates, and their brains grew significantly in size.
Anatomically modern humans have only lived on this Earth for 200 thousand years--a little more than one tenth of one percent of one percent (not a type-o) of the history of the Universe.
Yet, as recent as humans may be, our major accomplishments tend to be even more recent.
To really see these we need to zoom in again...
...and again.
...and again.
In the mere 700 years since the renaissance, there has been a profound explosion of advancement.
We have transformed the world from one of predominantly agrarian societies and medieval caste systems to one of pervasive technologies and modern luxuries.
What is the explanation for this surge in development?
Certainly, we could not have done it without the large primate complexes in our brain.
But, humans didn't just become rational in the last 700 years.
What changed is how we used our rational minds.
For ages, people worked to benefit themselves.
But sometime around the renaissance, some sort of tipping was reached where enough people started to take up hobbies.
Such people started to find happiness not just in benefitting themselves, but in making an impact in society.
For ages, monarchs planned and conspired about how to secure power for themselves and their family lines.
In the late 1700's the founding fathers of the United States carefully composed a constitution designed to decentralize power for the benefit of society.
For ages, technological innovations were wielded almost exclusively for crushing anyone who resisted authority.
But starting around the renaissance, they started being exchanged for the benefit of all.
A global economy began forming, and national borders became less about individual identities and more of something for politicians to fight over.
The biggest surges in evolutionary advancement that have occured throughout history have happened when individuals at each level stopped focusing on their own well-being, and started working for the benefit of the whole.
This one is no different.
Records such as the Bible show that humans have been searching for morality at least since the discovery of writing.
The progress of modern society suggests that now we are starting to find it.
And it clearly has something to do with individuals working to help advance all of society.
Now, as society itself begins to come to life,
it is becoming increasingly clear that the very purpose of life itself
does not involve pursuing our own personal personal interests, but in seeking after a form of altruism, and therein lies the meaning, fulfillment, and peace that people seek in their lives.
Trilemma
A trilemma is a difficult choice with three options.
Two famous trilemmas related to the debate about God include:
Epicurus' trilemma, an argument against God by the Greek philosopher Epicurus, and
the Apologetic trilemma, an argument for God by C.S. Lewis.
Epicurus' trilemma states:
- If God is unable to prevent evil, then he is not all-powerful.
- If God is not willing to prevent evil, then he is not all-good.
- If God is both willing and able to prevent evil, then why does evil exist?
The weak part of Epicurus' trilemma is its second point, which may be a false dichotomy.
Since "will" can be complex, other reasons for being unwilling to (at least temporarily) prevent evil could potentially exist.
Lewis' Apologetic trilemma states that Jesus must be a lunatic, a liar, or the Lord:
- Lunatic, meaning he falsely believed he was God.
- Liar, meaning he knowingly falsely claimed to be God.
- or Lord, meaning he was what he claimed to be.
The Apologetic trilemma also fails to cover a complete set of mutually exclusive possibilities.
(Some other possibilities include that Jesus never existed, or that he was misrepresented.)
However, the weakest part of the Apologetic trilemma is that there is little basis for determining which possibility is true.
Lewis argued that Jesus' teachings were too intelligent to have come from a lunatic, and too moral to have come from a liar.
He concludes, therefore, that Jesus must have been the Lord.
However, these arguments are easily challenged.
There are intelligent teachings to be found in many contradicting religions, but few people would resort to calling all of their founders "lunatics" for having some incorrect beliefs.
The words "lunatic" and "liar" are overly strong description that may have been selected for their alliterative properties, but actually serve to shame the skeptic from fully considering either of the first two options.
Together the first two options cover a far more plausible space of possibilities than the third option when all of the implications are considered.
If Jesus were Lord then a large body of supernatural phenomena must also be accepted.
But if Jesus had mistaken beliefs and/or engaged in some deliberate deceit, then one only needs to accept some fairly plausible social circumstances.
Witness-based arguments
Witness-based arguments rest the veracity of God on human witnesses.
It usually takes one of the following forms:
- There are more witnesses for the resurrection of Christ than for [some well-accepted event].
- Witnesses suffered and sacrificed even their own lives to testify of Christ.
- The witnesses had no reason to be dishonest. Why would they lie?
- Witnesses are accepted in courts of law, so they are sufficient to prove something is true.
Unfortunately, all of these arguments commit logical fallacies:
#1 is an argumentum ad populum.
But democracy is a very poor mechanism for establishing truth.
Moreover, it is not clear to what extent witnesses in the time of Christ were influenced by each other.
Very few ancient witnesses actually left any written accounts, so we are relying on a few witnesses to have correctly counted the total number of other witnesses, and also to have correctly assessed the level of conviction of those other witnesses.
Even the gospels of Matthew and Luke appear to have derived some of their accounts from Mark, which suggests that what currently appears to be several witnesses may actually just be derivations from a common source.
And witnesses on matters of religion tend to be notoriously unreliable.
#2 is trivially turned against itself by noting that people have suffered and died for other religions as well.
Christianity may very well have been the best thing that ancient witnesses had ever encountered, and it may have very well been something that they considered to be worth dying for.
However that does not mean that it withstands modern science, or that it would necessarily carry the same weight in modern times.
If anything, the combination of barbarous treatment toward Christians and the fervor with which they withstood it casts doubt on the degree to which witnesses from that time would have offered objective unexaggerated accounts of the events about which they testified.
#3 is an attempt to shift the burden of proof onto the skeptic.
Just because one cannot imagine a valid reason for someone to bear false witness, does not mean there is none.
The social circumstances of ancient witnesses are not well-known, and significant rigor is certainly warranted before accepting a testimony about a supernatural event that falls far outside the scope of anything ever before observed.
Moreover, retellings of anecdotal experiences are known to be unreliable, especially when the story has passed through multiple intermediaries.
#4 is a bad analogy because courts of law exist for establishing guilt, not truth.
Science generally does not consider witnesses to be very reliable, and numerous studies have corroborated that human witnesses are heavily influenced by their own cognitive biases, and that physical evidence is much more reliable for establishing the veracity of events.