Ethical Realism

July 30, 2010

10 Myths About Beliefs

Filed under: epistemology,philosophy — JW Gray @ 10:00 am
Tags: , , , , , ,

There are many myths and misunderstandings that prevent clear thinking, good debate, and proper argumentation. I will discuss ten myths about beliefs, but first I will describe knowledge.

What is Knowledge?

Knowledge is defined by many philosophers as justified true belief. For example, my belief that I exist is a justified true belief I have. Some beliefs are justified, such as our belief that Einstein’s theory of physics is true. Some beliefs are true, such as our belief that gravity existed in 2009. However, some beliefs might be true without any justification—such as my belief that I will live for the next 10 years. It could be true, but I can’t really give a good reason that I have the belief. Some beliefs are true, but people have them for the wrong reason. For example, some people might believe that Einstein’s theory of physics is true just because their parents believe it (rather than because the most respectable scientists agree with it).

Knowledge requires justified beliefs, and justified beliefs often require arguments—reasons to accept a conclusion. Good reasons for a conclusion are good justifications.

True beliefs are statements that refer to facts, and facts are things in the world. If it is true that I exist, then I am something in the world that the statement “I exist” refers to. No belief is required for me to exist. Facts do not require true statements to exist because true statements are merely statements that corresponds to facts.

Myths About Beliefs

What are the myths about beliefs that I will discuss?

  1. All opinions are equal.
  2. Challenging a belief is an insult.
  3. Something is true because I feel certain about it.
  4. A statement is an argument.
  5. Disagreement is an argument.
  6. Controversial beliefs can’t be justified.
  7. An objection to a belief proves it’s wrong.
  8. Knowledge requires an explicit justification.
  9. Justified beliefs have to be certain.
  10. All beliefs are rationally acceptable.

All opinions are equal.

Not all opinions are equal. It is clear that all opinions are not equal considering that we know some beliefs are true and others are false. Fist, mathematical truths, such as 1+1=2 is not up for debate. Second, it is pretty clear that some opinions are better than others now that science is so successful. How to make a television set is not a matter of belief, it is a matter of reality. Jumping off a skyscraper will kill you no matter how safe you might believe it to be. And so on.

It is also clear that all opinions are not equal because such a belief is self-defeating. Imagine for a moment that all opinions are equal. However, in that case the opinion that “not all opinions are equal” is just as good as the belief that “all opinions are equal.”

What makes some opinions better than others are (a) the truth and (b) justification. Some opinions are true and others are false. Some opinions are justified and others are unjustified. The best beliefs are the most justified beliefs. It is better to believe something false for good reasons than something true for bad reasons. Why? Good reasons are reliable and bad reasons aren’t. The odds of having a true belief from good reasons are higher than from poor reasons.

False justified beliefs.

Scientists at one point felt absolutely certain that Newton’s theory of physics was true, and they had no reason to doubt it. However, we later on found out that it had some flaws and Einstein’s theory of physics proved that Newton’s theory was false. At the same time the scientists had justified beliefs about physics that were clearly superior to previous theories of physics.

True unjustified beliefs.

Some people believe that racism is wrong just because their parents think racism is wrong. However, getting all of our values from our parents is not a reliable way to get our moral values. Some parents are racists, but that isn’t a good reason to be a racist. Additionally, your parent might be correct that racism is wrong, but have other false moral beliefs that you might accept without a second thought.

Challenging a belief is an insult.

It is not insulting to challenge someone’s belief. If you have a false belief, then you should welcome others to let you know about it. There’s nothing wrong with trying to correct people’s beliefs. Of course, it is most polite to actually tell someone why you think they are wrong. To demand agreement without argument can be disrespectful and pushy.

To be afraid to challenge someone’s beliefs can be disrespectful because you will be assuming that the person is too irrational and “can’t take criticism.” If your assumption is correct, then the person really is irrational (at least in one respect).

It is also disrespectful to punish or blame people merely for challenging your opinions. You are basically telling that person that he or she can’t possibly have a better reason for his or her belief than you have for holding yours.

Something is true because I feel certain about it.

No matter how much you feel like you are right, you might not be. Many people say that they are “certain” that something is true, but “certainty” means that a belief is infallible—that it couldn’t possibly be false. To feel certainty is not to actually have certainty.

We are, in fact, not certain about very much in life. We might be certain about mathematical and logical truths, but not so much about the world or ourselves. Most of our best knowledge is based on the assumption that the future will be like the past, but we can’t know that for certain. The sun might not rise tomorrow, gravity might not exist tomorrow, etc.

A statement is an argument.

A statement or assertion is not an argument. For example, “fire is hot” is not an argument because an argument requires a justification or reason to believe something. There is no reason given to accept it. However, a series of statements can be an argument. For example:

  1. Fire has high kinetic energy.
  2. All objects or areas of high kinetic energy feel hot.
  3. Therefore, fire is hot.

Disagreement is an argument.

No disagreement is required for an argument. The above argument concerning fire does not require any disagreement. It is a reason to believe that fire is hot despite the fact that people already agree that fire is hot. We can use arguments to try to give reasons for beliefs that no one has ever seriously questioned.

Controversial beliefs can’t be justified.

Just about everything in philosophy is controversial, but it is possible for two incompatible beliefs to both be justified. Philosophers have spent years studying and defending various theories—such as utilitarianism—and one philosopher can have a justified belief that utilitarian is false even though another philosopher can have a justified belief that utilitarianism is true. In a similar way it could be that one philosopher can have a justified belief that cocaine should be legal and another can have a justified belief that it should be illegal. Such beliefs are not sufficiently justified to prove that they are true, but they are sufficiently justified for a person to rationally hold the belief.

Some actions are morally acceptable, such as walking on your hands, and some beliefs are rationally acceptable, such as the belief that cocaine should be legal.

An objection to a belief proves it’s wrong.

First, an objection might not be a good objection to a belief. For example, the fact that cocaine is unhealthy and harmful doesn’t in itself prove that cocaine should be illegal. (Those could be objections to its legality.)

Second, good objections do not prove a belief is wrong. A good objection is one that seems to undermine a belief. There are serious problems with many scientific theories, but scientists call them “anomalies.” These are issues that the scientist hopes will be “explained away” in the future rather than proof that the theory is false.

In order to know if a belief is undermined by an objection, we need to decide how much reason we have that supports the belief. If I drop an object and it doesn’t fall, that wouldn’t disprove the existence of gravity because we have so much reason to believe in gravity. However, an objection against the belief in ghosts could be very worrisome because we have so little reason to believe in ghosts to begin with.

Knowledge requires an explicit justification.

Explicit justification is basically an argument. We don’t need to be able to give an argument to have knowledge. I know that 1+1=2 but I don’t have an argument that proves it. Some of our knowledge is difficult to justify, but that doesn’t mean we don’t have knowledge. Some people know that killing people for no reason is wrong despite not knowing how to explain why it’s wrong.

Justified beliefs have to be certain.

Certainty requires infallibility—the impossibility of being wrong. However, justified beliefs can be false—such as the belief scientists had in Newton’s theory of gravity. Some beliefs are rationally acceptable and others are rationally required. It can be rationally acceptable to believe that cocaine should be legal, but we might have our doubts and admit a great deal of uncertainty. However, it seems rationally required to believe that Einstein’s theory of physics is very accurate. There is a lot of evidence to back it up, and it doesn’t make sense to disbelieve in it. At the same time Einstein’s theory might be false—something can be very accurate and be false nonetheless. (Newton’s theory of physics is very accurate but false.)

A lack of certainty and knowledge doesn’t prove that facts don’t exist. I exist even if I believe that I don’t exist. A lack of knowledge of my existence has no effect on my actual existence.

All beliefs are rationally acceptable.

Not all beliefs are rationally acceptable. Someone might try to have a new politically correct belief to replace “all opinions are equal,” such as with “all beliefs are rationally acceptable.” However, it’s not rationally acceptable for me to believe that I don’t exist. No one would be here to have such a belief if it were true. It is also not acceptable to deny that 1+1=2. It is not acceptable to think that I will survive jumping out of a skyscraper. And so on.


There are many strange beliefs people seem to have regarding beliefs, argumentation, evidence, justification, truth, facts, and knowledge in general. The fact is that there are justified true beliefs, but most of our best beliefs are merely “highly justified.” Knowing that most of our beliefs are uncertain merely requires us to admit that our beliefs could be false. That doesn’t mean we should seriously doubt such beliefs unless we are given a good reason for doubting them. For example, I believe that killing people willy nilly is wrong and I don’t think that belief will ever change, but that belief lacks a level of certainty that other beliefs have—such as the belief that I exist or that 1+1=2. I have no doubt that killing people willy nilly is wrong, but that doesn’t mean that I am certain in the strict sense of the word. (I suppose it is possible to find out that I am highly deluded about many things if there is an evil demon deceiving me, etc.)



  1. After reading some of your articles I really found this specific one to be very creative. I have a web log also and would choose to repost several snips of your respective articles in my own weblog. Would it be all right if I use this as long I personal reference your webblog or build a backlink to the post I took the snip from? Otherwise I recognize and would not do it without having your approval . I have book marked the post to twitter as well as zynga account for reference. Anyway thanks either way!

    Comment by Jerica Ballantine — August 2, 2010 @ 1:04 am | Reply

    • Thank you for the compliment.

      You can quote material from my website just like any other site, but I don’t want an entry to be quoted in its entirety. In that case you might as well just link to the original material.

      Comment by James Gray — August 2, 2010 @ 4:08 am | Reply

  2. Nice job! I discovered your blog a week ago while randomly looking for some stuff about ethical realism and I’ve now read several of your posts and truly like your tone which is one of a competent pedagogue (don’t worry, that’s a compliment, there is not that much of them around!).

    However, what seems to me to be a very important point about beliefs you don’t mention in your post is that they are not submitted to the will. Disagreements (even disagreements with oneself) are sometimes very painful insofar as giving up a belief, even in presence of some very convincing arguments, can be difficult.

    I do not mention this point in order to be exhaustive but rather because I know of very little litterature about this point (though this is not the case for the unvoluntariness per se which has been masterfully analyzed by Bernard Williams, among other). This however seems to me a very puzzling and intereresting point. It’s nothing like bad faith or the like but more a kind of epistemic akrasia, though surely done for some practical reason, and that put some pressure on the rationality requirement. My guess would be that the beliefs we hold entertain a close relationship with the image we have of ourselves (or with a certain form of life, to put it in a Wittgensteinian way) and that giving up some of them can have important effects we may sometimes not be ready to face. Anyway…

    Greetings from Switzerland,


    Comment by Patrik — August 8, 2010 @ 4:00 pm | Reply

    • Are you suggesting that it is a myth that “we choose what we believe?” I think it might be possible to choose what I believe sometimes, but not always. I suppose it is a myth that we “always choose what we believe.” This is a relevant point to make when discussing the ethics of atheism, and I have talked about it there to a minimal extent.

      Comment by James Gray — August 9, 2010 @ 4:37 am | Reply

  3. Yes, I am not only suggesting that it is a myth that we are choosing what we believe but also strongly asserting it. However, we have to make an important distinction between two kinds of doxastic voluntarism: (i) direct voluntarism and (ii) indirect voluntarism. The form I deny (in accordance with most of the profession – the only author I know that is still defending a form of voluntarism is Carl Ginet) is direct voluntarism.

    To deny (i) amounts to claim that it is wrong that one can choose what one believes by deciding to do so, like one can, in normal cases, choose to raise her hand for instance. The parallel with action is here not fortuitous since colloquial expression such as ‘I made up my mind’ or the like seem to make of belief a form of action. However, this is false and can be explained by seeing that belief is conceptually linked with truth, i.e. to believe that p is to believe that p is true, and that setting the matter of whether p is true cannot be submitted to the will.

    We have however to distingush (i) from (ii) which claims that it is possible to indirectly form some beliefs at will. By indirectly I mean that we may sometimes have the possibility to generate at will the kind of evidence we need to support a belief. An example could be someone who opens a door in order to form the belief that the door is open.

    All this is pretty standard since Williams’ canonical paper of 1973 ‘Deciding to Believe’ and has been then continuously refined (recently in interesting ways by Nishi Shah and David Velleman). This is however not what my comment was directly aiming at. I was rather presupposing this distinction to claim that the rationality requirement you sketch in the end of the post might be a rationalist illusion. Indeed, the phenomenon I called ‘epistemic akrasia’, that is, the incapacity to give up a belief in presence of conclusive evidence that speak against it, shows that having a conclusive evidence might not be necessary to form a belief but that beliefs are also governed by some other, less epistemic norms. It is where I mentioned Wittgenstein because I have the intuition (but no deep argument) that some beliefs are entrenched within a certain ‘form of life’ and are therefore more difficult to give up.

    In sum, this is not one myth but two that I want to point: (A) the direct voluntariness of belief and (B) the idea that beliefs are strictly governed by epistemic norms. The latter point does not amount to claim that belief are not to be epistemically settled and that rationality does not play any role in forming them but rather that we might go for a milder form of rationalism in some cases.



    Comment by Patrik — August 9, 2010 @ 5:56 am | Reply

    • Thank you for the thoughtful replies.

      I haven’t studied very much of the philosophical literature involving the potential voluntary aspect of beliefs. I agree that personal bias and emotions do play a role, but I think that finding a belief to be irrational (or lack of it to be irrational) is probably the most powerful of all (involuntary) forces. I don’t think I can voluntarily believe A and not-A simultaneously.

      I don’t want to suggest that rationality is the only force for beliefs. I want to discuss which beliefs are rational, justified, probably true, etc. precisely because I don’t think rationality is guiding people’s beliefs enough.

      At the same time I think we can do certain things to change our beliefs. We can read arguments, look into scientific evidence, try to question our current beliefs, be willing to admit that our beliefs are fallible, and so on. People who do these sorts of things properly are more likely to have intellectual virtues. They can be philosophical, open minded, and skeptical in an appropriate way.

      People who ignore arguments, try not to question their beliefs, and disregard scientific research are likely to end up being reckless and embodying intellectual vices. They take a greater risk of being dogmatic or fanatical.

      I might talk as if we should voluntarily believe whatever beliefs are sufficiently justified. Epistemology is generally taken to be normative. “You ought to believe what is rational, and ought not believe what is irrational.” I think this talk makes sense from a common sense standpoint, but it might be misleading in some way. What would be a more appropriate way to put such an idea?

      Comment by James Gray — August 9, 2010 @ 7:20 am | Reply

  4. I surely agree with your answer. Belief formation is normatively governed and the adequate norm is probably something like: ‘If the evidence for forming the belief that p is conclusive, then you ought to form the belief that p’ or ‘If the evidence for giving up the belief that p is conclusive, then you ought to give up the belief that p’. This is all fine for me and I follow you when you say that rationality is not guiding people’s beliefs enough (I have the same frightening fanatics at home).

    However, this amounts to the normative part of belief and I fear that limiting ourselves to assert that belief is normatively governed and that people that do not respect the norm of belief fall into irrationality is just another form of (rationalistic) fanatism. Indeed, it may the case that our psychology does not fit our normative requirements and that we sometimes fall short of forming the adequate beliefs.

    This said, I think you are right when you claim that arguments, evidence, and rational norms are the safest way to form beliefs. The matter is more of trying to understand what is going on is someone’s head when you showed her with a respectable amount of evidence that she should not believe what she believes but still continue to believe it.

    One might answer to this, and I guess this would be a rejoinder you may agree with, that this amounts to a lack of intellectual virtues. That someone who refuses to give up some unwarranted beliefs is not intellectually virtuous enough. I surely like that rejoinder. I surely like the rather Spinozist idea that intellectual matters should be completed in a totally impersonal way. Sincerity seems to me to be here a decisive virtue, altogether with truthfulness.

    However, the matter is that, as far as I know, ought implies can in any modal logic. So when you say that ‘we should voluntarily believe whatever beliefs are sufficiently justified’, this is were I might disagree. So what do we end with? Indeed, if I endorse this last claim, I surely contradict what I claimed in the first paragraph…

    Well, I have a theory about that but the story is complicated and some of its parts are not exactly standard, to put it at its mildest. I am inclined to think that we should split our doxastic practices in two sub-classes, acceptance and genuine beliefs. Acceptances are formed for epistemically governed normative reasons and are surely not voluntaristic in a direct sense. One can however read, talk, examine evidence and the like in order to form an acceptance and this amounts to an indirect motivation. In this sense, acceptances are in fact almost like beliefs but lack a phenomenal character that is proper to belief (something like a qualitative character of confidence or the like). Belief, on the other hand, is, like acceptance, governed by truth but might not fit the ‘ought implies can’ requirement. Another difference is that acceptance maybe comes in degrees, which is surely not the case for belief.

    Anyway, this is a complicated matter and my theory is surely insufficient but I think that there it is motivated on a metaphilosophical level. Indeed, philosophy IS a normative matter. Philosophy ought to provide us with norms for judgments. However, philosophy ought also sometimes to be descriptive and then to try to overpass its normative task to explain why we fail to meet these normative requirements.

    Comment by Patrik — August 9, 2010 @ 8:20 am | Reply

    • I pretty much agree with what you are saying here, and I have four responses.

      One, normativity sometimes refers to qualities (virtues) rather than actions. We have already touched upon this idea, but I will just spell it out a little more to show the relevance to “ought implies can.” Virtuous beliefs do not necessarily indicate voluntary actions (in the sense that belief is a voluntary action.) We could rate a belief as having evaluative properties (good, justified, rational) meaning something like “reliable, cautious, likely to be true or accurate” rather than “you are committing a sin when you choose to have a bad belief.”

      The word “ought” seems to be based on the interest we have with actions, but an action can also be evaluated on the fact that the action promotes values rather than a mere prescription. We get caught up with the idea of rightness and wrongness involving actions, but normativity might be best modeled based on “values.” Morality can be based on intrinsic values. Wrong actions, reckless beliefs, and other negative character traits are more likely to lead to intrinsic harm. Epistemology can also evaluate actions, beliefs, and character traits in terms of promoting or being based on truth, honesty, accuracy, etc.

      The point here is that not all evaluations have to rely on the “ought implies can” issue. The idea that an action is a sin or is “wrong” does imply can, but we can evaluate things in terms of values and virtues as well.

      Two, I alluded to the idea that we can control beliefs indirectly. That might solve the “ought implies can” problem. There is a sense that we can control our beliefs in an indirect way. We “ought to have justified beliefs” in whatever sense we actually do have control over it, meaning we can choose to act cautiously, question ourselves, look at arguments, etc.

      Three, I’m not sure that the “acceptances” are different than how people use the word “belief.” Beliefs are very complected and what you call “acceptances” sounds like you are speculating about a sort of “quasi-belief.” Some beliefs are more of a unconscious automatic response that might be like an “assumption.” It is pretty clear that we have little to no direct control over our unconscious responses of this sort.

      I take it that a “quasi-belief” are conscious thoughts, and it is these beliefs that we use when making arguments or taking surveys asking us what we believe. We might be able to have direct control over our quasi-beliefs then. These are the sorts of beliefs we can control and act on with little trouble. (I actually don’t think “acceptances” have unlimited voluntary control, but it at least seems likely that we can control our “acceptances” at least sometimes.)

      This idea of quasi-belief seems a little like playing with words. When people talk about “voluntary belief” they might actually be talking about quasi-beliefs or “assented propositions,” which could be voluntary. This is one of the reasons why I said that we might be able to choose some beliefs but not others.

      If we wanted to rate a belief as “wrong” then that might imply that “ought implies can” but we can sometimes have the right sort of belief. If we found that a person had no control over the belief, then it wouldn’t be “wrong.”

      Four, I’m not sure about how oppressively rationalistic what I am saying might be. I think some beliefs might require little to no justification to be “rationally permissible.” I don’t want to say that every belief requires a degree in philosophy to be “rational.” However, some beliefs might (require one to be a philosopher or behave like one to attain the necessary expertise), and some beliefs might be more potentially dangerous than others and therefore warrant more caution.

      Comment by James Gray — August 9, 2010 @ 11:09 am | Reply

  5. Great post. I’ve been interested in how we know what we know (epistemology). What bothers me about Justified True Belief isn’t Gettier problems so much as the “True” requirement. As non-omniscient humans, we have no way of knowing if a belief is true (aside from analytic or tautalogical statements, which are essentially true by definition) by any means except through our justifications for our beliefs. Thus, JTB becomes simply Justified Belief, and I would prefer the term Sufficiently Justified Belief to emphasize that it generally needs to be strongly justified to hold a statement to be true. Details about how strongly our justification needs to be to be considered “sufficient” are welcome, and I see that you have at least hinted at possibilities in this post.
    If we already knew, somehow, that a belief was true, then we wouldn’t have to worry about justifications for it. It just seems to me that the “truth” requirement in JTB leads to either near absolute uncertainty, or circular reasoning. SWB avoids these problems, although it raises others.

    Comment by Michael A. Clem — October 14, 2010 @ 7:03 pm | Reply

    • Yes, knowing that we know can be difficult or impossible, but that doesn’t mean that we don’t have any justified true beliefs. However, I agree that we generally just settle with “justified belief” or “sufficiently justified belief.”

      Beliefs that are “sufficiently justified” tend to be beliefs we take to be “rational.” If they aren’t sufficiently justified, then they are irrational. I don’t have a definition or formula that can tell us exactly what that means for each belief and I understand sufficiently justified belief as a belief that appropriately exhibits various “values” or “theoretical virtues.” To have any value can be taken as a “justification” nothing-else-considered, but a sufficiently justified belief is justified “all else considered.”

      The belief that the Sun is up is easily justified by observation. The belief that 1+1=2 is easily justified by thought alone. It would be irrational to think that the Sun isn’t up based on our observations without very strong defeaters (objections) to consider, and it might be impossible to rationally believe that 1+1=3.

      It just seems to me that the “truth” requirement in JTB leads to either near absolute uncertainty, or circular reasoning. SWB avoids these problems, although it raises others.

      Do you mean SJB (sufficiently justified belief) seems sufficient? It might be, but many people understand “justification” in terms of its “likelyhood of being true.” We might not know that any belief really is true, but to “sufficiently justify our belief” is to show why it makes sense to think it really is true.

      I think most philosophers worry less now about knowledge in terms of truth and worry more about justification. The “coherence theory of truth” and “pragmatic theory of truth” seem obviously wrong — but the coherence theory of justification and pragmatic theory of justification do make a good deal of sense. I think philosophers tend to realize that now.

      Comment by James Gray — October 14, 2010 @ 9:02 pm | Reply

  6. Hi, I came across this while I was trying to brush up on the topic, I found it interesting to read and well written. Thorough but not lengthy. One quick note though there are some typos in ‘Challenging a belief is an insult.’ where you say “If your assumption is correct, ‘hen’ the..” 🙂
    and in ‘Controversial beliefs can’t be justified’ where you say “In a similar way it could be that one philosopher can have a justified ‘belie’ that cocain…”

    Do you have anything on if beliefs are choices?

    Comment by Troy Holmes — July 26, 2014 @ 4:23 am | Reply

    • Troy Holmes,

      Thanks for finding the typos. I corrected them.

      There might be myths about if beliefs are choices. For the most part they don’t seem to be choices, but there are things we can do to try to correct our unjustified beliefs. Thinking about the reasons for beliefs (or counter evidence against them) can help. Do you have any thoughts about the issue?

      Comment by JW Gray — July 26, 2014 @ 7:44 am | Reply

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

Blog at

%d bloggers like this: