![]() Spring 2004 |
Theory of KnowledgeDoes Knowledge Require Certainty? |
Assoc. Prof. James Pryor |
This first premise will come up many times in our discussion. Let's give it a name:
"Certainty" can mean different things. To say that you're certain that p might mean that you're especially confident, that you have no lingering doubts about P running through your mind. Call this the psychological sense of "certainty." Alternatively, to say that you're certain that p might mean that you have really good evidence for p, evidence which is so good that there's no chance of your being wrong. It's not possible to believe that p on the basis of that kind of evidence and be mistaken. Call this the evidential sense of "certainty."
Unger intends to be using the psychological sense of "certainty" in his argument. He does this just to keep the argument simpler. He says that the epistemic sense of "certainty" is a normative notion--it has to do with how good your evidence is, and so with how confident you should be that P, not with how confident you actually are. Talk about shoulds is controversial, and Unger wants to keep his discussion as straightforward as he can. He does, though, think both:
Why does Unger think that we are, and should be, certain of hardly anything?
The answer is that he thinks that "being certain" is an absolute term, like "empty" and "flat." He thinks that emptiness requires a thing to have nothing in it whatsoever--however small. And he thinks that, in order to be flat, a thing must have no bumps or curves whatsoever--however small. If "certain" were an absolute term, too, then being certain would require having no doubts whatsoever.
Unger argues that if "flat" is an absolute term, then:
Necessarily, if x is flatter (or more near to being flat) than y, then that must mean that x has fewer bumps or curves than y, so y must have some bumps or curves; so strictly speaking, y is not really flat.Similarly, if "being certain" is an absolute term, then:
Necessarily, if you are (or should be) more certain of p than you are of q, then that must mean that you have (or should have) fewer doubts about p than about q, so you must have (or should have) some doubts about q; so strictly speaking, you're not really certain of q.Unger is of course willing to allow that y might be close enough to being flat for all practical purposes. Likewise, you might be close enough to certain of q for all practical purposes. But there's a big difference between what's strictly speaking true and what's it's acceptable to say or what's near enough to the truth for practical purposes. Here we're just concerned with what's strictly speaking true.
Unger thinks that for most propositions q, the proposition that you exist is, and should be, more certain for you than q. Hence, if he's right that "being certain" is an absolute term, then--since there is something which is more certain for you than q--it follows that, strictly speaking, you're not certain of q. And if knowledge requires absolute certainty, then you can't know that q.
Do you think it's true that knowledge requires certainty? What if you believe that P, but you have some doubts running through your mind--doubts you recognize to be irrational and baseless. Would that prevent you from knowing P?
A related notion is the notion of defeasibility: the evidence you have for believing that p is defeasible just in case it can be overturned or defeated as more evidence comes in. An example of indefeasible evidence would be a mathematical proof. Most other kinds of evidence are defeasible. For example, we have plenty of evidence that Mars is not made of coffee. But one can imagine a sequence of discoveries that would turn the tables, and make it reasonable to think that perhaps Mars is made of coffee, after all. I'm not saying we're going to get that evidence. It's extremely unlikely that that will happen. But it's still possible. So our evidence that Mars is not made of coffee is defeasible. It could be defeated or overturned by more evidence.
As we've seen, some people think that knowledge requires absolute certainty. These people will say that you can never know that p if your evidence for p is less than fully certain. If there's any chance that your evidence might later be defeated, then it won't be good enough to give you knowledge that p.
The Lottery Argument seems to confirm this claim that defeasible evidence can never be good enough for knowledge. It seems to show that no matter how good your evidence is, so long as it leaves open some possibility of your being wrong, you won't know. You may be very highly justified in believing that your ticket will lose, but you don't know it.
Other people think that it is sometimes possible to know things on the basis of defeasible evidence. These people are called fallibilists.
They think it can be enough if your evidence is pretty damn good, but not so good as to make you infallible. For instance, you have pretty good evidence that Mars is not made of coffee. You might be wrong. Your evidence is defeasible. But suppose you're not wrong. Your evidence is pretty damn good. The fallibilist will say that in this kind of situation you can count as knowing.
Let's get clear about one thing. Earlier we were discussing the claim that:Knowledge is factive: that is, if you know that P then P has to be true.
That claim, by itself, is not enough to settle our current dispute about the Certainty Principle. The claim that knowledge is factive does not entail that:
Knowledge has to be based on indefeasible, absolutely certain evidence.
The fallibilist agrees that knowledge is factive. On his view, you can know P on the basis of fallible evidence, but only if P is also true. If there are other people who believe things on the basis of the same kinds of fallible evidence as you, but their beliefs are false, then their beliefs won't count as knowledge on anybody's view.
This point often confuses students, so make sure that you've thought it through and understood it.
The fallibilist says: to know P, you need to have good evidence for P, and in addition, P has to be true. (The evidence by itself usually won't guarantee that P is true.)
The Certainty Principle, on the other hand, says: to know P, your evidence has to be maximally good. It has to be so good that no one could have that evidence without P's being true.
We'll talk more about this later.
Sometimes people think that the debate about skepticism is just a debate about whether or not the Certainty Principle is true. But it's not that simple. If you accept the Certainty Principle, then it does looks like skepticism will follow, at least about a great many topics. Maybe there are some things that you're infallible about (e.g., whether 1+1=2, or whether you're thinking about monkeys). But not many. So the Certainty Principle does seem to support skepticism.
It's the reverse direction that's trickier.
Let's distinguish three kinds of epistemically desirable state:
What would be really interesting--and really troubling--is if the skeptic had arguments that threatened our possession of these less-demanding states, too. Arguments that don't just fuss about our not having absolutely certain evidence.
The best, most interesting kinds of skeptical argument are of that sort. Some of them threaten to show that we can't even have justified beliefs about the world outside our heads. If they're right, then it's no more reasonable to believe that you're sitting down right now than it is to believe you're a brain in a vat.
This shows that we shouldn't think that the debate about skepticism is just a debate about whether the Certainty Principle is true. Even if we concede that we can't be certain of much about the outside world, there remain weaker--but still epistemically desirable--positions for us to aspire to. Some of them might deserve the name "knowledge." On the other hand, even if we decided that certainty isn't a requirement for knowledge, we might not yet be in the clear. The most powerful skeptical arguments don't assume anything as strong as the Certainty Principle. They purport to raise difficulties about our possessing even the weaker epistemic positions.
[Theory of Knowledge] [Syllabus] [Notes and Handouts] [James Pryor] [Philosophy Links] [Philosophy Dept.]