Due on Fri ~~Nov 3~~ Nov 10.

Sample answers in purple. I have sometimes relied on `p(A ∧ B) = p(B ∧ A)`

without explicitly stating and justifying this assumption by appeal to result (27b).

Many of these problems are adapted from Titelbaum.

In all of the following problems, you can rely on the three probability axioms:

**Non-Negativity**-
For any proposition
`H`

,`p(H) ≥ 0`

**Normality**-
The probability of any tautology is
`1`

**Finite Additivity**(none of the the problems require the extra strength of**Countable Addivity**)-
For any mutually exclusive propositions
`H`

and`G`

,`p(H ∨ G) = p(H) + p(G)`

- You can also rely on the
**Ratio Formula**(moreover, you can rely on the relevant divisors always being > 0): -
For any propositions
`H`

and`G`

where`p(G) > 0`

:`p(H|G) = p(H ∧ G)/p(G)`

Wesley Salmon observed that each of the following confirmational situations can arise:

- Two pieces of evidence each confirm some hypothesis, but their conjunction disconfirms it.
- Two pieces of evidence each confirm some hypothesis, but their disjunction disconfirms it.
- A piece of evidence confirms each of two hypotheses, but it disconfirms their conjunction.
- A piece of evidence confirms each of two hypotheses, but it disconfirms their disjunction.

Provide a real-world example of each of these four situations. (None of propositions (dis)confirming or being (dis)confirmed should be a tautology or a contradiction.)

Understand “confirming” as an incremental notion: making a hypothesis more plausible than it was before. You can rely on pre-theoretic intuitions about when this happens; it’s not necessary to commit to a specific probabilistic analysis of confirmation. That said, we found it easiest to give examples involving claims about how a fair six-sided die came up when rolled. (If you get stuck trying to do the same, try with a larger die, for example one with 8 or 10 or 12 sides.)

- The hypothesis is that a fair six-sided die came up
`1-or-2`

. One piece of evidence is that it came up`1-or-3`

(taking us from 2 successes out of 6 to 1 success out of 2). The other piece of evidence is that it came up`2-or-3`

. The conjunction of these pieces of evidence is that the die came up`3`

, which refutes the hypothesis. - The hypothesis is that a fair six-sided die came up in
`{1,2,3,4}`

. One piece of evidence is that it came up in`{2,3,4,5}`

; the other piece of evidence is that it came up in`{2,3,4,6}`

. Each piece of evidence takes the hypothesis from 4 successes out of 6 to 3 successes out of 4. But their disjunction takes the hypothesis to 3 successes out of 5. - Use the same propositions as in (a), but with the roles of hypotheses and evidence reversed. The evidence that the die came up
`1-or-2`

confirms each of the hypotheses that it came up`1-or-3`

or came up`2-or-3`

(taking us from 2 successes out of 6 to 1 success out of 2). But it refutes their conjunction, which is that the die came up`3`

. - Use the same propositions as in (b), but with the roles of hypotheses and evidence reversed. The evidence that die came up in
`{1,2,3,4}`

confirms each of the two hypotheses by taking them from 4 successes out of 6 to 3 successes out of 4. But it takes their disjunction from 5 successes out of 6 to 3 successes out of 4.

Using the axioms listed above, prove each of the following:

`p(A) + p(¬A) = 1`

By Normality,

`p(A ∨ ¬A) = 1`

. But by Finite Additivity,`p(A ∨ ¬A) = p(A) + p(¬A)`

.If

`A`

and`B`

are logically equivalent, then`p(A) = p(B)`

. (**Hint:**When`A`

and`B`

are logically equivalent,`{A, ¬B}`

will be a partition of all the logical possibilities.)If

`A`

and`B`

are logically equivalent, then`A`

and`¬B`

are mutually exclusive. So by Finite Additivity,`p(A ∨ ¬B) = p(A) + p(¬B)`

. But it will also be the case that`A ∨ ¬B`

is a tautology. So by Normality,`p(A ∨ ¬B) = 1.`

Putting these together,`p(A) + p(¬B) = 1.`

Applying result (a),`p(¬B) = 1 - p(B)`

, so substituting, we get`p(A) + 1 - p(B) = 1`

, which simplifies to`p(A) = p(B)`

.`p(A) = p(A ∧ B) + p(A ∧ ¬B)`

`A ∧ B`

and`A ∧ ¬B`

are mutually exclusive, so by Finite Additivity,`p((A ∧ B) ∨ (A ∧ ¬B)) = p(A ∧ B) + p(A ∧ ¬B)`

. But`(A ∧ B) ∨ (A ∧ ¬B)`

is logically equivalent to`A`

, so by result (b), we have`p(A) = p(A ∧ B) + p(A ∧ ¬B)`

.If

`A`

logically entails`B`

, then`p(B) ≥ p(A)`

If

`A`

logically entails`B`

, then`A`

is logically equivalent to`A ∧ B`

, so by result (b), (i)`p(A) = p(A ∧ B)`

. By result (c), (ii)`p(B) = p(B ∧ A) + p(B ∧ ¬A)`

. Substituting in (i), this is equivalent to (iii)`p(B) = p(A) + p(B ∧ ¬A)`

. Because Non-Negativity implies`p(B ∧ ¬A) ≥ 0`

, (iii) implies that`p(B) ≥ p(A)`

.`p(A ∨ B) = p(A) + p(B) - p(A ∧ B)`

By result (c) above,

`p(A) = p(A ∧ B) + p(A ∧ ¬B)`

, and`p(B) = p(A ∧ B) + p(¬A ∧ B)`

. So the right-hand side of the desired result is equivalent to (i)`(p(A ∧ B) + p(A ∧ ¬B)) + (p(A ∧ B) + p(¬A ∧ B)) - p(A ∧ B)`

, or (iii)`p(A ∧ B) + p(A ∧ ¬B) + p(¬A ∧ B)`

. Since`A ∧ B`

,`A ∧ ¬B`

, and`¬A ∧ B`

are mutually exclusive, Finite Additivity gives us that (iii) equals (iv)`p((A ∧ B) ∨ (A ∧ ¬B) ∨ (¬A ∧ B))`

. But since`(A ∧ B) ∨ (A ∧ ¬B) ∨ (¬A ∧ B)`

is logically equivalent to`A ∨ B`

, result (b) gives us that (iv) equals (v)`p(A ∨ B)`

.`p(A) = p(E) p(A|E) + p(¬E) p(A|¬E)`

By the Ratio Formula,

`p(E) p(A|E) = p(A ∧ E)`

, and`p(¬E) p(A|¬E) = p(A ∧ ¬E)`

. So the right-hand side is equivalent to`p(A ∧ E) + p(A ∧ ¬E)`

, which by (c) above equals`p(A)`

.`p(A|E) > p(A) iff p(E|A) > p(E)`

By the Ratio Formula

`p(A|E) = p(A ∧ E)/p(E)`

, and`p(E|A) = p(A ∧ E)/p(A)`

. We’re assuming that the divisors are non-zero, and by Non-Negativity we therefore have that they’re positive. So the left-hand inequality is equivalent to (i)`p(A ∧ E) > p(A) p(E)`

and the right-hand inequality is equivalent to (ii)`p(A ∧ E) > p(E) p(A)`

. These claims are straightforwardly equivalent.`p(A|E) > p(A) iff p(A|E) > p(A|¬E)`

By the Ratio Formula, the right-hand inequality is equivalent to (i)

`p(A ∧ E)/p(E) > p(A ∧ ¬E)/p(¬E)`

, and since we’re assuming`p(E)`

and`p(¬E)`

are both`> 0`

, (i) is equivalent to (ii)`p(¬E) p(A ∧ E) > p(E) p(A ∧ ¬E)`

. Result (a) tells us that (ii) is equivalent to (iii)`(1 - p(E)) p(A ∧ E) > p(E) p(A ∧ ¬E)`

, and result (c) tells us that (iii) is equivalent to (iv)`(1 - p(E)) p(A ∧ E) > p(E) (p(A) - p(A ∧ E))`

. Simplifying, we get that (iv) is equivalent to (v)`p(A ∧ E) > p(E) p(A)`

. But the Ratio Formula tells us that the left-hand inequality is equivalent to (vi)`p(A ∧ E)/p(E) > p(A)`

, which given that`p(E) > 0`

, is equivalent to (v).

There are three mutually exclusive and jointly exhaustive theories:

`A`

,`B`

, and`C`

. All three theories agree that hypotheses`X`

and`Y`

are conditionally independent of each other, in the sense that`p(X ∧ Y|A) = p(X|A) p(Y|A)`

, and similarly for`B`

and`C`

.Here are some values:

`p(A) = .3, p(X|A) = .9, p(Y|A) = .4 p(B) = .1, p(X|B) = .3, p(Y|B) = .8 p(C) = .6, p(X|C) = 0, p(Y|C) = .2`

Answer the following:

First,

`X`

is learned with certainty to be true. According to the Bayesian, what are the new unconditional probabilities for`A`

,`B`

, and`C`

?Let us compute

`p(X) = p(X|A) p(A) + p(X|B) p(B) + p(X|C) p(C) = .9 .3 + .3 .1 + 0 .6 = .27 + .03 + 0 = .30`

.Bayes’ Theorem says

`p(A|X) = p(X|A) p(A) / p(X) = .9 .3 / .3 = .9`

.We also have

`p(B|X) = p(X|B) p(B) / p(X) = .3 .1 / .3 = .1`

.We also have

`p(C|X) = p(X|C) p(C) / p(X) = 0 .6 / .3 = 0`

.Next, after

`X`

has already been learned with certainty to be true,`Y`

is learned with certainty to be true. According to the Bayesian, what are the new unconditional probabilities for`A`

,`B`

, and`C`

?Given our answers to (a), we’re now in a new probability distribution that we’ll call

`q(⋅)`

, whose values are:`q(A) = p(A|X) = .9, q(Y|A) = p(Y|X ∧ A) = p(Y|A) = .4 q(B) = p(B|X) = .1, q(Y|B) = p(Y|X ∧ A) = p(Y|A) = .8 q(C) = p(C|X) = 0, q(Y|C) = p(Y|X ∧ A) = p(Y|A) = .2`

The equations

`p(Y|X ∧ A) = p(Y|A)`

and so on are justified because the Ratio Formula tells us that`p(Y|X ∧ A) = p(Y ∧ X ∧ A) / p(X ∧ A) = (p(Y ∧ X|A) p(A)) / (p(X|A) p(A)) = p(Y ∧ X|A) / p(X|A)`

. But the conditional independence of`X`

and`Y`

in the problem setup tells us this last value equals`p(Y|A)`

.Thus using the same strategy as in (a) but with these new values, we get:

`q(Y) = q(Y|A) q(A) + q(Y|B) q(B) + q(Y|C) q(C) = .4 .9 + .8 .1 + .2 0 = .36 + .08 + 0 = .44`

`q(A|Y) = q(Y|A) q(A) / q(Y) = .4 .9 / .44 = 0.8181... (9/11)`

`q(B|Y) = q(Y|B) q(B) / q(Y) = .8 .1 / .44 = 0.1818... (2/11)`

`q(C|Y) = q(Y|C) q(C) / q(Y) = .2 0 / .44 = 0`

Now, suppose that

`X`

and`Y`

had instead been learned (with certainty) in the opposite order:`Y`

was learned first, and`X`

was learned second. In that case, according to the Bayesian, what would the unconditional probabilities be for`A`

,`B`

, and`C`

after learning only`Y`

?Let us compute

`p(Y) = p(Y|A) p(A) + p(Y|B) p(B) + p(Y|C) p(C) = .4 .3 + .8 .1 + .2 .6 = .32`

Bayes’ Theorem says

`p(A|Y) = p(Y|A) p(A) / p(Y) = .4 .3 / .32 = .375`

.We also have

`p(B|Y) = p(Y|B) p(B) / p(Y) = .8 .1 / .32 = .25`

.We also have

`p(C|Y) = p(Y|C) p(C) / p(Y) = .2 .6 / .32 = .375`

.Sticking with the suppositions from (c), according to the Bayesian, what would the unconditional probabilities be for

`A`

,`B`

, and`C`

after next learning`X`

(after having already learned`Y`

)?Given our answers to (c), we’re now in a new probability distribution that we’ll call

`r(⋅)`

, whose values are:`r(A) = p(A|Y) = .375, r(X|A) = p(X|Y ∧ A) = p(X|A) = .9 r(B) = p(B|Y) = .25, r(X|B) = p(X|Y ∧ A) = p(X|A) = .3 r(C) = p(C|Y) = .375, r(X|C) = p(X|Y ∧ A) = p(X|A) = 0`

The equations

`p(X|Y ∧ A) = p(X|A)`

are justified for the same reasons as in (b).Using the same strategy as before but with these new values, we get:

`r(X) = r(X|A) r(A) + r(X|B) r(B) + r(X|C) r(C) = .9 .375 + .3 .25 + 0 .375 = .4125`

`r(A|X) = r(X|A) r(A) / r(X) = .9 .375 / .4125 = 0.8181...`

`r(B|X) = r(X|B) r(B) / r(X) = .3 .25 / .4125 = 0.1818...`

`r(C|X) = r(X|C) r(C) / r(X) = 0 .375 / .4125 = 0`

Now, suppose that

`X`

and`Y`

had instead been learned (with certainty) at the exact same time: the agent learns`X ∧ Y`

(with certainty) in a single moment. According to the Bayesian, what would the unconditional probabilities be for`A`

,`B`

, and`C`

after learning`X ∧ Y`

?Let us compute

`p(X ∧ Y)`

, which I’ll abbreviate as`p(XY)`

. This decomposes to`p(XY|A) p(A) + p(XY|B) p(B) + p(XY|C) p(C)`

. The fact that`X`

and`Y`

are independent conditional on`A`

tells us that the left-hand term of this sum equals`p(X|A) p(Y|A) p(A)`

. Similarly for the other terms, so the result is`p(X|A) p(Y|A) p(A) + p(X|B) p(Y|B) p(B) + p(X|C) p(Y|C) p(C) = .9 .4 .3 + .3 .8 .1 + 0 .2 .6 = .108 + .024 = .132`

.The Ratio Formula tells us that (i)

`p(A|XY) = p(AXY)/p(XY) = p(XY|A) p(A) / p(XY)`

.The fact that

`X`

and`Y`

are independent conditional on`A`

tells us that`p(XY|A) = p(X|A) p(Y|A)`

. Substituting into (i), we get (ii)`p(A|XY) = p(X|A) p(Y|A) p(A) / p(XY)`

.Plugging in the known values, we get

`p(A|XY) = .9 .4 .3 / .132 = 9/11 = 0.8181...`

Similarly,

`p(B|XY) = p(X|B) p(Y|B) p(B) / p(XY) = .3 .8 .1 / .132 = 2/11 = 0.1818...`

Similarly,

`p(C|XY) = p(X|C) p(Y|C) p(C) / p(XY) = 0 .2 .6 / .132 = 0/11 = 0`

Consider the discussion of Simpson’s Paradox in section 3.2.3 of Titelbaum’s book. Now consider the following scenario: a population can be partitioned into

`{old, young}`

and can also be partitioned into`{shuffleboarders, non-shuffleboarders}`

. Within the subpopulation of old people, shuffleboarding is positively correlated with good health. And within the subpopulation of young people, shuffleboarding is positively correlated with good health. However, in the entire population, shuffleboarding is negatively correlated with good health. According to the Bayesian, is this scenario possible? If not, briefly explain (in English) why not. If so, briefly explain (in English) how it might happen.Yes this is possible. This is because being in

`old`

is negatively correlated with good health, and shuffleboarding is positively correlated with being in`old`

.Can a probability function satisfying the axioms listed earlier assign

`p(H) = 0.5`

,`p(G) = 0.5`

, and`p(¬H ∧ ¬G) = 0.8`

? Explain why or why not.No, because

`p(¬H) = 1 - p(H) = 0.5`

, and by Result 27d,`p(¬H ∧ ¬G) ≤ p(¬H)`

.Prove that for any propositions

`H`

and`G`

, if`p(H iff G) = 1`

then`p(H) = p(G)`

.By result 27c,

`p(H) = p(H ∧ G) + p(H ∧ ¬G)`

, and`p(G) = p(G ∧ H) + p(G ∧ ¬H)`

. Substituting in the desired result and simplifying, we get that what we need to derive is that (i)`p(H ∧ ¬G) = p(G ∧ ¬H)`

. We have as a premise that (ii)`p(H iff G) = 1`

, so by result 27a, (iii)`p(¬(H iff G)) = 0`

. But`¬(H iff G)`

is logically equivalent to`(H ∧ ¬G) ∨ (G ∧ ¬H)`

, so we have that (iv)`p((H ∧ ¬G) ∨ (G ∧ ¬H)) = 0`

. Since those two disjuncts are mutually exclusive, Finite Additivity tells us (iv) is equivalent to (v)`p(H ∧ ¬G) + p(G ∧ ¬H) = 0`

. Because of Non-Negativity, the only way (v) can be true is if`p(H ∧ ¬G) = p(G ∧ ¬H) = 0`

, which gives us the desired result (i).Can an agent have a probabilistic credence distribution

`cr(⋅)`

meeting all of the following constraints?- The agent is certain of
`A ⊃ (B iff C)`

. - The agent is equally confident of
`B`

and`¬B`

. - The agent is twice as confident of
`C`

as`C ∧ A`

. `cr(B ∧ C ∧ ¬A) = 1/5`

.

If not, prove that it’s impossible. If so, provide a probability table and demonstrate that the resulting distribution satisfies each of the four constraints.

**Hint:**Start by building a probability table; then figure out what each of the constraints says about the credence values in the table; then figure out if it’s possible to meet all of the constraints at once.We start to build the table. We know from constraint (b) that

`p(B) = p(¬B) = 0.5`

.A B C credence T T T d T T F e T F T f T F F g F T T j F T F 0.5-d-e-j F F T k F F F 0.5-f-g-k We know from constraint (a) that

`¬A ∨ (B iff C)`

should have probability`1`

, thus`1 = j + (0.5-d-e-j) + k + (0.5-f-g-k) + d + g = 1 - e - f`

, so`e`

and`f`

must be`0`

.We know from constraint (c) that

`d+f+j+k`

is twice`d+f`

, so`j+k = d+f = d`

.We know from constraint (d) that

`j = .2`

. Thus we have:A B C credence T T T d T T F 0 T F T 0 T F F g F T T j = 0.2 F T F 0.5-d-j = 0.3-d F F T k = d-j = d-0.2 F F F 0.5-g-k = 0.7-g-d This can be satisfied, for example if

`d = 0.2`

and`g = 0.5`

, we get:A B C credence T T T d = 0.2 T T F 0 T F T 0 T F F g = 0.5 F T T j = 0.2 F T F 0.3-d = 0.1 F F T k = d-0.2 = 0 F F F 0.7-g-d = 0 Now

`p(B) = p(¬B) = 0.5`

;`p(C) = 0.4`

; and`p(C ∧ A) = 0.2`

.- The agent is certain of
A family has two children of different ages. Assume that each child has a probability of

`1/2`

of being a girl, and that the probability that the elder is a girl is independent of the probability that the younger is.Conditional on the older child’s being a girl, what’s the probability that the younger one is?

Since the probability of the younger’s being a girl is independent of the older’s being a girl, the answer is still

`1/2`

.Conditional on at least one child’s being a girl, what’s the probability that they both are?

There are four possibilities, and because the gender of each child is independent of the other’s, each of these four possibilities has probability

`1/4`

. In three of these possibilities, one of the children is a girl. So conditional on our being in one of those three cases, the probability that they’re both girls is`(1/4)/(3/4)`

or`1/3`

.

Flip and Flop are playing a game. They have a fair coin that they are going to keep flipping until one of two things happens: either the coin comes up heads twice in a row, or it comes up tails followed by heads. The first time one of these things happens, the game ends. If it ended with

`HH`

, Flip wins; if it ended with`TH`

, Flop wins.What’s the probability that Flip wins after the first two tosses of the coin? What’s the probability that Flop wins after the first two tosses of the coin?

There are four possibilities for the first two tosses, each with equal probability. In one of those four, Flip wins; in another of them, Flop wins. In the other two possibilities, the second toss is a

`T`

and neither wins. So the answer to (a) is`1/4`

for both.Flip and Flop play their game until it ends (at which point one of them wins). What’s the probability that Flop is the winner?

If the second toss was a

`H`

, then we’re in case (a) and Flop wins half of the time (depending on the first toss being a`T`

). If the second toss was a`T`

and the third toss was a`H`

, then again Flop wins. If the second and third are both`T`

and the fourth was a`H`

, then again Flop wins. The only way for Flip to win is for the first two tosses to both be`H`

. That will happen only`1/4`

of the time. Flop wins every other game that ever ends. So the probability that Flop wins is`3/4`

.

Pink gumballs always make my sister sick; blue gumballs make her sick half of the time; while white gumballs make her sick only one-tenth of the time. Yesterday, my sister bought a single gumball randomly selected from a machine that’s one-third pink gumballs, one-third blue, and one-third white. Applying this general version of Bayes’ Theorem:

cr(H

_{i}|E) = cr(E|H_{i}) cr(H_{i}) / ( cr(E|H₁) cr(H₁) + cr(E|H₂) cr(H₂) + … + cr(E|H_{n}) cr(H_{n}) )how confident should I be that my sister’s gumball was pink conditional on the supposition that it made her sick?

`cr(pink|sick) = cr(sick|pink) cr(pink) / ( cr(sick|pink) cr(pink) + cr(sick|blue) cr(blue) + cr(sick|white) cr(white) )`

`= 1 1/3 / ( 1 1/3 + 1/2 1/3 + 1/10 1/3 ) = 1/3 / ( 1/3 + 1/6 + 1/30 ) = 10/30 / ( 16/30 ) = 10/16 = 5/8`

Consider the probabilistic credence distribution specified by this probability table:

P Q R credence T T T 0.1 T T F 0.2 T F T 0 T F F 0.3 F T T 0.1 F T F 0.2 F F T 0 F F F 0.1 Answer the following questions about this distribution:

What is

`cr(P∣Q)`

?`cr(P|Q) = cr(P ∧ Q)/cr(Q) = (0.1 + 0.2) / (0.1 + 0.2 + 0.1 + 0.2) = 1/2`

Relative to this distribution, is

`Q`

positively relevant to`P`

, negatively relevant to`P`

, or probabilistically independent of`P`

?Since

`cr(P) = 0.1 + 0.2 + 0.3 = 0.6`

, the result (a) tells us that`Q`

is negatively relevant to`P`

.What is

`cr(P∣R)`

?`cr(P|R) = cr(P ∧ R)/cr(R) = (0.1 + 0) / (0.1 + 0 + 0.1 + 0) = 1/2`

What is

`cr(P∣Q ∧ R)`

?`cr(P|Q ∧ R) = cr(P ∧ Q ∧ R)/cr(Q ∧ R) = 0.1 / (0.1 + 0.1) = 1/2`

Conditional on

`R`

, is`Q`

positively relevant to`P`

, negatively relevant to`P`

, or probabilistically independent of`P`

?Results (c) and (d) tell us that conditional on

`R`

,`Q`

is independent of`P`

.Does

`R`

screen off`P`

from`Q`

? Explain why or why not.`P`

and`Q`

are not independent in the original distribution, but conditional on`R`

they become independent. Titelbaum requires for screening off also that`cr(P|Q ∧ ¬R) = cr(P|¬R)`

. One can verify from the table that the left-hand side is`0.2/(0.2+0.2) = 1/2`

, and the right-hand side is`(0.2+0.3)/(0.2+0.3+0.2+0.1) = 0.5/0.8`

. So this additional condition is not satisfied.

Show that probabilistic independence is not transitive. That is, provide a single probability distribution on which all of the following are true:

`X`

is independent of`Y`

, and`Y`

is independent of`Z`

, but`X`

is not independent of`Z`

. Show that your distribution satisfies all three conditions.A simple solution would be to let

`X`

=`Z`

.To have

`X`

and`Z`

be different propositions, let`X`

say that a fair six-sided die came up`1`

or`2`

, and let`Z`

say that it came up`1`

or`4`

. Clearly these are not independent (probability of each being true is`1/3`

, but probability of both being true is`1/6`

). But they are each independent of`Y`

, which says that the die came up even.In Section 3.3 Titelbaum pointed out that the following statement (labeled Equation (3.43) there) does not hold for every constant

`k`

and propositions`A`

,`B`

, and`C`

:`cr(C|A) = k`

and`cr(C|B) = k`

entail`cr(C∣A ∨ B) = k`

Describe a real-world example (involving dice, or cards, or something more interesting) in which it’s rational for an agent to assign

`cr(C∣A) = k`

and`cr(C∣B) = k`

but`cr(C∣A ∨ B) ≠ k`

. Show that your example meets this description.Our answer to Problem (26b) provided such a case. The hypothesis

`C`

that the die comes up`1`

through`6`

gets probability`5/8`

on either of the pieces of evidence about its coming up one of eight specified values. But their disjunction covers eleven values, only five of which are ones where`C`

is true.Prove that if

`A`

and`B`

are mutually exclusive, then whenever`cr(C∣A) = k`

and`cr(C∣B) = k`

, it’s also the case that`cr(C∣A ∨ B) = k`

.If

`A`

and`B`

are mutually exclusive, then so too will be`C ∧ A`

and`C ∧ B`

. Thus by Additivity,`cr((C ∧ A) ∨ (C ∧ B)) = cr(C ∧ A) + cr(C ∧ B)`

.By the Ratio Formula,

`cr(C|A ∨ B) = cr(C ∧ (A ∨ B))/cr(A ∨ B) = cr((C ∧ A) ∨ (C ∧ B)) / cr(A ∨ B)`

. Because of the mutual exclusivity of`C ∧ A`

and`C ∧ B`

, and also of`A`

and`B`

, this becomes`(cr(C ∧ A) + cr(C ∧ B))/(cr(A) + cr(B))`

.Since

`k = cr(C|A) = cr(C ∧ A)/cr(A)`

,`cr(C ∧ A)=k cr(A)`

. Similarly,`cr(C ∧ B) = k cr(B)`

. Substituting these into the result of the previous paragraph, we get that`cr(C|A ∨ B) = (k cr(A) + k cr(B))/(cr(A) + cr(B)) = k`

.

At

`t1`

,`t2`

, and`t3`

, Jane assigns credences over the language`ℒ`

constructed from atomic propositions`P`

and`Q`

. Jane’s distributions satisfy these constraints:- At
`t1`

, Jane is certain of`Q ⊃ P`

, anything that proposition entails, and nothing else. - Between
`tl`

and`t2`

Jane learns`P`

and nothing else. She updates by conditionalizing between those two times. `cr1(Q∣P) = 2∕3`

`cr3(Q∣¬P) = 1/2`

`cr3(P ⊃ Q) = cr2(P ⊃ Q)`

- At
`t3`

, Jane is certain of`¬(P ∧ Q)`

, anything that proposition entails, and nothing else.

Here’s how we solved this. Begin with a table like this:

P Q cr1 cr2 cr3 T T a d g T F b e h F T c f i F F 1-a-b-c 1-d-e-f 1-g-h-i Constraint (i) tells us that

`c = 0`

, and that none of the other cells for`cr1`

are`0`

.Constraint (vi) tells us that

`g = 0`

, and that none of the other cells for`cr3`

are`0`

.Constraint (iii) tells us that

`a = 2*b`

.Constraint (iv) tells us that

`g+h = 1-2*i`

.Filling in these results, we have:

P Q cr1 cr2 cr3 T T 2*b d 0 T F b e 1-2*i F T 0 f i F F 1-3*b 1-d-e-f i Now constraint (v) tells us that

`2*i = 1-e`

. And constraint (ii) tells us that`f = 0`

,`1-d-e-f = 0`

,`d = 2/3`

, and`e = 1/3`

. Filling in these results, we have:P Q cr1 cr2 cr3 T T 2*b 2*i 0 T F b 1-2*i 1-2*i F T 0 0 i F F 1-3*b 0 i We also have that

`2*i = 2/3`

, and`1-2*i = 1/3`

, so`i = 1/3`

.Answer the following:

Completely specify Jane’s credence distributions at

`t2`

and`t3`

.P Q cr2 cr3 T T 2/3 0 T F 1/3 1/3 F T 0 1/3 F F 0 1/3 Create a hypothetical prior for Jane. In other words, specify a regular probabilistic distribution

`cr0`

over`ℒ`

such that`cr1`

can be generated from`cr0`

by conditionalizing on Jane’s set of certainties at`t1`

,`cr2`

is`cr0`

conditionalized on Jane’s certainties at`t2`

, and`cr3`

is`cr0`

conditionalized on Jane’s certainties at`t3`

.P Q cr0 cr1 cr2 cr3 T T 2/5 2/4 2/3 0 T F 1/5 1/4 1/3 1/3 F T 1/5 0 0 1/3 F F 1/5 1/4 0 1/3 Does Jane update by Conditionalization between

`t2`

and`t3`

? Explain how you know.No, because the probability for

`¬P`

goes from`0`

to`2/3`

, but conditionalizing won’t change a`0`

probability to positive.The Hypothetical Priors Theorem says that if an agent always updates by conditionalizing, then her credences can be represented by a hypothetical prior distribution. Is the converse of this theorem true?

No, because we represented Jane’s credences by a hypothetical prior distribution, but her change from

`cr2`

to`cr3`

was not by conditionalizing.

- At
At noon I rolled a six-sided die. It came from either the Fair Factory (which produces exclusively fair dice), the Snake-Eyes Factory (which produces dice with a

`1/2`

chance of coming up one and equal chance of each other outcome), or the Boxcar Factory (which produces dice with a`1/4`

chance of coming up six and equal chance of each other outcome).Suppose you use the Principle of Indifference to assign equal credence to each of the three factories from which the die might have come. Applying the Principal Principle, what is your credence that my die roll came up three?

`cr(three) = cr(three|Fair) cr(Fair) + cr(three|Snake-Eyes) cr(Snake-Eyes) + cr(three|Boxcar) cr(Boxcar)`

`= 1/6 1/3 + 1/10 1/3 + 3/20 1/3 = 5/36`

Maria tells you that the die I rolled didn’t come from the Boxcar Factory. If you update on this new evidence by Conditionalization, how confident are you that the roll came up three?

`cr(three|Fair ∨ Snake-Eyes) = cr(three|Fair) cr(Fair|Fair ∨ Snake-Eyes) + cr(three|Snake-Eyes) cr(Snake-Eyes|Fair ∨ Snake-Eyes)`

`= 1/6 1/2 + 1/10 1/2 = 2/15`

Is Maria’s evidence “admissible” with respect to the outcome of the die roll? Explain.

Yes, because if we were certain of the chance of the die roll, her information would be redundant.

After you’ve incorporated Maria’s information into your credence distribution, Ron tells you the roll didn’t come up six. How confident are you in a three after conditionalizing on Ron’s information?

This solution formerly had an error: it said

`cr2(¬six|Snake-Eyes) = 1/10`

, whereas it should be`9/10`

.Let

`cr2(⋅) = cr(⋅|Fair ∨ Snake-Eyes)`

, that is, our credence distribution after hearing from Maria.Ron’s information affects our credence whether the die came from the Fair Factory or the Snake-Eyes Factory:

`cr2(Fair|¬six) = cr2(¬six|Fair) cr2(Fair) / ( cr2(¬six|Fair) cr2(Fair) + cr2(¬six|Snake-Eyes) cr2(Snake-Eyes) )`

`= 5/6 1/2 / (5/6 1/2 + 9/10 1/2) = 25/52`

`cr2(Snake-Eyes|¬six) = cr2(¬six|Snake-Eyes) cr2(Snake-Eyes) / ( cr2(¬six|Fair) cr2(Fair) + cr2(¬six|Snake-Eyes) cr2(Snake-Eyes) )`

`= 9/10 1/2 / (5/6 1/2 + 9/10 1/2) = 27/52`

Now

`cr2(three|¬six) = cr2(three|¬six ∧ Fair) cr(Fair|¬six) + cr2(three|¬six ∧ Snake-Eyes) cr(Snake-Eyes|¬six)`

. Plugging in the previous lines, we get:

`= cr2(three|¬six ∧ Fair) 25/52 + cr2(three|¬six ∧ Snake-Eyes) 27/52`

`= 1/5 25/52 + 1/9 27/52 = 8/52 = 2/13 ≈ 0.15385`

(The claim that

`cr2(three|¬six ∧ Snake-Eyes) = 1/9`

deserves comment. Think of it as they produce ten-sided dice, with five sides labeled`1`

and the rest numbered`2`

through`6`

. When you learn that the die didn’t come up`6`

, that leaves nine other equally likely outcomes, only one of which is a three.)Is Ron’s evidence “admissible” with respect to the outcome of the die roll? Explain.

No, because even if we were certain of the chance of the die roll, his information would still be probabilistically relevant.

I just 3D-printed a cube, but I will not show it to you. All I’ll tell you about the cube is that each edge is somewhere between

`10cm`

and`20cm`

.First, apply the Principle of Indifference to the edge-length of the cube. What is your credence that the cube has an edge-length of

`15cm`

or greater?`1/2`

Instead, apply the Principle of Indifference to the volume of the cube. What is your credence that the cube has a volume of

`3375cm³`

or greater?Since the volume will range from

`1000`

to`8000cm³`

, that is a bit over`.66`

of the range.Instead, apply the Principle of Indifference to the surface area of the cube. What is your credence that the cube has a surface area of

`1350cm²`

or greater?Since the surface area will range from

`600`

to`2400cm²`

, that is a bit over`.58`

of the range.Your answers to (a), (b), and (c) will be pairwise inconsistent with each other. Why do philosophers take that to be a challenge for the Principle of Indifference?

Given the laws of geometry, a cube’s having an edge-length of

`15cm`

or greater, its having a volume of`3375cm³`

or greater, and its having a surface area of`1350cm²`

or greater are all equivalent conditions. Thus, assigning different credences to each of the three conditions yields inconsistent probability assignments to the same condition, depending on the way that condition is described.