Due on Fri ~~Nov 3~~ Nov 10.

Many of these problems are adapted from Titelbaum.

In all of the following problems, you can rely on the three probability axioms:

**Non-Negativity**-
For any proposition
`H`

,`p(H) ≥ 0`

**Normality**-
The probability of any tautology is
`1`

**Finite Additivity**(none of the the problems require the extra strength of**Countable Addivity**)-
For any mutually exclusive propositions
`H`

and`G`

,`p(H ∨ G) = p(H) + p(G)`

- You can also rely on the
**Ratio Formula**(moreover, you can rely on the relevant divisors always being > 0): -
For any propositions
`H`

and`G`

where`p(G) > 0`

:`p(H|G) = p(H ∧ G)/p(G)`

Wesley Salmon observed that each of the following confirmational situations can arise:

- Two pieces of evidence each confirm some hypothesis, but their conjunction disconfirms it.
- Two pieces of evidence each confirm some hypothesis, but their disjunction disconfirms it.
- A piece of evidence confirms each of two hypotheses, but it disconfirms their conjunction.
- A piece of evidence confirms each of two hypotheses, but it disconfirms their disjunction.

Provide a real-world example of each of these four situations. (None of propositions (dis)confirming or being (dis)confirmed should be a tautology or a contradiction.)

Understand “confirming” as an incremental notion: making a hypothesis more plausible than it was before. You can rely on pre-theoretic intuitions about when this happens; it’s not necessary to commit to a specific probabilistic analysis of confirmation. That said, we found it easiest to give examples involving claims about how a fair six-sided die came up when rolled. (If you get stuck trying to do the same, try with a larger die, for example one with 8 or 10 or 12 sides.)

Using the axioms listed above, prove each of the following:

`p(A) + p(¬A) = 1`

If

`A`

and`B`

are logically equivalent, then`p(A) = p(B)`

. (**Hint:**When`A`

and`B`

are logically equivalent,`{A, ¬B}`

will be a partition of all the logical possibilities.)`p(A) = p(A ∧ B) + p(A ∧ ¬B)`

If

`A`

logically entails`B`

, then`p(B) ≥ p(A)`

`p(A ∨ B) = p(A) + p(B) - p(A ∧ B)`

`p(A) = p(E) p(A|E) + p(¬E) p(A|¬E)`

`p(A|E) > p(A) iff p(E|A) > p(E)`

`p(A|E) > p(A) iff p(A|E) > p(A|¬E)`

There are three mutually exclusive and jointly exhaustive theories:

`A`

,`B`

, and`C`

. All three theories agree that hypotheses`X`

and`Y`

are conditionally independent of each other, in the sense that`p(X ∧ Y|A) = p(X|A) p(Y|A)`

, and similarly for`B`

and`C`

.Here are some values:

`p(A) = .3, p(X|A) = .9, p(Y|A) = .4 p(B) = .1, p(X|B) = .3, p(Y|B) = .8 p(C) = .6, p(X|C) = 0, p(Y|C) = .2`

Answer the following:

First,

`X`

is learned with certainty to be true. According to the Bayesian, what are the new unconditional probabilities for`A`

,`B`

, and`C`

?Next, after

`X`

has already been learned with certainty to be true,`Y`

is learned with certainty to be true. According to the Bayesian, what are the new unconditional probabilities for`A`

,`B`

, and`C`

?Now, suppose that

`X`

and`Y`

had instead been learned (with certainty) in the opposite order:`Y`

was learned first, and`X`

was learned second. In that case, according to the Bayesian, what would the unconditional probabilities be for`A`

,`B`

, and`C`

after learning only`Y`

?Sticking with the suppositions from (c), according to the Bayesian, what would the unconditional probabilities be for

`A`

,`B`

, and`C`

after next learning`X`

(after having already learned`Y`

)?Now, suppose that

`X`

and`Y`

had instead been learned (with certainty) at the exact same time: the agent learns`X ∧ Y`

(with certainty) in a single moment. According to the Bayesian, what would the unconditional probabilities be for`A`

,`B`

, and`C`

after learning`X ∧ Y`

?

Consider the discussion of Simpson’s Paradox in section 3.2.3 of Titelbaum’s book. Now consider the following scenario: a population can be partitioned into

`{old, young}`

and can also be partitioned into`{shuffleboarders, non-shuffleboarders}`

. Within the subpopulation of old people, shuffleboarding is positively correlated with good health. And within the subpopulation of young people, shuffleboarding is positively correlated with good health. However, in the entire population, shuffleboarding is negatively correlated with good health. According to the Bayesian, is this scenario possible? If not, briefly explain (in English) why not. If so, briefly explain (in English) how it might happen.Can a probability function satisfying the axioms listed earlier assign

`p(H) = 0.5`

,`p(G) = 0.5`

, and`p(¬H ∧ ¬G) = 0.8`

? Explain why or why not.Prove that for any propositions

`H`

and`G`

, if`p(H iff G) = 1`

then`p(H) = p(G)`

.Can an agent have a probabilistic credence distribution

`cr(⋅)`

meeting all of the following constraints?- The agent is certain of
`A ⊃ (B iff C)`

. - The agent is equally confident of
`B`

and`¬B`

. - The agent is twice as confident of
`C`

as`C ∧ A`

. `cr(B ∧ C ∧ ¬A) = 1/5`

.

If not, prove that it’s impossible. If so, provide a probability table and demonstrate that the resulting distribution satisfies each of the four constraints.

**Hint:**Start by building a probability table; then figure out what each of the constraints says about the credence values in the table; then figure out if it’s possible to meet all of the constraints at once.- The agent is certain of
A family has two children of different ages. Assume that each child has a probability of

`1/2`

of being a girl, and that the probability that the elder is a girl is independent of the probability that the younger is.Conditional on the older child’s being a girl, what’s the probability that the younger one is?

Conditional on at least one child’s being a girl, what’s the probability that they both are?

Flip and Flop are playing a game. They have a fair coin that they are going to keep flipping until one of two things happens: either the coin comes up heads twice in a row, or it comes up tails followed by heads. The first time one of these things happens, the game ends. If it ended with

`HH`

, Flip wins; if it ended with`TH`

, Flop wins.What’s the probability that Flip wins after the first two tosses of the coin? What’s the probability that Flop wins after the first two tosses of the coin?

Flip and Flop play their game until it ends (at which point one of them wins). What’s the probability that Flop is the winner?

Pink gumballs always make my sister sick; blue gumballs make her sick half of the time; while white gumballs make her sick only one-tenth of the time. Yesterday, my sister bought a single gumball randomly selected from a machine that’s one-third pink gumballs, one-third blue, and one-third white. Applying this general version of Bayes’ Theorem:

cr(H

_{i}|E) = cr(E|H_{i}) cr(H_{i}) / ( cr(E|H₁) cr(H₁) + cr(E|H₂) cr(H₂) + … + cr(E|H_{n}) cr(H_{n}) )how confident should I be that my sister’s gumball was pink conditional on the supposition that it made her sick?

Consider the probabilistic credence distribution specified by this probability table:

P Q R credence T T T 0.1 T T F 0.2 T F T 0 T F F 0.3 F T T 0.1 F T F 0.2 F F T 0 F F F 0.1 Answer the following questions about this distribution:

What is

`cr(P∣Q)`

?Relative to this distribution, is

`Q`

positively relevant to`P`

, negatively relevant to`P`

, or probabilistically independent of`P`

?What is

`cr(P∣R)`

?What is

`cr(P∣Q ∧ R)`

?Conditional on

`R`

, is`Q`

positively relevant to`P`

, negatively relevant to`P`

, or probabilistically independent of`P`

?Does

`R`

screen off`P`

from`Q`

? Explain why or why not.

Show that probabilistic independence is not transitive. That is, provide a single probability distribution on which all of the following are true:

`X`

is independent of`Y`

, and`Y`

is independent of`Z`

, but`X`

is not independent of`Z`

. Show that your distribution satisfies all three conditions.In Section 3.3 Titelbaum pointed out that the following statement (labeled Equation (3.43) there) does not hold for every constant

`k`

and propositions`A`

,`B`

, and`C`

:`cr(C|A) = k`

and`cr(C|B) = k`

entail`cr(C∣A ∨ B) = k`

Describe a real-world example (involving dice, or cards, or something more interesting) in which it’s rational for an agent to assign

`cr(C∣A) = k`

and`cr(C∣B) = k`

but`cr(C∣A ∨ B) ≠ k`

. Show that your example meets this description.Prove that if

`A`

and`B`

are mutually exclusive, then whenever`cr(C∣A) = k`

and`cr(C∣B) = k`

, it’s also the case that`cr(C∣A ∨ B) = k`

.

At

`t1`

,`t2`

, and`t3`

, Jane assigns credences over the language`ℒ`

constructed from atomic propositions`P`

and`Q`

. Jane’s distributions satisfy these constraints:- At
`t1`

, Jane is certain of`Q ⊃ P`

, anything that proposition entails, and nothing else. - Between
`tl`

and`t2`

Jane learns`P`

and nothing else. She updates by conditionalizing between those two times. `cr1(Q∣P) = 2∕3`

`cr3(Q∣¬P) = 1/2`

`cr3(P ⊃ Q) = cr2(P ⊃ Q)`

- At
`t3`

, Jane is certain of`¬(P ∧ Q)`

, anything that proposition entails, and nothing else.

Answer the following:

Completely specify Jane’s credence distributions at

`t2`

and`t3`

.Create a hypothetical prior for Jane. In other words, specify a regular probabilistic distribution

`cr0`

over`ℒ`

such that`cr1`

can be generated from`cr0`

by conditionalizing on Jane’s set of certainties at`t1`

,`cr2`

is`cr0`

conditionalized on Jane’s certainties at`t2`

, and`cr3`

is`cr0`

conditionalized on Jane’s certainties at`t3`

.Does Jane update by Conditionalization between

`t2`

and`t3`

? Explain how you know.The Hypothetical Priors Theorem says that if an agent always updates by conditionalizing, then her credences can be represented by a hypothetical prior distribution. Is the converse of this theorem true?

- At
At noon I rolled a six-sided die. It came from either the Fair Factory (which produces exclusively fair dice), the Snake-Eyes Factory (which produces dice with a

`1/2`

chance of coming up one and equal chance of each other outcome), or the Boxcar Factory (which produces dice with a`1/4`

chance of coming up six and equal chance of each other outcome).Suppose you use the Principle of Indifference to assign equal credence to each of the three factories from which the die might have come. Applying the Principal Principle, what is your credence that my die roll came up three?

Maria tells you that the die I rolled didn’t come from the Boxcar Factory. If you update on this new evidence by Conditionalization, how confident are you that the roll came up three?

Is Maria’s evidence “admissible” with respect to the outcome of the die roll? Explain.

After you’ve incorporated Maria’s information into your credence distribution, Ron tells you the roll didn’t come up six. How confident are you in a three after conditionalizing on Ron’s information?

Is Ron’s evidence “admissible” with respect to the outcome of the die roll? Explain.

I just 3D-printed a cube, but I will not show it to you. All I’ll tell you about the cube is that each edge is somewhere between

`10cm`

and`20cm`

.First, apply the Principle of Indifference to the edge-length of the cube. What is your credence that the cube has an edge-length of

`15cm`

or greater?Instead, apply the Principle of Indifference to the volume of the cube. What is your credence that the cube has a volume of

`3375cm³`

or greater?Instead, apply the Principle of Indifference to the surface area of the cube. What is your credence that the cube has a surface area of

`1350cm²`

or greater?Your answers to (a), (b), and (c) will be pairwise inconsistent with each other. Why do philosophers take that to be a challenge for the Principle of Indifference?