Will I understand a causal decision theorist/frequentist/libertarian free will believer before the end of 2024?
➕
Plus
10
Ṁ290
Jan 2
38%
chance

I've been having a long conversation about game theory and decision theory with @JulianMTG.

Concerningly, Julian has been better at predicting my responses to their questions than I've been at predicting their responses to my questions. Several times I've said "ok, so we agree on X, how about Y?" or "if you believe X, doesn't that imply Y?" and their response was akin to "no I don't agree on X, where did you even get that?".

I don't think this has happened in the other direction, though it's been a long conversation and I may be forgetting some cases. This implies that Julian understands my position better than I understand theirs, which on the outside view implies that they're more likely to be correct than I am. But I can't square this with the (to me) obvious irrationality of taking actions that knowingly result in getting a lower payout, and ignoring the deterministic nature of physics.

The object-level conversation has been unproductive, with endless debates about definitional issues and subtle differences in underlying assumptions when discussing a specific thought experiment. At the moment I think I've frustrated them enough with my misunderstandings that they've stopped responding, but I'm hopeful that we can continue talking eventually. When we do, I'd like to eventually come to understand their position well enough to be able to confidently describe it to others. This market resolves based on whether I think that's occurred.

Get
Ṁ1,000
and
S1.00
Sort by:

Maybe there is a typo, are you claiming physics is deterministic? As in quantum mechanics being wrong or incomplete? Or because only unitary evolution ever happens?

i am only the first of the three things in the title 😂

predicts NO

@JulianMTG Aaarggggh you see what I mean‽ Months of talking and I clearly just don't understand your position at all. Very frustrating.

@JulianMTG I mean, if the issue you two are having is related to "oh, with Newcomb you're making decisions that ignore the effects of the perfect predictor and how things can be modeled deterministically, etc etc", I think you probably should be a frequentist (or at bare minimum an objective Bayesian). I think there's a line of attack that's potentially annoying if you're a subjective Bayesian as to how you're constructing your expectation values.