Comment Rescue: True Rejections
A True Rejection is a scenario or a set of facts that — if it were true — would cause you to revise a conclusion that you have expressed. New commenter OhB1Knewbie understands:
THIS IS A THOUGHT EXPERIMENT ! The purpose is not to win the argument. The purpose is to attempt a bit of self-examination and reflection. The goal of the game is to tease apart the two portions that form the foundation of any individual’s position. Any individual’s position has as its foundation an amalgam of logic/reason and emotion/faith. It is hoped that by separating these two constituents of the foundation that the part based on logic/reason may be examined to determine if in fact it is the True Rejection or if it is actually just a straw dog set up to protect and shield the emotion/faith based portion of the foundation of the individual’s position.
This exactly. If the logic/reason part is in the driver’s seat, then a change in certain facts or a discovered flaw in one’s deductions will cause a change in outcome. If the emotion/faith part is in the driver’s seat, then a change in the facts or deductions on which one is currently relying will… cause you to find new facts or deductions that lead you to believe the same old thing. OhB1Knewbie continues:
The underlying thesis [of the True Rejection game] is that unless you are addressing the logical foundation rather than the emotional foundation you’re wasting your time and energy. Because only the logical foundation is susceptible to change by the refutation or amending of the facts upon which it is based, only that portion should be addressed while the emotional foundation should be set aside in recognition that it will only change after a long slow simmer in the soup of logic/reason.
To honestly play the game, a player must search the cosmos of both the possible and the impossible to find something which would, if it were true, change their mind and hence their position. Failure to do so is to lose the game by default because you have refused to play. This is generally interpreted to imply that the logical foundation of your position is in fact only a straw dog intended to protect the emotional foundation which you have no intention of exposing or defending.
Of course all of this is predicated on the idea that decisions should be based only on logic/reason and not on emotion/faith, the purpose of which is to separate the educated/civilized from the ignorant/barbarian within any individual. The game is structured in recognition of the fact that we are all human and therefore tend to obstruct even ourselves in the honest evaluation of our motives and therefore are assisted by the abstraction of our logic/reason that the game necessitates.
This seems more or less right to me as well. Introspection suggests that emotional commitments can and do change over time, but it does take a good long while.
However, as Eliezer Yudkowsky has previously explained, True Rejections aren’t always easy to pin down:
I suspect that, in general, if two rationalists set out to resolve a disagreement that persisted past the first exchange, they should expect to find that the true sources of the disagreement are either hard to communicate, or hard to expose. E.g:
- Uncommon, but well-supported, scientific knowledge or math;
- Long inferential distances;
- Hard-to-verbalize intuitions, perhaps stemming from specific visualizations;
- Zeitgeists inherited from a profession (that may have good reason for it);
- Patterns perceptually recognized from experience;
- Sheer habits of thought;
- Emotional commitments to believing in a particular outcome;
- Fear of a past mistake being disproven;
- Deep self-deception for the sake of pride or other personal benefits.
In politics, whole ideologies are explicitly premised on things like Zeitgeist and the epistemic value of habit and intuition.
Nor can I say that I am immune to these. Though I’ll of course take it as a compliment if you say it about me, which many of you do all the time.
I like this characterization. It makes me think of extremes which might reveal it’s epistemic limits. One extreme might be beliefs about, say, mathematics (or any other beliefs subject to strict proof). The True Rejection for a mathematical belief would be clear: just a presentation of a proof. Of course, being presented with the proof isn’t sufficient to change the persons mind about mathematical belief P: they must understand the proof as well. And if someone who doesn’t understand complicated mathematical proofs (like me) my assurance that the (eg) Löwenheim–Skolem theorem is in fact true rests on my confidence that people smarter than me both do understand the proof and aren’t lying about it.
Taking another extreme, (this was mentioned by Wardsmith in a discussion with Patrick about AGW), I wonder what – if any – True Rejection could be imagined for changing one’s belief/disbelief in God. In some sense, I think the True Rejection for both the theist as well as the atheist would require articulating a complex belief set completely different than the one they currently experience. And that may not be so easy to do. So this particular True Rejection might be unexpressable. But I wouldn’t conclude from that that the person – even the atheist! – is irrational in holding maintaining their belief simply because they cannot articulate a True Rejection.
These types of situations make me wonder to what extent True Rejection disentangles closely held evidence-based beliefs about the empirical from the believers parallel emotional commitment to that belief on a subjective level. That is, the inability to articulate a True Rejection may appear to another person as evidence that the relevant belief is emotion-based rather than rational, but that presumes not only that a True Rejection can in principle be articulated, but also that the person being challenged is capable of articulating it. In trivial cases – like disputes about who won the Super Bowl in 1973 – the True Rejection method seems entirely applicable. In more complicated cases, where beliefs are determined for reasons that even we might not be fully cognizant of, things get dicey. Recall Hume here: our belief that there’s a hallway rather than a gaping void outside the door I am about to walk thru is irrational.Report
my true rejection for my faith isn’t disproving G-d — i’d still hold my faith even if you did disprove G-d (in that case, it would be a rather silly faith, but I’m rather okay with silliness). If you could convince me that I’d be a better person (yes, I do have criteria here) without faith, then I’d not believe in G-d.Report
I take it that a TR is a statement of an objective state of affairs which would, if it were true, falsify your belief and cause you to change your mind. Your TR for the belief in God isn’t objective, since it contains subjectively determined value terms. In the ideal case, those terms would get a semantics to make them objectively measurable. I’m not sure that could be done, which is part of what I was getting at in the earlier comment.Report
You will be a better person if you give me all your worldy possessions.Report
no. I do not judge you as being better than the people I do support, in terms of your worth as an asset to humanity. I also believe that you have better terms to support yourself.Report
Careful with this.
‘Cause I’m pretty sure there’s someone out there that you believe is better than you are, that you do support.
You need to go give *them* all your stuff 🙂Report
who is to say I don’t, Patrick?Report
he’s a nice guy. we share.Report
I’m crushed.Report
And some people are simply immune to logical thought:
Nigel Tufnel: The numbers all go to eleven. Look, right across the board, eleven, eleven, eleven and…
Marty DiBergi: Oh, I see. And most amps go up to ten?
Nigel Tufnel: Exactly.
Marty DiBergi: Does that mean it’s louder? Is it any louder?
Nigel Tufnel: Well, it’s one louder, isn’t it? It’s not ten. You see, most blokes, you know, will be playing at ten. You’re on ten here, all the way up, all the way up, all the way up, you’re on ten on your guitar. Where can you go from there? Where?
Marty DiBergi: I don’t know.
Nigel Tufnel: Nowhere. Exactly. What we do is, if we need that extra push over the cliff, you know what we do?
Marty DiBergi: Put it up to eleven.
Nigel Tufnel: Eleven. Exactly. One louder.
Marty DiBergi: Why don’t you just make ten louder and make ten be the top number and make that a little louder?
Nigel Tufnel: [pause] These go to eleven.Report
Or at the least, they are incapable of abstract reasoning, or thinking counterfactually (from their pov). I wouldn’t be inclined to say that someone who cannot think abstractly is irrational.Report
I know someone who counts pebbles in his head to do math problems. He’s quite rational.Report
My Alesis goes to +15.Report
I think any exercise that causes people to analyze their own opinions more rigorously is a Good Thing, but I’m still going to quibble here: if the True Rejection is supposed to highlight the logical basis for one’s opinion, then I don’t see what use there is in a True Rejection that’s fantastical or impossible. If my True Rejection for an opinion is “superintelligent aliens land on earth and tell me that I’m wrong”, that’s a pretty good sign that my opinion isn’t based on logic. Somewhat relatedly, if I have a belief about possible events that would come about several decades from now, having a True Rejection that basically says “those events don’t occur” doesn’t seem to add any clarity to my justifications for the opinion I hold right now.Report
True Rejections are one tool among many. I don’t want to be understood as taking them for the whole of rationality.
For example, there is at least one class of truth that doesn’t admit of a True Rejection: analytic truths, which stem from language and definition. I have no True Rejection for the statement “A bachelor is single.”
I’m convinced in fact that there isn’t one, at least as long as we are discussing in standard English.
So even if superintelligent aliens (or God!) said otherwise, I wouldn’t reject the statement “a bachelor is single.” I’d just conclude that they were using the language in an eccentric way, and that they were being very silly in arguing so.Report
If my True Rejection for an opinion is “superintelligent aliens land on earth and tell me that I’m wrong”, that’s a pretty good sign that my opinion isn’t based on logic.
Not if your opinion is that the statistics and observations of current astronomy and science don’t support intelligent alien lifeforms existing in our neck of the galaxy. Then it’s a perfectly good TR.Report
Well, that’s why I said “a pretty good sign” and not “incontrovertible proof”…Report
Any theory of knowledge that supposes the superior virtue of human intelligences denying their humanity seems like folly to me. You are human, you were born human, you’ll die human, and between now and then you’ll make all of your decisions in a sea of emotions you’ll barely understand and can’t possibly control. I’m saying that’s ideal. I’m saying it’s reality.Report
Oh, you can control all of the emotions some of the time, and some of the emotions all of the time, but you can’t control all of the emotions all of the time.Report
I understand all that. But I’m also drawn to virtue ethics. Which one of my human capacities should I most exercise and nourish? Which one requires a lot of work to develop? Rationality, I’d say.
Emotions barely need nourishment at all. What they need, if anything, is weeding and pruning.Report
Full point.Report
Full point, only if you agree with the premise. Lots of people don’t. Are they less virtuous, less rational, because they are incapable of or reject intellectualism?Report
Are they less rational if they are incapable or reject intellectualism? Um, aren’t they by definition?Report
Let’s just put it this way: I’m less likely to be convinced that they can have well formulated practical policy ideas.
If someone wants to argue foundational principles, rationalists vs. emotive types are certainly equally valid.
You want to talk practical implementations, it’s time to break out a measuring tape.Report
Um, aren’t they by definition?
Only according to one definition of rationality. I think what you’re getting at is that people who can’t/don’t want to think abstractly or comprehend logical deep arguments are irrational.
Have you ever heard of the Zen story where the teacher asks his students what is the nature of the water in this pitcher, and they all try to give an intellectual answer until one justs knocks the pitcher over breaking it?
Intellectualism isn’t the only avenue to the truth. It’s just the preferred method of westerners.Report
And if I could edit, I’d add that I agree with Patrick above.Report
People who react to their emotions rather than choosing to do virtue rather than to do vice?
Yeah, I’d say that they’re less virtuous. Probably less rational too.
I mean, assuming virtue exists, of course. (Cthulhu ftaghn, etc)Report
What if their emotions are very virtuous? Don’t Buddhists have something to say about this?Report
They say something to the effect of “don’t react, always act from a state of moral awareness”.
Moral awareness is a big thing to Buddhists.
They’re really big on stuff like “discipline” too.Report
Maybe, but the topic of discussion was emotions, not discipline. I’ve noticed that you like to expand the scope of an argument when it’s convenient.Report
Dude, way to ignore Jaybird’s first two sentences, which were entirely on point. I’ve noticed that you like to cherrypick which arguments to respond to when it’s convenient.Report
Since when does JB need an advocate?Report
I don’t need one.
The Truth could always use another.Report
The third sentence was intended to be read in the light of the first two.
Don’t let your emotions run away with you.Report
Right. But what is the neutral reader supposed to take away from that comment? That you endorse the idea that emotions alone can be virtuous? Or that there’s a downside to that as well: that high-stepping discipline is inconsistent with, and negates, emotional rationality and virtuousness?Report
But what is the neutral reader supposed to take away from that comment?
Let the neutral reader ask for clarification.
That you endorse the idea that emotions alone can be virtuous?
Um. No. I do not endorse that idea.
Or that there’s a downside to that as well: that high-stepping discipline is inconsistent with, and negates, emotional rationality and virtuousness?
We are now using sentences that make no sense to me.Report
Ahh, yes. And here we get the typical Jaybird games. Commenter S says something JB objects to. JB responds. Commenter S then asks for clarification of what JB meant. JB responds by saying that ‘well that isn’t what I meant at all, but I won’t articulate what I meant either, I’ll let you figure it out.’
Here’s what I’ve figured out: you have lots of objections that you can’t coherently articulate. That’s some short shrift, my friend.Report
Interesting observation, Mr. Stillwater. In the current crisis, this does not make him a bad person, nor does it make his objection moot, even if he cannot quite articulate it.
Words, and the mastery of them, are both overrated. This is the realm of the sophist, not the lover of wisdom. The cleverest man is not necessarily the wisest.
The lover of wisdom would help Brother JB articulate his argument to its best form even if he thinks he disagrees with it.
Perhaps in helping articulate it, the helper may even come to agree. That would rather be the point of this whole thing together. I often sit down to write a rebuttal, and upon further review end up with a concurrence.Report
Stillwater, let me be clear:
Buddhism teaches (among other things) that being carried away by emotions is bad. One should not be led by one’s emotions. One should not *REACT*. Instead, one should choose to act from a state of Moral Awareness.
This is pretty much *NOT* the same thing as responding as if, and let me cut and paste for you here, “their emotions are very virtuous”.
That is what the Buddhists teach about emotion (among other things, of course).
So when you ask whether high-stepping discipline is inconsistent with, and negates, emotional rationality and virtuousness, I feel like I am in a completely different conversation than the one you seem to think we’re in.Report
Tom Van Dyke,
You make an excellent point. The nature of my blog-commenting can sometimes make me lazy and too quick. And a principle of charity, which ought to always be in play, is sometimes lost. So I agree with you.
Regarding the other point you bring up – the irony of me criticizing someone for failing to adequately articulate their view – isn’t lost on me. Well played.Report
I think so. Reason, unlike animal instincts, is a human tool to deal with reality as it presents itself to us. The fact we don’t have perfect knowledge, or that we have emotions which are powerful at times, doesn’t mean we can’t learn to assess our emotional reactions to better understand our value-judgements and act according to the best of our knowledge, always willing to grow with new knowledge and better understanding.Report
Sheesh, Jason. I’m finding this whole line of posts you’ve been doing on these issues over the past few weeks my favorite stuff on the web these days.
Do you guys at Cato really spend your days talking about this stuff? I want to work there now. The whole being Libertarian thing… that’s more of a guideline than a requirement, right?Report
If you like this line of discussion, you should really check out Less Wrong and read some of the highest rated posts.Report
What James K said.
This line of posts owes a lot to my work — Eliezer Yudkowsky is commenting at Cato Unbound, as you know — but it also stems from my recreation time, for want of a better word. My husband is a participant in the Less Wrong community, and I’ve been a lurker for several months.
Our discussions at home often concern techniques of rationality, futurism, Bayesian reasoning, and whatever else the Less Wrong community is discussing at the time.Report
Yudkowsky’s page on Bayesian Reasoning is one of the best on the web.Report
Glad you brought this up again, Jason.
I wanted to comment on Koz’s comment here on the ‘Logical Rudeness’ thread, but I was pressed for time.
For me, it’s a much more practical matter.
#2 is my own personal belief (that AGW is real, and that we can do something about it), and it follows that some sort of action is required.
Scenarios 1 & 3 (that it’s not real, and that it is, but we can’t do anything about it) both end the same way– we sit on our hands and don’t do a darned thing.
Even if #3 turns out to be true, it seems to be more pragmatic to do what we can now, and wait and see (with what little time we’ve got) what comes along later.Report
Wages paid by firms should be allowed to be float downward at times of loose labor markets but during those times the federal government should provide wage support to restore workers’ total hourly cash compensation back to a the established minimum. This is an obvious economic stabilization and stimulus measure that shouldn’t be remotely controversial in principle, though determining the starting and stopping (or phase-out) triggers would be an involved and controversial process, but one that government is and should be perfectly capable of producing a workable solution to.Report
Beyond that, during demand shortfalls the wage restoration amount should actually bring take-home above what it had been prior to the labor market downturn, to provide additional spending power to consumers and buttress aggregate demand.Report
Uh, yeah, this belongs in the minimum wage thread. My bad.Report
That does it: I’m cutting your wages in half.Report
No, don’t do that! I love my wages! Let the other guy have them all!Report
… this provides “stabilization” at the expense of slowing down the “renormalization” — aka shifting jobs from a quadrant where they shouldn’t be anymore, to one which they should be in. (Taking Mises’ explanation of “why we get depression/recessions” as a given — provide a different one, and we’ll discuss which one is more applicable)Report