The point of rationality is to detect your own cognitive failures and to recognize and exploit those of others. It's pretty explicitly NOT about ignoring them.
So when I observe someone saying "that guy is racist, don't believe his arguments", I can recognize that this is an emotional claim. It's an attempt to demonize a person rather than refute the argument. I can then observe that no one is disputing the actual argument, evaluate that on merit, and have a true belief about the world unbiased by my desire not to affiliate with racists. I can also update my beliefs about the rationality/honesty of the person saying "hey that guy is racist".
Conversely, I can also recognize that a desire not to be racist is strong in people, and exploit it when I want to manipulate the less rational. For example, if I'm arguing against economic protectionism with an emotionally driven person, I might use Jim Crow as an example of protectionism rather than occupational licensing.
i think you missed my point, which was that these so called "cognitive failures" are really integral to what we are. rationalism is not generally wrong. improving your cognitive abilities should be a goal for everyone. but it tends to lead people to a worldview where simple humanity is deemed inferior, wrong, and where a person actually starts believing that they outgrew themselves and their humanity, that they are superior, objective, free of bias. this is an illusion, nobody is free of bias.
in fact, people who believe they successfully suppressed or outgrew any emotion are typically the ones most influenced by it, subconsciously.
The rationalists I know in real life tend to be acutely aware that they are still biased in many ways, and would probably laugh at the sentence fragment "successfully suppressed or outgrew [an] emotion". Maybe we know completely different people who affiliate with the word "rationality".
Much like the Zen ideologies, there is no end state to being a rational human being. You can't just stop at one point and say "That's it, I'm perfectly rational now, therefore..."
Instead, it's more of an ongoing process. The process of identifying your biases and reasoning about them is on-going. You will always have biases, the trick is to make the subconscious influence conscious. To come back to the Zen comparison - the more rational you become, the more you realize you're not rational at all.
well said. i just feel it is something people easily say, but hardly put into practice.
i am being unfair, perhaps, as the other comment here states. i may be identifying some negative aspects in some people with a whole group that did not deserve it. but, OTOH, you two may be resorting to the "No True Scotsman" fallacy.
> you two may be resorting to the "No True Scotsman" fallacy.
That may be a fair statement, since I do not believe there is a perfectly rational person in the real world - just those who are working to fight against their inherit irrationality. All we can ever rationally (oh, the irony) expect is that they do their best.
So your rationality skills give you a lot of advantages in arguments, more or less independently of the actual merits of your position. Rather than compete in that zero-sum game, isn't the best thing for society then to distrust anyone with rationality skills and form social norms against then?
One of the predictions of this paper is that rational Bayesians sharing information will a) rapidly converge to agreement and b) after making arguments, will often switch sides. An individual's position in an argument will look like a convergent random walk rather than a gradual concession in a negotiation. Of course, real life arguments rarely go this way [1].
But when reading this, I was struck by the fact that in some rare cases I have engaged in arguments that move this way. In every case I can think of, the other participant in the conversation was either a mathematician, a philosophers or a lesswrong reader.
[1] One rare exception to this is real life arguments about what a stock price should be. Yay for market transmission of information!
You're right that not everything is zero-sum. But I suspect an overwhelming majority of the issues that come up for argument are very close to zero-sum.
not to attack your position, but just to further illustrate where i'm going with all this.
> ...rational Bayesians sharing information...
setting aside the inherent humor of economics (which is sadly completely lost on economists), have you ever wondered - why would a purely rational actor even participate in the discussion, or, as a matter of fact, in anything? why would a purely logical machine get out of bed in the morning? would it not need to "want" something first? what would an emotionless logical machine "think" about, and why would it think about that and not something else? intellect without emotion is nothing. not figuratively, not poetically, but literally nothing.
P.S. since when does the word "converge" apply to anything going on in a market? :P
Why do you think a rational actor has no goals, or is somehow emotionless?
The word "converge" has always applied to market responses to new info. Find a news event (e.g., WMT Oct 13-14) and look at the second or minute level movements near that event. It's qualitatively quite similar to Aaronson's theorem.
well, i'm definitely not educated on the terminology, particularly when it comes to economics. i guess we could clear out a lot of the misunderstanding here as soon as we sort the terms out ;)
so, just IMO, a perfectly rational actor should be emotionless, at least on the matter at hand. any kind of emotional tendency would skew their reasoning.
to add to what you said more above, about those fruitful discussions: i would bet that the people you had those fruitful discussions with have had another common characteristic. namely, they did not have much personal stake in the issue you were discussing. when people approach a discussion with only a desire to learn and improve their opinions, they can indeed have a quality discussion.
ultimately, i don't believe such neat separation of rational and irrational can ever work (aside from sometimes being a useful approximation). which is why i asked those silly questions - how can a rational actor want something, without wanting it? i find the concept very contradictory, borderline useless.
edit - i think i have to back down a bit, or maybe just clarify, dunno. i maybe get what you meant. people that appreciate rationality will tend to be better discussion partners even if they are personally affected. but it will be much harder...
> a perfectly rational actor should be emotionless.
Nope. A perfectly rational actor should have unskewed reasoning, yes, but you can (in principle) achieve that by making your emotions not skew your reasoning rather than by throwing your emotions away.
> why would a purely logical machine get out of bed in the morning? would it not need to "want" something first?
In a word - huh? An irrational person might wake up one morning with the urge to paint a picture - are you suggesting a purely rational person wouldn't feel such an urge, or that they wouldn't act on it? In either case, why not?
i'm suggesting "purely rational" prohibits the existence of emotion, otherwise we're dealing with a contradiction. sorry, i already replied to the peer comment to yours, maybe you can reply there?..
Unfortunately, /u/yummyfajitas is severely mischaracterizing the point of "rationality". Viz:
>The point of rationality is to detect your own cognitive failures and to recognize and exploit those of others.
This is false. The point of "rationality" is to achieve greater cognitive success: to have your thoughts yield information about the world by allowing the world to move your thoughts. The people who try to do this (such as, in this case, me) do so because we feel like our thoughts and emotions ought to be about stuff. The more I make my emotions be linked to my thoughts and my thoughts be linked to the real world, the less gnawing self-doubt I have to deal with when things go bad, and the more I can enjoy when things go right.
If all you can do was recognize cognitive failures, you will end up an epistemic relativist, which is useless.
If what you care about is exploiting the cognitive failures of others, you're just a jerk.
You're talking more about epistemic rationality - getting closer to the truth; 'yummyfajitas seems to be talking more about instrumental rationality - doing and thinking stuff that systematically yields success. But generally, you're right, and here:
> If what you care about is exploiting the cognitive failures of others, you're just a jerk.
I totally, 100% agree with you. Rationality is a tool; if you use it to exploit people, you're just a jerk.
Everyone exploits people in this way. Have you ever worn a suit or otherwise altered your appearance to influence the decisions of others? Ever built a landing page using a theme other than default HTML, in order to make people happier when reading? Ever noted an irrelevant shared interest ("hey we both love kale!") to someone you are trying to sell to, or otherwise influence the behavior of?
Is everyone a jerk?
In my view, a big failure of rationalists (coming from the typical mind fallacy, most likely) is that too little effort to manipulations of this sort. It's certainly a failure on mine.
Fair enough. I see what you're getting at, and it's indeed the basic way we communicate - by influencing each other.
I thought long about it and I'm still confused at some points, but I ended up viewing the issue through a lens of intent. Am I exploiting people by building a pretty website? Maybe, in a way that my actions cause them to spend more time on it. But if I do it with intention of helping them accomplish whatever they're looking to accomplish, that will be beneficial to them, then it's ok. If I'm doing it to trick them into wasting more time on my site full of half-assed linkbait content so that I earn money through them viewing ads, then I am a fucking jerk.
So no, not everyone is a jerk. Only those who seek to act to purposefully harm others (usually to gain something at their expense). Which sort of fits the very definition of the world "jerk".
Not everyone sells. Not everyone does any of the things you list. Not everyone's a jerk, at least at a conscious level (and I think people are correct to put more trust in people who will only manipulate unconsciously)
So when I observe someone saying "that guy is racist, don't believe his arguments", I can recognize that this is an emotional claim. It's an attempt to demonize a person rather than refute the argument. I can then observe that no one is disputing the actual argument, evaluate that on merit, and have a true belief about the world unbiased by my desire not to affiliate with racists. I can also update my beliefs about the rationality/honesty of the person saying "hey that guy is racist".
Conversely, I can also recognize that a desire not to be racist is strong in people, and exploit it when I want to manipulate the less rational. For example, if I'm arguing against economic protectionism with an emotionally driven person, I might use Jim Crow as an example of protectionism rather than occupational licensing.