Alfred Mele is an American philosopher and the William H. and Lucyle T. Werkmeister Professor of Philosophy at Florida State University. He specializes in irrationality, akrasia, intentionality and philosophy of[…]
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people
Human beings will make unwise decisions — and sometimes they’ll make radically unwise decisions. But we aren’t fundamentally rational or irrational creatures.
Question: Can humans act against their own better judgment?
Alfred Mele: Plato, or at least Socrates, who’s views Plato expressed, had an opinion about this and the idea was that as the tempting option become closer in time, more available, more readily available, what happens is, you switch your judgment so that at the last second, the student would always judge that really, it’s best to go to the party.
Now, I, myself, don’t think that happens partly on the basis of personal experience, but also partly, well there’s this thing called experimental philosophy now where instead of relying on our own personal intuitions on how things happen, we actually go out and do surveys of ordinary people and of course, the ordinary people are usually undergraduates. But undergraduates are pretty ordinary, nice people, but you know, just lay people. And one thing I did was a survey of, I think about 90 undergraduates recently. And I said, “Does this ever happen to you? You judge that it would be best to do a certain thing, and best from your own point of view too, not the point of view of your peers or parents, or whatever. And then still believing that you should do this thing, you do something else instead. Does it every happen?” And I had a, what’s called, a Libert Scale that goes from one to seven, and it was strongly disagree at one end – I mean, strongly agree at the one end, at the one, and strongly disagree at the seven. And the mean rating was, as I recall, 1.32. So, almost all of them you know agreed that they do it sometimes. Now, they could be wrong. You know, they could be fooling themselves. But I suspect they’re right. And if you think about your own case and I think about my own case, sometimes I am convinced that I shouldn’t do a certain thing and I just think, “What the hell. I’ll do it, I shouldn’t, but I will.” But it’s never anything really bad, but it might be something like smoking a cigarette. I’m trying to quit right now because I’ve had dental surgery. But New Year’s Eve I smoked a cigarette, and I thought I shouldn’t. So, yeah, I think it happens.
And why should we think that it doesn’t happen? Well, if we thought that what we’re most strongly motivated to do always lines up with what we judged best, then since we always do what we are most strongly motivated to do, we would think that when we do it we’re judging it best. But there’s just too much evidence against it.
Question: Can the mind ever really deceive itself, or does it choose what to believe or disbelieve?
Alfred Mele: What might make self-deception impossible is a certain model of it, and it’s a traditional model. And the model is a two-person intentional deception model. So, if I’m going to deceive you into believing something, I’ve got to know that’s it’s false and come up with a strategy for getting you to believe that it’s true. And the normal strategy is lying, and then you trust me, let’s say. So, if you use that model for self-deception, then in the same head, you’ve got knowing what’s true, the intention to get yourself to believing that it’s false, and you’ve got some kind of strategy for doing it. Now, that’s very puzzling, or paradoxical. How are you going to pull it off? It’s as though I said, “Look, I’m going to deceive you now into believing that I drive a Range Rover, and this is how I’m going to do it. I’m going to put a picture of me next to a Range Rover out of my wallet and show it to you and you’re going to believe it’s true, but really it’s false.” Well, is that going to work? No. No way, because you know what I’m up to, right?
So, if you put all this in one head, then it looks like, well the person knows what he’s up to, so he can’t possibly succeed. So, it’s paradoxical. And also, the person would have to believe at the same time the truth, and it’s opposite. They both have to be there in the same place.
So, one thing I do is to reject this two-person model of self-deception and I have a different kind of model. And the way to think about it is self-deception is motivationally biased false belief. Now, how might this happen? Well, first some examples. There was a survey done of some professors in the 90’s, and professors were asked, how good are you? Rate yourself relative to other professors on a 100 point scale. And 96 percent of the professors rated themselves above average, with respect to other professors. But of course, it can’t be, and it’s an amazing figure. That’s just one example.
There was a study done in conjunction with the SAT, also in the ‘90’s if I recall. And students were asked all kinds of things. One thing was, how good are you at getting along with others? And there was a scale you rated yourself. All of them rated themselves above average and 25 percent rated themselves in the top 1 percent in ability to get along with others. And of course, you can only have 1 percent in there, you can’t have 25 percent.
So, what’s going on is that people tend to overestimate themselves on good things. This doesn’t happen in all people. There’s a phenomenon called “depressive realism.” The people who are the most accurate about themselves are depressed people. And one thing we don’t know for sure is whether depression causes the accuracy, or the accuracy causes the depression. It could be the second way.
So people have evidence coming in and evidence that points toward the truth of propositions of what they’d like to be true tends to be for salient for these people, for me too, for all of us. And so, it has a greater grip on what we believe than it really ought to have given justice evidential merit.
There are other examples, none statistical ones, but just things that happen like parents might believe that their kids, maybe young teenagers, are not using drugs. But, the neighbors and other people presented with the same evidence don’t believe that. They believe that these parents’ kids are using drugs and somehow the evidence doesn’t get treated properly by these parents. One thing that happens is that when thinking about something makes them uncomfortable, they tend to stop thinking about it so they don’t absorb the negative evidence or give it as much weight as it deserves. And when thinking about whether little Johnny is using drugs, images of innocent Johnny out playing in the sandbox with his toy trucks might come to mind and they absorb attention, and then what you’re thinking is, “Oh Jeez, a kid like this couldn’t be using drugs.”
So, little things like that add up to unwarranted beliefs and usually the biased things is motivated by what you’d like to be true. And I think that’s what self-deception is. It’s nothing really exotic; it’s a very ordinary thing. And I don’t think that self-deception is always bad either. It’s probably good to overestimate yourself to some degree, a little bit, on a variety of points because I think it does give you a little more confidence, enables you to function better and so on. Of course, you can’t be telling yourself, “Hey, this is what I’m doing,” because then it won’t work.
Question: Do you see human beings as fundamentally rational creatures?
Alfred Mele: Now, that is a good question because there are these two different senses of rationality and sometimes they get conflated and then within each sense there are subdivisions. But, think about rocks. Are they rational or irrational? Well, they’re neither, right? So, there’s rational as opposed to non-rational and so, I’m rational a rock isn’t. And to be rational in that sense, you just have to be able to understand, think, reason, come to conclusions, you know, things like that. But then there’s also rational as opposed to irrational. Now, people are fundamentally rational in the rational as opposed to the irrational sense. In the rational as opposed to irrational sense, I think so too, because if we were to try to imagine somebody that was utterly irrational, how would we interpret that person or understand his behavior? It looks like there’s got to be some kind of pattern to it in order for us to make some kind of assessment as to how rational this person is. So, yeah, I think rationality is widespread, and that’s a good thing. And irrationality is falling short in rationality.
Now, some people like to measure irrationally objectively, from an external point of view. I always feel awkward about doing that because I don’t know individuals inside and out. So, I like to measure it from a subjective point of view that is from the individual’s own point of view. So, practical irrationality in this sense of rationality would be a matter of believing that a certain thing is best to do from your own point of view and not doing it. People say that happens to them. That’s irrational. Also, people might accept certain modes of reasoning as legitimate and then some times reason in ways that violate those modes as in self-deception where you and I think, well if the best way to reason about what’s true is to reason objectively and not to be biased by one’s desires and emotions. But, sometime we might reason in a way that is biased by our desires and emotions. And so that would be subjectively irrational too. So, yeah, there’s a lot of subjective irrationality, but I think by and large, people are rational.
Now, if you measure rationally objectively, you might come to a different conclusion. But then what you have to do is to have your own view about what is really rational to do, independently of people’s preferences and the like. I can’t see myself doing that.
Question: Is the subjective versus objective irrationality contributing to the phenomenon where people don’t act in their best financial interest?
Alfred Mele: Yeah, I think it is related. If it were just a game, I guess buying and selling and so on could be seen as a kind of game. If you know people’s preferences and you know probabilities, then you can deduce what the right option is. And of course, ordinary folks aren’t going to be exactly on the ball all the time in that connection. So, people will make unwise decisions and sometimes they’ll make radically unwise decisions. And often that happens because they’re influenced by the salience of the evidence as opposed to its significance or importance. I mean, here’s an example. So, why do car advertisers, instead of talking about all the properties of the cars, show really attractive people driving them and then really attractive people looking at the drivers? Well, because they figure that attracts people’s attention, it increases the likelihood that they’ll buy this car. And people are moved by things like that. And they shouldn’t be, they should be moved by the objective data. It’s a little bit because it’s so much more boring to look at the data; it’s a little bit harder for people to do that. But this doesn’t mean that there’s some kind of fundamental defect in people, it’s just that maybe they don’t care enough about making the best decision to pay close attention to the data.
Alfred Mele: Plato, or at least Socrates, who’s views Plato expressed, had an opinion about this and the idea was that as the tempting option become closer in time, more available, more readily available, what happens is, you switch your judgment so that at the last second, the student would always judge that really, it’s best to go to the party.
Now, I, myself, don’t think that happens partly on the basis of personal experience, but also partly, well there’s this thing called experimental philosophy now where instead of relying on our own personal intuitions on how things happen, we actually go out and do surveys of ordinary people and of course, the ordinary people are usually undergraduates. But undergraduates are pretty ordinary, nice people, but you know, just lay people. And one thing I did was a survey of, I think about 90 undergraduates recently. And I said, “Does this ever happen to you? You judge that it would be best to do a certain thing, and best from your own point of view too, not the point of view of your peers or parents, or whatever. And then still believing that you should do this thing, you do something else instead. Does it every happen?” And I had a, what’s called, a Libert Scale that goes from one to seven, and it was strongly disagree at one end – I mean, strongly agree at the one end, at the one, and strongly disagree at the seven. And the mean rating was, as I recall, 1.32. So, almost all of them you know agreed that they do it sometimes. Now, they could be wrong. You know, they could be fooling themselves. But I suspect they’re right. And if you think about your own case and I think about my own case, sometimes I am convinced that I shouldn’t do a certain thing and I just think, “What the hell. I’ll do it, I shouldn’t, but I will.” But it’s never anything really bad, but it might be something like smoking a cigarette. I’m trying to quit right now because I’ve had dental surgery. But New Year’s Eve I smoked a cigarette, and I thought I shouldn’t. So, yeah, I think it happens.
And why should we think that it doesn’t happen? Well, if we thought that what we’re most strongly motivated to do always lines up with what we judged best, then since we always do what we are most strongly motivated to do, we would think that when we do it we’re judging it best. But there’s just too much evidence against it.
Question: Can the mind ever really deceive itself, or does it choose what to believe or disbelieve?
Alfred Mele: What might make self-deception impossible is a certain model of it, and it’s a traditional model. And the model is a two-person intentional deception model. So, if I’m going to deceive you into believing something, I’ve got to know that’s it’s false and come up with a strategy for getting you to believe that it’s true. And the normal strategy is lying, and then you trust me, let’s say. So, if you use that model for self-deception, then in the same head, you’ve got knowing what’s true, the intention to get yourself to believing that it’s false, and you’ve got some kind of strategy for doing it. Now, that’s very puzzling, or paradoxical. How are you going to pull it off? It’s as though I said, “Look, I’m going to deceive you now into believing that I drive a Range Rover, and this is how I’m going to do it. I’m going to put a picture of me next to a Range Rover out of my wallet and show it to you and you’re going to believe it’s true, but really it’s false.” Well, is that going to work? No. No way, because you know what I’m up to, right?
So, if you put all this in one head, then it looks like, well the person knows what he’s up to, so he can’t possibly succeed. So, it’s paradoxical. And also, the person would have to believe at the same time the truth, and it’s opposite. They both have to be there in the same place.
So, one thing I do is to reject this two-person model of self-deception and I have a different kind of model. And the way to think about it is self-deception is motivationally biased false belief. Now, how might this happen? Well, first some examples. There was a survey done of some professors in the 90’s, and professors were asked, how good are you? Rate yourself relative to other professors on a 100 point scale. And 96 percent of the professors rated themselves above average, with respect to other professors. But of course, it can’t be, and it’s an amazing figure. That’s just one example.
There was a study done in conjunction with the SAT, also in the ‘90’s if I recall. And students were asked all kinds of things. One thing was, how good are you at getting along with others? And there was a scale you rated yourself. All of them rated themselves above average and 25 percent rated themselves in the top 1 percent in ability to get along with others. And of course, you can only have 1 percent in there, you can’t have 25 percent.
So, what’s going on is that people tend to overestimate themselves on good things. This doesn’t happen in all people. There’s a phenomenon called “depressive realism.” The people who are the most accurate about themselves are depressed people. And one thing we don’t know for sure is whether depression causes the accuracy, or the accuracy causes the depression. It could be the second way.
So people have evidence coming in and evidence that points toward the truth of propositions of what they’d like to be true tends to be for salient for these people, for me too, for all of us. And so, it has a greater grip on what we believe than it really ought to have given justice evidential merit.
There are other examples, none statistical ones, but just things that happen like parents might believe that their kids, maybe young teenagers, are not using drugs. But, the neighbors and other people presented with the same evidence don’t believe that. They believe that these parents’ kids are using drugs and somehow the evidence doesn’t get treated properly by these parents. One thing that happens is that when thinking about something makes them uncomfortable, they tend to stop thinking about it so they don’t absorb the negative evidence or give it as much weight as it deserves. And when thinking about whether little Johnny is using drugs, images of innocent Johnny out playing in the sandbox with his toy trucks might come to mind and they absorb attention, and then what you’re thinking is, “Oh Jeez, a kid like this couldn’t be using drugs.”
So, little things like that add up to unwarranted beliefs and usually the biased things is motivated by what you’d like to be true. And I think that’s what self-deception is. It’s nothing really exotic; it’s a very ordinary thing. And I don’t think that self-deception is always bad either. It’s probably good to overestimate yourself to some degree, a little bit, on a variety of points because I think it does give you a little more confidence, enables you to function better and so on. Of course, you can’t be telling yourself, “Hey, this is what I’m doing,” because then it won’t work.
Question: Do you see human beings as fundamentally rational creatures?
Alfred Mele: Now, that is a good question because there are these two different senses of rationality and sometimes they get conflated and then within each sense there are subdivisions. But, think about rocks. Are they rational or irrational? Well, they’re neither, right? So, there’s rational as opposed to non-rational and so, I’m rational a rock isn’t. And to be rational in that sense, you just have to be able to understand, think, reason, come to conclusions, you know, things like that. But then there’s also rational as opposed to irrational. Now, people are fundamentally rational in the rational as opposed to the irrational sense. In the rational as opposed to irrational sense, I think so too, because if we were to try to imagine somebody that was utterly irrational, how would we interpret that person or understand his behavior? It looks like there’s got to be some kind of pattern to it in order for us to make some kind of assessment as to how rational this person is. So, yeah, I think rationality is widespread, and that’s a good thing. And irrationality is falling short in rationality.
Now, some people like to measure irrationally objectively, from an external point of view. I always feel awkward about doing that because I don’t know individuals inside and out. So, I like to measure it from a subjective point of view that is from the individual’s own point of view. So, practical irrationality in this sense of rationality would be a matter of believing that a certain thing is best to do from your own point of view and not doing it. People say that happens to them. That’s irrational. Also, people might accept certain modes of reasoning as legitimate and then some times reason in ways that violate those modes as in self-deception where you and I think, well if the best way to reason about what’s true is to reason objectively and not to be biased by one’s desires and emotions. But, sometime we might reason in a way that is biased by our desires and emotions. And so that would be subjectively irrational too. So, yeah, there’s a lot of subjective irrationality, but I think by and large, people are rational.
Now, if you measure rationally objectively, you might come to a different conclusion. But then what you have to do is to have your own view about what is really rational to do, independently of people’s preferences and the like. I can’t see myself doing that.
Question: Is the subjective versus objective irrationality contributing to the phenomenon where people don’t act in their best financial interest?
Alfred Mele: Yeah, I think it is related. If it were just a game, I guess buying and selling and so on could be seen as a kind of game. If you know people’s preferences and you know probabilities, then you can deduce what the right option is. And of course, ordinary folks aren’t going to be exactly on the ball all the time in that connection. So, people will make unwise decisions and sometimes they’ll make radically unwise decisions. And often that happens because they’re influenced by the salience of the evidence as opposed to its significance or importance. I mean, here’s an example. So, why do car advertisers, instead of talking about all the properties of the cars, show really attractive people driving them and then really attractive people looking at the drivers? Well, because they figure that attracts people’s attention, it increases the likelihood that they’ll buy this car. And people are moved by things like that. And they shouldn’t be, they should be moved by the objective data. It’s a little bit because it’s so much more boring to look at the data; it’s a little bit harder for people to do that. But this doesn’t mean that there’s some kind of fundamental defect in people, it’s just that maybe they don’t care enough about making the best decision to pay close attention to the data.
Recorded on January 5, 2010
Interviewedrn by Austin Allen
Interviewedrn by Austin Allen
▸
4 min
—
with