This time I read my 2022 paper in Review of Philosophy and Psychology titled, “Great Minds Do Not Think Alike: Philosophers’ Views Predicted by Reflection, Education, Personality, and Other Demographic Differences“. As the title suggests, various psychological factors predicted variance in philosophers’ answers to classic philosophical questions. This raises questions about how psychological and demographic differences can explain philosophical differences. There are also implications for scientific psychologists as well as academic philosophers.Continue reading Upon Reflection, Ep. 10: Great Minds Do Not Think Alike
Below are the syllabus and materials for my Introduction to Philosophy course. You are welcome to use any of the material as a student or as an instructor. The usual creative commons license applies to my portion of this—i.e., only the stuff to which I would have a copyright. (If you are my student, remember that you can be quizzed on the contents of the syllabus.)
I. Introduction to Philosophy
Did you know that people who study philosophy make significantly fewer reasoning errors than others? (See Livengood et al 2010 and Byrd 2014). And did you know that philosophy majors outperform basically everyone else on the GRE? And did you know that the median mid-career salary for people who major in philosophy is $81,000? And did you know that philosophy majors were projected to be the top-paid humanities major in 2016? Find out more about philosophy majors here. And if you’ve never taken a philosophy class, you might want to read this 3-4 page intro. Continue reading Introduction to Philosophy: A Free Course
I took a few courses in biblical studies and Christian apologetics as an undergraduate. The courses definitely influenced my thinking, but not in the way that I expected.
For years, I intended to study engineering. In my senior year of high school, I was admitted to a public school with a decent engineering program. But late in the summer, I changed my mind. I had recently become a Christian and I was dating someone who was going to a Christian college. And apparently that was enough to convince 18-year-old-me that I should also go to a Christian college and study the Bible. (Aside: Can you believe that 18-year-old-me was allowed to vote and serve on a jury?)
I signed up for Christian apologetics courses — as well as biblical studies courses — hoping to find compelling arguments to rationalize my relatively new faith. At first, the arguments seemed compelling. I remember being excited to take the arguments to unbelieving friends back home and see what they had to say.
But the more I thought about the arguments, the less Continue reading My Experience with Christian Apologetics
On Saturday, I was on the Veracity Hill Podcast talking about the evidence that atheists and agnostics reason more reflectively (i.e., make fewer errors) than theists.
- What do we mean by ‘reflective’? And how do we measure reflection? Who counts as a theist? And how do we measure religiosity?
- What do these findings about atheists and theists tell us about atheism and theism (if anything)? And how might further research answer hitherto unanswered questions about how atheists and theists reason?
- What are some related findings? For instance, what does this have to do with other philosophical beliefs?
When you step back and question your beliefs and assumptions, do you expect to change your mind? Should you? I think that reflective reasoning is supposed to change our minds. But it might not change our beliefs. Sometimes reflection reinforces our beliefs. And sometimes reflection makes our beliefs more extreme or partisan. I’ll explain below. Continue reading Is Reflective Reasoning Supposed To Change Your Mind?
Cognitive Science investigates the mind with methods and tools from various fields like computer science, neuroscience, psychology, and philosophy. Here are some popular cognitive science podcasts. I listen to almost all of them, so feel free to contact me if you have questions that are not answered in each podcast’s description below.
“I would use the Department of Education … to monitor our institutions of higher education for extreme political bias and deny federal funding if it exists.” –Ben Carson
1. Everyone has biases — political and otherwise.
So denying funding on the basis of any political bias would be tantamount to denying all federal education funding. That’d be problematic. So — if we assume a charitable interpretation of Carson — that’s surely not the Republican plan (…or is it?). So let’s assume that Carson is not out to defund any educational institution that exhibits just any political bias.
Instead, maybe Carson’s plan is to monitor for particular biases. The idea here would be that only institutions with certain biases should be defunded. But even that would be problematic. After all, Carson is a human. And humans are more likely to notice and take issue with others’ biases (Corner et al 2012; Lord et al 1979) or biases that merely seem like others’ biases (Trouche et al 2015, 2018). So Carson might be more attuned to and dismissive of others’ biases than his own. And that itself is a political bias.
To overcome that bias, we would need to make sure that Continue reading Politicians Defunding Based on Political Bias? Sounds Biased
The 2016 US election has many people thinking about third party candidates. Good news: philosophers and others have been sorting out the ethics and rationality of voting for awhile now. I talk about the philosophy of third party voting with Kurt Jaros below:
2022 Update: My own results mentioned below replicated in a paper now published in Review of Philosophy and Psychology. Free paper, audiopaper, and link to the journal’s version here.
Philosophy helps us reason better, right? I mean, taking courses in analytic philosophy and argument mapping does more for students’ critical thinking than even critical thinking courses do (Alvarez-Ortiz 2007). And the more training one has in philosophy, the better one does on certain reasoning tasks (Livengood et al 2010). So it’s no accident that philosophy majors tend to outperform almost every other major on the GRE, the GMAT, and the LSAT (“Why Study Philosophy…“; see also Educational Testing Service 2014). That’s why people like Deanna Kuhn have such high praise for philosophers’ reasoning (Kuhn 1991, 258-262).†
Reasoning expertise: We turn now to the philosophers…. The performance of the philosophers is not included in table form because it is so easily summarized. No variation occurs…philosophers [show] perfect performance in generation of genuine evidence, alternative theories, counterarguments, and rebuttals…. The philosophers display a sophisticated understanding of argumentative structure…. None of the philosophers [had] any special expertise in any of the content domains that the questions address…. The performance of philosophers shows that it is possible to attain expertise in the reasoning process itself, independent of any particular content to which the reasoning is applied.
But there’s much more to say about this. For instance, we might ask two questions about this evidence.
It’s one thing to claim that philosophers are better reasoners, but that’s not the same as being perfect reasoners. After all, philosophers might reason better than others and yet still be vulnerable to systematic reasoning errors. So we need to ask: Are philosophers’ prone to cognitive errors like everyone else?
Also, if philosophers are prone to cognitive error, what is the relationship between their errors and their philosophical views?
1. Are Philosophers Prone To Cognitive Error?
In order to understand the rest of the post, you will need to answer the question below. It should only take a moment.
The question comes from the Cognitive Reflection Test (CRT) (Frederick 2005). It is designed to elicit a quick answer. What answer first came to your mind?
If you are like most people, one answer quickly came to mind: “10 cents.” And if you are like many people, you had an intuitive sense that this answer was correct. Alas, 10 cents is not correct. You can work out the correct answer on your own if you like. The point I want to make is this: the intuitively correct answer to this question is demonstrably false. This suggests that answering this question intuitively constitutes an error in reasoning.
It turns out that philosophers are less likely than others to make this error.
Jonathan Livengood and colleagues found that the more philosophical training one had, the less likely one was to make this error (Livengood et al 2010). I replicated this finding a few years later (Byrd 2014). Specifically, I found that people who had — or were candidates for — a Ph.D. in philosophy were significantly less likely than others to make this reasoning error — F(1, 558) = 15.41, p < 0.001, d = 0.32 (ibid.).
Some philosophers performed perfectly on the CRT — even after controlling for whether philosophers were familiar with the CRT. However, many philosophers did not perform perfectly. Many philosophers made the error of responding unreflectively on one or two of the CRT questions. This implies an answer to our first question.
Answer: Yes. Philosophers’ reasoning is susceptible to systematic error.
So what about our second question?
2. Do Philosopher’s Errors Predict Their Views?
Among lay reasoners, the tendency to make this reasoning error on the CRT has correlated with believing that God exists, that immortal souls exist, that life experiences can count as evidence that a god exists, etc. (Shenhav Rand and Greene 2012). This finding is in line with a common theme in the research on reasoning: unreflective reasoning correlates with a bunch of religious, supernatural, and paranormal beliefs (Aarnio and Lindeman 2005; Bouvet and Bonnefon 2015; Giannotti et al 2001, Pennycook et al 2012, Pennycook et al 2013, Pennycook et al 2014a, 2014b).
And this finding has now been replicated among philosophers. Specifically, the more that philosophers were lured into the intuitively appealing yet incorrect answers on the CRT (e.g., “10 cents”), the more that they leaned toward or accepted theism — F(1, 559) = 7.3, p < 0.01, d = 0.16, b = 0.12 (Byrd 2014).
There is also evidence that people who make this error on the CRT are more prone to certain moral judgments. To see what I mean, read the scenario below (Foot 1967).
So? Would you pull the switch or not? Those who answered unreflectively on the CRT have been less likely to pull the switch (Paxton, Ungar, and Greene 2012).
Once again, it turns out that this finding holds among philosophers as well. Philosophers who were more likely to make a reasoning error on the CRT were less likely to pull the switch — F(1, 559) = 6.93, p < 0.001, d = 0.15, b = 0.17 (Byrd 2014).
Philosophers’ proclivity to make this error was also positively associated with other philosophical views:
- Physical (as opposed to psychological) views of personal identity — F(1, 558) = 8.57, p < 0.001, d = 0.17.
- Fregeanism (as opposed to Russelianism) about language — F(1, 558) = 8.59, p < 0.01, d = 0.17.
I have lots of thoughts about these findings, but I want to keep things brief. For now, consider the implied answer to our second question.
Answer: Yes. Philosophers’ reasoning errors are related to their views.
So there you have it. It would seem that philosophers are susceptible to systematic reasoning errors. And insofar as philosophers are so susceptible, they tend toward certain views. I’m tempted to say more, but I’ve already done so elsewhere (Byrd 2014) and I am working on a pre-registered replication of these findings for—among other things—my dissertation.††
† Thanks to Greg Ray for pointing me to this passage.
†† What does the rest of the literature suggest about philosophers’ reasoning? Unsurprisingly, the verdict is disputed (Nado 2014, Machery 2015, Mizrahi 2015, Rini 2015). Indeed, philosophers seem susceptible to the same tricks as anyone else (Schwitzgebel and Cushman 2015; Pinillos et al 2011). And second, even if philosophers are better reasoners, it’s not even clear why they are better (Clarke 2013). Why would philosophers be better reasoners than others? I sketch an account in Byrd 2014, Section 3 (see also Weinberg, Gonnerman, Buckner, and Alexander 2010).
This paper attempts to specify the conditions under which a psychological explanation can undermine or debunk a set of beliefs. The focus will be on moral and religious beliefs, where a growing debate has emerged about the epistemic implications of cognitive science. Recent proposals by Joshua Greene and Paul Bloom will be taken as paradigmatic attempts to undermine beliefs with psychology. I will argue that a belief p may be undermined whenever: (i) p is evidentially based on an intuition which (ii) can be explained by a psychological mechanism that is (iii) unreliable for the task of believing p; and (iv) any other evidence for belief p is based on rationalization. I will also consider and defend two equally valid arguments for establishing unreliability:the redundancy argument and the argument from irrelevant factors. With this more specific understanding of debunking arguments, it is possible to develop new replies to some objections to psychological debunking arguments from both ethics and philosophy of religion.