This time I read my 2022 paper in Review of Philosophy and Psychology titled, “Great Minds Do Not Think Alike: Philosophers’ Views Predicted by Reflection, Education, Personality, and Other Demographic Differences“. As the title suggests, various psychological factors predicted variance in philosophers’ answers to classic philosophical questions. This raises questions about how psychological and demographic differences can explain philosophical differences. There are also implications for scientific psychologists as well as academic philosophers.Continue reading Upon Reflection, Ep. 10: Great Minds Do Not Think Alike
The data quality on Amazon Mechanical Turk (mTurk) has suffered for years now (Chandler & Paolacci, 2017; Moss & Litman, 2018; Chmielewski & Kucker, 2019; Ahler et al., 2020; Kennedy et al., 2020; MacInnis et al., 2020). There are a few ways to protect online survey data quality. In this post, I will briefly cover five strategies for weeding out junk data in online research (not just via mTurk), from easiest to hardest.Continue reading 5 Ways To Overcome Junk Data From mTurk (and online surveys more generally)
Philosophers and cognitive scientists tend to think that reflective reasoning will improve our judgments and decisions. The idea reflection will lead us to test our judgments by “looking for their coherence with our beliefs about similar cases and our beliefs about a broader range of …issues” a la reflective equilibrium. This sounds intuitively plausible. But is it true? In this post I briefly present some research suggesting that reflective reasoning often, but does not always improve our judgments and decisions. Continue reading What good is reflective reasoning?
I have had some side gigs in graduate school that involved creating invoices for hourly work—web development, copyediting, research assistance, etc. I used Toggl to log my time. At some point, I realized that I could log all of my work time—not just the billable time. So in 2018 and 2019, I logged all of my work time. In this post, I will summarize the 2018 and 2019 data and mention some take-aways for 2020.Continue reading Two Years In The Life Of A Grad Student: Time Logging Data
One of the questions I get a lot these days is, “How can I learn data analysis without actually taking a statistics course?” In this post, I will relay my answers to that question.
From September 5 to September 30, there is an exciting, free, online conference about the philosophy and science of mind: the (second annual) Minds Online conference! Loads of wonderful scholars are sharing and commenting on each other’s research — and you can access and participate in all of it!
Here are a few things to note for those who are new to online conferences.
- Sessions: There are four sessions, each with a different topic and its own keynote.
- Timeline: Each session lasts one week. (So the conference lasts four weeks).
- Participating: You can read papers starting the weekend before their session. And you you can comment on papers on Monday through Friday of their session.
So head on over and enjoy the wonder that is conferencing from the comfort of your home, office, favorite coffee shop, etc.
Here’s the program: http://mindsonline.philosophyofbrains.com/minds-online-2016-program/
I see more fact-checking on Facebook than I used to. While I’m glad to see fact-checking catching on, fact-checking isn’t enough — or so I’ll argue in this post.
1. Fact-checking: The problem
Let’s say that you and I agree on all the facts. Now let’s say that we start arguing. Will we agree? Will we even argue well? Not necessarily!
After all, we can reason badly even if we agree on the facts. Specifically, we can jump to conclusions that don’t follow from the facts. So fact-checking our argument(s) won’t necessarily fix all the problems with our argument(s).
2. Bad Arguments
Consider some of the claims that people make:
- The new federal healthcare policy caused Continue reading Fact-checking is not enough: We need argument-checking