How should a layperson think (or form belief attitudes) about any given statement? Is it enough to research reliable sources and use logical, unbiased thinking? In Chapter 1, I will argue that it is not and that a layperson should instead defer to the majority testimony of experts on that statement, if there are any experts on that statement.
For any statement (or proposition), p, there are three possible belief attitudes (or doxastic attitudes, in technical jargon) toward p: to believe p (which is consistent with having a range of degrees of confidence in p), to disbelieve p (to believe that p is false), and to withhold belief about p (to withhold belief about whether p is true or false).
Doxastic attitudes may not be directly voluntary, but they can be informed by guiding principles that are sufficiently prima facie plausible or evidenced. Consider the following prima facie plausible principle: Unless there are overriding moral objections, one should form epistemically rational doxastic attitudes—that is, doxastic attitudes that best serve one’s epistemic ends, namely those of acquiring true beliefs and avoiding false beliefs to some respective extent.
Generally, when choosing between different courses of action, with a view to achieving one’s ends, false beliefs are undesirable more so than true beliefs are desirable. One error or miscalculation is often enough to undermine such a decision. Nevertheless, true beliefs are still obviously essential to choosing the best courses of action in pursuit of one’s goals. Therefore, one should aim to maximize one’s subjective chances of holding a favorable ratio of true beliefs to false beliefs (Alston, 1985), rather than the highest absolute number of true beliefs or the lowest absolute number of false beliefs.
It follows that unless there are overriding moral objections, one should acquire epistemically rational doxastic attitudes—that is, doxastic attitudes that maximize one’s subjective chances of holding a favorable ratio of true beliefs to false beliefs.
Now, avoiding fallacies, mitigating one’s cognitive biases, and relying on trustworthy sources of information are the most commonly talked about ways of promoting truth-conducive thinking (or, in other words, acquiring epistemically rational doxastic attitudes). But for any given proposition, unless one is an expert on that proposition, there are ways of thinking about that proposition that are considerably more truth-conducive (or epistemically rational) but rarely if ever discussed outside of specialist academic circles—namely, following the rules of epistemic rationality that govern how laypeople should learn from the testimony of experts.
There are cases of credentialed experts defending fringe claims in reliable sources, such as academic journals, without apparently committing any logical fallacies and without showing any overt signs of bias (Curry, 2011; Lemonick, 2010). This is to be expected in the sciences and in philosophy, where progress often requires that individual scientists and philosophers challenge mainstream or even consensus views.
Laypeople who have learned how to discern logical fallacies, cognitive biases, as well as reputable and disreputable sources of information will be poorly equipped to rationally assess the statements made by those credentialed experts unless and until these laypeople are made acutely aware of the rules of epistemic rationality that govern how they should learn from the testimony of experts—that is, how they should adjust their own doxastic attitudes in light of what doxastic attitudes experts hold.
These rules of epistemic rationality can be summarised as follows: For any proposition, p, a layperson with respect to p is rationally required to believe p if and only if p is believed by the majority of experts on p. There are exceptions to the rule. See Lahno (2014). See also Coady (2012), Goldman (2001), and Huemer (2005). Of course, there is, in principle, nothing stopping a layperson from becoming an expert on p. In that case, there are several alternative belief-forming methodologies available to him or her in the published literature (Matheson, 2015).
Anyone can be an expert on any given proposition, regardless of their academic or professional credentials, provided that they have studied or researched sufficiently thoroughly and impartially the evidence that bears on that proposition (Croce, 2019). The key word here is “sufficiently”. Given the Dunning–Kruger effect (i.e., “difficulties in recognizing one’s own incompetence lead to inflated self-assessments”[1]), those who lack the academic or professional credentials of credentialed experts should exercise caution in self-identifying as experts, and others should also exercise caution in identifying them as experts.
In the case of philosophical propositions, and especially ethical and religious propositions, it is arguable whether there are any genuine experts on those propositions (De Cruz, 2018; Frances, 2018; Matheson, McElreath, & Nobis, 2018).
Bibliography
Alston, W. (1985). Concepts of epistemic justification. The Monist, 68(1), 57–89. https://doi.org/10.5840/monist198568116
Coady, D. (2012). What to Believe Now: Applying Epistemology to Contemporary Issues. Chichester: Wiley-Blackwell.
Croce, M. (2019). On What it Takes to be an Expert. The Philosophical Quarterly, 69(274), 1–21. https://doi.org/10.1093/pq/pqy044
Curry, J. (2011). Reasoning about climate uncertainty. Climatic Change, 108, 723–732. https://doi.org/10.1007/s10584-011-0180-z
Curtis, G. N. (n.d.). The Fallacy Files. Retrieved April 8, 2021, from http://www.fallacyfiles.org/
De Cruz, H. (2018). Religious Disagreement. Cambridge: Cambridge University Press.
Dowden, B. (n.d.). Fallacies. The Internet Encyclopedia of Philosophy. Retrieved April 8, 2021, from https://www.iep.utm.edu/fallacy/
Dunning, D. (2014). We are all confident idiots. Pacific Standard, 7(6), 46–54. https://psmag.com/social-justice/confident-idiots-92793
Frances, B. (2018). Philosophical Expertise. In D. Coady & J. Chase (Eds.), The Routledge Handbook of Applied Epistemology (Routledge Handbooks in Philosophy) (pp. 297–306). Routledge. https://doi.org/10.4324/9781315679099
Goldman, A. (2001). Experts: Which Ones Should You Trust? Philosophy and Phenomenological Research, 63(1), 85–110. https://doi.org/10.1111/j.1933-1592.2001.tb00093.x
Huemer, M. (2005). Is Critical Thinking Epistemically Responsible? Metaphilosophy, 36(4), 522–531. https://doi.org/10.1111/j.1467-9973.2005.00388.x
Is My Source Credible? (n.d.). University of Maryland University College. Retrieved April 8, 2021, from https://sites.umuc.edu/library/libhow/credibility.cfm
Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121–1134. https://doi.org/10.1037/0022-3514.77.6.1121
Lahno, B. (2014). Challenging the majority rule in matters of truth. Erasmus Journal for Philosophy and Economics, 7(2), 54–72. https://doi.org/10.23941/ejpe.v7i2.167
Lemonick, M. D. (November 1, 2010). Climate heretic: Judith Curry turns on her colleagues. Nature. https://doi.org/10.1038/news.2010.577
Matheson, J. (2015). Disagreement and Epistemic Peers. Oxford Handbooks Online. http://dx.doi.org/10.1093/oxfordhb/9780199935314.013.13
Matheson, J., McElreath, S., & Nobis, N. (2018). Moral Experts, Deference & Disagreement. In J. C. Watson & L. K. Guidry-Grimes (Eds.), Moral Expertise: New Essays from Theoretical and Clinical Bioethics (pp. 87–106). Springer. https://doi.org/10.1007/978-3-319-92759-6_5
[1] Kruger & Dunning, 1999.
Leave a Reply