Moralistic fallacy

The moralistic fallacy is the informal fallacy of assuming that whichever aspect of nature which has socially unpleasant consequences cannot exist. Its typical form is "if X were true, then it would happen that Z!", where Z is a morally, socially or politically undesirable thing. What should be moral is assumed a priori to also be naturally occurring. The moralistic fallacy is used to be presented as the reverse of the naturalistic fallacy. However, it could be seen as a variation of the very same naturalistic fallacy; the difference between them could be considered pragmatical, depending on the intentions of that who uses it: naturalistic fallacy if he wants to justify existing social practices with the argument that they are natural; moralistic fallacy if he wants to combat existing social practices with the argument of denying that they are natural.

Examples

Steven Pinker writes that "[t]he naturalistic fallacy is the idea that what is found in nature is good. It was the basis for social Darwinism, the belief that helping the poor and sick would get in the way of evolution, which depends on the survival of the fittest. Today, biologists denounce the naturalistic fallacy because they want to describe the natural world honestly, without people deriving morals about how we ought to behave (as in: If birds and beasts engage in adultery, infanticide, cannibalism, it must be OK)." Pinker goes on to explain that "[t]he moralistic fallacy is that what is good is found in nature. It lies behind the bad science in nature-documentary voiceovers: lions are mercy-killers of the weak and sick, mice feel no pain when cats eat them, dung beetles recycle dung to benefit the ecosystem and so on. It also lies behind the romantic belief that humans cannot harbor desires to kill, rape, lie, or steal because that would be too depressing or reactionary."[1]

Moralistic fallacy:

Naturalistic fallacy:

Effects on science and society

Sometimes basic scientific findings or interpretations are rejected, or their discovery or development or acknowledgement is opposed or restricted, through assertions of potential misuse or harmfulness.

In the late 1970s, Bernard Davis, in response to growing political and public calls to restrict basic research (versus applied research), amid criticisms of dangerous knowledge (versus dangerous applications), applied the term moralistic fallacy toward its present use.[2]

(The term was used as early as 1957 to at least some if differing import.[3])

In natural science, the moralistic fallacy can result in rejection or suppression of basic science, whose goal is understanding the natural world, on account of its potential misuse in applied science, whose goal is the development of technology or technique.[4] This blurs scientific assessment, discussed in natural sciences (like physics or biology), versus significance assessment, weighed in social sciences (like social psychology, sociology, and political science), or in behavioral sciences (like psychology).

Davis asserted that in basic science, the descriptive, explanatory, and thus predictive ability of information is primary, not its origin or its applications, since knowledge cannot be ensured against misuse, and misuse cannot falsify knowledge. Both misuse and prevention and suppression of scientific knowledge can have undesired or even undesirable effects. In the early 20th century, development of the basic science quantum physics enabled the atomic bomb through applied science in the mid 20th century. Without quantum physics, however, much technology of communications and imaging, by other applied science, could have been renounced.

Scientific theories with abundant research support can be discarded in public debates, where general agreement is central but can be utterly false.[5] The obligation of basic scientists to inform the public, however, can be stymied by contrasting claims from others both rousing alarm and touting assurances of protecting the public.[6] Davis had indicated that greater and clearer familiarization with the uses and limitations of science can more effectively prevent knowledge misuse or harm.[7]

Natural science can help humans understand the natural world, but it cannot make policy, moral, or behavioral decisions.[7] Questions involving values—what people should do—are more effectively addressed through discourse in social sciences, not by restriction of basic science.[7] Misunderstanding of the potential of science, and misplaced expectations, have resulted in moral and decisionmaking impediments, but suppressing science is unlikely to resolve these dilemmas.[7]

Seville Statement on Violence

The Seville Statement on Violence was adopted, in Seville, Spain, on 16 May 1986, by an international meeting of scientists convened by the Spanish National Commission for UNESCO. UNESCO adopted the statement, on 16 November 1989, at the twenty-fifth session of its General Conference. The statement purported to refute "the notion that organized human violence is biologically determined".[8]

Some, including Steven Pinker,[9] have criticized the Seville Statement as an example of the moralistic fallacy. Research in the areas of evolutionary psychology and neuropsychology suggest that human violence has biological roots.[10][11]

See also

References

  1. Sailer, Steve (October 30, 2002). "Q&A: Steven Pinker of 'Blank Slate'". UPI. Archived from the original on December 5, 2015. Retrieved December 5, 2015.
  2. Davis BD (1978). "The moralistic fallacy". Nature. 272 (5652): 390. doi:10.1038/272390a0. PMID 11643452.
  3. Moore EC (1957). "The Moralistic Fallacy". The Journal of Philosophy. 54 (2): 29–42. doi:10.2307/2022356. JSTOR 2022356.
  4. Davis BD (2000). "The scientist's world". Microbiol Mol Biol Rev. 64 (1): 1–12. doi:10.1128/MMBR.64.1.1-12.2000. PMC 98983Freely accessible. PMID 10704471.
  5. Kreutzberg GW (2005). "Scientists and the marketplace of opinions". EMBO Rep. 6 (5): 393–6. doi:10.1038/sj.embor.7400405. PMC 1299311Freely accessible. PMID 15864285.
  6. Davis BD (2000), section "Technology".
  7. 1 2 3 4 Davis BD (2000), section "Limited scope of science".
  8. Suter, Keith (2005). 50 Things You Want to Know About World Issues... But Were Too Afraid to Ask. Milson's Point, NSW, Australia: Transworld Publishers. ISBN 978-1-86325-503-5.
  9. Pinker, Steven. How the Mind Works. New York: W. W. Norton & Company, 1997, pp. 44 and 49.
  10. Jones D (2008). "Human behaviour: Killer instincts". Nature. 451 (7178): 512–5. doi:10.1038/451512a. PMID 18235473.
  11. May ME & Kennedy CH (2009). "Aggression as positive reinforcement in mice under various ratio- and time-based reinforcement schedules". Journal of the Experimental Analysis of Behavior. 91 (2): 185–96. doi:10.1901/jeab.2009.91-185. PMC 2648522Freely accessible. PMID 19794833.
This article is issued from Wikipedia - version of the 9/7/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.