Moral psychology

Moral psychology is a field of study in both philosophy and psychology. Some use the term "moral psychology" relatively narrowly to refer to the study of moral development.[1] However, others tend to use the term more broadly to include any topics at the intersection of ethics, psychology, and philosophy of mind.[2] Some of the main topics of the field are moral judgment, moral reasoning, moral sensitivity, moral responsibility, moral motivation, moral identity, moral action, moral development, moral diversity, moral character (especially as related to virtue ethics), altruism, psychological egoism, moral luck, moral forecasting, moral emotion, affective forecasting, and moral disagreement.[3][4]

Moral psychology is a novel branch within the field of psychology. The study of moral identity development is one aspect of psychology that shows the most potential for growth due to the numerous sections within the field regarding its structure, mechanisms, and dynamics.[5] A moral act is a type of behavior that refers to an act that has either a moral or immoral consequence. Moral Psychology can be applied across a broad range of studies, including philosophy and psychology. However it is implemented in different ways depending on culture. In many cultures, a moral act refers to an act that entails free will, purity, liberty, honesty, and meaning. An immoral act refers to an act that entails corruption and fraudulence and usually leads to negative consequences.

Some psychologists that have worked in the field are: Jean Piaget, Lawrence Kohlberg, Elliot Turiel, Jonathan Haidt, Linda Skitka, Leland Saunders, Marc Hauser, C. Daniel Batson, Jean Decety, Joshua Greene, A. Peter McGraw, Philip Tetlock, Darcia Narvaez, Tobias Krettenauer, Liane Young and Fiery Cushman. Some philosophers that have worked in the field are Stephen Stich, John Doris, Joshua Knobe, John Mikhail, Shaun Nichols, Thomas Nagel, Robert C. Roberts, Jesse Prinz, Michael Smith, and R. Jay Wallace.

Background

Moral psychology began with early philosophers such as Aristotle, Plato, and Socrates. They believed that "to know the good is to do the good". They analyzed the ways in which people make decisions with regards to moral identity. The battle of good versus evil has been studied since the time moral psychology became accepted as a formal branch of psychology/philosophy up until the present and it continues to expand. As the field of psychology began to divide away from philosophy, moral psychology expanded to include risk perception and moralization, morality with regards to medical practices, concepts of self-worth, and the role of emotions when analyzing one's moral identity. In most introductory psychology courses, students learn about moral psychology by studying the psychologist Lawrence Kohlberg,[6] who introduced the moral development theory in 1969. This theory was built on Piaget's observation that children develop intuitions about justice that they can later articulate. The increasing sophistication of articulation of reasoning is a sign of development. Moral cognitive development centered around justice and guided moral action increase with development, resulting in a postconventional thinker that can "do no other" than what is reasoned to be the most moral action. But researchers using the Kohlberg model found a gap between what people said was most moral and actions they took. Today, some psychologists and students alike rely on Blasi's self-model that link ideas of moral judgment and action through moral commitment. Those with moral goals central to the self-concept are more likely to take moral action. The individual feels responsible for taking the moral action. However, those who are motivated will attain a unique moral identity[5]

History

Historically, early philosophers such as Aristotle and Plato engaged in both empirical research and a priori conceptual analysis about the ways in which people make decisions about issues that raise moral concerns. Moral psychological issues have been central theoretical issues explored by philosophers from the early days of the profession right up until the present. With the development of psychology as a discipline separate from philosophy, it was natural for psychologists to continue pursuing work in moral psychology, and much of the empirical research of the 20th century in this area was completed by academics working in psychology departments.

Today moral psychology is a thriving area of research in both philosophy and psychology, even at an interdisciplinary level.[7] For example, the psychologist Lawrence Kohlberg questioned boys and young men about their thought processes when they were faced with a moral dilemma[8][9] producing one of many very useful empirical studies in the area of moral psychology. As another example, the philosopher Joshua Knobe recently completed an empirical study on how the way in which an ethical problem is phrased dramatically affects an individual's intuitions about the proper moral response to the problem. More conceptually focused research has been completed by researchers such as John Doris. Doris (2002) discusses the way in which social psychological experiments—such as the Stanford prison experiments involving the idea of situationism—call into question a key component in virtue ethics: the idea that individuals have a single, environment-independent moral character. As a further example, Shaun Nichols (2004) examines how empirical data on psychopathology suggests that moral rationalism is false.

Measures

Philosophers and psychologists have created structured interviews and surveys as a means to study moral psychology and its development.

Interview techniques

Since at least 1894, philosophers and psychologists attempted to evaluate the morality of an individual, especially attempting to distinguish adults from children in terms of their judgment, but the efforts failed because they "attempted to quantify how much morality an individual had—a notably contentious idea—rather than understand the individual's psychological representation of morality."[10] Lawrence Kohlberg addressed that difficulty in 1963 by modeling evaluative diversity as reflecting a series of developmental stages (à la Jean Piaget). Lawrence Kohlberg's stages of moral development are:[11]

  1. Obedience and punishment orientation
  2. Self-interest orientation
  3. Interpersonal accord and conformity
  4. Authority and social-order maintaining orientation,
  5. Social contract orientation
  6. Universal ethical principles.

Stages 1 and 2 are combined into a single stage labeled "pre-conventional", and stages 5 and 6 are combined into a single stage labeled "post-conventional" for the same reason; psychologists can consistently categorize subjects into the resulting four stages using the "Moral Judgement Interview" which asks subjects why they endorse the answers they do to a standard set of moral dilemmas.[12]

Rather than confirm the existence of a single highest stage, Larry Walker's cluster analysis of a wide variety of interview and survey variables for moral exemplars found three types: The "caring" or "communal" cluster was strongly relational and generative, the "deliberative" cluster had sophisticated epistemic and moral reasoning, and the "brave" or "ordinary" cluster was less distinguished by personality.[13]

Survey instruments

Between 1910 and 1930, in the United States and Europe, several morality tests were developed to classify subjects as fit or unfit to make moral judgments.[10][14] Test-takers would classify or rank standardized lists of personality traits, hypothetical actions, or pictures of hypothetical scenes. As early as 1926, catalogs of personality tests included sections specifically for morality tests, though critics persuasively argued that they merely measured awareness of social expectations.[15]

Meanwhile, Kohlberg inspired a new wave of morality tests. The Defining Issues Test (DIT, dubbed "Neo-Kohlbergian" by its constituents) scores relative preference for post-conventional justifications[16] and the Moral Judgment Test (MJT) scores consistency of one's preferred justifications.[17] Both treat evaluative ability as similar to IQ (hence the single score), allowing categorization by high score vs. low score.

The Moral Foundations Questionnaire (MFQ) is based on moral intuitions consistent across cultures: care/harm, fairness/cheating, loyalty/betrayal, authority/subversion, and sanctity/degradation (liberty/oppression may be added). The focus on pre-conscious intuitions contrasts with Kohlberg's focus on post-conscious justifications, although the questions do ask respondents to rate what they consider morally relevant post-consciously (i.e. this is not a behavioral measure). The purpose of the questionnaire is to measure the degree to which people rely upon different sets of moral intuitions (which may coexist), rather than to categorize decision-makers, but the first two foundations cluster together with liberal political orientation and the latter three cluster with conservative political orientation. Thus, this survey allows categorization of people into a plurality which may reflect evaluative types more accurately than does political orientation by itself. [18][19]

The Moral DNA survey by Roger Steare asks respondents to rank their virtues, then divides respondents by three virtue clusters: obedience, care, and reason. The survey was developed for use in business settings, especially to raise awareness of ways perceived workplace discrimination diminishes effective evaluative diversity.[20]

In 1999 some of Kohlberg's measures were tested when Anne Colby and William Damon published a study in which the development of extraordinary moral development was examined in the lives of moral exemplars. After finding participants that exhibited high levels of moral commitment in their everyday behaviour the researchers then utilized the Kohlberg interview, the moral judgement interviews (MJI), to compare the 23 exemplars they studied with a more ordinary group of people. Along with the interviews with the 23 moral exemplars, the researchers put them through two standard dilemmas to assess what level they were on in Kohlberg's stages. The intention was to learn more about moral exemplars and to examine the strengths and weaknesses of the Kohlberg measure. It was found that the MJI scores were not clustered at the high end of Kohlberg's scale, they ranged from Stage 3 to Stage 5. Half landed at the conventional level (stages 3, 3/4, and 4) and the other half landed at the postconventional level (stages 4/5 and 5). It is to be noted that compared to the general population that these measures have been on before in the past, the scores of the moral exemplars may be somewhat higher than those of groups not selected for outstanding moral behaviour. Researchers noted that the "moral judgement scores are clearly related to subjects' educational attainment in this study". Among the participants that had attained college education or above there was no difference in moral judgement scores between genders. The study noted that although the exemplars' scores may have been higher than those of nonexemplars, it is also clear that one is not required to score at Kohlberg's highest stages in order to exhibit high degrees of moral commitment and exemplary behaviour.[21]

Theories

Recent attempts to develop an integrated model of moral motivation[22] have identified at least six different levels of moral functioning, each of which has been shown to predict some type of moral or prosocial behavior: moral intuitions, moral emotions, moral virtues/vices (behavioral capacities), moral values, moral reasoning, and moral willpower. This Social Intuitionist model of moral motivation[23] suggests that moral behaviors are typically the product of multiple levels of moral functioning, and are usually energized by the "hotter" levels of intuition, emotion, and behavioral virtue/vice. The "cooler" levels of values, reasoning, and willpower, while still important, are proposed to be secondary to the more affect-intensive processes.

The "Moral Foundations Theory" of psychologist Jonathan Haidt examines the way morality varies between cultures and identifies five fundamental moral values shared to a greater or lesser degree by different societies and individuals.[24] According to Haidt, these are: care for others, fairness, loyalty, authority and purity.[25] Haidt's book for the general reader The Happiness Hypothesis looks at the ways in which contemporary psychology casts light on the moral ideas of the past. On the other hand, in a recent conference, Haidt expressed views that may suggest he does not support a science of morality.[26]

Moral identity

While Kohlberg (1983) emphasized the role of moral reasoning and Hoffman (1971; 2001) emphasized the role of moral emotion in moral action, empirical studies showed that reasoning and emotion only moderately predicted moral action. Scholars, such as Blasi (1980; 1983), began proposing identity as a motivating factor in moral motivation. Blasi proposed the self model of moral functioning, which described the effects of the judgment of responsibility to perform a moral action, one's sense of moral identity, and the desire for self-consistency on moral action. Studies of moral exemplars have shown that exemplary moral action often results from the intertwining of personal goals and desires with moral goals, and studies on moral behavior also show a correlation between moral identity and action. However, there is great need for future research on the relationship between moral identity and behavior. Hardy and Carlo (2005) raise critical questions about Blasi's model as well as the topic in general, such as the nature of the causal relationship between moral identity and behavior, the presence of mediating or moderating variables in the relationship, how moral identity relates to more automatic and unconscious moral behavior, and more. Hardy and Carlo (2005) also propose that researchers should seek to better operationalize and measure moral identity and apply findings to moral education and intervention programs.[27]

A study was conducted by Anne Colby and William Damon[28] [29] regarding the lives of individuals who exhibit extraordinary moral commitment. This article suggests that one's moral identity is formed through that individual's synchronization of their personal and moral goals. This unity of their self and morality is what distinguishes themselves from non-exemplars and in turn makes them exceptional (pg.362). Colby and Damon studied moral identity through the narratives of Virginia Foster Durr and Suzie Valadez, whose behavior, actions, and life's work was considered to be morally exemplary by their communities and those with whom they came in contact. For example, Virginia Durr was a leader in the American civil rights movement for over 30 years, in particular she fought for voting rights of African Americans as well as racial integration (pg. 346). Suzie Valdez on the other hand, provided lifelong services to the poor and less fortunate people of Juarez, Mexico (pg. 346). The author describes these exemplars as maintaining a "unity between self and morality" (pg. 362). Some common characteristics, that these moral exemplars possess are certainty (refers to the exemplars' clarity about what they believe is right and about their own personal responsibility to act on those beliefs), Positivity (refers to the exemplars' positive approach to life, enjoyment of work, and optimism), and unity of self and moral goals (refers to the central place of the exemplars' moral goals in their conception of their own identity) (pg.361-362). The research suggests that a "transformation of goals" takes place during the evolution of one's moral identity and development and therefore is not an exercise of self-sacrifice but rather one done with great joy. Unlike modern Western societal thinking, which is characterized by a split between moral goals and personal goals and seeing these in opposition to one another, moral exemplars see their personal goals and moral goals as synonymous. "Their lives make it clear that one need not see personal goals and responsibility to others as necessarily in opposition" (pg. 368). This transformation is not always a deliberate process, but can be described as a developmental process that takes place in ones personal beliefs, affecting their conduct. Transformation is most often a gradual process, but can also be rapidly set of by a "triggering event", or "sudden, unexpected occurrences that create powerful emotional responses the 'trigger' a reexamination of one's life choices" (pg. 354). Triggering events can be anything from a powerful moment in a movie, to a traumatic life event, or as portrayed in the case of Suzie Valadez, the perception of a vision from God. This transformation is brought about by powerful social interactions that will gradually change and shape the persons goals. The research also found that there is not a "critical point" in one's life when moral development takes place. In many of the moral exemplars who were interviewed, their triggering events and goal transformation did not take place until their 40s. Moral exemplars are said to have the same concerns and commitments as other moral people but to a greater degree, "extensions in scope, intensity and breadth" (pg. 364). Furthermore, exemplars possess the ability to be open to new ideas and experiences, also known as an "active receptiveness" (pg. 350) to things exterior to themselves. Using this active receptiveness, a relatively average person can experience a transformation of goals and become an exemplary figure over time.[21]

Hart and Fegley (1995)[30] contribute to the literature on moral identity by providing research on adolescent moral exemplars from diverse backgrounds. This study was conducted to see how teenagers who conducted themselves in a caring manner throughout their communities saw themselves, because up to this point, all research concerning teenagers only focused on delinquents. Their findings suggest that adolescent caring exemplars formulate their self-concept differently from comparable peers. Moral exemplars were found to have more references to positive, moral, caring personality traits as well as moral and caring goals. They were also more likely to emphasize academic goals and amoral typical activities. However, it should also be noted that there were no significant differences between the exemplars and the control group concerning moral knowledge. On a semantic space analyses, Hart and Fegley also discovered that moral exemplars tend to view their actual self as more integrated with their ideal self and expected self. This means that moral exemplars have fewer differences between their schemas and each of these share very similar traits. In the conversation of moral identity, this strongly implies that moral exemplarity is associated with a meaningful, moral definition of one's own identity.

According to Blasi's theory on moral character, he stated that moral character is identified by the person's set of the morality of virtues and vices. He theorized willpower, moral desires, and integrity have the capability for a person to act morally by the hierarchical order of virtues. He believed that the "highest" and complex of virtues are expressed by the concept of willpower while the "lowest" and simplistic of virtues are expressed by the concept of integrity. He essentially states that to have the lower virtues, one must have one or more of the higher virtues. This is not to say that one is higher than the other. The will as desire is expressed as the wanting to "move forward" towards the virtue whereas the will of self-control is the wanting to "move backward" from the vice. Thus, will as desire is the moral desire that contains the moral characters' virtues and vices. The ending goals of moral development of character and identity are to establish core goals, act according to those goals and values, and use one's strengths and gifts to make a difference.[31]

In an article written by David B. Wong, he talks about Cultural Pluralism, and Moral Identity and how they correlate. He shows that in order to prove morality in terms of culture, there are two stand points. One being that morality is a cultural invention that was made so that people had something to strive towards. The other shows that any morality is as good as any other, and is not a cultural invention. He delves into showing how morality can be viewed much like language. While many places might have a certain way of looking at a situation morally and believe it to be right, this can contradict a separate culture's interpretation, (different moral dialects).

Moral self

A "moral" self is fostered by mutually-responsive parenting in childhood. Children with responsive parents develop more empathy, prosociality, a moral self and conscience (e.g., Kochanska, 2002). Narvaez (2014) describes the neurobiological and social elements of early experience and their effects on moral capacities.

The moral self is a differential process wherein some people integrate moral values into their self-concept.[32] This construct specifically refers to motivational processes. Research on the moral self has mostly focused on adolescence as a critical time period for the integration of self and morality, which gives rise to a moral self.[33] In other words, self and morality are traditionally seen as separate constructs that become integrated in adolescence.[34] However, the moral self may be established around 2–3 years-old.[35][36] In fact, children as young as 5 years-old are able to consistently identify themselves as having certain moral behavioral preferences reflective of the two internally consistent dimensions of the moral self: preferences for prosocial and avoidance of anti-social behaviour.[37] Children's moral self is also increasingly predictive of moral emotions with age.[37] Finally, children's moral self may be a precursor to the development of one's moral identity, which formulates later in life.

Moral values

Kristiansen and Hotte review many research articles regarding people's values and attitudes and whether they guide behavior. With the research they reviewed and their own extension of Ajzen and Fishbein's theory of reasoned action, they conclude that value-attitude-behavior depends on the individual and their moral reasoning. They also pointed out that there are such things as good values and bad values. Good values are those that guide our attitudes and behaviors and allow us to express and define ourselves. It also involves the ability to know when values are appropriate in response to the situation or person that you are dealing with. Bad values on the other hand are those that are relied on so much that it makes you unresponsive to the needs and perspectives of others.

Another issue that Kristiansen and Hotte discovered through their research was that individuals tended to "create" values to justify their reactions to certain situations. Or in other words they used values as a "post-hoc justification of their attitudes (emotions) and behaviors". Kristiansen and Hotte call this phenomenon the "Value Justification Hypothesis". The authors use an example from Faludi's journal entry of how during the period when women were fighting for their right to vote a New Rights group appealed to society's ideals of "traditional family values" as an argument against the new law in order to mask their own "anger at women's rising independence." Another theory that this can be equated to is Jonathan Haidt's "Social Intuition Theory" where individuals justify their intuitive emotions and actions through reasoning in a post-hoc fashion.

Kristiansen and Hotte also found that Independent selves had actions and behaviors that are influenced by their own thoughts and feelings, but Interdependent selves have actions, behaviors and self-concepts that were based on the thoughts and feelings of others. Westerners have two dimensions of emotions, activation and pleasantness. The Japanese have one more, the range of their interdependent relationships. Markus and Kitayama found that these two different types of values had different motives. Westerners, in their explanations, show self-bettering biases. Easterners, on the other hand, tend to focus on "other-oriented" biases.[38]

Psychologist S. H. Schwartz (1999)[39] defines individual values as "conceptions of the desirable that guide the way social actors (e.g.organisational leaders, policymakers, individual persons) select actions, evaluate people an events, and explain their actions and evaluations." In this definition, values are considered goals and ethics that guide one's life. Cultural values form the basis for social norms, laws, customs and practices. While individual values vary case by case (a result of unique life experience), the average of these values point to widely held cultural beliefs (a result of shared cultural values).

Moral virtues

Morality as virtues suggests that the morality of a person depends on the traits and temperaments that he or she possesses and values. Piaget and Kohlberg both developed stages of development to understand the timing and meaning of moral decisions. For Lapsley and Narvaez[40] in their paper (e.g., A Social-Cognitive Approach to Moral Personality) outline how social cognition explains aspects of moral functioning that other theories alone could not cover. The social cognitive approach to personality has six critical resources of moral personality; cognition, self-processes, affective elements of personality, changing social context, lawful situational variability, and the integration of other literature. Lapsley and Narvaez suggest that our moral values and actions stem from more than our virtues and are more so controlled by a set of schemas, cognititve structures that organize related concepts and integrate past events, that we have created in our minds. They claim that schemas are "fundamental to our very ability to notice dilemmas as we appraise the moral landscape" (p. 197). As people add to their schemas through knowledge and experience, they deliberately shape their view of morality. This idea fits in with Kohlberg's idea that moral reasoning is what governs our actions. Although Kohlberg believes in virtues as an aspect of morality, he stresses more of a justice reasoning approach to generate a consensus about moral developmental dilemmas. Kohlberg also argues that virtues are not the same among different cultures; different societies have different moral virtues by which they live. Lapsley and Narvaez suggest that over time, we develop greater "moral expertise" (See also Narvaez, 2005; 2006) . In gaining this moral expertise, we align our goals to our moral self, seek out and gain new knowledge of what it is to be moral, and develop highly practiced behavioral routines, all for the ultimate goal of acting out what it means to be a moral person. Essentially we are achieving a social cognitive account of personality advocated by Cervone and Shoda (1999)[41] referred to as "personality coherence". Coherence in personality can be seen in the dynamic interaction between dispositions and changing contexts. "Persons and contexts are not static, orthogonal effects, but are instead in dynamic interaction" (Lapsley & Narvaez, 2004). This "phenomenological sense of self-coherence that orders our goals, preferences, and values, and gives meaning to personal striving and motivated behavior" allows us to become moral experts because, according to the social cognitive approach, personality processes do not function independently but are instead are organized into coherent systems shaped by our personal experience and social contexts (p. 11). The aforementioned social-cognitive approach challenges the traditional trait approach that places emphasis on a structural basis of individual differences. The trait approach proposes that differences in personality arise from varying possession of traits, labeled as the "having" side (as opposed to "doing") of personality theory (Lapsley & Narvaez, 2004).

Moral reasoning

Main article: Moral reasoning

In the history of moral psychology, there is perhaps no more central figure than Lawrence Kohlberg, although Piaget (1932) was the inspiration for Kohlberg's work. Piaget watched children play games and noted how their rationales for cooperation changed with experience and maturation. He identified two stages, heteronomous (morality centered outside the self) and autonomous (internalized morality). Kohlberg sought to expand Piaget's work. His cognitive developmental theory of moral reasoning dominated the field for decades. Briefly stated, he focused on moral development as one's progression in the capacity to reason about justice. He used moral dilemmas or conflicts of interest. The most widely known moral scenario used in his research is usually referred to as the Heinz dilemma. He interviewed children and described what he saw in six stages.

Kohlberg identified six stages for moral judgment development. He used an interview method with hypothetical dilemmas such as "Heinz and the drug." In the story Heinz's wife is dying of cancer and the town's druggist has something that can help her but is charging more than Heinz can afford so Heinz steals the drug to save his wife's life. Children aged 10, 13, and 16 years old were asked if what Heinz did was okay. In the story children go from stage one, where they start to recognize higher authorities and that there are set rules and punishments for breaking those rules; to stage six, where good principles make a good society. They also start to define which of the principles are most agreeable and fair.[42] According to Kohlberg, an individual is considered more cognitively mature depending on their stage of moral reasoning. Kohlberg has found that an individual's stages of moral reasoning will grow as they grow in both education and world experience. One of the examples that Kohlberg gives is called "Cognitive-moral conflict" wherein an individual who is currently in one stage of moral reasoning has their beliefs challenged by a surrounding peer group. Through this challenge of beliefs the individual engages in "reflective reorganization" which allows for movement to a new stage to occur.

Kohlberg argued that though these six stages are artificial classifications of what is in reality much less discrete, they are the most logical. He claims that "anyone who interviewed children about dilemmas and who followed them longitudinally in time would come to our six stages and no others,"[43] and also that this is the best way to conceptualize not simply morality, but also specifically the direction of growth and progression of moral reasoning at the individual level over time.

Kohlberg's six stages emphasize the form or structure of morality over the content of morality, thus claiming his findings as universal. Form focuses purely on how one thinks about morality, or moral reasoning, shying away from explicitly defining what is or is not moral.

Psychologist John Dewey claims that moral development is not being fostered in the education system. He states that "the aim of moral education should be to stimulate people's thinking ability over time in ways which will enable them to use more adequate and complex reasoning patterns to solve moral problems."[44]

Despite the influence of Kohlberg, his views did not come without criticism and critique. Previous moral development scales, particularly Kohlberg's, believe that moral reasoning is dominated by one main perspective: justice. However, Gilligan and Attanucci argue that there is an alternative to this approach known as the care perspective.[45] The justice view deals with problems of inequality and oppression with equal rights and respect for all, whereas the care perspective deals with attachment to others. Both are unique experiences found within human development and experiences. Gilligan and Attanucci analyzed male and female responses to moral situations using content analysis to identify their moral considerations. Overall the study found that a majority of participants do represent both care and justice in their moral orientations. In addition, they found that men do tend to use the justice view significantly more than women and the same for women towards the care perspective.[45] This is significant as it illustrates that females were prone to view moral situations is a way that previous research did not account for and overlooked. At the time of production, this study really opened multiple closed doors leading to answers and further understanding of what separates men and women, specifically with how they handle and act upon moral situations. Solely looking at justice when determining moral development may not be appropriate for both genders. The results of this study indicate that one's moral standpoint—justice or caring—may be an extension of one's identity or preferred perception of life. This is especially true when a decision is made introspectively, or at the "postconventional" level. While this article from Gilligan and Attanucci did need a broader spectrum of test subjects to really prove authenticity, it is important to remember that not everyone views morality the same. In fact, Gilligan and Attanucci stated that different viewpoints on morality may be beneficial. It is possible to expand moral understanding if we acknowledge others' moral perspectives. Reviews by Walker (2006) and Jaffee and Hyde (2001) found that Gilligan's theory was not supported by empirical studies. In fact, in neo-Kohlbergian studies with the Defining Issues Test, females tend to get higher scores than males, though generally not significantly so (Rest, Narvaez, Bebeau & Thoma, 1999).

The neo-Kohlbergian approach to moral judgment modified Kohlberg's theory in systematic ways including to describe moral judgment development as shifting distributions of moral schemas. With age/maturation and education (higher education or equivalent) one uses more and more postconventional thinking and less personal interest thinking or maintaining norms thinking.

Moral willpower

Metcalfe and Mischel (1999)[46] offered a new theory of willpower that focused on the delay of gratification paradigm. They propose a hot/cool structure of analysis to deprive the way one controls the way stimulus is interpreted and willpower is exerted. The hot system is referred to as the "go" system whereas the cool system is referred to as the "know" system. The hot system is characterized as being highly emotional, reflexive, and impulsive. This system leads to go response (instant gratification) and therefore undermines efforts in self-control. It is specialized for quick emotional processing/response when confronted with "hot stimulus" (stimulus that will sabotage self-control efforts). The Cool System is characterized as being cognitive, emotionally neutral/flexible, slow, integrated, contemplative, and strategic. It specializes in complex episodic representations. The hot system develops early in life, whereas the cool system develops later as it relies on particular brain structures, notably the prefrontal cortex and hippocampus, and particular cognitive capacities that develop later. With age, there is a shift of dominance from the Hot System to the Cool System. However, it is possible to have a balance between both the hot and cool system. The balance is determined by stress, developmental levels, and a person's self-regulating dynamics.[47] Furthermore, the different systems triggered decide how one reacts to different stimuli presented.

Baumeister, Miller and Delaney (2005) explores the notion of willpower by first defining the self as being made up of three parts: reflexive consciousness, or the person's awareness of their environment and of himself as an individual; interpersonal being, which seeks to mold the self into one that will be accepted by others; and executive function, which encompasses the concepts of choice, control, and self-regulation and is used as the starting point of the authors' discussion of willpower. Baumeister, Miller and Delaney (2005) state that "[t]he self can free its actions from being determined by particular influences, especially those of which it is aware." (p. 68) Consciousness equips an individual with the ability to override instinctual reactions; however, there is a substantial cost in resisting these natural reactions and promoting moral ones. The three prevalent theories of willpower describe it as a limited supply of energy, as a cognitive process, and as a skill that is developed over time. Research has largely supported that willpower works like a "moral muscle" with a limited supply of strength that may be depleted, conserved, or replenished.[48] Research shows that a single act of self-control can significantly deplete the "supply" of willpower.[48] Baumeister, Miller, and Delaney (2005) found that the depletion of willpower is caused by various "types of responses, including controlling emotion, controlling thoughts, impulse control and resisting temptation, and controlling performance." (p. 64) While volitional exertion reduces the ability to engage in further acts of willpower in the short term, such exertions actually improve a person's ability to exert willpower for extended periods in the long run. That is, much like a regular muscle, the "moral muscle" is susceptible to depletion when heavily exerted, but repeated exertion builds strength that makes future prolonged exertions easier. Muraven, Baumeister and Tice (1999) demonstrate in their study that this moral muscle, when exercised, is strengthened in capacity but not necessarily in power - meaning the subjects became less susceptible to the depletion of self-regulatory faculties.[49] A second part of the self-management theory is self-regulation. This includes recognizing the current situation, computing a desired response, and initiating a substitute reaction.[48] For example, in the Muraven, Baumeister and Tice (1999) study it was shown that more complex tasks like regulating one's mood present substantive difficulty and may not be as effective in increasing willpower as other, more straight forward activities like posture correction or maintaining a food journal.[49] However, over time, the "moral muscle" may be exercised by performing small tasks of self-control, such as attempting to correct slouched posture, resist desserts, or complete challenging self-regulatory tasks. Similarly to the way an recovered alcoholic may find it easier to consequently use self-control to quit smoking, someone who has exercised their self-control muscle in one area may find themselves possessed of a greater level of self-regulatory stamina for the next task they attempt.[49] Lastly, Baumeister argues that self-management, or the ability to alter one's responses, is a kind of skill that develops as one grows up.[48] There are also many little things that can help a person replenish this source of will power, such as meditation, rest, and positive emotion between tasks.[49] They also showed that there is a conservation effect when it comes to will power. Like in sports, once a person uses their energy, (or in this case when a person's volition and self-control get used) they begin to conserve the little they have left so they can be more productive later on. People tend to realize that they are using up their stored up will volition and self-control and disperse it when needed.[49] This might lead the individual to try to use self-control sparingly, in order to avoid depleting this limited reserve, but the research done by Baumeister, Mauraven, and Tice (1999) indicates that self-control can, as already mentioned, be strengthened through exercise, offering a more encouraging outlook on the subject. If self-control can be strengthened through practice, individuals can worry less about conserving their limited self-control energy, and instead turn to the more empowering prospect of exercising their self-regulatory muscle to increase their self-control capabilities over time, and thus be more successful at achieving their goals.[49]

Moral behavior

James Rest (1983; Narvaez & Rest, 1995) reviewed the literature on moral functioning and identified at least four components or processes that must go right for a moral behavior to take place. (1) Moral sensitivity is noticing and interpreting what's happening; (2) Moral reasoning about what to do and making a judgment about what is the best (most moral) option; (3) Moral motivation (in the moment but also habitually, such as moral identity); (4) Moral implementation—having the skills and perseverance to carry out the action.

Reynolds and Ceranic (2007) identified the various contributors to moral behavior, including moral judgment and moral identity. Reynolds and Ceranic identified some major limitations in these classic cognitive moral development theories. They sought to bring together the concept of moral identity and moral judgment, rather than studying them as separate contributors to moral behavior. This research suggests that moral identity and moral judgment work both separately and together in shaping moral behavior. In addition, they have researched the effects of social consensus on one's moral behavior. The study claims that depending on the level of social consensus (high vs. low), moral behaviors will require greater or lesser degrees of moral identity to motivate an individual to make a choice and endorse a behavior. Also, depending on social consensus, particular behaviors may require different levels of moral reasoning. This article seeks to demonstrate an integrated approach to examining moral identity and moral judgment, as well as study the effects of social consensus on moral judgment. [50]

Moral intuitions

In 2001, Jonathan Haidt introduced his Social Intuitionist Model which claimed that with few exceptions, moral judgments are made based upon socially-derived intuitions. Moral judgments are basically evaluations of the actions one makes, and depends on the "judgers" set of virtues and culture. Moral reasoning happens in several steps. The first is searching for relevant evidence, then weighing that evidence, coordinating the evidence with past theories, and then reaching a decision. It can also be defined as a conscious mental activity that consists of changing the information presented in order to reach a moral judgment. Moral intuitions happen immediately and unconsciously, and is basically the conclusion a person who is faced with a certain situation comes to and whether they think the situation is right or wrong without really thinking about it in depth.[51]

This model suggests that moral reasoning is largely post-hoc rationalizations that function to justify one's instinctual reactions. He provides four arguments to doubt causal importance of reason. The first is research supporting a dual process system in the brain when making automatic evaluations or assessments, Haidt proposes this applies to moral judgement. The second is evidence from Chaiken that evolved social motives bias humans to cohere and relate to other's attitudes in order to achieve higher societal goals, which in turn influences one's moral judgment. Thirdly, Haidt found that people have post hoc reasoning when faced with a moral situation, this a posteriori (after the fact) explanation gives the illusion of objective moral judgement but in reality is subjective to one's gut feeling. Lastly, research has shown that moral emotion has a stronger link to moral action than moral reasoning, citing Damasio's research on psychopaths and Batson's empathy-altruism hypothesis.[51]

In 2008, Joshua Greene published a compilation which, in contrast to Haidt's model, suggested that fair moral reasoning does take place. A deontologist is someone who has rule-based morality that is mainly focused on duties and rights. In contrast, a consequentialist is someone who believes that only the best overall consequences ultimately matter.[52] Research has found that, generally speaking, individuals who answer to moral dilemmas in a consequential manner take longer to respond and show frontal-lobe activity (associated with cognitive processing). Individuals who answer to moral dilemmas in a deontologically, however, generally answer more quickly and show brain activity in the amygdala (associated with emotional processing). Greene also proposes that a person is someone who is always an identifiable individual and not a statistical someone. This makes moral answers to also be determined based on whether you can or cannot identify with the other person. Research suggests that, although intuitions largely influence morality (especially non-utilitarian moralities), individuals are still capable of fair moral reasoning.[53]

In regard to moral intuitions, researchers Haidt and Graham performed a study to research the difference between the moral foundations of political liberals and political conservatives.[54] They challenge individuals to question the legitimacy of their moral world and introduce 5 psychological foundations of morality: harm/care, fairness/reciprocity, ingroup/loyalty, authority/respect, and purity/sanctity. Harm/Care started with the sensitivity to signs of suffering in offspring and has developed into a general dislike of seeing suffering in others and the potential to feel the emotion of compassion in response. Fairness/Reciprocity is developed when someone observes or engages in reciprocal interactions. Virtues related to fairness and justice have developed in all cultures. However, they can be overridden. In particular, fairness "is an excellent candidate for a universal (though variably applied) value" (p. 105). Ingroup/Loyalty constitutes recognizing, trusting, and cooperating with members of one's ingroup as well as being wary and distrustful of members of other groups. Authority/Respect is how someone navigates in a hierarchal ingroups and communities. Lastly, Purity/Sanctity stems from the emotion of disgust that guards the body by responding to elicitors that are biologically or culturally linked to disease transmission. The five foundations theory are both a nativist and cultural-psychologica theory. Modern moral psychology concedes that "morality is about protecting individuals" and focuses primarily on issues of justice (harm/care & fairness/reciprocity) (p. 99). Their research found that "justice and related virtues…make up half of the moral world for liberals, while justice-related concerns make up only one fifth of the moral world for conservatives" (p. 99).[[54] ] Liberals value harm/care and fairness/reciprocity significantly more than the other moralities, while conservatives value all five equally. Additionally, their research illustrated social justice research and social psychology are constrained in their discussion of morality by focusing on harm and fairness. Their examination of these texts found that harm and fairness moral foundations were endorsed highly by articles, while the three other moral domains were associated more with vice than virtues because they conflicted with the harm and fairness foundations.[54] Haidt and Graham propose that in order for open discussions to take place in the political arena, liberals must recognize moral issues from a conservative perspective if they are to understand the stances of conservatives and hope to enact change. Their paper ultimately concludes with a call for tolerance between those who value different moral foundations. It is necessary for progress to occur.[54]

This idea is challenged by Augusto Blasi, who is hesitant to whole-heartedly accept this theory. Though Blasi recognizes that intuitions are sometimes valid and may motivate us to do moral things, this is not always the case. Blasi emphasizes the importance of moral responsibility and reflection as one analyzes an intuition (p. 423).[55] His main argument is that some, if not most, intuitions tend to be self-centered and self-seeking (p. 397). Those desires and intuitions do not create good moral actions. Blasi critiques Haidt in describing the average person and questioning if this model (having an intuition, acting on it, and then justifying it) always happens. He came to the conclusion that not everyone follows this model. He accuses Haidt of being a magician, because he is causing a distraction. Haidt is causing people to focus on intuitions and ignore all the other elements like identity or self. Blasi is concerned that the design of experiments made by psychologists today not only reveal, but also hide many parts of humanity. In other words, they are taking something extremely complex and simplifying it, which creates a skewed perspective. In more detail, Blasi proposes Haidt's five default positions on intuition. 1.) Normally moral judgments are caused by intuitions, whether the intuitions are themselves caused by heuristics, or the heuristics are intuitions; whether they are intrinsically based on emotions, or depend on grammar type of rules and externally related to emotions. 2.) Intuitions occur rapidly and appear as unquestionably evident; either the intuitions themselves or their sources are unconscious. 3.) Intuitions are responses to minimal information, are not a result of analyses or reasoning; neither do they require reasoning to appear solid and true. 4.) Reasoning may occur but infrequently. In any event, its purpose, and the purpose of reasons, is not to lead to, and support, a valid judgment, but to justify the judgment after the fact, either to other people or to oneself. Reasons in sum do not have a moral function. 5.) Because such are the empirical facts, the "rationalistic" theories and methods of Piaget and Kohlberg are rejected. Blasi argues that Haidt does not provide adequate evidence to support his position (p. 412) [55]

Darcia Narvaez[56] emphasizes the interplay between intuitions and conscious processes as normal moral functioning, especially in non-novices. In her 2014 book,[57] she shows how intuitions are shaped in early life and underlie one's orientation to moral self-protection or moral relational attunement in adulthood. Toxic early stress leads to stress reactivity and self-protectionism. Human moral imagination can be hijacked by self-protectionism which emerges from stress and stress reactivity, leading to oppositional or withdrawing moral mindsets in everyday life. When a person is raised within the human evolved development niche, imagination builds on capacities for relational attunement to form communal imagination. Small-band hunter-gatherers and many indigenous societies, who raise their children as nature intended, demonstrate these inherited moral capacities.

Moral emotions

Throughout history, thought about the basis of morality has been dominated by the reason perspective (See Moral Reasoning). Moral reasoning has been the focus of most study of morality dating all the way back to Plato and Aristotle. The emotive side of morality has been looked upon with disdain, as subservient to the higher, rational, moral reasoning, with scholars like Piaget and Kohlberg touting moral reasoning as the key forefront of morality [15] However, in the last 30–40 years, there has been a rise in a new front of research: moral emotions as the basis for moral behavior. As research has been done on the nature of moral emotions and their role in determining morality, the moral emotion perspective has gained credence. This development began with a focus on empathy and guilt, but has since moved on to encompass new emotional scholarship stocks like anger, shame, disgust, awe, and elevation. With the new research, theorists have begun to question whether moral emotions might hold a larger in determining morality, one that might even surpass that of moral reasoning.[58]

"One approach would be first to define morality and then to say that moral emotions are the emotions that respond to moral violations or that motivate moral behavior".[59] There have generally been two approaches taken by philosophers to define moral emotion. The first "is to specify the formal conditions that make a moral statement (e.g., that is prescriptive, that it is universalizable, such as expedience)[60]". This first approach is more tied to language and the definitions we give to a moral emotions. The second approach "is to specify the material conditions of a moral issue, for example, that moral rules and judgments 'must bear on the interest or welfare either of society as a whole or at least of persons other than the judge or agent'[61]". This definition seems to be more action based. It focuses on the outcome of a moral emotion. The second definition is more preferred because it is not tied to language and therefore can be applied to prelinguistic children and animals. Moral emotions are "emotions that are linked to the interests or welfare either of society as a whole or at least of persons other than the judge or agent." ([62])

There is a debate whether there is a set of basic emotions or if there are "scripts or set of components that can be mixed and matched, allowing for a very large number of possible emotions".[58] Even those arguing for a basic set acknowledge that there are variants of each emotion. Ekman (1992) calls these variants "families". "The principal moral emotions can be divided into two large and two small joint families. The large families are the 'other-condemning' family, in which the three brothers are contempt, anger, and disgust (and their many children, such as indignation and loathing), and the 'self-conscious' family (shame embarrassment, and guilt)…[T]he two smaller families the 'other-suffering' family (compassion) and the 'other-praising' family (gratitude and elevation)".[58] Haidt also leaves theoretical room for cultural specificities. Different cultures, he suggests, can formulate different local moral emotions that reflect the intrinsic values of that culture. For example, eastern cultures may be more inclined to consider serenity/calmness as a moral emotion than western cultures.

Jonathan Haidt argues that the studies of moral reasoning in moral psychology have done very little to determine what it is that leads us to action. He criticizes the field's avoidance of emotion and believes that it is emotion that drives us to act. As Haidt would suggest, the higher the emotionality of a moral agent the more likely they are to act morally. He also uses the term "disinterested elicitor" to describe someone who is less concerned with the self, and more concerned about the well being of things exterior to him or herself. Haidt suggests that society is made up of these disinterested elicitors and that each person's pro-social action tendency is determined by his or her degree of emotionality. Haidt uses Ekman's idea of "emotion families" and builds a scale of emotionality, from low to high. Combining this scale with self-interested vs. disinterested, and you find a likelihood to act. If a person works on a low level of emotion and has self-interested emotions, such as sad/happy, they are unlikely to act. If the moral agent possesses a high emotionality and operates as a disinterested elicitor with emotions such as elevation, they are much more likely to be morally altruistic.

Empathy also plays a large role in altruism. The empathy-altruism hypothesis states that feelings of empathy for another leads to an altruistic motivation to help that person.[63] In contrast, there may also be an egoistic motivation to help someone in need. This is the Hullian tension-reduction model in which personal distress caused by another in need leads the person to help in order to alleviate their own discomfort.[64]

Batson, Klein, Highberger, and Shaw[65] conducted experiments where they manipulated people through the use of empathy-induced altruism to make decisions that required them to show partiality to one individual over another. The first experiment involved a participant from each group to choose someone to experience a positive or negative task. These groups included a non-communication, communication/low-empathy, and communication/high-empathy. They were asked to make their decisions based on these standards resulting in the communication/high-empathy group showing more partiality in the experiment than the other groups due to being successfully manipulated emotionally. Those individuals who they successfully manipulated reported that despite feeling compelled in the moment to show partiality they still felt they had made the more "immoral" decision since they followed an empathy-based emotion rather than adhering to a justice perspective of morality.

Batson, Klein, Highberger, & Shaw conducted two experiments on empathy-induced altruism proposing that this can lead to actions that violate the justice principle. The second experiment operated similarly to the first using low-empathy and high-empathy groups. Participants were faced with the decision to move an ostensibly ill child to an Immediate Help Group versus leaving her on a waiting list after listening to her emotionally driven interview describing her condition and the life it has left her to lead. Results from this second experiment were consistent with the results found in the first experiment. Those who were in the high-empathy group were more likely than those in the low-empathy group to move this child higher up the list to receive treatment earlier. Participants that made this choice did so with the full knowledge that this would put those children that were higher up in the list, for reason of being on the list longer and having an estimated shorter amount of life worth to live as compared with children later in the list, behind this other child and therefore delayed their treatment. It was noted that when these participants were asked what was the more moral choice to make in this instance they agreed that the more moral choice would have been to not move this child ahead of the list at the expense of the other children. In this case it is evident that when empathy induced altruism is at odds with what is seen as moral, oftentimes empathy induced altruism has the ability to win out over morality. The results showed that empathy-induced altruism and acting in accordance to the justice principle are independent of one another. Empathy-induced altruism and justice are two independent prosocial motives, each with their own unique ultimate goal. Resource-allocation situations in which these two motives conflict cause empathy-induced altruism to become a source of immoral injustice. Empathy-induced altruism and justice morality can work together in situations where the empathy is felt towards a victim of injustice.[65]

Recently neuroscientist Jean Decety, drawing on empirical research in evolutionary theory, developmental psychology, social neuroscience, and psychopathy, argued that empathy and morality are neither systematically opposed to one another, nor inevitably complementary.[66][67] Further, a better understanding of the relation between empathy and morality, may require abandoning the notion of empathy in favor of more precise concepts, such as emotional sharing, empathic concern, and affective perspective-taking.[68][69]

Moral conviction

One of the main questions within the psychological study of morality is the issue of what qualitatively distinguishes moral attitudes from non-moral attitudes. Linda Skitka and colleagues have introduced the concept of moral conviction, which refers to a "strong and absolute belief that something is right or wrong, moral or immoral."[70] According to Skitka's Integrated Theory of Moral Conviction (ITMC), attitudes held with moral conviction, known as moral mandates, differ from strong but non-moral attitudes in a number of important ways. Namely, moral mandates derive their motivational force from their perceived universality, perceived objectivity, and strong ties to emotion.[71] Perceived universality refers to the notion that individuals experience moral mandates as transcending persons and cultures; additionally, they are regarded as matters of fact. Regarding association with emotion, ITMC is consistent with Jonathan Haidt's Social Intuitionist Model in stating that moral judgments are accompanied by discrete moral emotions (i.e., disgust, shame, guilt). Importantly, Skitka maintains that moral mandates are not the same thing as moral values. Whether an issue will be associated with moral conviction varies across persons.

One of the main lines of IMTC research addresses the behavioral implications of moral mandates. Individuals prefer greater social and physical distance from attitudinally dissimilar others when moral conviction was high. Importantly, this effect of moral conviction could not be explained by traditional measures of attitude strength, extremity, or centrality. Skitka, Bauman, and Sargis placed participants in either attitudinally heterogeneous or homogenous groups to discuss procedures regarding two morally mandated issues, abortion and capital punishment. Those in attitudinally heterogeneous groups demonstrated the least amount of goodwill towards other group members, the least amount of cooperation, and the most tension/defensiveness. Furthermore, individuals discussing a morally-mandated issue were less likely to reach a consensus, compared to those discussing non-moral issues.[72]

Evolution

In Unto Others: the Evolution and Psychology of Unselfish Behavior (1998), Elliott Sober and David Sloan Wilson demonstrated that diverse moralities could evolve through group selection. In particular, they dismantled the idea that natural selection will favor a homogeneous population in which all creatures care only about their own personal welfare and/or behave only in ways which advance their own personal reproduction.[73] Tim Dean has advanced the more general claim that moral diversity would evolve through frequency-dependent selection because each moral approach is vulnerable to a different set of situations which threatened our ancestors.[74] Darcia Narvaez [75] emphasizes the epigenetics of morality in a moral developmental systems theory [76] Humans evolved to have intensive, supportive parenting which shapes neurobiology for moral agility, relational attunement and communal imagination (apparent in small-band hunter-gatherers). In the last 1% of human existence, the evolved developmental niche or nest (mostly evolved to fixation in social mammals 30-40 million years ago; see Melvin Konner, The Evolution of Childhood) has been dismantled, undermining human development and human nature, leading to alienation from the natural world and widespread illbeing.

Sociological applications

Some research shows that people tend to self-segregate based on moral or moral-political values.[77][78]

Triune Ethics Theory (TET; Narvaez, 2008)

Triune Ethics Theory has been proposed by Darcia Narvaez (2008) as a metatheory, which attempts to integrate the observations and explanations of multiple theories. In particular, it highlights the relative contributions to moral development of biological inheritance (including human evolutionary adaptations), environmental influences on neurobiology (including epigenetic; the theory focuses especially on the relative effects of environments that either replicate or deviate from the Environment of Evolutionary Adaptedness, or EEA), and the role of culture. TET proposes three ethics that are the foundation or motivation for all ethics: security (or safety), engagement, and imagination. TET suggests that ethic is engaged by one of three unique, but interrelated, neural systems. Such systems differ, not only in the recency of evolutionary development, but also in their relative capacity to override one another.[79]

Security

The security ethic is based in the oldest part of the brain, involving the R-complex or the extrapyrimidal action nervous system.[80] The security ethic is triggered by the stress response which activates primal instincts and fight or flight responses.[81] These are concerned or centered on safety, survival, and thriving in an environment (or biological system). With these systems present at birth, the security ethic is conditioned during sensitive periods of development (such as infancy), life experience, and trauma.[79] Studies have shown that a dearth of touch in early years result in an underdevelopment of serotonin receptors.[82] Children with faulty serotonin receptors are susceptible to somatosensory affectional deprivation (SAD), a condition related to depression, violent behavior, and stimulus seeking.[83] As an adult, if serotonin receptors are not properly functioning, an individual is more prone to depression and anxiety.[84] If receptor are damaged, and one becomes fixated at this ethic, they can be seen as cold, closed-minded, and aggressive. This ethic is most responsible for racism and hate towards outside groups.

Engagement

The ethic of engagement is centered in the upper limbic system or the visceral-emotional nervous system.[80] The limbic system allows for external and internal emotional signaling and is critical to emotion, identity, memory for ongoing experience and an individual's sense of reality and truth. The ethic of engagement refers to relational attunement in the moment, which the stress response prevents, focusing on social bonding. It relies significantly on caregiver influence for its scheduled development in early childhood.[79] The engagement ethic is strongly associated with the hormone oxytocin, which has a strong presence during breastfeeding between a mother and child. Oxytocin is essential for building the trust between mother and child.

Imagination

The imagination ethic allows a person to step away from the impetuous emotional responses of the older parts of the brain and consider alternative actions based on logic and reason.[85] It is centered in the neocortex and related thalamic structures, including the frontal lobes used for reasoning and judgement skills.[80] It is focused on the outside world and allows for the integration and coordination of the other parts of the brain to allow for imaginative thinking and strategic problem solving. The ethic of imagination involves integrating internal information with external information, allowing an adult to acknowledge and possibly reject more emotional responses from the security or engagement ethics. The imagination ethic can build on the self-protective states of the security ethic (vicious or detached imagination) or of the prosocial engagement ethic (communal imagination).[79]

Topics

The subjects covered by moral psychology include:

  • The structure of action
  • Perceived causes and events of moral action
  • Emotions in morality
  • The faculties of the mind involved in moral decision
  • The interaction of those faculties and the emotions

  • Moral commitment
  • Rationality in moral matters
  • Moral judgement
  • Moral sensitivity
  • Moral motivation and identity
  • The relationship between ethics and moral action
  • The means by which moral agents understand each other

See also

Footnotes

  1. See, for example, Lapsley, D.K. Moral psychology (1996). Boulder, CO: Westview Press
  2. See, for example, Doris & Stich (2008) and Wallace (2007). Wallace writes: "Moral psychology is the study of morality in its psychological dimensions" (p. 86).
  3. See Doris & Stich (2008), §1.
  4. Teper, R.; Inzlicht, M.; Page-Gould, E. (2011). "Are we more moral than we think?: Exploring the role of affect in moral behavior and moral forecasting". Psychological Science. 22 (4): 553–558. doi:10.1177/0956797611402513.
  5. 1 2 Hardy, S. A.; Carlo, G. (2011). "Moral identity: What is it, how does it develop, and is it linked to moral action?". Child Development Perspectives. 5 (3): 212–218. doi:10.1111/j.1750-8606.2011.00189.x.
  6. Kohlberg, L. (1969). Stage and sequence: The cognitive development approach to socialization. In D. A. Goslin (Ed.), Handbook of socialization theory and research (pp. 347–480). Chicago: Rand McNally.
  7. Doris & Stich (2008), §1.
  8. Kohlberg, Lawrence (1958). "The Development of Modes of Thinking and Choices in Years 10 to 16". Ph. D. Dissertation, University of Chicago.
  9. Kohlberg, L. (1987). The Measurement of Moral Judgment Vol. 2: Standard Issue Scoring Manual. Cambridge University Press. ISBN 0-521-24447-1.
  10. 1 2 Wendorf, Craig A (2001). "History of American morality research, 1894–1932". History of psychology. 4 (3): 272–288. doi:10.1037/1093-4510.4.3.272.
  11. Kohlberg, Lawrence (1973). "The Claim to Moral Adequacy of a Highest Stage of Moral Judgment". Journal of Philosophy. 70 (18): 630–646. doi:10.2307/2025030. JSTOR 2025030.
  12. Colby, Anne; Kohlberg, L. (1987). The Measurement of Moral Judgment Vol. 2: Standard Issue Scoring Manual. Cambridge University Press. ISBN 0-521-24447-1.
  13. Walker, Lawrence J.; Frimer, Jeremy A.; Dunlop, William L. (2010). "Varieties of moral personality: beyond the banality of heroism". Journal of Personality. 78 (3): 907–942. doi:10.1111/j.1467-6494.2010.00637.x. PMID 20573130.
  14. Verplaetse, Jan (2008). "Measuring the moral sense: morality tests in continental Europe between 1910 and 1930". Paedagogica Historica. 44 (3): 265–286. doi:10.1080/00309230701722721.
  15. 1 2 Kohlberg, Lawrence (1981). Essays on moral development, Vol. 1. The philosophy of moral development. San Francisco: Harper & Row.
  16. Rest, James (1979). Development in Judging Moral Issues. University of Minnesota Press.
  17. Lind, Georg (1978). "Wie misst man moralisches Urteil? Probleme und alternative Möglichkeiten der Messung eines komplexen Konstrukts". In Portele, G. Sozialisation und Moral. Weinheim: Beltz. pp. 171–201.
  18. Graham, Jesse; Haidt, Jonathan; Nosek, Brian A. (2009). "Liberals and conservatives rely on different sets of moral foundations". Journal of Personality and Social Psychology. 96 (5): 1029–1046. doi:10.1037/a0015141. PMID 19379034.
  19. Graham, J.; Haidt, J.; Koleva, S.; Motyl, M.; Iyer, R.; Wojcik, S.; Ditto, P.H. (2013). "Moral Foundations Theory: The pragmatic validity of moral pluralism" (PDF). Advances in Experimental Social Psychology. 47: 55–130. doi:10.1016/b978-0-12-407236-7.00002-4.
  20. Steare, Roger (2006). Ethicability. Roger Steare Consulting.
  21. 1 2 Anne Colby and William Damon, "The Development of Extraordinary Moral Commitment," Morality in Everyday Life: Development Perspectives, Cambridge University Press, 1999, pp. 324-370.
  22. Leffel (2008)
  23. Leffel's (2008) model draws heavily on Haidt's (2001) "Social Intuitionist Model" of moral judgment.
  24. Haidt, Jonathan; Jesse Graham (2007). "When Morality Opposes Justice: Conservatives Have Moral Intuitions That Liberals May Not Recognize" (PDF). Social Justice Research. 20 (1): 98–116. doi:10.1007/s11211-007-0034-z. Archived from the original (PDF) on September 16, 2008. Retrieved 2008-12-14.
  25. Talks: Jonathan Haidt on the moral roots of liberals and conservatives at TED in 2008
  26. http://www.edge.org/3rd_culture/morality10/morality10_index.html
  27. Hardy, S. A.; Carlo, G. (2005). "Identity as a source of moral motivation". Human Development. 48: 232–256. doi:10.1159/000086859.
  28. Colby, A., & Damon, W. (1993). The uniting of self and morality in the development of extraordinary moral commitment. In G. G. Noam & T. E. Wren (Eds.), The moral self (pp. 149-174). Cambridge, MA: MIT Press.
  29. Colby, A., & Damon, W. (1992). Some do care: Contemporary lives of moral commitment. New York: Free Press.
  30. Hart, D.; Fegley, S. (1995). "Prosocial behavior and caring in adolescence: Relations to self-understanding and social judgment". Child Development. 66 (5): 1346–1359. doi:10.2307/1131651.
  31. Blasi, A. (2005). Moral character: A psychological approach. In D. K. Lapsley & F. C. Powers (Eds.), Character psychology and character education (pp. 67-100). Notre Dame, IN: University of Notre Dame Press.
  32. Krettenauer, T (2011). "The dual moral self: Moral centrality and internal moral motivation". The Journal of Genetic Psychology. 172: 309–328. doi:10.1080/00221325.2010.538451.
  33. see Krettenauer, T. (2013). Revisiting the moral self construct: Developmental perspectives on moral selfhood. In B. Sokol, U., F. Grouzet, & U. Müller (Eds.). Self-regulation and autonomy (pp. 115-140). Cambridge: Cambridge University Press.
  34. see for example, Damon, W., & Hart, D. (1988). Self-understanding in childhood and adolescence. Cambridge, England: Cambridge University Press.
  35. Emde, R.; Biringen, Z.; Clyman, R.; Oppenheim, D. (1991). "The moral self of infancy: Affective core and procedural knowledge". Developmental Review. 11: 251–270. doi:10.1016/0273-2297(91)90013-e.
  36. Kochanska, G (2002). "Committed compliance, moral self, and internalization: A mediational model". Developmental Psychology. 38: 339–351. doi:10.1037/0012-1649.38.3.339.
  37. 1 2 Krettenauer, T.; Campbell, S.; Hertz, S. (2013). "Moral emotions and the development of the moral self in childhood". European Journal of Developmental Psychology. 10: 159–173. doi:10.1080/17405629.2012.762750.
  38. Kristiansen, Connie M., Alan M. Hotte, and Ottawa U. Carleton. "Morality And The Self: Implications For The When And How Of Value–Attitude–Behavior Relations." The Psychology of Values. Vol. 8. Hillsdale, NJ, England: Lawrence Erlbaum Associates, 1996. 91. Print. The Ontario Symposium on Personality and Social Psychology.
  39. Schwartz, S. H. (1999). "A Theory of Cultural Values and Some Implications for Work". Applied Psychology: An International Review. 48 (1): 23–47. doi:10.1080/026999499377655.
  40. Lapsley, D. K., & Narvaez, D. (2004). A social-cognitive approach to the moral personality Lawrence Erlbaum Associates Publishers, Mahwah, NJ
  41. Cervone, D., & Shoda, Y. (1999). The coherence of personality: Social-cognitive bases of consistency, variability, and organization. Guilford Press, New York, NY.
  42. Crain, W.C. "Kohlberg's Stages of Moral Development". Theories of Development. Prentice-Hall. Retrieved 3 October 2011.
  43. Kohlberg, Lawrence (1984). The Psychology of Moral Development: The Nature and Validity of Moral Stages (Essays on Moral Development, Volume 2). Harper & Row. p. 195.
  44. Kohlberg, Lawrence; Richard Hersh (1977). "Moral Development: A Review of the Theory". Theory Into Practice. 16 (2): 53–58. doi:10.1080/00405847709542675.
  45. 1 2 Gilligan and Attanucci (1988). Two Moral Orientations: Gender Differences and Similarities. Merrill-Palmer Quarterly; 34(3), 223-237.
  46. Metcalfe, J.; Mischel, W. (1999). "A hot/cool-system analysis of delay of gratification: Dynamics of willpower". Psychological Review. 106 (1): 3–19. doi:10.1037/0033-295x.106.1.3.
  47. Metcalfe and Mischel (1999)
  48. 1 2 3 4 Baumeister, Miller, & Delaney (2005). Self and Volition
  49. 1 2 3 4 5 6 Longitudinal Improvement of Self-Regulation Through Practice, Muraven, Baumeister & Tice
  50. Reynolds, S.J., & Ceranic, T.L. (2007). The effects of moral judgment and moral identity on moral behavior: a empirical examination of the moral individual. Seattle, Journals of Applied Psychology.
  51. 1 2 Haidt, Jonathan (October 2001). "The Emotional Dog and Its Rational Tail". Psychological Review. 108 (4): 814–834. doi:10.1037/0033-295X.108.4.814.
  52. Greene, Joshua (2008). The Secret Joke of Kant's Soul. In: Sinnott-Armstrong W Moral Psychology, Vol. 3: The Neuroscience of Morality: Emotion, Disease, and Development. Cambridge, MA: MIT Press.
  53. Armstrong, Walter (2008). Moral Psychology. Cambridge, MA: MIT Press. pp. 35–79.
  54. 1 2 3 4 Haidt, Jonathan, and Jesse Graham. "When Morality Opposes Justice: Conservatives Have Moral Intuitions That Liberals May Not Recognize." Social Justice Research 20.1 (2007): 98-116. Print.
  55. 1 2 Narvaez, Darcia (2009). Personality, Identity, and Character. New York, NY: Cambridge University Press. pp. 396–440.
  56. Narvaez, D (2010). "Moral complexity: The fatal attraction of truthiness and the importance of mature moral functioning". Perspectives on Psychological Science. 5 (2): 163–181. doi:10.1177/1745691610362351.
  57. Narvaez, D. (2014). Neurobiology and the Development of Human Morality: Evolution, Culture and Wisdom. New York: WW Norton.
  58. 1 2 3 Haidt, J. (2003). The moral emotions. p. 855
  59. Haidt, J. (2003). The moral emotions. p. 853
  60. Hare (1981)
  61. Gewirth, 1984
  62. Haidt, J. (2003). The moral emotions. p 583
  63. Batson, Klein, Highberger, & Shaw (1995)
  64. Batson, Fultz, & Schoenrade (1987)
  65. 1 2 Batson, C. D.; Klein, T. R.; Highberger, L.; Shaw, L. L. (1995). "Immorality from empathy-induced altruism: When compassion and justice conflict". Journal of Personality and Social Psychology. 68 (6): 1042–1054. doi:10.1037/0022-3514.68.6.1042.
  66. Decety, J. (2014). The neuroevolution of empathy and caring for others: Why it matters for morality. In J. Decety and Y. Christen (Eds). New Frontiers in Social Neuroscience (pp. 127-151). New York: Springer.
  67. Decety, J.; Cowell, J. M. (2014). "The complex relation between morality and empathy". Trends in Cognitive Sciences. 18 (7): 337–339. doi:10.1016/j.tics.2014.04.008.
  68. Decety, J., & Cowell, J. M. (2014). Friends or foes: Is empathy necessary for moral behavior? Perspectives on Psychological Science, 9(5), 525-537.
  69. Decety, J., & Cowell, J. M. (2015). Empathy, justice and moral behavior. American Journal of Bioethics – Neuroscience, 6(3), 3-14.
  70. Skitka, Linda (2002). "Do the means always justify the ends or do the ends sometimes justify the means? A value protection model of justice". Personality and Social Psychology Bulletin. 28: 452–461. doi:10.1177/0146167202288003.
  71. Morgan, G. S.; Skitka, L. J. (2011). "Moral conviction". In Daniel J. Christie. Encyclopedia of Peace Psychology. Wiley-Blackwell.
  72. Skitka, L. J.; Bauman, C.; Sargis, E. (2005). "Moral conviction: Another contributor to attitude strength or something more?". Journal of Personality and Social Psychology. 88: 895–917. doi:10.1037/0022-3514.88.6.895.
  73. Sober, Elliott; Wilson, David Sloan (1998). Unto Others: The Evolution and Psychology of Unselfish Behavior. Cambridge: Harvard University Press.
  74. Dean, Tim (2012). "Evolution and moral diversity". Baltic International Yearbook of Cognition, Logic and Communication. 7. doi:10.4148/biyclc.v7i0.1775.
  75. name="Neurobiology and the Development of Human Morality: Evolution, Culture and Wisdom"
  76. Narvaez, D. (2014b). The co-construction of virtue: Epigenetics, neurobiology and development. In Nancy E. Snow (Ed.), Cultivating Virtue (pp. 251-277). New York, NY: Oxford University Press.
  77. Haidt, Jonathan; Rosenberg, Evan; Hom, Holly (2003). "Differentiating Diversities: Moral Diversity Is Not Like Other Kinds". Journal of Applied Social Psychology. 33 (1): 1–36. doi:10.1111/j.1559-1816.2003.tb02071.x.
  78. Motyl, Matt; Iyer, Ravi; Oishi, Shigehiro; Trawalterl, Sophie; Nosek, Brian A. (2014). "How ideological migration geographically segregates groups". Journal of Experimental Social Psychology. 51: 1–14. doi:10.1016/j.jesp.2013.10.010.
  79. 1 2 3 4 Narváez, Darcia, "Triune Ethics Theory and Moral Personality." In D. Narvaez and D.K. Lapsley, Personality, Identity, and Character: Explorations in Moral Psychology. New York: Cambridge UP, 2009. pp. 136-158, Print.
  80. 1 2 3 MacLean, P.D. (1990). The triune concept of the brain in evolution: Role in paleocerebral functions. New York:Plenum
  81. Narvaez, D (2007). "Triune ethics: The neurobiological roots of our multiple moralities. Chemically, this ethic is based on the hormone Adrenaline and the neurotransmitter Norepinephrine". New Ideas in Psychology. 26: 95–119. doi:10.1016/j.newideapsych.2007.07.008.
  82. Kalin, N.H (1999). "Primate models to understand human aggression". Journal of clinical psychiatry.
  83. Prescott, J.W. (1996). "The origins of human love and violence". pre- and perinatal psychology journal.
  84. Caspi, A., Sugden, K., Moffitt, T.E., Taylor,A., Craig, I.W., Harrington, W., McClay, J., Mill,J., Martin, J., Braithwaite,A.& Poulton, R. (2003). "Influence of life stress on depression:moderation by a polymorphism in the 5-HTT gene". Science. 301: 386–389. doi:10.1126/science.1083968. PMID 12869766.
  85. Narvaez, D (2007). "Triune ethics: The neurobiological roots of our multiple moralities". New Ideas in Psychology. 26: 95–119. doi:10.1016/j.newideapsych.2007.07.008.

References and further reading

  • Baron, J.; Spranca, M. (1997). "Protected values". Organizational Behavior and Human Decision Processes. 70: 1–16. doi:10.1006/obhd.1997.2690. 
  • Batson, C.D. (1987). Distress and Empathy: Two Qualitatively Distinct Vicarious Emotions with Different Motivational Consequences.
  • Batson, C. D. (1991). The Altruism Question: Toward a Social-Psychological Answer. Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Batson, C. D. (1995). Immorality From Empathy-Induced Altruism: When Compassion and Justice Conflict.
  • Blasi, A (1980). "Bridging moral cognition and moral action: A critical review of the literature". Psychological Bulletin. 88 (1): 1–45. doi:10.1037/0033-2909.88.1.1. 
  • Decety, J.; Cowell, J. M. (2014). "The complex relation between morality and empathy". Trends in Cognitive Sciences. 18 (7): 337–339. doi:10.1016/j.tics.2014.04.008. 
  • Doris, John M. (2002). Lack of Character: Personality and Moral Behavior. New York: Cambridge University Press.
  • Doris, John & Stich, Stephen. (2008). Moral Psychology: Empirical Approaches. The Stanford Encyclopedia of Philosophy (Fall 2008 Edition), Edward N. Zalta (ed.). link
  • Haidt, J (2001). "The emotional dog and its rational tail: A social intuitionist approach to moral judgment". Psychological Review. 108 (4): 814–834. doi:10.1037/0033-295x.108.4.814. 
  • Jackson, Frank & Smith, Michael (eds.) (2007). The Oxford Handbook of Contemporary Philosophy, Oxford University Press.
  • Kochanska, G (2002). "Mutually responsive orientation between mothers and their young children: A context for the early development of conscience". Current Directions in Psychological Science. 11 (6): 191–195. doi:10.1111/1467-8721.00198. 
  • Lapsley, Daniel K. (1996). Moral Psychology. Westview Press. ISBN 0-8133-3033-5
  • Leffel, G.M. (2008). "Who cares? Generativity and the moral emotions, Part 2: A social intuitionist model of moral motivation". Journal of Psychology and Theology. 36 (3): 182–201. 
  • McGraw, A.P.; Tetlock, P.E.; Kristel, O.V. (2003). "The limits of fungibility: Relational schemata and the value of things". Journal of Consumer Research. 30: 219–229. doi:10.1086/376805. 
  • Mikhail, John. (2011). Elements of Moral Cognition: Rawls' Linguistic Analogy and the Cognitive Science of Moral and Legal Judgment. New York: Cambridge University Press.
  • "Moral psychology" (2007). Britannica Concise Encyclopedia. Retrieved December 6, 2008 from Encyclopedia.com: link
  • Narvaez, D. (2005). The Neo-Kohlbergian tradition and beyond: Schemas, expertise and character. In G. Carlo & C. Pope-Edwards (Eds.), Nebraska Symposium on Motivation, Vol. 51: Moral Motivation through the Lifespan (pp. 119–163). Lincoln, NE: University of Nebraska Press.
  • Narvaez, D. (2006). Integrative Ethical Education. In M. Killen & J. Smetana (Eds.), Handbook of Moral Development (pp. 703–733). Mahwah, NJ: Erlbaum.
  • Narvaez, D (2008). "Triune ethics: The neurobiological roots of our multiple moralities". New Ideas in Psychology. 26: 95–119. doi:10.1016/j.newideapsych.2007.07.008. 
  • Narvaez, D (2010). "Moral complexity: The fatal attraction of truthiness and the importance of mature moral functioning". Perspectives on Psychological Science. 5 (2): 163–181. doi:10.1177/1745691610362351. 
  • Narvaez, D (2012). "Moral neuroeducation from early life through the lifespan". Neuroethics. 5 (2): 145–157. doi:10.1007/s12152-011-9117-5. 
  • Narvaez, D. (2014). Neurobiology and the development of human morality: Evolution, culture and wisdom. New York, NY: W.W. Norton.
  • Narvaez, D., & Lapsley, D.K. (Eds.) (2009). Personality, Identity, and Character: Explorations in Moral Psychology. New York: Cambridge University Press.
  • Lapsley, D.K., & Narvaez, D. (Eds.) (2004). Moral development, self and identity: Essays in honor of Augusto Blasi. Mahwah, NJ: Erlbaum.
  • Nagel, Thomas. (1970). The Possibility of Altruism. Princeton University Press.
  • Nichols, Shaun. (2004). Sentimental Rules: On the Natural Foundations of Moral Judgment. Oxford: Oxford University Press.
  • Plato. The Republic, public domain.
  • Richardson, Henry S. (2008). "Moral Reasoning", The Stanford Encyclopedia of Philosophy (Fall Edition), Edward N. Zalta (ed.). link
  • Roberts, Robert C. Emotions: An Essay in aid of Moral Psychology. Cambridge: Cambridge University Press.
  • Rozin, P.; Markwith, M.; Stoess, C. (1997). "Moralization and becoming a vegetarian: The transformation of preferences into values and the recruitment of disgust". Psychological Science. 8 (2): 67–73. doi:10.1111/j.1467-9280.1997.tb00685.x. 
  • Sinnott-Armstrong, Walter, ed. (2007). Moral Psychology, 3 volumes. MIT Press. ISBN 0-262-69354-2
  • Smith, Michael. (1994). The Moral Problem. Cambridge: Basil Blackwell.
  • Tetlock, P.; Kristel, O.; Elson, B.; Green, M.; Lerner, J. (2000). "The Psychology of the Unthinkable: Taboo Trade-Offs, Forbidden Base Rates, and Heretical Counterfactuals". Journal of Personality and Social Psychology. 78: 853–870. doi:10.1037/0022-3514.78.5.853. 
  • Thagard, Paul (2007). "The Moral Psychology of Conflicts of Interest: Insights from Affective Neuroscience". Journal of Applied Philosophy. 24 (4): 367–380. doi:10.1111/j.1468-5930.2007.00382.x. 
  • Wallace, R. Jay. (2007). "Moral Psychology", Ch. 4 of Jackson & Smith (2007), pp. 86–113.
  • Wallace, R. Jay (2006). Normativity and the Will. Selected Essays on Moral Psychology and Practical Reason. Oxford: Clarendon Press.
  • Katz, S. (1997). Secular morality. In A. M. Brandt & P. Rozin (Eds.), Morality and Health (pp. 295–330). New York, NY: Routledge.
  • Helweg-Larsen, M.; Tobias, M. R.; Cerban, B. M. (2010). "Risk perception and moralization among smokers in the USA and Denmark: A qualitative approach". British Journal of Health Psychology. 15: 871–886. doi:10.1348/135910710x490415. 
  • Brandt, A. M. (2004). Difference and diffusion: Cross-cultural perspectives on the rise of anti-tobacco policies. In E. A. Feldman & R. Bayer (Eds), Unfiltered: Conflicts over tobacco policy and public health (pp. 255–380). Cambridge, MA: Harvard University Press.
  • Hardy, S. A.; Carlo, G. (2011). "Moral identity: What is it, how does it develop, and is it linked to moral action?.". Child Development Perspectives. 5 (3): 212–218. doi:10.1111/j.1750-8606.2011.00189.x. 
  • Teper, R.; Inzlicht, M.; Page-Gould, E. (2011). "Are we more moral than we think?: Exploring the role of affect in moral behavior and moral forecasting". Psychological Science. 22 (4): 553–558. doi:10.1177/0956797611402513. 
  • Jones, A.; Fitness, J. (2008). "Moral hypervigilance: The influence of disgust sensitivity in the moral domain". Emotion. 8 (5): 613–627. doi:10.1037/a0013435. PMID 18837611. 
  • Rest, J. R., Narvaez, D., Bebeau, M., & Thoma, S. (1999). Postconventional moral thinking: A neo-Kohlbergian approach. Mahwah, NJ: Erlbaum.
  • Rozin, P (1999). "The process of moralization". Psychological Science. 10 (3): 218–221. doi:10.1111/1467-9280.00139. 
  • Rozin, P.; Lowery, L.; Imada, S.; Haidt, J. (1999). "The CAD triad hypothesis: A mapping between three moral emotions (contempt, anger, disgust) and three moral codes (community, autonomy, divinity)". Journal of Personality and Social Psychology. 76 (4): 574–586. doi:10.1037/0022-3514.76.4.574. 
  • Narváez, Darcia, and Daniel K. Lapsley. Personality, Identity, and Character: Explorations in Moral Psychology. New York: Cambridge UP, 2009. Print.

External links

This article is issued from Wikipedia - version of the 11/11/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.