Introduction to Principles of Ethics and Morality
for Scientists and Engineers
IDEESE: International Dimensions of Ethics Education in Science & Engineering
University of Massachusetts Amherst
- Introduction: Ethics for Science and Engineering Professionals
- Considering Impacts
- Ethical Dilemmas: Perspectives and Roles
- What is Ethics about? Are there Universal Ethical Principles?
- Knowledge and Skills Required for Ethical/Moral Decisions
- Bringing Technical Skill and Knowledge to Bear
- Spectrums of Moral Values
- Ethical Dialog and Deliberation
- International Implications
- References and Suggested Readings
[A PDF version of this document]
This document is part of an NSF-funded project at the University of Massachusetts Amherst that is developing on-line case-based ethics curriculum for science and engineering disciplines. The on-line materials include a set of cases based on real events with international ethical dimensions , including the incidents leading up to the Bhopal chemical plant disaster, the reporting incidence of the SARS epidemic, and international regulation of genetically modified foods. Each case module has a description and supplementary resources including reference material; a set of interviews with fictitious stakeholders; and an on-line discussion forum structured to promote particular types of discussion, analysis, and reflection.
The online activities are designed to be modularly introduced into college graduate and undergraduate classes, so in some scenarios students will be exploring these cases with little or no introduction to basic topics in Ethics. This document, sections of which might be assigned as required or optional reading, is provided to give students an overview of some overarching concepts and themes in ethics, particularly as they relate to the professional experiences of scientists and engineers. It should provide students some vocabulary and ideas to deepen their analysis and discuss of the case studies. It may also serve as a guide to instructors by outlining themes and discussion questions to expand upon.
Harris et al. (2000, pg. 19) report on five key objectives for ethics instruction in higher education developed by an interdisciplinary group of educators:
- Stimulating moral imagination (to consider multiple perspectives, anticipate the potential consequences of technical and other decisions; and consider creative solutions);
- Recognizing ethical issues (because the ethical dimensions of situations are not always apparent);
- Developing analytical skills (such as critical thinking, synthesizing information, and identifying areas of agreement to clarifying differences or controversies);
- Eliciting a sense of responsibility (an active sense of caring and moral obligation, which can come into play even for seemingly minor decisions);
- Tolerating disagreement and ambiguity (the vagueness and ambiguity of ethical questions can be particularly frustrating or challenging for engineers and scientists, who are used to working on well defined problems and expecting a certain rigor and objectivity).
This document builds upon these themes as they relate to scientists and engineers. For more information about the online curriculum and the design of ethics curriculum for science and engineering students, see Murray et al. 2009.
Engineers and scientists play a vital role in society. Their work is critical to many aspects of economic and social progress, to public health, safety, and wellbeing, and to the general advancement of human knowledge. Contributing in these ways requires not only technical competence, but also imagination, persistence, and integrity. On the job, engineers and scientists regularly make decisions that have ethical significance or moral relevance. The ethical/moral implications of these decisions may be small scale and personal, for example how one speaks of co-workers around the water cooler; they may be medium scale, for example affecting the profitability or reputation of one's company or department, or they may be wide-reaching, for example affecting the health and safety of thousands of people in years to come. Scientists and engineers can also be involved in regulatory or oversight functions, in panels and bodies that recommend or set ethics-related policies for organizations and governments, and in citizen-based advocacy and reform efforts. In these ways the roles, duties, and societal contributions of scientists and engineers go well beyond the purely technical aspects of research, development, and implementation.
Almost all ethically significant situations involve dilemmas where there are multiple interrelated needs, goals, or rules and no completely satisfactory solution--that is, any course of action will result in satisfying some goals and sacrificing others. In such predicaments making decisions that are most likely to lead to positive outcomes for oneself, one's organization or cohort, and/or society in general requires certain knowledge, skills and attitudes (cognitive capacities and character traits). The cognitive capacities called for in ethical dilemmas include knowing the laws or norms applicable to the situation, asking the right questions, being open to a sufficient scope of information and concerns, reflecting on one's own biases and values, and communicating clearly, as well as any specific technical knowledge related to the situation. Such skills and attitudes may develop "naturally" or "automatically" for some, but for most, they develop through the social mechanisms of observing others, trail-and-error learning, instruction, and assimilating social conventions and norms.
To enable a more ethical or moral society these capacities need to be actively nurtured in both individuals and social systems. Yet, education in engineering and science disciplines is often less focused on developing student responses to moral or ethical dilemmas as it is on imparting technical knowledge and skills. One way to train the mind to think deeply and flexibly about ethical dilemmas is to explore and evaluate realistic situations with peers and mentors. Though there are some general skills in moral/ethical reasoning, it is also true that the most useful concepts or approaches to handling dilemmas can differ according to context, so it is beneficial to gain practice "muddling through" situations similar to those that one expects to encounter as a professional.
You will be discussing one or more case studies involving ethical themes and dilemmas in your class. The overview of ethical themes and concepts below is intended to provide ethical concepts and principles to expand and refine your analysis and discussion.
One can and should evaluate the potential impacts of scientific/technical development, research, and implementation from many perspectives, including:
- Health and safety concerns
- Environmental impact
- Socio-political-economic effects to affected organizations, regions, or nations
- Interactions with prevailing cultural (or institutional) norms and values
Because engineers and scientists do not usually have deep expertise in the topics listed above, they must rely on outside information, opinion, and expertise. They must also be willing to engage in the creative and imprecise process of imagining hypothetical implications and webs of implications pointing to possible scenarios. While engineers and scientists often do not own the responsibility, nor have the time, to research and follow up on the ethical issues of impact, they can play a critical role in making sure that the right questions are asked and that the necessary technical information is accessible.
For instance, engineers and scientists can bring greater ethical awareness and focus to projects by raising ethical questions such as "How might this technology be used to harm or mislead people?" and "Do we know enough about the cultural norms of the implementation site?" Though investigating and responding to such questions may be the responsibility of managers, business experts, and policy makers, technical specialists can and should use their unique positions as professionals to voice their morally/ethically relevant concerns and advice.
Questions to consider:
- What are the core technical or scientific innovations of the case?
- What are potential impacts in each of the four impact-areas listed above? Include possible positive and negative impacts.
- Who is impacted, and how? Are those who bear the negative impacts the same or different as those who receive the benefits?
- Take this analysis a few steps further: for each potential impact, what might it lead to. Imagine future scenarios in which some these implications occur.
- What could be done to support the potential positive impacts and reduce the potential negative impacts? In the large sense, does the value gained from the project or technology outweigh the downsides?
- Consider the history of the introduction of existing technologies you are aware of. How does the "story" of the actual impacts of these technologies affect your analysis?
Professionals make dozens or even hundreds of decisions of ethical/moral import every day, but most of these are not ethical "problems" per se because they pose no great challenge--they do not cause one to pause to reflect upon the right choice of action. Our focus here is of course on the situations that do engage reflection in this way, though it must also be said that an important aspect of good ethical reasoning is recognizing that situations have ethical import and thus warrant reflection, which can mean stepping outside of one's habitual thought patterns.
We have noted that such situations are almost always dilemmas having a number of important factors that can't all be maximized. Work projects involve constraints and goals including being on time, within budget, and making efficient use of human and physical resources. Any change in time, cost, materials, personnel, etc. in the design, research, manufacturing, or deployment of a product or process may have a measurable effect on safety, quality, affordability, accessibility, sustainability, or environmental impact. One wants to "do right by" coworkers, managers and executives, customers and clients, community and country, family, and one's conscience. Balancing all of these factors, even being open to considering a full range of them, takes effort and skill. And yet doing so is exactly what our moral/ethical sense calls one to do.
One can think of an ethical dilemma's competing constraints and needs in terms of (1) stakeholders, and (2) personal roles. First, it is important to consider the perspectives of a full range of stakeholders, those who might be (or have been) affected by the project or technology, including customers, coworkers, suppliers, business interests, cultural factions, and citizens at large. Thinking flexibly and engaging in dialog to identify stakeholder groups and assess their needs, motivations, and constraints is an important part of the life cycle of any product or service.
Second, it is useful to reflect upon the multiple roles that one can hold within oneself. Multiple personal roles such as the following create a complex jumble of conflicting demands and needs within an individual. One can be:
- An employee—who has a duty to help with the profit and goals of one's company, and
- A supervisee and co-worker—who is motivated to keep one's boss happy and coworkers happy, and
- A citizen—who is concerned about the success and security of one's country, and the wellbeing of one's countrymen, and
- A household provider—who needs to keep one's job to support his/her family, and
- A professional—who has pride in one's profession and wants to uphold its regulations, principles, and reputation, and
- A global citizen—who wants the best for all people, present and future, and cares about the earth and environment, and
- A human being—who avoids unnecessary personal suffering, loss, embarrassment, confrontation, effort, etc.
Note that within these roles there may be further conflicts between short vs. long term interests. (In addition, one can acknowledge that other participants and stakeholders are also contending with such multiple personal roles.)
In response to an ethical dilemma one usually has to prioritize one or more stakeholders and/or personal roles at the expense of others. Creative thinking will help an individual or group find solutions that meet as many needs as are practically possible, sometimes even satisfying all parties. Still, in many situations there are difficult tradeoffs to be made. Usually just the exercise of being explicit and transparent about the pros and cons, and the various perspectives on the issue, is of general benefit to others.
Questions to consider:
- Pick a key decision that was made or needs to be made in your case. List two or more possible alternatives for that decision.
- List the stakeholders (or stakeholder types) affected by that decision. Describe their perspectives on the situation.
- For each stakeholder, how would they be affected by the possible decisions? What might they have to say about it (pros and cons)?
- Put yourself in the shoes of the person making the decision. Consider the various roles held by that person. Evaluate the pros and cons of the possible decisions from each of these roles (feel free to add other roles that seem relevant).
Though our focus here is on practical issues and skills applicable to professional ethics, it is worthwhile to briefly examine some general themes from ethical theories and the study of ethics. Doing so will introduce concepts and principles that help one think and dialog about ethical dilemmas, and it might provide guidance for on-the-job behavior.
What is ethics (or equivalently here, morality) about? There are a number of prominent perspectives on or approaches to this question:
- Character or values: Ethics is about the articulation and development of specific character traits (such as honesty, generosity, etc.) or specific values (such as freedom, responsibility, autonomy, etc.).
- Utility: Ethics is about behaviors aiming to achieve the greatest happiness, wellbeing, or welfare for the greatest number of people (the "greatest good"). A related line of reasoning is that the ethical value of an action depends on its actual consequences (i.e. one's intentions, feelings, or character do not matter much).
- Duty/obligation: Ethics is about adhering to moral/ethical principles or codes of conduct. The principles can come from authority figures, an authoritative source such as a religious text, or a general principle such as the Golden Rule (to do unto others as you would have them do unto you) or Kant's "Imperative" (to act only according to those principles/rules that would work as a universal laws that everyone should follow).
- Human Rights: Ethics is about honoring "human rights" (or claimed non-negotiable rights of a particular group). This view holds that people have certain inalienable rights that take precedence over other needs and desires. Those emphasizing human rights often also emphasize the role of personal or socio-political power differentials in the ethical evaluation of situations.
- Procedural: Ethics is about the methods used to communicate, reason, and make decisions in such a way as to produce the most ethical outcomes. The outcome is not as important as using valid (inclusive, democratic, transparent, free and fair, logical, etc.) procedures for making the decision.
- Ethics of care: Ethics is about human interdependence and vulnerability, the quality of human relationships, and the ways that people care for and about each other. General principles (such as the Golden Rule) should be used with great caution, as the specific details of each situation are important.
Though scholars have historically argued for the preeminence of one or another of these perspectives on how ethics should be approached, in the end there is no single agreed upon way to determine or argue for what is morally right or just in a given situation (or in general). Each approach contains an intuitive grain of truth that reflects how real people think about real ethical situations. Any and all of the above perspectives may be used to evaluate a given situation, and all of these perspectives have been shown to lead to problematic conclusions if used exclusively (e.g. reasoning for the greatest good alone can result in ignoring the needs of minority populations; acting based on feelings or convictions of care without paying attention to utility can result in unintended consequences; procedural ethics can work if all parties agree to the procedure, but determining an ethically valid procedure can not rely solely on procedural methods). Despite the unfortunately messy nature of ethics in real situations, it is vital to be able to reason about ethics, not to determine the right ethical solution to a problem (which may not exist as such), but to be able to make sincere efforts to find a (or any) solution. When ethical dilemmas are discussed, many of the proposed solutions can be problematic, due to misinformation, illogical thinking, narrow perspectives, etc. The process of weeding out ineffective solutions is a critical one even when finding a final solution is very difficult.
The perspectives listed above are general to all ethical situations. Informed dialog about any particular domain of ethical consideration (e.g. environmentalism, abortion, genetically modified organisms, distribution of wealth, etc.) requires a familiarity with the particular principles and common lines of argument used by the contrary sides of the issue. It also usually requires input from technical expertise in the appropriate domain and the gathering of extensive factual knowledge.
Ethics is in part a matter of following rules or principles established by authorities of various types. Those working in technical professions are under the purview of explicit rules, including laws, regulations, and codes imposed by governments, companies, oversight and standards associations. There are also implicit obligations or duties, which, though not written down, strongly govern expected practice.
Yet the dilemmas one encounters in these modern (or "post-modern") times are often not amenable to the easy and straightforward application of rules. Rules from various sources can be in conflict. Conventions and norms of practice can be in flux. Situations can arise that the rule-makers did not anticipate. Just as we may need to compare and prioritize the conflicting needs or values of others and within ourselves, we may need to compare and prioritize the rules that are applicable to a situation.
Questions to consider:
- Pick a key decision that was made or needs to be made in your case. List two or more possible alternatives for that decision.
- What authoritative (or official) rules, policies, or guidelines apply to this decision? What informal or implicit rules or duties apply?
- Evaluate the possible decisions and their outcomes from at least three (and up to six) of the approaches to ethics listed above.
- Compare and contrast these approaches. Are some more useful or intuitively valid than others? Do some generate more specific guidance than others?
- For you, which is more important to evaluating situations with ethical import: action (behavior), intent (and reasoning), or feeling (quality of care and connection)? And why do you find it so? Does your answer differ for different situations?
- Were the official rules/guidelines clear and useful enough? Were they too general, vague, or simple?
Real life dilemmas are not always black and white--there can be grey zones and slippery slopes of ambiguity, uncertainty, and complex interacting factors. However, this does not mean that "anything goes," that all opinions are equally valid, or that it is useless to ply the tools of reason and dialog toward finding satisfactory solutions to real problems. General knowledge, skills, and attitudes and specific knowledge and skills are vital.
First, general critical and logical thinking skills are important. Many of the ethical arguments one hears from peers or in the public sphere are dubious or blatantly erroneous because they rely on any one of a large number of logical fallacies. These include "confirmation bias" (interpreting a situation so as to confirm one's preconceptions), and "false dichotomy" (alternative statements are held to be the only possible options, when in reality there are more possibilities), “ad hominem," and “appeal to authority or tradition” (see the Wikipedia entries on Cognitive Biases and Logical Fallacies).
Research into human cognition has shown that rationality is "bounded," that is, numerous non-logical cognitive biases are practically hard-wired into human thought. Those who commit them may be wrong, but they are not necessarily ignorant. Avoiding them requires diligence and skill, and research indicates that avoiding many of them becomes more difficult in issues for which we have more at stake or more emotional charge. Metacognitive skills and emotional/social intelligence skills can therefore be just as important as logical reasoning skills. Dilemmas are often charged situations, where the abilities to reflect on one's own thinking, emotional state, biases, and assumptions is vitally important.
Ethical/moral deliberation always includes considerations of both fact and value, that is, arguments about what is true and arguments about what is good, right, or just. Disagreements that seem to be about values often resolve to be disagreements about facts (what is really the case) when participants probe further. The focus of an ethical debate will often shift as new facts become apparent. Clear ethical debate will try to separate questions of fact and value. It will identify agreed upon facts, conflicting facts (and whether it is possible to resolve these differences), which facts really matter, and relevant facts that are unknown (and how to find them).
Complicating factors such as the trustworthiness of sources of information may surface along the way. Of particular difficulty in the realm of facts are causal principles and predictions, that claim that certain factors will inevitably, or usually, lead to certain outcomes (as part of a moral argument supporting some course of action). Such claims can not be proven, and their acceptance depends in part on the strength of the causation argument. For example, one might hold firmly to the notion that genetically modified foods will lead to higher crop yields. This hypothesis may be treated like a "fact" by some and not by others. Ethical analysis benefits from the skills of checking sources and assigning hypotheticals the appropriate degree of uncertainty.
Another common complication is in the definition and application of terms or concepts. Abstract terms such as "responsibility," "equality," "human rights," "the middle class," "conflict of interest," and "environmentally dangerous" can be used without acknowledging that different parties may hold very different meanings for the term. Rules, guidelines, and principles may include abstract concepts. Even when there is agreement on the definition of a concept, there can be disagreement concerning its importance in a situation or what things constitute correct instances of the concept. Rigorous dialog helps to map out the range of these concept meanings and ground meanings in concrete examples.
An important emotional/social skill is "cognitive empathy"—being able to put oneself in another's shoes. This is important for many reasons. The most obvious is that one's "moral compass" and inclination to care about others is directly linked to one's ability to imagine the suffering that others are, were, or could be experiencing. But one's ability to understand another's argument also depends on cognitive empathy. It has been argued that there is an overemphasis in secondary education on "critical thinking skills" without a balancing emphasis on "listening skills" (see Elbow 2005). This critique notes that too often "critical" faculties are used to discredit arguments that are at odds with one's own preconceptions. It claims that we overuse critical thinking that insulates one from discomforting or effortful lines of thought, while we underuse critical thinking that reflects on our own beliefs.
Included as important skills for thinking about ethics are an acute awareness of what we don't know and a propensity to reflect on the certainty with which one holds one's beliefs. An open curiosity about the beliefs and experiences of others opens up the space necessary to inquire into many of the complications mentioned above, such as whether participants hold the same meaning for important terms. Thus the attitudes or traits of humility, openness, patience, persistence, and curiosity are important, not only as moral ends in themselves, but as a means to reason effectively about what is true.
Questions to consider:
- In the case you are considering, what questions of fact are in dispute or uncertain? What questions of value--of what is good, right, or just--are there disagreements about?
- Learning new facts can change ones perspective about what is right. Did this happen at any point in this case (for you or the actors in the case)? Are there unknown matters of fact that would have significant impact on how one evaluates the ethics of the situation?
- Do some actors in the case hold to hypotheses or predications with the certainty of facts? If so, are they justified in doing so?
- Where is the trustworthiness of sources of information an issue in the case? What could be done to increase the trustworthiness or your certainty of the information available?
- Are there terms or concepts that have widely differing meanings or associations for the parties in the case? Or for the discussion participants in your class? How does this affect people's ability to understand each other and what could be done to improve the situation?
- Reflect on the skill of "cognitive empathy" for this case: to what degree did the actors in the case have it or need it? To what degree do you see it in your own thinking and in that of other students discussing the case?
- Reflect upon the traits of humility, openness, and curiosity in the actors in the case. When these traits were evident, how did this seem to affect the outcomes? How might things have gone differently if these traits were more strongly established in some of the actors?
Thus far we have been discussing general skills important to ethical reasoning. Specific knowledge and skills also come into play. In professional contexts one should be aware of the standards, laws, norms, and precedents that apply. These may come from unions, professional societies, regional or national laws, non-governmental associations, oversight and regulatory bodies, quality control units, societal expectations, etc. One should be able to determine the spheres of action, duties, responsibilities, and liabilities of those in responsible (supervisory), acting (direct operation), or responsive (emergency response, damage control, accident investigation) roles.
In each domain of ethical debate (e.g. information privacy, environmental degradation, genetically modified organisms, distribution of wealth, etc.) informed reasoning requires a familiarity with the common arguments and principles put forth, and the commonly noted strengths and flaws of each.
Topics in professional ethics. In the resource materials provided with the case, one may find links to material covering general professional norms and profession-specific guidelines for areas including:
- Conflicts of interest, accepting gifts, bribery, perks, client relationships
- Intellectual property (trade secrets, patents, trademarks, copyrights)
- Publishing, authorship, plagiarism, peer review
- Confidentiality, whistle blowing
- Fraud, forgery, data fabrication, perjury
- Public relations, media, marketing
- Professional vs personal duties; legal vs moral responsibility
- Public safety, health, and welfare
- Social norms, social pressures, cultural taboos
- Public service, expert testimony, responsibility to inform the public
- Determining risk, liability, accountability, rights, and responsibilities
Though it is very important to know and understand the rules and authorities under which one's professional work is governed, and to have the conviction and integrity to follow those rules, it is equally important to develop skills to reflect on those rules (their purpose, scope, and limitations) and reflect on the relationship between one's personal values and what the rules imply--in other words, to reflect upon the process of moral decision making itself. In fact much of ethics deals with how one explains or justifies one's or another's actions as morally right or wrong, and what principles and concepts one brings into play to evaluate or prescribe behavior.
Questions to consider:
- What technical or scientific knowledge important to this case? What is the appropriate role of technical experts here?
- For the case in consideration, what official rules, policies, and guidelines are relevant?
- What are the relevant bodies or organizations making the rules and performing oversight functions (local, national, and international, as applicable)?
- Do you think the existing operative rules and policies are appropriate and sufficient? Are they reasonable? Specific enough? Up-to-date?
- What rules or norms should guide professional behavior? Are there professional organizations that have published relevant guidelines for workplace practices?
- Which of the "ethics topics" above apply to the case?
- If the case materials or your instructor has provided them, describe and reflect upon the common lines of reasoning that have been followed regarding the specific topic of the case.
Harris et al. (2000, p. 35) propose that professional ethics can begin with a minimalist set of moral values that includes "positive duties of mutual support, loyalty, and reciprocity; negative duties to refrain from harming others; and the norms for determining just procedures in resolving issues of justice." This set leaves room for personal and cultural differences yet provides a basis for framing codes of professional ethics.
Most people would agree to values or ethical claims phrased at this level of generality. Disagreements arise over how these values are interpreted and applied in real contexts and in how priority is assigned to various duties or values. Identifying areas of agreement is good, as it provides a starting point for dialog. But one must also be cautious that group dialog processes do not end with or overly focus on areas of agreement, as doing so can merely produce a "feel good" atmosphere where "all parties have been heard," but that glosses over real and important differences of opinion and leaves outcomes too general to be of practical application.
Once common difference in value priorities can be explained along a spectrum of order-oriented values (sometimes associated with politically conservative/traditionalist values) vs. change-oriented values (sometimes associated with politically liberal/progressive values). Order-oriented lines of reasoning tend to prioritize obedience to authority, duty, and/or value virtuous character traits such as honor and respect, self-discipline and courage, tolerance and generosity, and honesty and integrity. In evaluating morality order-oriented individuals tend to focus on personal responsibility and value action and consequences over intentions, feelings, or reasons (since rationalization and self-justification all too easily seduce one into unethical behavior…"the road to hell is paved with good intentions"). They also tend to prioritize cultural or social conventions (norms, etiquette, and taboos) more than universal moral obligations.
On the other side of the spectrum are change-oriented lines of reasoning. They tend to prioritize awareness of power dynamics and power imbalances and have a more critical stance toward rules imposed by authorities or cultural norms. They emphasize dialog, reflection, intention, understanding multiple perspectives, the quality of relationships, and the often complex process of balancing competing interests.
Of course the generalities noted above have many exceptions, and are described to illustrate a common range of value priorities found in many discussions (not to categorize people into one group or predict how a group will argue). Another common difference in values can be described in terms of prioritizing individual freedoms and rights vs. obligations and duties to the group or collective good.
Summary of some dimensions of values priority:
- order (stability; holding onto the things that work; minimizing chaos/uncertainty; ) vs. change (innovation; changing the things that don't work; adapting to or accepting chaos/uncertainty; )
- authority, duty, honor, respect, self-discipline, courage vs. flexibility, openness to perspectives, dialog, quality of relationships
- action, outcomes, and consequences vs. intentions, feelings, opportunity, and reasons
- personal responsibility and virtuous character vs. collective responsibility and systemic causes (including the dynamics of personal and political power)
- cultural/social conventions (norms, etiquette, and taboos) vs. universal moral obligations, suspicion of cultural/social rules and norms
In a very rough way, the values to the left side above correspond with order-orientation and those on the right to change-orientation. You may notice that your values lean toward one end for some dimensions, and the other end for other dimensions. You may also notice that which side of these dimensions you align with changes according to the situation or topic. Evaluating where one falls along the spectrum of each of these dimensions separately may enrich dialog and opinion justification.
Differences such as those noted above tend to be deep seated and tied to one's identity and core moral intuitions. Thus, it may not be practical to look for agreement at this level of generality regarding which of the competing values concepts are more important--the conversation can become too "philosophical" and ungrounded. Dialog can be more effectively focused on (1) articulating and honoring diverse values; and (2) focusing on the concrete implications of possible courses of actions. People often more easily agree that a certain outcome is preferable (or undesirable) than with why that outcome is preferable (or undesirable) in more abstract or philosophical terms (this phenomena has been called "convergence"—when participants on different sides of a debate converge on a solution, without necessarily having altered their values or arguments).
Questions to consider:
- Where do each of the key actors in the case seem to fall along the order vs. change-orientation spectrum described above? Along each of the separate dimensions listed?
- Where do you seem to fall along these dimensions? Does your explanation of your most important values depend strongly on the situation?
- Does thinking about these values dimensions help your thinking and debate about the case, or does it get in the way?
- Are there differences of opinion or values that seem relatively fixed for participants? (You can answer this for actors in the case or the participants in your class discussion).
- Are there aspects of the situation where it may not be viable to work toward mutual agreement? Can satisfactory solutions be found if the different perspectives "agree to disagree?"
Answering ethical questions such as "what is the right thing to do?" "was a decision (or action) just?" and "who (or what) is at fault?" will usually involve investigation, dialog, deliberation, and decision making with a group of people. In difficult cases, these things are often obscure and can take considerable time and effort to sort out. In such cases, it is appropriate to be suspicious of those with immediate ethical certitude, who believe that dialog and deliberation are not needed.
The principles of just, efficient, and valid dialog and deliberation are outside the scope of this document, but they are important so we briefly mention the topic here and encourage the reader to investigate further on his/her own.  (See Suggested Readings below.)
The following material is taken directly from M.J. Peterson's "Transnational Aspects of Ethical Debate" which is included in the curriculum materials for this project (see http://www.umass.edu/sts/ethics):
Most moral philosophy and ethical discussion assumes that everyone involved in or observing the situation shares the same broad values, expresses them in similar rules, and gives the values similar weight when balancing between competing rules. Ethical arguing becomes more complicated when different people maintain non-identical sets of values (for instance, individualists who emphasize autonomy and individual freedom and communitarians who emphasize membership in groups and allowing groups room to follow their way of life), express the same value in different rules (for instance, believe that humans have a right to life but disagree about abortion because some define “life” as beginning at the moment sperm and egg trigger the process of fetus development and others define it as beginning at the point a fetus could survive outside the womb), or maintain different hierarchies among values (for instance, a situation in which some regard privacy as more important than public access to information about past criminal records and others regard knowing the whereabouts of repeat pedophiles who have finished serving their jail terms as more important than privacy).
…In today’s globalized world, ethical theory and moral philosophy also have to address the challenge of cultural moral relativism, the idea that the different ethical beliefs of the many societies around the world deserve equal respect whatever their content and whatever the content of the rules derived from them. Arguments in favor of cultural moral relativism start from the well-established observation that traditions of ethics and morality and the sets of rules derived from them do vary from one society to another. The next step in such arguments is to claim that no society has the right to criticize the ethics, or ethical rules of another.
…One of the strongest arguments against cultural moral relativism claims that there is a universal human nature or a universal set of human needs which lead to adoption of similar basic moral values in all cultures. Cultural moral relativism is not consistent with universal human rights, for example. Adherents of this view further argue that most of what appear to be cultural differences in ethical systems are differences how people interpret and apply these similar basic beliefs in particular situations.
…The distinctive element of transnational ethical differences is the need to be particularly sensitive to the question of how far the differences of view expressed by participants depend on culturally-derived differences in judgments and/or standards. Whereas national ethical debates proceed against the background of a thick set of shared cultural references and practices, transnational ethical debates do not. Clarification of terms may have proceeded along different paths, making a literal translation of a phrase from one language into another misleading. The moral codes may be different in significant ways. The process of arguing by example and counter-example can be slowed down, though very likely enriched, by the different exemplary stories familiar in various cultures. These differences mean that participants in transnational ethical debates must be willing not only to hear the questions and explanations of others but to elaborate their own positions and explanations in ways that help participants from other cultures can understand them accurately. This requires making one’s own tacit assumptions explicit, something that can be difficult because the background knowledge provided by a culture is so taken for granted that a participant may have trouble bringing relevant parts of into active memory where it is available for conscious expression. Yet if enough participants make this effort the result will be a better informed debate all around even if in the end participants to “agree to disagree” and design a solution allowing divergent approaches rather than settling on a common one.
Questions to consider:
- For the case under consideration, discuss how cultural differences and historical conditions among different cultures affect their values, needs, and moral codes.
- In dialogs with other students around this case, were there instances where examples or life stories helped clarify someone's values or the terms being used, and created deeper mutual understanding?
- Can one society or culture rightly pass ethical/moral judgments on the rules or norms of another?
- Are moral principles socially constructed and relative, or are they more universal? Can ethical principles be universal or must they always be contingent? Are some relative while others are universal?
Additional materials by MJ Peterson are found on the IDEESE curriculum site at http://www.umass.edu/sts/ethics. These cover themes including:
- Workplace Ethics in Transnational Contexts. Professional codes of ethics may not be consistent across countries.
- International Accountability. International-level bodies and mechanisms exist at several levels, holding researchers, research institutes, firms, or others accountable to society.
- Transnational Diffusion of Ideas and Practices. Understanding the processes by which ideas and debates diffuse across countries.
- Transnational Conduct. Effective participation in cross-border scientific cooperation requires sensitivity to the implications of differences in national ethics and standards.
- Variation in International Regulatory Processes. The essence of international ethics is that variation exists among regulatory processes. There is variation in multilateral intergovernmental organizations such as United Nations Conferences, regional conferences or commissions, and other international bodies including private industry standards-setting bodies.
- Responsible Participation. Scientists and engineers participate in international regulatory processes in a variety of ways. These include: epistemic communities, professional associations, scientists as citizen-advocates, scientists as employees of private organizations, and scientists as government officials.
Below are some questions related to these themes (from IDEESE, 2009).
- We often speak of living in a “globalized” world, which is a way of saying that there is now more interconnection among countries and their societies.
- What are the sources of this interconnection?
- What kinds of activities occur across borders?
- Are there concerns about social and environmental implications of international activities?
General Ethics, Ethics Curriculum, Ethics for Science and Engineering
Goodman, Joan F. & Lesnick, H. (2001). The Moral Stake in Education: Contested Premises and Practices. New York: Longman.
Harris, Charles E., Pritchard, Michael S., & Rabins, Michael J. (2000). Engineering Ethics : Concepts and Cases. 2nd Ed. Belmont, CA: Thomson/Wadsworth.
IDEESE (2009). Slides on "International Ethics," available at http://www.umass.edu/sts/ethics.
Murray, T., Ake, J., Peterson, M.J. & Fountain, J. (2009). Online Curriculum and Dialog Design for Ethics Skills for Science and Engineering Students. Submitted to E-Learn-2009.
Whitbeck, Carol (1998). Ethics in Engineering Practice and Research. Cambridge Univ. Press.
Psychology of decision making; human reason and emotion; cognitive biases
Berreby, D. (2005). Us and Them. NY: Little Brown and Co.
Damasio, A. (1999). The Feeling of What Happens: Body and Emotion in the Making of Consciousness. Harcourt: NY.
Elster, J. (1999). Alchemies of the mind: Rationality and the emotions. Cambridge, UK: Cambridge University Press.
Feldman, R. (2009). The Liar in Your Life. NY: Hachette Book Group.
Gilbert, D. (2006). Stumbling on Happiness. NY: Knopf Random House.
Gilovish, T. (1991). How we know what isn't so: the fallibility of human reason in everyday life. NY; Free Press.
Goleman, D. (1985). Vital Lies, Simple Truth; The Psychology of Self-deception. Simon & Schuster: New York.
Goleman, D. (1995). Emotional Intelligence. Bantam Books: NY.
Haidt, J. (2006). The Happiness Hypothesis: Finding Modern Truth in Ancient Wisdom. NY: Basic Books.
Matthews, G, Zeidner, M, and Roberts, R (2002). Emotional intelligence: Science & myth. Cambridge, MA: Bradford Book/MIT Press.
Sunstein, C.R. (2002). Risk and Reason. Cambridge Univ. Press: NY.
Travis, C. & Aronson, E. (2007). Mistakes Were Made (but not by me): Why we justify foolish beliefs, bad decisions, and hurtful acts. Harcourt Inc.: NY.
Tversky, A. & Kahneman, D. Judgment under Uncertainty: Heuristics and Biases. In Science, September 1974. pp. 1124-1131.
Ethics, Trust, Deception
Aronson, E. (1992). Age of Propaganda; The Everyday Use and Abuse of Persuasion, W.H. Freeman and Company: New York
Bok, S. (1978). Lying: Moral Choice in Public and Private Life, Vintage Books, New York
Bok, S. (1983). Secrets: On the Ethics of Concealment and Revelation, Vintage Books, New York
Callahan, D. (2004). The cheating culture: Why more Americans are doing wrong to get ahead. Orlando, Florida: Harcourt Books.
Feldman, R. (2009). The Liar in Your Life. NY: Hachette Book Group.
Flores, F. & Solomon, R. (2001). Building trust in business, politics, relationships, and life. New York, NY: Oxford University Press.
Habermas, Jurgen, 1990. Moral Consciousness and Communicative Action (translated by C. Lenhardt & S. W. Nicholsen). MIT Press, Cambridge, MA.
Kofman, F. (2006). Conscious Business: How to build value through values. Sounds True, Boulder CO
Solomon, Robert & Flores, Fernando (2001). Building Trust in Business, Politics, Relationships, and Life. Oxford Univ. Press: NY.
Dialog, Deliberation, Group Decision Making, Conflict Resolution, Listening
Atlee, T. (2003). The Toa of Democracy: Using Co-Intelligence to create a world that works for all. The Writers Collective, Cranston, RI.
Bohm, D. (1996). On Dialog (Edited by Lee Nichol). Routledge, NY.
Butler, C.T.L. (1987). On Conflict & Consensus: a handbook on Formal Consensus decision-making. FoodNotBombs Publ. Portland, ME.
Deutsch, M. (1973). The Resolution of Conflict: Constructive and Destructive Processes. Yale University and Vail-Ballour Press, Binghamton, NY.
Elbow, P. (2005). Bringing the rhetoric of assent and the believing game together—And into the classroom. College English, (March).
Fisher, R. & Ury, W. (1981). Getting to Yes: Negotiating Agreement Without Giving In. Penguin Books, NY.
Katsh, E. & Rifkin, J. (2001). Online Dispute Resolution. Josey Bass, San Francisco.
Mindell, A. (2002). The Deep Democracy of Open Forums. Hampton Roads Publ., Charlottesville, VA.
Postman, N. (1985). Amusing Ourselves to Death: Public Discourse in the Age of Show Business. Penguin Books: NY.
Robert, H. M. (1967). Robert's Rules of Order. NY: Pyramid.
Rosenberg, M. (1999). Nonviolent Communication: A Language of Compassion. PuddleDancer Press, Encinitas, CA.
Rosenberg, M. (2005). Speak Peace in a World of Conflict: What You Say Next Will Change Your World. Encinitas, CA: PuddleDancer Press.
Adult Developmental Theory
Bassesches, M. (1984). Dialectical thinking and adult development. New Jersey: Ablex Publishing.
Cook-Grueter (2005). Ego Development: Nine levels of increasing embrace. Available at www.cook-greuter.com.
Fischer, K. (1980). A theory of cognitive development: The control and construction of hierarchies of skills. Psychological Review, 87(6), 477-531.
Kegan, R. (1994). In Over Our Heads: The Mental Demands of Modern Life. Harvard Univ. Press: Cambridge.
King, P.M. and Kitchener, K.S. (1994). Developing reflective judgment: Understanding and promoting intellectual growth and critical thinking in adolescents and adults. San Francisco: Jossey-Bass.
Perry, W. G. Jr. (1970). Forms of intellectual and ethical development in the college years: A scheme. New York: Holt.
 This work is part of the UMass Science, Technology & Society Initiative, see www.umass.edu/sts/ethics. Thanks to project team members John Ake, Michelle Sagan Goncalves, Ron Sandler, Jane Fountain (PI), M.J. Peterson, Neal Anderson, Marc Acherman, and Paula Stamps. This material is based upon work supported by the National Science Foundation under grant number 0734887. Any opinions, findings, conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation.
 We will treat "ethics" and "morality" as synonymous here, though in some theories they have different meanings.
 Of course, engineers and scientists with exceptional compassion and/or dedication will participate more fully in these issues. Inspiring and educational stories of such individuals can be found in the literature.
 A related perspective is that ethics should be judged by about what we actually do, not about what they think or intend to do. What one does do may or may not correspond to what one thinks or knows is the right thing to do. This brings up the importance of the faculty of will (or willpower).
 We use "order-" and "change-" orientation, and stay away from calling them "progressive" and "conservative" because our goal here is not to explore political issues, but rather to motivate discussion about how value systems can differ in different systematically groups within the same culture, region, or nation. We caution readers and instructors about discussions that become overly focused on political or left-vs-right ideology, at the expense of the larger ethical and meta-ethical issues. It is not our goal to encourage students to categorize themselves or others into rigid political categories. There are many, often incompatible, theories about the primary characteristics of liberal/progressive vs. conservative/traditionalist (or left vs. right) political orientations, and the question is complex (and differs for different geopolitical regions).
 See M.J. Peterson's "Resolving Ethical Disagreements" which is included in the curriculum materials for this project. There are also many books on related topics such as dialog and deliberation, communication skills, group facilitation, democratic decision making processes, and conflict resolution.
 Additional resources are available through The National Coalition on Dialog and Deliberation: http://www.thataway.org/, and in particular http://www.thataway.org/exchange/categories.php?cid=62&recommended=1 and http://www.thataway.org/exchange/.