7
23 © 2012 Wiley Periodicals, Inc. Published online in Wiley Online Library (wileyonlinelibrary.com). DOI 10.1002/jcaf.21767 f e a t u r e a r t i c l e Stephen R. Goldberg and David P. Centers INTRODUCTION Ethics training and education typically assume that people recognize when they encounter ethical situ- ations and that a bet- ter understanding of ethics results in better ethical decisions. This article draws from the works of Bazer- man and Tenbrunsel 1 and others to explain why these assumptions are false and suggests approaches to improve ethical decision mak- ing. Our discussion emphasizes an audit context but applies to other business and nonbusiness situations. Most cases of unethi- cal behavior and corruption are unintentional, a result of ethi- cal decisions bounded by innate psychological processes and the fading of the ethical dimension of the problem. For this reason, laws and regulations directed at intentional corruption are often of little efficacy in protecting society. We generally fail to rec- ognize that our ethical judgments are biased in ways we would crit- icize with greater awareness, and we underestimate the degree to which our behavior is affected by incentives and other situational factors. In addition, implica- tions of bounded ethicality are compounded when considering individual, organizational, and societal levels. Most people want to act ethically. So to improve ethical judgment, we need to understand and accept limita- tions of the human mind. Behavioral ethicists argue that auditor bias arises uncon- sciously when decisions are made, long before auditors report their judgment. Audit failures are a natural by-product of the auditor-client relationship. The cur- rent system makes it psychologically impossible to make objective judgments because of motivated blindness in even the most honest of audi- tors. Cases of audit failure are inevitable. By focusing solely on auditor neglect and corruption when analyzing audit fail- ures, innate bias in auditor deci- sion making is ignored. In the following sections of this article, we discuss reasons for bounded ethicality, exacer- bating factors at the organiza- tional level, issues specific to the audit industry, and approaches to reducing innate bias. Finally, we conclude with a brief comment. BOUNDED ETHICALITY Bounded ethicality refers to ways people systematically and unknowingly engage in unethi- cal behavior. There is a gap between how ethical we think our behavior is and how ethical Ethics training and education usually assume that people recognize when they encounter ethi- cal situations—and that a better understanding of ethics results in better ethical decisions. But these assumptions are false! Instead, the authors suggest other approaches to improve ethical deci- sion making by auditors. And although this article focuses on auditing, it also applies to other busi- ness and nonbusiness situations. © 2012 Wiley Periodicals, Inc. Why Most Audit Ethics Courses Don’t Work

Why most audit ethics courses don't work

Embed Size (px)

Citation preview

Page 1: Why most audit ethics courses don't work

23

© 2012 Wiley Periodicals, Inc.Published online in Wiley Online Library (wileyonlinelibrary.com).DOI 10.1002/jcaf.21767

featu

reartic

le

Stephen R. Goldberg and David P. Centers

INTRODUCTION

Ethics training and education typically assume that people recognize when they encounter ethical situ-ations and that a bet-ter understanding of ethics results in better ethical decisions. This article draws from the works of Bazer-man and Tenbrunsel1 and others to explain why these assumptions are false and suggests approaches to improve ethical decision mak-ing. Our discussion emphasizes an audit context but applies to other business and nonbusiness situations. Most cases of unethi-cal behavior and corruption are unintentional, a result of ethi-cal decisions bounded by innate psychological processes and the fading of the ethical dimension of the problem. For this reason, laws and regulations directed at intentional corruption are often of little efficacy in protecting society. We generally fail to rec-ognize that our ethical judgments are biased in ways we would crit-

icize with greater awareness, and we underestimate the degree to which our behavior is affected by incentives and other situational factors. In addition, implica-tions of bounded ethicality are compounded when considering individual, organizational, and societal levels. Most people want to act ethically. So to improve ethical judgment, we need to understand and accept limita-tions of the human mind.

Behavioral ethicists argue that auditor bias arises uncon-sciously when decisions are made, long before auditors report their judgment. Audit failures are a natural by-product

of the auditor-client relationship. The cur-rent system makes it psychologically impossible to make objective judgments because of motivated blindness in even the most honest of audi-tors. Cases of audit failure are inevitable. By focusing solely on auditor neglect and corruption when analyzing audit fail-

ures, innate bias in auditor deci-sion making is ignored.

In the following sections of this article, we discuss reasons for bounded ethicality, exacer-bating factors at the organiza-tional level, issues specific to the audit industry, and approaches to reducing innate bias. Finally, we conclude with a brief comment.

BOUNDED ETHICALITY

Bounded ethicality refers to ways people systematically and unknowingly engage in unethi-cal behavior. There is a gap between how ethical we think our behavior is and how ethical

Ethics training and education usually assume that people recognize when they encounter ethi-cal situations—and that a better understanding of ethics results in better ethical decisions. But these assumptions are false! Instead, the authors suggest other approaches to improve ethical deci-sion making by auditors. And although this article focuses on auditing, it also applies to other busi-ness and nonbusiness situations. © 2012 Wiley Periodicals, Inc.

Why Most Audit Ethics Courses

Don’t Work

JCAF21767.indd 23JCAF21767.indd 23 4/6/12 7:46:17 PM4/6/12 7:46:17 PM

Page 2: Why most audit ethics courses don't work

24 The Journal of Corporate Accounting & Finance / May/June 2012

DOI 10.1002/jcaf © 2012 Wiley Periodicals, Inc.

We tend to be more comfort-able doing favors for those with whom we identify, such as those who have the same alma mater, religion, race, or gender as us. The net result is discrimination against those who are not like us. This may explain the observation that blacks with similar qualifi-cations to whites are less likely to get loan approvals. In-group favoritism has the same effect as out-group discrimination without the discriminator thinking he has done anything wrong. Also, the less time we have to think, the more likely we are to fall back on stereotypes.

We tend to use high dis-count rates when considering future consequences of our decisions. As a result, we focus on and overweigh short-term considerations at the expense of long-term concerns. Requiring future generations to pay for the current generation’s economic or environmental mistakes are examples.5

Although changes in appear-ance and strength suggested use of steroids in baseball, no inves-tigation took place for years; to increase prices and revenue, companies in the pharmaceutical industry sometimes sell cancer or other drugs to a less well-known company, which then raises prices to cancer patients tenfold or more.6 Even when data suggesting unethical intent is obvious, human intuition does not sufficiently hold people and organizations accountable for such indirect unethical behavior.

We also have a tendency to judge the wisdom of decisions based on outcomes.7 Outcome bias is more prominent when one instance is confronted at a time. Seeing multiple versions of a story allows us to avoid outcome bias by comparing and consider-ing differences between versions.

ness by altering the importance of the attributes that affect what is fair. It happens whether we want to or not. Under intense uncertainty, egocentrism is magnified. On the other hand, when data are clear, the ability to manipulate fairness is limited.

Ethicists argue that moral reasoning influences moral judg-ments. However, behaviorists conclude that moral judgments often precede moral reasoning.4 Quick emotional reactions influ-ence our judgment and then we look for reasons to justify our decision. Bazerman and Ten-brunsel describe two types of thinking. Under System 1 think-ing, we quickly, intuitively, and automatically process informa-tion. System 1 thinking serves us well for the majority of our daily decisions. System 2 thinking is slower, more explicit, effortful, and logical. We weigh costs and benefits. When these two sys-tems disagree, it is a red flag to reassess our decision. Decisions are the most ethically compro-mised when our minds are over-loaded.

Behaviorists have also identified in-group favoritism.

our behavior really is. Built-in psychological responses occur at the individual, group, and soci-etal levels. Our ethical behavior is sometimes inconsistent or even hypocritical. Individuals can believe in ethical norms even when they act contrary to those norms, but judge their own wrongdoings differently than they judge others.2 Ethical fading is the process by which ethical aspects of a decision fade from the mind. As an example, leading up to the financial crisis, sell-ers of subprime mortgages saw themselves as making business decisions, not ethical decisions. Exhibit 1 lists factors that bound our ethicality at the individual level.

More than we realize, deci-sion making is influenced toward self-interest3 with pressures from a profit-focused environment. When individuals have a vested interest in seeing a problem in a particular manner, they are not capable of unbiased objec-tive decisions. We tend to first determine our preference for a certain outcome on the basis of self-interest, and then justify this preference on the basis of fair-

Causes of Unknowing Systematic Decision Biases

• Self-interest bias

• Moral judgments preceding moral reasoning

• In-group favoritism

• Over-discounting the future

• Indirect blindness

• Outcome bias

• Identifiable victim effect

• Status quo bias

Exhibit 1

JCAF21767.indd 24JCAF21767.indd 24 4/6/12 7:46:17 PM4/6/12 7:46:17 PM

Page 3: Why most audit ethics courses don't work

The Journal of Corporate Accounting & Finance / May/June 2012 25

© 2012 Wiley Periodicals, Inc. DOI 10.1002/jcaf

We punish people too harshly for making sensible decisions that have unlucky outcomes. Com-pounding the problem, judging decisions based on outcomes means we often wait too long to condemn unethical behavior. President Bush went to war over weapons of mass destruction. When the war was going well, there was less condemnation. After the war went poorly, there was greater condemnation.

There is a slippery slope to accepting indirect unethical behavior.8 Consider the clas-sical (and mythical) analogy to a frog in warm water that is heated slowly. The frog adjusts to the temperature change and does not save itself by jumping out of the water. We tend to accept small and then larger and larger unethical behavior. People commonly fail to notice the slippery slope of others’ ethical behavior. The Securities and Exchange Commission did not notice Madoff, due partly to motivated blind-ness and partly to the slip-pery slope, as his scheme took 15 years to develop.

Consider two scenarios of an auditor of a large corporation.9

In the first scenario, for three years in a row the auditor’s client produces high-quality financial statements. In the fourth year the client clearly violates generally accepted accounting principles (GAAP). In the second scenario, the auditor notices that the client stretches rules in a few areas the first year but does not appear to break any rules. The next year the firm is even more unethical, committing a minor violation of GAAP. The third year the viola-tions are a bit more severe. In the fourth year, there are clear viola-tions of GAAP in its financial statements. In the first scenario,

the auditor is more likely to identify the ethical transgression and take appropriate action. In the second scenario, it is far less likely that the auditor would iden-tify ethical transgression. This should put us on alert for slowly degrading ethical behavior.

People tend to be far more concerned with and show more sympathy for identifiable vic-tims than statistical victims.10 We overlook unethical behavior when there are no identifiable victims. An example of this is top universities giving priority to legacy candidates over more qualified candidates.

Bazerman and Tenbrunsel describe a conflict between our “want self ” and our “should

self.” The should self encom-passes our ethical intentions and the belief that we should behave according to our ethical values and principles. The want self reflects our actual behavior, typi-cally characterized by self-inter-est and a relative disregard for ethical considerations. Research shows that the should self domi-nates before and after we make a decision, but the want self often wins at the moment of deci-sion. In other words, we predict should decisions but make want decisions.

Want versus should deci-sions are driven by differences in motivations at different points in time and ethical fading. When

we think about our future behav-ior, it is difficult to anticipate the actual situation we will face. General principles and attitudes drive our predictions. We see the forest but not the trees. As the situation approaches, we see the trees and the forest disappears. Our decision is driven by details and not abstract principles. In the prediction phase of ethical decision making, we clearly see the ethical aspects of a given decision. Our moral values are evoked, and we clearly believe we will behave according to those values. At the time the decision is made, ethical fading occurs and we no longer see the ethical dimensions of a decision. We focus on making the best

business or legal decision and thus might behave unethically.

Visceral responses dominate at the time we make decisions.11 Responses such as hunger and pain are hardwired into our brain to increase chances of survival. Our behavior becomes auto-matic, such as eating or fleeing danger. In the clas-

sic Ford Pinto case, the pressures of competition likely resulted in feelings akin to the survival instinct. Ford decision makers thought cost-benefit rather than the ethics of endangering pas-sengers with a faulty gas tank design. Ethical considerations faded away.

As we gain distance from visceral responses to an ethical dilemma, ethical implications come back to us. We are faced with a contradiction between ourselves as ethical people and our unethical actions. We are motivated to reduce the disso-nance. Psychological cleansing is an aspect of moral disengage-ment, a process that allows us to

Research shows that the should self dominates before and after we make a decision, but the want self often wins at the moment of decision. In other words, we predict should deci-sions but make want decisions.

JCAF21767.indd 25JCAF21767.indd 25 4/6/12 7:46:18 PM4/6/12 7:46:18 PM

Page 4: Why most audit ethics courses don't work

26 The Journal of Corporate Accounting & Finance / May/June 2012

DOI 10.1002/jcaf © 2012 Wiley Periodicals, Inc.

turn our usual ethical standards on and off at will. Consumers who desire an article of clothing produced by child labor recon-cile their push-pull attraction to the purchase by reducing the degree to which child labor is viewed as a societal problem.12 When people are in environ-ments that allow them to cheat, they reduce the degree to which they view cheating as a moral problem. The process of moral disengagement allows us to behave contrary to our personal code of ethics, while maintain-ing the belief that we are ethical people.

We prefer to blame other people and other things (e.g., the economy, a boss, a subordi-nate, “everybody is doing it,” a friend, law permits, maximizing shareholder value). The more tempted we are to behave unethi-cally, the more likely we will see the action as more common and thus acceptable. If you don’t spin your ethical behavior to your advantage, you can always change your ethical standards. Once you change your ethical standards, the power of your moral principles diminishes. There may no longer be a line you won’t cross. We become desensitized as our exposure to unethical behavior increases.

The status quo bias is the desire to maintain an established behavior or condition rather than changing it. We tend to be more concerned about the risk of change than the risk of failure to change. Potential losses loom larger than potential gains. As a result, the status quo inertia is a barrier to wise action.

BOUNDED ETHICALITY AT THE ORGANIZATIONAL LEVEL

Ethical gaps at the individual level are compounded when

considered at the organizational level (see Exhibit 2). Informal values may have greater influ-ence on the ethics of employees than formal policies or codes. Informal values are determined by who really runs the company. Which division or departments have the most power? Over the years, power shifted at Arthur Andersen from audit to consult-ing, and the values of the organi-zation shifted also.

Knowing what is talked about and what is not talked about can shed light on values. What stories or slogans are repeated over and over? What values are emphasized in those stories? If something is valued above ethics, ethical fading is more likely to occur. Is there a story about an employee being rebuffed for taking an ethical position or being rewarded for taking an unethical position? Toward the end at Andersen, no one talked about accounting professionalism and the public trust—only revenue generation.

What wording or labeling is used to describe questionable behavior? Is lying referred to as misrepresenting facts? Is stealing referred to as inappropriate allo-cation of resources? If unethical behavior is not referred to by its name, it is unlikely that interven-tion will be attempted.

Groups tend to avoid a realistic appraisal of alternative courses of action in favor of una-nimity. This inhibits individuals within a group from challenging questionable decisions. Also, we often only consider data readily available to us rather than think through and request data that improve the quality of decisions, particularly when coupled with time pressure or self-interest. Furthermore, organizations sometimes create functional boundaries by separating pieces of decisions for different parts of the organization. The typical ethical dilemma is then seen as a marketing, engineering, or finan-cial problem even though the issue is apparent to other groups.

BOUNDED ETHICS AND AUDITING

Independence from clients is critical to the audit profession, effectively serving the public and adding value through increased reliability and credibility of financial reporting. Prior to the Sarbanes-Oxley Act (SOX), US auditing regulation allowed auditors’ motivated blindness to flourish. Self-interest came from large consulting and audit fees, and high-priced jobs from clients such as Enron, World-Com, Global Crossing, Tyco

Organizational Biases in Ethical Decision Making

• Informal organizational values

• Group-think

• Limiting analysis to data in front of you

• Functional boundaries

Exhibit 2

JCAF21767.indd 26JCAF21767.indd 26 4/6/12 7:46:18 PM4/6/12 7:46:18 PM

Page 5: Why most audit ethics courses don't work

The Journal of Corporate Accounting & Finance / May/June 2012 27

© 2012 Wiley Periodicals, Inc. DOI 10.1002/jcaf

engage in future immoral behav-ior and therefore maintain their moral equilibrium.

REDUCING BIAS IN ETHICAL DECISIONS

There is limited evidence that ethics education and train-ing result in more ethical deci-sions without considering innate biases.16 People need to be trained on psychological mecha-nisms that encourage unethi-cal behavior and techniques that inspire ethical conduct. Exhibit 4 identifies steps that would reduce bounded ethicality.

Informal organizational values teach employees what is really expected of them and could promote unethical behavior. Firm leaders should identify infor-mal systems that exist and the underlying pressures they place on employees and then influence these systems to create ethical informal cultures. To determine informal values, identify charac-teristics that make misalignment between formal and informal

companies’ long-run interest to improve the quality of the audits they receive.

Firms made the concession to disclose details of their rela-tionships with clients to inves-tors. However, behavioral ethi-cists argue that disclosure can intensify bias.15 Transparency may result in unintended con-sequences if we fail to account for the psychological process of moral compensation. The opportunity to behave morally by disclosing conflicts of interest might give people a license to

International, and Sunbeam. Exhibit 3 identifies general tac-tics used by industries to institu-tionalize corruption.13 Auditors, clients, and lobbyists argued that nonaudit services improved audit efficiency and restricting these services would harm audit quality.14 Further, the more audi-tors know about the client, the better the audit. They argued that there is no evidence of lack of independence contributing to audit fraud. Financial scandals at Enron, WorldCom, Adelphia, Global Crossing, Xerox, and Tyco followed, and then the Sar-banes-Oxley Act of 2002.

Likely as a result of audit industry lobbying, SOX failed to respond to some of the inde-pendence flaws in the industry. SOX prevented some nonaudit services but allowed others to continue, and required partner rotation after seven years but did not require audit firm rotation. Auditors are still allowed to take jobs at former clients. Legisla-tors egocentrically focused on how reforms would affect their own campaign contributions rather than on costs to society. Consistent with discounting the future and status quo bias, many corporate leaders do not want to change their current relationship with their auditors, even though it can be argued that it is in their

How to Overcome Bounded Ethicality

• Be aware of egocentric tendencies and other biases.

• Align gap between informal and formal value systems.

• Align gap between should and want selves.

• Practice, plan, and precommit decision making.

• Consider all options at same time.

• Apply ethical decision making to all major decisions.

• Debrief decisions.

• Engage in group decision making.

• Hold decision maker accountable.

• Delay implementation of decisions.

• Be alert to slowly degrading ethical behavior.

Exhibit 4

Tactics Used to Institutionalize Corruption

• Disinformation campaigns

• Obfuscation

• Claimed need to search for smoking gun

• Shifting views of facts

Exhibit 3

JCAF21767.indd 27JCAF21767.indd 27 4/6/12 7:46:18 PM4/6/12 7:46:18 PM

Page 6: Why most audit ethics courses don't work

28 The Journal of Corporate Accounting & Finance / May/June 2012

DOI 10.1002/jcaf © 2012 Wiley Periodicals, Inc.

than the want choice to the forefront.

Consider ethical aspects of all important decisions. If you only analyze the ethics of identi-fied ethical decisions, you are not likely to realize that you are making a decision with ethical implications. Immediate decision feedback can help assess actions. Feedback should warn about the likelihood of distortion and how bias might affect your recol-lection of the decision. Debrief your decisions with the help of a trusted friend or colleague playing devil’s advocate. Group decision making and systems that hold people accountable for their decisions also help remove bias.

People are more likely to choose according to the interests of their should selves when making deci-sions about the future rather than when mak-ing decisions that will be implemented immedi-ately.17 Time delay allows people to look beyond their emotional dislike of incur-ring immediate costs of implementation. Slightly

delaying policy implementation gives people time to listen to their should selves. It gives peo-ple time to prepare for a policy’s impact.

At the audit industry level, the profession, regulators, and the public should consider steps to reduce or eliminate self-serving biases.18 Steps to increase auditor independence and encourage unbiased audits require rotation of auditors, pro-hibit the selling of any consult-ing services, prohibit auditors from taking jobs with firms they audit, and have auditing contracts of limited duration with clients who are not able to fire the auditors. However, due

pliance systems. When a restric-tive compliance system is later removed, the cost of noncompli-ance becomes less burdensome. Ethics have faded from the deci-sion. The decision remains a business or practical decision.

Project yourself into future situations to anticipate which motivations will be most power-ful. Prepare for questions you are hoping won’t be asked. Rehearse or practice for an upcoming event. Given that abstract think-ing dominates when we predict actions, it is useful to bring abstract thinking to light when making decisions. Focus on high-level abstract principles at the time of the decision. If you find yourself thinking about the

trees and not the forest, ask if you would be comfortable shar-ing the decision with mom or in your eulogy. Precommit to a respected person a desired course of action. Those who publicly commit to a course of action in advance are more likely to follow through with the deci-sion than those who do not make such a commitment.

When decision options are evaluated individually, we are more likely to go along with a questionable option. Therefore, we should reformulate decision making by considering multiple options. Comparing the ethical option and unethical options helps bring the should rather

values more likely. Pay attention to areas in an organization that are characterized by uncertainty, time pressure, short-term hori-zons, and isolation. For example, the focus on billable hours by accountants, consultants, and lawyers creates incentives to engage in unnecessary projects or false reporting to run up hours. Profit goals tend to dwarf other goals, such as sustainability.

Informal norms may be difficult to identify as they are imbedded in euphemisms used by employees, stories they tell, socialization methods they encounter, and informal enforce-ment. Language can be used to mask unethical behavior, such as the term “creative account-ing” rather than “cooking the books” or “collat-eral damage” rather than “dead civilians.” Informal systems are influenced by which behaviors are rewarded or punished. If generating revenues is rewarded and violating policies and audit standards is ignored, the encouraged behavior is clear. An ethi-cal corporate leader is criti-cal to having ethical decisions made in an organization. Closing the ethical gap in an organization requires a thorough audit of top leaders’ decisions and behaviors.

Formal systems (e.g., codes of ethics) that are borrowed from another firm, rather than reflecting the specific values of an organization, have little impact. There was a quiet dilu-tion of standards at Andersen as the auditor-salesman gained power relative to the Professional Standards Group responsible for quality control and compliance with auditing and accounting standards.

Organizations should be cautious when instituting com-

Those who publicly commit to a course of action in advance are more likely to follow through with the decision than those who do not make such a commitment.

JCAF21767.indd 28JCAF21767.indd 28 4/6/12 7:46:18 PM4/6/12 7:46:18 PM

Page 7: Why most audit ethics courses don't work

The Journal of Corporate Accounting & Finance / May/June 2012 29

© 2012 Wiley Periodicals, Inc. DOI 10.1002/jcaf

9. Bazerman, M. H., Moore, D. A., Tetlock, P. E., & Tanlu, L. (2006). Reports of solving the conflicts of interest in audit-ing are highly exaggerated. Academy of Management Review, 31(1), 43–49.

10. Small, D. A., & Loewenstein, G. (2005). The devil you know: The effects of identifiability on punishment. Journal of Behavioral Decision Making, 18, 311–318.

11. Loewenstein, G. (1996). Out of control: Visceral influences on behavior. Organi-zational Behavior and Human Decision Processes, 65, 272–292.

12. Paharia, N., & Deshpande, R. (2009). Sweatshop labor is wrong unless the jeans are cute: Motivated moral disen-gagement. Harvard Business School Working Paper No. 09-079.

13. These tactics are also used by the tobacco, oil and gas, and coal industries.

14. See note 9. 15. Cain, D. M., Loewenstein, G., & Moore,

D. A. (2005). The dirt on coming clean: Perverse effects of disclosing conflicts of interest. Journal of Legal Studies, 34, 1–25.

16. Tenbrunsel, A. E., & Messick, D. M. (2004). Ethical fading: The role of self-deception in unethical behavior. Social Justice Research, 17, 223–236.

17. Rogers, T., & Bazerman, M. H. (2008). Future lock-in: Future implementation increases selection of ‘should’ choices. Organizational Behavior and Human Decision Processes, 106, 1–20.

18. See note 9.

Princeton, NJ: Princeton University Press.

2. Epley, N., & Dunning, D. (2000). Feel-ing holier than thou: Are self-serving assessments produced by errors in self- or social prediction? Journal of Personality and Social Psychology, 79, 861–875.

3. Moore, D. A., & Loewenstein, G. (2004). Self-interest, automaticity, and the psychology of conflict of interest. Social Justice Review, 17, 189–202.

4. Sonenschein, S. (2007). The role of con-struction, intuition, and justification in responding to ethical issues at work: The sensemaking intuition model. Academy of Management Review, 32, 1022–1040.

5. Paharia, N., Kassam, K. K., Greene, J. D., & Bazerman, M. H. (2009). Dirty work, clean hands: The moral psychol-ogy of indirect agency. Organizational Behavior and Human Decision Pro-cesses, 109, 134–141.

6. Berenson, A. (2006, March 12). A cancer drug’s big price rise disturbs doctors and patients. New York Times. Retrieved from http://www.nytimes.com/2006/03/12/business/12price.html.

7. Cushman, F. (2007). Crime and punish-ment: Distinguishing the roles of causal and intentional analyses in moral judg-ment. Cognition, 108, 353–380.

8. Gino, F., & Bazerman, M. H. (2009). When misconduct goes unnoticed: The acceptability of gradual erosion in others’ unethical behavior. Journal of Experimental Psychology, 45, 708–719.

to perceived self-interest, it is not likely that the auditing pro-fession would willingly accept major steps to further increase auditor independence.

CONCLUDING COMMENT

To improve the service that auditors provide and to reduce the likelihood of regula-tor intervention, auditors and audit firms should gain a bet-ter understanding of behavioral factors that encourage bias and enact remedies that reduce the likelihood of bias in their indi-vidual and organizational plans and actions. They should also consider industry regulation that would strengthen indepen-dence and lead to providing greater benefit to the public and, thus, greater value added from audits.

NOTES

1. Bazerman, M. H., & Tenbrunsel, A. E. (2011). Blind spots: why we fail to do what’s right and what to do about IT.

Stephen R. Goldberg, PhD, CPA, is a professor of accounting at Grand Valley State University. His teach-ing and research interests focus on financial accounting, international accounting, corporate governance, and ethics. David P. Centers, MBA, is an accounting instructor at Grand Valley State University. Previously, he worked in the governmental, industrial, and not-for-profit sectors as an auditor, plant controller, and controller. The authors have written articles in a number of academic as well as practitioner-oriented journals.

JCAF21767.indd 29JCAF21767.indd 29 4/6/12 7:46:18 PM4/6/12 7:46:18 PM