100 likes | 201 Views
Interplay of Research, Standard Setting and Regulation. Panel members: Jean C. Bedard, Bentley University Joe Carcello, University of Tennessee Mike Stein, Old Dominion University and PCAOB Academic Fellow. The Panel’s Focus.
E N D
Interplay of Research, Standard Setting and Regulation Panel members: Jean C. Bedard, Bentley University Joe Carcello, University of Tennessee Mike Stein, Old Dominion University and PCAOB Academic Fellow
The Panel’s Focus • To illustrate the role of research in setting and evaluating auditing standards and related regulations • In the context of a taxonomy of research types • To ground our discussion, we focus on specific recent examples • We cover auditing standards/regs of the PCAOB, the SEC, and the US Auditing Standards Board • Although some references to non-US standards arise
Policy Evaluation in Social Science • An informal word (or two) about policy evaluation • This is a large field of research, based in economics, sociology, military history, health care... • In general, a policy is good if it makes society better off • The challenges to such assessments are obvious: • What does “better off” mean? There are both costs & benefits • How do we measure “better off”? • Who is “better off”? There are always trade-offs! • What are the unintended consequences?
The “Interplay” in Auditing Practice Standards/ Regulations Impact on Auditors, Preparers, Users Research
Taxonomy of Auditing Research Related to Standards/Regulations • Research identifies a problem, and standards/regs are subsequently developed to address it • A standard/reg is considered or proposed to address a problem, and research assesses its potential impact • A standard/reg is promulgated but is nonspecific - research evaluates alternative ways of implementing it • A standard/reg is promulgated, research evaluates its impact since implementation • A standard/reg is promulgated, research looks back prior to implementation to assess whether it was indicated
Research Assesses How Nonspecific Standards/Regs Could be Implemented • In this situation, a standard is promulgated, but leaves open how audit firms should implement it • Research can evaluate options, in the laboratory or in the field • Example: SAS 99 says that audit engagement teams must “brainstorm” about fraud, but does not say how. Two recent studies investigate: • Hunton & Gold (TAR 2010) manipulate brainstorming methods, finding that “open brainstorming” is inferior • Brazel, Carpenter, Jenkins (TAR 2010) field study shows that 91 percent of engagements use open brainstorming
Research Looks Back Prior to Implementation of a Standard/Regulation • Research examines conditions prior to implementation to see whether conditions indicated it was needed, or other effects • Examples: • SOX limited provision of non-audit services. Kinney, Palmrose, and Scholz (JAR 2004) examine whether NAS affected auditor independence (measured as restatements) – findings fail to support the restrictions • SOX limited partner tenure on public engagements to 5 years. Bedard and Johnstone (2010) find higher hours and lower realization rates following partner turnover (2002-2003) – also, billing rates are higher for engagements with partner tenure > 5 years. Eliminating long-tenure partners may put pressure on firms to reduce the investment in learning about new clients.
Research Identifies a Problem for Standard-Setters • Beasley, Carcello, and Hermanson (COSO 1999) • COSO-sponsored study on fraudulent financial reporting between 1987-1997 • Found high incidence of fraud involving revenue recognition – SEC used as support for SAB 101 • Found weak governance – SEC used as support for not exempting smaller public companies from some of the governance changes recommended by the BRC • Carcello and Neal (The Accounting Review 2000, 2003) • Examine relations between audit committee characteristics and GC reporting, and between audit committee characteristics and auditor changes after a GC report
Ex Ante Research on Possible or Proposed Standards/Regulations • A problem is identified - policy alternatives to address it can be assessed in advance through analytical modeling, a “natural laboratory”, or an experiment • Examples: • Carcello and Santore (Working paper, 2011) examine the likely effects on partners, firms, and society of implementing a partner signature requirement using an analytical model • Van de Poel and Vanstraelen (AJPT, forthcoming) find poor reporting under a “comply-or-explain” internal control standard in the Netherlands (i.e., low quality of explanations as to why companies are not complying) • Zimbelman (JAR 1999) investigates whether a separate and explicit fraud-risk assessment would improve planning judgments (later required by SAS No. 82)
Research Evaluates the Impact of an Implemented Standard / Regulation • A standard or regulation has been implemented and research considers its impact – probably the most frequent category • Bedard and Graham (TAR, forthcoming) – find that most Section 404 deficiencies are discovered by auditors, and that company managements tend to classify detected deficiencies as less severe than auditors (implying ineffective management testing) • Carcello, Vanstraelen, and Willenborg (TAR 2009) – examine whether a change to a more rule-based regime for GC reporting in Belgium improved auditor reporting • Carcello, Hermanson, and Huss (AJPT 1995) – examine whether SAS 59 improved auditor GC reporting prior to bankruptcy