1 / 38

Producing Good Research: Behavioral and Otherwise

Producing Good Research: Behavioral and Otherwise. Jane Jollineau Kennedy University of Washington. American Accounting Association Auditing Doctoral Consortium Clearwater, Florida. January 14, 2004. Today’s Outline. What is good research (experimental and otherwise)?

Download Presentation

Producing Good Research: Behavioral and Otherwise

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Producing Good Research: Behavioral and Otherwise Jane Jollineau Kennedy University of Washington American Accounting Association Auditing Doctoral Consortium Clearwater, Florida. January 14, 2004

  2. Today’s Outline • What is good research (experimental and otherwise)? • Why do experiments? • Finding a topic in Behavioral (auditing) research • Common Pitfalls

  3. I. What is good research? • Good topic • Good design

  4. A. What is a good research topic? • Problem is described clearly, comprehensively • Important constituents care about it! • Problem is challenging but not impossible • Possible to evaluate solutions or interpret the direction of results • The results will add to our knowledge about the problem and have theoretical and/or practical ramifications

  5. Tip: Kinney’s 3 Paragraphs (footnote 18 in his 1986 TAR article) • For every research idea/paper write 3 paragraphs answering the following: • What problem or issue will be addressed? • Why is the problem or issue important? • How you will address the problem • (and what do you expect to find)?

  6. My Example Paper for Today Frank Hodge, TAR, October2001: “Hyperlinking unaudited information to audited financial statements: effects on investor judgments” Test of Kinney’s 3 questions

  7. Q1: Hodge 2001 “This study investigates whether firms can influence investors’ perceptions of their financial reports by hyperlinking unaudited information to information in the audited financial statements.”

  8. Hodge 2001 Q1 continued Specifically, does hyperlinking optimistic, unaudited information about a firm’s prospects to audited financial statements lead investors to: • Misclassify unaudited info. as audited, • Inflate the credibility of unaudited info., • Judge the firm’s earnings potential to be higher, relative to viewing the same info. in hardcopy format? Will a warning help?

  9. Q2: Who cares about this problem? • Management • SEC • FASB • AICPA • Accounting/auditing researchers • System designers(?)

  10. Q3: Method and results? • Method: • Used an experiment. Why? Data not observable from investors’ investment decisions. Can control info. and extraneous factors re. people, tasks, and environment. • Results: • Hyperlinked info was perceived as more credible • Firm judged to have greater earnings potential • Simple labels “(Un)audited” reduced the effects

  11. B. What is good Research design? • Theory answers “why?” • Construct Validity • Internal Validity • External Validity • Giving yourself a decent chance of testing your theory and finding results

  12. Model & Definitions (Kinney ’86) Y = f ( X, Vs, Zs ) Y = phenomenon to be explained X = your (new) theory about a cause of Y Vs = prior causes of Y Zs = contemporaneous causes of Y

  13. X0 ? Y1 {V-3, V -2, V -1 } Z0 A simple model

  14. IndependentDependent TheoreticalX Theoretical Y Conceptual Operational X Operational Y Operational Other potentially influential variables: Vs and Zs Control Building a Research Model

  15. ForX to causeY: • X and Y are correlated • Alternative explanations are ruled out by design, including • Y causes X • X and Y caused by an omitted V or Z • Reason to believe that operational X and Y represent X and Y • Reason to believe that X : Y relation generalizes to other persons, times, and settings.

  16. . . .threats to Predictive Validity • Statistical Conclusion Validity (5) • Internal Validity (4) • Construct Validity (2 and 3) • External Validity (1)

  17. IndependentDependent Theoretical X Theoretical Y Conceptual 3: Construct Validity 2:Construct Validity Operational X Operational Y Operational Other potentially influential variables: Vs and Zs Control Threats to Validity (cont’d) 1: Ext. Val. 5: Stat. Val. 4: Internal Validity

  18. 2 3 Hyperlinked Letter to F/S Credibility and Earnings Judgments 5 4 Accounting & finance courses; investing experience Research Model for Hodge 2001 IndependentDependent 1 Financial Decisions Assoc. of unaudited Information with Audited F/S Conceptual Operational Control

  19. To achieve Internal Validity: “account” for Vs and Zs by: • random assignment of participants to treatment/control • estimate statistically and remove (covariate analysis, regression) • match on Vs • hold constant by design/selection (special case of matching)

  20. The Researcher’s Problem(Kinney 1986) or Giving Yourself a Chance to find Results  = risk that data incorrectly “accepts” new theory  = risk that data incorrectly “rejects” new theory  = “true” effect of X on Y  = residual variation given research design (i.e., after effects of Vs and Zs) n = available sample size All five are related through a single, simple formula

  21.  = f ( n ) - - + - Researcher’s Problem (continued) is fixed at .05 or .10 by convention is the risk of failing to detect ‘true’ phenomenon (you want to minimize)  = f(X)  = f(Vs, Zs) nis semi-fixed by data availability or cost

  22. _ Y| H0, n, s _ Y| HA, n, s b a d Accept H0 Reject H0 Graphically . . . (small d, small sb okay) Y 0

  23. b Accept H0 Reject H0 Graphically (large d, large s b okay) _ Y| HA, n, s _ Y| H0, n, s a Y 0 d

  24. yikes!) b Accept H0 Reject H0 Graphically… (small d, large s _ Y| HA, n, s _ Y| H0, n, s a Y 0 d

  25. What to do? Need Power! • Increase sample size (not always possible) • Improve research design • Theory • Have you identified and measured all the major V’s and Z’s? • Have you manipulated the X’s at appropriate levels that allow you to find effects? • Manipulation checks?

  26. II. Why use experiments? • Causal theory can be tested (not just association) • Manipulate the variables of interest and control or randomize on others • Can get at “process” not just outcomes • Ex ante research is possible • Some conditions that do not exist in natural settings can be created in the lab

  27. (Difficult but) Necessary Choices in Experiments • Professional participants? • Monetary incentives? • Between- or within-subjects design (see Schepanski, Tubbs, and Grimlund 1992)? • Level of analysis, individual or market (see Libby, Bloomfield & Nelson 2002)?

  28. 1. Choice of participants? • Theory should dictate this choice • Negative externalities of using professionals when not needed: limited resource (Libby, Bloomfield & Nelson 2002) • Selection bias is stronger with professionals than with more available groups like students (Peecher & Solomon 2001) • Hodge 2001 used MBA students as investors • Elliott, Hodge and Kennedy (2004)

  29. 2. Monetary incentives? • Dictated by theory • Want participants to take the task seriously • Many results driven more by ability than incentives (running a 4-minute mile) • Incentives do not necessarily mitigate judgment bias– in some cases reduce (see Camerer & Hogarth 1999) • Incentives are not always possible or practical in real life either! • Hodge 2001 paid a flat rate for participation

  30. 3. Within- vs. Between-? • Enhanced statistical power, subjects are their own control– less variance • Increases salience of treatment effects • Vulnerable to carry-over effects • Can help demonstrate the (un)intentional nature of cognition • Requires proper analysis • Hodge 2001 used a between-subject design

  31. Hodge 2001 Experimental Design • 3 x 1 between-subjects • Control: Hardcopy of unaudited letter and audited financial statements • Treatment 1: Audited financial statements on computer with hyperlinked unaudited letter • Treatment 2: same as 1. Above, except that when the hyperlink was activated a notice popped up saying “(Un)audited.”

  32. 4. Individuals or Markets? • Theory dictates • Markets are expensive • Markets can have less observations, therefore less power (unless repeated measures) • Research shows that biases are not driven out by markets (may be reduced though) • Hodge 2001 studies individual judgments but ultimately wants to say something about behavior in the market – systematic individual results aggregate to market behavior?

  33. III. Finding a topic in behavioral auditing research? • Think broadly! Is Hodge 2001 auditing research? • You want to get the “big potatoes!” • Audits have a supply and a demand side • Many financial reporting issues are of concern to auditors and investors • Audit environment has changed greatly

  34. Finding a topic in behavioral audit research? (cont’d) • Enron and other accounting scandals have emphasized many problems such as: • lack of independence • engagement risk • incentives to report aggressively • incentives to please the client • Sarbanes-Oxley Act • Read, read, read…

  35. IV. Common Pitfalls • Topic fails “Yawn Test” • Predicting the null hypothesis • Accepting the null hypothesis • Low power tests • Poor motivation; small incremental contribution • Running experiments too early • Not getting adequate feedback before submitting

  36. Top 5 Takeaways • Have fun! • Seek lots of input at each stage • Use a well-thought out design • Choose the research method that best informs the question • Choose an interesting topic

  37. Helpful Papers on Research • Bonner, S., Accounting Horizons, Dec. 1999 • Camerer, C, & R. Hogarth. JRU, 1999. • Kinney, W., The Accounting Review, Apr. 1986. • Libby, R. & M. Lipe, JAR, 30, 1992. • Libby, R., Bloomfield, R., & M. Nelson. AOS, Nov., 2002. • Maines, L., BRIA, Supplement 1994. • Peecher, M. & I. Solomon, IJA, 2001.

  38. Q & A?

More Related