1 / 50

Research Methods

Research Methods. Types of Methods. Software Methods Scientific Methods Requirements Elicitation. Software Development Methodologies / Models. Development Methodologies. Traditional Waterfall Model Systems Development Life Cycle (SDLC)

tyson
Download Presentation

Research Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research Methods

  2. Types of Methods • Software Methods • Scientific Methods • Requirements Elicitation

  3. Software DevelopmentMethodologies / Models

  4. Development Methodologies • Traditional Waterfall Model • Systems Development Life Cycle (SDLC) • Structured Systems Analysis and Design Rapid Applications Development (RAD) • Spiral Model • Agile Methodologies

  5. Agile software development Agile Unified Process (AUP) Open Unified Process Best practice Cathedral and the Bazaar, Open source Constructionist design methodology (CDM) Development Methodologies (1/2) • Cowboy coding • Design by Use (DBU) • Design-driven development (D3) • Don't repeat yourself (DRY) or Once and Only Once (O3) • Dynamic Systems Development Method (DSDM) • Extreme Programming (XP)

  6. Test-driven development (TDD) Unified Process Waterfall model Worse is better (New Jersey style) Extreme Programming (XP) You Ain't Gonna Need It (YAGNI) Development Methodologies (2/2) • Iterative and incremental development • KISS principle (Keep It Simple, Stupid) • MIT approach • Quick-and-dirty • Rational Unified Process (RUP) • Scrum (management) • Spiral model • Software Scouting

  7. The Waterfall Model • Whatever means of software acquisition you choose, all the stages of the development life cycle are followed. However there are some differences in terms of what happens at each stage depending on whether you opt for bespoke, off-the-shelf purchase or end-user development

  8. Drawbacks of SDLC • Sequential nature of life cycle • Bureaucratic, long winded, expensive • Minor changes can cause problems • Cost of correcting errors • Misunderstandings/omissions may not come to light until user acceptance test stage – maybe too late to make significant changes • Change may be needed after sign off by user

  9. Drawbacks of SDLC • User Dissatisfaction • Early sign-off • Incorrect functionality • Incomplete functionality • User friendliness • Bugs • Lack of participation

  10. Drawbacks of SDLC • Applications backlog • Visible • Invisible • Failure to meet needs of management • Strategic/tactical potential ignored • Unambitious systems design

  11. Drawbacks of SDLC • Problems with documentation • User acceptance • Restrictive • Slow • Maintenance workload • Inflexibility • To cope with rapidly changing business climate

  12. ‘V’ Model Project Initiation Requirements specification Product phase out Evolution Acceptance testing Detailed requirements specification Specification Verified System Architectural software design System Integration & test Integrated software Design Software Integration & test Detailed software design QA QA Debugged Modules Module Design Code & Unit test

  13. SSADM • Only covers part of the system development process, i.e. analysis and design. • It emphasises the importance of the correct determination of systems requirements.

  14. SSADM Stages • Feasibility Study • Stage 0 – Feasibility • Requirements Analysis • Stage 1 – Investigation of current requirements • Stage 2 – Business Systems Options • Requirements Specification • Stage 3 – Definition of Requirements

  15. SSADM Stages • Logical System Specification • Stage 4 – Technical System Options • Stage 5 – Logical Design • Physical Design • Stage 6 – Physical Design

  16. Rapid Applications Development (RAD) • A method of developing information systems which uses prototyping to achieve user involvement and faster development compared to traditional methodologies such as SSADM. • A prototype is a preliminary version of part or a framework of all of an information system which can be reviewed by end-users. Prototyping is an iterative process where users suggest modifications before further prototypes and the final information system is built.

  17. The Spiral Model • Developed by Boehm (1988) • An iterative systems development model in which the stages of analysis, design, code and review repeat as new features for the system are identified.

  18. The Capability Maturity Model for Software Development • A 5 stage model for judging the maturity of the software processes of an organisation and for identifying the key practices that are required to increase the maturity of these processes. • Many large specialist organisations (e.g. NASA) have achieved the higher levels. • Many smaller companies have processes that are at stage 1 or 2.

  19. Dynamic Systems Development Methodology (DSDM) • A methodology that describes how RAD can be approached. • The focus of this approach is on delivering the business functionality on time. • Testing is integrated throughout the life cycle and not treated as a separate activity. • For further information refer to:

  20. Scientific ResearchMethodologies / Models

  21. Quantitative & Qualitative

  22. Main distinctions seen between quantitative and qualitative ‘paradigms’ • - The conventional and constructivist Belief Systems(Adapted from Guba and Lincoln 1989)

  23. Main distinctions seen between quantitative and qualitative ‘paradigms’ • - Common dichotomies in methodological literature

  24. Basic research methods • Quantitative research (e.g. survey) • Qualitative research (e.g. face-to-face interviews; focus groups; site visits) • Case studies • Participatory research

  25. Quantitative research • Involves information or data in the form of numbers • Allows us to measure or to quantify things • Respondents don’t necessarily give numbers as answers - answers are analysed as numbers • Good example of quantitative research is the survey

  26. Surveys • Think clearly about questions (need to constrain answers as much as possible) • Make sure results will answer your research question • Can use Internet for conducting surveys if need to cover wide geographic reach

  27. Qualitative research • Helps us flesh out the story and develop a deeper understanding of a topic • Often contrasted to quantitative research • Together they give us the ‘bigger picture’ • Good examples of qualitative research are face-to-face interviews, focus groups and site visits

  28. Face-to-face interviews • Must prepare questions • Good idea to record your interviews • Interviews take up time, so plan for an hour or less (roughly 10 questions) • Stick to your questions, but be flexible if relevant or interesting issues arise during the interview

  29. Focus groups • Take time to arrange, so prepare in advance (use an intermediary to help you if you can) • Who will be in your focus group? (e.g. age, gender) • Size of focus group (8-10 is typical) • Consider whether or not to have separate focus groups for different ages or genders (e.g. discussing sex and sexuality)

  30. Site visits and observation • Site visits involve visiting an organization, community project etc • Consider using a guide • Observation is when you visit a location and observe what is going on, drawing your own conclusions • Both facilitate making your research more relevant and concrete

  31. Case studies • Method of capturing and presenting concrete details of real or fictional situations in a structured way • Good for comparative analysis

  32. Participatory research • Allows participation of community being researched in research process (e.g. developing research question; choosing methodology; analysing results) • Good way to ensure research does not simply reinforce prejudices and presumptions of researcher • Good for raising awareness in community and developing appropriate action plans

  33. Planning your research: Key questions • What do you want to know? • How do you find out what you want to know? • Where can you get the information? • Who do you need to ask? • When does your research need to be done? • Why? (Getting the answer)

  34. Step 1: What? • What do I want to know? • When developing your research question, keep in mind: Who your research is for; What decisions your research will inform; What kind of information is needed to inform those decisions. • Conduct a local information scan • Take another look at your research question

  35. Step 2: How? Where? Who? • How do I find out what I want to know? • Where can I get the information I need? • Who do I need to ask? • Choose your methodology • quantitative or numbers information • qualitative in-depth explanatory information • case studies • site visits or observation • participatory research

  36. Step 3: When? • When do all the different parts of the research need to be done? • List all your research work areas • Map them against a timeline • Develop a work plan

  37. Step 4: Why? Getting the answer • Collect your data • Keep returning to your research question • Organize your research results to answer the question • Keep in mind who you are doing the research for • Focus on what research results do tell you • Be creative, methodical and meticulous

  38. Requirements Elicitation

  39. Requirements Elicitation Information to elicit: – Description of the problem domain – List of problems/opportunities requiring solution (therequirements) – Any client-imposed constraints upon system

  40. Requirements Elicitation Requirements Elicitation Techniques: – Background Reading – Hard data collection – Interviews – Questionnaires – Group Techniques – Participant Observation – Ethnomethodology – Knowledge Elicitation Techniques

  41. Sources of Information • Clients (actual and potential) • Users of systems (actual and potential) • Domain Experts • Pre - existing system (within the problemdomain) • Other relevant products • Documents • Technical standards and legislation

  42. Challenges of Elicitation (1/2) • Thin spread of domain knowledge – The knowledge might be distributed across many sources. It is rarely available in an explicit form (i.e. not written down) – There will be conflicts between knowledge from different sources. • Tacit knowledge (The “say - do” problem) - People find it hard to describe knowledge they regularly use.

  43. Challenges ofElicitation (2/2) • Limited Observability – The problem owners might be too busy coping withthe current system. – Presence of an observer may change the problem, e.g. Probe Effect, Hawthorne Effect • Bias – People may not be free to tell you what you need toknow. – People may not want to tell you what you need toknow. • The outcome will affect them, so they may try to influenceyou (hidden agendas)

  44. EXERCISE

  45. Qualitative Methods Interpretive Construction of Reality Ethnographic Case Studies Unstructured Interview Participant Observation Diary Keeping Narratives Ideographic, Hermeneutic Quantitative Methods Positivistic, Empirical Experimental Falsification Correlational Surveys Structured Interview Postal Questionnaires Tests of Performance Attitude Intervention Nomothetic

More Related