160 likes | 236 Views
CMP3265 – Professional Issues and Research Methods. Research Proposals: Aims and objectives, Method, Evaluation Yesterday – aims and objectives: clear, timely, significant, original, feasible This morning – Research Methods, Research Evaluation. Research Method - characteristics.
E N D
CMP3265 – Professional Issues and Research Methods Research Proposals: • Aims and objectives, Method, Evaluation • Yesterday – aims and objectives: clear, timely, significant, original, feasible • This morning – Research Methods, Research Evaluation
Research Method - characteristics The method should encompass: • What areas of research are involved and how this research builds upon them • How the research is to be conducted • A plan for the research project that will achieve the objectives • How the project is to be managed Questions of a chosen method: • Will the method deliver the aims/objectives? • Is the method appropriate for the type of research? • Can the results produced by the method be reproduced?
Back to Yesterday’s Example “Research Hypothesis: Object-oriented database technology leads to better quality software than RDB technology” METHOD - How the research is to be conducted • Assemble development team T • Identify an application A • Apply T to A using OODB • Apply T to A using RDB • Use results to ‘prove’ hypothesis Comment on the method.
better method? “Research Hypothesis: Object-oriented database technology leads to better quality software than RDB technology” METHOD: • Identify a set of software developers T • Randomly separate T into development teams T1 and T2 • Identify an application A • Apply T1 to A using OODB technology • Measure product quality by recording metrics eg development time, bug rate, LOC, code complexity • Apply T2 to A using RDB technology • Measure product quality by recording metrics eg development time, bug rate, LOC, code complexity Comment on the method.
better method? BUT – Using one development team split randomly would still introduce bias (depending on which was eg the most experienced team) Using one application A – not good enough Choice of _actual_ technology will influence result
Example 2 “Hypothesis: Algorithm A is faster at solving problems from population X then Algorithm B” METHOD - How the research is to be conducted: Implement algorithm A Implement algorithm B Generate a random sample from X Apply A to each member of X – record CPU time Apply B to X to each member of X – record CPU time Use results to ‘prove’ hypothesis Comment on the method.
Research Methods – Sampling Research methods often involve SAMPLING – Choose a representative / random sample S from population X; Experiment and obtain results on sample S; Make claims about S; GENERALISE claims to the whole population X.
Sampling - Pitfalls Consider the following samples. Is it safe to generalise? 1. What is being Tested: software method Population: Programmers Sample: set of IT students 2. What is being Tested: educational software method Population: IT students Sample: random set of Huddersfield IT students 3. Re-look at Example 2 above Samples must be representative of the larger population
Research Methods – cause and effect Research methods often involve trying to prove causality where a feature causes a particular effect • Does the use of one particular method improve a process compared to some other? • Does the use of an enhanced algorithm improve its quality (eg speed)? • Does a course of training / learning improve the effectiveness of developers?
The effect of ‘Extraneous Variables’ Research often fall down because of extraneous variables in cause and effect – e.g. Is it the new method / algorithm that causes the improvement or some other factor? e.g. it may be skill of the team itself rather than the method that produces an observed improvement. In computing research experiments tend to be VERY complex – it is important to remove any extraneous variables that may produce side-effects
Research Evaluation Typically there are several general ways to evaluate research results (plus combinations of these) • Literature Comparison – show superior to existing / past works by comparing with written accounts of other work • Empirical – run experiments and take measurements • Rational – prove or demonstrate properties [eg prove an algorithm’s computational complexity]
Research Evaluation - Measurements Comment on these fictional claims: “.. research shows that Pentium processors can function in environments at twice the normal ambient temperature” ? “.. research shows that XP is twice as reliable as W95” ? “..research shows that the XP OS has twice as many bugs as Linux“ ? Scientists must be very careful with statistics and measurements
Example of “Good” research – best research paper handout Using FORM rather than CONTENT, analyse and evaluate the computing research paper given out. Consider: • What type of research is it – ground breaking, new or improved method, new algorithm, new theory etc • Are its aims clear, significant, timely, original? • Does it outline what areas of research are involved and how this research builds upon them? • The method that was used to conduct the research • How the research was evaluated • Literature comparison? • Empirical? • Rational?
Example of “Good” research – best research paper handout • What type of research is it – its ground breaking in that it combines two areas into one for a new framework (for something…!) • The objective seems fairly clear, and it is written as if it is significant, timely, original (need to be a subject expert to decide on that one) • What kind of method + evaluation does it use? • Outlines the theory about ‘problems’ and ‘solutions’ that share a common formulation • Outlines a ‘discovered ‘algorithm (LDFS) that is general enough to solve the scope of problems • Evaluates the new algorithm • rationally (by proving propositions), • emprically by its application to a subset of the formulation (MDPs) with comparison to previous state of the art algorithms • comparisons are also made through the paper
UK Research Assessment Exercise This is huge evaluation of research to determine the ‘value’ of research groups in UK Universities. Each subject has to present evidence: • For each submitted academic: 4 pieces of work (publications) + 100 words stating their ‘impact’ • ‘peer esteem’ - measured by factors such as conference/journal organisation and management • Number of PhD awards / year • Amount of Research Income /year
Portfolio Exercise • Take any piece of published research (this can include staff research disseminated in last term’s seminar series) and analyse and evaluate it with respect to: • What type of research is it – ground breaking, new or improved method, new algorithm, new theory etc • Are its aims clear, significant, timely, original? • What kind of method does it use? • How is it evaluated? • Literature comparison? • Empirical? • Rational? Comment on its overall quality (and include a copy of the publication or a link to it in your portfolio)