280 likes | 436 Views
A programming exercise evaluation service for Mooshak. José Paulo Leal | Ricardo Queirós CRACS & INESC-Porto LA Faculdade de Ciências, Universidade do Porto Rua do Campo Alegre, 1021 4169-007 Porto PORTUGAL. Outline. Introduction Context Motivation Goal Architecture eLearning Frameworks
E N D
A programming exercise evaluation servicefor Mooshak José Paulo Leal | Ricardo Queirós CRACS & INESC-Porto LAFaculdade de Ciências,Universidade do PortoRua do Campo Alegre, 10214169-007 Porto PORTUGAL
Outline • Introduction • Context • Motivation • Goal • Architecture • eLearning Frameworks • E-Framework • Evaluation service (service genre, expression and usage model) • Design • Conclusion
1. Introduction: Context • Experience of projects with evaluation components • Mooshak - contest management system for ICPC contests • EduJudge - use of UVA programming exercises collections in LMSs • Emergence of eLearning frameworks • advocate SOA approaches to facilitate technical interoperability • based on a survey the most prominent is the E-Framework (E-F)
1. Introduction: Motivation • Integration of systems for automatic evaluation of programs • program evaluators are complex • difficult to integrate in eLearning systems (e.g. LMS) • program evaluators should be autonomous services • Modelling evaluation services • communication with heterogeneous systems • Learning Objects Repositories (LOR) • Learning Management Systems (LMS) • Integrated Development Environments (IDE) • conformance to eLearning frameworks improves interoperability
1. Introduction: Motivation • Integration of evaluation service in eLearning network
1. Introduction: Goal • Architecture • Integration of evaluation service on eLearning network • Definition of an evaluation service on eLearning framework • Formalise concepts related to program evaluation • Design • Extend existing contest management system • Expose evaluation functions as services • Reuse existing administration functions
2. Architecture • eLearning frameworks • Specialized software frameworks • Advocates SOA to facilitate technical interoperability • Types: • Abstract: creation of specifications and best practices for eLearning systems (e.g. IEEE LTSA, OKI, IMS AF) • Concrete: service designs and/or components that can be integrated in implementations of artifacts (e.g. SIF, E-F) • Survey: E-F and SIF are the most promising frameworks • they are the most active projects • both with a large number of implementations worldwide.
2. Architecture • E-Framework • initiative established by JISC, DEEWR, NZ MoE and SURF • aims to facilitate system interoperability via a SOA approach • has a knowledge base to support its technical model http://www.e-framework.org/
2. Architecture • support of the online community (developers wiki) • contribution for the E-Framework: • Service Genre (SG) • Service Expression (SE) • Service Usage Model (SUM)
2. Architecture - SG • Text File Evaluation Service Genre • responsible for the assessment of a text file • text file with an attempt to solve an exercise • exercise described by a learning object • supports three functions • ListCapabilities • EvaluateSubmission • GetReport
2. Architecture - SG • ListCapabilities function: • list all the capabilities supported by a specific evaluator • capabilities depend strongly on the evaluation domain • computer programming evaluator: programming language compiler • electronic circuit simulator: collection of gates that are allowed on a circuit
2. Architecture - SG • EvaluateSubmission function: • requests an evaluation for a specific exercise • request includes: • reference to an exercise as a learning object held in a repository • text file with an attempt to solve a particular exercise • evaluator capability necessary for a proper evaluation of the attempt • response includes • ticket for a later report request or a detailed evaluation report
2. Architecture - SG • GetReport function: • get a report for a specific evaluation • report included in the response may be transformed in client side: • based on a XML stylesheet • able to filter out parts of the report • calculate a classification based on its data
2. Architecture - SE • The Evaluate-Programming Exercise SE • requests • program source code • reference to programming exercise asa Learning Object (LO) • resources • learning objects retrieved from repository • LO are archives with assets (test cases, description) and metadata • responses • XML document containing evaluation report • details of test case evaluations EvaluationEngine input output Source code + LO reference report resource LO
2. Architecture - SE • The E-Framework model contains 20 distinct elements to describe a service expression (SE) • Major E-Framework elements: • Behaviours & Requests • Use & Interactions • Applicable Standards • Interface Definition • Usage Scenarios
2. Architecture - SE • Behaviours & Requests • details technical information about the functions of the SE • the 3 types of request handled by the SE: • ListCapabilities:provides the client systems with the capabilities of a particular evaluator • EvaluateSubmission:allows the request of an evaluation for a specific programming exercise • GetReport:allows a requester to get a report for a specific evaluation using a ticket
2. Architecture - SE • Use & Interactions • illustrates how the functions defined in the Requests & Behaviours section are combined to produce a workflow LEARNINGMANAGEMENTSYSTEM 1 LO reference and attempt 4 Report EVALUATION ENGINE(correction and classification) 2 LO 3 LO reference REPOSITORYLEARNINGOBJECTS
2. Architecture - SE • Applicable Standards • enumerates the technical standards used on the SE • content (IMS CP, IEEE LOM, EJ MD) and interoperability (IMS DRI)
2. Architecture - SE • Interface Definition • formalizes the interfaces of the service expression • syntax of requests and responses of SE functions • functions exposed as SOAP and REST web services
2. Architecture - SE • Interface Definition • Evaluation Response Language (ERL) • covers the definition of the response messages of the 3 functions • formalised in XML Schema
2. Architecture - SE • Usage Scenarios
2. Architecture - SUM • Text File Evaluation SUM • describes the workflows within a domain • composed by SG or SE • template diagram from E-F • two business processes • Archive Learning Objects • Evaluate Learning Objects
3. Design • Evaluation Service: Design principles & decisions • support e-framework architecture • extend existing contest management system – Mooshak • reuse existing functions rather than implement new ones • create service front controller for service • maintain administration web interface • map service concepts in Mooshak concepts
3. Design • Evaluation Service: Mapping service concepts to mooshak • Service -> Contest • only contests marked as serviceable • several contests served simultaneously • same contest can be served and managed • Capability -> Contest + Language • service request specify contest & language (whiting contest) • controls evaluation context • produces evaluation report (XML)
3. Design • Evaluation Service: Mapping service concepts to mooshak • Service requester -> Team • IDs based on remote IP address & Port • basis for authentication • useful also for auditing • Learning Object -> Problem • LOs downloaded from remote repositories • converted to Mooshak problems • downloaded problems used as cache
4. Conclusion • Definition of a evaluation service • Contribution to the E-Framework with a new Service Genre, Service Expression and Service Usage Model • Validation of the proposed model with a extension of Mooshak contest management system • Current and future work • first prototype already available • communication with repositories still in development • integration in network of eLearning systems • full evaluation of this service planned for next fall
Questions? Authors José Paulo Lealzp@dcc.fc.up.pthttp://www.dcc.fc.up.pt/~zp Ricardo Queirósricardo.queiros@eu.ipp.pthttp://www.eseig.ipp.pt/docentes/raq Thanks!