370 likes | 382 Views
This research focuses on the development of a Dynamic Hypermedia Engine (DHE) that automatically adds links and other hypermedia services to applications. It also explores the integration of digital library services and relationship analysis for educational research, specifically collaborative examinations.
E N D
Research Overview Michael Bieber Information Systems Department College of Computing Sciences New Jersey Institute of Technology bieber@njit.edu http://www-ec.njit.edu
Research Overview • Dynamic Hypermedia Engine • Digital Library Service Integration • Relationship Analysis • Educational Research: Collaborative Examination • Virtual Communities
Dynamic Hypermedia Engine Automatically add links and other “hypermedia” services to applications: • comments • guided tours • structural search (based on links and relationships instead of keywords) • others... Buzzword compatible: • Java, Servlets, RMI, XML, XHTML, RDF, etc.
1997 Sales 1997 Expenses $127,322.12 $85,101.99 Dynamic Hypermedia Engine • Links generated based on application structure, not search or lexical analysis • You cannot do a search on the display text “$127,322.12” to find related information… • But you can find relationships for the element Sales[1997]
Looking for Collaboration • Applications to integrate with DHE • Field study sites
Research Overview • Dynamic Hypermedia Engine • Digital Library Service Integration • Relationship Analysis • Educational Research: Collaborative Examination • Virtual Communities
DSLI Architecture Integration linking related documents • Digital Library: Multimedia Document Services
DSLI Architecture Integration Discussing a document • Digital Library: Multimedia Document Services • Asynchronous Discussion Tools(Groupware)
DSLI Architecture • Digital Library: Multimedia Document Services • Asynchronous Discussion Tools • Hypermedia Services • Processes/Workflows • Decision Analysis Support • Conceptual Knowledge Structures • Others... All Integrated through the Dynamic Hypermedia Engine
Looking for Collaboration • Digital library services to integrate into this infrastructure • Collections to integrate, so they can use the various digital library services
Research Overview • Dynamic Hypermedia Engine • Digital Library Service Integration • Relationship Analysis • Educational Research: Collaborative Examination • Virtual Communities
Relationship Analysis (RA) • Motivation: What to link? • RA: a systematic analysis methodology based on relationships • RA provides analysts with a deeper understanding of a system or information domain • The relationships “discovered” can be implemented as links (automatically by DHE)
RA: Sample Analysis Questions(replace “item” by “vendor”) • Activity Relationships • Who uses this item, and how? • What are this item’s inputs and outputs; what does it produce? • What is required to use this item? • Who is involved with this item? • Intentional Relationships (“meaning/opinions”) • Which goals, issues and arguments involve this item? • What are the policies, positions or statements on this item? • What comments and opinions have been expressed about this item? • What are the constraints, limitations, priorities and options for this item? • What rationale exists for this item?
Vendor details (address, contact, customer service, Web site) Reliability (on-time, complete orders, quality, service) Vendor agreements & discounts Who else has used this vendor Purchasing history with this vendor(mine, others) All application screens with this vendor All documents concerning this vendor Annotations/comments on this vendor Policies regarding this vendor Rationale for using this vendor in the past What people typically buy from this vendor Which vendors generally give better deals than this one Alternatives to this vendor Social considerations regarding this vendor Vendor’s parent company and subsidiaries Vendor’s partnerships and agreements with other companies Instructions: how to choose a vendor; how to evaluate a vendor Vendor Relationships(possible links resulting from an RA analysis)
Looking for Collaboration • Domains/Complex Systems to analyze using Relationship Analysis • Field study sites
Research Overview • Dynamic Hypermedia Engine • Digital Library Service Integration • Relationship Analysis • Educational Research: Collaborative Examination • Virtual Communities
CollaborativeExaminations Jia Shen, NJIT Starr Roxanne Hiltz, NJIT Kung-E Cheng, Rutgers University Yooncheong Cho, Rutgers University Michael Bieber, NJIT
Collaborative Exam • 1. Why? • To reduce the instructor’s own work load • To test a new method of conducting exams • 2. A form of collaborative learning • Previous research is limited
Exam Procedures • Traditional exam: • 3-hour, in-class, 3-4 essay questions, 6 pages of notes • Collaborative exam: • Students compose questions • Students select questions (eliminated in spring 00) • Students answer selected questions • Students grade questions • Ph.d. intermediate grading • Professor assigns final grade and handles disputes
Issues • Need to see behind anonymity • Grading guidelines and grade inflation • Consistent grading • Trade-offs for students • - drawn-out process vs. concentrated • - access to everything vs. limited access to notes • - we couldn’t justify the process to the students fully • Trade-offs for professors • - limited but harder grading vs. easier grading • - drawn-out process vs. concentrated • - much more administration
Looking for Collaboration • Other courses that would like to use a similar approach, or which we can contrast to our collaborative examinations
Research Overview • Dynamic Hypermedia Engine • Digital Library Service Integration • Relationship Analysis • Educational Research: Collaborative Examination • Virtual Communities
Knowledge Sharing and Learningin Virtual Communities Michael Bieber1 Ricki Goldman1 Roxanne Hiltz1 Il Im1 Ravi Paul1 Jenny Preece2 Ron Rice3 Ted Stohr4 Murray Turoff1 1New Jersey Institute of Technology3Rutgers University (SCILS) 2University of Maryland, Baltimore County 4Stevens Technical University
Motivation Why do people participate in virtual communities? • to attract customers/clients • for amusement • to socialize; find comfort (medical communities) • to network, build contacts • to improve what you do (job, personal) • find information/solve problems/learn from others ==> collaboration, knowledge-sharing and learning underlies most of these directly or indirectly Research Question: How best to support this?
Goal Increasing people’s effectiveness by helping them share knowledge and learn through virtual communities
Example Tasks (of individuals)for an academic research community • learning about the community domain • learning about relevant people in the community • teaching a course • finding materials on a research topic • mentoring members in research or learning • developing software using community research • developing/selling software to serve community
Example Community Tasksof an academic research community • running a conference • conducting elections • writing newsletter / submitting to the newsletter • making the budget • proposing & running a task force • recruiting new society members
Approach • Concept Building regarding knowledge and learning within virtual communities • Study testbed communities • Prototype tools • Prototype procedures • Evaluate • virtual communities • learning and effectiveness • the prototype tools and procedures
documents (published papers, reports, photos, videos, lesson plans, syllabi, etc.) discussions decisions conceptual models formal educational modules workflows/processes people’s expertise links/relationships among all these Community Knowledge Resides in...
CommunityServices Architecture • Digital Library: Multimedia Document Services • Asynchronous Discussion Tools • Hypermedia Services • Processes/Workflows • Decision Analysis Support • Conceptual Knowledge Structures • Others... All Integrated through the Dynamic Hypermedia Engine
Evaluation • focus on individual-level and community-level • Pilots and assessment on actual communities • Action Research: work actively with participants • Propositions/hypotheses and measures • Formative Evaluation to assess/improve tools (requirements analysis, usability testing) • Summative Evaluation to assess usage, impacts, satisfaction(direct observation, interviews, surveys, usage profiles)
Looking for Collaboration • Looking for testbed communities