350 likes | 413 Views
The Networked Education Database. Matthew Pittinsky Ph.D. Candidate, Teachers College, Columbia University. NED: A Vision. Schools have long invested in student administrative systems.
E N D
The Networked Education Database Matthew PittinskyPh.D. Candidate, Teachers College, Columbia University
NED: A Vision • Schools have long invested in student administrative systems. • Schools are now adopting eLearning systems equipped with gradebooks, class rosters and Web-based survey (assessment) tools. • Both are Internet-enabled. • Could “generic” and custom data be collected through school systems automatically, and anonymously, massively reducing the cost and complexity of educational research?
NED The Problem…
Original Data Collection • Requires precious classroom time. • Informed consents difficult to secure. • Customizing instruments for context across sites and time is costly and discourages certain types of data collection (e.g. sociometric). • Data entry and coding inhibits sharing and re-use. • Incomplete responses undermine results and inhibit certain types of data collection (e.g. full classrooms).
Major Secondary Datasets • International studies (e.g. TIMSS). • Federal studies (e.g. NELS, HSB). • State data warehouses (e.g. Florida K-20 Education Data Warehouse (EDW)). • Sponsored private studies (e.g. AddHealth).
Secondary Datasets: Issues • Requires tough trade-off’s when operationalizing specific research questions. • e.g. classmate effect studies. • Often based on stratified samples, not whole classrooms and schools. • e.g. same-teacher class periods. • Rarely longitudinal within academic years. • Rarely contextualized (e.g. relationship questions that require roster).
NED The Solution…
NED: A Data Collection Model • Asynchronous (outside class time) • Automatic (pre-scheduled, add/drops) • Contextual (draws on system data to generate questions) • Non-duplicative (uses already stored or entered data where possible) • Complete (form checks) • Anonymous (unique ID) • Efficient (paperless and coded) • Sustainable (self-perpetuating)
NED: A Dataset • Classroom-level data. • Same-teacher data. • Sociometric & social-psychological data. • Longitudinal data. • Multi-site data. • At scale…
How NED Works • School installs NED extension and marks participating classes. • School’s eLearning system automatically posts survey based on schedule (w/ announcement). • Participant provides consent. • Survey is delivered through eLearning system GUI. • Survey draws on class context (roster, subject matter, student information, etc.) when phrasing customized questions. • Survey enforces certain completion rules. • Survey responses stored in special encrypted tables that self-delete after posting. • Survey responses and pre-existing data (demographic, gradebook) are packaged and securely posted to NED. • New students are “caught up” when added to course. • Teacher knows how many completes, but not who.
Participant Anonymity • Data arrives to NED as secondary data (anonymous and coded). • Survey responses are tagged with unique participant IDs. • Generic data tagged with same ID and automatically merged with survey responses (gradebook, demographic, course overlap). • Structure of unique participant ID allows for sorting by class and school, however... • Data arrives at NED without any knowledge of participant’s school or classroom identity.
Participant Confidentiality • Survey responses are encrypted and automatically self-delete on local server. • No school official has access to student responses or completion status. • Data transmission is via secure protocol.
NED The Pilot…
NED Pilot • Custom extension to the Blackboard Learning System. • Solicited 15 sites, 5 agreed, 3 ultimately participated. • All secondary schools (2 private / 1 public), 3 different states and regions. • 19 teachers, 37 classes, 732 participants • Three pre-scheduled NED survey administrations (October, January, May). • Each “live” for two weeks • NED staff know site names, but not names of participating schools (if district), teachers or class information. • Students provided incentive to participate. • Surveys included questions from other datasets to compare responses. • Approximately 250 development hours.
NED Status • First administration launched 10-16. • Several showstopper technical issues identified and resolved. • One site dropped out. • Participation varied: * All surveys received were complete
Implementation Issues • Not a standard “building block;” required custom coding. • Bb installations vary, affecting custom code. • Not Bb’s standard survey tool. • Survey formatting limited. • Save and start, adaptive, and timing features limited. • Low ease of use (e.g. self-reference not grayed in sociometric questions; matrix questions scroll off screen without freezing roster). • Gradebook entries are user-defined, without a standard taxonomy. • Many schools create one mega-site for all class periods. • Pilot leans away from core subjects. • Pilot leans away from same-teacher course sections. • Many schools do not use Bb as their gradebook or student profile of record. • Relying on teacher responses for student-level data not always viable (e.g. mixed age-grade classes).
Implementation Issues • Required “enterprise license” of Blackboard. • Data transmission via local SQL scripts, not Web service. • IP address of sending site could allow for matching of school name with unique ID schema. • Different participant ID’s across classes (if student changed class periods). • System reports fragmented and unusable without additional programming. • Total eligible population not included in system report. • Ideal survey length difficult to asses. • Anonymity and class time impact concerns during site solicitation. • Will students participate?
Future Directions • Implement through standardized APIs and via eLearning system’s survey tool. • Pilot with larger number of sites. • Pilot with smaller, more frequent surveys. • Pilot with full site participation across all classes and grades. • Pilot with full age-grade population over time. • Include non eLearning systems (e.g. TPR) and non Bb eLearning systems. • Formalize vendor NED interface program. • Expand to higher education.
NED: Imagine • A national dataset. • Fed from tens of thousands of sites. • Collecting unique classroom-level data. • Throughout the academic year and a student’s educational career. • With minimal site-specific maintenance. • Efficiently and cost effectively.
NED Team • Technical • Tim Streightiff, lead developer • Linda Merryman, project director • Basheer Azizi, database engineer • Functional • Matthew Pittinsky, principal investigator • Gary Natriello, principal investigator
Dissertation Study • Specific case of “classmate effects” • The myriad ways in which a student’s classmate group exerts an independent, significant influence on many important educational processes and outcomes. • Are a teacher’s expectations for a student influenced by the student’s classmates? • Teacher will develop expectations for Student A in part based on the expectations the teacher holds for Student A’s friends (“Company You Keep”). • Classmates will provide the reference group lens through which Student A is evaluated and the teacher’s expectations for him or her formed (“Big Fish, Small Pond”).
Other Studies / Projects • Projects • Tagging engine for library resources • E.g. www.connotea.org from nature.com • ePortfolio (scholarly) • SNA application for the rest of us • TC alumni community of practice (w/ data collection) • Studies • Teacher-perceived vs. student-objective friendship networks • Revisit Davis’s “Frog Pond” study • Revisit college attachment post Web 2.0 • TE as function of teacher network • Admission SNA study (counselor/recruiter networks)