120 likes | 656 Views
Fusing physical and cognitive spaces: Using wireless networked sensors to assess the who, what, where, when, and how of student learning Gregory K. W. K. Chung UCLA/CRESST Mani B. Srivastava Department of Electrical Engineering, UCLA
E N D
Fusing physical and cognitive spaces: Using wireless networked sensors to assess the who, what, where, when, and how of student learning Gregory K. W. K. ChungUCLA/CRESST Mani B. SrivastavaDepartment of Electrical Engineering, UCLA Annual Conference of the National Center for Research on Evaluation, Standards, and Student Testing (CRESST) September 14-15, 2000 Los Angeles, CA
Measuring Behavior • Current techniques • Real-time observation with sampling • Observation of video or audio taped data • Characteristics • Are time-consuming and prone to error • Rarely capture temporal properties of behavior • Major advantage: human in-the-loop categorizing of observations
Measuring Behavior • Sensor-based techniques • Computationally measure physical properties of person and related objects • Computationally derive observations from sensor data • Vast improvement in observation capabilities • Scalability (high number of observations) • Efficiency (more information / unit cost) • Timeliness (rapid turnaround time) • Accuracy
Measuring Behavior • Sensor-based techniques (continued) • Measuring the who, what, where, when, and how of human-human and human-object interactions • Key challenges: • Develop algorithms to support the aggregation of sensor data that accurately measure the construct of interest, are meaningful, are credible, and are in a form usable to different end-users • Relate behavioral measurements to cognitive processes and task outcomes • Approximate the 24/7 human observer
Wireless Networked Sensors • Wireless networked sensors • Integrate sensing and short-range communication function in a single unit • Low-power consumption (long operational life) • Small form factor (embed in everyday objects) • RF (avoid line of sight problems) • Tetherless bi-directional connection to the Internet • Remote measurement and control capability • Embed “intelligence” and interactivity in everyday objects
Sample of Sensor Types • Acoustic • Light • Image/video • Touch/pressure • Temperature • Identification • Position (x,y,z) • Proximity (x’,y’,z’) • Orientation (360°) • Movement (acceleration)
Potential Application • Describing interaction • Student-object • Student-student • Student-teacher • Teacher-object • Triangulate multiple measures of interaction to successively refine inferences about interaction
Example: Deriving observations of small group object categorization task Object and student position, student orientation, object- proximity data allow the following questions to be answered: S1 1. How many objects are categorized correctly by shape? (12-squares, triangles, circles) 2. What object are students focused on? (rhombus) 3. How many objects remain to be categorized? (1-rhombus) S2 S3
Example: Deriving observations of small group instruction Position, orientation, acoustic data allow the following questions to be answered: S1 T 1. Who is paying attentionto the teacher? (S1, S2) 2. Which students are participating? (S1, S2) 3. What is the nature of the utterance? (S2 - question) 4. Which students are not paying attention orparticipating? (S3, S4) ??? S2 S4 S3
Potential Application • Describing the classroom environment • Measures of: • Amount of lecture, independent, small-group instruction • Student resource use • Student roaming profiles • Teacher-student interaction • Student-student interaction • Student attention
Next Steps • NSF Information Technology Research Grant (2000-2002) • UCLA Electrical Engineering lead department (PI Srivastava), UCLA Computer Science department and CRESST are partners • Develop technology wireless protocols, network architectures, middleware architecture, data management and mining, user profiling, speech recognition • Application domain: Assessing young children’s (K-1) problem-solving development
Next Steps • Qualitative analyses of classroom, children’s interactions with each other, and children’s interaction with objects • Develop measures using sensor data • Validate measures with human observations • Develop sensor-based assessment of children’s problem-solving skills • Use play or other manipulative-based task that requires demonstration of performance • Use extended task to gather data over time