220 likes | 395 Views
Evaluating Ubiquitous Computing Applications In Situ. Katherine Everitt (UW, IRS intern) Sunny Consolvo (IRS, UW) Ian Smith (IRS) James Landay (IRS, UW CSE) Intel Research Seattle University of Washington In-Use, In-Situ Workshop 28 October 2005. Talk Overview.
E N D
Evaluating Ubiquitous Computing Applications In Situ Katherine Everitt (UW, IRS intern) Sunny Consolvo (IRS, UW) Ian Smith (IRS) James Landay (IRS, UW CSE) Intel Research Seattle University of Washington In-Use, In-Situ Workshop 28 October 2005
Talk Overview • 2 evaluation approaches • Wizard of Oz • Mobile Phones (Existing technologies) • Cross-cutting problem • prototype fidelity • Value of in situ evaluations
In Situ Evaluation Technique:Wizard of Oz • Two examples: • CareNet Display • Home Energy Tutor Configuration Tool Home Energy TutorConfiguration Tool CareNet Display
WoZ Example: CareNet Display • Interactive photo augmented with care-relevant updates • Goal: help local care network members provide day-to-day care
WoZ Example:CareNet DisplaySystem architecture for evaluation
WoZ Example:CareNet Key Results • Improved relationship among care network (qualitative feedback) • Including caregiver-caregiver relationships • Improved quality of conversations with elder • All elders said before deployment that they would share with locals • Distant relations were concerns at start • Bad predictors of who to share with
In Situ Wizard of Oz: Challenges • Data collection was labor intensive • Full-time intern for 3 months! • No exceptions! • Not including development, analysis! • Unreliable technology • GPRS • Difficult to use touch screen • Can’t rely on participants to alert you when the technology fails • no matter how much you beg!
In Situ Evaluation Technique:Mobile Phones (existing technologies) • Houston: Sharing fitness information within a social group • Focus on the social effects • Competition, peer pressure/support • Reno: Sharing location data within a family • Focus on privacy issues • Location system design guidance
Mobile Phone Example:Houston mobile computing + social influence = increased step count Key Decision: Step counts are a reasonable proxy for physical activity
Mobile Phone Example:Houston • 3-week, in situ study of Houston • 3 groups of women aged 28 – 42 (13 participants total) • Carry “extra” mobile phone daily • Too complex to “modify” their phone • One group with no “assisted” sharing • Two groups with enhanced sharing • Used first week to set baseline
Houston: Key Results • Wanted credit for all activities • Proper credit within activity • 7 of 13 participants increased daily step count (on average) • Qualitative data suggests that most participants changed their behavior • Sharing motivates some people • Support but not require social interaction
Mobile Phones: Challenges • Input/output of a mobile phone is limited • Exacerbated by development choices • 2nd mobile phone was unnatural • 2nd phone offered us “more control” • Nokia 6600 is considered “too big” by some • Data loss when out of range • Timing of SMS • Charging batteries – custom apps burn batteries faster than participants are used to • Nightly charging isn’t always done
Mobile Phones: Loss of expensive prototype technologies • Theft • Theft from participant • Theft from you by participant • Damage • Pedometers lost to toilet, washing machine, sea, etc. • Phones/PDAs dropped (resulting in cracked screen) • Lost (leave the phone somewhere)
Cross-cutting problem:Fidelity of evaluation prototypes iGlove iBracelet
Value of early-stage in situ evaluations: No pain, no gain. Where would this woman clip a pedometer or carry a large cell phone? The glow of the CareNetDisplay was often distractingfor participants who could see it while watching a movie or trying to sleep
Thank you! • Questions? Comments? • Contact us at: • everitt@cs.washington.edu • [sunny.consolvo, ian.e.smith, james.a.landay]@intel.com