230 likes | 239 Views
Explore automatic adaptation techniques for a better user experience on diverse mobile devices. Learn about manual vs. automatic adaptation, issues, and solutions. Discover Community-Driven Adaptation (CDA) empowering users in the adaptation process. Research issues and experimental evaluations.
E N D
Community-Driven Adaptation Iqbal Mohomed Department of Computer Science University of Toronto
Heterogeneity in Mobile Devices • Significant Resource Variability among Mobile Devices • Screen real-estate • Networking • Power Capacity and Consumption • User Interface • Memory • Processing Capability
One Size Does Not Fit All • Provision of high-quality user experience requires Content and Applications to be customized for different devices!
Useful Customizations • Plethora of techniques for transforming content • Modality (convert text to speech on cell phones) • Fidelity (remove color information in images on devices with monochrome displays and limited bandwidth) • Layout (simplify layout of web pages on PDAs) • Summarization (condense large paragraphs of text into single lines on a pager using automated text summarizer) • Distinct content types usually benefit from different transformations • Most transformations have configuration parameters that can be varied How do we choose?
Manual Adaptation • Publishers make available content for several classes of devices • e.g., HTML and WAP versions of Web page • Disadvantages: High human cost • Done for every device (multiple versions) • Maintaining consistency and coherence • You can never cover all possible versions! • Continuous effort to support new types of devices • In practice: • Only done for few high-traffic sites • Limited number of devices • Slow update propagation • Not Scalable
Automatic Adaptation • Extract multiple version of content automatically • Optimize for device type, user preferences, context, etc. • Can be static or dynamic • Run converter after a change • Adapt content on-the-fly • Issues • Where is the decision made? • How is the adaptation/customization decision being made?
Proxy Where to Perform Adaptation? • Adaptation can be performed on: • Server • Places burden on each and every content producer • Client • Applications must be made adaptation-aware (potentially requiring source code changes) • Approaches such as Puppeteer alleviate this problem • May require server support • Middleware • Proxy interposed on network path between Server and Client • Natural aggregation point
How to Adapt Automatically? • Rule-based adaptation • Apply preset adaptation rules to content • e.g. Convert images larger than 10KB to JPEGs at 50% resolution • e.g. ASP.net Mobile Controls technology provides profiles for a large set of devices, where the rules apply to web controls • Constraint-based adaptation • Captures trade-off between resource-consumption and user satisfaction within a mathematical formula • Automatic solver makes adaptation decision by finding solution that meets all constraints, minimizes resource consumption and maximizes user satisfaction
Limitations of Rules and Constraints • Specifying per-object, per-device rules is too much work • No different than manual adaptation • In practice, a small set of global rules are utilized • Global rules are insufficient because they are content agnostic • Desired adaptation for an object depends on the task and the object's content (semantics)!
Core Issues • Need rule for every object, device, task • Computer alone can't do it • Human Designer can, but it is costly and does not scale • Idea: • Have the user decide • Apply decision to like-minded users
Community-Driven Adaptation (CDA) • Group users into communities based on adaptation requirements • System makes initial prediction as to how to adapt content (use rules and constraints) • Let user fix adaptation decisions • Feedback mechanism • System learns from user feedback • Improve adaptation prediction for future accesses by member of community
Server 1 Improve Fidelity Server 2 Mobile 2 Application Prediction How it Works Mobile 1 Application CDA Proxy
Advantages • User Empowerment: Users can fix bad adaptation decisions • Minimal Inconvenience: Burden of feedback is spread over entire community and is very low for each member • User does not have to provide feedback in every interaction
Research Issues • How good are CDA predictions? • What are good heuristics for learning how to adapt? • At what granularity should user accesses be tracked? (e.g. object, page, site, etc.) • How do we classify users into communities? • Does classification change over time? • Types of adaptations supported by this technique? • Fidelity, page layout, modality (text to voice, video to image) • User interfaces • What are good affordances to support adaptation? • Effects of UI on quality of adaptation prediction?
Experimental Evaluation • Goal: Quantify extent to which CDA predictions meet users’ adaptation requirements • Approach: • Step 1: User study • Create trace that captures levels of adaptation that users consider appropriate for a given task/content • Step 2: Simulation • Compare rule-based and CDA predictions to values in trace
Experimental Setup • 1 application • 1 kind of adaptation • 1 data type • 1 adaptation method • 1 community • Web browsing • Fidelity • Images • Progressive JPEG compression • Same device • Laptop at 56Kbps • Same content • Same tasks
Proxy Trace Gathering System • When loading page, • Transcode Web images into PJPEG • Split PJPEG into 10 slices • Just provide 1st slice initially • IE plugging enables users to request fidelity refinements • When user clicks on image • Provide additional slice • Reload image in IE • Add request to trace
Web Sites and Tasks SitesTasks Car show Find cars with license plates E-Store Buy a PDA, Camera and Aibo based on visual features UofT Map Determine name of all buildings between main library and subway Goal: finish task as fast as possible (minimize clicks) Traces capture minimum fidelity level that users’ consider sufficient for the task at hand.
Sample Web Site and Task Screenshot Car show application Lowest fidelity Improved fidelity
Evaluation Metrics • Extra data • Measure of overshoot • Extra data sent by prediction beyond what was selected by user • Extra clicks • Measure of undershoot • Number of time users will have to click to raise fidelity level from prediction to what they required in trace
fixed20 used when data needs to be conserved fixed40 used when clicks needs to be conserved Sample Baseline Rule-based Policies Sample Click Conservative CDA Policies Sample Data Conservative CDA Policies Results e.g. avg has same number of extra clicks but sends much less extra data (about 90%) e.g. max uses same amount of extra data but uses fewer clicks (about 40%)
Summary • CDA adapts data taking into account the content’s relationship to the user’s task • CDA outperforms rule-based adaptation • 90% less bandwidth wastage • 40% less extra clicks
And finally ... Questions? Contact Information: iq@cs.toronto.edu www.cs.toronto.edu/~iq