140 likes | 323 Views
Towards Soft Optimization Techniques for Parallel Cognitive Applications. Woongki Baek , JaeWoong Chung, Chi Cao Minh Christos Kozyrakis, Kunle Olukotun Computer Systems Laboratory Stanford University http://tcc.stanford.edu. Introduction. Cognitive applications are widely used
E N D
Towards Soft Optimization Techniques for Parallel Cognitive Applications Woongki Baek, JaeWoong Chung, Chi Cao Minh Christos Kozyrakis, Kunle Olukotun Computer Systems Laboratory Stanford University http://tcc.stanford.edu
Introduction • Cognitive applications are widely used • From commercial fields to military and security applications • Efficient parallelization techniques are needed • To process ever increasing data sets • This brief announcementdiscusses: • Optimizations that build upon soft computing properties • How they target common bottlenecks for parallel execution • Case study with Loopy Belief Propagation (LBP)
Soft Computing Properties • Conventional vs. cognitive applications • Conventional: precise data, strict accuracy requirements • Cognitive: inherently noisy inputs, must handle uncertainty, acceptable approximation of the “correct” answer • Soft computing properties: • User-defined, relaxed correctness • E.g., a small percentage of misclassification rate is tolerable • Redundancy • E.g., computations and communications on converged nodes • Inherent adaptivity to errors • E.g., noisy data from sensor nodes in online object tracking
Soft Optimizations (1) • O1) Reducing computation • Target: excessive workload due to large data sets • Possible optimizations: • Data dropping (sampling) • Lazy computation • Aggressive solution pruning • O2) Mitigating imbalance • Target: work and region imbalances • Possible optimizations: • Adaptive workload discarding • Selective barriers
Soft Optimizations (2) • O3) Reducing communication • Target: expensive communications on large-scale systems • Possible optimization: • Adaptive communication • O4) Reducing synchronization • Target: reduce synchronization frequency • Possible optimizations: • Selective synchronization • Imprecise updates
Case Study: Loopy Belief Propagation (LBP) • Efficient approximation for probabilistic inference on graphical models • Computes & communicates “beliefs” of nodes in graph. • Complexity grows linearly with number of nodes. • Not guaranteed to give the correct beliefs for networks with loops.
Evaluation • Soft optimizations on LBP • Adaptive message version 1 (MSG1) • Discardsmessages when both a sender and a receiver have converged • Target: O1, O2, and O3 • Adaptive message version 2 (MSG2) • Discards messages when only a sender has converged • Target: O1, O2, and O3 • Lazy belief computation (LazyBC) • Skip belief computations on converged nodes • Target: O1 and O3 • Evaluation • A detailed simulator for a shared-memory multiprocessor • Metrics: performance improvement & accuracy loss
2.8 2.7 3.7 4.8 2.4 Results: Speedup • #1: Soft optimizations improve performance significantly • E.g., 56.1x on 32 processors (superlinear) #2: Consistent benefits over conventional forall processor counts
Results: Accuracy • Outstanding tolerance to soft optimizations (accuracy loss < 0.2%) • Note: NOT a proof for general accuracy guarantees • Further research necessary on this question • Can use soft versions for real-time answers verified later on
Implications • We have shown that soft optimizations can be valuable • Further research • Algorithms • Evaluation with more applications • Analysis of convergence, accuracy, and performance • Programming languages and runtime systems • Language constructs to express soft optimizations • Runtime support for monitoring & automatic adaptation • Architectures • Support for relaxed coherence or communication • Support for introspection
Thanks & Questions? Woongki Baek (wkbaek@stanford.edu)