190 likes | 368 Views
Learning and Testing Submodular Functions. Columbia University. + Work in progress with Rocco Servedio. With Sofya Raskhodnikova (SODA’13). Grigory Yaroslavtsev. October 26, 2012. Submodularity. Discrete analog of convexity/concavity, law of diminishing returns
E N D
Learning and Testing Submodular Functions Columbia University + Work in progress with Rocco Servedio With SofyaRaskhodnikova (SODA’13) GrigoryYaroslavtsev October 26, 2012
Submodularity • Discrete analog of convexity/concavity, law of diminishing returns • Applications: optimization, algorithmic game theory Let • Discrete derivative: • Submodular function:
Exact learning • Q: Reconstruct a submodular with poly(|X|) queries (for all arguments)? • A: Only-approximation (multiplicative) possible[Goemans, Harvey, Iwata, Mirrokni, SODA’09] • Q: O-fraction of points (PAC-style learning with membership queries under uniform distribution)? • A: Almost as hard [Balcan, Harvey, STOC’11].
Approximate learning • PMAC-learning (Multiplicative), with poly(|X|) queries : (over arbitrary distributions [BH’11]) • PAAC-learning (Additive) • Running time: [Gupta, Hardt, Roth, Ullman, STOC’11] • Running time: poly[Cheraghchi, Klivans, Kothari, Lee, SODA’12]
Learning • For all algorithms
Learning: Bigger picture Subadditive [Badanidiyuru, Dobzinski, Fu, Kleinberg, Nisan, Roughgarden,SODA’12] XOS = Fractionally subadditive • Other positive results: • Learning valuation functions [Balcan, Constantin, Iwata, Wang, COLT’12] • PMAC-learning (sketching) valuation functions [BDFKNR’12] • PMAC learning Lipschitzsubmodular functions [BH’10] (concentration around average via Talagrand) Submodular Gross substitutes OXS Additive (linear) Value demand
Discrete convexity • Monotone convex • Convex
Discrete submodularity • Case study: = 1 (Boolean submodular functions ) • Monotone submodular = (monomial) • Submodular = (2-term CNF) • Monotone submodular • Submodular
Discrete monotone submodularity • Monotone submodular
Discrete monotone submodularity • Theorem: for monotonesubmodular f • (by monotonicity)
Discrete monotone submodularity • Ssmallest subset of such that • we have => Restriction of on is monotone increasing =>
Representation by a formula • Theorem: for monotonesubmodular f • Notation switch: , • (Monotone) (no negations) • (Monotone)submodular can be represented as a (monotone) pseudo-Boolean 2R-DNF with constants
Discrete submodularity • Submodular can be represented as a pseudo-Boolean 2R-DNF with constants • Hint [Lovasz] (Submodularmonotonization): Given submodular define Then is monotone and submodular.
Learning pB-formulas and k-DNF • = class of pB-DNF of width with • i-slice defined as • If its i-slices are -DNF and: • PAC-learning iff (
Learning pB-formulas and k-DNF • Learn every i-sliceon fraction of arguments • Learning -DNF () (let Fourier sparsity = ) • Kushilevitz-Mansour (Goldreich-Levin): queries/time. • ``Attribute efficient learning’’: queries • Lower bound: () queries to learn a random -junta (-DNF) up to constant precision. • Optimizations: • Slightly better than KM/GL by looking at the Fourier spectrum of (see SODA paper: switching lemma => bound) • Do all R iterations of KM/GL in parallel by reusing queries
Property testing • Let C be the class of submodular • How to (approximately) test, whether a given is in C? • Property tester: (Randomized) algorithm for distinguishing: • -far): • Key idea: -DNFs have small representations: • [Gopalan, Meka,Reingold CCC’12] (using quasi-sunflowers [Rossman’10]) ,-DNF formula F there exists: -DNF formula F’ of size such that
Testing by implicit learning • Good approximation by juntas => efficient property testing [surveys: Ron; Servedio] • -approximation by -junta • Good dependence on : • [Blais, Onak] sunflowers for submodularfunctions • Query complexity: (independent of n) • Running time: exponential in (we think can be reduced it to O( • We have lower bound for testing -DNF (reduction from Gap Set Intersection: distinguishing a random -junta vs + O(log )-junta requires queries)
Previous work on testing submodularity [Parnas, Ron, Rubinfeld ‘03, Seshadhri, Vondrak, ICS’11]: • U. • Lower bound: Special case: coverage functions [Chakrabarty, Huang, ICALP’12]. Gap in query complexity