180 likes | 276 Views
When a Good Reputation isn’t Good Enough. Jonathan Traupman Robert Wilensky U.C. Berkeley. Introduction. Reputation systems are a key component of many peer-to-peer systems Lots of application specific features, but most share a common structure Aggregates feedback about transactions
E N D
When a Good Reputation isn’t Good Enough Jonathan Traupman Robert Wilensky U.C. Berkeley
Introduction • Reputation systems are a key component of many peer-to-peer systems • Lots of application specific features, but most share a common structure • Aggregates feedback about transactions • Seems to create trust from thin air • Anecdotal evidence suggests reputation systems work pretty well
Evaluation is difficult • E.g. peer-to-peer markets • Many trust signals besides reputation • Market and payment provider offer some indemnity • Difficult to separate out the role of the reputation system • Cannot realistically experiment with alternative reputation systems
Some Questions • In the absence of external forces, is a reputation system sufficient for encouraging cooperation? • Under what conditions does stable cooperation arise? • How well or poorly do existing reputation systems meet these requirements? • How can we design systems to better encourage cooperation?
A Game-theoretic model • Model the trading process as a series of simple games • Interaction game: agents decide whether or not to trade • Transaction game: agents decide to cooperate or defect • Reputation game: agents decide how to leave feedback • Observe what strategies are optimal under different conditions
Interaction game • Simultaneous, perfect information • Played repeatedly until a pair willing to interact is found • No direct payoffs • Small penalty for failing too often
Transaction Game • Simultaneous, perfect information • Agents choose whether to cooperate or defect • Payoffs based on both agents’ behavior • Instance of the Prisoners’ Dilemma
Reputation Game • Mixed game • Perfect information • First move simultaneous • Subsequent moves sequential • No direct payoff, but outcome influences reputation • “Tragedy of the Commons” • Honest reputations benefit the community • Individuals benefit from dishonesty or apathy
Repeatedly play the three games Periodically evaluate agent performance Mean payoff per transaction Keep successful agents “Breed” new agents by combining parameters of successful parents Interaction parameters New user interactivity Low-exp. interactivity High-exp. Interactivity Transaction parameters Honesty Reputation parameter 1st negative rate 1st positive rate Retaliation rate Evolutionary Simulator
Experiments • All experiments performed 20 times • Unconstrained evolution • Reputation system modeled after Percent Positive Feedback (PPF) on eBay • Retaliation prohibited • Retaliation rate parameter forced to zero • Simultaneous feedback • Retaliation rendered impossible by forcing agents to leave feedback blind
Unconstrained Evolution • Most similar to current market conditions • None of the markets were able to maintain cooperation • Moderate (~50%) retaliation rate • Retaliation caused all agents to hesitate leaving feedback first • Dysfunctional reputation system permits defection to emerge as the optimal strategy
Disabled Reputation • Knock out the retaliation parameter • 14 of 20 markets remained cooperative for 10,000 generations • 2 oscillated • 4 remained uncooperative • Much higher participation in the reputation system • Lack of direct incentives for honest feedback allowed agent apathy to prevent cooperation
Disabled Retaliation • Results improve further if we force the 1st negative rate to 1.0 • Retaliation clearly is an obstacle to cooperation
Simultaneous Feedback • Disabling retaliation outright is not possible in a real marketplace • Simultaneous feedback is a common suggestion • Modify the reputation game to be simultaneous rather than sequential • Can’t retaliate if you don’t know the feedback you’re getting • Still other ways to game the system, but a good first step
Simultaneous Feedback • 19 of 20 markets oscillated • One remained non-cooperative throughout • Agent apathy remains a problem • In highly cooperative markets, agents get lazy about leaving, using feedback • Permits defectors to gain a foothold • Eventually, cooperation is restored
Conclusions • Under the right conditions, a simple reputation system like PPF can maintain cooperation • Users must participate frequently and honestly • As currently implemented, PPF cannot maintain cooperation on its own • Permits retaliation • Does nothing to prevent apathy • Basically confirms intuitions about reputation systems • Provides a better theoretical and experimental foundation for these arguments
Conclusions • Provides some guidelines for designing better reputation systems • Must prohibit retaliation and other means of gaming the reputation system • Should create incentives for honest participation to combat user apathy