1.1k likes | 1.63k Views
Trust. Course CS 6381 -- Grid and Peer-to-Peer Computing . Gerardo Padilla. Source. Part 1 : A Survey Study on Trust Management in P2P Systems Part 2 : Trust- χ : A Peer-to-Peer Framework for Trust Establishment. Outline. What is Trust? What is a Trust Management? How to measure Trust?
E N D
Trust Course CS 6381 -- Grid and Peer-to-Peer Computing Gerardo Padilla
Source • Part 1: A Survey Study on Trust Management in P2P Systems • Part 2: Trust-χ: A Peer-to-Peer Framework for Trust Establishment
Outline • What is Trust? • What is a Trust Management? • How to measure Trust? • Example • Reputation-based Trust Management Systems • DMRep • EigenRep • P2PRep • Frameworks for Trust Establishment • Trust- χ
What is Trust? • Kini & Choobineh trust is: "a belief that is influenced by the individual’s opinion about certain critical system features" • Gambetta " …trust (or, symmetrically, distrust) is a particular level of the subjective probability with which an agent will perform a particular action, both before [the trustor] can monitor such action (or independently of his capacity of ever to be able to monitor it) • The Trust-EC project (http://dsa-isis.jrc.it/TrustEC/) trust is: "the property of a business relationship, such that reliance can be placed on the business partners and the business transactions developed with them''. • Gradison and Sloman trust is: "the firm belief in the competence of an entity to act dependably, securely and reliably within a specified context"..
What is Trust? Some Basic Properties of Trust Relations • Trust is relative to some business transaction. A may trust B to drive her car but not to baby-sit. • Trust is a measurable belief. A may trust B more than A trusts C for the same business. • Trust is directed. A may trust B to be a profitable customer but B may distrust A to be a retailer worth buying from. • Trust exists and evolves in time. The fact that A trusted B in the past does not in itself guarantee that A will trust B in the future. B’s performance and other relevant information may lead A to re-evaluate her trust in B.
Reputation, Trust and Reciprocity • Reputation: perception that an agent creates through past actions about its intentions and norms. • Trust: a subjective expectation a peer has about another's future behavior based on the history of their encounters. • Reciprocity: mutual exchange of deeds reputation Increase pi’s reputation Increase pj’s trust of pi trust reciprocity Increase pi’s reciprocating actions
Outline • What is Trust? • What is a Trust Management? • How to measure Trust? • Example • Reputation-based Trust Management Systems • DMRep • EigenRep • P2PRep • Frameworks for Trust Establishment • Trust- χ
What is a Trust Management? • “a unified approach to specifying and interpreting security policies, credentials, relationships [which] allows direct authorization of security-critical actions”– Blaze, Feigenbaum & Lacy • Trust Management is the capture, evaluation and enforcement of trusting intentions. • Other areas: Distributed Agent Artificial Intelligence/ Social Sciences
What is a Trust Management? Trust Management Policy-Based Trust Systems Social Network-Based Trust Systems Reputation-Based Trust Systems
What is a Trust Management? Trust Management Policy-Based Trust Systems Social Network-Based Trust Systems Reputation-Based Trust Systems • Example: PolicyMaker • Goal: Access Control • Peers use credential verification to establish a trust relationship • Unilateral, only the resource-owner request to establish trust
What is a Trust Management? Trust Management Policy-Based Trust Systems Social Network-Based Trust Systems Reputation-Based Trust Systems • Example: Marsh, Regret, NodeRanking, … • Based on social relationships between peers when computing trust and reputation values • Form conclusions about peers through analyzing a social network
What is a Trust Management? Trust Management Policy-Based Trust Systems Social Network-Based Trust Systems Reputation-Based Trust Systems • Example: DMRep, EigenRep, P2PRep, XRep, NICE, … • Based on measuring Reputation • Evaluate the trust in the peer and the trust in the reliability of the resource
Outline • What is Trust? • What is a Trust Management? • How to measure Trust? • Example • Reputation-based Trust Management Systems • DMRep • EigenRep • P2PRep • Frameworks for Trust Establishment • Trust- χ
cooperate cooperate How to measure Trust?An example of Computation ModeA computational Model of Trust and Reputation (Mui et al,2001) • Let’s assume a social network where no new peers are expected to join or leave • (i.e. the social network is static) Action Space = {cooperate, failing} b a
cooperate failing How to measure Trust?An example of Computation Mode • Let’s assume a social network where no new peers are expected to join or leave • (i.e. the social network is static) Action Space = {cooperate, failing} b a
How to measure Trust?An example of Computation Mode • Reputation: perception that a peer creates through past actions about its intentions and norms • Let θji(c) represents pi’s reputation in a social network of concern to pj for a context c. • This value measures the likelihood that pireciprocates pj’s actions.
b a Context c How to measure Trust?An example of Computation Mode • θab : b’s reputation in the eyes of a. • Xab(i): the ith transaction between a and b. • After n transactions. We obtained the history data • History: Dab = {Xab(1), Xab(2), … , Xab(n)}
b a Context c How to measure Trust?An example of Computation Mode • θab : b’s reputation in the eyes of a. • Let p be the number of cooperations by peer b toward a in the n previous encounters. • b’s reputation θabfor peer a should be a function of both p and n. • A simple function can be the proportion of cooperative action over all n encounters (or transactions) • From statistics, a proportion random variable can be modeled as a Beta distribution
How to measure Trust?An example of Computation Mode • NOTE: Beta Distribution Shape Parameters
How to measure Trust?An example of Computation Mode • Beta distribution:p( ) = Beta(α, β) • : estimator for θ • α and β: α= β = 1 (by prior assumptions) • A simple estimator for θab • b’s reputation in the eyes of a as the proportion of cooperation in n finite encounters.
How to measure Trust?An example of Computation Mode • Trust is defined as the subjective expectation a peer has about another’s future behavior based on the history of encounters. T(c) = E[ θ(c) | D(c)] • The higher the trust level for peer ai, the higher the expectation that ai will reciprocate peer aj’s actions.
How to measure Trust?An example of Computation Mode • Assuming that each encounter’s cooperation probability is independent of other encounters between a and b, the likelihood of p cooperations and (n – p) failings can be modeled as: • The likelihood for the n encounters: • Combining the prior and the likelihood, the posterior estimate for becomes (the subscripts are omitted):
How to measure Trust?An example of Computation Mode • Trust towards b from a is the conditional expectation of given D. Tab = p(xab(n+1)|D) Then
Outline • What is Trust? • What is a Trust Management? • How to measure Trust? • Example • Reputation-based Trust Management Systems • DMRep • EigenRep • P2PRep • Frameworks for Trust Establishment • Trust- χ
Reputation-based Trust Management SystemsIntroduction • Examples of completely centralized mechanism for storing and exploring reputation data: • Amazon.com • Visitors usually look for customer reviews before deciding to buy new books. • eBay • Participants at eBay’s auctions can rate each other after each transaction.
Reputation-based Trust Management SystemsP2P Properties • No central coordination • No central database • No peer has a global view of the system • Global behavior emerges from local interactions • Peers are autonomous • Peers and connections are unreliable
Reputation-based Trust Management SystemsDesign Considerations • The system should be self-policing • The shared ethics of the user population are defined and enforced by the peers themselves and not by some central authority • The system should maintain anonymity • A peer’s reputation should be associated with an opaque identifier rather with an externally associated identity • The system should not assign any profit to newcomers • The system should have minimal overhead in terms of computation, infrastructure, storage, and message complexity • The system should be robust to malicious collectives of peers who know one another and attempt to collectively subvert the system.
Reputation-based Trust Management SystemsDesign Considerations : DMRepManaging Trust in a P2P Information System (Aberer,Despotovic,2001) • P2P Facts: • No central coordination or DB (e.g. not eBay) • No peer has global view • Peers autonomous and unreliable • Importance of trust in digital communities, but information dispersed and sources are not unconditionally trustworthy • Solution: reputation as decentralized storage of replicated & redundant transaction history • Calculate binary trust metric based on history of complaints.
Reputation-based Trust Management SystemsDMRep Notation • Let P denote the set of all peers. • The behavioral data B are observations t(q,p) that a peer q makes when he interacts with a peer p. • The behavioral data of p, B(p) B(p) = { t (p, q) or t (q, p) | q P} B(p) B In a decentralized system, how to model, store, and compute B?
Reputation-based Trust Management SystemsDMRep • In the decentralized environment, if a peer q has to determine trustworthiness of a peer p • It has no access to global knowledge B and B(p) • 2 ways to obtain data: • Directly by interactions Bq(p) = { t (q, p) | t (q, p) B} • Indirectly through a limited number of referrals from witnesses r Wq P Wq(p) = { t (r, p) | r Wq, t (r, p) B}
Reputation-based Trust Management SystemsDMRep • Assumption: • The probability of cheating or having malicious within a society is comparably low • In case of a malicious behavior of q, a peer p can file a complaint c(p,q) • Complaints are the only behavioral data B used in this model
Reputation-based Trust Management SystemsDMRep • Let us look a simple situation • p and q interact, • later r wants to determine the trustworthiness of p and q. • Assume p is cheating, q is honest • After their interaction, • q will file a complaint about p • p will file a complaint about q in order to hide its misbehavior. • r can not detect that p is cheating, • If p continues to cheat with more peers, r can conclude that it is very probable that p is the cheater by observing the other complaints about p
Reputation-based Trust Management SystemsDMRep • Based on the previous simple scenario, the reputation T(p) of a peer p can be computed as the product • T(p) = |{c(p,q)| qP}| x |{c(q,p)| qP}| • High value of T(p) indicate that p is not trustworthy • |{c(p,q)| qP}|: number of complains made by p • |{c(q,p)| qP}|: number of complains about p • Problem: • The reputation was determined based on the global knowledge on complains which is very difficult to obtain. • How to store the complaints?
Reputation-based Trust Management SystemsDMRep • The storage structure proposed in this approach uses P-Grid (other can be used, such as CAN or CHORD) • P- Grid is a peer-to-peer lookup system based on a virtual distributed search tree. • It stores data items for which the associated path is a prefix of the data key. • For the trust management application this are the complaints indexed by the peer number.
Routing Tables Data Stores
Reputation-based Trust Management SystemsDMRep • The same data can be stored at multiple peers and we have replicas of this data improve reliability • As the example shows, collisions of interest may occur, where peers are responsible for storing complaints about themselves. We do not exclude this, as for large peer populations these cases will be very rare and multiple replicas will be available to double-check.
Reputation-based Trust Management SystemsDMRep • Problem: The peers providing the data could themselves be malicious • Assume that the peers are only malicious with a certain probability π ≤ πmax <1. • If there are r replicas satisfies on average πrmax < ε, whereε is an acceptable fault-tolerance. • Problem Solution: If we receive the same data about a specific peer from a sufficient number of replicas we need no further checks, otherwise continue search.
Reputation-based Trust Management SystemsDMRep • How it works? P-Grid has two operations for storage-retrieve information • insert(p; k; v), • where p is an arbitrary peer in the network, k is the key value to be searched for, and v is a data value associated with the key. • query(r; k) : v, • where r is an arbitrary peer in the network, which returns the data values v for a corresponding query k.
Reputation-based Trust Management SystemsDMRep • How it works? • Every peer p can file a complaint about q at any time. It stores the complaint by sending messages insert(a1; key(p); c(p; q)) and insert(a2; key(q); c(p; q)) to arbitrary peers a1 and a2.
Reputation-based Trust Management SystemsDMRep – Query Results • Assume that a peer p query for information about q (p evaluates the trustworthiness of q) • p submits messages query(a; key(q)) to arbitrary peers a. • This process is performed s times.
Reputation-based Trust Management SystemsDMRep – Query Results • The result of these queries, called W, such that • w: number of witness found • cri(q): number of complaints that qreceived according witness ai • cfi(q): number of complaints qfiled according witness ai • fi: the frequency with which ai is found (non-uniformity of the P-Grid structure)
Reputation-based Trust Management SystemsDMRep – Variability • Different frequencies fi indicate that not all witnesses are found with the same probability due to the non-uniformity of the P-Grid structure. • Thus witnesses found less frequently will probably also not receive as many storage messages when complaints are filed. Thus the number of complaints they report will tend to be too low. • Problem: We need to compensate the information contribution from every witness. • Problem solution: Normalize values by using the frequencies . • High contribution (high fi), high probability • Low contribution (low fi), low probability
Reputation-based Trust Management SystemsDMRep – Variability the probability of not finding witness i in s attempts.
Reputation-based Trust Management SystemsDMRep – Trust • This model proposed to decide if a peer p considers peer q trustworthy (binary decision) based on tracking the history and computing T. • Thus p keeps a statistics of the average number of complaints received and complaints filed, aggregating all observations it makes over its lifetime. • Using the following heuristic approach:
Reputation-based Trust Management SystemsDMRep – Trust if an observed value for complaints exceeds the general average of the trust measure too much, the agent must be Dishonest.
Reputation-based Trust Management SystemsDMRep - Discussion • Strength • The method can be implemented in a fully decentralized peer-to-peer environment and scales well for large number of participants. • Limitations • environment with low cheating rates. • specific data management structure. • Not robust to malicious collectives of peers.
Outline • What is Trust? • What is a Trust Management? • How to measure Trust? • Example • Reputation-based Trust Management Systems • DMRep • EigenRep • P2PRep • Frameworks for Trust Establishment • Trust- χ
0.9 0.1 Reputation-based Trust Management SystemsDesign Considerations : EigenRepThe Eigen Trust Algorithm for Reputation Management in P2P Networks (Kamvar, Schossler,2003) • Goal: To identify sources of inauthentic files and bias peers against downloading from them. • Method: Give each peer a trust value based on its previous behavior.
t3=.5 C12=0.3 C23=0.7 t1=.3 C21=0.6 t2=.2 C14=0.01 t4=0 Reputation-based Trust Management SystemsEigenRep: Terminology Peer 3 • Local trust value:cij.The opinion that peer i has of peer j, based on past experience. • Global trust value: ti.The trust that the entire system places in peer i. Peer 1 Peer 2 Peer 4
Peer 1 C12=0.9 Peer 2 C14=0.1 Peer 4 Peer 4 Peer 2 Peer 1 Reputation-based Trust Management SystemsEigenRep: Normalizing Local Trust Values • All cij non-negative • ci1 + ci2 + . . . + cin = 1