TrustCourse CS 6381 -- Grid and Peer-to-Peer Computing
Gerardo Padilla
2
Source
• Part 1: A Survey Study on Trust Management in P2P Systems
• Part 2: Trust-χ: A Peer-to-Peer Framework for Trust Establishment
3
Outline
• What is Trust?• What is a Trust Management?• How to measure Trust?
– Example
• Reputation-based Trust Management Systems– DMRep– EigenRep– P2PRep
• Frameworks for Trust Establishment– Trust- χ
4
What is Trust?
• Kini & Choobineh trust is: "a belief that is influenced by the individual’s opinion about
certain critical system features"
• Gambetta" …trust (or, symmetrically, distrust) is a particular level of the
subjective probability with which an agent will perform a particular action, both before [the trustor] can monitor such action (or independently of his capacity of ever to be able to monitor it)
• The Trust-EC project (http://dsa-isis.jrc.it/TrustEC/) trust is: "the property of a business relationship, such that
reliance can be placed on the business partners and the business transactions developed with them''.
• Gradison and Slomantrust is: "the firm belief in the competence of an entity to act
dependably, securely and reliably within a specified context". .
5
What is Trust? Some Basic Properties of Trust Relations• Trust is relative to some business
transaction. A may trust B to drive her car but not to baby-sit.
• Trust is a measurable belief. A may trust B more than A trusts C for the same business.
• Trust is directed. A may trust B to be a profitable customer but B may distrust A to be a retailer worth buying from.
• Trust exists and evolves in time. The fact that A trusted B in the past does not in itself guarantee that A will trust B in the future. B’s performance and other relevant information may lead A to re-evaluate her trust in B.
6
• Reputation: perception that an agent creates through past actions about its intentions and norms.
• Trust: a subjective expectation a peer has about another's future behavior based on the history of their encounters.
• Reciprocity: mutual exchange of deeds
Reputation, Trust and Reciprocity
reputation
trust reciprocity
Increase pi’s reputation
Increase pj’s trust of pi
Increase pi’s reciprocating actions
7
Outline
• What is Trust?• What is a Trust Management?• How to measure Trust?
– Example
• Reputation-based Trust Management Systems– DMRep– EigenRep– P2PRep
• Frameworks for Trust Establishment– Trust- χ
8
What is a Trust Management?
• “ a unified approach to specifying and interpreting security policies, credentials, relationships [which] allows direct authorization of security-critical actions” – Blaze, Feigenbaum & Lacy
• Trust Management is the capture, evaluation and enforcement of trusting intentions.
• Other areas: Distributed Agent Artificial Intelligence/ Social Sciences
9
What is a Trust Management?
Trust Management
Policy-Based Trust
Systems
Reputation-Based Trust
Systems
Social Network-Based Trust
Systems
10
What is a Trust Management?
Trust Management
• Example: PolicyMaker • Goal: Access Control• Peers use credential verification to establish a trust relationship• Unilateral, only the resource-owner request to establish trust
Policy-Based Trust
Systems
Reputation-Based Trust
Systems
Social Network-Based Trust
Systems
11
What is a Trust Management?
Trust Management
• Example: Marsh, Regret, NodeRanking, …• Based on social relationships between peers when computing trust and reputation values • Form conclusions about peers through analyzing a social network
Policy-Based Trust
Systems
Reputation-Based Trust
Systems
Social Network-Based Trust
Systems
12
What is a Trust Management?
Trust Management
• Example: DMRep, EigenRep, P2PRep, XRep, NICE, …• Based on measuring Reputation• Evaluate the trust in the peer and the trust in the reliability of the resource
Policy-Based Trust
Systems
Reputation-Based Trust
Systems
Social Network-Based Trust
Systems
13
Outline
• What is Trust?• What is a Trust Management?• How to measure Trust?
– Example
• Reputation-based Trust Management Systems– DMRep– EigenRep– P2PRep
• Frameworks for Trust Establishment– Trust- χ
14
How to measure Trust?An example of Computation ModeA computational Model of Trust and Reputation (Mui et al,2001)• Let’s assume a social network where no
new peers are expected to join or leave – (i.e. the social network is static)
Action Space = {cooperate, failing}
a bcooperate
cooperate
15
How to measure Trust?An example of Computation Mode
• Let’s assume a social network where no new peers are expected to join or leave – (i.e. the social network is static)
Action Space = {cooperate, failing}
a bcooperate
failing
16
How to measure Trust?An example of Computation Mode
• Reputation: perception that a peer creates through past actions about its intentions and norms
– Let θji(c) represents pi’s reputation in
a social network of concern to pj for a
context c.– This value measures the likelihood that pi
reciprocates pj’s actions.
17
How to measure Trust?An example of Computation Mode
• θab : b’s reputation in the eyes of a.
• Xab(i): the ith transaction between a and b.
• After n transactions. We obtained the history data– History: Dab = {Xab(1), Xab(2), … , Xab(n)}
a bContext c
otherwise 0
cooperate isaction sb' if 1 (i)Xab
18
How to measure Trust?An example of Computation Mode
• θab : b’s reputation in the eyes of a.
• Let p be the number of cooperations by peer b
toward a in the n previous encounters.
– b’s reputation θab for peer a should be a function of both p and n.
– A simple function can be the proportion of cooperative action over all n encounters (or transactions)
• From statistics, a proportion random variable can be modeled as a Beta distribution
a bContext c
19
How to measure Trust?An example of Computation Mode
• NOTE: Beta Distribution
Shape Parameters
20
How to measure Trust?An example of Computation Mode
• Beta distribution: p( ) = Beta(α, β)– : estimator for θ
– α and β: α = β = 1 (by prior assumptions)
• A simple estimator for θab
• b’s reputation in the eyes of a as the proportion of cooperation in n finite encounters.
n
pab
21
How to measure Trust?An example of Computation Mode
• Trust is defined as the subjective expectation a peer has about another’s future behavior based on the history of encounters.
T(c) = E[ θ(c) | D(c)]
– The higher the trust level for peer ai, the higher the expectation that ai will reciprocate peer aj’s actions.
22
How to measure Trust?An example of Computation Mode• Assuming that each encounter’s cooperation
probability is independent of other encounters between a and b, the likelihood of p cooperations and (n – p) failings can be modeled as:– The likelihood for the n encounters:
• Combining the prior and the likelihood, the posterior estimate for becomes (the subscripts are omitted):
),()|( pnpBetaDp
23
How to measure Trust?An example of Computation Mode
• Trust towards b from a is the conditional expectation of given D.
Tab = p(xab(n+1)|D)
Then
n
pDET ab
]|[
24
Outline
• What is Trust?• What is a Trust Management?• How to measure Trust?
– Example
• Reputation-based Trust Management Systems– DMRep– EigenRep– P2PRep
• Frameworks for Trust Establishment– Trust- χ
25
Reputation-based Trust Management SystemsIntroduction• Examples of completely centralized
mechanism for storing and exploring reputation data:– Amazon.com
• Visitors usually look for customer reviews before deciding to buy new books.
– eBay• Participants at eBay’s auctions can rate each
other after each transaction.
26
Reputation-based Trust Management SystemsP2P Properties• No central coordination• No central database• No peer has a global view of the
system• Global behavior emerges from local
interactions• Peers are autonomous• Peers and connections are unreliable
27
Reputation-based Trust Management SystemsDesign Considerations• The system should be self-policing
– The shared ethics of the user population are defined and enforced by the peers themselves and not by some central authority
• The system should maintain anonymity– A peer’s reputation should be associated with an opaque identifier
rather with an externally associated identity
• The system should not assign any profit to newcomers• The system should have minimal overhead in terms of
computation, infrastructure, storage, and message complexity• The system should be robust to malicious collectives of peers
who know one another and attempt to collectively subvert the system.
28
Reputation-based Trust Management SystemsDesign Considerations : DMRepManaging Trust in a P2P Information System (Aberer,Despotovic,2001)• P2P Facts:
– No central coordination or DB (e.g. not eBay)– No peer has global view– Peers autonomous and unreliable
• Importance of trust in digital communities, but information dispersed and sources are not unconditionally trustworthy
• Solution: reputation as decentralized storage of replicated & redundant transaction history– Calculate binary trust metric based on history of
complaints.
29
Reputation-based Trust Management SystemsDMRepNotation
• Let P denote the set of all peers.• The behavioral data B are observations
t(q,p) that a peer q makes when he interacts with a peer p.
• The behavioral data of p, B(p)B(p) = { t (p, q) or t (q, p) | q P}B(p) B
In a decentralized system, how to model, store, and In a decentralized system, how to model, store, and compute B?compute B?
30
Reputation-based Trust Management SystemsDMRep• In the decentralized environment, if a peer
q has to determine trustworthiness of a peer p– It has no access to global knowledge B and B(p)
– 2 ways to obtain data:• Directly by interactions
Bq(p) = { t (q, p) | t (q, p) B}• Indirectly through a limited number of
referrals from witnesses r Wq P
Wq(p) = { t (r, p) | r Wq, t (r, p) B}
31
Reputation-based Trust Management SystemsDMRep• Assumption:
– The probability of cheating or having malicious within a society is comparably low
• In case of a malicious behavior of q, a peer p can file a complaint c(p,q)
• Complaints are the only behavioral data B used in this model
32
Reputation-based Trust Management SystemsDMRep• Let us look a simple situation• p and q interact,• later r wants to determine the
trustworthiness of p and q.– Assume p is cheating, q is honest– After their interaction,
• q will file a complaint about p• p will file a complaint about q in order to hide its
misbehavior. – r can not detect that p is cheating,– If p continues to cheat with more peers, r can
conclude that it is very probable that p is the cheater by observing the other complaints about p
33
Reputation-based Trust Management SystemsDMRep• Based on the previous simple scenario, the
reputation T(p) of a peer p can be computed as the product
• T(p) = |{c(p,q)| qP}| x |{c(q,p)| qP}|– High value of T(p) indicate that p is not trustworthy– |{c(p,q)| qP}|: number of complains made by p– |{c(q,p)| qP}|: number of complains about pProblem:
The reputation was determined based on the global knowledge on complains which is very difficult to
obtain. How to store the complaints?
34
Reputation-based Trust Management SystemsDMRep• The storage structure proposed in this approach
uses P-Grid (other can be used, such as CAN or CHORD)
• P- Grid is a peer-to-peer lookup system based on a virtual distributed search tree.
• It stores data items for which the associated path is a prefix of the data key. – For the trust management application this are
the complaints indexed by the peer number.
35
Routing Tables
Data Stores
36
Reputation-based Trust Management SystemsDMRep• The same data can be stored at multiple peers and we
have replicas of this data improve reliability
• As the example shows, collisions of interest may occur, where peers are responsible for storing complaints about themselves. We do not exclude this, as for large peer populations these cases will be very rare and multiple replicas will be available to double-check.
37
Reputation-based Trust Management SystemsDMRep
– Problem: The peers providing the data could themselves be malicious
– Assume that the peers are only malicious with a
certain probability π ≤ π max <1.
• If there are r replicas satisfies on average π rmax <
ε, where ε is an acceptable fault-tolerance.
– Problem Solution: If we receive the same data about a specific peer from a sufficient number of replicas we need no further checks, otherwise continue search.
38
Reputation-based Trust Management SystemsDMRep• How it works?
P-Grid has two operations for storage-retrieve information
– insert(p; k; v), • where p is an arbitrary peer in the network, k is
the key value to be searched for, and v is a data value associated with the key.
– query(r; k) : v, • where r is an arbitrary peer in the network, which
returns the data values v for a corresponding query k.
39
Reputation-based Trust Management SystemsDMRep• How it works?
– Every peer p can file a complaint about q at any time. It stores the complaint by sending messages
insert(a1; key(p); c(p; q)) and
insert(a2; key(q); c(p; q))
to arbitrary peers a1 and a2.
40
Reputation-based Trust Management SystemsDMRep – Query Results• Assume that a peer p query for
information about q (p evaluates the trustworthiness of q)
– p submits messages query(a; key(q)) to arbitrary peers a.
– This process is performed s times.
41
Reputation-based Trust Management SystemsDMRep – Query Results
– The result of these queries, called W, such that
• w: number of witness found
• cri(q): number of complaints that q received according witness ai
• cfi(q): number of complaints q filed according witness ai
• fi: the frequency with which ai is found (non-uniformity of the P-Grid structure)
42
Reputation-based Trust Management SystemsDMRep – Variability• Different frequencies fi indicate that not all
witnesses are found with the same probability due to the non-uniformity of the P-Grid structure. – Thus witnesses found less frequently will probably
also not receive as many storage messages when complaints are filed. Thus the number of complaints they report will tend to be too low.
– Problem: We need to compensate the information contribution from every witness.
– Problem solution: Normalize values by using the frequencies .
• High contribution (high fi), high probability • Low contribution (low fi), low probability
43
Reputation-based Trust Management SystemsDMRep – Variability
the probability of not finding witness i in s attempts.
44
Reputation-based Trust Management SystemsDMRep – Trust• This model proposed to decide if a peer p considers
peer q trustworthy (binary decision) based on tracking the history and computing T.
• Thus p keeps a statistics of the average number of complaints received and complaints filed, aggregating all observations it makes over its lifetime.
• Using the following heuristic approach:
45
Reputation-based Trust Management SystemsDMRep – Trust
if an observed value for complaints exceeds the general average of the trust measure too much, the agent must beDishonest.
46
Reputation-based Trust Management SystemsDMRep - Discussion• Strength
– The method can be implemented in a fully decentralized peer-to-peer environment and scales well for large number of participants.
• Limitations– environment with low cheating rates.– specific data management structure.– Not robust to malicious collectives of peers.
47
Outline
• What is Trust?• What is a Trust Management?• How to measure Trust?
– Example
• Reputation-based Trust Management Systems– DMRep– EigenRep– P2PRep
• Frameworks for Trust Establishment– Trust- χ
48
Reputation-based Trust Management SystemsDesign Considerations : EigenRepThe Eigen Trust Algorithm for Reputation Management in P2P Networks (Kamvar, Schossler,2003)
• Goal: To identify sources of inauthentic files and bias peers against downloading from them.
• Method: Give each peer a trust value based on its previous behavior.
0.9
0.1
49
Reputation-based Trust Management SystemsEigenRep: Terminology
• Local trust value:
cij. The opinion that peer
i has of peer j, based on past experience.
• Global trust value:
ti. The trust that the
entire system places in peer i.
Peer 1
Peer 3
Peer 2
Peer 4
t4=0
t1=.3
t3=.5
t2=.2
C21=0.6
C23=0.7
C14=0.01
C12=0.3
50
Reputation-based Trust Management SystemsEigenRep: Normalizing Local Trust Values
• All cij non-negative
• ci1 + ci2 + . . . + cin = 1
Peer 2
Peer 1
Peer 4
C14=0.1
C12=0.9
Peer 2 Peer 4
Peer 1
51
Reputation-based Trust Management SystemsEigenRep: Local Trust Vector
• Local trust vector ci: contains all local trust values cij that peer i has of other peers j.
1.0
0
9.0
0
0
0
Peer 2
Peer 4
Peer 1
c1Peer 2
Peer 1
Peer 4
C14=0.1
C12=0.9
52
Reputation-based Trust Management SystemsEigenRep: Local Trust Values
• Model Assumptions:
– Each time peer i downloads an authentic file from peer j, cij increases.
– Each time peer i downloads an inauthentic file from peer j, cij decreases.How to quantify these assumptions?
53
Reputation-based Trust Management SystemsEigenRep: Local Reputation Values
• Local Reputation Values = own experience– sat(i, j): number of satisfactory transactions (downloads)
peer i has had with peer j.– unsat(i, j): number of unsatisfactory transactions
(downloads) peer i has had with peer j local
– Reputation value:
sij=sat(i, j) - unsat(i, j).
54
• Normalize Local Reputation Values -> Local Reputation Vector
• Note:– Local reputation vector:
• Most are 0
–
Reputation-based Trust Management SystemsEigenRep: Normalizing Local Reputation Value
j ij
ijij s
sc
)0,max(
)0,max(
TiNii ccc ),...,( 1
1..,11
1
N
jiji ceic
10 ijcijc
55
• Issues: – Advantages of normalizing
• Reduce the problem where “malicious peers can assign arbitrarily high local trust values to other malicious peers, and arbitrarily low local trust values to good peers, easily subverting the system.”
– Disadvantages of normalizing• “the normalized trust values do not distinguish between a peer
with whom peer i did not interact and a peer with whom peer i has had poor experience.”
• “these cij values are relative, and there is no absolute interpretation. That is, if cij = cik, we know that peer j has the same reputation as peer k in the eyes of peer i, but we don’t know if both of them are very reputable, or if both of them are mediocre.”
Reputation-based Trust Management SystemsEigenRep: Normalizing Local Reputation Value
56
Reputation-based Trust Management SystemsEigenRep: Local Reputation Values
• Problem:– The peers have limited own experience.
• Solution:– Get information from other
peers who may have more experience about other peers.
How?
57
Reputation-based Trust Management SystemsEigenRep: Combining information by asking others
• Ask for the opinions of the people who you trust.
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
58
Reputation-based Trust Management SystemsEigenRep: Aggregating Local Reputation Values• Peer i asks its friends about their opinions
on peer k.
j
jkijik cct
Ask your friends j
What they think of peer k.
And weight each friend’s opinion by how
much you trust him.
59
Reputation-based Trust Management SystemsEigenRep: Aggregating Local Reputation Values
• Peer i asks its friends about their opinions on all peers.
j
jkijik cct
iT
i cCt
iN
ik
i
NNkNN
Nkkkk
Nk
iN
ik
i
c
c
c
ccc
ccc
ccc
t
t
t
...
...
......
......
......
......
......
...
...1
1
1
11111
60
Reputation-based Trust Management SystemsEigenRep: Aggregating Local Reputation Values• Peer i asks its friends about their opinions about
other peers again. (It seems like asking his friends’ friends)
• Continues in this manner,
• If n is large, will converge to the same vector por every peer i (left principal eigenvector of C for every peer i if C is irreducible and aperiodic)
iT
i cCt 2)(
inT
i cCt
)(
61
Reputation-based Trust Management SystemsEigenRep: Global Reputation Vector,
• We call this eigenvector , the global reputation vector.– , an element of , quantifies how much trust the
system as a whole places peer j.
How to Estimate t?
it
t
t
62
Reputation-based Trust Management SystemsEigenRep: EigenTrust Algorithm (non-dist)• Basic EigenTrust Algorithm
– Non-distributed Algorithm
– Assume that some central server knows all the cij values and performs the computation.
until
tt
tCt
repeat
et
kk
kTk
)()1(
)()1(
)0(
;
;
63
Reputation-based Trust Management SystemsEigenRep: EigenTrust Algorithm (non-dist)
• Basic EigenTrust Algorithm– Issues to consider:
• A priori notions of trust– There are some peers in the network that are known to be
trustworthy (pre-trusted peers). Good idea to incorporate this information.
• Inactive peers or new peers– Peers which do not download files from other peers or do not
know other peers.
• Malicious collectives– A malicious collective is a group of malicious peers who know
each other, who give each other high local trust values and give all other peers low local trust values in an attempt to subvert the system and gain high global trust values.
64
Reputation-based Trust Management SystemsEigenRep: EigenTrust Algorithm (non-dist)• Pre-trust peers: P is a set of peers which are known
to be trusted, is the pre-trusted vector of P, where,
• Assign some trust on pre-trust peers
• and use this information in new or inactive peers:
othervise
PiifPpi
,0
,/1
pci
p
65
Reputation-based Trust Management SystemsEigenRep: EigenTrust Algorithm (non-dist)• To avoid Malicious collectives
• Where a is some constant less than 1.
• This strategy breaks collectives by having each peer place at least some trust in the peers P that are not part of a collective.
• Strong Assumption: Pre-trusted peers are essential
patCat kTk )()1( )1(
66
Reputation-based Trust Management SystemsEigenRep: EigenTrust Algorithm (non-dist)• Modified Basic EigenTrust Algorithm
– non-distributed algorithm
until
tt
patCat
tCt
repeat
pt
kk
kTk
kTk
)()1(
)1()1(
)()1(
)0(
)1(
;
;
Now, let’s consider a distributed environment
67
Reputation-based Trust Management SystemsEigenRep: EigenTrust Algorithm (distributed)• All peers in the network cooperate to
compute and store the global trust vector.• Each peer stores and computes its own
global trust value.• Minimize the computation, storage, and
message overhead.
ik
NNik
ik
i aptctcat )...)(1( )()(11
)1(
patCat kTk )()1( )1(
68
Distributed Algorithm (cont…)
• Ai: set of peers which have downloaded files from peer i.
• Bi: set of peers which peer i has downloaded files.
}
; until
return to peers allfor wait
; compute
; peers all to send
;)...)(1( compute
repeat
;for peers allquery
do{ ipeer each for
)1(
)()1(
)1(
)()(11
)1(
)0(
kjjii
ki
ki
ik
iij
ik
NNik
ik
i
jji
tcAj
tt
Bjtc
aptctcat
ptAj
i6
0
2
9
1
5
8
Predecessor: Ai
(downl oad f rom i )Successor: Bi
(downl oaded by i )
1011
7. . .
3
4
12
)(11
kitc
)(55
kitc
)(66
kitc
)(1111
kitc
)(kit
)1( kit
)1(2
kii tc
)1(7
kii tc
)1(9
kii tc
69
Reputation-based Trust Management SystemsEigenRep: EigenTrust Algorithm (distributed)• Complexity
For a network of 1000 peers after 100query cycles
70
Reputation-based Trust Management SystemsEigenRep: EigenTrust Algorithm (dist. secure)• Issue: The trust value of one peer
should be computed by more than one other peer.– malicious peers report false trust values
of their own.– malicious peers compute false trust values
for others.
71
Reputation-based Trust Management SystemsEigenRep: EigenTrust Algorithm (dist. secure)• Solution Strategy:
– the current trust value of a peer must not be computed by and reside at the peer itself, where it can easily become subject to manipulation.
– the trust value of one peer in the network will be computed by more than one other peer.
• Use multiple DHTs to assign mother peers, such as CAN or CHORD.
• The number of mother peers for one peer is same to all peers.
72
i
0
4
7
2
6
9
1
5
8
3
Predecessor: Ai
(download from i)Successor: Bi
(downloaded by i)Mother: Mi
(compute for i)Daughter: Di
(computed by i)
1011
12
...
Reputation-based Trust Management SystemsEigenRep: EigenTrust Algorithm (dist. secure)
Ai, Bi
0 21 9
5 12 11 #
Ai
015
11
…
…
Ai
0ic
Bi
2 0. 21 9 0. 55 12 0. 3
11
Ai
0ic
Bi
2 0. 21 9 0. 55 12 0. 3
11
73
; Until
;return to peers allfor Wait
; Compute
; peers all to Send
;)...)(1( Compute
Repeat
;for peers allQuery
)1(
)()1(
)1(
)()(11
)1(
)0(
kjjii
ki
ki
ik
iij
ik
NNik
ik
i
jjijjii
tcAj
tt
Bjtc
aptctcat
pctcAj
Reputation-based Trust Management SystemsEigenRep: EigenTrust Algorithm (dist. secure)
i
0
4
7
2
6
9
1
5
8
3
Predecessor: Ai
(download from i)Successor: Bi
(downloaded by i)Mother: Mi
(compute for i)Daughter: Di
(computed by i)
1011
12
...
m Ai
0ic
Bi
2 0. 21 9 0. 55 12 0. 3
11
Ai
0ic
Bi
2 0. 21 9 0. 55 12 0. 3
11)1(
it )2(it it...
74
Reputation-based Trust Management SystemsEigenRep: EigenTrust Algorithm (dist. secure)
end
end
; until
;1
; compute
;for ,)( peers all to send
;)...)(1( compute
;return tofor ,)( peers allfor t wai
repeat
;for ,)( peers all to send
0;
);(
do each for
; daughters its from , ,collect
; mothers its to , , send
do ipeer each for
)()1(
)1(
)()(11
)1(
)(
)(
kk
tt
BjjHashtc
aptctcat
tcAjjHash
BjjHashpctc
k
dHashi
Dd
DcBA
McBA
ki
ki
itk
ddj
dk
NNdk
dk
d
kjjddt
itddjk
ddj
t
i
iddd
iiii
75
Reputation-based Trust Management SystemsEigenRep: Limitation of EigenRep
• Cannot distinguish between newcomers and malicious peers.
• Malicious peers can still cheat cooperatively– A peer should not report its predecessors by itself.
• Flexibility– How to calculate reputation values when peers join and
leave, on line and off line.
• When to update global reputation values?– According to the new local reputation vector of all peers.
• Anonymous?– A mother peer know its daughters.
76
Outline
• What is Trust?• What is a Trust Management?• How to measure Trust?
– Example
• Reputation-based Trust Management Systems– DMRep– EigenRep– P2PRep
• Frameworks for Trust Establishment– Trust- χ
77
Reputation-based Trust Management Systems P2PRep: IntroductionChoosing reputable servents in a P2P network (Cornelli et al, 2002)• Not focus on computation of
reputations• Security of exchanged messages
– Queries– Votes
• How to prevent different security attacks
78
• Using Gnutella for reference– A fully P2P decentralized infrastructure– Peers have low accountability and trust– Security threats to Gnutella
• Distribution of tampered information• Man in the middle attack
Reputation-based Trust Management Systems P2PRep: Introduction
79
Reputation-based Trust Management Systems P2PRep: Sketch of P2PRep• To ensure authenticity of offerers & voters, and
confidentiality of votes• Use public-key encryption to provide integrity
and confidentiality of messages• Require peer_id to be a digest of a public key,
for which the peer knows the private key• Votes are values expressing opinions on other
peers• Servent reputation represents the
“trustworthiness” of a servent in providing files• Servent credibility represents the
“trustworthiness” of a servent in providing votes
80
• P select a peer among those who respond to P’s query
• P polls its peers for opinions about the selected peer
• Peers respond to the polling with votes• P uses the votes to make its decision
Reputation-based Trust Management Systems P2PRep: Sketch of P2PRep
81
Reputation-based Trust Management Systems P2PRep: Approaches
• Two approaches:– Basic polling
• Voters do not provide peer_id in votes
– Enhanced polling• Voters declare their peer_id in votes
82
Reputation-based Trust Management Systems P2PRep: Basic Polling
• Phase 1: Resource searching.• p sends a Query message for searching
resources, and servents matching the request respond with a QueryHit
83
Reputation-based Trust Management Systems P2PRep: Basic Polling
• Phase 2: Vote polling.• p polls its peers about the reputation of
a top list T of servents, and peers wishing to respond send back a PollReply
84
Reputation-based Trust Management Systems P2PRep: Basic Polling
• Phase 3: Voter evaluation.• p selects a set of voters, contacts them
directly, and expects back a confirmation message
85
Reputation-based Trust Management Systems P2PRep: Basic Polling
• Phase 4: Resource download.• p selects a servent s from which
download the resource and starts a challenge-response phase before downloading
86
Reputation-based Trust Management Systems P2PRep: Enhanced Polling
• Phase 1: Resource searching.• p sends a Query message for searching
resources, and servents matching the request respond with a QueryHit
87
Reputation-based Trust Management Systems P2PRep: Enhanced Polling
• Phase 2: Vote polling.• p polls its peers about the reputation of
a top list of servents, and peers wishing to respond send back a PollReply
88
Reputation-based Trust Management Systems P2PRep: Enhanced Polling
• Phase 3: Voter evaluation.• p selects a set of voters, contacts them
directly to avoid servent_id to declare fake IPs
89
Reputation-based Trust Management Systems P2PRep: Enhanced Polling
• Phase 4: Resource download.• p selects a servent s from which
download the resource and starts a challenge-response phase before downloading
90
Reputation-based Trust Management Systems P2PRep: Comparison: Basic vs Enhanced
• Basic polling– all votes are considered equal
• Enhanced polling– peer_ids allow p to weight the votes based
on v’s trustworthiness
91
Reputation-based Trust Management Systems P2PRep: Security Improvements (1)
• Distribution of Tampered Information– B responds to A with a fake resource
• P2PRep Solution:– A discovers the harmful content from B– A updates B’s reputation, preventing
further interaction with B– A become witness against B in pollings by
others
92
Reputation-based Trust Management Systems P2PRep: Security Improvements (2)
• Man in the Middle Attack– Data from C to A can be modified by B,
who is in the path• A broadcasts a Query and C responds• B intercepts the QueryHit from C and rewrites it
with B’s IP & port• A receives B’s reply• A chooses B for downloading• B downloads original content from C, modifies it
and passes it to A
93
Reputation-based Trust Management Systems P2PRep: Security Improvements (2)
• Man in the Middle Attack– P2PRep addresses this problem by including
a challenge-response phase before downloading
– To impersonate C, B needs• C’s private key• To design a public key whose digest is C’s
identifier
– Public key encryption strongly enhances the integrity of the exchanged messages
– Both versions address this problem
94
Outline
• What is Trust?• What is a Trust Management?• How to measure Trust?
– Example
• Reputation-based Trust Management Systems– DMRep– EigenRep– P2PRep
• Frameworks for Trust Establishment– Trust- χ
95
Frameworks for Trust EstablishmentTrust- χ: Introduction
• Trust establishment via trust negotiation– Exchange of digital credentials
• Credential exchange has to be protected– Policies for credential disclosure
• Claim: Current approaches to trust negotiation don’t provide a comprehensive solution that takes into account all phases of the negotiation process
96
Slide from: http://www.ccs.neu.edu/home/ahchan/wsl/symposium/bertino.ppt
Frameworks for Trust EstablishmentTrust- χ: Trust Negotiation model
ClientPolicy Base
ServerPolicy BaseResource request
Policies
Policies
Subject Profile
Subject Profile
Resource granted
Credentials
Credentials
Slide from: http://www.ccs.neu.edu/home/ahchan/wsl/symposium/bertino.ppt
97
Frameworks for Trust EstablishmentTrust- χ
• XML-based system• Designed for a peer-to-peer environment
– Both parties are equally responsible for negotiation management.
– Either party can act as a requester or a controller of a resource
• X-TNL: XML based language for specifying certificates and policies
98
Frameworks for Trust EstablishmentTrust- χ
• Certificates: They are of two types– Credentials: States personal characteristics of its
owner and is certified by a CA– Declarations: collect personal information about its
owner that does not need to be certified
• Trust tickets (X-TNL)– Used to speed up negotiations for a resource when
access was granted in a previous negotiation
• Support for policy pre-conditions• Negotiation conducted in phases
99
Frameworks for Trust EstablishmentTrust- χ: Credentials and Declarations
a) Credential b) Declaration
100
The basic Trust-X system
Tree Tree ManagerManager
Tree Tree ManagerManager
Mailbox Store
X ProfileX Profile
Mailbox Store
X ProfileX ProfilePolicy Policy DatabaseDatabase
Policy Policy DatabaseDatabase
Compliance Compliance CheckerChecker Compliance Compliance
CheckerChecker
AliceAlice BobBob
Slide from: http://www.ccs.neu.edu/home/ahchan/wsl/symposium/bertino.ppt
101
Bob
Prerequisite acknowledge
Match disclosurepolicies
Alice
Request
RESOURCE DISCLOSURE
Frameworks for Trust EstablishmentTrust- χ : Message exchange in a Trust-X negotiation
POLICY EXCHANGEBilateral disclosureof policies
INTRODUCTORYPHASE
PreliminaryInformationexchange
CREDENTIAL DISCLOSURE
Actual credentialdisclosure
Service request
Credential and/or Declaration
Disclosure policies
Service granted
Disclosure policies
Credential and/or Declaration
Slide from: http://www.ccs.neu.edu/home/ahchan/wsl/symposium/bertino.ppt
102
Frameworks for Trust EstablishmentTrust- χ: Disclosure Policies
• “They state the conditions under which a resource can be released during a negotiation”
• Prerequisites – associated to a policy, it’s a set of alternative disclosure policies that must be satisfied before the disclosure of the policy they refer to.
103
Frameworks for Trust EstablishmentTrust- χ : Logic formalism
• P() credential type• C set of conditions
P(C)TERM
RP1(c), P2(c)Policy expressed as
Resource which the policy refers to
Requestedcertificates
Disclosure policies are expressed in terms of logical expressions which can specify either simple or composite conditions against certificates.
Slide from: http://www.ccs.neu.edu/home/ahchan/wsl/symposium/bertino.ppt
104
Example
• Consider a Rental Car service. • The service is free for the employees of Corrier
company. Moreover, the Company already knows Corrier employees and has a digital copy of their driving licenses. Thus, it only asks the employees for the company badge and a valid copy of the ID card, to double check the ownership of the badge. By contrast, rental service is available on payment for unknown requesters, who have to submit first a digital copy of their driving licence and then a valid credit card. These requirements can be formalized as follows:
105
Example (2)
106
Trust-X negotiation
107
Frameworks for Trust EstablishmentTrust- χ: Negotiation Tree
• Used in the policy evaluation phase• Maintains the progress of a negotiation• Used to identify at least a possible
trust sequence that can lead to success in a negotiation (a view)
108
Frameworks for Trust EstablishmentTrust- χ : Negotiation Tree (2)
109
Summary
• Thanks!, Questions?
Top Related