Prior to systems analysis, trust has been discussed at length in the social science literature. To date, however, this prior literature has not been fully unified, although solid progress has been made. As McKnight and Chervany (1996) have observed, there are indeed too many different definitions of trust in the research, making it impossible to compare studies to derive meta-theoretical conclusions as a hope to unify them.
Much of systems analysis literature does not attempt seriously to address the problem of a lack of definition of trust, but rather only makes a passing reference to the fact that there is a discussion going on. Many papers have repeated this exact quotation of trust:
"Trust (or, symmetrically, distrust) is a particular level of the subjective probability with which an agent will perform a particular action, both before [we] can monitor each action (or independently of his capacity of ever be able to monitor it) and in a context in which it affects [our] own action." (Gambetta, as quoted in Capra, 2004, p. 108)
as definitive. Otherwise many studies use dictionary definitions of trust, say from Webster's definition or the Oxford English Dictionary before moving onto their problems at hand. While these strategies are legitimate as far as they go, they do not encompass the full body of work. Although the situation leaves something to be desired, some have gone so far as outright attack the situation in the research as chaos:
Overall, it would appear that the confusion that existed in other fields may be slopping over into the IS field. Thus, while each individual IS researcher is being careful to base his/her work on established literature, the collective work is in danger of being chaotic because of the inconsistencies in the work that they are drawing from. (Geffen, Rao, & Tractinsky, 2003, p.3)
To address this, what follows is a simplified framework of the best conceptualizations to date.
It's not surprising the systems analysis research has failed to come to consensus on what trust means. More complete surveys (McKnight and Chervany, 1996; Chopra and Wallace, 2003) have come to the conclusion that the social science literature has multiple definitions of trust because they each fit the particular narrow lens or perspective of their respective disciplines, and thus they are each like blind men describing an elephant. To summarize:
- Psychology. Trust is a personal attribute or attributes of the trustor, ranging from how likely the are to trust someone else to their feelings towards the person they are trusting.
- Social psychology. Trust is a phenomenon of interpersonal relationships.
- Sociology. Trust is structural. Trust comes either from faith in social institutions like the government or from connection through social networks.
- Economics. Trust is a rational choice mechanism.
Which create overall three broad structural categories for conceptualizing trust:
- Impersonal/Structural. Trust is with respect to the social institutions surrounding the situation. For example, banking regulations.
- Dispositional. Does a person have more/less faith in human nature?
- Personal/Interpersonal. Trust is between the parties in that situation.
As well a general model of how trust affects an individual, stemming from both the dictionary definitions verified with research, as trust being a) a personal feeling of the trustor, b) about how confident to be in positive expectations of the trustee, c) which induces some behavioural change in the trustor, leading Chopra and Wallace (2003) to define trust concisely as
"the willingness to rely on a specific other, based on confidence that one's trust will lead to positive outcomes." (p.333)
Finally, trust has both rational cognitive and emotional affective dimensions to it, a distinction that is often grappled with in the system analysis literature (e.g. Camp, 2003). While economists look at the rational choice mechanism as a cognitive decision to trust, psychologists study how emotional states lead people to trust or distrust. McKnight and Chervany (1996) manage to combine all these disparate views of trust in the following table:
| Impersonal | Personal | Interpersonal |
| Affective State | Cognitive State | |
| Structural | Dispositional Attitude Feeling | Expectancy Belief Intention |
Against this framework, we can place the various dimensions people have used to try to characterize trust. As Chopra and Wallace (2003) describe, with some minor changes to move displaced categorizations, plus the dimensions described by McKnight and Chervany (1996):
- Trust as personality characteristic of the trustor. Disposition to trust, personal attraction to trustee, expectation, confidence, behaviour, cooperation, reliance.
- Trustworthiness of the trustee. Competence (credibility, expertness, dynamism), positive intentions (goodwill, benevolence, loyalty, motivations, open-mindedness), ethics (moral order, integrity, honesty, fairness, moral commitment, fulfillment of obligations, fiduciary obligation), predictability (reliability, dependability, consistency, responsiveness, behaving as expected, safety, shared understanding)
- Situation. Context, social trust (trust in trust), obligation of trustee to trustor,
Chopra and Wallace (2003) further describe that trust is developed through predictability (past experience with trustee), judgment (calculate based on available evidence), bonding (emotional relationship), reputation, and identification (do we share a common identity?). Whereas, in a later paper, McKnight and Chervany (2001) summarize the background (hygiene) factors that are necessary to enable the process of developing trust, which are restructured here as follows:
- Disposition to trust. The willingness to trust across a wide variety of situations or persons depends often on one's own personality, such as one's:
- Faith in humanity. One believes others are usually upright, well-meaning, and dependable.
- Trusting stance. One believes dealing with people as though they are well-meaning and reliable leads to better outcomes.
- Willingness to depend. One is volitionally prepared to make oneself vulnerable to the other person.
- Judgment of trustworthiness. While trustworthiness as a personal trait was described above as a multi-dimensional concept, making a personal judgment about the trustworthiness of another person involves at least
- Trusting beliefs. One believes the trustee has desirable traits. The degree of competence and predictability, as related to trustworthiness above.
- Trusting intentions. One feels safe in depending or relying on the trustee.
- Probable future dependance. One forecasts or predicts that one will depend on the other person.
- Situational normality. One believes feels comfortable that the situation is normal or favourable.
- Social recourse. Where recommendations are about you tapping a social network to decide whom to trust, recourse is the converse flow where you feedback into this social network to punish or respond in order to predict a successful outcome. Licensing, auditing laws (i.e. punishing the trustee for malfeasance), guarantees, contracts, regulations, promises, legal recourse, processes, procedures all exist to increase the probability of success.
Distrust. The latter point, social recourse, points to where trust contrasts with distrust. Trust enables higher gains while distrust avoids potential loss (Camp, 2003). Along the axis of personal, interpersonal, and structural relationships distrust is respectively a person's level of general cautiousness or wariness; the level they wish to control a particular interpersonal relationship in order to feel safe; and a mechanism (e.g. jail) to revoke trust either granted or presumed by default. One reasonable definition says distrust is the "the lack of firm belief in the competence of an entity to act dependably, securely, and reliably within a specified context." (Grandison and Sloman, 2000, p.3)
In information systems, many attempts are made to formalize trust or to model trust for the ease of implementation by machine. The following principles are commonly held:
- Trust is contextual / situational. Trust varies from situation to situation. You care more about competence than benevolence from your doctor, as you do not want them to hurt you but you do not care very much if they are in it for the money. Yet, benevolence is the most important in parent/child relationships. Alternatively, one trusts for different reasons. One may trust their car mechanic to operate on their engine, but not their heart.
- Trust has levels. We trust people at different levels. One may trust Joe more than Jake as a car mechanic, therefore preferring to have Joe service their car. One may also trust a CEO more with sensitive information than a new co-op student. One may also trust their wife more than the town gossipmonger. Further, gradations of trust are continuous rather than discrete. The longer an employee works successfully at a company, the more responsibility we trust with them.
- Trust is not static. Trust changes over time. Either it increases as relationships grow, goes away as people become distrusted, or decays over time as we lose touch with people. In practice, few systems take this into consideration (Grandison and Sloman, 2000).
- Trust is transitive. As Grandison and Sloman (2000) point out, trust is either decided based on a personal judgment or based on a recommendation by another trusted party. If A trusts B and B trust C, then A trusts C. We often rely on others to make our trust judgments for us. One may trust their family doctor to recommend a specialist or a Medical Association to certify a family doctor and the specialist. Many researchers have relied on this aspect of trust to build recommendation, reputation, or certification systems to confer trust based on the trustee's position in a social network relative to the trustor, typically automatically by modeling this social network explicitly.
However, although most attempts have focused on transitive trust because it was convenient for the modeling technique, without considering the full richness of how people actually use their social connections to decide trust leads to faulty systems. Consider that for one, recommendations also have to be trusted (Abdul-Rahman & Hailes, 1997a), as one may not trust their car mechanic to give medical advice, or different levels of trust in the state-run institutions of the Third World Secondly, trust is not totally transitive, but only partially transitive. If a friend recommends an accountant, it will not automatically conclude that money can be trusted with that accountant., requiring us to evaluate that accountant ourselves (Reagle, 1996). Further, trust is bound in personal relationships as described above, so until the accountant has been met and a personal relationship is built, we will not trust each other as much as the friend and the accountant. Also, people lie, are prejudiced, or have other biases that we cannot assume do not exist — even if most of the systems literature does make this simplifying assumption for the convenience of their models. This can be dangerous if the system forces a user to become vulnerable to a party that, despite what the model believes, does not get along with. Instead, a preferred model is to use recommendations to identify candidates, and then make personal judgments and build relationship from there.
References
Abdul-Rahman, A. and Hailes, S. (1997a). A distributed trust model. In proceedings ACM New Security Paradigms Workshop '97, Cumbria, UK. September 1997, 48-60.
Abdul-Rahman, A. and Hailes, S. (1997b). Using recommendations for managing trust in distributed systems. In Proceedings IEEE Malaysia International Conference on Communication '97 (MICC'97), Kuala Lumpur, Malaysia. November 1997.
Abrams, M. D. and Joyce, M. V. (1995). Trusted systems concepts. Computers and security, 14(1), 57-68.
Camp, L. J. (2003). Designing for trust. In: Rino Falcone, et al. (Eds.): Trust, Reputation, and Security: Theories and Practice, AAMAS 2002 International Workshop, Bologna, Italy, July 15, 2002, Selected and Invited Papers.
Capra, L. (2004). Engineering human trust in mobile system collaborations. In SIGSOFT'04, 107-116.
Chopra, K. and Wallace, W. A. (2003). Trust in electronic environments. Proceedings of the 36th Hawaii International Conference on System Sciences, January, 2003, 331-340.
Geffen, D., Rao, V. S., Tractinsky, N. (2003). The conceptualization of trust, risk and their relationship in electronic commerce: The need for clarifications. In Proceedings of the 36th Hawaii International Conference on System Sciences (HICSS ’03), Hawaii, 192-201.
Grandison, T. and Sloman, M. (2000). A survey of trust in Internet applications. IEEE Communications Surveys, Fourth Quarter, 2-16.
Guha, R., Kumar, R., Raghavan, P., and Tomkins, A. (2004). Propagation of trust and distrust. In Proceedings of WWW2004, New York, May 17-22.
McKnight, D. H., and Chervany, N. L. (1996). ''The meanings of trust. University of Minnesota MIS Research Center Working Paper series, WP 96-04. Available from http://misrc.umn.edu/wpaper/WorkingPapers/9604.pdf
McKnight, D. H. and Chervany, N. L. (2001). Conceptualizing trust: A typology and e-commerce customer relationships model. In Proceedings of the 34th Hawaii International Conference on System Sciences, January 3-6, 2001, 7022-7031.
Mundie, C., de Vries, P., and Corwine, M. (2002). Trustworthy computing. Microsoft whitepaper. Available from http://www.microsoft.com/mscorp/twc/twc_whitepaper.mspx
Reagle Jr., J. M. (1996). Trust in electronic markets: the convergence of cryptographers and economist. First Monday, 1(2). Available from http://www.firstmonday.dk/issues/issue2/markets/index.html
Trusted Computing Platform Alliance. (2001). Main specification, Version 1.1b. Available at http://www.trustedcomputing.org/docs/main%20v1_1b.pdf
Yu, B. and Singh, M. P. (2002). An evidential model of distributed reputation management. In Proceedings of AAMAS ’02, Bologna, Italy, July 15-19, 2002, 294-301.
Discussion
There also is a very different definition of trust: Trust is the amount of damage you allow someone else to be able to inflict to yourself. So a high level of trust doesn't mean nothing can go wrong but it means if something goes wrong it will hurt much. In this definition trust is neither a probability nor an estimation. It is measured by the part you already have exposed to your opponent.
How much someone else (or something) can inflict to yourself is (as much as you are not omnipotent) independent from yourself. I don't think you can define trust as something independent from the person that trusts. -- ZbigniewLukasiak