Attributes and Dimensions of Trust in Secure Systems

Matthew Bradbury, Daniel Prince, Victoria Marcinkiewicz, and Tim Watson. Attributes and Dimensions of Trust in Secure Systems. In Proceedings of the 12th International Conference on the Internet of Things, STaR-IoT’22, 179–186. Delft, Netherlands, 7 November 2022. ACM. doi:10.1145/3567445.3571105.

[ bibtex] [ file] [ presentation]

In this paper we set out our position on the concept of trust in secure systems. This position evolved from previous work investigating trust-based task offloading. The views in this paper intended to resolve issues that we perceived with the use of trust in literature produced by the security community. To resolve this we provided more general definitions for trust and related concepts, splitting them into the label that is assigned (trusted/trustworthy) and the measurements (trustiness/trustworthiness). We also identified a set of trust attributes that could be applied to this more general definition and explored dimensions along which attributes could be measured.

Prior to this work we noted that:

  1. The majority of existing work tends to redefine what trust means depending on what the work was investigating.
  2. That many works focus on trust in the behaviour of a trustee, so definitions of trust tend to focus on behavioural trust.
  3. Many other types of trust exist different to behavioural trust.
  4. Trust and trustworthy are often used as synonyms, but we believe they are different concepts.
  5. Trust is often assumed or placed in an entire system (or subsystem) without evidence to support this assignment.

So the aim of this paper was to: 1) define trust (and related concepts) in a general way, such that they could be paired with 2) different trust attributes that specify what attribute of a system is being assessed, and 3) specify dimensions along which those attributes could be measured.


  • Trustiness — A measurement of the attributes under consideration by the trustor to assess the ability of the trustee to meet the trustors trust expectations.
  • Trustworthiness — A measure of the uncertainty in the trustiness the trustor has in the trustee.
  • Trusted — An entity in a system is deemed to be trusted when the trustiness is sufficiently high.
  • Trustworthy — An entity in a system is deemed to be trustworthy when the trustworthiness is sufficiently high.

Where the trustor is the entity performing an assessment of trust or assuming trust in a trustee.


Using the example proactive trust-based task offloading system we classified examined the different trust attributes on the proposed scale. The key take away is that while there was an element on trust assessment in this system, it is not possible to describe it as holistically trusted because some attributes were not evidenced and other attributes were evidenced to different degrees.

AttributeScaleActivityScopeStrengthSourceTime of Evidence
Data AccuracyNoneAssumed
Data IntegrityOrdinalReactiveLocalHighDirectSampled
Data ProvenanceOrdinalReactiveLocalHigh/MediumDirectSampled


Understanding what is meant by trust is highly important, but is typically left poorly specified. For example, the 2014 EU regulation on “electronic identification and trust services for electronic transactions in the internal market” (Regulation No 910/2014) has only 200 references to the word trust, but never defines what is meant by it.


This is not the first piece of work in which views on the different types of trust attributes have been explored. Jøsang et al, Grandison and Sloman and Daubert have all proposed similar sets of trust attributes. We noticed that the trust attributes of interest have evolved over time, likely due to the different ways in which trust is being assessed. These works have also looked at uncertainty (for example, with belief, disbelief, uncertainty), but this has typically been assessed in what would have been called trustiness instead of trustworthiness as we have defined them.