avoiding distrust in
play

Avoiding distrust in inevitable. Kjell Jrgen Hole e-government - PowerPoint PPT Presentation

We study e-government services and develop a trust model to illustrate how incidents affecting few users can cause pervasive distrust, and why it is hard to regain lost trust. We then discuss how to build and maintain trust when the high


  1. We study e-government services and develop a trust model to illustrate how incidents affecting few users can cause pervasive distrust, and why it is hard to regain lost trust. We then discuss how to build and maintain trust when the high complexity of e-government infrastructures makes incidents Avoiding distrust in inevitable. Kjell Jørgen Hole e-government Simula@UiB Last updated 16.05.17 Overview ❖ Introduction ❖ Defining trust ❖ Information infrastructures ❖ Explanatory trust model ❖ Trust is fragile ❖ Tipping points ❖ Distrust is robust ❖ Building and preserving trust 2 Introduction 3

  2. Norwegian citizens now receive information from the government in E-government in Norway personal digital “mail boxes.” ❖ The Norwegian government is developing e-services for citizens and companies ❖ applications, invoicing, appointments, and various types of reports will be handled electronically ❖ sensitive information such as health and tax data are to be sent over the Internet to personal devices 4 While the model provides one set of explanations, other explanations are Trust modeling also possible. ❖ We’ll model a population of users that influence each other’s level of trust in e-government services ❖ The model explains why ❖ trust decreases rapidly when distrust starts to spread ❖ it is hard to determine which incidents will lead to widespread distrust ❖ it is difficult to create pervasive trust when there is much distrust 5 Defining trust 6

  3. The literature contains many different definitions of ‘trust’ because the Trust as a computational construct concept it is both context- and agent-dependent. ❖ Here, an individual’s trust in an entity is given by three mutually exclusive states ❖ trust ❖ mistrust ❖ distrust 7 State of trust ❖ An individual that trusts an entity has a positive expectation of the entity’s future behavior ❖ The individual will cooperate with the entity even though there is a possibility that the entity will misbehave and inflict cost or damage ❖ The entity gains the individual’s trust over time through repeated actions benefiting the individual 8 State of mistrust ❖ An individual harboring mistrust believes the uncertainty is too large to expect a particular behavior from an entity ❖ a citizen may believe in a government’s desire to deliver secure services, but has no confidence in the government’s ability to deliver 9

  4. State of distrust ❖ An individual distrusting an entity believes it will deliberately act against her in a given situation ❖ a distrusting citizen may think that the government uses collected information to spy on individuals 10 Trust is situational and varies over time. In area A, the individual trusts an Trust varies over time entity enough to cooperate. In area B, the individual actively distrusts the entity and will take action against it, convinced that the entity will respond Trust levels in turn. Between the cooperation and noncooperation thresholds is An individual’s development Trust of trust over time A mistrust, in which the individual believes the entity’s intent to deliver a Cooperation threshold Mistrust certain service quality but is not certain of the entity’s ability to do so. Noncooperation threshold B Distrust Time 11 Mutual influence ❖ Since most users do not fully understand the reasons for incidents in national computer systems, they will seek advice from others ❖ The users’ trust is particularly influenced by the opinions of family, friends, and co-workers ❖ especially when they all start to discuss incidents widely reported by the media 12

  5. The three fractions of trust, mistrust, and distrust sum to one. Population has degrees of trust ❖ Note that the whole population has different degrees of trust, mistrust, and distrust at the same time measured by the fractions of individuals in each of the three states 13 Information infrastructures 14 Infrastructure definition ❖ A national information infrastructure is a socio- technological system that consists of ❖ stakeholders, ❖ networked computer systems, ❖ security and privacy policies, and ❖ threats such as equipment failure, extreme weather, hacking, and sabotage 15

  6. LHR events are outliers that are also referred to as black or gray swans. See LHR events lecture on antifragile ICT systems to learn more about LHR events. ❖ Incidents occur in infrastructures all the time, but users do not detect most of the events because automated mechanisms and system operators limit their impacts ❖ From the users’ point of view, infrastructures tend to be stable over long periods, punctuated by large-impact, hard-to-predict, and rare ( LHR ) events 16 The complexity is due to the many interactions between the users and the Complex adaptive system infrastructure, the large amounts of communications between the subsystems, the influence of changing policies and threats, and the adaption of stakeholders and infrastructure to internal and external changes. ❖ We view a national information infrastructure as a complex adaptive system and LHR events as surprising and extreme global behavior 17 While improved risk management can assess and mitigate more incidents, LHR incidents are inevitable incidents will still occur because an infrastructure has too many dynamic interactions for humans to even enumerate all possible rare and extreme ❖ Because national information infrastructures are complex behaviors of a system. adaptive systems, LHR incidents will occur no matter the quality of the risk management ❖ Next, we develop a model to study how LHR incidents affect a population’s trust in information infrastructures, especially e-government services ❖ we’ll focus on incidents leading to widespread distrust 18

  7. Explanatory trust model 19 Simple, generic trust model shaped as a doughnut. Simple trust model ❖ Two-dimensional, discrete-time, cellular automaton ❖ individuals are represented by 10,000 patches on a square that wraps around at the edges ❖ synchronous (deterministic) updates of trust states 20 States of trust ❖ An individual’s state of trust is given by the color of the patch ❖ trust is green ❖ mistrust is yellow ❖ distrust is red 21

  8. The figure shows the so-called Moore neighborhood. Realization of mutual influence ❖ At each time step, the state of an individual is updated based on its own state and the states of its eight neighbors 22 Set of update rules (used in examples) (1) A green patch changes to yellow when there are maximum four green neighbors (2) A yellow patch turns red when there are maximum three green neighbors (3) A red patch becomes yellow when minimum seven neighbors are green (4) A yellow patch turns green if minimum six neighbors are green 23 Each column defines a set of four update rules. The first two entries in a Alternative sets of update rules column define the maximum number of green neighbors causing changes towards distrust, while the last two entries define the minimum number of green neighbors needed to change away from distrust. Changes Color-changing thresholds Maximum number of green neighbors ⌅ → ⌅ 3 3 3 3 3 3 3 3 4 4 4 4 4 4 ⌅ → ⌅ 1 1 1 1 2 2 2 2 1 1 2 2 3 3 Minimum number of green neighbors ⌅ → ⌅ 6 6 7 7 6 6 7 7 6 7 6 7 6 7 ⌅ → ⌅ 5 6 5 6 5 6 5 6 6 6 6 6 6 6 Rules from previous page 24

  9. The rule sets all have the same properties. These properties were selected Properties of rule sets (1) simply because they reflect reasonable assumptions about trust. ❖ An individual with trust goes through a period of mistrust before developing distrust ❖ An individual with distrust develops mistrust before trust 25 Properties of rule sets (2) ❖ Individuals that have trusted an entity for a long time are reluctant to mistrust or distrust the entity ❖ Distrusting individuals are even more reluctant to ever again trust an entity that has violated their trust and caused pain or damage 26 Users harboring mistrust will (directly or indirectly) experience trust- Properties of rule sets (3) reducing events in the future because incidents are inevitable in a complex adaptive systems. Hence, it is reasonable to believe that users with mistrust ❖ Since mistrust is believed to be a less stable state than will develop distrust when they are surrounded by enough mistrust. distrust, an individual harboring mistrust develops distrust when surrounded by much mistrust 27

  10. See the lecture on antifragile ICT systems to better understand why it is, at Explanatory model best, very hard to predict extreme global behavior in complex adaptive systems. ❖ The trust model is non-predictive in the sense that it cannot forecast a population’s trust in a real system ❖ However, it offers an explanation of how the degree of trust changes in a large community of users 28 Trust is fragile 29 While most incidents go unnoticed by the media, a few incidents are widely Development of distrust reported. Not all reported events are very serious from a technical point of view, but extensive media coverage can still create mistrust among a ❖ We first study how a high degree of trust can turn into a significant fraction of users. high degree of distrust ❖ We concentrate on incidents reported in the media creating some percentage of initial mistrust ❖ At the start of a model run, a selectable percentage of the patches (chosen uniformly at random) are yellow and the rest are green 30

Recommend


More recommend