making users feel accountable deterring abuses of private
play

Making Users Feel Accountable: Deterring Abuses of Private - PDF document

Making Users Feel Accountable: Deterring Abuses of Private Information within Information Systems Anthony Vance Braden Molyneux A long-standing tenet of information security is the Principle of Least Privilege: the concept that every


  1. Making Users Feel Accountable: Deterring Abuses of Private Information within Information Systems Anthony Vance Braden Molyneux A long-standing tenet of information security is the Principle of Least Privilege: the concept that “every privileged user of the system should operate using the least amount of privilege necessary to complete the job” (Saltzer 1974, p. 389). However, many medical, financial, and personnel records systems are intentionally designed in violation of this principle. Because of job requirements, users of these systems are given broad access to information in the system. However, with this broad access comes potential for abuse. For example, the Integrated Data Retrieval System (IDRS) used by U.S. Internal Revenue Service (IRS) allows auditors to view tax records of individuals they are not assigned to audit (U.S. Department of Treasury 2004). Every year, the IRS sanctions or terminates hundreds of employees for unauthorized access violations made out of curiosity for personal gain (TIGTA 2009). Abuses by users of medical records systems are similarly frequent (Rubenstein 2008). If potential for abuse of private information is so great, why is the principle of least privilege not more strictly employed in these systems? One reason is concern for practicality and efficiency (Rubenstein 2008). According to one hospital administrator, the problem is stated as follows: “There are just thousands of people who have access—and need to have access—to confidential information, and to try to change their behavior is a challenge” (Rubenstein 2008, p. D1).

  2. Research Question One promising means for modifying the behaviors of users with broad access privileges is through accountability, “the implicit or explicit pressure to justify one’s beliefs and actions to others” (Tadmor and Tetlock 2009, p. 8). The construct of accountability has received substantial attention in fields of psychology and organizational behavior (Lerner and Tetlock 1999). Findings are consistent that a person’s expectation that he/she will be held accountable for an action lessens the likelihood that an action will be performed in socially unacceptable ways (Gelfand and Realo 1999). However, although the potential of IT systems to foster accountability has long been recognized (Zuboff 1988), as yet no research has examined the effects of accountability on users of an information system. The research question of the proposed study therefore the following: RQ: How can features of an information system be designed to manipulate perceptions of accountability and thereby deter computer abuse? Theory Key elements of accountability theory include (1) identifiability , a person’s perception that his/her actions may be linked to him/her); (2) evaluation , the expectation that another will compare a users’ actions with some standard; and (3) justification , the expectation of being required to give reasons justifying a person’s words or actions (Lerner and Tetlock 1999). The dependent variable of interest is a person’s intention to violate the organization’s IT privacy policy (Siponen and Vance 2009).

  3. Approach Two different methodologies for this study will be used. First, A factorial survey method will be used for the study (Rossi and Anderson 1982; Jasso 2006). In this method, respondents read a series of short scenarios or vignettes describing a hypothetical situation. Textual elements within the scenario are experimentally varied. After reading the vignette, respondents rate their intention of behaving in a similar way to the character portrayed in the scenario. Second, a field experiment will be conducted involving a homework submission system in which students will be free to view the homework submissions of other students, but are required by policy not to do so until their own homework assignment is submitted. Various treatments will be given so that different students will be administered different accountability features in the system. Subjects for this study will be come from two sources. First, for the factorial survey, the primary data collection will involve university employees with broad access within a personnel records system. Second, the experiment will take place in several undergraduate programming courses over the course of a semester. Expected Contributions The proposed study breaks new ground by applying accountability theory to the field of Information systems. An additional potential contribution is to identify which design features of an information system can be manipulated to make feelings of accountability among system users more or less salient and thereby deter computer abuse. A third potential contribution is to provide practitioners with empirical evidence of using accountability mechanisms as alternatives to the principle of least privilege in preventing abuses of private information within information systems.

  4. Manuscript Status The manuscript is currently a working paper. The theory and literature review sections are near completion. Currently data is being collected as part of the pretest for the factorial survey. Both the primary data collection for the factorial survey and the field experiment begin January 2010. References Gelfand, M. and Realo, A. 1999. Individualism-Collectivism and Accountability in Intergroup Negotiations. Journal of Applied Psychology 84 (5) pp. 721-736 Jasso, Guillermina. 2006. ‘Factorial Survey Methods for Studying Beliefs and Judgments.’ Sociological Methods and Research 34: 334–423. Lerner and Tetlock. 1999. “Accounting for the Effects of Accountability”. Psychological Bulletin 125 (2) pp. 255-275 Rossi, Peter and Andy Anderson 1982. “The Factorial Survey Approach: An Introduction.” In Peter Rossi and Steven Nock (eds.), Measuring Social Judgements. Beverly Hills, Calif.: Sage. Rubenstein, Sarah. 2008. Wall Street Journal. (Eastern edition). New York, N.Y.: Apr 29, p. D.1 Saltzer. Protection and the Control of Information Sharing in Multics. Communications of the ACM (1974) vol. 17 (7) Siponen and Vance. Forthcoming. “Neutralization: New Insights into the Problem of Employee Information Systems Security Policy Violations.” MIS Quarterly . Tadmor, C., Tetlock, P. 2009. “Accountability”. In The Cambridge Dictionary of Psychology , Cambridge University Press, Cambridge: U.K.

  5. TIGTA: Treasure Inspector General for Tax Administration. 2009. Semiannual Report to Congress. Accessed at: http://www.treas.gov/tigta/semiannual/semiannual_mar2009.pdf, 10/5/2009. U.S. Department of the Treasury. 2004. “The Audit Trail System for Detecting Improper Activities on Modernized Systems Is Not Functioning”. Internal memorandum. Accessed at: http://www.ustreas.gov/tigta/auditreports/2004reports/200420135fr.pdf, 10/5/2009. Zuboff, S. In the Age of the Smart Machine: the Future of Work and Power. New York: Basic Books. (1988) ¡ ¡ ¡ ¡ ¡ ¡

Recommend


More recommend