OECD REVIEW OF SURVEYS ON THE DIGITAL SECURITY RISK MANAGEMENT PRACTICES OF BUSINESSES Benjamin C. Dean OECD Expert Workshop on Improving the Measurement of Digital Security Incidents and Risk Management Swiss Re Centre for Global Dialogue. Switzerland 13 May 2017
Agenda • Overview of OECD project • Findings of Phase 1 report • Key methodological issues – Epistemological – Conceptual – Definitions, taxonomies • Looking ahead
Our map of ‘cyberspace’ is incomplete
Overview of OECD project • Goals – Establish a measurement framework – Improve the evaluation of policies – Improve digital security risk management practices – Learn from failures and successes – Promote adoption of good practice • In response to calls from the: – 2016 OECD Ministerial Declaration on the Digital Economy (“ Cancún Declaration”) – 2015 OECD Council Recommendation on Digital Security Risk Management for Economic and Social Prosperity (“OECD Security Risk Recommendation”)
Overview of OECD project Phase Aims of phase Phase 1 • Reviews 14 surveys • Examines challenges to terminology and measurement • Compares some data • Decides on measurement priorities • Aims to achieve consensus on a measurement framework Phase 2 Deliver a priority list of core indicators and a questionnaire Phase 3 Validate the core indicators either through existing data collections or a separate stand-alone pilot survey
Draft framework for survey analysis Based on ‘2015 OECD Security Risk Framework’ 1. Risk awareness, risk management skills and governance 2. Risk assessment practices 3. Incident impact evaluation 4. Risk reduction practices 5. Risk transfer practices Each has a set of 3-5 objects based on past surveys
Section 1: Stocktaking exercise 14 surveys covered • 9 ’official’ surveys: – Australia, Canada, Eurostat, Korea, OECD, UK, US (3: DoJ, FBI & NCSA) • 5 ‘non - official’ surveys: – France (2: AIF & CGEPME), FERMA, Ponemon, ICSPA
Section 2: Trends across surveys • Challenges in comparing different: – Concepts, taxonomies, definitions – Survey populations – Countries – Methodological quality – Years • Emphasis placed on surveys with results broken-down by enterprise size
Key methodological issues • Epistemological : what can we ‘know’ and measure empirically? • Definitions, taxonomies : what are the components? • Conceptual : what are the components and how do they relate to one another?
Key methodological issues: Epistemology • Philosophy of science • Deductive vs inductive • Limits to inferences from inductive methods
Key methodological issues: Definitions and taxonomies Definitions: – Enterprise size, sample population – Digital security – Incidents Taxonomies – Incidents – Technical and governance measures – Skills and training
Key methodological issues: Conceptual 1. Overall: Threat Vulnerability Incident Impact 2. Economic impacts: costs vs losses 3. Incident types vs. incident impacts
Section 3: Looking ahead Key questions to answer • What are the goals of the stakeholders (policymakers, insurers, businesses)? • Which definitions can/should be standardised? • How to reconcile different conceptual frameworks? • Who is the ideal respondent? • Other indicator-specific issues …
Recommend
More recommend