Sei sulla pagina 1di 4

Policy Briefing

Evaluating Cyber Security Evidence for Policy Advice (ECSEPA)

Cyber security policy making:


a framework to assess evidence quality

Cyber security is considered a “Tier 1” risk to National Security. The project included three
Civil servants across the UK Government are working on policy primary activities to assess
advice for cyber security – but how they acquire and use evidence the effectiveness of cyber
to make recommendations is not well understood. This is important security decision-making.
as the source and credibility of evidence affects the effectiveness
Mapping exercise
and authority of the judgements made about threats, risks,
mitigation and consequences. This briefing sets out findings from We worked with the cyber
research as to how evidence is being incorporated into developing security policy community
effective cyber security policies across UK Government. to create a map of where
cyber security policy
Use of evidence in cyber security policy making development is taking
place across government,
Evidence-informed policy making is intended to reduce uncertainty and what evidence is
in decision making by drawing on rigorously collated information to used by policymakers.2
turn policy goals into reasonable, concrete and achievable outcomes.
Evidence Quality
The quality of evidence used is crucial to the value of advice provided
by civil servants to decision makers. The UK hopes to become
Assessment Model (EQAM)
“the safest place to live and do business online”, as outlined in
Our framework rates evidence
its National Cyber Security Strategy (NCSS),1 and cyber security is
samples relative to each
increasingly pervasive across all policy areas. This means it will be
ever more important for civil servants to assess, and be confident in other based on source and
the quality of the evidence they are using. The NCSS seeks to make credibility, designed to help
cyber security part of business-as-usual, by embedding it into policy policy-makers assess the
making, regulatory frameworks, business practices, research agendas credibility of their evidence.3
and institutional structures throughout Government and society.
Policy crisis games
To achieve this, government departments must work together
to share information, views and decision-making processes. The We brought together the
Cabinet Office Strategic Policy Making team described types policy community working
of evidence with a list of sources reproduced here.4 Previous across government to
research has found that in practice, the UK public sector uses take part in a simulated
a more limited range of evidence, specifically research and
‘crisis game’. We observed
statistics, policy evaluation, economic modelling and expert
participants’ decision-making
knowledge. Published research is not always used.5
processes as they worked in
teams to develop solutions
to a fictional escalating
cyber security crisis.

Policy Briefing – October 2020 | Page 1


Policy Briefing

Cyber security making policy in the UK faces novel challenges:6

- The landscape is developing rapidly and cuts across most


policy portfolios, whether it concerns maintaining the security Types of evidence
of personal health records, defending the national power grid
against cyber attacks or preventing online scams and fraud. - Expert knowledge

- Cyber security is a political issue and evidence can be contradictory, - Published research
gathered selectively and/or carry specific agendas or goals which - Statistics
could reduce its rigour and reliability. For example, states may
- Stakeholder consultations
prioritise using evidence from within their sovereign borders.
- Previous policy evaluations
- The stakeholder community involved in meeting the NCSS
objectives is vast. They have competing priorities and vested - Internet resources
interest in particular policy decisions. For example, preventing - Outcome of consultations
online scams and fraud involves financial institutions
- Costings of policy options
(such as banks and insurers), law enforcement agencies
and cyber threat intelligence companies, which all have - Results from statistical
different priorities, liabilities and regulatory standards. and ecological modelling

Our interviews with civil servants working in cyber security across


UK government departments (including Digital, Culture, Media and Evidence used by UK civil
Sport and the Home Office) and specialist agencies such as the servants in cybersecurity
London Mayor’s Office for Policing and Crime and the National Crime
Agency (NCA) found that they use a wide range of sources. However, - Research on trends from
the quality of this evidence is not always considered or understood, open source material
which has consequences for the quality of advice created based on it. (such as forums, news
articles, and newsletters).
What makes high quality evidence?
- Threat intelligence reports
from academics and
Evidence quality has been discussed and measured in other disciplines.
think tanks; surveys and
In medicine for example, the presence of randomised control trials
case studies received
is a key measure of evidence quality (the What Works Centre for from government
Local Economic Growth uses the ‘Maryland Scientific Method Scale’).7 sources (restricted
The Department for International Development aims to help civil and unrestricted), and
servants to understand different types of empirical research evidence, from businesses.
appreciate the principles of high quality evidence, consider how the
context of research findings affects the way staff might use them and - Intelligence reports from
understand how to make sense of inconsistent or conflicting evidence.8 domestic and overseas
sister agencies and
The ‘Evidence Quality Assessment Model’ restricted government
information and
We have come up with a framework to assess evidence quality for the crime survey for
England and Wales.
cyber security, which we hope will help civil servants to provide the
best policy advice based on the available evidence. It is designed
- Action fraud and general
for civil servants who provide short-term and long-term policy policing data from the NCA,
advice to measure the quality of evidence they use and to express cyber security breaches
the level of confidence they have in that evidence. It can also survey and Office of
be used for reflection on the diversity of evidence sources they National Statistics (ONS)
rely on. The framework positions evidence samples relative to data sources and reports.
each other based on two dimensions of evidence quality: source
and credibility. There are different quality issues associated with - Classified information
data and human sources of evidence, but data sources may be from law enforcement
preferred if they are more objective and tangible. The credibility agencies and the
intelligence community.
dimension reflects the point that the method and provider of
evidence both underpin the quality of the information.

Policy Briefing – October 2020 | Page 2


Policy Briefing

The Evidence Quality Assessment Model

Less credible More credible

Evidence based on open source data and Evidence based on reliable and regulated
third-party sites, such as blogs or industry sources using rigorous methods
sources
Example: IBM 2017 report.
Example: Kaspersky Lab Global Report.
IBM X-Force Research is a team that monitors
Kaspersky Lab is a multinational and analyses security issues and provides
cyber security and anti-virus provider threat intelligence content. This report
headquartered in Moscow, Russia. The report covers IBM X-Force Research’s findings.
covers security events from around the globe.
Considerations
Considerations
- Transparency around how evidence
- Industry sources can be advantageous to is collected, processed, stored and
the organisations that collect and publish handled is essential if it is to be used for
the data: they can use selected evidence to policy decisions related to legislation or
Data Sources

corroborate their findings with the assumption regulation. It is particularly important for
that other sources may not publish their own non-technical cyber security policymakers
evidence and that non-technical audiences who may require further transparency to
may not understand how the data are determine the credibility of the evidence.
collected. Can be biased for commercial - Digital forensics (the recovery and
advantage and are not peer reviewed. investigation of material found in digital
- ‘Digital evidence’ is subject to easier devices) is subject to strict chain of
manipulation.9 Unlike with analogue custody and preservation procedures.
evidence such as ridge patterns for
fingerprinting or polymarkers for DNA
analysis, editing software exists for
almost all types of digital information.
- Data analysis of cyber-attacks is open to
interpretation. For example, evaluating
the level of sophistication of a cyber-
attack has controversially been used as an
indicator of the identity of the attacker.10

Testimony obtained through unregulated Expert witnesses, subject matter experts,


means, such as media reports and online and the intelligence community
forums
Example: NCSC Password security guidance.
Example: BBC article on the main
technology events of 2017. Considerations
Human Sources

- The cyber threat intelligence industry is a


Considerations major source of information for government
- Bias may affect the credibility of testimony. agencies and corporations for policy
The news article in the example relies making and decisions about security.
heavily on the opinions of political - Geopolitical affiliations can cast
leaders and acknowledged experts. While a shadow on providers.
experts can be trusted to provide sound - Conflicting evidence among different
advice, individuals with strong political, providers can occur, for example
commercial, or ideological views may password advice from leaders in the
shape the argument or perspective. market differs from that of NCSC.

Policy Briefing – October 2020 | Page 3


Policy Briefing

The next iteration of the framework


The use of a framework to measure evidence quality, combined
with the rich stakeholder discussions that this would enable could Questions for the policy
improve policy makers understanding of, and decision making community
in field of cyber security. This framework is the first step: we
plan to develop this idea further with input from the UK policy-
making community experienced in cyber security to help refine - How could a tool for
the evidence quality criteria and validate the framework. assessing evidence
quality change the way
you use evidence?
References
- How could the next
1. National Cyber Security Strategy 2016 – 2021. Available iteration of this framework
at: https://www.gov.uk/government/publications/ be improved?
national-cyber-security-strategy-2016-to-2021
- What are the outstanding
2. ECSEPA Map. Available at: https://www.riscs.org.uk/ecsepa-map/ barriers and challenges
to developing good
3. Hussain A, Shaikh S, Chung A, Dawda S and Carr M. (2018). cybersecurity policy?
‘An Evidence Quality Assessment Model for Cyber security How can the research
Policymaking‘. Critical Infrastructure Protection XII, IFIP. community support this?
4. British Cabinet Office (1999). Professional Policymaking - The ECSEPA Mapping
in the Twenty-First century. Available at: https:// project identifies where
dera.ioe.ac.uk/6320/1/profpolicymaking.pdf cyber security policy
development is taking
5. Nutley, S., Davies, H., and Walter, I., Evidence based policy and
place across Government,
practice: cross sector lessons from the UK (2002). Working Paper
and what evidence is
9, Social Policy Research and Evaluation, New Zealand. used by policymakers.
6. Chung A, Dawda S, Hussain A, Shaikh S and Carr M.  (2018). The map is designed to
‘Cyber security: Policy’, Encyclopedia of Security and Emergency be used and updated
by policymakers. Is this
Management, LR Shapiro and MH Maras eds.  Springer Nature.
visualisation useful to you?
7. https://whatworksgrowth.org/resources/the-scientific-maryland-scale How could it be improved?
8. DFID, 2014. How to Note: Assessing the strength of evidence.
Available at: https://www.gov.uk/government/publications/
how-to-note-assessing-the-strength-of-evidence
We’d love to hear your
9. Chaikin, D. (2006). Network investigations of cyber-attacks: the thoughts on these
limits of digital evidence. Crime Law Soc Change. 46, 239–256. questions. Please get in
10. Guitton,C and Korzak, E. (2013). The Sophistication Criterion
touch using the contact
for Attribution. The RUSI Journal, vol. 158(4), pp. 62-68.
details below.

Our research Contact us

This briefing was produced in partnership with Professor Madeline Carr specialises in
UCL STEaPP’s Policy Impact Unit as part of the Global Politics and Cyber Security at UCL and
work carried out by the ECSEPA project team at is Director of RISCS. m.carr@ucl.ac.uk
UCL and Coventry University. This research has
Professor Siraj Shaikh specialises in Systems Security
been funded by the Engineering and Physical
at the Institute of Future Transport and Cities (IFTC)
Science Research Council (EPSRC) as part of
at Coventry University. s.shaikh@coventry.ac.uk
the ECSEPA project. This is part of the Research
Institute for Sociotechnical Cybersecurity (RISCS). Further details at: riscs.org.uk

Policy Briefing – October 2020 | Page 4

Potrebbero piacerti anche