Sei sulla pagina 1di 77

Introduction: how to use this spreadsheet

The tables in this spreadsheet illustrate the metrics scoring process described by Krag Brotby and Gar
Security Metrics".

This spreadsheet is provided as a tool to accompany and illustrate the concepts laid out in depth in the
The spreadsheet alone is not sufficient for you to choose suitable information security metrics, n
measurement system. The book describes a rational process to score, rank and shortlist candidat
goes on to cover the complexities of designing, using and maintaining a metrics system. In particula
thinking behind the PRAGMATIC scoring criteria, and hopefully to appreciate the reasons why we ha
stated herein.

The example metrics and their PRAGMATIC scores in this spreadsheet are purely illustrative, demo
PRAGMATIC scoring process in practice. In determining the scores, we have made lots of assumpti
be viewed by management in a fictional generic mid-sized commercial organization, "ACME Inc
discussing them and resolving our differences (mostly!). The PRAGMATIC scores make sense to u
free to disagree with us. Seriously, consider and challenge these examples, preferably in conjunctio
we insist.

Do not rely solely on these example metrics or the scores, not even if you happen to wor
commercial organization and agree 100% with us, which is highly unlikely. Don't just meekly swallow
Develop a set of candidate information security metrics and score them in your own specific organ
creative. Develop your own metrics, and steal good metrics ideas from wherever you find inspiratio
field). Use whatever structures make sense to you. Discuss, score and refine the scores with your pe
colleagues. Argue about them. Get passionate about information security and metrics. Benefit as m
process as from the output. We hope you'll enjoy the game even more than the final result.

Using this spreadsheet and the example metrics simplistically without the broader context from th
inefficient and ineffective, and may well harm rather than help your information security. Read the
systematic view of your security metrics, specifying metrics that are useful at your current sta
improvements to your information security arrangements. The systematic aspect goes beyond simply
There are maturity and other factors to consider. Trust us, this is not nearly as easy as it appears. Us

And don't say we didn't warn you.


Example PRAGMATIC-scored informa
structured according to ISO/IE
Metric
ref # Risk management metric

4.1 Security risk management maturity

Number of high/medium/low risks currently


4.2
untreated/unresolved

4.3 Information security budget variance

4.4 Process/system fragility or vulnerability

4.5 Number of unpatched technical vulnerabilities

4.6 Information security risk scores

4.7 Total liability value of untreated/residual risks

4.8 Coupling index

4.9 Changes in network probe levels

4.10 Organizational and technical homogeneity

4.11 % of controls working as defined


4.12 Organization's insurance coverage versus annual premiums

4.13 Number of attacks

Security policy metric


Number of security policies, standards, procedures and
5.1 metrics with committed owners

5.2 Security policy management maturity

Traceability of policies, control objectives, standards &


5.3 procedures

Number of important operations with documented & tested


5.4 security procedures

5.5 Comprehensiveness of security policy coverage

5.6 Policy coverage of frameworks such as ISO/IEC 27002

5.7 Number or % of security policies addressing viable risks

5.8 Quality of security policies

% of policy statements unambiguously linked to control


5.9 objectives

5.10 Thud factor (policy verbosity/red tape index, waffle-o-meter)


Number of security policies whose review/reapproval is
5.11 overdue

Flesch readability scores for policies, procedures, standards


5.12 and guidelines

5.13 Number or % of security policies that are clear

5.14 % of security policies that satisfy documentation standards

Number of security policies that are inconsistent with other


5.15 policies or obligations

Management/governance metric

6.1 Quality of security metrics in use

6.2 % of security controls that may fail silently

6.3 Security governance maturity

6.4 Information security ascendency

6.5 % of controls unambiguously linked to control objectives

6.6 Number of controls meeting defined control criteria/objectives

6.7 % of critical controls consistent with controls policy


6.8 Corporation's economic situation

6.9 % of controls that are ossified or redundant

6.10 Control objectives tied to specific business objectives

6.11 Days since the last serious information security incident

6.12 Annual cost of information security controls

6.13 Number of different controls

6.14 Extent of accountability for information assets

6.15 Information security expenditure

6.16 Benford's law

6.17 NPV (Net Present Value)

6.18 ROI (Return On Investment)

6.19 IRR (Internal Rate of Return)

6.20 Payback period

Information Security Management customer satisfaction


6.21 rating
6.22 Information security controls coverage

6.23 DEFCON level

6.24 Controls consistency

6.25 Scope of information security activities

6.26 VAR (Value At Risk)

6.27 ROSI (Return on Security Investment)

6.28 Security budget as % of IT budget or turnover

Information asset management


metric
7.1 Number of orphaned information assets without an owner

7.2 Information asset management maturity

7.3 % of information assets not [correctly] classified

7.4 Unowned information asset days

7.5 Integrity of the information asset inventory


Value of information assets owned by each Information Asset
7.6 Owner

% of information assets not marked with the [correct]


7.7 classification

Human resources security


metric
8.1 Human resources security maturity

8.2 Security awareness level

8.3 Rate of change in employee turnover and/or absenteeism

8.4 Staff morale & attitude

8.5 Tone at the top

8.6 Corporate security culture

8.7 System accounts-to-employees ratio

8.8 Opinion surveys and direct observations of the culture

8.9 Help desk security traffic volumes

8.10 Culture / world view


8.11 Employee turn versus account churn

8.12 Organizational dysfunction

8.13 Psychometrics

Physical & environmental security


metric
Power consumed by the computer suite versus air conditioning
9.1 capacity

9.2 Physical and environmental security maturity

Discrepancies between physical location and logical access


9.3 location

9.4 Number of unsecured access points

9.5 Number of unacceptable physical risks on premises

9.6 Distance between employee and visitor parking

9.7 % of facilities that have adequate external lighting

IT security metric

10.1 IT security maturity


% of systems checked and fully compliant to applicable
10.2 (technical) security standards

10.3 Time from change approval to change

Correlation between system/configuration logs and authorized


10.4 change requests

10.5 % of IT devices not securely configured

10.6 Rate of change of emergency change requests

10.7 % of highly privileged/trusted users or functions

10.8 Entropy of encrypted content

% of IT/process changes abandoned, backed-out or failed for


10.9 information security reasons

10.10 Vulnerability index

10.11 Delays and inconsistencies in patching

10.12 Perceptions of rate of change in IT

10.13 Patching policy compliance

10.14 Number of changes

10.15 Number of viruses detected in user files


10.16 Number of firewall rules changed

10.17 Toxicity rate of customer data

Access control metric


Rate of messages received at central access logging/alerting
11.1 system

11.2 Information access control maturity

Days since logical access control matrices for application


11.3 systems were last reviewed

% of inactive user accounts that have been disabled in


11.4 accordance with policy

11.5 Rate of detection of access anomalies

Logical access control matrices for applications: coverage and


11.6 detail

Logical access control matrices for applications: state of


11.7 development

11.8 Quality of identification and authentication controls

% of business units that have proven their identification and


11.9 authentication mechanisms

Number of times that assets were accessed without


11.10 authentication or validation
Software security metric

12.1 Software security maturity

12.2 % of controls tested realistically

12.3 Software quality assurance

12.4 Quality of system security revealed by testing

Extent to which information security is incorporated in software


12.5 QA

Extent to which QA is incorporated in information security


12.6 processes

% of configuration items in line with service levels for


12.7 performance and security

12.8 % of technical controls that fail-safe

Number of deviations identified between configuration


12.9 repository and actual asset configurations

Incident management metric

13.1 Information security incident management maturity

13.2 Time taken to remediate security incidents


13.3 Time lag between incident and detection

% of incidents for which root causes have been diagnosed and


13.4 addressed

13.5 Cumulative costs of information security incidents to date

Number of information security events and incidents, major


13.6 and minor

Number of information security incidents that could have been


13.7 prevented, mitigated or avoided

13.8 Non-financial impacts of incidents

Business continuity metric

14.1 Coverage of business impact analyses

14.2 Business continuity management maturity

% of critical business processes having adequate business


14.3 continuity arrangements

14.4 % of business processes having defined RTOs and RPOs

14.5 Business continuity plan maintenance status

14.6 Disaster recovery test results


14.7 Uptime

14.8 IT capacity and performance

Mapping critical business processes to disaster recovery and


14.9 business continuity plans

14.10 Business continuity expenditure

% of critical systems reviewed for compliance with critical


14.11 control requirements

Compliance & assurance metric

15.1 Information security compliance management maturity

15.2 Breakdown of exceptions and exemptions

Number and severity of findings in audit reports, reviews,


15.3 assessments etc.

Status of compliance with externally-imposed information


15.4 security obligations

15.5 Historic consequences of noncompliance

15.6 Number of systems whose security has been accredited

Status of compliance with internally-mandated (corporate)


15.7 information security requirements
Number of unapproved/unlicensed software installations
15.8 identified on corporate IT equipment

% of security policies supported by adequate compliance


15.9 activities

15.10 Compliance benchmark against peers

Number or rate of security policy noncompliance infractions


15.11 detected

15.12 Embarrassment factor

15.13 % of purchased software that is unauthorized

Proportionality of expenditure on assurance versus potential


15.14 impact x likelihood

% of software licenses purchased but not accounted for in


15.15 repository

% of critical information assets residing on fully compliant


15.16 systems
PRAGMATIC-scored information security metrics,
tructured according to ISO/IEC 27002:2005
S/M/O P R A G M .A T I
Note .

Scored using a maturity scale with scoring


SM 92 98 68 78 90 83 89 84
indicators (see appendix H)

Measures the overall effectiveness of risk


SMO 87 87 84 81 89 80 87 83
management processes

Changes to the infosec budget are likely to


effect what can realistically be achieved, but M 70 90 85 77 80 77 80 90
specific differences can't be predicted

Contrast results of theoretical assessments with


actual performance as a measure of risk and SMO 90 90 44 80 92 77 66 60
security management

The unknown ones are arguably more worrying


MO 80 64 80 70 80 75 25 85
but we can't count them!

Most RA methods generate some sort of


scoring output that is useful to compare SM 72 60 55 70 71 40 60 60
whatever is analyzed

Assumes we can estimate the liability (potential


SM 88 98 59 33 96 33 77 38
impact x likelihood) that remains untreated

The amount of coupling between systems and


processes has an impact on the risk of cascade S 68 85 50 60 72 47 35 61
failures ('the domino effect')

Exploratory probes often precede attacks, so


may give an early warning (for various values of MO 50 80 10 68 66 85 50 70
'early')

Homogeneity or heterogeneity impacts on


aggregated risk and can be measured using MO 67 70 40 59 67 50 33 65
scoring scales

Presumes controls are being systematically


MO 62 62 44 26 66 25 22 36
tested/proven
If management spends $50k pa to provide $1m
in business interruption insurance, the risk S 64 46 5 25 20 16 10 82
appetite has been partially quantified

"Attack" would have to be carefully defined;


M 13 9 1 2 12 1 4 1
ignores differing impacts, and accidents

S/M/O P R A G M .A T I
Note
.

To achieve effective accountability and


M 81 87 90 95 92 92 77 92
responsibility requires an identified "owner"

Scored using a maturity scale with scoring


SM 90 95 70 80 88 85 90 82
indicators (see appendix H)

They should all be explicitly/directly interlinked


M 85 89 88 90 91 87 65 84
and/or cross-referenced

Procedural documentation captures knowledge


but must be tested for accuracy; essential for MO 95 96 91 85 95 84 62 90
business continuity

Standards must be based on and clarify policy S 75 82 92 78 80 70 73 60

Policy coverage metric implies management


MO 70 75 90 69 85 76 72 65
has a reasonable idea of the policy landscape

A measure of completeness of governance


M 65 76 91 73 83 77 70 61
structure

Having pre-agreed criteria or standards for


MO 80 85 40 66 72 75 80 80
quality, readability etc. helps!

Confirms that detailed policy statements


M 92 91 64 60 85 65 45 75
reflect/satisfy identified control objectives

Policies are one form of control addressing


risks: if there is no viable risk, why have a M 82 80 60 60 70 45 85 86
policy?
Presumably ranges from "none" to "we live by
it" but very subjective unless supported by O 54 88 92 14 97 77 43 90
criteria

The volume level in decibels when the policy


suite is dropped from one meter negatively MO 68 77 60 86 35 70 64 88
correlates to its effectiveness :-)

Drives the policy review process; assumes


O 75 70 68 41 96 50 56 90
policies have defined review/reapproval dates

Ideal scores likely to vary between those types


MO 66 47 79 45 74 38 44 50
of document

… as perceived by the intended audiences O 60 49 76 43 88 45 41 43

S/M/O P R A G M .A
.
T I
Note

Measured using metametrics similar to


SM 96 91 99 92 88 94 89 79
PRAGMATIC

Controls that fail without it being obvious are a


SMO 90 90 90 90 90 93 86 93
liability, increasing security risk

Scored using a maturity scale with scoring


SM 95 97 70 78 91 89 90 85
indicators (see appendix H)

Position of security in organization very


S 97 87 15 94 86 90 99 97
indicative of authority, culture and effectiveness

ISO27k approach: without clear objectives, how


would you determine whether the controls are M 92 91 64 60 85 65 45 75
adequate?

Either objectives are incorrect or control


MO 88 86 88 65 78 60 26 90
inadequate - binary measure

Without a controls policy, implementation is


likely to be ad hoc, contradictory and SM 83 92 80 83 89 82 32 70
inconsistent
Financial stress generally results in increased
SM 72 80 10 80 80 80 61 80
risk

“It’s the way we’ve always done it: put the pink
copy in the green file …"; gathering the data SM 85 88 85 80 84 75 22 62
has spin-off benefits

Controls that do not support specific business


SM 96 95 65 55 99 50 40 70
objectives may not be needed

"Serious" needs to be defined by severity


M 62 70 11 87 87 10 92 95
criteria

Difficult to determine accurately but a very


interesting metric, especially if analyzed by SM 94 92 90 77 97 44 50 16
different categories of expense

Fewer effective system wide controls more


efficient- too few lose granularity, too many too SM 71 75 72 75 88 30 50 65
costly

Good indicator of security maturity S 94 93 78 36 72 76 30 40

Compare against previous years’ figures and


projections for future years, and against various SM 82 94 60 60 89 29 33 49
other overheads 

Predictability of first digit of random numbers O 84 30 53 95 11 98 62 98

Standard accounting technique reflects the time


M 77 72 25 35 85 55 44 60
value of money

Standard accounting technique M 70 72 25 30 82 50 44 60

Standard accounting technique M 69 72 25 30 82 50 44 60

Standard accounting technique M 65 72 25 25 88 50 44 60

Effectiveness can be compromised by low


SM 60 60 40 35 85 51 85 15
scores / perception of lack of value
Source: Scott Berinato. Measure with a
MO 87 89 65 40 74 35 46 40
maturity scale or ISO27K checklist

External threats may pose organizational risk M 5 10 30 85 25 71 88 90

Standardized controls for same things, follows


M 78 83 67 60 71 33 27 31
controls policy consistently

Does scope include SCADA/ICS, facilities,


production systems, elevators, archives, S 86 74 35 44 70 37 30 44
backups, business continuity etc.?

Standard risk management technique M 70 65 20 30 35 40 30 30

ROI as applied to security investments, using


M 40 40 20 20 55 45 25 40
ALE

Possible comparator between business units or


organizations but wide variety of possible M 13 3 16 2 2 0 4 18
reasons for the differences

S/M/O P R A G M .A .
T I
Note
Information asset owners are accountable for
their adequate protection; orphaned asets are M 85 90 97 90 90 95 85 99
less llkely to be properly secured

Scored using a maturity scale with scoring


SM 90 95 70 80 90 85 90 85
indicators (see appendix H)

Absent asset classification, there is risk of


under-protecting or the cost of over-protecting MO 75 75 97 85 90 80 80 80
assets; assumes a comprehensive inventory

Counted from an arbitrary start or asset


creation date: speeds up the nomination of MO 40 51 84 77 74 86 92 94
owners for all (significant) information assets

Measure the completeness, accuracy and


MO 82 66 83 78 80 43 50 66
up-to-date-ness of the inventory
Assumes there is an information assets
M 48 64 78 57 79 38 50 22
valuation process, which has spin-off benefits

Presumes that labels are meaningful and


O 52 53 63 44 62 13 17 87
correct

S/M/O P R A G M .A
.
T I
Note

Scored using a maturity scale with scoring


SM 90 95 70 80 90 85 90 85
indicators (see appendix H)

Measured using surveys, scales, tests etc. MO 86 89 86 82 85 80 69 48

Big changes (especially sudden increases)


suggest potentially increased security risks and SM 60 66 20 85 60 80 75 80
so are probably worth investigating

Difficult to measure objectively without testing,


SM 88 72 60 75 65 75 20 50
yet important for security and fraud

Difficult to measure objectively, yet an important


driver for security and governance throughout SM 95 50 57 40 91 45 50 25
the organization

In the hands of competent psychometricians,


SM 60 76 55 75 60 60 10 75
OCAI scores are highly predictive of behaviors

Measures personal accountability; will stimulate


an interesting disucssion about what the 'ideal' MO 74 67 38 39 68 42 36 83
value should be!

Culture can be improved by focusing attention


SM 80 80 60 55 75 55 10 45
using surveys.

Assumes that security support calls are


routinely identified as such by the Help Deskers' O 24 33 16 58 5 35 33 45
call logging system

Difficult to measure objectively without testing,


SM 66 30 10 70 40 56 15 40
yet important for security and fraud
Any differences between the rates of change
should be investigated but may be legitimate; O 30 30 11 36 44 36 62 57
may need to count contractors separately

Difficult to measure objectively, yet symptomatic


SM 75 20 10 60 80 40 15 10
of likely security risks within the organization

Can be used to check the extent to which


MO 40 24 0 79 15 55 10 42
employees are suited to their roles

S/M/O P R A G M .A
.
T I
Note
Virtually all the power ends up as heat, so it is
important to track power comsumed against the O 81 69 89 92 80 99 98 90
air conditioning capacity

Scored using a maturity scale with scoring


SM 90 95 70 80 90 85 90 85
indicators (see appendix H)

Inconsistencies between logical and physical


M 75 76 72 90 82 75 85 83
locations of employees are a concern

Locate (and perhaps rank) vulnerabilities


MO 95 80 90 70 85 77 45 75
through tiger team/penetration testing

Useful way to highlight issues but a rather


M 70 60 85 60 90 60 30 60
subjective and coarse measure

Assumes there is a perceived need for


O 1 0 6 93 2 93 66 45
segregation

Measures a single security control O 2 5 70 42 11 46 35 18

S/M/O P R A G M .A
.
T I
Note

Scored using a maturity scale with scoring


SM 90 95 70 80 90 85 90 85
indicators (see appendix H)
Measures of the technical compliance
O 81 77 89 86 89 73 74 78
processes

Approval suggests proper risk evaluation


process ergo reducing the time needed to M 70 71 76 90 60 84 64 60
achieve changes is likely to be beneficial

This is a process compliance metric in the


context of IT change management: M 87 80 90 80 80 80 60 50
unauthorized changes may be incidents

Assumes there are technical security


configuration standards and systematic O 83 80 77 75 59 74 76 88
assessment (e.g.the CIS benchmarks)

A peak would suggest a cluster of serious


issues and possibly a process failure if the MO 64 71 69 73 78 70 70 69
changes are not really emergencies

Indicative of the organization's control over


privileged access: potentially a very granular MO 86 80 51 40 65 39 55 95
information-rich metric

Measured by sampling and looking for


O 78 66 23 78 3 93 74 79
patterns/uneven distributions

Measures procedural deficiencies: these


changes should have been blocked by change MO 50 70 55 60 65 40 50 45
testing and authorization processes

Could be measured using automated analyses


and/or penetration tests, and compared across O 74 85 71 74 60 32 46 33
systems

Would have to be carefully defined O 43 41 77 62 36 32 48 34

This is clearly a highly subjective measure but


extreme values (whether high or low) suggest M 40 50 6 65 70 50 30 14
increased information security risks

Compare actual patch status of systems against


the policy on patching & vulnerability O 66 52 55 77 19 36 11 8
management

… in the reporting period, a crude activity


O 55 24 9 6 2 3 15 26
measure loosely correlated with security

Let's be generous and assume this actually


O 8 13 6 11 3 2 5 5
covers all forms of malware
Would have to be carefully defined O 2 1 1 10 2 33 14 4

Don't ask us what this means: this extremely


obscure metric was suggested by someone O 0 0 0 0 0 0 0 0
else!

S/M/O P R A G M .A .
T I
Note
A 'heartbeat' metric: the sudden unexpected
lack of data from a system is probably a O 87 88 94 93 93 94 97 89
security incident

Scored using a maturity scale with scoring


SM 90 95 70 80 90 80 90 85
indicators (see appendix H)

Indicates how effective are the processes to


MO 55 80 95 30 80 85 60 70
manage/maintain access rights

During the reporting period MO 68 56 74 76 73 64 64 52

Correlate turnstile/card accesses, network


activity (NIDS), system logins, plus application- O 83 86 65 75 70 52 44 61
level records; process metric

Comparative metric highlights good and bad


MO 60 70 65 70 78 68 50 50
examples

Maturity metric highlights systems without


MO 70 50 60 60 88 25 40 20
effective access controls

Could incorproate password length &


complexity, relative strengths of I&A M 60 87 40 40 56 36 41 22
mechanisms etc.; should reflect risk though

During the reporting period M 69 73 72 32 36 4 56 2

How would you know this? O 61 78 33 16 33 0 44 35


S/M/O P R A G M .A
.
T I
Note

Scored using a maturity scale with scoring


SM 90 95 70 80 90 85 90 85
indicators (see appendix H)

Controls degrade over time, untested controls


may not operate as expected, meet control M 92 95 90 65 95 60 75 55
objectives and address risks

Difficult to define the metric, but an important


parameter to measure & improve across all M 83 85 91 73 90 68 70 80
system developments

Difficult to define the metric, but an important


parameter to measure & improve for each IT M 83 88 83 73 90 68 80 82
system

Security needs to be considered in all SDLC


stages - affects feasibility, design, M 85 80 67 62 70 50 35 35
implementation etc.

Absent effective QA, information security is


M 75 70 66 61 80 50 35 36
unlikely to be effective, efficient and consistent

Includes service levels for availability, of course MO 60 75 65 62 40 70 35 80

Controls that fail, especially silently, can be very


risky in certain circumstances, hence this is M 59 55 66 78 77 33 20 48
highly context-dependent

The number and magnitude of deviation would


MO 50 60 60 64 40 50 40 60
be more indicative of risk

S/M/O P R A G M .A
.
T I
Note

Scored using a maturity scale with scoring


SM 90 95 70 80 90 85 90 85
indicators (see appendix H)

Simple yet effective measure of incident


M 82 69 85 76 80 75 65 75
response capability
The times are not always clear-cut, but good
MO 80 70 72 30 75 50 50 65
forensics can help

Failure to determine root cause invites


M 85 85 67 40 77 40 48 16
recurrence

Useful for business case and security budgeting SM 76 85 0 30 95 30 33 40

Steal a dime, steal a dollar: even minor frauds


or incidents are indicative of untreated threats SMO 70 60 0 50 72 35 35 70
and vulnerabilities

Subjective and ill-defined metric; incident


M 50 75 0 15 85 5 16 9
reports are more useful

Subjective and difficult to quantify SM 60 65 0 20 60 6 30 20

S/M/O P R A G M .A .
T I
Note

Implies there is a comprehensive map of


SM 95 90 99 90 95 80 86 80
business processes

Scored using a maturity scale with scoring


SM 90 95 70 80 90 85 90 87
indicators (see appendix H)

Assumes the organization has identified its


critical business processes, and that BC M 85 97 93 84 89 75 85 85
adequacy can be determined

RTO & RPO are basic BCM parameters;


perhaps change 'defined' to 'approved' if M 88 99 90 68 93 68 92 68
random values may be assigned

Count the number of plans that have not been


reviewed/tested when planned, or the days MO 75 75 90 73 84 76 80 77
overdue

High assurance can only be achieved by highly


SMO 83 80 85 91 92 75 75 81
realistic (ideally live) testing
A classic IT performance metric directly related
directly to availability but often politically defined MO 84 97 66 78 94 61 79 47
and determined

Comprises a whole family of more specific


metrics, collectively indicating how close to the SMO 92 92 82 77 96 62 84 64
red line IT is being run

This should generate the map or landscape


used by metric 14.1 and numerous BCM SM 85 92 79 81 90 70 75 40
activities

Assumes the costs relating to resilience,


disaster recovery and contingency are tracked SM 75 92 20 82 95 70 70 70
reasonably accurately

Only measures the assessment O 62 53 68 36 5 69 34 43

S/M/O P R A G M .A
.
T I
Note

Scored using a maturity scale with scoring


SM 90 95 70 80 90 85 90 85
indicators (see appendix H)

Analysis of unauthorized and authorized


(respectively) non-compliance to policies, M 87 83 84 94 81 83 84 87
procedures, standards, laws, regs etc.

May be numeric, textual or both (annotated


scores); may include internal, external and SM 79 89 87 96 92 84 30 96
certification audits, even reviews etc.

Avoiding corporate and/or personal liabilities is


SMO 77 85 85 70 98 68 35 89
a strong management driver

Historical trends and common factors generally


S 70 80 72 82 80 80 20 67
indicate systematic security issues

Assumes a formal accreditation process M 72 79 73 89 68 32 22 89

Lack of compliance means increased risk,


incorrect requirements or inadequate SMO 75 75 73 63 65 58 40 40
compliance procedures
Assumes a compliance checking process is
running; could be analyzed by business unit for M 58 55 82 73 86 47 64 66
additional leverage

Hinges on the definition of 'adequate' M 96 92 78 40 75 33 60 34

In regulated sectors, indicative of risk of


regulatory sanctions; also indicative of cost SM 80 65 62 61 90 60 22 65
effectiveness of infosec

Simply identify and count the number of policy


noncompliance incidents each reporting period O 55 64 75 50 68 34 59 76
to drive up compliance ...

Number of embarrassing privacy breaches &


other information security incidents that become SM 26 38 20 50 63 72 40 87
public knowledge per year

Indicates a severe gap in the procurement


M 71 51 90 75 82 35 13 20
processes!

Measures whether or not the organization is


focusing assurance activities on the risks that M 65 40 85 40 3 20 46 76
actually matter; confusing definition

A very narrow (specific) metric with negligible


MO 1 1 90 84 1 70 50 81
relevance to information security

Assumes we know which assets are critical and


which systems are compliant, but compliant M 48 26 36 41 56 13 19 46
does not necessarily mean secure
C Score

92 86%

90 85%

95 83%

22 69%

52 68%

50 60%

10 59%

42 58%

40 58%

45 55%

22 41%
94 40%

7 6%

C Score

90 88%

88 85%

85 85%

60 84%

81 77%

85 76%

78 75%

80 73%

75 72%

84 72%
89 72%

41 65%

34 64%

35 53%

12 51%

C Score

95 91%

80 89%

90 87%

99 85%

75 72%

70 72%

35 72%
79 69%

39 69%

40 68%

95 68%

20 64%

43 63%

37 62%

59 62%

23 62%

88 60%

88 58%

88 58%

90 58%

80 57%
30 56%

91 55%

27 53%

45 52%

22 38%

30 35%

88 16%

C Score

90 91%

90 86%

80 82%

82 76%

70 69%
26 51%

44 48%

C Score

90 86%

75 78%

91 69%

50 62%

70 58%

20 55%

44 55%

10 52%

95 38%

10 37%
20 36%

5 35%

5 30%

C Score

98 88%

90 86%

60 78%

55 75%

42 62%

66 41%

31 29%

C Score

90 86%
70 80%

80 73%

47 73%

36 72%

83 72%

60 63%

34 59%

60 55%

19 55%

42 46%

40 41%

5 37%

67 23%

78 15%
17 9%

0 0%

C Score

79 90%

90 86%

80 71%

75 67%

11 61%

20 59%

40 50%

42 47%

50 44%

33 37%
C Score

90 86%

60 76%

20 73%

10 73%

50 59%

50 58%

20 56%

10 50%

20 49%

C Score

90 86%

60 74%
65 62%

40 55%

55 49%

50 49%

42 33%

17 31%

C Score

88 89%

90 86%

75 85%

90 84%

93 80%

60 80%
89 77%

29 75%

40 72%

70 72%

33 45%

C Score

90 86%

88 86%

36 77%

60 74%

65 68%

88 68%

70 62%
17 61%

30 60%

10 57%

33 57%

87 54%

6 49%

35 46%

30 45%

12 33%
Example information s
ranked by PRAGMATIC s

Rank Ref Example metric

1 6.1 Quality of security metrics in use

Number of orphaned information assets without an


2 7.1
owner

Rate of messages received at central access


3 11.1
logging/alerting system

4 14.1 Coverage of business impact analyses

5 6.2 % of security controls that may fail silently

Number of security policies, standards, procedures


6 5.1
and metrics with committed owners

Power consumed by the computer suite versus air


7 9.1
conditioning capacity

8 6.3 Security governance maturity

9 14.2 Business continuity management maturity

10 10.1 IT security maturity

11 12.1 Software security maturity


12 13.1 Information security incident management maturity

13 15.1 Information security compliance management maturity

14 7.2 Information asset management maturity

15 8.1 Human resources security maturity

16 9.2 Physical and environmental security maturity

17 4.1 Security risk management maturity

18 15.2 Breakdown of exceptions and exemptions

19 11.2 Information access control maturity

Number of high/medium/low risks currently


20 4.2
untreated/unresolved

% of critical business processes having adequate


21 14.3
business continuity arrangements

22 5.2 Security policy management maturity

Traceability of policies, control objectives, standards &


23 5.3
procedures

24 6.4 Information security ascendency


Number of important operations with documented &
25 5.4
tested security procedures

% of business processes having defined RTOs and


26 14.4
RPOs

27 4.3 Information security budget variance

28 7.3 % of information assets not [correctly] classified

29 14.5 Business continuity plan maintenance status

30 14.6 Disaster recovery test results

% of systems checked and fully compliant to


31 10.2
applicable (technical) security standards

32 8.2 Security awareness level

Discrepancies between physical location and logical


33 9.3
access location

34 14.7 Uptime

35 5.5 Comprehensiveness of security policy coverage

Number and severity of findings in audit reports,


36 15.3
reviews, assessments etc.

37 12.2 % of controls tested realistically


Policy coverage of frameworks such as ISO/IEC
38 5.6
27002

39 7.4 Unowned information asset days

40 14.8 IT capacity and performance

Number or % of security policies addressing viable


41 5.7
risks

42 9.4 Number of unsecured access points

43 13.2 Time taken to remediate security incidents

Status of compliance with externally-imposed


44 15.4
information security obligations

45 12.3 Software quality assurance

46 5.8 Quality of security policies

47 12.4 Quality of system security revealed by testing

48 10.3 Time from change approval to change

Correlation between system/configuration logs and


49 10.4
authorized change requests

Mapping critical business processes to disaster


50 14.9
recovery and business continuity plans
Thud factor (policy verbosity/red tape index, waffle-o-
51 5.10
meter)

% of policy statements unambiguously linked to control


52 5.9
objectives

% of controls unambiguously linked to control


53 6.5
objectives

Number of controls meeting defined control


54 6.6
criteria/objectives

55 10.5 % of IT devices not securely configured

56 10.6 Rate of change of emergency change requests

57 6.7 % of critical controls consistent with controls policy

58 14.10 Business continuity expenditure

Number of security policies whose review/reapproval is


59 5.11
overdue

Days since logical access control matrices for


60 11.3
application systems were last reviewed

61 6.9 Corporation's economic situation

62 4.4 Process/system fragility or vulnerability

63 6.10 % of controls that are ossified or redundant


64 7.5 Integrity of the information asset inventory

Rate of change in employee turnover and/or


65 8.3
absenteeism

66 15.5 Historic consequences of noncompliance

Number of systems whose security has been


67 15.6
accredited

68 4.5 Number of unpatched technical vulnerabilities

69 6.11 Control objectives tied to specific business objectives

Days since the last serious information security


70 6.12
incident

% of inactive user accounts that have been disabled in


71 11.4
accordance with policy

Flesch readability scores for policies, procedures,


72 5.12
standards and guidelines

73 5.13 Number or % of security policies that are clear

74 6.13 Annual cost of information security controls

75 10.7 % of highly privileged/trusted users or functions

76 6.14 Number of different controls


Status of compliance with internally-mandated
77 15.7
(corporate) information security requirements

78 13.3 Time lag between incident and detection

79 9.5 Number of unacceptable physical risks on premises

80 6.15 Extent of accountability for information assets

81 6.16 ·         Information security expenditure

82 8.4 Staff morale & attitude

83 6.17 Benford's law

Number of unapproved/unlicensed software


84 15.8
installations identified on corporate IT equipment

85 11.5 Rate of detection of access anomalies

86 6.18 NPV (Net Present Value)

87 4.6 Information security risk scores

% of security policies supported by adequate


88 15.9
compliance activities

Extent to which information security is incorporated in


89 12.5
software QA
90 4.7 Total liability value of untreated/residual risks

Logical access control matrices for applications:


91 11.6
coverage and detail

92 10.8 Entropy of encrypted content

Extent to which QA is incorporated in information


93 12.6
security processes

94 8.5 Tone at the top

95 6.19 ROI (Return On Investment)

96 4.8 Coupling index

97 6.20 IRR (Internal Rate of Return)

98 4.9 Changes in network probe levels

99 6.21 Payback period

100 15.10 Compliance benchmark against peers

Number or rate of security policy noncompliance


101 15.11
infractions detected

Information Security Management customer


102 6.8
satisfaction rating
% of configuration items in line with service levels for
103 12.7
performance and security

104 6.22 Information security controls coverage

% of incidents for which root causes have been


105 13.4
diagnosed and addressed

106 4.10 Organizational and technical homogeneity

% of IT/process changes abandoned, backed-out or


107 10.9
failed for information security reasons

108 6.23 DEFCON level

109 10.10 Vulnerability index

110 8.6 Corporate security culture

111 8.7 System accounts-to-employees ratio

112 15.12 Embarrassment factor

% of security policies that satisfy documentation


113 5.14
standards

114 6.24 Controls consistency

115 8.8 Opinion surveys and direct observations of the culture


116 6.25 Scope of information security activities

Value of information assets owned by each


117 7.6
Information Asset Owner

Number of security policies that are inconsistent with


118 5.15
other policies or obligations

Logical access control matrices for applications: state


119 11.7
of development

120 12.8 % of technical controls that fail-safe

Number of deviations identified between configuration


121 12.9
repository and actual asset configurations

Cumulative costs of information security incidents to


122 13.5
date

123 15.13 % of purchased software that is unauthorized

Number of information security events and incidents,


124 13.6
major and minor

% of information assets not marked with the [correct]


125 7.7
classification

126 11.8 Quality of identification and authentication controls

127 10.11 Delays and inconsistencies in patching

Proportionality of expenditure on assurance versus


128 15.14
potential impact x likelihood
% of software licenses purchased but not accounted
129 15.15
for in repository

% of critical systems reviewed for compliance with


130 14.11
critical control requirements

% of business units that have proven their


131 11.9
identification and authentication mechanisms

132 9.6 Distance between employee and visitor parking

133 10.12 Perceptions of rate of change in IT

134 4.11 % of controls working as defined

Organization's insurance coverage versus annual


135 4.12
premiums

136 8.9 Help desk security traffic volumes

137 6.26 VAR (Value At Risk)

138 8.10 Culture / world view

Number of times that assets were accessed without


139 11.10
authentication or validation

140 10.13 Patching policy compliance

141 8.11 Employee turn versus account churn


142 6.27 ROSI (Return on Security Investment)

143 8.12 Organizational dysfunction

Number of information security incidents that could


144 13.7
have been prevented, mitigated or avoided

% of critical information assets residing on fully


145 15.16
compliant systems

146 13.8 Non-financial impacts of incidents

147 8.13 Psychometrics

148 9.7 % of facilities that have adequate external lighting

149 10.14 Number of changes

150 6.28 Security budget as % of IT budget or turnover

151 10.15 Number of viruses detected in user files

152 10.16 Number of firewall rules changed

153 4.13 Number of attacks

154 10.17 Toxicity rate of customer data


Example information security metrics,
anked by PRAGMATIC score (unweighted)
S/M/O P R A G M .A
.
T
Note
Measured using metametrics similar to
SM 96 91 99 92 88 94 89
PRAGMATIC

Information asset owners are accountable for


their adequate protection; orphaned asets are M 85 90 97 90 90 95 85
less llkely to be properly secured

A 'heartbeat' metric: the sudden unexpected lack


of data from a system is probably a security O 87 88 94 93 93 94 97
incident

Implies there is a comprehensive map of


SM 95 90 99 90 95 80 86
business processes

Controls that fail without it being obvious are a


SMO 90 90 90 90 90 93 86
liability, increasing security risk

To achieve effective accountability and


M 81 87 90 95 92 92 77
responsibility requires an identified "owner"

Virtually all the power ends up as heat, so it is


important to track power comsumed against the O 81 69 89 92 80 99 98
air conditioning capacity

Scored using a maturity scale with scoring


SM 95 97 70 78 91 89 90
indicators (see appendix H)

Scored using a maturity scale with scoring


SM 90 95 70 80 90 85 90
indicators (see appendix H)

Scored using a maturity scale with scoring


SM 90 95 70 80 90 85 90
indicators (see appendix H)

Scored using a maturity scale with scoring


SM 90 95 70 80 90 85 90
indicators (see appendix H)
Scored using a maturity scale with scoring
SM 90 95 70 80 90 85 90
indicators (see appendix H)

Scored using a maturity scale with scoring


SM 90 95 70 80 90 85 90
indicators (see appendix H)

Scored using a maturity scale with scoring


SM 90 95 70 80 90 85 90
indicators (see appendix H)

Scored using a maturity scale with scoring


SM 90 95 70 80 90 85 90
indicators (see appendix H)

Scored using a maturity scale with scoring


SM 90 95 70 80 90 85 90
indicators (see appendix H)

Scored using a maturity scale with scoring


SM 92 98 68 78 90 83 89
indicators (see appendix H)

Analysis of unauthorized and authorized


(respectively) non-compliance to policies, M 87 83 84 94 81 83 84
procedures, standards, laws, regs etc.

Scored using a maturity scale with scoring


SM 90 95 70 80 90 80 90
indicators (see appendix H)

Measures the overall effectiveness of risk


SMO 87 87 84 81 89 80 87
management processes

Assumes the organization has identified its


critical business processes, and that BC M 85 97 93 84 89 75 85
adequacy can be determined

Scored using a maturity scale with scoring


SM 90 95 70 80 88 85 90
indicators (see appendix H)

They should all be explicitly/directly interlinked


M 85 89 88 90 91 87 65
and/or cross-referenced

Position of security in organization very


S 97 87 15 94 86 90 99
indicative of authority, culture and effectiveness
Procedural documentation captures knowledge
but must be tested for accuracy; essential for MO 95 96 91 85 95 84 62
business continuity

RTO & RPO are basic BCM parameters;


perhaps change 'defined' to 'approved' if random M 88 99 90 68 93 68 92
values may be assigned

Cuts (or increases!) to the infosec budget are


likely to effect what can realistically be achieved, M 70 90 85 77 80 77 80
but specific differences can't be predicted

Absent asset classification, there is risk of


under-protecting or the cost of over-protecting MO 75 75 97 85 90 80 80
assets; assumes a comprehensive inventory

Count the number of plans that have not been


reviewed/tested when planned, or the days MO 75 75 90 73 84 76 80
overdue

High assurance can only be achieved by highly


SMO 83 80 85 91 92 75 75
realistic (ideally live) testing

Measures of the technical compliance processes O 81 77 89 86 89 73 74

Measured using surveys, scales, tests etc. MO 86 89 86 82 85 80 69

Inconsistencies between logical and physical


M 75 76 72 90 82 75 85
locations of employees are a concern

A classic IT performance metric directly related


directly to availability but often politically defined MO 84 97 66 78 94 61 79
and determined

Standards must be based on and clarify policy S 75 82 92 78 80 70 73

May be numeric, textual or both (annotated


scores); may include internal, external and SM 79 89 87 96 92 84 30
certification audits, even reviews etc.

Controls degrade over time, untested controls


may not operate as expected, meet control M 92 95 90 65 95 60 75
objectives and address risks
Policy coverage metric implies management has
MO 70 75 90 69 85 76 72
a reasonable idea of the policy landscape

Counted from an arbitrary start or asset creation


date: speeds up the nomination of owners for all MO 40 51 84 77 74 86 92
(significant) information assets

Comprises a whole family of more specific


metrics, collectively indicating how close to the SMO 92 92 82 77 96 62 84
red line IT is being run

A measure of completeness of governance


M 65 76 91 73 83 77 70
structure

Locate (and perhaps rank) vulnerabilities


MO 95 80 90 70 85 77 45
through tiger team/penetration testing

Simple yet effective measure of incident


M 82 69 85 76 80 75 65
response capability

Avoiding corporate and/or personal liabilities is a


SMO 77 85 85 70 98 68 35
strong management driver

Difficult to define the metric, but an important


parameter to measure & improve across all M 83 85 91 73 90 68 70
system developments

Having pre-agreed criteria or standards for


MO 80 85 40 66 72 75 80
quality, readability etc. helps!

Difficult to define the metric, but an important


parameter to measure & improve for each IT M 83 88 83 73 90 68 80
system

Approval suggests proper risk evaluation


process ergo reducing the time needed to M 70 71 76 90 60 84 64
achieve changes is likely to be beneficial

This is a process compliance metric in the


context of IT change management: unauthorized M 87 80 90 80 80 80 60
changes may be incidents

This should generate the map or landscape used


SM 85 92 79 81 90 70 75
by metric 14.1 and numerous BCM activities
Policies are one form of control addressing risks:
M 82 80 60 60 70 45 85
if there is no viable risk, why have a policy?

Confirms that detailed policy statements


M 92 91 64 60 85 65 45
reflect/satisfy identified control objectives

ISO27k approach: without clear objectives, how


would you determine whether the controls are M 92 91 64 60 85 65 45
adequate?

Either objectives are incorrect or control


MO 88 86 88 65 78 60 26
inadequate - binary measure

Assumes there are technical security


configuration standards and systematic O 83 80 77 75 59 74 76
assessment (e.g.the CIS benchmarks)

A peak would suggest a cluster of serious issues


and possibly a process failure if the changes are MO 64 71 69 73 78 70 70
not really emergencies

Without a controls policy, implementation is likely


SM 83 92 80 83 89 82 32
to be ad hoc, contradictory and inconsistent

Assumes the costs relating to resilience, disaster


recovery and contingency are tracked SM 75 92 20 82 95 70 70
reasonably accurately

Presumably ranges from "none" to "we live by it"


O 54 88 92 14 97 77 43
but very subjective unless supported by criteria

Indicates how effective are the processes to


MO 55 80 95 30 80 85 60
manage/maintain access rights

Financial stress generally results in increased


SM 72 80 10 80 80 80 61
risk

Contrast results of theoretical assessments with


actual performance as a measure of risk and SMO 90 90 44 80 92 77 66
security management

“It’s the way we’ve always done it: put the pink
copy in the green file …"; gathering the data has SM 85 88 85 80 84 75 22
spin-off benefits
Measure the completeness, accuracy and
MO 82 66 83 78 80 43 50
up-to-date-ness of the inventory

Big changes (especially sudden increases)


suggest potentially increased security risks and SM 60 66 20 85 60 80 75
so are probably worth investigating

Historical trends and common factors generally


S 70 80 72 82 80 80 20
indicate systematic security issues

Assumes a formal accreditation process M 72 79 73 89 68 32 22

The unknown ones are arguably more worrying


MO 80 64 80 70 80 75 25
but we can't count them!

Controls that do not support specific business


SM 96 95 65 55 99 50 40
objectives may not be needed

"Serious" needs to be defined by severity


M 62 70 11 87 87 10 92
criteria

During the reporting period MO 68 56 74 76 73 64 64

The volume level in decibels when the policy


suite is dropped from one meter negatively MO 68 77 60 86 35 70 64
correlates to its effectiveness :-)

Drives the policy review process; assumes


O 75 70 68 41 96 50 56
policies have defined review/reapproval dates

Difficult to determine accurately but a very


interesting metric, especially if analyzed by SM 94 92 90 77 97 44 50
different categories of expense

Indicative of the organization's control over


privileged access: potentially a very granular MO 86 80 51 40 65 39 55
information-rich metric

Fewer effective system wide controls more


efficient- too few lose granularity, too many too SM 71 75 72 75 88 30 50
costly
Lack of compliance means increased risk,
incorrect requirements or inadequate compliance SMO 75 75 73 63 65 58 40
procedures

The times are not always clear-cut, but good


MO 80 70 72 30 75 50 50
forensics can help

Useful way to highlight issues but a rather


M 70 60 85 60 90 60 30
subjective and coarse measure

Good indicator of security maturity S 94 93 78 36 72 76 30

Compare against previous years’ figures and


projections for future years, and against various SM 82 94 60 60 89 29 33
other overheads 

Difficult to measure objectively without testing,


SM 88 72 60 75 65 75 20
yet important for security and fraud

Predictability of first digit of random numbers O 84 30 53 95 11 98 62

Assumes a compliance checking process is


running; could be analyzed by business unit for M 58 55 82 73 86 47 64
additional leverage

Correlate turnstile/card accesses, network


activity (NIDS), system logins, plus application- O 83 86 65 75 70 52 44
level records; process metric

Standard accounting technique reflects the time


M 77 72 25 35 85 55 44
value of money

Most RA methods generate some sort of scoring


output that is useful to compare whatever is SM 72 60 55 70 71 40 60
analyzed

Hinges on the definition of 'adequate' M 96 92 78 40 75 33 60

Security needs to be considered in all SDLC


stages - affects feasibility, design, M 85 80 67 62 70 50 35
implementation etc.
Assumes we can estimate the liability (potential
SM 88 98 59 33 96 33 77
impact x likelihood) that remains untreated

Comparative metric highlights good and bad


MO 60 70 65 70 78 68 50
examples

Measured by sampling and looking for


O 78 66 23 78 3 93 74
patterns/uneven distributions

Absent effective QA, information security is


M 75 70 66 61 80 50 35
unlikely to be effective, efficient and consistent

Difficult to measure objectively, yet an important


driver for security and governance throughout SM 95 50 57 40 91 45 50
the organization

Standard accounting technique M 70 72 25 30 82 50 44

The amount of coupling between systems and


processes has an impact on the risk of cascade S 68 85 50 60 72 47 35
failures ('the domino effect')

Standard accounting technique M 69 72 25 30 82 50 44

Exploratory probes often precede attacks, so


may give an early warning (for various values of MO 50 80 10 68 66 85 50
'early')

Standard accounting technique M 65 72 25 25 88 50 44

In regulated sectors, indicative of risk of


regulatory sanctions; also indicative of cost SM 80 65 62 61 90 60 22
effectiveness of infosec

Simply identify and count the number of policy


noncompliance incidents each reporting period O 55 64 75 50 68 34 59
to drive up compliance ...

Effectiveness can be compromised by low


SM 60 60 40 35 85 51 85
scores / perception of lack of value
Includes service levels for availability, of course MO 60 75 65 62 40 70 35

Source: Scott Berinato. Measure with a maturity


MO 87 89 65 40 74 35 46
scale or ISO27K checklist

Failure to determine root cause invites


M 85 85 67 40 77 40 48
recurrence

Homogeneity or heterogeneity impacts on


aggregated risk and can be measured using MO 67 70 40 59 67 50 33
scoring scales

Measures procedural deficiencies: these


changes should have been blocked by change MO 50 70 55 60 65 40 50
testing and authorization processes

External threats may pose organizational risk M 5 10 30 85 25 71 88

Could be measured using automated analyses


and/or penetration tests, and compared across O 74 85 71 74 60 32 46
systems

In the hands of competent psychometricians,


SM 60 76 55 75 60 60 10
OCAI scores are highly predictive of behaviors

Measures personal accountability; will stimulate


an interesting disucssion about what the 'ideal' MO 74 67 38 39 68 42 36
value should be!

Number of embarrassing privacy breaches &


other information security incidents that become SM 26 38 20 50 63 72 40
public knowledge per year

Ideal scores likely to vary between those types


MO 66 47 79 45 74 38 44
of document

Standardized controls for same things, follows


M 78 83 67 60 71 33 27
controls policy consistently

Culture can be improved by focusing attention


SM 80 80 60 55 75 55 10
using surveys.
Does scope include SCADA/ICS, facilities,
production systems, elevators, archives, S 86 74 35 44 70 37 30
backups, business continuity etc.?

Assumes there is an information assets


M 48 64 78 57 79 38 50
valuation process, which has spin-off benefits

… as perceived by the intended audiences O 60 49 76 43 88 45 41

Maturity metric highlights systems without


MO 70 50 60 60 88 25 40
effective access controls

Controls that fail, especially silently, can be very


risky in certain circumstances, hence this is M 59 55 66 78 77 33 20
highly context-dependent

The number and magnitude of deviation would


MO 50 60 60 64 40 50 40
be more indicative of risk

Useful for business case and security budgeting SM 76 85 0 30 95 30 33

Indicates a severe gap in the procurement


M 71 51 90 75 82 35 13
processes!

Steal a dime, steal a dollar: even minor frauds or


incidents are indicative of untreated threats and SMO 70 60 0 50 72 35 35
vulnerabilities

Presumes that labels are meaningful and correct O 52 53 63 44 62 13 17

Could incorproate password length & complexity,


relative strengths of I&A mechanisms etc.; M 60 87 40 40 56 36 41
should reflect risk though

Would have to be carefully defined O 43 41 77 62 36 32 48

Measures whether or not the organization is


focusing assurance activities on the risks that M 65 40 85 40 3 20 46
actually matter; confusing definition
A very narrow (specific) metric with negligible
MO 1 1 90 84 1 70 50
relevance to information security

Only measures the assessment O 62 53 68 36 5 69 34

During the reporting period M 69 73 72 32 36 4 56

Assumes there is a perceived need for


O 1 0 6 93 2 93 66
segregation

This is clearly a highly subjective measure but


extreme values (whether high or low) suggest M 40 50 6 65 70 50 30
increased information security risks

Presumes controls are being systematically


MO 62 62 44 26 66 25 22
tested/proven

If management spends $50k pa to provide $1m


in business interruption insurance, the risk S 64 46 5 25 20 16 10
appetite has been partially quantified

Assumes that security support calls are routinely


identified as such by the Help Deskers' call O 24 33 16 58 5 35 33
logging system

Standard risk management technique M 70 65 20 30 35 40 30

Difficult to measure objectively without testing,


SM 66 30 10 70 40 56 15
yet important for security and fraud

How would you know this? O 61 78 33 16 33 0 44

Compare actual patch status of systems against


the policy on patching & vulnerability O 66 52 55 77 19 36 11
management

Any differences between the rates of change


should be investigated but may be legitimate; O 30 30 11 36 44 36 62
may need to count contractors separately
ROI as applied to security investments, using
M 40 40 20 20 55 45 25
ALE

Difficult to measure objectively, yet symptomatic


SM 75 20 10 60 80 40 15
of likely security risks within the organization

Subjective and ill-defined metric; incident reports


M 50 75 0 15 85 5 16
are more useful

Assumes we know which assets are critical and


which systems are compliant, but compliant M 48 26 36 41 56 13 19
does not necessarily mean secure

Subjective and difficult to quantify SM 60 65 0 20 60 6 30

Can be used to check the extent to which


MO 40 24 0 79 15 55 10
employees are suited to their roles

Measures a single security control O 2 5 70 42 11 46 35

… in the reporting period, a crude activity


O 55 24 9 6 2 3 15
measure loosely correlated with security

Possible comparator between business units or


organizations but wide variety of possible M 13 3 16 2 2 0 4
reasons for the differences

Let's be generous and assume this actually


O 8 13 6 11 3 2 5
covers all forms of malware

Would have to be carefully defined O 2 1 1 10 2 33 14

"Attack" would have to be carefully defined;


M 13 9 1 2 12 1 4
ignores differing impacts, and accidents

Don't ask us what this means: this extremely


O 0 0 0 0 0 0 0
obscure metric was suggested by someone else!
I C Score

79 95 91%

99 90 91%

89 79 90%

80 88 89%

93 80 89%

92 90 88%

90 98 88%

85 90 87%

87 90 86%

85 90 86%

85 90 86%
85 90 86%

85 90 86%

85 90 86%

85 90 86%

85 90 86%

84 92 86%

87 88 86%

85 90 86%

83 90 85%

85 75 85%

82 88 85%

84 85 85%

97 99 85%
90 60 84%

68 90 84%

90 95 83%

80 80 82%

77 93 80%

81 60 80%

78 70 80%

48 75 78%

83 60 78%

47 89 77%

60 81 77%

96 36 77%

55 60 76%
65 85 76%

94 82 76%

64 29 75%

61 78 75%

75 55 75%

75 60 74%

89 60 74%

80 20 73%

80 80 73%

82 10 73%

60 80 73%

50 47 73%

40 40 72%
86 84 72%

75 75 72%

75 75 72%

90 70 72%

88 36 72%

69 83 72%

70 35 72%

70 70 72%

90 89 72%

70 80 71%

80 79 69%

60 22 69%

62 39 69%
66 70 69%

80 91 69%

67 65 68%

89 88 68%

85 52 68%

70 40 68%

95 95 68%

52 75 67%

88 41 65%

90 34 64%

16 20 64%

95 60 63%

65 43 63%
40 70 62%

65 65 62%

60 42 62%

40 37 62%

49 59 62%

50 50 62%

98 23 62%

66 17 61%

61 11 61%

60 88 60%

60 50 60%

34 30 60%

35 50 59%
38 10 59%

50 20 59%

79 34 59%

36 50 58%

25 70 58%

60 88 58%

61 42 58%

60 88 58%

70 40 58%

60 90 58%

65 10 57%

76 33 57%

15 80 57%
80 20 56%

40 30 56%

16 40 55%

65 45 55%

45 60 55%

90 91 55%

33 19 55%

75 20 55%

83 44 55%

87 87 54%

50 35 53%

31 27 53%

45 10 52%
44 45 52%

22 26 51%

43 12 51%

20 40 50%

48 10 50%

60 20 49%

40 55 49%

20 6 49%

70 50 49%

87 44 48%

22 42 47%

34 42 46%

76 35 46%
81 30 45%

43 33 45%

2 50 44%

45 66 41%

14 40 41%

36 22 41%

82 94 40%

45 95 38%

30 22 38%

40 10 37%

35 33 37%

8 5 37%

57 20 36%
40 30 35%

10 5 35%

9 42 33%

46 12 33%

20 17 31%

42 5 30%

18 31 29%

26 67 23%

18 88 16%

5 78 15%

4 17 9%

1 7 6%

0 0 0%
This spreadsheet is protected internationally by copyright law.

Copyright © 2013 Brotby & Hinson

Potrebbero piacerti anche