Sei sulla pagina 1di 40

Electronic voting: practice and theory

Mark Ryan
University of Birmingham

Network security lecture


2 March 2011
Outline

1 Potential & current situation

2 Desired properties

3 Trust assumptions

4 Example 1: FOO (1992)

5 Example 2: Helios (2009)

6 Example 3: JCJ/Civitas (2008) (main focus)

7 Conclusions
Electronic voting: potential

Electronic voting potentially offers

Efficiency Governments world over have


higher voter participation been trialling e-voting, e.g.
greater accuracy USA, UK, Canada, Brasil, the
lower costs Netherlands and Estonia.
Better security Can also be useful for
vote-privacy even in presence smaller-scale elections
of corrupt election authorities (student guild, shareholder
voter verification, i.e. the
voting, trade union ballots,
ability of voters and observers
to check the declared
local government).
outcome against the votes
cast.
Current situation
The potential benefits have turned out to be hard to realise.

In UK
May 2007 elections included 5 local authorities that piloted a range
of electronic voting machines.
Electoral Commission report concluded that the implementation and
security risk was significant and unacceptable and recommends that
no further e-voting take place until a sufficiently secure and
transparent system is available.

In USA:
Diebold controversy since 2003 when code leaked on internet.
Kohno/Stubblefield/Rubin/Wallach analysis concluded Diebold
system far below even most minimal security standards. Voters
without insider privileges can cast unlimited votes without being
detected.
Current situation in USA, continued

In 2007, Secr. of State for California commissioned


top-to-bottom review by computer science academics of
the four machines certified for use in the state. Result is a
catalogue of vulnerabilities, including
appalling software engineering practices, such as hardcoding crypto
keys in source code; bypassing OS protection mechanisms, . . .
susceptibility of voting machines to viruses that propogate from
machine to machine, and that could maliciously cause votes to be
recorded incorrectly or miscounted
weakness-in-depth, architecturally unsound systems in which even
as known flaws are fixed, new ones are discovered.
In response to these reports, she decertified all four types of voting
machine for regular use in California, on 3 August 2007.
Situation in USA 2008 election
Several other states followed Californias lead, and decertified
electronic voting machines.
But other states have continued to use touch-screen systems, having
invested massively. (E.g., the state of Colorado spent $41M on
electronic voting systems for its 3M voters, on machines that
California has now decertified. . . )
Diebold, one of the main suppliers, tried unsuccessfully to sell their
e-voting business. Instead, they rebranded it Premier Election
Solutions and revised their forecasts downwards.
Current situation in Estonia

Estonia is a tiny former Soviet republic (pop. 1.4M), nicknamed


e-Stonia because of its tech-savvy character.
Oct. 2005 local election allowed voters to cast ballots on internet.
There were 9,317 electronic votes cast out of 496,336 votes in total
(1.9%) participated online.
Officials hailed the experiment a success. Said no reports of hacking
or flaws. System based on linux.
Voters need special ID smartcard, a $24 device that reads the card,
and a computer with internet access. About 80% of Estonian voters
have the cards anyway, also used since 2002 for online banking and
tax records.

Feb. 2007 general election: 30,275 voters used internet voting.


Internet voting and coercion resistance

The possibility of coercion (e.g. by family members) seems very hard to


avoid for internet voting.

In Estonia, the threat is somewhat mitigated:


Election system allows multiple online votes to be cast by the same
person during the days of advance voting, with each vote cancelling
the previous one.
System gives priority to paper ballots; a paper ballot cancels any
previous online ballot by the same person.
Where are we?

1 Potential & current situation

2 Desired properties

3 Trust assumptions

4 Example 1: FOO (1992)

5 Example 2: Helios (2009)

6 Example 3: JCJ/Civitas (2008) (main focus)

7 Conclusions
Desired properties

Verifiability
Outcome of election is
verifiable by voters
and observers
You dont need to trust
election software
Desired properties

Verifiability Incoercibility
Outcome of election is Your vote is private
even if you try to
verifiable by voters
and observers cooperate with a coercer
even if the coercer is the
You dont need to trust
election software election authorities
Desired properties

Verifiability Incoercibility
Outcome of election is Your vote is private
even if you try to
verifiable by voters
and observers cooperate with a coercer
even if the coercer is the
You dont need to trust
election software election authorities

Usability
Vote & go
Verify any time
Examples

Verifiable
Incoercible

raising hands using Tor

website voting

Usable
Voting system: desired properties in more detail

Privacy: the fact that a particular voted in a particular way is not


revealed to anyone
Receipt-freeness: a voter cannot later prove to a coercer that she voted
in a certain way
Coercion-resistance: a voter cannot interactively cooperate with a
coercer to prove that she voted in a certain way
Individual verifiability: a voter can verify that her vote was really counted
Universal verifiability: a voter can verify that the published outcome
really is the sum of all the votes
Eligibility verifiability: a voter can verify that only eligible votes have
been counted
Fairness no early results can be obtained which could influence the
remaining voters
. . . and all this even in the presence of corrupt election authorities!
Are these properties even simultaneously satisfiable?

Contradiction?
Eligibility: only legitimate
voters can vote, and only once
Effectiveness: the number of
votes for each candidate is
published after the election Contradiction?
Privacy: the fact that a Receipt-freeness: a voter
particular voted in a particular cannot later prove to a coercer
way is not revealed to anyone that she voted in a certain way
(not even the election Individual verifiability: a
authorities) voter can verify that her vote
was really counted
Individual verifiability
(stronger): . . . , and if her
vote wasnt counted, she can
prove that.
Where are we?

1 Potential & current situation

2 Desired properties

3 Trust assumptions

4 Example 1: FOO (1992)

5 Example 2: Helios (2009)

6 Example 3: JCJ/Civitas (2008) (main focus)

7 Conclusions
How could it be secure?


Trust assumption possibilities

Nothing is required-to-be-trusted

Some hardware or software is Some hardware or software is


required-to-be-trusted required-to-be-trusted
but you can audit it on some but you can bring/make/source

cut/choose basis your own


e.g. PaV, Helios 2.0 e.g. FOO, JCJ/Civitas

Some hardware or software is


required-to-be-trusted
and you just have to assume

it is
e.g. current DRE solutions
Security by trusted client software

trusted by user
not trusted by user
does not need to be
doesnt need to be
trusted by authorities
trusted by anyone
or other voters
Where are we?

1 Potential & current situation

2 Desired properties

3 Trust assumptions

4 Example 1: FOO (1992)

5 Example 2: Helios (2009)

6 Example 3: JCJ/Civitas (2008) (main focus)

7 Conclusions
First, some cryptoraphy

Blind signatures
Normally, when Alice signs a
message M, creating
SignSKA (M), she knows what
the message M is.
In a blind signature, Bob can Commitments
ask her to sign a blinded version Alice can send Bob a
of the message, blindb (M). commitment commitc (M) to a
After she signs it, he can message M.
unblind it. Later, she can reveal c and M,
unblindb (SignSKA (blindb (M))) = and Bob can verify that it is
SignSKA (M) indeed the correct M that she
committed to.
Alice cannot lie, e.g., cannot
find some other c 0 and M 0 that
have the same commitment
commitc 0 (M 0 ).
FOO 92 protocol [FujiokaOkamotoOhta92]

Alice aDministrator Collector


{ blind (commit (v, c), b)} A 1

{ blind (commit (v, c), b)} D 1 I

unblind (...) = { commit (v, c)} D 1

II
{ commit(v, c)} D 1

publ. (l , commit(v, c ))
(l , c )

III
open(...) = v
publ. v
FOO 92 properties
Let us consider for each one whether FOO has it or not:
Eligibility
Fairness
Privacy
Receipt-freeness
Coercion-resistance
Individual verifiability
Universal verifiability
Eligibility verifiability
4 out of 8. . . . not too bad, but not good enough!

FOO usability in a real election: an exercise for the reader.


Where are we?

1 Potential & current situation

2 Desired properties

3 Trust assumptions

4 Example 1: FOO (1992)

5 Example 2: Helios (2009)

6 Example 3: JCJ/Civitas (2008) (main focus)

7 Conclusions
Election of president at University of Louvain

The election No coercion


resistance
Helios 2.0 Only
25,000 potential voters recommended
for
5000 registered, 4000 voted
Educated, but not technical low-coercion
environments
30% voters checked their vote Re-votes are
No valid complaints allowed, but
dont help
Verifiability w.r.t. insider
Anyone can write code to verify coercer
the election
Sample python code provided
[Adida/deMarneffe/Pereira/-
Quisquater 09]
Zero-knowledge proofs

Problem
I give you an encryption
enck (v ) of a value v .
How do you know that I
know v (E.g., that I
didnt copy someone
elses ballot)?
How do you know that v
satisfies certain
constraints, e.g., that it is
a valid vote?
Zero-knowledge proofs

Problem
I give you an encryption Solution
enck (v ) of a value v . I give you a
How do you know that I zero-knowledge proof
know v (E.g., that I that I know the v inside
didnt copy someone enck (v ).
elses ballot)? I give you a zero
How do you know that v knowledge proof that the
satisfies certain v inside enck (v ) satisfies
constraints, e.g., that it is the relevant constraints
a valid vote?

Zero-knowledge proofs are cryptographic objects, that (among other


applications) provide unforgeable proof about values hidden inside
encryptions. They dont reveal the value itself, but prove something
about it.
Elgamal encryption

Public parameters: a group


G , and a generator g .
Private key x
Public key h = g x

Encryption of m:

(c, d) = (g r , m hr )

Variation:

(c, d) = (g r , g m hr )

Decryption of (c, d)
(assuming variation):

0 d
gm =
cx
Elgamal encryption

Public parameters: a group


Homomorphic combination
G , and a generator g .
Private key x (c1 , d1 ) (c2 , d2 ) =
Public key h = g x (g r1 +r2 , g m1 +m2 hr1 +r2 )

Encryption of m:

(c, d) = (g r , m hr )

Variation:

(c, d) = (g r , g m hr )

Decryption of (c, d)
(assuming variation):

0 d
gm =
cx
Elgamal encryption

Public parameters: a group


Homomorphic combination
G , and a generator g .
Private key x (c1 , d1 ) (c2 , d2 ) =
Public key h = g x (g r1 +r2 , g m1 +m2 hr1 +r2 )

Encryption of m: Verifiable, distributed decryption


(c, d) = (g r , m hr ) The decryption key can be
held among a set of key
Variation: holders.
Each keyholder is required
(c, d) = (g r , g m hr ) to participate in the
decryption.
Decryption of (c, d)
(assuming variation): Each keyholder also creates
a zero-knowledge proof
0 d that he performed his part
gm =
cx correctly.
Helios 2.0

User prepares ballot


(encrypted vote) on her Ballots are checked &
computer, together with homomorphically
ZKPs to show that it is a combined into a single
valid vote. encrypted outcome.
Cut-and-choose Outcome is decrypted
auditability provides by threshold of talliers.
assurance of correctness Proof of correct
Not much guarantee of decryption.
privacy on client side.
Where are we?

1 Potential & current situation

2 Desired properties

3 Trust assumptions

4 Example 1: FOO (1992)

5 Example 2: Helios (2009)

6 Example 3: JCJ/Civitas (2008) (main focus)

7 Conclusions
Verifiability and Incoercibility

JCJ Civitas is the only protocol (to my knowledge) that achieves both
verifiability and incoercibility in strong forms. This makes it of great
theoretical interest, although its complexity may make it unusable in
practice.
How does it achieve these properties?
Incoercibility
Verifiability Voters cannot prove that a given value
Everything is their credential. Votes under invalid
that the credentials may be cast, but wont be
servers counted. Observers can verify that
process is votes with incorrect credentials werent
published counted, but they cant see which ones
those were.
Verifiable reencryption mixes

Problem
We want to shuffle a
bunch of encryptions
{v }m
pk , like putting them
into a big box, closing it,
and shaking it for a long
time!
But we want to be sure
that
What comes out of
the box is what goes
in, as a whole
No-one can link any
particular object that
comes out with a
particular object that
went in
Verifiable reencryption mixes

Problem
Solution: a verifiable reencryption mix
We want to shuffle a
bunch of encryptions It takes as input the bunch of
{v }m encryptions {v }mpk .
pk , like putting them
into a big box, closing it, It re-randomises them all.
and shaking it for a long It outputs the results.
time!
But we want to be sure
that
What comes out of
the box is what goes
in, as a whole
No-one can link any
particular object that
comes out with a
particular object that
went in
Verifiable reencryption mixes

Problem
Solution: a verifiable reencryption mix
We want to shuffle a
bunch of encryptions It takes as input the bunch of
{v }m encryptions {v }mpk .
pk , like putting them
into a big box, closing it, It re-randomises them all.
and shaking it for a long It outputs the results.
time!
But we want to be sure How to verify it did the mix correctly?
that Ask it to do another mix.
What comes out of Flip a coin.
the box is what goes If heads, ask it to prove the
in, as a whole correspondence between the input
No-one can link any and the result of the second mix.
particular object that If tails, ask it to prove the
comes out with a correspondence between the output
particular object that and the result of the second mix.
went in
JCJ/Civitas step-by-step

1 Voter obtains her credential d.


She obtains it in several parts, each one from a different Registrar.
She puts them together herself
She cant prove to anyone the validity of her credential.
2 Voter casts her ballot:
0
({v }m m
pkT , {d}pkR , zkp)

3 System removes malformed ballots.


0
4 System verifiably re-encrypts and mixes ({v }m m
pkT , {d}pkR ) part.
5 System compares all the {v }m
pkT parts using a plaintext equivalence
test (PET), and discards duplicates.
6 System uses PETs to remove any ineligible votes
7 Keyholders decrypt using verifiable threshold decryption.
JCJ-Civitas
Ballot Electoral register
{v }mpk , {d }mpk' , zkp
T R
m' '
{d }pk , Anne Jones
R

Voter with
remove malformed
credential d ballots

remove duplicates

remove ineligible ballots

decrypt

results
JCJ-Civitas: verifiability

All the intermediate stages are


published. Eligibility verifiability
The mixnet proofs are Voters construct a secret
published. credential through interaction
with multiple registrars.
The decryption proofs are
published A public part of the credential
is published on the electoral
A voter can check her encrypted register, for public scrutiny.
vote is among the inputs to the first Voters submit their vote with a
mix; then she can check the overall differently randomised public
correctness of the mix, and the part of the credential.
decryption.
I This means that any observer
Any observer can check all the can verify that all the votes
mixes, the decryptions, and the cast are cast by eligible voters.
tallying.
Conclusions

Electronic voting is coming, whether we like it or not.


Sadly, the current systems are woefully inadequate.
in the USA and UK, they dont manage to satisfy basic security
properties, like resistance to virus attacks and to tampering.
They dont even try to satisfy stronger properties, like privacy
guarantees against corrupt election officials, universal and individual
verifiability.
There are many academic protocols which have much better
properties, although some of them underestimate the importance of
usability issues.

Potrebbero piacerti anche