Sei sulla pagina 1di 5

Privacy Issues

Two basic idea


Personal privacy:
"A desire to be left alone, free to be ourselves, unconstrained by the prying of others"
(Wacks, 2010)
Includes issues of:
Intrusion,
Surveillance,
Use of ones image, etc.

Data or information privacy:


Refers to how data, that can be linked to an individual, is collected and used
Is concerned with under what circumstances, and how data may be collected, stored,
and processed
Electronic data raises significant concern
Quantity of data is more
Remote access
Security of data is big concern

Why should we have privacy?


1. Promotes individuality through personal autonomy, avoiding manipulation or
domination by others
2. Permits emotional release - removing our social "masks"
3. Permits self-evaluation - testing creative and moral activities and ideas
4. Allows sharing of confidences and intimacies in limited and protected communication

Why should we NOT have privacy?


1. May conceal things e.g., domestic oppression
2. May weaken detection and apprehension of threats to social order (criminals, terrorists)
3. May hamper free flow of information , impeding transparency and candor
4. May obstruct business efficiency, slow down making commercial decisions
5. May be a form of deception - withholding (unflattering) information

OECD (Organization for Economic Co-operation and Development) Guidelines:


Established 8 principles that are reflected in legislation of most nations
1. Collection Limitation
 There should be limits to collection of personal data and any such data should be
obtained by lawful and fair means, with knowledge or consent
2. Data Quality
 Personal data should be relevant to purposes for which they are to be used, and, to the
extent necessary for those purposes, should be accurate, complete and kept-up-to-date.
3. Purpose Specification
 Specified at time collected … subsequent use limited to the fulfilment of those purposes
or such others as are not incompatible with those purposes and as are specified on each
occasion of change of purpose
4. Use Limitation
 Data not disclosed to 3rd parties except with consent or if required by law
5. Security Safeguards
 Reasonable safeguards against loss, unauthorized access, destruction etc.
6. Openness
 There should be a general policy of openness about developments, practices and
policies … means of establishing existence and nature of data; and identity of personal
responsible
7. Individual Participation
 Essentially that an individual may have access to and ability to correct data
8. Accountability
 There's an individual who is accountable for compliance with the principles

Transfer of data between countries


 Free trans border flow except if the other country doesn’t substantially the guide lines
 Restriction on for categories of personal data
 Corporation between countries

GDPR rules and regulations


-data quality
-right to access
-confidentiality & security

Defines personal data very broadly … when someone is able to link the information to a person
even if the organization holding the data can't make this link.

PII - Personally identifiable information


Any data that alone or in combination with other data can be used to identify an individual.

DC- Data Controller


Can be either individuals or legal persons such as companies gov, sch, ngos

Impacts of GDPR
Reorients data protection as a fundamental right
Human centered
 The processing of personal data should be designed to serve mankind
Individual control
 Natural persons should have control over there personal data.
The data doesn’t primarily serve business needs , just based to protect individual.

Human rights 1977


Privacy act 1983 separated from human right act

PIPEDA - Personal Information Protection and Electronic Documents Act


 Protection / fair handling of personal information with consent
FIPPA for ONTARIO - Sets rules for collection, use , disposal , and excess to information
 Establishes the office of the information and privacy commissioner of
Ontario
US
US has only patch work of sectors specific laws that fail to adequately protect data, instead of
comprehensive legal protection for personal data

Safe Harbour
 Allow data transfer to those companies that agreed to adhere to fair information
practices
 Was declared invalid by the EU of Justice in 2015 because of concerns over NSA
collection of data of EU citizens via social media companies who had custody of that
data
 Replaced with the EU-US Privacy Shield program:
 Limited access by US Gov authorities - tries to limit mass surveillance
 Companies can self-register as meeting the GDPR requirements
 Annual review mechanism
 But still has concerns relating to
 Deletion of data,
 Collection of massive amounts of data,
 Clarification of a new Ombudsperson mechanism
 Us is making the issues work with GDPR using FTC
FTC - Federal Trade Commission
 Is the primary enforcer of consumer protection, in which privacy is becoming more and
more important
 In US a company doesn’t have to have or disclose a privacy policy, but the FTC's position
is that if a company provides a privacy policy, it must comply with it.
 FTC regards it as a violation of the Act for a company to retroactively change its privacy
policy without providing data subjects an opportunity to opt out of the new privacy
practice
 FTC does attempt to pursue privacy issues
Summary
 EU setting agenda through GDPR (for general data collection and processing) and ePrivacy (for
communications data) regulations
 US - no general privacy legal framework, many sectoral laws, general impression that industry
efforts to address privacy through self-regulation have been too slow, and have failed to
provide adequate and meaningful protection
 But FT is reasonably activist and has more powers than Privacy Commissioners in Canada to
impose paneities
 In Canada future updates likely coming (to bring into alignment with EU)

Ethics for Intelligent machines - Regina Rini - YorkU Philosophy


 Ex: a patient in so much paint and doctor want to help but cant ask for anything. Two options
o Two allow the patient the natural progression to take its course
o To give pain killers to reduce pain
o Consequentialism: consequences of your actions (best outcome available)
 You can't think far ahead to the future based on you actions
 How far in future do you have to know the consequences of your actions
 Give you method/solutions to solve moral problems
 How do you establish value?
o Deontology: what is not just consequences, but its is the nature of the action itself
 Some type of actions are wrong even for best outcome.
 Killing (action) is always wrong, regardless of consequences
 Could you rationally live in a world where people lie to each or kill ppl?
o Virtue Theory:
 Focuses not on outcome and not on the action, but instead on the person who does it.
 Would a good do this?
 What type of person, with good character Traits, benevolent, merciful?
 But will this kind of person be able to end the life of the patient (above ex)?
o Relationship theory:
 Focusses on thinking about social relationships between acting party and the person
affected and society as a whole
 Overall social roles, not just one individual

 If the machine does the decision making process


o Consequentialism
 Works well because the actions are very well design by the algorithm
 But what if it kills one person to save 5 people, to the machine, best out is 1 or 5?
 Very problematics situation
o Deontology
 Think of rules in a list that never kill ppl, etc. (relatively easier to implement)
 Hard to think like a person
o Virtue Theory
 Doesn't make sense
o Relationship theory
 Doesn't make sense for a robot because it does not have social roles
 Who decides (Computer-as-proxy model)?
o Robot can be used as a too, a proxy to help surgeon, asking the robot
o Autonomous cars
 Programmer decides -> Paternalism challenge
 Your decisions override decisions of others
 Consequentialism
 Deontology
 Paternalism challenge
 Consumer Decides -> Implementation challenge
 Consequentialism: doesn’t matter
 Emulation model -> progress challenge - relying upon ppl depends on types of ppl
(locks in a potentially bad option based on the society)
 Have AI emulate based on best example or data sets
 Virtue theory
 Relationship theory
Short Paper 1 Notes
 Classify automatic semi and autonomous systems
 Responsibilities of stakeholders
 Stakeholders -> software agent
o Slow down development due to fear of liability
o Learning autonomy

Short paper 2 notes


What is a policy vacuum, black box and concerns of black box machines?
additional ref. explaining what is a black box ml systems
2nd additional ref. Issues of black box ml systems

Moor's 1985 paper discussion


Citation: J.H. Moor, What is computer ethics, Metaphilosophy, vol. 16, no. 4, pp 266-275, 1985

Potrebbero piacerti anche