Call Girls Service In Old Town Dubai ((0551707352)) Old Town Dubai Call Girl ...
Data at the centre of a complex world
1. Data at the centre of a
complex world
Kate Carruthers
Version 1.0
May 2018
Classification: PUBLIC
2. May 18 Kate Carruthers | UNSW 1
Slides are available
https://www.slideshare.net/carruthk
3. 5 Propositions about data
1. Data is not neutral
2. There is no such thing as raw data
3. The signal to noise ratio has changed
4. Data is not inherently smart
5. The more data we have the less anonymity
May 18 Kate Carruthers | UNSW 2
4. 2 Propositions about data ethics
1. Technology has no ethics. People demonstrate
ethics.
2. Technology inherits the biases of its makers –
therefore we need diversity and formal
mechanisms to reduce bias.
May 18 Kate Carruthers | UNSW 3
5. May 18 Kate Carruthers | UNSW 4
Source: https://www.economist.com/news/leaders/21721656-data-economy-demands-new-approach-antitrust-rules-worlds-most-valuable-resource
The world’s most valuable resource is no
longer oil, but data
6. May 18 Kate Carruthers | UNSW 5
Source: https://www.economist.com/news/briefing/21721634-how-it-shaping-up-data-giving-rise-new-economy
11. May 18 Kate Carruthers | UNSW 10
Source: http://beyondplm.com/2014/07/22/plm-implementations-nuts-and-bolts-of-data-silos/
12. April 2018 Kate Carruthers | UNSW 11
Source: http://www.matricis.com/en/integration-solutions/data-integration/
13. May 18 Kate Carruthers | UNSW 12
“Complexity is a defining
feature of the digital era, &
we are not adjusting our
governance structures to
manage it.” Kent Aiken, Prime Minister’s Fellow,
Public Policy Forum Canada, 2017
14. May 18 Kate Carruthers | UNSW 13
“Privacy is an inherent human
right, and a requirement for
maintaining the human condition
with dignity and respect.”
- Bruce Schneier
Source: https://www.schneier.com/essays/archives/2006/05/the_eternal_value_of.html
15. May 18 Kate Carruthers | UNSW 14
Source: https://www.isoqsltd.com/quick-guide-complying-gdpr/
16. Ethics
Moral principles that govern a person's behaviour or the way
in which they conduct an activity…
May 18 Kate Carruthers | UNSW 15
“We ask ethical questions whenever
we think about how we should act.
Being ethical is a part of what defines
us as human beings.”
The Ethics Centre, Sydney
17. Areas of focus
• Ethics of data - how we generate, record & share
data
• Ethics of algorithms - how we interpret data via
artificial intelligence, machine learning and robots
• Ethics of practices - devising responsible innovation
and professional codes to guide this emerging
science
May 18 Kate Carruthers | UNSW 16
What is data ethics?
Luciano Floridi, Mariarosaria Taddeo
Phil. Trans. R. Soc. A 2016 374
20160360; DOI: 10.1098/rsta.2016.0360. Published 14 November
2016
18. The essentials
May 18 Kate Carruthers | UNSW 17
Privacy
Cyber
Security
Ethics
Data
Governance
19. April 2018 Kate Carruthers | UNSW 18
To protect our
data we need to
understand it.
20. May 18 Kate Carruthers | UNSW 19
Source: https://www.arnnet.com.au/article/609309/australian-red-cross-launches-investigation-after-massive-data-cock-up/
3rd party
risk
21. May 18 Kate Carruthers | UNSW 20
Source: http://www.news.com.au/technology/online/australias-2016-census-had-significant-and-obvious-oversights-report-finds/news-story/6edcf8f897b2361965bd72683ee6edbe
Risk
management
22. May 18 Kate Carruthers | UNSW 21
http://thehackernews.com/2017/07/sweden-data-breach.html
Poor
data
practices
23. May 18 Kate Carruthers | UNSW 22
“Revealed: 50
million Facebook
profiles harvested
for Cambridge
Analytica in major
data breach”
https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election
24. May 18 Kate Carruthers | UNSW 23
https://medium.com/textifire/cambridge-analytica-microsofts-exploitative-ad-tech-c2db8633f542
May 2017: Cambridge Analytica: Microsoft’s Exploitative Ad-Tech
Trump & Brexit’s infamous data firm was allegedly grown in the Microsoft-funded
advertising research labs at Cambridge University.
26. Privacy by Design
1. Proactive not Reactive; Preventative not Remedial
2. Privacy as the Default Setting
3. Privacy Embedded into Design
4. Full Functionality – Positive-Sum, not Zero-Sum
5. End-to-End Security – Full Lifecycle Protection
6. Visibility and Transparency – Keep it Open
7. Respect for User Privacy – Keep it User-Centric
May 18 Kate Carruthers | UNSW 25
Source: https://www.ipc.on.ca/wp-content/uploads/Resources/7foundationalprinciples.pdf
27. OWASP Security by Design Principles
1. Minimize attack surface area
2. Establish secure defaults
3. Principle of Least privilege
4. Principle of Defence in depth
5. Fail securely
6. Don’t trust services
7. Ensure Separation of duties
8. Always avoid security by obscurity
9. Keep security simple
10.Fix security issues correctly
May 18 Kate Carruthers | UNSW 26
Open Web Application Security Project https://www.owasp.org/index.php/Security_by_Design_Principles
28. IEEE CS Code of Ethics & Professional
Practice
Software engineers shall commit themselves to making the analysis, specification, design, development, testing
and maintenance of software a beneficial and respected profession. In accordance with their commitment to the
health, safety and welfare of the public, software engineers shall adhere to the following Eight Principles:
• Public: Software engineers shall act consistently with the public interest.
• Client and Employer: Software engineers shall act in a manner that is in the best interests of their client and employer,
consistent with the public interest.
• Product: Software engineers shall ensure that their products and related modifications meet the highest professional
standards possible.
• Judgement: Software engineers shall maintain integrity and independence in their professional judgment.
• Management: Software engineering managers and leaders shall subscribe to and promote an ethical approach to the
management of software development and maintenance.
• Profession: Software engineers shall advance the integrity and reputation of the profession consistent with the public
interest.
• Colleagues: Software engineers shall be fair to and supportive of their colleagues.
Self: Software engineers shall participate in lifelong learning regarding the practice of their profession and shall
promote an ethical approach to the practice of the profession.
May 18 Kate Carruthers | UNSW 27
https://www.computer.org/web/education/code-of-ethics
29. ACM Code of Ethics
As an ACM member I will
1. Contribute to society and human well-being.
2. Avoid harm to others.
3. Be honest and trustworthy.
4. Be fair and take action not to discriminate.
5. Honor property rights including copyrights and patent.
6. Give proper credit for intellectual property.
7. Respect the privacy of others.
8. Honor confidentiality.
From the ACM Code of Ethics http://www.acm.org/about/code-of-ethics
May 18 Kate Carruthers | UNSW 28
30. Issues with Codes, Guidelines
• Often nobody knows they exist
• Even if they do, nobody follows them
• If they do follow them it turns into work to rule
• It’s hard to articulate all ethical requirements in a
code
• Needs strong institutional support
May 18 Kate Carruthers | UNSW 29
31. Some new approaches
• FATML – Fairness, Accountability and Transparency in
Machine Learning
• IEEE Global Initiative on Ethics of Autonomous and
Intelligent Systems
• Algorithmic Justice League
• Education in digital ethics
May 18 Kate Carruthers | UNSW 30
32. May 18 Kate Carruthers | UNSW 31
http://www.fatml.org/
33. May 18 Kate Carruthers | UNSW 32
https://standards.ieee.org/develop/indconn/ec/autonomous_systems.html
35. May 18 Kate Carruthers | UNSW 34
We cannot leave
ethics to volunteer
groups!
36. May 18 Kate Carruthers | UNSW 35
*****
*
https://www.theverge.com/2016/3/24/11297050/tay-microsoft-chatbot-racist
37. April 2018 Kate Carruthers | UNSW 36
Source: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
38. House of Lords Artificial Intelligence
Committee 2017
The Lords’ report proposes five main principles for an AI code:
1. Artificial intelligence should be developed for the common good and benefit
of humanity
2. Artificial intelligence should operate on principles of intelligibility and
fairness
3. Artificial intelligence should not be used to diminish the data rights or
privacy of individuals, families or communities
4. All citizens have the right to be educated to enable them to flourish mentally,
emotionally and economically alongside artificial intelligence
5. The autonomous power to hurt, destroy or deceive human beings should
never be vested in artificial intelligence
May 18 Kate Carruthers | UNSW 37
Source: Report of Session 2017-19 - published 16 April 2017 - HL Paper 100
https://publications.parliament.uk/pa/ld201719/ldselect/ldai/100/10002.htm
39. Australian efforts
• Autonomy, Agency and Assurance Innovation Institute (3A Institute)
brings together the best researchers from around the world and a range
of disciplines, to build a new applied science around the management of
artificial intelligence, data and technology and of their impact on
humanity.
• Australian National University (ANU) last September launched 3A
Institute in collaboration with CSIRO's Data61.
• Charged with creating a curriculum for training certified AI practitioners
by 2022 as well as researching and informing policy and understanding
around AI technologies.
May 18 Kate Carruthers | UNSW 38
https://cecs.anu.edu.au/3a-institute
40. April 2018 Kate Carruthers | UNSW 39
“Public agencies urgently need
a practical framework to assess
automated decision systems
and to ensure public
accountability.”
Source: Algorithmic Impact Assessments: A Practical Framework For Public Agency Accountability, Dillon Reisman, Jason Schultz, Kate Crawford, Meredith Whittaker, April 2018,
https://ainowinstitute.org/aiareport2018.pdf
41. April 2018 Kate Carruthers | UNSW 40
Source: Algorithmic Impact Assessments: A Practical Framework For Public Agency Accountability, Dillon Reisman, Jason Schultz, Kate Crawford, Meredith Whittaker, April 2018,
https://ainowinstitute.org/aiareport2018.pdf
42. Digital Ethics?
“A few guidelines are useful in most situations:
Use the golden rule: ask yourself how you would like to be treated as a
human being, citizen or customer.
There are always unintended consequences: embrace new positive uses of
technology, and block undesirable uses.
Success usually comes from exercising discipline and self-restraint in using
technology, rather than pushing the limits.”
May 18 Kate Carruthers | UNSW 41
Goasduff, C. L. (2016, March 07). Kick-Start the Conversation on Digital Ethics. Retrieved August 15, 2017, from http://www.gartner.com/smarterwithgartner/kick-start-the-conversation-on-digital-ethics/
43. May 18 Kate Carruthers | UNSW 42
We need formal ways to
consider the ethical
implications of data usage
44. May 18 Kate Carruthers | UNSW 43
Slides are available
https://www.slideshare.net/carruthk
45. Resources
• AI Now Institute at New York University
• Georgetown University, Kennedy Institute of Ethics, Ethics Lab
• Causeit Data Ethics
• The BIG Data Ethics Cheat Sheet, Hackermoon
• Digital Ethics Lab - Oxford Internet Institute - University of Oxford
• Guidelines on Ethical Research - British Sociological Association
• What is data ethics? Luciano Floridi, Mariarosaria Taddeo. Phil. Trans. R. Soc. A 2016 374
20160360; DOI: 10.1098/rsta.2016.0360. Published 14 November 2016
• Digital Enlightenment Forum: Digital Ethics. Workshop Report. (2016, March 1). Retrieved August 16, 2017, from
https://digitalenlightenment.org/sites/default/files/users/14/Digital%20Ethics%20Workshop%20Report%20v2.pdf
• A deep study on the concept of digital ethics. Maggiolini, Piercarlo. (2014).. Revista de Administração de Empresas, 54(5), 585-
591. https://dx.doi.org/10.1590/S0034-759020140511
• Algorithmic Impact Assessments: A Practical Framework For Public Agency Accountability, Dillon Reisman, Jason Schultz, Kate
Crawford, Meredith Whittaker, April 2018, https://ainowinstitute.org/aiareport2018.pdf
May 18 Kate Carruthers | UNSW 44
46. May 18 Kate Carruthers | UNSW 45
Thank you
Kate Carruthers
k.carruthers@unsw.edu.au