Mais conteúdo relacionado

Apresentações para você(20)

Similar a Dasts16 a koene_un_bias(20)


Dasts16 a koene_un_bias

  1. User agency on social networks that are mediated by algorithms Ansgar Koene HORIZON Digital Economy Research, University of Nottingham
  2. User experience satisfaction on social network sites
  3. Human attenetion is a limited resource Filter
  4. Information services, e.g. internet search, news feeds etc. • free-to-use => no competition on price • lots of results => no competition on quantity • Competition on quality of service • Quality = relevance = appropriate filtering Good information service = good filtering
  5. Sacrificing control for Convenience
  6. Sacrificing control for Convenience
  7. Personalized recommendations • Content based – similarity to past results the user liked • Collaborative – results that similar users liked (people with statistically similar tastes/interests) • Community based – results that people in the same social network liked (people who are linked on a social network e.g. ‘friends’)
  8. How do the algorithms work?
  9. User understanding of social media algorithms More than 60% of Facebook users are entirely unaware of any algorithmic curation on Facebook at all: “They believed every single story from their friends and followed pages appeared in their news feed”. Published at: CHI 2015
  10. Revealing News Feed behaviour
  11. Participants indicate desired changes
  12. Information filtering, or ranking, implicitly manipulates choice behaviour. Many online information services are ‘free-to-use’, the service is paid for by adverting revenue, not users directly  Potential conflict of interest: promote advertisement vs. match user interests Advertising inherently tries to manipulate consumer behaviour Personalized filtering can also be use for political spin / propaganda etc. Manipulation: conflict of interest
  13. Trending Topics controversy
  14. Q&A with N. Lundblad (Google) Nicklas Lundblad, Head of EMEA Public Policy and Government Relations at Google Human attention is the limited resource that services need to compete for. As long as there exist competing platforms, loss of agency due to algorithms deciding what to show to users is not an issue. Users can switch to other platform.
  15. UnBias: Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy WP1: ‘Youth Juries’ workshops with 13-17 year olds to co- produce citizen education materials on properties of information filtering/recommendation algorithms; WP2: co-design workshops, hackathons and double-blind testing to produce user-friendly open source tools for benchmarking and visualizing biases in the algorithms; WP3: design requirements for algorithms that satisfy subjective criteria of bias avoidance based on interviews and observation of users’ sense-making behaviour WP4: policy briefs for an information and education governance framework for social media usage. Developed through broad stakeholder focus groups with representatives of government, industry, third-sector organizations, educators, lay-people and young people (a.k.a. “digital natives”).
  16. Thank you for your attention Click to add text
  17. It’s based on data so it must be true “More data, not better models” Belief that ‘law of large number’ means Big Data methods do not need to worry about model quality or sampling bias as long as enough data is used. “More Data” is the key to Deep-learning success compared to previous AI
  18. Garbage in -> garbage out perpetuating the status-quo ProPublica “Machine Bias”
  19. ‘equal opportunity by design’ “Big Data: A Report on Algorithmic Systems, Opportunities, and Civil Rights“, White House report focused on the problem of avoiding discriminatory outcomes “To avoid exacerbating biases by encoding them into technological systems a principle of ‘equal opportunity by design’—designing data systems that promote fairness and safeguard against discrimination from the first step of the engineering process and continuing throughout their lifespan.”