O slideshow foi denunciado.
Seu SlideShare está sendo baixado. ×

SFScon 22 - Claudia Falchi Delgado - The Impact of Sex and Gender Biases in Artificial Intelligence .pdf

Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Carregando em…3
×

Confira estes a seguir

1 de 22 Anúncio

SFScon 22 - Claudia Falchi Delgado - The Impact of Sex and Gender Biases in Artificial Intelligence .pdf

Baixar para ler offline

Preventing biases and digital literacy with the "no-code-movement"

What is the impact of sex and gender differences in Artificial Intelligence (AI) used in biomedicine and healthcare? How can AI biases threaten our health and mitigate inequalities? According to Dr. Antonella Santuccione Chadha, founder of the Women’s Brain Project, ambitious goals set by precision medicine will be achieved using the latest advances in AI to identify the role of inter-individual differences. Yet, we need to consider more than ever that our decisions significantly impact the future treatment of patients and society. Despite scientific progress, most biomedical solutions do not account for (neither desirable nor desirable) bias detection. Furthermore, most algorithms neglect the sex and gender factor and their relevance to individual variances in health and illness. Failure to account for these variations could result in errors and discriminatory consequences (Cirillo et al., 2020).

A solution to this issue would be “no-code-movement,” which stands for a type of web development that allows non-programmers or people with great creative rather than technical skills to build immersive online interactions, such as 2D (Web 2.0) and 3D (Web 3.0) applications, and create software using a user-friendly graphical interface, instead of writing boring code. The no-code movement rests upon the fundamental belief that technology should enable and facilitate the creation and understanding and not be a barrier to entry. In this regard, user experience (UX) approaches can be taken to make it inclusive, adaptable, and customized for each sex and gender variation representation.

Historical sex and gender inequities and prejudices can infect health research and practice (Cirillo et al., 2020). Thus, developing AI precision systems in healthcare will enable the differentiation of vulnerabilities for disease and response to treatments among individuals while avoiding discriminatory biases. Furthermore, the no-code movement allows visualizations, logical statements, and dimensionality reduction techniques, which can be implemented to achieve interpretability (Cirillo et al., 2020). Therefore, developing and applying fair approaches is critical for implementing unbiased and interpretable models for everyone.

The no-code approach also saves money since it requires less upkeep and upfront investment. Coding money could now be better utilized for all sex and gender well-designed research, implying that post-trials are essential to scientific advancements (Rich-Edwards et al., 2018).
Furthermore, it allows for decentralized autonomy and innovation. When decentralizing development–by removing the barriers preventing healthcare users and professionals from experimenting and testing–no-code platforms, autonomy, and innovation are encouraged. It also enables to be known inside and out, thus avoiding the “black box” effect. (...)

Preventing biases and digital literacy with the "no-code-movement"

What is the impact of sex and gender differences in Artificial Intelligence (AI) used in biomedicine and healthcare? How can AI biases threaten our health and mitigate inequalities? According to Dr. Antonella Santuccione Chadha, founder of the Women’s Brain Project, ambitious goals set by precision medicine will be achieved using the latest advances in AI to identify the role of inter-individual differences. Yet, we need to consider more than ever that our decisions significantly impact the future treatment of patients and society. Despite scientific progress, most biomedical solutions do not account for (neither desirable nor desirable) bias detection. Furthermore, most algorithms neglect the sex and gender factor and their relevance to individual variances in health and illness. Failure to account for these variations could result in errors and discriminatory consequences (Cirillo et al., 2020).

A solution to this issue would be “no-code-movement,” which stands for a type of web development that allows non-programmers or people with great creative rather than technical skills to build immersive online interactions, such as 2D (Web 2.0) and 3D (Web 3.0) applications, and create software using a user-friendly graphical interface, instead of writing boring code. The no-code movement rests upon the fundamental belief that technology should enable and facilitate the creation and understanding and not be a barrier to entry. In this regard, user experience (UX) approaches can be taken to make it inclusive, adaptable, and customized for each sex and gender variation representation.

Historical sex and gender inequities and prejudices can infect health research and practice (Cirillo et al., 2020). Thus, developing AI precision systems in healthcare will enable the differentiation of vulnerabilities for disease and response to treatments among individuals while avoiding discriminatory biases. Furthermore, the no-code movement allows visualizations, logical statements, and dimensionality reduction techniques, which can be implemented to achieve interpretability (Cirillo et al., 2020). Therefore, developing and applying fair approaches is critical for implementing unbiased and interpretable models for everyone.

The no-code approach also saves money since it requires less upkeep and upfront investment. Coding money could now be better utilized for all sex and gender well-designed research, implying that post-trials are essential to scientific advancements (Rich-Edwards et al., 2018).
Furthermore, it allows for decentralized autonomy and innovation. When decentralizing development–by removing the barriers preventing healthcare users and professionals from experimenting and testing–no-code platforms, autonomy, and innovation are encouraged. It also enables to be known inside and out, thus avoiding the “black box” effect. (...)

Anúncio
Anúncio

Mais Conteúdo rRelacionado

Semelhante a SFScon 22 - Claudia Falchi Delgado - The Impact of Sex and Gender Biases in Artificial Intelligence .pdf (20)

Mais de South Tyrol Free Software Conference (20)

Anúncio

Mais recentes (20)

SFScon 22 - Claudia Falchi Delgado - The Impact of Sex and Gender Biases in Artificial Intelligence .pdf

  1. 1. Sex and Gender Biases in Artificial Intelligence The Impact of Claudia Falchi Delgado
  2. 2. https://www.womensbrainproject.com/ https://www.dexai.eu/ Designing Ethical, Compliant and Explainable Artificial Intelligence Our mission is to bring precision to medical research, towards sustainability in healthcare The Women's Brain Project DEXAI - Artificial Ethics
  3. 3. Guess what? We are different.
  4. 4. Sex AND Gender differences Inter-individual differences in health due to Biological - Genetic (sex) Socio-cultural (gender)
  5. 5. diabetes cardiovascular disorders neurological diseases mental health disorders cancer autoimmunity physiological processes such as brain aging sensitivity to pain diet physical activity tobacco use alcohol consumption Moreover, differences in lifestyle factors that are associated with sex and gender. Evidence of sex and gender differences has been reported in chronic diseases such as: (Cirillo et al., 2020)
  6. 6. Open questions regarding health differences across the gender spectrum, due to scarcity of studies dedicated to intersex, transgender and nonbinary individuals. Consequences Observed sex and gender differences in health and well- being are influenced by complex links between both biological and social-economic factors, which are often surrounded by confounding variables such as stigma, stereotypes, and the misrepresentation of data. Consequently, health research and practices can be entangled with sex and gender inequalities and biases. (Cirillo et al., 2020, p. 2)
  7. 7. Artificial Intelligence
  8. 8. Artificial intelligence 1 2 As a double-edged sword AI can magnify and perpetuate existing sex and gender inequalities If appropriately designed, mitigate inequalities by effectively integrating sex and gender differences in healthcare
  9. 9. Artificial Intelligence Ethics?
  10. 10. The difference between them is found in the impact that these biases have on the patients’ wellbeing and healthcare access. Biases Desirable bias 01 02 Undesirable bias
  11. 11. (Cirillo et al., 2020, p. 3)
  12. 12. Sources and type of health data Experimental and clinical data Digital biomarkers
  13. 13. Technologies for analysis and development of health data Big Data analytics Natural Language Processing Robotics
  14. 14. Sex differences in behavioral and social patterns related with communication such as the number and duration of phone calls and the degree of social networking callers have been observed. Big Data analytics Common Big Data analytics processes and approaches include the creation of data management infrastructures and the application of data-driven algorithms and AI solutions. Associations with mental health and social networks, show men express higher negativity and lower desire for social support on social media than women. Findability, Accessibility, Interoperability, and Reusability (FAIR) recommendations for responsible research and gender equality. It will facilitate the identification of sex and gender differences in health, accurate indicators for prevention and diagnosis, and effective treatment. (Cirillo et al., 2020)
  15. 15. Allow to make predictions that can contribute to clinical decisions, such as diagnosis, prognosis, risk of relapse, and symptomatology fluctuations in response to treatments. Natural Processing Languages (NLP) NLP consists of computational systems that understand and manipulate written and spoken human language for purposes like machine translation, speech recognition and conversational interfaces. Medical chatbots include Woebot, proven to relieve anxiety and depression, and Moodkit, which recommends chatting and journaling activities through text and voice notes. A flourishing area of NLP is that of medical chatbots, aiming to improve users’ wellbeing through real-time symptom assessment and recommendation interfaces. (Cirillo et al., 2020)
  16. 16. Neurology, Rehabilitation, and assistive approaches for improving the quality of life of patients and caregivers. Robotics Robots are expected to provide personalised assistance to patients according to their specific needs and preferences, at the right time and in the right way. It has been demonstrated that the outcome of a humanoid robot’s task can be affected by its gender, as in the case of female charity robots receiving more donations from men than women. Awareness of sex and gender differences in patients and robots could lead to better healthcare assistance and effective human-machine interactions for biomedical applications, as well as a better translation of ethical decision- making into machines. (Cirillo et al., 2020)
  17. 17. The Digital Divide Across LMICs (“Overall”), women are 8% less likely than men to own a mobile phone. (Cirillo et al., 2020, p. 7)
  18. 18. Explainable Artificial Intelligence (XAI) In the context of Precision Medicine, the expected outputs of AI models consist of predictions of risk and diagnosis of medical conditions or recommendations of treatments, with profound influence in people’s lives and health. Where? XAI is also useful in basic research, for instance, efforts in creating “visible” deep neural networks that provide automatic explanations of the impact of a genotypic change on cellular phenotypic states. It would enable us to find potential mistaken conclusions derived by training an algorithm with misrepresented data -> facilitating the identification of undesirable biases generally found in clinical data with unbalanced sex and gender representation. Explaining the decisional processes will help discover sex and gender differences in clinical data that is representative, promoting the -> desired biases for personalised preventative and therapeutic interventions.
  19. 19. A no-code-movement approach? Benefits The development and application of FAIR approaches will be critical for the implementation of unbiased and interpretable models for Precision Medicine. The “no-code-movement” allows non-programmers to create software using a user-friendly graphical interface instead of with complex code. Technology should enable and facilitate the creation and understanding and not be a barrier to entry. In this regard, user experience (UX) approaches can be taken to make it inclusive, adaptable, and customized for each sex and gender variation representation. The use of visualizations, logical statements, and dimensionality reduction techniques can be implemented in computational tools to achieve interpretability for all and everyone (patients, researchers, and health professionals).
  20. 20. Recent developments in bias detection and mitigation include adopting re-sampling, adversarial learning, and open- source toolkits such as IBM AI Fairness 360 (AIF360). Reccomendations 01 02 03 05 Distinguish between desirable and undesirable biases and guarantee the representation of desirable biases in AI development Increase awareness of unintended biases in the scientific community, technology industry, among policy makers, and the general public Implement explainable algorithms, which not only provide understandable explanations for the layperson, but which could also be equipped with integrated bias detection systems and mitigation strategies, and validated with appropriate benchmarking Incorporate key ethical considerations during every stage of technological development, ensuring that the systems maximize well- being and health of the population 04 Utilize the "no-code-movement" approach to visualizations, logical statements, and dimensionality reduction techniques to achieve interoperability for all and everyone (patients and health professionals)
  21. 21. Thank You Cirillo, D., Catuara-Solarz, S., Morey, C., Guney, E., Subirats, L., Mellino, S., Gigante, A., Valencia, A., Rementeria, M. J., Chadha, A. S., & Mavridis, N. (2020). Sex and gender differences and biases in artificial intelligence for biomedicine and healthcare. npj Digital Medicine, 3(1). https://doi.org/10.1038/s41746-020-0288-5

×