Gender inequality and technology

between women and algorithms

Authors

  • Luiza Berger von Ende Universidade Federal de Santa Maria
  • Rafael Santos de Oliveira Universidade Federal de Santa Maria

Keywords:

algorithms, fundamental rights, gender discrimination, equality, women

Abstract

The use of algorithms is already ubiquitous in the daily world, enabling the treatment of a large amount of data quickly. Although they may seem impartial, they may incur discrimination against parts of society, and the present research aims to identify how this can occur in relation to women. The approach method used is the deductive, and in the procedure were employed the functionalist and the monographic methods. It was noted that the technology is not neutral and has limitations, which harm especially groups and individuals already marginalized, being able to negatively impact the lives of women.

Downloads

Download data is not yet available.

Author Biographies

Luiza Berger von Ende, Universidade Federal de Santa Maria

Law student at the Federal University of Santa Maria (UFSM). Researcher at the Center for Studies and Research in Law and Internet at UFSM (CEPEDI / UFSM), registered on the CNPq research platform. CNPq Scholarship PIBIC/CNPq 2020-21. Student of the Technical Course in Informatics at the Polytechnic College of UFSM.

Rafael Santos de Oliveira, Universidade Federal de Santa Maria

Doctor in Law from the Federal University of Santa Catarina (2010), in the area of concentration in International Relations, with timing of Doctoral Training (PhD-sandwich) with CAPES scholarship at the Università Degli Studi di Padova - Italy (2009). Master in Latin American Integration (Integration Law) from the Federal University of Santa Maria (2005) and Graduated in Law from the Federal University of Santa Maria (2003). Associate Professor I at the Law Department of the Federal University of Santa Maria (UFSM), under exclusive dedication and in the Post-Graduate Program in Law at UFSM (Master's). Coordinator of the Center for Studies and Research in Law and Internet (CEPEDI/UFSM).

References

BOLUKBASI, Tolga; CHANG, Kai-Weng; ZOU, James; SALIGRAMA, Venkatesh; KALAI, Adam. Man is to computer programmer as woman is to homemaker? Debiasing word embeddings. In: 30TH CONFERENCE ON NEURAL INFORMATION PROCESSING SYSTEMS, 2016, Barcelona. Proceedings on the 30th Conference on Neural Information Processing Systems. New York, Curran Associates Inc., 2016. p. 4356–4364. Disponível em:https://papers.nips.cc/paper/6228-man-is-to-computer-programmer-as-woman-is-to-homemaker-debiasing-word-embeddings.pdf. Acesso em: 23 out. 2020.

DASTING, Jeffrey. Amazon scraps secret AI recruiting tool that showed bias against women. Reuters, 10 out. 2018. Disponível em: https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G. Acesso em: 24 out. 2020.

DATTA, Amit; TSCHANTZ, Michael Carl; DATTA, Anupam. Automated Experiments on Ad Privacy Settings: a Tale of Opacity, Choice, and Discrimination. Proceedings of Privacy Enhancing Technologies, The Internet, v. 1, p. 92-112, 2015. Disponível em: https://content.sciendo.com/configurable/contentpage/journals$002fpopets$002f2015$002f1$002farticle-p92.xml. Acesso em: 23 out. 2020.

KEARNS, Michael; ROTH, Aaron. The ethical algorithm: the science of socially aware algorithm design. New York: Oxford University Press, 2019.

McMILLAN, Graeme. It’s not you, it’s it: voice recognition doesn’t recognize women. Time, 01 jun. 2011. Disponível em: https://techland.time.com/2011/06/01/its-not-you-its-it-voice-recognition-doesnt-recognize-women/. Acesso em: 24 out. 2020.

NATIONAL CENTER FOR WOMEN & INFORMATION TECHNOLOGY. By the numbers. National Center for Women & Information Technology, 21 abr. 2020. Disponível em: https://www.ncwit.org/sites/default/files/resources/ncwit_btn_07232020.pdf. Acesso em: 22 out. 2020.

NOBLE, Safiya Umoja. Algorithms of oppression: how search engines reinforce racism. New York: New York University Press, 2018. [livro eletrônico]

O’NEIL, Cathy. Weapons of Math Destruction: how big data increases inequality and threatens democracy. New York: Crown Publishers, 2016. [livro eletrônico]

PARISER, Eli. O filtro invisível: o que a Internet está escondendo de você. Rio de Janeiro: Zahar, 2012.

SILVA, Fernanda dos Santos Rodrigues; SILVA, Rosane Leal da. Reconhecimento facial e segurança pública: os perigos do uso da tecnologia no sistema penal seletivo brasileiro. In: 5 CONGRESSO INTERNACIONAL DE DIREITO E CONTEMPORANEIDADE: mídias e direitos da sociedade em rede. Anais do 5º Congresso Internacional de Direito e Contemporaneidade: mídias e direitos da sociedade em rede. Santa Maria (RS): UFSM, 2019. Disponível em: https://www.ufsm.br/app/uploads/sites/563/2019/09/5.23.pdf. Acesso em: 23 out. 2020.

UN WOMEN. UN Women ad series reveals widespread sexism. Un Women, 21 out. 2013. Disponível em: https://www.unwomen.org/en/news/stories/2013/10/women-should-ads. Acesso em: 23 set. 2020.

Published

2020-12-15

How to Cite

Berger von Ende, L., & Santos de Oliveira, R. (2020). Gender inequality and technology: between women and algorithms. Revista Brasileira De Iniciação Científica, 7(6), 210–219. Retrieved from https://periodicoscientificos.itp.ifsp.edu.br/index.php/rbic/article/view/249