Gender inequality and technology
between women and algorithms
Keywords:
algorithms, fundamental rights, gender discrimination, equality, womenAbstract
The use of algorithms is already ubiquitous in the daily world, enabling the treatment of a large amount of data quickly. Although they may seem impartial, they may incur discrimination against parts of society, and the present research aims to identify how this can occur in relation to women. The approach method used is the deductive, and in the procedure were employed the functionalist and the monographic methods. It was noted that the technology is not neutral and has limitations, which harm especially groups and individuals already marginalized, being able to negatively impact the lives of women.
Downloads
References
BOLUKBASI, Tolga; CHANG, Kai-Weng; ZOU, James; SALIGRAMA, Venkatesh; KALAI, Adam. Man is to computer programmer as woman is to homemaker? Debiasing word embeddings. In: 30TH CONFERENCE ON NEURAL INFORMATION PROCESSING SYSTEMS, 2016, Barcelona. Proceedings on the 30th Conference on Neural Information Processing Systems. New York, Curran Associates Inc., 2016. p. 4356–4364. Disponível em:https://papers.nips.cc/paper/6228-man-is-to-computer-programmer-as-woman-is-to-homemaker-debiasing-word-embeddings.pdf. Acesso em: 23 out. 2020.
DASTING, Jeffrey. Amazon scraps secret AI recruiting tool that showed bias against women. Reuters, 10 out. 2018. Disponível em: https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G. Acesso em: 24 out. 2020.
DATTA, Amit; TSCHANTZ, Michael Carl; DATTA, Anupam. Automated Experiments on Ad Privacy Settings: a Tale of Opacity, Choice, and Discrimination. Proceedings of Privacy Enhancing Technologies, The Internet, v. 1, p. 92-112, 2015. Disponível em: https://content.sciendo.com/configurable/contentpage/journals$002fpopets$002f2015$002f1$002farticle-p92.xml. Acesso em: 23 out. 2020.
KEARNS, Michael; ROTH, Aaron. The ethical algorithm: the science of socially aware algorithm design. New York: Oxford University Press, 2019.
McMILLAN, Graeme. It’s not you, it’s it: voice recognition doesn’t recognize women. Time, 01 jun. 2011. Disponível em: https://techland.time.com/2011/06/01/its-not-you-its-it-voice-recognition-doesnt-recognize-women/. Acesso em: 24 out. 2020.
NATIONAL CENTER FOR WOMEN & INFORMATION TECHNOLOGY. By the numbers. National Center for Women & Information Technology, 21 abr. 2020. Disponível em: https://www.ncwit.org/sites/default/files/resources/ncwit_btn_07232020.pdf. Acesso em: 22 out. 2020.
NOBLE, Safiya Umoja. Algorithms of oppression: how search engines reinforce racism. New York: New York University Press, 2018. [livro eletrônico]
O’NEIL, Cathy. Weapons of Math Destruction: how big data increases inequality and threatens democracy. New York: Crown Publishers, 2016. [livro eletrônico]
PARISER, Eli. O filtro invisível: o que a Internet está escondendo de você. Rio de Janeiro: Zahar, 2012.
SILVA, Fernanda dos Santos Rodrigues; SILVA, Rosane Leal da. Reconhecimento facial e segurança pública: os perigos do uso da tecnologia no sistema penal seletivo brasileiro. In: 5 CONGRESSO INTERNACIONAL DE DIREITO E CONTEMPORANEIDADE: mídias e direitos da sociedade em rede. Anais do 5º Congresso Internacional de Direito e Contemporaneidade: mídias e direitos da sociedade em rede. Santa Maria (RS): UFSM, 2019. Disponível em: https://www.ufsm.br/app/uploads/sites/563/2019/09/5.23.pdf. Acesso em: 23 out. 2020.
UN WOMEN. UN Women ad series reveals widespread sexism. Un Women, 21 out. 2013. Disponível em: https://www.unwomen.org/en/news/stories/2013/10/women-should-ads. Acesso em: 23 set. 2020.
Additional Files
Published
How to Cite
Issue
Section
License
Copyright (c) 2021 Revista Brasileira de Iniciação Científica
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.