Gender Bias in Artificial Intelligence A Systematic Review of the Literature
Main Article Content
Keywords
Bias, Gender, Artificial Intelligence, Systematic Literature Review
Abstract
This study presents a Systematic Literature Review (SLR) of Gender Bias in Artificial Intelligence (AI). The research was conducted using two techniques: a domain-based approach to SLR process providing a bibliometric sample description and in-depth examination of the thematic categories arising from inductive categorization, extracted from reading and interpretation of the final 35 sample articles analyzed. In answering three key research questions on the types, causes, and overcoming (mitigating) strategies of gender bias in artificial intelligence, three thematic treemaps were constructed, enabling systematic overview as an essential contribution to the literature. The main types of gender bias found in AI are categorized as societal, technical, and individual. Societal and socio-technical aspects stand out as the leading causes of bias, while debiasing, dataset design and gender sensitivity were the most frequent among the main strategies for overcoming bias. The study also proposes theoretical, practical and managerial capacity building and policy implications that aim to influence broad socio-technical challenges and refer to changes necessary, aiming to create bias-free artificial intelligence.
Downloads
References
Arseniev-Koehler, A., Cochran, S. D., Mays, V. M., Chang, K. W., & Foster, J. G. (2022). Integrating topic modelling and word embedding to characterize violent deaths. Proceedings of the National Academy of Sciences, 119(10), e2108801119. http://dx.doi.org/10.1073/pnas.2108801119
Asr, F. T., Mazraeh, M., Lopes, A., Gautam, V., Gonzales, J., Rao, P., & Taboada, M. (2021). The gender gap tracker: Using natural language processing to measure gender bias in media. PloS one, 16(1), e0245533. http://dx.doi.org/10.1371/journal.pone.0245533
Bardhan, R., Sunikka-Blank, M., & Haque, A. N. (2019). Sentiment analysis as a tool for gender mainstreaming in slum rehabilitation housing management in Mumbai, India. Habitat International, 92, 102040.http://dx.doi.org/10.1016/j.habitatint.2019.102040
Bhardwaj, R., Majumder, N., & Poria, S. (2021). Investigating gender bias in BERT. Cognitive Computation, 13(4), 1008-1018. http://dx.doi.org/10.1007/s12559-021-09881-2
Breazeal, C., & Brooks, R. (1997). Gender Holes in Intelligent Technologies. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 1187–1192). IEEE.
Chen, X., Li, Z., Setlur, S., & Xu, W. (2022). Exploring racial and gender disparities in voice biometrics. Scientific Reports, 12(1), 3723. https://doi.org/10.1038/s41598-022-06673-y
Conz, E., & Magnani, G. (2020). A dynamic perspective on the resilience of firms: A systematic literature review and a framework for future research. European Management Journal, 38(3), 400–412. http://doi:10.1016/j.emj.2019.12.004
Corrêa, V. S., Brito, F. R. S., Lima, R. M., & Queiroz, M. M. (2022a). Female entrepreneurship in emerging and developing countries: A systematic literature review. International Journal of Gender and Entrepreneurship, 14(3), 300–322. http://doi:10.1108/IJGE-08-2021-0142
Corrêa, V. S., Lima, R. M., Brito, F. R. S., Machado, M. C., & Nassif, V. M. J. (2022b). Female entrepreneurship in emerging and developing countries: A systematic review of practical and policy implications and suggestions for new studies. Journal of Entrepreneurship in Emerging Economies. https://doi.org/10.1108/JEEE-04-2022-0115
Crawford, K. (2021). The Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. Yale University Press (December 2022).
Crawford, K. (2013). The hidden biases of big data. Harvard Business Review Blog, Apr 1. Retrieved from http://blogs.hbr.org/2013/04/the-hidden-biases-in-big-data/. Accessed on Apr 10, 2023.
Das, S., & Paik, J. H. (2021). Context-sensitive gender inference of named entities in text. Information Processing & Management, 58(1), 102423. https://doi.org/10.1016/j.ipm.2020.102423
Deacon, T. W., & Brooks, D. R. (1988). Artificial Intelligence and the Bias of the Human Architect. In Proceedings of the 10th International Joint Conference on Artificial Intelligence (pp. 799–805). Morgan Kaufmann Publishers Inc.
DeFranza, D., Mishra, H., & Mishra, A. (2020). How language shapes prejudice against women: An examination across 45 world languages. Journal of personality and social psychology, 119(1), 7. https://doi.org/10.1037/pspa0000188
Draude, C., Klumbyte, G., Lücking, P., & Treusch, P. (2020). Situated algorithms: a sociotechnical systemic approach to bias. Online Information Review, 44(2), 325–342. http://dx.doi.org/10.1108/OIR-10-2018-0332
Dwork, C., & Minow, M. (2022). Distrust of Artificial Intelligence: Sources & Responses from Computer Science & Law. Daedalus, 151(2), 309–321. https://doi.org/10.1162/daed_a_01918
Fyrvald, J. (2019). Mitigating algorithmic bias in Artificial Intelligence systems. Ph.D. Thesis, Uppsala Universitet. Available at https://www.diva-portal.org/smash/get/diva2:1334465/FULLTEXT01.pdf
Fossa, F., & Sucameli, I. (2022). Gender Bias and Conversational Agents: an ethical perspective on Social Robotics. Science and Engineering Ethics, 28(3), 23. https://doi.org/10.1007/s11948-022-00376-3
Hägg, G., & Gabrielsson, J. (2020). A systematic literature review of the evolution of pedagogy in entrepreneurial education research. International Journal of Entrepreneurial Behaviour and Research, 26(5), 829–861. https://doi.org/10.1108/IJEBR-04-2018-0272
Haraway, D. (1991). Simians, Cyborgs, and Women: The Reinvention of Nature. Routledge.
Haraway, D. (1987). A Manifesto for Cyborgs: Science, technology, and socialist feminism in the 1980s. Australian Feminist Studies, 2(4), 1–42.
Huluba, A. M., Kingdon, J., & McLaren, I. (2018). The UK Online Gender Audit 2018: A comprehensive audit of gender within the UK’s online environment. Heliyon, 4(12), e01001. https://doi.org/10.1016/j.heliyon.2018.e01001
Jones, J. J., Amin, M. R., Kim, J., & Skiena, S. (2020). Stereotypical gender associations in language have decreased over time. Sociological Science, 7, 1–35. http://dx.doi.org/10.15195/v7.a1
Kordzadeh, N., & Ghasemaghaei, M. (2022). Algorithmic bias: review, synthesis, and future research directions. European Journal of Information Systems, 31(3), 388–409. https://doi.org/10.1080/0960085X.2021.1927212
Kraus, S., Breier, M., & Dasí-Rodríguez, S. (2020). The art of crafting a systematic literature review in entrepreneurship research. International Entrepreneurship and Management Journal, 16, 1023–1042. https://doi.org/10.1007/s11365-020-00635-4
Kuppler, M. (2022). Predicting the future impact of Computer Science researchers: Is there a gender bias? Scientometrics, 127(11), 6695–6732. http://dx.doi.org/10.1007/s11192-022-04337-2
Kurpicz-Briki, M., & Leoni, T. (2021). A World Full of Stereotypes? Further Investigation on Origin and Gender Bias in Multi-Lingual Word Embeddings. Frontiers in Big Data, 4, 625290. http://dx.doi.org/10.3389/fdata.2021.625290
Licklider, J. C., & Taylor, R. W. (1968). The computer as a communication device. Science and Technology, 76(2), 1̶3.
Machado, M. C., Vivaldini, M., & de Oliveira, O. J. (2020). Production and supply-chain as the basis for SMEs’ environmental management development: A systematic literature review, Journal of Cleaner Production, 273. https://doi.org/10.1016/j.jclepro.2020.123141
Mahmud, H., Islam, A. K. M. N., Ahmed, S. I., & Smolander, K. (2022). What influences algorithmic decision-making? A systematic literature review on algorithm aversion. Technological Forecasting and Social Change, 175. http://doi:10.1016/j.techfore.2021.121390
Martínez, C. D., García, P. D., & Sustaeta, P. N. (2020). Hidden Gender Bias in Big Data as Revealed Through Neural Networks: Man is to Woman as Work is to Mother? Revista Española de Investigaciones Sociológicas (REIS), 172(172), 41–76. https://doi.org/10.5477/cis/reis.172.41
Nadeem, A., Marjanovic, O., & Abedin, B. (2022). Gender bias in AI-based decision-making systems: a systematic literature review. Australasian Journal of Information Systems, 26. https://doi.org/10.3127/ajis.v26i0.3835
Noble, S. U. (2018). Algorithms of oppression. In Algorithms of oppression. New York University Press. https://doi.org/10.18574/nyu/9781479833641.001.0001
Oldenziel, R. (1992). Cynthia Cockburn, Machinery of Dominance: Women, Men, and Technical Know-How (Book Review). Technology and Culture, 33(1), 151.
Orgeira-Crespo, P., Míguez-Álvarez, C., Cuevas-Alonso, M., & Rivo-López, E. (2021). An analysis of unconscious gender bias in academic texts by means of a decision algorithm. Plos one, 16(9), e0257903. https://doi.org/10.1371/journal.pone.0257903
Pair, E., Vicas, N., Weber, A. M., Meausoone, V., Zou, J., Njuguna, A., & Darmstadt, G. L. (2021). Quantification of Gender Bias and Sentiment Toward Political Leaders Over 20 Years of Kenyan News Using Natural Language Processing. Frontiers in Psychology, 12, 712646. https://doi.org/10.3389/fpsyg.2021.712646
Paul, J., & Criado, A. R. (2020). The art of writing literature review: What do we know and what do we need to know? International Business Review, 29(4), 101717. https://doi.org/10.1016/j.ibusrev.2020.101717
Patón-Romero, J. D., Vinuesa, R., Jaccheri, L., & Baldassarre, M. T. (2022). State of Gender Equality in and by Artificial Intelligence. IADIS International Journal on Computer Science and Information Systems, 17(2), 31–48.
Petreski, D., & Hashim, I. C. (2022). Word embeddings are biased. But whose bias are they reflecting? AI & Society, 1–8. https://doi.org/10.1007/s00146-022-01443-w
Reyero Lobo, P., Daga, E., Alani, H., & Fernandez, M. (2022). Semantic Web technologies and bias in artificial intelligence: A systematic literature review. Semantic Web (Preprint), 1–26. https://doi.org/10.3233/SW-223041
Santos, S. C., & Neumeyer, X. (2021). Gender, poverty and entrepreneurship: A systematic literature review and future research agenda. Journal of Developmental Entrepreneurship, 26(3). https://doi.org/10.1142/S1084946721500187
Savoldi, B., Gaido, M., Bentivogli, L., Negri, M., & Turchi, M. (2021). Gender bias in machine translation. Transactions of the Association for Computational Linguistics, 9, 845–874. https://doi.org/10.1162/tacl_a_00401
Scheuerman, M. K., Paul, J. M., & Brubaker, J. R. (2019). How computers see gender: An evaluation of gender classification in commercial facial analysis services. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), 1̶33. https://doi.org/10.1145/3359246
Schopmans, H., & Cupać, J. (2021). Engines of patriarchy: ethical artificial intelligence in times of illiberal backlash politics. Ethics & International Affairs, 35(3), 329–342. http://dx.doi.org/10.1017/S0892679421000356
Schwemmer, C., Knight, C., Bello-Pardo, E. D., Oklobdzija, S., Schoonvelde, M., & Lockhart, J. W. (2020). Diagnosing gender bias in image recognition systems. Socius, 6, 2378023120967171. https://doi.org/10.1177/2378023120967171
Shrestha, S., & Das, S. (2022). Exploring gender biases in ML and AI academic research through systematic literature review. Frontiers in Artificial Intelligence, 5. https://doi.org/10.3389/frai.2022.976838
Tannenbaum, C., Ellis, R. P., Eyssel, F., Zou, J., & Schiebinger, L. (2019). Sex and gender analysis improves science and engineering. Nature, 575(7781), 137̶146. https://doi.org/10.1038/s41586-019-1657-6
Thelwall, M. (2018). Gender bias in machine learning for sentiment analysis. Online Information Review, 42(3), 343–354. https://doi.org/10.1108/OIR-05-2017-0153
Tomalin, M., Byrne, B., Concannon, S., Saunders, D., & Ullmann, S. (2021). The practical ethics of bias reduction in machine translation: Why domain adaptation is better than data debiasing. Ethics and Information Technology, 23, 419–433. http://dx.doi.org/10.1007/s10676-021-09583-1
Tranfield, D., Denyer, D., & Smart, P. (2003). Towards a methodology for developing evidence‐informed management knowledge by means of systematic review. British Journal of Management, 14(3), 207–222. https://doi.org/10.1111/1467-8551.00375
Tubaro, P., & Coville, M., Le Ludec, C., & Casilli, A. A. (2022). Hidden inequalities: the gendered labour of women on micro-tasking platforms. Internet Policy Review, 11(1). https://doi.org/10.14763/2022.1.1623
Turkle, S. (2005). The second self: Computers and the human spirit. MIT Press.
Vargas-Solar, G. (2022). Intersectional study of the gender gap in STEM through the identification of missing datasets about women: A multisided problem. Applied Sciences, 12(12), 5813. https://doi.org/10.3390/app12125813
Vlasceanu, M., & Amodio, D. M. (2022). Propagation of societal gender inequality by internet search algorithms. Proceedings of the National Academy of Sciences of the United States of America, 119(29), e2204529119. https://doi.org/10.1073/pnas.2204529119
Waelen, R., & Wieczorek, M. (2022). The struggle for AI’s recognition: understanding the normative implications of gender bias in AI with Honneth’s theory of recognition. Philosophy & Technology, 35(2), 53. https://doi.org/10.1007/s13347-022-00548-w
Wajcman, J. (2004). TechnoFeminism. Polity Press: Cambridge.
Wellner, G., & Rothman, T. (2020). Feminist AI: Can We Expect Our AI Systems to Become Feminist? Philosophy & Technology, 33(2), 191–205. https://doi.org/10.1007/s13347-019-00352-z
Witherspoon, E. B., Schunn, C. D., Higashi, R. M., & Baehr, E. C. (2016). Gender, interest, and prior experience shape opportunities to learn programming in robotics competitions. International Journal of STEM Education, 3, 1–12. https://doi.org/10.1186/s40594-016-0052-1