References
- Adams, R. & Loideáin, N., 2019. Addressing indirect discrimination and gender stereotypes in AI virtual personal assistants: the role of international human rights law. Cambridge International Law Journal, 8(2), pp. 241-257.
- Ahn, J., Kim, J. & Sung, Y., 2022. The effect of gender stereotypes on artificial intelligence recommendations. Journal of Business Research, Volume 141, pp. 50-59.
- Barbul, M.; Pelau, C.; Pop, S. 2022. Genders’ Fascinations and Fears Towards the Use of Cognitive Computing, Proceedings of the 8th International Conference on New Trends in Sustainable Business and Consumption (BASIQ), 672-679.
- Belanche, D., Casaló, L. V., Flavián, C., & Schepers, J. (2019). Service robot implementation: a theoretical framework and research agenda. Service Industries Journal, 40(3–4), 203–225. https://doi.org/10.1080/02642069.2019.1672666
- Blum, L. & Blum, M. (2023). A Theoretical Computer Science Perspective on Consciousness and Artificial General Intelligence, 25, 12-16, https://doi.org/10.1016/j.eng.2023.03.010
- Costa, P. & Ribas, L., 2019. Al becomes her: Discussing gender and artificial intelligence. Technoetic Arts, 17(1-2), pp. 171-193.
- Cui, J., & Zhong, J. (2023). The effect of robot anthropomorphism on revisit intentions after service failure: a moderated serial mediation model. Asia Pacific Journal of Marketing and Logistics, 35(11), 2621–2644. https://doi.org/10.1108/apjml-10-2022-0862
- Dang, N. B., & Bertrandias, L. (2023). Social robots as healing aids: How and why powerlessness influences the intention to adopt social robots. Technological Forecasting and Social Change, 196, 122845. https://doi.org/10.1016/j.techfore.2023.122845
- Davis, N. et al., 2023. I’m Only Human? The Role of Racial Stereotypes, Humanness, and Satisfaction in Transactions with Anthropomorphic Sales Bots. Journal Of The Association For Consumer Research.
- Delgosha, M. S., & Hajiheydari, N. (2020). How human users engage with consumer robots? A dual model of psychological ownership and trust to explain post-adoption behaviours. Computers in Human Behavior, 117, 106660. https://doi.org/10.1016/j.chb.2020.106660
- Flavián, C., Belk, R. W., Belanche, D., & Casaló, L. V. (2024). Automated social presence in AI: Avoiding consumer psychological tensions to improve service value. Journal of Business Research, 175, 114545. https://doi.org/10.1016/j.jbusres.2024.114545
- Hall, L., 2021. Programming the machine: gender, race, sexuality, AI, and the construction of credibility and deceit at the border. Internet Policy Review, 10(4).
- Happe, L. & Marquardt, K. (2025). Authentic interdisciplinary online courses for alternative pathways into computer science, Journal of Systems and Software, 219, 112240, https://doi.org/10.1016/j.jss.2024.112240).
- Hillis, C., Bagheri, E. & Marshall, Z., 2024. Exploring Gender Bias in Search Engines. International Review Of Information Ethics, Volume 34, pp. 1-7.
- Hitron, T. et al., 2022. AI bias in Human-Robot Interaction: An evaluation of the Risk in Gender Biased Robots. Napoli, ITALY, 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) - Social, Asocial, and Antisocial Robots.
- Jeldes-Delgado, F.; Ferreira, T.A.; Diaz, D.; Ortiz, R. (2024). Exploring gender stereotypes in financial reporting: An aspect-level sentiment analysis using big data and deep learning, 10(20), 38915, https://doi.org/10.1016/j.heliyon.2024.e38915
- Li, Y., Wu, B., Huang, Y. & Luan, S., 2024. Developing trustworthy artificial intelligence: insights from research on interpersonal, human-automation, and human-AI trust. Frontiers in Psychology, Volume 15.
- Liu, X., & Shi, Z. (2025). Served or exploited: The impact of data capture strategy on users’ intention to use selected AI systems. Journal of Consumer Behaviour. https://doi.org/10.1002/cb.2428
- Loureiro, S. M. C., Jiménez‐ Barreto, J., Bilro, R. G., & Romero, J. (2023). Me and my AI: Exploring the effects of consumer self‐ construal and AI‐ based experience on avoiding similarity and willingness to pay. Psychology and Marketing, 41(1), 151–167. https://doi.org/10.1002/mar.21913
- Lu, Y. & Zhang, J., 2025. Balancing identity diversity and product contexts: Understanding consumer trust in AI-enhanced chatbot services. Journal of Retailing and Consumer Services, 84.
- Marinucci, L., Mazzuca, C. & Gangemi, A., 2023. Exposing implicit biases and stereotypes in human and artificial intelligence: state of the art and challenges with a focus on gender. AI & Society, 38(2), pp. 747-761.
- Pelau, C., Barbul, M., Bojescu, I., & Niculescu, M. (2025). AI, How Much Shall I Tell You? Exchange and Communal Consumer–AI Relationships and the Willingness to Disclose Personal Information. Behavioral Sciences, 15(3), 386, https://doi.org/10.3390/bs1503038
- Piercy, C.W.; Montgomery-Vestecka, G.; Lee, S.K. (2025). Gender and accent stereotypes in communication with an intelligent virtual assistant, International Journal of Human-Computer Studies, Volume 195, January 2025, 103407, https://doi.org/10.1016/j.ijhcs.2024.103407
- Pop, S., Pelau, C., Ciofu, I., & Kondort, G. (2023). Factors Predicting Consumer-AI Interactions. New Trends in Sustainable Business and Consumption, 592–597. https://doi.org/10.24818/basiq/2023/09/068
- Teng, D.; Ye, S. & Martinez, V. (2025), Gen-AI’s effects on new value propositions in business model innovation: Evidence from information technology industry, Technovation, 143, 103191, https://doi.org/10.1016/j.technovation.2025.103191
- Van Doorn, J., Mende, M., Noble, S. M., Hulland, J., Ostrom, A. L., Grewal, D., & Petersen, J. A. (2017). Domo Arigato Mr. Roboto: Emergence of Automated Social Presence in Organizational Frontlines and Customers’ Service Experiences. Journal of Service Research, 20(1), 43-58. https://doi.org/10.1177/1094670516679272
- Wang, L., 2020. The Three Harms of Gendered Technology. Australasian Journal of Information Systems, 24.
- Woods, H., 2018. Asking more of Siri and Alexa: feminine persona in service of surveillance capitalism. Critical Studies in Media Communication, 35(4), pp. 334-349.