A Study of the Development and Deployment of Emotionally Intelligent Artificial Intelligence

Ejuchegahi Anthony Angwaomaodoko

Abstract

This paper examines the relationship between artificial intelligence (AI) and emotional intelligence (EI), particularly in light of the growing deployment of emotionally responsive AI systems in sensitive human contexts (e.g., healthcare, education, and customer service). Although AI has made significant progress in understanding and simulating human emotions through affective computing, it remains structurally incapable of experiencing emotions, as AI lacks consciousness and self-awareness—a phenomenon known as the "empathy gap." At an emotional level, AI can simulate empathy through data-driven models. Emotionally intelligent AI can approximate empathy through its data-driven model, but these simulations are not equivalent to genuine empathy. This distinction also raises ethical concerns, as users may experience emotional reactions or develop trust in AI systems that can't actually comprehend or grasp emotions. Such an illusion of empathy, enforced by HAT for humans to harmonically anthropomorphise machines, can make some individuals be emotionally dependent and/or manipulated (especially the ones who are vulnerable). It also highlights key risks associated with emotionally intelligent AI, including bias in emotion recognition, emotional manipulation by both commercial and political actors, and the disruption of genuine human interactions. It acknowledges that although AI can augment users' enjoyment and engagement, it must be implemented and designed ethically and transparently if it is to remain a supportive tool rather than a replacement for genuine human empathy.



Keywords


Artificial Intelligence; Emotional Intelligence; affective computing; empathy gap; ethical AI

Full Text:

PDF


References


1. Goleman, D. (1995). Emotional Intelligence: Why It Can Matter More Than IQ. London: Bloomsbury Publishing.

2. Kuwaiti, A. A., Nazer, K., Al-Reedy, A., Al-Shehri, S., Al-Muhanna, A., Subbarayalu, A. V., Muhanna, D. A., & Al-Muhanna, F. A. (2023). A review of the role of artificial intelligence in healthcare. Journal of Personalised Medicine, 13(6), 951. doi: 10.3390/jpm13060951

3. Hammad, T. M. (2024). Exploring the Intersection of AI and Emotional Intelligence: Navigating the Promise and Peril. International Journal for Multidisciplinary Research, 6(3)

4. Cowie, R. (2011). Affective Computing and Intelligent Interaction. Lecture Notes in Computer Science. doi: 10.1007/978-3-642-24600-5

5. Thakkar, A., Gupta, A., & De Sousa, A. (2024). Artificial intelligence in positive mental health: a narrative review. Frontiers in Digital Health, 6. doi: 10.3389/fdgth.2024.1280235

6. Trung, H. H., Chau, A. T., Hong, A. N., & Thu, H. V. (2025). The impact of artificial intelligence (AI) on the exploitation and commercial use of personal images and Vietnamese legal regulations. International Journal of Law, Policy and Social Review, 7(2), 69-73.

7. Picard, R., Vyzas, E., & Healey, J. (2001). Toward machine emotional intelligence: analysis of affective physiological state. IEEE Transactions on Pattern Analysis and Machine Intelligence, 23(10), 1175–1191. doi: 10.1109/34.954607

8. Kattnig, M., Angerschmid, A., Reichel, T., & Kern, R. (2024). Assessing trustworthy AI: Technical and legal perspectives of fairness in AI. Computer Law & Security Review, 55, 106053. doi: 10.1016/j.clsr.2024.106053

9. LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444. doi: 10.1038/nature14539

10. Devlin, J., Chang, M., Lee, K., & Toutanova, K. (2018). BERT: Pretraining of Deep Bidirectional Transformers for Language Understanding. arXiv (Cornell University). doi: 10.48550/arxiv.1810.04805

11. Salovey, P., & Mayer, J. D. (1990). Emotional intelligence. Imagination, Cognition, and Personality, 9(3), 185–211. doi: 10.2190/dugg-p24e-52wk-6cdg

12. Mayer, J. D., Caruso, D. R., & Salovey, P. (2016). The ability Model of Emotional intelligence: Principles and updates. Emotion Review, 8(4), 290–300. doi: 10.1177/1754073916639667

13. Serrat, O. (2017). Understanding and developing emotional intelligence. In Springer eBooks (pp. 329–339). doi: 10.1007/978-981-10-0983-9_37

14. Kotsou, I., Mikolajczak, M., Heeren, A., Grégoire, J., & Leys, C. (2018). Improving Emotional intelligence: A systematic review of existing work and future challenges. Emotion Review, 11(2), 151–165. doi: 10.1177/1754073917735902

15. Khare, S. K., Blanes-Vidal, V., Nadimi, E. S., & Acharya, U. R. (2024). Emotion recognition and artificial intelligence: A systematic review (2014–2023) and research recommendations. Information Fusion, 102, 102019. doi: 10.1016/j.inffus.2023.102019

16. Pudasaini Pramila, T. (2025). Emotional Intelligence and Effective Leadership in the Digital Era. Leadership Studies in the Turbulent Business Ecosystem. doi: 10.5772/intechopen.114331

17. Gaikwad, S. (2023). The Role of Emotional Intelligence in HR Leadership. International Journal of Education and Science Research Review, 10(2), 373-376.

18. McStay, A. (2020). Emotional AI: The Rise of Empathic Media. Sage Publications.

19. Dinesh Deckker, & Subhashini Sumanasekara. (2025). Systematic Review On Ai In Emotional Intelligence And Psychological Education. EPRA International Journal of Research & Development (IJRD), 400–414. doi: 10.36713/epra21351

20. Premack, D., & Woodruff, G. (1978). Does the chimpanzee have a theory of mind? Behavioural and Brain Sciences, 1(4), 515–526. doi: 10.1017/s0140525x00076512

21. Semin, G. R., & Fiedler, K. (1988). The cognitive functions of linguistic categories in describing persons: Social cognition and language. Journal of Personality and Social Psychology, 54(4), 558–568. doi: 10.1037/0022-3514.54.4.558

22. Rabinowitz, N. C., Perbet, F., Song, F., Zhang, C., Eslami, S. M. A., & Botvinick, M. (2018). Machine Theory of Mind. Proceedings of the 35th International Conference on Machine Learning.

23. De Sio, F. S., & Van Den Hoven, J. (2018). Meaningful Human Control over Autonomous Systems: A Philosophical Account. Frontiers in Robotics and AI, 5. doi: 10.3389/frobt.2018.00015

24. Vicci, Dr. H. (2024). Emotional Intelligence in Artificial Intelligence: A Review and Evaluation Study. SSRN Electronic Journal. doi: 10.2139/ssrn.4818285

25. Uddin, T. A., Sazzard, H., Hossain, M. S., Raihan, U. I., & Karl, A. (2019). Facial Expression Recognition using Convolutional Neural Network with Data Augmentation. DiVA Portal.

26. Sang, D. V., Van Dat, N., & Thuan, D. P. (2017). Facial expression recognition using deep convolutional neural networks. 2017 9th International Conference on Knowledge and Systems Engineering (KSE), 130–135. doi: 10.1109/kse.2017.8119447

27. Razavi, M., Ziyadidegan, S., Mahmoudzadeh, A., Kazeminasab, S., Baharlouei, E., Janfaza, V., Jahromi, R., & Sasangohar, F. (2024). Machine Learning, Deep Learning, and Data Preprocessing Techniques for Detecting, Predicting, and Monitoring Stress and Stress-Related Mental Disorders: Scoping Review. JMIR Mental Health, 11, e53714. doi: 10.2196/53714

28. Affectiva. (n. d.). Humanising technology with Emotion AI. Retrieved from https://www.affectiva.com/

29. Wang, X., Li, X., Yin, Z., Wu, Y., & Jia, L. (2023). Emotional intelligence of large language models. arXiv (Cornell University). doi: 10.48550/arxiv.2307.09042

30. OpenAI. (2023). GPT-4 Technical Report. Retrieved from https://cdn.openai.com/papers/gpt-4.pdf

31. Dorigoni, A., & Giardino, P. L. (2025). The illusion of empathy: evaluating AI-generated outputs in moments that matter. Frontiers in Psychology, 16. doi: 10.3389/fpsyg.2025.1568911

32. Rashkin, H., Smith, E. M., Li, M., & Boureau, Y. (2019). Towards Empathetic Open-domain Conversation Models: A New Benchmark and Dataset. Conference: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. doi: 10.18653/v1/p19-1534

33. Zhou, L., Gao, J., Li, D., & Shum, H. (2020). The design and implementation of XiaoIce, an empathetic social chatbot. Computational Linguistics, 46(1), 53–93. doi: 10.1162/coli_a_00368

34. Müller, V. C. (2020). Ethics of artificial intelligence. In A. Elliott, The Routledge Social Science Handbook of AI. London: Routledge.

35. Fulmer, R., Joerin, A., Gentile, B., Lakerink, L., & Rauws, M. (2018). Using Psychological Artificial Intelligence (TESS) to relieve symptoms of depression and anxiety: randomised controlled trial. JMIR Mental Health, 5(4), e64. doi: 10.2196/mental.9782

36. Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behaviour therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (WoeBot): a randomised controlled trial. JMIR Mental Health, 4(2), e19. doi: 10.2196/mental.7785

37. Inkster, B., Sarda, S., & Subramanian, V. (2018). An Empathy-Driven, Conversational Artificial Intelligence Agent (WYSA) for Digital Mental Wellbeing: Real-World Data Evaluation Mixed-Methods Study. JMIR Mhealth and Uhealth, 6(11), e12106. doi: 10.2196/12106

38. Kretzschmar, K., Tyroll, H., Pavarini, G., Manzini, A., & Singh, I. (2019). Can your phone be your therapist? Young people's ethical perspectives on the use of fully Automated conversational Agents (Chatbots) in mental health support. Biomedical Informatics Insights, 11. doi: 10.1177/1178222619829083

39. Lachimipriya, K., Regina, G., Rajam, M., Shyja, R., & Sb, V. (2025). Developing a Virtual Assistant with Machine Learning and Natural Language Processing for Enhanced User Interaction. Retrieved from https://www.researchgate.net/publication/387685806_Developing_a_Virtual_Assistant_with_Machine_Learning_and_Natural_Language_Processing_for_Enhanced_User_Interaction

40. Sajja, R., Sermet, Y., Cikmaz, M., Cwiertny, D., & Demir, I. (2024). Artificial Intelligence-Enabled Intelligent Assistant for Personalized and Adaptive Learning in Higher Education. Information, 15(10), 596. doi: 10.3390/info15100596

41. Singh, P., & Singh, V. (2024). The power of AI: enhancing customer loyalty through satisfaction and efficiency. Cogent Business & Management, 11(1). doi: 10.1080/23311975.2024.2326107

42. Microsoft. (n. d.). Azure Language in Foundry Tools. Retrieved from https://azure.microsoft.com/en-us/products/ai-services/ai-language

43. Haleem, A., Javaid, M., Asim Qadri, M., Pratap Singh, R., & Suman, R. (2022). Artificial intelligence (AI) applications for marketing: A literature-based study. International Journal of Intelligent Networks, 3, 119–132. doi: 10.1016/j.ijin.2022.08.005

44. Rane, N., Paramesha, M., Choudhary, S., & Rane, J. (2024). Artificial Intelligence in Sales and Marketing: Enhancing Customer Satisfaction, Experience and Loyalty. SSRN Electronic Journal. doi: 10.2139/ssrn.4831903

45. Patrick Oputa Odili, Cosmas Dominic Daudu, Adedayo Adefemi, Ifeanyi Onyedika Ekemezie, & Gloria Siwe Usiagu. (2024). The Impact Of Artificial Intelligence On Recruitment And Selection Processes In The Oil And Gas Industry: A Review. Engineering Science & Technology Journal, 5(2), 612–638. doi: 10.51594/estj.v5i2.836

46. Patrick Oputa Odili, Cosmas Dominic Daudu, Adedayo Adefemi, Ifeanyi Onyedika Ekemezie, & Gloria Siwe Usiagu. (2024). The Impact Of Artificial Intelligence On Recruitment And Selection Processes In The Oil And Gas Industry: A Review. Engineering Science & Technology Journal, 5(2), 612–638. doi: 10.51594/estj.v5i2.836

47. IMD. (2025, August). AI in HR: How is Artificial Intelligence transforming human resources? Retrieved from https://www.imd.org/blog/digital-transformation/ai-in-hr/#:~:text=AI%20enables%20HR%20professionals%20to%20use%20predictive%20analytics%20to%20gather,engagement%2C%20productivity%2C%20and%20retention.

48. Tariq, M. U., Poulin, M., & Abonamah, A. A. (2021). Achieving Operational Excellence Through Artificial Intelligence: Driving Forces and Barriers. Frontiers in Psychology, 12. doi: 10.3389/fpsyg.2021.686624

49. Wan, J., Li, X., Dai, H.-N., Kusiak, A., Martinez-Garcia, M., & Li, D. (2021). Artificial-Intelligence-Driven Customized Manufacturing Factory: Key Technologies, Applications, and Challenges. Proceedings of the IEEE, 109(4), 377–398. doi: 10.1109/jproc.2020.3034808

50. Higginbotham. (2024, April 16). Employee wellbeing challenges: Addressing burnout and stress. Retrieved from https://www.higginbotham.com/blog/employee-wellbeing-addressing-burnout-and-stress/

51. Andrejevic, M., & Selwyn, N. (2019). Facial Recognition Technology in Schools: Critical Questions and Concerns. Learning Media and Technology, 45(2), 115–128. doi: 10.1080/17439884.2020.1686014

52. Barker, D., Tippireddy, M. K. R., Farhan, A., & Ahmed, B. (2025). Ethical Considerations in Emotion Recognition Research. Psychology International, 7(2), 43. doi: 10.3390/psycholint7020043

53. Lukac, M., Zhambulova, G., Abdiyeva, K., & Lewis, M. (2023). Study on emotion recognition bias in different regional groups. Scientific Reports, 13(1). doi: 10.1038/s41598-023-34932-z

54. Buolamwini, J. (2023, December 13). Unmasking the bias in facial recognition algorithms. Retrieved from https://mitsloan.mit.edu/ideas-made-to-matter/unmasking-bias-facial-recognition-algorithms#:~:text=Buolamwini%2C%20a%20computer%20scientist%2C%20self%2Dstyled%20%60%60poet%20of,intelligence%20and%20bias%20in%20facial%20analysis%20algorithms.

55. Togioka, B. M., & Young, E. (2024). Diversity and Discrimination in Health Care. Retrieved from https://www.ncbi.nlm.nih.gov/books/NBK568721/

56. Floridi, L. (2022). The Ethics of Artificial Intelligence. Oxford University Press.

57. Varshita, A., & Tamanna, S. (2025). AI Chatbot Companions Impact on Users Parasocial Relationships and Loneliness. International Journal of Research Publication and Reviews, 6(5), 4715-4724.

58. Williams, A., Brooks, C., & Shmargad, Y. (2020). How Algorithms Discriminate Based on Data They Lack. Penn State University Press.

59, Shaalan, A., Tourky, M., & Ibrahim, K. (2024). AI Caramba! Leveraging AI for Effective Digital Relationship Marketing, 309–352. doi: 10.4018/979-8-3693-5340-0.ch011

60. Darling, K. (2021). The New Breed: What Our History with Animals Reveals about Our Future with Robots. Henry Holt and Company.

61. Asimolowo, A. (2025). Leveraging AI-Powered Chatbots for Mental Health Support for High School Students. Iconic Research and Engineering Journals, 8(7), 194-205.

62. Cuadra, A., Wang, M., Stein, L. A., Jung, M. F., Dell, N., Estrin, D., & Landay, J. A. (2024). The Illusion of Empathy? Notes on Displays of Emotion in Human-Computer Interaction. Proceedings of the CHI Conference on Human Factors in Computing Systems, 1–18. doi: 10.1145/3613904.3642336

63. Rudin, C. (2019). Stop explaining black box machine learning models for high-stakes decisions and use interpretable models instead. Nature Machine Intelligence, 1(5), 206–215. doi: 10.1038/s42256-019-0048-x

64. Kordzadeh, N., & Ghasemaghaei, M. (2021). Algorithmic bias: review, synthesis, and future research directions. European Journal of Information Systems, 31(3), 388–409. doi: 10.1080/0960085x.2021.1927212

65. Ghotbi, N. (2022). The Ethics of Emotional Artificial Intelligence: A Mixed Method Analysis. Asian Bioethics Review, 15(4), 417–430. doi: 10.1007/s41649-022-00237-y

66. Saranya, A., & Subhashini, R. (2023). A systematic review of Explainable Artificial Intelligence models and applications: Recent developments and future trends. Decision Analytics Journal, 7, 100230. doi: 10.1016/j.dajour.2023.100230

67. Arunraju Chinnaraju. (2025). Explainable AI (XAI) for trustworthy and transparent decision-making: A theoretical framework for AI interpretability. World Journal of Advanced Engineering Technology and Sciences, 14(3), 170–207. doi: 10.30574/wjaets.2025.14.3.0106

68. Jobin, A., Ienca, M., & Vayena, E. (2019). The global landscape of AI ethics guidelines. Nature Machine Intelligence, 1(9), 389–399. doi: 10.1038/s42256-019-0088-2

69. European Parliament. (2023, June 08). EU AI Act: first regulation on artificial intelligence. Retrieved from https://www.europarl.europa.eu/topics/en/article/20230601STO93804/eu-ai-act-first-regulation-on-artificial-intelligence Accessed on 20th September, 2025.

70. Stanford Report. (2025, August 27). Why AI companions and young people can make for a dangerous mix. Retrieved from https://news.stanford.edu/stories/2025/08/ai-companions-chatbots-teens-young-people-risks-dangers-study#:~:text=Mainly%20because%20they%20simulate%20emotional,delay%20access%20to%20real%20help.

71. Akande, B. (2024). Personalisation and Empathy in AI Companions: Enhancing Emotional Connection to Alleviate Loneliness in Adults.

72. Turkle, S. (2011). Alone Together: Why We Expect More from Technology and Less from Each Other. New York: Basic Books.

73. Keles, B., McCrae, N., & Grealish, A. (2019). A systematic review: the influence of social media on depression, anxiety and psychological distress in adolescents. International Journal of Adolescence and Youth, 25(1), 79–93. doi: 10.1080/02673843.2019.1590851


Article Metrics

Metrics Loading ...

Metrics powered by PLOS ALM

Refbacks

  • There are currently no refbacks.




Copyright (c) 2025 Ejuchegahi Anthony Angwaomaodoko

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.