Review Article
A Bibliometric Analysis of Emotion Artificial Intelligence in Education Using VOSviewer and CitNetExplorer
Tingna Wei1* and Mengyao Li2
1School of Foreign Languages, Dezhou University, China
2Teaching Department, Da Han Elementary School in Tianqu New District, China
Tingna Wei, School of Foreign Languages, Dezhou University, China
Received Date: June 26, 2024; Published Date: July 01, 2024
Abstract
With the development of AI technology, emotional artificial intelligence has gained significant attention from academics and practitioners, becoming an actively researched topic in the field of education. Existing studies rarely examines from the perspective of necessary competences that emotion artificial intelligence itself need to possess in the process of education. To bridge this research gap, this study aims to conduct a bibliometric analysis through visualization and citation network analyses. The result shows that two main research domains of emotion artificial intelligence in education: models and frameworks for emotion recognition, and emotion recognition channels and methods. This paper concludes that emotional artificial intelligence needs to have personal and social competences in the process of serving education. Findings may contribute to further studies on the further design and application of emotional artificial intelligence in educational and other fields and inspire technological advancements that may help effectively tackle the current challenges.
Keywords: Emotion artificial intelligence; Education; Bibliometric analysis; Citation network analysis; Visualization analysis; Competence
Introduction
Emotion artificial intelligence (EAI) is frequently known as affective computing or artificial emotion intelligence. With the advancement in artificial intelligence, EAI is experiencing rapid growth [1]. EAI is the field dedicated to the study and development of artificial intelligence (AI) and machines that are capable of detecting and responding to affective states [2]. To be more specific, it is the process of capturing facial expressions and bodily signals triggered by emotions and feelings through various sensors and recognizing these signals in order to comprehend human emotions and provide apt feedback accordingly. Artificial intelligence finds application in educational contexts and receives lots of attention, due to its capability to engage in meaningful human-computer interaction [3,4]. Over the past decade, a growing number of studies have been conducted to highlight the benefits of human-computer/ robot interactions (Huang & Chan, 2017) [5].
However, the existing studies on emotional characteristics of AI in education are limited and not mature enough. Recognizing affective states is advantageous for analyzing users’ reactions, enabling the elicitation of behavioral intentions and the development of appropriate responses [6]. The existing studies on EAI mainly involve emotion classification [7], emotion detection [8,9], and the textual and facial ways of emotion recognition [10]. Literature review made by Yadegaridehkordi [6] selects 94 papers, which are publications up to 2017 and makes a summary from the different research perspectives. There is also a need to create a citation network analysis to reveal the entire developmental path, and its insufficiency, especially reviewing the recent studies.
To address this gap in research, the present study intends to undertake a bibliometric analysis utilizing visualization and citation network software. The primary objective of this study is to examine the developing process of EAI in education and its recent design and research advancements. The findings of this study may reveal current trends and research gaps in EAI, fostering a more diverse range of research perspectives and inspiring technological researchers and designers to gain a deeper understanding of practical needs within educational settings. In the subsequent sections of this article, we commence with a review of the current empirical studies and literature surveys about the utilization of EAI in education. Then, we outline the methods employed for literature searches and visualization. Finally, we get the analysis results to answer the research questions and draw implications from this study.
Literature Review
Emotion artificial intelligence/affective computing
Picard [11] first introduced affective computing in his paper, exploring neurological research on human emotions and the potential for computers to emulate them through expression recognition. In 2000, he gave a clear concept of affective computing, that is, computers with a comparable affective system that enables them to intelligently grasp human emotions during humancomputer interactions. Affective computing integrates computer science, psychology, cognition, and physiology in its study [12]. Emotional AI involves the utilization of affective computing and AI methodologies to perceive and empathize with human emotional experiences (McStay, 2018). It is developed to detect facial expressions, gaze directions, gestures, voice patterns, and brain activity [13,14]. The roots of automatic emotion recognition can be traced to diverse theories on emotion, which endeavor to explain what emotion entails and how we acquire knowledge about it [15]. The Basic Emotion Theory, introduced by Ekman [16], classifies human emotions across cultures into anger, disgust, fear, joy, sadness, and surprise. Ballesteros et al. [17] utilizes artificial intelligence (AI) and computer vision algorithms to detect human emotions.
Emotion artificial intelligence in education
There is an increasing commercial use of emotion AI [18]. Education is one of the industries that holds great potential for the implementation of AI [19]. It is proved that the learning process can be positively or negatively influenced by an individual’s emotional state [20,21], which further has an effect on learners’ performance, motivation and engagement [22,23]. Thus, more and more studies on education and learning are done from the perspective of emotions [24]. In order to promote future research on shared discoveries and interdisciplinary exploration, we intended to make a clear outcome from a bibliometric analysis on this subject. We aim to identify the leading researchers in this field by addressing the following research question (RQ1).
RQ1. What are the top keyword items, authors, organizations, and countries in the studies of emotion artificial intelligence in education?
Over the last decades, researchers have carried out studies on EAI in education from different perspectives. Huang et al. (2021) shows that the PAD model, proposed by Mehrabian and Russell in 1994, is a computational model of emotion (CMEs) which can be used to recognize users’ emotions. Electrocardiograms (ECGs) as well as multimodal approaches are also used for emotion recognition [9,25]. Emotions were also detected by merging two categories of facial expression data under the framework of Discriminant Laplacian Embedding [5]. Llurba et al. [26] investigated the feasibility of utilizing a camera for emotion recognition with the intention of harnessing this information to enhance the teaching-learning process. It helped improve educational procedures, including teachers’ decision-making in the classroom while optimizing attention to students.
The current e-learning environment is plagued by emotional illiteracy, leading to a decline in learning enthusiasm and productivity. To address this issue, methods of emotion recognition and regulation are proposed to modulate the negative emotions of speakers [27]. Several studies on affective tutoring systems are given to respond to student emotions and manage learners’ negative emotions [28,29]. Given the diverse research topics of EAI, it is crucial to clarify the primary classification of overall research. Consequently, this study aims to address the following research question (RQ2) in order to elucidate the general research interest in this subject area. In addition, considering EAI’s response to the user’s emotions, we need to explore what capabilities the chatbots require (RQ3).
RQ2. What are the primary themes of emotion artificial intelligence in education?
RQ3. What abilities should emotion artificial intelligence have in education?
Methods
Literature Search and Results Analysis in WOS
We searched the related literature from the Core Collection on the Web of Science. Four indexes were included for the literature search due to their high relevance to research questions in this study, including Science Citation Index Expanded (2013 to present), Social Sciences Citation Index (2006 to present), Arts & Humanities Citation Index (2008 to present), and Emerging Sources Citations Index (2017 to present). The other two in the Core Collection on Web of Science, Current Chemical Reactions (1985 to present) and Index Chemicus (1993 to present), were not included as their irrelevant research areas. The following search strategy was adopted: emotion artificial intelligence OR EAI OR affective AI OR affective artificial intelligence (Topic) AND educat* OR teach* OR learn* (Topic). The “Analyze Results” on Web of Science provided publication trend analysis by year, categories (subjects and research areas), and publication titles (journals). We first retrieved fundamental bibliometric data, including a year-based publication trend in the last 25 years (the longest range we could choose on this website), the most published research categories, and the most published journals.
Visualization and Citation Network Analysis
We exported the full record and the cited references of the search results for visualization analysis. The plain text file could be processed by VOSviewer directly. However, among the results, the early-accessed publications did not contain publication years, which CitNetExplorer could not process. It was crucial to include these early-access publications since they were usually published in recent years and represented the most recent research directions on this topic. We processed the text records by replacing the earlyaccess dates (EA publication month, year) with publication year marks of official documents (PY publication year). In this way, we transformed the early-access dates of these publications into what the software could process in the same way as formal publications.
Then, the text file of the full records was processed respectively in VOSviewer (Van Eck and Waltman, 2014) and CitNetExplorer (Van Eck and Waltman, 2010). VOSviewer could visualize the connections between authors, keywords, countries, and organizations. For each analytical label, we looked for the top items that occurred and were cited for the highest frequencies. In the meantime, the software provided lists that counted the citations and occurrences of authors, keywords, countries, and organizations. According to the citations, we collected the top 10 authors, countries, and organizations to show the representative contributors to this topic. We collected the top 25 keyword items measured by occurrences as they would represent the popular issues and elements related to this topic. CitNetExplorer allowed conducting “clustering” analysis so that studies closely connected were displayed in the same cluster. This software also allowed citation network analysis, including the longest paths and the highly cited or recent studies. This function would allow identifying the theoretical foundations and advancements in this area. However, in some cases, current studies have directly cited pioneering publications, where the longest citation paths were not longer than 4. Such paths could not generate significantly enlightening interpretations of the citation network. In this case, analyses of the most cited and recent publications would be conducted to reveal the advancements in the research topics.
Results
Literature Search Results and Publication Trends
On February 19, 2024, we searched the core collection on WOS with relevant indexes. The total results included 952 publications, among which 283 were early-accessed papers. For all these results, Web of Science revealed the year-based publication trends. As demonstrated in Figure 1, although we selected to display the publication number in the past 25 years, studies before 2008 were few and could not be displayed. The current climax was recorded for 2023 (N = 189). This year, ten publications have been published till the time of data collection. The publications saw a sharp increase in number since around 2020. We then analyzed the Web of Science categories to which these results belonged. The top ten published categories are as follows in Figure 2.


The top ten publication titles, i.e., journal names, were as follows in Figure 3. The related studies were highly distributed to different publication journals, as demonstrated by the low percentage of publications for each journal. Figure 4 shows that the People’s Republic of China and USA are the first two countries with the most publications.


Visualization Analysis and Most-Cited Items (RQ1)
VOSviewer visualized 152 keyword items from the included studies. The top 25 keywords (and their occurrences) were listed as follows: artificial intelligence (occurrences = 205, link strength = 555), emotion recognition (occurrences = 111, link strength = 408), deep learning (occurrences = 110, link strength = 354), machine learning (occurrences = 107, link strength = 353), emotion (occurrences = 62, link strength = 174), affective computing (occurrences = 58, link strength = 201), classification (occurrences = 54, link strength = 232), recognition (occurrences = 49, link strength = 154), model (occurrences = 41, link strength = 128), sentiment analysis (occurrences = 39, link strength = 134), feature extraction (occurrences = 36, link strength = 156), emotions (occurrences = 31, link strength = 87), system (occurrences = 31, link strength = 106), natural language processing (occurrences = 28, link strength = 90) education (occurrences = 24, link strength = 55), eeg (occurrences = 23, link strength = 97), convolutional neural networks (occurrences = 21, link strength = 77), depression (occurrences = 21, link strength = 71), face recognition (occurrences = 21, link strength = 93), facial expression recognition (occurrences = 21, link strength = 66), ai (occurrences = 18, link strength = 48), children (occurrences = 18, link strength = 50), convolutional neural network (occurrences = 17, link strength = 53), design (occurrences = 16, link strength = 45), and features (occurrences = 16, link strength = 69).

As demonstrated by different colors in Figure 5, these keywords were grouped into 5 clusters. Cluster 4 (13 items, in yellow), and cluster 2 (24 items, in green) with the nodes bigger than the rest, which represented keyword items that occurred most frequently. Cluster 4 was presented by keyword items such as “anxiety”, “artificial intelligence”, “mental health”, and “depression”. Cluster 2 was represented by “deep learning” and “emotion recognition”. Other noticeable nodes representing keyword items also included “education”, and “emotion” (cluster 1, in red), “affective computing” (cluster 3, in blue), and “personality” (cluster 5, in purple).
We also found the top 10 authors and organizations according to the citations of each item. The top 10 authors on this topic were: Achini Adikari (documents = 5, citations = 101, link strength = 9), Damminda Alahakoon (documents = 5, citations = 101, link strength = 9), Brian A. Bottge (documents = 5, citations = 129, link strength = 0), Chen Xieling (documents = 4, citations = 81, link strength = 12), Cheng Gary (documents = 4, citations = 81, link strength = 12), Xie Haoran (documents = 4, citations = 81, link strength = 12), Zou Di (documents = 4, citations = 81, link strength = 12), Daswin De silva (documents = 4, citations = 88, link strength = 8), Ketan Kotecha (documents = 4, citations = 32, link strength = 6), Nick Haber (documents = 3, citations = 58, link strength = 9).
The top 10 organizations were: King Saud University (documents = 12, citations = 136, link strength = 13), Prince Sattam Bin Abdulaziz University (documents = 11, citations = 47, link strength = 17), Princess Nourah bint Abdulrahman University (documents = 10, citations = 33, link strength = 20), King Khalid University (documents = 7, citations = 19, link strength = 18), McGill University(documents = 7, citations = 77, link strength = 3), Sejong University (documents = 7, citations = 316, link strength = 1), Stanford University (documents = 7, citations = 661, link strength = 1), University of Southern California (documents = 6, citations = 155, link strength = 3), University of Cambridge (documents = 6, citations = 913, link strength = 2), Chinese Academy of Sciences (documents = 6, citations = 113, link strength = 1), and King Abdulaziz University (documents = 6, citations = 85, link strength = 1).
Literature Clustering
Clustering Application Themes in Education
To reveal the primary themes of emotion artificial intelligence in education (RQ2), we clustered the results with CitNetExplorer. The clustering function in the software was based on the citation relationships between the literature search results. All 729 publications and their references were grouped into 6 clusters, as Fig. 6 demonstrates. Cluster 1 (in blue) included 156 publications, cluster 2 (in green) included 141 publications, cluster 3 (in purple) included 63 publications, Cluster 4 (in orange) included 46 publications, cluster 5 (in yellow) included 17 publications, and cluster 6 (in pink) included 9 publications. The rest of the 297 results did not belong to any group due to the minimum size of group = 10 publications. As the clustering function was based on the close connections built on citations, we examined the clustering criteria. Group 1 and group 2 had the largest number of publications and citation links; Other four clusters had a small total of publications with higher than 15 citation scores. Given the significant volume of publications and citation connections, our analysis primarily concentrated on the first two groups.
The clustering results of the literature revealed two main research domains of emotion artificial intelligence in education: models and frameworks for emotion recognition, and emotion recognition channels and methods. Despite some interdisciplinary citations and presumably wrong clustering results by the software, each cluster demonstrated its focus on different research preferences. According to the dominant research topics of the most cited and recent literature, we summarized the focus of each cluster. In cluster 1, studies fixed their focus on different models for emotion recognition, such as the CNN-Bi-LSTM-Attention model proposed to automatically extract the features and categorize emotions from EEG signals [30], and vector machine (SVM)-based machine learning models given to classify the emotional states and the attention level of the participants to a video conversation [31]. Adhikari proposed a self-structuring AI framework for deep emotion modeling (2021). By contrast, cluster 2 contained studies with various channels used to measure emotion, including textual, visual, vocal, physiological, and multimodal [6] and a deep learning approach for recognition [32,33]. The clustering results did not lead to absolute and clear-cut grouping of the literature since some studies in each group might belong to the focus of other groups, and studies that were not clustered took a considerable proportion. However, this did not prevent the identification of Emotion Artificial Intelligence in educational research (Figure 6).
Foundations and Developments of Emotion Artificial Intelligence Applied to Education
The most cited publications are analyzed in this part. In cluster 1, the pioneering publication with the highest citation score was written by Russell, in which he presented an affective space model to differentiate various emotion states (1980). The second and third with the highest citation score were published in 1997 and 2012. Long short-term memory, a type of recurrent neural network, proposed by Hochreiter and Schmidhuber [34], has laid the theory foundation for later emotion states recognition. Koelstra et al. [35] used electroencephalogram (EEG) and peripheral physiological signals to analyze human affective states. Following these three previous publications, more and more emotion recognition models were researched, such as the EEG-based emotion recognition models [36,37] (Şengür & Siuly, 2020). Sajno et al. [38] did a narrative review to provide a comprehensive overview of the utilization of machine learning algorithms in inferring psychological states from bio signals. In cluster 2, the first three most cited publications were about deep learning-based methods being used in the recognition of facial expressions [39,40]. Following these deep learning-based methods, emotion measurement channels were explored and discussed, such as vocal methods [41], visual methods [9,42], and textual methods [10].

Discussion
Publication Trends
Due to the evolution of educational formats, the integration of online and offline teaching emerges as the pivotal development trajectory for education, both presently and in the future [43]. Applying emotion artificial intelligence to education has been an emerging topic within the past several years. There is a need for all learning applications and platforms to possess an integrated capability to identify and track learners’ emotions [6]. Utilizing artificial intelligence for emotion recognition holds significant benefits for individuals as well as humanity [44]. The precision of AI systems in perceiving and interpreting human emotions paves the way for enhanced human-computer interactions, tailored user experiences, commercial use for psychiatry, and educational advancements [18,45]. EAI intended for educational purposes still requires further specification and necessitates interdisciplinary research to validate their applicability in educational settings. The implementation of EAI is facing some challenges encountered in emotion analysis, including issues of data sources and the constraints of current models [46].
Although the research themes of emotion artificial intelligence in education are various in different research areas, the primary research themes are concentrated on models and frameworks for emotion recognition, and emotion recognition channels and methods. Emotion measurement channels include textual, visual, vocal, physiological, and multimodal channels (Chen & Sun, 2012) [6,47]. Emotion models are mainly recognized as the discrete emotion model and the dimensional emotion model (Bakker et al., 2014) [48,49].
The abilities for emotion artificial intelligence in education
Emotion artificial intelligence needs to cover personal and social competence to facilitate successful and productive use in education. Personal competence mainly refers to emotional awareness (EA), the ability to recognize others’ emotions [50]. For artificial intelligence in education, emotion recognition involves the meticulous analysis and processing of emotional signals gathered by sensors utilizing a computer, enabling the accurate determination of the current emotional state of users [51]. Through emotional expression, individuals can communicate a vast amount of information that is conducive to interpersonal communication. Thus, it’s vital for artificial intelligence to have the emotional recognition ability, which can help them respond to users efficiently in education. Weng and Lin did an analysis and study on the multimodal emotion recognition algorithm in artificial intelligence (2022). Vempati and Sharma conducted a review of the automated recognition of emotions from EEG signals utilizing artificial intelligence techniques [52]. Besides, emotion artificial intelligence is required to have social competence to communicate with users. Academic emotions can significantly impact the learning effect due to their direct correlation with factors such as motivation, controllability, and cognition [53]. The empathetic dimension has proven to exert a positive impact on the application of AI chatbots for enhancing learners’ learning outcomes, especially when addressing their emotional discomfort. Magno and Dossena proved that the quality of information and emotional experiences provided by chatbots play a pivotal role in determining customer satisfaction [54].
Conclusion
Major Findings
Motivated by the benefits that blended teaching brought in education, artificial intelligence gains much attention and exploration, attracting rising academic interest, especially the emotion artificial intelligence, which is very helpful to recognize users’ emotion states and regulate their learning. The existing literature from recent years did not present a literature review and investigate the requisite abilities of EAI. This study explores EAI application in educational contexts by visualization and citation network analyses on this topic. This leads us to the following findings. First, the research and application of EAI in education is a hot topic due to its advantages in distance education and the advocation of blended learning. Second, the hot issues on this topic include models and frameworks, channels, and methods for emotion recognition. Third, EAI needs to cover personal and social competence to facilitate successful and productive use in education.
Limitations
There are some limitations in this paper. First, the literature, which is not retrieved on the core collection on Web of Science are not included. Second, it would be better to combine with empirical study. However, the paper tries to provide a reliable theoretical support for later studies in this area.
Implications for Future Studies
There are theoretical implications for future studies. This study may offer policymakers and practitioners in the various education sectors fresh perspectives on the more efficient utilization of EAI. And the systematic review of literature will help improve the design of EAI to have a better serve for education. Additionally, the discoveries can provide practical implications. Teaching practitioners can prove and research the related review of EAI during the process of teaching. The existing studies have been conducted from the classification of users’ emotion states. However, the reasons for users to give rise to different emotions in learning are rare, which can be done in further studies.
Acknowledgment
None.
Conflict of Interest
No conflict of interest.
References
- Ong DC, Soh H, Zaki J, Goodman ND (2021) Applying Probabilistic Programming to Affective Computing. IEEE transactions on affective computing 12(2): 306-317.
- Caruelle D, Shams P, Gustafsson A, Lervik Olsen L (2022) Affective computing in marketing: Practical implications and research opportunities afforded by emotionally intelligent machines. Marketing Letters 33(1): 163-169.
- Lin YP, Yu ZG (2023) A bibliometric analysis of artificial intelligence chatbots in educational contexts. Interactive Technology and Smart Education.
- Pavlik JV (2023) Collaborating with ChatGPT: Considering the Implications of Generative Artificial Intelligence for Journalism and Media Education. Journalism & Mass Communication Educator 78(1): 84-93.
- Wang H, Huang H, Makedon F (2013a) Emotion detection via discriminant Laplacian embedding. Universal Access in the Information Society 13(1): 23-31.
- Yadegaridehkordi E, Noor NFBM, Bin Ayub MN, Affal HB, Hussin NB (2019) Affective computing in education: A systematic review and future research. Computers & education 142(12): 103649.
- So HJ, Lee JH, Park HJ (2019) Affective computing in education: platform analysis and academic emotion classification. International Journal of Advanced Smart Convergence 8(2): 8-17.
- Blanchard EG, Volfson B, Hong YJ, Lajoie SP (2009) Affective Artificial Intelligence in Education: From Detection to Adaptation. Artificial Intelligence in Education: Building Learning Systems that Care: From Knowledge Representation to Affective Modelling, Proceedings of the 14th International Conference on Artificial Intelligence in Education, AIED 2009, Brighton, UK. DBLP.
- Saneiro M, Santos OC, Salmeron-Majadas S, Boticario JG (2014) Towards emotion detection in educational scenarios from facial expressions and body movements through multimodal approaches. The Scientific World Journal.
- Lin HCK, Wang CH, Chao CJ, Chien MK (2012) Employing textual and facial emotion recognition to design an affective tutoring system. Turkish Online Journal of Educational Technology 11(4): 418-426.
- Picard RW (1995) Affective Computing, M.I.T Media Laboratory Perceptual Computing Section Technical Report 321: 1-26.
- Strauss M, Reynolds C, Hughes S, Park K, McDarby G, et al. (2005) Affective Computing: A Review. Springer: Berlin/Heidelberg, Germany pp. 699-706.
- Benitez-Quiroz CF, Srinivasan R, Martinez, AM (2016) EmotioNet: An accurate, real-time algorithm for the automatic annotation of a million facial expressions in the wild. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
- Dehghan A, Ortiz EG, Shu G, Masood SZ (2017) Dager: deep age, gender and emotion recognition using convolutional neural network.
- Corvite S, Roemmich K, Rosenberg TI, Andalibi N (2023) Data subjects’ perspectives on emotion artificial intelligence use in the workplace: A relational ethics lens. Proceedings of the ACM on Human-Computer Interaction 7(CSCW1): 1-38.
- Ekman P (1999) Basic emotions. Handbook of Cognition and Emotion pp. 45-60.
- Ballesteros JA, Ramírez VGM, Moreira F, Solano A, Pelaez CA (2024) Facial emotion recognition through artificial intelligence. Frontiers in Computer Science 6: 1359471.
- Monteith S, Glenn T, Geddes J, Whybrow PC, Bauer M (2022) Commercial use of Emotion Artificial Intelligence (AI): Implications for psychiatry. Current Psychiatry Reports 24(3): 203-211.
- Reindl S (2021) Emotion AI in education: A literature review. International Journal of Learning Technology 16(4): 288.
- Kort B, Reilly R, Picard RW (n.d.) An affective model of interplay between emotions and learning: Reengineering educational pedagogy-building a learning companion. Proceedings IEEE International Conference on Advanced Learning Technologies.
- Faria AR, Almeida A, Martins C, Gonçalves R, Martins J, et al. (2017). A global perspective on an emotional learning model proposal. Telematics and Informatics 34(6): 824-837.
- Pekrun R, Schutz PA (2007) Emotion in education. Elsevier.
- Muñoz K, Mc Kevitt P, Lunney T, Noguez J, Neri L (2011) An emotional student model for game-play adaptation. Entertainment Computing 2(2): 133-141.
- Baldassarri S, Hupont I, Abadía D, Cerezo E (2015) Affective-aware tutoring platform for interactive digital television. Multimedia Tools and Applications 74(9): 3183-3206.
- Hasnul MA, Aziz NA, Alelyani S, Mohana M, Aziz A Abd (2021) Electrocardiogram-based emotion recognition systems and their applications in healthcare-a review. Sensors 21(15): 5015.
- Llurba C, Fretes G, Palau R (2022) Pilot study of real-time emotional recognition technology for secondary school students. Interaction Design and Architecture(s) 52: 61-80.
- Tian F, Gao P, Li L, Zhang W, Liang H, Qian Y, Zhao R (2014a) Recognizing and regulating e-learners’ emotions based on interactive Chinese texts in e-learning systems. Knowledge-Based Systems 55: 148-164.
- Lin HCK, Wu CH, Hsueh YP (2014) The influence of using affective tutoring system in accounting remedial instruction on learning performance and usability. Computers in Human Behavior 41: 514-522.
- Woolf BP, Arroyo I, Cooper D, Burleson W, Muldner K (2010) Affective tutors: automatic detection of and response to student emotion. Springer Berlin Heidelberg 308: 207-227.
- Huang Z, Ma Y, Wang R, Li W, Dai Y (2023) A model for EEG-based emotion recognition: CNN-bi-LSTM with attention mechanism. Electronics 12(14): 3188.
- Kodithuwakku J, Arachchi DD, Rajasekera J (2022) An emotion and attention recognition system to classify the level of engagement to a video conversation by participants in real time using machine learning models and utilizing a neural accelerator chip. Algorithms 15(5): 150.
- Trabelsi Z, Alnajjar F, Parambil MMA, Gochoo M, Ali L (2023) Real-Time Attention Monitoring System for Classroom: A Deep Learning Approach for Student’s Behavior Recognition. Big Data and Cognitive Computing 7(1): 48.
- He E, Chen Q, Zhong Q (2023) SL-Swin: A Transformer-Based Deep Learning Approach for Macro- and Micro-Expression Spotting on Small-Size Expression Datasets. Electronics 12(12).
- Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8): 1735-1780.
- Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, et al. (2012) DEAP: A Database for Emotion Analysis,Using Physiological Signals. IEEE Transactions on Affective Computing 3(1): 18-31.
- Zheng Wei, Lu Bao (2015) Investigating Critical Frequency Bands and Channels for EEG-Based Emotion Recognition with Deep Neural Networks. IEEE Transactions on Autonomous Mental Development 7(3): 162-175.
- Fares A, Zhong Sh, Jiang J (2019) EEG-based image classification via a region-level stacked bi-directional deep learning framework. BMC Med Inform Decis Mak 19(6): 268.
- Sajno E, Bartolotta S, Tuena C, Cipresso P, Pedroli E, et al. (2023) Machine learning in bio signals processing for mental health: A narrative review. Frontiers in Psychology 13.
- Krizhevsky A, Sutskever I, Hinton GE (2012) ImageNet classification with deep convolutional neural networks. Adv. Neural Inf Process Syst 25: 1097-1105.
- Lucey P, Cohn JF, Kanade T, Saragih J, Ambadar Z, et al. (2010) The extended cohn-kanade dataset (ck+): A complete dataset for action unit and emotion-specified expression. In Proceedings of the 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Workshops, San Francisco, CA, USA pp. 94-101.
- Yoo H, Baek JW, Chung K (2021) CNN-Based Voice Emotion Classification Model for Risk Detection. Intelligent Automation & Soft Computing 29(2): 319-334.
- Park SJ, Kim BG, Chilamkurti N (2021) A Robust Facial Expression Recognition Algorithm Based on Multi-Rate Feature Fusion Scheme. Sensors 21(21): 6954.
- Yang Y, Zhang H, Chai H, Xu W (2023) Design and application of intelligent teaching space for blended teaching. Interactive Learning Environments 31(10): 6147-6164.
- Tanabe H, Shiraishi T, Sato H, Nihei M, Inoue T, et al. (2023) A concept for emotion recognition systems for children with profound intellectual and multiple disabilities based on artificial intelligence using physiological and motion signals. Disabil. Rehabil. Assist. Technol pp. 1-8.
- Lu X (2022) Deep learning-based emotion recognition and visualization of figural representation. Front Psychol 12: 818833.
- Rokhsaritalemi S, Sadeghi Niaraki A, Choi SM (2023) Exploring Emotion Analysis Using Artificial Intelligence, Geospatial Information Systems, and Extended Reality for Urban Services. IEEE Access 11: 92478-92495.
- Harley JM, Bouchet F, Hussain MS, Azevedo R, Calvo R (2015) A multi-componential analysis of emotions during complex learning with an intelligent multi-agent system. Computers in Human Behavior 48: 615-625.
- Cai Y, Li X, Li J (2023) Emotion Recognition Using Different Sensors, Emotion Models, Methods, and Datasets: A Comprehensive Review. Sensors 23: 2455.
- Plutchik R (2003) Emotions and life: perspectives from psychology, biology, and evolution. American Psychological Association.
- Elyoseph Z, Hadar Shoval D, Asraf K, Lvovsky M (2023) Chatgpt outperforms humans in emotional awareness evaluations. Frontiers in Psychology 14.
- Weng Ya, Lin Fei (2022) Multimodal Emotion Recognition Algorithm for Artificial Intelligence Information System. Wireless Communications and Mobile Computing.
- Vempati R, Sharma L (2023) A systematic review on automated human emotion recognition using electroencephalogram signals and artificial intelligence. Results in Engineering 18.
- Mao X, Li Z (2010) Agent based affective tutoring systems: A pilot study. Comput Educ 55(1): 202-208.
- Magno F, Dossena G (2022) The effects of Chatbots’ attributes on customer relationships with brands: PLS-sem and importance-performance map analysis. The TQM Journal 35(5): 1156-1169.
- Adikari A, Gamage G, de Silva D, Mills N, Wong SMJ, et al. (2021) A self-structuring artificial intelligence framework for deep emotions modeling and analysis on the social web. Future Generation Computer Systems 116: 302-315.
- Ekman P, Friesen WV, O Sullivan M, Chan A, et al. (1987) Universal and cultural differences in the judgments of facial expressions of emotion. Journal of Personality and Social Psychology 53(4): 712-717.
- Hill J, Randolph Ford W, Farreras IG (2015) Real conversations with artificial intelligence: A comparison between human-human online conversations and human-chatbot conversations. Computers in Human Behavior 49: 245-250.
- Huang KL, Duan SF, Lyu X (2021) Affective voice interaction and artificial intelligence: A research study on the acoustic features of gender and the emotional states of the pad model. Frontiers in Psychology 12.
- Picard RW (2000) Affective Computing. Cambridge, MA, MIT Press, USA.
- Zhai C, Wibowo S (2022) A systematic review on cross-culture, humor, and empathy dimensions in conversational chatbots: the case of second language acquisition. Heliyon 8(12): e12056.
-
Tingna Wei* and Mengyao Li. A Bibliometric Analysis of Emotion Artificial Intelligence in Education Using VOSviewer and CitNetExplorer. Iris J of Edu & Res. 3(3): 2024. IJER.MS.ID.000565.
-
Emotion artificial intelligence, Education, Bibliometric analysis, Citation network analysis, Visualization analysis, Competence
-
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.