Citation Link: https://doi.org/10.25819/ubsi/10782
Integrating Learning Context and Explainability in Educational Recommender Systems Using Markov Decision Process over Knowledge Graphs
Alternate Title
Integration von Lernkontext und Erklärbarkeit in Bildungsempfehlungssystemen unter Verwendung des Markov-Entscheidungsprozesses über Wissensgraphen
Source Type
Doctoral Thesis
Author
Issue Date
2025
Abstract
Human learning is a complex and multi-dimensional process, governed by a wide range of factors that describe the individual differences between learners, and the contexts in which learning happens. Since these differences influence how learners respond to the learning content and activities, it is essential for an effective learning process to consider these factors when generating learning recommendations. Context-aware recommender systems (CARS) have offered a promising solution for tailoring learning experiences to the specific learning contexts. However, existing CARS often fall short in comprehensively integrating complex contextual data into their reasoning. Moreover, complex CARS face challenges in providing transparent and explainable recommendations to learners and educators, especially when they have no technical background, which hinders CARS’s effectiveness and acceptance among these educational stakeholders.
This thesis addresses these challenges by developing a novel method for building Context-Aware Recommendations and Explainability through Knowledge Graphs (CARExKG). Structurally, the CARExKG method employs knowledge graphs to represent contextual learning factors and their interdependencies, capturing the dynamic interplay between various contextual variables in complex learning settings. Algorithmically, CARExKG employs a Markov decision process over knowledge graphs, featuring a context-sensitive reward function tailored to enable the RS to generate contextualized learning paths across various learning settings.
In order to ensure human oversight, reduce stakeholders’ resistance to recommendations, and encourage collaborative human-AI decision-making, CARExKG further incorporates an explainability framework, utilizing large language models and expert-input from pedagogy specialists, to generate user-centric explanations, supporting learner’s understanding of the reasoning behind generating the recommended path, and enhancing their decision-making ability and ownership of their educational journey.
To evaluate the proposed method, a set of experiments was designed to measure the effectiveness of the knowledge graphs, the recommendation algorithm, and the explainability framework. CARExKG is then evaluated as a complete system through a real-world user study with nursing staff in two elderly homes, where a complex learning scenario was constructed to mimic the multi-dimensional challenges they face in their profession. Evaluation results have demonstrated the effectiveness of the proposed approach for constructing the knowledge graph and the reasoning of the RS. Results also show the ability of the CARExKG method to improve learner satisfaction and outcomes in vocational education and training settings. Experiment findings underscore the importance of the interdisciplinary approach followed in designing CARExKG, which combines artificial intelligence, educational technology, and pedagogy to create adaptive, explainable, and learner-centered educational tools.
This thesis addresses these challenges by developing a novel method for building Context-Aware Recommendations and Explainability through Knowledge Graphs (CARExKG). Structurally, the CARExKG method employs knowledge graphs to represent contextual learning factors and their interdependencies, capturing the dynamic interplay between various contextual variables in complex learning settings. Algorithmically, CARExKG employs a Markov decision process over knowledge graphs, featuring a context-sensitive reward function tailored to enable the RS to generate contextualized learning paths across various learning settings.
In order to ensure human oversight, reduce stakeholders’ resistance to recommendations, and encourage collaborative human-AI decision-making, CARExKG further incorporates an explainability framework, utilizing large language models and expert-input from pedagogy specialists, to generate user-centric explanations, supporting learner’s understanding of the reasoning behind generating the recommended path, and enhancing their decision-making ability and ownership of their educational journey.
To evaluate the proposed method, a set of experiments was designed to measure the effectiveness of the knowledge graphs, the recommendation algorithm, and the explainability framework. CARExKG is then evaluated as a complete system through a real-world user study with nursing staff in two elderly homes, where a complex learning scenario was constructed to mimic the multi-dimensional challenges they face in their profession. Evaluation results have demonstrated the effectiveness of the proposed approach for constructing the knowledge graph and the reasoning of the RS. Results also show the ability of the CARExKG method to improve learner satisfaction and outcomes in vocational education and training settings. Experiment findings underscore the importance of the interdisciplinary approach followed in designing CARExKG, which combines artificial intelligence, educational technology, and pedagogy to create adaptive, explainable, and learner-centered educational tools.
File(s)![Thumbnail Image]()
Loading...
Name
Dissertation_Rasheed_Hasan_Abu.pdf
Size
6.53 MB
Format
Adobe PDF
Checksum
(MD5):1fdcc4810e7506e0ad3831ffca2132af
Owning collection

