Shajalal, MdMdShajalalBoden, AlexanderAlexanderBodenStevens, GunnarGunnarStevens2025-12-052025-12-052022https://dspace.ub.uni-siegen.de/handle/ubsi/7244This is the preprint and accepted versionIn recent years, eXplainable Artificial Intelligence (XAI) has received huge attention in the area of explaining the decision-making processes of machine learning models. The aim is to increase the acceptance, trust, and transparency of AI models by providing explanations about the models' decisions. But most of the prior works on XAI are focused to support AI practitioners and developers in understanding and debugging. In this paper, we propose a user-centered explainable energy demand prediction and forecasting system that aims to provide explanations to end-users in the smart home. In doing so, we present an overview of the explainable system and propose a method combining Deep Learning Important FeaTures (DeepLIFT) and Shapley Additive Explanations (SHAP) to explain the prediction of an LSTM-based energy forecasting model.en004 InformatikExplainable Energy Demand ForecastingHuman-centered ExplanationDeepLIFTShapely Additive ExplanationLSTMErklärbare EnergiebedarfsprognoseTowards user-centered explainable energy demand forecasting systemsInProceedingshttps://doi.org/10.1145/3538637.3538877urn:nbn:de:hbz:467-72447