학술논문

Enhancing Household Energy Consumption Predictions Through Explainable AI Frameworks
Document Type
Periodical
Source
IEEE Access Access, IEEE. 12:36764-36777 2024
Subject
Aerospace
Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineered Materials, Dielectrics and Plasmas
Engineering Profession
Fields, Waves and Electromagnetics
General Topics for Engineers
Geoscience
Nuclear Engineering
Photonics and Electrooptics
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Predictive models
Forecasting
Explainable AI
Data models
Energy consumption
Artificial intelligence
Vectors
Energy management
Machine learning
Sustainable development
Low-carbon economy
Renewable energy sources
Home automation
Closed box
energy forecasting
feature importance
household energy consumption
machine learning models
XAI
Language
ISSN
2169-3536
Abstract
Effective energy management is crucial for sustainability, carbon reduction, resource conservation, and cost savings. However, conventional energy forecasting methods often lack accuracy, suggesting the need for advanced approaches. Artificial intelligence (AI) has emerged as a powerful tool for energy forecasting, but its lack of transparency and interpretability poses challenges for understanding its predictions. In response, Explainable AI (XAI) frameworks have been developed to enhance the transparency and interpretability of black-box AI models. Accordingly, this paper focuses on achieving accurate household energy consumption predictions by comparing prediction models based on several evaluation metrics, namely the Coefficient of Determination (R2), Root Mean Squared Error (RMSE), Mean Squared Error (MSE), and Mean Absolute Error (MAE). The best model is identified by comparison after making predictions on unseen data, after which the predictions are explained by leveraging two XAI frameworks: Local Interpretable Model-Agnostic Explanations (LIME) and Shapley Additive Explanations (SHAP). These explanations help identify crucial characteristics contributing to energy consumption predictions, including insights into feature importance. Our findings underscore the significance of current consumption patterns and lagged energy consumption values in estimating energy usage. This paper further demonstrates the role of XAI in developing consistent and reliable predictive models.