Effective energy management is crucial for sustainability, carbon reduction, resource conservation, and cost savings. However, conventional energy forecasting methods often lack accuracy, suggesting the need for advanced approaches. Artificial intelligence (AI) has emerged as a powerful tool for energy forecasting, but its lack of transparency and interpretability poses challenges for understanding its predictions. In response, Explainable AI (XAI) frameworks have been developed to enhance the transparency and interpretability of black-box AI models. Accordingly, this paper focuses on achieving accurate household energy consumption predictions by comparing prediction models based on several evaluation metrics, namely the Coefficient of Determination (R2), Root Mean Squared Error (RMSE), Mean Squared Error (MSE), and Mean Absolute Error (MAE). The best model is identified by comparison after making predictions on unseen data, after which the predictions are explained by leveraging two XAI frameworks: Local Interpretable Model-Agnostic Explanations (LIME) and Shapley Additive Explanations (SHAP). These explanations help identify crucial characteristics contributing to energy consumption predictions, including insights into feature importance. Our findings underscore the significance of current consumption patterns and lagged energy consumption values in estimating energy usage. This paper further demonstrates the role of XAI in developing consistent and reliable predictive models.

Bhandary, A., Dobariya, V., Yenduri, G., Jhaveri, R.H., Gochhait, S., Benedetto, F. (2024). Enhancing Household Energy Consumption Predictions Through Explainable AI Frameworks. IEEE ACCESS, 12, 36764-36777 [10.1109/ACCESS.2024.3373552].

Enhancing Household Energy Consumption Predictions Through Explainable AI Frameworks

Benedetto F.
2024-01-01

Abstract

Effective energy management is crucial for sustainability, carbon reduction, resource conservation, and cost savings. However, conventional energy forecasting methods often lack accuracy, suggesting the need for advanced approaches. Artificial intelligence (AI) has emerged as a powerful tool for energy forecasting, but its lack of transparency and interpretability poses challenges for understanding its predictions. In response, Explainable AI (XAI) frameworks have been developed to enhance the transparency and interpretability of black-box AI models. Accordingly, this paper focuses on achieving accurate household energy consumption predictions by comparing prediction models based on several evaluation metrics, namely the Coefficient of Determination (R2), Root Mean Squared Error (RMSE), Mean Squared Error (MSE), and Mean Absolute Error (MAE). The best model is identified by comparison after making predictions on unseen data, after which the predictions are explained by leveraging two XAI frameworks: Local Interpretable Model-Agnostic Explanations (LIME) and Shapley Additive Explanations (SHAP). These explanations help identify crucial characteristics contributing to energy consumption predictions, including insights into feature importance. Our findings underscore the significance of current consumption patterns and lagged energy consumption values in estimating energy usage. This paper further demonstrates the role of XAI in developing consistent and reliable predictive models.
2024
Bhandary, A., Dobariya, V., Yenduri, G., Jhaveri, R.H., Gochhait, S., Benedetto, F. (2024). Enhancing Household Energy Consumption Predictions Through Explainable AI Frameworks. IEEE ACCESS, 12, 36764-36777 [10.1109/ACCESS.2024.3373552].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11590/471827
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact