Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 1605 2023-07-25 17:32:25 |
2 layout -196 word(s) 1409 2023-07-26 03:25:33 |

Video Upload Options

We provide professional Video Production Services to translate complex research into visually appealing presentations. Would you like to try it?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Skianis, K.; Giannopoulos, A.; Gkonis, P.; Trakadas, P. Federated Learning-Based Consumption Prediction in Smart Homes. Encyclopedia. Available online: https://encyclopedia.pub/entry/47268 (accessed on 26 December 2024).
Skianis K, Giannopoulos A, Gkonis P, Trakadas P. Federated Learning-Based Consumption Prediction in Smart Homes. Encyclopedia. Available at: https://encyclopedia.pub/entry/47268. Accessed December 26, 2024.
Skianis, Konstantinos, Anastasios Giannopoulos, Panagiotis Gkonis, Panagiotis Trakadas. "Federated Learning-Based Consumption Prediction in Smart Homes" Encyclopedia, https://encyclopedia.pub/entry/47268 (accessed December 26, 2024).
Skianis, K., Giannopoulos, A., Gkonis, P., & Trakadas, P. (2023, July 25). Federated Learning-Based Consumption Prediction in Smart Homes. In Encyclopedia. https://encyclopedia.pub/entry/47268
Skianis, Konstantinos, et al. "Federated Learning-Based Consumption Prediction in Smart Homes." Encyclopedia. Web. 25 July, 2023.
Federated Learning-Based Consumption Prediction in Smart Homes
Edit

Smart homes, powered mostly by Internet of Things (IoT) devices, have become very popular nowadays due to their ability to provide a holistic approach towards effective energy management. This is made feasible via the deployment of multiple sensors, which enables predicting energy consumption via machine learning approaches.

federated learning energy consumption smart homes

1. Introduction

Smart homes have become increasingly popular with the widespread adoption of Internet of Things (IoT) devices [1][2][3][4][5], and have paved the way for improving multiple aspects of homes by utilizing the enormous amounts of data generated every day. One key challenge in this domain is predicting energy consumption to optimize energy management, reduce waste, and save costs [6][7][8]. Several studies have investigated this problem, including both centralized and decentralized approaches [9][10].
Traditional centralized prediction models, such as regression and time-series analysis, require data to be collected and processed on a central server [11][12][13]. However, collecting and transmitting sensitive data from smart homes to a central server can pose privacy concerns. Additionally, these methods do not scale well to large datasets, and the central server can become a bottleneck in the prediction process.
To address these challenges, decentralized approaches, such as FedAvg from federated learning (FL), have emerged [14]. FL enables multiple clients to train a machine learning model collaboratively without sharing their raw data. FL has been successfully applied to energy load prediction for smart homes, improving prediction accuracy while preserving data privacy [15][16][17]. Current FL approaches do not take into consideration the nature and additional properties of data. More specifically, limited work has been conducted in the area of federated learning regarding time-series datasets [18][19]. Nevertheless, none of them take into consideration and exploit the property of age.

2. Federated Learning-Based Consumption Prediction in Smart Homes via Age-Based Model Weighting

One of the early works on smart homes consumption prediction was conducted by [20], where they presented an energy management system (EMS) for smart homes. This system uses a data-acquisition module, which is an IoT device with a unique IP address, to interface with each home device. This creates a mesh wireless network of devices. The module, called the system on chip (SoC), collects energy-consumption data from each smart home device and sends them to a central server for analysis. The energy consumption data from all residential areas are collected in the utility company’s server, which results in a large collection of big data. The proposed related work makes use of standard business intelligence (BI) and big data analytics software to manage energy consumption effectively and meet consumer demand.
More recently, deep learning techniques have been applied to multiple domains, for example, the area of manufacturing [21][22][23]. In the area of smart home consumption prediction, ref. [24] proposed a convolutional neural network (CNN) based model for predicting the electricity consumption of smart homes. Similarly, ref. [25] proposed a long short-term memory (LSTM) based model for predicting the energy consumption of smart homes. They experimented on multiple datasets showing the effectiveness of LSTMs.
Federated learning (FL), also referred to as collaborative learning, is a machine learning method that enables training an algorithm without transferring data samples between various decentralized edge devices or servers that store local data samples. This approach distinguishes itself from conventional centralized machine learning techniques, where all local datasets are uploaded to a central server, as well as traditional decentralized alternatives that often assume a uniform distribution of local data samples. Numerous studies have investigated the application of FL in predicting energy consumption in smart homes.
Previous work [26] suggested two approaches to decrease the costs associated with uplink communication. The first approach involves utilizing structured updates, which involves learning an update from a limited parameter space that is represented by a smaller set of variables. This can be achieved through techniques like low-rank approximation or applying a random mask. The second approach, known as sketched updates, entails learning a complete model update and then compressing it using a combination of quantization, random rotations, and subsampling before transmitting it to the server. Experimental results on convolutional and recurrent networks demonstrate that these proposed methods can reduce communication costs.
Ref. [14] introduced a practical technique for federated learning of deep networks using iterative model averaging. They conducted a thorough empirical assessment, utilizing five distinct model architectures and four datasets. The results of these experiments indicate that the proposed approach remains resilient even when confronted with unbalanced and non-independent and identically distributed (non-IID) data distributions, which are common characteristics of this scenario. The authors focused on reducing communication costs, which are a primary constraint in federated learning. They demonstrated that their method significantly reduces the number of communication rounds required, achieving a reduction of 10–100 times compared to synchronized stochastic gradient descent. Ref. [27] presented a system that allows for training a deep neural network using Tensor-Flow on data stored on a mobile phone. The data remain on the device and are not shared. The weights are combined in the cloud using federated averaging, creating a global model that is sent back to the phones for inference. To ensure privacy, secure aggregation is used to make sure individual updates from phones are not viewable on a global level. This system has been used in large-scale applications, such as phone keyboards. The approach addresses several practical issues, including device availability, which depends on the local data distribution in complex ways, unreliable device connectivity, interrupted execution, coordinating execution across devices with different availability, and limited device storage and computing resources.
In recent work by [18], a federated series forecasting framework was proposed by redesigning a hybrid model that enables neural networks, utilizing the extra information from the time series to achieve time-series-specific learning via exponential smoothing.
Regarding smart homes and energy prediction, a number of approaches have been introduced. In their position paper, ref. [15] proposed a novel architecture for smart homes, called IOTFLA, focusing on the security and privacy aspects, which combines federated learning with secure data aggregation. Ref. [17] proposed a prediction model based on the analysis of the energy usage patterns of the households. They used a clustering algorithm to group the households with similar energy consumption patterns and then trained a prediction model for each cluster. The results showed that their model can accurately predict the energy consumption of households.
Moreover, the non-independent and identically distributed (non-IID) data samples across participating nodes slow model training and impose additional communication rounds for FL to converge. Ref. [28] proposed the federated adaptive weighting (FedAdp) algorithm that aims to accelerate model convergence under the presence of nodes with the non-IID dataset.
In the work [29], the authors employed privacy-preserving principal component analysis (PCA) to extract features from data obtained from smart meters. Using this approach, they trained an artificial neural network in a federated manner, incorporating three weighted averaging strategies. The goal was to establish a connection between the smart meter data and the socio-demographic attributes of consumers. Ref. [30] suggested a personalized federated learning (PFL) based user-level load-forecasting system. Using local data, the derived personalized model performs better than the global model. To add another layer of privacy protection to the suggested system, the authors also used a unique differential privacy (DP) method. Based on the generative adversarial network (GAN) theory, the method balances prediction accuracy and privacy throughout the game. By performing simulation tests on real-world datasets, they demonstrate that the proposed system can meet the requirements for accuracy and privacy in practical load-forecasting scenarios.
To address the long-term optimization considerations for latency, accuracy, and energy consumption in wireless federated learning, ref. [31] introduced a mixed-integer optimization problem. The objective was to minimize the cost function over a finite number of rounds while adhering to the energy budget constraints of each client in the long run. To tackle this optimization problem, the authors proposed an online algorithm called per-round energy drift plus cost (PEDPC), which consists of two main components: client selection and bandwidth allocation. The client selection is addressed using the increasing time-maximum client selection (ITMCS) algorithm, while the barrier method is employed for bandwidth allocation. This approach allows for effectively handling the optimization problem in a real-time fashion.
Advances and open problems as well as future directions in federated learning have been described in recent papers by [27][32][33].
While these studies have shown the promise of FL for energy-consumption prediction in smart homes, there is still room for improvement. In particular, the use of FL for prediction models that can be implemented on resource-constrained smart home devices remains a challenging problem. None of the aforementioned methods exploit the age of datasets within the clients.

References

  1. Stojkoska, B.L.R.; Trivodaliev, K.V. A review of Internet of Things for smart home: Challenges and solutions. J. Clean. Prod. 2017, 140, 1454–1464.
  2. Alaa, M.; Zaidan, A.A.; Zaidan, B.B.; Talal, M.; Kiah, M.L.M. A review of smart home applications based on Internet of Things. J. Netw. Comput. Appl. 2017, 97, 48–65.
  3. Padmanaban, S.; Nasab, M.A.; Shiri, M.E.; Javadi, H.H.S.; Nasab, M.A.; Zand, M.; Samavat, T. The Role of Internet of Things in Smart Homes. Artif. Intell.-Based Smart Power Syst. 2023, 13, 259–271.
  4. Trakadas, P.; Masip-Bruin, X.; Facca, F.M.; Spantideas, S.T.; Giannopoulos, A.E.; Kapsalis, N.C.; Martins, R.; Bosani, E.; Ramon, J.; Prats, R.G.; et al. A Reference Architecture for Cloud–Edge Meta-Operating Systems Enabling Cross-Domain, Data-Intensive, ML-Assisted Applications: Architectural Overview and Key Concepts. Sensors 2022, 22, 9003.
  5. Trakadas, P.; Sarakis, L.; Giannopoulos, A.; Spantideas, S.; Capsalis, N.; Gkonis, P.; Karkazis, P.; Rigazzi, G.; Antonopoulos, A.; Cambeiro, M.A.; et al. A cost-efficient 5G non-public network architectural approach: Key concepts and enablers, building blocks and potential use cases. Sensors 2021, 21, 5578.
  6. Tso, G.K.; Yau, K.K. Predicting electricity energy consumption: A comparison of regression analysis, decision tree and neural networks. Energy 2007, 32, 1761–1768.
  7. Zhao, H.X.; Magoulès, F. A review on the prediction of building energy consumption. Renew. Sustain. Energy Rev. 2012, 16, 3586–3592.
  8. Kalafatelis, A.; Panagos, K.; Giannopoulos, A.E.; Spantideas, S.T.; Kapsalis, N.C.; Touloupou, M.; Kapassa, E.; Katelaris, L.; Christodoulou, P.; Christodoulou, K.; et al. ISLAND: An Interlinked Semantically-Enriched Blockchain Data Framework. In Proceedings of the Economics of Grids, Clouds, Systems, and Services: 18th International Conference, GECON 2021, Virtual Event, 21–23 September 2021; Proceedings 18. Springer: Berlin/Heidelberg, Germany, 2021; pp. 207–214.
  9. Hiremath, R.B.; Shikha, S.; Ravindranath, N. Decentralized energy planning; modeling and application—A review. Renew. Sustain. Energy Rev. 2007, 11, 729–752.
  10. Priyadarshini, I.; Sahu, S.; Kumar, R.; Taniar, D. A machine-learning ensemble model for predicting energy consumption in smart homes. Internet Things 2022, 20, 100636.
  11. Karamplias, T.; Spantideas, S.T.; Giannopoulos, A.E.; Gkonis, P.; Kapsalis, N.; Trakadas, P. Towards Closed-Loop Automation in 5G Open RAN: Coupling an Open-Source Simulator with XApps. In Proceedings of the 2022 Joint European Conference on Networks and Communications & 6G Summit (EuCNC/6G Summit), Virtual, 7–10 June 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 232–237.
  12. Kaloxylos, A.; Gavras, A.; Camps, D.; Ghoraishi, M.; Hrasnica, H. AI and ML–Enablers for beyond 5G Networks. 2021. Available online: https://5g-ppp.eu/wp-content/uploads/2021/05/AI-MLforNetworks-v1-0.pdf (accessed on 5 March 2023).
  13. Giannopoulos, A.; Spantideas, S.; Kapsalis, N.; Karkazis, P.; Trakadas, P. Deep reinforcement learning for energy-efficient multi-channel transmissions in 5G cognitive hetnets: Centralized, decentralized and transfer learning based solutions. IEEE Access 2021, 9, 129358–129374.
  14. McMahan, B.; Moore, E.; Ramage, D.; Hampson, S.; y Arcas, B.A. Communication-efficient learning of deep networks from decentralized data. In Proceedings of the Artificial Intelligence and Statistics, Fort Lauderdale, FL, USA, 20–22 April 2017; PMLR: Westminster, UK, 2017; pp. 1273–1282.
  15. Aïvodji, U.M.; Gambs, S.; Martin, A. IOTFLA: A secured and privacy-preserving smart home architecture implementing federated learning. In Proceedings of the 2019 IEEE Security and Privacy Workshops (SPW), San Francisco, CA, USA, 19–23 May 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 175–180.
  16. Gao, J.; Wang, W.; Liu, Z.; Billah, M.F.R.M.; Campbell, B. Decentralized federated learning framework for the neighborhood: A case study on residential building load forecasting. In Proceedings of the 19th ACM Conference on Embedded Networked Sensor Systems, Coimbra, Portugal, 15–17 November 2021; pp. 453–459.
  17. Tun, Y.L.; Thar, K.; Thwal, C.M.; Hong, C.S. Federated learning based energy demand prediction with clustered aggregation. In Proceedings of the 2021 IEEE International Conference on Big Data and Smart Computing (BigComp), Jeju, Republic of Korea, 17–20 January 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 164–167.
  18. Li, Y. Federated Learning for Time Series Forecasting Using Hybrid Model. 2019. Available online: https://www.semanticscholar.org/paper/Federated-Learning-for-Time-Series-Forecasting-Li/620c6b7b4e04988a207662d46f321a514a56a773 (accessed on 2 February 2023).
  19. Díaz González, F. Federated Learning for Time Series Forecasting Using LSTM Networks: Exploiting Similarities through Clustering. 2019. Available online: https://www.semanticscholar.org/paper/Federated-Learning-for-Time-Series-Forecasting-LSTM-Gonz%C3%A1lez/ea4101aa3f6308141ad75a28e2dc3d829a02cf97 (accessed on 4 March 2023).
  20. Al-Ali, A.R.; Zualkernan, I.A.; Rashid, M.; Gupta, R.; AliKarar, M. A smart home energy management system using IoT and big data analytics approach. IEEE Trans. Consum. Electron. 2017, 63, 426–434.
  21. Liu, Y.; Garg, S.; Nie, J.; Zhang, Y.; Xiong, Z.; Kang, J.; Hossain, M.S. Deep anomaly detection for time-series data in industrial IoT: A communication-efficient on-device federated learning approach. IEEE Internet Things J. 2020, 8, 6348–6358.
  22. Truong, H.T.; Ta, B.P.; Le, Q.A.; Nguyen, D.M.; Le, C.T.; Nguyen, H.X.; Do, H.T.; Nguyen, H.T.; Tran, K.P. Light-weight federated learning-based anomaly detection for time-series data in industrial control systems. Comput. Ind. 2022, 140, 103692.
  23. Ji, S.; Zhu, J.; Yang, Y.; Zhang, H.; Zhang, Z.; Xia, Z.; Zhang, Z. Self-Attention-Augmented Generative Adversarial Networks for Data-Driven Modeling of Nanoscale Coating Manufacturing. Micromachines 2022, 13, 847.
  24. Dey, N.; Fong, S.; Song, W.; Cho, K. Forecasting energy consumption from smart home sensor network by deep learning. In Proceedings of the Smart Trends in Information Technology and Computer Communications: Second International Conference, SmartCom 2017, Pune, India, 18–19 August 2017; Revised Selected Papers 2. Springer: Berlin/Heidelberg, Germany, 2018; pp. 255–265.
  25. Alden, R.E.; Gong, H.; Ababei, C.; Ionel, D.M. LSTM forecasts for smart home electricity usage. In Proceedings of the 2020 9th International Conference on Renewable Energy Research and Application (ICRERA), Glasgow, UK, 27–30 September 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 434–438.
  26. Konečnỳ, J.; McMahan, H.B.; Yu, F.X.; Richtárik, P.; Suresh, A.T.; Bacon, D. Federated learning: Strategies for improving communication efficiency. arXiv 2016, arXiv:1610.05492.
  27. Bonawitz, K.; Eichner, H.; Grieskamp, W.; Huba, D.; Ingerman, A.; Ivanov, V.; Kiddon, C.; Konečnỳ, J.; Mazzocchi, S.; McMahan, B.; et al. Towards federated learning at scale: System design. Proc. Mach. Learn. Syst. 2019, 1, 374–388.
  28. Wu, H.; Wang, P. Fast-convergent federated learning with adaptive weighting. IEEE Trans. Cogn. Commun. Netw. 2021, 7, 1078–1088.
  29. Wang, Y.; Bennani, I.L.; Liu, X.; Sun, M.; Zhou, Y. Electricity consumer characteristics identification: A federated learning approach. IEEE Trans. Smart Grid 2021, 12, 3637–3647.
  30. Qu, X.; Guan, C.; Xie, G.; Tian, Z.; Sood, K.; Sun, C.; Cui, L. Personalized Federated Learning for Heterogeneous Residential Load Forecasting. Big Data Min. Anal. 2023.
  31. Ji, Y.; Zhong, X.; Kou, Z.; Zhang, S.; Li, H.; Yang, Y. Efficiency-Boosting Federated Learning in Wireless Networks: A Long-Term Perspective. IEEE Trans. Veh. Technol. 2023.
  32. Li, T.; Sahu, A.K.; Talwalkar, A.; Smith, V. Federated learning: Challenges, methods, and future directions. IEEE Signal Process Mag. 2020, 37, 50–60.
  33. Kairouz, P.; McMahan, H.B.; Avent, B.; Bellet, A.; Bennis, M.; Bhagoji, A.N.; Bonawitz, K.; Charles, Z.; Cormode, G.; Cummings, R.; et al. Advances and open problems in federated learning. In Foundations and Trends® in Machine Learning; Now Publishers, Inc.: Norwell, MA, USA, 2021; Volume 14, pp. 1–210.
More
Information
Subjects: Telecommunications
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , , ,
View Times: 544
Revisions: 2 times (View History)
Update Date: 26 Jul 2023
1000/1000
Video Production Service