Pub. Date | : Oct, 2022 |
---|---|
Product Name | : The IUP Journal of Electrical and Electronics Engineering |
Product Type | : Article |
Product Code | : IJEEE011022 |
Author Name | : Lucas Richter, Ilja Dontsov and Tania Jacob |
Availability | : YES |
Subject/Domain | : Engineering |
Download Format | : PDF Format |
No. of Pages | : 14 |
Measured data in the context of smart cities can be used to develop new and innovative business models to improve efficiency and the value of life. A time-series classification algorithm can help automatize many different processes such as forecasting services. In order to ensure data security and privacy, Federated Learning trains a global model collaboratively on multiple clients. Having different data-distributions and data-quantities across participating clients, neural networks suffer from slow convergence and overfitting. Based on different data values and number of clients, different dataclustering strategies have been developed and evaluated in this study to update global model weights. Public time-series data has been downloaded from the Internet to generate various synthetic datasets and train a Relational-Regularized Autoencoder for classification purposes. The results showed an improvement of model performance concerning generalization.
The digitalization of industrial processes, marketing, controlling of energy fluxes and healthcare, is accompanied by high-frequency data generation on edge devices. This data contains a lot of information about processes that could be optimized and automated. Since there exists a lack of data-privacy, security and communication latency when operating in the cloud, many companies prefer edge-computing. This often means that each company or device has access to a relatively small amount of data, leading to overfitting, while applying Machine Learning models to solve some specific problems. As many applications benefit from sharing their knowledge, Federated Learning (FL) helps to circumvent this conflict by training locally and aggregating model weights on a central
Federated Learning (FL) Strategies, Relational-Regularized Autoencoder (RAE), Time-Series Classification