MACEDO, D. E. C. R; http://lattes.cnpq.br/4068176664073062; MACEDO, Daniel Enos Cavalcanti Rodrigues de.
Resumo:
Federated Learning is a training technique for Machine Learning models that guarantees
the privacy of the data of the devices that hold it. Data privacy is ensured by assigning
responsibility for model training to devices that exclusively use their respective databases to
train local models. This technique has gained attention from science and industry in contexts
where applications are increasingly stringent in terms of security and privacy requirements.
However, when assigning training to devices, training efficiency is subject to the characteristics
of these devices and the context in which they are inserted. Therefore, analyzing
these characteristics and implementing techniques is essential to ensure training efficiency.
Among these characteristics, the devices’ mobility and the system’s non-stationarity stand
out. The mobility of devices interferes with training to prevent completion due to the consequent
loss of connection to the network. In turn, the non-stationarity of the system is directly
related to the deviation and the consequent depreciation of the model’s learning efficiency.
MoSimFeL, a specific Federated Learning coordination algorithm, is introduced as a solution
to ensure training efficiency in scenarios with mobility and non-stationary systems.
To validate its effectiveness, a simulator called FLSimulator was developed. This simulator
evaluates Federated Learning applications in the described context and is used to experimentally
analyze the behavior of MoSimFeL in an image classification application. The results
of these simulations demonstrate that MoSimFeL is capable of performing federated training
even in scenarios with intense user mobility and under non-stationarity systems, a feat that
traditional algorithms struggle to achieve.