bokomslag Communication Efficient Federated Learning for Wireless Networks
Data & IT

Communication Efficient Federated Learning for Wireless Networks

Mingzhe Chen Shuguang Cui

Pocket

2469:-

Funktionen begränsas av dina webbläsarinställningar (t.ex. privat läge).

Uppskattad leveranstid 10-16 arbetsdagar

Fri frakt för medlemmar vid köp för minst 249:-

Andra format:

  • 179 sidor
  • 2025
This book provides a comprehensive study ofFederated Learning (FL) over wireless networks. It consists ofthree main parts: (a) Fundamentals and preliminaries ofFL, (b) analysis and optimization ofFL over wireless networks, and (c) applications of wireless FL for Internet-of-Things systems. In particular, in the first part, the authors provide a detailed overview on widely-studied FL framework. In thesecond part ofthis book, theauthors comprehensively discuss three key wireless techniques including wireless resource management, quantization, and over-the-air computation tosupport thedeployment ofFL over realistic wireless networks. It also presents several solutions based onoptimization theory, graph theory and machine learning tooptimize theperformance ofFL over wireless networks. In thethird part ofthis book, theauthors introduce theuse ofwireless FL algorithms for autonomous vehicle control and mobile edge computing optimization. Machine learning and data-driven approaches have recently received considerable attention as key enablers for next-generation intelligent networks. Currently, most existing learning solutions for wireless networks rely on centralizing the training and inference processes by uploading data generated at edge devices to data centers. However, such a centralized paradigm may lead to privacy leakage, violate the latency constraints of mobile applications, or may be infeasible due to limited bandwidth or power constraints of edge devices. To address these issues, distributing machine learning at the network edge provides a promising solution, where edge devices collaboratively train a shared model using real-time generated mobile data. The avoidance of data uploading to a central server not only helps preserve privacy but also reduces network traffic congestion as well as communication cost. Federated learning (FL) is one of most important distributed learning algorithms. In particular, FL enables devices to train a shared machine learning model while keeping data locally. However, in FL, training machine learning models requires communication between wireless devices and edge servers over wireless links. Therefore, wireless impairments such as noise, interference, and uncertainties among wireless channel states will significantly affect the training process and performance of FL. For example, transmission delay can significantly impact the convergence time of FL algorithms. In consequence, it is necessary to optimize wireless network performance for the implementation of FL algorithms. This book targets researchers and advanced level students in computer science and electrical engineering. Professionals working in signal processing and machine learning will also buy this book.
  • Författare: Mingzhe Chen, Shuguang Cui
  • Format: Pocket/Paperback
  • ISBN: 9783031512681
  • Språk: Engelska
  • Antal sidor: 179
  • Utgivningsdatum: 2025-03-11
  • Förlag: Springer International Publishing AG