Optimizing sparse ternary compression with thresholds for communication-efficient federated learning
Abstract
Federated learning (FL) enables decentralized model training while preserving client data privacy, yet suffers from significant communication overhead due to frequent parameter exchanges. This study investigates how varying sparse ternary compression (STC) thresholds impact communication efficiency and model accuracy across the CIFAR-10 and MedMNIST datasets. Experiments tested thresholds ranging from 1.0 to 1.9 and batch sizes of 10, 15, and 20. Results demonstrated that selecting thresholds between 1.2 and 1.5 reduced total communication costs by approximately 10–15%, while maintaining acceptable accuracy levels. These findings suggest that careful threshold tuning can achieve substantial communication savings with minimal compromise in model performance, offering practical guidance for improving the efficiency and scalability of FL systems.
Keywords
Communication efficiency; Distributed machine learning; Federated learning; Sparse ternary compression; STC threshold
Full Text:
PDFDOI: http://doi.org/10.11591/ijai.v14.i6.pp4902-4912
Refbacks
- There are currently no refbacks.
Copyright (c) 2025 Nithyaniranjana Murthy Chittaiah, Manjula Sunkadakatte Haladappa

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
IAES International Journal of Artificial Intelligence (IJ-AI)
ISSN/e-ISSN 2089-4872/2252-8938
This journal is published by the Institute of Advanced Engineering and Science (IAES).