Current Location:home > Detailed Browse

Article Detail

DEED: A general quantization scheme for saving bits in communication

Submit Time: 2020-06-16
Author: Tian Ye 1 ; Peijun Xiao 2 ; Ruoyu Sun 2 ;
Institute: 1.Tsinghua University; 2.University of Illinois Urbana-Champaign;


Quantization is a popular technique to reduce communication in distributed optimization. Motivated by the classical work on inexact gradient descent (GD) \cite{bertsekas2000gradient}, we provide a general convergence analysis framework for inexact GD that is tailored for quantization schemes. We also propose a quantization scheme Double Encoding and Error Diminishing (DEED). DEED can achieve small communication complexity in three settings: frequent-communication large-memory, frequent-communication small-memory, and infrequent-communication (e.g. federated learning). More specifically, in the frequent-communication large-memory setting, DEED can be easily combined with Nesterov's method, so that the total number of bits required is $ \tilde{O}( \sqrt{\kappa} \log 1/\epsilon )$, where $\tilde{O}$ hides numerical constant and $\log \kappa $ factors. In the frequent-communication small-memory setting, DEED combined with SGD only requires $\tilde{O}( \kappa \log 1/\epsilon)$ number of bits in the interpolation regime. In the infrequent communication setting, DEED combined with Federated averaging requires a smaller total number of bits than Federated Averaging. All these algorithms converge at the same rate as their non-quantized versions, while using a smaller number of bits.
Download Comment Hits:16476 Downloads:1629
From: 叶添
Recommended references: Tian Ye,Peijun Xiao,Ruoyu Sun.(2020).DEED: A general quantization scheme for saving bits in communication.[ChinaXiv:202005.00043] (Click&Copy)
Version History
[V2] 2020-06-16 03:38:15 chinaXiv:202005.00043V2 Download
[V1] 2020-05-13 13:14:30 chinaXiv:202005.00043v1(View This Version) Download
Related Paper


Current Browse

Change Subject Browse

Cross Subject Browse

  • - NO