Erratum to “LightVeriFL: A Lightweight and Verifiable Secure Aggregation for Federated Learningâ€
This article addresses errors in [1]. Equation (2) contained an error wherein x was not bold. It is corrected below.
This article addresses errors in [1]. Equation (2) contained an error wherein x was not bold. It is corrected below.
The problem of coding for the uplink and downlink of cloud radio access networks (C-RAN’s) with K users and L relays is considered. It is shown that low-complexity coding schemes that achieve any point in the rate-fronthaul region of joint coding and compression can be constructed starting from at most $4(K+L)-2$ point-to-point codes designed for symmetric channels. This reduces the seemingly hard task of constructing good codes for C-RAN’s to the much better understood task of finding good codes for single-user channels.
In this paper, the problem of zero-error network function computation is considered, where in a directed acyclic network, a single sink node is required to compute with zero error a function of the source messages that are separately generated by multiple source nodes. From the information-theoretic point of view, we are interested in the fundamental computing capacity, which is defined as the average number of times that the function can be computed with zero error for one use of the network.
The Shannon lower bound has been the subject of several important contributions by Berger. This paper surveys Shannon bounds on rate-distortion problems under mean-squared error distortion with a particular emphasis on Berger’s techniques. Moreover, as a new result, the Gray-Wyner network is added to the canon of settings for which such bounds are known. In the Shannon bounding technique, elegant lower bounds are expressed in terms of the source entropy power.
Proactive testing and interventions are crucial for disease containment during a pandemic until widespread vaccination is achieved. However, a key challenge remains: Can we accurately identify all new daily infections with only a fraction of tests needed compared to testing everyone, everyday?
Uniform continuity bounds on entropies are generally expressed in terms of a single distance measure between probability distributions or quantum states, typically, the total variation-or trace distance. However, if an additional distance measure is known, the continuity bounds can be significantly strengthened. Here, we prove a tight uniform continuity bound for the Shannon entropy in terms of both the local-and total variation distances, sharpening an inequality in I. Sason, Âé¶¹´«Ã½Ó³» Trans. Inf. Th., 59, 7118 (2013).
The problem of statistical inference in its various forms has been the subject of decades-long extensive research. Most of the effort has been focused on characterizing the behavior as a function of the number of available samples, with far less attention given to the effect of memory limitations on performance. Recently, this latter topic has drawn much interest in the engineering and computer science literature.
Prospective authors are requested to submit new, unpublished manuscripts for inclusion in the upcoming event described in this call for papers.
It has recently been shown that for compressive sensing, significantly fewer measurements may be required if the sparsity assumption is replaced by the assumption the unknown vector lies near the range of a suitably-chosen generative model.