ECCC-Report TR17-177https://eccc.weizmann.ac.il/report/2017/177Comments and Revisions published for TR17-177en-usMon, 23 Apr 2018 17:25:34 +0300
Revision 1
| On Communication Complexity of Classification Problems |
Shay Moran,
Daniel Kane,
Roi Livni,
Amir Yehudayoff
https://eccc.weizmann.ac.il/report/2017/177#revision1This work introduces a model of distributed learning in the spirit of Yao's communication complexity model. We consider a two-party setting, where each of the players gets a list of labelled examples and they communicate in order to jointly perform some learning task. To naturally fit into the framework of learning theory, we allow the players to send each other labelled examples, where each example costs one unit of communication. This model can also be thought of as a distributed version of sample compression schemes.
We study several fundamental questions in this model. For example, we define the analogues of the complexity classes P, NP and coNP, and show that in this model P equals the intersection of NP and coNP. The proof does not seem to follow from the analogous statement in classical communication complexity; in particular, our proof uses different techniques, including boosting and metric properties of VC classes.
This framework allows to prove, in the context of distributed learning, unconditional separations between various learning contexts, like realizable versus agnostic learning, and proper versus improper learning. The proofs here are based on standard ideas from communication complexity as well as learning theory and geometric constructions in Euclidean space. As a corollary, we also obtain lower bounds that match the performance of algorithms from previous works on distributed classification.
Mon, 23 Apr 2018 17:25:34 +0300https://eccc.weizmann.ac.il/report/2017/177#revision1
Paper TR17-177
| On Communication Complexity of Classification Problems |
Shay Moran,
Daniel Kane,
Roi Livni,
Amir Yehudayoff
https://eccc.weizmann.ac.il/report/2017/177This work introduces a model of distributed learning in the spirit of Yao's communication complexity model. We consider a two-party setting, where each of the players gets a list of labelled examples and they communicate in order to jointly perform some learning task. To naturally fit into the framework of learning theory, we allow the players to send each other labelled examples, where each example costs one unit of communication. This model can also be thought of as a distributed version of sample compression schemes.
We study several fundamental questions in this model. For example, we define the analogues of the complexity classes P, NP and coNP, and show that in this model P equals the intersection of NP and coNP. The proof does not seem to follow from the analogous statement in classical communication complexity; in particular, our proof uses different techniques, including boosting and metric properties of VC classes.
This framework allows to prove, in the context of distributed learning, unconditional separations between various learning contexts, like realizable versus agnostic learning, and proper versus improper learning. The proofs here are based on standard ideas from communication complexity as well as learning theory and geometric constructions in Euclidean space. As a corollary, we also obtain lower bounds that match the performance of algorithms from previous works on distributed classification.
Thu, 16 Nov 2017 04:44:10 +0200https://eccc.weizmann.ac.il/report/2017/177