Aktivitäten pro Jahr
Abstract
In this paper we look into the conjecture of Entezari et al. (2021) which states that if the permutation invariance of neural networks is taken into account, then there is likely no loss barrier to the linear interpolation between SGD solutions. First, we observe that neuron alignment methods alone are insufficient to establish low-barrier linear connectivity between SGD solutions due to a phenomenon we call variance collapse: interpolated deep networks suffer a collapse in the variance of their activations, causing poor performance. Next, we propose REPAIR (REnormalizing Permuted Activations for Interpolation Repair) which mitigates variance collapse by rescaling the preactivations of such interpolated networks. We explore the interaction between our method and the choice of normalization layer, network width, and depth, and demonstrate that using REPAIR on top of neuron alignment methods leads to 60%-100% relative barrier reduction across a wide variety of architecture families and tasks. In particular, we report a 74% barrier reduction for ResNet50 on ImageNet and 90% barrier reduction for ResNet18 on CIFAR10
Originalsprache | englisch |
---|---|
Titel | The Eleventh International Conference on Learning Representations |
Publikationsstatus | Veröffentlicht - 20 Jan. 2023 |
Veranstaltung | 11th International Conference on Learning Representations: ICLR 2023 - Hybrider Event, Kigali, Hybrid / Virtual, Ruanda Dauer: 1 Mai 2023 → 5 Mai 2023 |
Konferenz
Konferenz | 11th International Conference on Learning Representations |
---|---|
Kurztitel | ICLR 2023 |
Land/Gebiet | Ruanda |
Ort | Kigali, Hybrid / Virtual |
Zeitraum | 1/05/23 → 5/05/23 |
Fingerprint
Untersuchen Sie die Forschungsthemen von „REPAIR: REnormalizing Permuted Activations for Interpolation Repair“. Zusammen bilden sie einen einzigartigen Fingerprint.Aktivitäten
- 1 Gastvortrag
-
Navigating the Depths of Adaptive Embedded Intelligence: Ensembling, Reconfiguring and Editing Models
Olga Saukh (Redner/in)
14 Dez. 2023Aktivität: Vortrag oder Präsentation › Gastvortrag › Science to science