Subspace-Configurable Networks

Olga Saukh, Dong Wang, Xiaoxi He, Lothar Thiele

Publikation: ArbeitspapierPreprint

Abstract

While the deployment of deep learning models on edge devices is increasing, these models often lack robustness when faced with dynamic changes in sensed data. This can be attributed to sensor drift, or variations in the data compared to what was used during offline training due to factors such as specific sensor placement or naturally changing sensing conditions. Hence, achieving the desired robustness necessitates the utilization of either an invariant architecture or specialized training approaches, like data augmentation. Alternatively, input transformations can be treated as a domain shift problem, and solved by post-deployment model adaptation. In this paper, we train a parameterized subspace of configurable networks, where an optimal network for a particular parameter setting is part of this subspace. The obtained subspace is low-dimensional and has a surprisingly simple structure even for complex, non-invertible transformations of the input, leading to an exceptionally high efficiency of subspace-configurable networks (SCNs) when limited storage and computing resources are at stake. We evaluate SCNs on a wide range of standard datasets, architectures, and transformations, and demonstrate their power on resource-constrained IoT devices, where they can take up to 2.4 times less RAM and be 7.6 times faster at inference time than a model that achieves the same test set accuracy, yet is trained with data augmentations to cover the desired range of input transformations.
Originalspracheenglisch
HerausgeberarXiv
Seitenumfang36
PublikationsstatusVeröffentlicht - 21 Nov. 2023

Fields of Expertise

  • Information, Communication & Computing

Fingerprint

Untersuchen Sie die Forschungsthemen von „Subspace-Configurable Networks“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren