1-Lipschitz Layers Compared: Memory, Speed and Certifiable Robustness
Published in CVPR, 2024
We compare existing methods of creating 1-Lipschitz convolutions, analysing them both theoretically as well as experimentally.
Published in CVPR, 2024
We compare existing methods of creating 1-Lipschitz convolutions, analysing them both theoretically as well as experimentally.
Published in arXiv, 2023
We show a shortcoming with currently popular activation functions in 1-Lipschitz networks, both empirically and experimentally. Then we propose an activation function that provably overcomes this limitation.
Published in ECCV, 2022
We introduce a rescaling-based parameterization that guarantees linear (fully-connected and convolutional) layers to be 1-Lipschitz