Skip to main content
  1. Index/
  2. đź“„ Publications/

Zero-Shot Quantization via Weight-Space Arithmetic

Daniele Solombrino

Antonio Andrea Gargiulo

Adrian Robert Minut

Luca Zhou

Alessandro Zirilli

Emanuele RodolĂ 

·1 min
Table of Contents
BibTeX Citation
@misc{solombrino2026zeroshotquantizationweightspacearithmetic,
      title={Zero-Shot Quantization via Weight-Space Arithmetic}, 
      author={Daniele Solombrino and Antonio Andrea Gargiulo and Adrian Robert Minut and Luca Zhou and Alessandro Zirilli and Emanuele RodolĂ },
      year={2026},
      eprint={2604.03420},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2604.03420}, 
}

This paper shows that robustness to post-training quantization (PTQ) is a transferable direction in weight space. The key idea is the quantization vector: extracted from a donor task by simple weight-space arithmetic, it can patch a receiver model and improve robustness to PTQ-induced noise by as much as 60%, without receiver-side quantization-aware training (QAT).

Because the method requires no receiver training data, it provides a zero-shot, low-cost alternative to QAT for extremely low-bit deployment. The results on Vision Transformer (ViT) models suggest that quantization robustness is not merely a byproduct of task-specific training, but a reusable feature of weight-space geometry that can be transferred rather than retrained.

Alessandro Zirilli
Author
Alessandro Zirilli