Skip to content

[ASP-DAC 2025] "NeuronQuant: Accurate and Efficient Post-Training Quantization for Spiking Neural Networks" Official Implementation

License

Notifications You must be signed in to change notification settings

shieldforever/NeuronQuant

Repository files navigation

NeuronQuant: Accurate and Efficient Post-Training Quantization for Spiking Neural Networks

Paper Link

Haomin Li*, Fangxin Liu*, Zewen Sun, Zongwu Wang, Shiyuan Huang, Ning Yang, Li Jiang

This is the official implementation of the paper "NeuronQuant: Accurate and Efficient Post-Training Quantization for Spiking Neural Networks" [ASP-DAC 2025]

Introduction

NeuronQuant Framework

Prepare Models and Running Quantization

./run.sh

Citation

@inproceedings{li2025neuronquant,
  title={NeuronQuant: Accurate and Efficient Post-Training Quantization for Spiking Neural Networks},
  author={Li, Haomin and Liu, Fangxin and Sun, Zewen and Wang, Zongwu and Huang, Shiyuan and Yang, Ning and Jiang, Li},
  booktitle={2025 30th Asia and South Pacific Design Automation Conference (ASP-DAC)},
  year={2025}
}

Acknowledgement

Our Code is based on the implementation of ANN2SNN_SRP.

Contact Us

If you have any questions, please contact:

About

[ASP-DAC 2025] "NeuronQuant: Accurate and Efficient Post-Training Quantization for Spiking Neural Networks" Official Implementation

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy