Document Details


2402.04291.pdf
Download View Text Delete
Clip: BiLLM: Pushing the Limit of Post-Training Quantization for LLMs Wei Huang 1 Yangdong Liu 2 Haotong Qin B3 2 Ying Li 2 Shiming Zhang 1 Xianglong Liu 2 Michele Magno 3
Filename: 2402.04291.pdf
Filetype: application/pdf
Size: 7299460 bytes
Uploaded On: 2024-02-25
Abstract:
Summary:
Tags:
Notes:
Visible: 1
Status: Parsed
Author: Wei Huang, Yangdong Liu, Haotong Qin66, Ying Li, Shiming Zhang, Xianglong Liu, Michele Magno, Xiaojuan Qi
CreationDate: 2024-02-08T01:47:39+00:00
Creator: LaTeX with hyperref
Keywords: Model Binarization, Large Language Model, Model Compression, Deep Learning
ModDate: 2024-02-08T01:47:39+00:00
PTEX.Fullbanner: This is pdfTeX, Version 3.141592653-2.6-1.40.25 (TeX Live 2023) kpathsea version 6.3.5
Producer: pdfTeX-1.40.25
Subject: Proceedings of the International Conference on Machine Learning 2024
Title: BiLLM: Pushing the Limit of Post-Training Quantization for LLMs
Trapped: False
Pages: 19

Return to Document Library