Document Details
Clip:
Published as a conference paper at ICLR 2019 THELOTTERYTICKETHYPOTHESIS: FINDINGSPARSE, TRAINABLENEURALNETWORKS Jonathan Frankle MIT CSAIL jfrankle@csail.mit.edu Michael Carbin MIT CSAIL mcarbin@csail.mit.edu ABSTRACT Neural network pruning techniques can reduce the parameter counts of trained net- works by over 90%, decreasing storage requirements and improving computational performance of inference without compromising accuracy. However, contemporary experience is that the sparse architectures produced by pruning are difcult to train from the start, which would similarly improve training performance.
Filename:
1803.03635v5.pdf
Filetype:
application/pdf
Size:
4001475 bytes
Uploaded On:
2025-10-24
Abstract:
Summary:
Tags:
Notes:
Visible:
1
Status:
Parsed
Author:
CreationDate:
2019-03-05T01:37:10+00:00
Creator:
LaTeX with hyperref package
Keywords:
ModDate:
2019-03-05T01:37:10+00:00
PTEX.Fullbanner:
This is pdfTeX, Version 3.14159265-2.6-1.40.17 (TeX Live 2016) kpathsea version 6.2.2
Producer:
pdfTeX-1.40.17
Subject:
Title:
Trapped:
False
Pages:
42
Return to Document Library