Document Details
Clip:
Under review as a conference paper at ICLR 2020 ISPARSE: OUTPUTINFORMED SPARSIFICATION OF NEURALNETWORKS Anonymous authors Paper under double-blind review ABSTRACT Deep neural networks have demonstrated unprecedented success in various knowledge management applications. However, the networks created are often very complex, with large numbers of trainable edges which require extensive com- putational resources. We note that many successful networks nevertheless often contain large numbers of redundant edges. Moreover, many of these edges may have negligible contributions towards the overall network performance. In this pa- per, we propose a noveliSparse framework, and experimentally show, that we can sparsify the network, by 30-50%, without impacting the network performance. iSparse leverages a novel edge signicance score,E, to determine the importance
Filename:
442_isparse_output_informed_sparsi.pdf
Filetype:
application/pdf
Size:
454953 bytes
Uploaded On:
2024-01-27
Abstract:
Summary:
Tags:
Notes:
Visible:
1
Status:
Parsed
Author:
Title:
Subject:
Creator:
LaTeX with hyperref package
Producer:
pdfTeX-1.40.19
Keywords:
CreationDate:
2019-11-14T16:45:55-07:00
ModDate:
2019-11-14T16:45:55-07:00
Trapped:
False
PTEX.Fullbanner:
This is pdfTeX, Version 3.14159265-2.6-1.40.19 (TeX Live 2018
Pages:
14
Return to Document Library