Document Details
Clip:
Published as a conference paper at ICLR 2020 REFORMER: THEEFFICIENTTRANSFORMER Nikita Kitaev U.C. Berkeley & Google Research kitaev@cs.berkeley.edu ukasz Kaiser Google Research flukaszkaiser,levskaya g@google.com Anselm Levskaya Google Research ABSTRACT Large Transformer models routinely achieve state-of-the-art results on a number of tasks but training these models can be prohibitively costly, especially on long
Filename:
2001.04451v2.pdf
Filetype:
application/pdf
Size:
658515 bytes
Uploaded On:
2025-10-24
Abstract:
Summary:
Tags:
Notes:
Visible:
1
Status:
Parsed
Author:
CreationDate:
2020-02-19T01:32:39+00:00
Creator:
LaTeX with hyperref package
Keywords:
ModDate:
2020-02-19T01:32:39+00:00
PTEX.Fullbanner:
This is pdfTeX, Version 3.14159265-2.6-1.40.17 (TeX Live 2016) kpathsea version 6.2.2
Producer:
pdfTeX-1.40.17
Subject:
Title:
Trapped:
False
Pages:
12
Return to Document Library