LM详解 GPT3,GPT2, GPT1 论文译读
T5,Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer,2019
arxiv https://arxiv.org/abs/1910.10683 中译 https://zhuanlan.zhihu.com/p/89719631 讨论 如何评价 Goo…
论文: GPT:Improving Language Understanding by Generative Pre-Training GTP-2:Language Models are Unsupervised Multitask Learners GPT-3:Language Models are Few-Shot Learners 参考:GPT、GPT-2、GPT-3论文精读…