论文全称及链接:《BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding》
项目地址:google-research/bert
BERT全称:Bidirectional Encoder Representations from Transformers