BERT: Pre-training of deep bidirectional transformers

BERT: Pre-training of deep bidirectional transformers

paper
advanced
Landmark Research
90 min

About This Resource

First publicly available large language model with many active variants still in use today.

Author:Devlin et al.
Source:Google

Learn with AI

Get personalized help understanding this resource from leading AI assistants

Explain It Simply
Tutor Mode
Test My Understanding

Click any AI assistant to open it with a pre-filled learning prompt. You can edit before sending.

Sign in to track your progress and mark this resource as completed.