1. Introduction


How?

2. Backgroud (BERT)


2.1 Setup

https://i.imgur.com/RAmRtu5.png

BERT input representation (BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding)

2.2 Architecture