BERT

BERT is a language model architechture which is commonly used for classification and embedding tasks. Often used interchangably with encoder-only language models. It was created by machine learning researchers at Google in 2018.

Related Articles

No items found.