Model Name | olm/olm-roberta-base-dec-2022 |
Description | A RoBERTa model trained on data up to december 2022 |
Use For | General text blobs |
Limitations | This model is for English language text |
Graft Default | No |
Reference information
Source | RoBERTa (2022) - hugging face |
Trained on | December 2022 cleaned Common Crawl dataset and December 2022 cleaned Wikipedia dataset |
Paper | RoBERTa: A Robustly Optimized BERT Pretraining Approach |
Embedding Dimension | 768 |