Skip to content

Chiamakac/IgboNER-Models

Repository files navigation

IgboBERT Models: Building and Training Transformer Models for the Igbo Language


This is a repository for the IgboBERT Models trainings as presented in this paper: "IgboBERT Models: Building and Training Transformer Models for the Igbo Language". We trained an IgboBERT model from scratch to have a baseline model for Igbo language and also finetuned some state-of-the-arts- transformer models which include mBERT, XLM_R_BERT, DistilBERT using Igbo data.

Citation


@inproceedings{chukwuneke2022igbobert, title={IgboBERT Models: Building and Training Transformer Models for the Igbo Language}, author={Chukwuneke, Chiamaka and Ezeani, Ignatius and Rayson, Paul and El-Haj, Mahmoud}, booktitle={Proceedings of the Thirteenth Language Resources and Evaluation Conference}, pages={5114--5122}, year={2022} }

About

IgboBERT Models

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published