We are re-implementing BERT for R in {torchtransformers}. We find {torch} much easier to work with in R than {tensorflow}, and strongly recommend starting there!
RBERT is an R implementation of the Python package BERT developed at Google for Natural Language Processing.
You can install RBERT from GitHub with:
# install.packages("devtools")
devtools::install_github(
"jonathanbratt/RBERT",
build_vignettes = TRUE
)
RBERT requires TensorFlow. Currently the version must be <= 1.13.1. You can install it using the tensorflow package (installed as a dependency of this package; see note below about Windows).
tensorflow::install_tensorflow(version = "1.13.1")
The current CRAN version of reticulate (1.13) causes some issues with the tensorflow installation. Rebooting your machine after installing Anaconda seems to fix this issue, or upgrade to the development version of reticulate.
devtools::install_github("rstudio/reticulate")
RBERT is a work in progress. While fine-tuning a BERT model using RBERT may be possible, it is not currently recommended.
RBERT is best suited for exploring pre-trained BERT models, and obtaining contextual representations of input text for use as features in downstream tasks.
- See the “Introduction to RBERT” vignette included with the package for more specific examples.
- For a quick explanation of what BERT is, see the “BERT Basics” vignette.
- The package RBERTviz provides tools for making fun and easy visualizations of BERT data.
The first time you run the test suite, the 388.8MB bert_base_uncased.zip
file will download in your tests/testthat/test_checkpoints
directory.
Subsequent test runs will use that download. This was our best
compromise to allow for relatively rapid testing without bloating the
repository.
This is not an officially supported Macmillan Learning product.
Questions or comments should be directed to Jonathan Bratt (jonathan.bratt@macmillan.com) and Jon Harmon (jon.harmon@macmillan.com).