RBERTviz provides tools for convenient visualization of BERT models in the RBERT package.
You can install RBERTviz from GitHub with:
# install.packages("devtools")
devtools::install_github(
"jonathanbratt/RBERTviz",
build_vignettes = TRUE
)
RBERTviz is intended to be used alongside the RBERT package. See the installation instructions in that repository.
RBERTviz currently enables visualization of
- the attention matrices for each attention head
- the output vectors at each transformer layer
The attention visualizer is basically a wrapper around an earlier version of the transformer visualization tools, as adapted by Jesse Vig.
The output vector visualizer is a collection of routines for generating 2D PCA plots of the layer outputs, which can be interpreted as context-dependent embedding vectors.
See the “Introduction to RBERTviz” vignette included with the package for more detailed examples of usage.
This is not an officially supported Macmillan Learning product.
Questions or comments should be directed to Jonathan Bratt ([email protected]) and Jon Harmon ([email protected]).