Text Analytics Toolbox Model for BERT-Base Multilingual Cased Network
Pretrained BERT-Base Multilingual Cased Network for MATLAB
66 Downloads
Updated
11 Sep 2024
BERT-Base Multilingual Cased is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 12 self-attention layers and a hidden size of 768.
To load a BERT-Multilingual Cased model, you can run the following code:
[net, tokenizer] = bert(Model="multilingual");
MATLAB Release Compatibility
Created with
R2023b
Compatible with R2023b to R2024b
Platform Compatibility
Windows macOS (Apple silicon) macOS (Intel) LinuxTags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.