In using BERT in Japanese, there is a person who distributes a re-trained model because the model published by the original Google has a problem in handling Japanese. Thank you very much.

BERT


This page is auto-translated from /nishio/日本語BERT using DeepL. If you looks something interesting but the auto-translated English is not good enough to understand it, feel free to let me know at @nishio_en. I’m very happy to spread my thought to non-Japanese readers.