- Distributed representation of characters and then gradually attach them together to create a larger distributed representation of the structure.
- Do separating words in Japanese with spaces (e.g. in kana-only books) in MeCab, etc., or
- Do you want to stick with frequent 2-grams or
- How about attaching them in the way I wrote in Good or bad dispersion representation?
- Units larger than words
- Scrapbox page titles, etc.
- Proposal to multiply [keyphrase extraction
- Scrapbox page titles, etc.
This page is auto-translated from /nishio/文字の分散表現をくっつける using DeepL. If you looks something interesting but the auto-translated English is not good enough to understand it, feel free to let me know at @nishio_en. I’m very happy to spread my thought to non-Japanese readers.