- Forms of Knowledge Representation
- I think something new might emerge, but it’s not clear yet.
- Machines, like humans, would be better off learning while outputting a “summary for future review”.
- The attribute of the link from page to page would be the distribution of surrounding words if [CBOW
- The linked empty page is the set of the distribution of the surrounding words
- What would you write in a book if a machine could freely write in Scrapbox?
- What is spit out on the local file system now is the text per page and the keywords in that text.
- Transcribing whole pages of text is a dead sentence.
- It could be said that the original is in the file system and doesn’t need to be written,
- If Scrapbox is a form of knowledge representation, then “sometimes dead sentences get put there, right?”
- I don’t remember.
- I don’t remember by writing.
- Bracket the keywords and it becomes a network.
- If a machine reviews it instead of a human looking at it, what does that look like?
- Select and follow links based on interest
- consult a dictionary
- Of the description, pull up the dictionary again with words that interest you.
- I’d look at Wikipedia and quote the opening sentences.
- distillation
- Distilling Knowledge in Deep Learning | Code Craft House
- Re-training with the probability outputs in the classification model as a teacher - soft target loss
- Distillation in natural language
This page is auto-translated from /nishio/2018-10-09 using DeepL. If you looks something interesting but the auto-translated English is not good enough to understand it, feel free to let me know at @nishio_en. I’m very happy to spread my thought to non-Japanese readers.