Kozaneba
from Diary 2023-11-01 My current impressions and thoughts on [omni
- Good āsometimes up coming pageā.
- page sometimes emerging.
- The form of expression is problematic.
- Currently, Iām ākeeping it allā as an experiment, but itās clearly not good.
- After all, weāre only looking at the most recent generation, so we donāt need the one in front of us.
- If you havenāt read the part in the foreground, you should.
- You might want to look back and read it, but it doesnāt have to be all unfolded from the beginning.
- Frequency gap is too large.
- And it is not a good idea for that change to be made by a change in title
- Not that there is anything wrong with that per se, but the problem is that you are treating the page with the changed title as a ānewly added pageā.
- Seems a bit cumbersome to solve this problem with Scrapbox.
- And it is not a good idea for that change to be made by a change in title
- Iterative Commenter
- As expected, the ābehavior of slowly extending branches to the surrounding areaā is occurring. - Neri Neriās value is slowly decreasing. occurs, it slowly spreads outward over more time
- The original Recurrent Notes does not have the āKJ legal effect of repeated summaryā that it had.
- Iterative Commenter should be moved to private. - The Serendipity of Random Reading because the system causes
- About the original data
- Thereās a big difference between having 60 books worth of my Scrapbox articles and 100 books worth of real book-derived data.
- Whatās the difference?
- Books assume a one-dimensional context.
- It implicitly assumes that the reader has read the previous sentence.
- So āwhat are you assuming we know of what has appeared up to that pointā is not explicitly stated.
- Scrapbox, on the other hand, makes no assumptions about what the reader is reading in the foreground
- So relevant content is explicitly made into a link.
- Ideally, the first thing to do is to put a large one-dimensional item, like a record of an event or lecture material, on the table. Event article is dying.
- So relevant content is explicitly made into a link.
- what should be done?
- One idea: Leverage memo.
- Given page k-1 and k, make a leverage memo for page k
- This is a little over 500 tokens of input.
- Summary is an ambiguous concept
- Not a good idea to think of leverage memos as āmaking summariesā.
- Leverage memo is not a summary extract reusable components.
- Returning null is often the correct answer.
- Theyāre not trained on that kind of data set.
- There is no data set for this purpose, do you want to make one?
- Context-Specific Summaries
- I want to take out where the context matches.
- One idea: Leverage memo.
- Thereās a big difference between having 60 books worth of my Scrapbox articles and 100 books worth of real book-derived data.
- Markdown conversion
- Simple format conversion
- This is boring.
- Stylistic Conversion
- From fragmentary notes to blog post
- Conversion to English
- And determine if it should be targeted as a preliminary step?
- Simple format conversion
- Relation to search hits
- There are smaller units than the current chunk of 500 tokens.
- Scrapboxās brief article is āitā. - Evergreen notes should be atomic.
- The string of 500 tokens cut from the book is not it.
- Maybe Iāll stop working with Github Action once and for all.
from Diary 2023-11-02 Current impressions and thoughts on OMNI
Iterative Commenter is running on Github Actions.
- omni-private is working locally.
- Itās easier to experiment that way.
- The code is shared with public omni.
-
Iterative Commenter should be moved to private.
- ā Stop working with Github Actions once
public /nishio ā Github
- Save for now
- Cleaned up articles are available separately.
- Wikignome of maintenance should be made in 3.5.
Machine-generated duplicate content subtle
ideal
- About embed
- Use cache
- Does not carry over unused cache
- Do not target machine-generated content
- About Update
- Maintain telomeres
- Or overwrite without logging.
- If you want to see the past, use History.
- search results
- viewable
- Shouldnāt be shown from the beginning.
- too much
- More DIGEST
- Manpower DIGEST will be data for future FTs.
Lost in the meadow, which way to go?
-
State with more than one attractive goal.
- If itās one thing, go that way.
-
The problem of knowing which way to go when many goals are in different directions.
-
Situations that should be Kozaneba
Joint activities with AI
- Not much has been done.
- With omni-private, you just throw a query and the AI generates it and reads it.
- Just a single request response
- And theyāre doing it in Scrapbox, which is unsuitable.
AI Reading Notebook - Let the AI do the activity of [read a book - What is reading? - Creating a Knowledge Network. - What kind of network is needed? - backlink Should be, maybe. - 2-hop links There should be - Why? - This Scrapbox reminds me of what Iāve forgotten.
- Landing it in the context of nishio?
- A system that allows people to find connections between books, and if you let people read the output of 60 books of NISHIO, you can land the books in NISHIOās context as a result.
- You donāt have to make it explicit.
- Well, if you want to experience this effect, you must first have a reasonable amount of output, so itās not for the average person at all.
- You shouldnāt aim for the general public first. Kozaneba
- A system that allows people to find connections between books, and if you let people read the output of 60 books of NISHIO, you can land the books in NISHIOās context as a result.
- I have always used it when I need a non-written form of expression.
- Rather āfunctional enoughā?
- It would be useful to write notes as sentences.
- Browser compatibility issues rather than functionality like I want to be able to use it from my iPad.
- Iām doing the chopping of the text, humans have done it, and Iām thinking that by doing it, it has the effect of ābetter understanding is created by reading while processingā like sutras or active reading.
- but it may be an assumption.
- If LLM helped you, you might say, āI should have done this sooner!ā I might be able to
- Rather āfunctional enoughā?
- I use it all the time, but itās not the type of tool I use every day.
- Tools to retrieve when needed
- Should there be an interconnection with Scrapbox?
- Scrapboxās ālink by stringā and Kozanebaās non-verbal ālink by placementā and ālink by lineā are complementary
- Scrapbox side of the view is not very customizable, so another view is needed
- More development of mem.nhiro.org.
Keichobot
- A system that encourages verbalization by asking humans questions.
- Focus on bringing it out from scratch.
- so it is not connected to existing conversations or Scrapbox stock.
- There is of course the possibility that it would be good to be
- I put the chat logs in Scrapbox.
- It was chopped into chunks and entered into the vector index.
- This had merit.
- Right now itās chopped at 500 tokens, so the granularity is rough, but this exchange is QA, so it might be good to index it in the form of a QA pair or something.
Kozaneba situation to be
Kozaneba
This is a bit of a leap.
- Too drawn to the specific idea of āleveraging memos.ā
- Overly specific implementation images
- Given a page in a book, if the LLM can find links to what is related to what in the story up to that point in the book, the LLM can create a network structure while reading a one-dimensional book.
There is a link to Scrapbox, so there should be a backlink
- Scrapbox does not have the ability to display it so there should be a new view
https://kozaneba.netlify.app/#view=lhq9ixTHLRlLh55wFcT7
- Refine the concept of summary
- Create reusable components
- You can return null.
- Finding connections between context and a given sentence
- What is context?
- Previous page before the page you are currently reading in the book
- Books Iām reading now and others
- Which is the context?
- Ideas and books in Scrapbox
- What is context?
Support at Kozanebaās LLM
- Where the text is engraved
- Where to title the group
This page is auto-translated from /nishio/omnić«é¢ććē¾ęē¹ć§ć®ęę³ćØęčć®ę“ē using DeepL. If you looks something interesting but the auto-translated English is not good enough to understand it, feel free to let me know at @nishio_en. Iām very happy to spread my thought to non-Japanese readers.