Lab Youth Training Camp Day 3
Suppressing pages that have already appeared should be limited to those appearing in vector searches.
-
If you happen to pull something at random, but it seems relevant, you should again do a vector search on that content and use the most relevant part, so
-
AI Reading Notebook is working for now.
-
No vector search.
-
Iām running it on GPT-3.5, and itās only up to 4K.
- Use the 16K version or
- Then, on the other hand, itās like, āWell, weāve got a lot of space, so letās put more information in.
- But then the ratio of āwhat you readā to āwhat you writeā would increase, and you might find yourself disappearing more and more!
- Maybe it will.
- In the end, what matters is the amount of your output.
-
Oh, I thought I was using 3.5, but it was 4.
Assuming IO to Scrapbox, thereās only one input/output port and error messages come to the same place, but with that mechanism, if I loop it mechanically, Iāll read the error message and write the rest.
Oh, with 3.5, I sometimes translate what I read and write it down as I read it.
- Once you do that, you look at it and do the next one, so itās like a translation, not a note.
Rough estimate, it costs about $0.48 per step, and each book is going to take about 100-500 steps, so, well, thatās a pretty expensive mess.
The fill_with_related_fragments system is now returning a string that fills the given token and a list of the titles of the pages used, but this should change.
- Because now itās got that āstack the relevant stuff and random chunks for the restā behavior.
- In this implementation, I want to pass half of the frame and pour the text data into the remainder and the other half.
- That kind of flexibility in the current design is not good.
- Thereās a whole bunch of code that shouldnāt be tightly coupled because itās tied to the interface in the beginning.
I forgot to push again today and the code for periodic execution is a bit oldā¦
- push and manual trigger, or
Letās add lots of heads with multi-heads.
- Generating a story about what to read on your phone on the train ride home from camp.
- Changed cron to once an hour.
Letās see if we can make it by the time we take off.
- Perspective of ā title
- ā Pile on the NG list of vector searches.
- Increase the percentage of ā
vector searches
- OK for now, since āup to 3ā has been changed to ā3ā.
- I would like to do some initial, balancing in the future in case of low input.
- Distinguish between titles picked up at random and those picked up in a vector search.
leaving (fleeing) the venue
- I made a lot of parallel notes for now.
- Iāll have to remember to stop this one tonightā¦
- It runs once an hour, so itās going to be tough tomorrow morning.
- The theory that we are already in a lot of trouble even now.
- TODO Back to once a day when you get home.
- Good to run many themes in parallel.
- If you remove the š mark, the auto-renewal will stop.
- But Iām thinking that explicitly cutting the mark is not a very good experience.
- Wouldnāt it be nice if the frequency slowly decreased?
- Spaced Repetition
- Gradual increase in span.
- If a human reacts, it goes back to day one.
- Generation date and time are indicated.
- If the update date is later than the generation date, it means that the edit was made, so it will be updated at the next regular timing.
- If not updated, updated only if the ādifference between the last generation and the previous generationā multiplied by 1.5 is exceeded
- So it would be 1,2,3,5,8,13, and so on.
- Almost Fibonacci series.
When an error occurs, all the fragments are output, so I have to delete them, very tedious to do from my phone.
- If you didnāt turn it off, you should still get an error the next time the context width is over, which is a bad implementation w
I just counted and you have 20 pages of parallel topics, you are making too much!
- Some of them are keywords that I learned today from listening to the LT of the Lab Youth people.
- Some of these keywords have been mentioned for years.
- GPT4 can also be answered since it is a general keyword
- Some are keywords such as Kozaneba so raw GPT4 is not answered
- So far youāve answered well, picking up information from Scrapbox.
I get a release notification from Github at 16:07, but 20 GPT4s run after this, huh? Then an overwrite runs in about 30 minutes? Hard to writeā¦
- I still think we should move during sleeping hours.
16:58 I thought maybe an update would come while I was lost at Yokohama station, but it didnāt.
- Is this down to some kind of error?
- It hasnāt even fallen off!
- I think you may have committed with pdb.set_trace on without looking at it carefully again.
- Doesnāt sound like it.
- Iāll have to look at the logs to be sure.
- Or work tethered from home on the next transfer.
- I donāt know if such a place exists.
- Stopped the action running.
- I thought I changed the schedule and forgot to push againā¦
- I got an email notification that it was released.
Debugging, mystery solved.
- Embedded API failure retries endlessly
- So why do they fail?
- Previous Noteā is an empty string.
- How did that happen?
- A case where the GPT4 API failed and the error log was written as a previous note, and I didnāt want it to be read as a note, so when I deleted it from my phone, it left a blank line
Hmmm, what to do.
- A: Fundamentally, it is not good that the error log can be read as a note
- B: When there is only a blank line above the marker, it is strange to make a blank line the previous note, which can be caused by inadvertent mistakes in the future.
- C: It is strange to do a vector search when the previous note is really empty in the first place
- The original design was randomly filled in, but we decided to ignore that use case.
- Even if it is empty, the title exists, so use it.
- D: It is strange that the embedded API retries infinitely when an empty string is really passed to it
- Immediately throws an exception because it is strange that an empty string is passed in the first place. B, C, D fixed: https://github.com/nishio/omni/commit/33ab6c7ad456c5aa96033603a4f06cb7e5a72465 A fixed: https://github.com/nishio/omni/commit/03b6ca3e660ced94fdcbb84e975c62bc115e3d4b
The original plan to interact on the train on the way home was a dud.
-
I looked through all of them anyway.
- What is turned off - š¤Zero Knowledge Proof
-
What you think is still subtle may turn out to be interesting if you run it for a while.
-
Now, Iām doing more vector searches, so when I run out, Iāll start taking what I donāt have.
-
Diary 2023-08-22 āDiary 2023-08-23 ā Diary 2023-08-24 100 days ago Diary 2023-05-15. 1 year ago Diary 2022-08-23.
This page is auto-translated from /nishio/ę„čØ2023-08-23 using DeepL. If you looks something interesting but the auto-translated English is not good enough to understand it, feel free to let me know at @nishio_en. Iām very happy to spread my thought to non-Japanese readers.