Half a croissant, on a plate, with a sign in front of it saying '50c'
h a l f b a k e r y
i v n i n seeks n e t o

idea: add, search, annotate, link, view, overview, recent, by name, random

meta: news, help, about, links, report a problem

account: browse anonymously, or get an account and write.

user:
pass:
register,


         

Please log in.
Before you can vote, you need to register. Please log in or create an account.

humans are your god

modify weights and continually update LLMs with them
 
(0)
  [vote for,
against]

LLMs are brittle. They can be taught and thereafter never learn again or they can be continually updated in various ways, all of which lead to forgetting things that are important. one of the ways to prevent what is called "catastrophic forgetting" in LLMs is called Elastic Weight Consolidation. This technique involves constraining the weights being used by the LLM against forgetting. Of course you have a problem here. Constrain too much and it can't learn anything new. Constrain too little and it forgets everything it used to know. What if there was a way to constrain it only along things that are important?

Using primary human input as a proxy for what is important this is possible. This secondary LLM would be trained on the primary LLM's previously held data. It would combine that data with new experiences both llms are receiving. Its output would be a very limited set of concepts that are pertinent to both the previous data and what the human operator is asking for. For example if the human operator input, "I need some water" the secondary LLM would highlight the places those words influence the primary LLM's matrix. These places and only these places would receive a much stronger (decreasing over time) constraint against being forgotten. The primary LLM would now be more "focused" for lack of a better word on the concepts surrounding the human in question and his need for water.
Voice, Sep 16 2024

Attention is All You Need https://research.go...on-is-all-you-need/
This paper boosted what had been relatively steady progress into a sudden explosion of the current chat-based stuff that's been so successful recently. [zen_tom, Sep 16 2024]

[link]






       I've read it through a few times now and even looked up some stuff.   

       It feels wrong.   

       I have no other criteria I can give you for that feeling.
It just feels wrong.
  

       I don't know why.   

       Is the technical concept of "attention" (Attention is All you Need, Google 2017) doing what you largely describe in your second paragraph?
zen_tom, Sep 16 2024
  

       Yes. [marked-for-expiry] I somehow thought this would allow dynamic learning but of course it won't.
Voice, Sep 16 2024
  
      
[annotate]
  


 

back: main index

business  computer  culture  fashion  food  halfbakery  home  other  product  public  science  sport  vehicle