You know LLMs are regurgitating when they will contradict their statements just by clicking 'redo' on a prompt. I doubt if you were the ask the same question that they would suddenly say the complete opposite of what they just said.
Comparing LLMs trained on reddit comments and people who learn to speak as a byproduct of actually interacting with people and the world is nuts.
Actually I think the line between creative and regurgitate is so blurred you can’t tell me a single creative thing you did. So if 99% of people are not creative, and just regurgitate then why we keep AI standards so high?
Can you show me one single thing you did in your life that was truly creative and not regurgitated?
I think that was my point, I generally regurgitate. A person can do that a lot in life.
That's why people are conflating LLMs for AGI.
For now, I think that the key difference between me, and an LLM is that an LLM still needs a prompt.
It's not surveying the world around it determining what it needs to do.
I do a lot of something that I think an LLM cannot get do, look at things and try to find what attributes they have and how I can harness those to solve problems. Most of the attributes are unknown by the human race when I start.
So if I make an ai with an a prompt and tell him to re prompt itself every day for the rest of his life means is smart now? Or just because I give him the first prompt is invalid? I doubt your first prompt was given by yourself. Was probably in your mums belly your first prompt.
—-
I could give an initial prompt to my ai to survey the server and act accordingly… and he can re prompt every day himself.
——
> I do a lot of something that I think an LLM cannot get do, look at things and try to find what attributes they have and how I can harness those to solve problems. Most of the attributes are unknown by the human race when I start.
Any examples? An ai can look at a conversation and extract insights better than most people. Negotiate better than most people.
—-
I heard nothing that you can do more than a llm. Self prompting yourself to do something I don’t think is a differentiator.
You also self prompt yourself based on Previous feedback. And you do this since you’re a baby. So someone also gave you the source prompt. Maybe dna.