https://news.ycombinator.com/item?id=41058194
What this implies is future models will be even better at sounding smart but even more likely to hallucinate and give you wrong answers.
The future is r-slurred.
What this implies is future models will be even better at sounding smart but even more likely to hallucinate and give you wrong answers.
The future is r-slurred.
Jump in the discussion.
No email address required.
this is kinda like incest, where the offspring turns r-slurred
Jump in the discussion.
No email address required.
hot robot brain sexo?
Jump in the discussion.
No email address required.
More options
Context
Hapsburg GPT![:marseysnappyautism: :marseysnappyautism:](https://i.rdrama.net/e/marseysnappyautism.webp)
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
Considering the way that Neural Networks iterate, that's actually a very accurate analogy for what's happening here.![:nerd: :nerd:](https://i.rdrama.net/e/nerd.webp)
Jump in the discussion.
No email address required.
More options
Context
yeah but wouldn't they be trained by humans so it would be like a strict natural selection to remove the r-slurred
Jump in the discussion.
No email address required.
Eugenics, you say? Tsk.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
I'd say it's closer to getting cancer, but your analogy also works.
Jump in the discussion.
No email address required.
More options
Context
More options
Context