https://news.ycombinator.com/item?id=41058194
What this implies is future models will be even better at sounding smart but even more likely to hallucinate and give you wrong answers.
The future is r-slurred.
What this implies is future models will be even better at sounding smart but even more likely to hallucinate and give you wrong answers.
The future is r-slurred.
Jump in the discussion.
No email address required.
!nooticers Literal nothing burger.
"When we pipe data into our training and make literally 0 attempt to separate bad from good outputs first there's more bad outputs!"
People have been training off AI data for a year or more by now
jewish lives matter
Jump in the discussion.
No email address required.
I've been training a year or more on your mom.
Jump in the discussion.
No email address required.
More options
Context
More options
Context