Thread theme
I don't like how OpenAI continues to close off more and more of what they are doing behind the scenes but frick me I guess. 4o was less of a step forwards but more of a step sideways, and this O1 bullshit is in the same vein.
They're supposed to be running crazy transformers with apps (consumers) using it for CoT but who fricking cares about that shit anyways number goes up slop about phd and bro it scored like 85 on this one test bro just one more prompt bro just 900 more crosschecks bro.
If I read one more fricking r-slur post about strawberries i will unironically forcefeed them 3000 strawberries
Jump in the discussion.
No email address required.
Do you know cars? Temp is like asking what happens if you frick with a setting in the ECU. It's related to probability curves and is complicated.
Tl;dr If you've given lots of numbers and have temp set to 0.7 it technically can choose from many different options and even more. Setting it to 0 kills the curve and forces it to use a smaller list.
https://medium.com/@albert_88839/large-language-model-settings-temperature-top-p-and-max-tokens-1a0b54dcb25e
No clue what you mean by "talking like it has a fever".
Going back to car analogy, you set it to 0 so you know its the llm provider fricking around. Your car will autoshift for a number of reasons, but setting temp to 0 is like hardcoding at X rpms go up one gear. You benchmark it to make sure that happens each time. If all of a sudden gear isnt going up at X rpms anymore, you know the llm provider fricked with the model again.
Jump in the discussion.
No email address required.
Mm, that article was pretty simple and helpful. Thanks.
Are there any LLMs I can run locally that compare at all to ChatGPT? I've tried twice, last time and six months ago, and they always fricking suck.
Jump in the discussion.
No email address required.
Lurk /g/ or go on huggingface to answer your question. I do this for corpo shit so I don't really care about open source local models tbh. A modified version of llama is your best bet.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context