Unable to load image

:marseyflageu: Holy shit, the European Union finally came up with their their own LLM :marseysnappy!:

https://openeurollm.eu/launch-press-release

Wait nvm, it's just a framework lol :marseyemojirofl:

https://openeurollm.eu/

Orange Site:

https://news.ycombinator.com/item?id=42922989

:marseybluecheck:

:marsey4chan:

https://boards.4chan.org/g/thread/104200209

:marseymouse:

https://lemmy.world/post/25074353

:marseysnoo:

https://old.reddit.com/r/singularity/comments/1ih6ukk/the_eu_is_betting_56_million_on_open_source_ai/

61
Jump in the discussion.

No email address required.

They allocated €37.4 million [1]. As an European, I truly don't understand why they keep ignoring that the money required for such projects is at least an order of magnitude more.

https://digital-strategy.ec.europa.eu/en/news/pioneering-ai-project-awarded-opening-large-language-models-european-languages

:#marseygarfieldrofl: !peakpoors

Jump in the discussion.

No email address required.

Didn't the Chinese do it for $6 million?

Jump in the discussion.

No email address required.

The pre-training run for that specific model they published (r1) would have cost $6M if they had done it on EC2 or something similar.

But that wasn't their first attempt, it probably took 200 tries (starting smaller of course), so the (EC2 equivalent) cost of figuring out how to eventually train their r1 model was another $250M, and in order to iterate through those attempts quickly they needed a lot of computing resources, worth around $2B (which they will continue to use for the next 5 years).

That's the smallest order of magnitude that currently has a chance, but going forward it won't be enough.

OpenAI's computing resources are closer to $75B (increasing rapidly). OpenAI will quickly figure out how deepseek did it, and apply those modifications to their own upcoming models, but with 40 times as much compute as deepseek has used, so they can iterate 40 times faster at the same size as deepsekk, and eventually go much bigger than r1 (and perhaps at the end distill down to a terabyte sized model, for cheaper inference).

(Not just for openai but all its big competitors.)

Jump in the discussion.

No email address required.

Accurate. Buy the NVDA dip or get dusted.

Jump in the discussion.

No email address required.

can the yuroes do it for $37 mil?

Jump in the discussion.

No email address required.

Sure, just download DeepSeek and run it, plus $37 million to verify regulatory compliance.

Jump in the discussion.

No email address required.

https://media1.tenor.com/m/q_vdHP2y_9IAAAAd/nodding-yes.gif

Jump in the discussion.

No email address required.

Is that for the building rent to stick the LLM datacenter inside?

Jump in the discussion.

No email address required.



Now playing: Misty Menace (DKC).mp3

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.