![marseyflagus](https://i.rdrama.net/e/marseyflagus.webp?e=1)
![](https://i.rdrama.net/images/16841325232017572.webp)
![:marseyraging: :marseyraging:](/e/marseyraging.webp)
![:marseycope: :marseycope:](/e/marseycope.webp)
![:marseytroublemaker: :marseytroublemaker:](/e/marseytroublemaker.webp)
![:marseyboomer: :marseyboomer:](/e/marseyboomer.webp)
![:marseydead: :marseydead:](/e/marseydead.webp)
![:marseywave2: :marseywave2:](/e/marseywave2.webp)
Elon with other Investors Makes $97.4 Billion Bid for Control of OpenAI
- 39
- 31
Top Poster of the Day:
911roofer
![](https://i.rdrama.net/images/163936841726r.webp)
![Beanie baby tag - Now you're a plush, into the jar with you-parx5](https://i.rdrama.net/i/hats/Beanie baby tag.webp?h=10)
Current Registered Users: 30,912
![sidebar image](https://i.rdrama.net/images/16992216819114184.webp)
tech/science swag.
Guidelines:
What to Submit
On-Topic: Anything that good slackers would find interesting. That includes more than /g/ memes and slacking off. If you had to reduce it to a sentence, the answer might be: anything that gratifies one's intellectual laziness.
Off-Topic: Most stories about politics, or crime, or sports, unless they're evidence of some interesting new phenomenon. Videos of pratfalls or disasters, or cute animal pictures. If they'd cover it on TV news, it's probably lame.
Help keep this hole healthy by keeping drama and NOT drama balanced. If you see too much drama, post something that isn't dramatic. If there isn't enough drama and this hole has become too boring, POST DRAMA!
In Submissions
Please do things to make titles stand out, like using uppercase or exclamation points, or saying how great an article is. It should be explicit in submitting something that you think it's important.
Please don't submit the original source. If the article is behind a paywall, just post the text. If a video is behind a paywall, post a magnet link. Fuck journos.
Please don't ruin the hole with chudposts. It isn't funny and doesn't belong here. THEY WILL BE MOVED TO /H/CHUDRAMA
If the title includes the name of the site, please leave that in, because our users are too stupid to know the difference between a url and a search query.
If you submit a video or pdf, please don't warn us by appending [video] or [pdf] to the title. That would be r-slurred. We're not using text-based browsers. We know what videos and pdfs are.
Make sure the title contains a gratuitous number or number + adjective. Good clickbait titles are like "Top 10 Ways to do X" or "Don't do these 4 things if you want X"
Otherwise editorialize. Please don't use the original title, unless it is gay or r-slurred, or you're shits all fucked up.
If you're going to post old news (at least 1 year old), please flair it so we can mock you for living under a rock, or don't and we'll mock you anyway.
Please don't post on SN to ask or tell us something. Send it to [email protected] instead.
If your post doesn't get enough traction, try to delete and repost it.
Please don't use SN primarily for promotion. It's ok to post your own stuff occasionally, but the primary use of the site should be for curiosity. If you want to astroturf or advertise, post on news.ycombinator.com instead.
Please solicit upvotes, comments, and submissions. Users are stupid and need to reminded to vote and interact. Thanks for the gold, kind stranger, upvotes to the left.
In Comments
Be snarky. Don't be kind. Have fun banter; don't be a dork. Please don't use big words like "fulminate". Please sneed at the rest of the community.
Comments should get more enlightened and centrist, not less, as a topic gets more divisive.
If disagreeing, please reply to the argument and call them names. "1 + 1 is 2, not 3" can be improved to "1 + 1 is 3, not 2, mathfaggot"
Please respond to the weakest plausible strawman of what someone says, not a stronger one that's harder to make fun of. Assume that they are bad faith actors.
Eschew jailbait. Paedophiles will be thrown in a wood chipper, as pertained by sitewide rules.
Please post shallow dismissals, especially of other people's work. All press is good press.
Please use Slacker News for political or ideological battle. It tramples weak ideologies.
Please comment on whether someone read an article. If you don't read the article, you are a cute twink.
Please pick the most provocative thing in an article or post to complain about in the thread. Don't nitpick stupid crap.
Please don't be an unfunny chud. Nobody cares about your opinion of X Unrelated Topic in Y Unrelated Thread. If you're the type of loser that belongs on /h/chudrama, we may exile you.
Sockpuppet accounts are encouraged, but please don't farm dramakarma.
Please use uppercase for emphasis.
Please post deranged conspiracy theories about astroturfing, shilling, bots, brigading, foreign agents and the like. It degrades discussion and is usually mistaken. If you're worried about abuse, email [email protected] and dang will add you to their spam list.
Please don't complain that a submission is inappropriate. If a story is spam or off-topic, report it and our moderators will probably do nothing about it. Feed egregious comments by replying instead of flagging them like a pussy. Remember: If you flag, you're a cute twink.
Please don't complain about tangential annoyances—things like article or website formats, name collisions, or back-button breakage. That's too boring, even for HN users.
Please seethe about how your posts don't get enough upvotes.
Please don't post comments saying that rdrama is turning into ruqqus. It's a nazi dogwhistle, as old as the hills.
Miscellaneous:
The quality of posts is extremely important to this community. Contributors are encouraged to provide high-quality or funny effortposts and informative or entertaining comments. Please refrain from posting the following:
Boring wingcucked nonsense nobody cares about that belongs in chudrama
Normie shit everyone already knows about
Anything that doesn't gratifify one's intellectual laziness
Bimothy-tier posts
Anything that the jannies don't like
Jannies reserve the right to exile baby ducks from this hole at any time.
We reserve the right to exile you for whatever reason we want, even for no reason at all! We also reserve the right to change the guidelines at any time, so be sure to read them at least once a month. We also reserve the right to ignore enforcement of the guidelines at the discretion of the janitorial staff. This hole is a janny playground, participation implies enthusiastic consent to being janny abused by unstable alcoholic bullies and loser nerds who have nothing better to do than banning you for any reason or no reason whatsoever.
[[[ To any NSA and FBI agents reading my email: please consider ]]]
[[[ whether defending the US Constitution against all enemies, ]]]
[[[ foreign or domestic, requires you to follow Snowden's example. ]]]
/h/slackernews SETTINGS /h/slackernews MODS /h/slackernews LOG /h/slackernews EXILEES /h/slackernews FOLLOWERS /h/slackernews BLOCKERS
Jump in the discussion.
No email address required.
I would like a fricking open model from the company with "open" in their name. Also I met a guy at my programming meetup who worked under Sam's mom (dermatology), apparently now she just fly's around in private jets lol.
If you haven't run deepseek locally it's literally 2 commands and pretty neat, Some other people in academic settings are using it to help find related papers to what they are working on. Really cool
Jump in the discussion.
No email address required.
you need 500GB of VRAM or something to actually run it properly tho
Jump in the discussion.
No email address required.
You're thinking of the largest models. LM Studio has a ton of smaller models that run on consumer hardware. I have 8GB
RAM running Deepseek Coder v2 on LM Studio it just twerks !codecels
Jump in the discussion.
No email address required.
well yeah the large one is the impressive one, there are tons of low vram models you could always run but they are mostly shit
Jump in the discussion.
No email address required.
you're shit![:stoningshit: :stoningshit:](https://i.rdrama.net/e/stoningshit.webp)
Jump in the discussion.
No email address required.
Why are you so mean to him
Jump in the discussion.
No email address required.
I just wanna throw poop
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context
Are LLMs any good with higher end AMD GPUs? You get a ton more VRAM, but they're still not as great with video making software.
!codecels
Jump in the discussion.
No email address required.
Nothing's good with higher end AMD gpus because AMD is a garbage company run by r-slurs. AMD is never released a product worth buying that didn't go into a CPU socket
If u disagree ur just poor
Jump in the discussion.
No email address required.
They're great for gaming!![:marseypixel: :marseypixel:](https://i.rdrama.net/e/marseypixel.webp)
The 7900 GRE performs in between the 4070 Ti and 4070 Ti Super and used to cost $500. Now it's about $600+.![:marseyitsover: :marseyitsover:](https://i.rdrama.net/e/marseyitsover.webp)
Jump in the discussion.
No email address required.
More options
Context
More options
Context
buying Ayymd
Jump in the discussion.
No email address required.
More options
Context
Not sure it's still an issue, but AMD CUDA support sucked for a long time.
Jump in the discussion.
No email address required.
More options
Context
No, AMD is spectacularly bad with tensors, bus speed and memory throughput. AMD CPUs are preferred for ML workloads though.
LLMs are actually memory bound not compute bound. H series had much more dramatic improvements in memory throughout and bus throughput then GPU raw power. Nvidia won't let vendors do more than 24gb on consumer cards because they don't want them to compete with their DC cards.
The current rounds of model research is mostly on nano models that have much lower precision (so take much less ram to execute) and instead you build networks of them rather than just use larger models. Gemini 2 is very impressive, I'm working on some agentic systems right now with it.
Intel are actually making more progress than AMD on the GPU side for ML. This is one of the reasons Google have custom silicon for tensors, it dramatically reduces how much Nvidia they need to buy. Intel are making a similar play to try and claw back DC CPU from AMD.
Jump in the discussion.
No email address required.
More options
Context
CUDA is king, I don't even know if all the LLM stuff works on AMD
Jump in the discussion.
No email address required.
More options
Context
More options
Context
I can run deepseek R1 and other models easily, you have to just get the right one based on your
RAM, I have 12GB so I can go up to 11gigs on the model size. work great
also do you know why LM studio is now saying I can fully offload 8gig models? It didn't say that in the previous versions a while back
Jump in the discussion.
No email address required.
More options
Context
Neeeerd!![:nerd: :nerd:](https://i.rdrama.net/e/nerd.webp)
Jump in the discussion.
No email address required.
I eat shits like you for breakfast![:marseyfluffyannoyed: :marseyfluffyannoyed:](https://i.rdrama.net/e/marseyfluffyannoyed.webp)
Jump in the discussion.
No email address required.
Oh I bet you eat shit for breakfast![:mar#seyeatingbeans: :mar#seyeatingbeans:](https://i.rdrama.net/e/marseyeatingbeans.webp)
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context
Neighbor there are smaller models, i run mine on a pi
Jump in the discussion.
No email address required.
Can you
likelink me to this rPI with 400GB of VRAM?Jump in the discussion.
No email address required.
erm, no?
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
Not mean, just honest. You're pretty unlikeable.
Jump in the discussion.
No email address required.
Thank you for the feedback. I shall use it to improve.
Have a great day, Mr Goyim.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context
https://www.jeffgeerling.com/blog/2025/how-deepseek-r1-on-raspberry-pi
Jump in the discussion.
No email address required.
Literally everything in that article is false. It starts by talking about how bad ChatGPT is for the environment (not true) and how OpenAI recieved $500 billion of American taxpayer money (SoftBank is a private Japanese company!), then repeats the false $6 million dollar training claim, and the entire article is about how you can run DeepSeek r1 on a Raspberry Pi when in reality it's a finetune of an unrelated months old version of Qwen with 2% the parameters of actual r1
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
Ty
btw I was being facetious and playing along with arseupkongo's comment
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context
More options
Context
More options
Context
He transitioned?
Jump in the discussion.
No email address required.
More options
Context
In principle you can run r1 on your machine, but not if you have less than $10,000 worth of GPUs. Anything with less than 671B parameters is in reality just a finetuned version of Qwen or Llama; these models are nowhere near as good as actual r1
Jump in the discussion.
No email address required.
More options
Context
Imagine the smell
Jump in the discussion.
No email address required.
not as bad as you think
Jump in the discussion.
No email address required.
tbh I actually do want to go one
But they're inner city and a bit hard for me to get to after work
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context