Reported by:

Marsey giving advice to Marsey (+more AI slop)

https://i.rdrama.net/images/1716506724347168.webp

https://i.rdrama.net/images/17165067244932284.webp

https://i.rdrama.net/images/17165067246736412.webp

https://i.rdrama.net/images/17165067248130205.webp

https://i.rdrama.net/images/17165067249263797.webp

Jump in the discussion.

No email address required.

@TouchFluffyTails

What are the limiting factors on image generation?

Is it impossible to generate a high quality image at all on a like a current Pixel with NPUs or is it simply super fricking slow vs. an actual Nvidia GPU?

I'm been playing with LLM chat bots on my Mac Studio, it seems like RAM is the biggest limiting factor and then compute, but I guess GGUFs let you use disk storage too?

Jump in the discussion.

No email address required.

>exposing yourself as a RAMlet

:marseyyikes:

Jump in the discussion.

No email address required.

I have 64GB :marseyconfused: my friends said that was too much but I couldn't switch from my PC with 64GB to a Mac with less - especially being soldered and shared with the GPU. \

The sharing with the GPU/NPUs makes it awesome at this AI stuff though, even a 4090 only has 24GB VRAM.

Jump in the discussion.

No email address required.

it's vram not ram that limits, anything but using a GPU is going to be pretty unusable as soon as you go above the very worst models nevermind slow

Jump in the discussion.

No email address required.



Link copied to clipboard
Action successful!
Error, please refresh the page and try again.