https://civitai.com/models/363402/rabbit-hole-hatsune-mikupure-pure
idk where to put it but here you go if you want to make images of the bunny
https://civitai.com/models/363402/rabbit-hole-hatsune-mikupure-pure
idk where to put it but here you go if you want to make images of the bunny
Jump in the discussion.
No email address required.
good, finally something not early acc-
why would you make it larger than the dataset it's representing (especially in latent space, ain't no way you have 270mb of latents)
the only thing that does is make it less compatible with other loras that may also have their problems
Jump in the discussion.
No email address required.
Because memory is cheap and there is no reason to bother making it smaller - I train at 128 dim
Sdxl Loras get big fast for quality
Jump in the discussion.
No email address required.
Preventing overfitting which makes it behave like crap on other models or with other loras
This is a relatively simple char lora, not one of those loras with hundred or thousands of styles inside
Jump in the discussion.
No email address required.
On sdxl 128 dim I have found to be a sweet spot, trying to do 256 or higher is where issues like that start (1.5 worked best on 256)
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context