zlacker

[return to "Z-Image: Powerful and highly efficient image generation model with 6B parameters"]
1. paweld+Uyk[view] [source] 2025-12-06 17:10:41
>>doener+(OP)
Did anyone test it on 5090? I saw some 30xx reports and it seemed very fast
◧◩
2. Wowfun+eJk[view] [source] 2025-12-06 18:35:10
>>paweld+Uyk
Even on my 4080 it's extremely fast, it takes ~15 seconds per image.
◧◩◪
3. accrua+akl[view] [source] 2025-12-06 23:56:10
>>Wowfun+eJk
Did you use PyTorch Native or Diffusers Inference? I couldn't get the former working yet so I used Diffusers, but it's terribly slow on my 4080 (4 min/image). Trying again with PyTorch now, seems like Diffusers is expected to be slow.
◧◩◪◨
4. Wowfun+Nkl[view] [source] 2025-12-07 00:01:47
>>accrua+akl
Uh, not sure? I downloaded the portable build of ComfyUI and ran the CUDA-specific batch file it comes with.

(I'm not used to using Windows and I don't know how to do anything complicated on that OS. Unfortunately, the computer with the big GPU also runs Windows.)

[go to top]