I was curious, do you run Stable Diffusion locally? On someone else’s server? What kind of computer do you need to run SD locally?

  • TheForvalaka@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    I run it locally. I prefer having the most control I can over the install, what extensions I want to use, etc.

    The most important thing to run it in my opinion is VRAM. The more the better, as much as you can get.

    • korewa@reddthat.com
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I run locally too. I have a 10gb 3080.

      I haven’t had vram issues could you elaborate on your statement?

      I know on local llama I have been limited to 13b models

      • TheForvalaka@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Stable Diffusion loves VRAM. The larger and more complex the images you’re trying to produce, the more it’ll eat.

        My line of thinking is that if you have a slower GPU it’ll generate slower, sure, but if you run out of VRAM it’ll straight up fail and shout at you.

        I’m not an expert in this field though, so grain of salt, YMMV, all that.

  • DrRatso@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Runs fine with 1660, but that is about 20 sec for 512x512 and upscaling takes upwards of a minute.

    If you want to run it online I suggest paperspace paid tier, not too big of a hassle to set up but you might have to wait a couple mins spamming refresh to get a better GPU, the instance can run for 6 hours, then it will be autoshutdown. Generally 2-4sec for 512 and 10-20 for 1024. Also, you will have to either download models every time, settle for only two or three models at a time or fork up a couple extra bucks for the permanent storage as base paid is only 15GB.

  • voluntaryexilecat@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    locally, always.

    I even got it to run without GPU on a pure old i5 CPU with 8GB system RAM (not VRAM) paired with 32GB swap. SD1.5 takes 4-10 minutes per image, SDXL about 2 hours. But it works. With GPU its between 7 and 90 seconds per image, depending on model and settings.

  • BitSound@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I run it locally with an 11gb 1080 TI. It’s just exposed on the local network, so I still use the main SD website if I’m out and about somewhere.