I’d like to contribute to the Lemmy community. I’ve been running my own private Linux servers for more than 25 years for things like email (years ago before all the spam), and as file servers, backup, etc. It’s an old, not very powerful computer, running Ubuntu server, in a corner in my house. Is it worth running a Lemmy instance on such a machine? I suppose there’d also be issues of how much data is going in and out, and how that would impact my internet cable usage. Thoughts?

  • KonQuesting@lemmy.sdf.org
    link
    fedilink
    arrow-up
    1
    ·
    2 years ago

    How much headroom do you have left on that? I’m considering starting up a public instance and would love to get an estimate for per-user workload on a federated instance.

    • Slashzero@hakbox.social
      link
      fedilink
      arrow-up
      4
      ·
      2 years ago

      With just me on the system, CPU is barely ever over 2 -3%. Memory usage looks fine. You know what? Let me post some graphs for the past 24 hours, which, I’ve pretty much been on here for 24 hours straight. Again, I’m the only user on my instance.

      • KonQuesting@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        Awesome, this is super helpful! I’d be using a very similar setup. It might be best to start small, invite a couple people on, and see how that memory scales. I’ll be avoiding any auto-scaling unless it becomes a much bigger project.

        • Slashzero@hakbox.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 years ago

          Well, ideally each service would have their own dedicated resources to begin with. But, given all of the lemmy services + Postgres are running on 2 cores with 2GB of RAM, that’s pretty impressive.

          Anyway, autoscaling doesn’t necessarily solve scaling issues without a lot of thought and planning. It’s not always as simple as throwing more hardware at the problem, as I’m sure you already know.