I’m planning to migrate my email to a different provider, but they don’t give much storage, so I was wondering what people would recommend for this kind of setup: basically I’d like to use the new provider as something like a relay. I’d want them to only store an email or two at a time and have some kind of self hosted solution that just grabs the emails from the provider and stores them after deleting them off the provider so it’s never storing my entire email history, and also keeps my sent emails somewhere so that I have a copy of it. Ideally I’d wanna be able to set this up with a mail client like NextCloud’s.

  • Admiral Patrick@dubvee.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 years ago

    That sounds like POP3.

    Unlike IMAP, where your inbox lives on the mail server, POP stores the messages only until you download them.

    So you should be able to look for a provider that allows you to connect with POP3 and set your client up to fetch them periodically.

  • TCB13@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 years ago

    The good old fetchmail is probably what you’re looking for. Run your local/self-hosted email server and then use fetchmail as described here to fetch the email from the email provider and deliver into the local accounts. You also have getmail (does the same but is written in python), guide here, or go-getmail

    Alternatively, and probably way better:

    Postfix has a feature called ETRN service, documented here. It can be used to incoming emails queued deliver it to another server when a connection is available:

    The SMTP ETRN command was designed for sites that have intermittent Internet connectivity. With ETRN, a site can tell the mail server of its provider to “Please deliver all my mail now”. The SMTP server searches the queue for mail to the customer, and delivers that mail by connecting to the customer’s SMTP server.

    From what I know about it you might be able to:

    1. Configure just a SMTP/Postfix server on the cloud provider;
    2. Configure a full IMAP/SMTP server on the self-hosted / local machine;
    3. Configure the “cloud” Postfix to deliver all incoming email into your local / self-hosted Postfix using relay_domains here and here.
    4. Setup ETRN in the “cloud” provider to deal with your local server being offline / unavailable;
    5. On the local machine create a simple bash script + systemd timer / cron like this:
    nc -c 'echo "ehlo selfhosted.example.org";sleep 1;echo "etrn your-domain.example.org";sleep 1;echo "quit"' remote-cloud-server.example.org 25
    

    This command will connect to the cloud server and ask it to deliver all queued email to the self-hosted instance. This can be setup to run every x minutes, or if you want to get fancy, when the network goes up with the network-online.target target like described here. Note that the script isn’t strictly necessary, is just guarantees that if the connection between servers goes down when it comes back you’ll get all the queued email delivered right away.

    The following links may also be of interest so your local / self-hosted email server can send email:

    Now a note about NextCloud: their webmail is the worst possible solution, I wrote very detailed description of the issues here. Do yourself a favor and use Roundcube.

    • jcg@halubilo.socialOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      2 years ago

      Wow thanks for the very detailed info! I’ll look into all of these. I read your post about the NC webmail and yeah I might just go for RoundCube lol. I’ve had performance issues with the file part of NC but it just works better for me than other solutions so I figured I may as well just tack it on but seems I’ll have more performance/resource concerns if I do.

      • TCB13@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        You’re welcome. Well Syncthing is great, setup is easy and it does get the job done. My favorite way of running it is having a central “server” (like your NAS or so) and have all devices connecting to it instead BUT not to each other. This way your NAS acts like a single source of truth for the files and conflicts are close to none. Another advantage of running it like that is that you can plug other things into the file storage like WebDav, SMB or Filebrowser in order to support accessing files from any browser and iOS devices.