I have a confession to make.

I’ve been working in IT for about 6/7 years now and I’ve been selfhosting for about 5. And in all this time, in my work environment or at home, I’ve never bothered about backups. I know they are essential for every IT network, but I never cared to learn it. Just a few copies of some harddisks here and there and that is actually all I know. I’ve tried a few times, but I’ve often thought the learning curve to steep, or the commandline gave me some errors I didn’t want to troubleshoot.

It is time to make a change. I’m looking for an easy to learn backup solution for my home network. I’m running a Proxmox server with about 8 VMs on it, including a NAS full of photos and a mediaserver with lots of movies and shows. It has 2x 8TB disks in a RAID1 set. Next to that I’ve got 2 windows laptops and a linux desktop.

What could be a good backup solution that is also easy to learn?

I’ve tried Borg, but I couldn’t figure out all the commandline options. I’m leaning towards Proxmox Backup Server, but I don’t know if it works well with something other than my Proxmox server. I’ve also thought about Veeam since I encounter it sometimes at work, but the free version supports only up to 10 devices.

My plan now is to create 2 backup servers, 1 onsite, running on something like a raspberry pi or an HP elitedesk. The other would be an HP microserver N40L, which I can store offsite.

What could be the perfect backup solution for me?

EDIT:

After a few replies I feel the need to mention that I’m looking for a free and centrally managed option. Thanks!

  • ikidd@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    Proxmox backup server is free and absolutely essential in a PVE system. You can restore entire VMs, volumes, folders and files. You can keep many versions with it’s fantastic dedup system, you can mirror the backups to USB drives or other PBS remotes. If you’re using a ZFS filesystem on your PVE storage, then every backup is snapshotted at a point in time to prevent database issues on restore.

    • atek@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I’m going to try that for my servers! What do you use for your files (music, photos and such)?

      • ikidd@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I use a docker of Nextcloud. I have a Debian LXC on Proxmox that runs my docker containers, and since the backend storage is ZFS, I can snapshot it before any major upgrades to the OS or the docker containers. I have restored a whole LXC from PBS when something like my mailserver has gotten borked if I’ve forgotten to snapshot.

  • doeknius_gloek@feddit.de
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    I’ve been working in IT for about 6/7 years now and I’ve been selfhosting for about 5. And in all this time, in my work environment or at home, I’ve never bothered about backups.

    That really is quite a confession to make, especially in a professional context. But good for you to finally come around!

    I can’t really recommend a solution with a GUI but I can tell you a bit about how I backup my homelab. Like you I have a Proxmox cluster with several VMs and a NAS. I’ve mounted some storage from my NAS into Proxmox via NFS. This is where I let Proxmox store backups of all VMs.

    On my NAS I use restic to backup to two targets: An offsite NAS which contains full backups and additionally Wasabi S3 for the stuff I really don’t want to lose. I like restic a lot and found it rather easy to use (also coming from borg/borgmatic). It supports many different storage backends and multithreading (looking at you, borg).

    I run TrueNAS, so I make use of ZFS Snapshots too. This way I have multiple layers of defense against data loss with varying restore times for different scenarios.

  • momsi@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Maybe have a look at urbackup. Gui, “centrally managed”, free…

    And please, as mentioned in another comment, have a look at Borgmatic. It makes Borg really easy to use and has some super handy features. Super easy backups to multiple locations by just adding a line in the config… And I just love the healthchecks integration. Set and forget until either healthchecks notifies you of a problem or you really need to recover data.

    • atek@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I’m gonna look into that! Borgmatic looks a lot easier than borg, but that CLI still scares me. I like working with Linux commands but something new like backups makes me want to click in a GUI to set everything up.

      • momsi@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        When I got started I preferred GUI apps too. The more you use them, the more you get to appreciate cli tools. Meanwhile I find cli tools better, they are just more precise and have a good way to push you to use them correctly. Also they are mostly well documented and even offer “on the fly” help with -h flags or alike… also the get started page of Borgmatic is really well written. Just play around with it ;)

  • lemmyvore@feddit.nl
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    I use the daily/weekly/monthly pattern for machine backups:

    • Use a rsync job to copy whatever you deem important from the target machine to a backup dir. Run this once a day.
    • Once a week, sync the daily dir to a weekly dir.
    • Once a month, take a snapshot of the weekly dir as a tarball.

    In addition to that I use Pika Backup (it’s a very user friendly GUI for Borg) to make incremental backups of the monthly dir to a couple of external HDDs.

    • spez_@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I use Restic, for the incremental backups and deduplication. I feel tar balls won’t factor in those two cases.

      • lemmyvore@feddit.nl
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        If you use a backup solution that does incremental/deduplication you can probably replace the monthly tarball with a monthly deduplicative backup.

        Tarballs are useful in repetitive backups, like for example long term archiving to optical media (burning Blu Rays).

  • withtheband@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 months ago

    You can use syncthing to get files from all of your devices to your central server and then use something like FreeFileSync to backup the entire folder structure to another drive.

  • Decronym@lemmy.decronym.xyzB
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    3 months ago

    Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:

    Fewer Letters More Letters
    DNS Domain Name Service/System
    ESXi VMWare virtual machine hypervisor
    LXC Linux Containers
    NAS Network-Attached Storage
    RAID Redundant Array of Independent Disks for mass storage
    Unifi Ubiquiti WiFi hardware brand

    6 acronyms in this thread; the most compressed thread commented on today has 7 acronyms.

    [Thread #152 for this sub, first seen 20th Sep 2023, 15:35] [FAQ] [Full list] [Contact] [Source code]

  • rentar42@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Good on you to finally get into it, I switched to something systematic only very recently myself (previously it was “copy important stuff to an external HDD whenever I think of it”).

    The one thing that I learned (luckily the easy-ish way) is: test your backup. Yes, it’s annoying, but since you rarely (ideally never!) will need to restore the backup it’s incredibly easy to think that everything in your system is working and it either never having worked properly or it somehow started failing at some point.

    A backup solution that has never been tested via a full restore of at least something has to be assumed to be broken.

    Which reminds me: I have to set up the cron job to periodically test a percentage of all backed up data.

    I decided to use Kopia, btw, but can’t really say if that’s well-suited for your goals.

  • hoodlem@hoodlem.me
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    I backup Proxmox VMs and templates onto my NAS, and from there into the cloud. If you don’t want the cloud maybe auto backup to an external drive and keep it somewhere safe (out of range of a possible disaster to your home)

  • Yote.zip@pawb.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Have you tried Vorta (based on Borg) or Kopia yet? I’m not sure what your exact requirements are but those two have similar featuresets and they’re very easy to use.

  • Appoxo@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    If you are not afraid of Windows: Veeam B/R (Community Edition)

    It has a nice GUI and works very well.
    GUI is well explained, knowledgebases for Hyper-V, VMware and some others.
    The Agent can be deployed manually and linux agents can write to a repository.
    I don’t think Proxmox is a supported hypervisor.

    Community Edition is free
    I think up to 10 workloads

    Maybe take a look.

    You could try to get hands on a NFR license that has the premium features with a 1 year runtime

    Edit: I use Windows Agent for my personal rig and backup via SMB.
    We use it at work so I am partially biased to that solution.

    • jubilationtcornpone@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I’ll second Veeam. It only runs on Windows but as far as backup and recovery software goes it’s the gold standard and the competition is not even close.

      • Oisteink@feddit.nl
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        You ever had it back up a proxmox cluster? I’d say it’s suboptimal advice to go for veeam for this use-case.

        Yeah - i use veeam for backups at work, but we run VMware, some MS servers and use rsync or bacula for our Linux boxes. A great product.

        • warmaster@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          What would you recommend for me?

          I have a homelab with:

          1 laptop on Windows

          3 desktop PCs (2 on Linux, 1 on Windows)

          1 server running Proxmox VE

          1 old 2 bay Synology NAS.

  • Rootiest@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Kopia is my favorite by far!

    It’s super fast and has tons of great features including cutting-edge encryption and several compression options.

    It has a GUI and is cross-platform.

    It can do both cloud and local/network backups.

    That includes locally mounted disks, SFTP, rsync, or any network share/etc accessible from your machine as well as many cloud options.

    The de-duplication stuff is also killer. If you upload the same file (or chunk of data) in different folders or even from different systems it will map them to the same backup storage potentially saving you a ton of storage space.

    It also uses a rolling hash system so if you modify just a handful of megabytes from a 25GB file many times, only the megabytes of changes will need to be backed up to store the version history. You do not need to store 25GB every time you modify that file.

    There’s a ton of other goodies as well!

    And it’s all FOSS!

    I use it to backup to an external hard drive, a NAS, and to Amazon S3. You can configure multiple repositories like that and have them all run at the same time (subject to their individual scheduling policies of course)

  • Mio@feddit.nu
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Use Veeam. If you hit the limit just configure it to send to a SMB share and you need no licens.

  • fear025@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I like BackupPC, it’ll do what you want but it may be more challenging to learn than some of these other options.

  • rambos@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I use kopia CLI, but the easiest one I came across with simple GUI is duplicati

    • pory@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      KopiaUI is fantastic and easy to use. I used to run Duplicati but it had database issues that kept coming up and forcing a sixty-hour rebuild process every couple weeks and I wasn’t happy with the idea of my PC potentially failing during one of those six days per month.

      • rambos@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Thx. I have read about duplicati issues and thats why I moved to Kopia. Duplicati is still doing smaller backups with no issues tho.

        I know about KopiaUI on desktop, but can that run in server mode? Or do you connect to server using desktop app? I just start kopia web server when Im testing backups, but thats not the easiest way Id say.

        • pory@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I don’t run anything on the server because I don’t need to. I have my home server mounted as a network drive in Windows, so I just point Kopia’s database at a folder in there. It’s stored as an encrypted backup, and I’ve got the config for Kopia backed up in a few places (and the encryption key as well) so if the worst case scenario happens to my PC I’ll just reinstall Kopia on a fresh windows install + HDD, restore the config from the backup, then restore the backup.

          I also have a backup target to an older 8TB drive that I leave with a friend and update whenever he visits for extra safety, if my whole apartment with my PC and server burns down I’ll at least be able to have an outdated snapshot and lose only a month or so instead of decades.

          • rambos@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Thx for explaining, it makes sense. Ill just leave it in cli and start web server when needed so I can use chron job when PC is off. Its only few commands from nice GUI simmilar to desktop version.

  • Monkey With A Shell@lemmy.socdojo.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    It has been a while since I used proxmox, but I seem to recall it having an option to export the VMs on a periodic cadance to an external host built in? That would solve for the configured system backup issue if it still exists. More directly, my preffered method is in keeping the payload objects (photos/files) on a separate dedicated storage NAS with RAID and automatic zfs dataset snapshots to accomodate for both a disk failing and the ‘oh shit, I meant to delete the file not the whole folder!’ type of losses. For a NAS in my case I use xigmanas, which is the predicessor to corenas, fka freenas largely because it doesn’t try to be too fancy, just serve the drives and provide some ancilary services around that job.

    So long version short, what particularly are you trying to back up? The pictures or the service hosting them?

    • atek@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I guess I just want to make sure the pictures are safe. Next to that I’ll backup my /home/user folder, but next to that it’s not that hard to rebuild my VMs.

      • Monkey With A Shell@lemmy.socdojo.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Simplest way there is to keep them on a dedicated storage system that you don’t even need to access directly for the most part. If there’s one thing I learned over many years playing with servers is that the end user/admin is more a hazard to your data than the system failing ever could be. A raid1 will automatically protect you if one of the hardrives happens to die without thinking about it, but will just as quickly delete everything on both drives if you run the wrong command.

        My nightmare example from personal experience, installing a new pair of drives with the intent to migrate to them.

        Install drive ‘b’, rsync -a dive ‘a’ to ‘b’ Wipe ‘a’ for storge/disposal, Install new drive ‘a’ to original slot of ‘a’ Start second rsync intended to be ‘b’ to ‘a’ but forget to change drives and instead sync the new blank ‘a’ to ’ b’ with the only copy of your data…

        Fortunately I managed to get most everything back with some data recovery tools, but that second after pressing enter and watching it all go away was wrenching. Since then I’ve become a lot more aware of having a certain level of protection against human error.