I was starting an install of Debian the other day, and it suggested 25 GB as the root partition (including /usr, but not /var, /home, or /tmp). I had to laugh. My server has a 50 GB partition for that purpose and it’s around half full.
I aborted the installation. Might try again later today. (Switching this machine from Kubuntu, using a new drive, so it’s not critical that it be done at a certain time.)
I have to admit, I only have the barest understanding of this flatpak, snap, docker, etc. business. I’ve been using Linux since the late 90s and missed this development. I haven’t been following what was happening in the development end, I suppose - in part because there isn’t that much need to, because Linux has gotten so good.
But while I’m a power user (hey, I used Slackware until about 2015), I’ve found that I much prefer not having to spend hours and hours administering my machines every few weeks or months.
Sorry for the long comment. But this has been bugging me for a long time and you triggered me. No need to try to answer my questions if you’re not feeling it, I’m just dumping. You can stop reading here, if you like.
I’ve used a few appimages for limited cases - BalenaEtcher to burn HomeAssistant on to SD cards for my Raspberry Pi, and I think the scanning software I use is also an AppImage. The idea of having the libraries and binaries all together for certain things is a good one for certain cases, such as software I do not use regularly. BelenaEtcher strikes me as a perfect use case for appimage, because I don’t want to spend time installing Balena and keeping it up to date or uninstalling it, when I’m only likely going to use it once or twice. I never even move it out of my Downloads directory, just download, run, and delete.
Ubuntu (I use Kubuntu) moved Firefox to a snap image some time back. I get it, sandboxing, not a bad idea. But I’m pretty sure I had one installed by root, and one erroneously installed by my user account, possibly caused by forgetting to “sudo” during an update one day (I’m really not sure how it happened). And that latter one, if it existed, was almost certainly sitting in my /home directory somewhere, because my user account doesn’t have authority to write to /usr or /opt or anywhere like that. I didn’t plan to install software in /home, and didn’t allocate space for it, and don’t really like the concept in general. (I’ve switched debs for Firefox, and think I got the snaps for it cleaned up.)
If we’re going to do images installed by users, /opt seems like a much better choice, albeit with some controls - maybe /opt/username/ with permissions set by user; I’d be okay with the user account being able to install there and being unable to screw up system files. My current backup strategy involves grabbing everything in /home with a few very specific exceptions, and clearly I don’t need the current release of Firefox on my backup.
I have OpenProject (community edition) installed for keeping track of a restoration project I’m working on, and I’m pretty sure I used docker to install it. I have to admit it was easy to install (but so are debs 99.9% of the time), but now I’m wondering about the best way to get the data back out so I can migrate that software to my server (it’s running on my desktop because my server was that 2008 computer). I assume I can backup and restore, but I haven’t yet looked into this. Or heck, maybe it’s possible to just move the Docker image, the way I moved the HomeAssistant KVM image. It looks like the data is stored in a separate volume (which I interpret to mean a file that acts as a virtual disk, similar to how KVM has a virtual disk for the OS and apps in the virtual machine). Also, I’m not clear if docker images automatically update or if I should be updating them manually.
Then there’s Zwift. Zwift is a virtual cycling program that runs on Windows, Mac, Android, and iOS. No Linux client, which isn’t a surprise. I have a whole Windows 10 computer in the basement that only runs Zwift, and it’s my only Windows machine that I use. But, someone created a docker image of Zwift! I tried it on my Linux desktop machine a while back, and it worked. Very cool! But Zwift updates the program regularly, introducing new bugs and features - does the maintainer of that image have to do anything? What if he or she loses interest? It’d be nice to ditch Windows, but I have no idea if that docker image will remain usable indefinitely.
I think Zwift is using Wine to run. So it seems the docker image for that has the Zwift Windows client, some Wine libraries, and everything supporting Wine is already supplied by the Kubuntu install…but I’m really not sure. Theoretically I don’t need to know, until something breaks.
I have yet to use a flatpak, I think.
I’ve considered asking about all of this in the Linux community here on Lemmy, but there’s probably an article with an overview of it somewhere, and I just need to search for it.
I was starting an install of Debian the other day, and it suggested 25 GB as the root partition (including /usr, but not /var, /home, or /tmp). I had to laugh. My server has a 50 GB partition for that purpose and it’s around half full.
I aborted the installation. Might try again later today. (Switching this machine from Kubuntu, using a new drive, so it’s not critical that it be done at a certain time.)
Format with btrfs if you use flatpack, it has deduplication.
I have to admit, I only have the barest understanding of this flatpak, snap, docker, etc. business. I’ve been using Linux since the late 90s and missed this development. I haven’t been following what was happening in the development end, I suppose - in part because there isn’t that much need to, because Linux has gotten so good.
But while I’m a power user (hey, I used Slackware until about 2015), I’ve found that I much prefer not having to spend hours and hours administering my machines every few weeks or months.
Sorry for the long comment. But this has been bugging me for a long time and you triggered me. No need to try to answer my questions if you’re not feeling it, I’m just dumping. You can stop reading here, if you like.
I’ve used a few appimages for limited cases - BalenaEtcher to burn HomeAssistant on to SD cards for my Raspberry Pi, and I think the scanning software I use is also an AppImage. The idea of having the libraries and binaries all together for certain things is a good one for certain cases, such as software I do not use regularly. BelenaEtcher strikes me as a perfect use case for appimage, because I don’t want to spend time installing Balena and keeping it up to date or uninstalling it, when I’m only likely going to use it once or twice. I never even move it out of my Downloads directory, just download, run, and delete.
Ubuntu (I use Kubuntu) moved Firefox to a snap image some time back. I get it, sandboxing, not a bad idea. But I’m pretty sure I had one installed by root, and one erroneously installed by my user account, possibly caused by forgetting to “sudo” during an update one day (I’m really not sure how it happened). And that latter one, if it existed, was almost certainly sitting in my /home directory somewhere, because my user account doesn’t have authority to write to /usr or /opt or anywhere like that. I didn’t plan to install software in /home, and didn’t allocate space for it, and don’t really like the concept in general. (I’ve switched debs for Firefox, and think I got the snaps for it cleaned up.)
If we’re going to do images installed by users, /opt seems like a much better choice, albeit with some controls - maybe /opt/username/ with permissions set by user; I’d be okay with the user account being able to install there and being unable to screw up system files. My current backup strategy involves grabbing everything in /home with a few very specific exceptions, and clearly I don’t need the current release of Firefox on my backup.
I have OpenProject (community edition) installed for keeping track of a restoration project I’m working on, and I’m pretty sure I used docker to install it. I have to admit it was easy to install (but so are debs 99.9% of the time), but now I’m wondering about the best way to get the data back out so I can migrate that software to my server (it’s running on my desktop because my server was that 2008 computer). I assume I can backup and restore, but I haven’t yet looked into this. Or heck, maybe it’s possible to just move the Docker image, the way I moved the HomeAssistant KVM image. It looks like the data is stored in a separate volume (which I interpret to mean a file that acts as a virtual disk, similar to how KVM has a virtual disk for the OS and apps in the virtual machine). Also, I’m not clear if docker images automatically update or if I should be updating them manually.
Then there’s Zwift. Zwift is a virtual cycling program that runs on Windows, Mac, Android, and iOS. No Linux client, which isn’t a surprise. I have a whole Windows 10 computer in the basement that only runs Zwift, and it’s my only Windows machine that I use. But, someone created a docker image of Zwift! I tried it on my Linux desktop machine a while back, and it worked. Very cool! But Zwift updates the program regularly, introducing new bugs and features - does the maintainer of that image have to do anything? What if he or she loses interest? It’d be nice to ditch Windows, but I have no idea if that docker image will remain usable indefinitely.
I think Zwift is using Wine to run. So it seems the docker image for that has the Zwift Windows client, some Wine libraries, and everything supporting Wine is already supplied by the Kubuntu install…but I’m really not sure. Theoretically I don’t need to know, until something breaks.
I have yet to use a flatpak, I think.
I’ve considered asking about all of this in the Linux community here on Lemmy, but there’s probably an article with an overview of it somewhere, and I just need to search for it.
WTF!? Didn’t know you could post dissertations here!