• NuXCOM_90Percent@lemmy.zip
    link
    fedilink
    arrow-up
    3
    arrow-down
    3
    ·
    11 months ago

    I am more familiar with RHEL than Ubuntu (I still can’t grok what the hell they advertise when you try to update home ubuntu…). But you are generally paying for a more curated selection of packages in the default repositories as well as active support for the more “bleeding edge” stuff.

    Which DOES provide “security”. Both in the sense of having more vetted third party packages (rather than do your own research on which solution to use, you use the one that the people you threw money at decided on for you) but also in response time. Because if someone manages to sneak malware into a popular package, you don’t just have people on call to roll that back and implement mitigations/recoveries immediately. They are also on call to call you to say “Yo, gimp is gonna shove bitcoin mining goatse into every single picture you make. We suggest you do the following…” at 2 am.

    • hemko@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      14
      arrow-down
      1
      ·
      11 months ago

      Let’s be honest, you’re paying for enterprise support. It ticks the squares in your report and makes management happy - and there’s nothing wrong with it as that will save your ass sometimes too.

      You’d get the same experience with any rhel clone otherwise (old centos, rocky) or even completely another distro like debian.

    • c10l@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      11 months ago

      I’ll be honest I’m not that familiar with Ubuntu either. I do have pretty extensive experience with RHEL (though mostly through CentOS back when it was effectively a RHEL clone) and even more with Debian (upon which Ubuntu is based).

      you are generally paying for a more curated selection of packages in the default repositories

      You seem to be implying that having fewer packages in the default repos somehow increase security. I don’t buy that. Packages that are not installed on the base system are fully optional (and even some that are, if you’re willing to do some cleanup!). Not having them installed doesn’t decrease your attack vectors. Having them in the repos means they’re going through the distro’s security process, patching, etc.

      Should the user choose to install that piece of software (otherwise it doesn’t matter), that process should mean increased security vs. the alternative - installing those packages either from upstream or from a third-party. Either solution may have on-par security practices with the distro’s but more likely have worse. Furthermore, upgrades could become more perilous for essentially 2 reasons:

      • It’s difficult to update (you’ll need to track upstream, verify if it has CVEs, etc and manually update vs. apt upgrade or similar).
      • The CVE fix you need may only exist in a major version above the one you’re running, which could mean a lot more work on upgrading, breakages, outages, etc. - compare that with Debian stable or Ubuntu LTS where security fixes keep coming for years.

      having more vetted third party packages

      Surely the mass of independent security researchers are more likely to find and file CVEs than the limited staff at Red Hat who probably have better things to worry about. On top of that, whatever CVEs RH do find, they will likely submit to the CVE database so it doesn’t matter.

      They are also on call to call you to say “Yo, gimp is gonna shove bitcoin mining goatse into every single picture you make. We suggest you do the following…” at 2 am.

      That sounds like a nightmare scenario (almost literally!). Please don’t wake me up we’re bleeding money, reputation or potential revenue. Everything else can wait until next morning. My sleep can’t.

      • NuXCOM_90Percent@lemmy.zip
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        11 months ago

        Not having a package installed DOES decrease your potential attack vectors. But it is more about decreasing the burden of picking a solution. For example, let’s say you are setting up a kubernetes install and need to pick an ingress controller. You can read through the documentation and maybe even check various message boards to figure out which are good options. But you need to sift through the FUD and often end up at the point of needing an expert to make an informed decision.

        Or you can rely on the company you are paying to have already done that and likely have already contracted this out to an expert to figure out which solutions are well maintained and have solid update policies.

        Because, getting back to a CVE: Some software has a policy of backporting security fixes to the current LTS (or even a few of the previous ones). Others will just tell you to upgrade to the latest version… which can be a huge problem if you were holding on 3.9 until 4.x became stable enough to support the massive API changes. A “properly” curated package repository not only prioritizes the former but does so at every level so that you don’t find out you were dependent on some random piece of software by a kid who decided he is going to delete everything and fuck over half the internet (good times).

        And yes, you can go a long way by reading the bulletins by the various security researchers. But that is increasingly a full time job that requires a very specialized background.

        Given infinite money and infinite time? Sure, hire your own team of specialists in every capacity you need. Given the reality, you look for a “secure”/“enterprise” OS where you can outsource that and pay a fraction of the price.

        As for the 2 am wake up call: If you have global customers then “wait until next morning” might mean a full work day where they are completely vulnerable and getting hammered and deciding that every single loss is your fault because you couldn’t maintain a piece of software. Or if you have sensitive enough customers/data where a sufficiently bad breach is the company itself (and an investigation to see who is at fault).

        Which all gets back down to why this is a non-issue for consumers. Enterprise OSes already exist and are not some evil scheme MS are working toward. And the vast majority of even companies don’t need them (but really should run them and consider paying for the support package on top…). So there is absolutely zero reason that the “home” version would ever be locked away behind one.

        • caseyweederman@lemmy.ca
          link
          fedilink
          arrow-up
          3
          ·
          11 months ago

          This is what you get when you pay.
          Security backports to old versions of software that have fallen out of support.

          • c10l@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            11 months ago

            Or, you know… you can get it for free with Debian, which circles back to my initial argument.

            • caseyweederman@lemmy.ca
              link
              fedilink
              arrow-up
              1
              ·
              11 months ago

              Well, no.
              You can get it for free with Debian, or even Ubuntu on an LTS version. Just not forever.
              The reason enterprises want to pay money for extended long-term support is so they don’t have to keep jumping major versions (with the possibility of breaking whatever unique environment they had going) every couple of years.

              Even the Linux kernel itself scaled back how far back they’re willing to support, leaving long-term users with the work of sourcing backports or constantly testing out new features.

              I’m very comfortable running Sid at home, but there the annoyance is limited to one person if I have to spend a couple hours combing through git diffs.

              Ten years between OS refreshes is money.