Based on https://privacytests.org

Desktop browsers in their current stable versions, sorted from better (left) to worse (right). These are:

Librewolf, Mullvad, Brave, Tor, Safari, Chromium/Ungoogled, Firefox, Edge, Opera, Vivaldi, Chrome.

Note: Each test is counted with a value of one in this chart, however each test may not have an equal importance in regard to privacy. It still gives an image of which browsers value privacy and which do not.

The maximum (worst possible) score is 143.

Edit: Also FUCK BRAVE. But for other reasons than these points. Read the description before you vote or comment ffs…

          • legless velociraptor@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 months ago

            Even though it’s not, it certainly looks like one. It’s what your readers perceive that matters. I write a little blog about family life in a local newspaper. I write the content before it is published, but after publication, the text belongs to the readers. I have no possibility of altering the content after reading the comments and understanding what the people think the text is expressing. It wouldn’t matter, anyway, as none will come back later to read the updated version and almost everyone will read it in the first few hours after publication. Editing after publication is futile. This is just to underpin that, explaining a graph on a different website is bonkers, as only data analysts will actually follow the link to understand the data behind the graph. “Normal” people will take the graph as is and jump to the most obvious conclusion: “Librewolf, Mullvad, Brave, Chromium are all better than Firefox, and Chrome is the worst!”. Or even better, “Ah, a list of browsers. Chrome seems to be the best one. Cool! scroll”. Those are the ones that didn’t even comment or up- or downvote the post. You won’t ever know who or how many those people are. The best approach might indeed be to delete the post, build yourself some data visualization knowledge, and come back with an improved graph. Also, even though you say it’s impossible to weigh the individual points in the tests, this might still be something you have to do to get your message across, whatever that might be. It involves work. You’ll be ok. Making mistakes like this and posting them publicly is what will give you the information you need to improve.

            • Vub@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              8 months ago

              Thanks for your feedback - and no worries about it being too long :)

    • tiramichu@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      8 months ago

      The tables in the source shows why.

      There are many things measured in those tables around preventing URL tracking, cookies, and storing data in session storage etc which none of the mainstream browsers are doing, including Firefox.

      Some of them are decisions made for good reason, because although preventing them would improve privacy it would also massively impact usability, and so only the most all-in privacy-focused browsers are doing that.

      OP themselves notes that they have weighted each ‘check’ as a 1 and each ‘cross’ as a 0 in calculating the size of the bars in the chart, without consideration of how relatively important or not those features are against other.

      Personally I believe the approach to generating the chart is flawed and does not give a fair measure of browser privacy.

      • Mario_Dies.wav@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        Yes, this seems like a really flawed method, and the only reason I could see someone using it would be if they were trying to make one browser appear better or worse than it really is.

        The truth is this chart is misleading and not very helpful.

      • nxdefiant@startrek.website
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        They are FF with the defaults set to “I don’t care if enabling this breaks my websites”.

        Telemetry is personal preference. Sending that data to a company you trust to use it for the stated purpose (making Firefox better) is a choice, and FF lets you easily disable it.

        • Katlah@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          I don’t care if enabling this breaks my websites

          I haven’t experienced any website breakage with Librewolf. Mullvad breaks websites because it has noscript by default (even though uBlock Origin has noscript built in).

    • nxdefiant@startrek.website
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      8 months ago

      Firefox tries to keep its defaults on the ‘functional’ side of safe. Firefox will pass almost every test on here by changing some settings away from default to more strict (like the enhanced privacy tracking), but doing so can actually break some websites. That said, this is a brave ad. A big tip off for me is GCP. Its development was supported by the Mozilla foundation but Firefox gets a ‘fail’ here because it isn’t on by default. No test that expects a user to know what all these things are could also reasonably expect a user to not check if they’re enabled. A fair test would have been to rate all these browsers with their defaults AND hardened (without plugins). Keep in mind Brave doesn’t give you the choice to disable GCP, they dictate what you can and cannot do in this regard, so it isn’t possible to A/B test Brave’s behavior in a lot of these cases.

    • Vub@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      How is Firefox that high?

      How? Firefox 121 has that score because it “failed” on 86 of the 143 tests, you can see which tests if you go to the source.

      As someone already commented, Firefox actually lets a bunch of things through and has telemetry. But it can be hardened and it is (IMO) the best browser all-in-all.

      But it is not perfect.

      Please note (AGAIN) that 1 test (1 point) here is NOT equal to all other tests. But I have no way to weigh one test against another, there are 143 tests (points) to consider. I thought it was obvious but for some reason people chose not to read the description, or misunderstood it. Sorry if it wasn’t clear enough.

      • tiramichu@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        8 months ago

        I appreciate your effort in doing the work and putting the chart together :)

        You fairly disclosed how it was all generated, but sadly not everyone will read that. A lot of people are going to look at that chart and their conclusion will be “Wow, Firefox is like a tenth as private as Brave!” - which you and I both know is not a true statement at all.

        Even if people did read it, things get saved and shared out of context all the time, and before you know it this chart is going to pop up in some Discord argument as evidence for why Firefox sucks just as much as Chrome, completely divorced from your comments or any reference of where the data came from - the source website URL isn’t part of the image either.

        I guess that’s the danger of visualisations. Data Viz can paint a complex picture clearly in an instant, we just have to make sure its painting the right one.