For example, theming shouldnt have to be a 10 step process. Make Flatpaks use your themes correctly. Another thing is QT Theming, why is it outside of KDE you can’t use the breeze style? It’s the best and most consistent application style for QT apps. And the final point, when is the naming scheme of org.foo.bar going to be fixed to be the actual name of the package rather than the technical name. Flatpak remove and flatpak install both work without giving the full name, so why doesn’t flatpak run? The naming is the only thing snaps have over flatpak. If nothing is been done how can I contribute?

  • TCB13@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    9 months ago

    I’ve been using Linux for quite a while now. While not everything is perfect,

    You don’t have to upsell me Linux, I’m already sold however I do use both Linux, Windows and macOS at work and home and notice the subtle differences when it comes to a polished experience and cohesion - Windows is the worst on cohesion as expected.

    And which kind of cohesion on the desktop are you missing and I don’t? I mean sure, Apple has one big ecosystem with everything tied to it. It is convenient and easy as long as you’re within that one ecosystem. And Linux for example doesn’t sell an operating system and online services and software

    I’m not talking about ecosystems, I’m talking about the small annoyances from icons that don’t have a consistent looking across apps on Linux to for instance whenever I want to add a VLAN instead of doing it on GNOME Settings (gnome-control-center) I’m forced to use nm-connection-editor that is a different application. Settings are kind of scattered around. The same happened with Wireguard VPNs for a while…

    Flatpak isn’t something tightly integrated into the system and it isn’t Linux’s default choice

    It does use a lot of the same containerization technologies that Docker, LXC etc. use such as cgroups, namespaces and bind mounts. To me it seems more like the higher levels are missing pieces to facilitate communication between applications (be it protocols, code or documentation) and sometimes it is as simple as configuration.

    Additionally I have the command line where everything ties into another superbly. It follows the unix philosophy. That means I have tools that are supposed to do one task, but do that task well. And then I have a simple means of connecting them, concatenating them and it makes things really easy. I don’t know how Apple does stuff.

    Apple also has it, macOS is a UNIX system for what’s worth and that’s the reason why a large number of developers use it instead of Windows. The cli tools you use under Linux are most likely available for macOS as well via https://brew.sh/.

    To be fair Apple actually does a decent job when it comes to connect things as they even created a programming language called Apple Script that as made specifically so you can automated macOS GUI applications easily. On Linux this can be done with strongwind (deprecated?), dogtail (dead?), xdotool but those tools are sloppy and hard to use.

    In AppleScript you can access the native APIs of macOS GUIs and simulate a user clicking on buttons and menus, you can also tell it to record some action on the GUI and it will translate it to code.

    Recently Apple even made their macOS GUIs automations available from JavaScript and you can do the exact same things you used to be able to do from AppleScript in JavaScript. They actually invested so much into that you can even build entire macOS desktop apps using the typical UI components and frameworks Apple provides with JavaScript.

    a game that isn’t available for Mac or you’re forced to use Microsoft Access or other specific software for work? Or you want to watch Virtual Reality pornography and that happens to be something the app store cuts down on?

    There’s a difference between macOS and iOS. What your described is what happens in iOS - you’re required to use the store and whatnot, but under macOS you can get applications from anywhere you want like you do on Window and Linux.

    You just can’t compare specifically Flatpak to the way Apple does it. It is something that is focused on decoupling things from the system, not integrate them

    Apple does enforce a LOT of separation. they call it sandboxed apps and it is all based on capabilities, you may enjoy reading this. Applications get their isolated space at ~/Library/Containers and are not allowed to just write to any file system path they want.

    A sandboxed app may even think it is writing into a system folder for preference storage for example - but the system rewrites the path so that it ends up in the Container folder instead. For example under macOS apps typically write their data to ~/Library/Application Support. A sandboxed app cannot do that - and the data is instead written beneath the ~/Library/Containers/app-id path for that app.

    And here’s how good Apple is, any application, including 3rd party tools running inside your Terminal will be restricted:

    I bet you weren’t expecting that a simple ls would trigger the sandbox restrictions applied to the Terminal application. The best part is that instead of doing what Flatpak does (just blocking things and leaving the user unable to to anything) the system will prompt you for a decision.

    I would really like Linux to step up their game regarding a few things. Desktop application sandboxing and distribution is one thing. If you use Flatpak for this, the blame is on you. It is not the solution I’d like to see. And it is not the intended way of using Linux, so you can’t really complain. We need a proper solution instead.

    But okay I see your point. Still believe that Flatpak could’ve done a few things better just by looking at what Apple does.

    Half of the success of Windows and macOS is the fact that they provide solid and stable APIs and development tools that “makes it easy” to develop to those platforms. Linux is very bad at that. If major pieces of an OS are constantly changing and it requires large re-works of the applications then developers are less likely to support it. To be fair the Linux situation might be even harder than that - there are no distribution “sponsored” IDE (like Visual Studio or Xcode) and userland API documentation, frameworks etc.

    If Linux is able to provide those things we may even get proprietary software like Adobe on Linux because let’s face it the lack of Adobe and others is also Linux’s fault, not only on those companies. It is really fucking hard to develop and support software for Linux when you’ve to deal with at least two major half-assed desktop environments (KDE and GNOME) and one of them decides to reinvent the wheel every now breaking APIs with little to no regard for software. To make things worse you’ll end up finding out that most of the time people are running KDE + a bunch of GNOME/GTK/libadwaita components creating a Frankenstein of a system because some specific App depends on said components.

    • rufus@discuss.tchncs.de
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      9 months ago

      Hehe, yeah I see. I can agree to a lot of that. Maybe I should try a Mac for once and for more than 20 minutes. I think I mostly read the iPhone stuff and shake my head. How they force developers to buy a Mac, restrict the whole iPhone ecosystem. I don’t think I’d feel at home on a platform like that

      Concerning the Macbooks: I’ve recently learned about their M2 and M3 Macbooks and their outstanding performance at some workloads. For example people doing machine learning (AI) stuff on them. And the numbers of tokens an LLM can process/generate on them is on a whole other level than what my Intel machine does. I think Apple did a good job with that hardware. However they cost so much more… I can get a very decent frame.work laptop with a modern Ryzen for $2.070 or buy a new Macbook for $3.400 with a bit less RAM and the same amount of storage. It’d be faster at a singular workload i’m somewhat interested in. But I’m not sure if it is worth that kind of money.

      And I think I’m about to get old. I’m accustomed to how Linux works, I know my way around, have my workflow set up. I’m not sure if I can be bothered to learn something new… Exchange the little annoyances for something that requires me to adapt to an entirely new workflow… Maybe I’ll try it anyways. See if there are cracked versions of MacOS that I can boot in a VM and see if I like it. I have to think about that.

      Thank you for the discussion. I really don’t see Flatpak as the pinnacle of software distribution. But Linux is constantly evolving. I’m pretty sure we’ll someday get there for desktop applications. I think all the containerization stuff, CGroups and SystemD stuff is a good approach. It makes many things so much easier than they used to be. And I can spin up light containers, services and have them run with arbitrary permissions and environments on a server and all I need is a few lines of text. Sure, I configure the permissions and what they’re allowed to access myself on the server. That can’t be transferred directly to the desktop. We still need additional interfaces and especially ways to address what you said. Linux is a good desktop operating system, but there are some things that need to be solved better (or at all).

      Since you mentioned that GUI application automatization. That is a crazy approach. I saw some CI pipelines using such tools to test GUI applications and web interfaces. Load XY, press TAB 4 times, hit enter, search for an element with Z in the name, press ALT+F, do something else and then do a screenshot… The whole thing looked completely mental (to me.) And I think there is something like that on Windows, too. I can only imagine things like that break easy and you’re never able to change things if people actually rely on it. But I’m really not an expert on this. Might have valid use-cases. Or it’s just a silly way of doing things.

      Something I don’t agree with is Windows and MacOS succeeding because of solid and stable APIs. Theoretically this might be the case for developers. For Windows desktop end-users it is certainly not the case. My family threw out several printers because after an Windows Update there were no drivers available any more. Most of my old games don’t work any more, I’ve tried. Installing the old dotnet or c++ runtimes and directx versions is a hassle, sometimes impossible. Some games crap out entirely. I can’t do it the other way around and install an old version of Windows on modern hardware. So while in theory the Windows Kernel API might enjoy a good development model, it has little to zero effect on the end-user and why they buy Windows-Laptops in large quantities. And if success at the market is the measurement, contrary to Windows, Linux is the dominant operating system on servers an very successful there. So I don’t think this is the real reason. But reliable interfaces is certainly something we want. Apple changed the entire kind of processor architecture, and then again. With them things also don’t stay the same. They solve that with other techniques. And a Macbook won’t be thrown to the garbage after a few years because it’s gotten so slow. I see people keeping them for quite some time. But they usually don’t run the latest version of MacOS any more. At least that’s what I’ve seen.

      Anyway, it’s getting kind of late here. Thanks for the comment and the additional info you linked. I’m going to read the links tomorrow.

      • TCB13@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        See if there are cracked versions of MacOS that I can boot in a VM and see if I like it. I have to think about that.

        You don’t need any cracks. The issue with running macOS on a VM is that the VM won’t provide a compatible GPU and it will lag a lot. Yes, it’s painful and there aren’t decent workarounds unless you can passthrough an entire GPU supported natively by macOS.

        The whole thing looked completely mental (to me.) And I think there is something like that on Windows, too.

        Yes, there’s vTask (proprietary) and AutoIt for Windows. The second one is very good and very reliable.

        I can only imagine things like that break easy and you’re never able to change things if people actually rely on it. But I’m really not an expert on this. Might have valid use-cases. Or it’s just a silly way of doing things.

        AutoIt doesn’t break as much as you think if the developer knows what he’s doing. “Unfortunately” I spent the better part of 2010 coding AutoIt to automate exporting data from a very proprietary Siemens software and after a few months you just learn how to it properly :P It can target the Win32 controls directly and you can bind code to UI events by their internal Windows IDs. Another interesting thing it can do (sometimes) is explore a program’s DDLs and internal funcional calls and call those directly from your code instead of button clicks.

        What Apple does with AppleScript is a less advanced version of AutoIt, you can call their framework’s functions directly from it (hance the ability to build entire applications) and interact in robust ways with the GUI of some application. Applications can also load “plugins” into the editor and provide methods to do certain tasks the developer decided that might be important for someone.

        Might have valid use-cases.

        In the macOS land the use case is allowing anyone without much coding experience to be able to automate some GUI task. While not perfect this a large win for a lot of people, specially because you can just click “record” > do your repetitive task > “finish” and it will translate the task into code - the best part is that this “record” feature doesn’t actually record click positions, it will actually find out the IDs of the buttons and menus you clicked and write optimized and reliable code for the task.

        In my case with the Siemens software the use-case was very simple: we needed access to data that was only made available either through their software that was about 300€/month OR with a special license and another tool (that provided a local API via socket with the data) that would cost around 50 000 €/month - when you see a price like that I believe it’s totally justifiable and okay to automate the UI. Note that this was in 2010 and from what I’ve been told my code is still running the same task today without changes (AutoIt is complied and they don’t even have the source). I believe this speaks volumes about how reliable AutoIt can be.

        Something I don’t agree with is Windows and MacOS succeeding because of solid and stable APIs. Theoretically this might be the case for developers

        And and developers create software that people use. Large companies, without being given stable APIs and good documentation won’t ever feel like developing for Linux. They couldn’t justify a very expensive development process with large maintenance costs for such a small market share. If the APIs were more stable and there were better frameworks it could be easier to justify.

        So while in theory the Windows Kernel API might enjoy a good development model, it has little to zero effect on the end-user

        It’s not just about the kernel, it’s about the higher level APIs and frameworks that make developers be able to develop quickly. It’s about having C# and knowing the thing is very well supported at any corner of Windows and whatnot. It’s about having entire SDKs with everything integrated on a IDE made by them where everything works at the first try.

        re, I’ve tried. Installing the old dotnet or c++ runtimes and directx versions is a hassle, sometimes impossible. Some games crap out entirely. I can’t do it the other way around and install an old version of Windows on modern hardware.

        It seems you’re picking the hard case - games. But for instance you can install Office 2003 and Photoshop 6 on Windows 11 and they’ll run without hacks - Linux desktop (not CLI) never offered this kind of long term support. Recently I had an experience with an old game on modern Windows that might interest you: https://lemmy.world/post/10112060.

        Apple changed the entire kind of processor architecture, and then again. With them things also don’t stay the same. They solve that with other techniques. And a Macbook won’t be thrown to the garbage after a few years because it’s gotten so slow. I see people keeping them for quite some time. But they usually don’t run the latest version of MacOS any more. At least that’s what I’ve seen.

        Apple simply obliterates the old and doesn’t care much about it, Microsoft usually is way better at this. BUT… still as you’ve noticed their Rosetta 2 compatibility layer allows you to run Intel software on ARM machines without issues, even games and heavy stuff.

        But they usually don’t run the latest version of MacOS any more. At least that’s what I’ve seen.

        Yes, they’ve restrictions because they usually want to cleanup their kernel and some system components of support for older hardware and this seems to be a big advantage when it comes to the performance and reliability of their OS. Either way those machines with older macOS versions keep working and getting at least most of the software for a reasonable time.

        • rufus@discuss.tchncs.de
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          9 months ago

          Yes, Googled a bit and found how to virtualize macOS, the install did the first reboot already. Seems they took inspiration from Scotty from the TOS Enterprise, it suggested 2h50 at first but the minutes are coming down fast.

          We’ll see about that graphics accelleration. The laptop doesn’t have a dedicated GPU anyways. Either QEMU/KVM does it or I can pass through half the intel iGPU or it’ll just be slow.

          I can empathize with your story about the GUI automation. Sometimes you just need a solution for your problem. If it’s still running more than 10 years later it probably was the right call. I mean sometimes crazy workarounds stick and do the trick. You can always calculate if buying software/a license or paying someone to come up with a solution is cheaper. 13x12x50.000€ is a good amount of money.

          Thank you for the Midtown Madness 2 link. I need that, too. Spent quite some time in that blocky version of San Francisco when I was a kid.

          I don’t really have a better use case for Windows on my laptop at home. I use it to update stuff like the GPS and probably one or two other things. I moved a few games there after the SSD with Linux on it was filled up.

          (Edit: The install is done. You were right, the desktop is totally sluggish and I don’t have any sound. And I skipped the AppleID. I’ve closed it for now. Maybe I can find better settings on the weekend and try to install something on it.)

          • TCB13@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 months ago

            We’ll see about that graphics accelleration. The laptop doesn’t have a dedicated GPU anyways. Either QEMU/KVM does it or I can pass through half the intel iGPU or it’ll just be slow.

            Assuming you’re a GPU supported by macOS you might be able to get good results by treating with like an hackintosh: https://dortania.github.io/OpenCore-Install-Guide/

            I’m not sure how macOS plays with GVT-g / SR-IOV / sharing slices of hardware but this guy says he go it to work https://www.reddit.com/r/VFIO/comments/innriq/successful_macos_catalina_with_intel_gvtg/. I personally never got macOS with GPU acceleration working fine on a VM because my host is NVIDIA and unsupported. However I did have very good results in HP Mini computers running macOS by following the links before.

            Thank you for the Midtown Madness 2 link. I need that, too. Spent quite some time in that blocky version of San Francisco when I was a kid.

            I believe the hacks work with other games from that time as well as it solves the DirectX and GPU issues nicely without permanente changes to your system.