Part of what’s making learning Linux so difficult for me, is the idea of how fragmented it is. You can install programs with sudo apt get (program). You can get programs with snaps. You can get programs with flatpaks. You can install with tar.gz files. You can install with .deb files. You can get programs with .sh files. There’s probably more I don’t know about.

I don’t even know where all these programs are being installed. I haven’t learned how to uninstall them yet. And I’m sure that each way has a different way to uninstall too.

So that brings me to my main question. Why not consolidate all this? Sure, files CAN be installed anywhere if you want, but why not make a folder like /home/programs/ where it’s assumed that programs would be installed?

On windows the programs can be installed anywhere, but the default is C:/windows/Program Files x86/ or something like that. Now, you can change it all you want when you install the programs. I could install it to C:/Fuckfuckfuck/ if I wanted to. I don’t want to, so I leave it alone because C:/Windows/Program Files x86/ is where it’s assumed all the files are.

Furthermore, I see no benefit to installing 15 different programs in 7 different folders. I begrudgingly understand why there’s so many different installation methods, but I do NOT understand why as a collective community we can’t have something like a standardized setting in each distro that you can set 1 place for all your installation files.

Because of the fragmentation of distros, I can understand why we can’t have a standardized location across all distros like Windows has. However I DON’T see why we can’t have a setting that gets set upon each first boot after installation that tells each future installation which folder to install to.

I would personally pick /Home/Programs/, but maybe you want /root/Jamies Files/ because you’re Jamie, and those are your files.

In either case, as we boot up during the install, it would ask us where we want our program files installed. And from then on, no matter what method of install you chose, it would default to whatever your chosen folder was.

Now, you could still install other places too, but you would need to direct that on a per install basis.

So what’s the benefit of having programs each installed in seperate locations that are wildly different?

  • highball@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 days ago

    If you delete a program from the Programs folder, does it get uninstalled from the system? Nope. You have to go drag the registry and delete any mentions. You have to go looking for shortcuts and delete all those in multiple locations. If you go to the control panel -> add/remove programs, that works decently, but it’s not guaranteed. It’s been a couple decades since I’ve used windows regularly, but there were all kinds of folders, appdata roaming and that’s just what I remember having to go dig through because there is no real standard for anything. That doesn’t even get into system files and dlls.

    There are package managers for Windows too. Choclatey is something I have seen in README.md files for various install instructions.

    The Unix-like install locations are for organization. Goes back to Unix-like server oses. Linux is Unix-like as well. Linux doesn’t have a registry to keep track of all the locations for all the files and configurations for each program, so organized locations where the system can expect to find specific things is how it’s done. I much prefer the organized file structure, digging through the registry where there is only the minimal organization was something I always hated.

  • Rimu@piefed.social
    link
    fedilink
    English
    arrow-up
    22
    ·
    6 days ago

    What you’re seeing is the result of decades of new ways to install stuff being added at different times. As a new way is added all the old ways still need to work because getting everyone to switch to the new way is impossible. There is no central authority making these decisions, it’s more of a marketplace of ideas with different ‘sellers’ competing for attention.

  • demesisx@infosec.pub
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 days ago

    You should try NixOS or something similar. It sounds like all of your gripes with Linux are solved by NixOS. It makes system management a LOT more sane than FHS has gotten.

  • Zozano@lemy.lol
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    5 days ago

    Windows certainly doesnt have uniformity.

    Where are my game saves located?

    Are they in my hidden AppData folder? If so, which of the three subdirectories does it live?

    If not there, then surly it’s in the Saved Games folder.

    Nope. It must be in My Documents.

    Shit… Maybe in the Program Files?

    For fuck sake, where is it?!

    Web browser > search > pcgamingwiki (great resource BTW), save game location. AH-HA! IT’S IN… My Documents?

    I just checked there! (Half an hour passes)

    Found it! Now why the FUCK does Windows partition the local user directory from the OneDrive user directory?!

    Windows is a FUCKING mess. Once you get used to Linux, you’ll understand the worst thing is Mozilla thinks it’s okay to put its config file one directory up from where it should be.

    • Ketata Mohamed@mastodon.tn
      link
      fedilink
      arrow-up
      5
      ·
      5 days ago

      @Zozano @Lost_My_Mind
      seriously, I have been gaming for more than 30 years and if you want me to swear I’ll swear, for 25 years I have been gaming pirated games, on the V1.0.0 of a random game, the save file is located on My Documents\My Saved Files, then after installing the update 1.1.0 the saved progress is gone, after much research, I find it under my docs\the game name, then after 1.2.0, I find it under App Data\local… and so on and every time the pseudo changes+ the path

  • manicdave@feddit.uk
    link
    fedilink
    English
    arrow-up
    10
    ·
    5 days ago

    Linux is actually kinda designed to be less fragmented than windows really.

    The reason you don’t pick an install directory is because the standard is that binaries live where binaries live, dependencies live where dependencies live, logs live where logs live, etc.

    All the user should worry about is where the media or whatever your program works with is.

    Always try to find the apt install instructions for whatever program you want, and it’s easy to uninstall with apt remove.

    Apart from a few deb packages, almost everything that can’t be managed via apt should be considered incomplete or experimental. If it was ready for you to just use it without issue, it would be in an apt repository.

    It may seem a bit daunting to have to use command line at first, but once you’re used to it, you’ll realise how absolutely broken and archaic managing software on windows is. (Like seriously, it’s 2024 and you’re still having to fish through slow or sketchy websites to find installers for tools and drivers.)

  • Possibly linux@lemmy.zip
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    edit-2
    6 days ago

    Linux doesn’t really have stand alone programs like Windows. A package is a series of files that get placed in the proper places plus some optional scripting. Packages have dependencies so you can’t just run a binary from a package. The closest thing Linux as is AppImage but it lost a lot of steam.

    In Linux there are two general types of package managers. The first one is native packages. Native packages install to the root filesystem and are part if the core system.

    The second type of package manager is the portable format like Flatpak. Flatpaks can either be installed system wide or as a local user. The big difference is that they run in there own environment and have limited permissions. This is done by creating a sandbox that has its own filesystem so that it is independent of the system. This is also what makes them portable as that environment is the same no matter what.

    Technically snap packages are portable but you aren’t going to see much use outside of Ubuntu since the underlying architecture has so many flaws.

    • This is sometimes true.

      Go and Rust both (often) build single-executable binaries, often with very few (and, rarely, no) dependencies. It’s becoming more rare for developers to include proper man pages, more’s the pity, but things like man pages, READMEs, and LICENSE files are often the only assets packages from these languages include.

      If you’re installing with Cargo or go install, then even the intermediate build assets are fairly well-contained; go install hides binaries quite effectively from users who don’t know to include GOPATH(/bin) in their paths, because Go puts everything into a single tree rooted there.

      Libraries are a different matter; you get headers and usually more documentation. Interpreted languages are as you say: a great pile of shit spewed all over your system, and how bad that can be depends a lot on how many different ways you install them.

      Anyway, I’m not disagreeing with you, except that it’s a trend for newer compiled languages to build stand-alone binaries that you can usually just copy between systems and have it work.

    • Lost_My_Mind@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      6 days ago

      Now does flatpak get it’s programs from the same place that terminal would? I’m still trying to grasp what’s even happening here. Because from my experience (limited) I like flatpaks more than any other method used so far, and am unclear why anyone would use terminal if given the choice.

      As for snaps, I heard Ubuntu owns the technology behind snaps, and for some reason everybody hates snaps because canonical owns it. Which I don’t get. As far as I know they don’t abuse snaps, and they don’t cause viruses or anything. So why would it matter who owns the technology behind them?

      • PetteriPano@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        6 days ago

        everybody hates snaps because canonical owns it

        We like of like things to be open so that we can review, or replace. The snap store is proprietary and controlled by canonical. I don’t want my data collected and subject to canonical’s EULA when using my choice of distro.

        Canonical has a hisory of doing bad choices, so the level of trust is not very high. It feels like an attempt at embrace, extend, extinguish. Get people hooked on snaps and then make snaps suck on other distros kind of thing.

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        6 days ago

        https://flathub.org/

        The reason most people don’t like snaps is fairly complicated. It started with Ubuntu forcing some basic packages to install as a snap instead of a native package. The thing is snaps are not native packages and because of this it caused major problems. These days a lot of the issues have been addressed but there are still some serious design flaws. The biggest issue is that it is way overly complex and depends on a privileged daemon. The result of this is poor performance and a clunky experience.

      • banazir@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        5 days ago

        Now does flatpak get it’s programs from the same place that terminal would?

        I usually install Flatpaks from the terminal, but as to your question: no, the distro’s package manager and Flatpak have different repositories (servers with software packages) and formats. While distros like Fedora have their own Flatpak repositories, most people use Flathub. You can install apps as Flatpak on any distro that supports them, but native package managers generally don’t support other distros’ repositories.

        for some reason everybody hates snaps because canonical owns it.

        As I understand it, Snap server software is proprietary and doesn’t support independent repositories, so you have to install Snaps from Canonical. This is not exactly in line with Free (as in Freedom) Software principles. Canonical has done many questionable decisions in the past.

  • TheGrandNagus@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    edit-2
    5 days ago

    This is part of why these days I just stick to flatpaks. No fragmentation, same on any distro, I know where all the programs are going as well as all their config files.

    If I want to back up my flatpaks I can do so trivially.

    It’s a godsend. Way better than having a bunch of different formats everywhere, or the windows-style some programs installed in XYZ directory, some in program files, some in program files (x86), with config files saved literally anywhere. Maybe it’s in one of the dozens of poorly laid out appdata folders, maybe it’s where the exe is, maybe it’s in documents, maybe it’s in C:, maybe it’s hidden in my user directory, etc. I’ve even seen config files saved to bloody onedrive by default, leading to some funky app behaviour when I wasn’t connected to the internet, or when I ran out of onedrive space.

        • iopq@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          5
          ·
          5 days ago

          No, because docker shows up in random places in your system and takes forever to set up compared to the actual program

          There’s a few repos for online management consoles and the original version used a .sh file that installed in 30 seconds on a single core free VPS. The docker version was like two minutes and when I uninstalled it, I still have traces of docker on the VPS

          • Jakeroxs@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 days ago

            There’s a lot of configuration you can do with docker containers to ease the “where are my files” problem, along with the prune commands, in my experience, working well to clean up leftover files and such.

  • Captain Aggravated@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    5 days ago

    You can install programs with sudo apt get (program). You can get programs with snaps. You can get programs with flatpaks. You can install with tar.gz files. You can install with .deb files. You can get programs with .sh files.

    APT, the Advanced Packaging Tool, uses .deb files. Speaking technically, APT is the app store part of the system, the part that actually installs packages is called dpkg. If you have a .deb file and you want to install it, you can directly invoke dpkg. I have encountered things like printer drivers where the vendor will provide in .deb format. For one reason or another they’re doing it the way the Debian/Ubuntu standard package manager does it, but not publishing the files to the repositories.

    Snaps, flatpaks and appimages are intended to solve the problem of fragmentation. See, APT with it’s .deb files are the Debian system used by most if not all Debian/Ubuntu forks. DNF with .rpm files are Red Hat/Fedora’s system. Arch uses Pacman with .pkg.tar files. SuSe, Nix, and a couple others have their own systems, and none of them interoperate.

    One thing these do have in common is where they put the files they install; they are sorted by type and function into several directories in the root directory. Most of them you’ll find in the /usr directory, such as /usr/bin, /usr/sbin, /usr/lib or /usr/share. bin for binary files, this is where executables like sh or bash or cat or grep. sbin is for binary files that are only for those with super user privileges; you have to switch users to root or use something like sudo to execute those. Non-executable files like libraries, data, assets etc. will be variously stored in lib, local, share and so forth. You may find there are duplicate folders, like /bin and /usr/bin. In the olden days of minicomputers these were often located on separate drives, but nowadays /bin is a symlink (basically a shortcut) of /usr/bin. Some of that is maintained for backward compatibility or in case you still need to use the differences. It gets a lot more nuanced than I’m talking about here.

    They aren’t installed in /home/$USER because if you have multiple users on the machine you probably want them all to have access to the software. Say you’ve got a husband, wife and two kids, would you want to have to install four copies of LibreOffice?

    If you do want to install software in the user’s home directory, the place to do that is probably /home/$USER/.local/bin

    =====

    This does indeed make it a pain in the ass to publish software for Linux. So they wanted one system that would work on most if not all distros. Of course three competitors arrived.

    Snap is Canonical/Ubuntu’s attempt which IIRC started with their embedded systems but they’ve been pushing as their whole deal. It’s the closest thing to an Apple App Store there is; it’s owned by Canonical, the back end is proprietary so no one else can host the Snap repository, etc. A lot of folks don’t like that they kept some of it proprietary, plus there are performance issues with how it’s implemented; it mounts virtual disks for each app installed from Snap (or something similar to that) so it gets messy.

    Flatpak I think sprung up in or around the Fedora side of the world but has always had the goal of being omnicompatible. It is totally open, you can host your own flatpak repo if you want, though the de facto standard repository is Flathub. At this point, I think Flathub has achieved the goal of being the one place where, if you have a commercial app you want to also publish a Linux version of, publishing it as a Flatpak will reach most of the Linux audience in one shot. There are a couple downsides to it; for example Flatpak isn’t a great way to distribute command line tools, but for GUI-based applications it works fine. Flatpaks are weird; they tend to install in /var somewhere, I think for reasons surrounding permissions.

    Appimage is also around; the technique here is it’s basically a .iso file that has everything the app will need to run in it, so you mount the .iso and run the executable file inside. No infrastructure or installer needed. There are package managers for it, but they aren’t required. I find appimage to be the one to use if you’re making some small niche software that 200 people in the world will ever want and you don’t want to bother with the repositories, but I don’t like using them for main distribution.

    =====

    Let’s change gears and talk about .tar.gz or .run or .sh files. These are non-standard ways of installing software on Linux and I would avoid them unless it’s a last resort type of situation.

    A .tar.gz file is functionally similar to a .zip file on Windows; it’s a TAR file (Tape ARchive. A bunch of files packaged into one continuous file, initially designed to be written to magnetic tape) which has been compressed with GZip. When software is distributed this way, it is up to the user to extract and store it…somewhere. Probably the best place to do this is in /opt, which is the standard location for pre-compiled software manually added to the machine by the administrator.

    .run files have basically no standard and can do damn near anything and are a bad idea. It could be written in any language and do anything. You just don’t know. The one I encountered the most was Simplify3D’s installer; what they did was directly translate their Windows installer to Linux, the way Windows install wizards work, rather than do things in a Linux way. Hopefully you don’t have to deal with that.

    a .sh file is a shell script. Once again these can do anything but it’ll be a text file full of shell commands that you can read and understand. It’ll probably consist of a series of commands like wget or curl which download files from the internet and then commands to put them where they need to go in the file system. Once again this tends to be the approach for the “I’m usually a Windows guy but here’s my Linux version” folks. This is going to choose where to put software for you, hopefully in /usr/local.

    =====

    You may also find yourself compiling software from source code, which is what’s happening when you’re told to go to Github and then run a series of instructions like git clone, make, make install. Make is basically a scripting system specifically for compiling software, it will figure out what to do. I believe locally compiled software ends up in /usr/local/bin.

    ===

    Summary: There are a lot of directories where software ends up, and it’s done the way it is so you can tell by where it is stored what it is, what it does, who has access to it, who manages it and how it arrived on the system. It also keeps automatic processes like the package manager from interfering with software manually installed or compiled by the users or system administrators.

  • Solumbran@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    6 days ago

    The idea is to not manage programs by hand to avoid messes.

    The multiple solutions come from various needs, but it’s more of an underlying complexity of installation than a real intent to have so many.

    Ideally we would need one clean package manager that handles everything without ever having to tinker with installation paths and various methods.

  • Nicht BurningTurtle@feddit.org
    link
    fedilink
    English
    arrow-up
    8
    ·
    6 days ago

    I don’t see the issue, since you rarely have to run a program by going to the location of the binary (AppImage and others excluded).

    • Lost_My_Mind@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      6 days ago

      I just like to understand things, and I like to be organized. Being organized helps me understand things.

      Another thing I’m not understanding is, if Android is just Linux anyways, why aren’t there PC distros that are just Android?

      I need a resource where my brain can actually ask all the questions. Youtube videos are kind of informative. My issue with them is they’re more along the lines of “here’s how to do this thing that only advanced users will even know the terms being discussed”. Whereas I’m like “Where is the uninstall button for programs?”

      • catloaf@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        ·
        6 days ago

        Usually right next to the install button. Or, if you used the command line, change apt install vim to apt remove vim.

        The best way to learn how to use something is, of course, the manual.

        • Lost_My_Mind@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          6 days ago

          Is vim a command? Or are you using it as an example of a program? I’ve only heard of “sudo apt install (program)”

          • Dudewitbow@lemmy.zip
            link
            fedilink
            English
            arrow-up
            7
            ·
            edit-2
            6 days ago

            vim is a very common text editor. hes just using it as an example program to install/remove

            • Lost_My_Mind@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              6 days ago

              I’m on ZorinOS actually, but I’m not sure if it’s permanent. I’m going to be buying a bunch of smaller SSD’s next month. Just trying a crapload of new distros. I haven’t landed on anything yet.

          • catloaf@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            ·
            6 days ago

            I literally just said to read the manual. It will tell you much more than you are asking.

            • Lost_My_Mind@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              6 days ago

              I’m still at work, so I’m not near my computer. Plus…I’m not sure which manual you mean. I didn’t mention my distro.

              • lime!@feddit.nu
                link
                fedilink
                English
                arrow-up
                2
                ·
                5 days ago

                “the manual” in linux always refers to the man command. run it with the name of a command as an argument an you will get a full description of how that command is used.

        • Lost_My_Mind@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          6 days ago

          One of the big problems I see with Linux is the lack of software that people know. The response seems to often be “Well we don’t have THAT, but we have this alternative…”. And the reason for that is the big name software sticks to where they know the userbase is.

          Android on the otherhand IS where the userbase is. You’re either on iPhone, or you’re on Android. So there’s a lot of software already available, ready to run. It would work the same on both your phone and PC, since it literally is the same apk. And it will always have support, due to being one of the main ways people use phones.

          The fact that Linux HASN’T found way to use APKs and android based distros baffles me, as it already has a MASSIVE foothold into what people know. There’s so much potential there! Imagine plugging your cell phone into a desktop via a dock, or a usb cable, or even some wireless communication (not bluetooth or wifi), and suddenly your entire PC set up is actually running off your phone. For most people, their cell phone would be a good enough desktop if it had a desktop mode. I connected a keyboard with trackpad, to my Samsung A8 android tablet, and don’t feel the need for an actual laptop. I use Win-X launcher to give it a traditional desktop launcher feel, and I’m happy with it.

          • ZoDoneRightNow@kbin.earth
            link
            fedilink
            arrow-up
            6
            ·
            5 days ago

            I don’t understand what you are saying. Android is missing a bunch of stuff that linux users rely on for a full desktop experience. They have completely different use cases. Android isn’t designed with desktop in mind, Linux is. A lot of the linux apps I rely on aren’t on android and vice versa.

            • Lost_My_Mind@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              3
              ·
              5 days ago

              A lot of the linux apps I rely on aren’t on android and vice versa.

              That’s exactly my point. I’m saying since it runs on the same format anyways, why NOT make it run on both? Then when someone uses an app on their phone, you could convince them to use that same app on a desktop, since they know it.

              And then, once it’s established to people that Android and Linux work together, publishers will start designing their Android apps like a hybrid. Eventually phones would just become Linux distros on a phone. And when you get home, you connect a mouse/keyboard, and switch to PC Mode. And eventually every Linux program would work on Android, and every apk would work on Linux distros.

              And both ecosystems would gain a huge amount of software.

              • lime!@feddit.nu
                link
                fedilink
                English
                arrow-up
                3
                ·
                5 days ago

                I’m saying since it runs on the same format anyways, why NOT make it run on both?

                it’s not the same format. android is using an old linux kernel, yes, but the two systems are not compatible at all.

                interestingly, what you’re talking about exists. it’s called samsung dex. they are cancelling it because nobody uses it.

          • Possibly linux@lemmy.zip
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            edit-2
            6 days ago

            You can run Waydroid for android app support. I’m not really sure I understand what you are saying. “Big name” proprietary software will never come to Linux as there is no incentive for companies to spend money on that. You technically can run pretty much any Android app on Linux but that’s a privacy nightmare

  • Goingdown@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    5 days ago

    First of all, in Linux everyone should only use software from distribution repositories (eg. via apt command in Debian, Ubuntu, Mint, dnf/yum command in Fedora etc…). Package managers will install software in controlled way and it is really easy to remove them too. And, there is usually gui app for installing apps from distribution repositories.

    Second way is to use flatpak / snap. They are pretty much similar and will keep things easy.

    Do not install sh packages or tar.gz if you really do not know what you are doing. These are only for expert cases.

    One fundamental change coming from Windows is that in Linux, you should never worry about location where software is installed (except for those expert cases, which you should not use). They will be put in correct places always. In Linux, apps are sorted so that executables go to /usr/bin, library files to /usr/lib64 and /usr/lib, applicatoin other non-modifiable stuff to /usr/share etc. It gets quite a lot to get used to, but in long term it feels more natural than Windows way to dump everything in app directory.

    My recommendation will be to install some user friendly distribution (Ubuntu, Fedora, Mint) and just go ahead with default package management things what it offers. If you see Android way handling software good, Fedora Silverblue is kind of like that - System upgrades are handled same way, and applications are installed as flatpaks.

  • insomniac_lemon@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    6 days ago

    I understand fragmentation here, as you can get what you need in a format that works well-enough.

    Different package formats often have technical differences. Recently I had the choice to use something from a flatpak to reduce lib32 dependencies on my system… but I didn’t go with that as the other dependencies it needed (openGL, graphics driver etc) were redundant thanks to sandboxing (~2GB download!).

    Anything native from itch, GOG, or humble doesn’t really ‘install’ but rather they are just extracted… so the files should be what it is (portable, except game saves/user data likely won’t be). This allows you to run it off of a slower+larger-capacity drive.

    EDIT: Also if you need to compile it, probably will also just compiled to where you put it (to a bin folder).

    Non-system stuff like this is more viable for things that you don’t need updated frequently/ever (particularly games/software post-development). For sure most-of-the-time the best experience is via your package manager.

  • Auster@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 days ago

    I think that, while, yes, fragmentation hinders a system, it is also its saving grace, as it also stops a given family of systems from growing into what made the competition problematic.

    Taking the Program Files folders as example, they have limited read/write permissions on Windows, so whenever possible, I try to install them onto a folder I make in the root of C:. But more and more, since at the very least Windows XP from what I could observe, Microsoft is training users into using only the users folder, and less and less programs give an option to install elsewhere, installing only on the Program Files folder instead. Meanwhile, on Linux Mint (my distro of choice), if AppImage (my to go medium of programs) isn’t working well, I can always fallback to other means, such APT directly or downloading its .deb files then extracting them, getting from flatpak, compiling it myself, building a custom AppImage, running on a VM or emulator, or in the worst possibility, I make a dual boot between Mint and some other distro.

    Also, although there are many package managers, from my experience, they usually work similarly. Some changes in syntax, options and names, but nothing outlandish. It would be, I think, like someone learning a close language to his/her mother tongue. And from experience, you can even organize installations in a more standardized way, although it will take some effort from your part to figure out how, since some adaptations may be needed (java 8 and sdl ptsd intensify).

    And lastly, from what I can observe, stuff in Linux more often than not share logic or even methods with a lot other stuff in the system. Dunno if it’s a bit of a bias of someone that’s using Linux for a few years already, but the fragmentation usually feels superficial to me, with distros being more tweaks of the ones they stem from, and major changes being better observable when distros are sufficiently far apart.

  • Jumuta@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    5 days ago

    there’s different ways to install things because they each have their use cases in which they’re better than others (or used to have use cases)

    • binary package managers (e.g. apt): fast and lightweight because it only downloads/installs the necessary binaries

    • flatpak: can be installed on any distro, but takes up more storage space because they’re installed in a sandbox and all the dependencies are also installed with it, for every application

    • snap: same thing as flatpak but a bit worse, but some applications are only packaged for snap because canonical paid a lot of big companies to package for snap (rhey didn’t incentivise against flatpak, they just didn’t fund flatpak)

    • appimage: the ‘windows exe’ kinda thing and has all the dependencies bundled so distro agnostic, but you have to manage the appimage files yourself unless you get a manager for it and you can’t update them centrally like you can do with other stuff

    • source code repos (e.g. aur): have to compile every new version yourself on your machine, so is slow to update, but often offers things not in the binary package manager

    • .sh files for installation: idk why these are used, they’re just annoying. a lot of proprietary software from corpos use them (probably so they can verify dependencies themselves and not trust the system)

    • binariy files (e.g. .deb): same thing as with appimage except they’re not distro agnostic

    • tar.gz: is just a compressed file format like zip

    • Daemon Silverstein@thelemmy.club
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 days ago

      .sh files for installation: idk why these are used, they’re just annoying. a lot of proprietary software from corpos use them (probably so they can verify dependencies themselves and not trust the system)

      GOG (Good Old Games) distributes the games using a .sh file containing all the binaries and assets needed for the game. It’s strange to think of, but the binary data coexists with textual shellscript instructions, thanks to the exit instruction (which ensures that the shell won’t try to interpret the binary data) alongside some awk/grep/tail wizardry to extract the binary data from the current shellscript.

      It’s probably because .sh can run in any distro, because every distro has a shell interpreter. Also, they don’t need to be compiled (differently from .appimage, for example), it’s just a merge of a .sh and a binary archive (possibly .tar.gz).

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      5 days ago

      .sh files are shell scripts, they’re comparable to Windows batch files or newer powershell scripts. They can be useful for tools with lots of dependencies which they then download on their own, so you often see them when you want to install something like LLM tools from Github or whatever. They’re easy to put together and easy to edit, even for the user itself, unlike a precompiled installer.