DrakeFox: Just to avoid this being a complete derail of the topic I'll try to be brief without having quotes and quotes and quotes.
Hmm, I thought we were discussing multiple smaller topics. If I would use single response, then I would very probably derail your initial post, by ignoring or overlooking some arguments and building up a pile on my own... This is why I went to partial quotes...
Yes, when you receive a heavy quoted post, there is an
impression someone is trying to prove something. Because this produces posts that are hard to process and you start weighting your time, if response is needed at all - a la looking at discussion opponent as some kind of a troll.
Thats not what I was personally up to. I have my experience, you have yours. Its pointless to talk someone into something what he disagrees upon. I am just exchanging my POV, nobody is obligated to anything.
DrakeFox: I do end-user support at a school, and sadly often see users who come in with a dead hard drive, no recovery image. No sticker and a
Windows 8 ISO will refuse to run on the system. Macbooks have it the best in that regard, internet connection and apple-ID required and that's about it.
Thats really strange, because it should activate using BIOS/UEFI key - possibly require telephone call. Can you please give me some search keywords, because I am curious as to - why the installation would fail.
DrakeFox: While it's true Linux has had a Software Center by default for a long time, I find it actually runs counter to some of the ideals of Linux.
Well, I have meant "package management system" by that. There are different and many have own approach that builds the very core of them. Few opposite examples:
- apt/dpkg (Debian,Ubuntu,SteamOS) - are binary, installs are fast, inter-package problems rather easy, they do allow automated building from source, but its rather limited to few packages rather than whole system(that may be the serious downside). They accept mixing of architectures (86 and x64) and mixing of sources - they usually resolve even these conflicts well. But when some newer package requires a newer library, that conflicts with older library on what other package depend - there is nothing owner can do, but wait till its packaged; or try that himself, which is much harder than typical install and may cause owner to flee to another distro.
- portage (Gentoo, Funtoo) - completely different bag of nails. It allows owner to define which target package he wants and with which features, and then calculates all necessary dependencies and starts building them from source. Why? Because this way, it has paralleled flexibility of combining any software together. The problem is, the whole specter of problems around actual compilation of software, the downsides of that are also thrown into the bag and mixed with typical package depend problems. The "recipe" repository is rather large and includes many of versions of different packages, as well as inofficial additional trees. Portage supports creating binary packages, but the ability to resolve between binary packages, but its ability is limited (stand 2012). Yet other projects decided to expand here (Sabayon), but I personally discovered, its not as flexible as apt and improper use may cause a serious mess. The situation in previous paragraph, is just a typical case for portage and is usually resolved automatically. But the owner still requires to know and understand which path to pick. But on the other side - if some system library is updated, you have a typical 5 GB of source software to recompile - automated, but (CPU) time consuming (it can do that in background, actually).
- pacman (Arch, -based) - it behaves like apt/dpkg, but cuts on automation. It expects user to understand the problem and carry out the action. Example: where, apt/dpkg would proceed adding new user groups in the system (because newer software requires it) automatically, pacman would only mention that. Furthermore, it features ability to automatically compile and install software from source repository similar to one from portage, but with much less automation. With ABS, the features which a package should enable are defined in the build script (PKGBUILD) and not processed as an option for package manager ("use flag" on Gentoo). That means, ... a much faster, simpler system, which is more flexible than portage and faster than apt -- but the downside is that requires a lot more attention for individual package, as well as system configuration (including knowledge about it). Personally, I discovered such system to be very simple, but the amount of such "simple" actions was over the top. I went back to using (and fighting against) additional layers of automation.
As you see.... there is no fragmentation. Its the actual goal that defines the approach, defines the tool and builds the system around it.
DrakeFox: If any OS, Linux is the one with the spirit of "you can run whatever you want without asking anyone permission", so it's a bit of a paradox that in order to be sure some software you want to run is going to work on your system you need to wait for those who maintain your distro tree. (Heck on both Ubuntu and Mint installing LibreOffice 5 you need to fiddle around in a terminal with extra PPAs to get it to even show up in the software center)
That is a situation that is typical to a binary approach.
You can include Windows there. They ship an update, great amount of software stops working. Would you attempt to re-compile that software yourself in VS with a newer library?.. Or would you just draw their attention, so that they "fix their stuff" and wait until that?
By "fiddling around terminal" you are actually doing that. "Fiddling around" Visual Studio to re-compile the project for newer library that is force-shipped would be much more time-consuming.
In both cases, this is actually not what you should do. Unless you explicitly want it right here and right now.
Gentoo would laught at that:
# emerge =libreoffice-5
portage will install newer library in new "slot" and use it parallel.
.... and after 2 hours of compilation in the background, you have it. There is no guarantee it runs, but usually it does. =)
Same applies to pacman (Arch, *based). The newer version (recipe to build it) would be available much much faster, but there can be harder problems to solve among the path.
DrakeFox: Sure you can for much software download and compile software yourself, or maybe even find binary packages, but the fragmentation once again means it might work out of the box, or need some serious tinkering.
Well, I think what I described above explains pretty much that there is no "fragmentation".
If you really want it working well and that - lazy, its binary distribution with long support window and stable (outdated? well-aged? you decide) software. Still, many of the (user) software is available in newer versions as backports, if there is need for that.
Thats about an equivalent of using windows XP... until something that is newer - stabilizes.
DrakeFox: As a user experience, Linux might be a lot easier depending on your user interface (Personally I found KDE4 cumbersome to work with some years ago, especially the plasma bits seemed like pointless eyecandy).
I really loved KDE3 at some point, then I used Gnome2 for big chunk of time and I started hating it (KDE3). When KDE4 came out, it (plasma-component mostly) crashed and memory-leaked so often it was not funny. Right now KDE4 is very stable, very functional, fully translated. And outdated - KF5 knocks on the door with double the RAM requirement (its still 1/2 of what Windows 10 uses) , and the plasma crash fun. =)
And once you get enough knowledge of the workings of the OS it's most likely easier to fix when it breaks down since you don't have to guess at some closed source internals. But while Windows is a lot more crash prone, it's gotten quite good at repairing itself (though it would be preferable to not have to do so).
I know Wine is an API implementation, it's a great solution for running software, but API or not you're still running into some emulation of environment (drive_c for instance) which may or may not cause issue. It works quite well often enough, but still may require hacks which aren't required to run the executable in a native Windows environment (Took a look at the Wine page for Bioshock since I kinda fancied playing that again, categorized bronze, it installs and runs and you apparently need to do a bit of emulated registry hacking to not have mouse issues).
For me that's acceptable.
For a good deal of end users where reading a 3 step guide to something is beyond what's reasonable to ask, it's not.
DrakeFox: And that video....I think I spaced out because of the text to speech voices. Personally the UI I like the most at the moment seems to be Cinnamon (on Mint). It's basic and fairly lightweight and not overly complicated to setup. But hey, why choose, you can run Cinnamon, Mate, and Unity on the same system. Maybe even KDE though last time I installed KDE from the software repo, after first login to the KDE environment it managed to render the Unity shell unloadable (Ubuntu about 2 years ago)
Well, that sounds like an interesting bug. Probably the PPA had different libary version and installed it, causing segmentation fault in Unity. Thats worthy of report, if PPA is maintained. Otherwise, ask on official Ubuntu IRC channels. I have not used Ubuntu (full-time, that is) since 2011 so I am not sure KDE is in official repository. If it is, is always better to stick to that, unless something is really wrong with that. Well, basically, what I said few paragraphs above about binary distros.