Posted July 01, 2018
Sorry, i'm hitting space limits, so i'm going to have to cut your quotes small.
So, as you're about to say, big companies can't afford as many risks, but they also handle ambitious projects? Ambition is risk. There's, unfortunately, a fundemental conflict within your own logic. You're free to think whatever you will, but please ask yourself if you might have stake in this.
It's kinda funny, though, despite all their opposition in the past, microsoft the past few years has been trying to buddy up with open source software: they just can't pay the devs enough and hire enough to deal with all the problems: open source is a great way to use the customer's need to get a product developed without having to pay anyone to do it, which is what fueled alot of development on windows: microsoft went out of their way to point out they allow us to develop programs for windows (i remember reading it in one of the EULAs).
However, if you program is smart, it'll read the disk and copy the files, rather than making images. This is bad, because often times you make images because of "unlisted files," which is called "forensics." The idea is to recover files that were accidentally deleted without sending them to the recycle bin (virus, shift-del, etc). Any program that follows the chains and actually reads the sectors immediately to prevent the race condition described above would then, in turn, not be able to read unlisted files and fail the job of computer forensics.
So, as you're about to say, big companies can't afford as many risks, but they also handle ambitious projects? Ambition is risk. There's, unfortunately, a fundemental conflict within your own logic. You're free to think whatever you will, but please ask yourself if you might have stake in this.
Lets take your Microsoft example....too big for it's own good.
The guy in question, and his team, were hired to make windows vista look like OSX. His biggest problem was that he couldn't get confirmation on what departmment was responsible for freeing what resources (allocated RAM). As such, every time a meeting took place, the decision would change. The guy has now gone down in history as "the guy who only wrote a few lines of code over the course of a whole year," which then highlighted the problems with measuring programmers by "lines of code." There were major communication issues within the company, because Microsoft was interested in making sure certain people had jobs (and these people were usually supervisors). When open source, you're right there's ...things.
So, in other words, it's bad, because it's a free market of ideas? Let every idea flourish, because alot of people also have different needs. Someone like myself really likes LXDE, it fits my needs of using few system resources. Not very user friendly, though, but that's the trade-in. Meanwhile, Gnome2 was good for people who liked pretty interfaces, which was also good for kids because all the flashy composition compatibility also kept the attention of the children when viewing the screen. Then you have people like my girlfriend who absolutely love the whole Windows 8 interface, so gnome3 was right up her alley, because it was something she was used to having used her cellphone all the time (to be fair, she stuck with windows due to familiarity). Everyone has different needs, and thus these different products are GOOD. If you have 2 ideas that are very similar, people will debate, fight, and argue over the two and the winner will come out on top (VIM!). When Bill gates ... it.
Android is linux mixed with closed source software. What's cool about it is, install the termux app and you get the linux shell back. It's amazing what you can pull off with it, too. If it wasn't for linux and distros being a thing, we wouldn't even have android. We'd have iOS, and MS sure as hell would still be just as far behind the ball as they've been so far. It's kinda funny, though, despite all their opposition in the past, microsoft the past few years has been trying to buddy up with open source software: they just can't pay the devs enough and hire enough to deal with all the problems: open source is a great way to use the customer's need to get a product developed without having to pay anyone to do it, which is what fueled alot of development on windows: microsoft went out of their way to point out they allow us to develop programs for windows (i remember reading it in one of the EULAs).
With Linux you might ... anything serious.
People in general are only equipped to do small things, where open source gives more freedom, which is why MS is behind the ball. The "do something small and do it really well" actually came before the big push for open software. It's known as "the Unix philosophy," which was so great at making development happen, even in a closed source environment (since it understood humans and computers and how they interact), that most OSes (including OSX) are based on it. The OS itself is stable and sound, while the rest you see is everyone else doing their own thing, and the projects aren't really that old. The fact that volunteer work takes so much time, and the fact they've managed to get all this accomplished in itself is quite a feat when compared to your Apple and Microsoft. Microsoft, for example, is hanging on to really, really old code. I remember reading that, right now, they're having huge issues with 3rd party office modules: they are trying to develop a new way to replace legacy code, but it's so ingrained into the base, they're having trouble dislodging it. There weren't enough alternatives to keep them in line. They've even backed out of it, at one point, and it's a huge problem for mobile versions: the legacy code seems to be tied to x86 CPUs, thus they can't make it work for mobile platforms. This is going to be a big problem for them, moving forward, as more and more people want to use office tools during commute and show presentations on the go, which has been a thing everyone's been aiming for as long as I can remember. OpenOiffice/LibreOffice is much, much closer to this goal, but i have a feeling someone else will beat the both of them to it first (Google, most likely). Once you start getting into bigger ... DRM imoh)
QEMU's emulation is slower, but i've found it to be far more reliable. When i was doing my own kernel development, I remember we (my friend [who was instructing me] and I) had to switch to qemu just to get the graphics driver to work like the real thing (because our code was working on real hardware, but not in vmware). Sure, there's speed bonuses, which is great if you're trying to run a specific OS since you need that OS (which you most likely wouldn't if the software was open source, thus someone could port it for you), but it's really bad if you need an accurate x86 emulator. Meanwhile, qemu seems to be the defacto standard emulator for people doing microcontroller development who aren't ready to invest in hardware. Another examples ...3DSMAX in a heartbeat.
The comparison is much like windows vs linux. 3DSMAX is like photoshop, while Blender is like GIMP. The projects are more... Infantile? Anther one I would like to personally ... Reflect
Depends, actually. I never used clonezilla, but usually anyone who's doing that kind of thing shouldn't be too afraid of the terminal, at which point the OS itself handles it quite well (all hardware is a device, so you can, for example, vim your USB drive and look at the raw data on it). I, personally, prefer ddrescue, and have used it in a pinch. It's based on dd, which is great if you aren't doing forensics. My guess of looking at screenshots, clonezilla is just a frontend for dd. Has has thing thing called VSS which allows you to image your computer while it is in use. You can read more about it here. https://forum.macrium.com/Topic24066.aspx#24104
Most will, including linux. However, it's discouraged, and for good reason: if it's in use, changes can occur, which can ruin your backups. Imagine a file pointer that you have pointing to a certain block in memory, then you save a file and the OS doesn't write to the same block (which is common, which is what causes disk fragmentation, a huge problem that plagues NTFS). Now, you decide then to go watch a youtube video: it loads thumbnails. That previous block (which the backup is currently pointing to) is now occupied with data from those thumbnails. Now, if your backup program got the first reference to the file, but not the block that actually stores the contents of the file, when you restore your backup, your important source code now looks like gibberish that conveniently starts with the letters "JPEG." In theory, imaging software should be able to catch this kind of stuff when it's running, but reality is quite different, and that's why, unlike linux, windows requires that files be locked when in use. Imagine, too, that an important photograph you wanted to save just got replaced with the wonderful 2 letters "MZ". However, if you program is smart, it'll read the disk and copy the files, rather than making images. This is bad, because often times you make images because of "unlisted files," which is called "forensics." The idea is to recover files that were accidentally deleted without sending them to the recycle bin (virus, shift-del, etc). Any program that follows the chains and actually reads the sectors immediately to prevent the race condition described above would then, in turn, not be able to read unlisted files and fail the job of computer forensics.
One thing about the ... even have forks.
Emulators are a niche market, though. One last thing ... gaped PC.
You can, actually. Most people only know how to use the repositories. Package managers actually work on files, too. You can safely wiregap linux computers, but it's more tedious (not more tedious than windows). Some desktops even have support for double clicking their respective package files. MS has been hacked alot, though. Same with Mac. This is why MS is being so tyrranical.