It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Tallima: It even has 50c microtransactions for when you want a child's account added.
avatar
PookaMustard: What in the seven hells is this 50c microtransaction you keep talking about? I went through the entire process and I was never charged; in fact that functionality was working without a single extra cent. Where do you come from?
http://windows.microsoft.com/en-US/windows-live/family-safety-why-does
avatar
Tallima: That's simply not true. It *is* a game. You get to search high and low like a hidden objects game to find the off-switch to "features" like "reboot your PC without asking you," "shut down your PC without asking you," "download and install updates with the full load of your Internet bandwidth and processor without telling you," "useless voice search" and whatnot.

It even has 50c microtransactions for when you want a child's account added.

If that's not a game, I don't know what is!
avatar
Painted_Doll: Too bad you can't return this game to the publisher or ask for a refund . :)
I tried to go back to 7, but it was too late. :(

I'm learning all about Mint though!
Post edited November 02, 2015 by Tallima
Windows 10 is unable to execute privacy.exe, OS already in use by Microsoft.
Post edited November 02, 2015 by phaolo
Strange that this page exists yet I was never asked to spend an iota to make this child account working complete with the time curfew! But as the page itself implies, COPPA requires that the 50c be charged. So if anything, you should deal with COPPA first, and boy do I hate them. But then the question that presents itself.

Why am I not charged?
avatar
PookaMustard: Strange that this page exists yet I was never asked to spend an iota to make this child account working complete with the time curfew! But as the page itself implies, COPPA requires that the 50c be charged. So if anything, you should deal with COPPA first, and boy do I hate them. But then the question that presents itself.

Why am I not charged?
I'll look back into it. If there's a way to do it without a MS account, then you wouldn't need the COPPA charge. When I tried to make it, it said for all the features I wanted (Internet controls was the biggest thing), then I needed an MS account -- which clearly makes not sense to me.

On the MINT front, pretty much all of the games that are up on my to-play-next list are fully working on MINT. Yippee!
Regarding the initial issue,

ShellExecuteEx is likely used by the installer to run a helper library or installer. It's a way of getting Windows to execute a new process. Think of it as if it runs a process through the RUN dialog for you. It might not be a process directly, but anything your WIndows should be able to run in the Run dialog, like a path to a file or a URI.

The documentation of the method doesn't mention any of the items which would be code 32 though. Most of the error codes seem centered around a file not being found, not being accessible, not having enough memory available, not having an association and thus not knowing what to do.

A few quick things to try might be to make sure you've downloaded the installer before executing it, to ensure you're not running it out of a temporary folder which gets deleted before you finish running the installer. Also try making sure it's run with administrative privileges by right clicking the file and choosing Run As Administrator.

As mentioned above, if the Process already in use shows up, it might be the galaxy installer is already running in the background but without a visible window which could cause it to refuse to launch yet another instance, try opening your task list (ctrl-shift-Escape) and under the details tab looking for setup_galaxy_1.1.5.28.exe. If you can't find the Details tab, you might need to click the Show More (or similar, sorry Danish OS and I don't know the translation) button to see the tabs. If it's running, select it, right click it and select end process tree to make sure any sub-processes are killed with the main setup.

And of course, the old joke "Have you tried turning off and on your computer?"

As to Tallima rolling back to Windows 7. I hope you have the installation media for the computer, so you could always wipe it entirely and go back to a clean 7? Otherwise MINT might be the solution, but before wiping your disk and throwing Linux on there, make sure you at least have some sort of Windows install media in case you decide you want to go back again...just in case. I've been fiddling with Mint and got Freedom Planet running quite well on an old 1ghz HP notebook from 2006. It runs rather well, but has a bit of a learning curve once you start running into trouble.
avatar
PookaMustard: Strange that this page exists yet I was never asked to spend an iota to make this child account working complete with the time curfew! But as the page itself implies, COPPA requires that the 50c be charged. So if anything, you should deal with COPPA first, and boy do I hate them. But then the question that presents itself.

Why am I not charged?
avatar
Tallima: I'll look back into it. If there's a way to do it without a MS account, then you wouldn't need the COPPA charge. When I tried to make it, it said for all the features I wanted (Internet controls was the biggest thing), then I needed an MS account -- which clearly makes not sense to me.

On the MINT front, pretty much all of the games that are up on my to-play-next list are fully working on MINT. Yippee!
Erm.. correct me if I'm wrong, but don't you just need the Admin account to be connected to Microsoft accounts? And then you can create local accounts for kids and under control panel->user accounts-> manage other accounts?

OR

the best way that always works: GPEDIT.MSC
avatar
Tallima: I'll look back into it. If there's a way to do it without a MS account, then you wouldn't need the COPPA charge. When I tried to make it, it said for all the features I wanted (Internet controls was the biggest thing), then I needed an MS account -- which clearly makes not sense to me.

On the MINT front, pretty much all of the games that are up on my to-play-next list are fully working on MINT. Yippee!
avatar
dewtech: Erm.. correct me if I'm wrong, but don't you just need the Admin account to be connected to Microsoft accounts? And then you can create local accounts for kids and under control panel->user accounts-> manage other accounts?

OR

the best way that always works: GPEDIT.MSC
I need an email (my 6 year old doesn't have one) or create an MS Account to get started. I created an MS account (he had no email) and it finished with "please insert credit card."

I just tried it again.

You can make an account (he has one on this PC) but I wanted to use the Win10 features that you're supposed to be able to use with minors.

If there is a way to do it without a credit card, then that's a testament to how crappy Win10 is. I'm no novice when it comes to computers. If anyone is able to create an account that has internet controls and game timers and whatnot on it without a credit card, let me know. Mine ain't workin'.
avatar
DrakeFox: As to Tallima rolling back to Windows 7. I hope you have the installation media for the computer, so you could always wipe it entirely and go back to a clean 7? Otherwise MINT might be the solution, but before wiping your disk and throwing Linux on there, make sure you at least have some sort of Windows install media in case you decide you want to go back again...just in case. I've been fiddling with Mint and got Freedom Planet running quite well on an old 1ghz HP notebook from 2006. It runs rather well, but has a bit of a learning curve once you start running into trouble.
Thats already third Windows I wipe from the harddrive permanently, replacing it with Linux after saving the product key - in case hardware gets sold. I don't understand even the slightest what you claim here - Linux install images are fully bootable OSes, with webbrowser, network card drivers, even Office Suit sometimes. From a 2006? Why not put windows on it and get a modern dedicated machine for Linux? Seriously, I can't understand you guys, who make horror drama out of children cartoon (Mint), because first thing they do is go for dual-boot with all its complications. This is neither Gentoo from stage3, nor Debian netinst - it installs under 15 minutes. The amount of fail-safes and automation in Debian perhaps surpasses Windows itself, if you don't count dialog boxes and next-buttons. It won't allow you to remove kernel, or any vital component out of you system. I mean, nobody starts breaking stuff to learn something without understanding the consequences. Don't do that on the production machine! Just use software and create documents. For wild experiments, there is always a virtual machine: VMWare, VirtualBox, QEMU - you name it. You are not breaking you windows host install either (and have a Linux copy, just in case... lol)

By the way, If you really want it fast and brutal, you just need three things:
1) a VM
2) Linux From Scratch in your browser
3) KeepNote or similar
in about three weeks, you will be compaining on Mint forums about how "bloated" it is.
What in retarded world is this that says Windows 10 isn't for gaming?

Please for the sake of humanity, use the fucking brain.
avatar
DrakeFox: As to Tallima rolling back to Windows 7. I hope you have the installation media for the computer, so you could always wipe it entirely and go back to a clean 7? Otherwise MINT might be the solution, but before wiping your disk and throwing Linux on there, make sure you at least have some sort of Windows install media in case you decide you want to go back again...just in case. I've been fiddling with Mint and got Freedom Planet running quite well on an old 1ghz HP notebook from 2006. It runs rather well, but has a bit of a learning curve once you start running into trouble.
avatar
Lin545: Thats already third Windows I wipe from the harddrive permanently, replacing it with Linux after saving the product key - in case hardware gets sold. I don't understand even the slightest what you claim here - Linux install images are fully bootable OSes, with webbrowser, network card drivers, even Office Suit sometimes. From a 2006? Why not put windows on it and get a modern dedicated machine for Linux? Seriously, I can't understand you guys, who make horror drama out of children cartoon (Mint), because first thing they do is go for dual-boot with all its complications. This is neither Gentoo from stage3, nor Debian netinst - it installs under 15 minutes. The amount of fail-safes and automation in Debian perhaps surpasses Windows itself, if you don't count dialog boxes and next-buttons. It won't allow you to remove kernel, or any vital component out of you system. I mean, nobody starts breaking stuff to learn something without understanding the consequences. Don't do that on the production machine! Just use software and create documents. For wild experiments, there is always a virtual machine: VMWare, VirtualBox, QEMU - you name it. You are not breaking you windows host install either (and have a Linux copy, just in case... lol)

By the way, If you really want it fast and brutal, you just need three things:
1) a VM
2) Linux From Scratch in your browser
3) KeepNote or similar
in about three weeks, you will be compaining on Mint forums about how "bloated" it is.
Not sure I quite understand your post.

I am fully aware Linux is a fully bootable OS. As I said I'd fiddled with Mint on an old laptop. Used it for testing our network setup. After adding a static IP to the ethernet NIC, then asking it to go back to DHCP suddenly it thought I had 2 ethernet cards, none of which seemed connected to the actual hardware. It's likely just because I haven't dug into the config files or dmesg to troubleshoot the issues.

As to why I haven't got it on my modern system. Well I play more than old games so I'd kinda like my OS to be able to play them. I initially partitioned my system to have a Linux (Debian) and Windows boot. Realised after a year I rarely booted into Debian because I had no games run properly there apart from a few (Jagged Alliance 2 + Wine runs better on Linux in a window than in Windows in my eperience). I got kinda tired of the system kernel panicking every other time I tried to make a configuration change. Thus I decided I needed dedicated hardware to fiddle around with Linux so it was not an either/or for me.

The mention of the 2006 machine was because we have a couple of old laptops (from 2006) which I wanted to see if I could repurpose into usable machines. They came with XP which is sadly out of support and not really a wise choice for anyone who wants to use it online even with another browser. As mentioned it runs rather well, better than XP actually (though that might be the gig of extra RAM taking part of the credit there)


Oh and as for the Windows 10 isn't for gaming....If it wasn't for gaming, why would Microsoft bother to implement a game recording/streaming overlay if they didn't intend it for gaming? Just like Windows 8, and Vista, and XP, and 2000 and ESPECIALLY ME before it, there are issues on launch with backwards compatibility. If anything should be said: computers aren't for gaming. They were initially made as business machines. But clever coders found ways of making them run games anyway. As time evolved dedicated gaming specific hardware was made (Graphics cards for instance), but the architecture is still that of a general purpose instrument with no direct hardware support for gaming. What makes it capable of gaming is the software, and Windows 10 is just as much for gaming as previous iterations I'd say, except with the usual compatibility issues.
avatar
DrakeFox: Not sure I quite understand your post.
For the user Linux is easier than Windows.
First thing you do with Windows, is to install it alone on dedicated hardware.

But with Linux, you claimed to go dualboot, create windows backup etc. Why?

What if your machine came with Linux, would you also be instructing to do Windows dualboot, Linux backup?
Artificial hardship.

avatar
DrakeFox: As I said I'd fiddled with Mint on an old laptop. Used it for testing our network setup. After adding a static IP to the ethernet NIC, then asking it to go back to DHCP suddenly it thought I had 2 ethernet cards, none of which seemed connected to the actual hardware. It's likely just because I haven't dug into the config files or dmesg to troubleshoot the issues.
What, why? Have you messed with config files? Then why dig into them? Dmesg is just a log file.
This is clearly a bug, likely Ubuntu-specific - perhaps something to do with "upstart" renaming network interfaces.
Never had it on my machines (I am not calling it fake) - I have two plugged into ethernet and had used a lot of distributions over the 6 years, but its nothing regular.


avatar
DrakeFox: As to why I haven't got it on my modern system. Well I play more than old games so I'd kinda like my OS to be able to play them.
-___-

Windows can't play any modern games. Its stuck at version 3.1. And more importantly - Windows can't use anything higher than 486 processor, due to 16 megabyte ram limit. Same logic, bro.

avatar
DrakeFox: I initially partitioned my system to have a Linux (Debian) and Windows boot. Realised after a year I rarely booted into Debian because I had no games run properly there apart from a few (Jagged Alliance 2 + Wine runs better on Linux in a window than in Windows in my eperience).
I played Ja2, Ja2 1.13 and Ja2 Wildfire in Linux btw. Runs perfectly in the window.
As well as Painkiller BlackEdition, Quake 4, Doom 3, Unreal Tournament 2004, Prey, Crysis, Far Cry, Stalker CoP.
And I have no idea how an old Laptop should provide OpenGL4 hardware, that Linux supports.

avatar
DrakeFox: I got kinda tired of the system kernel panicking every other time I tried to make a configuration change. Thus I decided I needed dedicated hardware to fiddle around with Linux so it was not an either/or for me.
Ah, so Windows is also not for you, because you dislike to run OSes on dedicated hardware?

avatar
DrakeFox: The mention of the 2006 machine was because we have a couple of old laptops (from 2006) which I wanted to see if I could repurpose into usable machines. They came with XP which is sadly out of support and not really a wise choice for anyone who wants to use it online even with another browser.
Windows XP is pretty hard to beat actually. Its simple enough and arguably resource-efficient. If you install good security suite and hide behind the router, there should be nothing serious. Yes, also disable Flash or use Chrome PepperAPI-based Flash.
Sure, MS refuses to update core libraries, hence newer software will turn incompatible - but thats no different to restricting self to older Linux distribution (with older kernel, userspace, desktop - also more resource efficient).

I mean, the problem is the hardware itself. Its limited and probably not very well supported by hardware manufacturer - incorrect ACPI code, overheating CPU, errors in graphics driver. All these are worth reporting, some are outside of possible to correct. For example, Intel refuses to support GMA500 on Windows 10 for exactly same reasons.

As to kernel panic, you don't get them when you change configuration.
You may misconfigure to boot loader or format part of the harddrive - that would render the system non-bootable.
Well, there is only one type of panic, which can be caused by that, but that would involve making the whole drive C: (root partition) inaccessible to OS on boot.

My assumption is that Linux allowed you to do too much, and this is what you did; or you had badly supported hardware.

avatar
DrakeFox: As mentioned it runs rather well, better than XP actually (though that might be the gig of extra RAM taking part of the credit there)
Linux memory and CPU footprint is far more efficient than XP, provided the similar software stack is used. However, the start phase may require more RAM, because Linux ships with far more drivers out of the box.

If one takes KDE4, for example, that thing ships with about same amount of bells and whistles as Windows 7 DE. Yet, I prefer KDE4 myself.


avatar
DrakeFox: Oh and as for the Windows 10 isn't for gaming....If it wasn't for gaming, why would Microsoft bother to implement a game recording/streaming overlay if they didn't intend it for gaming? Just like Windows 8, and Vista, and XP, and 2000 and ESPECIALLY ME before it, there are issues on launch with backwards compatibility. If anything should be said: computers aren't for gaming. They were initially made as business machines. But clever coders found ways of making them run games anyway. As time evolved dedicated gaming specific hardware was made (Graphics cards for instance), but the architecture is still that of a general purpose instrument with no direct hardware support for gaming. What makes it capable of gaming is the software, and Windows 10 is just as much for gaming as previous iterations I'd say, except with the usual compatibility issues.
They just port and dispatch the older (outdated) libraries, which is what "compatibility modes" are.
This is not very much different from using older Wine version in Linux... but you don't have to wait.

PC hardware has always been far superior to consoles... I have no idea where you took the "general purpose" thing. It has CPU, APU, GPU, earlier - dedicated sound DSPs. The software stack on PC is more generic itself: consoles typically booted to game
bootloader+kernel+init system+userspace+peripheral support(printers, scanners, etc)+desktop stuff+heap of libraries for different software+different software.
vs
bootloader+kernel+3d, input, sound hardware+one game and just its libraries.

besides, games and drivers had an advantage of single hardware base, where on PC there is a heap of models from a hive of manufacturers.

As for W10, its too bloated for games and too bloated for work. But thats just my opinion (I have no plans to use any Windows, because there are no advantages).
avatar
Lin545: For the user Linux is easier than Windows.
First thing you do with Windows, is to install it alone on dedicated hardware.

But with Linux, you claimed to go dualboot, create windows backup etc. Why?

What if your machine came with Linux, would you also be instructing to do Windows dualboot, Linux backup?
Artificial hardship.
If my machine came with Linux, I wouldn't be dual booting Windows. And while I can go online and get a Linux any day I want, if I wipe the drive of the pre-installed WIndows before I got a backup of the install media I'd have to either get lucky and find an image of the OEM's Windows install, or potentially buy Windows again, should I decide to jump to Windows for some reason.

And yes, Linux is easier for those who've used Linux for the majority of their time. Just as Windows is easier for long time Windows users and OS X is easier for long time OS X users. I believe it's called experience.

avatar
Lin545: What, why? Have you messed with config files? Then why dig into them? Dmesg is just a log file.
This is clearly a bug, likely Ubuntu-specific - perhaps something to do with "upstart" renaming network interfaces.
Never had it on my machines (I am not calling it fake) - I have two plugged into ethernet and had used a lot of distributions over the 6 years, but its nothing regular.
I didn't mess with the config files until I tried to get it working again. I used the network settings dialog accessible from the Panel. Added a static IP to test a network, removed the static IP and presto, issue. I know dmesg is a log file, and I were hoping it might log warnings or errors which might hint why it failed to have the network card working.

Then again, I may be one of those walking disasters out there when it comes to Linux. I've tried out Arch, Ark, Gentoo, Slack, Ubuntu, OpenSuse, Mint, Redhat and Fedora (over about 15 years). And usually something "interesting" happens. I've had kernel panics because I used the OpenSuse configurator to setup X to run in a resolution higher than 640x480. That's not supposed to happen, it just did.

avatar
Lin545: Windows can't play any modern games. Its stuck at version 3.1. And more importantly - Windows can't use anything higher than 486 processor, due to 16 megabyte ram limit. Same logic, bro.

I played Ja2, Ja2 1.13 and Ja2 Wildfire in Linux btw. Runs perfectly in the window.
As well as Painkiller BlackEdition, Quake 4, Doom 3, Unreal Tournament 2004, Prey, Crysis, Far Cry, Stalker CoP.
And I have no idea how an old Laptop should provide OpenGL4 hardware, that Linux supports.
Almost. I'm happy to see many more new game releases on Linux than a mere 5 years ago. And sure you can run many of the non-Linux releases through something like Wine. But it will be unsupported and be a Your Mileage May Vary hack to get it running.

Some games (like UT 2004, Doom 3) have native Linux versions which work quite well (Wish I had a screenshot of the entertaining glitches UT2004 had though)

avatar
Lin545: Ah, so Windows is also not for you, because you dislike to run OSes on dedicated hardware?
I think you miss my point there. I prefer to run an OS on dedicated hardware. But I found myself booting Linux so rarely when it was an either/or choice, that I wanted to have a dedicated machine to do all the fiddling around learning the OS ins and outs on, that I could have booted simultaneously, without it being a VM which could introduce it's own host of problems.

avatar
Lin545: Windows XP is pretty hard to beat actually. Its simple enough and arguably resource-efficient. If you install good security suite and hide behind the router, there should be nothing serious. Yes, also disable Flash or use Chrome PepperAPI-based Flash.
Sure, MS refuses to update core libraries, hence newer software will turn incompatible - but thats no different to restricting self to older Linux distribution (with older kernel, userspace, desktop - also more resource efficient).
Which is why I was so positive about Mint running on the laptop better than WIndows XP did. As I was handing it off to someone who is _not_ that security conscious I would not hand off an XP machine, knowing security precautions would be thrown to the wind.

avatar
Lin545: My assumption is that Linux allowed you to do too much, and this is what you did; or you had badly supported hardware.
My assumption is I quite like Linux as an idea. And it's matured well over the 15 years. But I like to be able to dig into a system, to know how it functions. To be able to trace problems and troubshoot them properly and to tweak it for performance. Windows has gone downhill since 95 on that front.

Trouble is, things tend to break even when I'm using the distro-built-in applets to change configuration. And when it doesn't break,
I tend to want to play games on my system preferably without having to tinker and hack to get it running. It's great I can do so and Wine has proven itself quite handy there. But if something is released for Windows (and a lot of things are primarily released for Windows) then when I want to play it, I would prefer to play it without first having to play a puzzle game of how to get it running. Especially given my bad attention span so I hop between several games a week.
It doesn't help the fact with the fragmentation of Linux distros (who use upstart, who use systemd, is this one still running init?, yum or apt-get or are we doing rpm archives. Where are shared objects stored this time around? Things change too much from distro to distro to be able to quickly find information which is likely to help you). It's one of Linux's greatest strengts, but it's also one of the things making it rather tricky for new users to learn.

avatar
Lin545: PC hardware has always been far superior to consoles... I have no idea where you took the "general purpose" thing. It has CPU, APU, GPU, earlier - dedicated sound DSPs. The software stack on PC is more generic itself: consoles typically booted to game
bootloader+kernel+init system+userspace+peripheral support(printers, scanners, etc)+desktop stuff+heap of libraries for different software+different software.
vs
bootloader+kernel+3d, input, sound hardware+one game and just its libraries.

besides, games and drivers had an advantage of single hardware base, where on PC there is a heap of models from a hive of manufacturers.
Back in ye olden days consoles had specific registers for stuff like sprites, special hardware sound effects and channels. Geared towards games. The PC was general purpose and so had to do all the same stuff but in software which comes at a performance overhead. Sure the PC is way more powerful, but it comes with overhead in software.
Current Gen consoles (apart from the Nintendo) run on PC hardware so the main advantage they'd have specs wise is they still don't have to run a large OS behind it all (though they're getting there), and there's no fragmentation of drivers to worry about, as the hardware and thus API is known already. Slight but powerful difference. I kinda doubt it's allowed now, but if it was it'd mean a developer could bypass the overhead of something like DirectX/OpenGL and program functions directly towards the GPU/CPU where needed because they know it'd be the same hardware in all consoles. AMD is trying something along the line of Mantle, and DirectX 12 (Windows 10) is allowing more direct hardware access in the same vein. Will it pay off or collapse entirely? Time will tell.

avatar
Lin545: As for W10, its too bloated for games and too bloated for work. But thats just my opinion (I have no plans to use any Windows, because there are no advantages).
And I'm not saying you should. You've got Linux working for you. Got the know-how to get the software that hasn't got a native Linux release working for you. Why on earth would you go back to a proprietary system which costs you money and gives you no advantage?

But to joe-average user who's used Windows for some time, making the jump to Linux can be a scary and frustrating prospect when something _does_ fall over or they want to run something not released and supported on Linux. And that's where I were merely trying to remind them to have a way to fall back if they gave up.

In the end. I think Three Dead Trolls in a Baggie said it best Every OS Sucks

edit: And thankfully things have gotten better for every one of the three OS's in the song. They still all have their advantages, disadvantages and learning curve.
Post edited November 03, 2015 by DrakeFox
avatar
DrakeFox: If my machine came with Linux, I wouldn't be dual booting Windows. And while I can go online and get a Linux any day I want, if I wipe the drive of the pre-installed WIndows before I got a backup of the install media I'd have to either get lucky and find an image of the OEM's Windows install, or potentially buy Windows again, should I decide to jump to Windows for some reason.
Actually no, since windows 7 (?) this isn't the case anymore.
You can freely download official ISO anytime, burn it and install.
You just need the product key, which is either in UEFI/BIOS area, on the sticker or can be extracted with a script from registry.

Regarding buying license, its around 20-40$ from official OEM sellers (on reddit).

avatar
DrakeFox: And yes, Linux is easier for those who've used Linux for the majority of their time. Just as Windows is easier for long time Windows users and OS X is easier for long time OS X users. I believe it's called experience.
No, Linux is easier to use by itself. Only recently windows got application store. Linux had it for ages.
The way incoming files are treated is pretty secure.
There are exploits, but usually only crafted dedicated cracking can prove successful.
Adware, credit card need, personal data collection, hidden file extension "feature" that is mis-used even today + execute attribute, based on (hidden) file extension.

I have experience teaching both windows and linux to people, even seniors who do trading, documentation, talk, social stuff and banking with it. Linux is much easier to use. It can be more picky about hardware, however, in manufacturer decides not to play along. But if hardware is supported, on something like Debian Stable, it just works - like good old windows 3.1.

avatar
DrakeFox: I didn't mess with the config files until I tried to get it working again. I used the network settings dialog accessible from the Panel. Added a static IP to test a network, removed the static IP and presto, issue. I know dmesg is a log file, and I were hoping it might log warnings or errors which might hint why it failed to have the network card working.
avatar
DrakeFox: Then again, I may be one of those walking disasters out there when it comes to Linux. I've tried out Arch, Ark, Gentoo, Slack, Ubuntu, OpenSuse, Mint, Redhat and Fedora (over about 15 years). And usually something "interesting" happens. I've had kernel panics because I used the OpenSuse configurator to setup X to run in a resolution higher than 640x480. That's not supposed to happen, it just did.
Some interesting distros you mention there. I am surprised you still use Windows, if you had tried them... Gentoo, Slack and perhaps Arch users may understand.
No Debian?

Around 2011 till 2013, AMD open drivers were on development hence not really stable, but most of Distros tried to push them nethertheless. Because closed source were even more crap. AMD gpus are "usable" since around 2014, so before that and since 3d cards became important - it was all-nvidia.
Still....

Today, mostly only crossfire/sli are lacking.

avatar
DrakeFox: Almost. I'm happy to see many more new game releases on Linux than a mere 5 years ago. And sure you can run many of the non-Linux releases through something like Wine. But it will be unsupported and be a Your Mileage May Vary hack to get it running.
Wine is free winapi implementation, a library, not emulator - so its actually more reliable than windows itself. At least for me : >

avatar
DrakeFox: I think you miss my point there. I prefer to run an OS on dedicated hardware. But I found myself booting Linux so rarely when it was an either/or choice, that I wanted to have a dedicated machine to do all the fiddling around learning the OS ins and outs on, that I could have booted simultaneously, without it being a VM which could introduce it's own host of problems.
Yeah.
Well, I couldn't manage to bring myself over /etc/fstab (in 2010, it was required).
But after Vista knocked on the door, I suddenly felt exceptionally motivated.

avatar
DrakeFox: My assumption is I quite like Linux as an idea. And it's matured well over the 15 years. But I like to be able to dig into a system, to know how it functions. To be able to trace problems and troubshoot them properly and to tweak it for performance. Windows has gone downhill since 95 on that front.

Trouble is, things tend to break even when I'm using the distro-built-in applets to change configuration. And when it doesn't break,
I tend to want to play games on my system preferably without having to tinker and hack to get it running. It's great I can do so and Wine has proven itself quite handy there. But if something is released for Windows (and a lot of things are primarily released for Windows) then when I want to play it, I would prefer to play it without first having to play a puzzle game of how to get it running. Especially given my bad attention span so I hop between several games a week.
Debian Stable-based (SolydX or -K), or LTS version of Ubuntu (after at least 6 months pass) + nvidia gpu should have the least problems. I see trouble only in games that extensively use Dx11 features and all the recent .NET stuff. Yet today, its mostly either native or runs in pretty good via wine.

Yes, but YMMV, - recent windows-only games target windows platform, so if you find yourself with windows at home, congratulations! (no sarcasm)

I find most trouble to be -- the file lowercasing, because extFS is case-SENsitive. There is a dozen of programs for that, as well as scripts, but - one should not forget = ]
Actually I find Wine times more better than Windows for one reason - if I install it in Prefix (called "bottle" if you use crossover), I can archive and store it - such installation is fully portable, so to say. So, I don't have to reinstall anything when I decide to replay, just un7z'ip it and run.

avatar
DrakeFox: It doesn't help the fact with the fragmentation of Linux distros (who use upstart, who use systemd, is this one still running init?, yum or apt-get or are we doing rpm archives. Where are shared objects stored this time around? Things change too much from distro to distro to be able to quickly find information which is likely to help you). It's one of Linux's greatest strengts, but it's also one of the things making it rather tricky for new users to learn.
Never had a single trouble with that.
Whether its source-based or binary; stable or experimental; rolling, semi-rolling or discrete release - it all depends on the user and his task. Nobody takes VM away =)

Newbies or anyone who doesn't want to tinker with OS, are always better with binary-based stable discrete. If they have cutting edge hardware or require cutting edge software, they can pick drivers and stuff from backports.

avatar
DrakeFox: Back in ye olden days consoles had specific registers for stuff like sprites, special hardware sound effects and channels. Geared towards games. The PC was general purpose and so had to do all the same stuff but in software which comes at a performance overhead. Sure the PC is way more powerful, but it comes with overhead in software.
Current Gen consoles (apart from the Nintendo) run on PC hardware so the main advantage they'd have specs wise is they still don't have to run a large OS behind it all (though they're getting there), and there's no fragmentation of drivers to worry about, as the hardware and thus API is known already. Slight but powerful difference. I kinda doubt it's allowed now, but if it was it'd mean a developer could bypass the overhead of something like DirectX/OpenGL and program functions directly towards the GPU/CPU where needed because they know it'd be the same hardware in all consoles. AMD is trying something along the line of Mantle, and DirectX 12 (Windows 10) is allowing more direct hardware access in the same vein. Will it pay off or collapse entirely? Time will tell.
Yes I remember these times, when consoles could boast fluid palette effects and non-tearing scrolling :) , but since around 3Dfx it became no trouble anymore. Which is what I meant that ability for PC hardware to do anything does not mean it makes cuts anywhere. The only "advantage" of consoles today is hardware optimization and proprietary technologies/networks. But thats performance type of "advantage", not flexibility "advantage" - most of them are still blackboxes.
Vulcan is mostly here, expected to play nice with upcoming migration to Wayland(-technology). Its pretty sure to stay and will replace OpenGL as API.

avatar
DrakeFox: But to joe-average user who's used Windows for some time, making the jump to Linux can be a scary and frustrating prospect when something _does_ fall over or they want to run something not released and supported on Linux. And that's where I were merely trying to remind them to have a way to fall back if they gave up.

In the end. I think Three Dead Trolls in a Baggie said it best Every OS Sucks

edit: And thankfully things have gotten better for every one of the three OS's in the song. They still all have their advantages, disadvantages and learning curve.
Well,... let me show you which Linux user thinks is the best of the three(): [url=https://www.youtube.com/watch?v=780s0WRL8Rk]https://www.youtube.com/watch?v=780s0WRL8Rk

hehe =)
Post edited November 03, 2015 by Lin545
Just to avoid this being a complete derail of the topic I'll try to be brief without having quotes and quotes and quotes.

I do end-user support at a school, and sadly often see users who come in with a dead hard drive, no recovery image. No sticker and a Windows 8 ISO will refuse to run on the system. Macbooks have it the best in that regard, internet connection and apple-ID required and that's about it.

While it's true Linux has had a Software Center by default for a long time, I find it actually runs counter to some of the ideals of Linux. If any OS, Linux is the one with the spirit of "you can run whatever you want without asking anyone permission", so it's a bit of a paradox that in order to be sure some software you want to run is going to work on your system you need to wait for those who maintain your distro tree. (Heck on both Ubuntu and Mint installing LibreOffice 5 you need to fiddle around in a terminal with extra PPAs to get it to even show up in the software center)
Sure you can for much software download and compile software yourself, or maybe even find binary packages, but the fragmentation once again means it might work out of the box, or need some serious tinkering.

As a user experience, Linux might be a lot easier depending on your user interface (Personally I found KDE4 cumbersome to work with some years ago, especially the plasma bits seemed like pointless eyecandy).
And once you get enough knowledge of the workings of the OS it's most likely easier to fix when it breaks down since you don't have to guess at some closed source internals. But while Windows is a lot more crash prone, it's gotten quite good at repairing itself (though it would be preferable to not have to do so).

I know Wine is an API implementation, it's a great solution for running software, but API or not you're still running into some emulation of environment (drive_c for instance) which may or may not cause issue. It works quite well often enough, but still may require hacks which aren't required to run the executable in a native Windows environment (Took a look at the Wine page for Bioshock since I kinda fancied playing that again, categorized bronze, it installs and runs and you apparently need to do a bit of emulated registry hacking to not have mouse issues).
For me that's acceptable.
For a good deal of end users where reading a 3 step guide to something is beyond what's reasonable to ask, it's not.

And that video....I think I spaced out because of the text to speech voices. Personally the UI I like the most at the moment seems to be Cinnamon (on Mint). It's basic and fairly lightweight and not overly complicated to setup. But hey, why choose, you can run Cinnamon, Mate, and Unity on the same system. Maybe even KDE though last time I installed KDE from the software repo, after first login to the KDE environment it managed to render the Unity shell unloadable (Ubuntu about 2 years ago)
avatar
DrakeFox: Just to avoid this being a complete derail of the topic I'll try to be brief without having quotes and quotes and quotes.
Hmm, I thought we were discussing multiple smaller topics. If I would use single response, then I would very probably derail your initial post, by ignoring or overlooking some arguments and building up a pile on my own... This is why I went to partial quotes...

Yes, when you receive a heavy quoted post, there is an impression someone is trying to prove something. Because this produces posts that are hard to process and you start weighting your time, if response is needed at all - a la looking at discussion opponent as some kind of a troll.

Thats not what I was personally up to. I have my experience, you have yours. Its pointless to talk someone into something what he disagrees upon. I am just exchanging my POV, nobody is obligated to anything.

avatar
DrakeFox: I do end-user support at a school, and sadly often see users who come in with a dead hard drive, no recovery image. No sticker and a Windows 8 ISO will refuse to run on the system. Macbooks have it the best in that regard, internet connection and apple-ID required and that's about it.
Thats really strange, because it should activate using BIOS/UEFI key - possibly require telephone call. Can you please give me some search keywords, because I am curious as to - why the installation would fail.

avatar
DrakeFox: While it's true Linux has had a Software Center by default for a long time, I find it actually runs counter to some of the ideals of Linux.
Well, I have meant "package management system" by that. There are different and many have own approach that builds the very core of them. Few opposite examples:
- apt/dpkg (Debian,Ubuntu,SteamOS) - are binary, installs are fast, inter-package problems rather easy, they do allow automated building from source, but its rather limited to few packages rather than whole system(that may be the serious downside). They accept mixing of architectures (86 and x64) and mixing of sources - they usually resolve even these conflicts well. But when some newer package requires a newer library, that conflicts with older library on what other package depend - there is nothing owner can do, but wait till its packaged; or try that himself, which is much harder than typical install and may cause owner to flee to another distro.

- portage (Gentoo, Funtoo) - completely different bag of nails. It allows owner to define which target package he wants and with which features, and then calculates all necessary dependencies and starts building them from source. Why? Because this way, it has paralleled flexibility of combining any software together. The problem is, the whole specter of problems around actual compilation of software, the downsides of that are also thrown into the bag and mixed with typical package depend problems. The "recipe" repository is rather large and includes many of versions of different packages, as well as inofficial additional trees. Portage supports creating binary packages, but the ability to resolve between binary packages, but its ability is limited (stand 2012). Yet other projects decided to expand here (Sabayon), but I personally discovered, its not as flexible as apt and improper use may cause a serious mess. The situation in previous paragraph, is just a typical case for portage and is usually resolved automatically. But the owner still requires to know and understand which path to pick. But on the other side - if some system library is updated, you have a typical 5 GB of source software to recompile - automated, but (CPU) time consuming (it can do that in background, actually).

- pacman (Arch, -based) - it behaves like apt/dpkg, but cuts on automation. It expects user to understand the problem and carry out the action. Example: where, apt/dpkg would proceed adding new user groups in the system (because newer software requires it) automatically, pacman would only mention that. Furthermore, it features ability to automatically compile and install software from source repository similar to one from portage, but with much less automation. With ABS, the features which a package should enable are defined in the build script (PKGBUILD) and not processed as an option for package manager ("use flag" on Gentoo). That means, ... a much faster, simpler system, which is more flexible than portage and faster than apt -- but the downside is that requires a lot more attention for individual package, as well as system configuration (including knowledge about it). Personally, I discovered such system to be very simple, but the amount of such "simple" actions was over the top. I went back to using (and fighting against) additional layers of automation.

As you see.... there is no fragmentation. Its the actual goal that defines the approach, defines the tool and builds the system around it.

avatar
DrakeFox: If any OS, Linux is the one with the spirit of "you can run whatever you want without asking anyone permission", so it's a bit of a paradox that in order to be sure some software you want to run is going to work on your system you need to wait for those who maintain your distro tree. (Heck on both Ubuntu and Mint installing LibreOffice 5 you need to fiddle around in a terminal with extra PPAs to get it to even show up in the software center)
That is a situation that is typical to a binary approach.
You can include Windows there. They ship an update, great amount of software stops working. Would you attempt to re-compile that software yourself in VS with a newer library?.. Or would you just draw their attention, so that they "fix their stuff" and wait until that?
By "fiddling around terminal" you are actually doing that. "Fiddling around" Visual Studio to re-compile the project for newer library that is force-shipped would be much more time-consuming.
In both cases, this is actually not what you should do. Unless you explicitly want it right here and right now.

Gentoo would laught at that:
# emerge =libreoffice-5
portage will install newer library in new "slot" and use it parallel.
.... and after 2 hours of compilation in the background, you have it. There is no guarantee it runs, but usually it does. =)
Same applies to pacman (Arch, *based). The newer version (recipe to build it) would be available much much faster, but there can be harder problems to solve among the path.

avatar
DrakeFox: Sure you can for much software download and compile software yourself, or maybe even find binary packages, but the fragmentation once again means it might work out of the box, or need some serious tinkering.
Well, I think what I described above explains pretty much that there is no "fragmentation".
If you really want it working well and that - lazy, its binary distribution with long support window and stable (outdated? well-aged? you decide) software. Still, many of the (user) software is available in newer versions as backports, if there is need for that.
Thats about an equivalent of using windows XP... until something that is newer - stabilizes.

avatar
DrakeFox: As a user experience, Linux might be a lot easier depending on your user interface (Personally I found KDE4 cumbersome to work with some years ago, especially the plasma bits seemed like pointless eyecandy).
I really loved KDE3 at some point, then I used Gnome2 for big chunk of time and I started hating it (KDE3). When KDE4 came out, it (plasma-component mostly) crashed and memory-leaked so often it was not funny. Right now KDE4 is very stable, very functional, fully translated. And outdated - KF5 knocks on the door with double the RAM requirement (its still 1/2 of what Windows 10 uses) , and the plasma crash fun. =)

And once you get enough knowledge of the workings of the OS it's most likely easier to fix when it breaks down since you don't have to guess at some closed source internals. But while Windows is a lot more crash prone, it's gotten quite good at repairing itself (though it would be preferable to not have to do so).

I know Wine is an API implementation, it's a great solution for running software, but API or not you're still running into some emulation of environment (drive_c for instance) which may or may not cause issue. It works quite well often enough, but still may require hacks which aren't required to run the executable in a native Windows environment (Took a look at the Wine page for Bioshock since I kinda fancied playing that again, categorized bronze, it installs and runs and you apparently need to do a bit of emulated registry hacking to not have mouse issues).
For me that's acceptable.
For a good deal of end users where reading a 3 step guide to something is beyond what's reasonable to ask, it's not.

avatar
DrakeFox: And that video....I think I spaced out because of the text to speech voices. Personally the UI I like the most at the moment seems to be Cinnamon (on Mint). It's basic and fairly lightweight and not overly complicated to setup. But hey, why choose, you can run Cinnamon, Mate, and Unity on the same system. Maybe even KDE though last time I installed KDE from the software repo, after first login to the KDE environment it managed to render the Unity shell unloadable (Ubuntu about 2 years ago)
Well, that sounds like an interesting bug. Probably the PPA had different libary version and installed it, causing segmentation fault in Unity. Thats worthy of report, if PPA is maintained. Otherwise, ask on official Ubuntu IRC channels. I have not used Ubuntu (full-time, that is) since 2011 so I am not sure KDE is in official repository. If it is, is always better to stick to that, unless something is really wrong with that. Well, basically, what I said few paragraphs above about binary distros.
Post edited November 05, 2015 by Lin545