It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
DivisionByZero.620: snip

What do you think?

Discussion guidelines:
snip
My thoughts on your post:
Meta level: So, you clearly put a lot of thought into this, yet your guidelines are like you don't know GOG culture...

Object level:
#1 - rather than digital dystopia you will continue to see the rise of trusted curation - Apple, Amazon, even GOG is an example... the horror stories about digital privacy breaches will hopefully increase awareness about digital data security - it's not valued enough today, but people will have to learn the hard away to value (and pay) for it, including noticing the abuses their own governments are doing.

#2 - short term thinking is always competing with long term thinking. if consumers don't value maintenance and stability, then it's to be expected that the providers will focus on what sells. High quality niches will still exist though, and will obviously be higher price for those that care enough. Basically the productivity gains caused by the new coding paradigms being so much easier to use mean optimization is less necessary, however these things are cyclical and as competition increases those that are able to extract optimization from their coding are the ones likely to be able to outcompete versus other coders. Until the next accessibility shift...

#3 - undervaluing what you are consuming is to be expected. It's that pressure that will bring on the correction as the providers get out of the rat's race and survivors can therefore increase prices, find niches, etc... You're spot on that we decide. And if you're on the minority, well - pay more for what you want. You can and it is that simple.

#4 - What is so bleak about it? after several years of Microsoft near monopoly we have had Apple, Android and even Linux gain a lot of ground... a lot of choice with clear pros and cons for the consumer to evaluate...

Overall, you seem to consider that the trends you dislike in IT are disliked by the majority... well if that was the case, they would not be trends right? I guess you can believe the users / consumers are making their choices without knowledge of the consequences, but that should be easy to correct - engage them and explain the problems... me I think most people do know about the security issues, or the walled gardens, or whatever else you decry, but they make mostly rational determinations about what matters more to them. For them, it is all golden... And if I'm wrong, then all you need is patience, when the shit hits the fan you'll be able to say I told you so, and offer alternatives to the newly awakened majority which will value what you're selling and make you rich. Wish you luck :)
avatar
Telika: I simply think that tablets will progressively become the standard. Whether we want it or not.
And we'll all be old conservative men with our desktop machines. :P

GET OFF MY LAWN YOU YOUNG HOODLUMS! AND TAKE YOUR STUPID TABLETS WITH YOU! BACK IN MY DAY, WE USED WE REAL COMPUTERS!

I can probably see myself doing this, 20 years from now.
Post edited August 19, 2015 by monkeydelarge
Last night, I wrote a parody of the opening post, but today I've decided I can't be bothered to redo that, so I'll get to the point.

1 is an issue that has existed from the start, but only now, awareness has increased greatly.

2 is pointless babble that segues into how it is too easy to program these days and how QA testing is dead.

3 is a hilarious misunderstanding as to what free software is and seems to miss the point, seeing as truly free seems to be doing just fine.

4 endorses voiding the warranty of an item with massive costs, ignores the massive markup of a company and fails to realize that OS X is Linux with a price tag, and then causes doubts that the OP has used Linux recently as they describe issues that all have been long solved.

The conclusion I came to was yes, I was laughing at the post, but for all the wrong reasons.
Post edited August 19, 2015 by Darvond
So we'll all be going back to terminals and mainframes?
I don't think we're quite there yet.
And I'm not quite sure this is the golden age to be honest.
avatar
Telika: I simply think that tablets will progressively become the standard. Whether we want it or not.
avatar
monkeydelarge: And we'll all be old conservative men with our desktop machines. :P
Hey, we already are, with what we run on them. :-/
Nope. The golden age of personal computing is just starting.

There is nothing personal about a fat plastic box sitting under a desk. There is everything personal about a small device you can take everywhere wtih you, that knows you and who you are, that can call you the way you want, that can help you in any way possible.

#1 Transparency in software development has never been better.
#2 Privacy through obscurity is not privacy.
#3 Software development practices have never been at a better place and understanding.
#4 User interfaces are going to continue evolving and we are going to leave behind the old and tired more-static-than-a-zombie icon based interface.
avatar
Elenarie: Nope. The golden age of personal computing is just starting.

There is nothing personal about a fat plastic box sitting under a desk. There is everything personal about a small device you can take everywhere wtih you, that knows you and who you are, that can call you the way you want, that can help you in any way possible.

#1 Transparency in software development has never been better.
#2 Privacy through obscurity is not privacy.
#3 Software development practices have never been at a better place and understanding.
#4 User interfaces are going to continue evolving and we are going to leave behind the old and tired more-static-than-a-zombie icon based interface.
Its pretty clear that there are two camps of thinking here, you are quite clearly on the Start Trek universe way of thinking, where I would class myself as the Blade Runner dystopian future way of thinking. Not saying either is wrong, but not sure there either camp is going to budge much. On your points:
#1 Yes, partly. Open source is coming in a lot, which is good overall. However juxtaposed to that is Firefox bowing to pressure and allowing a DRM playback module.
#2 Your right its not, but privacy via abstinence is. We are not obliged to grab everything which comes out, or jump to the "next big thing".
#3 Yup I agree here (being a programmer I would though :o) I would imagine - totally unsubstantiated here - that a lot of the issues found in software is down to poor management, and budget/timelines, rather than poor coding practices. I.e. the programmers just aren't given the time by the bean counters.
#4 This I don't fully agree with. Metro for instance, was meant to be the next big step. Its awful if you use a PC, and I don't like it on the Windows phone either. I still believe the division between desktop and mobile should be present, they are for different things at the end of the day, I mean you could do digital modelling on your phone, but would you want to?
avatar
Elenarie: Nope. The golden age of personal computing is just starting.

<snip>

#1, #2, #3 & #4
Hmmm i'll agree with you, only that software like games is actually fairly new compared to other technologies, and we were bound to make huge mistakes. Unfortunately unless everyone gives a damn or can do things without everything being made for them, the golden age won't really begin...

Although i'm sure Stallman might say the golden age was in the 70's or so with the mainframes, since they built software as they needed it, shared source code, made drivers, improved things, and computers were for everyone... Ahhh that must have been quite a thing back then... too bad i wasn't born yet :(
avatar
nightcraw1er.488: Its pretty clear that there are two camps of thinking here, you are quite clearly on the Start Trek universe way of thinking, where I would class myself as the Blade Runner dystopian future way of thinking. Not saying either is wrong, but not sure there either camp is going to budge much. On your points:
#1 Yes, partly. Open source is coming in a lot, which is good overall. However juxtaposed to that is Firefox bowing to pressure and allowing a DRM playback module.
#2 Your right its not, but privacy via abstinence is. We are not obliged to grab everything which comes out, or jump to the "next big thing".
#3 Yup I agree here (being a programmer I would though :o) I would imagine - totally unsubstantiated here - that a lot of the issues found in software is down to poor management, and budget/timelines, rather than poor coding practices. I.e. the programmers just aren't given the time by the bean counters.
#4 This I don't fully agree with. Metro for instance, was meant to be the next big step. Its awful if you use a PC, and I don't like it on the Windows phone either. I still believe the division between desktop and mobile should be present, they are for different things at the end of the day, I mean you could do digital modelling on your phone, but would you want to?
#1 That is of no concern. Transparency doesn't mean that people can have access to content that they haven't paid for.

#2 What privacy? Non-WinRT apps never asked for any permissions. We were using random Win32 apps that had the power to do whatever the crap they want in the background, and nobody complained. But when the actual power was given to the user, people started complaining about privacy concerns. That is just retarded.

#3 Of course. But then that is never an easy issue to solve. The perception of IT in companies needs to improve. To them IT is just a tool that keeps thing running, yet it should be on a much higher level as everything they do depends on it. But then, that is what happens when you have "business" jokers in charge of companies and not actual engineers.

#4 Yes, I would want to. Because now with Windows 10, phones will support extending to a completely usable desktop. Meaning if you have a UWP modelling app, you could just connect your phone to a display, connect proper input devices, and you could do all your work that you would normally do on a big fat plastic box.
Post edited August 19, 2015 by Elenarie
Where are the Cliff Notes?
On the UI discussion from Elenarie's #4.

What are the emerging alternatives to icons you all see, I'm curious and have no idea what you might be talking about... VR?

I see icons as just digital buttons, and menus as digital drawers... they resonate with those physical experiences. Touch interfaces I think will take us even closer to those analogues rather than in a new direction. Kind of like they made drag and drop much more intuitive imo...
avatar
DivisionByZero.620: <snip>
I don't really have much to say on #1, #3, or #4. Though I would mention for #1 that back in about 2008 a company making medical equipment decided to add a wireless interface to their pacemakers in order to reduce the need to actually operate in order to configure them (a good idea, a very positive move to improve their product). Unfortunately they left this wireless network completely unsecured, you could shut down someones pacemaker (or in other ways reprogram it to kill them) with just a laptop. So I suppose this golden age started finishing a while ago.

I would like to disagree with you on #2. I feel that what you've described is at best a gross simplification of the software development industry. I'm not sure where your "Conventional wisdom in the industry" is obtained from, but I disagree that it is the conventional wisdom.

What you've called "modern software development paradigms" I think would be better described as "business strategy and values". I consider a development paradigm to be concerned with the delivery of the software rather than the priorities assigned to its quality (this is usually implicit in the development paradigm, sometimes stated, but never anything but "bug free").

With regard to quality of programmers - low quality programmers do not write low quality programs. low quality programmers do not produce anything at all. A poor programmer just fails. The really bad one can't even achieve the basic tasks, the slightly better ones can't manage to put those tasks together to make an actually program. In fact you have to be a reasonably competent programmer just to know what those tasks are, and they need to be even better one to then to be able to structure a program such that it doesn't turn into a runaway project. The better the programmer the better they'll be able to work out the tasks to make a good product. So I feel you're setting the bar very low for the level of quality of programmer that is able to produce a program (I mean an actual program that people buy, anyone can write "hello world").

So we're already at "average" quality programmers if they've written a proper application (or more likely a range of skills with some very good, some quite bad). Then there is the observation I initially read in "The Mythical Man Month", but I've since read has been re-assessed, and proven still to be true - "The difference in productivity between a very high quality programmer and a low quality programmer is about a factor of 10". Over my 12 years in this industry, programming for several firms, I can tell you that in my experience this is true. Given that the pay of programmers doesn't range by a factor of 10, you can conclude it would be corporate suicide to actually deliberately recruit anyone other than the best you can get. You may recruit a range of experience levels, but never anything but the best ability.

So these businesses that deliberately hire from the shallow end of the programming gene pool to cut costs, they just don't exist. Someone might have tried it, in fact several companies almost certainly have. The important point is that they have long sinced ceased to trade (it is one of the reasons I believe that outsourcing has previously failed so badly, management making the decisions failed to understand the complexity of producing software, and the importance of very high quality developers). Every place I've worked at, or even seen, has placed exceptional importance on recruiting the best they can get.

Your assertion about rewards might be true (the closest I've encountered was a bonus for releasing the product on time, but that was on the condition it was released to the quality standards applied by the company, which were high), however I don't think it is conventional wisdom to do this, and certainly not at the cost of quality. Usually the assessment of how "good" a developer is (which then affects pay/rewards) is usually far more badly affected to the negative by poor code than to the positive by output. In my early days coding I was told that I was able to very quickly write very buggy code, it was not a compliment.

The reason you're seeing more low quality programs is actually (in my opinion) for the following reasons:
1) Programs are more complex, the more a program tries to do, the more it can fail to do perfectly - try writing a bug in "Hello World".
2) Release schedules are being more heavily enforced. The developers would fix the bugs if they had time, but companies are more willing to release with bugs rather than lose out to competition and not release this year. To an extent this is justified, they do have to compete.
3) People are expecting higher quality. It used to be par for the course for applications to bomb out, possibly crashing the entire OS. Why was autosave and document recovery introduced into MS Word?

Anyway, that's enough from me for now. I doubt anyone would want to read any more than that.
Post edited August 19, 2015 by wpegg
avatar
Elenarie: #1 That is of no concern. Transparency doesn't mean that people can have access to content that they haven't paid for.

#2 What privacy? Non-WinRT apps never asked for any permissions. We were using random Win32 apps that had the power to do whatever the crap they want in the background, and nobody complained. But when the actual power was given to the user, people started complaining about privacy concerns. That is just retarded.

#3 Of course. But then that is never an easy issue to solve. The perception of IT in companies needs to improve. To them IT is just a tool that keeps thing running, yet it should be on a much higher level as everything they do depends on it. But then, that is what happens when you have "business" jokers in charge of companies and not actual engineers.

#4 Yes, I would want to. Because now with Windows 10, phones will support extending to a completely usable desktop. Meaning if you have a UWP modelling app, you could just connect your phone to a display, connect proper input devices, and you could do all your work that you would normally do on a big fat plastic box.
#1 I have not stated that watching content you own should be allowed, however content that you do purchase should actually work, now and into the future. Closed environments like the example given just leads to problems with this, as people are point out with SecuRom on Win10.

#2 Not at all. In the past people did not understand that there was a problem. For many years going back to early days, computers were either a mystery or confined to technical people. As the market has changed and more people have taken it up (and from a diverse background), such things as privacy is becoming more of an issue, certainly in the last year or so with the naked pictures news story. Non-technical people have had to start learning these things to remain protected online. With the vast leaps in technology e.g. internet down/upload speeds, far greater reach of social media etc. everyone is struggling to keep up with what information is around and what it can be used for. I tend to still pull the wireless reciever out when I am not using the internet, do many people actually do this whilst wandering around with a phone, I doubt it.

#3 Agree.

#4 I wasn't really trying to push for the plastic box, more the traditional keyboard/mouse setup or digital stylus/pad setups. And if by Win10 phones, you mean the nokia ones, I have to say I am really unimpressed by the one I have.

avatar
tinyE: Where are the Cliff Notes?
Here: https://www.youtube.com/watch?v=b7lKKNrXUJg
avatar
Telika: I simply think that tablets will progressively become the standard. Whether we want it or not.
What is this 2010?

Hasn't Tablets and Phones already became the standard? Why are we still treating the breakthrough of Tablets and heavy push of Apple products like its still a new thing?

Besides many websites sold out by making their interfaces sutible for tablets.
Post edited August 19, 2015 by Elmofongo
I'm with Elenarie on this one. In most aspects things are better than ever: most people have a personal computer on them at all times, can get tons of useful software for free, and there are also a lot of good free tools for creating such software. Open source software is taken more seriously than ever, Linux in particular; more new OS's reach the public hands, and older OS's are taken in interesting new directions (even though sometimes some of their fans don't like that). Creators have easier ways than ever to reach their public and to get funding.

So as Elenarie said, the golden age of computing is just starting.