It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Lionel212008: The m1 has shown us a very different approach towards chip design. It's more than just a mobile chip on a laptop. It has shaken up the x86 hegemony.
I don't think that Apple's constant flip-flopping on processor design and usage has much impact in the grand scheme of things. They have good market share in mobile phones, but their computer offerings remain pretty niche (in the business environment, they have reasonable penetration in creative industries but not much elsewhere).

The x86-64 design will continue to adapt and develop.
Whatever happened to the Cell CPUs the PS3 had?
avatar
§pectre: Whatever happened to the Cell CPUs the PS3 had?
probably a total fail

i cant see why would they change from x86 , it does its job well , not much reason to switch
Post edited February 19, 2021 by Orkhepaj
avatar
§pectre: Whatever happened to the Cell CPUs the PS3 had?
As I understand, CELL BE (Broadband Engine) was "simply" an underclocked POWER (eight?) surrounded by speciality co-processors.
The main selling point, again as I got it, was the interplay between the CPU and those chips.
The description sounded to me somewhat like what AMD pushes as its APUs

Seeing how the platform's support is gradually being phased out (GCC 10 has dropped it in '19 I think, in the Linux kernel they propose the removal, but not the PS3 version), I think it's safe to say that it's dead at least in that form.
Post edited February 19, 2021 by osm
avatar
osm: So success of the M1 will most probably hinge on people habitually squandering much of their hard earned to a certain company, not it's technical superiority over x86 whether its real or only debatable.
M1's success depends on Ye Apple Faithful.
But the chip does show that x86 could do with an overhaul.

I'm typing this on a Macbook Air M1, and this machine is an absolute beauty.
It's fast. Like really fast. Compiling is nippy and heavy apps like Intellij IDEA fly like poo off a shovel.

Rosetta, the x86 translator, is impressive too.
I tried installing a few Windows-only GOG games on it and run them through Crossover (Wine). A couple didn't work, but many did.
IIRC, the graphics performance is supposed to be akin to a 1650 Ti, and I'd say that sounds about right.

All that alone isn't the impressive bit, though.
The impressive thing is the lack of any fan in this machine.
This kind of performance with passive cooling is damn impressive.

Instant sleep. Instant wake-up.
Not a second or two. Table-like Instant.

The only downside is that it's running MacOS.

This thing, including Rosetta for x86 games, with a real Linux on it would be absolutely fantastic.
If it wasn't for all the damn telemetry, Windows would be just dandy too.
But MacOS? Meh.
avatar
brouer: [...]
All that alone isn't the impressive bit, though.
The impressive thing is the lack of any fan in this machine.
This kind of performance with passive cooling is damn impressive.
[...]
That's always been an fixation for Apple they even used to downclock Intel's CPU to keep their computers as silent as possible, that and the obsession with design.
Also, see the very last paragraph of http://landley.net/history/mirror/os2/history/os2warp/index.html

/thread
avatar
Lionel212008: I am not saying that a m1 like design approach is a panacea.

However, it is a design that does make a lot of sense.

Does the future look more ARM for the mainstream?
The biggest issue with x86 VS. ARM cpus is legacy support. That's the reason why x86 is still around, why Windows 10 still has a floppy disk driver, and why x86 is so power (electricity) hungry. Microsoft tried hard in years past to get x86 in smaller and smaller form factors and failed due to power requirements and user interface design. (UMPCs anyone?) All of that failure was legacy support. Power due to the CPU requirements, UI due to expectations and assumptions made by applications and their developers. (Mouse? Must be present. Display? At least 800x600. Full Keyboard? Absolutely. CPU power? Full throttle constantly. Memory? As much as we want. Fat finger touch input? What is that?)

As another legacy support example, It took decades to get rid of the 16-bit BIOS infrastructure, and in some cases it's still present in today's systems as UEFI CSM mode. That deprecated infrastructure required x86 CPUs to boot in 16-bit mode, and the CPU is still expected to switch to 16-bit mode when executing UEFI CSM code when it's used today. That requirement alone means that 16-bit mode can't be removed from the CPU design. Removing that mode means that OSes like WinXP / Vista / and to a lesser extent Win7 and Win8 would be unbootable on systems without the mode. (Even though all of those operating systems only use the BIOS support during initial startup.)

It's also not a simple case to get rid of the legacy support either. Many places, governments, companies, etc. rely on that legacy support to run their operations. There was a US state that recently said their COVID stimulus checks would be delayed due to computer programming issues. Turns out their unemployment system is still written in COBAL, a programming language that is 60 years old. Upgrades are not cheap, and in some cases an "upgrade" would require either complete replacement of entire infrastructure, or complete rewrites / recreation of source code. All of which is usually cost prohibitive.

Another problem is ARM's lack of standards which are taken for granted on x86. An example of this is hardware detection. On x86 you have things like ACPI tables in the firmware that tell the OS what hardware is present and where in memory the hardware is located. On ARM that stuff has to be built into the OS / device bootloader. Which is one of the reasons why upgrading things like Android devices is so painful and typically not done by manufacturers.

None of this is to say ARM doesn't have utility or that x86 cannot be fixed, but that each architecture has it's use and advantages. You probably won't be running the next up and coming AAA game on an ARM device, but if you just want some facebook / YouTube on the go, ARM is a pretty good choice. Don't care about power consumption, and need raw compute power / legacy application support? x86 has you covered. It's about using the right tool for the right job. In that regard, the future remains bright for both platforms.
avatar
dtgreene: Case in point: Try creating a file with the name "con" (with *any* extension) on a Windows system and see what happens.
avatar
osm: i'm sure there are a lot of quirks like that with the mammoth crappile that is WIn, but I don't even have Crapdose on my desktop. Will try on a work system. Will it kill kittens?
It's a reserved file name for legacy reasons. (COM ports A.K.A serial ports back in the early days of DOS needed an identifier to allow progams to use them, and COM1, COM2, etc. were chosen for that task. Windows was originally a GUI frontend application for DOS and as such had to abide by DOS's restrictions. Those limitations were kept in Win95, Win98, and WinME as they were still DOS under the hood. Those limitations then were kept in the transition to WinNT, Windows 2000, and WinXP for compatibility reasons, and they still exist today as old legacy code kept around from back then.)
Post edited February 21, 2021 by phibbs
avatar
phibbs: Turns out their unemployment system is still written in COBAL
Pretty sure it's CABAL ;-)

PS also, "unemployment system in COBOL" - oh the irony!

PPS you could've taken some time to look throught the posts in this whopping 2-page thread - could've saved you a lot of typing
avatar
Lionel212008: The m1 has shown us a very different approach towards chip design. It's more than just a mobile chip on a laptop. It has shaken up the x86 hegemony.
avatar
pds41: I don't think that Apple's constant flip-flopping on processor design and usage has much impact in the grand scheme of things. They have good market share in mobile phones, but their computer offerings remain pretty niche (in the business environment, they have reasonable penetration in creative industries but not much elsewhere).

The x86-64 design will continue to adapt and develop.
Actually Apple has flipped on CPU architectures multiple times through out the life of macOS. Originally, it ran on Motorola's X6800 CPUs. Around Mac OS 8, Apple switched to IBM's PowerPC CPUs. Then later Apple switched to Intel's x86 CPUs. Now Apple has flipped to their own self-branded ARM CPUs.

Sadly, each flip has had some level of impact on application development. There's also been the issues with legacy compatibility. (Again there's a reason why x86 is still around.) All of which occur before Apple's own shenanigans with deprecating standards. (Deprecating OpenGL and rejection of Vulcan are the most recent, but Apple has done this in the past as well.) None of this is good for application developers who need stable systems to develop for, and end users who expect their software to work years later. Even Apple has issues dealing with the constant changes, as they routinely send out updates that break their own software and devices. As a result, most people steer clear of macOS unless they can afford the constant upgrades and workflow changes that macOS comes with.

If Apple would get over it's commitment issues, it could over take the PC market. People love Apple's simplicity and design. They also trust Apple more than other manufacturers. The biggest limitation on dominance Apple has is it's own inability to work with others.
avatar
phibbs: Turns out their unemployment system is still written in COBAL
avatar
osm: Pretty sure it's CABAL ;-)

PS also, "unemployment system in COBOL" - oh the irony!

PPS you could've taken some time to look throught the posts in this whopping 2-page thread - could've saved you a lot of typing
Heh.

I could have, but I wanted to post my own take. The thread got me all nostalgic. :)
avatar
pds41: I don't think that Apple's constant flip-flopping on processor design and usage has much impact in the grand scheme of things.
avatar
phibbs: If Apple would get over it's commitment issues, it could over take the PC market. People love Apple's simplicity and design. They also trust Apple more than other manufacturers. The biggest limitation on dominance Apple has is it's own inability to work with others.
Sorry to be that guy: but it's "its" not "it's". "it's" is one of those special cases where you only use it for "it is" and not for any other situation. I also got that wrong until I was 24 and took a course in teaching English as a foreign language.

Not sorry: I respectfully disagree with you. Apple is overpriced. Apple wants you to send your Apple back to them for repairs even if it's just a failed hard drive or a CMOS battery which has gone dead. Apple sells itself on its iconic reliable products. I enjoy using Apple Macs although I have never bought one. But what you are buying is off the very standardised assembly line and massively marked up.

For that reason most home consumers won't be massively impacted by the processor change as Apple makes sure to (or at least they used to) emulate the other processor for Apple apps for a few cycles of macOS. But businesses don't like it because there comes a point when old hardware stops being emulated or the newer macOS becomes unsupported on the other architecture.

Hopefully Intel will be successful in solving their problems but que será, será.
avatar
phibbs: The thread got me all nostalgic. :)
http://landley.net/history/mirror/os2/history/ then
avatar
phibbs: Another problem is ARM's lack of standards which are taken for granted on x86. An example of this is hardware detection. On x86 you have things like ACPI tables in the firmware that tell the OS what hardware is present and where in memory the hardware is located. On ARM that stuff has to be built into the OS / device bootloader. Which is one of the reasons why upgrading things like Android devices is so painful and typically not done by manufacturers.
This sort of thing is why the Linux kernel developers introduced something called "device tree", to try to deal with all the special cases of ARM hardware.

avatar
phibbs: If Apple would get over it's commitment issues, it could over take the PC market. People love Apple's simplicity and design. They also trust Apple more than other manufacturers. The biggest limitation on dominance Apple has is it's own inability to work with others.
avatar
lupineshadow: Sorry to be that guy: but it's "its" not "it's". "it's" is one of those special cases where you only use it for "it is" and not for any other situation. I also got that wrong until I was 24 and took a course in teaching English as a foreign language.
On that tangent:

In English, apostrophes are used for noun possessives, but never for pronoun possessives. Words like "my", "your", "his", "her" (and "hers"), "whose" (don't confuse it with "who's", which is a contraction just like "it's"), "our", and so on have no apostrophes, so why would "its" be any different?

English may be inconsistent, but at least it's consistent in that respect.
Post edited February 21, 2021 by dtgreene
avatar
Lionel212008: Does the future look more ARM for the mainstream?
ARM is not good for serious gaming.
Apple has done nothing good for serious gaming too.