It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
The m1 has shown us a very different approach towards chip design. It's more than just a mobile chip on a laptop. It has shaken up the x86 hegemony.

As transistor upgrades become more difficult, chip-makers have to bank on innovation to make themselves standout. This means there will be a focus on 3D chip stacking, more efficient system design approaches, and an emphasis on microarchitectural idiosyncracies (the apple m1 does this well with its wider core design, unified memory architecture, innovative cache design, and unique out of order execution approach). It is difficult for x86 to mimic the m1 because of its legacy baggage.

I am not saying that a m1 like design approach is a panacea.

However, it is a design that does make a lot of sense.

Does the future look more ARM for the mainstream?

Disclaimer: I am not an engineer. Forgive me if I have misunderstood certain chip design concepts. My perspective stems from a sustainability or performance per watt perspective.
Post edited February 19, 2021 by Lionel212008
not the first or the last time some new ARM chip design spelled the end of x86. Nonetheless, X86 is still here and continues to thrive. It'll take something a lot more earth-shattering, like mass-produced consumer quantum computing cpus to actually put an end to x86. This, it is just another evolution of ARM, not a game changing revolution in computing. The most it'll do is just provide incentive to introduce another innovation to boost x86 efficiency/performance.
Post edited February 19, 2021 by anzial
avatar
anzial: not the first or the last time some new ARM chip design spelled the end of x86. Nonetheless, X86 is still here and continues to thrive. It'll take something a lot more earth-shattering, like mass-produced consumer quantum computing cpus to actually put an end to x86. This, it is just another evolution of ARM, not a game changing revolution in computing. The most it'll do is just provide incentive to introduce another innovation to boost x86 efficiency/performance.
Actually, when it comes to quantum computers, I have a feeling that they won't replace CPUs, but will rather be something like GPUs, as the programming model you need to use is very different. In particular, I believe even a simple instruction like "x = y" won't work with a quantum computer, as I believe every operation must be reversible. This means that most of the software written for classical computers can't be ported, and even the programming languages used will be fundamentally different, with some operations we take for granted impossible.

Hence, I see quantum computers being used for specialized applications, like cryptography, and a classical CPU (whether x86, ARM, or another architecture entirely) being used for everything else, much like the current situation with GPUs.
avatar
Lionel212008: I am not saying that a m1 like design approach is a panacea.

However, it is a design that does make a lot of sense.
You could say the same about many a chip design TBH. POWER makes a lot of sense, so does MIPS, the RISC-V effort etc. Maybe even the Russian e2k (aka Elbrus aka the VLIWy distant Itanium relative) does too.. for something somewhere lol.

The market is king tho.

DOS doesn't make a lot of sense in 2021, yet the most popular (apparently) OS still carries some of its (mostly dreadful) design decisions in a huge bag of legacy crap, some of which is possibly traceable to CP/M.
And DOS was thoroughly outdated by OS/2 on i386 like 30 yrs ago. Yet you have C:\ in your computer, but not a lot of OS/2... Thanks to M$ thugs, corruption, and some really "smart" decision making on the part of IBM.

While outdating the x86 will probably take something else. Heck, they said (and still say) that the DEC Alpha was a wonderful design.. yet...

I personally like that you can actually buy a functional POWER9 (soon 10) system which is open down to the firmware and actually usable, at least for development of high performance applications. Tho you can browse teh Internets with it too)

So success of the M1 will most probably hinge on people habitually squandering much of their hard earned to a certain company, not it's technical superiority over x86 whether its real or only debatable.
Post edited February 19, 2021 by osm
Apple has changed its hardware previously a few times, before it was using intel it was using PowerPC, which was another apple designed RISC type of chip. The ARM design is useful for apple because it allows them to develop software which works across its notebook and mobile device platforms. However, the sale of apple PC hardware is so low compared to the industry overall. Because of this, it is unlikely to have a significant impact on the intel sales of consumer chips and chipsets.

The modern implementation for cpus is to have a combination of "big cores" and "small cores" on a single chip. This allows for more efficient use of silicon die area, since it is unlikely a user will need to perform deep learning or complex FP16 calculations on all 6, 8, or 24+ cores. My understanding is that intel uses this design on its consumer processors from the 12th gen Core-i series, and that apple is already using this in its ARM processors.

You also have to understand that ARM is an architecture, it doesnt own its own Fabs like intel does. So while there are shortages of chips for automotive, pcs, and other uses affecting things like graphics cards, it is unlikely to greatly affect the output of intel chips because intel owns its own chip fabrication plants.

Arm is not the only consumer grade architecture in competition with x86. There are others that will likely replace Arm in the future as a consumer grade offering. In my own opinion, as an incumbent in the consumer and enterprise space, intel has a secure future for its product lines. The most likely thing to happen is ARM continues to dominate mobile and x86 continues to dominate consumer PCs.
avatar
osm: DOS doesn't make a lot of sense in 2021, yet the most popular (apparently) OS still carries some of its (mostly dreadful) design decisions in a huge bag of legacy crap, some of which is possibly traceable to CP/M.
And DOS was thoroughly outdated by OS/2 on i386 like 30 yrs ago. Yet you have C:\ in your computer, but not a lot of OS/2... Thanks to M$ thugs, corruption, and some really "smart" decision making on the part of IBM.
Case in point: Try creating a file with the name "con" (with *any* extension) on a Windows system and see what happens.
avatar
osm: DOS doesn't make a lot of sense in 2021, yet the most popular (apparently) OS still carries some of its (mostly dreadful) design decisions in a huge bag of legacy crap, some of which is possibly traceable to CP/M.
And DOS was thoroughly outdated by OS/2 on i386 like 30 yrs ago. Yet you have C:\ in your computer, but not a lot of OS/2... Thanks to M$ thugs, corruption, and some really "smart" decision making on the part of IBM.
avatar
dtgreene: Case in point: Try creating a file with the name "con" (with *any* extension) on a Windows system and see what happens.
i'm sure there are a lot of quirks like that with the mammoth crappile that is WIn, but I don't even have Crapdose on my desktop. Will try on a work system. Will it kill kittens?
Post edited February 19, 2021 by osm
avatar
schewy: The modern implementation for cpus is to have a combination of "big cores" and "small cores" on a single chip. This allows for more efficient use of silicon die area, since it is unlikely a user will need to perform deep learning or complex FP16 calculations on all 6, 8, or 24+ cores. My understanding is that intel uses this design on its consumer processors from the 12th gen Core-i series, and that apple is already using this in its ARM processors.
Sounds like ARM's big.LITTLE.
avatar
dtgreene: Actually, when it comes to quantum computers, I have a feeling that they won't replace CPUs,
Agree to disagree. Quantum computing is the future, unless something better comes along. It won't happen tomorrow or within the next 10 years but eventually, it'll be everywhere, running everything
avatar
dtgreene: Sounds like ARM's big.LITTLE.
that's exactly what it is. Intel basically stole the idea lol :) as they did many times before.
Post edited February 19, 2021 by anzial
avatar
dtgreene: Case in point: Try creating a file with the name "con" (with *any* extension) on a Windows system and see what happens.
avatar
osm: i'm sure there are a lot of quirks like that with the mammoth crappile that is WIn, but I don't even have Crapdose on my desktop. Will try on a work system. Will it kill kittens?
On a modern Windows system, the worst that will happen is that you will get an informative error message.

On somewhat less modern Windows systems, you'll get an error message, but it won't make sense in the context.

You also get a strange error if you have a file named "con" and attempt to delete it via Windows Explorer.
avatar
dtgreene: Actually, when it comes to quantum computers, I have a feeling that they won't replace CPUs,
avatar
anzial: Agree to disagree. Quantum computing is the future, unless something better comes along. It won't happen tomorrow or within the next 10 years but eventually, it'll be everywhere, running everything
Have you actually looked into how quantum computing actually works?
Post edited February 19, 2021 by dtgreene
avatar
schewy: The modern implementation for cpus is to have a combination of "big cores" and "small cores" on a single chip. This allows for more efficient use of silicon die area, since it is unlikely a user will need to perform deep learning or complex FP16 calculations on all 6, 8, or 24+ cores. My understanding is that intel uses this design on its consumer processors from the 12th gen Core-i series, and that apple is already using this in its ARM processors.
avatar
dtgreene: Sounds like ARM's big.LITTLE.
It indeed has found its way into Intel chips https://www.phoronix.com/scan.php?page=news_item&px=Intel-Alder-Lake-Hybrid-Model

Also AMD's CPUs i think even have an ARM core inside. For all the "trusted" stuff. Or something like that.
avatar
dtgreene: On a modern Windows system, the worst that will happen is that you will get an informative error message.

On somewhat less modern Windows systems, you'll get an error message, but it won't make sense in the context.

You also get a strange error if you have a file named "con" and attempt to delete it via Windows Explorer.
Please stop replying with 1-liners to everyone elses response to the OP? If you want to add something to the discussion, that would be great, but you have posted alot of junk and it ruins the conversation.
avatar
dtgreene: Sounds like ARM's big.LITTLE.
avatar
anzial: that's exactly what it is. Intel basically stole the idea lol :) as they did many times before.
Incidentally, when it comes to x86 instruction set extensions:
* If Intel develops an extension, AMD would later adopt it in many cases. (Notable exception: VMX, as AMD uses SVM instead)
* If AMD develops an extension, Intel would *not* adopt it. (Notable exception: long mode, which is probably the most important extension these days)
Anyway most probably the motivation on the Apple's side was a) control (as always) b) unification. Not performance let alone chip design.
avatar
Lionel212008: The m1 has shown us a very different approach towards chip design. It's more than just a mobile chip on a laptop. It has shaken up the x86 hegemony.

As transistor upgrades become more difficult, chip-makers have to bank on innovation to make themselves standout. This means there will be a focus on 3D chip stacking, more efficient system design approaches, and an emphasis on microarchitectural idiosyncracies (the apple m1 does this well with its wider core design, unified memory architecture, innovative cache design, and unique out of order execution approach). It is difficult for x86 to mimic the m1 because of its legacy baggage.

I am not saying that a m1 like design approach is a panacea.

However, it is a design that does make a lot of sense.

Does the future look more ARM for the mainstream?

Disclaimer: I am not an engineer. Forgive me if I have misunderstood certain chip design concepts. My perspective stems from a sustainability or performance per watt perspective.
Keep in mind that Apple's M1 is built on a more advanced node than AMD's CPUs which would use aorund 25..30% less power if built on the same node, that's just to point that out without downplaying the good job Apple has done with their CPU.

Near future not really if you just look at the mobile market ARM is king, queen and prince there and in the ultra LP laptop segment with Chromebooks ATM which, news of yesterday, have eaten Apple's marklet share BTW.

Distant future, who knows, anything can happen but you need much, much more than recompile software for the new architecture in order to steer the industry to a new standard at least that's what uninformed, less knowledgable people might think.

Not to nitpick on anyone I just leave it here: people often refers to ARM as an architecture while it is a microarchitecture, and x86 is basically legacy: today we have x64 which are just nominally CISC but they are much close to RISC CPUs instead.
Post edited February 19, 2021 by Judicat0r