It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
LeahKerr: Does 5000 is better than 4000 series?
The 5000 series is the newest Ryzen series of cpu. Short answer is yes.
The Ryzen 4000 processors come with integrated graphics, mobile or not. The desktop versions are closer to the 3000 series than the 5000 series in performance.
avatar
Judicat0r: Software would have to be rewritten for the new archicture, this could take many years, not just the act itself but to steer the whole inidustry to a new standard is a massive task.
avatar
dtgreene: It's worth noting that this issue isn't as bad as it sounds, and it's mainly a problem for proprietary software that the developer doesn't want to compile for the new architecture, or for abandoned proprietary software (a category that includes a lot of games).

All that needs to be done is as follows:
* Compilers need to support the new architecture as a target; both gcc and clang support aarch64, so this isn't a problem here.
* Assembly programs need to be converted to the new architecture; fortunately, assembly is used very little these days (OS kernels and device drivers being the cases where you're most likely to see this as well as embedded systems). (ZSNES is one example of a program that won't transfer because of this, but then again, it doesn't even support amd64, which is currently the most common desktop/laptop ISA.)
* Any bugs related to subtle differences that affect higher level languages need to be taken care of. This shouldn't be too much of an issue, but there might be programs that try to do things like access 4 byte values at addresses that aren't multiples of 4.

Solve those, and the software is now running on the new CPU type.
avatar
Dark_art_: ARM is "backwards compatibility free" at the moment and that means, unlike x86/x64, all the resources get to be fully utilized. IMHO, if someone design a x64 with only 1 aplication in mind, I have no doubt it will be more efficient than a ARM chip.
avatar
dtgreene: Not quite true; most aarch64 CPUs support 32-bit ARM programs.

(The ones in the Raspberry Pi 3 and 4, for example, do so, and in fact Raspberry Pi OS runs in 32-bit mode.)
Yeah we agree on that but it's not about technical reasons but practical ones instead: the proprietary software is the core of the problem, the industry standards, those have the greatest inertia.

On the other hand it is worth noting that Microsoft's OSes aren't running on metal for years now and among virtualization, abstraction layers and translations could run on different architectures with reasonable efforts.
avatar
Judicat0r: On the other hand it is worth noting that Microsoft's OSes aren't running on metal for years now and among virtualization, abstraction layers and translations could run on different architectures with reasonable efforts.
You still can't run software for different architectures without some CPU emulation, which will cause a loss in performance.
avatar
Judicat0r: On the other hand it is worth noting that Microsoft's OSes aren't running on metal for years now and among virtualization, abstraction layers and translations could run on different architectures with reasonable efforts.
avatar
dtgreene: You still can't run software for different architectures without some CPU emulation, which will cause a loss in performance.
It depends on the type of emulation: it can be software but can also be the much faster dedicated hardware emulation like M1's with rather good performances.

But, yes, the core of my point was that: ARM is here to stay and to, maybe, replace x64 at some time eventually, but the hurdles it has to overcome are not easy ones because beyond technical resons there are practical/econical/logistic ones.

It succeded in the mobile and low power market but HPC and server's not quite despite having remarkable hardware capable of very good perfermance.
Why?
In my opinion for the aforementioned reasons: to steer the industry to a whole new standard takes years and will to do.

Tangentially it's worth noting that intel has a major part in this situation with years of stagnating technology, indeed when AMD came back into the fight they were caught pantless.
Imagine now the performaces of M1 without AMD hardware to compare it to.

What I am trying to say is that x64 is facing competition by ARM also because of plus ten years of poor innovations and lack of techical advancementes, Apple abandoned intel hardware guess why.
If the competition would have been stronger throughout the last decade, well, I'm not so sure we would be at this point now.
avatar
Judicat0r: It depends on the type of emulation: it can be software but can also be the much faster dedicated hardware emulation like M1's with rather good performances.
Hardware emulation would mean that there'd actually have to be an x86 compatible chip in there somewhere.
avatar
Judicat0r: It depends on the type of emulation: it can be software but can also be the much faster dedicated hardware emulation like M1's with rather good performances.
avatar
dtgreene: Hardware emulation would mean that there'd actually have to be an x86 compatible chip in there somewhere.
Frankly I'm not an expert but what you write kind of defies the purpose of emulation altogether if you still need native hardware, take for example console emulators, additionally hardware emulation can be executed in different ways.

The purpose of emulation is to translate software in way that is comprehensible and efficiently executable by the hardware, now I'm not aware of the intricacies of Apple's approach but Iknow it happens in a couple different ways.

One implies that the sfotware is translated in memory by rosetta 2 and then executed, the other should translate the non native application before it is started and then executed on local hardware, I'm simplyfing here but, again, I'm not an expert of that stuff.

As I am aware none of the mentioned methods involves native x86-64 hardware in Apples' silicon and in my knowledge M1 isn't an Hybrid architecture chip.
avatar
dtgreene: Hardware emulation would mean that there'd actually have to be an x86 compatible chip in there somewhere.
avatar
Judicat0r: Frankly I'm not an expert but what you write kind of defies the purpose of emulation altogether if you still need native hardware, take for example console emulators, additionally hardware emulation can be executed in different ways.

The purpose of emulation is to translate software in way that is comprehensible and efficiently executable by the hardware, now I'm not aware of the intricacies of Apple's approach but Iknow it happens in a couple different ways.

One implies that the sfotware is translated in memory by rosetta 2 and then executed, the other should translate the non native application before it is started and then executed on local hardware, I'm simplyfing here but, again, I'm not an expert of that stuff.

As I am aware none of the mentioned methods involves native x86-64 hardware in Apples' silicon and in my knowledge M1 isn't an Hybrid architecture chip.
Any translation will result in a reduction of performance.

If you have a program that does heavy calculations on the CPU, then the result of emulation will be noticeable. If we have an ARM CPU and emulating a program compiled for an x86 CPU, it's going to run significantly slower than if the program had been compiled for ARM and run natively.

Either you have native hardware (in which case you are actually using an x86 compatible CPU to run x86 programs), or you are emulating it in software,

(Worth noting: Hardware virtualization only works if the CPU can natively execute the target instruction set. It's apparently possible to run Linux on a VM in an Apple Silicon mac, but it has to be a version compiled for ARM; it still can't do an x86 VM without emulation.)
No i have a Ryzen 7 3800 and people said i bought to heavy CPU

(btw it's absolute beast of CPU if only GPU's would leap so much forward)
avatar
Judicat0r: What I am trying to say is that x64 is facing competition by ARM also because of plus ten years of poor innovations and lack of techical advancementes, Apple abandoned intel hardware guess why.
Apple going ARM was probably inevitable at some point given that they like a tightly controlled supply chain. Intel supply and thermal issues may have been the last straw, but Apple could have gone fully AMD (CPU/ GPU) instead of in house.

The talk of ARM replacing x86 really brings me back though, since there were similar murmurings in the 90s with Oracle's Network Computers going to replace PCs via ARM/ DEC's StrongARM etc. Didn't happen then, obviously, but would have been interesting if it had.

Intel's fundamental mistake in retrospect was tying architectural improvements to their fab business. Hard to be sympathetic when they most certainly milked a dominant position for literally no improvement in core numbers at consumer level and mediocre IPC gains since Nehalem/ Sandy Bridge respectively, of course, but they'd still be competitive if they could have got their 10nm process working properly. TigerLake has and IceLake had a good IPC gain over SkyLake, but it's laptop only gain (and 1 (?) server chip, seemingly produced solely to fulfill a pledge to shareholders) until early next year, assuming the backport doesn't slip again.

Should also be noted that a lot of AMD's improvements with Zen are not strictly architectural, but due to more cache, managed better- ironically, Intel did much the same with its initial Broadwell chips by adding EDRAM modules to compensate for low clocks, iirc the 5775C still has the best IPC of any Intel desktop chip as a result. Doesn't really matter for the consumer where the performance comes from though of course.
avatar
Abishia: No i have a Ryzen 7 3800 and people said i bought to heavy CPU

(btw it's absolute beast of CPU if only GPU's would leap so much forward)
Nice!
avatar
Phasmid: Apple going ARM was probably inevitable at some point given that they like a tightly controlled supply chain. Intel supply and thermal issues may have been the last straw, but Apple could have gone fully AMD (CPU/ GPU) instead of in house.
They may have anti competitive agreements, who knows what it's stipulated on the contract.

Funny how Apple seem to not want to deal with nVidia and in the end must pay them royalities/licencing fees.
low rated
When I decide to buy a new graphics card it will be Nvidia. As always.
avatar
Wishmaster777: When I decide to buy a new graphics card it will be Nvidia. As always.
When I decide to buy a new graphics card it will definitely not be Nvidia, as always.

(Problem is that they keep their Linux driver proprietary, which results in it not playing well with the kernel; plus I really don't want to run proprietary software in kernel mode, where malicious code can do the most harm. If they start open sourcing their driver, and said driver ends up in mainline Linux, then I'll consider Nvidia, but not before.)
avatar
Dark_art_: Funny how Apple seem to not want to deal with nVidia and in the end must pay them royalities/licencing fees.
Not that odd really, just about no one wants to deal with nVidia, Apple included.

Jensen might be an effective CEO for internal matters, but he's burnt bridges with nearly everyone nVidia has ever worked with- Sony, MS, Tesla, Apple. Maybe Nintendo too, given that the Switch jailbreak was 100% nVidia's fault. Intel picked their direct competitor AMD when they needed a graphics option a few years ago over nVidia. Even their AIBs dislike nVidia over GPP trying to co-opt their branding and having their profit margins dictated to them.