Posted December 21, 2020

CthulhuInSpace
Just because we disagree doesnt mean I hate you.
Registered: Oct 2008
From United States

Themken
Old user
Registered: Nov 2011
From Other
Posted December 21, 2020
The Ryzen 4000 processors come with integrated graphics, mobile or not. The desktop versions are closer to the 3000 series than the 5000 series in performance.

Judicat0r
New User
Registered: Dec 2009
From Italy
Posted December 21, 2020


All that needs to be done is as follows:
* Compilers need to support the new architecture as a target; both gcc and clang support aarch64, so this isn't a problem here.
* Assembly programs need to be converted to the new architecture; fortunately, assembly is used very little these days (OS kernels and device drivers being the cases where you're most likely to see this as well as embedded systems). (ZSNES is one example of a program that won't transfer because of this, but then again, it doesn't even support amd64, which is currently the most common desktop/laptop ISA.)
* Any bugs related to subtle differences that affect higher level languages need to be taken care of. This shouldn't be too much of an issue, but there might be programs that try to do things like access 4 byte values at addresses that aren't multiples of 4.
Solve those, and the software is now running on the new CPU type.


(The ones in the Raspberry Pi 3 and 4, for example, do so, and in fact Raspberry Pi OS runs in 32-bit mode.)
On the other hand it is worth noting that Microsoft's OSes aren't running on metal for years now and among virtualization, abstraction layers and translations could run on different architectures with reasonable efforts.

dtgreene
vaccines work she/her
Registered: Jan 2010
From United States
Posted December 21, 2020
You still can't run software for different architectures without some CPU emulation, which will cause a loss in performance.

Judicat0r
New User
Registered: Dec 2009
From Italy
Posted December 22, 2020


But, yes, the core of my point was that: ARM is here to stay and to, maybe, replace x64 at some time eventually, but the hurdles it has to overcome are not easy ones because beyond technical resons there are practical/econical/logistic ones.
It succeded in the mobile and low power market but HPC and server's not quite despite having remarkable hardware capable of very good perfermance.
Why?
In my opinion for the aforementioned reasons: to steer the industry to a whole new standard takes years and will to do.
Tangentially it's worth noting that intel has a major part in this situation with years of stagnating technology, indeed when AMD came back into the fight they were caught pantless.
Imagine now the performaces of M1 without AMD hardware to compare it to.
What I am trying to say is that x64 is facing competition by ARM also because of plus ten years of poor innovations and lack of techical advancementes, Apple abandoned intel hardware guess why.
If the competition would have been stronger throughout the last decade, well, I'm not so sure we would be at this point now.

dtgreene
vaccines work she/her
Registered: Jan 2010
From United States

Judicat0r
New User
Registered: Dec 2009
From Italy
Posted December 22, 2020


The purpose of emulation is to translate software in way that is comprehensible and efficiently executable by the hardware, now I'm not aware of the intricacies of Apple's approach but Iknow it happens in a couple different ways.
One implies that the sfotware is translated in memory by rosetta 2 and then executed, the other should translate the non native application before it is started and then executed on local hardware, I'm simplyfing here but, again, I'm not an expert of that stuff.
As I am aware none of the mentioned methods involves native x86-64 hardware in Apples' silicon and in my knowledge M1 isn't an Hybrid architecture chip.

dtgreene
vaccines work she/her
Registered: Jan 2010
From United States
Posted December 22, 2020


The purpose of emulation is to translate software in way that is comprehensible and efficiently executable by the hardware, now I'm not aware of the intricacies of Apple's approach but Iknow it happens in a couple different ways.
One implies that the sfotware is translated in memory by rosetta 2 and then executed, the other should translate the non native application before it is started and then executed on local hardware, I'm simplyfing here but, again, I'm not an expert of that stuff.
As I am aware none of the mentioned methods involves native x86-64 hardware in Apples' silicon and in my knowledge M1 isn't an Hybrid architecture chip.
If you have a program that does heavy calculations on the CPU, then the result of emulation will be noticeable. If we have an ARM CPU and emulating a program compiled for an x86 CPU, it's going to run significantly slower than if the program had been compiled for ARM and run natively.
Either you have native hardware (in which case you are actually using an x86 compatible CPU to run x86 programs), or you are emulating it in software,
(Worth noting: Hardware virtualization only works if the CPU can natively execute the target instruction set. It's apparently possible to run Linux on a VM in an Apple Silicon mac, but it has to be a version compiled for ARM; it still can't do an x86 VM without emulation.)

Abishia
New User
Registered: Jul 2015
From Netherlands
Posted December 22, 2020
No i have a Ryzen 7 3800 and people said i bought to heavy CPU
(btw it's absolute beast of CPU if only GPU's would leap so much forward)
(btw it's absolute beast of CPU if only GPU's would leap so much forward)

Phasmid
New User
Registered: Apr 2012
From New Zealand

CthulhuInSpace
Just because we disagree doesnt mean I hate you.
Registered: Oct 2008
From United States

Dark_art_
🔴I'm just glad that cows don't fly YO
Registered: Dec 2017
From Portugal
Posted December 23, 2020

Funny how Apple seem to not want to deal with nVidia and in the end must pay them royalities/licencing fees.

WishmasterTheDark
Heavy Metal Inquisitor
Registered: Feb 2017
From Montenegro
Posted December 23, 2020
low rated
When I decide to buy a new graphics card it will be Nvidia. As always.

dtgreene
vaccines work she/her
Registered: Jan 2010
From United States
Posted December 23, 2020
When I decide to buy a new graphics card it will definitely not be Nvidia, as always.
(Problem is that they keep their Linux driver proprietary, which results in it not playing well with the kernel; plus I really don't want to run proprietary software in kernel mode, where malicious code can do the most harm. If they start open sourcing their driver, and said driver ends up in mainline Linux, then I'll consider Nvidia, but not before.)
(Problem is that they keep their Linux driver proprietary, which results in it not playing well with the kernel; plus I really don't want to run proprietary software in kernel mode, where malicious code can do the most harm. If they start open sourcing their driver, and said driver ends up in mainline Linux, then I'll consider Nvidia, but not before.)

Phasmid
New User
Registered: Apr 2012
From New Zealand
Posted December 23, 2020

Jensen might be an effective CEO for internal matters, but he's burnt bridges with nearly everyone nVidia has ever worked with- Sony, MS, Tesla, Apple. Maybe Nintendo too, given that the Switch jailbreak was 100% nVidia's fault. Intel picked their direct competitor AMD when they needed a graphics option a few years ago over nVidia. Even their AIBs dislike nVidia over GPP trying to co-opt their branding and having their profit margins dictated to them.