It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
WinterSnowfall: ...
Something i'm familiar with from working with things at the wire levels that alot of people don't understand is that there's alot of bias regarding integrated GPUs. Historically, they've been cheaply made and were there to exist in lieu of an upgrade or to make space, but the newer ones are actually aiming at performance. The PS4 system is a great example of how the notion of integrated GPUs is flawed among most hardware enthusiasts. Fundamentally, the integrated GPUs can be made to use custom buses and such which increase bandwidth to the GPUs and lower the amount of CPU operations necessary significantly.

The issue is that no one understands, anymore, the fundamental purpose of the GPU: to offload the work from the CPU. Same thing for SPU, onboard PICs, etc. This means that, of course, the newer stuff is going to have increased performance while completely negating it's historical purpose (since they're re-integrating the hardware). I foresee a push for integrated GPUs in the future, when people slowly realize this, and consequentially alot of hot debates from people with experience at the high level but no proper understanding of what's going on in reality.

General rule is, the closer the hardware, the better the bandwidth, and the integration allows significantly improved bandwidth.

I have an AMD C60, and my computer can run Skyrim smoothly, but not Terraria or Minecraft. It seems counter-intuitive until you realize what's really going on.
avatar
brouer: The 4700u gets close enough to the M1 that I can't tell the difference now that I can't compare them side by side.

[...]

Of course, I'm not going to run CP2077 on this thing, but games like Shadow Tactics and Anno 1404 (neither of which worked on the Macbook) run with all graphics settings maxed out.
avatar
WinterSnowfall: So that's on a RX Vega 7 iGPU... you're maxing out Anno 1404 at 1080p?

It would be good news for me as well, as the graphics performance seems comparable to the Xe line.

Edit: I'm so curios to see how it goes, that I'm installing it now :P.

Update: It's kind of rough @1080p when I fully zoom out (get around 40fps), so I'll stick with 1360x768. Which is what I probably would have done away due to how impossible to read the Anno UI is on smaller screens.

It could be that it would have worked better on Windows, who knows.
So, the 4700U has a Vega 7 GPU.

I have a laptop with a 3500U, which apparently has a Vega 8 GPU.

Looking at CPU comparison sites, the 4700U is clearly the better CPU (ignoring things like price and availability), but yet the GPU number is higher on the 3500U. Could someone explain what's going on?

(Then again, I get better performance on Stranger of Sword City Revisited on a desktop with a HD 4600 GPU, so things aren't as straightforward as they seem. (I'm suspecting a CPU bottleneck here, since the desktop's CPU is more powerful. Also, in case it matters, the games are being run via WINE.))
avatar
WinterSnowfall: I've been using a thin and light for years now - a fanless N5000 (Pentium Silver) based laptop. It had sufficient punch (or let's call it "slap") in its HD 605 iGPU to let me play the likes of Quake 3, Freelancer or Jedi Academy @1280x1024 60Hz, which, if you think about it, is pretty impressive. Again, this is a fanless system.

The downside? 4GB of RAM. Soldered, as most 13.3" notebooks come with these days. And limited performance when I had to do the odd compilation while away from home (less of a problem these days, I admit).
Wondering how it compares to my small laptop, which has chromebook tier specs. (Celeron CPU, 4GB RAM soldered, 64GB eMMC, according to some sites it may have a slot for an M.2 SATA SSD.)

(By the way, this laptop can run Ikenfell, which I believe would not run on my desktop, though I haven't tried since fixing the cooling and updating the OS to debian bullseye, but I note that the laptop ran the game when it was still on buster.)
avatar
WinterSnowfall: Can't wait for the discrete GPUs from Intel, to be honest, after I've seen what this baby iGPU can do. Not that I'm going to buy one necessarily, but more competition in the space is welcome. And since Intel has open drivers in the Linux space, it's a win for us Linux users - we now have a potential alternative to increasingly overpriced AMD cards.
I'd like to see Intel's discrete GPUs as well. Ideally, I'd like it to be possible to build a computer with an AMD CPU and an Intel GPU.

I'm hoping that Intel has some good entries in the low budget GPU space, which apparently has been rather sparse even without the current GPU price inflation.
Post edited May 13, 2021 by dtgreene
avatar
dtgreene: Could someone explain what's going on?
Mandatory clip :P. In short, more CPU cores, less space for GPU cores. But the Vega 7 has slightly higher clocks than the Vega 8 I think, so you won't feel much of a difference, though there still is one.

P.S.: 7/8 are actually the Vega CU counts of the iGPUs, not sure if you were aware of it.

avatar
dtgreene: Wondering how it compares to my small laptop, which has chromebook tier specs. (Celeron CPU, 4GB RAM soldered, 64GB eMMC, according to some sites it may have a slot for an M.2 SATA SSD.)
I've been around a N4100 (Celeron) powered laptop as well. It is somewhat slower than the N5000 which has a higher boost clock and a larger boost window.
Post edited May 13, 2021 by WinterSnowfall
APUs need the competition. Maybe after DDR5 goes mainstream we'll have some serious iGPU battles.
avatar
WinterSnowfall: So that's on a RX Vega 7 iGPU... you're maxing out Anno 1404 at 1080p?

It would be good news for me as well, as the graphics performance seems comparable to the Xe line.

Edit: I'm so curios to see how it goes, that I'm installing it now :P.

Update: It's kind of rough @1080p when I fully zoom out (get around 40fps), so I'll stick with 1360x768. Which is what I probably would have done away due to how impossible to read the Anno UI is on smaller screens.

It could be that it would have worked better on Windows, who knows.
avatar
dtgreene: So, the 4700U has a Vega 7 GPU.

I have a laptop with a 3500U, which apparently has a Vega 8 GPU.

Looking at CPU comparison sites, the 4700U is clearly the better CPU (ignoring things like price and availability), but yet the GPU number is higher on the 3500U. Could someone explain what's going on?

(Then again, I get better performance on Stranger of Sword City Revisited on a desktop with a HD 4600 GPU, so things aren't as straightforward as they seem. (I'm suspecting a CPU bottleneck here, since the desktop's CPU is more powerful. Also, in case it matters, the games are being run via WINE.))
I'd have to have a list of games and your results on laptop vs desktop to give you an accurate answer. Knowing your responses before, i'd take a bet at it being a matter of bandwidth, as I explained above. Looking at your one example, it reminds me of Serment which has a CPU bottleneck issue that causes the sound to glitch. Too much abstraction, which is something i noticed with alot of games i've been looking at on DLSite. They're simple games, not much graphics behind them, nor do they appear to require much processing, but they use some god awful game making thing similar to unity (which also always runs slow for me). To look at them you'd think they were from 2005, but on even newer computers they can be real slogs. Wirth's Law demonstrated before your very eyes.

EDIT:

And for some reason games on DLSite using "live2d" all warn that it's CPU intensive. Looking up specifically what "live2d" is, it would appear that for some reason whatever technique people are using to layer animated images is CPU intensive. I'm not too surprised, as you either have to do alot of triangles or you have to do transparency, just to get the types of rounded edges frequently seen in these. And, would you look at that, your game has an example of this process right https://items.gog.com/saviors_of_sapphire_wings_/_stranger_of_sword_city_revisited/mp4/Battle_2.gif.mp4]here[/img].

That said, it shouldn't be nearly as bad as things are appearing to be on some of the demos i've tried running on my own computer. Playing Wing Commander Privateer, I have alot of things on my screen at all times which would have to be displayed in a similar way (at a lower resolution, not that it should matter when it's upscaled) and it runs just fine (even with dosbox emulation!). My guess is that there's a fancy library for making this easier for developers that could be optimized quite a bit, but the developers of that library aren't doing that. The thing is, from what i can tell from this game, they aren't really aided by the ease of such software, either. Those animations just have bouncing critters which doesn't mean we need mapping to models and such. But, hey, i've seen people load crazy libs for simple tasks before.

EDIT AGAIN: I should probably make it clear that due to the sporadic use of live2d, this is an example, not something i think is your main source of the problem. I see this kind of thing crop up in projects all the time. You get a little of this, this, and that, which each alone wouldn't be much, but combine alot of these libraries together that were optimized in isolation and not in an environment where they devs considered other things running (like how web browsers like to assume they're the only thing you ever run on a computer) and what do you think you'll get?
Post edited May 13, 2021 by kohlrak
avatar
dtgreene: Looking at CPU comparison sites, the 4700U is clearly the better CPU (ignoring things like price and availability), but yet the GPU number is higher on the 3500U. Could someone explain what's going on?
In Vega 6, 7 and 8, the number indicates the number of compute units in the chip.
The 4000u series chips are a fairly significant upgrade across the board, so even a 4500u with Vega 6 will easily outperform the 3500u's Vega 8.

The gaming performance wasn't what got me hooked on the 4700u though. That's just a welcome surprise.
I wanted a very quiet and battery efficient "ultra portable", preferably with a premium feel to it, for office, browsing and C++ development.
If I trusted Apple to support Rosetta for the long haul, the MacBook Air would have been my first choice. As it is, I wanted a real X86-64, so the HP Envy x360 13 was the best I could find.
HP's Spectre line is even more spiffy, but they are all Intel chips, and I wanted a Ryzen.
Post edited May 13, 2021 by brouer
avatar
kohlrak: (like how web browsers like to assume they're the only thing you ever run on a computer)
Speaking of which, I've noticed that chromium, at least on my Raspberry Pi, seems to assume that the local disk is fast, which in the Pi's case, it often isn't. In fact, in some cases, it is actually faster to re-download images instead of loading them from cache, particularly if running off a microSD card, so the browser is behaving sub-optimally in this case.

Switching to better storage did help (and when I was using a spinning hard drive, the disk accesses were very noticeable), and giving the browser a ram disk for its storage (using firejail) seemt to provide even better performance, particularly for sites like youtube (I haven't tried Zoom yet).

HP's Spectre line is even more spiffy
I'd be careful about anything named after a security vulnerability.
Post edited May 13, 2021 by dtgreene
avatar
kohlrak: (like how web browsers like to assume they're the only thing you ever run on a computer)
avatar
dtgreene: Speaking of which, I've noticed that chromium, at least on my Raspberry Pi, seems to assume that the local disk is fast, which in the Pi's case, it often isn't. In fact, in some cases, it is actually faster to re-download images instead of loading them from cache, particularly if running off a microSD card, so the browser is behaving sub-optimally in this case.

Switching to better storage did help (and when I was using a spinning hard drive, the disk accesses were very noticeable), and giving the browser a ram disk for its storage (using firejail) seemt to provide even better performance, particularly for sites like youtube (I haven't tried Zoom yet).
Yeah, it's a huge problem with development since the past decade. Before devs told you a minimum hardware, then they more or less worked for an average in the 2000s, and now they just straight up mandate updates and/or stealth update even on mobile devices which are harder to upgrade. Something's really wrong with the industry right now, and it's not helping that Moore's law ended.
avatar
dtgreene: Speaking of which, I've noticed that chromium, at least on my Raspberry Pi, seems to assume that the local disk is fast, which in the Pi's case, it often isn't. In fact, in some cases, it is actually faster to re-download images instead of loading them from cache, particularly if running off a microSD card, so the browser is behaving sub-optimally in this case.

Switching to better storage did help (and when I was using a spinning hard drive, the disk accesses were very noticeable), and giving the browser a ram disk for its storage (using firejail) seemt to provide even better performance, particularly for sites like youtube (I haven't tried Zoom yet).
avatar
kohlrak: Yeah, it's a huge problem with development since the past decade. Before devs told you a minimum hardware, then they more or less worked for an average in the 2000s, and now they just straight up mandate updates and/or stealth update even on mobile devices which are harder to upgrade. Something's really wrong with the industry right now, and it's not helping that Moore's law ended.
It's also a problem with the world wide web:
* Sites requiring JavaScript when they have no good reason to
* Sites that use JavaScript using lots of libraries, which in turn need to be downloaded to all web site visitors
* Other annoyances, like sites that auto-play video (youtube, I'm looking at you), or sites that only allow users a limited number of article views per month, without giving the user a choice about whether to use one after the link is clicked
* Of course, user-agent discrimination (I notice it a lot when using lynx; way too many sites give me 403 or other errors when they shouldn't)

Really, the web has become too complex, and way too resource heavy.

(i'm thinking that allowing client-side scripting in web browsers was a mistake.)
avatar
kohlrak: Yeah, it's a huge problem with development since the past decade. Before devs told you a minimum hardware, then they more or less worked for an average in the 2000s, and now they just straight up mandate updates and/or stealth update even on mobile devices which are harder to upgrade. Something's really wrong with the industry right now, and it's not helping that Moore's law ended.
avatar
dtgreene: It's also a problem with the world wide web:
* Sites requiring JavaScript when they have no good reason to
* Sites that use JavaScript using lots of libraries, which in turn need to be downloaded to all web site visitors
* Other annoyances, like sites that auto-play video (youtube, I'm looking at you), ...
* Of course, user-agent discrimination (I notice it a lot when using lynx; way too many sites give me 403 or other errors when they shouldn't)

Really, the web has become too complex, and way too resource heavy.

(i'm thinking that allowing client-side scripting in web browsers was a mistake.)
It's Wirth's law. I don't believe the mistake was these choices, because that would remove power and freedom from developers and continue putting it into the hands of the corporations which are actually causing the problems right now (just look at how android is becoming iOS slowly with all it's constant additions to restrictions on projects like Termux). Instead, I think it's an ideological problem. I've been self-taught in assembly since about 2007, and since then i've been in discussions with people about the poor optimization of compilers (they've gotten better, but now people don't even hit release switches, anymore, but just end up using "obfuscators" which make the whole issue worse). The argument i've had with other coders goes something like this:
Customers are expensive, and a 5 year old computer shouldn't be struggling with running your program with only a music player and email client in the background.
Yeah, but software isn't developed like that. You see, it's about who can make the product the fastest, not who can make the product run the fastest. And the customer is always willing to upgrade, especially since the price of RAM is cheap.
and what of when they stop being willing?
That'll never happen, because people just know what they like and they'll buy it regardless. Plus, the changes you're proposing wouldn't make much of a difference.
That's the usual flow of arguments i've had all these years. I even had one guy argue with me that he found an exe-compressor with the same amount of time that i spent optimizing. Of course, this wouldn't matter for RAM, but only download speed, but that was the argument.

And the truth is, they're right when it comes to the bigger picture: the customer always upgrades, don't they? There was an argument that at the rate we were going, by 2020 we wouldn't be able to make transistors any smaller and thus we'd be in real trouble if it continued. So, Moore's law slowed down to buy time, it seems, but notice the prices haven't, so now we know that people will now pay anything for anything.
or sites that only allow users a limited number of article views per month, without giving the user a choice about whether to use one after the link is clicked
This one i'm going to be more nuanced with, though. I think they have that right, but such sites should not be given preferential treatment like thy have been. And to a similar angle, when going to youtube and looking up something controversial, I don't want some braindead narrative: i want to hear all the wild conspiracy theories so i can use logic and reason to discredit the truly whacky ones. I mean, hell, we're prioritizing biased cut footage over uploads of original and unedited footage, which should be a huge red flag to anyone.

The bigger irony is that the people who have the least problem with it are usually anticapitalist, but it's corporate media being supported by a corporation. You'd think this is a separate topic or line of logic, but i think this goes to the heart of the issue: there's so much reasfon to dislike something these days, and it's so much easier to like something even if we should be fundamentally opposed to it. As such, we're OK with a narrative that we agree with even if it's non-factual, clearly biased, etc, and similarly we're OK with massive amounts of bloat as long as we can cope.

This wouldn't nearly be as big of a problem (the limited articles sites) if they weren't given preferential treatment. People would recognize them, not click on them, and they'd be pushed down the queue. I mean, hell, you don't even go to icy-veins or anything when looking up diablo 3 stuff. It's all corporate "opinion pieces." It's not even arguably limited to politics, but it's just easiest to point to the politics 'cause it's so charged right now it's obvious to at least half the people you talk to.

or sites that only allow users a limited number of article views per month, without giving the user a choice about whether to use one after the link is clicked
avatar
kohlrak: This one i'm going to be more nuanced with, though. I think they have that right, but such sites should not be given preferential treatment like thy have been. And to a similar angle, when going to youtube and looking up something controversial, I don't want some braindead narrative: i want to hear all the wild conspiracy theories so i can use logic and reason to discredit the truly whacky ones. I mean, hell, we're prioritizing biased cut footage over uploads of original and unedited footage, which should be a huge red flag to anyone.

The bigger irony is that the people who have the least problem with it are usually anticapitalist, but it's corporate media being supported by a corporation. You'd think this is a separate topic or line of logic, but i think this goes to the heart of the issue: there's so much reasfon to dislike something these days, and it's so much easier to like something even if we should be fundamentally opposed to it. As such, we're OK with a narrative that we agree with even if it's non-factual, clearly biased, etc, and similarly we're OK with massive amounts of bloat as long as we can cope.

This wouldn't nearly be as big of a problem (the limited articles sites) if they weren't given preferential treatment.
People would recognize them, not click on them, and they'd be pushed down the queue. I mean, hell, you don't even go to icy-veins or anything when looking up diablo 3 stuff. It's all corporate "opinion pieces." It's not even arguably limited to politics, but it's just easiest to point to the politics 'cause it's so charged right now it's obvious to at least half the people you talk to.
The main problem I have with them is that simply clicking a link (and one that doesn't even use a POST request) results in a side effect, in this case the consumption of a resource (the reader's linited number of articles per month). A simple HTTP GET request should never have such side effects, and a link with such side effects should require user confirmation first.

I could compare this to having a link that, when clicked, will result in an order that costs the user money. (There's a good reason that posting a link to a place like amazon doesn't result in an automatic purchase, for example.)

avatar
dtgreene: It's also a problem with the world wide web:
* Sites requiring JavaScript when they have no good reason to
* Sites that use JavaScript using lots of libraries, which in turn need to be downloaded to all web site visitors
* Other annoyances, like sites that auto-play video (youtube, I'm looking at you), ...
* Of course, user-agent discrimination (I notice it a lot when using lynx; way too many sites give me 403 or other errors when they shouldn't)

Really, the web has become too complex, and way too resource heavy.

(i'm thinking that allowing client-side scripting in web browsers was a mistake.)
avatar
kohlrak: It's Wirth's law. I don't believe the mistake was these choices, because that would remove power and freedom from developers and continue putting it into the hands of the corporations which are actually causing the problems right now (just look at how android is becoming iOS slowly with all it's constant additions to restrictions on projects like Termux). Instead, I think it's an ideological problem. I've been self-taught in assembly since about 2007, and since then i've been in discussions with people about the poor optimization of compilers (they've gotten better, but now people don't even hit release switches, anymore, but just end up using "obfuscators" which make the whole issue worse). The argument i've had with other coders goes something like this:
If the code is running on the computer of someone other than the author, and it's run automatically, then the developers shouldn't have that much freedom. In particular, I think the web would be better if browsers didn't execute things like JavaScript by default, and would be required to ask the user first; this would keep JavaScript from being used in places it isn't necessary, and would reduce the number of security issues that would arise.

Actually, I think it would be reasonable if a typical user would have two web browsers, one of which is intentionally feature limited (no JavaScript, no video/audio, and perhaps other dangerous or intensive features absent), and the user would have to switch to another one when needed. In particular, it would make sense to separate documents (and other sites that don't need client side scripting) from web apps. Also, it would encourage web developers to design pages as documents instead of web apps, except when things actually need to be web apps.
Post edited May 15, 2021 by dtgreene
avatar
kohlrak: This one i'm going to be more nuanced with, though. I think they have that right, but such sites should not be given preferential treatment like thy have been. And to a similar angle, when going to youtube and looking up something controversial, I don't want some braindead narrative: i want to hear all the wild conspiracy theories so i can use logic and reason to discredit the truly whacky ones. I mean, hell, we're prioritizing biased cut footage over uploads of original and unedited footage, which should be a huge red flag to anyone.

The bigger irony is that the people who have the least problem with it are usually anticapitalist, but it's corporate media being supported by a corporation. You'd think this is a separate topic or line of logic, but i think this goes to the heart of the issue: there's so much reasfon to dislike something these days, and it's so much easier to like something even if we should be fundamentally opposed to it. As such, we're OK with a narrative that we agree with even if it's non-factual, clearly biased, etc, and similarly we're OK with massive amounts of bloat as long as we can cope.

This wouldn't nearly be as big of a problem (the limited articles sites) if they weren't given preferential treatment.
People would recognize them, not click on them, and they'd be pushed down the queue. I mean, hell, you don't even go to icy-veins or anything when looking up diablo 3 stuff. It's all corporate "opinion pieces." It's not even arguably limited to politics, but it's just easiest to point to the politics 'cause it's so charged right now it's obvious to at least half the people you talk to.
avatar
dtgreene: The main problem I have with them is that simply clicking a link (and one that doesn't even use a POST request) results in a side effect, in this case the consumption of a resource (the reader's linited number of articles per month). A simple HTTP GET request should never have such side effects, and a link with such side effects should require user confirmation first.

I could compare this to having a link that, when clicked, will result in an order that costs the user money. (There's a good reason that posting a link to a place like amazon doesn't result in an automatic purchase, for example.)
In the same way, though, it's their choice how they distribute the "free articles" as well as which ones they choose. I think it should be up to the market to punish them, not to make sweeping changes that would also affect sites that do not engage in this behavior. My bigger gripe is that there's no way to downvote or something to punish sites. Frankly, I don't trust the mainstream media on the left or the right, nor do i want to pay for "opinion pieces," so why am I being pushed to them by priority? The algorithm is not representing my wishes, and the money involved with these is more or less creating a pseudo-monopoly. I've straight up just started using other search engines at this point. When i am forced to use google or a derivative, i generally take advantage of certain features to severely limit my searches to what I want, which isn't effective for when i want alot of different sources, 'cause i've pruned them all away trying to kill the mainstream trash.
avatar
kohlrak: It's Wirth's law. I don't believe the mistake was these choices, because that would remove power and freedom from developers and continue putting it into the hands of the corporations which are actually causing the problems right now (just look at how android is becoming iOS slowly with all it's constant additions to restrictions on projects like Termux). Instead, I think it's an ideological problem. I've been self-taught in assembly since about 2007, and since then i've been in discussions with people about the poor optimization of compilers (they've gotten better, but now people don't even hit release switches, anymore, but just end up using "obfuscators" which make the whole issue worse). The argument i've had with other coders goes something like this:
If the code is running on the computer of someone other than the author, and it's run automatically, then the developers shouldn't have that much freedom. In particular, I think the web would be better if browsers didn't execute things like JavaScript by default, and would be required to ask the user first; this would keep JavaScript from being used in places it isn't necessary, and would reduce the number of security issues that would arise.
There goes cookie law compliance. Also, you'd just get another notification like the cookie laws, anyway. I do think there needs to be something that should be done, here, though, as malicious JS is still a thing. We need more specific user control over what kind of operations can be done. Moreover, we also need some degree of better education on the topic, but the population as a whole has the wrong mentality on these "magic boxes." Something like C should be a requirement for graduation at this point. You aren't permitted to operate a gun if you don't know how to use it properly, so why are we permitted knives and computers when they can be just as dangerous (if not more in the case of a honey pot)?
Actually, I think it would be reasonable if a typical user would have two web browsers, one of which is intentionally feature limited (no JavaScript, no video/audio, and perhaps other dangerous or intensive features absent), and the user would have to switch to another one when needed. In particular, it would make sense to separate documents (and other sites that don't need client side scripting) from web apps. Also, it would encourage web developers to design pages as documents instead of web apps, except when things actually need to be web apps.
Well, in particular, the corporate industry wants to have these "reactive web pages," with things like notifications, adverts, etc, which pretty much all require JS. GOG, for example, would not function strictly with server-side scripting. I think we're also at the point, though, where we need something better than the current setup of using client side scripting for basic features like notifications and updates to web content. Javascript has long extended beyond it's purpose of simple changes and reactions, and it's even to the point where people are using it to design applications that run outside of a web browser. We have too much cheap code by people who barely know how to code and people like me can't get a job doing it 'cause i don't have some fancy paper that says i know what i'm doing, but the clowns doing this stuff do. The whole corporate setup for legal protections based on certifications and blame shifting is an absolute cancer to all industries, but it's very obvious with the "computer technology" industry.

I don't believe the problem is the coders, but instead the system that leads to the prioritization of bad practices. Of course, it's not simple enough to try to target and destroy it, but instead to first accept who's responsible: us. No change is going to permanently fix the issue, especially when we don't address the cause, the source that lead to the problem. Right now, there's a push to change programming languages to make them more "idiot proof" for the devs who have certifications but have no idea what they're doing. The situation is a mess, and we're putting band-aides on gaping wounds.