It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
i guess all cyberpunk does now is kill gpu's.
while nvidia keeps bolting ductape over performance

i expected them to update or bring physix into a new realm.. nope.
Post edited 3 days ago by XeonicDevil
Sad since GPU PhysX is the reason why some old games from the late 2000s and early 2010s still have better physics than most new modern games in the 2020s. I would rather get better physics than get better graphics and ray tracing.

I also wonder if Nvidia will also make PhysX slower on older GPUs with a driver update... I remember when they turned off Nvidia 3D Vision through a driver update for all GPUs in 2019.
Maybe someone will come up with another dll wrapper solution that translates 32 bit physx to 64 bit cuda or something.
avatar
Randalator: Just run it on the CPU. That old ass PhysX stuff won't make any halfway decent CPU break a sweat...
https://youtu.be/mJGf0-tGaf4

If you assume things you make a you know what of u and me :P.

Don't confuse run of the mill PhysX games which were optimized for CPU in the first place, with games that are intended to run PhysX with CUDA. The latter will perform absolutely abysmal if you shove them onto a CPU, because the entire workload will be single threaded.
Post edited 3 days ago by WinterSnowfall
to be honest i expected the tech to evolve 10 years ago... as per usual were specially sold a tech that sold gpu's for the time.. they went that far then dropped it entirely.

nividia has been scaling by on minimum effort.
and were too dumb to say... no

they literally search for the code that runs the worst and call it crysis.. to sell you? 1% of performance... oh wait thats not performance.. it's just a key.. much like making api's that work fine outdated... we pay for some dumb guys code to run.
Post edited 3 days ago by XeonicDevil
One of many good reasons i am against proprietary stuff because if they decide to "drop" it then there is almost nothing someone can do in order to get the old stuff to work properly. Sure... a "translator"... a emulation, may help but there is no warranty on that. Nvidia surely was benefiting from commercial stuff more then enough already... same with MS, Apple and many more companies... so... i would not say they did not become sufficiently paid for their "awesome work"...
Gonna be time to become more social and trying to benefit the customers even more.

Those creating emulations and all sort of workarounds... they are getting paid close to nothing, but why should a huge company care as long as they get the "benefits"? In theory there is a huge demand for "alternate approaches", but it is always difficult putting it into any fully legal condition... because those who truly got power obviously do not want to "share to much of their pretty huge cookie"... and in most cases the politics is still backing up those "needs" from the more powerful ones. The world could be different or at least offer way more "options", yet we are still far away from this condition.
Post edited 2 days ago by Xeshra
avatar
P. Zimerickus: You don't save your GPU's O.O ??????????

check this one out.....

https://www.youtube.com/watch?v=Hne3SoBoAuQ

They call it a dream build but...
correct me if i'm wrong but... a top tier build without water cooling.... like ridiculous yea ?

I still can't understand why the market seems to be stuck on these preposterous offerings.... I know, it is still not car worth 40k or more but please.... industry
It has liquid cooling. The big black block with hoses going to the radiator...?
avatar
P. Zimerickus: You don't save your GPU's O.O ??????????

check this one out.....

https://www.youtube.com/watch?v=Hne3SoBoAuQ

They call it a dream build but...
correct me if i'm wrong but... a top tier build without water cooling.... like ridiculous yea ?

I still can't understand why the market seems to be stuck on these preposterous offerings.... I know, it is still not car worth 40k or more but please.... industry
avatar
paladin181: It has liquid cooling. The big black block with hoses going to the radiator...?
It is absolutely amazing how much energy is generated in quantum style computing. You put in 1 Watt and your little mistress is able to compete against the combined computing power of all CPU's working in 1999, including those Wall Street bastards
avatar
Randalator: Just run it on the CPU. That old ass PhysX stuff won't make any halfway decent CPU break a sweat...
https://youtu.be/mJGf0-tGaf4
Who cares the 5000 series anyway? It is currently the worst Nvidia ever made due to several issues... and pricing is just one of them.


MSI RTX 5090 Suprim Liquid SOC crashing in Avatar, Assassin’s Creed Mirage, and Star Wars Outlaws

https://www.youtube.com/watch?v=XM0Nw6UiHEU

4090 goes for 3000+ now because it is still "the best" stable card.

AMD is done competing, because they realized... they can barely be even more worse than Nvidia, so they had to pass.

Although i bet those cards may work great on AI-datacenter tasks... which is "working as intended".
Post edited Yesterday by Xeshra
Now you know how AMD GPU users have always felt!

This doesn't seem like the first (and last) time either. Didn't e.g. the Splinter Cell games use some proprietary NVidia Geforce features which NVidia dropped from later GPUs, making those games look like shit later on or something? Like the lightning in those games became much worse etc.?
avatar
timppu: Now you know how AMD GPU users have always felt!

This doesn't seem like the first (and last) time either. Didn't e.g. the Splinter Cell games use some proprietary NVidia Geforce features which NVidia dropped from later GPUs, making those games look like shit later on or something? Like the lightning in those games became much worse etc.?
Thought Pandora Tomorrow just didn't work anymore, why it's not sold anywhere. But may be a different reason...
https://www.youtube.com/watch?v=_dUjUNrbHis

RTX 5080 PhysX Performance - Mirrors Edge & Borderlands 2 (32-bit PhysX Not Supported on 50-Series)
Get a laptop with a 4060 and save on your electric bill ;)

Prove me wrong!
Ooooo what are the chances AMD will copy this move with the 9070 XT...