Forums » Off-Topic and Casual Chatter

Code popping GPU s

    • 68 posts
    October 6, 2021 11:20 AM PDT

    The elephant in the room atm in the MMO whirled is New Worlds.  Check this video out, es6our devs, and then talk amongst yourselves.  Below...

     

    https://youtu.be/6A0sLVgJ7qU

    • 1921 posts
    October 6, 2021 1:11 PM PDT

    IMO:

    Yeah, been an issue for a few months.

    In my experience, the optimized benchmark for GPU/CPU load in first/third person 3D games is No Man's Sky 3.60+.  Took them a few years to get there, but wow.
    It uses (not an exaggeration) less than 5% of a 1050/1060/1070/1080/166x/RTX GPU under any conditions.  You can go and try it for yourself, it's honestly almost hard to believe, when you see it in practice.

    Similarly, you can look at games like Horizon Zero Dawn.  Basically uses no GPU whatsoever.  Both NMS and HZD maintain 60/75/144 FPS or whatever you'd care to set as an upper limit, regardless of low/med/high quality settings.
    These games are created, from scratch, not to rely on the GPU, yet, provide a visually stunning 3D environment with no hitching, tearing, low frame rate, display lag, input lag, or anything remotely like that.
    It can be done.  These games and many others like them are proof of that.

    When I start a new game, now, I check how much GPU load is produced during the main menu, then how much is produced after the game initially loads the player into the starting environment or scene.  These values vary dramatically, and most new games are extremely poorly written.  Often, some games run the GPU at 100% just at the main menu, because they have no frame rate limiting of any kind.  No consideration whatsoever to the reality of power draw from a fully loaded GPU.
    Often, games developed with a very small team, and/or developed very rapidly, or with a team with very little prior experience, suffer from 100% GPU load and a game that is ... frankly very simplistic, from a 3D perspective, when compared with NMS & HZD.
    My suspicion, based on seeing what is possible with NMS and HZD is that New World is not programmed properly.  That can mean a lot of things, but in this case, I suspect whatever underlying graphics library or engine they are using does not have reducing GPU load as a high priority goal.  Likely, there 

    What's extremely telling is that you can often set frame limits like 30, 60, or higher FPS, and when you do, there is a direct and immediate corresponding change in GPU load. 
    Ok, so clearly, it's possible to reduce GPU load, yet.. it's not done, intentionally or out of ignorance.  So people run their cards at (hyperbole) 900 FPS and it's OMG so amazing, " WOW, I got (hyperbole) 900FPS! ", on their 144Hz freesync/g-sync monitors, and their cards draw north of 400W for 10 hours in a microATX case kept in a cupboard under their desk with one case fan and then they're suprised when it literally melts? :|
    Maybe while New World sorts out their optimization, some customers should frame limit down to 30-60FPS, keep the card at under 50-75% load, and be patient.

    Short version?  Running any GPU at 100% for hours or days will cause problems, in my experience.  The rapid, dramatic, and constant thermal cycling alone increases MTBF, that's been known in electronics of any kind, and especially in FPGAs, CMOS, ASICs, or any remotely similar ~VLSI tech, for many decades.  Likely, New World is simply highlighting a much larger systemic problem, and it's just exacerbated by the long play times most hardcore players put into an MMO.

    • 2752 posts
    October 6, 2021 4:16 PM PDT

    From what I can tell New World is just highlighting faulty GPU manufacturing more than it is responsible for any card deaths. EVGA came out and said as much about their cards regarding New World. 

    • 68 posts
    October 6, 2021 4:35 PM PDT
    The thing I was noticing was Jay was finding bug after bug in Amazon's engine coding as the culprit. It doesn't surprise me that Amazon could not possibly care less if they fry peeps GPUs with smurfed coding. I'm figuring from what I've heard of Pantheon's engine and programming coupled with the fact that our devs basically us that this wouldn't be a thing. But itd still be nice to hear from a dev their take on this. My understanding is that a RTX3090 would be gross overkill, but again, would love one of our devs perspective on it
    • 223 posts
    October 6, 2021 8:57 PM PDT

    Interesting video. Clearly there is an issue with New World. However I do feel for the devs; it's perhaps unfair to claim a case of bad code as a cause. They've developed a proprietary engine, from my understanding, and that in itself would be a massive undertaking. The amount of testing involved, software and hardware, would be monumental. From what I've seen of NW in many respects it doesn't seem ready for release. 

     

    This may very well be a bullet dodged by Pantheon, as VR is using a licenced engine; Unity. The benefit in licencing engines such as Unity or Unreal is that this effort doesn't (shouldn't) be the burden of the licencee. Finger's crossed :)

    • Moderator
    • 9115 posts
    October 7, 2021 1:30 AM PDT

    Moved to off-topic as it doesn't relate to Pantheon at all.

    • 422 posts
    October 7, 2021 4:57 AM PDT

    I love how some random hardware failure that happens in very specific targeted instances to a literal handful of GPUs is suddenly a huge issue every game dev needs to be aware of.

    As Iksar said, EVGA has stated it was just faulty hardware.

    • 68 posts
    October 7, 2021 5:22 AM PDT
    EVGA did say that. Then someone else demonstrated there was more to it. For one thing the GPU was set to 102% but hitting 117%, then set to 90% and still pegging over 100%. Or the GPU setting itself to 4k in the game menu when it wasn't an option on the monitor. This subject is all new info to me. I wish you weren't so dismissive.
    • 1921 posts
    October 7, 2021 7:47 AM PDT

    kellindil said:

    I love how some random hardware failure that happens in very specific targeted instances to a literal handful of GPUs is suddenly a huge issue every game dev needs to be aware of.

    ...

    IMO:

    If a 3D first/third-person game can be made where it uses 5% of a GPU, and the exact same game can be made where it uses 100% of a GPU, all other features/performance in-game being equal, which implementation is better?
    Maybe if more game devs were aware of it, more games would be made like NMS and HZD, where the GPU is a nice-to-have rather than consuming more power than the CPU by several times.  And yet, the game performance and visual appeal is still astonishingly awesome. :)

    • 2752 posts
    October 7, 2021 12:21 PM PDT

    dudimous said: EVGA did say that. Then someone else demonstrated there was more to it. For one thing the GPU was set to 102% but hitting 117%, then set to 90% and still pegging over 100%. Or the GPU setting itself to 4k in the game menu when it wasn't an option on the monitor. This subject is all new info to me. I wish you weren't so dismissive.

    It's all hardware in the end. The game doesn't communicate directly with the graphics card and it shouldn't even have to worry about how it's doing and if it can handle the load. If it's a software issue then the issue is in DirectX, the Nvidia driver or GPU bios. The worst thing that should happen if the NW developers screwed up is the game crashing or having a bad FPS.

    It is a problem for some that New World is exposing whatever it is for sure, definitely something for some to be cautious about. But the responsibility for preventing the problem is on the GPU.

    • 68 posts
    October 7, 2021 12:24 PM PDT
    I thought this would be a topic where Pantheon would have a chance to shine because all I've heard is about how the devs have made so many decisions to make the engine more efficient and scalable. I'm surprised that a lot of the reactions I've gotten have been: "shut up"
    • 2053 posts
    October 7, 2021 12:37 PM PDT

    vjek said:IMO:

    If a 3D first/third-person game can be made where it uses 5% of a GPU, and the exact same game can be made where it uses 100% of a GPU, all other features/performance in-game being equal, which implementation is better?
    Maybe if more game devs were aware of it, more games would be made like NMS and HZD, where the GPU is a nice-to-have rather than consuming more power than the CPU by several times.  And yet, the game performance and visual appeal is still astonishingly awesome. :)

    To be clear, I have no experience of NMS or HZD at all.

    If 'ordinary' mainstream MMOs of various genres can be made to function normally while only using 5% of a good GPU's ability, then the only logical conclusions I can draw from that is:

    1. The games you mention - NMZ & HZD - have some extremely unique properties to their gameplay that removes the need for heavy-duty graphics processing. OR

    2. The maker's of those games have made a discovery that SHOULD be soon making headlines on news & fan sites across the Gaming World. OR

    3. The Graphics card industry and the Game Development industry have been in collusion for years now to keep consumers buying $1000 cards when $100 cards - or even onboard graphics - would be quite sufficient.

    I'd like to believe #2, but I'm not holding my breath. I don't believe #3 for a second. That pretty much leaves #1 as the only sensible explanation to me.

    Can you offer a fourth choice?

     


    This post was edited by Jothany at October 7, 2021 12:40 PM PDT
    • 1921 posts
    October 7, 2021 1:54 PM PDT

    IMO:

    Some programmers are better than others, is my best estimate so far.

    Those two games aren't unique.  They're just two very popular examples of this technique in action.  I've seen many others, over the past few years.

    I haven't decompiled those two games to check, but I suspect they're simply doing all the heavy lifting with the CPU now, and only using the GPU as an ouput device in the simplist possible way.  Yet, even on those games, CPU load is minor.  Certainly nowhere near 6-8 cores at 100% @3-4+Ghz.  More like 2-3 cores at 20-30% @2-3Ghz.

    From what I've seen, many modern console ports end up like this (using CPU instead of GPU) yet still sustain all 3D/Video/Graphics features, full 60/75/144 FPS, all post processing, full shadows, full draw distance, full everything, because they're doing it all with the CPU instead.  Several console ports I've played over the past few years are similar in their performance profile, as HZD, when played on a PC.

    It may simply be that 10th and 11th gen CPUs are so much better, some developers have found doing it all there ensures 100% compatabiility, and no GPU related platform/driver/heat issues.
    I know personally, using the Panda3D libraray with Python, provided I don't use compiled GPU shaders or GPU specific libraries/functions, it's all done on the CPU, and the results are similar 
    Something like 10-13% of a GT1030 to render a 128x128x9 tile 3D scene at very high/detailed smoothing/shading/scale/MinLevel/BlockSize,etc and sustaining 60FPS trivially.


    This post was edited by vjek at October 7, 2021 1:55 PM PDT
    • 2053 posts
    October 7, 2021 3:58 PM PDT

    Well I hope it's all real. I'm a firm believer in 'There ain't no such thing as a free lunch'.

    Your description of the situation still sounds "too good to be true" to me.

    • 1921 posts
    October 7, 2021 5:33 PM PDT

    IMO:

    You (and anyone) should really try either game to see it for yourself. :) Don't take my word for it.
    NMS in particular is just.. jaw dropping in optimized efficiency in any version after 3.60.
    Similarly, Panda3D and Python are both free, and you can try that for yourself for free, on any platform.

    • 2053 posts
    October 10, 2021 12:36 PM PDT

    vjek said: Similarly, Panda3D and Python are both free, and you can try that for yourself for free, on any platform.

    LoL. My coding skills start at Basic, flourish in Fortran, and fade away in Algol. I sort of recognize Panda3D as some sort of programming language. But Python..... that's a whole 'nother Circus. (wink wink nudge nudge)

    :D

    • 68 posts
    October 11, 2021 5:32 AM PDT
    Ps: its not just EVGA. Atm Gigabyte is having the most deaths. Again... its coding exposing flaws in hardware by being buggy and causing them to exceed spikes they were never meant for. Bad coding