DirectX 9 CPU Benchmark Thread

Discussion in 'General Discussion' started by Thomas Jansen, Jul 22, 2019.

  1. Balrog

    Balrog Well-Known Member

    Joined:
    Apr 10, 2015
    Ratings:
    +466 / 0 / -0
    Still only rumours, but:

    "Greymon55 continued, saying that production of Zen 4 processors with 3D-stacked cache will not begin until production of Zen 3D chips has stopped so that the line can be changed over to the new chips. What this means in practice, confirmed by a follow-up reply, is that AMD's first wave of Zen 4-based processors likely won't include any chips with 3D V-cache. Instead, those parts will probably come next year."

    Source: https://hothardware.com/news/zen-4-cpus-will-also-get-3d-v-cache-next-year
     
    • Like Like x 1
  2. nolive721

    nolive721 Well-Known Member

    Joined:
    Dec 2, 2018
    Ratings:
    +74 / 0 / -0
  3. Giovanni Giorgio

    Giovanni Giorgio Member

    Joined:
    Feb 28, 2022
    Ratings:
    +6 / 0 / -0
    It really sounds too good to be true, but the performance gains are real. If I had not owned and tested the 5800X before upgrading, I also would have a hard time believing that it would make such a difference.
     
  4. Pape78

    Pape78 New Member

    Joined:
    Jun 26, 2021
    Ratings:
    +4 / 0 / -0
    That sounds really interesting. I'm thinking of switching from my 5600x.
    I use the HP G2 and even there I reach the CPU limit. My GPU load doesn't get above 75%.

    Could you observe something regarding the GPU load? You can see that with the tool FPSVR.
     
  5. Giovanni Giorgio

    Giovanni Giorgio Member

    Joined:
    Feb 28, 2022
    Ratings:
    +6 / 0 / -0
    Oculus Rift has its own performance monitor HUD and I've used that to back my claims up, FPSVR is not needed. I did not see the GPU load on a separate app (I could use GPU-Z for this), but I'm pretty sure the GPU was fully utilized. I'll check the GPU load later today and will let you know.
     
  6. Maarten

    Maarten Member

    Joined:
    Apr 15, 2019
    Ratings:
    +23 / 0 / -0
    The cv1 is a completely different story to the G2. It's incomparable. Cv1 is low resolution with better software including the open composite possibility, the g2 is high resolution with crappy wmr software.
    If you plan on doing ranked, you better test with time progression on, because that's on by default on these servers.
     
  7. Giovanni Giorgio

    Giovanni Giorgio Member

    Joined:
    Feb 28, 2022
    Ratings:
    +6 / 0 / -0
    I was not in the mood for recording this in VR, so I did in pancake mode but at 4K60 VSync off to roughly simulate the VR load. It's not the same thing, I know. But at least its something. GPU used here is the Nvidia 1080 Ti. CPU is the Ryzen 5800X3D.
    There is Rivatuner OSD statistics showing CPU and GPU usage, and the framerate obviously.

    The video is still not processed yet, so you have to wait a little.

    Raceroom CPU Stress test Ryzen 5800X3D at Nords with 99 AI cars 52 visible cars 4K60FPS Drunk Mode lol
     
    • Like Like x 1
  8. Pape78

    Pape78 New Member

    Joined:
    Jun 26, 2021
    Ratings:
    +4 / 0 / -0
    I just tested it too and was at about 55 frames at the start. 99 opponents and 52 visible on the Nordschleife. You were even in the GPU limit the whole time. The frames would probably be higher. In any case, that sounds tempting, since I'm even in VR in the CPU limit.
     
    • Like Like x 1
  9. Giovanni Giorgio

    Giovanni Giorgio Member

    Joined:
    Feb 28, 2022
    Ratings:
    +6 / 0 / -0
    Yes, with a better GPU the frames would be higher with the 5800X3D, but not in this case. As you've noticed, I'm GPU bottlenecked in this specific scenario. You on the other hand are CPU bottlenecked.

    Same race but at 1080p to eliminate the GPU bottleneck. Instead of uploading a video (I don't think its necessary at this time), I took some screenshots to show the limit of the 5800X3D in that exotic race with 100 cars at Nords. It seems changing the GPU will not be that much better except in VR. The game is heavily CPU bottlenecked due to being DX9 and mostly single core, like Starcraft.

    [​IMG]

    [​IMG]

    [​IMG]
     
    Last edited: May 4, 2022
  10. Giovanni Giorgio

    Giovanni Giorgio Member

    Joined:
    Feb 28, 2022
    Ratings:
    +6 / 0 / -0
    I think this is enough to show the limit of the 5800X3D with the video and the screenshots. I hope it helps people to make their decision to upgrade or not you CPU to this one. For me it's well worthy it.
     
    • Like Like x 2
  11. Maarten

    Maarten Member

    Joined:
    Apr 15, 2019
    Ratings:
    +23 / 0 / -0
    12900K with E-cores disabled, XMP, no overclock (yet)

    heaven.png

    cpu.png

    memory.png
     
    • Like Like x 2
    • Wonderful Wonderful x 1
  12. Vale

    Vale Well-Known Member

    Joined:
    Jul 4, 2019
    Ratings:
    +278 / 0 / -0
    Grats on the new record! I thought the 12th gen would do very well but not sure why you have DDR4 when it supports 5 and also why the E cores are disabled.
     
  13. Maarten

    Maarten Member

    Joined:
    Apr 15, 2019
    Ratings:
    +23 / 0 / -0
    DDR5 is not per definition faster than DDR4. I already have very fast DDR4, so i used those.
    Windows 10 doesn't handle the E cores very well. This causes lag spikes in vr. Maybe Windows 11 does a better job, but i haven't tested that yet.
     
  14. Vale

    Vale Well-Known Member

    Joined:
    Jul 4, 2019
    Ratings:
    +278 / 0 / -0
    Good to know. I do belive DDR5 is much quicker but there is only one expensive way to find out!

    As for W11, the annual update will be out in a couple of months so probably best to wait till then to change as a lot of the UI stuff they dropped from W10 will hopefully be coming back.
     
    Last edited: May 13, 2022
  15. Balrog

    Balrog Well-Known Member

    Joined:
    Apr 10, 2015
    Ratings:
    +466 / 0 / -0
    It hasn't been the case so far, the latency of the modules has also increased in line with the frequencies, so there is no significant performance difference between the two. But the manufacturers are coming up with better and better latency-frequency ratios, so we'll see.
     
  16. Yuval Rosen

    Yuval Rosen Well-Known Member

    Joined:
    Jun 18, 2021
    Ratings:
    +49 / 0 / -0
    Specs:
    - GPU: Nvidia GeForce RTX 3070 Founders Edition
    - CPU: Intel Core i5-12600k
    - RAM: Kingston Fury 2x8gb 5200MHz DDR5
    - Motherboard: MSI Pro Z690-a

    upload_2022-8-30_19-10-22.png
     
  17. Yuval Rosen

    Yuval Rosen Well-Known Member

    Joined:
    Jun 18, 2021
    Ratings:
    +49 / 0 / -0
    When overclocked with Intel XTU's Speed Optimizer from 4.5GHz all core to 4.7GHz, it managed to pull off this:
    upload_2022-9-1_1-6-18.png
     
    • Like Like x 1
  18. Sascha Reynders

    Sascha Reynders Well-Known Member

    Joined:
    Feb 18, 2016
    Ratings:
    +109 / 0 / -0
    My latest build: 12900K running stock with OCTVB +2 profile, both P and E cores enabled, Asus Strix Z690-A Gaming WiFi D4 (DDR4) motherboard. Memory is 4x8GB G.Skill Trident Z 4000CL15-16-16-36 but will only run in Gear 2 at those speeds/timings. At 3600 it runs stable in Gear 1 with tightened timings.

    R3E 12900K CPU Benchmark.png

    • CPU: Intel i9 12900K
    • CPU OC: stock with OCTVB +2 profile enabled
    • Memory: 32GB DDR4-3600
    • Memory timings: CL14-14-14-34
    • GPU: Nvidia GeForce RTX 3080ti
    • OS: Windows 11
     
    • Like Like x 3
  19. hugotwowheels

    hugotwowheels Member

    Joined:
    Jun 10, 2022
    Ratings:
    +11 / 0 / -0
    Have you done anything else but changing the CPU? I'm on a Ryzen 5 3600 / RX 6600 @ 4k but can't even come close to this kind of performance. DXVK helps a lot but shadows are bugged with it.

    Edit: Answering my own question. No, a 5800X3D simply is that superior.
     
    Last edited: Sep 21, 2022
  20. Spidybite

    Spidybite New Member

    Joined:
    Jan 28, 2022
    Ratings:
    +3 / 0 / -0
    Sily question...
    I guess anything over 60FPS is good, but does anything actually run at 640x360? (presumably is just being used as a standard for benchmarking purposes only?).