I tried gaming on a cheap dual Xeon system

Key findings

  • For high resolution gaming, opting for a budget Xeon configuration may make sense, as GPUs are preferable to processors in terms of performance.
  • Server CPUs can impact performance in esports titles because they prioritize single-core performance and high refresh rates.
  • While not ideal for high refresh rate gaming, Xeon systems with powerful GPUs can deliver playable FPS at high resolutions.



There's no denying that server hardware offers top-notch performance on virtualization-intensive tasks. However, when it comes to gaming, multi-core Xeon and Epyc CPUs often fall short. To put this in perspective, even the most expensive server processor will easily lose out to a low-cost mainstream processor when it comes to gaming.

But with old Xeon chips available at bargain prices, it's pretty tempting to pick one up. Having recently purchased a dual Xeon system for my home lab, I decided to test it out with some of the most graphically demanding games in my library. If you're curious, here's a log of all my observations.


Related

I tried running some macOS apps on Proxmox. This is how it went

As long as you have a fast processor, enough RAM and plenty of patience, you can run a surprising number of apps on your Proxmox Hackintosh

The test bench

Before I post the pictures and benchmark tables, I want to go over the specs of my test bench. Since we're testing budget processors, I did all my testing on a cheap X99 motherboard equipped with two Intel Xeon E5-2650 v4 processors, with each CPU running at stock settings. In terms of memory, I used two 32GB ECC registered DDR4 memory sticks at 2400MHz in dual-channel mode, although in hindsight I should have chosen a quad-channel configuration for the best performance. For those living in the US, it should be possible to get a similar system for well under $250.


With that out of the way, it's time to address the obvious: the GPU. For the first round of testing, I used my old GTX 1080, which still delivers solid frame rates at 1080p and can even hold its own at 1440p. Since the project would be incomplete if I used an eight-year-old GPU, I used my RTX 3080 Ti for the second wave of testing. While it's not an RTX 4090, I believe it should serve as a good replacement for a high-end current-gen graphics card. I was also tempted to give my Intel Arc A750 a shot, but decided against it since my X99 motherboard doesn't support Resizeable BAR, and without that setting, the Alchemist family's performance is massively compromised. In case you're wondering about the PSU, I used my 1000W Corsair RM1000e PSU to provide maximum power to the test bench.


Related

Which GPU should you buy in each $100 price range?

There is something for everyone here, whether you are a budget user or a gaming enthusiast

As for games, I decided to test five of the most realistic titles in my library: Armored Core VI: Fire of the Rubicon, Baldur’s Gate 3, Cyberpunk2077, Elden RingAnd Red Dead Redemption 2. I used the highest possible settings in almost every game and tested each title twice: once at 1080p and again at 4K.

Red Dead Redemption 2
kept crashing when I set Volumetric Fog, Screen Space Reflections, and other demanding effects to Ultra on my GTX 1080, so I had to compromise and turn it down to Medium/High.

To avoid skewing the results, I left DLSS and ray tracing off on both GPUs. Finally, I played each game for a few minutes to determine GPU usage, system temperatures, and FPS instead of running the built-in benchmarking tools.


The GTX 1080 proved to be a ballast at 4K

And the Xeon processors were the bottleneck at 1080p

Running Red Dead Redemption 2 on a Uperfect UGame K118

During the first round of testing with the GTX 1080, the frame rates were more or less what you would expect from a server PC with an old GPU. At 4K, the games could barely be called playable, and that was because the system was being slowed down by the GTX 1080. Sure, I might have gotten slightly better frame rates on a mainstream processor, but a 4-5 FPS difference doesn't mean much when you're playing at less than 20 FPS (if you can even call it that).


GTX 1080 + 2x Xeon E5-2650 v4

Red Dead Redemption 2 (4K, custom settings)

25 FPS

Red Dead Redemption 2 (1080p, custom settings)

38 FPS

Cyberpunk 2077 (4K, Ultra Settings)

53 FPS

Cyberpunk 2077 (1080p, ultra settings)

17 FPS

However, the situation was completely different after I turned the resolution down to 1080p. In most titles except Red Dead Redemption 2 And Cyberpunk2077GPU utilization had dropped from 100%, meaning that the 24-core, 48-thread processors were now the bottleneck, not the GPU.

GTX 1080 + 2x Xeon E5-2650 v4

Armored Core VI: Fires of Rubicon (4K, Ultra Settings)

34 FPS

Armored Core VI: Fires of Rubicon (1080p, Ultra settings)

52 FPS

Baldur's Gate 3 (4K, Ultra Settings)

24 FPS

Baldur's Gate 3 (1080p, Ultra Settings)

57 FPS

Elden Ring (4K, Ultra Settings)

33 FPS

Elden Ring (1080p, Ultra Settings)

46 FPS


In Baldur’s Gate 3, Armored Core VI: Fire of the RubiconAnd Elden Ringperformance was clearly affected by the server CPUs. I don't want to complicate things with more numbers, but let's just say that Elden Ring consistently achieved a solid 60 FPS at 1080p when I paired my GTX 1080 with the Ryzen 5 1600. Baldur’s Gate 3 And Armored Core VI typically exceed the 60 FPS threshold, and depending on the region, I've seen frame rates in the 80 FPS range.

With the RTX 3080 Ti, the system was tied exclusively to the CPU

And in almost every game, the FPS at 4K and 1080p were pretty similar


After completing the benchmarks on the GTX 1080, I put my RTX 3080 Ti in its place. Armored Core VI And Elden RingGPU utilization barely reached the 70% threshold, meaning the RTX 3080 Ti was being slowed down by the processor.

RTX 3080 Ti + 2x Xeon E5-2650 v4

Armored Core VI: Fires of Rubicon (4K, Ultra Settings)

54 FPS

Armored Core VI: Fires of Rubicon (1080p, Ultra settings)

65 FPS

Elden Ring (4K, Ultra Settings)

37 FPS

Elden Ring (1080p, Ultra Settings)

42 FPS


I usually play both titles at the same resolution (with ultra settings) on my primary machine, which includes a Ryzen 5 5600X, and as you can imagine, frame rates were always above the 60 FPS mark. Lowering the resolution also didn't increase the frame rate by much, which makes it clear that the server CPUs aren't suited to gaming at high refresh rates.

RTX 3080 Ti + 2x Xeon E5-2650 v4

Cyberpunk 2077 (4K, Ultra Settings)

44 FPS

Cyberpunk 2077 (1080p, ultra settings)

63 FPS

Running Baldur's Gate 3 on the Uperfect UGame K118 portable monitor


In the meantime, Cyberpunk2077 at 4K Ultra settings, the last-gen champion was definitely challenged as GPU usage remained at 100%. Switching to 1080p made a huge difference in frame rates, although it's worth noting that even my old Ryzen 5 5600X was easily able to hit over 90 FPS with the same GPU.

RTX 3080 Ti + 2x Xeon E5-2650 v4

Baldur's Gate 3 (4K, Ultra Settings)

78 FPS

Baldur's Gate 3 (1080p, Ultra Settings)

84 FPS

Red Dead Redemption 2 (4K, custom settings)

68 FPS

Red Dead Redemption 2 (1080p, custom settings)

75 FPS


Now for the more interesting results: Red Dead Redemption 2 And Baldur’s Gate 3 ran at surprisingly high frame rates, even at 4K. One could argue that I ran RDR2 with slightly tweaked settings, but even then I was quite surprised by the FPS as it was pretty close to what I usually get on my main machine. But it all made sense when I went to Task Manager and checked the load on all cores. From the looks of it, both games were efficiently distributing the load across all cores, resulting in better performance than the other titles I tested.

Should you buy Xeon systems for gaming?

A server-grade Intel Xeon E5-2650 v4 processor


The answer is clearly a bit more complicated than just shouting “no” and calling it a day. If you plan on playing esports titles, you should avoid server CPUs and invest in a high-end mainstream processor instead. Those who prefer high refresh rates over sharp resolutions will be sorely disappointed with the performance of server processors.

On the other hand, if you game at high resolutions where the GPU is more important than the processor, a cheap Xeon setup might be just what you need. As you've already seen, there are some games that can use multiple cores to make up for the lack of single-core performance. In summary, while overall performance varies quite a bit depending on the title, you can expect playable FPS at high resolutions if you pair these CPUs with powerful graphics cards. Just keep an eye on the system's power consumption if you don't want to be surprised by an outrageously high electricity bill.


Related

7 things you should know before buying a server PC

Buying a server PC is different from buying gaming gear. Here are seven things to consider before getting one for your home lab.

Leave a Comment