In my last video I unboxed the brand new Nvidia GeForce GTX 1080Ti, the best new graphics card on the market, and now it's time to do something with it. In this video I'll install the 1080Ti into Project Sledgehammer, and I'll put it up against the former pair of Titan X's. Can the 1080Ti single handedly outperform the Titan Xs in SLI? We'll answer that question and more.
While this is not the first time I've benchmarked a video card, this is the first time I've used a structured method and created an official review of my findings.
Despite what other reviewers do, I firmly believe the true purpose of a benchmark is to get an idea of baseline performance. For this reason, I will never benchmark an overclocked graphics card that is not factory overclocked, in an official review. The point is to convey to my viewers the minimum amount of performance they can expect, and if I'm able to overclock my GPUs higher than you, then my videos do nothing to help you understand what to expect. Furthermore, I will also run official benchmarks on the stock cooler, as I feel the same about an aftermarket water cooling solution being of no help to those who do not watercool their hardware. This does not mean I will never show overclocked and watercooled results, those videos just won't be official reviews. To help with any oddities I may encounter while testing, I run each test 3 times at each resolution tested. I record the minimum and maximum frames per second and average those, and for the official benchmarks I also average the scores. From the 1080Ti going forward, I also record the average FPS of the test, which may or may not vary wildly from the high / low average.
The software I use will change from time to time as needed to remain relevant, but for now I am using the following software to benchmark hardware. Ghost Recon Wildlands, Rise of the Tombraider, Grand Theft Auto V, Fire Strike and Fire Strike Ultra, Time Spy, Unigine Heaven 4.0, and Unigine Valley 1.0. All games are ran at their ultra presets, and the official benchmarks at their default settings.
With that said, let's start by recapping the specs of Sledgehammer. Sledgehammer is built on an Asus Z10PE-D16 Workstation motherboard, with two 2.3GHz Intel Xeon E5-2696v3 18 core processors, with hyperthreading. These processors have the potential to Turbo up to 3.3GHz. Just for clarification, the 2696 is the OEM version of the 2699. It also has 256GB of error correcting DD4 RAM, boots from a 950 Pro SSD, and runs dual Maxwell Titan X's. Nearly anything you would like to know about it, is in the prior 5 videos of this series.
To begin, I removed one Titan X, and ran my tests as a single graphics card. I then replaced the second Titan X, and ran my benchmarks in SLI mode. After all of this, I then removed both Titan X's and installed the 1080Ti, and ran my tests again. The results were expected, but still quite impressive to see.
You will notice the Titan X results do not have the benchmark FPS average. I did not record those numbers for the Titan X, but going forward, all of my tests, including the 1080Ti, will have it.
This graph is exactly what we expected. In Ghost Recon Wildlands, a single Titan X comes in at a high / low average of 44.72 frames per second at 1080p. Definitely lower than the desired minimum 60. In SLI, we jump to 58.25 fps, but still just shy of 60. At 4K the numbers are almost depressing, coming in at 20.46 fps in single card config, and just 27.01 fps in SLI. The 1080Ti, however, not only breaks through the 60 fps barrier at 1080, but it does it as a single card hitting 66.87 fps, and averaged 66.79 fps during the entire benchmark. 4K, however, is a different story. Even the mighty 1080Ti is not enough to reach anywhere near the 60fps mark in this game at Ultra. I'm not sure if that's because the game is so new, or if it's just that demanding.
In Rise of the Tomb Raider, the Titan X holds it's own with a high / low average of 111.1 fps at 1080p, and falling just short at 55.3 fps at 4K. Oddly enough, you will notice the Titan X actually dips in performance at 1080p in SLI. This is one case where you would be better off with a single card if you game at 1080. However, the 4K performance increased as expected. With Titan X's in SLI, I was able to average a high / low value of 67.53 fps. The 1080Ti of course still dominates here, beating both the single and SLI performance of the Titan X once again with 134.86 fps and 85.46 fps in 1080 and 4K modes respectively. The 4K test averaged 63.13 fps throughout the entire benchmark.
In Grand Theft Auto things get much more interesting. The single card performance of the Titan X and the 1080Ti are nearly neck and neck. This tells me, more than likely, I hit a bottleneck on the system somewhere. I've heard GTA likes higher clocks speeds, and despite my cornucopia of cores, the low clock speed might be holding things back. All the same, in 1080p or 4K, either single card is more than enough to keep you over the 60 fps desired minimum. Interestingly enough, again we see a dip in 1080 performance with SLI, and the 1080Ti alone averaged almost a frame per second higher than the Titan X's in SLI in 4K. One thing to note is I didn't include the Average FPS on this game because GTA doesn't give you a general overall average like the others. With the next card I test, I will start giving a calculated average of the averages.
3D Mark's Fire Strike is available in 3 tests. The standard Fire Strike is 1080p while Fire Strike Ultra is 4K. I've combined the results as 1080p and 4K in this chart. In 1080p both cards blow this test out of the water, with the 1080Ti taking a comanding lead over the aging Titan X. In 4K, however, all of them suffer a fate similar to playing console games, with only the 1080Ti actually breaking the 30fps barrier. The scores reflect a similar outcome with the 1080Ti almost breaching the 20,000 mark in 1080p, but is nearly identical in results to the Titan X's in SLI in 4K.
The Ghost Recon Wildlands api is unknown to me, and it appears Ubisoft doesn't want you to know as my Google searches were never able to officially confirm what it uses. Time Spy, however, is definitely a DirectX 12 test. It also breaks my mold of 1080p and 4K testing by giving me results of a 1440p test. In this test all 3 cards came in fairly close to one another, with no card, or combination of cards, actually reaching 60 fps. The 1080Ti does beat the SLI performance of the Titan X's however. For your comparison, the 1080Ti obtained a score average of 6,175.67, with the single Titan X falling behind at 5,313.33.
Unigine Heaven 4.0, which is undoubtly the longest benchmark of my tests, shows a clear disdain for SLI graphics. The single Titan X had a wonderful 110.17 high / low 1080p average, with a 4K average, just under 60, at 53.05 fps. In SLI, the 1080p performance actually dropped significantly, with only an 11 fps increase in 4K. Keeping in mind this test does not even have a 4K setting. It is only obtainable by forcing the benchmark to run at the system resolution. No matter how you cut it, the 1080Ti is the clear and obvious winner here, with an average of over 60 fps in 4K mode. The overall scores reflect the same results, with the single Titan X coming in second place over SLI, and the 1080Ti still winning.
Unigine's Valley 1.0 is a newer benchmark which appears to be more optimized, however it too lacks a 4K mode. The single Titan X is still preferable over SLI as the only win is in 4K for SLI, but by less than 2 fps. Hardly enough to justify using 2 GPUs. The 1080Ti is still the winner here, taking home a clear trophy in 4K, but only marginally better in 1080p. The scores actually show a slightly different story here, with 4K Titan X SLI taking a clear lead over a single card. It seems the story here is more is only better at higher resolutions.
So it would seem the 1080Ti is definitely the way to go. It doesn't get any better than this... Or does it?