Jump to content
Spartans Home

Nvidia 600 Series - Do you NEED PCIe 3.0


Hajimoto
 Share

Recommended Posts

PCI-E 2.0 vs. PCI-E 3.0 with the GTX 690

When the GTX 680 was first launched, I assumed that its performance would be slowed on anything but a PCI-E 3.0 slot. NVIDIA had other ideas since their post release drivers all dialed its bandwidth back to PCI-E 2.0 when used on X79-based systems. The reasons for this were quite simple: while the interconnects are built into the Sandy Bridge E chips, Intel doesn't officially support PCI-E 3.0 though their architecture. As such, some performance issues arose in rare cases when running two cards or more on some X79 systems.

 

This new GTX 690 uses an internal PCI-E 3.0 bridge chip which allows it to avoid the aforementioned problems. But with a pair of GK104 cores beating at its heart, bottlenecks could presumably occur with anything less than a full bandwidth PCI-E 3.0 x16 connection. This could cause issues for users of non-native PCI-E 3.0 boards (like P67, Z68 and even X58) that want a significant boost to their graphics but don't want to upgrade to Ivy Bridge or Sandy Bridge-E.

 

In order to test how the GTX 690 reacts to changes in the PCI-E interface, we used our ASUS X79WS board which can switch its primary PCI-E slots between Gen2 and Gen3 through a simple BIOS option. All testing was done at 2560 x 1600 in order to eliminate any CPU bottlenecks.

 

screenshot2012051107510.jpg

 

While some of you may not have been expecting these results, they prove that anyone with a PCI-E 2.0 motherboard won't have to run out for a last minute upgrade. We hardly saw any variance between the two interfaces and in the few cases where there was a discernable difference, it was well within the margin of error. Skyrim is the odd game out but we'll get to that later.

Let's explain what's happening here since it all centers around the complex dance between the CPU, GPU and their interconnecting buses. At higher framerates, a ton of information is passed through the PCI-E interface as the GPU calls on the processor to fetch more frames. This potentially causes a lower bandwidth PCI-E bus to become saturated. At higher resolutions and image quality settings like the ones used in the tests above, the GPU becomes the bottleneck so it calls for less frames from the CPU, leading to the PCI-E interface being less of a determining factor in overall performance.

 

This brings us to our next point: users may still encounter a bandwidth bottleneck but only when the CPU has to send large batches of frames off to the GPU, something that doesn't typically happen at higher resolutions. We'd normally see that kind of situation when the GPU is operating at ultra high framerates. This is why Skyrim ?which still seems oddly CPU bound at 2560 x 1600- sees an ever so slight benefit from PCI-E 3.0.

 

Naturally, higher clocked processors can throw out more frames which is why overclocking your CPU can result in a potential PCI-E bottleneck. But once again, with a GPU as powerful as the GTX 690, a situation like this will only happen at framerates so high that it won't cause any noticeable in-game performance drop offs.

In order to put our summations into context, we've repeated some of the tests below at lower resolutions / IQ settings. The three games chosen were the only ones that displayed a clear difference after multiple benchmark run-throughs.

 

screenshot2012051107524.jpg

 

As you can see, there is a falloff here that's beyond our margin of error but even in this case, the GTX 690 with an overclocked processor is able to push such high framerates that a few percentage points won't make one lick of difference. With a lower clocked CPU, the gap would be even less.

 

So let's sum this up: while PCI-E 3.0 can make a minor on-paper difference in some situations, it certainly isn't needed to ensure your GTX 690 is the fastest graphics card on the block. If you have a PCI-E 2.0 motherboard with a decent processor, an upgrade to a Gen3 interface really isn't necessary.

This story was straight up stolen from The Hardware Canucks, click HERE to read the original story.

Edited by Hajimoto
Link to comment
Share on other sites

 Share

×
×
  • Create New...