Jump to content
Spartans Home

Kepler is here: The GTX 680


Recommended Posts

I'm set to receive one of these bad boys as soon as NewEgg has them in stock. I'll stick it in my current system as soon as I get it, but I'll be moving it into a new system as soon as Intel's Ivy Bridge arrives.

 

http://www.pcgamer.com/2012/03/22/nvidia-unveils-the-geforce-gtx-680-massively-multicore-insanely-powerful/

Link to comment
Share on other sites

Wow !! I love competition !! Finally Nvidia is coming out again with a Stellar card!

 

There are many things I really like in that review. I will not be an early adopter because what i have is still pretty good and does what I need.

 

However when I will get to the point to fully upgrade my system to the latest generation, this is going to be totally on the to-do-list... if AMD doesn t come out with a Killer solution ;)

Link to comment
Share on other sites

Wow, that sounds like exactly what I need for Minecraft.

 

Honestly, I don't know if they are coming out with any games that need that power. So what is the benefit of being an early adopter, aside from bragging rights?

 

Running Skyrim/BF3 across 3+ monitors with a high framerate on one single GPU card.

 

Only thing I can think of.

Edited by MH6~SPARTA~
Link to comment
Share on other sites

Only 2gb ram on that monster? Seems like such a waste of potential. For reference I have vram issues in some games on my piddly little ati 2gb crossfire setup.

Edited by AlarmedBread
Link to comment
Share on other sites

2Gb of Vram is plenty, remember most use 1920x1080 frames (1920x1200 myself) and that seems to be the norm for gaming.

 

1920x1080= 2,073,600 bits (2Mb)

X32 bit colour plains

 

2,073,600 x 32 = 66,355,200 bits per video page (in 32bit colour)

 

66Mb per frame.

 

(2Gb Vram) 2,147,483,648 bits / 66,355,200 = 32.36 Frames (x,y frames)

 

In effect the card can buffer a possible 32 Frames, who in the real world renders 32 frames ahead? no one.

 

Most cards today render 2-4 frames ahead

 

of course I've simplified so calc. but you get the idea. (like z frames)

 

2GB is plenty even for 680GTX on 6 monitors buffering 6 frames each. (twice the current norm. for buffering).

 

=======

 

This is nothing to do with how many frames the card can push out per second as that is tied into the clk speed.

Just how many frames can be waiting in memory at any time to be virtually instantly pushed out (paged to the display).

 

 

Frame rate drops (a different part), is when the card (GPU) can not supply enough frames to the display fast enough...

(or on occasion when the online network connection delays the push out of frames for sync reasons).

 

 

looks good, but is there anything out there but 6 screen flight sims that might use the power?

Link to comment
Share on other sites

2Gb of Vram is plenty, remember most use 1920x1080 frames (1920x1200 myself) and that seems to be the norm for gaming.

 

1920x1080= 2,073,600 bits (2Mb)

X32 bit colour plains

 

2,073,600 x 32 = 66,355,200 bits per video page (in 32bit colour)

 

66Mb per frame.

 

(2Gb Vram) 2,147,483,648 bits / 66,355,200 = 32.36 Frames (x,y frames)

 

In effect the card can buffer a possible 32 Frames, who in the real world renders 32 frames ahead? no one.

 

Most cards today render 2-4 frames ahead

 

of course I've simplified so calc. but you get the idea. (like z frames)

 

2GB is plenty even for 680GTX on 6 monitors buffering 6 frames each. (twice the current norm. for buffering).

 

=======

 

This is nothing to do with how many frames the card can push out per second as that is tied into the clk speed.

Just how many frames can be waiting in memory at any time to be virtually instantly pushed out (paged to the display).

 

 

Frame rate drops (a different part), is when the card (GPU) can not supply enough frames to the display fast enough...

(or on occasion when the online network connection delays the push out of frames for sync reasons).

 

 

looks good, but is there anything out there but 6 screen flight sims that might use the power?

 

Graphics memory is used to store vertex and index buffers, as well as various textures.

You also have to factor in games that use heavy shaders, such as deferred lighting (which allows you to have many more lights than the hardware limited amount).

These shaders can require up to 3+ fullscreen buffers per frame, one for color information, one for normal information, one for depth information. Then if you have anti-aliasing enabled, that's a couple more fullscreen buffers depending on the quality of it. All of these buffers are stored into video memory on the graphics card (because storing them on the PC adds CPU overhead). Each of these fullscreen buffers (aka render targets) will take up just as much space as the final output buffer. When it comes down to it, the final "output" frame is a very insignificant amount of memory compared to everything else being stored. So, more memory on the graphics card DOES help, especially with games with high graphic demands that you want to run across multiple monitors. The bandwidth (throughput) of the graphics RAM is really more important, however.

 

Excerpt from a dev. journal for Tabula Rosa:

In Tabula Rasa, even at a modest 1024x768 resolution, we can consume well over 50 MB of video memory just for render targets used by deferred shading and refraction. This does not include the primary back buffer, vertex buffers, index buffers, or textures. A resolution of 1600x1200 at highest quality settings requires over 100 MB of video memory just for render targets alone.

We utilize four screen-size render targets for our material attribute data when rendering geometry with our material shaders. Our light shaders utilize two screen-size render targets. These render targets can be 32 bits per pixel or 64, depending on quality and graphics settings. Add to this a 2048x2048 32-bit shadow map for the global directional light, plus additional shadow maps that have been created for other lights.

 

 

 

 

However, as the GTX 680 can (for right now) only support up to 4 monitors anyway, I think the 2GB limit will be fine.

 

 

Link to comment
Share on other sites

 Share

×
×
  • Create New...