Jump to content
Spartans Home

Rooster's OC Experience: The Forum


Rooster90
 Share

Recommended Posts

Well, to put it lightly, I'm a total noob when it comes to overclocking. I know some basics and I understand a few terms, but outside of that this is the first computer where I'm really delving into this. (Refer to my avatar, that will explain how I feel when people start throwing voltages and weird abbreviations at me) I will document my methods and results here, mostly for my own benefit, but also for others' too. Tips and pointers are always appreciated.

 

Here's my system, all stock settings:

 

CPU: Intel i7 2600K @ 3.4 GHz (actually at default it runs closer to 3.5 according to CPU-Z)

Mobo: MSI P67A GD-65

RAM: G.Skill Ripjaw-X 8GB DDR3 (dual-channel) @ 1333 MHz

GPU: EVGA GTX 560 Ti 1GB

-Graphics Clock: 900MHz

-Memory Clock: 2106MHz

-Processor (Shaders) Clock: 1800MHz

Case: Cooler Master HAF 912. Great, ventilated case and unbeatable price. Highly recommended for those on a tight budget. http://www.newegg.com/Product/Product.aspx?Item=N82E16811119233

 

I have sealed liquid cooling system on my CPU and 3 120mm fans throughout my case, one intake on the side and two exhaust (one in the top rear, the other on the very top). My Corsair 850W PSU sits at the bottom of my case.

 

Liquid Cooler = Asetek 550LC: http://www.asetek.com/products/oem-standard-products/120mm-products/550lc.html

 

On to the fun stuff. I just uninstalled my video drivers, rebooted in Safe Mode (hit F5 on start-up, selected it in the advanced start-up options) and reset all of my CPU overclocking profiles to the default settings. My mobo comes with a pretty sweet BIOS that's actually a GUI with clickable buttons, making it a much more user-friendly experience. I installed the latest drivers onto my EVGA GTX 560 Ti (275.33 as of this writing).

 

Nvidia drivers: http://www.nvidia.com/Download/index.aspx?lang=en-us

 

On to some synthetic testing. My first test was to test my CPU stock and my liquid cooling. I'm using a program called RealTemp to monitor all of my core's temps simultaneously. Hajimoto recommended a good stress test called Prime95 for this very purpose. Note I'm using the 64-bit version, the same that my OS is running. For those of you running 32-bit OS's, there is a "related link" on that page for the 32-bit version.

 

I've run both the "In-place FFT" test (explained as a max temp/power consumption test) and the "Blend test" which tests a little bit of everything. I ran them both about 5-10 minutes each with my max temperature sitting somewhere around 69-70 degrees C. Not bad, if I do say so myself. Intel really did deliver a great line of processors that run exceptionally cool. According to CPU-Z the core voltage is sitting at 1.152.

 

So far, so good. My CPU and cooler can more than take the heat at stock settings. And not hardly a peep out of my fans. In my next post I will begin GPU tests, both synthetic and "real" at stock settings.

 

To be continued....

Edited by Rooster90
Link to comment
Share on other sites

Here's my system, all stock settings:

 

CPU: Intel i7 2600K @ 3.4 GHz (actually at default it runs closer to 3.5 according to CPU-Z)

Mobo: MSI P67A GD-65

RAM: G.Skill Ripjaw-X 8GB DDR3 (dual-channel) @ 1333 MHz

GPU: EVGA GTX 560 Ti 1GB

-Graphics Clock: 900MHz

-Memory Clock: 2106MHz

-Processor (Shaders) Clock: 1800MHz

Case: Cooler Master HAF 912. Great, ventilated case and unbeatable price. Highly recommended for those on a tight budget. http://www.newegg.com/Product/Product.aspx?Item=N82E16811119233

 

[

 

NICE RIG already ..

 

have fun ...

 

 

 

 

(don't hunt on a full belly )

Link to comment
Share on other sites

PART 2: GPU stock settings. Synthetic benchmarking/stressing.

 

I'm using another Haji-approved program for this one. This time, it's the Unigine developed benchmark program called "Heaven".

 

Direct download here: http://www.techpowerup.com/downloads/1961/Unigine_Heaven_DX11_Benchmark_2.5.html

 

Supposedly this stuff is top of the line stressing for DX11 systems. And if good ole Hajimoto recommends it, I'm definitely willing to give it a go. It's a modest 228MB download, FYI. I will also be using EVGA Precision to monitor my GPU.

 

Found here: http://www.evga.com/precision/

 

Idling, my GPU runs at a cool and calm 36 degrees C. Now to begin the Heaven test. I'll also be monitoring my CPU temps, for the hell of it.

 

Heaven settings:

DX11

Stereo 3D: disabled

Shaders: High

Tesselation: Extreme

Ansiotropy: 16

AA: x8

Res: 1680x1050 (old monitor, still running strong though).

 

All of these are pretty much the highest settings on the program. Onto the results....

 

DX11 = BRUTAL

Max CPU temp: 46 degrees C

Gpu temp: ~80 degrees C, fan speed ~50%

Average frames: ~21fps

Score: 582

 

Ouch. For kicks and giggles, I'll add DX9 and 10 results as well.

 

DX9 Results:

Max CPU temp: low 40s

GPU temp: ~80 degrees C, fan speed ~55%

Avg frames: 51

Score: 1287

 

DX10 Results:

Max CPU temp: 46 degrees C

GPU temp: ~80 degrees C, fan speed ~53%

Avg frames: 39

Score: 971

 

Hmm. Interesting indeed.

 

To be continued....

Edited by Rooster90
Link to comment
Share on other sites

Thoughts and Reflections so far.

 

I'm not really sure what to think of my Heaven results. I honestly don't know what a comparable system to mine's results should be. They seem pretty low to me, but I can't be sure. Maybe it just means Heaven really is as brutal as they say it is. I still feel like I should have performed better, especially considering my resolution. I also noticed at the end of the benchmark it said this was using 32-bit C++ binaries, even though I'm on a 64-bit system... idk if this means anything or not though. Somebody better versed in the program should clarify this.

 

What do you guys think? Is this to be expected of this benchmarking program, or is there something sinister going on here?

 

I've already named one culprit to my problem, and that is my RAM. I got it for a great price, but you get what you pay for. It's clocking in @ 1333MHz stock and has a CAS timing of 9. Meh. But for something like this, you'd think it would be more reliant on the GPU. Refer to my first post for my GPU's stock settings. Even for stock, they're pretty damn good. What else could be going on here?....

 

I await your thoughts and opinions (if any). I will continue benchmarking my stock system tomorrow.

Edited by Rooster90
Link to comment
Share on other sites

Couldnt run dx 11 on my gtx 285 but my dx10 looks to be inline so your about where you should be.

 

 

 

Powered by Unigine Engine

Heaven Benchmark v2.5 Basic

FPS: 22.6

Scores: 568

Min FPS: 11.4

Max FPS: 42.7

 

Hardware

Binary: Windows 32bit Visual C++ 1600 Release Mar 1 2011

Operating system: Windows 7 (build 7601, Service Pack 1) 64bit

CPU model: Intel® Core i5-2500K CPU @ 3.30GHz

CPU flags: 3292MHz MMX SSE SSE2 SSE3 SSSE3 SSE41 SSE42 HTT

GPU model: NVIDIA GeForce GTX 285 8.17.12.7533 1024Mb

 

Settings

Render: direct3d10

Mode: 1680x1050 8xAA fullscreen

Shaders: high

Textures: high

Filter: trilinear

Anisotropy: 16x

Occlusion: enabled

Refraction: enabled

Volumetric: enabled

Tessellation: disabled

Link to comment
Share on other sites

I've had this cpu for so long and after comparing the results here and in Batwings thread i wont be buying a new one for years to come....wth!?

 

Heaven Benchmark v2.5 Basic

FPS: 39.5

Scores: 996

Min FPS: 19.2

Max FPS: 82.2

 

Hardware

Binary: Windows 32bit Visual C++ 1600 Release Mar 1 2011

Operating system: Windows 7 (build 7601, Service Pack 1) 64bit

CPU model: Intel® Core2 Quad CPU Q9550 @ 2.83GHz

CPU flags: 2830MHz MMX SSE SSE2 SSE3 SSSE3 SSE41 HTT

GPU model: NVIDIA GeForce GTX 560 Ti 8.17.12.7533 1024Mb

 

Settings

Render: direct3d10

Mode: 1680x1050 8xAA fullscreen

Shaders: high

Textures: high

Filter: trilinear

Anisotropy: 16x

Occlusion: enabled

Refraction: enabled

Volumetric: enabled

Tessellation: disabled

Link to comment
Share on other sites

Confirmed, this is a very good stress test for the GPU, overclocking CPU here doesn t help, instead overclocking GPU may make a difference

 

Beautiful graphic on this benchmark anyway, I loved the graphic art.

 

Here is my result

 

Heaven Benchmark v2.5 Basic

 

 

 

FPS:

20.4

 

Scores:

513

 

Min FPS:

10.1

 

Max FPS:

42.5

 

 

Hardware

 

Binary:

Windows 32bit Visual C++ 1600 Release Mar 1 2011

 

Operating system:

Windows 7 (build 7601, Service Pack 1) 64bit

 

CPU model:

Intel® Core i7 CPU 920 @ 2.67GHz

 

CPU flags:

3500MHz MMX SSE SSE2 SSE3 SSSE3 SSE41 SSE42 HTT (here it looks I am OCed at 3.5 Ghz, I thought I was at 3.7..)

 

GPU model:

AMD Radeon HD 6900 Series 8.850.0.0 2048Mb

 

 

Settings

 

Render:

direct3d11

 

Mode:

1920x1200 8xAA fullscreen

 

Shaders:

high

 

Textures:

high

 

Filter:

trilinear

 

Anisotropy:

16x

 

Occlusion:

enabled

 

Refraction:

enabled

 

Volumetric:

enabled

 

Tessellation:

extreme

 

EDIT:

 

I lowered my res to 1680*1050 to give you a comparison. However I notice my Card gives better results at higher res. Of course it cranked up some better numbers, but when you look at only 3 FPS average in difference, at such higher res, the highest is the better here. Which makes me think if you also had an higher res you could have better results as well. it looks like last gen videocards like to push their hp on big screens :)

 

 

FPS: 23.8

 

Scores: 599

 

Min FPS: 10.6

 

Max FPS: 52.9

Edited by Batwing~SPARTA~
Link to comment
Share on other sites

I've had this cpu for so long and after comparing the results here and in Batwings thread i wont be buying a new one for years to come....wth!?

 

 

 

Now doesn't that sound familiar??? Learn to trust the Hajimoto.... thumbsup.gif

You have a powerhouse CPU and all that is needed is a fair liquid cooler to help keep things in check when you start to push that thing a bit.

Link to comment
Share on other sites

Now doesn't that sound familiar??? Learn to trust the Hajimoto.... thumbsup.gif

You have a powerhouse CPU and all that is needed is a fair liquid cooler to help keep things in check when you start to push that thing a bit.

 

To be fair,i made that choice cause of you few months ago... otherwise i would have already bought a new cpu,you know i was about to.

But this really acknowledges exactly what you said!<<that sounded proper English when i typed it...anyway....lol

Link to comment
Share on other sites

Powered by Unigine Engine

Heaven Benchmark v2.5 Basic

FPS: 69.5

Scores: 1934

Min FPS: 36.7

Max FPS: 159.2

 

Hardware

Binary: Windows 32bit Visual C++ 1600 Release Mar 1 2011

Operating system: Windows 7 (build 7601, Service Pack 1) 64bit

CPU model: Intel® Core™ i7-920 CPU @ 2.67GHz

CPU flags: 4395MHz MMX SSE SSE2 SSE3 SSSE3 SSE41 SSE42 HTT

GPU model: NVIDIA GeForce GTX 590 8.17.12.7533 1536Mb

 

Settings

Render: direct3d11

Mode: 1920x1080 8xAA fullscreen

Shaders: high

Textures: high

Filter: trilinear

Anisotropy: 16x

Occlusion: enabled

Refraction: enabled

Volumetric: enabled

Tessellation: extreme

Edited by Hajimoto
Link to comment
Share on other sites

Powered by Unigine Engine

Heaven Benchmark v2.5 Basic

 

FPS: 34.6

Scores: 873

Min FPS: 12.8

Max FPS: 88.5

 

Hardware

Binary: Windows 32bit Visual C++ 1600 Release Mar 1 2011

Operating system: Windows 7 (build 7600) 64bit

CPU model: Intel® Core i7 CPU 930 @ 2.80GHz

CPU flags: 4009 MHz MMX SSE SSE2 SSE3 SSSE3 SSE41 SSE42 HTT

GPU model: NVIDIA GeForce GTX 470 8.17.12.7533 1280Mb

 

Settings

 

Render: direct3d11

Mode: 1920x1080 8xAA fullscreen

Shaders: high

Textures: high

Filter: trilinear

Anisotropy: 16x

Occlusion: enabled

Refraction: enabled

Volumetric: enabled

Tessellation: extreme

 

This was with (not really fantastic) stock clocks on the Video card.

 

Core Clk: 607

Shader Clk: 1215

Memory Clk: 1674

Fan speed: 50%

Max temp: 76 deg C

 

 

I am planning to run the benchmark after I OC a bit and stress test.

Link to comment
Share on other sites

@Hajimoto:

 

Your tessellation was set to normal. Set it to "Extreme" to be relative to everybody else's score. I've been OCing my GPU and running a multitude of tests today. Tessellation alone going from Normal to Extreme will be a 10-15 fps difference (on my rig). I imagine it's because the technology is still fairly new, as is DX11.

 

@Noob: What manufacturer/brand of 560 Ti do you own? I own the EVGA Superlcocked version (900 MHz core clock, stock). I'm wondering that, despite the manufacturer's numbers, my GPU is simply using inferior hardware/architecture compared to other 560's. I find that a little hard to swallow considering my brand, but it's been proven in the past that hardware will say big numbers on the box, but still can't perform up to snuff with similar products. It certainly isn't my CPU because even at stock settings my CPU is essentially idling. At full speed I'm only using about 50-60% of my CPU during ArmA 2.

Edited by Rooster90
Link to comment
Share on other sites

Ah, that explains a bit. Your 560 is in fact, superior to mine. That explains some discrepancies, though the gap is still fairly large (10-15 fps more than mine).

 

My 560 at stock is:

 

Core: 900

Shader: 1800

Memory: 2106

 

Currently I have it OC slightly at:

 

Core: 950

Shader: 1900

Memory: 2106

 

I likely could OC my card further, but when going through my OC tests I was using Heaven as a guide. Anything above what I have now started to produce artifacts and glitches. Granted, the program is pushing my GPU farther than any game is in DX11 for an extended period of time. Plus having all the settings set to the most extreme doesn't help. However, I'm sure your card is designed to handle the higher speeds and heat better than mine. I don't want to push my luck and fry the card early.

Link to comment
Share on other sites

Rooster,

 

Be aware Noob test was on tessellation "disabled" wich makes a big difference on the final results

 

Why don t you simply take Noob harware reference and emulate it? Don t push the OC, just emulate Noobs one and see if you still have artifacts. Pump up the fan power to be sure artifact are not caused by temp under OC settings.

 

@ Haji:

 

That freaking 590 is a real monster man ! In your config is not even your CPU pumped at 4.4 making the difference and not even the great RAM performance almost at 1600 you have. That is really that freaking Monster :)

Edited by Batwing~SPARTA~
Link to comment
Share on other sites

Rooster,

 

Be aware Noob test was on tessellation "disabled" wich makes a big difference on the final results

 

Why don t you simply take Noob harware reference and emulate it? Don t push the OC, just emulate Noobs one and see if you still have artifacts. Pump up the fan power to be sure artifact are not caused by temp under OC settings.

 

So was Donzi's and Rooster posted his dx10 in there 2 until you Haj and medic posted,thats why i did the dx10 bm.

Link to comment
Share on other sites

Idk why I didn't think of that... *facepalm*

 

Running with Noob's clock speeds and with tessellation disabled my score is almost EXACTLY identical as his. However, I caught two small artifacts when running the test. These very well may be heat related. I had my fan speed set to a rather audible 65% (~3150 RPM) and at peak heat it was around 73-74 degrees C. This also happened to be around the time when I noticed these artifacts....

 

I guess the logical question that I would ask next is: What temp is Noob's GPU typically running at? And the fan speed? His GPU may also be a more efficient and cooler running card than mine, giving him yet even more added performance.

Link to comment
Share on other sites

Oh Noob, that s true :) in fact i didn t even notice you were testing Dx10 instead of 11. Not a biggy, but that caused almost a cardiac arrest to our good Rooster who is trying to achieve your results :)

 

Comparing Rooster's and yours results on the DX10 performances, really the 2 cards are performing the same, minimal difference. I give a special commendation to your machine anyway, because is using the good faithful 9550 Quadcore. However, as we know, here CPU performance is not a major asset.

 

When you have time, please re-post a benchmark on DX11 as with Rooster's settings and that will help to see if Rooster really need to be concerned.

 

I think overall, although benchmarks are not always real world experience, this is giving all of us an opportunity to review and refine our performances, which is overall a great thing anyway :)

 

@ Haji:

 

Just because I love when you slap my face.. lol.. could you please post your benchmark at Max res 1920/1200 all maxed up? So I can see wher my single GPU stacks on yours ? Thx my friend... Is almost like asking a friend to perform euthanasy.. but oh well... we like living dangerously, don t we? :)

Edited by Batwing~SPARTA~
Link to comment
Share on other sites

Here's my 1920x1080

Heaven Benchmark v2.5 Basic

FPS: 21.4

Scores: 538

Min FPS: 6.6

Max FPS: 52.9

 

Settings

Render: direct3d11

Mode: 1920x1080 8xAA fullscreen

Shaders: high

Textures: high

Filter: trilinear

Anisotropy: 16x

Occlusion: enabled

Refraction: enabled

Volumetric: enabled

Tessellation: extreme

=======================================================================================

=======================================================================================

And the lower 1680x1050

 

FPS: 24.0

Scores: 604

Min FPS: 6.5

Max FPS: 60.1

 

Settings

Render: direct3d11

Mode: 1680x1050 8xAA fullscreen

Shaders: high

Textures: high

Filter: trilinear

Anisotropy: 16x

Occlusion: enabled

Refraction: enabled

Volumetric: enabled

Tessellation: extreme

Link to comment
Share on other sites

Rooster my card came with its own custom cooler.

Batwing,yes we achieve the same score's,cause i guess this bm is GPU only.......but that doesnt explain why i beat or at least achieve the same scores with ARMA2 a very CPU intensive game....i guess its the overclock.....but Rooster should "beat" me with ARMA no doubt about it.

Link to comment
Share on other sites

Btw if you're getting artifacts,dont run that shit for too long!...it's probable that you left your voltage at default which could cause the artifacts,but if you up the voltage itll get even hotter which could also result in artifacts etc,and i've read that if you run that card faster than 950 to 1000 it'll start to use over 300w....bad idea.

Link to comment
Share on other sites

Just ran DX10 and 11 tests with Noob's clock settings.

 

With Noob's original DX10 test, our scores are literally the same. Same Avg FPS, same score. However, I'm noticing about 2 or 3 minor artifacts in random places throughout the test. It seems once my card starts encroaching upon that 69-70 degree mark and beyond, my card flakes out. Noob's doesn't due to a much better cooling system on his GPU, I'm sure of it. In DX11 with Extreme tessellation the artifacts become much worse. However, this is a benchmark program and not an actual game which would be better optimized to handle these sorts of issues. I'll be back when I run some benchmarks in actual games.

 

EDIT (after reading Noob's post): Well, at 30% fan speed (which is the lowest it will go) my fan runs very silent at about 1280 RPM. So that 950-1000 RPM rule must be a card-specific issue. I've got 850W of power in my rig and a very energy efficient Mobo and CPU. I've got the headroom for additional fan power.

Edited by Rooster90
Link to comment
Share on other sites

Arma is still a mistery about that.

 

However i guess the answer on Arma is really how many threads the game can use and optimize. I am not too strong on the Arma software architecture, but what i can say is that the 9550 has absolutely a great performance when it comes to the main 4 threads. I don t know if the 9550 has the option to double up to 8 threads.

 

However, the Rooster's K2600 is not necessarely more powerful than your 9550. Fact is the k2600 performs at lower wattage and lower temps (which in general means "performs better"), but not necessarely better about absolute performances within the 4 main threads.

 

Then you need to evaluate how Arma uses those threads. I am almost more oriented to think your 9550 could perform better on the Arma engine, simply because when they wrote that engine, the 9550 was the available kick ass :) is not even optimized for the newer k2600.

 

@ Rooster:

 

Nowq you see the DX11 results, i would say the minimal difference there is just about the way you overclok your card. If you overclock exactly with the same Noobs values, you should see a perfect fair result.

 

EDIT

 

I had to correct a very misleading comment I wrote above the k2600 performances:

 

What I meant is still how new processors workload can be optimized using newer software. When you read the 3DMark Vantage High on tom's hardware comparison chart

http://www.tomshardware.co.uk/charts/desktop-cpu-charts-2010/3DMark-Vantage-High,2418.html

 

the k2600 performs 200% in comparison to the 9550. However how would it perform on an older benchmark, let s say written at the 9550 golden days?

Therefore i think in someway, Arma engine may be giving an edge to the old processor. Just an idea, I could be totally wrong.

Edited by Batwing~SPARTA~
Link to comment
Share on other sites

 Share

×
×
  • Create New...