Hi,
I would like to present this project of mine, if that's OK with you.
It consists in achieving 1:1 FSB / DRAM ratio
and stabilizing the 4CoreDual with a mild overclocking, without using any v-mod.
Even though it's slightly off topic, I will say a word or two about how the GeForce 7600 behaves.
Some BIOS settings, screenshots and explanations to follow
Code:
* CPU CONFIGURATION
CPU Host Frequency [Manual]
BIOS Default (Auto)
CPU Frequency [278]
[i]Mild adjustment I know, but fine with me.
To those who still wonder,
a more powerful (all things being equal) graphic adapter will significantly reduce the headroom for any standard overclocking[/i]
BIOS Default (266)
PCIE Clock [Sync with CPU]
BIOS Default (Auto)
AGP/PCI Clock [66 / 33]
BIOS Default (Auto)
Spread Spectrum [Disabled]
[i]Apparently, only high-precision devices in the room (e.q. not fm receivers) would benefit from having this enabled[/i]
BIOS Default (Auto)
Boot Failure [Enabled]
Ratio Actual Value 10
[i]Do not hesitate to start with a lower ratio (x9, then x9.5 in this case).
After all, overclocking is about making compromises, not merely a GHz race[/i]
BIOS Default (10)
Enhanced Halt State [Enabled]
[i]Glad that feature was implemented with the unofficial bios'
As you all know, it will make the ratio change dynamically from 6 to whatever the CMOS value is[/i]
BIOS Default (Disabled)
Max CPUID Value Limit [Disabled]
Cpu Thermal Throttling [Disabled]
[i]Some kind of Pentium 4 related security feature (to prevent overheating)[/i]
BIOS Default (Enabled)
No-execute Memory Protection [Disabled]
Intel {R} SpeedStep{tm} tech. [Disabled]
BIOS Default (Enabled)
* CHIPSET SETTINGS
DRAM Frequency [266MHz {DDRII533)]
[i]With PC2-5300's (which nominal clock speed is 333 MHz),
the Asrock BIOS defaults to 266 MHz speed[/i]
BIOS Default (Auto)
Flexibility option [Disabled]
DRAM CAS# Latency [5]
BIOS Default (Auto)
DRAM Bank Interleave [4-Way]
BIOS Default (Auto)
Precharge to Active {Trp} [4T]
BIOS Default (Auto)
Active to precharge {Tras} [12T]
BIOS Default (Auto)
Active to CMD {Trcd} [4T]
[i]While adjusting BIOS settings, I recommend those loose timings that are supposedly used with DDR2 @ 333MHz
eq 5-5-5-13
Could be a temporary solution for stabilization purposes.
Anyway, these timings of mine are a mix between the 2 JEDEC values, and it seems OK as no error did show up in MemTest86+.[/i]
BIOS Default (Auto)
REF to ACT/REF to REF {Trfc} [Auto]
[i]Trfc (Tfc) doesn't show up in CPU-Z, AIDA64 mentions 34[/i]
ACT {0} to ACT {1} Trrd [Auto]
Read to Precharge {Trtp [Auto]
Write to read CMD (Twrt) [Auto]
Write Recovery Time (Twr) [Auto]
DRAM BUS Selections [Dual Channel]
BIOS Default (Auto)
DRAM Command Rate [2T Command]
- Advanced Memory Configuration
DRAM Drv & DCLK Ctrl [Auto]
(All four)
CLKBUF Skew [300ps]
(All four)
[i]I chose to set skew @ 300ps for stabilization purposes.[/i]
BIOS Default (Auto)
- Advanced Host Configuration
Pipeline DRQCTL [Enabled]
[i]Enabling this is more effective for DDR1 type of DRAM, supposedly
(in terms of increased theoretical memory bandwidth)
but it doesn't seem to affect my DDR2 setup[/i]
BIOS Default (Auto)
GTL Control [Auto]
* CHIPSET SETTINGS (continued)
DRAM Voltage [Normal]
BIOS Default (Auto)
AGP Voltage [High]
[i]Cranks up Northbridge voltage[/i]
BIOS Default (Auto)
Primary Graphics Adapter [PCI-Express gfx]
BIOS Default (PCI)
AGP Mode, Fast Write, Aperture Size
[i]NA (I would probably use 4X, Enabled and 32 MB with the AGP card)[/i]
BIOS Default (Auto, Disabled, 64 MB)
V-Link Speed [Normal]
[i]Setting this to Fast could lead to a faster boot, but also to pushing the chipset beyond its limit.[/i]
PCI Delay transaction [Enabled]
[i]Merely allows write-posting to continue while a non-postable PCI transaction is underway (source : TechARP).
Never noticed any benefit from setting this to disabled
(such a basic queuing mechanism, I am sure even Asrock engineers were able to implement this PCI 2.1 feature correctly!)[/i]
IDE Drive Strength [Normal]
[i]Never noticed any benefit from setting this to Low or Ultra-high, but my main (C:) HDD isn't an IDE one[/i]
PCIE Downstream Pipeline [Disabled]
[i]I think it's safe to say the PCIE Pipeline should be disabled in most of the cases[/i]
BIOS Default (Auto)
PCIE VC1 Request Queue [Disabled]
BIOS Default (Auto)
HD Onboard Audio [Disabled]
BIOS Default (Enabled)
Echo TPR Disable [Auto]
[i]I have no idea what this is, I simply noticed people switch it to Auto (sheepish behavior I guess!)[/i]
BIOS Default (Enabled)
Just in case some of you are interested :
* ACPI SETTINGS
ACPI HPET Table [Enabled]
BIOS Default (Disabled)
* ADVANCED PCI~PNP SETTINGS
PCI Latency Timer [64]
BIOS default (32)
PCI IDE Busmaster [Enabled]
Pictures of the corresponding BIOS settings
Overview :
CPU :
Advanced Memory :
Advanced Host :
Chipset (first part) :
Chipset (continued) :
OS : x86 Windows Vista Ultimate SP2
Intel Dual Core E7300 (2.66 GHz, 1066 MHz FSB, 3 MB L2 Cache)
2x 1 GB Transcend DDR2-667 PC2-5300 CL5 (1.8 V modules) sticks
CL4 (CAS 4), evenmoreso CL3 rated DRAM, whether DDR2-667 or 533 will probably cause trouble, except in compatibility mode.
If you don't find the DDR2 transition worthwhile, I suggest using DDR1 in compatibility mode.
This way, you will attain a nice 2:1 ratio (does not apply to non FSB-1066 CPUs),
and obtain the best timings (eq 2.5-3-3-3-6) with minimal BIOS tweaking.
Other DRAM manufacturer I could recommend modules from : Corsair, Kingston
Ordered from eBay a fanless GeForce 7600 GS as a replacement for the crappy 5200 FX.
Before switching, I was able to reach 300+ (310 max) FSB with the 5200 FX (and DDR1-400)
Not anymore : The 7600 GS seems to drain a lot more resources.
By the way, I noticed some people use an ATI X1950 based card, or even a HD 3850 in their rig.
Well, I don't think this mobo, with its somehow crippled PCIE bus, "deserves" such expensive / recent ATI cards (just my 2 cent).
This said, mine might inherit from the HD 4850 located on the main computer, when the latter gets (really) deprecated.
For this 4CoreDual-SATA2 R2.0 from Asrock
I use the latest BIOS version, from PCtreiber (2.20a).
It, along with 2 HDDs and 2 optical drives, runs just fine with my old 320 W Enermax Noisetaker power supply
(up to 23A on the +12V).
CPU Temps are incredibly stable (43-44°C) as the CPU fan is regulated (PWM fan + fanmate)
GPU temps oscillate between 55 and 61°C (after about 1 hour of benchmarking)
No error show up after one hour of OCCT stress-testing (Core 1 temp raises up to 51°C).
As I said, I dumped the Geforce 5200 FX (AGP)
for the much more powerful (yet aging) PCI-E 7600 GS from GigaByte (a fanless / heatsink equipped model).
The funny thing is I had more than one driver to choose from, not even considering the WDDM one, but anyway.
In terms of overclocking, or should a say bumping / adjusting
a few values here and there, here are my results :
CPU : Starting @ 282, Vista's desktop would be garbled. I decided to stick with 278,
which brings the CPU clock speed upper limit to around 2780 MHz
GPU : Gigabyte already overclocked the GPU core from 200 to 250 MHz,
so I simply overclocked the memory (GDDR-2) from 200 to 250 MHz via Afterburner (by 12 units steps).
This aside, in terms of compatibility,
I was getting severe artefacts in Internet Explorer 9 (even with the wddm driver),
same issues with the default picture viewer.
I suppose Vista & Se7en's handling of hardware acceleration and desktop composition (Aero) are to blame,
as this generation of GeForce probably cannot handle those novelties very well.
I managed to get rid of IE artefacts (especially annoying with multiple tabs opened) by reverting to software rendering instead of using GPU rendering (in advanced options).
Time for some CPU-Z and GPU-Z captures
GPU-Z one
CPU-Z ones
- CPU highest value
- CPU lowest value
- Memory
And now, some measurements (antivirus set to disabled) :
Aquamark
Avg FPS: 60.12
Avg Triangles Per Second: 18098076
Aquamark Score Render: 11615
Aquamark Score Simulation: 6232
Aquamark Score: 60120
3Dmark 2001SE
20075 marks
3Dmark '03
10551 marks
What an ugly series of tests (except for Mother Nature)!
3Dmark '05
4816 marks
3Dmark '06
NA
Too many artefacts. Plus, annoying slowdowns at this resolution (1280x1024).
Bottom line : Would I recommend the Asrock 4CoreDual (or similar boards based on VIA's PT880 chipset) ?
If you plan to overclock with a decent gfx adapter plugged in, not really (again, it might be worth it if you apply some v-mod I am not familiar with). Not to mention the badly documented, potentially troublesome (e.q. the infamous PCIE Downstream Pipeline) BIOS settings. Then you have to deal with the Vista / Se7en (Windows 8, despite a lighter Aero engine, is concerned too) artefacts I mentioned (having in mind I barely did check a pair of 3D games for potential lock-ups, artefacts...);
That is if your budget wouldn't allow you to pick up something else than a used 6xxx or 7xxx GeForce (check the VGA compatibility list published on asrock.com page).
However, combining x86 Windows XP (SP3) + a compatible / reasonably powerful gfx card + DDR1 (+ patience!) amount to a very stable, affordable, fast (compared to the older generation of chipsets) and overclockable machine. If you really wish to use a more powerful gfx card, you will have to relinquish on 300+ FSB overclocking IMHO (except if you're very lucky).
Addendum : It is possible to display up to 4 screens (I think they call this MultiView, nothing to do with SLI obviously) simultaneously, as both an AGP and a PCI-E adapter can be plugged in at the same time.
I would like to thank all the people who provided us some results and hints from hours and hours of stress testing here, and on various boards.
Any question or comment welcome!