Single vs. Dual Channel DDR2

Message boards : Number crunching : Single vs. Dual Channel DDR2

To post messages, you must log in.

AuthorMessage
Profile Gen_X_Accord
Avatar

Send message
Joined: 5 Jun 06
Posts: 154
Credit: 279,018
RAC: 0
Message 60982 - Posted: 4 May 2009, 8:31:00 UTC

{I'm not a gamer; I surf the net, watch "Lost" and "Heroes" online, make a few DVD's of my kids pictures and videos. That is why my new system is a compromise of performance and budget.}

I have a new motherboard, processor, and memory on the way. The memory is 1066 DDR2, and the processor is an AMD Athlon X2 7850 Black Edition that has a memory controller for the 1066. Now I have heard that if I configure my memory to run in dual channel mode, that it will only run at 800mhz. My question is, which do you think will deliver the most performance for Rossetta, not to mention other applications...Single channel at 1066mhz or dual channel at 800mhz?
I'd like an opinion from others who have experimented with the difference between the two instead of wasting time playing with it myself. (I don't have time, I have mother's day DVD's to produce and create. 8-)
ID: 60982 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
mikey
Avatar

Send message
Joined: 5 Jan 06
Posts: 1895
Credit: 9,183,794
RAC: 3,318
Message 60985 - Posted: 4 May 2009, 10:52:00 UTC - in response to Message 60982.  

{I'm not a gamer; I surf the net, watch "Lost" and "Heroes" online, make a few DVD's of my kids pictures and videos. That is why my new system is a compromise of performance and budget.}

I have a new motherboard, processor, and memory on the way. The memory is 1066 DDR2, and the processor is an AMD Athlon X2 7850 Black Edition that has a memory controller for the 1066. Now I have heard that if I configure my memory to run in dual channel mode, that it will only run at 800mhz. My question is, which do you think will deliver the most performance for Rossetta, not to mention other applications...Single channel at 1066mhz or dual channel at 800mhz?
I'd like an opinion from others who have experimented with the difference between the two instead of wasting time playing with it myself. (I don't have time, I have mother's day DVD's to produce and create. 8-)


Dual channel is faster. I have run single and dual channel installations and dual is faster!
ID: 60985 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
NightmareXX

Send message
Joined: 12 Jun 06
Posts: 24
Credit: 1,885,318
RAC: 0
Message 61012 - Posted: 5 May 2009, 0:31:05 UTC

From my own testing using a system with exactly the same components, just changing the memory speed, it has no effect on R@H.

Tested on a Phenom 9850 at 800MHz and 667MHz. There was no drop in the average credit granted between the two.

What does affect R@H pretty significantly however, is the amount of L2/3 cache your CPU has. My 9850 (RIP) only had of 4MB L2/3 cache total. I swapped it out for an Intel Q9450 which has a slower clock speed but a massive 12MB of L2 cache (4MB per core!) and the amount of decoys I was able to generate per work unit shot up resulting in a much higher RAC :)
ID: 61012 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Gen_X_Accord
Avatar

Send message
Joined: 5 Jun 06
Posts: 154
Credit: 279,018
RAC: 0
Message 61017 - Posted: 5 May 2009, 8:07:38 UTC - in response to Message 61012.  

From my own testing using a system with exactly the same components, just changing the memory speed, it has no effect on R@H.

Tested on a Phenom 9850 at 800MHz and 667MHz. There was no drop in the average credit granted between the two.

What does affect R@H pretty significantly however, is the amount of L2/3 cache your CPU has. My 9850 (RIP) only had of 4MB L2/3 cache total. I swapped it out for an Intel Q9450 which has a slower clock speed but a massive 12MB of L2 cache (4MB per core!) and the amount of decoys I was able to generate per work unit shot up resulting in a much higher RAC :)


I now that those Rosetta mimi apps rely heavily on processor cache. I would have loved to have gotten a Phenom II quad or triple core. But...buget came into play,and I had to decide on the minimum of what I needed to really do a good job on the video encoding/dvd creation. And the $70 I saved by getting an X2 7850 instead of the Phenom II X3 720 helped pay for most of my new motherboard(an ASUS M3N78-EM Motherboard - NVIDIA GeForce 8300 by the way).
I do like that the 7850 has a lot of the features that are big selling points for the phemon and phenom II. And an unlocked multiplier in case I decide to try my hand at a little overclocking. Whether or not overclocking has a big effect on rosetta results is a whole other thread I'm sure
ID: 61017 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
mikey
Avatar

Send message
Joined: 5 Jan 06
Posts: 1895
Credit: 9,183,794
RAC: 3,318
Message 61019 - Posted: 5 May 2009, 12:36:48 UTC - in response to Message 61012.  

From my own testing using a system with exactly the same components, just changing the memory speed, it has no effect on R@H.

Tested on a Phenom 9850 at 800MHz and 667MHz. There was no drop in the average credit granted between the two.

What does affect R@H pretty significantly however, is the amount of L2/3 cache your CPU has. My 9850 (RIP) only had of 4MB L2/3 cache total. I swapped it out for an Intel Q9450 which has a slower clock speed but a massive 12MB of L2 cache (4MB per core!) and the amount of decoys I was able to generate per work unit shot up resulting in a much higher RAC :)


You are absolutely correct, L2 cache size plays a HUGE role in Boinc crunching as a whole! That is why celeron and duron processors don't do as well as there full cache sized counterparts under Boinc.

But you were talking about memory speed not the dual channel or single channel part of the memory. Dual channel also makes a big difference over single channel. I have also changed the memory speed on my pc's and it also made absolutely no difference! BUT going from one chip to two memory chips, single to dual channel, did make a big difference.
ID: 61019 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
TomaszPawel

Send message
Joined: 28 Apr 07
Posts: 54
Credit: 2,791,145
RAC: 0
Message 61038 - Posted: 6 May 2009, 19:09:48 UTC - in response to Message 61012.  
Last modified: 6 May 2009, 19:10:30 UTC

From my own testing using a system with exactly the same components, just changing the memory speed, it has no effect on R@H.

Tested on a Phenom 9850 at 800MHz and 667MHz. There was no drop in the average credit granted between the two.

What does affect R@H pretty significantly however, is the amount of L2/3 cache your CPU has. My 9850 (RIP) only had of 4MB L2/3 cache total. I swapped it out for an Intel Q9450 which has a slower clock speed but a massive 12MB of L2 cache (4MB per core!) and the amount of decoys I was able to generate per work unit shot up resulting in a much higher RAC :)


Yes it looks very good when you compare Q8200 vs Q9550 - 4 MB L2 vs 12MB L2 ...

Maby RAM CL also have some imact on speed.
WWW of Polish National Team - Join! Crunch! Win!
ID: 61038 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Gen_X_Accord
Avatar

Send message
Joined: 5 Jun 06
Posts: 154
Credit: 279,018
RAC: 0
Message 61041 - Posted: 7 May 2009, 0:47:27 UTC

Can you help me with the FSB and memory speed then? Here is what it looks like if all the Bios is set to auto. I have the memory in Dual Channel configuration, but I get the same showing in single channel configuration.



As you can see, it is only running at 667 mhz. I read somewhere that that is because I am using 1066MHz Nvidia SLI-Ready EPP Memory. Now I have set the Dram frequency manually to 533, and left everything else in auto, including voltage. Here is what cpu-z shows with that setting...



I have set the memory speed to 400mhz and the FSB:Dram ratio was 1:2

What is the best setting for even performance all around and Rosetta crunching?
I read somewhere that you want the FSB:Dram ratio to be 1:1? If that is true, what do I change in Bios to get that?
ID: 61041 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
mikey
Avatar

Send message
Joined: 5 Jan 06
Posts: 1895
Credit: 9,183,794
RAC: 3,318
Message 61044 - Posted: 7 May 2009, 11:52:41 UTC - in response to Message 61041.  

Can you help me with the FSB and memory speed then? Here is what it looks like if all the Bios is set to auto. I have the memory in Dual Channel configuration, but I get the same showing in single channel configuration.


That is because 667 is the "standard" and nothing above that is "approved and certified" yet. You will need to go into the manual settings and set it there. I found this on Tom's Hardware:
"the memory isn't "really" DDR2-1066 because the DDR2-1066 "standard" doesn't exist. All DDR2-1066 configures as DDR2-800 or DDR2-667 on all boards.

But the advice you've found here is correct, simply increase the voltage and memory speed in BIOS, that's only two settings that should get you where you want to be." http://www.tomshardware.com/forum/245626-30-ddr2-1066-memory-appears-ddr2 Be VERY careful when overclocking though, if you go to far, BOOM!
ID: 61044 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile dcdc

Send message
Joined: 3 Nov 05
Posts: 1832
Credit: 119,677,569
RAC: 10,479
Message 61107 - Posted: 11 May 2009, 12:42:57 UTC

no need to increase the memory voltage...
ID: 61107 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Chilean
Avatar

Send message
Joined: 16 Oct 05
Posts: 711
Credit: 26,694,507
RAC: 0
Message 61112 - Posted: 11 May 2009, 18:07:26 UTC - in response to Message 61107.  

no need to increase the memory voltage...


this.

If your mem is 1066 out of the box, it shouldn't need a voltage increase.
ID: 61112 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Gen_X_Accord
Avatar

Send message
Joined: 5 Jun 06
Posts: 154
Credit: 279,018
RAC: 0
Message 61123 - Posted: 12 May 2009, 6:53:42 UTC

That becomes the question of the day, is it 1066? Or is it 667 that has to be manually set to get it to run at 1066? Here is a link to the site and exactly what I purchased...

http://www.compusa.com/applications/SearchTools/item-details.asp?EdpNo=2701646&CatId=3980

I put the pics of CPUZ up to show what it says the memory is when bios is set to auto. I have also read somewhere that this is because of it being EEP SLI ready memory, and the only way to have it show as standard 1066 is to get non EEP memory, like Kinston value or something.
Anyway, I have found that the sweet spot seems to be at 400mhz, and with a 200mhz processor bus, and a 2800mhz processor, a lot of the numbers, and clock dividers and such, are even. And even when I did put it at 1066, I left the voltage alone, I just left that setting on auto. If the board can't automatically adjust for it, than it sucks and I'll tell Asus so too. 8-)
ID: 61123 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Chilean
Avatar

Send message
Joined: 16 Oct 05
Posts: 711
Credit: 26,694,507
RAC: 0
Message 61149 - Posted: 13 May 2009, 1:42:53 UTC - in response to Message 61123.  

That becomes the question of the day, is it 1066? Or is it 667 that has to be manually set to get it to run at 1066? Here is a link to the site and exactly what I purchased...

http://www.compusa.com/applications/SearchTools/item-details.asp?EdpNo=2701646&CatId=3980

I put the pics of CPUZ up to show what it says the memory is when bios is set to auto. I have also read somewhere that this is because of it being EEP SLI ready memory, and the only way to have it show as standard 1066 is to get non EEP memory, like Kinston value or something.
Anyway, I have found that the sweet spot seems to be at 400mhz, and with a 200mhz processor bus, and a 2800mhz processor, a lot of the numbers, and clock dividers and such, are even. And even when I did put it at 1066, I left the voltage alone, I just left that setting on auto. If the board can't automatically adjust for it, than it sucks and I'll tell Asus so too. 8-)


If you want to squeeze the performance you paid for, then you have to go to BIOS and manually change the speed to 1066...

BTW, My AMD Athlon 64 X2 6400+ (My used-to-be pride) runs @ 3.2GHz... I wonder why yours runs slower... even when the performance rating is higher.

And my Athlon benchmarks higher as well :S

Mine: https://boinc.bakerlab.org/rosetta/show_host_detail.php?hostid=1009549
Yours: https://boinc.bakerlab.org/rosetta/show_host_detail.php?hostid=1012775

Kinda odd...


ID: 61149 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Gen_X_Accord
Avatar

Send message
Joined: 5 Jun 06
Posts: 154
Credit: 279,018
RAC: 0
Message 61153 - Posted: 13 May 2009, 6:07:21 UTC

That's because my processor is not an X2 6400 but an X2 7800, which is really a Phenom series quad core with two of the cores disabled. . At least that is what I read online somewhere.
ID: 61153 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Gen_X_Accord
Avatar

Send message
Joined: 5 Jun 06
Posts: 154
Credit: 279,018
RAC: 0
Message 61156 - Posted: 13 May 2009, 9:08:15 UTC

I just did some research over at OCZ's website forums and found the answer to my questions about why 1066 might originally boot up and run at 667 or 800. And I am just one of many people who have been confused by the same issue. Bad craziness this computer stuff is.<--Yoda speak.
ID: 61156 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Chilean
Avatar

Send message
Joined: 16 Oct 05
Posts: 711
Credit: 26,694,507
RAC: 0
Message 61175 - Posted: 13 May 2009, 23:05:00 UTC

Well, form my own experience, last time I built a PC with 1066 mem (a Phenom x4), I had to manually tell it to work at that speed. If I set it on auto, it would go to 800 I believe.
ID: 61175 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Gen_X_Accord
Avatar

Send message
Joined: 5 Jun 06
Posts: 154
Credit: 279,018
RAC: 0
Message 61186 - Posted: 14 May 2009, 7:21:52 UTC
Last modified: 14 May 2009, 7:23:33 UTC

Here's why you had to tell it to do that. Mine wants to run at 667 an auto, but it will run at 1066. Here are two good articles.

OCZ FAQ's

and

Why CPU-Z doesn’t match your memory’s advertised speeds
ID: 61186 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote

Message boards : Number crunching : Single vs. Dual Channel DDR2



©2024 University of Washington
https://www.bakerlab.org