I honestly belive that if R@H harnesses GPUs

Message boards : Number crunching : I honestly belive that if R@H harnesses GPUs

To post messages, you must log in.

AuthorMessage
Profile Chilean
Avatar

Send message
Joined: 16 Oct 05
Posts: 711
Credit: 26,694,507
RAC: 0
Message 56267 - Posted: 7 Oct 2008, 5:07:38 UTC

We would be WAY over the 150Tflop mark
ID: 56267 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile robertmiles

Send message
Joined: 16 Jun 08
Posts: 1233
Credit: 14,284,221
RAC: 1,121
Message 56326 - Posted: 11 Oct 2008, 2:11:32 UTC

Unfortunately, that would take significantly more effort for the Rosetta@home programmers to maintain the software for. There's already a new version of BOINC that allows use of the GPU, but very few BOINC projects likely to be ready to make use of this soon.
ID: 56326 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Mike Tyka

Send message
Joined: 20 Oct 05
Posts: 96
Credit: 2,190
RAC: 0
Message 56338 - Posted: 11 Oct 2008, 19:02:13 UTC - in response to Message 56326.  

Unfortunately, that would take significantly more effort for the Rosetta@home programmers to maintain the software for. There's already a new version of BOINC that allows use of the GPU, but very few BOINC projects likely to be ready to make use of this soon.


Projects like Folding@home change their code rarely and only change paramerters and input files. That means writing highly specialised code such as for a GPU makes a lot of sense for them.

In contrast we modify the code constantly (as well as parameters and inputfiles) so writing highly specialized code is a bit of a headache for us, since keeping track of changes in multiple version of the same code (remember rosetta has to run on many plattforms already, never mind different CPU arhitectures) is just not viable for us right now. Maybe one day, when we can "freeze" a given section of the code and just want to crunch majorly we'll embark on porting the code.

I know its sad and it seems like a waste of CPU but practically its tricky..

Mike

http://beautifulproteins.blogspot.com/
http://www.miketyka.com/
ID: 56338 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile rochester new york
Avatar

Send message
Joined: 2 Jul 06
Posts: 2842
Credit: 2,020,043
RAC: 0
Message 56448 - Posted: 23 Oct 2008, 3:48:06 UTC

ID: 56448 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Chilean
Avatar

Send message
Joined: 16 Oct 05
Posts: 711
Credit: 26,694,507
RAC: 0
Message 56453 - Posted: 24 Oct 2008, 1:23:38 UTC - in response to Message 56448.  

http://www.isgtw.org/?pid=1001391


Well... My GPU can run Crysis on Gamer settings...
So I would guess it could crunch proteins like a hungry mad dog.
ID: 56453 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile ]{LiK`RangerS`
Avatar

Send message
Joined: 27 Oct 08
Posts: 39
Credit: 6,552,652
RAC: 0
Message 56507 - Posted: 29 Oct 2008, 8:29:15 UTC - in response to Message 56453.  

i was planning on getting 2 radeon 2gb graphics cards ... CHILI
ID: 56507 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
The_Bad_Penguin
Avatar

Send message
Joined: 5 Jun 06
Posts: 2751
Credit: 4,271,025
RAC: 0
Message 56519 - Posted: 29 Oct 2008, 22:13:42 UTC

dang, that's sweet!

i decided to go "cheap". got some mobo's with 4 pci-e x16's slots, and got 4 nVidia 9600gt's and 3 nVidia 9600gso's (i'll probably pick up a 4th one somewhere along the line)

trying to bring this "farm" on-line sometime within the next few weeks

imho, ati gpu's are currently "better" than nVidia, but dc projects (other than F@H) seem to be supporting nVidia (i.e., gpugrid)

i have one ATI 4850, and i might get two more at some point, and i guess the only place to throw them is F@H.
ID: 56519 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote

Message boards : Number crunching : I honestly belive that if R@H harnesses GPUs



©2024 University of Washington
https://www.bakerlab.org