Could GPU's mean the end of DC as we know it?

Message boards : Number crunching : Could GPU's mean the end of DC as we know it?

To post messages, you must log in.

1 · 2 · Next

AuthorMessage
FoldingSolutions
Avatar

Send message
Joined: 2 Apr 06
Posts: 129
Credit: 3,506,690
RAC: 0
Message 42500 - Posted: 23 Jun 2007, 10:00:16 UTC

As most modern GPU's break the 100GFLOP mark and most modern CPU's barely make it to the 10GFLOPs, and with there being more talk around about GPGPU's which could effectively eradicate the only strength CPU's have, the ability to carry out a wide range of code. Could machines with 100’s of GFLOPs of performance dissuade people with mere CPU’s from wanting to contribute to DC projects, as they feel their contribution is now so tiny that they may as well not bother? Which in a way defies the point of Distributed Computing which is so that everyone can get involved in science. I know this might not be the case for another year or so but it may become an issue. What do you think?
ID: 42500 · Rating: 1 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Greg_BE
Avatar

Send message
Joined: 30 May 06
Posts: 5691
Credit: 5,859,226
RAC: 0
Message 42502 - Posted: 23 Jun 2007, 10:49:53 UTC - in response to Message 42500.  

i'll say this, if GPU's become the hot thing and affordable you might find a few of us that just bought CPU machines to be a bit discouraged if the mega crunchers take over. Personally I'll just crunch what I can until I can afford a new machine. I am just waiting for the Cell and GPU processors to sort themselves out as well as the multicore units. After the dust settles on these units, I can make a choice as to where to spend my money and continue with ultra highspeed crunching.


As most modern GPU's break the 100GFLOP mark and most modern CPU's barely make it to the 10GFLOPs, and with there being more talk around about GPGPU's which could effectively eradicate the only strength CPU's have, the ability to carry out a wide range of code. Could machines with 100’s of GFLOPs of performance dissuade people with mere CPU’s from wanting to contribute to DC projects, as they feel their contribution is now so tiny that they may as well not bother? Which in a way defies the point of Distributed Computing which is so that everyone can get involved in science. I know this might not be the case for another year or so but it may become an issue. What do you think?


ID: 42502 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
soriak

Send message
Joined: 25 Oct 05
Posts: 102
Credit: 137,632
RAC: 0
Message 42503 - Posted: 23 Jun 2007, 10:50:15 UTC - in response to Message 42500.  
Last modified: 23 Jun 2007, 10:51:29 UTC

If we're really going to see that kind of performance for general tasks, I think it'd be a very good idea to upgrade to a GPGPU system regardless of DC ;)

We'd quickly see clients that support this new computing and take advantage of the full speed - and with that, a lot of new computing stuff would be possible. What takes now a year could be done in a month and that's just with the GPU speed we have today... pretty cool ;)

It's important to keep in mind, though, that even if it's 10x as fast, there'll initially still be WAY more people with CPU machines. So the biggest combined contribution will still come from them.

of course I don't think there's such a thing as enough (much less too much) computing power. Make it available, and someone will find a way to bring it to its limits.
ID: 42503 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Tom Philippart
Avatar

Send message
Joined: 29 May 06
Posts: 183
Credit: 834,667
RAC: 0
Message 42506 - Posted: 23 Jun 2007, 12:20:33 UTC - in response to Message 42503.  

It can never be the end of DC as we know it now, since GPUs are only good at a limited amount of fields (like graphics) so are PS3s.

The CPU is the only component able to do "nearly" everything. GPUs and PS3s have advantages in some fields, but disadvantages in a lot of others.

The guys at folding@home said that they can only run one type of simulations on GPUs, another one on PS3s, but majority of tasks still need CPUs.
ID: 42506 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
FoldingSolutions
Avatar

Send message
Joined: 2 Apr 06
Posts: 129
Credit: 3,506,690
RAC: 0
Message 42509 - Posted: 23 Jun 2007, 13:03:04 UTC - in response to Message 42506.  

Maybe when the Intel terascale project comes to fruition and becomes x86 or x86_64 then CPU's may just gold their ground. This brings up another issue, there's so much talk about Intel and their super processing projects, what is AMD doing? Their only apparant footing in the field of high performance computing is through their acquisition of GPU producers ATI.
ID: 42509 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
soriak

Send message
Joined: 25 Oct 05
Posts: 102
Credit: 137,632
RAC: 0
Message 42517 - Posted: 23 Jun 2007, 15:45:36 UTC

@Tom

GPGPU would be general processing GPUs - ie they can do what CPUs do now, just a whole lot faster. We're not quite there yet, but from what I've been reading we're getting there. That'd definitely make CPUs obsolete.
ID: 42517 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Greg_BE
Avatar

Send message
Joined: 30 May 06
Posts: 5691
Credit: 5,859,226
RAC: 0
Message 42522 - Posted: 23 Jun 2007, 17:22:21 UTC

the cell processor and other advances will make the cpu extinct just like the old faithfull 8 inch floppy from long ago. GPGPU will be a very new and exciting thing to watch and see how fast the technology goes and how fast will the price drop after it comes on the market.
ID: 42522 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
FoldingSolutions
Avatar

Send message
Joined: 2 Apr 06
Posts: 129
Credit: 3,506,690
RAC: 0
Message 42525 - Posted: 23 Jun 2007, 18:21:14 UTC

Check this out... http://www.nvidia.com/object/tesla_computing_solutions.html
ID: 42525 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Tom Philippart
Avatar

Send message
Joined: 29 May 06
Posts: 183
Credit: 834,667
RAC: 0
Message 42528 - Posted: 23 Jun 2007, 19:09:54 UTC - in response to Message 42517.  

@Tom

GPGPU would be general processing GPUs - ie they can do what CPUs do now, just a whole lot faster. We're not quite there yet, but from what I've been reading we're getting there. That'd definitely make CPUs obsolete.

thanks for the info!
ID: 42528 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Greg_BE
Avatar

Send message
Joined: 30 May 06
Posts: 5691
Credit: 5,859,226
RAC: 0
Message 42533 - Posted: 23 Jun 2007, 19:36:55 UTC - in response to Message 42525.  

Check this out... http://www.nvidia.com/object/tesla_computing_solutions.html


thats impressive, if it computed the picture on the right, thats something!
ID: 42533 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
FoldingSolutions
Avatar

Send message
Joined: 2 Apr 06
Posts: 129
Credit: 3,506,690
RAC: 0
Message 42536 - Posted: 23 Jun 2007, 20:35:43 UTC - in response to Message 42533.  

500+ GFLOPs is impressive. And thats for a single GPU, with upscaling to 10 or more in a dedicated rack the possibilities are, well, multiples of 500 GFLOPs+. ATI needs to come up with something better than relying upon F@H to do everything for them.
ID: 42536 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile dcdc

Send message
Joined: 3 Nov 05
Posts: 1832
Credit: 119,821,902
RAC: 15,180
Message 42543 - Posted: 23 Jun 2007, 23:28:42 UTC

I think DC has a long future ahead. i'm sure the guys here could make use of 100x the computational power, and if they couldn't there are plenty more projects that will be coming to life over the next few years. DC is still in its infancy... there's so much scope. Can you imagine if every item from car engines to vacuum cleaners were evolved using DC during the design stages to improve efficiency - pretty much anything can be analysed by brute force DC while scientists and designers do the more elegant work in the mean -time.
ID: 42543 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Greg_BE
Avatar

Send message
Joined: 30 May 06
Posts: 5691
Credit: 5,859,226
RAC: 0
Message 42549 - Posted: 24 Jun 2007, 7:40:09 UTC - in response to Message 42543.  

I think your right on this, DC can take advantage of more powerful home systems to get faster and more detailed results. Whats needed is people that can program a project into multiple pieces and send it out over a DC network and then recompile all the data into one big picture. Think of what I am saying as taking a cray and splitting it into a thousand pieces, each piece a mini cray itself. then send out a wind tunnel simulation or some other project and let these systems work on it and send back the results. I bet in the same amount of time on a DC grid you can get more results back then if you ran it on a cray or some other supercomputer alone. So your probably right, DC is still being figured out for various applications. Will be interested where it goes.

I think DC has a long future ahead. i'm sure the guys here could make use of 100x the computational power, and if they couldn't there are plenty more projects that will be coming to life over the next few years. DC is still in its infancy... there's so much scope. Can you imagine if every item from car engines to vacuum cleaners were evolved using DC during the design stages to improve efficiency - pretty much anything can be analysed by brute force DC while scientists and designers do the more elegant work in the mean -time.


ID: 42549 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
FluffyChicken
Avatar

Send message
Joined: 1 Nov 05
Posts: 1260
Credit: 369,635
RAC: 0
Message 42556 - Posted: 24 Jun 2007, 14:15:43 UTC

Nah, DC (use of home based computer DC) will not go away and I think used more and more, it's never been about the CPU, but about the willingness of people to want to do it on their computer. What does it our end does not matter.

We will have these fandangled GPU, processors etc in our machines if they are useful to us, Dell will make sure even if they are not usefull but they can sell them.

But you hit something on the head in you sentance, GPU cannot do everything, CPU we have now can do a lot more. So they are still needed, folding@home can only do some parts on the GPU and GPU are still flexible, else we'd all be using buying long established dedicated hardware for FFT or something.

It is the power of these processors that make you think they'll keep it in house... will it or will they just try a lot more complex things on them ?

Doing it in house also doesn't give them the advantage of widespread ideas, advertisment, etc.. that project like Rosetta@home must surely got far more than it can have imagined.
Team mauisun.org
ID: 42556 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile David Emigh
Avatar

Send message
Joined: 13 Mar 06
Posts: 158
Credit: 417,178
RAC: 0
Message 42590 - Posted: 24 Jun 2007, 23:13:57 UTC - in response to Message 42509.  

Maybe when the Intel terascale project comes to fruition and becomes x86 or x86_64 then CPU's may just gold their ground. This brings up another issue, there's so much talk about Intel and their super processing projects, what is AMD doing? Their only apparant footing in the field of high performance computing is through their acquisition of GPU producers ATI.


AMD's answer:

http://www.amd.com/us-en/Corporate/VirtualPressRoom/0,,51_104_543~117412,00.html?redir=dtqc03

The link is only to a press release, but it's better than nothing.
Rosie, Rosie, she's our gal,
If she can't do it, no one shall!
ID: 42590 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile AgnosticPope

Send message
Joined: 16 Dec 05
Posts: 18
Credit: 148,821
RAC: 0
Message 42594 - Posted: 25 Jun 2007, 2:40:34 UTC
Last modified: 25 Jun 2007, 2:46:38 UTC

About a decade ago, CPUs were on the road to extinction because RISC processors were so much better. Apple went with a RISC design for its MAC. Then, lo and behold, a decade later Apple went back to Intel and the "classic" CPU machine.

So, the GPU folks have developed a technique for doing 10x the floating point calculations? Well, my guess is that the same technique could be rather easily incorporated back into a CPU in a generation or two at the most.

The market still demands CPUs because the market wants to buy a single machine that runs "everything" in one box. Yes, there will be perturbations in the market from time-to-time as the RISC and GPU break-outs clearly demonstrate. But I think we are a long way away from counting the CPU out!

Oh, and then there is THIS:
ENGINEERS from AMD and ATI companies are going to start working on a unified chip that will have GPU and CPU on a same silicon. We learned this from high ranking sources close to the companies, more than once.
Don’t get too excited as it will take at least eighteen months to see such a dream come true.

This is the ultimate OEM chip, as it will be the cheapest way to have the memory controller, chipset, graphics function and CPU on a single chip. This will be the ultimate integration as will decrease the cost of platform and will make even cheaper PCs possible.

CPUs are being shrunk to a 65 nanometre process as we speak and the graphics guys are expected to migrate to this process next year. The graphics firms are still playing with 80 nanometre but will ultimately go to 65 nanometre later next year.

DAAMIT engineers will be looking to shift to 65 nanometre if not even to 45 nanometre to make such a complex chip as a CPU/GPU possible.

We still don’t know whether they are going to put a CPU on a GPU or a GPU or a CPU but either way will give you the same product.


See what I mean?

== Bill
ID: 42594 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Paydirt
Avatar

Send message
Joined: 10 Aug 06
Posts: 127
Credit: 960,607
RAC: 0
Message 42604 - Posted: 25 Jun 2007, 14:16:13 UTC

I agree that it is not the end, only the beginning for DC science. Yeah, we might see a day where F@H & R@H have more computing power than they need (which would be GREAT). How much will ClimatePrediction need before it is "full"?

Also, I think you will see DC solutions being used for a wider variety of issues/problems. AI, space travel, psychology, sociology, RNA medicine, etc.

Maybe computing power will become very cheap? That would be cool.


JUST NOW we have kids growing up with Google, the Internet, etc... They will have a different perspective on things and they will have new ideas on how we can harness it all for the betterment of mankind.
ID: 42604 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Greg_BE
Avatar

Send message
Joined: 30 May 06
Posts: 5691
Credit: 5,859,226
RAC: 0
Message 42613 - Posted: 25 Jun 2007, 18:34:46 UTC

I wouldn't think there is such a thing as to MUCH computing power for science projects like this. Theres always something they can tweak or make brand new that will take advantage of the new computing technology.

Might be able to work a whole strand of related molecules instead of just one at a time for instance. Or do high resolution searches with greater detail and send the results back to be viewed like they really are instead of the strands we are looking at now. Or any number of other things....

2 or 3 work units at one time instead of one at time maybe? who knows....
ID: 42613 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
Profile Paydirt
Avatar

Send message
Joined: 10 Aug 06
Posts: 127
Credit: 960,607
RAC: 0
Message 42618 - Posted: 25 Jun 2007, 19:24:49 UTC

They could have "enough" computing power where those pesky humans at the science department at Washington aren't able to come up with enough new ideas to run computations on. I guess there are always proteins to figure out the 3D structure of and interaction simulations for designer proteins... Eventually though if they had a TON of computing power, they'd run out of stuff to do :)
ID: 42618 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
FoldingSolutions
Avatar

Send message
Joined: 2 Apr 06
Posts: 129
Credit: 3,506,690
RAC: 0
Message 42619 - Posted: 25 Jun 2007, 19:36:50 UTC - in response to Message 42613.  

I agree, there is always new ways to make use of increased computing power. Seti@home made use of this and developed their "enhanced" app, this looks more closely at the data they have. And the CPU/GPU chip also looks very exciting, as does the upcoming Penryn chips from Intel and the Phenom chips from AMD. With literally hundreds of millions of computers (PC's and corporate machines) now connected to the internet or in some way able to be utilized for DC. The problem at the moment is getting enough people interested and "in the know" about it. And the people who run the DC projects to write more flexible code to be run on many types of chip. With the enhancement of GPU's, I agree with Agnosticpope that the technologies developed with this may find their way onto a CPU. And there is always the possibility that the power of the GPU has been massively overestimated. GPU's only perform single precision floating point operations. A CPU does double precision, so a GPU has a lot less "real" computing power than boasted by the large numbers. I think the points raised here are excellent and provide a very balanced view of what GPU's mean for DC, and how it's making the CPU developers put their thinking caps on and even join forces! (ATI & AMD) Keep the ideas coming ;)
ID: 42619 · Rating: 0 · rate: Rate + / Rate - Report as offensive    Reply Quote
1 · 2 · Next

Message boards : Number crunching : Could GPU's mean the end of DC as we know it?



©2024 University of Washington
https://www.bakerlab.org