[Beowulf] GP-GPU experience
hahn at mcmaster.ca
Mon Apr 4 15:20:08 PDT 2011
>>>> well, for your application, which is quite narrow.
>>> Which is about any relevant domain where massive computation takes place.
>> you are given to hyperbole. the massive domains I'm thinking of
>> are cosmology and explicit quantum condensed-matter calculations.
>> the experts in those fields I talk to both do use massive computation
>> and do not expect much benefit from GPUs.
> Even the field you give as an example: quantum mechanica:
> Vaste majority of quantum mechanica calculations are massive matrix
yes, specifically very large sparse eigensystems. do you have an example
of effectively using GPUs for this?
> Furthermore i didn't take a look to the field you're speaking about.
> I did however take a look to 1 other quantum mechanica calculation,
> where someone used 1 core of his quadcore box and massive RAM.
sorry, I'm talking thousands of cores, ideally with > 4GB/core.
> It took me 1 afternoon to explain the guy how to trivially use all 4 cores
> doing that calculation
> using the same RAM buffer.
the point is that lots of serious science uses MPI already,
and doesn't care much about GPUs. if they were free, sure,
they might be interesting.
> My attempt to write a sieve directly into the gpu in order to do everything
> inside the gpu,
> is of a different league sir than where you are talking.
bully for you. your application is a niche.
> Your kind of talking is: "there are no tanks in the city, we will drive all
> tanks out of the city, so that only
> our cpu's are left again".
nonsense. I'm saying that GPUs are a nice, specialized accelerator.
you can't have them without hosts, so you need to compare host vs host+GPU.
> Those days are over. Just get creative and find a way to do it at a gpu.
don't be silly. GPUs have weaknesses as well as strengths. packaging
and system design is one of the minor sticking points with GPUs.
More information about the Beowulf