using graphics cards as generic FLOP crunchers

Eugene.Leitl at lrz.uni-muenchen.de Eugene.Leitl at lrz.uni-muenchen.de
Mon Mar 19 13:09:09 PST 2001


'Bryce' wrote:
 
>  Bx notes the beowulf geeks are getting seriously freaky, they're searching for a way to use the GFU's on high
> end video cards to contribute to the processing power of the main FPU

This is not the first time the topic came up. Last time we decided it 
wasn't worthwhile, iirc.

> <kx> wx, er, it is quite insane.  they can do the calculations but there's no way to get the results out.

It depends on the amount of calculations vs. result data to be moved. 
Clearly, 64 MBytes VRAM are usable for ANNs, which do require extensive 
matrix multiplications  (at the least the canonical kind), whereas the 
traffic from and to the input and output layer is relatively negligable.

> <sx> kx: Suddenly you get something that looks suspiciously like a vector multiply.

GeForce3 does do vector addition and vector multiply, amongst other things. 

> <kx> sx, still, I suspect that using a faster CPU will be easier and cheaper

The problem is the overhead of dealing with the nooks and crannies of horribly 
misused 3d accelerators. Also, the hardware is getting stale awfully quickly, 
so your investments would seem to have a very short half life time.




More information about the Beowulf mailing list