[Beowulf] nVidia revealed as evil
deadline at eadline.org
Thu Jan 4 13:02:40 PST 2018
My first response was to chuckle. As you know, the entire concept of
Beowulf clusters was based largely on using hardware that was
not supposed to be used. "You can't use desktop x86 and Ethernet
for supercomputing!" (as it was called at the time)
We also know Intel decided to fuse off the processors so they
could reduce features for the desktop and thereby charge more
for "server processors" And, this fact has been the basis
for my construction of high performance desk-side clusters. Same
basic guts, slower memory, but much cheaper. And in many
cases similar performance.
NVidia did the same with the ratio of SP vs DP, and ECC but
deep learning (DL) has no need for DP (or even SP) so a $700
video card is a bargain for DL type stuff.
BTW, I find it interesting one of the most popular codes run
on Nvidia GPUs is Amber (MD). It has been optimized to use
SP when it can and many Amber users turn off ECC because it
slows down the GPU which translates to runs really well on
Nvidia video cards.
I'm actually building a dual 1080ti box to run Amber for one
of my customers this month. Fortunately it won't go in
a data center but in the lab. So when the Nvidia police
investigate the students can quickly switch over to GTA.
> Of course you cannot use our less expensive hardware for whatever you
> want! Beacuse it includes proprietary software, we can ex-post-facto
> forbid you from using the thing you paid for any way you want.
> Looks like Stallman was right all along.
> Beowulf mailing list, Beowulf at beowulf.org sponsored by Penguin Computing
> To change your subscription (digest mode or unsubscribe) visit
> MailScanner: Clean
More information about the Beowulf