[Beowulf] visualization machine
Andrew Robbie (GMail)
andrew.robbie at gmail.com
Sun Mar 30 07:17:41 PDT 2008
On Thu, Mar 27, 2008 at 9:41 PM, Ricardo Reis <rreis at aero.ist.utl.pt> wrote:
> Hi all
> I beg to take advantage of your experience although the topic isn't
> completly cluster thing. I got some money to buy a new machine, at least
> 8Gb and I'm thinking between a 2 x dual core or a 1 x quad (or even 2x
> quads). It must be one machine because it must (urgh) be able to use it's
> 8Gb in serial codes (don't ask).
Just be aware that most of the machines designed to be number crunchers
have shortcomings in board layout or bus design that make them suck for
Not that many that will be happy with 8GB for starters. So few machines are
actually ever populated with big dimms that you almost always get issues.
So you end up going for machines with lots of ram slots, ecc support etc,
is all good. These are almost always at least dual socket. But many of those
motherboards aren't designed to take a 16x PCIe graphics card and only have
PCIe 8x buses. Also, graphics cards have an extra retaining lug which
extends further than the PCIe slot; this is commonly blocked on server
motherboards by some capacitor.
High end graphics cards always take up two slots and require additional
the Quadro 5600 and other cards this connector enters from the top not the
hence making it impossible to fit them in a 3U case. Oh yeah -- don't think
one of these for under your desk unless you want to wear earmuffs in the
> Anyway, I've been experiencing with
> paraview for parallel visualization and was wondering on your opinion
> on... buying a ultra-duper-cool state-of-the-art graphic card (Nvidia) or
> 2 graphic cards?
Depends -- is performance critical *now*? If so, buy the fastest Quadro. If
to maximize performance over time, just upgrade the graphics card every six
with the sweet spot on the price/performance curve. Quadros are the first
parts from the fab; the same chips, with slightly slower/cheaper memory
become mass market later.
Don't bother with SLI, you won't notice any speedup unless you invest lots
time. And since your viz app is 3rd party, probably no speedup at all.
ATI vs nVidia: ATI drivers really really suck. nVidia drivers are generally
you are on the bleeding edge (eg brand new part) or a corner case (eg quad
stereo on a 2.2 kernel but with recent hardware & drivers). nVidia developer
sucks too unless you are a major game author or eg industrial light & magic.
developer support is non-existent under linux; under windows I'm told they
can be ok
about fixing windows bugs.
(flight simulation geek)
> thanks for your time,
> Ricardo Reis
> 'Non Serviam'
> PhD student @ Lasef
> Computational Fluid Dynamics, High Performance Computing, Turbulence
> Cultural Instigator @ Rádio Zero
> Beowulf mailing list, Beowulf at beowulf.org
> To change your subscription (digest mode or unsubscribe) visit
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Beowulf