Dolphin Wulfkit

Tim Wait TIMOTHY.R.WAIT at saic.com
Tue May 7 06:53:31 PDT 2002


> Let's not turn this one into a flamewar, ok ? The original poster
> (Kevin Van Workum) asked a legitimate question and I think he got his
> answer (so basically this stuff is now OT).

I doubt Kevin has got anything out of the usual Myrinet/SCI pissing match.

As the owner of a 40 node SCI system, and an 8 node Myrinet system,
I'll attempt to put forth a few observations.

The SCI interconnect is very nice, once you have it working.
The cables are a bit of a pain, and you'll have raw fingers
once your done. We had a bad cable for about a year which
wreaked havoc with the system, casing sporadic and inconsistent
failures. The only way to resolve hardware problems is to
physically partition the system, which is time consuming and
somewhat frustrating. The Scali software is nice, but I really
only use the ScaMPI, being more of a traditionalist ;)
The SCI drivers are publicly available now, so you can compile
your own kernel with it, just like mpich-gm. There used to be
a restriction to using 2.2 kernels, but after upgrading my
machines to rh7.2, Scali put together a SSP2.x package for me
that would work (couldn't afford the upgrade to SSP3.x, which
handles 2.4 no problem). Support from Scali is very good.
Support from Dolphin, however, seems to be limited to sending
replacement parts, and it's up to you to do the troubleshooting.
Since we resolved our cable issue, I have had no problems.

As for performance, I'll simply say that our very highly coupled
atmopheric code runs on these old tbird 1.1/pc133 boxes at a similar
speed as on an O2k.

Hope this helps.

Tim

-- 
Tim Wait                   waitt at saic.com
SAIC - Advanced Systems Group
1710 SAIC Drive MS 2-3-1, McLean VA 22102
Phone: (703) 676-7108, fax (703) 676-5509




More information about the Beowulf mailing list