Can we have a moment of silence (or several million dollars) . . . please?

Bob Drzyzgula bob at drzyzgula.org
Mon Jun 25 20:39:20 PDT 2001


On Mon, Jun 25, 2001 at 09:47:25PM -0400, Greg Lindahl wrote:
> > 3) Take the cost to design and fab a cpu (>$1 Billion) and divide
> >    by number of units sold and see how much you need to make
> >    a living.
> 
> Hello? A cpu does not take up an entire fab. If it's built in small
> volumes, it shares with other items.  The Power3/4, MIPS, and
> Transmeta chips do not have dedicated fabs; I don't think the Samsung
> fab that makes Alphas is dedicated either.

AFAIK, even the Athlon fabs (Dresden & Austin, last I knew) aren't
dedicate, but that could have changed.

> The whole reason Digital
> sold their fab to Intel was to take advantage of the fact that it was
> mostly idle, but that Intel can fill it up. Today, if you want
> something fabbed in low volume, like the custom chip on a Myrinet
> board, which contains a CPU, you go to one of the 2 companies in
> Taiwan that offers a fab for low-volume use.

Or use an FPGA. The ratio of ASIC NREs to per-unit cost
premium for using programmable logic is getting larger
all the time. Not that I have any idea what Myrinet
specifically is using.

> On the design side, AMD proved that they can design a cpu with 1/10
> the money it costs Intel. The Alpha team has likewise been extremely
> cost effective. 1/10 the design cost on 1/100 the sales does make for
> higher overhead, but not necessarily a prohibitive cost.

Personally, I think that the most remarkable thing about
AMD is their abilty to do what they've done without *any*
high-end stuff to subsidize the activity. MDR estimates
that a 1.33GHz Athlon costs about $62 to manufacture. If
AMD is able to get a $50 margin on these, then they can
bring in a billion dollars on 20 Million of the little
suckers.  OTOH, MDR also figures that a 900MHz UltraSPARC
III costs about $145 to build. If Sun can get a $2000
margin, they only need to sell a half a million of them
to get the same billion. But it is a pretty tall order
to get people to pay those kinds of prices for a slab of
silicon in a package with little balls of solder on the
bottom. No, you pretty much need a *system* to go with it,
and that system has to offer something that you can't get
any other way. Even this sell is getting tougher all the
time, but Sun is still, at this point, pulling it off.
Not that they aren't feeling some pain at the moment.

When you look at those margins, though, it's pretty easy
to see the attraction of the high end -- 40x the income on
less than 3x the investment? Cool. That is of course what
Intel is after with the Xeons. At the cost of investing
in a slightly modified infrastructure, they can sell
essentially the same chip at much higher margins.

I would assert, however, that it is precisely that
strategy -- of trying to have it both ways -- that got
Intel into whatever trouble it's gotten itself into
with Rambus and the Pentiums III and 4. I think that
Intel spent so much brainpower trying to avoid damaging
their high-margin business that they lost sight of the
fact that their high-volume low-end busiess was what
was driving the process and keeping the costs down
at the high end. Intel forgot that if you don't "eat
your own children", someone else will eat them anyway:
http://www.zdnet.com/zdnn/content/zdnn/0326/299230.html

This was Andy Grove's focus -- to quote the above article,
"Companies succeed only by being the first to obsolete
their own products." While it would at first blush appear
that this is precisely what Intel is doing with the
PIII/P4/Itanium, it's more complicated than that.
IMHO, they made two primary mistakes: First was when
they tied their own hands with in the Rambus deal.
It could easily be argued that, especially in the
i820/i840 timeframe, Rambus wasn't really Ready, and
neither were the chipsets. Intel could have been out
there kicking butt with "entry-level" SDRAM systems
but no, they were stuck trying to foist RDRAM onto
their loyal customers, and sucking up to them with
hardware tradeins after screwing up massively on the
MTH.

The second big mistake was trying to keep the Celeron
under control. In this case, it wasn't enough for them
to market the Celeron in such a way as to make it appear
unsuitable for serious use. No, they had to *actually*
cripple it to make sure that it was really a worse chip
than the Pentium II/III. Why? To protect the fat margins on
the Pentium II/III. But AMD *didn't have* any fat margins
to protect. They barely had any margins at all. They
had no qualms whatsoever at attacking a $500 part with a
$100 part. Intel should have been doing the same thing --
if people wanted 100MHz FSB Celerons, well, then, hell --
sell them 100MHz FSB Celerons. Is it really better to lose
the sale to AMD than it is to lose the sale to your own
product? I personally don't think that Intel would ever
have lost as much market share as they did if they hadn't
handed it over to AMD in this way.

*THIS* is why we need healthy competition in commodity
processors. If AMD hadn't been out there keeping Intel
honest, who knows what kind of crap we'd be paying a
fortune for at this point.

> One of the most annoying parts of mostly off-topic discussions is when
> people give "facts" that aren't. I know your intentions are good, but
> can we discuss commodity clusters?

What's not about commodity clusters? Understanding where one's parts
are going to come from next year would seem to be quite relevent.

--Bob




More information about the Beowulf mailing list