[Beowulf] Re: Religious wars

Robert G. Brown rgb at phy.duke.edu
Wed Jul 23 13:37:16 PDT 2008


On Tue, 22 Jul 2008, Greg Lindahl wrote:

> On Tue, Jul 22, 2008 at 10:54:47AM -0400, Bob Drzyzgula wrote:
>
>> It is not even certain that the default, base install of a Linux
>> system will include Emacs
>
> This just indicates a conspiracy of vi users. Or, more likely,
> vi users complained that emacs was in the default. Emacs users
> aren't bothered by having vi around.

More likely it is history and inertia.  I think even systems people tend
to be a bit dazed by Moore's Law.  Note that Bob and I started out on
systems with far less than 100 MB of DISK and perhaps a MB of system
memory on a fat SERVER in the latter 80's.  And the P(o)DP(eople) made
do with even less in the early 80's.  And the costs for these servers
were staggeringly higher -- $40K and up without bound (hundreds of
thousands of dollars to millions of dollars in 1980 money), until pretty
much the end of the 80's when Suns got sufficiently commoditized and
pressure from the descendants of the IBM PC continued to mount and
prices started to drop on non-PC iron and workstations.  Still, we paid
just about $100K even for a refrigerator-sized SGI 220S with two
processors, 8 MB of memory, and sometime like a 100 MB drive in 1989 or
1990 or thereabouts.  Software maintenance alone on it was $3500 a year.
We sold it in maybe 1995 for $3500 -- Sparc 2's were down to that or
even less, ultrasparcs were coming out, one could put that much compute
power on your desk with much more disk for the maintenance cost alone
(and no need for 1.5 tons of AC just to run two processors!).

Back then (80's) resources were tight and expensive and fitting into a
small footprint was key, as the 100 MB or so one could afford had to
hold the OS itself, all e.g. emacs etc sources and build spaces, and
/home for all users.  Our Sun 4/110 served something like 50 or 100
users on a mix of tty connections (into big multiport serial bus
interfaces) and IP connections, on a processor far slower and with far
less memory than the one in my phone or PDA, and its performance was
quite acceptable.  BUT, things like emacs had big memory footprints for
the time; some people wouldn't install it on a public server just
because enough simultaneous users would bring it to its knees, shared
libraries or not.  People bought whole workstations in part so they
could run emacs and code development tools off the public servers
without resource competition.

vi back then was little more than a shell on ed IIRC -- tiny and
efficient.  More importantly, ed and vi were compiled static, along with
a handful of other key tools, and lived in /sbin on early systems.  They
were literally the first things installed in the bootstrap install of
any unixoid operating system, and they were the only things that would
WORK if libc failed (or the drive/partition holding libc failed, but the
/ or /boot partition survived with the /sbin image intact).  Which
happened.  Not even that infrequently.

So if you wanted to NOT have to do a full reinstall from a QIC tape
following an elaborate and arcane bootstrapping procedure, followed by a
rebuild and reconfiguration, or a restore from a tape backup that one
would pray actually worked and in any event would cost SOME user SOME
time or critical files, you learned vi.  It was the editor that worked
when all others failed.

I learned this (like so many other lessons back then) the hard way -- a
system I was managing died -- I think it was one of the SGIs -- and I
had to go in to perform some /etc surgery to try to bring it back
without a reinstall.  However, its network was gone, its access to
/usr/share was gone, and /usr (a separate partition) had come
disconnected somehow.  jove (which I had already learned and mostly
used) Did Not Work.  Emacs Did Not Work.  Both binaries needed something
that was gone.  I thought I was dead in the water, and talked to my guru
about it looking for help as I did NOT want to do a reinstall and he was
a guru, right?  A magic worker.

At which time he gave me his usual withering look, mumbled about the
importance of my reading all of the man pages -- I mean ALL of them, and
I mean whether or not I ever needed to use what they described -- and
whacked me upside the head with a banana while describing the proper
function and purpose of /etc/sbin, what the "s" stood for, and why I had
to learn to use (and remember!) vi or even ed because without an editor
in the base system life was bad.  I humbly spent weeks popping in and
out of insert mode and learning the embedded ed commands until I could
at least reliably survive, but I missed my jove.

NOW it doesn't matter any more.  I carry a workable, bootable linux
around in my pocket at all times nowadays on a USB thumb drive that has
more memory than any of my computers or servers did (including those
that served whole departments) until maybe twelve or thirteen years ago.
The thumb drive there at this very moment holds a bit of a small linux,
just a rescue system, and I think it has both vim and emacs on it (but
not jove, alas).  In my bag I have my somewhat more expensive 8 GB thumb
drive, large enough to hold a kitchen-sink installation of Fedora 8 and
still leave me a few GB to use as a "small" home directory.  And by next
year, I'm guessing that 16 GB thumbs will cost what 8's do now (if not
4's), and one will be able to carry around a kitchen sink linux PLUS
10-12 GB of personal workspace for maybe $50.

Give me a machine that can boot USB and four minutes and I'll be working
on a linux machine no matter what it has installed, that kind of thing.
Even networks and devices, the things that plagued that sort of freedom
for so long, are being tamed by hal and friends so that e.g.
NetworkManager makes the network "just work" no matter what hardware it
finds at boot time.

The issue of resource consumption between vi(m) and emacs is hence truly
irrelevant to modern resource scales, and becoming more irrelevant
daily.  RAM size is scaling up at roughly 10 MB a day, amortized over a
year.  HD is scaling up by what, 10 GB a day?  More?  So if an 8 MB
memory footprint was relevant yesterday, it probably isn't today (he
says typing the reply on a system with 4 GB of RAM, where last year I
bought a system with 2 GB of RAM and two years previously got a system
with 512 MB of RAM). Like that.

Even mighty emacs vanishes without a trace in 4 GB, far less than 1% of
the resource -- now bloat is represented by the ever expanding maw of
e.g. ooffice (or better yet, by Vista of Evil, which crawls in 2 GB and
needs 4 GB to really get happy).  X at over 100 MB is "suddenly" almost
inconsequential.  ooffice eating 100 MB more on a personal laptop (not
even a server) -- who cares?  And next year, or the year after that, 8
GB RAM systems for $1000, thumb drives with 32 GB, computer/PDAs that
run at a GHz or more and have many GB of internal memory hard and soft,
TBs of disk standard.

> Interestingly, looking at the Red Hat RPMs, a full emacs install is
> only 2X the size of a full vim install. The main difference is that a
> minimal vim install is very small because it doesn't need vim-common,
> but no one's done the work to get emacs-nox to run sans emacs-common.
> Either way, it's a tiny fraction of a DVD, so you can look forward
> to full emacs on your rescue disk soon. Assuming the conspirators and
> complainers don't have their way.

This is dead certain correct.  A tiny fraction of a DVD, a thumb drive,
of main memory, and a truly miniscule fraction of the TB scale OTC
drives here and coming.  But it was not always so, and it is the Old
Guys that still configure many of the base setups.  Its like my parents
-- they grew up in the great depression, and never quite got used to the
idea that they weren't actually hungry and poor even when they were
quite comfortably off.  They would still go dumpster diving into their
late 70's, because why throw a chair away only because it had a broken
leg or a stain on the seat?  Why buy a chair when one can find one in a
dumpster?  Never mind that it takes days of work to fix up the chair but
only hours of work to earn the money to buy a new one.  Bad ecology, but
(perhaps unfortunately) good economics.

We are now embarrassed by computer resource riches, but our minds were
set by our early experiences with poverty and scarcity.  The same issue
(this isn't entire OT) comes up in coding practice.  I know people who
work for days, sometimes weeks, tightening up code so it is absolutely
efficient, and everything is done in a clean way that doesn't waste
memory or time.  OTOH, I personally write code (and teach my students to
write code) that is resource aware, but to be SENSIBLE about it.

By this I mean that if one can accomplish some task in 1 millisecond in
the initialation phase of some program using an hour of programming in a
straightforward way, but reduce it to a microsecond if one spends a week
reordering loops and and optimizing, NOBODY CARES.  1 millisecond is
less than human reaction time -- nobody could tell the difference,
literally.  Even a half second is probably irrelevant.  If there is a
way of programming that reduces the memory footprint by a few hundred K
but requires great care at managing the memory and coding vs just
allocating a few big blocks ditto -- a few hundred K is quite irrelevant
now, in nearly all cases (where once it would have been TERRIBLE
practice).

The opposite is true (of course) in core loops.  Memory leaks, wasted
cycles, all add up there (depending).  Even there, adding a millisecond
to a loop that takes a minute to complete is invisible, where adding it
to a loop that takes 100 microseconds to complete is a disaster.  This
sort of style infuriates some purists.  They'll work for days to avoid
wasting something that there is a nearly inexhaustible supply of, a
supply that is exponentially growing so fast that in the time it takes
them to complain about it, the net resource has grown more than the
marginal difference consumed.  This is the ssh vs rsh questions -- rsh
is maybe 10 or 20 times faster than ssh, but for most cluster purposes
nobody should really care -- ssh isn't the actual IPC channel, it is
just used out of band to start and stop tasks and if it takes ten
seconds to do this instead of one second on a task that will run for a
week, what difference does it make?  There the time trade off the OTHER
way is the time it might take you to cope with a cracking episode caused
by rsh's utter lack of security, which can be an issue even inside a
cluster, if multiple people use the cluster and some of them are
untrustworthy.  You know.  Grad students.  Postdocs.  Disgruntled
employees.  IP thieves.  Most privacy abuses are internal, IIRC --
originate inside the so-called "trusted" space from employees or people
with the "right" to be there.

So even if emacs IS pure, unadulterated bloat (and it's not - it is a
damn powerful tool, just one that is too complex for most normal humans
to master any more except for their own specific narrow context) I'm
here to announce to the world that it is IRRELEVANTLY SMALL as of
several years ago, and getting smaller by the day as Moore's Law
advances even faster than Lisp programmers can expand to fill the
available space.

But jove is still smaller, tighter, faster, and better, if all you do is
code in C, fortran, simple scripting languages, and write straight text.
It's as good or better than vi(m) for editing system files or anything
else.  Small, tight, and fast are irrelevant now -- if it ran ten times
faster than emacs proper nobody could tell, if it were ten times smaller
nobody would care, if the code were ten times perfecter -- well, people
DO care about tightness and quality.  jove is probably better debugged
and more trouble free than emacs (in any modern flavor).

But as you note, people use what they learn first, and they learn first
what they need to learn given their task mix.  For really old sysadmins
and coders, that was almost certainly vi (I'm a bit of an exception).
For almost as old coders, it is emacs, but sysadmins still needed to do
vi first and foremost until people stopped separating out /usr as a
partition (which was done to facilitate early installs and upgrades,
where one might want to preserve a big chunk of /, especially /etc).
For coders of the early 90's on, it was pretty much both, although even
through the early 90's having emacs relied on having a good sysadmin who
built and installed it and who used a nonstandard layout and installed
/usr in the / partition so libraries emacs relied on wouldn't go away
while the system still functioned.  From the mid 90's on (with linux on
the move and packaging systems flourishing) people no longer HAD to
build emacs, and it became truly universal.  At the same time it went
through its greatest period of bloat -- xml coders wanted xml indented
and color coded, ditto html coders, ditto php coders, it split into an
x-only version and a tty but still GUIish version, people integrated
mice to make it more gui-like either way, etc etc.

Somewhere in there somebody introduced mouse-clickable buttons on a
button panel, and I stopped even LOOKING at emacs or xemacs in new
releases.  It now largely looks like, and functions like, other WYSIWYG
editors in many contexts and for many users.  The cleanness and
fingers-on-the-keysness of it that were its original appeal are now
distant memories; even compiling tends to be done by means of pulldown
mouse menus.  I can now go head to head with many emacs users just as I
could any ooffice user, and open a file, make a key-based edit, compile
it, and run it in about the time it takes them to open it, make the same
change, and reach for their mouse to initiate the compile.  Ease of
learning has at long last started to win out over speed.

Learn to use a Mac in a day, pay for it forever, used to be an
instructive adage.  GUIs are easy to learn and use, but slow as molasses
when you want to do certain kinds of work.  Many/most new emacs users I
know -- ones that have started in the 2000's, say -- don't even know the
elementary cursor movement key combinations.  They just use the cursor
keys -- it is easy, if slow.  They don't know how to move or invoke make
or split screens with just their fingers.  They use the mouse.  They
don't grok Ctrl-space, Alt-F, Alt-F, Ctrl-W, Ctrl-Ifrog, space, Ctrl-Y
to move two words from whereever you are to the location of the word
"frog", which takes less than the time required to actually get to the
mouse with your hand, let alone move the cursor, highlight the text, cut
it, scroll down to the word frog (which might be ten pages down), click
on an insert point, and select "paste" from a menu.  And leaves your
fingers right on the home keys, still typing, your train of thought
unbroken.

Ultimately it is THIS that is the real shame.  The really good Unix text
editors, especially the ones for programmers, were expert friendly to be
sure -- one has to WORK to learn to do this sort of thing at the speed
of thought.  As they are GUIfied, made idiot-simple, made to look like
all those Mac interfaces and MS Word and Ooffice-alikes, something
important is lost.  Speed.  I mean serious speed, speed of work done by
humans.  Human time is the resource that really costs money, just as
much today as it did back in 1982 or 1988 when computers were
extraordinarily expensive where now they are cheap.  I've written entire
functioning multilevel autobuilding mysql integrated websites in php in
a day, and debugged them and extended them in a day or two more -- less
than 40 hours of work -- using jove.

I can rip out code in jove (and y'all KNOW I can type like the wind in
jove, as jove is one reason I am so list-prolific:-).  One can pop in
and out of it to test, flop windows and test, run multiple source
windows and test, in fractions of a second and without conscious
thought, so the connection of brain to actual task semantics is never
broken.  And this isn't to tout it -- I'm sure vim or emacs users of the
old school could do the same as long as they keep the fingers on those
home keys and have mastered the keystroke-based shortcuts.  Or joe users
-- wordstar may have been the last great PC/DOS editor to keep fingers
firmly on the keys where they belong when working with text.  But show
me a "programmer" who cannot work without their mouse and a GUI-based
text editor, who has to scroll slowly up and down or constantly move
hands from the keys to the mouse and back to select even elementary
functions, and I'll show you somebody that will take a month to do the
work I did in 3 days...

    rgb

>
> -- greg
>
>
> _______________________________________________
> Beowulf mailing list, Beowulf at beowulf.org
> To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf
>

-- 
Robert G. Brown                            Phone(cell): 1-919-280-8443
Duke University Physics Dept, Box 90305
Durham, N.C. 27708-0305
Web: http://www.phy.duke.edu/~rgb
Book of Lilith Website: http://www.phy.duke.edu/~rgb/Lilith/Lilith.php
Lulu Bookstore: http://stores.lulu.com/store.php?fAcctID=877977



More information about the Beowulf mailing list