[Beowulf] Moores Law is dying
himikehawaii1 at yahoo.com
Thu Apr 9 08:33:56 PDT 2009
Multi-core or multi-processor's take a different way of looking at the situation. The traditional decision tree of Yes/No and 1 line of code , the traditional if/then/else statement is just an example, you cannot effectively divide up an application that is written to perform in a linear fashion, that is one line after the next where the prior decision determines the next actions. As long as this type of programming is the norm multi-cores or multi-processors cannot do more than segregate different programs to the different areas. If we look at the majority of computer uses the user is used to their linear process, the advent of multiple cores will not help a word-processor, true it may help with the management of the overhead from the graphical user interfaces, or the bloated code, that 99% of people use only a tiny fraction of.
HPC (clusters)or even multi-cores, which I just bought 3 more quad-cores for personal use, basically so I can have more programs open and a reasonable speed, not that any individual program is actually taking real advantage as this would require the program to have a predictive feature or prescient idea of what the average user will want. True algorithms have gotten better at predicting, then again the sheer volume such as Google's search algorithms have actually degraded with the shear volume of information.
Until, the average user stops buying into the old idea of Bill Gates' of a computer on every desk, and "bloated C code programs and operating systems," and that was not for doing super-complex financial simulations, though with the current situation I can only think about 2 things 1) My old statistic teacher telling me tell me the results you want and we can find a statistical way to verify it and 2) Computers only do what you tell them (GIGO), so if you feed it a bad financial simulation or projection with a biased idea of what you are wanting as a results then well getting it a few seconds faster do to more GHZ or Ram, not a single analytical tool I know of for the average user, or for that matter the advanced user outside of maybe weather and Nuclear and Genome where specialty programs are written has access to anything that comes close to being able to divide up the work effectively, true multi-user and multi-taskers may benefit but without a real
paradigm shift in basic uses be it spreadsheets, databases, word-processing, etc. then the use beyond being able to have 25 windows open at once and your word processor and spreadsheet does not really benefit from multi-cores and HPCs beyond having more than one program open and GHZ, a faster response, than a complex model that they have no idea of how to use and generally have no need of.
True, companies can benefit by using their systems to design say chips, versus say a super-computer, Intel has long used HPCs to leverage their excess cycles. The basic paradigm shift needs to make multi-core and HPCs explode is for the average user, and the most unfortunately use that will be is most likely be in the multi-user game market, versus the average user.
I personally would love to put my excess capacity to use but the tools, such as obtaining a Beowulf, or other HOC system, is not realistic. I maybe have 25 cores and various computers with terabytes of storage at my disposal at home but there is not cost effective way for me to obtain the systems or the programming languages to try and develop the next "killer application."
You mention "Moore's law, yet VisiCalc launched the Apple II, Lotus 123 and WordPerfect the IBM, Windows and Apple basically just made the need for faster speeds to support the overhead and for the more "average" user to start using computers. Yet has there been any real change in the majority of database development, or financial tools for the average masses, do you really need a HPC for the typical word processor or desk-top publishing? True we all like it faster but the underlying statement is there has not been a fundamental change, beyond making it easier to use for the average person and also bloated code that most features are never used.
There is a dedicated segment, Linus and Linux is a prime example, but 1% of a market base cannot steer a market. I do think we are on the cusp of change, and I also believe that many, mainly a select few, companies do not want too much power in the hands of the masses. Look at the idea of "online services for what is typically preformed on the desktop now." I am talking about using word processing, spreadsheets, etc. on as client server applications and paying for then on a per use basis. This is the exact idea of the old "big iron" with dumb terminals, as then charging the various departments for services.
I hope that Open Source, including access to HPC software, will happen so that there can then be a new field for the average interested party to work on development of systems that can effectively utilize multiple-cores and HPCs. But when the average user cannot afford the complete Adobe suite, or is instead catered to by others telling them what they need, thus removing creativity and advancement, then while we are using 4th+ generation languages they are all still based on just easier to use versions of say the old Altair or Apple basic where line numbers were used to reference as most people have a hard enough time thinking even in a linear and step by step fashion, few can plan 20 chess moves in the future.
So more-cores or faster GHZs are more just to feed the impatient masses, this does not lead to a breakthrough that a true use of multi/hype-threading and multi-cores and HPCs will shine.
What we really need is a paradigm shift in the thinking of what a computer can do beyond the typical modern uses. I has been a while since I heard much on neural nets or fuzzy logic as these topics do little for the average user, in fact beyond the non-uses of these to correct misspelled words on the fly in a word processor, which is not either of these, but instead just using excess cycles to branch off and run a sub-routine to check these items, the average user things that because a computer can do this, or have 7 Google windows open, while their IM is running at the same time that their system is utilizing the real items at the base of the situation and question.
So until the fact that computer’s in the developed world are not seen as disposable items, big iron and legacy systems still exists, when the latest greatest faster system come along, they will always see it as it was originally named "[personal Computer', they think individually and do not deviate from this linear process.
What we really need is a new killer operating system with killer applications that wakes up the general user. Where I can affordably link and have the systems needed to utilize my multiple systems, and also the bandwidth to have it process the massive growth of information, I cannot say it is data as the internet is too full of unverified factoids (not really facts), as well as the average user not have to depend on Google, which may now have 38 million hit on a query.
Just as Moore’s law brought the price of the amount of computing power available into the hands of the general public, it was applications that drove the use and growth. So until there is a real paradigm shift in what the general public sees the use of their "PC" (Macs included) for and there is no killer applications, reasonable priced methods to utilize our legacy as well as newer PCs, programming languages and operating systems that are affordable then the growth will be slow, and slower, etc. In business we call this the law of diminishing returns; it is also known as the maturity cycle of an industry or business. Currently until there is a fundamental shift then the computer industry, at least for 99.9% of users it is a mature industry and there will not be as rapid increases in improvement.
Instead, just as Linux was made Open Source then to spark the interest of more of the mass of the developers seeking something new, when was the last real company out of a garage born? Then maybe HPCs should consider being open Source to foster development.
I can think of several uses for HPCs but as an average user they are beyond my budget, so therefore I am locked out of even trying. So just as the PC was treated as a 'toy" see HP and the Apple II, or IBM did not take it serious till it was forced to, the fact is the growth will continue to slow. There is little need for faster cycles as already most people cannot use their computer's cycles and they sit wasted between keystrokes.
So if anyone knows a way that the average person that wants to "hack: away at a idea of something new that would benefit the general use of HPCs and multi-cores, multi-nodes, HPCs, etc. beyond how may windows can I have open, and small sub-routines that really are not anything more than items like to automatically correct my typing, then we will stay in this stagnant growth era.
--- On Thu, 4/9/09, ariel sabiguero yawelak <asabigue at fing.edu.uy> wrote:
From: ariel sabiguero yawelak <asabigue at fing.edu.uy>
Subject: Re: [Beowulf] Moores Law is dying
To: "Bruno Coutinho" <coutinho at dcc.ufmg.br>
Cc: richard.walsh at comcast.net, beowulf at beowulf.org
Date: Thursday, April 9, 2009, 3:48 AM
I am not sure that all those GHz are useless in a Desktop. All the users
of the Atom processor I know are kind of disappointed.... 1,6GHz and a
single core does not seem to be enough to run their preferred OS... They
pay that price because they can carry it, but as far as I can
understand, they need more. I can do really a lot with that system, but
I had to understand -not painlessly- that I am not an average user.
Just last year 55000 new security threats were discovered, and thus,
their antivirus systems should try to do something about them... don't
forget about personal firewalls working at the speed of P2P
applications, and all the FX required to run a desktop today. I don't
think all the GHz we have are enough for a regular Vista user (9x% of
the market of multicore personal systems?).
Unfortunately we have not figured out how to use the capabilities of
parallel processing for a Visual Studio programmer, thus, 99% (please,
add more nines as required) of the applications cannot deal with
multiple cores. As we know, a single threaded application today might
run even slower than a few years ago. In some way, we "earned" a few
years in learning how to go parallel thanks to multicore systems. We can
allocate one core for the system, another for the antivirus, another for
the application and we still have one more.... but not on most
notebooks, and even less for uses whose OS put a limit on the number of
cores they are licensed to handle....
What we did? we just "released" all the power of a core to a
single-threaded program while using the other cores to "serve" the
application in foreground. But we still don't know how to solve general
programming things in a parallel system... well, sure we know for
certain applications!, we spawn more threads as more connections arrive
for a web server or for an IMAP server.... and that can really use a
multi processor system. Most of the readers in this list really know how
to solve applications in parallel, even beyond my imagination! All the
science -in parallel- you do really squeezes our systems, but even
though we use COTS systems (and I suffer when you buy equipment from
server vendors instead of building from scratch as original Beowulf was
about -and Google learned to do), we don't have COTS programmers. Think
of the training and know-how required for a skilled parallel programmer.
Parallel programming is still done by artists. A good parallel program
is a rare piece of art.
What about other programs? Well, we need a programmer that has taken, at
least, "101 - parallel programming" or "101 - High Performance
Computing" lectures, which covers only a tiny subset of the humans
writing lines of code.
And please remember all those (barely) humans who specify their problem
in some sort of 4GL language, that is utterly translated into rules,
interpreted and finally executed -singlethreaded- on a processor. Those
guys fly high on GHz and always need more, more, and more GHz. Their
only way to run faster is to get more GHz
I think that, in the end, until we learn how to translate from single
threaded programs written in standard programming languages (yes, Visual
Basic, Java, C#, and so on) into fine-grained parallel code, we are
constrained to pray for GHz to have our systems running faster. Most of
IT is based on reuse (otherwise we might have moved from Fortran and
DOS...) and we have to reuse single threaded things for the years to come.
The users will not be amazed playing some variations of Space Invaders
and Pacman on their mobiles for long.
Bruno Coutinho escribió:
> I think that even if they stop scaling down size of desktop processors
> due lack of interest in more performance,
> someone will continue doing it (even at a much slower rate) for HPC
> No matter how much computing power future processors will have,
> someone will invent a application that needs more.
Beowulf mailing list, Beowulf at beowulf.org sponsored by Penguin Computing
To change your subscription (digest mode or unsubscribe) visit
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Beowulf