[Beowulf] Large amounts of data to store and process

Prentice Bisbal pbisbal at pppl.gov
Mon Mar 18 08:49:57 PDT 2019


On 3/15/19 9:23 PM, Gerald Henriksen wrote:
> On Fri, 15 Mar 2019 05:28:42 +0000, you wrote:
>
>> I think what I was getting at is why not include the current HPC practices to every day desktops in the sense since we are reaching certain limits and have to write code to take advantage of more and more cores. Why not use MPI and the like to help distribute the software side of things to the cores?
> I suspect the simple answer is a combination of most of the software
> running on desktops doesn't really need all those extra cores we are
> now getting, and some of the more common desktop applications don't
> really lend themselves to parallel processing anyway.
>
> It's great that you can go out and get 32 core Threadripper and sit it
> on your desk for an affordable price, but for 90% or more of the
> market at least 28 of those cores would spend most of their life idle.

Not really, the OS can still assign those cores to different 
applications, so that each application has it's own core, or each core 
is shared amongst a smaller amount of applications. That would still 
yield a noticeable speed up to the user, just not single application 
performance, which is typically what we concern ourselves with in HPC.

--
Prentice



More information about the Beowulf mailing list