[Beowulf] Haswell as supercomputer microprocessors

Jörg Saßmannshausen j.sassmannshausen at ucl.ac.uk
Mon Aug 3 03:59:00 PDT 2015


Hi Mikhail,

I would guess your queueing system could take care of that. 

With SGE you can define how many cores each node has. Thus, if you only want to 
use 16 out of the 18 cores you simply define that.

Alternatively, at least OpenMPI allows you to underpopulate the nodes as well.

Having said that, is there a good reason why you want to purchase 18 cores and 
then only use 16? 
The only thing I can think of why one needs to / wants to do that is if your 
job requires more memory which you got on the node. For memory intensive work 
I am still thinking that less cores and more nodes are beneficial here.

My 2 cents from a sunny London

Jörg

On Monday 03 Aug 2015 10:06:27 Mikhail Kuzminsky wrote:
>  New special supercomputer microprocessors (like IBM Power BQC and Fujitsu
> SPARC64 XIfx) have 2**N +2 cores (N=4 for 1st, N=5 for 2nd), where 2 last
> cores are redundant, not for computations, but only for other work w/Linux
> or even for replacing of failed computational core.
> 
> Current Intel Haswell E5 v3 may also have 18 = 2**4 +2 cores.  Is there
> some sense to try POWER BQC or SPARC64 XIfx ideas (not exactly), and use
> only 16 Haswell cores for parallel computations ? If the answer is "yes",
> then how to use this way under Linux ?
> 
> Mikhail Kuzminsky,
> Zelinsky Institute of Organic Chemistry RAS,
> Moscow

-- 
*************************************************************
Dr. Jörg Saßmannshausen, MRSC
University College London
Department of Chemistry
Gordon Street
London
WC1H 0AJ 

email: j.sassmannshausen at ucl.ac.uk
web: http://sassy.formativ.net

Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word-attachments.html
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 230 bytes
Desc: This is a digitally signed message part.
URL: <http://www.beowulf.org/pipermail/beowulf/attachments/20150803/cc2a0d05/attachment.sig>


More information about the Beowulf mailing list